Sample records for statistically significant model

  1. Strategies for Testing Statistical and Practical Significance in Detecting DIF with Logistic Regression Models

    ERIC Educational Resources Information Center

    Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza

    2014-01-01

    This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…

  2. Testing prediction methods: Earthquake clustering versus the Poisson model

    USGS Publications Warehouse

    Michael, A.J.

    1997-01-01

    Testing earthquake prediction methods requires statistical techniques that compare observed success to random chance. One technique is to produce simulated earthquake catalogs and measure the relative success of predicting real and simulated earthquakes. The accuracy of these tests depends on the validity of the statistical model used to simulate the earthquakes. This study tests the effect of clustering in the statistical earthquake model on the results. Three simulation models were used to produce significance levels for a VLF earthquake prediction method. As the degree of simulated clustering increases, the statistical significance drops. Hence, the use of a seismicity model with insufficient clustering can lead to overly optimistic results. A successful method must pass the statistical tests with a model that fully replicates the observed clustering. However, a method can be rejected based on tests with a model that contains insufficient clustering. U.S. copyright. Published in 1997 by the American Geophysical Union.

  3. A two-component rain model for the prediction of attenuation statistics

    NASA Technical Reports Server (NTRS)

    Crane, R. K.

    1982-01-01

    A two-component rain model has been developed for calculating attenuation statistics. In contrast to most other attenuation prediction models, the two-component model calculates the occurrence probability for volume cells or debris attenuation events. The model performed significantly better than the International Radio Consultative Committee model when used for predictions on earth-satellite paths. It is expected that the model will have applications in modeling the joint statistics required for space diversity system design, the statistics of interference due to rain scatter at attenuating frequencies, and the duration statistics for attenuation events.

  4. New insights into the endophenotypic status of cognition in bipolar disorder: genetic modelling study of twins and siblings.

    PubMed

    Georgiades, Anna; Rijsdijk, Fruhling; Kane, Fergus; Rebollo-Mesa, Irene; Kalidindi, Sridevi; Schulze, Katja K; Stahl, Daniel; Walshe, Muriel; Sahakian, Barbara J; McDonald, Colm; Hall, Mei-Hua; Murray, Robin M; Kravariti, Eugenia

    2016-06-01

    Twin studies have lacked statistical power to apply advanced genetic modelling techniques to the search for cognitive endophenotypes for bipolar disorder. To quantify the shared genetic variability between bipolar disorder and cognitive measures. Structural equation modelling was performed on cognitive data collected from 331 twins/siblings of varying genetic relatedness, disease status and concordance for bipolar disorder. Using a parsimonious AE model, verbal episodic and spatial working memory showed statistically significant genetic correlations with bipolar disorder (rg = |0.23|-|0.27|), which lost statistical significance after covarying for affective symptoms. Using an ACE model, IQ and visual-spatial learning showed statistically significant genetic correlations with bipolar disorder (rg = |0.51|-|1.00|), which remained significant after covarying for affective symptoms. Verbal episodic and spatial working memory capture a modest fraction of the bipolar diathesis. IQ and visual-spatial learning may tap into genetic substrates of non-affective symptomatology in bipolar disorder. © The Royal College of Psychiatrists 2016.

  5. Comparing and combining process-based crop models and statistical models with some implications for climate change

    NASA Astrophysics Data System (ADS)

    Roberts, Michael J.; Braun, Noah O.; Sinclair, Thomas R.; Lobell, David B.; Schlenker, Wolfram

    2017-09-01

    We compare predictions of a simple process-based crop model (Soltani and Sinclair 2012), a simple statistical model (Schlenker and Roberts 2009), and a combination of both models to actual maize yields on a large, representative sample of farmer-managed fields in the Corn Belt region of the United States. After statistical post-model calibration, the process model (Simple Simulation Model, or SSM) predicts actual outcomes slightly better than the statistical model, but the combined model performs significantly better than either model. The SSM, statistical model and combined model all show similar relationships with precipitation, while the SSM better accounts for temporal patterns of precipitation, vapor pressure deficit and solar radiation. The statistical and combined models show a more negative impact associated with extreme heat for which the process model does not account. Due to the extreme heat effect, predicted impacts under uniform climate change scenarios are considerably more severe for the statistical and combined models than for the process-based model.

  6. Towards Accurate Modelling of Galaxy Clustering on Small Scales: Testing the Standard ΛCDM + Halo Model

    NASA Astrophysics Data System (ADS)

    Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron K.; Scoccimarro, Roman; Piscionere, Jennifer A.; Wibking, Benjamin D.

    2018-04-01

    Interpreting the small-scale clustering of galaxies with halo models can elucidate the connection between galaxies and dark matter halos. Unfortunately, the modelling is typically not sufficiently accurate for ruling out models statistically. It is thus difficult to use the information encoded in small scales to test cosmological models or probe subtle features of the galaxy-halo connection. In this paper, we attempt to push halo modelling into the "accurate" regime with a fully numerical mock-based methodology and careful treatment of statistical and systematic errors. With our forward-modelling approach, we can incorporate clustering statistics beyond the traditional two-point statistics. We use this modelling methodology to test the standard ΛCDM + halo model against the clustering of SDSS DR7 galaxies. Specifically, we use the projected correlation function, group multiplicity function and galaxy number density as constraints. We find that while the model fits each statistic separately, it struggles to fit them simultaneously. Adding group statistics leads to a more stringent test of the model and significantly tighter constraints on model parameters. We explore the impact of varying the adopted halo definition and cosmological model and find that changing the cosmology makes a significant difference. The most successful model we tried (Planck cosmology with Mvir halos) matches the clustering of low luminosity galaxies, but exhibits a 2.3σ tension with the clustering of luminous galaxies, thus providing evidence that the "standard" halo model needs to be extended. This work opens the door to adding interesting freedom to the halo model and including additional clustering statistics as constraints.

  7. Testing statistical self-similarity in the topology of river networks

    USGS Publications Warehouse

    Troutman, Brent M.; Mantilla, Ricardo; Gupta, Vijay K.

    2010-01-01

    Recent work has demonstrated that the topological properties of real river networks deviate significantly from predictions of Shreve's random model. At the same time the property of mean self-similarity postulated by Tokunaga's model is well supported by data. Recently, a new class of network model called random self-similar networks (RSN) that combines self-similarity and randomness has been introduced to replicate important topological features observed in real river networks. We investigate if the hypothesis of statistical self-similarity in the RSN model is supported by data on a set of 30 basins located across the continental United States that encompass a wide range of hydroclimatic variability. We demonstrate that the generators of the RSN model obey a geometric distribution, and self-similarity holds in a statistical sense in 26 of these 30 basins. The parameters describing the distribution of interior and exterior generators are tested to be statistically different and the difference is shown to produce the well-known Hack's law. The inter-basin variability of RSN parameters is found to be statistically significant. We also test generator dependence on two climatic indices, mean annual precipitation and radiative index of dryness. Some indication of climatic influence on the generators is detected, but this influence is not statistically significant with the sample size available. Finally, two key applications of the RSN model to hydrology and geomorphology are briefly discussed.

  8. The Development of Statistical Models for Predicting Surgical Site Infections in Japan: Toward a Statistical Model-Based Standardized Infection Ratio.

    PubMed

    Fukuda, Haruhisa; Kuroki, Manabu

    2016-03-01

    To develop and internally validate a surgical site infection (SSI) prediction model for Japan. Retrospective observational cohort study. We analyzed surveillance data submitted to the Japan Nosocomial Infections Surveillance system for patients who had undergone target surgical procedures from January 1, 2010, through December 31, 2012. Logistic regression analyses were used to develop statistical models for predicting SSIs. An SSI prediction model was constructed for each of the procedure categories by statistically selecting the appropriate risk factors from among the collected surveillance data and determining their optimal categorization. Standard bootstrapping techniques were applied to assess potential overfitting. The C-index was used to compare the predictive performances of the new statistical models with those of models based on conventional risk index variables. The study sample comprised 349,987 cases from 428 participant hospitals throughout Japan, and the overall SSI incidence was 7.0%. The C-indices of the new statistical models were significantly higher than those of the conventional risk index models in 21 (67.7%) of the 31 procedure categories (P<.05). No significant overfitting was detected. Japan-specific SSI prediction models were shown to generally have higher accuracy than conventional risk index models. These new models may have applications in assessing hospital performance and identifying high-risk patients in specific procedure categories.

  9. The Development and Demonstration of Multiple Regression Models for Operant Conditioning Questions.

    ERIC Educational Resources Information Center

    Fanning, Fred; Newman, Isadore

    Based on the assumption that inferential statistics can make the operant conditioner more sensitive to possible significant relationships, regressions models were developed to test the statistical significance between slopes and Y intercepts of the experimental and control group subjects. These results were then compared to the traditional operant…

  10. Towards accurate modelling of galaxy clustering on small scales: testing the standard ΛCDM + halo model

    NASA Astrophysics Data System (ADS)

    Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron K.; Scoccimarro, Roman; Piscionere, Jennifer A.; Wibking, Benjamin D.

    2018-07-01

    Interpreting the small-scale clustering of galaxies with halo models can elucidate the connection between galaxies and dark matter haloes. Unfortunately, the modelling is typically not sufficiently accurate for ruling out models statistically. It is thus difficult to use the information encoded in small scales to test cosmological models or probe subtle features of the galaxy-halo connection. In this paper, we attempt to push halo modelling into the `accurate' regime with a fully numerical mock-based methodology and careful treatment of statistical and systematic errors. With our forward-modelling approach, we can incorporate clustering statistics beyond the traditional two-point statistics. We use this modelling methodology to test the standard Λ cold dark matter (ΛCDM) + halo model against the clustering of Sloan Digital Sky Survey (SDSS) seventh data release (DR7) galaxies. Specifically, we use the projected correlation function, group multiplicity function, and galaxy number density as constraints. We find that while the model fits each statistic separately, it struggles to fit them simultaneously. Adding group statistics leads to a more stringent test of the model and significantly tighter constraints on model parameters. We explore the impact of varying the adopted halo definition and cosmological model and find that changing the cosmology makes a significant difference. The most successful model we tried (Planck cosmology with Mvir haloes) matches the clustering of low-luminosity galaxies, but exhibits a 2.3σ tension with the clustering of luminous galaxies, thus providing evidence that the `standard' halo model needs to be extended. This work opens the door to adding interesting freedom to the halo model and including additional clustering statistics as constraints.

  11. Duration on unemployment: geographic mobility and selectivity bias.

    PubMed

    Goss, E P; Paul, C; Wilhite, A

    1994-01-01

    Modeling the factors affecting the duration of unemployment was found to be influenced by the inclusion of migration factors. Traditional models which did not control for migration factors were found to underestimate movers' probability of finding an acceptable job. The empirical test of the theory, based on the analysis of data on US household heads unemployed in 1982 and employed in 1982 and 1983, found that the cumulative probability of reemployment in the traditional model was .422 and in the migration selectivity model was .624 after 30 weeks of searching. In addition, controlling for selectivity eliminated the significance of the relationship between race and job search duration in the model. The relationship between search duration and the county unemployment rate in 1982 became statistically significant, and the relationship between search duration and 1980 population per square mile in the 1982 county of residence became statistically insignificant. The finding that non-Whites have a longer duration of unemployment can better be understood as non-Whites' lower geographic mobility and lack of greater job contacts. The statistical significance of a high unemployment rate in the home labor market reducing the probability of finding employment was more in keeping with expectations. The findings assumed that the duration of employment accurately reflected the length of job search. The sample was redrawn to exclude discouraged workers and the analysis was repeated. The findings were similar to the full sample, with the coefficient for migration variable being negative and statistically significant and the coefficient for alpha remaining positive and statistically significant. Race in the selectivity model remained statistically insignificant. The findings supported the Schwartz model hypothesizing that the expansion of the radius of the search would reduce the duration of unemployment. The exclusion of the migration factor misspecified the equation for unemployment duration. Policy should be directed to the problems of geographic mobility, particularly among non-Whites.

  12. A General Model for Estimating and Correcting the Effects of Nonindependence in Meta-Analysis.

    ERIC Educational Resources Information Center

    Strube, Michael J.

    A general model is described which can be used to represent the four common types of meta-analysis: (1) estimation of effect size by combining study outcomes; (2) estimation of effect size by contrasting study outcomes; (3) estimation of statistical significance by combining study outcomes; and (4) estimation of statistical significance by…

  13. Functional annotation of regulatory pathways.

    PubMed

    Pandey, Jayesh; Koyutürk, Mehmet; Kim, Yohan; Szpankowski, Wojciech; Subramaniam, Shankar; Grama, Ananth

    2007-07-01

    Standardized annotations of biomolecules in interaction networks (e.g. Gene Ontology) provide comprehensive understanding of the function of individual molecules. Extending such annotations to pathways is a critical component of functional characterization of cellular signaling at the systems level. We propose a framework for projecting gene regulatory networks onto the space of functional attributes using multigraph models, with the objective of deriving statistically significant pathway annotations. We first demonstrate that annotations of pairwise interactions do not generalize to indirect relationships between processes. Motivated by this result, we formalize the problem of identifying statistically overrepresented pathways of functional attributes. We establish the hardness of this problem by demonstrating the non-monotonicity of common statistical significance measures. We propose a statistical model that emphasizes the modularity of a pathway, evaluating its significance based on the coupling of its building blocks. We complement the statistical model by an efficient algorithm and software, Narada, for computing significant pathways in large regulatory networks. Comprehensive results from our methods applied to the Escherichia coli transcription network demonstrate that our approach is effective in identifying known, as well as novel biological pathway annotations. Narada is implemented in Java and is available at http://www.cs.purdue.edu/homes/jpandey/narada/.

  14. Effect of Internet-Based Cognitive Apprenticeship Model (i-CAM) on Statistics Learning among Postgraduate Students.

    PubMed

    Saadati, Farzaneh; Ahmad Tarmizi, Rohani; Mohd Ayub, Ahmad Fauzi; Abu Bakar, Kamariah

    2015-01-01

    Because students' ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is 'value added' because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students' problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students.

  15. A Model Assessment of Satellite Observed Trends in Polar Sea Ice Extents

    NASA Technical Reports Server (NTRS)

    Vinnikov, Konstantin Y.; Cavalieri, Donald J.; Parkinson, Claire L.

    2005-01-01

    For more than three decades now, satellite passive microwave observations have been used to monitor polar sea ice. Here we utilize sea ice extent trends determined from primarily satellite data for both the Northern and Southern Hemispheres for the period 1972(73)-2004 and compare them with results from simulations by eleven climate models. In the Northern Hemisphere, observations show a statistically significant decrease of sea ice extent and an acceleration of sea ice retreat during the past three decades. However, from the modeled natural variability of sea ice extents in control simulations, we conclude that the acceleration is not statistically significant and should not be extrapolated into the future. Observations and model simulations show that the time scale of climate variability in sea ice extent in the Southern Hemisphere is much larger than in the Northern Hemisphere and that the Southern Hemisphere sea ice extent trends are not statistically significant.

  16. Sensation seeking and smoking behaviors among adolescents in the Republic of Korea.

    PubMed

    Hwang, Heejin; Park, Sunhee

    2015-06-01

    This study aimed to explore the relationship between the four components of sensation seeking (i.e., disinhibition, thrill and adventure seeking, experience seeking, and boredom susceptibility) and three types of smoking behavior (i.e., non-smoking, experimental smoking, and current smoking) among high school students in the Republic of Korea. Multivariate multinomial logistic regression analysis was performed using two models. In Model 1, the four subscales of sensation seeking were used as covariates, and in Model 2, other control factors (i.e., characteristics related to demographics, individuals, family, school, and friends) were added to Model 1 in order to adjust for their effects. In Model 1, the impact of disinhibition on experimental smoking and current smoking was statistically significant. In Model 2, the influence of disinhibition on both of these smoking behaviors remained statistically significant after controlling for all the other covariates. Also, the effect of thrill and adventure seeking on experimental smoking was statistically significant. The two statistically significant subscales of sensation seeking were positively associated with the risk of smoking behaviors. According to extant literature and current research, sensation seeking, particularly disinhibition, is strongly associated with smoking among youth. Therefore, sensation seeking should be measured among adolescents to identify those who are at greater risk of smoking and to develop more effective intervention strategies in order to curb the smoking epidemic among youth. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Evaluation of Regression Models of Balance Calibration Data Using an Empirical Criterion

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert; Volden, Thomas R.

    2012-01-01

    An empirical criterion for assessing the significance of individual terms of regression models of wind tunnel strain gage balance outputs is evaluated. The criterion is based on the percent contribution of a regression model term. It considers a term to be significant if its percent contribution exceeds the empirical threshold of 0.05%. The criterion has the advantage that it can easily be computed using the regression coefficients of the gage outputs and the load capacities of the balance. First, a definition of the empirical criterion is provided. Then, it is compared with an alternate statistical criterion that is widely used in regression analysis. Finally, calibration data sets from a variety of balances are used to illustrate the connection between the empirical and the statistical criterion. A review of these results indicated that the empirical criterion seems to be suitable for a crude assessment of the significance of a regression model term as the boundary between a significant and an insignificant term cannot be defined very well. Therefore, regression model term reduction should only be performed by using the more universally applicable statistical criterion.

  18. [Mechanism study on leptin resistance in lung cancer cachexia rats treated by Xiaoyan Decoction].

    PubMed

    Zhang, Yun-Chao; Jia, Ying-Jie; Yang, Pei-Ying; Zhang, Xing; Li, Xiao-Jiang; Zhang, Ying; Zhu, Jin-Li; Sun, Yi-Yu; Chen, Jun; Duan, Hao-Guo; Guo, Hua; Li, Chao

    2014-12-01

    To study the leptin resistance mechanism of Xiaoyan Decoction (XD) in lung cancer cachexia (LCC) rats. An LCC rat model was established. Totally 40 rats were randomly divided into the normal control group, the LCC model group, the XD group, and the positive control group, 10 in each group. After LCC model was set up, rats in the LCC model group were administered with normal saline, 2 mL each time. Rats in the XD group were administered with XD at the daily dose of 2 mL. Those in the positive control group were administered with Medroxyprogesterone Acetate suspension (20 mg/kg) by gastrogavage at the daily dose of 2 mL. All medication lasted for 14 days. The general condition and tumor growth were observed. Serum levels of leptin and leptin receptor in the hypothalamus were detected using enzyme-linked immunosorbent assay. Contents of neuropeptide Y (NPY) and anorexia for genomic POMC were detected using real-time PCR technique. Serum leptin levels were lower in the LCC model group than in the normal control group with statistical significance (P < 0.05). Compared with the LCC model groups, serum leptin levels significantly increased in the XD group (P < 0.01). Leptin receptor levels in the hypothalamus increased significantly in the LCC model group (P < 0.01). Increased receptor levels in the LCC model group indicated that either XD or Medroxyprogesterone Acetate could effectively reduce levels of leptin receptor with statistical significance (P < 0.01). There was also statistical difference between the XD group and the positive control group (P < 0.05). Contents of NPY was higher in the LCC model group than in the other groups with statistical difference (P < 0.05). There was no statistical difference in NPY between the normal control group and the rest 2 treatment groups (P > 0.05). There was statistical difference in POMC between the normal control group and the LCC model group (P < 0.05). POMC could be decreased in the XD group and the positive control group with statistical significance (P < 0.05), and it was more obviously decreased in the XD group (P < 0.05). Leptin resistance existed in LCC rats. XD could increase serum leptin levels and reduce leptin receptor levels in the hypothalamus. LCC could be improved by elevating NPY contents in the hypothalamus and reducing POMC contents, promoting the appetite, and increasing food intake from the periphery pathway and the central pathway.

  19. The application of feature selection to the development of Gaussian process models for percutaneous absorption.

    PubMed

    Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P

    2010-06-01

    The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it was possible to interchange certain descriptors (i.e. molecular weight and melting point) without incurring a loss of model quality. Such synergy suggested that a model constructed from discrete terms in an equation may not be the most appropriate way of representing mechanistic understandings of skin absorption.

  20. Comparing geological and statistical approaches for element selection in sediment tracing research

    NASA Astrophysics Data System (ADS)

    Laceby, J. Patrick; McMahon, Joe; Evrard, Olivier; Olley, Jon

    2015-04-01

    Elevated suspended sediment loads reduce reservoir capacity and significantly increase the cost of operating water treatment infrastructure, making the management of sediment supply to reservoirs of increasingly importance. Sediment fingerprinting techniques can be used to determine the relative contributions of different sources of sediment accumulating in reservoirs. The objective of this research is to compare geological and statistical approaches to element selection for sediment fingerprinting modelling. Time-integrated samplers (n=45) were used to obtain source samples from four major subcatchments flowing into the Baroon Pocket Dam in South East Queensland, Australia. The geochemistry of potential sources were compared to the geochemistry of sediment cores (n=12) sampled in the reservoir. The geochemical approach selected elements for modelling that provided expected, observed and statistical discrimination between sediment sources. Two statistical approaches selected elements for modelling with the Kruskal-Wallis H-test and Discriminatory Function Analysis (DFA). In particular, two different significance levels (0.05 & 0.35) for the DFA were included to investigate the importance of element selection on modelling results. A distribution model determined the relative contributions of different sources to sediment sampled in the Baroon Pocket Dam. Elemental discrimination was expected between one subcatchment (Obi Obi Creek) and the remaining subcatchments (Lexys, Falls and Bridge Creek). Six major elements were expected to provide discrimination. Of these six, only Fe2O3 and SiO2 provided expected, observed and statistical discrimination. Modelling results with this geological approach indicated 36% (+/- 9%) of sediment sampled in the reservoir cores were from mafic-derived sources and 64% (+/- 9%) were from felsic-derived sources. The geological and the first statistical approach (DFA0.05) differed by only 1% (σ 5%) for 5 out of 6 model groupings with only the Lexys Creek modelling results differing significantly (35%). The statistical model with expanded elemental selection (DFA0.35) differed from the geological model by an average of 30% for all 6 models. Elemental selection for sediment fingerprinting therefore has the potential to impact modeling results. Accordingly is important to incorporate both robust geological and statistical approaches when selecting elements for sediment fingerprinting. For the Baroon Pocket Dam, management should focus on reducing the supply of sediments derived from felsic sources in each of the subcatchments.

  1. Transfer Student Success: Educationally Purposeful Activities Predictive of Undergraduate GPA

    ERIC Educational Resources Information Center

    Fauria, Renee M.; Fuller, Matthew B.

    2015-01-01

    Researchers evaluated the effects of Educationally Purposeful Activities (EPAs) on transfer and nontransfer students' cumulative GPAs. Hierarchical, linear, and multiple regression models yielded seven statistically significant educationally purposeful items that influenced undergraduate student GPAs. Statistically significant positive EPAs for…

  2. Prediction of the presence of insulin resistance using general health checkup data in Japanese employees with metabolic risk factors.

    PubMed

    Takahara, Mitsuyoshi; Katakami, Naoto; Kaneto, Hideaki; Noguchi, Midori; Shimomura, Iichiro

    2014-01-01

    The aim of the current study was to develop a predictive model of insulin resistance using general health checkup data in Japanese employees with one or more metabolic risk factors. We used a database of 846 Japanese employees with one or more metabolic risk factors who underwent general health checkup and a 75-g oral glucose tolerance test (OGTT). Logistic regression models were developed to predict existing insulin resistance evaluated using the Matsuda index. The predictive performance of these models was assessed using the C statistic. The C statistics of body mass index (BMI), waist circumference and their combined use were 0.743, 0.732 and 0.749, with no significant differences. The multivariate backward selection model, in which BMI, the levels of plasma glucose, high-density lipoprotein (HDL) cholesterol, log-transformed triglycerides and log-transformed alanine aminotransferase and hypertension under treatment remained, had a C statistic of 0.816, with a significant difference compared to the combined use of BMI and waist circumference (p<0.01). The C statistic was not significantly reduced when the levels of log-transformed triglycerides and log-transformed alanine aminotransferase and hypertension under treatment were simultaneously excluded from the multivariate model (p=0.14). On the other hand, further exclusion of any of the remaining three variables significantly reduced the C statistic (all p<0.01). When predicting the presence of insulin resistance using general health checkup data in Japanese employees with metabolic risk factors, it is important to take into consideration the BMI and fasting plasma glucose and HDL cholesterol levels.

  3. Effect of Internet-Based Cognitive Apprenticeship Model (i-CAM) on Statistics Learning among Postgraduate Students

    PubMed Central

    Saadati, Farzaneh; Ahmad Tarmizi, Rohani

    2015-01-01

    Because students’ ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is ‘value added’ because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students’ problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students. PMID:26132553

  4. Long-term occlusal changes assessed by the American Board of Orthodontics' model grading system.

    PubMed

    Aszkler, Robert M; Preston, Charles B; Saltaji, Humam; Tabbaa, Sawsan

    2014-02-01

    The purpose of this study was to assess the long-term posttreatment changes in all criteria of the American Board of Orthodontics' (ABO) model grading system. We used plaster models from patients' final and posttreatment records. Thirty patients treated by 1 orthodontist using 1 bracket prescription were selected. An initial discrepancy index for each subject was performed to determine the complexity of each case. The final models were then graded using the ABO's model grading system immediately at posttreatment and postretention. Statistical analysis was performed on the 8 criteria of the model grading system, including paired t tests and Pearson correlations. An alpha of 0.05 was considered statistically significant. The average length of time between the posttreatment and postretention records was 12.7 ± 4.4 years. It was shown that alignment and rotations worsened by postretention (P = 0.014), and a weak statistically significant correlation at posttreatment and postretention was found (0.44; P = 0.016). Both marginal ridges and occlusal contacts scored less well at posttreatment. These criteria showed a significant decrease in scores between posttreatment and postretention (P <0.001), but the correlations were not statistically significant. The average total score showed a significant decrease between posttreatment and postretention (P <0.001), partly because of the large decrease in the previous 2 criteria. Higher scores for occlusal contacts and marginal ridges were found at the end of treatment; however, those scores and the overall scores for the 30 subjects improved in the postretention phase. Copyright © 2014. Published by Mosby, Inc.

  5. A statistical assessment of seismic models of the U.S. continental crust using Bayesian inversion of ambient noise surface wave dispersion data

    NASA Astrophysics Data System (ADS)

    Olugboji, T. M.; Lekic, V.; McDonough, W.

    2017-07-01

    We present a new approach for evaluating existing crustal models using ambient noise data sets and its associated uncertainties. We use a transdimensional hierarchical Bayesian inversion approach to invert ambient noise surface wave phase dispersion maps for Love and Rayleigh waves using measurements obtained from Ekström (2014). Spatiospectral analysis shows that our results are comparable to a linear least squares inverse approach (except at higher harmonic degrees), but the procedure has additional advantages: (1) it yields an autoadaptive parameterization that follows Earth structure without making restricting assumptions on model resolution (regularization or damping) and data errors; (2) it can recover non-Gaussian phase velocity probability distributions while quantifying the sources of uncertainties in the data measurements and modeling procedure; and (3) it enables statistical assessments of different crustal models (e.g., CRUST1.0, LITHO1.0, and NACr14) using variable resolution residual and standard deviation maps estimated from the ensemble. These assessments show that in the stable old crust of the Archean, the misfits are statistically negligible, requiring no significant update to crustal models from the ambient noise data set. In other regions of the U.S., significant updates to regionalization and crustal structure are expected especially in the shallow sedimentary basins and the tectonically active regions, where the differences between model predictions and data are statistically significant.

  6. Statistical methods for the beta-binomial model in teratology.

    PubMed Central

    Yamamoto, E; Yanagimoto, T

    1994-01-01

    The beta-binomial model is widely used for analyzing teratological data involving littermates. Recent developments in statistical analyses of teratological data are briefly reviewed with emphasis on the model. For statistical inference of the parameters in the beta-binomial distribution, separation of the likelihood introduces an likelihood inference. This leads to reducing biases of estimators and also to improving accuracy of empirical significance levels of tests. Separate inference of the parameters can be conducted in a unified way. PMID:8187716

  7. Publication of statistically significant research findings in prosthodontics & implant dentistry in the context of other dental specialties.

    PubMed

    Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos

    2015-10-01

    To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. 'Chain pooling' model selection as developed for the statistical analysis of a rotor burst protection experiment

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1977-01-01

    A statistical decision procedure called chain pooling had been developed for model selection in fitting the results of a two-level fixed-effects full or fractional factorial experiment not having replication. The basic strategy included the use of one nominal level of significance for a preliminary test and a second nominal level of significance for the final test. The subject has been reexamined from the point of view of using as many as three successive statistical model deletion procedures in fitting the results of a single experiment. The investigation consisted of random number studies intended to simulate the results of a proposed aircraft turbine-engine rotor-burst-protection experiment. As a conservative approach, population model coefficients were chosen to represent a saturated 2 to the 4th power experiment with a distribution of parameter values unfavorable to the decision procedures. Three model selection strategies were developed.

  9. Evaluation of airborne lidar data to predict vegetation Presence/Absence

    USGS Publications Warehouse

    Palaseanu-Lovejoy, M.; Nayegandhi, A.; Brock, J.; Woodman, R.; Wright, C.W.

    2009-01-01

    This study evaluates the capabilities of the Experimental Advanced Airborne Research Lidar (EAARL) in delineating vegetation assemblages in Jean Lafitte National Park, Louisiana. Five-meter-resolution grids of bare earth, canopy height, canopy-reflection ratio, and height of median energy were derived from EAARL data acquired in September 2006. Ground-truth data were collected along transects to assess species composition, canopy cover, and ground cover. To decide which model is more accurate, comparisons of general linear models and generalized additive models were conducted using conventional evaluation methods (i.e., sensitivity, specificity, Kappa statistics, and area under the curve) and two new indexes, net reclassification improvement and integrated discrimination improvement. Generalized additive models were superior to general linear models in modeling presence/absence in training vegetation categories, but no statistically significant differences between the two models were achieved in determining the classification accuracy at validation locations using conventional evaluation methods, although statistically significant improvements in net reclassifications were observed. ?? 2009 Coastal Education and Research Foundation.

  10. Statistical significance of the rich-club phenomenon in complex networks

    NASA Astrophysics Data System (ADS)

    Jiang, Zhi-Qiang; Zhou, Wei-Xing

    2008-04-01

    We propose that the rich-club phenomenon in complex networks should be defined in the spirit of bootstrapping, in which a null model is adopted to assess the statistical significance of the rich-club detected. Our method can serve as a definition of the rich-club phenomenon and is applied to analyze three real networks and three model networks. The results show significant improvement compared with previously reported results. We report a dilemma with an exceptional example, showing that there does not exist an omnipotent definition for the rich-club phenomenon.

  11. Evaluating pictogram prediction in a location-aware augmentative and alternative communication system.

    PubMed

    Garcia, Luís Filipe; de Oliveira, Luís Caldas; de Matos, David Martins

    2016-01-01

    This study compared the performance of two statistical location-aware pictogram prediction mechanisms, with an all-purpose (All) pictogram prediction mechanism, having no location knowledge. The All approach had a unique language model under all locations. One of the location-aware alternatives, the location-specific (Spec) approach, made use of specific language models for pictogram prediction in each location of interest. The other location-aware approach resulted from combining the Spec and the All approaches, and was designated the mixed approach (Mix). In this approach, the language models acquired knowledge from all locations, but a higher relevance was assigned to the vocabulary from the associated location. Results from simulations showed that the Mix and Spec approaches could only outperform the baseline in a statistically significant way if pictogram users reuse more than 50% and 75% of their sentences, respectively. Under low sentence reuse conditions there were no statistically significant differences between the location-aware approaches and the All approach. Under these conditions, the Mix approach performed better than the Spec approach in a statistically significant way.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poyer, D.A.

    In this report, tests of statistical significance of five sets of variables with household energy consumption (at the point of end-use) are described. Five models, in sequence, were empirically estimated and tested for statistical significance by using the Residential Energy Consumption Survey of the US Department of Energy, Energy Information Administration. Each model incorporated additional information, embodied in a set of variables not previously specified in the energy demand system. The variable sets were generally labeled as economic variables, weather variables, household-structure variables, end-use variables, and housing-type variables. The tests of statistical significance showed each of the variable sets tomore » be highly significant in explaining the overall variance in energy consumption. The findings imply that the contemporaneous interaction of different types of variables, and not just one exclusive set of variables, determines the level of household energy consumption.« less

  13. "Suicide shall cease to be a crime": suicide and undetermined death trends 1970-2000 before and after the decriminalization of suicide in Ireland 1993.

    PubMed

    Osman, Mugtaba; Parnell, Andrew C; Haley, Clifford

    2017-02-01

    Suicide is criminalized in more than 100 countries around the world. A dearth of research exists into the effect of suicide legislation on suicide rates and available statistics are mixed. This study investigates 10,353 suicide deaths in Ireland that took place between 1970 and 2000. Irish 1970-2000 annual suicide data were obtained from the Central Statistics Office and modelled via a negative binomial regression approach. We examined the effect of suicide legislation on different age groups and on both sexes. We used Bonferroni correction for multiple modelling. Statistical analysis was performed using the R statistical package version 3.1.2. The coefficient for the effect of suicide act on overall suicide deaths was -9.094 (95 % confidence interval (CI) -34.086 to 15.899), statistically non-significant (p = 0.476). The coefficient for the effect suicide act on undetermined deaths was statistically significant (p < 0.001) and was estimated to be -644.4 (95 % CI -818.6 to -469.9). The results of our study indicate that legalization of suicide is not associated with a significant increase in subsequent suicide deaths. However, undetermined death verdict rates have significantly dropped following legalization of suicide.

  14. Conducting Multilevel Analyses in Medical Education

    ERIC Educational Resources Information Center

    Zyphur, Michael J.; Kaplan, Seth A.; Islam, Gazi; Barsky, Adam P.; Franklin, Michael S.

    2008-01-01

    A significant body of education literature has begun using multilevel statistical models to examine data that reside at multiple levels of analysis. In order to provide a primer for medical education researchers, the current work gives a brief overview of some issues associated with multilevel statistical modeling. To provide an example of this…

  15. A Comparative Evaluation of Mixed Dentition Analysis on Reliability of Cone Beam Computed Tomography Image Compared to Plaster Model.

    PubMed

    Gowd, Snigdha; Shankar, T; Dash, Samarendra; Sahoo, Nivedita; Chatterjee, Suravi; Mohanty, Pritam

    2017-01-01

    The aim of the study was to evaluate the reliability of cone beam computed tomography (CBCT) obtained image over plaster model for the assessment of mixed dentition analysis. Thirty CBCT-derived images and thirty plaster models were derived from the dental archives, and Moyer's and Tanaka-Johnston analyses were performed. The data obtained were interpreted and analyzed statistically using SPSS 10.0/PC (SPSS Inc., Chicago, IL, USA). Descriptive and analytical analysis along with Student's t -test was performed to qualitatively evaluate the data and P < 0.05 was considered statistically significant. Statistically, significant results were obtained on data comparison between CBCT-derived images and plaster model; the mean for Moyer's analysis in the left and right lower arch for CBCT and plaster model was 21.2 mm, 21.1 mm and 22.5 mm, 22.5 mm, respectively. CBCT-derived images were less reliable as compared to data obtained directly from plaster model for mixed dentition analysis.

  16. On the Determination of Poisson Statistics for Haystack Radar Observations of Orbital Debris

    NASA Technical Reports Server (NTRS)

    Stokely, Christopher L.; Benbrook, James R.; Horstman, Matt

    2007-01-01

    A convenient and powerful method is used to determine if radar detections of orbital debris are observed according to Poisson statistics. This is done by analyzing the time interval between detection events. For Poisson statistics, the probability distribution of the time interval between events is shown to be an exponential distribution. This distribution is a special case of the Erlang distribution that is used in estimating traffic loads on telecommunication networks. Poisson statistics form the basis of many orbital debris models but the statistical basis of these models has not been clearly demonstrated empirically until now. Interestingly, during the fiscal year 2003 observations with the Haystack radar in a fixed staring mode, there are no statistically significant deviations observed from that expected with Poisson statistics, either independent or dependent of altitude or inclination. One would potentially expect some significant clustering of events in time as a result of satellite breakups, but the presence of Poisson statistics indicates that such debris disperse rapidly with respect to Haystack's very narrow radar beam. An exception to Poisson statistics is observed in the months following the intentional breakup of the Fengyun satellite in January 2007.

  17. Modeling Soot Oxidation and Gasification with Bayesian Statistics

    DOE PAGES

    Josephson, Alexander J.; Gaffin, Neal D.; Smith, Sean T.; ...

    2017-08-22

    This paper presents a statistical method for model calibration using data collected from literature. The method is used to calibrate parameters for global models of soot consumption in combustion systems. This consumption is broken into two different submodels: first for oxidation where soot particles are attacked by certain oxidizing agents; second for gasification where soot particles are attacked by H 2O or CO 2 molecules. Rate data were collected from 19 studies in the literature and evaluated using Bayesian statistics to calibrate the model parameters. Bayesian statistics are valued in their ability to quantify uncertainty in modeling. The calibrated consumptionmore » model with quantified uncertainty is presented here along with a discussion of associated implications. The oxidation results are found to be consistent with previous studies. Significant variation is found in the CO 2 gasification rates.« less

  18. Modeling Soot Oxidation and Gasification with Bayesian Statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Josephson, Alexander J.; Gaffin, Neal D.; Smith, Sean T.

    This paper presents a statistical method for model calibration using data collected from literature. The method is used to calibrate parameters for global models of soot consumption in combustion systems. This consumption is broken into two different submodels: first for oxidation where soot particles are attacked by certain oxidizing agents; second for gasification where soot particles are attacked by H 2O or CO 2 molecules. Rate data were collected from 19 studies in the literature and evaluated using Bayesian statistics to calibrate the model parameters. Bayesian statistics are valued in their ability to quantify uncertainty in modeling. The calibrated consumptionmore » model with quantified uncertainty is presented here along with a discussion of associated implications. The oxidation results are found to be consistent with previous studies. Significant variation is found in the CO 2 gasification rates.« less

  19. The epistemology of mathematical and statistical modeling: a quiet methodological revolution.

    PubMed

    Rodgers, Joseph Lee

    2010-01-01

    A quiet methodological revolution, a modeling revolution, has occurred over the past several decades, almost without discussion. In contrast, the 20th century ended with contentious argument over the utility of null hypothesis significance testing (NHST). The NHST controversy may have been at least partially irrelevant, because in certain ways the modeling revolution obviated the NHST argument. I begin with a history of NHST and modeling and their relation to one another. Next, I define and illustrate principles involved in developing and evaluating mathematical models. Following, I discuss the difference between using statistical procedures within a rule-based framework and building mathematical models from a scientific epistemology. Only the former is treated carefully in most psychology graduate training. The pedagogical implications of this imbalance and the revised pedagogy required to account for the modeling revolution are described. To conclude, I discuss how attention to modeling implies shifting statistical practice in certain progressive ways. The epistemological basis of statistics has moved away from being a set of procedures, applied mechanistically, and moved toward building and evaluating statistical and scientific models. Copyrigiht 2009 APA, all rights reserved.

  20. Use of statistical and neural net approaches in predicting toxicity of chemicals.

    PubMed

    Basak, S C; Grunwald, G D; Gute, B D; Balasubramanian, K; Opitz, D

    2000-01-01

    Hierarchical quantitative structure-activity relationships (H-QSAR) have been developed as a new approach in constructing models for estimating physicochemical, biomedicinal, and toxicological properties of interest. This approach uses increasingly more complex molecular descriptors in a graduated approach to model building. In this study, statistical and neural network methods have been applied to the development of H-QSAR models for estimating the acute aquatic toxicity (LC50) of 69 benzene derivatives to Pimephales promelas (fathead minnow). Topostructural, topochemical, geometrical, and quantum chemical indices were used as the four levels of the hierarchical method. It is clear from both the statistical and neural network models that topostructural indices alone cannot adequately model this set of congeneric chemicals. Not surprisingly, topochemical indices greatly increase the predictive power of both statistical and neural network models. Quantum chemical indices also add significantly to the modeling of this set of acute aquatic toxicity data.

  1. Nature's style: Naturally trendy

    USGS Publications Warehouse

    Cohn, T.A.; Lins, H.F.

    2005-01-01

    Hydroclimatological time series often exhibit trends. While trend magnitude can be determined with little ambiguity, the corresponding statistical significance, sometimes cited to bolster scientific and political argument, is less certain because significance depends critically on the null hypothesis which in turn reflects subjective notions about what one expects to see. We consider statistical trend tests of hydroclimatological data in the presence of long-term persistence (LTP). Monte Carlo experiments employing FARIMA models indicate that trend tests which fail to consider LTP greatly overstate the statistical significance of observed trends when LTP is present. A new test is presented that avoids this problem. From a practical standpoint, however, it may be preferable to acknowledge that the concept of statistical significance is meaningless when discussing poorly understood systems.

  2. Nature's style: Naturally trendy

    NASA Astrophysics Data System (ADS)

    Cohn, Timothy A.; Lins, Harry F.

    2005-12-01

    Hydroclimatological time series often exhibit trends. While trend magnitude can be determined with little ambiguity, the corresponding statistical significance, sometimes cited to bolster scientific and political argument, is less certain because significance depends critically on the null hypothesis which in turn reflects subjective notions about what one expects to see. We consider statistical trend tests of hydroclimatological data in the presence of long-term persistence (LTP). Monte Carlo experiments employing FARIMA models indicate that trend tests which fail to consider LTP greatly overstate the statistical significance of observed trends when LTP is present. A new test is presented that avoids this problem. From a practical standpoint, however, it may be preferable to acknowledge that the concept of statistical significance is meaningless when discussing poorly understood systems.

  3. Statistical analysis of effective singular values in matrix rank determination

    NASA Technical Reports Server (NTRS)

    Konstantinides, Konstantinos; Yao, Kung

    1988-01-01

    A major problem in using SVD (singular-value decomposition) as a tool in determining the effective rank of a perturbed matrix is that of distinguishing between significantly small and significantly large singular values to the end, conference regions are derived for the perturbed singular values of matrices with noisy observation data. The analysis is based on the theories of perturbations of singular values and statistical significance test. Threshold bounds for perturbation due to finite-precision and i.i.d. random models are evaluated. In random models, the threshold bounds depend on the dimension of the matrix, the noisy variance, and predefined statistical level of significance. Results applied to the problem of determining the effective order of a linear autoregressive system from the approximate rank of a sample autocorrelation matrix are considered. Various numerical examples illustrating the usefulness of these bounds and comparisons to other previously known approaches are given.

  4. Statistical Models for Predicting Automobile Driving Postures for Men and Women Including Effects of Age.

    PubMed

    Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J

    2016-03-01

    Previously published statistical models of driving posture have been effective for vehicle design but have not taken into account the effects of age. The present study developed new statistical models for predicting driving posture. Driving postures of 90 U.S. drivers with a wide range of age and body size were measured in laboratory mockup in nine package conditions. Posture-prediction models for female and male drivers were separately developed by employing a stepwise regression technique using age, body dimensions, vehicle package conditions, and two-way interactions, among other variables. Driving posture was significantly associated with age, and the effects of other variables depended on age. A set of posture-prediction models is presented for women and men. The results are compared with a previously developed model. The present study is the first study of driver posture to include a large cohort of older drivers and the first to report a significant effect of age. The posture-prediction models can be used to position computational human models or crash-test dummies for vehicle design and assessment. © 2015, Human Factors and Ergonomics Society.

  5. Enhanced understanding of the relationship between erection and satisfaction in ED treatment: application of a longitudinal mediation model.

    PubMed

    Bushmakin, A G; Cappelleri, J C; Symonds, T; Stecher, V J

    2014-01-01

    To apportion the direct effect and the indirect effect (through erections) that sildenafil (vs placebo) has on individual satisfaction and couple satisfaction over time, longitudinal mediation modeling was applied to outcomes on the Sexual Experience Questionnaire. The model included data from weeks 4 and 10 (double-blind phase) and week 16 (open-label phase) of a controlled study. Data from 167 patients with erectile dysfunction (ED) were available for analysis. Estimation of statistical significance was based on bootstrap simulations, which allowed inferences at and between time points. Percentages (and corresponding 95% confidence intervals) for direct and indirect effects of treatment were calculated using the model. For the individual satisfaction and couple satisfaction domains, direct treatment effects were negligible (not statistically significant) whereas indirect treatment effects via the erection domain represented >90% of the treatment effects (statistically significant). Week 4 vs week 10 percentages of direct and indirect effects were not statistically different, indicating that the mediation effects are longitudinally invariant. As there was no placebo arm in the open-label phase, mediation effects at week 16 were not estimable. In conclusion, erection has a crucial role as a mediator in restoring individual satisfaction and couple satisfaction in men with ED treated with sildenafil.

  6. The Effects of Selection Strategies for Bivariate Loglinear Smoothing Models on NEAT Equating Functions

    ERIC Educational Resources Information Center

    Moses, Tim; Holland, Paul W.

    2010-01-01

    In this study, eight statistical strategies were evaluated for selecting the parameterizations of loglinear models for smoothing the bivariate test score distributions used in nonequivalent groups with anchor test (NEAT) equating. Four of the strategies were based on significance tests of chi-square statistics (Likelihood Ratio, Pearson,…

  7. Adapting an Agent-Based Model of Socio-Technical Systems to Analyze Security Failures

    DTIC Science & Technology

    2016-10-17

    total number of non-blackouts differed from the total number in the baseline data to a statistically significant extent with a p- valueɘ.0003...the total number of non-blackouts differed from the total number in the baseline data to a statistically significant extent with a p-valueɘ.0003...I. Nikolic, and Z. Lukszo, Eds., Agent-based modelling of socio-technical systems. Springer Science & Business Media, 2013, vol. 9. [12] A. P. Shaw

  8. [A strategic family medicine model for controlling borderline and mild arterial hypertension].

    PubMed

    Uzcátegui Contreras, D; Granadillo Vera, D; Salinas, P J; Alvarez, N

    1999-10-31

    To research on the relationship of the patient and his/her family as a non-pharmacological factor for blood hypertension. To determine whether a hyposodic, hypocaloric, hypofat, and hypocholesterolemic diet decreases the blood tension. To determine whether physical exercises in the patient and his/her family help to reduce the hypertension. To observe whether the psychological therapy of muscles relaxation helps to reduce the hypertension. To evaluate in the sample of families, the experience of each member, as well as their suggestions and complaints about the programme. To design the strategic model to control the blood tension by ambulatory means. Controlled intervention study, descriptive, non-randomized, prospective. PLACEMENT: Primary care. Study group of 10 patients, 10 wives, and 12 children, and control group of 10 patients excluding family members. With both groups (study and control) there were meetings every 15 days for 6 months according to an established schedule. In the meetings there were given talks, pamphlets, physical exercises, muscles relaxation therapy, all about blood hypertension. There were questionnaires before and after each activity. MEASURING AND MAIN RESULTS: In both groups (study and control) there was a statistically significant (t < 0.01) reduction in the weight. The blood systolic tension decreased in both positions, seated and standing, in the study group (difference statistically significant) but not so in the control group, although there was a non-significant difference (decrease of 1.5 mmHg) in the seated position. The diastolic tension decreased significantly in the study group both in seated and standing positions, not so in the control group. The study sample showed that systolic tension seated and standing had a statistically significant reduction in the study group but not so in the control group. The weight had statistically significant reduction in both study and control groups. Total cholesterol had statistically significant decrease in the study group but not in the control group. HDL-C had statistically significant reduction in the study group; in the control group there was a decrease but not statistically significant. The triglycerides did not decrease statistically significant in any of the groups (study and control).

  9. Statistical validation of normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Eruption patterns of the chilean volcanoes Villarrica, Llaima, and Tupungatito

    NASA Astrophysics Data System (ADS)

    Muñoz, Miguel

    1983-09-01

    The historical eruption records of three Chilean volcanoes have been subjected to many statistical tests, and none have been found to differ significantly from random, or Poissonian, behaviour. The statistical analysis shows rough conformity with the descriptions determined from the eruption rate functions. It is possible that a constant eruption rate describes the activity of Villarrica; Llaima and Tupungatito present complex eruption rate patterns that appear, however, to have no statistical significance. Questions related to loading and extinction processes and to the existence of shallow secondary magma chambers to which magma is supplied from a deeper system are also addressed. The analysis and the computation of the serial correlation coefficients indicate that the three series may be regarded as stationary renewal processes. None of the test statistics indicates rejection of the Poisson hypothesis at a level less than 5%, but the coefficient of variation for the eruption series at Llaima is significantly different from the value expected for a Poisson process. Also, the estimates of the normalized spectrum of the counting process for the three series suggest a departure from the random model, but the deviations are not found to be significant at the 5% level. Kolmogorov-Smirnov and chi-squared test statistics, applied directly to ascertaining to which probability P the random Poisson model fits the data, indicate that there is significant agreement in the case of Villarrica ( P=0.59) and Tupungatito ( P=0.3). Even though the P-value for Llaima is a marginally significant 0.1 (which is equivalent to rejecting the Poisson model at the 90% confidence level), the series suggests that nonrandom features are possibly present in the eruptive activity of this volcano.

  11. A statistical model including age to predict passenger postures in the rear seats of automobiles.

    PubMed

    Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J

    2016-06-01

    Few statistical models of rear seat passenger posture have been published, and none has taken into account the effects of occupant age. This study developed new statistical models for predicting passenger postures in the rear seats of automobiles. Postures of 89 adults with a wide range of age and body size were measured in a laboratory mock-up in seven seat configurations. Posture-prediction models for female and male passengers were separately developed by stepwise regression using age, body dimensions, seat configurations and two-way interactions as potential predictors. Passenger posture was significantly associated with age and the effects of other two-way interaction variables depended on age. A set of posture-prediction models are presented for women and men, and the prediction results are compared with previously published models. This study is the first study of passenger posture to include a large cohort of older passengers and the first to report a significant effect of age for adults. The presented models can be used to position computational and physical human models for vehicle design and assessment. Practitioner Summary: The significant effects of age, body dimensions and seat configuration on rear seat passenger posture were identified. The models can be used to accurately position computational human models or crash test dummies for older passengers in known rear seat configurations.

  12. An application of seasonal ARIMA models on group commodities to forecast Philippine merchandise exports performance

    NASA Astrophysics Data System (ADS)

    Natividad, Gina May R.; Cawiding, Olive R.; Addawe, Rizavel C.

    2017-11-01

    The increase in the merchandise exports of the country offers information about the Philippines' trading role within the global economy. Merchandise exports statistics are used to monitor the country's overall production that is consumed overseas. This paper investigates the comparison between two models obtained by a) clustering the commodity groups into two based on its proportional contribution to the total exports, and b) treating only the total exports. Different seasonal autoregressive integrated moving average (SARIMA) models were then developed for the clustered commodities and for the total exports based on the monthly merchandise exports of the Philippines from 2011 to 2016. The data set used in this study was retrieved from the Philippine Statistics Authority (PSA) which is the central statistical authority in the country responsible for primary data collection. A test for significance of the difference between means at 0.05 level of significance was then performed on the forecasts produced. The result indicates that there is a significant difference between the mean of the forecasts of the two models. Moreover, upon a comparison of the root mean square error (RMSE) and mean absolute error (MAE) of the models, it was found that the models used for the clustered groups outperform the model for the total exports.

  13. The potential of composite cognitive scores for tracking progression in Huntington's disease.

    PubMed

    Jones, Rebecca; Stout, Julie C; Labuschagne, Izelle; Say, Miranda; Justo, Damian; Coleman, Allison; Dumas, Eve M; Hart, Ellen; Owen, Gail; Durr, Alexandra; Leavitt, Blair R; Roos, Raymund; O'Regan, Alison; Langbehn, Doug; Tabrizi, Sarah J; Frost, Chris

    2014-01-01

    Composite scores derived from joint statistical modelling of individual risk factors are widely used to identify individuals who are at increased risk of developing disease or of faster disease progression. We investigated the ability of composite measures developed using statistical models to differentiate progressive cognitive deterioration in Huntington's disease (HD) from natural decline in healthy controls. Using longitudinal data from TRACK-HD, the optimal combinations of quantitative cognitive measures to differentiate premanifest and early stage HD individuals respectively from controls was determined using logistic regression. Composite scores were calculated from the parameters of each statistical model. Linear regression models were used to calculate effect sizes (ES) quantifying the difference in longitudinal change over 24 months between premanifest and early stage HD groups respectively and controls. ES for the composites were compared with ES for individual cognitive outcomes and other measures used in HD research. The 0.632 bootstrap was used to eliminate biases which result from developing and testing models in the same sample. In early HD, the composite score from the HD change prediction model produced an ES for difference in rate of 24-month change relative to controls of 1.14 (95% CI: 0.90 to 1.39), larger than the ES for any individual cognitive outcome and UHDRS Total Motor Score and Total Functional Capacity. In addition, this composite gave a statistically significant difference in rate of change in premanifest HD compared to controls over 24-months (ES: 0.24; 95% CI: 0.04 to 0.44), even though none of the individual cognitive outcomes produced statistically significant ES over this period. Composite scores developed using appropriate statistical modelling techniques have the potential to materially reduce required sample sizes for randomised controlled trials.

  14. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    PubMed

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  15. Statistically significant relational data mining :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann

    This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publicationsmore » that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.« less

  16. Comparison of Histograms for Use in Cloud Observation and Modeling

    NASA Technical Reports Server (NTRS)

    Green, Lisa; Xu, Kuan-Man

    2005-01-01

    Cloud observation and cloud modeling data can be presented in histograms for each characteristic to be measured. Combining information from single-cloud histograms yields a summary histogram. Summary histograms can be compared to each other to reach conclusions about the behavior of an ensemble of clouds in different places at different times or about the accuracy of a particular cloud model. As in any scientific comparison, it is necessary to decide whether any apparent differences are statistically significant. The usual methods of deciding statistical significance when comparing histograms do not apply in this case because they assume independent data. Thus, a new method is necessary. The proposed method uses the Euclidean distance metric and bootstrapping to calculate the significance level.

  17. Flexible statistical modelling detects clinical functional magnetic resonance imaging activation in partially compliant subjects.

    PubMed

    Waites, Anthony B; Mannfolk, Peter; Shaw, Marnie E; Olsrud, Johan; Jackson, Graeme D

    2007-02-01

    Clinical functional magnetic resonance imaging (fMRI) occasionally fails to detect significant activation, often due to variability in task performance. The present study seeks to test whether a more flexible statistical analysis can better detect activation, by accounting for variance associated with variable compliance to the task over time. Experimental results and simulated data both confirm that even at 80% compliance to the task, such a flexible model outperforms standard statistical analysis when assessed using the extent of activation (experimental data), goodness of fit (experimental data), and area under the operator characteristic curve (simulated data). Furthermore, retrospective examination of 14 clinical fMRI examinations reveals that in patients where the standard statistical approach yields activation, there is a measurable gain in model performance in adopting the flexible statistical model, with little or no penalty in lost sensitivity. This indicates that a flexible model should be considered, particularly for clinical patients who may have difficulty complying fully with the study task.

  18. TSP Symposium 2012 Proceedings

    DTIC Science & Technology

    2012-11-01

    and Statistical Model 78 7.3 Analysis and Results 79 7.4 Threats to Validity and Limitations 85 7.5 Conclusions 86 7.6 Acknowledgments 87 7.7...Table 12: Overall Statistics of the Experiment 32 Table 13: Results of Pairwise ANOVA Analysis, Highlighting Statistically Significant Differences...we calculated the percentage of defects injected. The distribution statistics are shown in Table 2. Table 2: Mean Lower, Upper Confidence Interval

  19. Modelling the effect of structural QSAR parameters on skin penetration using genetic programming

    NASA Astrophysics Data System (ADS)

    Chung, K. K.; Do, D. Q.

    2010-09-01

    In order to model relationships between chemical structures and biological effects in quantitative structure-activity relationship (QSAR) data, an alternative technique of artificial intelligence computing—genetic programming (GP)—was investigated and compared to the traditional method—statistical. GP, with the primary advantage of generating mathematical equations, was employed to model QSAR data and to define the most important molecular descriptions in QSAR data. The models predicted by GP agreed with the statistical results, and the most predictive models of GP were significantly improved when compared to the statistical models using ANOVA. Recently, artificial intelligence techniques have been applied widely to analyse QSAR data. With the capability of generating mathematical equations, GP can be considered as an effective and efficient method for modelling QSAR data.

  20. Statistical characteristics of trajectories of diamagnetic unicellular organisms in a magnetic field.

    PubMed

    Gorobets, Yu I; Gorobets, O Yu

    2015-01-01

    The statistical model is proposed in this paper for description of orientation of trajectories of unicellular diamagnetic organisms in a magnetic field. The statistical parameter such as the effective energy is calculated on basis of this model. The resulting effective energy is the statistical characteristics of trajectories of diamagnetic microorganisms in a magnetic field connected with their metabolism. The statistical model is applicable for the case when the energy of the thermal motion of bacteria is negligible in comparison with their energy in a magnetic field and the bacteria manifest the significant "active random movement", i.e. there is the randomizing motion of the bacteria of non thermal nature, for example, movement of bacteria by means of flagellum. The energy of the randomizing active self-motion of bacteria is characterized by the new statistical parameter for biological objects. The parameter replaces the energy of the randomizing thermal motion in calculation of the statistical distribution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. A consistent framework for Horton regression statistics that leads to a modified Hack's law

    USGS Publications Warehouse

    Furey, P.R.; Troutman, B.M.

    2008-01-01

    A statistical framework is introduced that resolves important problems with the interpretation and use of traditional Horton regression statistics. The framework is based on a univariate regression model that leads to an alternative expression for Horton ratio, connects Horton regression statistics to distributional simple scaling, and improves the accuracy in estimating Horton plot parameters. The model is used to examine data for drainage area A and mainstream length L from two groups of basins located in different physiographic settings. Results show that confidence intervals for the Horton plot regression statistics are quite wide. Nonetheless, an analysis of covariance shows that regression intercepts, but not regression slopes, can be used to distinguish between basin groups. The univariate model is generalized to include n > 1 dependent variables. For the case where the dependent variables represent ln A and ln L, the generalized model performs somewhat better at distinguishing between basin groups than two separate univariate models. The generalized model leads to a modification of Hack's law where L depends on both A and Strahler order ??. Data show that ?? plays a statistically significant role in the modified Hack's law expression. ?? 2008 Elsevier B.V.

  2. The statistical average of optical properties for alumina particle cluster in aircraft plume

    NASA Astrophysics Data System (ADS)

    Li, Jingying; Bai, Lu; Wu, Zhensen; Guo, Lixin

    2018-04-01

    We establish a model for lognormal distribution of monomer radius and number of alumina particle clusters in plume. According to the Multi-Sphere T Matrix (MSTM) theory, we provide a method for finding the statistical average of optical properties for alumina particle clusters in plume, analyze the effect of different distributions and different detection wavelengths on the statistical average of optical properties for alumina particle cluster, and compare the statistical average optical properties under the alumina particle cluster model established in this study and those under three simplified alumina particle models. The calculation results show that the monomer number of alumina particle cluster and its size distribution have a considerable effect on its statistical average optical properties. The statistical average of optical properties for alumina particle cluster at common detection wavelengths exhibit obvious differences, whose differences have a great effect on modeling IR and UV radiation properties of plume. Compared with the three simplified models, the alumina particle cluster model herein features both higher extinction and scattering efficiencies. Therefore, we may find that an accurate description of the scattering properties of alumina particles in aircraft plume is of great significance in the study of plume radiation properties.

  3. A Comparative Evaluation of Mixed Dentition Analysis on Reliability of Cone Beam Computed Tomography Image Compared to Plaster Model

    PubMed Central

    Gowd, Snigdha; Shankar, T; Dash, Samarendra; Sahoo, Nivedita; Chatterjee, Suravi; Mohanty, Pritam

    2017-01-01

    Aims and Objective: The aim of the study was to evaluate the reliability of cone beam computed tomography (CBCT) obtained image over plaster model for the assessment of mixed dentition analysis. Materials and Methods: Thirty CBCT-derived images and thirty plaster models were derived from the dental archives, and Moyer's and Tanaka-Johnston analyses were performed. The data obtained were interpreted and analyzed statistically using SPSS 10.0/PC (SPSS Inc., Chicago, IL, USA). Descriptive and analytical analysis along with Student's t-test was performed to qualitatively evaluate the data and P < 0.05 was considered statistically significant. Results: Statistically, significant results were obtained on data comparison between CBCT-derived images and plaster model; the mean for Moyer's analysis in the left and right lower arch for CBCT and plaster model was 21.2 mm, 21.1 mm and 22.5 mm, 22.5 mm, respectively. Conclusion: CBCT-derived images were less reliable as compared to data obtained directly from plaster model for mixed dentition analysis. PMID:28852639

  4. Radiation detection method and system using the sequential probability ratio test

    DOEpatents

    Nelson, Karl E [Livermore, CA; Valentine, John D [Redwood City, CA; Beauchamp, Brock R [San Ramon, CA

    2007-07-17

    A method and system using the Sequential Probability Ratio Test to enhance the detection of an elevated level of radiation, by determining whether a set of observations are consistent with a specified model within a given bounds of statistical significance. In particular, the SPRT is used in the present invention to maximize the range of detection, by providing processing mechanisms for estimating the dynamic background radiation, adjusting the models to reflect the amount of background knowledge at the current point in time, analyzing the current sample using the models to determine statistical significance, and determining when the sample has returned to the expected background conditions.

  5. Cognitive Complaints After Breast Cancer Treatments: Examining the Relationship With Neuropsychological Test Performance

    PubMed Central

    2013-01-01

    Background Cognitive complaints are reported frequently after breast cancer treatments. Their association with neuropsychological (NP) test performance is not well-established. Methods Early-stage, posttreatment breast cancer patients were enrolled in a prospective, longitudinal, cohort study prior to starting endocrine therapy. Evaluation included an NP test battery and self-report questionnaires assessing symptoms, including cognitive complaints. Multivariable regression models assessed associations among cognitive complaints, mood, treatment exposures, and NP test performance. Results One hundred eighty-nine breast cancer patients, aged 21–65 years, completed the evaluation; 23.3% endorsed higher memory complaints and 19.0% reported higher executive function complaints (>1 SD above the mean for healthy control sample). Regression modeling demonstrated a statistically significant association of higher memory complaints with combined chemotherapy and radiation treatments (P = .01), poorer NP verbal memory performance (P = .02), and higher depressive symptoms (P < .001), controlling for age and IQ. For executive functioning complaints, multivariable modeling controlling for age, IQ, and other confounds demonstrated statistically significant associations with better NP visual memory performance (P = .03) and higher depressive symptoms (P < .001), whereas combined chemotherapy and radiation treatment (P = .05) approached statistical significance. Conclusions About one in five post–adjuvant treatment breast cancer patients had elevated memory and/or executive function complaints that were statistically significantly associated with domain-specific NP test performances and depressive symptoms; combined chemotherapy and radiation treatment was also statistically significantly associated with memory complaints. These results and other emerging studies suggest that subjective cognitive complaints in part reflect objective NP performance, although their etiology and biology appear to be multifactorial, motivating further transdisciplinary research. PMID:23606729

  6. Preventive Effect of Phosphodiesterase Inhibitor Pentoxifylline Against Medication-Related Osteonecrosis of the Jaw: An Animal Study.

    PubMed

    Yalcin-Ulker, Gül Merve; Cumbul, Alev; Duygu-Capar, Gonca; Uslu, Ünal; Sencift, Kemal

    2017-11-01

    The aim of this experimental study was to investigate the prophylactic effect of pentoxifylline (PTX) on medication-related osteonecrosis of the jaw (MRONJ). Female Sprague-Dawley rats (n = 33) received zoledronic acid (ZA) for 8 weeks to create an osteonecrosis model. The left mandibular second molars were extracted and the recovery period lasted 8 weeks before sacrifice. PTX was intraperitoneally administered to prevent MRONJ. The specimens were histopathologically and histomorphometrically evaluated. Histomorphometrically, between the control and ZA groups, there was no statistically significant difference in total bone volume (P = .999), but there was a statistically significant difference in bone ratio in the extraction sockets (P < .001). A comparison of the bone ratio of the ZA group with the ZA/PTX group (PTX administered after extraction) showed no statistically significant difference (P = .69), but there was a statistically significant difference with the ZA/PTX/PTX group (PTX administered before and after extraction; P = .008). Histopathologically, between the control and ZA groups, there were statistically significant differences for inflammation (P = .013), vascularization (P = .022), hemorrhage (P = .025), and regeneration (P = .008). Between the ZA and ZA/PTX groups, there were no statistically significant differences for inflammation (P = .536), vascularization (P = .642), hemorrhage (P = .765), and regeneration (P = .127). Between the ZA and ZA/PTX/PTX groups, there were statistically significant differences for inflammation (P = .017), vascularization (P = .04), hemorrhage (P = .044), and regeneration (P = .04). In this experimental model of MRONJ, it might be concluded that although PTX, given after tooth extraction, improves new bone formation that positively affects bone healing, it is not prophylactic. However, PTX given before tooth extraction is prophylactic. Therefore, PTX might affect healing in a positive way by optimizing the inflammatory response. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  7. Linearised and non-linearised isotherm models optimization analysis by error functions and statistical means

    PubMed Central

    2014-01-01

    In adsorption study, to describe sorption process and evaluation of best-fitting isotherm model is a key analysis to investigate the theoretical hypothesis. Hence, numerous statistically analysis have been extensively used to estimate validity of the experimental equilibrium adsorption values with the predicted equilibrium values. Several statistical error analysis were carried out. In the present study, the following statistical analysis were carried out to evaluate the adsorption isotherm model fitness, like the Pearson correlation, the coefficient of determination and the Chi-square test, have been used. The ANOVA test was carried out for evaluating significance of various error functions and also coefficient of dispersion were evaluated for linearised and non-linearised models. The adsorption of phenol onto natural soil (Local name Kalathur soil) was carried out, in batch mode at 30 ± 20 C. For estimating the isotherm parameters, to get a holistic view of the analysis the models were compared between linear and non-linear isotherm models. The result reveled that, among above mentioned error functions and statistical functions were designed to determine the best fitting isotherm. PMID:25018878

  8. The impacts of recent smoking control policies on individual smoking choice: the case of Japan

    PubMed Central

    2013-01-01

    Abstract This article comprehensively examines the impact of recent smoking control policies in Japan, increases in cigarette taxes and the enforcement of the Health Promotion Law, on individual smoking choice by using multi-year and nationwide individual survey data to overcome the analytical problems of previous Japanese studies. In the econometric analyses, I specify a simple binary choice model based on a random utility model to examine the effects of smoking control policies on individual smoking choice by employing the instrumental variable probit model to control for the endogeneity of cigarette prices. The empirical results show that an increase in cigarette prices statistically significantly reduces the smoking probability of males by 1.0 percent and that of females by 1.4 to 2.0 percent. The enforcement of the Health Promotion Law has a statistically significant effect on reducing the smoking probability of males by 15.2 percent and of females by 11.9 percent. Furthermore, an increase in cigarette prices has a statistically significant negative effect on the smoking probability of office workers, non-workers, male manual workers, and female unemployed people, and the enforcement of the Health Promotion Law has a statistically significant effect on decreasing the smoking probabilities of office workers, female manual workers, and male non-workers. JEL classification C25, C26, I18 PMID:23497490

  9. Diagnostic index of 3D osteoarthritic changes in TMJ condylar morphology

    NASA Astrophysics Data System (ADS)

    Gomes, Liliane R.; Gomes, Marcelo; Jung, Bryan; Paniagua, Beatriz; Ruellas, Antonio C.; Gonçalves, João. Roberto; Styner, Martin A.; Wolford, Larry; Cevidanes, Lucia

    2015-03-01

    The aim of this study was to investigate imaging statistical approaches for classifying 3D osteoarthritic morphological variations among 169 Temporomandibular Joint (TMJ) condyles. Cone beam Computed Tomography (CBCT) scans were acquired from 69 patients with long-term TMJ Osteoarthritis (OA) (39.1 ± 15.7 years), 15 patients at initial diagnosis of OA (44.9 ± 14.8 years) and 7 healthy controls (43 ± 12.4 years). 3D surface models of the condyles were constructed and Shape Correspondence was used to establish correspondent points on each model. The statistical framework included a multivariate analysis of covariance (MANCOVA) and Direction-Projection- Permutation (DiProPerm) for testing statistical significance of the differences between healthy control and the OA group determined by clinical and radiographic diagnoses. Unsupervised classification using hierarchical agglomerative clustering (HAC) was then conducted. Condylar morphology in OA and healthy subjects varied widely. Compared with healthy controls, OA average condyle was statistically significantly smaller in all dimensions except its anterior surface. Significant flattening of the lateral pole was noticed at initial diagnosis (p < 0.05). It was observed areas of 3.88 mm bone resorption at the superior surface and 3.10 mm bone apposition at the anterior aspect of the long-term OA average model. 1000 permutation statistics of DiProPerm supported a significant difference between the healthy control group and OA group (t = 6.7, empirical p-value = 0.001). Clinically meaningful unsupervised classification of TMJ condylar morphology determined a preliminary diagnostic index of 3D osteoarthritic changes, which may be the first step towards a more targeted diagnosis of this condition.

  10. Seismic activity prediction using computational intelligence techniques in northern Pakistan

    NASA Astrophysics Data System (ADS)

    Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat

    2017-10-01

    Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.

  11. Statistical Simulation of the Performance and Degradation of a PEMFC Membrane Electrode Assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, David; Bellemare-Davis, Alexander; Karan, Kunal

    2012-07-01

    A 1-D MEA Performance model was developed that considered transport of liquid water, agglomerate catalyst structure, and the statistical variation of the MEA characteristic parameters. The model was validated against a low surface area carbon supported catalyst across various platinum loadings and operational conditions. The statistical variation was found to play a significant role in creating noise in the validation data and that there was a coupling effect between movement in material properties with liquid water transport. Further, in studying the low platinum loaded catalyst layers it was found that liquid water played a significant role in the increasing themore » overall transport losses. The model was then further applied to study platinum dissolution via potential cycling accelerated stress tests, in which the platinum was found to dissolve nearest the membrane effectively resulting in reaction distribution shifts within the layer.« less

  12. Evaluation of The Operational Benefits Versus Costs of An Automated Cargo Mover

    DTIC Science & Technology

    2016-12-01

    logistics footprint and life-cycle cost are presented as part of this report. Analysis of modeling and simulation results identified statistically...life-cycle cost are presented as part of this report. Analysis of modeling and simulation results identified statistically significant differences...Error of Estimation. Source: Eskew and Lawler (1994). ...........................75 Figure 24. Load Results (100 Runs per Scenario

  13. Metamodelling Messages Conveyed in Five Statistical Mechanical Textbooks from 1936 to 2001

    ERIC Educational Resources Information Center

    Niss, Martin

    2009-01-01

    Modelling is a significant aspect of doing physics and it is important how this activity is taught. This paper focuses on the explicit or implicit messages about modelling conveyed to the student in the treatments of phase transitions in statistical mechanics textbooks at beginning graduate level. Five textbooks from the 1930s to the present are…

  14. Cognitive predictors of balance in Parkinson's disease.

    PubMed

    Fernandes, Ângela; Mendes, Andreia; Rocha, Nuno; Tavares, João Manuel R S

    2016-06-01

    Postural instability is one of the most incapacitating symptoms of Parkinson's disease (PD) and appears to be related to cognitive deficits. This study aims to determine the cognitive factors that can predict deficits in static and dynamic balance in individuals with PD. A sociodemographic questionnaire characterized 52 individuals with PD for this work. The Trail Making Test, Rule Shift Cards Test, and Digit Span Test assessed the executive functions. The static balance was assessed using a plantar pressure platform, and dynamic balance was based on the Timed Up and Go Test. The results were statistically analysed using SPSS Statistics software through linear regression analysis. The results show that a statistically significant model based on cognitive outcomes was able to explain the variance of motor variables. Also, the explanatory value of the model tended to increase with the addition of individual and clinical variables, although the resulting model was not statistically significant The model explained 25-29% of the variability of the Timed Up and Go Test, while for the anteroposterior displacement it was 23-34%, and for the mediolateral displacement it was 24-39%. From the findings, we conclude that the cognitive performance, especially the executive functions, is a predictor of balance deficit in individuals with PD.

  15. On use of the multistage dose-response model for assessing laboratory animal carcinogenicity

    PubMed Central

    Nitcheva, Daniella; Piegorsch, Walter W.; West, R. Webster

    2007-01-01

    We explore how well a statistical multistage model describes dose-response patterns in laboratory animal carcinogenicity experiments from a large database of quantal response data. The data are collected from the U.S. EPA’s publicly available IRIS data warehouse and examined statistically to determine how often higher-order values in the multistage predictor yield significant improvements in explanatory power over lower-order values. Our results suggest that the addition of a second-order parameter to the model only improves the fit about 20% of the time, while adding even higher-order terms apparently does not contribute to the fit at all, at least with the study designs we captured in the IRIS database. Also included is an examination of statistical tests for assessing significance of higher-order terms in a multistage dose-response model. It is noted that bootstrap testing methodology appears to offer greater stability for performing the hypothesis tests than a more-common, but possibly unstable, “Wald” test. PMID:17490794

  16. Significance testing of rules in rule-based models of human problem solving

    NASA Technical Reports Server (NTRS)

    Lewis, C. M.; Hammer, J. M.

    1986-01-01

    Rule-based models of human problem solving have typically not been tested for statistical significance. Three methods of testing rules - analysis of variance, randomization, and contingency tables - are presented. Advantages and disadvantages of the methods are also described.

  17. Assessing Climate Change Impacts for DoD installations in the Southwest United States During the Warm Season

    DTIC Science & Technology

    2017-03-10

    20 4. Statistical analysis methods to characterize distributions and trends...duration precipitation diagram from convective- permitting simulations for Barry Goldwater Range, Arizona. ix Figure 60: Statistically ...Same as Fig. 60 for other DoD facilities in the Southwest as labeled. Figure 62: Statistically significant model ensemble changes in rainfall

  18. Strategies for Reduced-Order Models in Uncertainty Quantification of Complex Turbulent Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Qi, Di

    Turbulent dynamical systems are ubiquitous in science and engineering. Uncertainty quantification (UQ) in turbulent dynamical systems is a grand challenge where the goal is to obtain statistical estimates for key physical quantities. In the development of a proper UQ scheme for systems characterized by both a high-dimensional phase space and a large number of instabilities, significant model errors compared with the true natural signal are always unavoidable due to both the imperfect understanding of the underlying physical processes and the limited computational resources available. One central issue in contemporary research is the development of a systematic methodology for reduced order models that can recover the crucial features both with model fidelity in statistical equilibrium and with model sensitivity in response to perturbations. In the first part, we discuss a general mathematical framework to construct statistically accurate reduced-order models that have skill in capturing the statistical variability in the principal directions of a general class of complex systems with quadratic nonlinearity. A systematic hierarchy of simple statistical closure schemes, which are built through new global statistical energy conservation principles combined with statistical equilibrium fidelity, are designed and tested for UQ of these problems. Second, the capacity of imperfect low-order stochastic approximations to model extreme events in a passive scalar field advected by turbulent flows is investigated. The effects in complicated flow systems are considered including strong nonlinear and non-Gaussian interactions, and much simpler and cheaper imperfect models with model error are constructed to capture the crucial statistical features in the stationary tracer field. Several mathematical ideas are introduced to improve the prediction skill of the imperfect reduced-order models. Most importantly, empirical information theory and statistical linear response theory are applied in the training phase for calibrating model errors to achieve optimal imperfect model parameters; and total statistical energy dynamics are introduced to improve the model sensitivity in the prediction phase especially when strong external perturbations are exerted. The validity of reduced-order models for predicting statistical responses and intermittency is demonstrated on a series of instructive models with increasing complexity, including the stochastic triad model, the Lorenz '96 model, and models for barotropic and baroclinic turbulence. The skillful low-order modeling methods developed here should also be useful for other applications such as efficient algorithms for data assimilation.

  19. Intraoral versus extraoral measurement of the height of the interproximal contact area in maxillary anterior teeth.

    PubMed

    Sghaireen, Mohd G; Albhiran, Heyam Mobark; Alzoubi, Ibrahim A; Lynch, Edward; Al-Omiri, Mahmoud K

    2015-01-01

    This study aimed to clinically quantify the apicoincisal height of the upper interproximal areas directly in patients' mouths compared to measurements on stone models. One hundred and fifty participants (75 females and 75 males, age range 20-45 years) were recruited for this study. A digital caliper was used to measure the anterior maxillary interproximal contact areas directly in patients' mouths and on stone models. The digital caliper accuracy was up to 0.01. The Statistical Package for Social Sciences software (SPSS, version 19.0, Chicago, Ill., USA) was used for statistical analysis. Statistical significance was based on probability values <0.05. The intraoral measurement of proximal contacts as well as the measurement on stone models showed that the dimensions of interproximal contacts on both sides of each tooth were significantly different (p < 0.001) and that the dimension of the mesial contact point was larger than that of the distal contact point of each tooth. The largest contact point was the one between the central incisors (direct intraoral measurement = 2.9-6.49 mm; model measurement = 3.31-6.91 mm). On the other hand, the contact point between the canine and first premolar was the smallest on both sides of the arch (0.63-2.52 mm intraorally, 0.98-2.88 mm on models). The intraoral measurement of contact points was more accurate than model measurements, and the differences were statistically significant (p < 0.001). The clinical evaluation of contact point dimensions using a digital caliper was more precise than measuring contact points on stone models; hence, it is a viable, quick and adequate method to be used routinely. © 2015 S. Karger AG, Basel.

  20. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    PubMed

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  1. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model

    PubMed Central

    Austin, Peter C.

    2017-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest. PMID:29321694

  2. Fast, Statistical Model of Surface Roughness for Ion-Solid Interaction Simulations and Efficient Code Coupling

    NASA Astrophysics Data System (ADS)

    Drobny, Jon; Curreli, Davide; Ruzic, David; Lasa, Ane; Green, David; Canik, John; Younkin, Tim; Blondel, Sophie; Wirth, Brian

    2017-10-01

    Surface roughness greatly impacts material erosion, and thus plays an important role in Plasma-Surface Interactions. Developing strategies for efficiently introducing rough surfaces into ion-solid interaction codes will be an important step towards whole-device modeling of plasma devices and future fusion reactors such as ITER. Fractal TRIDYN (F-TRIDYN) is an upgraded version of the Monte Carlo, BCA program TRIDYN developed for this purpose that includes an explicit fractal model of surface roughness and extended input and output options for file-based code coupling. Code coupling with both plasma and material codes has been achieved and allows for multi-scale, whole-device modeling of plasma experiments. These code coupling results will be presented. F-TRIDYN has been further upgraded with an alternative, statistical model of surface roughness. The statistical model is significantly faster than and compares favorably to the fractal model. Additionally, the statistical model compares well to alternative computational surface roughness models and experiments. Theoretical links between the fractal and statistical models are made, and further connections to experimental measurements of surface roughness are explored. This work was supported by the PSI-SciDAC Project funded by the U.S. Department of Energy through contract DOE-DE-SC0008658.

  3. Development of uncertainty-based work injury model using Bayesian structural equation modelling.

    PubMed

    Chatterjee, Snehamoy

    2014-01-01

    This paper proposed a Bayesian method-based structural equation model (SEM) of miners' work injury for an underground coal mine in India. The environmental and behavioural variables for work injury were identified and causal relationships were developed. For Bayesian modelling, prior distributions of SEM parameters are necessary to develop the model. In this paper, two approaches were adopted to obtain prior distribution for factor loading parameters and structural parameters of SEM. In the first approach, the prior distributions were considered as a fixed distribution function with specific parameter values, whereas, in the second approach, prior distributions of the parameters were generated from experts' opinions. The posterior distributions of these parameters were obtained by applying Bayesian rule. The Markov Chain Monte Carlo sampling in the form Gibbs sampling was applied for sampling from the posterior distribution. The results revealed that all coefficients of structural and measurement model parameters are statistically significant in experts' opinion-based priors, whereas, two coefficients are not statistically significant when fixed prior-based distributions are applied. The error statistics reveals that Bayesian structural model provides reasonably good fit of work injury with high coefficient of determination (0.91) and less mean squared error as compared to traditional SEM.

  4. Tooth-size discrepancy: A comparison between manual and digital methods

    PubMed Central

    Correia, Gabriele Dória Cabral; Habib, Fernando Antonio Lima; Vogel, Carlos Jorge

    2014-01-01

    Introduction Technological advances in Dentistry have emerged primarily in the area of diagnostic tools. One example is the 3D scanner, which can transform plaster models into three-dimensional digital models. Objective This study aimed to assess the reliability of tooth size-arch length discrepancy analysis measurements performed on three-dimensional digital models, and compare these measurements with those obtained from plaster models. Material and Methods To this end, plaster models of lower dental arches and their corresponding three-dimensional digital models acquired with a 3Shape R700T scanner were used. All of them had lower permanent dentition. Four different tooth size-arch length discrepancy calculations were performed on each model, two of which by manual methods using calipers and brass wire, and two by digital methods using linear measurements and parabolas. Results Data were statistically assessed using Friedman test and no statistically significant differences were found between the two methods (P > 0.05), except for values found by the linear digital method which revealed a slight, non-significant statistical difference. Conclusions Based on the results, it is reasonable to assert that any of these resources used by orthodontists to clinically assess tooth size-arch length discrepancy can be considered reliable. PMID:25279529

  5. Statistical model specification and power: recommendations on the use of test-qualified pooling in analysis of experimental data

    PubMed Central

    Colegrave, Nick

    2017-01-01

    A common approach to the analysis of experimental data across much of the biological sciences is test-qualified pooling. Here non-significant terms are dropped from a statistical model, effectively pooling the variation associated with each removed term with the error term used to test hypotheses (or estimate effect sizes). This pooling is only carried out if statistical testing on the basis of applying that data to a previous more complicated model provides motivation for this model simplification; hence the pooling is test-qualified. In pooling, the researcher increases the degrees of freedom of the error term with the aim of increasing statistical power to test their hypotheses of interest. Despite this approach being widely adopted and explicitly recommended by some of the most widely cited statistical textbooks aimed at biologists, here we argue that (except in highly specialized circumstances that we can identify) the hoped-for improvement in statistical power will be small or non-existent, and there is likely to be much reduced reliability of the statistical procedures through deviation of type I error rates from nominal levels. We thus call for greatly reduced use of test-qualified pooling across experimental biology, more careful justification of any use that continues, and a different philosophy for initial selection of statistical models in the light of this change in procedure. PMID:28330912

  6. Restenosis of the CYPHER-Select, TAXUS-Express, and Polyzene-F Nanocoated Cobalt-Chromium Stents in the Minipig Coronary Artery Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radeleff, Boris, E-mail: Boris.radeleff@med.uni-heidelberg.de; Thierjung, Heidi; Stampfl, Ulrike

    2008-09-15

    PurposeTo date no direct experimental comparison between the CYPHER-Select and TAXUS-Express stents is available. Therefore, we investigated late in-stent stenosis, thrombogenicity, and inflammation, comparing the CYPHER-Select, TAXUS-Express, and custom-made cobalt chromium Polyzene-F nanocoated stents (CCPS) in the minipig coronary artery model.MethodsThe three stent types were implanted in the right coronary artery of 30 minipigs. The primary endpoint was in-stent stenosis assessed by quantitative angiography and microscopy. Secondary endpoints were inflammation and thrombogenicity evaluated by scores for inflammation and immunoreactivity (C-reactive protein and transforming growth factor beta). Follow-up was at 4 and 12 weeks.ResultsStent placement was successful in all animals; nomore » thrombus deposition occurred. Quantitative angiography did not depict statistically significant differences between the three stent types after 4 and 12 weeks. Quantitative microscopy at 4 weeks showed a statistically significant thicker neointima (p = 0.0431) for the CYPHER (105.034 {+-} 62.52 {mu}m) versus the TAXUS (74.864 {+-} 66.03 {mu}m) and versus the CCPS (63.542 {+-} 39.57 {mu}m). At 12 weeks there were no statistically significant differences. Inflammation scores at 4 weeks were significantly lower for the CCPS and CYPHER compared with the TAXUS stent (p = 0.0431). After 12 weeks statistical significance was only found for the CYPHER versus the TAXUS stent (p = 0.0431). The semiquantitative immunoreactivity scores for C-reactive protein and transforming growth factor beta showed no statistically significant differences between the three stent types after 4 and 12 weeks.ConclusionsThe CCPS provided effective control of late in-stent stenosis and thrombogenicity in this porcine model compared with the two drug-eluting stents. Its low inflammation score underscores its noninflammatory potential and might explain its equivalence to the two DES.« less

  7. A Comparison of Student Understanding of Seasons Using Inquiry and Didactic Teaching Methods

    NASA Astrophysics Data System (ADS)

    Ashcraft, Paul G.

    2006-02-01

    Student performance on open-ended questions concerning seasons in a university physical science content course was examined to note differences between classes that experienced inquiry using a 5-E lesson planning model and those that experienced the same content with a traditional, didactic lesson. The class examined is a required content course for elementary education majors and understanding the seasons is part of the university's state's elementary science standards. The two self-selected groups of students showed no statistically significant differences in pre-test scores, while there were statistically significant differences between the groups' post-test scores with those who participated in inquiry-based activities scoring higher. There were no statistically significant differences between the pre-test and the post-test for the students who experienced didactic teaching, while there were statistically significant improvements for the students who experienced the 5-E lesson.

  8. PSSMSearch: a server for modeling, visualization, proteome-wide discovery and annotation of protein motif specificity determinants.

    PubMed

    Krystkowiak, Izabella; Manguy, Jean; Davey, Norman E

    2018-06-05

    There is a pressing need for in silico tools that can aid in the identification of the complete repertoire of protein binding (SLiMs, MoRFs, miniMotifs) and modification (moiety attachment/removal, isomerization, cleavage) motifs. We have created PSSMSearch, an interactive web-based tool for rapid statistical modeling, visualization, discovery and annotation of protein motif specificity determinants to discover novel motifs in a proteome-wide manner. PSSMSearch analyses proteomes for regions with significant similarity to a motif specificity determinant model built from a set of aligned motif-containing peptides. Multiple scoring methods are available to build a position-specific scoring matrix (PSSM) describing the motif specificity determinant model. This model can then be modified by a user to add prior knowledge of specificity determinants through an interactive PSSM heatmap. PSSMSearch includes a statistical framework to calculate the significance of specificity determinant model matches against a proteome of interest. PSSMSearch also includes the SLiMSearch framework's annotation, motif functional analysis and filtering tools to highlight relevant discriminatory information. Additional tools to annotate statistically significant shared keywords and GO terms, or experimental evidence of interaction with a motif-recognizing protein have been added. Finally, PSSM-based conservation metrics have been created for taxonomic range analyses. The PSSMSearch web server is available at http://slim.ucd.ie/pssmsearch/.

  9. Comparison of small diameter stone baskets in an in vitro caliceal and ureteral model.

    PubMed

    Korman, Emily; Hendlin, Kari; Chotikawanich, Ekkarin; Monga, Manoj

    2011-01-01

    Three small diameter (<1.5F) stone baskets have recently been introduced. Our objective was to evaluate the stone capture rate of these baskets in an in vitro ureteral model and an in vitro caliceal model using novice, resident, and expert operators. Sacred Heart Medical Halo™ (1.5F), Cook N-Circle(®) Nitinol Tipless Stone Extractor (1.5F), and Boston Scientific OptiFlex(®) (1.3F) stone baskets were tested in an in vitro ureteral and a caliceal model by three novices, three residents, and three experts. The caliceal model consisted of a 7-cm length of 10-mm O.D. plastic tubing with a convex base. Each operator was timed during removal of a 3-mm calculus from each model with three repetitions for each basket. Data were analyzed by analysis of variance single factor tests and t tests assuming unequal variances. In the ureteral model, the Halo had the fastest average rate of stone extraction for experts and novices (0:02 ± 0:01 and 0:08 ± 0:04 min, respectively), as well as the overall fastest average stone extraction rate (0:08 ± 0:06 min). No statistical significant differences in extraction times between baskets were identified in the resident group. In the novice group, the Halo stone extraction rate was significantly faster than the OptiFlex (P=0.029). In the expert group, the OptiFlex had statistically significant slower average extraction rates compared with the Halo (P=0.005) and the N-Circle (P=0.017). In the caliceal model, no statistically significant differences were noted. While no significant differences were noted in extraction times for the caliceal model, the extraction times for the ureteral model were slowest with the OptiFlex basket. Other variables important in selection of the appropriate basket include operator preference, clinical setting, and cost.

  10. A Path Model of School Violence Perpetration: Introducing Online Game Addiction as a New Risk Factor.

    PubMed

    Kim, Jae Yop; Lee, Jeen Suk; Oh, Sehun

    2015-08-10

    Drawing on the cognitive information-processing model of aggression and the general aggression model, we explored why adolescents become addicted to online games and how their immersion in online games affects school violence perpetration (SVP). For this purpose, we conducted statistical analyses on 1,775 elementary and middle school students who resided in northern districts of Seoul, South Korea. The results validated the proposed structural equation model and confirmed the statistical significance of the structural paths from the variables; that is, the paths from child abuse and self-esteem to SVP were significant. The levels of self-esteem and child abuse victimization affected SVP, and this effect was mediated by online game addiction (OGA). Furthermore, a multigroup path analysis showed significant gender differences in the path coefficients of the proposed model, indicating that gender exerted differential effects on adolescents' OGA and SVP. Based on these results, prevention and intervention methods to curb violence in schools have been proposed. © The Author(s) 2015.

  11. DETECTING UNSPECIFIED STRUCTURE IN LOW-COUNT IMAGES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stein, Nathan M.; Dyk, David A. van; Kashyap, Vinay L.

    Unexpected structure in images of astronomical sources often presents itself upon visual inspection of the image, but such apparent structure may either correspond to true features in the source or be due to noise in the data. This paper presents a method for testing whether inferred structure in an image with Poisson noise represents a significant departure from a baseline (null) model of the image. To infer image structure, we conduct a Bayesian analysis of a full model that uses a multiscale component to allow flexible departures from the posited null model. As a test statistic, we use a tailmore » probability of the posterior distribution under the full model. This choice of test statistic allows us to estimate a computationally efficient upper bound on a p-value that enables us to draw strong conclusions even when there are limited computational resources that can be devoted to simulations under the null model. We demonstrate the statistical performance of our method on simulated images. Applying our method to an X-ray image of the quasar 0730+257, we find significant evidence against the null model of a single point source and uniform background, lending support to the claim of an X-ray jet.« less

  12. Applications of spatial statistical network models to stream data

    USGS Publications Warehouse

    Isaak, Daniel J.; Peterson, Erin E.; Ver Hoef, Jay M.; Wenger, Seth J.; Falke, Jeffrey A.; Torgersen, Christian E.; Sowder, Colin; Steel, E. Ashley; Fortin, Marie-Josée; Jordan, Chris E.; Ruesch, Aaron S.; Som, Nicholas; Monestiez, Pascal

    2014-01-01

    Streams and rivers host a significant portion of Earth's biodiversity and provide important ecosystem services for human populations. Accurate information regarding the status and trends of stream resources is vital for their effective conservation and management. Most statistical techniques applied to data measured on stream networks were developed for terrestrial applications and are not optimized for streams. A new class of spatial statistical model, based on valid covariance structures for stream networks, can be used with many common types of stream data (e.g., water quality attributes, habitat conditions, biological surveys) through application of appropriate distributions (e.g., Gaussian, binomial, Poisson). The spatial statistical network models account for spatial autocorrelation (i.e., nonindependence) among measurements, which allows their application to databases with clustered measurement locations. Large amounts of stream data exist in many areas where spatial statistical analyses could be used to develop novel insights, improve predictions at unsampled sites, and aid in the design of efficient monitoring strategies at relatively low cost. We review the topic of spatial autocorrelation and its effects on statistical inference, demonstrate the use of spatial statistics with stream datasets relevant to common research and management questions, and discuss additional applications and development potential for spatial statistics on stream networks. Free software for implementing the spatial statistical network models has been developed that enables custom applications with many stream databases.

  13. PyEvolve: a toolkit for statistical modelling of molecular evolution.

    PubMed

    Butterfield, Andrew; Vedagiri, Vivek; Lang, Edward; Lawrence, Cath; Wakefield, Matthew J; Isaev, Alexander; Huttley, Gavin A

    2004-01-05

    Examining the distribution of variation has proven an extremely profitable technique in the effort to identify sequences of biological significance. Most approaches in the field, however, evaluate only the conserved portions of sequences - ignoring the biological significance of sequence differences. A suite of sophisticated likelihood based statistical models from the field of molecular evolution provides the basis for extracting the information from the full distribution of sequence variation. The number of different problems to which phylogeny-based maximum likelihood calculations can be applied is extensive. Available software packages that can perform likelihood calculations suffer from a lack of flexibility and scalability, or employ error-prone approaches to model parameterisation. Here we describe the implementation of PyEvolve, a toolkit for the application of existing, and development of new, statistical methods for molecular evolution. We present the object architecture and design schema of PyEvolve, which includes an adaptable multi-level parallelisation schema. The approach for defining new methods is illustrated by implementing a novel dinucleotide model of substitution that includes a parameter for mutation of methylated CpG's, which required 8 lines of standard Python code to define. Benchmarking was performed using either a dinucleotide or codon substitution model applied to an alignment of BRCA1 sequences from 20 mammals, or a 10 species subset. Up to five-fold parallel performance gains over serial were recorded. Compared to leading alternative software, PyEvolve exhibited significantly better real world performance for parameter rich models with a large data set, reducing the time required for optimisation from approximately 10 days to approximately 6 hours. PyEvolve provides flexible functionality that can be used either for statistical modelling of molecular evolution, or the development of new methods in the field. The toolkit can be used interactively or by writing and executing scripts. The toolkit uses efficient processes for specifying the parameterisation of statistical models, and implements numerous optimisations that make highly parameter rich likelihood functions solvable within hours on multi-cpu hardware. PyEvolve can be readily adapted in response to changing computational demands and hardware configurations to maximise performance. PyEvolve is released under the GPL and can be downloaded from http://cbis.anu.edu.au/software.

  14. A SIGNIFICANCE TEST FOR THE LASSO1

    PubMed Central

    Lockhart, Richard; Taylor, Jonathan; Tibshirani, Ryan J.; Tibshirani, Robert

    2014-01-01

    In the sparse linear regression setting, we consider testing the significance of the predictor variable that enters the current lasso model, in the sequence of models visited along the lasso solution path. We propose a simple test statistic based on lasso fitted values, called the covariance test statistic, and show that when the true model is linear, this statistic has an Exp(1) asymptotic distribution under the null hypothesis (the null being that all truly active variables are contained in the current lasso model). Our proof of this result for the special case of the first predictor to enter the model (i.e., testing for a single significant predictor variable against the global null) requires only weak assumptions on the predictor matrix X. On the other hand, our proof for a general step in the lasso path places further technical assumptions on X and the generative model, but still allows for the important high-dimensional case p > n, and does not necessarily require that the current lasso model achieves perfect recovery of the truly active variables. Of course, for testing the significance of an additional variable between two nested linear models, one typically uses the chi-squared test, comparing the drop in residual sum of squares (RSS) to a χ12 distribution. But when this additional variable is not fixed, and has been chosen adaptively or greedily, this test is no longer appropriate: adaptivity makes the drop in RSS stochastically much larger than χ12 under the null hypothesis. Our analysis explicitly accounts for adaptivity, as it must, since the lasso builds an adaptive sequence of linear models as the tuning parameter λ decreases. In this analysis, shrinkage plays a key role: though additional variables are chosen adaptively, the coefficients of lasso active variables are shrunken due to the l1 penalty. Therefore, the test statistic (which is based on lasso fitted values) is in a sense balanced by these two opposing properties—adaptivity and shrinkage—and its null distribution is tractable and asymptotically Exp(1). PMID:25574062

  15. Patch-Based Generative Shape Model and MDL Model Selection for Statistical Analysis of Archipelagos

    NASA Astrophysics Data System (ADS)

    Ganz, Melanie; Nielsen, Mads; Brandt, Sami

    We propose a statistical generative shape model for archipelago-like structures. These kind of structures occur, for instance, in medical images, where our intention is to model the appearance and shapes of calcifications in x-ray radio graphs. The generative model is constructed by (1) learning a patch-based dictionary for possible shapes, (2) building up a time-homogeneous Markov model to model the neighbourhood correlations between the patches, and (3) automatic selection of the model complexity by the minimum description length principle. The generative shape model is proposed as a probability distribution of a binary image where the model is intended to facilitate sequential simulation. Our results show that a relatively simple model is able to generate structures visually similar to calcifications. Furthermore, we used the shape model as a shape prior in the statistical segmentation of calcifications, where the area overlap with the ground truth shapes improved significantly compared to the case where the prior was not used.

  16. A cross-national analysis of how economic inequality predicts biodiversity loss.

    PubMed

    Holland, Tim G; Peterson, Garry D; Gonzalez, Andrew

    2009-10-01

    We used socioeconomic models that included economic inequality to predict biodiversity loss, measured as the proportion of threatened plant and vertebrate species, across 50 countries. Our main goal was to evaluate whether economic inequality, measured as the Gini index of income distribution, improved the explanatory power of our statistical models. We compared four models that included the following: only population density, economic footprint (i.e., the size of the economy relative to the country area), economic footprint and income inequality (Gini index), and an index of environmental governance. We also tested the environmental Kuznets curve hypothesis, but it was not supported by the data. Statistical comparisons of the models revealed that the model including both economic footprint and inequality was the best predictor of threatened species. It significantly outperformed population density alone and the environmental governance model according to the Akaike information criterion. Inequality was a significant predictor of biodiversity loss and significantly improved the fit of our models. These results confirm that socioeconomic inequality is an important factor to consider when predicting rates of anthropogenic biodiversity loss.

  17. Assessment of corneal properties based on statistical modeling of OCT speckle.

    PubMed

    Jesus, Danilo A; Iskander, D Robert

    2017-01-01

    A new approach to assess the properties of the corneal micro-structure in vivo based on the statistical modeling of speckle obtained from Optical Coherence Tomography (OCT) is presented. A number of statistical models were proposed to fit the corneal speckle data obtained from OCT raw image. Short-term changes in corneal properties were studied by inducing corneal swelling whereas age-related changes were observed analyzing data of sixty-five subjects aged between twenty-four and seventy-three years. Generalized Gamma distribution has shown to be the best model, in terms of the Akaike's Information Criterion, to fit the OCT corneal speckle. Its parameters have shown statistically significant differences (Kruskal-Wallis, p < 0.001) for short and age-related corneal changes. In addition, it was observed that age-related changes influence the corneal biomechanical behaviour when corneal swelling is induced. This study shows that Generalized Gamma distribution can be utilized to modeling corneal speckle in OCT in vivo providing complementary quantified information where micro-structure of corneal tissue is of essence.

  18. Inevitable end-of-21st-century trends toward earlier surface runoff timing in California's Sierra Nevada Mountains

    NASA Astrophysics Data System (ADS)

    Schwartz, M. A.; Hall, A. D.; Sun, F.; Walton, D.; Berg, N.

    2015-12-01

    Hybrid dynamical-statistical downscaling is used to produce surface runoff timing projections for California's Sierra Nevada, a high-elevation mountain range with significant seasonal snow cover. First, future climate change projections (RCP8.5 forcing scenario, 2081-2100 period) from five CMIP5 global climate models (GCMs) are dynamically downscaled. These projections reveal that future warming leads to a shift toward earlier snowmelt and surface runoff timing throughout the Sierra Nevada region. Relationships between warming and surface runoff timing from the dynamical simulations are used to build a simple statistical model that mimics the dynamical model's projected surface runoff timing changes given GCM input or other statistically-downscaled input. This statistical model can be used to produce surface runoff timing projections for other GCMs, periods, and forcing scenarios to quantify ensemble-mean changes, uncertainty due to intermodel variability and consequences stemming from choice of forcing scenario. For all CMIP5 GCMs and forcing scenarios, significant trends toward earlier surface runoff timing occur at elevations below 2500m. Thus, we conclude that trends toward earlier surface runoff timing by the end-of-the-21st century are inevitable. The changes to surface runoff timing diagnosed in this study have implications for many dimensions of climate change, including impacts on surface hydrology, water resources, and ecosystems.

  19. Statistical methodology for the analysis of dye-switch microarray experiments

    PubMed Central

    Mary-Huard, Tristan; Aubert, Julie; Mansouri-Attia, Nadera; Sandra, Olivier; Daudin, Jean-Jacques

    2008-01-01

    Background In individually dye-balanced microarray designs, each biological sample is hybridized on two different slides, once with Cy3 and once with Cy5. While this strategy ensures an automatic correction of the gene-specific labelling bias, it also induces dependencies between log-ratio measurements that must be taken into account in the statistical analysis. Results We present two original statistical procedures for the statistical analysis of individually balanced designs. These procedures are compared with the usual ML and REML mixed model procedures proposed in most statistical toolboxes, on both simulated and real data. Conclusion The UP procedure we propose as an alternative to usual mixed model procedures is more efficient and significantly faster to compute. This result provides some useful guidelines for the analysis of complex designs. PMID:18271965

  20. A multibody knee model with discrete cartilage prediction of tibio-femoral contact mechanics.

    PubMed

    Guess, Trent M; Liu, Hongzeng; Bhashyam, Sampath; Thiagarajan, Ganesh

    2013-01-01

    Combining musculoskeletal simulations with anatomical joint models capable of predicting cartilage contact mechanics would provide a valuable tool for studying the relationships between muscle force and cartilage loading. As a step towards producing multibody musculoskeletal models that include representation of cartilage tissue mechanics, this research developed a subject-specific multibody knee model that represented the tibia plateau cartilage as discrete rigid bodies that interacted with the femur through deformable contacts. Parameters for the compliant contact law were derived using three methods: (1) simplified Hertzian contact theory, (2) simplified elastic foundation contact theory and (3) parameter optimisation from a finite element (FE) solution. The contact parameters and contact friction were evaluated during a simulated walk in a virtual dynamic knee simulator, and the resulting kinematics were compared with measured in vitro kinematics. The effects on predicted contact pressures and cartilage-bone interface shear forces during the simulated walk were also evaluated. The compliant contact stiffness parameters had a statistically significant effect on predicted contact pressures as well as all tibio-femoral motions except flexion-extension. The contact friction was not statistically significant to contact pressures, but was statistically significant to medial-lateral translation and all rotations except flexion-extension. The magnitude of kinematic differences between model formulations was relatively small, but contact pressure predictions were sensitive to model formulation. The developed multibody knee model was computationally efficient and had a computation time 283 times faster than a FE simulation using the same geometries and boundary conditions.

  1. Simple Statistics: - Summarized!

    ERIC Educational Resources Information Center

    Blai, Boris, Jr.

    Statistics are an essential tool for making proper judgement decisions. It is concerned with probability distribution models, testing of hypotheses, significance tests and other means of determining the correctness of deductions and the most likely outcome of decisions. Measures of central tendency include the mean, median and mode. A second…

  2. Increasing the statistical significance of entanglement detection in experiments.

    PubMed

    Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei

    2010-05-28

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.

  3. Statistical Models for Averaging of the Pump–Probe Traces: Example of Denoising in Terahertz Time-Domain Spectroscopy

    NASA Astrophysics Data System (ADS)

    Skorobogatiy, Maksim; Sadasivan, Jayesh; Guerboukha, Hichem

    2018-05-01

    In this paper, we first discuss the main types of noise in a typical pump-probe system, and then focus specifically on terahertz time domain spectroscopy (THz-TDS) setups. We then introduce four statistical models for the noisy pulses obtained in such systems, and detail rigorous mathematical algorithms to de-noise such traces, find the proper averages and characterise various types of experimental noise. Finally, we perform a comparative analysis of the performance, advantages and limitations of the algorithms by testing them on the experimental data collected using a particular THz-TDS system available in our laboratories. We conclude that using advanced statistical models for trace averaging results in the fitting errors that are significantly smaller than those obtained when only a simple statistical average is used.

  4. Role of socioeconomic status measures in long-term mortality risk prediction after myocardial infarction.

    PubMed

    Molshatzki, Noa; Drory, Yaacov; Myers, Vicki; Goldbourt, Uri; Benyamini, Yael; Steinberg, David M; Gerber, Yariv

    2011-07-01

    The relationship of risk factors to outcomes has traditionally been assessed by measures of association such as odds ratio or hazard ratio and their statistical significance from an adjusted model. However, a strong, highly significant association does not guarantee a gain in stratification capacity. Using recently developed model performance indices, we evaluated the incremental discriminatory power of individual and neighborhood socioeconomic status (SES) measures after myocardial infarction (MI). Consecutive patients aged ≤65 years (N=1178) discharged from 8 hospitals in central Israel after incident MI in 1992 to 1993 were followed-up through 2005. A basic model (demographic variables, traditional cardiovascular risk factors, and disease severity indicators) was compared with an extended model including SES measures (education, income, employment, living with a steady partner, and neighborhood SES) in terms of Harrell c statistic, integrated discrimination improvement (IDI), and net reclassification improvement (NRI). During the 13-year follow-up, 326 (28%) patients died. Cox proportional hazards models showed that all SES measures were significantly and independently associated with mortality. Furthermore, compared with the basic model, the extended model yielded substantial gains (all P<0.001) in c statistic (0.723 to 0.757), NRI (15.2%), IDI (5.9%), and relative IDI (32%). Improvement was observed both for sensitivity (classification of events) and specificity (classification of nonevents). This study illustrates the additional insights that can be gained from considering the IDI and NRI measures of model performance and suggests that, among community patients with incident MI, incorporating SES measures into a clinical-based model substantially improves long-term mortality risk prediction.

  5. Impact of the calibration period on the conceptual rainfall-runoff model parameter estimates

    NASA Astrophysics Data System (ADS)

    Todorovic, Andrijana; Plavsic, Jasna

    2015-04-01

    A conceptual rainfall-runoff model is defined by its structure and parameters, which are commonly inferred through model calibration. Parameter estimates depend on objective function(s), optimisation method, and calibration period. Model calibration over different periods may result in dissimilar parameter estimates, while model efficiency decreases outside calibration period. Problem of model (parameter) transferability, which conditions reliability of hydrologic simulations, has been investigated for decades. In this paper, dependence of the parameter estimates and model performance on calibration period is analysed. The main question that is addressed is: are there any changes in optimised parameters and model efficiency that can be linked to the changes in hydrologic or meteorological variables (flow, precipitation and temperature)? Conceptual, semi-distributed HBV-light model is calibrated over five-year periods shifted by a year (sliding time windows). Length of the calibration periods is selected to enable identification of all parameters. One water year of model warm-up precedes every simulation, which starts with the beginning of a water year. The model is calibrated using the built-in GAP optimisation algorithm. The objective function used for calibration is composed of Nash-Sutcliffe coefficient for flows and logarithms of flows, and volumetric error, all of which participate in the composite objective function with approximately equal weights. Same prior parameter ranges are used in all simulations. The model is calibrated against flows observed at the Slovac stream gauge on the Kolubara River in Serbia (records from 1954 to 2013). There are no trends in precipitation nor in flows, however, there is a statistically significant increasing trend in temperatures at this catchment. Parameter variability across the calibration periods is quantified in terms of standard deviations of normalised parameters, enabling detection of the most variable parameters. Correlation coefficients among optimised model parameters and total precipitation P, mean temperature T and mean flow Q are calculated to give an insight into parameter dependence on the hydrometeorological drivers. The results reveal high sensitivity of almost all model parameters towards calibration period. The highest variability is displayed by the refreezing coefficient, water holding capacity, and temperature gradient. The only statistically significant (decreasing) trend is detected in the evapotranspiration reduction threshold. Statistically significant correlation is detected between the precipitation gradient and precipitation depth, and between the time-area histogram base and flows. All other correlations are not statistically significant, implying that changes in optimised parameters cannot generally be linked to the changes in P, T or Q. As for the model performance, the model reproduces the observed runoff satisfactorily, though the runoff is slightly overestimated in wet periods. The Nash-Sutcliffe efficiency coefficient (NSE) ranges from 0.44 to 0.79. Higher NSE values are obtained over wetter periods, what is supported by statistically significant correlation between NSE and flows. Overall, no systematic variations in parameters or in model performance are detected. Parameter variability may therefore rather be attributed to errors in data or inadequacies in the model structure. Further research is required to examine the impact of the calibration strategy or model structure on the variability in optimised parameters in time.

  6. A multi-level approach for investigating socio-economic and agricultural risk factors associated with rates of reported cases of Escherichia coli O157 in humans in Alberta, Canada.

    PubMed

    Pearl, D L; Louie, M; Chui, L; Doré, K; Grimsrud, K M; Martin, S W; Michel, P; Svenson, L W; McEwen, S A

    2009-10-01

    Using negative binomial and multi-level Poisson models, the authors determined the statistical significance of agricultural and socio-economic risk factors for rates of reported disease associated with Escherichia coli O157 in census subdivisions (CSDs) in Alberta, Canada, 2000-2002. Variables relating to population stability, aboriginal composition of the CSDs, and the economic relationship between CSDs and urban centres were significant risk factors. The percentage of individuals living in low-income households was not a statistically significant risk factor for rates of disease. The statistical significance of cattle density, recorded at a higher geographical level, depended on the method used to correct for overdispersion, the number of levels included in the multi-level models, and the choice of using all reported cases or only sporadic cases. Our results highlight the importance of local socio-economic risk factors in determining rates of disease associated with E. coli O157, but their relationship with individual risk factors requires further evaluation.

  7. Statistical label fusion with hierarchical performance models

    PubMed Central

    Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.

    2014-01-01

    Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809

  8. Computational Analysis for Rocket-Based Combined-Cycle Systems During Rocket-Only Operation

    NASA Technical Reports Server (NTRS)

    Steffen, C. J., Jr.; Smith, T. D.; Yungster, S.; Keller, D. J.

    2000-01-01

    A series of Reynolds-averaged Navier-Stokes calculations were employed to study the performance of rocket-based combined-cycle systems operating in an all-rocket mode. This parametric series of calculations were executed within a statistical framework, commonly known as design of experiments. The parametric design space included four geometric and two flowfield variables set at three levels each, for a total of 729 possible combinations. A D-optimal design strategy was selected. It required that only 36 separate computational fluid dynamics (CFD) solutions be performed to develop a full response surface model, which quantified the linear, bilinear, and curvilinear effects of the six experimental variables. The axisymmetric, Reynolds-averaged Navier-Stokes simulations were executed with the NPARC v3.0 code. The response used in the statistical analysis was created from Isp efficiency data integrated from the 36 CFD simulations. The influence of turbulence modeling was analyzed by using both one- and two-equation models. Careful attention was also given to quantify the influence of mesh dependence, iterative convergence, and artificial viscosity upon the resulting statistical model. Thirteen statistically significant effects were observed to have an influence on rocket-based combined-cycle nozzle performance. It was apparent that the free-expansion process, directly downstream of the rocket nozzle, can influence the Isp efficiency. Numerical schlieren images and particle traces have been used to further understand the physical phenomena behind several of the statistically significant results.

  9. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    PubMed

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Generating survival times to simulate Cox proportional hazards models with time-varying covariates.

    PubMed

    Austin, Peter C

    2012-12-20

    Simulations and Monte Carlo methods serve an important role in modern statistical research. They allow for an examination of the performance of statistical procedures in settings in which analytic and mathematical derivations may not be feasible. A key element in any statistical simulation is the existence of an appropriate data-generating process: one must be able to simulate data from a specified statistical model. We describe data-generating processes for the Cox proportional hazards model with time-varying covariates when event times follow an exponential, Weibull, or Gompertz distribution. We consider three types of time-varying covariates: first, a dichotomous time-varying covariate that can change at most once from untreated to treated (e.g., organ transplant); second, a continuous time-varying covariate such as cumulative exposure at a constant dose to radiation or to a pharmaceutical agent used for a chronic condition; third, a dichotomous time-varying covariate with a subject being able to move repeatedly between treatment states (e.g., current compliance or use of a medication). In each setting, we derive closed-form expressions that allow one to simulate survival times so that survival times are related to a vector of fixed or time-invariant covariates and to a single time-varying covariate. We illustrate the utility of our closed-form expressions for simulating event times by using Monte Carlo simulations to estimate the statistical power to detect as statistically significant the effect of different types of binary time-varying covariates. This is compared with the statistical power to detect as statistically significant a binary time-invariant covariate. Copyright © 2012 John Wiley & Sons, Ltd.

  11. Progress of statistical analysis in biomedical research through the historical review of the development of the Framingham score.

    PubMed

    Ignjatović, Aleksandra; Stojanović, Miodrag; Milošević, Zoran; Anđelković Apostolović, Marija

    2017-12-02

    The interest in developing risk models in medicine not only is appealing, but also associated with many obstacles in different aspects of predictive model development. Initially, the association of biomarkers or the association of more markers with the specific outcome was proven by statistical significance, but novel and demanding questions required the development of new and more complex statistical techniques. Progress of statistical analysis in biomedical research can be observed the best through the history of the Framingham study and development of the Framingham score. Evaluation of predictive models comes from a combination of the facts which are results of several metrics. Using logistic regression and Cox proportional hazards regression analysis, the calibration test, and the ROC curve analysis should be mandatory and eliminatory, and the central place should be taken by some new statistical techniques. In order to obtain complete information related to the new marker in the model, recently, there is a recommendation to use the reclassification tables by calculating the net reclassification index and the integrated discrimination improvement. Decision curve analysis is a novel method for evaluating the clinical usefulness of a predictive model. It may be noted that customizing and fine-tuning of the Framingham risk score initiated the development of statistical analysis. Clinically applicable predictive model should be a trade-off between all abovementioned statistical metrics, a trade-off between calibration and discrimination, accuracy and decision-making, costs and benefits, and quality and quantity of patient's life.

  12. Pattern statistics on Markov chains and sensitivity to parameter estimation

    PubMed Central

    Nuel, Grégory

    2006-01-01

    Background: In order to compute pattern statistics in computational biology a Markov model is commonly used to take into account the sequence composition. Usually its parameter must be estimated. The aim of this paper is to determine how sensitive these statistics are to parameter estimation, and what are the consequences of this variability on pattern studies (finding the most over-represented words in a genome, the most significant common words to a set of sequences,...). Results: In the particular case where pattern statistics (overlap counting only) computed through binomial approximations we use the delta-method to give an explicit expression of σ, the standard deviation of a pattern statistic. This result is validated using simulations and a simple pattern study is also considered. Conclusion: We establish that the use of high order Markov model could easily lead to major mistakes due to the high sensitivity of pattern statistics to parameter estimation. PMID:17044916

  13. Pattern statistics on Markov chains and sensitivity to parameter estimation.

    PubMed

    Nuel, Grégory

    2006-10-17

    In order to compute pattern statistics in computational biology a Markov model is commonly used to take into account the sequence composition. Usually its parameter must be estimated. The aim of this paper is to determine how sensitive these statistics are to parameter estimation, and what are the consequences of this variability on pattern studies (finding the most over-represented words in a genome, the most significant common words to a set of sequences,...). In the particular case where pattern statistics (overlap counting only) computed through binomial approximations we use the delta-method to give an explicit expression of sigma, the standard deviation of a pattern statistic. This result is validated using simulations and a simple pattern study is also considered. We establish that the use of high order Markov model could easily lead to major mistakes due to the high sensitivity of pattern statistics to parameter estimation.

  14. The effect of a major cigarette price change on smoking behavior in california: a zero-inflated negative binomial model.

    PubMed

    Sheu, Mei-Ling; Hu, Teh-Wei; Keeler, Theodore E; Ong, Michael; Sung, Hai-Yen

    2004-08-01

    The objective of this paper is to determine the price sensitivity of smokers in their consumption of cigarettes, using evidence from a major increase in California cigarette prices due to Proposition 10 and the Tobacco Settlement. The study sample consists of individual survey data from Behavioral Risk Factor Survey (BRFS) and price data from the Bureau of Labor Statistics between 1996 and 1999. A zero-inflated negative binomial (ZINB) regression model was applied for the statistical analysis. The statistical model showed that price did not have an effect on reducing the estimated prevalence of smoking. However, it indicated that among smokers the price elasticity was at the level of -0.46 and statistically significant. Since smoking prevalence is significantly lower than it was a decade ago, price increases are becoming less effective as an inducement for hard-core smokers to quit, although they may respond by decreasing consumption. For those who only smoke occasionally (many of them being young adults) price increases alone may not be an effective inducement to quit smoking. Additional underlying behavioral factors need to be identified so that more effective anti-smoking strategies can be developed.

  15. Weak lensing probe of cubic Galileon model

    NASA Astrophysics Data System (ADS)

    Dinda, Bikash R.

    2018-06-01

    The cubic Galileon model containing the lowest non-trivial order action of the full Galileon action can produce the stable late-time cosmic acceleration. This model can have a significant role in the growth of structures. The signatures of the cubic Galileon model in the structure formation can be probed by the weak lensing statistics. Weak lensing convergence statistics is one of the strongest probes to the structure formation and hence it can probe the dark energy or modified theories of gravity models. In this work, we investigate the detectability of the cubic Galileon model from the ΛCDM model or from the canonical quintessence model through the convergence power spectrum and bi-spectrum.

  16. Governance and Regional Variation of Homicide Rates: Evidence From Cross-National Data.

    PubMed

    Cao, Liqun; Zhang, Yan

    2017-01-01

    Criminological theories of cross-national studies of homicide have underestimated the effects of quality governance of liberal democracy and region. Data sets from several sources are combined and a comprehensive model of homicide is proposed. Results of the spatial regression model, which controls for the effect of spatial autocorrelation, show that quality governance, human development, economic inequality, and ethnic heterogeneity are statistically significant in predicting homicide. In addition, regions of Latin America and non-Muslim Sub-Saharan Africa have significantly higher rates of homicides ceteris paribus while the effects of East Asian countries and Islamic societies are not statistically significant. These findings are consistent with the expectation of the new modernization and regional theories. © The Author(s) 2015.

  17. Computational algebraic geometry for statistical modeling FY09Q2 progress.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, David C.; Rojas, Joseph Maurice; Pebay, Philippe Pierre

    2009-03-01

    This is a progress report on polynomial system solving for statistical modeling. This is a progress report on polynomial system solving for statistical modeling. This quarter we have developed our first model of shock response data and an algorithm for identifying the chamber cone containing a polynomial system in n variables with n+k terms within polynomial time - a significant improvement over previous algorithms, all having exponential worst-case complexity. We have implemented and verified the chamber cone algorithm for n+3 and are working to extend the implementation to handle arbitrary k. Later sections of this report explain chamber cones inmore » more detail; the next section provides an overview of the project and how the current progress fits into it.« less

  18. Statistical assessment of bi-exponential diffusion weighted imaging signal characteristics induced by intravoxel incoherent motion in malignant breast tumors

    PubMed Central

    Wong, Oi Lei; Lo, Gladys G.; Chan, Helen H. L.; Wong, Ting Ting; Cheung, Polly S. Y.

    2016-01-01

    Background The purpose of this study is to statistically assess whether bi-exponential intravoxel incoherent motion (IVIM) model better characterizes diffusion weighted imaging (DWI) signal of malignant breast tumor than mono-exponential Gaussian diffusion model. Methods 3 T DWI data of 29 malignant breast tumors were retrospectively included. Linear least-square mono-exponential fitting and segmented least-square bi-exponential fitting were used for apparent diffusion coefficient (ADC) and IVIM parameter quantification, respectively. F-test and Akaike Information Criterion (AIC) were used to statistically assess the preference of mono-exponential and bi-exponential model using region-of-interests (ROI)-averaged and voxel-wise analysis. Results For ROI-averaged analysis, 15 tumors were significantly better fitted by bi-exponential function and 14 tumors exhibited mono-exponential behavior. The calculated ADC, D (true diffusion coefficient) and f (pseudo-diffusion fraction) showed no significant differences between mono-exponential and bi-exponential preferable tumors. Voxel-wise analysis revealed that 27 tumors contained more voxels exhibiting mono-exponential DWI decay while only 2 tumors presented more bi-exponential decay voxels. ADC was consistently and significantly larger than D for both ROI-averaged and voxel-wise analysis. Conclusions Although the presence of IVIM effect in malignant breast tumors could be suggested, statistical assessment shows that bi-exponential fitting does not necessarily better represent the DWI signal decay in breast cancer under clinically typical acquisition protocol and signal-to-noise ratio (SNR). Our study indicates the importance to statistically examine the breast cancer DWI signal characteristics in practice. PMID:27709078

  19. Assessing the statistical significance of the achieved classification error of classifiers constructed using serum peptide profiles, and a prescription for random sampling repeated studies for massive high-throughput genomic and proteomic studies.

    PubMed

    Lyons-Weiler, James; Pelikan, Richard; Zeh, Herbert J; Whitcomb, David C; Malehorn, David E; Bigbee, William L; Hauskrecht, Milos

    2005-01-01

    Peptide profiles generated using SELDI/MALDI time of flight mass spectrometry provide a promising source of patient-specific information with high potential impact on the early detection and classification of cancer and other diseases. The new profiling technology comes, however, with numerous challenges and concerns. Particularly important are concerns of reproducibility of classification results and their significance. In this work we describe a computational validation framework, called PACE (Permutation-Achieved Classification Error), that lets us assess, for a given classification model, the significance of the Achieved Classification Error (ACE) on the profile data. The framework compares the performance statistic of the classifier on true data samples and checks if these are consistent with the behavior of the classifier on the same data with randomly reassigned class labels. A statistically significant ACE increases our belief that a discriminative signal was found in the data. The advantage of PACE analysis is that it can be easily combined with any classification model and is relatively easy to interpret. PACE analysis does not protect researchers against confounding in the experimental design, or other sources of systematic or random error. We use PACE analysis to assess significance of classification results we have achieved on a number of published data sets. The results show that many of these datasets indeed possess a signal that leads to a statistically significant ACE.

  20. Predicting juvenile recidivism: new method, old problems.

    PubMed

    Benda, B B

    1987-01-01

    This prediction study compared three statistical procedures for accuracy using two assessment methods. The criterion is return to a juvenile prison after the first release, and the models tested are logit analysis, predictive attribute analysis, and a Burgess procedure. No significant differences are found between statistics in prediction.

  1. Teaching MBA Statistics Online: A Pedagogically Sound Process Approach

    ERIC Educational Resources Information Center

    Grandzol, John R.

    2004-01-01

    Delivering MBA statistics in the online environment presents significant challenges to education and students alike because of varying student preparedness levels, complexity of content, difficulty in assessing learning outcomes, and faculty availability and technological expertise. In this article, the author suggests a process model that…

  2. Administration of honey to prevent peritoneal adhesions in a rat peritonitis model.

    PubMed

    Yuzbasioglu, Mehmet Fatih; Kurutas, Ergul Belge; Bulbuloglu, Ertan; Goksu, Mustafa; Atli, Yalcin; Bakan, Vedat; Kale, Ilhami Taner

    2009-02-01

    We investigated the effects of intraperitoneal honey on the development of postoperative intra-abdominal adhesions and oxidative stress in a model of bacterial peritonitis. Bacterial peritonitis was induced in 18 rats by cecal ligation and puncture. The rats were randomly assigned to three groups. Group 1 (n=6) received honey intraperitoneally, group 2 (n=6) received 5% dextrose intraperitoneally, and the third group received no fluid or medicine intraperitoneally one day after cecal ligation and puncture procedure. All animals were killed 14 days later so we could assess the adhesion score. Tissue antioxidant levels were measured in 1-g tissue samples taken from the abdominal wall. Adhesion scores of honey treated group were significantly lower according to the control group (P<0.05) and statistically significant. Adhesion scores of honey were lower from 5% dextrose but not statistically significant (P>0.05). Malondialdehyde values of honey group were significantly lower from the control group (P<0.05) and levels in 5% dextrose group was higher than the honey group. Catalase levels were high in control and 5% dextrose groups. Superoxide dismutase levels were higher in the control group than the honey group (statistically significant). Intraperitoneal honey decreased the formation of postoperative intra-abdominal adhesions without compromising wound healing in this bacterial peritonitis rat model. Honey also decreased the oxidative stress during peritonitis.

  3. Statistical framework for evaluation of climate model simulations by use of climate proxy data from the last millennium - Part 1: Theory

    NASA Astrophysics Data System (ADS)

    Sundberg, R.; Moberg, A.; Hind, A.

    2012-08-01

    A statistical framework for comparing the output of ensemble simulations from global climate models with networks of climate proxy and instrumental records has been developed, focusing on near-surface temperatures for the last millennium. This framework includes the formulation of a joint statistical model for proxy data, instrumental data and simulation data, which is used to optimize a quadratic distance measure for ranking climate model simulations. An essential underlying assumption is that the simulations and the proxy/instrumental series have a shared component of variability that is due to temporal changes in external forcing, such as volcanic aerosol load, solar irradiance or greenhouse gas concentrations. Two statistical tests have been formulated. Firstly, a preliminary test establishes whether a significant temporal correlation exists between instrumental/proxy and simulation data. Secondly, the distance measure is expressed in the form of a test statistic of whether a forced simulation is closer to the instrumental/proxy series than unforced simulations. The proposed framework allows any number of proxy locations to be used jointly, with different seasons, record lengths and statistical precision. The goal is to objectively rank several competing climate model simulations (e.g. with alternative model parameterizations or alternative forcing histories) by means of their goodness of fit to the unobservable true past climate variations, as estimated from noisy proxy data and instrumental observations.

  4. Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling

    PubMed Central

    Wood, John

    2017-01-01

    Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered—some very seriously so—but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. PMID:28706080

  5. Human-modified temperatures induce species changes: Joint attribution.

    PubMed

    Root, Terry L; MacMynowski, Dena P; Mastrandrea, Michael D; Schneider, Stephen H

    2005-05-24

    Average global surface-air temperature is increasing. Contention exists over relative contributions by natural and anthropogenic forcings. Ecological studies attribute plant and animal changes to observed warming. Until now, temperature-species connections have not been statistically attributed directly to anthropogenic climatic change. Using modeled climatic variables and observed species data, which are independent of thermometer records and paleoclimatic proxies, we demonstrate statistically significant "joint attribution," a two-step linkage: human activities contribute significantly to temperature changes and human-changed temperatures are associated with discernible changes in plant and animal traits. Additionally, our analyses provide independent testing of grid-box-scale temperature projections from a general circulation model (HadCM3).

  6. Relative strength of tailor's bunion osteotomies and fixation techniques.

    PubMed

    Haddon, Todd B; LaPointe, Stephan J

    2013-01-01

    A paucity of data is available on the mechanical strength of fifth metatarsal osteotomies. The present study was designed to provide that information. Five osteotomies were mechanically tested to failure using a materials testing machine and compared with an intact fifth metatarsal using a hollow saw bone model with a sample size of 10 for each construct. The osteotomies tested were the distal reverse chevron fixated with a Kirschner wire, the long plantar reverse chevron osteotomy fixated with 2 screws, a mid-diaphyseal sagittal plane osteotomy fixated with 2 screws, the mid-diaphyseal sagittal plane osteotomy fixated with 2 screws, and an additional cerclage wire and a transverse closing wedge osteotomy fixated with a box wire technique. Analysis of variance was performed, resulting in a statistically significant difference among the data at p <.0001. The Tukey-Kramer honestly significant difference with least significant differences was performed post hoc to separate out the pairs at a minimum α of 0.05. The chevron was statistically the strongest construct at 130 N, followed by the long plantar osteotomy at 78 N. The chevron compared well with the control at 114 N, and they both fractured at the proximal model to fixture interface. The other osteotomies were statistically and significantly weaker than both the chevron and the long plantar constructs, with no statistically significant difference among them at 36, 39, and 48 N. In conclusion, the chevron osteotomy was superior in strength to the sagittal and transverse plane osteotomies and similar in strength and failure to the intact model. Copyright © 2013 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  7. Using Geographic Information Science to Explore Associations between Air Pollution, Environmental Amenities, and Preterm Births

    PubMed Central

    Ogneva-Himmelberger, Yelena; Dahlberg, Tyler; Kelly, Kristen; Simas, Tiffany A. Moore

    2015-01-01

    The study uses geographic information science (GIS) and statistics to find out if there are statistical differences between full term and preterm births to non-Hispanic white, non-Hispanic Black, and Hispanic mothers in their exposure to air pollution and access to environmental amenities (green space and vendors of healthy food) in the second largest city in New England, Worcester, Massachusetts. Proximity to a Toxic Release Inventory site has a statistically significant effect on preterm birth regardless of race. The air-pollution hazard score from the Risk Screening Environmental Indicators Model is also a statistically significant factor when preterm births are categorized into three groups based on the degree of prematurity. Proximity to green space and to a healthy food vendor did not have an effect on preterm births. The study also used cluster analysis and found statistically significant spatial clusters of high preterm birth volume for non-Hispanic white, non-Hispanic Black, and Hispanic mothers. PMID:29546120

  8. Using Geographic Information Science to Explore Associations between Air Pollution, Environmental Amenities, and Preterm Births.

    PubMed

    Ogneva-Himmelberger, Yelena; Dahlberg, Tyler; Kelly, Kristen; Simas, Tiffany A Moore

    2015-01-01

    The study uses geographic information science (GIS) and statistics to find out if there are statistical differences between full term and preterm births to non-Hispanic white, non-Hispanic Black, and Hispanic mothers in their exposure to air pollution and access to environmental amenities (green space and vendors of healthy food) in the second largest city in New England, Worcester, Massachusetts. Proximity to a Toxic Release Inventory site has a statistically significant effect on preterm birth regardless of race. The air-pollution hazard score from the Risk Screening Environmental Indicators Model is also a statistically significant factor when preterm births are categorized into three groups based on the degree of prematurity. Proximity to green space and to a healthy food vendor did not have an effect on preterm births. The study also used cluster analysis and found statistically significant spatial clusters of high preterm birth volume for non-Hispanic white, non-Hispanic Black, and Hispanic mothers.

  9. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition.

    PubMed

    Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin

    2014-06-05

    In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Time independent summary statistics may aid the understanding of drugs' action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies.

  10. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition

    PubMed Central

    2014-01-01

    Background In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves’ dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. Results First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Conclusion Time independent summary statistics may aid the understanding of drugs’ action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies. PMID:24902483

  11. Short-term Forecasting of the Prevalence of Trachoma: Expert Opinion, Statistical Regression, versus Transmission Models

    PubMed Central

    Liu, Fengchen; Porco, Travis C.; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K.; Bailey, Robin L.; Keenan, Jeremy D.; Solomon, Anthony W.; Emerson, Paul M.; Gambhir, Manoj; Lietman, Thomas M.

    2015-01-01

    Background Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. Methods The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts’ opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon’s signed-rank statistic. Findings Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher’s information. Each individual expert’s forecast was poorer than the sum of experts. Interpretation Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. PMID:26302380

  12. Short-term Forecasting of the Prevalence of Trachoma: Expert Opinion, Statistical Regression, versus Transmission Models.

    PubMed

    Liu, Fengchen; Porco, Travis C; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K; Bailey, Robin L; Keenan, Jeremy D; Solomon, Anthony W; Emerson, Paul M; Gambhir, Manoj; Lietman, Thomas M

    2015-08-01

    Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts' opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon's signed-rank statistic. Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher's information. Each individual expert's forecast was poorer than the sum of experts. Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. Clinicaltrials.gov NCT00792922.

  13. Applying quantitative adiposity feature analysis models to predict benefit of bevacizumab-based chemotherapy in ovarian cancer patients

    NASA Astrophysics Data System (ADS)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2016-03-01

    How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.

  14. Construction of cosmic string induced temperature anisotropy maps with CMBFAST and statistical analysis

    NASA Astrophysics Data System (ADS)

    Simatos, N.; Perivolaropoulos, L.

    2001-01-01

    We use the publicly available code CMBFAST, as modified by Pogosian and Vachaspati, to simulate the effects of wiggly cosmic strings on the cosmic microwave background (CMB). Using the modified CMBFAST code, which takes into account vector modes and models wiggly cosmic strings by the one-scale model, we go beyond the angular power spectrum to construct CMB temperature maps with a resolution of a few degrees. The statistics of these maps are then studied using conventional and recently proposed statistical tests optimized for the detection of hidden temperature discontinuities induced by the Gott-Kaiser-Stebbins effect. We show, however, that these realistic maps cannot be distinguished in a statistically significant way from purely Gaussian maps with an identical power spectrum.

  15. Modeling Ka-band low elevation angle propagation statistics

    NASA Technical Reports Server (NTRS)

    Russell, Thomas A.; Weinfield, John; Pearson, Chris; Ippolito, Louis J.

    1995-01-01

    The statistical variability of the secondary atmospheric propagation effects on satellite communications cannot be ignored at frequencies of 20 GHz or higher, particularly if the propagation margin allocation is such that link availability falls below 99 percent. The secondary effects considered in this paper are gaseous absorption, cloud absorption, and tropospheric scintillation; rain attenuation is the primary effect. Techniques and example results are presented for estimation of the overall combined impact of the atmosphere on satellite communications reliability. Statistical methods are employed throughout and the most widely accepted models for the individual effects are used wherever possible. The degree of correlation between the effects is addressed and some bounds on the expected variability in the combined effects statistics are derived from the expected variability in correlation. Example estimates are presented of combined effects statistics in the Washington D.C. area of 20 GHz and 5 deg elevation angle. The statistics of water vapor are shown to be sufficient for estimation of the statistics of gaseous absorption at 20 GHz. A computer model based on monthly surface weather is described and tested. Significant improvement in prediction of absorption extremes is demonstrated with the use of path weather data instead of surface data.

  16. Statistics of Statisticians: Critical Mass of Statistics and Operational Research Groups

    NASA Astrophysics Data System (ADS)

    Kenna, Ralph; Berche, Bertrand

    Using a recently developed model, inspired by mean field theory in statistical physics, and data from the UK's Research Assessment Exercise, we analyse the relationship between the qualities of statistics and operational research groups and the quantities of researchers in them. Similar to other academic disciplines, we provide evidence for a linear dependency of quality on quantity up to an upper critical mass, which is interpreted as the average maximum number of colleagues with whom a researcher can communicate meaningfully within a research group. The model also predicts a lower critical mass, which research groups should strive to achieve to avoid extinction. For statistics and operational research, the lower critical mass is estimated to be 9 ± 3. The upper critical mass, beyond which research quality does not significantly depend on group size, is 17 ± 6.

  17. Applications of spatial statistical network models to stream data

    Treesearch

    Daniel J. Isaak; Erin E. Peterson; Jay M. Ver Hoef; Seth J. Wenger; Jeffrey A. Falke; Christian E. Torgersen; Colin Sowder; E. Ashley Steel; Marie-Josee Fortin; Chris E. Jordan; Aaron S. Ruesch; Nicholas Som; Pascal Monestiez

    2014-01-01

    Streams and rivers host a significant portion of Earth's biodiversity and provide important ecosystem services for human populations. Accurate information regarding the status and trends of stream resources is vital for their effective conservation and management. Most statistical techniques applied to data measured on stream networks were developed for...

  18. Three Strategies for the Critical Use of Statistical Methods in Psychological Research

    ERIC Educational Resources Information Center

    Campitelli, Guillermo; Macbeth, Guillermo; Ospina, Raydonal; Marmolejo-Ramos, Fernando

    2017-01-01

    We present three strategies to replace the null hypothesis statistical significance testing approach in psychological research: (1) visual representation of cognitive processes and predictions, (2) visual representation of data distributions and choice of the appropriate distribution for analysis, and (3) model comparison. The three strategies…

  19. Application of a Fuzzy Verification Technique for Assessment of the Weather Running Estimate-Nowcast (WRE-N) Model

    DTIC Science & Technology

    2016-10-01

    comes when considering numerous scores and statistics during a preliminary evaluation of the applicability of the fuzzy- verification minimum coverage...The selection of thresholds with which to generate categorical-verification scores and statistics from the application of both traditional and...of statistically significant numbers of cases; the latter presents a challenge of limited application for assessment of the forecast models’ ability

  20. Addressing the mischaracterization of extreme rainfall in regional climate model simulations - A synoptic pattern based bias correction approach

    NASA Astrophysics Data System (ADS)

    Li, Jingwan; Sharma, Ashish; Evans, Jason; Johnson, Fiona

    2018-01-01

    Addressing systematic biases in regional climate model simulations of extreme rainfall is a necessary first step before assessing changes in future rainfall extremes. Commonly used bias correction methods are designed to match statistics of the overall simulated rainfall with observations. This assumes that change in the mix of different types of extreme rainfall events (i.e. convective and non-convective) in a warmer climate is of little relevance in the estimation of overall change, an assumption that is not supported by empirical or physical evidence. This study proposes an alternative approach to account for the potential change of alternate rainfall types, characterized here by synoptic weather patterns (SPs) using self-organizing maps classification. The objective of this study is to evaluate the added influence of SPs on the bias correction, which is achieved by comparing the corrected distribution of future extreme rainfall with that using conventional quantile mapping. A comprehensive synthetic experiment is first defined to investigate the conditions under which the additional information of SPs makes a significant difference to the bias correction. Using over 600,000 synthetic cases, statistically significant differences are found to be present in 46% cases. This is followed by a case study over the Sydney region using a high-resolution run of the Weather Research and Forecasting (WRF) regional climate model, which indicates a small change in the proportions of the SPs and a statistically significant change in the extreme rainfall over the region, although the differences between the changes obtained from the two bias correction methods are not statistically significant.

  1. Statistical relations of salt and selenium loads to geospatial characteristics of corresponding subbasins of the Colorado and Gunnison Rivers in Colorado

    USGS Publications Warehouse

    Leib, Kenneth J.; Linard, Joshua I.; Williams, Cory A.

    2012-01-01

    Elevated loads of salt and selenium can impair the quality of water for both anthropogenic and natural uses. Understanding the environmental processes controlling how salt and selenium are introduced to streams is critical to managing and mitigating the effects of elevated loads. Dominant relations between salt and selenium loads and environmental characteristics can be established by using geospatial data. The U.S. Geological Survey, in cooperation with the Bureau of Reclamation, investigated statistical relations between seasonal salt or selenium loads emanating from the Upper Colorado River Basin and geospatial data. Salt and selenium loads measured during the irrigation and nonirrigation seasons were related to geospatial variables for 168 subbasins within the Gunnison and Colorado River Basins. These geospatial variables represented subbasin characteristics of the physical environment, precipitation, geology, land use, and the irrigation network. All subbasin variables with units of area had statistically significant relations with load. The few variables that were not in units of area but were statistically significant helped to identify types of geospatial data that might influence salt and selenium loading. Following a stepwise approach, combinations of these statistically significant variables were used to develop multiple linear regression models. The models can be used to help prioritize areas where salt and selenium control projects might be most effective.

  2. Comparison of Artificial Neural Networks and ARIMA statistical models in simulations of target wind time series

    NASA Astrophysics Data System (ADS)

    Kolokythas, Kostantinos; Vasileios, Salamalikis; Athanassios, Argiriou; Kazantzidis, Andreas

    2015-04-01

    The wind is a result of complex interactions of numerous mechanisms taking place in small or large scales, so, the better knowledge of its behavior is essential in a variety of applications, especially in the field of power production coming from wind turbines. In the literature there is a considerable number of models, either physical or statistical ones, dealing with the problem of simulation and prediction of wind speed. Among others, Artificial Neural Networks (ANNs) are widely used for the purpose of wind forecasting and, in the great majority of cases, outperform other conventional statistical models. In this study, a number of ANNs with different architectures, which have been created and applied in a dataset of wind time series, are compared to Auto Regressive Integrated Moving Average (ARIMA) statistical models. The data consist of mean hourly wind speeds coming from a wind farm on a hilly Greek region and cover a period of one year (2013). The main goal is to evaluate the models ability to simulate successfully the wind speed at a significant point (target). Goodness-of-fit statistics are performed for the comparison of the different methods. In general, the ANN showed the best performance in the estimation of wind speed prevailing over the ARIMA models.

  3. Analysis of intra-arch and interarch measurements from digital models with 2 impression materials and a modeling process based on cone-beam computed tomography.

    PubMed

    White, Aaron J; Fallis, Drew W; Vandewalle, Kraig S

    2010-04-01

    Study models are an essential part of an orthodontic record. Digital models are now available. One option for generating a digital model is cone-beam computed tomography (CBCT) scanning of orthodontic impressions and bite registrations. However, the accuracy of digital measurements from models generated by this method has yet to be thoroughly evaluated. A plastic typodont was modified with reference points for standardized intra-arch and interarch measurements, and 16 sets of maxillary and mandibular vinylpolysiloxane and alginate impressions were made. A copper wax-bite registration was made with the typodont in maximum intercuspal position to accompany each set of impressions. The impressions were shipped to OrthoProofUSA (Albuquerque, NM), where digital orthodontic models were generated via CBCT. Intra-arch and interarch measurements were made directly on the typodont with electronic digital calipers and on the digital models by using OrthoProofUSA's proprietary DigiModel software. Percentage differences from the typodont of all intra-arch measurements in the alginate and vinylpolysiloxane groups were low, from 0.1% to 0.7%. Statistical analysis of the intra-arch percentage differences from the typodont of the alginate and vinylpolysiloxane groups had a statistically significant difference between the groups only for maxillary intermolar width. However, because of the small percentage differences, this was not considered clinically significant for orthodontic measurements. Percentage differences from the typodont of all interarch measurements in the alginate and vinylpolysiloxane groups were much higher, from 3.3% to 10.7%. Statistical analysis of the interarch percentage differences from the typodont of the alginate and vinylpolysiloxane groups showed statistically significant differences between the groups in both the maxillary right canine to mandibular right canine (alginate with a lower percentage difference than vinylpolysiloxane) and the maxillary left second molar to mandibular left second molar (alginate with a greater percentage difference than vinylpolysiloxane) segments. This difference, ranging from 0.24 to 0.72 mm, is clinically significant. In this study, digital orthodontic models from CBCT scans of alginate and vinylpolysiloxane impressions provided a dimensionally accurate representation of intra-arch relationships for orthodontic evaluation. However, the use of copper wax-bite registrations in this CBCT-based process did not result in an accurate digital representation of interarch relationships. Copyright (c) 2010 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  4. Statistical Modeling of Zr/Hf Extraction using TBP-D2EHPA Mixtures

    NASA Astrophysics Data System (ADS)

    Rezaeinejhad Jirandehi, Vahid; Haghshenas Fatmehsari, Davoud; Firoozi, Sadegh; Taghizadeh, Mohammad; Keshavarz Alamdari, Eskandar

    2012-12-01

    In the present work, response surface methodology was employed for the study and prediction of Zr/Hf extraction curves in a solvent extraction system using D2EHPA-TBP mixtures. The effect of change in the levels of temperature, nitric acid concentration, and TBP/D2EHPA ratio (T/D) on the Zr/Hf extraction/separation was studied by the use of central composite design. The results showed a statistically significant effect of T/D, nitric acid concentration, and temperature on the extraction percentage of Zr and Hf. In the case of Zr, a statistically significant interaction was found between T/D and nitric acid, whereas for Hf, both interactive terms between temperature and T/D and nitric acid were significant. Additionally, the extraction curves were profitably predicted applying the developed statistical regression equations; this approach is faster and more economical compared with experimentally obtained curves.

  5. Efficacy of keratinocyte growth factor (palifermin) for the treatment of caustic esophageal burns

    PubMed Central

    NUMANOĞLU, KEMAL VARIM; TATLI, DUYGU; BEKTAŞ, SIBEL; ER, EBUBEKIR

    2014-01-01

    Current treatment strategies against the development of corrosive esophageal strictures remain unsatisfactory. Thus, the aim of the present study was to investigate the efficacy of keratinocyte growth factor, in the form of palifermin, for the prevention of stricture development following esophageal caustic injuries in a rat model. A total of 32 female Wistar albino rats were divided into four groups, which included the control (C), burn (B), steroid (S) and steroid plus palifermin (S/P) groups. An experimental corrosive esophageal burn model was established in the B, S and S/P groups. Weight gain was recorded and histopathological evaluation was performed for each group. Weight gain in the S and B groups was compared with the control group and statistically significant differences were observed. In addition, statistically significant differences in weight gain were observed between the S/P group and the B group. Histopathologically, statistically significant differences were identified with regard to submucosal collagen deposition, muscularis mucosa and tunica muscularis damage when comparing the B group with the C group. In addition, statistically significant differences were observed when comparing the S and S/P groups with the B group. Furthermore, significant submucosal collagen deposition and tunica muscularis damage were observed in the S group when compared with the S/P group. The stenosis indexes in the C and S groups were significantly lower compared with the B group. In addition, the stenosis index in the S/P group was significantly lower compared with the S group. To the best of our knowledge, the present study is the first to investigate the effect of palifermin on corrosive esophageal burns. The addition of palifermin to the corrosive esophageal burn standard treatment regimen was found to reduce the degree of fibrosis and ameliorate histopathological damage in an experimental model of corrosive esophagitis in rats. PMID:25187801

  6. Artificial neural network study on organ-targeting peptides

    NASA Astrophysics Data System (ADS)

    Jung, Eunkyoung; Kim, Junhyoung; Choi, Seung-Hoon; Kim, Minkyoung; Rhee, Hokyoung; Shin, Jae-Min; Choi, Kihang; Kang, Sang-Kee; Lee, Nam Kyung; Choi, Yun-Jaie; Jung, Dong Hyun

    2010-01-01

    We report a new approach to studying organ targeting of peptides on the basis of peptide sequence information. The positive control data sets consist of organ-targeting peptide sequences identified by the peroral phage-display technique for four organs, and the negative control data are prepared from random sequences. The capacity of our models to make appropriate predictions is validated by statistical indicators including sensitivity, specificity, enrichment curve, and the area under the receiver operating characteristic (ROC) curve (the ROC score). VHSE descriptor produces statistically significant training models and the models with simple neural network architectures show slightly greater predictive power than those with complex ones. The training and test set statistics indicate that our models could discriminate between organ-targeting and random sequences. We anticipate that our models will be applicable to the selection of organ-targeting peptides for generating peptide drugs or peptidomimetics.

  7. Cross-Participant EEG-Based Assessment of Cognitive Workload Using Multi-Path Convolutional Recurrent Neural Networks.

    PubMed

    Hefron, Ryan; Borghetti, Brett; Schubert Kabban, Christine; Christensen, James; Estepp, Justin

    2018-04-26

    Applying deep learning methods to electroencephalograph (EEG) data for cognitive state assessment has yielded improvements over previous modeling methods. However, research focused on cross-participant cognitive workload modeling using these techniques is underrepresented. We study the problem of cross-participant state estimation in a non-stimulus-locked task environment, where a trained model is used to make workload estimates on a new participant who is not represented in the training set. Using experimental data from the Multi-Attribute Task Battery (MATB) environment, a variety of deep neural network models are evaluated in the trade-space of computational efficiency, model accuracy, variance and temporal specificity yielding three important contributions: (1) The performance of ensembles of individually-trained models is statistically indistinguishable from group-trained methods at most sequence lengths. These ensembles can be trained for a fraction of the computational cost compared to group-trained methods and enable simpler model updates. (2) While increasing temporal sequence length improves mean accuracy, it is not sufficient to overcome distributional dissimilarities between individuals’ EEG data, as it results in statistically significant increases in cross-participant variance. (3) Compared to all other networks evaluated, a novel convolutional-recurrent model using multi-path subnetworks and bi-directional, residual recurrent layers resulted in statistically significant increases in predictive accuracy and decreases in cross-participant variance.

  8. Cross-Participant EEG-Based Assessment of Cognitive Workload Using Multi-Path Convolutional Recurrent Neural Networks

    PubMed Central

    Hefron, Ryan; Borghetti, Brett; Schubert Kabban, Christine; Christensen, James; Estepp, Justin

    2018-01-01

    Applying deep learning methods to electroencephalograph (EEG) data for cognitive state assessment has yielded improvements over previous modeling methods. However, research focused on cross-participant cognitive workload modeling using these techniques is underrepresented. We study the problem of cross-participant state estimation in a non-stimulus-locked task environment, where a trained model is used to make workload estimates on a new participant who is not represented in the training set. Using experimental data from the Multi-Attribute Task Battery (MATB) environment, a variety of deep neural network models are evaluated in the trade-space of computational efficiency, model accuracy, variance and temporal specificity yielding three important contributions: (1) The performance of ensembles of individually-trained models is statistically indistinguishable from group-trained methods at most sequence lengths. These ensembles can be trained for a fraction of the computational cost compared to group-trained methods and enable simpler model updates. (2) While increasing temporal sequence length improves mean accuracy, it is not sufficient to overcome distributional dissimilarities between individuals’ EEG data, as it results in statistically significant increases in cross-participant variance. (3) Compared to all other networks evaluated, a novel convolutional-recurrent model using multi-path subnetworks and bi-directional, residual recurrent layers resulted in statistically significant increases in predictive accuracy and decreases in cross-participant variance. PMID:29701668

  9. Estimation of Total Nitrogen and Phosphorus in New England Streams Using Spatially Referenced Regression Models

    USGS Publications Warehouse

    Moore, Richard Bridge; Johnston, Craig M.; Robinson, Keith W.; Deacon, Jeffrey R.

    2004-01-01

    The U.S. Geological Survey (USGS), in cooperation with the U.S. Environmental Protection Agency (USEPA) and the New England Interstate Water Pollution Control Commission (NEIWPCC), has developed a water-quality model, called SPARROW (Spatially Referenced Regressions on Watershed Attributes), to assist in regional total maximum daily load (TMDL) and nutrient-criteria activities in New England. SPARROW is a spatially detailed, statistical model that uses regression equations to relate total nitrogen and phosphorus (nutrient) stream loads to nutrient sources and watershed characteristics. The statistical relations in these equations are then used to predict nutrient loads in unmonitored streams. The New England SPARROW models are built using a hydrologic network of 42,000 stream reaches and associated watersheds. Watershed boundaries are defined for each stream reach in the network through the use of a digital elevation model and existing digitized watershed divides. Nutrient source data is from permitted wastewater discharge data from USEPA's Permit Compliance System (PCS), various land-use sources, and atmospheric deposition. Physical watershed characteristics include drainage area, land use, streamflow, time-of-travel, stream density, percent wetlands, slope of the land surface, and soil permeability. The New England SPARROW models for total nitrogen and total phosphorus have R-squared values of 0.95 and 0.94, with mean square errors of 0.16 and 0.23, respectively. Variables that were statistically significant in the total nitrogen model include permitted municipal-wastewater discharges, atmospheric deposition, agricultural area, and developed land area. Total nitrogen stream-loss rates were significant only in streams with average annual flows less than or equal to 2.83 cubic meters per second. In streams larger than this, there is nondetectable in-stream loss of annual total nitrogen in New England. Variables that were statistically significant in the total phosphorus model include discharges for municipal wastewater-treatment facilities and pulp and paper facilities, developed land area, agricultural area, and forested area. For total phosphorus, loss rates were significant for reservoirs with surface areas of 10 square kilometers or less, and in streams with flows less than or equal to 2.83 cubic meters per second. Applications of SPARROW for evaluating nutrient loading in New England waters include estimates of the spatial distributions of total nitrogen and phosphorus yields, sources of the nutrients, and the potential for delivery of those yields to receiving waters. This information can be used to (1) predict ranges in nutrient levels in surface waters, (2) identify the environmental variables that are statistically significant predictors of nutrient levels in streams, (3) evaluate monitoring efforts for better determination of nutrient loads, and (4) evaluate management options for reducing nutrient loads to achieve water-quality goals.

  10. Method for Real-Time Model Based Structural Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Urnes, James M., Sr. (Inventor); Smith, Timothy A. (Inventor); Reichenbach, Eric Y. (Inventor)

    2015-01-01

    A system and methods for real-time model based vehicle structural anomaly detection are disclosed. A real-time measurement corresponding to a location on a vehicle structure during an operation of the vehicle is received, and the real-time measurement is compared to expected operation data for the location to provide a modeling error signal. A statistical significance of the modeling error signal to provide an error significance is calculated, and a persistence of the error significance is determined. A structural anomaly is indicated, if the persistence exceeds a persistence threshold value.

  11. Multiplicity Control in Structural Equation Modeling

    ERIC Educational Resources Information Center

    Cribbie, Robert A.

    2007-01-01

    Researchers conducting structural equation modeling analyses rarely, if ever, control for the inflated probability of Type I errors when evaluating the statistical significance of multiple parameters in a model. In this study, the Type I error control, power and true model rates of famsilywise and false discovery rate controlling procedures were…

  12. Weak simulated extratropical responses to complete tropical deforestation

    USGS Publications Warehouse

    Findell, K.L.; Knutson, T.R.; Milly, P.C.D.

    2006-01-01

    The Geophysical Fluid Dynamics Laboratory atmosphere-land model version 2 (AM2/LM2) coupled to a 50-m-thick slab ocean model has been used to investigate remote responses to tropical deforestation. Magnitudes and significance of differences between a control run and a deforested run are assessed through comparisons of 50-yr time series, accounting for autocorrelation and field significance. Complete conversion of the broadleaf evergreen forests of South America, central Africa, and the islands of Oceania to grasslands leads to highly significant local responses. In addition, a broad but mild warming is seen throughout the tropical troposphere (<0.2??C between 700 and 150 mb), significant in northern spring and summer. However, the simulation results show very little statistically significant response beyond the Tropics. There are no significant differences in any hydroclimatic variables (e.g., precipitation, soil moisture, evaporation) in either the northern or the southern extratropics. Small but statistically significant local differences in some geopotential height and wind fields are present in the southeastern Pacific Ocean. Use of the same statistical tests on two 50-yr segments of the control run show that the small but significant extratropical differences between the deforested run and the control run are similar in magnitude and area to the differences between nonoverlapping segments of the control run. These simulations suggest that extratropical responses to complete tropical deforestation are unlikely to be distinguishable from natural climate variability.

  13. Detecting signals of drug-drug interactions in a spontaneous reports database.

    PubMed

    Thakrar, Bharat T; Grundschober, Sabine Borel; Doessegger, Lucette

    2007-10-01

    The spontaneous reports database is widely used for detecting signals of ADRs. We have extended the methodology to include the detection of signals of ADRs that are associated with drug-drug interactions (DDI). In particular, we have investigated two different statistical assumptions for detecting signals of DDI. Using the FDA's spontaneous reports database, we investigated two models, a multiplicative and an additive model, to detect signals of DDI. We applied the models to four known DDIs (methotrexate-diclofenac and bone marrow depression, simvastatin-ciclosporin and myopathy, ketoconazole-terfenadine and torsades de pointes, and cisapride-erythromycin and torsades de pointes) and to four drug-event combinations where there is currently no evidence of a DDI (fexofenadine-ketoconazole and torsades de pointes, methotrexade-rofecoxib and bone marrow depression, fluvastatin-ciclosporin and myopathy, and cisapride-azithromycine and torsade de pointes) and estimated the measure of interaction on the two scales. The additive model correctly identified all four known DDIs by giving a statistically significant (P < 0.05) positive measure of interaction. The multiplicative model identified the first two of the known DDIs as having a statistically significant or borderline significant (P < 0.1) positive measure of interaction term, gave a nonsignificant positive trend for the third interaction (P = 0.27), and a negative trend for the last interaction. Both models correctly identified the four known non interactions by estimating a negative measure of interaction. The spontaneous reports database is a valuable resource for detecting signals of DDIs. In particular, the additive model is more sensitive in detecting such signals. The multiplicative model may further help qualify the strength of the signal detected by the additive model.

  14. Detecting signals of drug–drug interactions in a spontaneous reports database

    PubMed Central

    Thakrar, Bharat T; Grundschober, Sabine Borel; Doessegger, Lucette

    2007-01-01

    Aims The spontaneous reports database is widely used for detecting signals of ADRs. We have extended the methodology to include the detection of signals of ADRs that are associated with drug–drug interactions (DDI). In particular, we have investigated two different statistical assumptions for detecting signals of DDI. Methods Using the FDA's spontaneous reports database, we investigated two models, a multiplicative and an additive model, to detect signals of DDI. We applied the models to four known DDIs (methotrexate-diclofenac and bone marrow depression, simvastatin-ciclosporin and myopathy, ketoconazole-terfenadine and torsades de pointes, and cisapride-erythromycin and torsades de pointes) and to four drug-event combinations where there is currently no evidence of a DDI (fexofenadine-ketoconazole and torsades de pointes, methotrexade-rofecoxib and bone marrow depression, fluvastatin-ciclosporin and myopathy, and cisapride-azithromycine and torsade de pointes) and estimated the measure of interaction on the two scales. Results The additive model correctly identified all four known DDIs by giving a statistically significant (P< 0.05) positive measure of interaction. The multiplicative model identified the first two of the known DDIs as having a statistically significant or borderline significant (P< 0.1) positive measure of interaction term, gave a nonsignificant positive trend for the third interaction (P= 0.27), and a negative trend for the last interaction. Both models correctly identified the four known non interactions by estimating a negative measure of interaction. Conclusions The spontaneous reports database is a valuable resource for detecting signals of DDIs. In particular, the additive model is more sensitive in detecting such signals. The multiplicative model may further help qualify the strength of the signal detected by the additive model. PMID:17506784

  15. Assessment of corneal properties based on statistical modeling of OCT speckle

    PubMed Central

    Jesus, Danilo A.; Iskander, D. Robert

    2016-01-01

    A new approach to assess the properties of the corneal micro-structure in vivo based on the statistical modeling of speckle obtained from Optical Coherence Tomography (OCT) is presented. A number of statistical models were proposed to fit the corneal speckle data obtained from OCT raw image. Short-term changes in corneal properties were studied by inducing corneal swelling whereas age-related changes were observed analyzing data of sixty-five subjects aged between twenty-four and seventy-three years. Generalized Gamma distribution has shown to be the best model, in terms of the Akaike’s Information Criterion, to fit the OCT corneal speckle. Its parameters have shown statistically significant differences (Kruskal-Wallis, p < 0.001) for short and age-related corneal changes. In addition, it was observed that age-related changes influence the corneal biomechanical behaviour when corneal swelling is induced. This study shows that Generalized Gamma distribution can be utilized to modeling corneal speckle in OCT in vivo providing complementary quantified information where micro-structure of corneal tissue is of essence. PMID:28101409

  16. Forecasting runout of rock and debris avalanches

    USGS Publications Warehouse

    Iverson, Richard M.; Evans, S.G.; Mugnozza, G.S.; Strom, A.; Hermanns, R.L.

    2006-01-01

    Physically based mathematical models and statistically based empirical equations each may provide useful means of forecasting runout of rock and debris avalanches. This paper compares the foundations, strengths, and limitations of a physically based model and a statistically based forecasting method, both of which were developed to predict runout across three-dimensional topography. The chief advantage of the physically based model results from its ties to physical conservation laws and well-tested axioms of soil and rock mechanics, such as the Coulomb friction rule and effective-stress principle. The output of this model provides detailed information about the dynamics of avalanche runout, at the expense of high demands for accurate input data, numerical computation, and experimental testing. In comparison, the statistical method requires relatively modest computation and no input data except identification of prospective avalanche source areas and a range of postulated avalanche volumes. Like the physically based model, the statistical method yields maps of predicted runout, but it provides no information on runout dynamics. Although the two methods differ significantly in their structure and objectives, insights gained from one method can aid refinement of the other.

  17. Does solar activity affect human happiness?

    NASA Astrophysics Data System (ADS)

    Kristoufek, Ladislav

    2018-03-01

    We investigate the direct influence of solar activity (represented by sunspot numbers) on human happiness (represented by the Twitter-based Happiness Index). We construct four models controlling for various statistical and dynamic effects of the analyzed series. The final model gives promising results. First, there is a statistically significant negative influence of solar activity on happiness which holds even after controlling for the other factors. Second, the final model, which is still rather simple, explains around 75% of variance of the Happiness Index. Third, our control variables contribute significantly as well: happiness is higher in no sunspots days, happiness is strongly persistent, there are strong intra-week cycles and happiness peaks during holidays. Our results strongly contribute to the topical literature and they provide evidence of unique utility of the online data.

  18. Statistically accurate low-order models for uncertainty quantification in turbulent dynamical systems.

    PubMed

    Sapsis, Themistoklis P; Majda, Andrew J

    2013-08-20

    A framework for low-order predictive statistical modeling and uncertainty quantification in turbulent dynamical systems is developed here. These reduced-order, modified quasilinear Gaussian (ROMQG) algorithms apply to turbulent dynamical systems in which there is significant linear instability or linear nonnormal dynamics in the unperturbed system and energy-conserving nonlinear interactions that transfer energy from the unstable modes to the stable modes where dissipation occurs, resulting in a statistical steady state; such turbulent dynamical systems are ubiquitous in geophysical and engineering turbulence. The ROMQG method involves constructing a low-order, nonlinear, dynamical system for the mean and covariance statistics in the reduced subspace that has the unperturbed statistics as a stable fixed point and optimally incorporates the indirect effect of non-Gaussian third-order statistics for the unperturbed system in a systematic calibration stage. This calibration procedure is achieved through information involving only the mean and covariance statistics for the unperturbed equilibrium. The performance of the ROMQG algorithm is assessed on two stringent test cases: the 40-mode Lorenz 96 model mimicking midlatitude atmospheric turbulence and two-layer baroclinic models for high-latitude ocean turbulence with over 125,000 degrees of freedom. In the Lorenz 96 model, the ROMQG algorithm with just a single mode captures the transient response to random or deterministic forcing. For the baroclinic ocean turbulence models, the inexpensive ROMQG algorithm with 252 modes, less than 0.2% of the total, captures the nonlinear response of the energy, the heat flux, and even the one-dimensional energy and heat flux spectra.

  19. Effect of social support on informed consent in older adults with Parkinson disease and their caregivers.

    PubMed

    Ford, M E; Kallen, M; Richardson, P; Matthiesen, E; Cox, V; Teng, E J; Cook, K F; Petersen, N J

    2008-01-01

    To evaluate the effects of social support on comprehension and recall of consent form information in a study of Parkinson disease patients and their caregivers. Comparison of comprehension and recall outcomes among participants who read and signed the consent form accompanied by a family member/friend versus those of participants who read and signed the consent form unaccompanied. Comprehension and recall of consent form information were measured at one week and one month respectively, using Part A of the Quality of Informed Consent Questionnaire (QuIC). The mean age of the sample of 143 participants was 71 years (SD = 8.6 years). Analysis of covariance was used to compare QuIC scores between the intervention group (n = 70) and control group (n = 73). In the 1-week model, no statistically significant intervention effect was found (p = 0.860). However, the intervention status by patient status interaction was statistically significant (p = 0.012). In the 1-month model, no statistically significant intervention effect was found (p = 0.480). Again, however, the intervention status by patient status interaction was statistically significant (p = 0.040). At both time periods, intervention group patients scored higher (better) on the QuIC than did intervention group caregivers, and control group patients scored lower (worse) on the QuIC than did control group caregivers. Social support played a significant role in enhancing comprehension and recall of consent form information among patients.

  20. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    This study statistically analyzed a grain-size based additivity model that has been proposed to scale reaction rates and parameters from laboratory to field. The additivity model assumed that reaction properties in a sediment including surface area, reactive site concentration, reaction rate, and extent can be predicted from field-scale grain size distribution by linearly adding reaction properties for individual grain size fractions. This study focused on the statistical analysis of the additivity model with respect to reaction rate constants using multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment as an example. Experimental data of rate-limited U(VI) desorption in amore » stirred flow-cell reactor were used to estimate the statistical properties of multi-rate parameters for individual grain size fractions. The statistical properties of the rate constants for the individual grain size fractions were then used to analyze the statistical properties of the additivity model to predict rate-limited U(VI) desorption in the composite sediment, and to evaluate the relative importance of individual grain size fractions to the overall U(VI) desorption. The result indicated that the additivity model provided a good prediction of the U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model, and U(VI) desorption in individual grain size fractions have to be simulated in order to apply the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel size fraction (2-8mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.« less

  1. Development and validation of a risk calculator predicting exercise-induced ventricular arrhythmia in patients with cardiovascular disease.

    PubMed

    Hermes, Ilarraza-Lomelí; Marianna, García-Saldivia; Jessica, Rojano-Castillo; Carlos, Barrera-Ramírez; Rafael, Chávez-Domínguez; María Dolores, Rius-Suárez; Pedro, Iturralde

    2016-10-01

    Mortality due to cardiovascular disease is often associated with ventricular arrhythmias. Nowadays, patients with cardiovascular disease are more encouraged to take part in physical training programs. Nevertheless, high-intensity exercise is associated to a higher risk for sudden death, even in apparently healthy people. During an exercise testing (ET), health care professionals provide patients, in a controlled scenario, an intense physiological stimulus that could precipitate cardiac arrhythmia in high risk individuals. There is still no clinical or statistical tool to predict this incidence. The aim of this study was to develop a statistical model to predict the incidence of exercise-induced potentially life-threatening ventricular arrhythmia (PLVA) during high intensity exercise. 6415 patients underwent a symptom-limited ET with a Balke ramp protocol. A multivariate logistic regression model where the primary outcome was PLVA was performed. Incidence of PLVA was 548 cases (8.5%). After a bivariate model, thirty one clinical or ergometric variables were statistically associated with PLVA and were included in the regression model. In the multivariate model, 13 of these variables were found to be statistically significant. A regression model (G) with a X(2) of 283.987 and a p<0.001, was constructed. Significant variables included: heart failure, antiarrhythmic drugs, myocardial lower-VD, age and use of digoxin, nitrates, among others. This study allows clinicians to identify patients at risk of ventricular tachycardia or couplets during exercise, and to take preventive measures or appropriate supervision. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Estimating order statistics of network degrees

    NASA Astrophysics Data System (ADS)

    Chu, J.; Nadarajah, S.

    2018-01-01

    We model the order statistics of network degrees of big data sets by a range of generalised beta distributions. A three parameter beta distribution due to Libby and Novick (1982) is shown to give the best overall fit for at least four big data sets. The fit of this distribution is significantly better than the fit suggested by Olhede and Wolfe (2012) across the whole range of order statistics for all four data sets.

  3. The effectiveness of repeat lumbar transforaminal epidural steroid injections.

    PubMed

    Murthy, Naveen S; Geske, Jennifer R; Shelerud, Randy A; Wald, John T; Diehn, Felix E; Thielen, Kent R; Kaufmann, Timothy J; Morris, Jonathan M; Lehman, Vance T; Amrami, Kimberly K; Carter, Rickey E; Maus, Timothy P

    2014-10-01

    The aim of this study was to determine 1) if repeat lumbar transforaminal epidural steroid injections (TFESIs) resulted in recovery of pain relief, which has waned since an index injection, and 2) if cumulative benefit could be achieved by repeat injections within 3 months of the index injection. Retrospective observational study with statistical modeling of the response to repeat TFESI. Academic radiology practice. Two thousand eighty-seven single-level TFESIs were performed for radicular pain on 933 subjects. Subjects received repeat TFESIs >2 weeks and <1 year from the index injection. Hierarchical linear modeling was performed to evaluate changes in continuous and categorical pain relief outcomes after repeat TFESI. Subgroup analyses were performed on patients with <3 months duration of pain (acute pain), patients receiving repeat injections within 3 months (clustered injections), and in patients with both acute pain and clustered injections. Repeat TFESIs achieved pain relief in both continuous and categorical outcomes. Relative to the index injection, there was a minimal but statistically significant decrease in pain relief in modeled continuous outcome measures with subsequent injections. Acute pain patients recovered all prior benefit with a statistically significant cumulative benefit. Patients receiving clustered injections achieved statistically significant cumulative benefit, of greater magnitude in acute pain patients. Repeat TFESI may be performed for recurrence of radicular pain with the expectation of recovery of most or all previously achieved benefit; acute pain patients will likely recover all prior benefit. Repeat TFESIs within 3 months of the index injection can provide cumulative benefit. Wiley Periodicals, Inc.

  4. Counts-in-cylinders in the Sloan Digital Sky Survey with Comparisons to N-body Simulations

    NASA Astrophysics Data System (ADS)

    Berrier, Heather D.; Barton, Elizabeth J.; Berrier, Joel C.; Bullock, James S.; Zentner, Andrew R.; Wechsler, Risa H.

    2011-01-01

    Environmental statistics provide a necessary means of comparing the properties of galaxies in different environments, and a vital test of models of galaxy formation within the prevailing hierarchical cosmological model. We explore counts-in-cylinders, a common statistic defined as the number of companions of a particular galaxy found within a given projected radius and redshift interval. Galaxy distributions with the same two-point correlation functions do not necessarily have the same companion count distributions. We use this statistic to examine the environments of galaxies in the Sloan Digital Sky Survey Data Release 4 (SDSS DR4). We also make preliminary comparisons to four models for the spatial distributions of galaxies, based on N-body simulations and data from SDSS DR4, to study the utility of the counts-in-cylinders statistic. There is a very large scatter between the number of companions a galaxy has and the mass of its parent dark matter halo and the halo occupation, limiting the utility of this statistic for certain kinds of environmental studies. We also show that prevalent empirical models of galaxy clustering, that match observed two- and three-point clustering statistics well, fail to reproduce some aspects of the observed distribution of counts-in-cylinders on 1, 3, and 6 h -1 Mpc scales. All models that we explore underpredict the fraction of galaxies with few or no companions in 3 and 6 h -1 Mpc cylinders. Roughly 7% of galaxies in the real universe are significantly more isolated within a 6 h -1 Mpc cylinder than the galaxies in any of the models we use. Simple phenomenological models that map galaxies to dark matter halos fail to reproduce high-order clustering statistics in low-density environments.

  5. Hormone replacement therapy is associated with gastro-oesophageal reflux disease: a retrospective cohort study.

    PubMed

    Close, Helen; Mason, James M; Wilson, Douglas; Hungin, A Pali S

    2012-05-29

    Oestrogen and progestogen have the potential to influence gastro-intestinal motility; both are key components of hormone replacement therapy (HRT). Results of observational studies in women taking HRT rely on self-reporting of gastro-oesophageal symptoms and the aetiology of gastro-oesophageal reflux disease (GORD) remains unclear. This study investigated the association between HRT and GORD in menopausal women using validated general practice records. 51,182 menopausal women were identified using the UK General Practice Research Database between 1995-2004. Of these, 8,831 were matched with and without hormone use. Odds ratios (ORs) were calculated for GORD and proton-pump inhibitor (PPI) use in hormone and non-hormone users, adjusting for age, co-morbidities, and co-pharmacy. In unadjusted analysis, all forms of hormone use (oestrogen-only, tibolone, combined HRT and progestogen) were statistically significantly associated with GORD. In adjusted models, this association remained statistically significant for oestrogen-only treatment (OR 1.49; 1.18-1.89). Unadjusted analysis showed a statistically significant association between PPI use and oestrogen-only and combined HRT treatment. When adjusted for covariates, oestrogen-only treatment was significant (OR 1.34; 95% CI 1.03-1.74). Findings from the adjusted model demonstrated the greater use of PPI by progestogen users (OR 1.50; 1.01-2.22). This first large cohort study of the association between GORD and HRT found a statistically significant association between oestrogen-only hormone and GORD and PPI use. This should be further investigated using prospective follow-up to validate the strength of association and describe its clinical significance.

  6. Risk prediction model: Statistical and artificial neural network approach

    NASA Astrophysics Data System (ADS)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  7. Detailed Spectral Analysis of the 260 ks XMM-Newton Data of 1E 1207.4-5209 and Significance of a 2.1 keV Absorption Feature

    NASA Astrophysics Data System (ADS)

    Mori, Kaya; Chonko, James C.; Hailey, Charles J.

    2005-10-01

    We have reanalyzed the 260 ks XMM-Newton observation of 1E 1207.4-5209. There are several significant improvements over previous work. First, a much broader range of physically plausible spectral models was used. Second, we have used a more rigorous statistical analysis. The standard F-distribution was not employed, but rather the exact finite statistics F-distribution was determined by Monte Carlo simulations. This approach was motivated by the recent work of Protassov and coworkers and Freeman and coworkers. They demonstrated that the standard F-distribution is not even asymptotically correct when applied to assess the significance of additional absorption features in a spectrum. With our improved analysis we do not find a third and fourth spectral feature in 1E 1207.4-5209 but only the two broad absorption features previously reported. Two additional statistical tests, one line model dependent and the other line model independent, confirmed our modified F-test analysis. For all physically plausible continuum models in which the weak residuals are strong enough to fit, the residuals occur at the instrument Au M edge. As a sanity check we confirmed that the residuals are consistent in strength and position with the instrument Au M residuals observed in 3C 273.

  8. Evaluation of a Local Anesthesia Simulation Model with Dental Students as Novice Clinicians.

    PubMed

    Lee, Jessica S; Graham, Roseanna; Bassiur, Jennifer P; Lichtenthal, Richard M

    2015-12-01

    The aim of this study was to evaluate the use of a local anesthesia (LA) simulation model in a facilitated small group setting before dental students administered an inferior alveolar nerve block (IANB) for the first time. For this pilot study, 60 dental students transitioning from preclinical to clinical education were randomly assigned to either an experimental group (N=30) that participated in a small group session using the simulation model or a control group (N=30). After administering local anesthesia for the first time, students in both groups were given questionnaires regarding levels of preparedness and confidence when administering an IANB and level of anesthesia effectiveness and pain when receiving an IANB. Students in the experimental group exhibited a positive difference on all six questions regarding preparedness and confidence when administering LA to another student. One of these six questions ("I was prepared in administering local anesthesia for the first time") showed a statistically significant difference (p<0.05). Students who received LA from students who practiced on the simulation model also experienced fewer post-injection complications one day after receiving the IANB, including a statistically significant reduction in trismus. No statistically significant difference was found in level of effectiveness of the IANB or perceived levels of pain between the two groups. The results of this pilot study suggest that using a local anesthesia simulation model may be beneficial in increasing a dental student's level of comfort prior to administering local anesthesia for the first time.

  9. Weighted Feature Significance: A Simple, Interpretable Model of Compound Toxicity Based on the Statistical Enrichment of Structural Features

    PubMed Central

    Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.

    2009-01-01

    In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) data from Salmonella typhimurium reverse mutagenicity assays conducted by the U.S. National Toxicology Program, and (3) hepatotoxicity data published in the Registry of Toxic Effects of Chemical Substances. Enrichments of structural features in toxic compounds are evaluated for their statistical significance and compiled into a simple additive model of toxicity and then used to score new compounds for potential toxicity. The predictive power of the model for cytotoxicity was validated using an independent set of compounds from the U.S. Environmental Protection Agency tested also at the National Institutes of Health Chemical Genomics Center. We compared the performance of our WFS approach with classical classification methods such as Naive Bayesian clustering and support vector machines. In most test cases, WFS showed similar or slightly better predictive power, especially in the prediction of hepatotoxic compounds, where WFS appeared to have the best performance among the three methods. The new algorithm has the important advantages of simplicity, power, interpretability, and ease of implementation. PMID:19805409

  10. Maternal factors predicting cognitive and behavioral characteristics of children with fetal alcohol spectrum disorders.

    PubMed

    May, Philip A; Tabachnick, Barbara G; Gossage, J Phillip; Kalberg, Wendy O; Marais, Anna-Susan; Robinson, Luther K; Manning, Melanie A; Blankenship, Jason; Buckley, David; Hoyme, H Eugene; Adnams, Colleen M

    2013-06-01

    To provide an analysis of multiple predictors of cognitive and behavioral traits for children with fetal alcohol spectrum disorders (FASDs). Multivariate correlation techniques were used with maternal and child data from epidemiologic studies in a community in South Africa. Data on 561 first-grade children with fetal alcohol syndrome (FAS), partial FAS (PFAS), and not FASD and their mothers were analyzed by grouping 19 maternal variables into categories (physical, demographic, childbearing, and drinking) and used in structural equation models (SEMs) to assess correlates of child intelligence (verbal and nonverbal) and behavior. A first SEM using only 7 maternal alcohol use variables to predict cognitive/behavioral traits was statistically significant (B = 3.10, p < .05) but explained only 17.3% of the variance. The second model incorporated multiple maternal variables and was statistically significant explaining 55.3% of the variance. Significantly correlated with low intelligence and problem behavior were demographic (B = 3.83, p < .05) (low maternal education, low socioeconomic status [SES], and rural residence) and maternal physical characteristics (B = 2.70, p < .05) (short stature, small head circumference, and low weight). Childbearing history and alcohol use composites were not statistically significant in the final complex model and were overpowered by SES and maternal physical traits. Although other analytic techniques have amply demonstrated the negative effects of maternal drinking on intelligence and behavior, this highly controlled analysis of multiple maternal influences reveals that maternal demographics and physical traits make a significant enabling or disabling contribution to child functioning in FASD.

  11. The effect of clulstering of galaxies on the statistics of gravitational lenses

    NASA Technical Reports Server (NTRS)

    Anderson, N.; Alcock, C.

    1986-01-01

    It is examined whether clustering of galaxies can significantly alter the statistical properties of gravitational lenses? Only models of clustering that resemble the observed distribution of galaxies in the properties of the two-point correlation function are considered. Monte-Carlo simulations of the imaging process are described. It is found that the effect of clustering is too small to be significant, unless the mass of the deflectors is so large that gravitational lenses become common occurrences. A special model is described which was concocted to optimize the effect of clustering on gravitational lensing but still resemble the observed distribution of galaxies; even this simulation did not satisfactorily produce large numbers of wide-angle lenses.

  12. Improved statistical power with a sparse shape model in detecting an aging effect in the hippocampus and amygdala

    NASA Astrophysics Data System (ADS)

    Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.

    2014-03-01

    The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.

  13. A new test statistic for climate models that includes field and spatial dependencies using Gaussian Markov random fields

    DOE PAGES

    Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel

    2016-07-20

    A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less

  14. A new test statistic for climate models that includes field and spatial dependencies using Gaussian Markov random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel

    A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less

  15. The Real World Significance of Performance Prediction

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Wang, Qing Yang; Trivedi, Shubhendu

    2012-01-01

    In recent years, the educational data mining and user modeling communities have been aggressively introducing models for predicting student performance on external measures such as standardized tests as well as within-tutor performance. While these models have brought statistically reliable improvement to performance prediction, the real world…

  16. Performance comparison of LUR and OK in PM2.5 concentration mapping: a multidimensional perspective

    PubMed Central

    Zou, Bin; Luo, Yanqing; Wan, Neng; Zheng, Zhong; Sternberg, Troy; Liao, Yilan

    2015-01-01

    Methods of Land Use Regression (LUR) modeling and Ordinary Kriging (OK) interpolation have been widely used to offset the shortcomings of PM2.5 data observed at sparse monitoring sites. However, traditional point-based performance evaluation strategy for these methods remains stagnant, which could cause unreasonable mapping results. To address this challenge, this study employs ‘information entropy’, an area-based statistic, along with traditional point-based statistics (e.g. error rate, RMSE) to evaluate the performance of LUR model and OK interpolation in mapping PM2.5 concentrations in Houston from a multidimensional perspective. The point-based validation reveals significant differences between LUR and OK at different test sites despite the similar end-result accuracy (e.g. error rate 6.13% vs. 7.01%). Meanwhile, the area-based validation demonstrates that the PM2.5 concentrations simulated by the LUR model exhibits more detailed variations than those interpolated by the OK method (i.e. information entropy, 7.79 vs. 3.63). Results suggest that LUR modeling could better refine the spatial distribution scenario of PM2.5 concentrations compared to OK interpolation. The significance of this study primarily lies in promoting the integration of point- and area-based statistics for model performance evaluation in air pollution mapping. PMID:25731103

  17. Incidence trend and risk factors for campylobacter infections in humans in Norway

    PubMed Central

    Sandberg, Marianne; Nygård, Karin; Meldal, Hege; Valle, Paul Steinar; Kruse, Hilde; Skjerve, Eystein

    2006-01-01

    Background The objectives of the study were to evaluate whether the increase in incidence of campylobacteriosis observed in humans in Norway from 1995 to 2001 was statistically significant and whether different biologically plausible risk factors were associated with the incidence of campylobacteriosis in the different counties in Norway. Methods To model the incidence of domestically acquired campylobacteriosis from 1995 to 2001, a population average random effect poisson model was applied (the trend model). To case data and assumed risk-factor/protective data such as sale of chicken, receiving treated drinking water, density of dogs and grazing animals, occupation of people in the municipalities and climatic factors from 2000 and 2001, an equivalent model accounting for geographical clustering was applied (the ecological model). Results The increase in incidence of campylobacteriosis in humans in Norway from 1995 to 2001 was statistically significant from 1998. Treated water was a protective factor against Campylobacter infections in humans with an IRR of 0.78 per percentage increase in people supplied. The two-level modelling technique showed no evidence of clustering of campylobacteriosis in any particular county. Aggregation of data on municipality level makes interpretation of the results at the individual level difficult. Conclusion The increase in incidence of Campylobacter infections in humans from 1995 to 2001 was statistically significant from 1998. Treated water was a protective factor against Campylobacter infections in humans with an IRR of 0.78 per percentage increase in people supplied. Campylobacter infections did not appear to be clustered in any particular county in Norway. PMID:16827925

  18. Development of a Cadaveric Model for Arthrocentesis.

    PubMed

    MacIver, Melissa A; Johnson, Matthew

    2015-01-01

    This article reports the development of a novel cadaveric model for future use in teaching arthrocentesis. In the clinical setting, animal safety is essential and practice is thus limited. Objectives of the study were to develop and compare a model to an unmodified cadaver by injecting one of two types of fluids to increase yield. The two fluids injected, mineral oil (MO) and hypertonic saline (HS), were compared to determine any difference on yield. Lastly, aspiration immediately after (T1) or three hours after (T2) injection were compared to determine any effect on diagnostic yield. Joints used included the stifle, elbow, and carpus in eight medium dog cadavers. Arthrocentesis was performed before injection (control) and yield measured. Test joints were injected with MO or HS and yield measured after range of motion (T1) and three hours post injection to simulate lab preparation (T2). Both models had statistically significantly higher yield compared with the unmodified cadaver in all joints at T1 and T2 (p<.05) with the exception of HST2 carpus. T2 aspiration had a statistically significant lower yield when compared to T1HS carpus, T1HS elbow, and T1MO carpus. Overall, irrespective of fluid volume or type, percent yield was lower in T2 compared to T1. No statistically significant difference was seen between HS and MO in most joints with the exception of MOT1 stifle and HST2 elbow. Within the time frame assessed, both models were acceptable. However, HS arthrocentesis models proved appropriate for student trial due to the difficult aspirations with MO.

  19. Impact of Gender on Pharmocokinetics of Intranasal Scopolamine

    NASA Technical Reports Server (NTRS)

    Putcha, L.; Lei, Wu.; S-L Chow, Diana

    2013-01-01

    Introduction: An intranasal gel dosage formulation of scopolamine (INSCOP) was developed for the treatment of Space Motion Sickness (SMS), which is commonly experienced by astronauts during space missions. The bioavailability and pharmacokinetics (PK) were evaluated under IND guidelines. Since information is lacking on the effect of gender on the PK of Scopolamine, we examined gender differences in PK parameters of INSCOP at three dose levels of 0.1, 0.2 and 0.4 mg. Methods: Plasma scopolamine concentrations as a function of time data were collected from twelve normal healthy human subjects (6 male/6 female) who participated in a fully randomized double blind crossover study. The PK parameters were derived using WinNonlin. Covariate analysis of PK profiles was performed using NONMEN and statistically compared using a likelihood ratio test on the difference of objective function value (OFV). Statistical significance for covariate analysis was set at P<0.05(?OFV=3.84). Results: No significant difference in PK parameters between male and female subjects was observed with 0.1 and 0.2 mg doses. However, CL and Vd were significantly different between male and female subjects at the 0.4 mg dose. Results from population covariate modeling analysis indicate that a onecompartment PK model with first-order elimination rate offers best fit for describing INSCOP concentration-time profiles. The inclusion of sex as a covariate enhanced the model fitting (?OFV=-4.1) owing to the genderdependent CL and Vd differences after the 0.4 mg dose. Conclusion: Statistical modeling of scopolamine concentration-time data suggests gender-dependent pharmacokinetics of scopolamine at the high dose level of 0.4 mg. Clearance of the parent compound was significantly faster and the volume of distribution was significantly higher in males than in females, As a result, including gender as a covariate to the pharmacokinetic model of scopolamine offers the best fit for PK modeling of the drug at dose of 0.4 mg or higher.

  20. Adapting an Agent-Based Model of Socio-Technical Systems to Analyze System and Security Failures

    DTIC Science & Technology

    2016-05-09

    statistically significant amount, which it did with a p-valueɘ.0003 on a simulation of 3125 iterations; the data is shown in the Delegation 1 column of...Blackout metric to a statistically significant amount, with a p-valueɘ.0003 on a simulation of 3125 iterations; the data is shown in the Delegation 2...Proceedings of the 9th International Conference on Autonomous Agents and Multiagent Systems: volume 1-Volume 1, pp. 1007- 1014 . International Foundation

  1. Relationship of body weight parameters with the incidence of common spontaneous tumors in Tg.rasH2 mice.

    PubMed

    Paranjpe, Madhav G; Denton, Melissa D; Vidmar, Tom J; Elbekai, Reem H

    2014-10-01

    The mechanistic relationship between increased food consumption, increased body weights, and increased incidence of tumors has been well established in 2-year rodent models. Body weight parameters such as initial body weights, terminal body weights, food consumption, and the body weight gains in grams and percentages were analyzed to determine whether such relationship exists between these parameters with the incidence of common spontaneous tumors in Tg.rasH2 mice. None of these body weight parameters had any statistically significant relationship with the incidence of common spontaneous tumors in Tg.rasH2 males, namely lung tumors, splenic hemangiosarcomas, nonsplenic hemangiosarcomas, combined incidence of all hemangiosarcomas, and Harderian gland tumors. These parameters also did not have any statistically significant relationship with the incidence of lung and Harderian gland tumors in females. However, in females, increased initial body weights did have a statistically significant relationship with the nonsplenic hemangiosarcomas, and increased terminal body weights did have a statistically significant relationship with the incidence of splenic hemangiosarcomas, nonsplenic hemangiosarcomas, and the combined incidence of all hemangiosarcomas. In addition, increased body weight gains in grams and percentages had a statistically significant relationship with the combined incidence of all hemangiosarcomas in females, but not separately with splenic and nonsplenic hemangiosarcomas. © 2013 by The Author(s).

  2. CoMFA analyses of C-2 position salvinorin A analogs at the kappa-opioid receptor provides insights into epimer selectivity.

    PubMed

    McGovern, Donna L; Mosier, Philip D; Roth, Bryan L; Westkaemper, Richard B

    2010-04-01

    The highly potent and kappa-opioid (KOP) receptor-selective hallucinogen Salvinorin A and selected analogs have been analyzed using the 3D quantitative structure-affinity relationship technique Comparative Molecular Field Analysis (CoMFA) in an effort to derive a statistically significant and predictive model of salvinorin affinity at the KOP receptor and to provide additional statistical support for the validity of previously proposed structure-based interaction models. Two CoMFA models of Salvinorin A analogs substituted at the C-2 position are presented. Separate models were developed based on the radioligand used in the kappa-opioid binding assay, [(3)H]diprenorphine or [(125)I]6 beta-iodo-3,14-dihydroxy-17-cyclopropylmethyl-4,5 alpha-epoxymorphinan ([(125)I]IOXY). For each dataset, three methods of alignment were employed: a receptor-docked alignment derived from the structure-based docking algorithm GOLD, another from the ligand-based alignment algorithm FlexS, and a rigid realignment of the poses from the receptor-docked alignment. The receptor-docked alignment produced statistically superior results compared to either the FlexS alignment or the realignment in both datasets. The [(125)I]IOXY set (Model 1) and [(3)H]diprenorphine set (Model 2) gave q(2) values of 0.592 and 0.620, respectively, using the receptor-docked alignment, and both models produced similar CoMFA contour maps that reflected the stereoelectronic features of the receptor model from which they were derived. Each model gave significantly predictive CoMFA statistics (Model 1 PSET r(2)=0.833; Model 2 PSET r(2)=0.813). Based on the CoMFA contour maps, a binding mode was proposed for amine-containing Salvinorin A analogs that provides a rationale for the observation that the beta-epimers (R-configuration) of protonated amines at the C-2 position have a higher affinity than the corresponding alpha-epimers (S-configuration). (c) 2010. Published by Elsevier Inc.

  3. On Lack of Robustness in Hydrological Model Development Due to Absence of Guidelines for Selecting Calibration and Evaluation Data: Demonstration for Data-Driven Models

    NASA Astrophysics Data System (ADS)

    Zheng, Feifei; Maier, Holger R.; Wu, Wenyan; Dandy, Graeme C.; Gupta, Hoshin V.; Zhang, Tuqiao

    2018-02-01

    Hydrological models are used for a wide variety of engineering purposes, including streamflow forecasting and flood-risk estimation. To develop such models, it is common to allocate the available data to calibration and evaluation data subsets. Surprisingly, the issue of how this allocation can affect model evaluation performance has been largely ignored in the research literature. This paper discusses the evaluation performance bias that can arise from how available data are allocated to calibration and evaluation subsets. As a first step to assessing this issue in a statistically rigorous fashion, we present a comprehensive investigation of the influence of data allocation on the development of data-driven artificial neural network (ANN) models of streamflow. Four well-known formal data splitting methods are applied to 754 catchments from Australia and the U.S. to develop 902,483 ANN models. Results clearly show that the choice of the method used for data allocation has a significant impact on model performance, particularly for runoff data that are more highly skewed, highlighting the importance of considering the impact of data splitting when developing hydrological models. The statistical behavior of the data splitting methods investigated is discussed and guidance is offered on the selection of the most appropriate data splitting methods to achieve representative evaluation performance for streamflow data with different statistical properties. Although our results are obtained for data-driven models, they highlight the fact that this issue is likely to have a significant impact on all types of hydrological models, especially conceptual rainfall-runoff models.

  4. Weather variability, tides, and Barmah Forest virus disease in the Gladstone region, Australia.

    PubMed

    Naish, Suchithra; Hu, Wenbiao; Nicholls, Neville; Mackenzie, John S; McMichael, Anthony J; Dale, Pat; Tong, Shilu

    2006-05-01

    In this study we examined the impact of weather variability and tides on the transmission of Barmah Forest virus (BFV) disease and developed a weather-based forecasting model for BFV disease in the Gladstone region, Australia. We used seasonal autoregressive integrated moving-average (SARIMA) models to determine the contribution of weather variables to BFV transmission after the time-series data of response and explanatory variables were made stationary through seasonal differencing. We obtained data on the monthly counts of BFV cases, weather variables (e.g., mean minimum and maximum temperature, total rainfall, and mean relative humidity), high and low tides, and the population size in the Gladstone region between January 1992 and December 2001 from the Queensland Department of Health, Australian Bureau of Meteorology, Queensland Department of Transport, and Australian Bureau of Statistics, respectively. The SARIMA model shows that the 5-month moving average of minimum temperature (b=0.15, p-value<0.001) was statistically significantly and positively associated with BFV disease, whereas high tide in the current month (b=-1.03, p-value=0.04) was statistically significantly and inversely associated with it. However, no significant association was found for other variables. These results may be applied to forecast the occurrence of BFV disease and to use public health resources in BFV control and prevention.

  5. Analysis of variance to assess statistical significance of Laplacian estimation accuracy improvement due to novel variable inter-ring distances concentric ring electrodes.

    PubMed

    Makeyev, Oleksandr; Joe, Cody; Lee, Colin; Besio, Walter G

    2017-07-01

    Concentric ring electrodes have shown promise in non-invasive electrophysiological measurement demonstrating their superiority to conventional disc electrodes, in particular, in accuracy of Laplacian estimation. Recently, we have proposed novel variable inter-ring distances concentric ring electrodes. Analytic and finite element method modeling results for linearly increasing distances electrode configurations suggested they may decrease the truncation error resulting in more accurate Laplacian estimates compared to currently used constant inter-ring distances configurations. This study assesses statistical significance of Laplacian estimation accuracy improvement due to novel variable inter-ring distances concentric ring electrodes. Full factorial design of analysis of variance was used with one categorical and two numerical factors: the inter-ring distances, the electrode diameter, and the number of concentric rings in the electrode. The response variables were the Relative Error and the Maximum Error of Laplacian estimation computed using a finite element method model for each of the combinations of levels of three factors. Effects of the main factors and their interactions on Relative Error and Maximum Error were assessed and the obtained results suggest that all three factors have statistically significant effects in the model confirming the potential of using inter-ring distances as a means of improving accuracy of Laplacian estimation.

  6. A model for indexing medical documents combining statistical and symbolic knowledge.

    PubMed

    Avillach, Paul; Joubert, Michel; Fieschi, Marius

    2007-10-11

    To develop and evaluate an information processing method based on terminologies, in order to index medical documents in any given documentary context. We designed a model using both symbolic general knowledge extracted from the Unified Medical Language System (UMLS) and statistical knowledge extracted from a domain of application. Using statistical knowledge allowed us to contextualize the general knowledge for every particular situation. For each document studied, the extracted terms are ranked to highlight the most significant ones. The model was tested on a set of 17,079 French standardized discharge summaries (SDSs). The most important ICD-10 term of each SDS was ranked 1st or 2nd by the method in nearly 90% of the cases. The use of several terminologies leads to more precise indexing. The improvement achieved in the models implementation performances as a result of using semantic relationships is encouraging.

  7. A note about high blood pressure in childhood

    NASA Astrophysics Data System (ADS)

    Teodoro, M. Filomena; Simão, Carla

    2017-06-01

    In medical, behavioral and social sciences it is usual to get a binary outcome. In the present work is collected information where some of the outcomes are binary variables (1='yes'/ 0='no'). In [14] a preliminary study about the caregivers perception of pediatric hypertension was introduced. An experimental questionnaire was designed to be answered by the caregivers of routine pediatric consultation attendees in the Santa Maria's hospital (HSM). The collected data was statistically analyzed, where a descriptive analysis and a predictive model were performed. Significant relations between some socio-demographic variables and the assessed knowledge were obtained. In [14] can be found a statistical data analysis using partial questionnaire's information. The present article completes the statistical approach estimating a model for relevant remaining questions of questionnaire by Generalized Linear Models (GLM). Exploring the binary outcome issue, we intend to extend this approach using Generalized Linear Mixed Models (GLMM), but the process is still ongoing.

  8. Fully Bayesian tests of neutrality using genealogical summary statistics.

    PubMed

    Drummond, Alexei J; Suchard, Marc A

    2008-10-31

    Many data summary statistics have been developed to detect departures from neutral expectations of evolutionary models. However questions about the neutrality of the evolution of genetic loci within natural populations remain difficult to assess. One critical cause of this difficulty is that most methods for testing neutrality make simplifying assumptions simultaneously about the mutational model and the population size model. Consequentially, rejecting the null hypothesis of neutrality under these methods could result from violations of either or both assumptions, making interpretation troublesome. Here we harness posterior predictive simulation to exploit summary statistics of both the data and model parameters to test the goodness-of-fit of standard models of evolution. We apply the method to test the selective neutrality of molecular evolution in non-recombining gene genealogies and we demonstrate the utility of our method on four real data sets, identifying significant departures of neutrality in human influenza A virus, even after controlling for variation in population size. Importantly, by employing a full model-based Bayesian analysis, our method separates the effects of demography from the effects of selection. The method also allows multiple summary statistics to be used in concert, thus potentially increasing sensitivity. Furthermore, our method remains useful in situations where analytical expectations and variances of summary statistics are not available. This aspect has great potential for the analysis of temporally spaced data, an expanding area previously ignored for limited availability of theory and methods.

  9. Comparison of statistical models for writer verification

    NASA Astrophysics Data System (ADS)

    Srihari, Sargur; Ball, Gregory R.

    2009-01-01

    A novel statistical model for determining whether a pair of documents, a known and a questioned, were written by the same individual is proposed. The goal of this formulation is to learn the specific uniqueness of style in a particular author's writing, given the known document. Since there are often insufficient samples to extrapolate a generalized model of an writer's handwriting based solely on the document, we instead generalize over the differences between the author and a large population of known different writers. This is in contrast to an earlier model proposed whereby probability distributions were a priori without learning. We show the performance of the model along with a comparison in performance to the non-learning, older model, which shows significant improvement.

  10. Statistical inference methods for sparse biological time series data.

    PubMed

    Ndukum, Juliet; Fonseca, Luís L; Santos, Helena; Voit, Eberhard O; Datta, Susmita

    2011-04-25

    Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR) spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values <0.0001). We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures, based on ANOVA likelihood ratio tests, for testing the significance of differences between short time course data under different biological perturbations.

  11. Ultra-low-dose computed tomographic angiography with model-based iterative reconstruction compared with standard-dose imaging after endovascular aneurysm repair: a prospective pilot study.

    PubMed

    Naidu, Sailen G; Kriegshauser, J Scott; Paden, Robert G; He, Miao; Wu, Qing; Hara, Amy K

    2014-12-01

    An ultra-low-dose radiation protocol reconstructed with model-based iterative reconstruction was compared with our standard-dose protocol. This prospective study evaluated 20 men undergoing surveillance-enhanced computed tomography after endovascular aneurysm repair. All patients underwent standard-dose and ultra-low-dose venous phase imaging; images were compared after reconstruction with filtered back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction. Objective measures of aortic contrast attenuation and image noise were averaged. Images were subjectively assessed (1 = worst, 5 = best) for diagnostic confidence, image noise, and vessel sharpness. Aneurysm sac diameter and endoleak detection were compared. Quantitative image noise was 26% less with ultra-low-dose model-based iterative reconstruction than with standard-dose adaptive statistical iterative reconstruction and 58% less than with ultra-low-dose adaptive statistical iterative reconstruction. Average subjective noise scores were not different between ultra-low-dose model-based iterative reconstruction and standard-dose adaptive statistical iterative reconstruction (3.8 vs. 4.0, P = .25). Subjective scores for diagnostic confidence were better with standard-dose adaptive statistical iterative reconstruction than with ultra-low-dose model-based iterative reconstruction (4.4 vs. 4.0, P = .002). Vessel sharpness was decreased with ultra-low-dose model-based iterative reconstruction compared with standard-dose adaptive statistical iterative reconstruction (3.3 vs. 4.1, P < .0001). Ultra-low-dose model-based iterative reconstruction and standard-dose adaptive statistical iterative reconstruction aneurysm sac diameters were not significantly different (4.9 vs. 4.9 cm); concordance for the presence of endoleak was 100% (P < .001). Compared with a standard-dose technique, an ultra-low-dose model-based iterative reconstruction protocol provides comparable image quality and diagnostic assessment at a 73% lower radiation dose.

  12. Proliferative changes in the bronchial epithelium of former smokers treated with retinoids.

    PubMed

    Hittelman, Walter N; Liu, Diane D; Kurie, Jonathan M; Lotan, Reuben; Lee, Jin Soo; Khuri, Fadlo; Ibarguen, Heladio; Morice, Rodolfo C; Walsh, Garrett; Roth, Jack A; Minna, John; Ro, Jae Y; Broxson, Anita; Hong, Waun Ki; Lee, J Jack

    2007-11-07

    Retinoids have shown antiproliferative and chemopreventive activity. We analyzed data from a randomized, placebo-controlled chemoprevention trial to determine whether a 3-month treatment with either 9-cis-retinoic acid (RA) or 13-cis-RA and alpha-tocopherol reduced Ki-67, a proliferation biomarker, in the bronchial epithelium. Former smokers (n = 225) were randomly assigned to receive 3 months of daily oral 9-cis-RA (100 mg), 13-cis-RA (1 mg/kg) and alpha-tocopherol (1200 IU), or placebo. Bronchoscopic biopsy specimens obtained before and after treatment were immunohistochemically assessed for changes in the Ki-67 proliferative index (i.e., percentage of cells with Ki-67-positive nuclear staining) in the basal and parabasal layers of the bronchial epithelium. Per-subject and per-biopsy site analyses were conducted. Multicovariable analyses, including a mixed-effects model and a generalized estimating equations model, were used to investigate the treatment effect (Ki-67 labeling index and percentage of bronchial epithelial biopsy sites with a Ki-67 index > or = 5%) with adjustment for multiple covariates, such as smoking history and metaplasia. Coefficient estimates and 95% confidence intervals (CIs) were obtained from the models. All statistical tests were two-sided. In per-subject analyses, Ki-67 labeling in the basal layer was not changed by any treatment; the percentage of subjects with a high Ki-67 labeling in the parabasal layer dropped statistically significantly after treatment with 13-cis-RA and alpha-tocopherol treatment (P = .04) compared with placebo, but the drop was not statistically significant after 9-cis-RA treatment (P = .17). A similar effect was observed in the parabasal layer in a per-site analysis; the percentage of sites with high Ki-67 labeling dropped statistically significantly after 9-cis-RA treatment (coefficient estimate = -0.72, 95% CI = -1.24 to -0.20; P = .007) compared with placebo, and after 13-cis-RA and alpha-tocopherol treatment (coefficient estimate = -0.66, 95% CI = -1.15 to -0.17; P = .008). In per-subject analyses, treatment with 13-cis-RA and alpha-tocopherol, compared with placebo, was statistically significantly associated with reduced bronchial epithelial cell proliferation; treatment with 9-cis-RA was not. In per-site analyses, statistically significant associations were obtained with both treatments.

  13. Proliferative Changes in the Bronchial Epithelium of Former Smokers Treated With Retinoids

    PubMed Central

    Hittelman, Walter N.; Liu, Diane D.; Kurie, Jonathan M.; Lotan, Reuben; Lee, Jin Soo; Khuri, Fadlo; Ibarguen, Heladio; Morice, Rodolfo C.; Walsh, Garrett; Roth, Jack A.; Minna, John; Ro, Jae Y.; Broxson, Anita; Hong, Waun Ki; Lee, J. Jack

    2012-01-01

    Background Retinoids have shown antiproliferative and chemopreventive activity. We analyzed data from a randomized, placebo-controlled chemoprevention trial to determine whether a 3-month treatment with either 9-cis-retinoic acid (RA) or 13-cis-RA and α-tocopherol reduced Ki-67, a proliferation biomarker, in the bronchial epithelium. Methods Former smokers (n = 225) were randomly assigned to receive 3 months of daily oral 9-cis-RA (100 mg), 13-cis-RA (1 mg/kg) and α-tocopherol (1200 IU), or placebo. Bronchoscopic biopsy specimens obtained before and after treatment were immunohistochemically assessed for changes in the Ki-67 proliferative index (i.e., percentage of cells with Ki-67–positive nuclear staining) in the basal and parabasal layers of the bronchial epithelium. Per-subject and per–biopsy site analyses were conducted. Multicovariable analyses, including a mixed-effects model and a generalized estimating equations model, were used to investigate the treatment effect (Ki-67 labeling index and percentage of bronchial epithelial biopsy sites with a Ki-67 index ≥ 5%) with adjustment for multiple covariates, such as smoking history and metaplasia. Coefficient estimates and 95% confidence intervals (CIs) were obtained from the models. All statistical tests were two-sided. Results In per-subject analyses, Ki-67 labeling in the basal layer was not changed by any treatment; the percentage of subjects with a high Ki-67 labeling in the parabasal layer dropped statistically significantly after treatment with 13-cis-RA and α-tocopherol treatment (P = .04) compared with placebo, but the drop was not statistically significant after 9-cis-RA treatment (P = .17). A similar effect was observed in the parabasal layer in a per-site analysis; the percentage of sites with high Ki-67 labeling dropped statistically significantly after 9-cis-RA treatment (coefficient estimate = −0.72, 95% CI = −1.24 to −0.20; P = .007) compared with placebo, and after 13-cis-RA and α-tocopherol treatment (coefficient estimate = −0.66, 95% CI = −1.15 to −0.17; P = .008). Conclusions In per-subject analyses, treatment with 13-cis-RA and α-tocopherol, compared with placebo, was statistically significantly associated with reduced bronchial epithelial cell proliferation; treatment with 9-cis-RA was not. In per-site analyses, statistically significant associations were obtained with both treatments. PMID:17971525

  14. The effects of modeling instruction on high school physics academic achievement

    NASA Astrophysics Data System (ADS)

    Wright, Tiffanie L.

    The purpose of this study was to explore whether Modeling Instruction, compared to traditional lecturing, is an effective instructional method to promote academic achievement in selected high school physics classes at a rural middle Tennessee high school. This study used an ex post facto , quasi-experimental research methodology. The independent variables in this study were the instructional methods of teaching. The treatment variable was Modeling Instruction and the control variable was traditional lecture instruction. The Treatment Group consisted of participants in Physical World Concepts who received Modeling Instruction. The Control Group consisted of participants in Physical Science who received traditional lecture instruction. The dependent variable was gains scores on the Force Concepts Inventory (FCI). The participants for this study were 133 students each in both the Treatment and Control Groups (n = 266), who attended a public, high school in rural middle Tennessee. The participants were administered the Force Concepts Inventory (FCI) prior to being taught the mechanics of physics. The FCI data were entered into the computer-based Statistical Package for the Social Science (SPSS). Two independent samples t-tests were conducted to answer the research questions. There was a statistically significant difference between the treatment and control groups concerning the instructional method. Modeling Instructional methods were found to be effective in increasing the academic achievement of students in high school physics. There was no statistically significant difference between FCI gains scores for gender. Gender was found to have no effect on the academic achievement of students in high school physics classes. However, even though there was not a statistically significant difference, female students' gains scores were higher than male students' gains scores when Modeling Instructional methods of teaching were used. Based on these findings, it is recommended that high school science teachers should use Modeling Instructional methods of teaching daily in their classrooms. A recommendation for further research is to expand the Modeling Instructional methods of teaching into different content areas, (i.e., reading and language arts) to explore academic achievement gains.

  15. Human Immunodeficiency Virus and the Enrolled Student: A Model Policy.

    ERIC Educational Resources Information Center

    Iowa State Dept. of Education, Des Moines.

    In the nearly 4 years since the initial publication of the model policy "Communicable Diseases and the Enrolled Student" in January 1986, the statistics, recommendations, and even the terminology of Acquired Immune Deficiency Syndrome (AIDS) have changed significantly. In light of the new information, the model policy, recommended for…

  16. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    NASA Astrophysics Data System (ADS)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system was developed to allow users to easily select the most reliable reference climate data at each target point based on the elevation of grid cell. By constructing the best combination of reference data for the study domain, the accurate and reliable statistically downscaled climate projections could be significantly improved.

  17. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    PubMed

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.

  18. Improving the Validity of Activity of Daily Living Dependency Risk Assessment

    PubMed Central

    Clark, Daniel O.; Stump, Timothy E.; Tu, Wanzhu; Miller, Douglas K.

    2015-01-01

    Objectives Efforts to prevent activity of daily living (ADL) dependency may be improved through models that assess older adults’ dependency risk. We evaluated whether cognition and gait speed measures improve the predictive validity of interview-based models. Method Participants were 8,095 self-respondents in the 2006 Health and Retirement Survey who were aged 65 years or over and independent in five ADLs. Incident ADL dependency was determined from the 2008 interview. Models were developed using random 2/3rd cohorts and validated in the remaining 1/3rd. Results Compared to a c-statistic of 0.79 in the best interview model, the model including cognitive measures had c-statistics of 0.82 and 0.80 while the best fitting gait speed model had c-statistics of 0.83 and 0.79 in the development and validation cohorts, respectively. Conclusion Two relatively brief models, one that requires an in-person assessment and one that does not, had excellent validity for predicting incident ADL dependency but did not significantly improve the predictive validity of the best fitting interview-based models. PMID:24652867

  19. BCM: toolkit for Bayesian analysis of Computational Models using samplers.

    PubMed

    Thijssen, Bram; Dijkstra, Tjeerd M H; Heskes, Tom; Wessels, Lodewyk F A

    2016-10-21

    Computational models in biology are characterized by a large degree of uncertainty. This uncertainty can be analyzed with Bayesian statistics, however, the sampling algorithms that are frequently used for calculating Bayesian statistical estimates are computationally demanding, and each algorithm has unique advantages and disadvantages. It is typically unclear, before starting an analysis, which algorithm will perform well on a given computational model. We present BCM, a toolkit for the Bayesian analysis of Computational Models using samplers. It provides efficient, multithreaded implementations of eleven algorithms for sampling from posterior probability distributions and for calculating marginal likelihoods. BCM includes tools to simplify the process of model specification and scripts for visualizing the results. The flexible architecture allows it to be used on diverse types of biological computational models. In an example inference task using a model of the cell cycle based on ordinary differential equations, BCM is significantly more efficient than existing software packages, allowing more challenging inference problems to be solved. BCM represents an efficient one-stop-shop for computational modelers wishing to use sampler-based Bayesian statistics.

  20. Projected changes, climate change signal, and uncertainties in the CMIP5-based projections of ocean surface wave heights

    NASA Astrophysics Data System (ADS)

    Wang, Xiaolan; Feng, Yang; Swail, Val R.

    2016-04-01

    Ocean surface waves can be major hazards in coastal and offshore activities. However, wave observations are available only at limited locations and cover only the recent few decades. Also, there exists very limited information on ocean wave behavior in response to climate change, because such information is not simulated in current global climate models. In a recent study, we used a multivariate regression model with lagged dependent variable to make statistical global projections of changes in significant wave heights (Hs) using mean sea level pressure (SLP) information from 20 CMIP5 climate models for the twenty-first century. The statistical model was calibrated and validated using the ERA-Interim reanalysis of Hs and SLP for the period 1981-2010. The results show Hs increases in the tropics (especially in the eastern tropical Pacific) and in southern hemisphere high-latitudes. Under the projected 2070-2099 climate condition of the RCP8.5 scenario, the occurrence frequency of the present-day one-in-10-year extreme wave heights is likely to double or triple in several coastal regions around the world (e.g., the Chilean coast, Gulf of Oman, Gulf of Bengal, Gulf of Mexico). More recently, we used the analysis of variance approaches to quantify the climate change signal and uncertainty in multi-model ensembles of statistical Hs simulations globally, which are based on the CMIP5 historical, RCP4.5 and RCP8.5 forcing scenario simulations of SLP. In a 4-model 3-run ensemble, the 4-model common signal of climate change is found to strengthen over time, as would be expected. For the historical followed by RCP8.5 scenario, the common signal in annual mean Hs is found to be significant over 16.6%, 55.0% and 82.2% of the area by year 2005, 2050 and 2099, respectively. For the annual maximum, the signal is much weaker. The signal is strongest in the eastern tropical Pacific, featuring significant increases in both the annual mean and maximum of Hs in this region. The climate model uncertainty (i.e., inter-model variability) is significant over 99.9% of the area; its magnitude is comparable to or greater than the climate change signal by 2099 over most areas, except in the eastern tropical Pacific where the signal is much larger. In a 20-model 2-scenario single-run ensemble of statistical Hs simulations for the period 2006-2099, the model uncertainty is found to be significant globally; it is about 10 times as large as the scenario uncertainty between RCP4.5 and RCP8.5 scenarios.

  1. Evaluating the utility of companion animal tick surveillance practices for monitoring spread and occurrence of human Lyme disease in West Virginia, 2014-2016.

    PubMed

    Hendricks, Brian; Mark-Carew, Miguella; Conley, Jamison

    2017-11-13

    Domestic dogs and cats are potentially effective sentinel populations for monitoring occurrence and spread of Lyme disease. Few studies have evaluated the public health utility of sentinel programmes using geo-analytic approaches. Confirmed Lyme disease cases diagnosed by physicians and ticks submitted by veterinarians to the West Virginia State Health Department were obtained for 2014-2016. Ticks were identified to species, and only Ixodes scapularis were incorporated in the analysis. Separate ordinary least squares (OLS) and spatial lag regression models were conducted to estimate the association between average numbers of Ix. scapularis collected on pets and human Lyme disease incidence. Regression residuals were visualised using Local Moran's I as a diagnostic tool to identify spatial dependence. Statistically significant associations were identified between average numbers of Ix. scapularis collected from dogs and human Lyme disease in the OLS (β=20.7, P<0.001) and spatial lag (β=12.0, P=0.002) regression. No significant associations were identified for cats in either regression model. Statistically significant (P≤0.05) spatial dependence was identified in all regression models. Local Moran's I maps produced for spatial lag regression residuals indicated a decrease in model over- and under-estimation, but identified a higher number of statistically significant outliers than OLS regression. Results support previous conclusions that dogs are effective sentinel populations for monitoring risk of human exposure to Lyme disease. Findings reinforce the utility of spatial analysis of surveillance data, and highlight West Virginia's unique position within the eastern United States in regards to Lyme disease occurrence.

  2. On statistical analysis of factors affecting anthocyanin extraction from Ixora siamensis

    NASA Astrophysics Data System (ADS)

    Mat Nor, N. A.; Arof, A. K.

    2016-10-01

    This study focused on designing an experimental model in order to evaluate the influence of operative extraction parameters employed for anthocyanin extraction from Ixora siamensis on CIE color measurements (a*, b* and color saturation). Extractions were conducted at temperatures of 30, 55 and 80°C, soaking time of 60, 120 and 180 min using acidified methanol solvent with different trifluoroacetic acid (TFA) contents of 0.5, 1.75 and 3% (v/v). The statistical evaluation was performed by running analysis of variance (ANOVA) and regression calculation to investigate the significance of the generated model. Results show that the generated regression models adequately explain the data variation and significantly represented the actual relationship between the independent variables and the responses. Analysis of variance (ANOVA) showed high coefficient determination values (R2) of 0.9687 for a*, 0.9621 for b* and 0.9758 for color saturation, thus ensuring a satisfactory fit of the developed models with the experimental data. Interaction between TFA content and extraction temperature exhibited to the highest significant influence on CIE color parameter.

  3. Population Pharmacokinetics of Intranasal Scopolamine

    NASA Technical Reports Server (NTRS)

    Wu, L.; Chow, D. S. L.; Putcha, L.

    2013-01-01

    Introduction: An intranasal gel dosage formulation of scopolamine (INSCOP) was developed for the treatment of Space Motion Sickness (SMS).The bioavailability and pharmacokinetics (PK) was evaluated using data collected in Phase II IND protocols. We reported earlier statistically significant gender differences in PK parameters of INSCOP at a dose level of 0.4 mg. To identify covariates that influence PK parameters of INSCOP, we examined population covariates of INSCOP PK model for 0.4 mg dose. Methods: Plasma scopolamine concentrations versus time data were collected from 20 normal healthy human subjects (11 male/9 female) after a 0.4 mg dose. Phoenix NLME was employed for PK analysis of these data using gender, body weight and age as covariates for model selection. Model selection was based on a likelihood ratio test on the difference of criteria (-2LL). Statistical significance for base model building and individual covariate analysis was set at P less than 0.05{delta(-2LL)=3.84}. Results: A one-compartment pharmacokinetic model with first-order elimination best described INSCOP concentration ]time profiles. Inclusion of gender, body weight and age as covariates individually significantly reduced -2LL by the cut-off value of 3.84(P less than 0.05) when tested against the base model. After the forward stepwise selection and backward elimination steps, gender was selected to add to the final model which had significant influence on absorption rate constant (ka) and the volume of distribution (V) of INSCOP. Conclusion: A population pharmacokinetic model for INSCOP has been identified and gender was a significant contributing covariate for the final model. The volume of distribution and Ka were significantly higher in males than in females which confirm gender-dependent pharmacokinetics of scopolamine after administration of a 0.4 mg dose.

  4. Disaster response team FAST skills training with a portable ultrasound simulator compared to traditional training: pilot study.

    PubMed

    Paddock, Michael T; Bailitz, John; Horowitz, Russ; Khishfe, Basem; Cosby, Karen; Sergel, Michelle J

    2015-03-01

    Pre-hospital focused assessment with sonography in trauma (FAST) has been effectively used to improve patient care in multiple mass casualty events throughout the world. Although requisite FAST knowledge may now be learned remotely by disaster response team members, traditional live instructor and model hands-on FAST skills training remains logistically challenging. The objective of this pilot study was to compare the effectiveness of a novel portable ultrasound (US) simulator with traditional FAST skills training for a deployed mixed provider disaster response team. We randomized participants into one of three training groups stratified by provider role: Group A. Traditional Skills Training, Group B. US Simulator Skills Training, and Group C. Traditional Skills Training Plus US Simulator Skills Training. After skills training, we measured participants' FAST image acquisition and interpretation skills using a standardized direct observation tool (SDOT) with healthy models and review of FAST patient images. Pre- and post-course US and FAST knowledge were also assessed using a previously validated multiple-choice evaluation. We used the ANOVA procedure to determine the statistical significance of differences between the means of each group's skills scores. Paired sample t-tests were used to determine the statistical significance of pre- and post-course mean knowledge scores within groups. We enrolled 36 participants, 12 randomized to each training group. Randomization resulted in similar distribution of participants between training groups with respect to provider role, age, sex, and prior US training. For the FAST SDOT image acquisition and interpretation mean skills scores, there was no statistically significant difference between training groups. For US and FAST mean knowledge scores, there was a statistically significant improvement between pre- and post-course scores within each group, but again there was not a statistically significant difference between training groups. This pilot study of a deployed mixed-provider disaster response team suggests that a novel portable US simulator may provide equivalent skills training in comparison to traditional live instructor and model training. Further studies with a larger sample size and other measures of short- and long-term clinical performance are warranted.

  5. Air pollution and hospital admissions for asthma in a tropical city: Kaohsiung, Taiwan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shang-Shyue Tsai; Meng-Hsuan Cheng; Hui-Fen Chiu

    2006-07-15

    This study was undertaken to determine whether there is an association between air pollutants levels and hospital admissions for asthma in Kaohsiung, Taiwan. Hospital admissions for asthma and ambient air pollution data for Kaohsiung were obtained for the period from 1996 through 2003. The relative risk of hospital admission was estimated using a case-crossover approach, controlling for weather variables, day of the week, seasonality, and long-term time trends. In the single-pollutant models, on warm days ({>=}25{sup o}C) statistically significant positive associations were found in all pollutants except sulfur dioxide. On cool days ({<=} 25{sup o}C) all pollutants were significantly associatedmore » with asthma admissions. For the two pollutant models, CO and O{sub 3} were significant in combination with each of the other four pollutants on warm days. On cool days NO{sub 2} remained statistically significant in all the two-pollutant models. This study provides evidence that higher levels of ambient pollutants increase the risk of hospital admissions for asthma.« less

  6. System and method for statistically monitoring and analyzing sensed conditions

    DOEpatents

    Pebay, Philippe P [Livermore, CA; Brandt, James M [Dublin, CA; Gentile, Ann C [Dublin, CA; Marzouk, Youssef M [Oakland, CA; Hale, Darrian J [San Jose, CA; Thompson, David C [Livermore, CA

    2011-01-04

    A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.

  7. System and method for statistically monitoring and analyzing sensed conditions

    DOEpatents

    Pebay, Philippe P [Livermore, CA; Brandt, James M [Dublin, CA; Gentile, Ann C [Dublin, CA; Marzouk, Youssef M [Oakland, CA; Hale, Darrian J [San Jose, CA; Thompson, David C [Livermore, CA

    2011-01-25

    A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.

  8. System and method for statistically monitoring and analyzing sensed conditions

    DOEpatents

    Pebay, Philippe P [Livermore, CA; Brandt, James M. , Gentile; Ann C. , Marzouk; Youssef M. , Hale; Darrian J. , Thompson; David, C [Livermore, CA

    2010-07-13

    A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.

  9. Projected changes in significant wave height toward the end of the 21st century: Northeast Atlantic

    NASA Astrophysics Data System (ADS)

    Aarnes, Ole Johan; Reistad, Magnar; Breivik, Øyvind; Bitner-Gregersen, Elzbieta; Ingolf Eide, Lars; Gramstad, Odin; Magnusson, Anne Karin; Natvig, Bent; Vanem, Erik

    2017-04-01

    Wind field ensembles from six CMIP5 models force wave model time slices of the northeast Atlantic over the last three decades of the 20th and the 21st centuries. The future wave climate is investigated by considering the RCP4.5 and RCP8.5 emission scenarios. The CMIP5 model selection is based on their ability to reconstruct the present (1971-2000) extratropical cyclone activity, but increased spatial resolution has also been emphasized. In total, the study comprises 35 wave model integrations, each about 30 years long, in total more than 1000 years. Here annual statistics of significant wave height are analyzed, including mean parameters and upper percentiles. There is general agreement among all models considered that the mean significant wave height is expected to decrease by the end of the 21st century. This signal is statistically significant also for higher percentiles, but less evident for annual maxima. The RCP8.5 scenario yields the strongest reduction in wave height. The exception to this is the north western part of the Norwegian Sea and the Barents Sea, where receding ice cover gives longer fetch and higher waves. The upper percentiles are reduced less than the mean wave height, suggesting that the future wave climate has higher variance than the historical period.

  10. Development of a Microsimulation Model to Predict Stroke and Long-Term Mortality in Adherent and Nonadherent Medically Managed and Surgically Treated Octogenarians with Asymptomatic Significant Carotid Artery Stenosis.

    PubMed

    Luebke, Thomas; Brunkwall, Jan

    2016-08-01

    The primary study objective was to develop a microsimulation model to predict preventable first-ever and recurrent strokes and mortality for a population of medically or surgically managed octogenarians with substantial (>60%) asymptomatic carotid artery stenosis and comparing an adherent with a real-world nonadherent best medical treatment (BMT) regimen subjected to sex. A Monte Carlo microsimulation model was constructed with a 14-year time horizon and with 10,000 patients. Probabilities and values for clinical outcomes were obtained from the current literature. The stratification of the microsimulation estimates by treatment strategy within the female group of octogenarians showed a statistically significant lower stroke rate during follow-up for carotid endarterectomy (CEA) compared with nonadherent BMT (P < 0.0001) as well as compared with adherent BMT (P < 0.0001). In male octogenarians, the CEA strategy was also associated with statistically significant lower stroke rates compared with adherent and nonadherent BMT (P < 0.0001 and P < 0.0001, respectively). For each treatment strategy, female octogenarians had a statistically significant longer overall long-term survival compared with male octogenarians (P < 0.0001, respectively). In terms of stratification by sex, in octogenarian men and women, long-term survival was significantly better for adherent BMT compared with nonadherent BMT, and CEA was associated with a significant better long-term survival compared with nonadherent BMT. In the present microsimulation, in real-world drug adherence, it was likely that a strategy of early endarterectomy was beneficial in octogenarians with significant asymptomatic carotid artery disease compared with BMT alone. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Statistical shear lag model - unraveling the size effect in hierarchical composites.

    PubMed

    Wei, Xiaoding; Filleter, Tobin; Espinosa, Horacio D

    2015-05-01

    Numerous experimental and computational studies have established that the hierarchical structures encountered in natural materials, such as the brick-and-mortar structure observed in sea shells, are essential for achieving defect tolerance. Due to this hierarchy, the mechanical properties of natural materials have a different size dependence compared to that of typical engineered materials. This study aimed to explore size effects on the strength of bio-inspired staggered hierarchical composites and to define the influence of the geometry of constituents in their outstanding defect tolerance capability. A statistical shear lag model is derived by extending the classical shear lag model to account for the statistics of the constituents' strength. A general solution emerges from rigorous mathematical derivations, unifying the various empirical formulations for the fundamental link length used in previous statistical models. The model shows that the staggered arrangement of constituents grants composites a unique size effect on mechanical strength in contrast to homogenous continuous materials. The model is applied to hierarchical yarns consisting of double-walled carbon nanotube bundles to assess its predictive capabilities for novel synthetic materials. Interestingly, the model predicts that yarn gauge length does not significantly influence the yarn strength, in close agreement with experimental observations. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  12. A Statistical Analysis of Brain Morphology Using Wild Bootstrapping

    PubMed Central

    Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.

    2008-01-01

    Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909

  13. Model fitting for small skin permeability data sets: hyperparameter optimisation in Gaussian Process Regression.

    PubMed

    Ashrafi, Parivash; Sun, Yi; Davey, Neil; Adams, Roderick G; Wilkinson, Simon C; Moss, Gary Patrick

    2018-03-01

    The aim of this study was to investigate how to improve predictions from Gaussian Process models by optimising the model hyperparameters. Optimisation methods, including Grid Search, Conjugate Gradient, Random Search, Evolutionary Algorithm and Hyper-prior, were evaluated and applied to previously published data. Data sets were also altered in a structured manner to reduce their size, which retained the range, or 'chemical space' of the key descriptors to assess the effect of the data range on model quality. The Hyper-prior Smoothbox kernel results in the best models for the majority of data sets, and they exhibited significantly better performance than benchmark quantitative structure-permeability relationship (QSPR) models. When the data sets were systematically reduced in size, the different optimisation methods generally retained their statistical quality, whereas benchmark QSPR models performed poorly. The design of the data set, and possibly also the approach to validation of the model, is critical in the development of improved models. The size of the data set, if carefully controlled, was not generally a significant factor for these models and that models of excellent statistical quality could be produced from substantially smaller data sets. © 2018 Royal Pharmaceutical Society.

  14. Watershed regressions for pesticides (warp) models for predicting atrazine concentrations in Corn Belt streams

    USGS Publications Warehouse

    Stone, Wesley W.; Gilliom, Robert J.

    2012-01-01

    Watershed Regressions for Pesticides (WARP) models, previously developed for atrazine at the national scale, are improved for application to the United States (U.S.) Corn Belt region by developing region-specific models that include watershed characteristics that are influential in predicting atrazine concentration statistics within the Corn Belt. WARP models for the Corn Belt (WARP-CB) were developed for annual maximum moving-average (14-, 21-, 30-, 60-, and 90-day durations) and annual 95th-percentile atrazine concentrations in streams of the Corn Belt region. The WARP-CB models accounted for 53 to 62% of the variability in the various concentration statistics among the model-development sites. Model predictions were within a factor of 5 of the observed concentration statistic for over 90% of the model-development sites. The WARP-CB residuals and uncertainty are lower than those of the National WARP model for the same sites. Although atrazine-use intensity is the most important explanatory variable in the National WARP models, it is not a significant variable in the WARP-CB models. The WARP-CB models provide improved predictions for Corn Belt streams draining watersheds with atrazine-use intensities of 17 kg/km2 of watershed area or greater.

  15. Statistical Methodology for the Analysis of Repeated Duration Data in Behavioral Studies.

    PubMed

    Letué, Frédérique; Martinez, Marie-José; Samson, Adeline; Vilain, Anne; Vilain, Coriandre

    2018-03-15

    Repeated duration data are frequently used in behavioral studies. Classical linear or log-linear mixed models are often inadequate to analyze such data, because they usually consist of nonnegative and skew-distributed variables. Therefore, we recommend use of a statistical methodology specific to duration data. We propose a methodology based on Cox mixed models and written under the R language. This semiparametric model is indeed flexible enough to fit duration data. To compare log-linear and Cox mixed models in terms of goodness-of-fit on real data sets, we also provide a procedure based on simulations and quantile-quantile plots. We present two examples from a data set of speech and gesture interactions, which illustrate the limitations of linear and log-linear mixed models, as compared to Cox models. The linear models are not validated on our data, whereas Cox models are. Moreover, in the second example, the Cox model exhibits a significant effect that the linear model does not. We provide methods to select the best-fitting models for repeated duration data and to compare statistical methodologies. In this study, we show that Cox models are best suited to the analysis of our data set.

  16. Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling.

    PubMed

    Nord, Camilla L; Valton, Vincent; Wood, John; Roiser, Jonathan P

    2017-08-23

    Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered-some very seriously so-but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. Copyright © 2017 Nord, Valton et al.

  17. Forecasting defoliation by the gypsy moth in oak stands

    Treesearch

    Robert W. Campbell; Joseph P. Standaert

    1974-01-01

    A multiple-regression model is presented that reflects statistically significant correlations between defoliation by the gypsy moth, the dependent variable, and a series of biotic and physical independent variables. Both possible uses and shortcomings of this model are discussed.

  18. Evaluation of high-resolution sea ice models on the basis of statistical and scaling properties of Arctic sea ice drift and deformation

    NASA Astrophysics Data System (ADS)

    Girard, L.; Weiss, J.; Molines, J. M.; Barnier, B.; Bouillon, S.

    2009-08-01

    Sea ice drift and deformation from models are evaluated on the basis of statistical and scaling properties. These properties are derived from two observation data sets: the RADARSAT Geophysical Processor System (RGPS) and buoy trajectories from the International Arctic Buoy Program (IABP). Two simulations obtained with the Louvain-la-Neuve Ice Model (LIM) coupled to a high-resolution ocean model and a simulation obtained with the Los Alamos Sea Ice Model (CICE) were analyzed. Model ice drift compares well with observations in terms of large-scale velocity field and distributions of velocity fluctuations although a significant bias on the mean ice speed is noted. On the other hand, the statistical properties of ice deformation are not well simulated by the models: (1) The distributions of strain rates are incorrect: RGPS distributions of strain rates are power law tailed, i.e., exhibit "wild randomness," whereas models distributions remain in the Gaussian attraction basin, i.e., exhibit "mild randomness." (2) The models are unable to reproduce the spatial and temporal correlations of the deformation fields: In the observations, ice deformation follows spatial and temporal scaling laws that express the heterogeneity and the intermittency of deformation. These relations do not appear in simulated ice deformation. Mean deformation in models is almost scale independent. The statistical properties of ice deformation are a signature of the ice mechanical behavior. The present work therefore suggests that the mechanical framework currently used by models is inappropriate. A different modeling framework based on elastic interactions could improve the representation of the statistical and scaling properties of ice deformation.

  19. Development of a funding, cost, and spending model for satellite projects

    NASA Technical Reports Server (NTRS)

    Johnson, Jesse P.

    1989-01-01

    The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.

  20. Comparison of potential fecundity models for walleye pollock Gadus chalcogrammus in the Pacific waters off Hokkaido, Japan.

    PubMed

    Tanaka, H; Hamatsu, T; Mori, K

    2017-01-01

    Potential fecundity models of walleye or Alaska pollock Gadus chalcogrammus in the Pacific waters off Hokkaido, Japan, were developed. They were compared using a generalized linear model with using either standard body length (L S ) or total body mass (M T ) as a main covariate along with Fulton's condition factor (K) and mean diameter of oocytes (D O ) as additional potential covariates to account for maternal conditions and maturity stage. The results of model selection showed that M T was a better single predictor of potential fecundity (F P ) than L S . The biological importance of K on F P was obscure, because it was statistically significant when used in the predictor with L S (i.e. length-based model), but not significant when used with M T (i.e. mass-based model). Meanwhile, D O was statistically significant in both length and mass-based models, suggesting the importance of downregulation on the number of oocytes with advancing maturation. Among all candidate models, the model with M T and D O in the predictor had the lowest Akaike's information criterion value, suggesting its better predictive power. These newly developed models will improve future comparisons of the potential fecundity within and among stocks by excluding potential biases other than body size. © 2016 The Fisheries Society of the British Isles.

  1. Employing the Gini coefficient to measure participation inequality in treatment-focused Digital Health Social Networks.

    PubMed

    van Mierlo, Trevor; Hyatt, Douglas; Ching, Andrew T

    2016-01-01

    Digital Health Social Networks (DHSNs) are common; however, there are few metrics that can be used to identify participation inequality. The objective of this study was to investigate whether the Gini coefficient, an economic measure of statistical dispersion traditionally used to measure income inequality, could be employed to measure DHSN inequality. Quarterly Gini coefficients were derived from four long-standing DHSNs. The combined data set included 625,736 posts that were generated from 15,181 actors over 18,671 days. The range of actors (8-2323), posts (29-28,684), and Gini coefficients (0.15-0.37) varied. Pearson correlations indicated statistically significant associations between number of actors and number of posts (0.527-0.835, p  < .001), and Gini coefficients and number of posts (0.342-0.725, p  < .001). However, the association between Gini coefficient and number of actors was only statistically significant for the addiction networks (0.619 and 0.276, p  < .036). Linear regression models had positive but mixed R 2 results (0.333-0.527). In all four regression models, the association between Gini coefficient and posts was statistically significant ( t  = 3.346-7.381, p  < .002). However, unlike the Pearson correlations, the association between Gini coefficient and number of actors was only statistically significant in the two mental health networks ( t  = -4.305 and -5.934, p  < .000). The Gini coefficient is helpful in measuring shifts in DHSN inequality. However, as a standalone metric, the Gini coefficient does not indicate optimal numbers or ratios of actors to posts, or effective network engagement. Further, mixed-methods research investigating quantitative performance metrics is required.

  2. An evaluation of the variable-resolution CESM for modeling California's climate: Evaluation of VR-CESM for Modeling California's Climate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Xingying; Rhoades, Alan M.; Ullrich, Paul A.

    In this paper, the recently developed variable-resolution option within the Community Earth System Model (VR-CESM) is assessed for long-term regional climate modeling of California at 0.25° (~ 28 km) and 0.125° (~ 14 km) horizontal resolutions. The mean climatology of near-surface temperature and precipitation is analyzed and contrasted with reanalysis, gridded observational data sets, and a traditional regional climate model (RCM)—the Weather Research and Forecasting (WRF) model. Statistical metrics for model evaluation and tests for differential significance have been extensively applied. VR-CESM tended to produce a warmer summer (by about 1–3°C) and overestimated overall winter precipitation (about 25%–35%) compared tomore » reference data sets when sea surface temperatures were prescribed. Increasing resolution from 0.25° to 0.125° did not produce a statistically significant improvement in the model results. By comparison, the analogous WRF climatology (constrained laterally and at the sea surface by ERA-Interim reanalysis) was ~1–3°C colder than the reference data sets, underestimated precipitation by ~20%–30% at 27 km resolution, and overestimated precipitation by ~ 65–85% at 9 km. Overall, VR-CESM produced comparable statistical biases to WRF in key climatological quantities. Moreover, this assessment highlights the value of variable-resolution global climate models (VRGCMs) in capturing fine-scale atmospheric processes, projecting future regional climate, and addressing the computational expense of uniform-resolution global climate models.« less

  3. An evaluation of the variable-resolution CESM for modeling California's climate: Evaluation of VR-CESM for Modeling California's Climate

    DOE PAGES

    Huang, Xingying; Rhoades, Alan M.; Ullrich, Paul A.; ...

    2016-03-01

    In this paper, the recently developed variable-resolution option within the Community Earth System Model (VR-CESM) is assessed for long-term regional climate modeling of California at 0.25° (~ 28 km) and 0.125° (~ 14 km) horizontal resolutions. The mean climatology of near-surface temperature and precipitation is analyzed and contrasted with reanalysis, gridded observational data sets, and a traditional regional climate model (RCM)—the Weather Research and Forecasting (WRF) model. Statistical metrics for model evaluation and tests for differential significance have been extensively applied. VR-CESM tended to produce a warmer summer (by about 1–3°C) and overestimated overall winter precipitation (about 25%–35%) compared tomore » reference data sets when sea surface temperatures were prescribed. Increasing resolution from 0.25° to 0.125° did not produce a statistically significant improvement in the model results. By comparison, the analogous WRF climatology (constrained laterally and at the sea surface by ERA-Interim reanalysis) was ~1–3°C colder than the reference data sets, underestimated precipitation by ~20%–30% at 27 km resolution, and overestimated precipitation by ~ 65–85% at 9 km. Overall, VR-CESM produced comparable statistical biases to WRF in key climatological quantities. Moreover, this assessment highlights the value of variable-resolution global climate models (VRGCMs) in capturing fine-scale atmospheric processes, projecting future regional climate, and addressing the computational expense of uniform-resolution global climate models.« less

  4. Factorial analysis of trihalomethanes formation in drinking water.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2010-06-01

    Disinfection of drinking water reduces pathogenic infection, but may pose risks to human health through the formation of disinfection byproducts. The effects of different factors on the formation of trihalomethanes were investigated using a statistically designed experimental program, and a predictive model for trihalomethanes formation was developed. Synthetic water samples with different factor levels were produced, and trihalomethanes concentrations were measured. A replicated fractional factorial design with center points was performed, and significant factors were identified through statistical analysis. A second-order trihalomethanes formation model was developed from 92 experiments, and the statistical adequacy was assessed through appropriate diagnostics. This model was validated using additional data from the Drinking Water Surveillance Program database and was applied to the Smiths Falls water supply system in Ontario, Canada. The model predictions were correlated strongly to the measured trihalomethanes, with correlations of 0.95 and 0.91, respectively. The resulting model can assist in analyzing risk-cost tradeoffs in the design and operation of water supply systems.

  5. Predictors of Latina/o Community College Student Vocational Choice of STEM Fields: Testing of the STEM-Vocational Choice Model

    ERIC Educational Resources Information Center

    Johnson, Joel D.

    2013-01-01

    This study confirmed appropriate measurement model fit for a theoretical model, the STEM vocational choice (STEM-VC) model. This model identifies exogenous factors that successfully predicted, at a statistically significant level, a student's vocational choice decision to pursue a STEM degree at transfer. The student population examined for this…

  6. Comparing Regression Coefficients between Nested Linear Models for Clustered Data with Generalized Estimating Equations

    ERIC Educational Resources Information Center

    Yan, Jun; Aseltine, Robert H., Jr.; Harel, Ofer

    2013-01-01

    Comparing regression coefficients between models when one model is nested within another is of great practical interest when two explanations of a given phenomenon are specified as linear models. The statistical problem is whether the coefficients associated with a given set of covariates change significantly when other covariates are added into…

  7. Light propagation in Swiss-cheese models of random close-packed Szekeres structures: Effects of anisotropy and comparisons with perturbative results

    NASA Astrophysics Data System (ADS)

    Koksbang, S. M.

    2017-03-01

    Light propagation in two Swiss-cheese models based on anisotropic Szekeres structures is studied and compared with light propagation in Swiss-cheese models based on the Szekeres models' underlying Lemaitre-Tolman-Bondi models. The study shows that the anisotropy of the Szekeres models has only a small effect on quantities such as redshift-distance relations, projected shear and expansion rate along individual light rays. The average angular diameter distance to the last scattering surface is computed for each model. Contrary to earlier studies, the results obtained here are (mostly) in agreement with perturbative results. In particular, a small negative shift, δ DA≔D/A-DA ,b g DA ,b g , in the angular diameter distance is obtained upon line-of-sight averaging in three of the four models. The results are, however, not statistically significant. In the fourth model, there is a small positive shift which has an especially small statistical significance. The line-of-sight averaged inverse magnification at z =1100 is consistent with 1 to a high level of confidence for all models, indicating that the area of the surface corresponding to z =1100 is close to that of the background.

  8. Predicting risk for portal vein thrombosis in acute pancreatitis patients: A comparison of radical basis function artificial neural network and logistic regression models.

    PubMed

    Fei, Yang; Hu, Jian; Gao, Kun; Tu, Jianfeng; Li, Wei-Qin; Wang, Wei

    2017-06-01

    To construct a radical basis function (RBF) artificial neural networks (ANNs) model to predict the incidence of acute pancreatitis (AP)-induced portal vein thrombosis. The analysis included 353 patients with AP who had admitted between January 2011 and December 2015. RBF ANNs model and logistic regression model were constructed based on eleven factors relevant to AP respectively. Statistical indexes were used to evaluate the value of the prediction in two models. The predict sensitivity, specificity, positive predictive value, negative predictive value and accuracy by RBF ANNs model for PVT were 73.3%, 91.4%, 68.8%, 93.0% and 87.7%, respectively. There were significant differences between the RBF ANNs and logistic regression models in these parameters (P<0.05). In addition, a comparison of the area under receiver operating characteristic curves of the two models showed a statistically significant difference (P<0.05). The RBF ANNs model is more likely to predict the occurrence of PVT induced by AP than logistic regression model. D-dimer, AMY, Hct and PT were important prediction factors of approval for AP-induced PVT. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Hormone replacement therapy is associated with gastro-oesophageal reflux disease: a retrospective cohort study

    PubMed Central

    2012-01-01

    Background Oestrogen and progestogen have the potential to influence gastro-intestinal motility; both are key components of hormone replacement therapy (HRT). Results of observational studies in women taking HRT rely on self-reporting of gastro-oesophageal symptoms and the aetiology of gastro-oesophageal reflux disease (GORD) remains unclear. This study investigated the association between HRT and GORD in menopausal women using validated general practice records. Methods 51,182 menopausal women were identified using the UK General Practice Research Database between 1995–2004. Of these, 8,831 were matched with and without hormone use. Odds ratios (ORs) were calculated for GORD and proton-pump inhibitor (PPI) use in hormone and non-hormone users, adjusting for age, co-morbidities, and co-pharmacy. Results In unadjusted analysis, all forms of hormone use (oestrogen-only, tibolone, combined HRT and progestogen) were statistically significantly associated with GORD. In adjusted models, this association remained statistically significant for oestrogen-only treatment (OR 1.49; 1.18–1.89). Unadjusted analysis showed a statistically significant association between PPI use and oestrogen-only and combined HRT treatment. When adjusted for covariates, oestrogen-only treatment was significant (OR 1.34; 95% CI 1.03–1.74). Findings from the adjusted model demonstrated the greater use of PPI by progestogen users (OR 1.50; 1.01–2.22). Conclusions This first large cohort study of the association between GORD and HRT found a statistically significant association between oestrogen-only hormone and GORD and PPI use. This should be further investigated using prospective follow-up to validate the strength of association and describe its clinical significance. PMID:22642788

  10. Can upstaging of ductal carcinoma in situ be predicted at biopsy by histologic and mammographic features?

    NASA Astrophysics Data System (ADS)

    Shi, Bibo; Grimm, Lars J.; Mazurowski, Maciej A.; Marks, Jeffrey R.; King, Lorraine M.; Maley, Carlo C.; Hwang, E. Shelley; Lo, Joseph Y.

    2017-03-01

    Reducing the overdiagnosis and overtreatment associated with ductal carcinoma in situ (DCIS) requires accurate prediction of the invasive potential at cancer screening. In this work, we investigated the utility of pre-operative histologic and mammographic features to predict upstaging of DCIS. The goal was to provide intentionally conservative baseline performance using readily available data from radiologists and pathologists and only linear models. We conducted a retrospective analysis on 99 patients with DCIS. Of those 25 were upstaged to invasive cancer at the time of definitive surgery. Pre-operative factors including both the histologic features extracted from stereotactic core needle biopsy (SCNB) reports and the mammographic features annotated by an expert breast radiologist were investigated with statistical analysis. Furthermore, we built classification models based on those features in an attempt to predict the presence of an occult invasive component in DCIS, with generalization performance assessed by receiver operating characteristic (ROC) curve analysis. Histologic features including nuclear grade and DCIS subtype did not show statistically significant differences between cases with pure DCIS and with DCIS plus invasive disease. However, three mammographic features, i.e., the major axis length of DCIS lesion, the BI-RADS level of suspicion, and radiologist's assessment did achieve the statistical significance. Using those three statistically significant features as input, a linear discriminant model was able to distinguish patients with DCIS plus invasive disease from those with pure DCIS, with AUC-ROC equal to 0.62. Overall, mammograms used for breast screening contain useful information that can be perceived by radiologists and help predict occult invasive components in DCIS.

  11. QSAR study of curcumine derivatives as HIV-1 integrase inhibitors.

    PubMed

    Gupta, Pawan; Sharma, Anju; Garg, Prabha; Roy, Nilanjan

    2013-03-01

    A QSAR study was performed on curcumine derivatives as HIV-1 integrase inhibitors using multiple linear regression. The statistically significant model was developed with squared correlation coefficients (r(2)) 0.891 and cross validated r(2) (r(2) cv) 0.825. The developed model revealed that electronic, shape, size, geometry, substitution's information and hydrophilicity were important atomic properties for determining the inhibitory activity of these molecules. The model was also tested successfully for external validation (r(2) pred = 0.849) as well as Tropsha's test for model predictability. Furthermore, the domain analysis was carried out to evaluate the prediction reliability of external set molecules. The model was statistically robust and had good predictive power which can be successfully utilized for screening of new molecules.

  12. Estimating the Regional Economic Significance of Airports

    DTIC Science & Technology

    1992-09-01

    following three options for estimating induced impacts: the economic base model , an econometric model , and a regional input-output model . One approach to...limitations, however, the economic base model has been widely used for regional economic analysis. A second approach is to develop an econometric model of...analysis is the principal statistical tool used to estimate the economic relationships. Regional econometric models are capable of estimating a single

  13. The effects of BleedArrest on hemorrhage control in a porcine model.

    PubMed

    Gegel, Brian; Burgert, James; Loughren, Michael; Johnson, Don

    2012-01-01

    The purpose of this study was to examine the effectiveness of the hemostatic agent BleedArrest compared to control. This was a prospective, experimental design employing an established porcine model of uncontrolled hemorrhage. The minimum number of animals (n=10 per group) was used to obtain a statistically valid result. There were no statistically significant differences between the groups (P>.05) indicating that the groups were equivalent on the following parameters: activating clotting time, the subject weights, core body temperatures, amount of one minute hemorrhage, arterial blood pressures, and the amount and percentage of total blood volume. There were significant differences in the amount of hemorrhage (P=.033) between the BleedArrest (mean=72, SD±72 mL) and control (mean=317.30, SD±112.02 mL). BleedArrest is statistically and clinically superior at controlling hemorrhage compared to the standard pressure dressing control group. In conclusion, BleedArrest is an effective hemostatic agent for use in civilian and military trauma management.

  14. Modelling spruce bark beetle infestation probability

    Treesearch

    Paulius Zolubas; Jose Negron; A. Steven Munson

    2009-01-01

    Spruce bark beetle (Ips typographus L.) risk model, based on pure Norway spruce (Picea abies Karst.) stand characteristics in experimental and control plots was developed using classification and regression tree statistical technique under endemic pest population density. The most significant variable in spruce bark beetle...

  15. Maternal Factors Predicting Cognitive and Behavioral Characteristics of Children with Fetal Alcohol Spectrum Disorders

    PubMed Central

    May, Philip A.; Tabachnick, Barbara G.; Gossage, J. Phillip; Kalberg, Wendy O.; Marais, Anna-Susan; Robinson, Luther K.; Manning, Melanie A.; Blankenship, Jason; Buckley, David; Hoyme, H. Eugene; Adnams, Colleen M.

    2013-01-01

    Objective To provide an analysis of multiple predictors of cognitive and behavioral traits for children with fetal alcohol spectrum disorders (FASD). Method Multivariate correlation techniques were employed with maternal and child data from epidemiologic studies in a community in South Africa. Data on 561 first grade children with fetal alcohol syndrome (FAS), partial FAS (PFAS), and not FASD and their mothers were analyzed by grouping 19 maternal variables into categories (physical, demographic, childbearing, and drinking) and employed in structural equation models (SEM) to assess correlates of child intelligence (verbal and non-verbal) and behavior. Results A first SEM utilizing only seven maternal alcohol use variables to predict cognitive/behavioral traits was statistically significant (B = 3.10, p < .05), but explained only 17.3% of the variance. The second model incorporated multiple maternal variables and was statistically significant explaining 55.3% of the variance. Significantly correlated with low intelligence and problem behavior were demographic (B = 3.83, p < .05) (low maternal education, low socioeconomic status (SES), and rural residence) and maternal physical characteristics (B = 2.70, p < .05) (short stature, small head circumference, and low weight). Childbearing history and alcohol use composites were not statistically significant in the final complex model, and were overpowered by SES and maternal physical traits. Conclusions While other analytic techniques have amply demonstrated the negative effects of maternal drinking on intelligence and behavior, this highly-controlled analysis of multiple maternal influences reveals that maternal demographics and physical traits make a significant enabling or disabling contribution to child functioning in FASD. PMID:23751886

  16. The Detection and Correction of Bias in Student Ratings of Instruction.

    ERIC Educational Resources Information Center

    Haladyna, Thomas; Hess, Robert K.

    1994-01-01

    A Rasch model was used to detect and correct bias in Likert rating scales used to assess student perceptions of college teaching, using a database of ratings. Statistical corrections were significant, supporting the model's potential utility. Recommendations are made for a theoretical rationale and further research on the model. (Author/MSE)

  17. On the Use of Principal Component and Spectral Density Analysis to Evaluate the Community Multiscale Air Quality (CMAQ) Model

    EPA Science Inventory

    A 5 year (2002-2006) simulation of CMAQ covering the eastern United States is evaluated using principle component analysis in order to identify and characterize statistically significant patterns of model bias. Such analysis is useful in that in can identify areas of poor model ...

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, C.; Potts, I.; Reeks, M. W., E-mail: mike.reeks@ncl.ac.uk

    We present a simple stochastic quadrant model for calculating the transport and deposition of heavy particles in a fully developed turbulent boundary layer based on the statistics of wall-normal fluid velocity fluctuations obtained from a fully developed channel flow. Individual particles are tracked through the boundary layer via their interactions with a succession of random eddies found in each of the quadrants of the fluid Reynolds shear stress domain in a homogeneous Markov chain process. In this way, we are able to account directly for the influence of ejection and sweeping events as others have done but without resorting tomore » the use of adjustable parameters. Deposition rate predictions for a wide range of heavy particles predicted by the model compare well with benchmark experimental measurements. In addition, deposition rates are compared with those obtained from continuous random walk models and Langevin equation based ejection and sweep models which noticeably give significantly lower deposition rates. Various statistics related to the particle near wall behavior are also presented. Finally, we consider the model limitations in using the model to calculate deposition in more complex flows where the near wall turbulence may be significantly different.« less

  19. A Statistical Model of Tropical Cyclone Tracks in the Western North Pacific with ENSO-Dependent Cyclogenesis

    NASA Technical Reports Server (NTRS)

    Yonekura, Emmi; Hall, Timothy M.

    2011-01-01

    A new statistical model for western North Pacific Ocean tropical cyclone genesis and tracks is developed and applied to estimate regionally resolved tropical cyclone landfall rates along the coasts of the Asian mainland, Japan, and the Philippines. The model is constructed on International Best Track Archive for Climate Stewardship (IBTrACS) 1945-2007 historical data for the western North Pacific. The model is evaluated in several ways, including comparing the stochastic spread in simulated landfall rates with historic landfall rates. Although certain biases have been detected, overall the model performs well on the diagnostic tests, for example, reproducing well the geographic distribution of landfall rates. Western North Pacific cyclogenesis is influenced by El Nino-Southern Oscillation (ENSO). This dependence is incorporated in the model s genesis component to project the ENSO-genesis dependence onto landfall rates. There is a pronounced shift southeastward in cyclogenesis and a small but significant reduction in basinwide annual counts with increasing ENSO index value. On almost all regions of coast, landfall rates are significantly higher in a negative ENSO state (La Nina).

  20. Examination of environmentally friendly "green" logistics behavior of managers in the pharmaceutical sector using the Theory of Planned Behavior.

    PubMed

    Arslan, Miray; Şar, Sevgi

    2017-12-11

    Logistics activities play a prominent role in enabling manufacturers, distribution channels, and pharmacies to work in harmony. Nowadays these activities have become increasingly striking in the pharmaceutical industry and seen as a development area for this sector. Additionally, green practices are beginning to be more attracting particularly in decreasing costs and increasing image of pharmaceutical companies. The main objective of this study was modeling green logistics (GL) behavior of the managers in the pharmaceutical sector in the theory of planned behavior (TPB) frame via structural equation modeling (SEM). A measurement tool was developed according to TPB. Exploratory factor analysis was conducted to determine subfactors of GL behavior. In the second step, confirmatory factor analysis (CFA) was conducted for confirming whether there is a relationship between the observed variables and their underlying latent constructs. Finally, structural equation model was conducted to specify the relationships between latent variables. In the proposed green logistics behavior (GLB) model, the positive effect of environmental attitude towards GL, perceived behavioral control related GL, and subjective norm about GL on intention towards GL were found statistically significant. Nevertheless, the effect of attitude towards costs of GL on intention towards GL was not found statistically significant. Intention towards GL has been found to have a positive statistically significant effect on the GL behavior. Based on the results of this study, it is possible to say that TPB is an appropriate theory for modeling green logistics behavior of managers. This model can be seen as a guide to the companies in the pharmaceutical sector to participate in green logistics. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Geospatial clustering in sugar-sweetened beverage consumption among Boston youth.

    PubMed

    Tamura, Kosuke; Duncan, Dustin T; Athens, Jessica K; Bragg, Marie A; Rienti, Michael; Aldstadt, Jared; Scott, Marc A; Elbel, Brian

    2017-09-01

    The objective was to detect geospatial clustering of sugar-sweetened beverage (SSB) intake in Boston adolescents (age = 16.3 ± 1.3 years [range: 13-19]; female = 56.1%; White = 10.4%, Black = 42.6%, Hispanics = 32.4%, and others = 14.6%) using spatial scan statistics. We used data on self-reported SSB intake from the 2008 Boston Youth Survey Geospatial Dataset (n = 1292). Two binary variables were created: consumption of SSB (never versus any) on (1) soda and (2) other sugary drinks (e.g., lemonade). A Bernoulli spatial scan statistic was used to identify geospatial clusters of soda and other sugary drinks in unadjusted models and models adjusted for age, gender, and race/ethnicity. There was no statistically significant clustering of soda consumption in the unadjusted model. In contrast, a cluster of non-soda SSB consumption emerged in the middle of Boston (relative risk = 1.20, p = .005), indicating that adolescents within the cluster had a 20% higher probability of reporting non-soda SSB intake than outside the cluster. The cluster was no longer significant in the adjusted model, suggesting spatial variation in non-soda SSB drink intake correlates with the geographic distribution of students by race/ethnicity, age, and gender.

  2. Influence of neurophysiological hippotherapy on the transference of the centre of gravity among children with cerebral palsy.

    PubMed

    Maćków, Anna; Małachowska-Sobieska, Monika; Demczuk-Włodarczyk, Ewa; Sidorowska, Marta; Szklarska, Alicja; Lipowicz, Anna

    2014-01-01

    The aim of the study was to present the influence of neurophysiological hippotherapy on the transference of the centre of gravity (COG) among children with cerebral palsy (CP). The study involved 19 children aged 4-13 years suffering from CP who demonstrated an asymmetric (A/P) model of compensation. Body balance was studied with the Cosmogamma Balance Platform. An examination on this platform was performed before and after a session of neurophysiological hippotherapy. In order to compare the correlations and differences between the examinations, the results were analysed using Student's T-test for dependent samples at p ≤ 0.05 as the level of statistical significance and descriptive statistics were calculated. The mean value of the body's centre of gravity in the frontal plane (COG X) was 18.33 (mm) during the first examination, changing by 21.84 (mm) after neurophysiological hippotherapy towards deloading of the antigravity lower limb (p ≤ 0.0001). The other stabilographic parameters increased; however, only the change in average speed of antero - posterior COG oscillation was statistically significant (p = 0.0354). 1. One session of neurophysiological hippotherapy induced statistically significant changes in the position of the centre of gravity in the body in the frontal plane and the average speed of COG oscillation in the sagittal plane among CP children demonstrating an asymmetric model of compensation (A/P).

  3. The Influence of 16-year-old Students' Gender, Mental Abilities, and Motivation on their Reading and Drawing Submicrorepresentations Achievements

    NASA Astrophysics Data System (ADS)

    Devetak, Iztok; Aleksij Glažar, Saša

    2010-08-01

    Submicrorepresentations (SMRs) are a powerful tool for identifying misconceptions of chemical concepts and for generating proper mental models of chemical phenomena in students' long-term memory during chemical education. The main purpose of the study was to determine which independent variables (gender, formal reasoning abilities, visualization abilities, and intrinsic motivation for learning chemistry) have the maximum influence on students' reading and drawing SMRs. A total of 386 secondary school students (aged 16.3 years) participated in the study. The instruments used in the study were: test of Chemical Knowledge, Test of Logical Thinking, two tests of visualization abilities Patterns and Rotations, and questionnaire on Intrinsic Motivation for Learning Science. The results show moderate, but statistically significant correlations between students' intrinsic motivation, formal reasoning abilities and chemical knowledge at submicroscopic level based on reading and drawing SMRs. Visualization abilities are not statistically significantly correlated with students' success on items that comprise reading or drawing SMRs. It can be also concluded that there is a statistically significant difference between male and female students in solving problems that include reading or drawing SMRs. Based on these statistical results and content analysis of the sample problems, several educational strategies can be implemented for students to develop adequate mental models of chemical concepts on all three levels of representations.

  4. Acceleration techniques for dependability simulation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Barnette, James David

    1995-01-01

    As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.

  5. Does transport time help explain the high trauma mortality rates in rural areas? New and traditional predictors assessed by new and traditional statistical methods

    PubMed Central

    Røislien, Jo; Lossius, Hans Morten; Kristiansen, Thomas

    2015-01-01

    Background Trauma is a leading global cause of death. Trauma mortality rates are higher in rural areas, constituting a challenge for quality and equality in trauma care. The aim of the study was to explore population density and transport time to hospital care as possible predictors of geographical differences in mortality rates, and to what extent choice of statistical method might affect the analytical results and accompanying clinical conclusions. Methods Using data from the Norwegian Cause of Death registry, deaths from external causes 1998–2007 were analysed. Norway consists of 434 municipalities, and municipality population density and travel time to hospital care were entered as predictors of municipality mortality rates in univariate and multiple regression models of increasing model complexity. We fitted linear regression models with continuous and categorised predictors, as well as piecewise linear and generalised additive models (GAMs). Models were compared using Akaike's information criterion (AIC). Results Population density was an independent predictor of trauma mortality rates, while the contribution of transport time to hospital care was highly dependent on choice of statistical model. A multiple GAM or piecewise linear model was superior, and similar, in terms of AIC. However, while transport time was statistically significant in multiple models with piecewise linear or categorised predictors, it was not in GAM or standard linear regression. Conclusions Population density is an independent predictor of trauma mortality rates. The added explanatory value of transport time to hospital care is marginal and model-dependent, highlighting the importance of exploring several statistical models when studying complex associations in observational data. PMID:25972600

  6. Predictors of surgeons' efficiency in the operating rooms.

    PubMed

    Nakata, Yoshinori; Watanabe, Yuichi; Narimatsu, Hiroto; Yoshimura, Tatsuya; Otake, Hiroshi; Sawa, Tomohiro

    2017-02-01

    The sustainability of the Japanese healthcare system is questionable because of a huge fiscal debt. One of the solutions is to improve the efficiency of healthcare. The purpose of this study is to determine what factors are predictive of surgeons' efficiency scores. The authors collected data from all the surgical procedures performed at Teikyo University Hospital from April 1 through September 30 in 2013-2015. Output-oriented Charnes-Cooper-Rhodes model of data envelopment analysis was employed to calculate each surgeon's efficiency score. Seven independent variables that may predict their efficiency scores were selected: experience, medical school, surgical volume, gender, academic rank, surgical specialty, and the surgical fee schedule. Multiple regression analysis using random-effects Tobit model was used for our panel data. The data from total 8722 surgical cases were obtained in 18-month study period. The authors analyzed 134 surgeons. The only statistically significant coefficients were surgical specialty and surgical fee schedule (p = 0.000 and p = 0.016, respectively). Experience had some positive association with efficiency scores but did not reach statistical significance (p = 0.062). The other coefficients were not statistically significant. These results demonstrated that the surgical reimbursement system, not surgeons' personal characteristics, is a significant predictor of surgeons' efficiency.

  7. Joint resonant CMB power spectrum and bispectrum estimation

    NASA Astrophysics Data System (ADS)

    Meerburg, P. Daniel; Münchmeyer, Moritz; Wandelt, Benjamin

    2016-02-01

    We develop the tools necessary to assess the statistical significance of resonant features in the CMB correlation functions, combining power spectrum and bispectrum measurements. This significance is typically addressed by running a large number of simulations to derive the probability density function (PDF) of the feature-amplitude in the Gaussian case. Although these simulations are tractable for the power spectrum, for the bispectrum they require significant computational resources. We show that, by assuming that the PDF is given by a multivariate Gaussian where the covariance is determined by the Fisher matrix of the sine and cosine terms, we can efficiently produce spectra that are statistically close to those derived from full simulations. By drawing a large number of spectra from this PDF, both for the power spectrum and the bispectrum, we can quickly determine the statistical significance of candidate signatures in the CMB, considering both single frequency and multifrequency estimators. We show that for resonance models, cosmology and foreground parameters have little influence on the estimated amplitude, which allows us to simplify the analysis considerably. A more precise likelihood treatment can then be applied to candidate signatures only. We also discuss a modal expansion approach for the power spectrum, aimed at quickly scanning through large families of oscillating models.

  8. Japanese migration in contemporary Japan: economic segmentation and interprefectural migration.

    PubMed

    Fukurai, H

    1991-01-01

    This paper examines the economic segmentation model in explaining 1985-86 Japanese interregional migration. The analysis takes advantage of statistical graphic techniques to illustrate the following substantive issues of interregional migration: (1) to examine whether economic segmentation significantly influences Japanese regional migration and (2) to explain socioeconomic characteristics of prefectures for both in- and out-migration. Analytic techniques include a latent structural equation (LISREL) methodology and statistical residual mapping. The residual dispersion patterns, for instance, suggest the extent to which socioeconomic and geopolitical variables explain migration differences by showing unique clusters of unexplained residuals. The analysis further points out that extraneous factors such as high residential land values, significant commuting populations, and regional-specific cultures and traditions need to be incorporated in the economic segmentation model in order to assess the extent of the model's reliability in explaining the pattern of interprefectural migration.

  9. An Evaluation of the Euroncap Crash Test Safety Ratings in the Real World

    PubMed Central

    Segui-Gomez, Maria; Lopez-Valdes, Francisco J.; Frampton, Richard

    2007-01-01

    We investigated whether the rating obtained in the EuroNCAP test procedures correlates with injury protection to vehicle occupants in real crashes using data in the UK Cooperative Crash Injury Study (CCIS) database from 1996 to 2005. Multivariate Poisson regression models were developed, using the Abbreviated Injury Scale (AIS) score by body region as the dependent variable and the EuroNCAP score for that particular body region, seat belt use, mass ratio and Equivalent Test Speed (ETS) as independent variables. Our models identified statistically significant relationships between injury severity and safety belt use, mass ratio and ETS. We could not identify any statistically significant relationships between the EuroNCAP body region scores and real injury outcome except for the protection to pelvis-femur-knee in frontal impacts where scoring “green” is significantly better than scoring “yellow” or “red”.

  10. Evaluation of the ecological relevance of mysid toxicity tests using population modeling techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhn-Hines, A.; Munns, W.R. Jr.; Lussier, S.

    1995-12-31

    A number of acute and chronic bioassay statistics are used to evaluate the toxicity and risks of chemical stressors to the mysid shrimp, Mysidopsis bahia. These include LC{sub 50}S from acute tests, NOECs from 7-day and life-cycle tests, and the US EPA Water Quality Criteria Criterion Continuous Concentrations (CCC). Because these statistics are generated from endpoints which focus upon the responses of individual organisms, their relationships to significant effects at higher levels of ecological organization are unknown. This study was conducted to evaluate the quantitative relationships between toxicity test statistics and a concentration-based statistic derived from exposure-response models describing populationmore » growth rate ({lambda}) to stressor concentration. This statistic, C{sup {sm_bullet}} (concentration where {lambda} = I, zero population growth) describes the concentration above which mysid populations are projected to decline in abundance as determined using population modeling techniques. An analysis of M. bahia responses to 9 metals and 9 organic contaminants indicated the NOEC from life-cycle tests to be the best predictor of C{sup {sm_bullet}}, although the acute LC{sub 50} predicted population-level response surprisingly well. These analyses provide useful information regarding uncertainties of extrapolation among test statistics in assessments of ecological risk.« less

  11. Testing for significance of phase synchronisation dynamics in the EEG.

    PubMed

    Daly, Ian; Sweeney-Reed, Catherine M; Nasuto, Slawomir J

    2013-06-01

    A number of tests exist to check for statistical significance of phase synchronisation within the Electroencephalogram (EEG); however, the majority suffer from a lack of generality and applicability. They may also fail to account for temporal dynamics in the phase synchronisation, regarding synchronisation as a constant state instead of a dynamical process. Therefore, a novel test is developed for identifying the statistical significance of phase synchronisation based upon a combination of work characterising temporal dynamics of multivariate time-series and Markov modelling. We show how this method is better able to assess the significance of phase synchronisation than a range of commonly used significance tests. We also show how the method may be applied to identify and classify significantly different phase synchronisation dynamics in both univariate and multivariate datasets.

  12. Accuracy of single-abutment digital cast obtained using intraoral and cast scanners.

    PubMed

    Lee, Jae-Jun; Jeong, Ii-Do; Park, Jin-Young; Jeon, Jin-Hun; Kim, Ji-Hwan; Kim, Woong-Chul

    2017-02-01

    Scanners are frequently used in the fabrication of dental prostheses. However, the accuracy of these scanners is variable, and little information is available. The purpose of this in vitro study was to compare the accuracy of cast scanners with that of intraoral scanners by using different image impression techniques. A poly(methyl methacrylate) master model was fabricated to replicate a maxillary first molar single-abutment tooth model. The master model was scanned with an accurate engineering scanner to obtain a true value (n=1) and with 2 intraoral scanners (CEREC Bluecam and CEREC Omnicam; n=6 each). The cast scanner scanned the master model and duplicated the dental stone cast from the master model (n=6). The trueness and precision of the data were measured using a 3-dimensional analysis program. The Kruskal-Wallis test was used to compare the different sets of scanning data, followed by a post hoc Mann-Whitney U test with a significance level modified by Bonferroni correction (α/6=.0083). The type 1 error level (α) was set at .05. The trueness value (root mean square: mean ±standard deviation) was 17.5 ±1.8 μm for the Bluecam, 13.8 ±1.4 μm for the Omnicam, 17.4 ±1.7 μm for cast scanner 1, and 12.3 ±0.1 μm for cast scanner 2. The differences between the Bluecam and the cast scanner 1 and between the Omnicam and the cast scanner 2 were not statistically significant (P>.0083), but a statistically significant difference was found between all the other pairs (P<.0083). The precision of the scanners was 12.7 ±2.6 μm for the Bluecam, 12.5 ±3.7 μm for the Omnicam, 9.2 ±1.2 μm for cast scanner 1, and 6.9 ±2.6 μm for cast scanner 2. The differences between Bluecam and Omnicam and between Omnicam and cast scanner 1 were not statistically significant (P>.0083), but there was a statistically significant difference between all the other pairs (P<.0083). An Omnicam in video image impression had better trueness than a cast scanner but with a similar level of precision. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  13. Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations

    NASA Technical Reports Server (NTRS)

    Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)

    1998-01-01

    This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.

  14. Statistical analyses to support guidelines for marine avian sampling. Final report

    USGS Publications Warehouse

    Kinlan, Brian P.; Zipkin, Elise; O'Connell, Allan F.; Caldow, Chris

    2012-01-01

    Interest in development of offshore renewable energy facilities has led to a need for high-quality, statistically robust information on marine wildlife distributions. A practical approach is described to estimate the amount of sampling effort required to have sufficient statistical power to identify species-specific “hotspots” and “coldspots” of marine bird abundance and occurrence in an offshore environment divided into discrete spatial units (e.g., lease blocks), where “hotspots” and “coldspots” are defined relative to a reference (e.g., regional) mean abundance and/or occurrence probability for each species of interest. For example, a location with average abundance or occurrence that is three times larger the mean (3x effect size) could be defined as a “hotspot,” and a location that is three times smaller than the mean (1/3x effect size) as a “coldspot.” The choice of the effect size used to define hot and coldspots will generally depend on a combination of ecological and regulatory considerations. A method is also developed for testing the statistical significance of possible hotspots and coldspots. Both methods are illustrated with historical seabird survey data from the USGS Avian Compendium Database. Our approach consists of five main components: 1. A review of the primary scientific literature on statistical modeling of animal group size and avian count data to develop a candidate set of statistical distributions that have been used or may be useful to model seabird counts. 2. Statistical power curves for one-sample, one-tailed Monte Carlo significance tests of differences of observed small-sample means from a specified reference distribution. These curves show the power to detect "hotspots" or "coldspots" of occurrence and abundance at a range of effect sizes, given assumptions which we discuss. 3. A model selection procedure, based on maximum likelihood fits of models in the candidate set, to determine an appropriate statistical distribution to describe counts of a given species in a particular region and season. 4. Using a large database of historical at-sea seabird survey data, we applied this technique to identify appropriate statistical distributions for modeling a variety of species, allowing the distribution to vary by season. For each species and season, we used the selected distribution to calculate and map retrospective statistical power to detect hotspots and coldspots, and map pvalues from Monte Carlo significance tests of hotspots and coldspots, in discrete lease blocks designated by the U.S. Department of Interior, Bureau of Ocean Energy Management (BOEM). 5. Because our definition of hotspots and coldspots does not explicitly include variability over time, we examine the relationship between the temporal scale of sampling and the proportion of variance captured in time series of key environmental correlates of marine bird abundance, as well as available marine bird abundance time series, and use these analyses to develop recommendations for the temporal distribution of sampling to adequately represent both shortterm and long-term variability. We conclude by presenting a schematic “decision tree” showing how this power analysis approach would fit in a general framework for avian survey design, and discuss implications of model assumptions and results. We discuss avenues for future development of this work, and recommendations for practical implementation in the context of siting and wildlife assessment for offshore renewable energy development projects.

  15. A Probabilistic Model of Local Sequence Alignment That Simplifies Statistical Significance Estimation

    PubMed Central

    Eddy, Sean R.

    2008-01-01

    Sequence database searches require accurate estimation of the statistical significance of scores. Optimal local sequence alignment scores follow Gumbel distributions, but determining an important parameter of the distribution (λ) requires time-consuming computational simulation. Moreover, optimal alignment scores are less powerful than probabilistic scores that integrate over alignment uncertainty (“Forward” scores), but the expected distribution of Forward scores remains unknown. Here, I conjecture that both expected score distributions have simple, predictable forms when full probabilistic modeling methods are used. For a probabilistic model of local sequence alignment, optimal alignment bit scores (“Viterbi” scores) are Gumbel-distributed with constant λ = log 2, and the high scoring tail of Forward scores is exponential with the same constant λ. Simulation studies support these conjectures over a wide range of profile/sequence comparisons, using 9,318 profile-hidden Markov models from the Pfam database. This enables efficient and accurate determination of expectation values (E-values) for both Viterbi and Forward scores for probabilistic local alignments. PMID:18516236

  16. The impact of mother's literacy on child dental caries: Individual data or aggregate data analysis?

    PubMed

    Haghdoost, Ali-Akbar; Hessari, Hossein; Baneshi, Mohammad Reza; Rad, Maryam; Shahravan, Arash

    2017-01-01

    To evaluate the impact of mother's literacy on child dental caries based on a national oral health survey in Iran and to investigate the possibility of ecological fallacy in aggregate data analysis. Existing data were from second national oral health survey that was carried out in 2004, which including 8725 6 years old participants. The association of mother's literacy with caries occurrence (DMF (Decayed, Missing, Filling) total score >0) of her child was assessed using individual data by logistic regression model. Then the association of the percentages of mother's literacy and the percentages of decayed teeth in each 30 provinces of Iran was assessed using aggregated data retrieved from the data of second national oral health survey of Iran and alternatively from census of "Statistical Center of Iran" using linear regression model. The significance level was set at 0.05 for all analysis. Individual data analysis showed a statistically significant association between mother's literacy and decayed teeth of children ( P = 0.02, odds ratio = 0.83). There were not statistical significant association between mother's literacy and child dental caries in aggregate data analysis of oral health survey ( P = 0.79, B = 0.03) and census of "Statistical Center of Statistics" ( P = 0.60, B = 0.14). Literate mothers have a preventive effect on occurring dental caries of children. According to the high percentage of illiterate parents in Iran, it's logical to consider suitable methods of oral health education which do not need reading or writing. Aggregate data analysis and individual data analysis had completely different results in this study.

  17. Statistical inference for template aging

    NASA Astrophysics Data System (ADS)

    Schuckers, Michael E.

    2006-04-01

    A change in classification error rates for a biometric device is often referred to as template aging. Here we offer two methods for determining whether the effect of time is statistically significant. The first of these is the use of a generalized linear model to determine if these error rates change linearly over time. This approach generalizes previous work assessing the impact of covariates using generalized linear models. The second approach uses of likelihood ratio tests methodology. The focus here is on statistical methods for estimation not the underlying cause of the change in error rates over time. These methodologies are applied to data from the National Institutes of Standards and Technology Biometric Score Set Release 1. The results of these applications are discussed.

  18. Cancer Survival Estimates Due to Non-Uniform Loss to Follow-Up and Non-Proportional Hazards

    PubMed

    K M, Jagathnath Krishna; Mathew, Aleyamma; Sara George, Preethi

    2017-06-25

    Background: Cancer survival depends on loss to follow-up (LFU) and non-proportional hazards (non-PH). If LFU is high, survival will be over-estimated. If hazard is non-PH, rank tests will provide biased inference and Cox-model will provide biased hazard-ratio. We assessed the bias due to LFU and non-PH factor in cancer survival and provided alternate methods for unbiased inference and hazard-ratio. Materials and Methods: Kaplan-Meier survival were plotted using a realistic breast cancer (BC) data-set, with >40%, 5-year LFU and compared it using another BC data-set with <15%, 5-year LFU to assess the bias in survival due to high LFU. Age at diagnosis of the latter data set was used to illustrate the bias due to a non-PH factor. Log-rank test was employed to assess the bias in p-value and Cox-model was used to assess the bias in hazard-ratio for the non-PH factor. Schoenfeld statistic was used to test the non-PH of age. For the non-PH factor, we employed Renyi statistic for inference and time dependent Cox-model for hazard-ratio. Results: Five-year BC survival was 69% (SE: 1.1%) vs. 90% (SE: 0.7%) for data with low vs. high LFU respectively. Age (<45, 46-54 & >54 years) was a non-PH factor (p-value: 0.036). However, survival by age was significant (log-rank p-value: 0.026), but not significant using Renyi statistic (p=0.067). Hazard ratio (HR) for age using Cox-model was 1.012 (95%CI: 1.004 -1.019) and the same using time-dependent Cox-model was in the other direction (HR: 0.997; 95% CI: 0.997- 0.998). Conclusion: Over-estimated survival was observed for cancer with high LFU. Log-rank statistic and Cox-model provided biased results for non-PH factor. For data with non-PH factors, Renyi statistic and time dependent Cox-model can be used as alternate methods to obtain unbiased inference and estimates. Creative Commons Attribution License

  19. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    PubMed

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  20. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more

    PubMed Central

    Rivas, Elena; Lang, Raymond; Eddy, Sean R.

    2012-01-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases. PMID:22194308

  1. Knowledge level of effect size statistics, confidence intervals and meta-analysis in Spanish academic psychologists.

    PubMed

    Badenes-Ribera, Laura; Frias-Navarro, Dolores; Pascual-Soler, Marcos; Monterde-I-Bort, Héctor

    2016-11-01

    The statistical reform movement and the American Psychological Association (APA) defend the use of estimators of the effect size and its confidence intervals, as well as the interpretation of the clinical significance of the findings. A survey was conducted in which academic psychologists were asked about their behavior in designing and carrying out their studies. The sample was composed of 472 participants (45.8% men). The mean number of years as a university professor was 13.56 years (SD= 9.27). The use of effect-size estimators is becoming generalized, as well as the consideration of meta-analytic studies. However, several inadequate practices still persist. A traditional model of methodological behavior based on statistical significance tests is maintained, based on the predominance of Cohen’s d and the unadjusted R2/η2, which are not immune to outliers or departure from normality and the violations of statistical assumptions, and the under-reporting of confidence intervals of effect-size statistics. The paper concludes with recommendations for improving statistical practice.

  2. Statistical analysis of the factors that influenced the mechanical properties improvement of cassava starch films

    NASA Astrophysics Data System (ADS)

    Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle

    2017-08-01

    In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.

  3. Spatial scan statistics for detection of multiple clusters with arbitrary shapes.

    PubMed

    Lin, Pei-Sheng; Kung, Yi-Hung; Clayton, Murray

    2016-12-01

    In applying scan statistics for public health research, it would be valuable to develop a detection method for multiple clusters that accommodates spatial correlation and covariate effects in an integrated model. In this article, we connect the concepts of the likelihood ratio (LR) scan statistic and the quasi-likelihood (QL) scan statistic to provide a series of detection procedures sufficiently flexible to apply to clusters of arbitrary shape. First, we use an independent scan model for detection of clusters and then a variogram tool to examine the existence of spatial correlation and regional variation based on residuals of the independent scan model. When the estimate of regional variation is significantly different from zero, a mixed QL estimating equation is developed to estimate coefficients of geographic clusters and covariates. We use the Benjamini-Hochberg procedure (1995) to find a threshold for p-values to address the multiple testing problem. A quasi-deviance criterion is used to regroup the estimated clusters to find geographic clusters with arbitrary shapes. We conduct simulations to compare the performance of the proposed method with other scan statistics. For illustration, the method is applied to enterovirus data from Taiwan. © 2016, The International Biometric Society.

  4. Weather Variability, Tides, and Barmah Forest Virus Disease in the Gladstone Region, Australia

    PubMed Central

    Naish, Suchithra; Hu, Wenbiao; Nicholls, Neville; Mackenzie, John S.; McMichael, Anthony J.; Dale, Pat; Tong, Shilu

    2006-01-01

    In this study we examined the impact of weather variability and tides on the transmission of Barmah Forest virus (BFV) disease and developed a weather-based forecasting model for BFV disease in the Gladstone region, Australia. We used seasonal autoregressive integrated moving-average (SARIMA) models to determine the contribution of weather variables to BFV transmission after the time-series data of response and explanatory variables were made stationary through seasonal differencing. We obtained data on the monthly counts of BFV cases, weather variables (e.g., mean minimum and maximum temperature, total rainfall, and mean relative humidity), high and low tides, and the population size in the Gladstone region between January 1992 and December 2001 from the Queensland Department of Health, Australian Bureau of Meteorology, Queensland Department of Transport, and Australian Bureau of Statistics, respectively. The SARIMA model shows that the 5-month moving average of minimum temperature (β = 0.15, p-value < 0.001) was statistically significantly and positively associated with BFV disease, whereas high tide in the current month (β = −1.03, p-value = 0.04) was statistically significantly and inversely associated with it. However, no significant association was found for other variables. These results may be applied to forecast the occurrence of BFV disease and to use public health resources in BFV control and prevention. PMID:16675420

  5. Ultralow-dose CT of the craniofacial bone for navigated surgery using adaptive statistical iterative reconstruction and model-based iterative reconstruction: 2D and 3D image quality.

    PubMed

    Widmann, Gerlig; Schullian, Peter; Gassner, Eva-Maria; Hoermann, Romed; Bale, Reto; Puelacher, Wolfgang

    2015-03-01

    OBJECTIVE. The purpose of this article is to evaluate 2D and 3D image quality of high-resolution ultralow-dose CT images of the craniofacial bone for navigated surgery using adaptive statistical iterative reconstruction (ASIR) and model-based iterative reconstruction (MBIR) in comparison with standard filtered backprojection (FBP). MATERIALS AND METHODS. A formalin-fixed human cadaver head was scanned using a clinical reference protocol at a CT dose index volume of 30.48 mGy and a series of five ultralow-dose protocols at 3.48, 2.19, 0.82, 0.44, and 0.22 mGy using FBP and ASIR at 50% (ASIR-50), ASIR at 100% (ASIR-100), and MBIR. Blinded 2D axial and 3D volume-rendered images were compared with each other by three readers using top-down scoring. Scores were analyzed per protocol or dose and reconstruction. All images were compared with the FBP reference at 30.48 mGy. A nonparametric Mann-Whitney U test was used. Statistical significance was set at p < 0.05. RESULTS. For 2D images, the FBP reference at 30.48 mGy did not statistically significantly differ from ASIR-100 at 3.48 mGy, ASIR-100 at 2.19 mGy, and MBIR at 0.82 mGy. MBIR at 2.19 and 3.48 mGy scored statistically significantly better than the FBP reference (p = 0.032 and 0.001, respectively). For 3D images, the FBP reference at 30.48 mGy did not statistically significantly differ from all reconstructions at 3.48 mGy; FBP and ASIR-100 at 2.19 mGy; FBP, ASIR-100, and MBIR at 0.82 mGy; MBIR at 0.44 mGy; and MBIR at 0.22 mGy. CONCLUSION. MBIR (2D and 3D) and ASIR-100 (2D) may significantly improve subjective image quality of ultralow-dose images and may allow more than 90% dose reductions.

  6. Adopting adequate leaching requirement for practical response models of basil to salinity

    NASA Astrophysics Data System (ADS)

    Babazadeh, Hossein; Tabrizi, Mahdi Sarai; Darvishi, Hossein Hassanpour

    2016-07-01

    Several mathematical models are being used for assessing plant response to salinity of the root zone. Objectives of this study included quantifying the yield salinity threshold value of basil plants to irrigation water salinity and investigating the possibilities of using irrigation water salinity instead of saturated extract salinity in the available mathematical models for estimating yield. To achieve the above objectives, an extensive greenhouse experiment was conducted with 13 irrigation water salinity levels, namely 1.175 dS m-1 (control treatment) and 1.8 to 10 dS m-1. The result indicated that, among these models, the modified discount model (one of the most famous root water uptake model which is based on statistics) produced more accurate results in simulating the basil yield reduction function using irrigation water salinities. Overall the statistical model of Steppuhn et al. on the modified discount model and the math-empirical model of van Genuchten and Hoffman provided the best results. In general, all of the statistical models produced very similar results and their results were better than math-empirical models. It was also concluded that if enough leaching was present, there was no significant difference between the soil salinity saturated extract models and the models using irrigation water salinity.

  7. Joint inversion of marine seismic AVA and CSEM data using statistical rock-physics models and Markov random fields: Stochastic inversion of AVA and CSEM data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, J.; Hoversten, G.M.

    2011-09-15

    Joint inversion of seismic AVA and CSEM data requires rock-physics relationships to link seismic attributes to electrical properties. Ideally, we can connect them through reservoir parameters (e.g., porosity and water saturation) by developing physical-based models, such as Gassmann’s equations and Archie’s law, using nearby borehole logs. This could be difficult in the exploration stage because information available is typically insufficient for choosing suitable rock-physics models and for subsequently obtaining reliable estimates of the associated parameters. The use of improper rock-physics models and the inaccuracy of the estimates of model parameters may cause misleading inversion results. Conversely, it is easy tomore » derive statistical relationships among seismic and electrical attributes and reservoir parameters from distant borehole logs. In this study, we develop a Bayesian model to jointly invert seismic AVA and CSEM data for reservoir parameter estimation using statistical rock-physics models; the spatial dependence of geophysical and reservoir parameters are carried out by lithotypes through Markov random fields. We apply the developed model to a synthetic case, which simulates a CO{sub 2} monitoring application. We derive statistical rock-physics relations from borehole logs at one location and estimate seismic P- and S-wave velocity ratio, acoustic impedance, density, electrical resistivity, lithotypes, porosity, and water saturation at three different locations by conditioning to seismic AVA and CSEM data. Comparison of the inversion results with their corresponding true values shows that the correlation-based statistical rock-physics models provide significant information for improving the joint inversion results.« less

  8. Evaluating the capabilities of watershed-scale models in estimating sediment yield at field-scale.

    PubMed

    Sommerlot, Andrew R; Nejadhashemi, A Pouyan; Woznicki, Sean A; Giri, Subhasis; Prohaska, Michael D

    2013-09-30

    Many watershed model interfaces have been developed in recent years for predicting field-scale sediment loads. They share the goal of providing data for decisions aimed at improving watershed health and the effectiveness of water quality conservation efforts. The objectives of this study were to: 1) compare three watershed-scale models (Soil and Water Assessment Tool (SWAT), Field_SWAT, and the High Impact Targeting (HIT) model) against calibrated field-scale model (RUSLE2) in estimating sediment yield from 41 randomly selected agricultural fields within the River Raisin watershed; 2) evaluate the statistical significance among models; 3) assess the watershed models' capabilities in identifying areas of concern at the field level; 4) evaluate the reliability of the watershed-scale models for field-scale analysis. The SWAT model produced the most similar estimates to RUSLE2 by providing the closest median and the lowest absolute error in sediment yield predictions, while the HIT model estimates were the worst. Concerning statistically significant differences between models, SWAT was the only model found to be not significantly different from the calibrated RUSLE2 at α = 0.05. Meanwhile, all models were incapable of identifying priorities areas similar to the RUSLE2 model. Overall, SWAT provided the most correct estimates (51%) within the uncertainty bounds of RUSLE2 and is the most reliable among the studied models, while HIT is the least reliable. The results of this study suggest caution should be exercised when using watershed-scale models for field level decision-making, while field specific data is of paramount importance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Statistical-Dynamical Seasonal Forecasts of Central-Southwest Asian Winter Precipitation.

    NASA Astrophysics Data System (ADS)

    Tippett, Michael K.; Goddard, Lisa; Barnston, Anthony G.

    2005-06-01

    Interannual precipitation variability in central-southwest (CSW) Asia has been associated with East Asian jet stream variability and western Pacific tropical convection. However, atmospheric general circulation models (AGCMs) forced by observed sea surface temperature (SST) poorly simulate the region's interannual precipitation variability. The statistical-dynamical approach uses statistical methods to correct systematic deficiencies in the response of AGCMs to SST forcing. Statistical correction methods linking model-simulated Indo-west Pacific precipitation and observed CSW Asia precipitation result in modest, but statistically significant, cross-validated simulation skill in the northeast part of the domain for the period from 1951 to 1998. The statistical-dynamical method is also applied to recent (winter 1998/99 to 2002/03) multimodel, two-tier December-March precipitation forecasts initiated in October. This period includes 4 yr (winter of 1998/99 to 2001/02) of severe drought. Tercile probability forecasts are produced using ensemble-mean forecasts and forecast error estimates. The statistical-dynamical forecasts show enhanced probability of below-normal precipitation for the four drought years and capture the return to normal conditions in part of the region during the winter of 2002/03.May Kabul be without gold, but not without snow.—Traditional Afghan proverb

  10. Response statistics of rotating shaft with non-linear elastic restoring forces by path integration

    NASA Astrophysics Data System (ADS)

    Gaidai, Oleg; Naess, Arvid; Dimentberg, Michael

    2017-07-01

    Extreme statistics of random vibrations is studied for a Jeffcott rotor under uniaxial white noise excitation. Restoring force is modelled as elastic non-linear; comparison is done with linearized restoring force to see the force non-linearity effect on the response statistics. While for the linear model analytical solutions and stability conditions are available, it is not generally the case for non-linear system except for some special cases. The statistics of non-linear case is studied by applying path integration (PI) method, which is based on the Markov property of the coupled dynamic system. The Jeffcott rotor response statistics can be obtained by solving the Fokker-Planck (FP) equation of the 4D dynamic system. An efficient implementation of PI algorithm is applied, namely fast Fourier transform (FFT) is used to simulate dynamic system additive noise. The latter allows significantly reduce computational time, compared to the classical PI. Excitation is modelled as Gaussian white noise, however any kind distributed white noise can be implemented with the same PI technique. Also multidirectional Markov noise can be modelled with PI in the same way as unidirectional. PI is accelerated by using Monte Carlo (MC) estimated joint probability density function (PDF) as initial input. Symmetry of dynamic system was utilized to afford higher mesh resolution. Both internal (rotating) and external damping are included in mechanical model of the rotor. The main advantage of using PI rather than MC is that PI offers high accuracy in the probability distribution tail. The latter is of critical importance for e.g. extreme value statistics, system reliability, and first passage probability.

  11. A statistical parts-based appearance model of inter-subject variability.

    PubMed

    Toews, Matthew; Collins, D Louis; Arbel, Tal

    2006-01-01

    In this article, we present a general statistical parts-based model for representing the appearance of an image set, applied to the problem of inter-subject MR brain image matching. In contrast with global image representations such as active appearance models, the parts-based model consists of a collection of localized image parts whose appearance, geometry and occurrence frequency are quantified statistically. The parts-based approach explicitly addresses the case where one-to-one correspondence does not exist between subjects due to anatomical differences, as parts are not expected to occur in all subjects. The model can be learned automatically, discovering structures that appear with statistical regularity in a large set of subject images, and can be robustly fit to new images, all in the presence of significant inter-subject variability. As parts are derived from generic scale-invariant features, the framework can be applied in a wide variety of image contexts, in order to study the commonality of anatomical parts or to group subjects according to the parts they share. Experimentation shows that a parts-based model can be learned from a large set of MR brain images, and used to determine parts that are common within the group of subjects. Preliminary results indicate that the model can be used to automatically identify distinctive features for inter-subject image registration despite large changes in appearance.

  12. A Model for Indexing Medical Documents Combining Statistical and Symbolic Knowledge.

    PubMed Central

    Avillach, Paul; Joubert, Michel; Fieschi, Marius

    2007-01-01

    OBJECTIVES: To develop and evaluate an information processing method based on terminologies, in order to index medical documents in any given documentary context. METHODS: We designed a model using both symbolic general knowledge extracted from the Unified Medical Language System (UMLS) and statistical knowledge extracted from a domain of application. Using statistical knowledge allowed us to contextualize the general knowledge for every particular situation. For each document studied, the extracted terms are ranked to highlight the most significant ones. The model was tested on a set of 17,079 French standardized discharge summaries (SDSs). RESULTS: The most important ICD-10 term of each SDS was ranked 1st or 2nd by the method in nearly 90% of the cases. CONCLUSIONS: The use of several terminologies leads to more precise indexing. The improvement achieved in the model’s implementation performances as a result of using semantic relationships is encouraging. PMID:18693792

  13. Mathematical and statistical models for determining the crop load in grapevine

    NASA Astrophysics Data System (ADS)

    Alina, Dobrei; Alin, Dobrei; Eleonora, Nistor; Teodor, Cristea; Marius, Boldea; Florin, Sala

    2016-06-01

    Ensuring a balance between vine crop load and vine vegetative growth is a dynamic process, so it is necessary to develop models for describing this relationship. This study analyzed the interrelationship between the crop load and growing specific parameters (viable buds - VB, dead (frost-injured) buds - DB, total shoots growth-TSG, one-year-old wood - MSG), in two vine grapes varieties: Muscat Ottonel cultivar for wine and Victoria cultivar for fresh grapes. In both varieties interrelationship between the buds number and vegetative growth parameters were described by polynomial functions statistically assured. Using regression analysis it was possible to develop predictive models for one-year-old wood (MSG), an important parameter for the yield and quality of wine grape production, with statistical significance results (R2 = 0.884, p <0.001, F = 45.957 in Muscat Ottonel cultivar and R2 = 0.893, p = 0.001, F = 49.886 in Victoria cultivar).

  14. The effectiveness and cost-effectiveness of intraoperative imaging in high-grade glioma resection; a comparative review of intraoperative ALA, fluorescein, ultrasound and MRI.

    PubMed

    Eljamel, M Sam; Mahboob, Syed Osama

    2016-12-01

    Surgical resection of high-grade gliomas (HGG) is standard therapy because it imparts significant progression free (PFS) and overall survival (OS). However, HGG-tumor margins are indistinguishable from normal brain during surgery. Hence intraoperative technology such as fluorescence (ALA, fluorescein) and intraoperative ultrasound (IoUS) and MRI (IoMRI) has been deployed. This study compares the effectiveness and cost-effectiveness of these technologies. Critical literature review and meta-analyses, using MEDLINE/PubMed service. The list of references in each article was double-checked for any missing references. We included all studies that reported the use of ALA, fluorescein (FLCN), IoUS or IoMRI to guide HGG-surgery. The meta-analyses were conducted according to statistical heterogeneity between studies. If there was no heterogeneity, fixed effects model was used; otherwise, a random effects model was used. Statistical heterogeneity was explored by χ 2 and inconsistency (I 2 ) statistics. To assess cost-effectiveness, we calculated the incremental cost per quality-adjusted life-year (QALY). Gross total resection (GTR) after ALA, FLCN, IoUS and IoMRI was 69.1%, 84.4%, 73.4% and 70% respectively. The differences were not statistically significant. All four techniques led to significant prolongation of PFS and tended to prolong OS. However none of these technologies led to significant prolongation of OS compared to controls. The cost/QALY was $16,218, $3181, $6049 and $32,954 for ALA, FLCN, IoUS and IoMRI respectively. ALA, FLCN, IoUS and IoMRI significantly improve GTR and PFS of HGG. Their incremental cost was below the threshold for cost-effectiveness of HGG-therapy, denoting that each intraoperative technology was cost-effective on its own. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Using Computational Modeling to Assess the Impact of Clinical Decision Support on Cancer Screening within Community Health Centers

    PubMed Central

    Carney, Timothy Jay; Morgan, Geoffrey P.; Jones, Josette; McDaniel, Anna M.; Weaver, Michael; Weiner, Bryan; Haggstrom, David A.

    2014-01-01

    Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman’s Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241

  16. An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less

  17. An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology

    DOE PAGES

    Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin; ...

    2017-05-15

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less

  18. Non-rigid image registration using a statistical spline deformation model.

    PubMed

    Loeckx, Dirk; Maes, Frederik; Vandermeulen, Dirk; Suetens, Paul

    2003-07-01

    We propose a statistical spline deformation model (SSDM) as a method to solve non-rigid image registration. Within this model, the deformation is expressed using a statistically trained B-spline deformation mesh. The model is trained by principal component analysis of a training set. This approach allows to reduce the number of degrees of freedom needed for non-rigid registration by only retaining the most significant modes of variation observed in the training set. User-defined transformation components, like affine modes, are merged with the principal components into a unified framework. Optimization proceeds along the transformation components rather then along the individual spline coefficients. The concept of SSDM's is applied to the temporal registration of thorax CR-images using pattern intensity as the registration measure. Our results show that, using 30 training pairs, a reduction of 33% is possible in the number of degrees of freedom without deterioration of the result. The same accuracy as without SSDM's is still achieved after a reduction up to 66% of the degrees of freedom.

  19. Diurnal fluctuations in brain volume: Statistical analyses of MRI from large populations.

    PubMed

    Nakamura, Kunio; Brown, Robert A; Narayanan, Sridar; Collins, D Louis; Arnold, Douglas L

    2015-09-01

    We investigated fluctuations in brain volume throughout the day using statistical modeling of magnetic resonance imaging (MRI) from large populations. We applied fully automated image analysis software to measure the brain parenchymal fraction (BPF), defined as the ratio of the brain parenchymal volume and intracranial volume, thus accounting for variations in head size. The MRI data came from serial scans of multiple sclerosis (MS) patients in clinical trials (n=755, 3269 scans) and from subjects participating in the Alzheimer's Disease Neuroimaging Initiative (ADNI, n=834, 6114 scans). The percent change in BPF was modeled with a linear mixed effect (LME) model, and the model was applied separately to the MS and ADNI datasets. The LME model for the MS datasets included random subject effects (intercept and slope over time) and fixed effects for the time-of-day, time from the baseline scan, and trial, which accounted for trial-related effects (for example, different inclusion criteria and imaging protocol). The model for ADNI additionally included the demographics (baseline age, sex, subject type [normal, mild cognitive impairment, or Alzheimer's disease], and interaction between subject type and time from baseline). There was a statistically significant effect of time-of-day on the BPF change in MS clinical trial datasets (-0.180 per day, that is, 0.180% of intracranial volume, p=0.019) as well as the ADNI dataset (-0.438 per day, that is, 0.438% of intracranial volume, p<0.0001), showing that the brain volume is greater in the morning. Linearly correcting the BPF values with the time-of-day reduced the required sample size to detect a 25% treatment effect (80% power and 0.05 significance level) on change in brain volume from 2 time-points over a period of 1year by 2.6%. Our results have significant implications for future brain volumetric studies, suggesting that there is a potential acquisition time bias that should be randomized or statistically controlled to account for the day-to-day brain volume fluctuations. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. System Study: Residual Heat Removal 1998-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-12-01

    This report presents an unreliability evaluation of the residual heat removal (RHR) system in two modes of operation (low-pressure injection in response to a large loss-of-coolant accident and post-trip shutdown-cooling) at 104 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing trends were identified in themore » RHR results. A highly statistically significant decreasing trend was observed for the RHR injection mode start-only unreliability. Statistically significant decreasing trends were observed for RHR shutdown cooling mode start-only unreliability and RHR shutdown cooling model 24-hour unreliability.« less

  1. Effect of water extract of Turkish propolis on tuberculosis infection in guinea-pigs.

    PubMed

    Yildirim, Zeki; Hacievliyagil, Süleyman; Kutlu, Nurettin Onur; Aydin, Nasuhi Engin; Kurkcuoglu, Mine; Iraz, Mustafa; Durmaz, Riza

    2004-03-01

    Mycobacterium tuberculosis (H(37)R(v))-infected guinea-pig model was used to investigate the effect of water extract of propolis (WEP). After subcutaneous inoculation of tubercle bacilli, each animal received oral WEP (n=9), isoniazid (n=5) or saline (n=6) as placebo and were sacrificed 30 days later. Formation of necrosis was less prominent in the group treated with WEP, but was not statistically significant (P>0.05). The granuloma formation in the same group was more prominent than the placebo and isoniazid groups; however, this finding failed to reach statistical significance by the Kruskal-Wallis test (P>0.05). These findings suggest that Turkish WEP may have a limited effect on the development of tuberculosis infection in this guinea-pig model.

  2. A short-term ensemble wind speed forecasting system for wind power applications

    NASA Astrophysics Data System (ADS)

    Baidya Roy, S.; Traiteur, J. J.; Callicutt, D.; Smith, M.

    2011-12-01

    This study develops an adaptive, blended forecasting system to provide accurate wind speed forecasts 1 hour ahead of time for wind power applications. The system consists of an ensemble of 21 forecasts with different configurations of the Weather Research and Forecasting Single Column Model (WRFSCM) and a persistence model. The ensemble is calibrated against observations for a 2 month period (June-July, 2008) at a potential wind farm site in Illinois using the Bayesian Model Averaging (BMA) technique. The forecasting system is evaluated against observations for August 2008 at the same site. The calibrated ensemble forecasts significantly outperform the forecasts from the uncalibrated ensemble while significantly reducing forecast uncertainty under all environmental stability conditions. The system also generates significantly better forecasts than persistence, autoregressive (AR) and autoregressive moving average (ARMA) models during the morning transition and the diurnal convective regimes. This forecasting system is computationally more efficient than traditional numerical weather prediction models and can generate a calibrated forecast, including model runs and calibration, in approximately 1 minute. Currently, hour-ahead wind speed forecasts are almost exclusively produced using statistical models. However, numerical models have several distinct advantages over statistical models including the potential to provide turbulence forecasts. Hence, there is an urgent need to explore the role of numerical models in short-term wind speed forecasting. This work is a step in that direction and is likely to trigger a debate within the wind speed forecasting community.

  3. A Statistical Skull Geometry Model for Children 0-3 Years Old

    PubMed Central

    Li, Zhigang; Park, Byoung-Keon; Liu, Weiguo; Zhang, Jinhuan; Reed, Matthew P.; Rupp, Jonathan D.; Hoff, Carrie N.; Hu, Jingwen

    2015-01-01

    Head injury is the leading cause of fatality and long-term disability for children. Pediatric heads change rapidly in both size and shape during growth, especially for children under 3 years old (YO). To accurately assess the head injury risks for children, it is necessary to understand the geometry of the pediatric head and how morphologic features influence injury causation within the 0–3 YO population. In this study, head CT scans from fifty-six 0–3 YO children were used to develop a statistical model of pediatric skull geometry. Geometric features important for injury prediction, including skull size and shape, skull thickness and suture width, along with their variations among the sample population, were quantified through a series of image and statistical analyses. The size and shape of the pediatric skull change significantly with age and head circumference. The skull thickness and suture width vary with age, head circumference and location, which will have important effects on skull stiffness and injury prediction. The statistical geometry model developed in this study can provide a geometrical basis for future development of child anthropomorphic test devices and pediatric head finite element models. PMID:25992998

  4. Statistics of Optical Coherence Tomography Data From Human Retina

    PubMed Central

    de Juan, Joaquín; Ferrone, Claudia; Giannini, Daniela; Huang, David; Koch, Giorgio; Russo, Valentina; Tan, Ou; Bruni, Carlo

    2010-01-01

    Optical coherence tomography (OCT) has recently become one of the primary methods for noninvasive probing of the human retina. The pseudoimage formed by OCT (the so-called B-scan) varies probabilistically across pixels due to complexities in the measurement technique. Hence, sensitive automatic procedures of diagnosis using OCT may exploit statistical analysis of the spatial distribution of reflectance. In this paper, we perform a statistical study of retinal OCT data. We find that the stretched exponential probability density function can model well the distribution of intensities in OCT pseudoimages. Moreover, we show a small, but significant correlation between neighbor pixels when measuring OCT intensities with pixels of about 5 µm. We then develop a simple joint probability model for the OCT data consistent with known retinal features. This model fits well the stretched exponential distribution of intensities and their spatial correlation. In normal retinas, fit parameters of this model are relatively constant along retinal layers, but varies across layers. However, in retinas with diabetic retinopathy, large spikes of parameter modulation interrupt the constancy within layers, exactly where pathologies are visible. We argue that these results give hope for improvement in statistical pathology-detection methods even when the disease is in its early stages. PMID:20304733

  5. A statistical skull geometry model for children 0-3 years old.

    PubMed

    Li, Zhigang; Park, Byoung-Keon; Liu, Weiguo; Zhang, Jinhuan; Reed, Matthew P; Rupp, Jonathan D; Hoff, Carrie N; Hu, Jingwen

    2015-01-01

    Head injury is the leading cause of fatality and long-term disability for children. Pediatric heads change rapidly in both size and shape during growth, especially for children under 3 years old (YO). To accurately assess the head injury risks for children, it is necessary to understand the geometry of the pediatric head and how morphologic features influence injury causation within the 0-3 YO population. In this study, head CT scans from fifty-six 0-3 YO children were used to develop a statistical model of pediatric skull geometry. Geometric features important for injury prediction, including skull size and shape, skull thickness and suture width, along with their variations among the sample population, were quantified through a series of image and statistical analyses. The size and shape of the pediatric skull change significantly with age and head circumference. The skull thickness and suture width vary with age, head circumference and location, which will have important effects on skull stiffness and injury prediction. The statistical geometry model developed in this study can provide a geometrical basis for future development of child anthropomorphic test devices and pediatric head finite element models.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson M.; Feng, Zhe; Burleyson, Casey D.

    Regional cloud permitting model simulations of cloud populations observed during the 2011 ARM Madden Julian Oscillation Investigation Experiment/ Dynamics of Madden-Julian Experiment (AMIE/DYNAMO) field campaign are evaluated against radar and ship-based measurements. Sensitivity of model simulated surface rain rate statistics to parameters and parameterization of hydrometeor sizes in five commonly used WRF microphysics schemes are examined. It is shown that at 2 km grid spacing, the model generally overestimates rain rate from large and deep convective cores. Sensitivity runs involving variation of parameters that affect rain drop or ice particle size distribution (more aggressive break-up process etc) generally reduce themore » bias in rain-rate and boundary layer temperature statistics as the smaller particles become more vulnerable to evaporation. Furthermore significant improvement in the convective rain-rate statistics is observed when the horizontal grid-spacing is reduced to 1 km and 0.5 km, while it is worsened when run at 4 km grid spacing as increased turbulence enhances evaporation. The results suggest modulation of evaporation processes, through parameterization of turbulent mixing and break-up of hydrometeors may provide a potential avenue for correcting cloud statistics and associated boundary layer temperature biases in regional and global cloud permitting model simulations.« less

  7. Formulating Spatially Varying Performance in the Statistical Fusion Framework

    PubMed Central

    Landman, Bennett A.

    2012-01-01

    To date, label fusion methods have primarily relied either on global (e.g. STAPLE, globally weighted vote) or voxelwise (e.g. locally weighted vote) performance models. Optimality of the statistical fusion framework hinges upon the validity of the stochastic model of how a rater errs (i.e., the labeling process model). Hitherto, approaches have tended to focus on the extremes of potential models. Herein, we propose an extension to the STAPLE approach to seamlessly account for spatially varying performance by extending the performance level parameters to account for a smooth, voxelwise performance level field that is unique to each rater. This approach, Spatial STAPLE, provides significant improvements over state-of-the-art label fusion algorithms in both simulated and empirical data sets. PMID:22438513

  8. On Fitting Generalized Linear Mixed-effects Models for Binary Responses using Different Statistical Packages

    PubMed Central

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W.; Xia, Yinglin; Tu, Xin M.

    2011-01-01

    Summary The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. PMID:21671252

  9. The Effects of Various High School Scheduling Models on Student Achievement in Michigan

    ERIC Educational Resources Information Center

    Pickell, Russell E.

    2017-01-01

    This study reviews research and data to determine whether student achievement is affected by the high school scheduling model, and whether changes in scheduling models result in statistically significant changes in student achievement, as measured by the ACT Composite, ACT English Language Arts, and ACT Math scores. The high school scheduling…

  10. Resuscitation quality of rotating chest compression providers at one-minute vs. two-minute intervals: A mannequin study.

    PubMed

    Kılıç, D; Göksu, E; Kılıç, T; Buyurgan, C S

    2018-05-01

    The aim of this randomized cross-over study was to compare one-minute and two-minute continuous chest compressions in terms of chest compression only CPR quality metrics on a mannequin model in the ED. Thirty-six emergency medicine residents participated in this study. In the 1-minute group, there was no statistically significant difference in the mean compression rate (p=0.83), mean compression depth (p=0.61), good compressions (p=0.31), the percentage of complete release (p=0.07), adequate compression depth (p=0.11) or the percentage of good rate (p=51) over the four-minute time period. Only flow time was statistically significant among the 1-minute intervals (p<0.001). In the 2-minute group, the mean compression depth (p=0.19), good compression (p=0.92), the percentage of complete release (p=0.28), adequate compression depth (p=0.96), and the percentage of good rate (p=0.09) were not statistically significant over time. In this group, the number of compressions (248±31 vs 253±33, p=0.01) and mean compression rates (123±15 vs 126±17, p=0.01) and flow time (p=0.001) were statistically significant along the two-minute intervals. There was no statistically significant difference in the mean number of chest compressions per minute, mean chest compression depth, the percentage of good compressions, complete release, adequate chest compression depth and percentage of good compression between the 1-minute and 2-minute groups. There was no statistically significant difference in the quality metrics of chest compressions between 1- and 2-minute chest compression only groups. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Test anxiety and academic performance in chiropractic students.

    PubMed

    Zhang, Niu; Henderson, Charles N R

    2014-01-01

    Objective : We assessed the level of students' test anxiety, and the relationship between test anxiety and academic performance. Methods : We recruited 166 third-quarter students. The Test Anxiety Inventory (TAI) was administered to all participants. Total scores from written examinations and objective structured clinical examinations (OSCEs) were used as response variables. Results : Multiple regression analysis shows that there was a modest, but statistically significant negative correlation between TAI scores and written exam scores, but not OSCE scores. Worry and emotionality were the best predictive models for written exam scores. Mean total anxiety and emotionality scores for females were significantly higher than those for males, but not worry scores. Conclusion : Moderate-to-high test anxiety was observed in 85% of the chiropractic students examined. However, total test anxiety, as measured by the TAI score, was a very weak predictive model for written exam performance. Multiple regression analysis demonstrated that replacing total anxiety (TAI) with worry and emotionality (TAI subscales) produces a much more effective predictive model of written exam performance. Sex, age, highest current academic degree, and ethnicity contributed little additional predictive power in either regression model. Moreover, TAI scores were not found to be statistically significant predictors of physical exam skill performance, as measured by OSCEs.

  12. Model variations in predicting incidence of Plasmodium falciparum malaria using 1998-2007 morbidity and meteorological data from south Ethiopia.

    PubMed

    Loha, Eskindir; Lindtjørn, Bernt

    2010-06-16

    Malaria transmission is complex and is believed to be associated with local climate changes. However, simple attempts to extrapolate malaria incidence rates from averaged regional meteorological conditions have proven unsuccessful. Therefore, the objective of this study was to determine if variations in specific meteorological factors are able to consistently predict P. falciparum malaria incidence at different locations in south Ethiopia. Retrospective data from 42 locations were collected including P. falciparum malaria incidence for the period of 1998-2007 and meteorological variables such as monthly rainfall (all locations), temperature (17 locations), and relative humidity (three locations). Thirty-five data sets qualified for the analysis. Ljung-Box Q statistics was used for model diagnosis, and R squared or stationary R squared was taken as goodness of fit measure. Time series modelling was carried out using Transfer Function (TF) models and univariate auto-regressive integrated moving average (ARIMA) when there was no significant predictor meteorological variable. Of 35 models, five were discarded because of the significant value of Ljung-Box Q statistics. Past P. falciparum malaria incidence alone (17 locations) or when coupled with meteorological variables (four locations) was able to predict P. falciparum malaria incidence within statistical significance. All seasonal AIRMA orders were from locations at altitudes above 1742 m. Monthly rainfall, minimum and maximum temperature was able to predict incidence at four, five and two locations, respectively. In contrast, relative humidity was not able to predict P. falciparum malaria incidence. The R squared values for the models ranged from 16% to 97%, with the exception of one model which had a negative value. Models with seasonal ARIMA orders were found to perform better. However, the models for predicting P. falciparum malaria incidence varied from location to location, and among lagged effects, data transformation forms, ARIMA and TF orders. This study describes P. falciparum malaria incidence models linked with meteorological data. Variability in the models was principally attributed to regional differences, and a single model was not found that fits all locations. Past P. falciparum malaria incidence appeared to be a superior predictor than meteorology. Future efforts in malaria modelling may benefit from inclusion of non-meteorological factors.

  13. Intra-articular decorin influences the fibrosis genetic expression profile in a rabbit model of joint contracture.

    PubMed

    Abdel, M P; Morrey, M E; Barlow, J D; Grill, D E; Kolbert, C P; An, K N; Steinmann, S P; Morrey, B F; Sanchez-Sotelo, J

    2014-01-01

    The goal of this study was to determine whether intra-articular administration of the potentially anti-fibrotic agent decorin influences the expression of genes involved in the fibrotic cascade, and ultimately leads to less contracture, in an animal model. A total of 18 rabbits underwent an operation on their right knees to form contractures. Six limbs in group 1 received four intra-articular injections of decorin; six limbs in group 2 received four intra-articular injections of bovine serum albumin (BSA) over eight days; six limbs in group 3 received no injections. The contracted limbs of rabbits in group 1 were biomechanically and genetically compared with the contracted limbs of rabbits in groups 2 and 3, with the use of a calibrated joint measuring device and custom microarray, respectively. There was no statistical difference in the flexion contracture angles between those limbs that received intra-articular decorin versus those that received intra-articular BSA (66° vs 69°; p = 0.41). Likewise, there was no statistical difference between those limbs that received intra-articular decorin versus those who had no injection (66° vs 72°; p = 0.27). When compared with BSA, decorin led to a statistically significant increase in the mRNA expression of 12 genes (p < 0.01). In addition, there was a statistical change in the mRNA expression of three genes, when compared with those without injection. In this model, when administered intra-articularly at eight weeks, 2 mg of decorin had no significant effect on joint contractures. However, our genetic analysis revealed a significant alteration in several fibrotic genes. Cite this article: Bone Joint Res 2014;3:82-8.

  14. Statistical analysis of target acquisition sensor modeling experiments

    NASA Astrophysics Data System (ADS)

    Deaver, Dawne M.; Moyer, Steve

    2015-05-01

    The U.S. Army RDECOM CERDEC NVESD Modeling and Simulation Division is charged with the development and advancement of military target acquisition models to estimate expected soldier performance when using all types of imaging sensors. Two elements of sensor modeling are (1) laboratory-based psychophysical experiments used to measure task performance and calibrate the various models and (2) field-based experiments used to verify the model estimates for specific sensors. In both types of experiments, it is common practice to control or measure environmental, sensor, and target physical parameters in order to minimize uncertainty of the physics based modeling. Predicting the minimum number of test subjects required to calibrate or validate the model should be, but is not always, done during test planning. The objective of this analysis is to develop guidelines for test planners which recommend the number and types of test samples required to yield a statistically significant result.

  15. Benefits of statistical molecular design, covariance analysis, and reference models in QSAR: a case study on acetylcholinesterase

    NASA Astrophysics Data System (ADS)

    Andersson, C. David; Hillgren, J. Mikael; Lindgren, Cecilia; Qian, Weixing; Akfur, Christine; Berg, Lotta; Ekström, Fredrik; Linusson, Anna

    2015-03-01

    Scientific disciplines such as medicinal- and environmental chemistry, pharmacology, and toxicology deal with the questions related to the effects small organic compounds exhort on biological targets and the compounds' physicochemical properties responsible for these effects. A common strategy in this endeavor is to establish structure-activity relationships (SARs). The aim of this work was to illustrate benefits of performing a statistical molecular design (SMD) and proper statistical analysis of the molecules' properties before SAR and quantitative structure-activity relationship (QSAR) analysis. Our SMD followed by synthesis yielded a set of inhibitors of the enzyme acetylcholinesterase (AChE) that had very few inherent dependencies between the substructures in the molecules. If such dependencies exist, they cause severe errors in SAR interpretation and predictions by QSAR-models, and leave a set of molecules less suitable for future decision-making. In our study, SAR- and QSAR models could show which molecular sub-structures and physicochemical features that were advantageous for the AChE inhibition. Finally, the QSAR model was used for the prediction of the inhibition of AChE by an external prediction set of molecules. The accuracy of these predictions was asserted by statistical significance tests and by comparisons to simple but relevant reference models.

  16. Aspartame induces angiogenesis in vitro and in vivo models.

    PubMed

    Yesildal, F; Aydin, F N; Deveci, S; Tekin, S; Aydin, I; Mammadov, R; Fermanli, O; Avcu, F; Acikel, C H; Ozgurtas, T

    2015-03-01

    Angiogenesis is the process of generating new blood vessels from preexisting vessels and is considered essential in many pathological conditions. The purpose of the present study is to evaluate the effect of aspartame on angiogenesis in vivo chick chorioallantoic membrane (CAM) and wound-healing models as well as in vitro 2,3-bis-2H-tetrazolium-5-carboxanilide (XTT) and tube formation assays. In CAM assay, aspartame increased angiogenesis in a concentration-dependent manner. Compared with the control group, aspartame has significantly increased vessel proliferation (p < 0.001). In addition, in vivo rat model of skin wound-healing study showed that aspartame group had better healing than control group, and this was statistically significant at p < 0.05. There was a slight proliferative effect of aspartame on human umbilical vein endothelial cells on XTT assay in vitro, but it was not statistically significant; and there was no antiangiogenic effect of aspartame on tube formation assay in vitro. These results provide evidence that aspartame induces angiogenesis in vitro and in vivo; so regular use may have undesirable effect on susceptible cases. © The Author(s) 2015.

  17. Imaging of the midpalatal suture in a porcine model: flat-panel volume computed tomography compared with multislice computed tomography.

    PubMed

    Hahn, Wolfram; Fricke-Zech, Susanne; Fialka-Fricke, Julia; Dullin, Christian; Zapf, Antonia; Gruber, Rudolf; Sennhenn-kirchner, Sabine; Kubein-Meesenburg, Dietmar; Sadat-Khonsari, Reza

    2009-09-01

    An investigation was conducted to compare the image quality of prototype flat-panel volume computed tomography (fpVCT) and multislice computed tomography (MSCT) of suture structures. Bone samples were taken from the midpalatal suture of 5 young (16 weeks) and 5 old (200 weeks) Sus scrofa domestica and fixed in formalin solution. An fpVCT prototype and an MSCT were used to obtain images of the specimens. The facial reformations were assessed by 4 observers using a 1 (excellent) to 5 (poor) rating scale for the weighted criteria visualization of the suture structure. A linear mixed model was used for statistical analysis. Results with P < .05 were considered to be statistically significant. The visualization of the suture of young specimens was significantly better than that of older animals (P < .001). The visualization of the suture with fpVCT was significantly better than that with MSCT (P < .001). Compared with MSCT, fpVCT produces superior results in the visualization of the midpalatal suture in a Sus scrofa domestica model.

  18. Verification of relationship model between Korean new elderly class's recovery resilience and productive aging.

    PubMed

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-12-01

    The purpose of this study is to verification of relationship model between Korean new elderly class's recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model.

  19. Verification of relationship model between Korean new elderly class’s recovery resilience and productive aging

    PubMed Central

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-01-01

    The purpose of this study is to verification of relationship model between Korean new elderly class’s recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model. PMID:26730383

  20. Passage relevance models for genomics search.

    PubMed

    Urbain, Jay; Frieder, Ophir; Goharian, Nazli

    2009-03-19

    We present a passage relevance model for integrating syntactic and semantic evidence of biomedical concepts and topics using a probabilistic graphical model. Component models of topics, concepts, terms, and document are represented as potential functions within a Markov Random Field. The probability of a passage being relevant to a biologist's information need is represented as the joint distribution across all potential functions. Relevance model feedback of top ranked passages is used to improve distributional estimates of query concepts and topics in context, and a dimensional indexing strategy is used for efficient aggregation of concept and term statistics. By integrating multiple sources of evidence including dependencies between topics, concepts, and terms, we seek to improve genomics literature passage retrieval precision. Using this model, we are able to demonstrate statistically significant improvements in retrieval precision using a large genomics literature corpus.

  1. Cardiac arrest risk standardization using administrative data compared to registry data.

    PubMed

    Grossestreuer, Anne V; Gaieski, David F; Donnino, Michael W; Nelson, Joshua I M; Mutter, Eric L; Carr, Brendan G; Abella, Benjamin S; Wiebe, Douglas J

    2017-01-01

    Methods for comparing hospitals regarding cardiac arrest (CA) outcomes, vital for improving resuscitation performance, rely on data collected by cardiac arrest registries. However, most CA patients are treated at hospitals that do not participate in such registries. This study aimed to determine whether CA risk standardization modeling based on administrative data could perform as well as that based on registry data. Two risk standardization logistic regression models were developed using 2453 patients treated from 2000-2015 at three hospitals in an academic health system. Registry and administrative data were accessed for all patients. The outcome was death at hospital discharge. The registry model was considered the "gold standard" with which to compare the administrative model, using metrics including comparing areas under the curve, calibration curves, and Bland-Altman plots. The administrative risk standardization model had a c-statistic of 0.891 (95% CI: 0.876-0.905) compared to a registry c-statistic of 0.907 (95% CI: 0.895-0.919). When limited to only non-modifiable factors, the administrative model had a c-statistic of 0.818 (95% CI: 0.799-0.838) compared to a registry c-statistic of 0.810 (95% CI: 0.788-0.831). All models were well-calibrated. There was no significant difference between c-statistics of the models, providing evidence that valid risk standardization can be performed using administrative data. Risk standardization using administrative data performs comparably to standardization using registry data. This methodology represents a new tool that can enable opportunities to compare hospital performance in specific hospital systems or across the entire US in terms of survival after CA.

  2. Cardiac arrest risk standardization using administrative data compared to registry data

    PubMed Central

    Gaieski, David F.; Donnino, Michael W.; Nelson, Joshua I. M.; Mutter, Eric L.; Carr, Brendan G.; Abella, Benjamin S.; Wiebe, Douglas J.

    2017-01-01

    Background Methods for comparing hospitals regarding cardiac arrest (CA) outcomes, vital for improving resuscitation performance, rely on data collected by cardiac arrest registries. However, most CA patients are treated at hospitals that do not participate in such registries. This study aimed to determine whether CA risk standardization modeling based on administrative data could perform as well as that based on registry data. Methods and results Two risk standardization logistic regression models were developed using 2453 patients treated from 2000–2015 at three hospitals in an academic health system. Registry and administrative data were accessed for all patients. The outcome was death at hospital discharge. The registry model was considered the “gold standard” with which to compare the administrative model, using metrics including comparing areas under the curve, calibration curves, and Bland-Altman plots. The administrative risk standardization model had a c-statistic of 0.891 (95% CI: 0.876–0.905) compared to a registry c-statistic of 0.907 (95% CI: 0.895–0.919). When limited to only non-modifiable factors, the administrative model had a c-statistic of 0.818 (95% CI: 0.799–0.838) compared to a registry c-statistic of 0.810 (95% CI: 0.788–0.831). All models were well-calibrated. There was no significant difference between c-statistics of the models, providing evidence that valid risk standardization can be performed using administrative data. Conclusions Risk standardization using administrative data performs comparably to standardization using registry data. This methodology represents a new tool that can enable opportunities to compare hospital performance in specific hospital systems or across the entire US in terms of survival after CA. PMID:28783754

  3. A quantitative analysis of factors influencing the professional longevity of high school science teachers in Florida

    NASA Astrophysics Data System (ADS)

    Ridgley, James Alexander, Jr.

    This dissertation is an exploratory quantitative analysis of various independent variables to determine their effect on the professional longevity (years of service) of high school science teachers in the state of Florida for the academic years 2011-2012 to 2013-2014. Data are collected from the Florida Department of Education, National Center for Education Statistics, and the National Assessment of Educational Progress databases. The following research hypotheses are examined: H1 - There are statistically significant differences in Level 1 (teacher variables) that influence the professional longevity of a high school science teacher in Florida. H2 - There are statistically significant differences in Level 2 (school variables) that influence the professional longevity of a high school science teacher in Florida. H3 - There are statistically significant differences in Level 3 (district variables) that influence the professional longevity of a high school science teacher in Florida. H4 - When tested in a hierarchical multiple regression, there are statistically significant differences in Level 1, Level 2, or Level 3 that influence the professional longevity of a high school science teacher in Florida. The professional longevity of a Floridian high school science teacher is the dependent variable. The independent variables are: (Level 1) a teacher's sex, age, ethnicity, earned degree, salary, number of schools taught in, migration count, and various years of service in different areas of education; (Level 2) a school's geographic location, residential population density, average class size, charter status, and SES; and (Level 3) a school district's average SES and average spending per pupil. Statistical analyses of exploratory MLRs and a HMR are used to support the research hypotheses. The final results of the HMR analysis show a teacher's age, salary, earned degree (unknown, associate, and doctorate), and ethnicity (Hispanic and Native Hawaiian/Pacific Islander); a school's charter status; and a school district's average SES are all significant predictors of a Florida high school science teacher's professional longevity. Although statistically significant in the initial exploratory MLR analyses, a teacher's ethnicity (Asian and Black), a school's geographic location (city and rural), and a school's SES are not statistically significant in the final HMR model.

  4. The effects and interactions of student, teacher, and setting variables on reading outcomes for kindergartners receiving supplemental reading intervention.

    PubMed

    Hagan-Burke, Shanna; Coyne, Michael D; Kwok, Oi-Man; Simmons, Deborah C; Kim, Minjung; Simmons, Leslie E; Skidmore, Susan T; Hernandez, Caitlin L; McSparran Ruby, Maureen

    2013-01-01

    This exploratory study examined the influences of student, teacher, and setting characteristics on kindergarteners' early reading outcomes and investigated whether those relations were moderated by type of intervention. Participants included 206 kindergarteners identified as at risk for reading difficulties and randomly assigned to one of two supplemental interventions: (a) an experimental explicit, systematic, code-based program or (b) their schools' typical kindergarten reading intervention. Results from separate multilevel structural equation models indicated that among student variables, entry-level alphabet knowledge was positively associated with phonemic and decoding outcomes in both conditions. Entry-level rapid automatized naming also positively influenced decoding outcomes in both conditions. However, its effect on phonemic outcomes was statistically significant only among children in the typical practice comparison condition. Regarding teacher variables, the quality of instruction was associated with significantly higher decoding outcomes in the typical reading intervention condition but had no statistically significant influence on phonemic outcomes in either condition. Among setting variables, instruction in smaller group sizes was associated with better phonemic outcomes in the comparison condition but had no statistically significant influence on outcomes of children in the intervention group. Mode of delivery (i.e., pullout vs. in class) had no statistically significant influence on either outcome variable.

  5. Climate change projections for winter precipitation over Tropical America using statistical downscaling

    NASA Astrophysics Data System (ADS)

    Palomino-Lemus, Reiner; Córdoba-Machado, Samir; Quishpe-Vásquez, César; García-Valdecasas-Ojeda, Matilde; Raquel Gámiz-Fortis, Sonia; Castro-Díez, Yolanda; Jesús Esteban-Parra, María

    2017-04-01

    In this study the Principal Component Regression (PCR) method has been used as statistical downscaling technique for simulating boreal winter precipitation in Tropical America during the period 1950-2010, and then for generating climate change projections for 2071-2100 period. The study uses the Global Precipitation Climatology Centre (GPCC, version 6) data set over the Tropical America region [30°N-30°S, 120°W-30°W] as predictand variable in the downscaling model. The mean monthly sea level pressure (SLP) from the National Center for Environmental Prediction - National Center for Atmospheric Research (NCEP-NCAR reanalysis project), has been used as predictor variable, covering a more extended area [30°N-30°S, 180°W-30°W]. Also, the SLP outputs from 20 GCMs, taken from the Coupled Model Intercomparison Project (CMIP5) have been used. The model data include simulations with historical atmospheric concentrations and future projections for the representative concentration pathways RCP2.6, RCP4.5, and RCP8.5. The ability of the different GCMs to simulate the winter precipitation in the study area for present climate (1971-2000) was analyzed by calculating the differences between the simulated and observed precipitation values. Additionally, the statistical significance at 95% confidence level of these differences has been estimated by means of the bilateral rank sum test of Wilcoxon-Mann-Whitney. Finally, to project winter precipitation in the area for the period 2071-2100, the downscaling model, recalibrated for the total period 1950-2010, was applied to the SLP outputs of the GCMs under the RCP2.6, RCP4.5, and RCP8.5 scenarios. The results show that, generally, for present climate the statistical downscaling shows a high ability to faithfully reproduce the precipitation field, while the simulations performed directly by using not downscaled outputs of GCMs strongly distort the precipitation field. For future climate, the projected predictions under the RCP4.5 and RCP8.5 scenarios show large areas with significant changes. For the RCP2.6 scenario, projected results present a predominance of very moderate decreases in rainfall, although significant in some models. Keywords: climate change projections, precipitation, Tropical America, statistical downscaling. Acknowledgements: This work has been financed by the projects P11-RNM-7941 (Junta de Andalucía-Spain) and CGL2013-48539-R (MINECO-Spain, FEDER).

  6. SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit

    PubMed Central

    Chu, Annie; Cui, Jenny; Dinov, Ivo D.

    2011-01-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994

  7. A Methodology for the Parametric Reconstruction of Non-Steady and Noisy Meteorological Time Series

    NASA Astrophysics Data System (ADS)

    Rovira, F.; Palau, J. L.; Millán, M.

    2009-09-01

    Climatic and meteorological time series often show some persistence (in time) in the variability of certain features. One could regard annual, seasonal and diurnal time variability as trivial persistence in the variability of some meteorological magnitudes (as, e.g., global radiation, air temperature above surface, etc.). In these cases, the traditional Fourier transform into frequency space will show the principal harmonics as the components with the largest amplitude. Nevertheless, meteorological measurements often show other non-steady (in time) variability. Some fluctuations in measurements (at different time scales) are driven by processes that prevail on some days (or months) of the year but disappear on others. By decomposing a time series into time-frequency space through the continuous wavelet transformation, one is able to determine both the dominant modes of variability and how those modes vary in time. This study is based on a numerical methodology to analyse non-steady principal harmonics in noisy meteorological time series. This methodology combines both the continuous wavelet transform and the development of a parametric model that includes the time evolution of the principal and the most statistically significant harmonics of the original time series. The parameterisation scheme proposed in this study consists of reproducing the original time series by means of a statistically significant finite sum of sinusoidal signals (waves), each defined by using the three usual parameters: amplitude, frequency and phase. To ensure the statistical significance of the parametric reconstruction of the original signal, we propose a standard statistical t-student analysis of the confidence level of the amplitude in the parametric spectrum for the different wave components. Once we have assured the level of significance of the different waves composing the parametric model, we can obtain the statistically significant principal harmonics (in time) of the original time series by using the Fourier transform of the modelled signal. Acknowledgements The CEAM Foundation is supported by the Generalitat Valenciana and BANCAIXA (València, Spain). This study has been partially funded by the European Commission (FP VI, Integrated Project CIRCE - No. 036961) and by the Ministerio de Ciencia e Innovación, research projects "TRANSREG” (CGL2007-65359/CLI) and "GRACCIE” (CSD2007-00067, Program CONSOLIDER-INGENIO 2010).

  8. Syndromic surveillance models using Web data: the case of scarlet fever in the UK.

    PubMed

    Samaras, Loukas; García-Barriocanal, Elena; Sicilia, Miguel-Angel

    2012-03-01

    Recent research has shown the potential of Web queries as a source for syndromic surveillance, and existing studies show that these queries can be used as a basis for estimation and prediction of the development of a syndromic disease, such as influenza, using log linear (logit) statistical models. Two alternative models are applied to the relationship between cases and Web queries in this paper. We examine the applicability of using statistical methods to relate search engine queries with scarlet fever cases in the UK, taking advantage of tools to acquire the appropriate data from Google, and using an alternative statistical method based on gamma distributions. The results show that using logit models, the Pearson correlation factor between Web queries and the data obtained from the official agencies must be over 0.90, otherwise the prediction of the peak and the spread of the distributions gives significant deviations. In this paper, we describe the gamma distribution model and show that we can obtain better results in all cases using gamma transformations, and especially in those with a smaller correlation factor.

  9. Constructing three emotion knowledge tests from the invariant measurement approach

    PubMed Central

    Prieto, Gerardo; Burin, Debora I.

    2017-01-01

    Background Psychological constructionist models like the Conceptual Act Theory (CAT) postulate that complex states such as emotions are composed of basic psychological ingredients that are more clearly respected by the brain than basic emotions. The objective of this study was the construction and initial validation of Emotion Knowledge measures from the CAT frame by means of an invariant measurement approach, the Rasch Model (RM). Psychological distance theory was used to inform item generation. Methods Three EK tests—emotion vocabulary (EV), close emotional situations (CES) and far emotional situations (FES)—were constructed and tested with the RM in a community sample of 100 females and 100 males (age range: 18–65), both separately and conjointly. Results It was corroborated that data-RM fit was sufficient. Then, the effect of type of test and emotion on Rasch-modelled item difficulty was tested. Significant effects of emotion on EK item difficulty were found, but the only statistically significant difference was that between “happiness” and the remaining emotions; neither type of test, nor interaction effects on EK item difficulty were statistically significant. The testing of gender differences was carried out after corroborating that differential item functioning (DIF) would not be a plausible alternative hypothesis for the results. No statistically significant sex-related differences were found out in EV, CES, FES, or total EK. However, the sign of d indicate that female participants were consistently better than male ones, a result that will be of interest for future meta-analyses. Discussion The three EK tests are ready to be used as components of a higher-level measurement process. PMID:28929013

  10. Observed and Projected Precipitation Changes over the Nine US Climate Regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chylek, Petr; Dubey, Manvendra; Hengartner, Nicholas

    Here, we analyze the past (1900–2015) temperature and precipitation changes in nine separate US climate regions. We find that the temperature increased in a statistically significant (95% confidence level equivalent to alpha level of 0.05) manner in all of these regions. However, the variability in the observed precipitation was much more complex. In the eastern US (east of Rocky Mountains), the precipitation increased in all five climate regions and the increase was statistically significant in three of them. In contract, in the western US, the precipitation increased in two regions and decreased in two with no statistical significance in anymore » region. The CMIP5 climate models (an ensemble mean) were not able to capture properly either the large precipitation differences between the eastern and the western US, or the changes of precipitation between 1900 and 2015 in eastern US. The statistical regression model explains the differences between the eastern and western US precipitation as results of different significant predictors. The anthropogenic greenhouse gases and aerosol (GHGA) are the major forcing of the precipitation in the eastern part of US, while the Pacific Decadal Oscillation (PDO) has the major influence on precipitation in the western part of the US. This analysis suggests that the precipitation over the eastern US increased at an approximate rate of 6.7%/K, in agreement with the Clausius-Clapeyron equation, while the precipitation of the western US was approximately constant, independent of the temperature. Future precipitation over the western part of the US will depend on the behavior of the PDO, and how it (PDO) may be affected by future warming. Low hydrological sensitivity (percent increase of precipitation per one K of warming) projected by the CMIP5 models for the eastern US suggests either an underestimate of future precipitation or an overestimate of future warming.« less

  11. Observed and Projected Precipitation Changes over the Nine US Climate Regions

    DOE PAGES

    Chylek, Petr; Dubey, Manvendra; Hengartner, Nicholas; ...

    2017-10-25

    Here, we analyze the past (1900–2015) temperature and precipitation changes in nine separate US climate regions. We find that the temperature increased in a statistically significant (95% confidence level equivalent to alpha level of 0.05) manner in all of these regions. However, the variability in the observed precipitation was much more complex. In the eastern US (east of Rocky Mountains), the precipitation increased in all five climate regions and the increase was statistically significant in three of them. In contract, in the western US, the precipitation increased in two regions and decreased in two with no statistical significance in anymore » region. The CMIP5 climate models (an ensemble mean) were not able to capture properly either the large precipitation differences between the eastern and the western US, or the changes of precipitation between 1900 and 2015 in eastern US. The statistical regression model explains the differences between the eastern and western US precipitation as results of different significant predictors. The anthropogenic greenhouse gases and aerosol (GHGA) are the major forcing of the precipitation in the eastern part of US, while the Pacific Decadal Oscillation (PDO) has the major influence on precipitation in the western part of the US. This analysis suggests that the precipitation over the eastern US increased at an approximate rate of 6.7%/K, in agreement with the Clausius-Clapeyron equation, while the precipitation of the western US was approximately constant, independent of the temperature. Future precipitation over the western part of the US will depend on the behavior of the PDO, and how it (PDO) may be affected by future warming. Low hydrological sensitivity (percent increase of precipitation per one K of warming) projected by the CMIP5 models for the eastern US suggests either an underestimate of future precipitation or an overestimate of future warming.« less

  12. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.

  13. Rapid Fuel Quality Surveillance Through Chemometric Modeling of Near-Infrared Spectra

    DTIC Science & Technology

    2009-01-01

    measurements also have a first order advantage and are not time-dependent as is the case for chromatography. Thus, the data preprocessing requirements, while...due in part to the nature of hydrocarbon fuels, which imposes significant technical challenges that must be overcome, and in many cases , traditional...properties. The statistical significance of some other fuel properties is given in Table 2. Note also that in those cases where the property models

  14. Structural uncertainty of downscaled climate model output in a difficult-to-resolve environment: data sparseness and parameterization error contribution to statistical and dynamical downscaling output in the U.S. Caribbean region

    NASA Astrophysics Data System (ADS)

    Terando, A. J.; Grade, S.; Bowden, J.; Henareh Khalyani, A.; Wootten, A.; Misra, V.; Collazo, J.; Gould, W. A.; Boyles, R.

    2016-12-01

    Sub-tropical island nations may be particularly vulnerable to anthropogenic climate change because of predicted changes in the hydrologic cycle that would lead to significant drying in the future. However, decision makers in these regions have seen their adaptation planning efforts frustrated by the lack of island-resolving climate model information. Recently, two investigations have used statistical and dynamical downscaling techniques to develop climate change projections for the U.S. Caribbean region (Puerto Rico and U.S. Virgin Islands). We compare the results from these two studies with respect to three commonly downscaled CMIP5 global climate models (GCMs). The GCMs were dynamically downscaled at a convective-permitting scale using two different regional climate models. The statistical downscaling approach was conducted at locations with long-term climate observations and then further post-processed using climatologically aided interpolation (yielding two sets of projections). Overall, both approaches face unique challenges. The statistical approach suffers from a lack of observations necessary to constrain the model, particularly at the land-ocean boundary and in complex terrain. The dynamically downscaled model output has a systematic dry bias over the island despite ample availability of moisture in the atmospheric column. Notwithstanding these differences, both approaches are consistent in projecting a drier climate that is driven by the strong global-scale anthropogenic forcing.

  15. Parameter optimization in biased decoy-state quantum key distribution with both source errors and statistical fluctuations

    NASA Astrophysics Data System (ADS)

    Zhu, Jian-Rong; Li, Jian; Zhang, Chun-Mei; Wang, Qin

    2017-10-01

    The decoy-state method has been widely used in commercial quantum key distribution (QKD) systems. In view of the practical decoy-state QKD with both source errors and statistical fluctuations, we propose a universal model of full parameter optimization in biased decoy-state QKD with phase-randomized sources. Besides, we adopt this model to carry out simulations of two widely used sources: weak coherent source (WCS) and heralded single-photon source (HSPS). Results show that full parameter optimization can significantly improve not only the secure transmission distance but also the final key generation rate. And when taking source errors and statistical fluctuations into account, the performance of decoy-state QKD using HSPS suffered less than that of decoy-state QKD using WCS.

  16. The challenge of identifying greenhouse gas-induced climatic change

    NASA Technical Reports Server (NTRS)

    Maccracken, Michael C.

    1992-01-01

    Meeting the challenge of identifying greenhouse gas-induced climatic change involves three steps. First, observations of critical variables must be assembled, evaluated, and analyzed to determine that there has been a statistically significant change. Second, reliable theoretical (model) calculations must be conducted to provide a definitive set of changes for which to search. Third, a quantitative and statistically significant association must be made between the projected and observed changes to exclude the possibility that the changes are due to natural variability or other factors. This paper provides a qualitative overview of scientific progress in successfully fulfilling these three steps.

  17. Systematic and fully automated identification of protein sequence patterns.

    PubMed

    Hart, R K; Royyuru, A K; Stolovitzky, G; Califano, A

    2000-01-01

    We present an efficient algorithm to systematically and automatically identify patterns in protein sequence families. The procedure is based on the Splash deterministic pattern discovery algorithm and on a framework to assess the statistical significance of patterns. We demonstrate its application to the fully automated discovery of patterns in 974 PROSITE families (the complete subset of PROSITE families which are defined by patterns and contain DR records). Splash generates patterns with better specificity and undiminished sensitivity, or vice versa, in 28% of the families; identical statistics were obtained in 48% of the families, worse statistics in 15%, and mixed behavior in the remaining 9%. In about 75% of the cases, Splash patterns identify sequence sites that overlap more than 50% with the corresponding PROSITE pattern. The procedure is sufficiently rapid to enable its use for daily curation of existing motif and profile databases. Third, our results show that the statistical significance of discovered patterns correlates well with their biological significance. The trypsin subfamily of serine proteases is used to illustrate this method's ability to exhaustively discover all motifs in a family that are statistically and biologically significant. Finally, we discuss applications of sequence patterns to multiple sequence alignment and the training of more sensitive score-based motif models, akin to the procedure used by PSI-BLAST. All results are available at httpl//www.research.ibm.com/spat/.

  18. Caries prevention during orthodontic treatment: In-vivo assessment of high-fluoride varnish to prevent white spot lesions.

    PubMed

    Perrini, Federico; Lombardo, Luca; Arreghini, Angela; Medori, Silvia; Siciliani, Giuseppe

    2016-02-01

    Our objective was to evaluate the efficacy of a fluoridated varnish in preventing white spot lesions in patients with fixed appliances. A laser-induced fluorescence device was used to determine any correlations between the degree of demineralization and the length of the observation period, the arch sector, the frequency of varnish application, and the specific tooth site. A split-mouth study design was used for 24 orthodontic patients, allocated randomly to 2 subgroups with differing frequencies of Duraphat varnish (Colgate-Palmolive, New York, NY) application. Repeated measures of the degree of demineralization were taken on the vestibular surfaces of 12 teeth (6 varnished and 6 unvarnished controls). Measurements were taken at 4 sites using a DIAGNOdent Pen 2190 laser (KaVo, Biberach an der Riss, Germany) and then subjected to statistical analysis. Generalized linear model and coefficient model analysis showed differences in the degrees of demineralization between treated and untreated teeth, but this was not statistically significant in terms of time point, frequency of application, or specific tooth site. However, when we analyzed the position of the teeth, the varnished anterior teeth showed a statistically significant reduction in demineralization compared with their unvarnished counterparts. Periodic application of fluoride varnish can offer some protection against white spots, but not to a statistically significant degree if the patients have excellent oral hygiene. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  19. [Relationship between finger dermatoglyphics and body size indicators in adulthood among Chinese twin population from Qingdao and Lishui cities].

    PubMed

    Sun, Luanluan; Yu, Canqing; Lyu, Jun; Cao, Weihua; Pang, Zengchang; Chen, Weijian; Wang, Shaojie; Chen, Rongfu; Gao, Wenjing; Li, Liming

    2014-01-01

    To study the correlation between fingerprints and body size indicators in adulthood. Samples were composed of twins from two sub-registries of Chinese National Twin Registry (CNTR), including 405 twin pairs in Lishui and 427 twin pairs in Qingdao. All participants were asked to complete the field survey, consisting of questionnaire, physical examination and blood collection. From the 832 twin pairs, those with complete and clear demographic prints were selected as the target population. Information of Fingerprints pixel on the demographic characteristics of these 100 twin pairs and their related adulthood body type indicators were finally chosen to form this research. Descriptive statistics and mixed linear model were used for data analyses. In the mixed linear models adjusted for age and sex, data showed that the body fat percentage of those who had arches was higher than those who did not have the arches (P = 0.002), and those who had radial loops would have higher body fat percentage when compared with ones who did not (P = 0.041). After adjusted for age, there appeared no statistically significant correlation between radial loops and systolic pressure, but the correlations of arches (P = 0.031)and radial loops (P = 0.022) to diastolic pressure still remained statistically significant. Statistically significant correlations were found between fingerprint types and body size indicators, and the fingerprint types showed a useful tool to explore the effects of uterine environment on health status in one's adulthood.

  20. Meteorological models for estimating phenology of corn

    NASA Technical Reports Server (NTRS)

    Daughtry, C. S. T.; Cochran, J. C.; Hollinger, S. E.

    1984-01-01

    Knowledge of when critical crop stages occur and how the environment affects them should provide useful information for crop management decisions and crop production models. Two sources of data were evaluated for predicting dates of silking and physiological maturity of corn (Zea mays L.). Initial evaluations were conducted using data of an adapted corn hybrid grown on a Typic Agriaquoll at the Purdue University Agronomy Farm. The second phase extended the analyses to large areas using data acquired by the Statistical Reporting Service of USDA for crop reporting districts (CRD) in Indiana and Iowa. Several thermal models were compared to calendar days for predicting dates of silking and physiological maturity. Mixed models which used a combination of thermal units to predict silking and days after silking to predict physiological maturity were also evaluated. At the Agronomy Farm the models were calibrated and tested on the same data. The thermal models were significantly less biased and more accurate than calendar days for predicting dates of silking. Differences among the thermal models were small. Significant improvements in both bias and accuracy were observed when the mixed models were used to predict dates of physiological maturity. The results indicate that statistical data for CRD can be used to evaluate models developed at agricultural experiment stations.

  1. Analysis Monthly Import of Palm Oil Products Using Box-Jenkins Model

    NASA Astrophysics Data System (ADS)

    Ahmad, Nurul F. Y.; Khalid, Kamil; Saifullah Rusiman, Mohd; Ghazali Kamardan, M.; Roslan, Rozaini; Che-Him, Norziha

    2018-04-01

    The palm oil industry has been an important component of the national economy especially the agriculture sector. The aim of this study is to identify the pattern of import of palm oil products, to model the time series using Box-Jenkins model and to forecast the monthly import of palm oil products. The method approach is included in the statistical test for verifying the equivalence model and statistical measurement of three models, namely Autoregressive (AR) model, Moving Average (MA) model and Autoregressive Moving Average (ARMA) model. The model identification of all product import palm oil is different in which the AR(1) was found to be the best model for product import palm oil while MA(3) was found to be the best model for products import palm kernel oil. For the palm kernel, MA(4) was found to be the best model. The results forecast for the next four months for products import palm oil, palm kernel oil and palm kernel showed the most significant decrease compared to the actual data.

  2. Angular Baryon Acoustic Oscillation measure at z=2.225 from the SDSS quasar survey

    NASA Astrophysics Data System (ADS)

    de Carvalho, E.; Bernui, A.; Carvalho, G. C.; Novaes, C. P.; Xavier, H. S.

    2018-04-01

    Following a quasi model-independent approach we measure the transversal BAO mode at high redshift using the two-point angular correlation function (2PACF). The analyses done here are only possible now with the quasar catalogue from the twelfth data release (DR12Q) from the Sloan Digital Sky Survey, because it is spatially dense enough to allow the measurement of the angular BAO signature with moderate statistical significance and acceptable precision. Our analyses with quasars in the redshift interval z in [2.20,2.25] produce the angular BAO scale θBAO = 1.77° ± 0.31° with a statistical significance of 2.12 σ (i.e., 97% confidence level), calculated through a likelihood analysis performed using the theoretical covariance matrix sourced by the analytical power spectra expected in the ΛCDM concordance model. Additionally, we show that the BAO signal is robust—although with less statistical significance—under diverse bin-size choices and under small displacements of the quasars' angular coordinates. Finally, we also performed cosmological parameter analyses comparing the θBAO predictions for wCDM and w(a)CDM models with angular BAO data available in the literature, including the measurement obtained here, jointly with CMB data. The constraints on the parameters ΩM, w0 and wa are in excellent agreement with the ΛCDM concordance model.

  3. Simulation, identification and statistical variation in cardiovascular analysis (SISCA) - A software framework for multi-compartment lumped modeling.

    PubMed

    Huttary, Rudolf; Goubergrits, Leonid; Schütte, Christof; Bernhard, Stefan

    2017-08-01

    It has not yet been possible to obtain modeling approaches suitable for covering a wide range of real world scenarios in cardiovascular physiology because many of the system parameters are uncertain or even unknown. Natural variability and statistical variation of cardiovascular system parameters in healthy and diseased conditions are characteristic features for understanding cardiovascular diseases in more detail. This paper presents SISCA, a novel software framework for cardiovascular system modeling and its MATLAB implementation. The framework defines a multi-model statistical ensemble approach for dimension reduced, multi-compartment models and focuses on statistical variation, system identification and patient-specific simulation based on clinical data. We also discuss a data-driven modeling scenario as a use case example. The regarded dataset originated from routine clinical examinations and comprised typical pre and post surgery clinical data from a patient diagnosed with coarctation of aorta. We conducted patient and disease specific pre/post surgery modeling by adapting a validated nominal multi-compartment model with respect to structure and parametrization using metadata and MRI geometry. In both models, the simulation reproduced measured pressures and flows fairly well with respect to stenosis and stent treatment and by pre-treatment cross stenosis phase shift of the pulse wave. However, with post-treatment data showing unrealistic phase shifts and other more obvious inconsistencies within the dataset, the methods and results we present suggest that conditioning and uncertainty management of routine clinical data sets needs significantly more attention to obtain reasonable results in patient-specific cardiovascular modeling. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Forest-stressing climate factors on the US West Coast as simulated by CMIP5

    NASA Astrophysics Data System (ADS)

    Rupp, D. E.; Buotte, P.; Hicke, J. A.; Law, B. E.; Mote, P.; Sharp, D.; Zhenlin, Y.

    2013-12-01

    The rate of forest mortality has increased significantly in western North America since the 1970s. Causes include insect attacks, fire, and soil water deficit, all of which are interdependent. We first identify climate factors that stress forests by reducing photosynthesis and hydraulic conductance, and by promoting bark beetle infestation and wildfire. Examples of such factors may be two consecutive years of extreme summer precipitation deficit, or prolonged vapor pressure deficit exceeding some threshold. Second, we quantify the frequency and magnitude of these climate factors in 20th and 21st century climates, as simulated by global climate models (GCMs) in Coupled Model Intercomparison Project phase 5 (CMIP5), of Washington, Oregon, and California in the western US. Both ';raw' (i.e., original spatial resolution) and statistically downscaled simulations are considered, the latter generated using the Multivariate Adaptive Constructed Analogs (MACA) method. CMIP5 models that most faithfully reproduce the observed historical statistics of these climate factors are identified. Furthermore, significant changes in the statistics between the 20th and 21st centuries are reported. A subsequent task will be to use a selected subset of MACA-downscaled CMIP5 simulations to force the Community Land Model, version 4.5 (CLM 4.5). CLM 4.5 will be modified to better simulate forest mortality and to couple CLM with an economic model. The ultimate goal of this study is to understand the interactions and the feedbacks by which the market and the forest ecosystem influence each other.

  5. A methodology for the design of experiments in computational intelligence with multiple regression models.

    PubMed

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  6. Bootstrap study of genome-enabled prediction reliabilities using haplotype blocks across Nordic Red cattle breeds.

    PubMed

    Cuyabano, B C D; Su, G; Rosa, G J M; Lund, M S; Gianola, D

    2015-10-01

    This study compared the accuracy of genome-enabled prediction models using individual single nucleotide polymorphisms (SNP) or haplotype blocks as covariates when using either a single breed or a combined population of Nordic Red cattle. The main objective was to compare predictions of breeding values of complex traits using a combined training population with haplotype blocks, with predictions using a single breed as training population and individual SNP as predictors. To compare the prediction reliabilities, bootstrap samples were taken from the test data set. With the bootstrapped samples of prediction reliabilities, we built and graphed confidence ellipses to allow comparisons. Finally, measures of statistical distances were used to calculate the gain in predictive ability. Our analyses are innovative in the context of assessment of predictive models, allowing a better understanding of prediction reliabilities and providing a statistical basis to effectively calibrate whether one prediction scenario is indeed more accurate than another. An ANOVA indicated that use of haplotype blocks produced significant gains mainly when Bayesian mixture models were used but not when Bayesian BLUP was fitted to the data. Furthermore, when haplotype blocks were used to train prediction models in a combined Nordic Red cattle population, we obtained up to a statistically significant 5.5% average gain in prediction accuracy, over predictions using individual SNP and training the model with a single breed. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  7. A methodology for the design of experiments in computational intelligence with multiple regression models

    PubMed Central

    Gestal, Marcos; Munteanu, Cristian R.; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable. PMID:27920952

  8. Multifactor-Dimensionality Reduction Reveals High-Order Interactions among Estrogen-Metabolism Genes in Sporadic Breast Cancer

    PubMed Central

    Ritchie, Marylyn D.; Hahn, Lance W.; Roodi, Nady; Bailey, L. Renee; Dupont, William D.; Parl, Fritz F.; Moore, Jason H.

    2001-01-01

    One of the greatest challenges facing human geneticists is the identification and characterization of susceptibility genes for common complex multifactorial human diseases. This challenge is partly due to the limitations of parametric-statistical methods for detection of gene effects that are dependent solely or partially on interactions with other genes and with environmental exposures. We introduce multifactor-dimensionality reduction (MDR) as a method for reducing the dimensionality of multilocus information, to improve the identification of polymorphism combinations associated with disease risk. The MDR method is nonparametric (i.e., no hypothesis about the value of a statistical parameter is made), is model-free (i.e., it assumes no particular inheritance model), and is directly applicable to case-control and discordant-sib-pair studies. Using simulated case-control data, we demonstrate that MDR has reasonable power to identify interactions among two or more loci in relatively small samples. When it was applied to a sporadic breast cancer case-control data set, in the absence of any statistically significant independent main effects, MDR identified a statistically significant high-order interaction among four polymorphisms from three different estrogen-metabolism genes. To our knowledge, this is the first report of a four-locus interaction associated with a common complex multifactorial disease. PMID:11404819

  9. The Relationship between Zinc Levels and Autism: A Systematic Review and Meta-analysis.

    PubMed

    Babaknejad, Nasim; Sayehmiri, Fatemeh; Sayehmiri, Kourosh; Mohamadkhani, Ashraf; Bahrami, Somaye

    2016-01-01

    Autism is a complex behaviorally defined disorder.There is a relationship between zinc (Zn) levels in autistic patients and development of pathogenesis, but the conclusion is not permanent. The present study conducted to estimate this probability using meta-analysis method. In this study, Fixed Effect Model, twelve articles published from 1978 to 2012 were selected by searching Google scholar, PubMed, ISI Web of Science, and Scopus and information were analyzed. I² statistics were calculated to examine heterogeneity. The information was analyzed using R and STATA Ver. 12.2. There was no significant statistical difference between hair, nail, and teeth Zn levels between controls and autistic patients: -0.471 [95% confidence interval (95% CI): -1.172 to 0.231]. There was significant statistical difference between plasma Zn concentration and autistic patients besides healthy controls: -0.253 (95% CI: 0.498 to -0.007). Using a Random Effect Model, the overall Integration of data from the two groups was -0.414 (95% CI: -0.878 to -0.051). Based on sensitivity analysis, zinc supplements can be used for the nutritional therapy for autistic patients.

  10. Examination of the Ovarian Reserve after Generation of Unilateral Rudimentary Uterine Horns in Rats

    PubMed Central

    Toyganözü, Hasan; Nazik, Hakan; Narin, Raziye; Satar, Deniz; Narin, Mehmet Ali; Büyüknacar, Sinem; Api, Murat; Aytan, Hakan

    2014-01-01

    Objective. The purpose of this experimental rat model study is to evaluate the changes in the ovarian environment after excision of the rudimentary horn. Methods. Ten female Wistar albino rats were used in this study. One cm of right uterine horn length was excised in the first operation. Two months after the first operation, all animals were sacrificed to obtain ovaries for histological examination. Mann-Whitney U test and Student's t-test were used for statistical analysis purposes. Statistical significance was defined as P < 0.005. Results. The number of primordial follicles (P = 0.415), primary follicles (P = 0.959), preantral follicles (P = 0.645), antral follicles (P = 0.328), and Graafian follicles (P = 0.721) was decreased and the number of atretic follicles (P = 0.374) increased in the right ovarian side. Howeve,r this difference was not found to be statistically significant. Conclusion. The results of this experimental rat model study suggest that the excision of rudimentary horn could have negative effects on ipsilateral ovarian functions. PMID:24672393

  11. A Model Comparison for Count Data with a Positively Skewed Distribution with an Application to the Number of University Mathematics Courses Completed

    ERIC Educational Resources Information Center

    Liou, Pey-Yan

    2009-01-01

    The current study examines three regression models: OLS (ordinary least square) linear regression, Poisson regression, and negative binomial regression for analyzing count data. Simulation results show that the OLS regression model performed better than the others, since it did not produce more false statistically significant relationships than…

  12. The Impact of State Legislation and Model Policies on Bullying in Schools.

    PubMed

    Terry, Amanda

    2018-04-01

    The purpose of this study was to determine the impact of the coverage of state legislation and the expansiveness ratings of state model policies on the state-level prevalence of bullying in schools. The state-level prevalence of bullying in schools was based on cross-sectional data from the 2013 High School Youth Risk Behavior Survey. Multiple regression was conducted to determine whether the coverage of state legislation and the expansiveness rating of a state model policy affected the state-level prevalence of bullying in schools. The purpose and definition category of components in state legislation and the expansiveness rating of a state model policy were statistically significant predictors of the state-level prevalence of bullying in schools. The other 3 categories of components in state legislation-District Policy Development and Review, District Policy Components, and Additional Components-were not statistically significant predictors in the model. Extensive coverage in the purpose and definition category of components in state legislation and a high expansiveness rating of a state model policy may be important in efforts to reduce bullying in schools. Improving these areas may reduce the state-level prevalence of bullying in schools. © 2018, American School Health Association.

  13. Association between hepatitis B virus/hepatitis C virus infection and primary hepatocellular carcinoma risk: A meta-analysis based on Chinese population.

    PubMed

    Li, Libo; Lan, Xiaolin

    2016-12-01

    To assess the relationship between hepatitis B virus (HBV), hepatitis C virus (HCV), and HBV/HCV double infection and hepatocellular carcinoma risk in Chinese population. The databases of PubMed and CNKI were electronic searched by reviewers according to the searching words of HBV, HCV, and hepatocellular carcinoma. The related case-control studies or cohort studies were included. The association between virus infection and hepatocellular carcinoma risk was demonstrated by odds ratio (OR) and 95% confidence interval (95% CI). The data were pooled by fixed or random effects model according to the statistical heterogeneity. The publication bias was assessed by Begg's funnel plot and Egger's linear regression test. Finally, 13 publications were included in this meta-analysis. For significant statistical heterogeneity (I2 = 99.8%,P = 0.00), the OR was pooled by random effects model. The pooled results showed that HBV infection can significantly increase the risk of developing hepatocellular carcinoma (OR = 58.01, 95% CI: 44.27-71.75); statistical heterogeneity analysis showed that significant heterogeneity existed in evaluation of HCV infection and hepatocellular carcinoma risk across the included 13 studies I2 = 77.78%, P = 0.00). The OR was pooled by random effects model. The pooled results showed that HCV infection can significantly increase the risk of developing hepatocellular carcinoma (OR = 2.34, 95% CI: 1.20-3.47); significant heterogeneity did not exist in evaluation HBV/HCV double infection and hepatocellular carcinoma risk for the included 13 studies (I2 = 0.00%,P = 0.80). The OR was pooled by fixed effects model. The pooled results showed that HBV/HCV double infection can significantly increase the risk of developing hepatocellular carcinoma (OR = 11.39, 95% CI: 4.58-18.20). No publication bias was found in the aspects of HBV, HCV, and HBV/HCV double infection and hepatocellular carcinoma. For Chinese population, HBV, HCV or HBV/HCV double infection can significantly increase the risk of developing hepatocellular carcinoma.

  14. Statistical analysis of the effect of temperature and inlet humidities on the parameters of a semiempirical model of the internal resistance of a polymer electrolyte membrane fuel cell

    NASA Astrophysics Data System (ADS)

    Giner-Sanz, J. J.; Ortega, E. M.; Pérez-Herranz, V.

    2018-03-01

    The internal resistance of a PEM fuel cell depends on the operation conditions and on the current delivered by the cell. This work's goal is to obtain a semiempirical model able to reproduce the effect of the operation current on the internal resistance of an individual cell of a commercial PEM fuel cell stack; and to perform a statistical analysis in order to study the effect of the operation temperature and the inlet humidities on the parameters of the model. First, the internal resistance of the individual fuel cell operating in different operation conditions was experimentally measured for different DC currents, using the high frequency intercept of the impedance spectra. Then, a semiempirical model based on Springer and co-workers' model was proposed. This model is able to successfully reproduce the experimental trends. Subsequently, the curves of resistance versus DC current obtained for different operation conditions were fitted to the semiempirical model, and an analysis of variance (ANOVA) was performed in order to determine which factors have a statistically significant effect on each model parameter. Finally, a response surface method was applied in order to obtain a regression model.

  15. Detection of crossover time scales in multifractal detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Ge, Erjia; Leung, Yee

    2013-04-01

    Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.

  16. Functional Status Outperforms Comorbidities as a Predictor of 30-Day Acute Care Readmissions in the Inpatient Rehabilitation Population.

    PubMed

    Shih, Shirley L; Zafonte, Ross; Bates, David W; Gerrard, Paul; Goldstein, Richard; Mix, Jacqueline; Niewczyk, Paulette; Greysen, S Ryan; Kazis, Lewis; Ryan, Colleen M; Schneider, Jeffrey C

    2016-10-01

    Functional status is associated with patient outcomes, but is rarely included in hospital readmission risk models. The objective of this study was to determine whether functional status is a better predictor of 30-day acute care readmission than traditionally investigated variables including demographics and comorbidities. Retrospective database analysis between 2002 and 2011. 1158 US inpatient rehabilitation facilities. 4,199,002 inpatient rehabilitation facility admissions comprising patients from 16 impairment groups within the Uniform Data System for Medical Rehabilitation database. Logistic regression models predicting 30-day readmission were developed based on age, gender, comorbidities (Elixhauser comorbidity index, Deyo-Charlson comorbidity index, and Medicare comorbidity tier system), and functional status [Functional Independence Measure (FIM)]. We hypothesized that (1) function-based models would outperform demographic- and comorbidity-based models and (2) the addition of demographic and comorbidity data would not significantly enhance function-based models. For each impairment group, Function Only Models were compared against Demographic-Comorbidity Models and Function Plus Models (Function-Demographic-Comorbidity Models). The primary outcome was 30-day readmission, and the primary measure of model performance was the c-statistic. All-cause 30-day readmission rate from inpatient rehabilitation facilities to acute care hospitals was 9.87%. C-statistics for the Function Only Models were 0.64 to 0.70. For all 16 impairment groups, the Function Only Model demonstrated better c-statistics than the Demographic-Comorbidity Models (c-statistic difference: 0.03-0.12). The best-performing Function Plus Models exhibited negligible improvements in model performance compared to Function Only Models, with c-statistic improvements of only 0.01 to 0.05. Readmissions are currently used as a marker of hospital performance, with recent financial penalties to hospitals for excessive readmissions. Function-based readmission models outperform models based only on demographics and comorbidities. Readmission risk models would benefit from the inclusion of functional status as a primary predictor. Copyright © 2016 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  17. Students' Emergent Articulations of Statistical Models and Modeling in Making Informal Statistical Inferences

    ERIC Educational Resources Information Center

    Braham, Hana Manor; Ben-Zvi, Dani

    2017-01-01

    A fundamental aspect of statistical inference is representation of real-world data using statistical models. This article analyzes students' articulations of statistical models and modeling during their first steps in making informal statistical inferences. An integrated modeling approach (IMA) was designed and implemented to help students…

  18. A statistical experiment design approach for optimizing biodegradation of weathered crude oil in coastal sediments.

    PubMed

    Mohajeri, Leila; Aziz, Hamidi Abdul; Isa, Mohamed Hasnain; Zahed, Mohammad Ali

    2010-02-01

    This work studied the bioremediation of weathered crude oil (WCO) in coastal sediment samples using central composite face centered design (CCFD) under response surface methodology (RSM). Initial oil concentration, biomass, nitrogen and phosphorus concentrations were used as independent variables (factors) and oil removal as dependent variable (response) in a 60 days trial. A statistically significant model for WCO removal was obtained. The coefficient of determination (R(2)=0.9732) and probability value (P<0.0001) demonstrated significance for the regression model. Numerical optimization based on desirability function were carried out for initial oil concentration of 2, 16 and 30 g per kg sediment and 83.13, 78.06 and 69.92 per cent removal were observed respectively, compare to 77.13, 74.17 and 69.87 per cent removal for un-optimized results.

  19. Exploratory analysis of the potential relationship between urinary molybdenum and bone mineral density among adult men and women from NHANES 2007-2010.

    PubMed

    Lewis, Ryan C; Johns, Lauren E; Meeker, John D

    2016-12-01

    Human exposure to molybdenum (Mo) may play a role in reducing bone mineral density (BMD) by interfering with steroid sex hormone levels. To begin to address gaps in the literature on this topic, the potential relationship between urinary Mo (U-Mo) and BMD at the femoral neck (FN-BMD) and lumbar spine (LS-BMD) was explored in a sample of 1496 adults participating in the 2007-2010 cycles of the National Health and Nutrition Examination Survey. Associations were assessed using multiple linear regression models stratified on sex and age. In adjusted models for 50-80+ year-old women, there was a statistically significant inverse relationship between natural log-U-Mo and LS-BMD (p-value: 0.002), and a statistically significant dose-dependent decrease in LS-BMD with increasing U-Mo quartiles (trend p-value: 0.002). A suggestive (trend p-value: 0.08), dose-dependent decrease in FN-BMD with increasing U-Mo quartiles was noted in this group of women as well. All other adjusted models revealed no statistically significant or suggestive relationships between U-Mo and FN-BMD or LS-BMD. Bone health is important for overall human health and well-being and, given the exploratory nature of this work, additional studies are needed to confirm the results in other populations, and clarify the potential underlying mechanisms of Mo on BMD. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Neopuff T-piece resuscitator mask ventilation: Does mask leak vary with different peak inspiratory pressures in a manikin model?

    PubMed

    Maheshwari, Rajesh; Tracy, Mark; Hinder, Murray; Wright, Audrey

    2017-08-01

    The aim of this study was to compare mask leak with three different peak inspiratory pressure (PIP) settings during T-piece resuscitator (TPR; Neopuff) mask ventilation on a neonatal manikin model. Participants were neonatal unit staff members. They were instructed to provide mask ventilation with a TPR with three PIP settings (20, 30, 40 cm H 2 O) chosen in a random order. Each episode was for 2 min with 2-min rest period. Flow rate and positive end-expiratory pressure (PEEP) were kept constant. Airway pressure, inspiratory and expiratory tidal volumes, mask leak, respiratory rate and inspiratory time were recorded. Repeated measures analysis of variance was used for statistical analysis. A total of 12 749 inflations delivered by 40 participants were analysed. There were no statistically significant differences (P > 0.05) in the mask leak with the three PIP settings. No statistically significant differences were seen in respiratory rate and inspiratory time with the three PIP settings. There was a significant rise in PEEP as the PIP increased. Failure to achieve the desired PIP was observed especially at the higher settings. In a neonatal manikin model, the mask leak does not vary as a function of the PIP when the flow rate is constant. With a fixed rate and inspiratory time, there seems to be a rise in PEEP with increasing PIP. © 2017 Paediatrics and Child Health Division (The Royal Australasian College of Physicians).

  1. Royal jelly and bee pollen decrease bone loss due to osteoporosis in an oophorectomized rat model.

    PubMed

    Kafadar, Ibrahim Halil; Güney, Ahmet; Türk, Cemil Yildirim; Oner, Mithat; Silici, Sibel

    2012-01-01

    In this study, we aimed to investigate whether royal jelly and bee pollen reduce the bone loss due to osteoporosis in oophorectomized rat model. Thirty-two female Sprague-Dawley mature rats at six-month-old, weighing 180-260 g were used in the study. The rats were divided into four groups: Sham-operation group, only oophorectomy group, oophorectomy in combination with royal jelly group, and oophorectomy and bee pollen group. The rats were sacrified within 12 weeks following surgery. Bone mineral density (BMD) was measured and blood samples were collected for biochemical analysis before sacrification. Following sacrification, uterine weights were measured and tissue samples were taken to determine bone calcium and phosphate level with imaging through scanning electron microscope. The uterine weights of the rats were found higher in Sham-operation group than the other groups. The difference among the groups was statistically significant (p=0.001). Total body BMD results were similar in all groups and there was not statistically significant difference (p=0.19). The lumbar spine and proximal femur BMD results were statistically significantly higher in the royal jelly and bee pollen groups, compared to only oophorectomy group (p=0.001). Bone tissue calcium and phosphate levels were higher in royal jelly and bee pollen groups. Royal jelly and bee pollen decrease the bone loss due to osteoporosis in oophorectomized rat model. These results may contribute to the clinical practice.

  2. Absolute plate motions relative to deep mantle plumes

    NASA Astrophysics Data System (ADS)

    Wang, Shimin; Yu, Hongzheng; Zhang, Qiong; Zhao, Yonghong

    2018-05-01

    Advances in whole waveform seismic tomography have revealed the presence of broad mantle plumes rooted at the base of the Earth's mantle beneath major hotspots. Hotspot tracks associated with these deep mantle plumes provide ideal constraints for inverting absolute plate motions as well as testing the fixed hotspot hypothesis. In this paper, 27 observed hotspot trends associated with 24 deep mantle plumes are used together with the MORVEL model for relative plate motions to determine an absolute plate motion model, in terms of a maximum likelihood optimization for angular data fitting, combined with an outlier data detection procedure based on statistical tests. The obtained T25M model fits 25 observed trends of globally distributed hotspot tracks to the statistically required level, while the other two hotspot trend data (Comores on Somalia and Iceland on Eurasia) are identified as outliers, which are significantly incompatible with other data. For most hotspots with rate data available, T25M predicts plate velocities significantly lower than the observed rates of hotspot volcanic migration, which cannot be fully explained by biased errors in observed rate data. Instead, the apparent hotspot motions derived by subtracting the observed hotspot migration velocities from the T25M plate velocities exhibit a combined pattern of being opposite to plate velocities and moving towards mid-ocean ridges. The newly estimated net rotation of the lithosphere is statistically compatible with three recent estimates, but differs significantly from 30 of 33 prior estimates.

  3. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  4. On fitting generalized linear mixed-effects models for binary responses using different statistical packages.

    PubMed

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W; Xia, Yinglin; Zhu, Liang; Tu, Xin M

    2011-09-10

    The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  5. Testing for nonlinearity in time series: The method of surrogate data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, J.; Galdrikian, B.; Longtin, A.

    1991-01-01

    We describe a statistical approach for identifying nonlinearity in time series; in particular, we want to avoid claims of chaos when simpler models (such as linearly correlated noise) can explain the data. The method requires a careful statement of the null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against themore » null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. We present algorithms for generating surrogate data under various null hypotheses, and we show the results of numerical experiments on artificial data using correlation dimension, Lyapunov exponent, and forecasting error as discriminating statistics. Finally, we consider a number of experimental time series -- including sunspots, electroencephalogram (EEG) signals, and fluid convection -- and evaluate the statistical significance of the evidence for nonlinear structure in each case. 56 refs., 8 figs.« less

  6. [Establishment of diagnostic model to monitor minimal residual disease of acute promyelocytic leukemia by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry].

    PubMed

    Zhang, Lin-lin; Xu, Zhi-fang; Tan, Yan-hong; Chen, Xiu-hua; Xu, Ai-ning; Ren, Fang-gang; Wang, Hong-wei

    2013-01-01

    To screen the potential protein biomarkers in minimal residual disease (MRD) of the acute promyelocytic leukemia (APL) by comparison of differentially expressed serum protein between APL patients at diagnosis and after complete remission (CR) and healthy controls, and to establish and verify a diagnostic model. Serum proteins from 36 cases of primary APL, 29 cases of APL during complete remission and 32 healthy controls were purified by magnetic beads and then analyzed by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). The spectra were analyzed statistically using FlexAnalysis(TM) and ClinProt(TM) software. Two prediction model of primary APL/healthy control, primary APL/APL CR were developed. Thirty four statistically significant peptide peaks were obtained with the m/z value ranging from 1000 to 10 000 (P < 0.001) in primary APL/healthy control model. Seven statistically significant peptide peaks were obtained in primary APL/APL CR model (P < 0.001). Comparison of the protein profiles between the two models, three peptides with m/z 4642, 7764 and 9289 were considered as the protein biomarker of APL MRD. A diagnostic pattern for APL CR using m/z 4642 and 9289 was established. Blind validation yielded correct classification of 6 out of 8 cases. The MALDI-TOF MS analysis of APL patients serum protein can be used as a promising dynamic method for MRD detection and the two peptides with m/z 4642 and 9289 may be better biomarkers.

  7. Statistical Analysis of Big Data on Pharmacogenomics

    PubMed Central

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  8. Prediction of local concentration statistics in variably saturated soils: Influence of observation scale and comparison with field data

    NASA Astrophysics Data System (ADS)

    Graham, Wendy; Destouni, Georgia; Demmy, George; Foussereau, Xavier

    1998-07-01

    The methodology developed in Destouni and Graham [Destouni, G., Graham, W.D., 1997. The influence of observation method on local concentration statistics in the subsurface. Water Resour. Res. 33 (4) 663-676.] for predicting locally measured concentration statistics for solute transport in heterogeneous porous media under saturated flow conditions is applied to the prediction of conservative nonreactive solute transport in the vadose zone where observations are obtained by soil coring. Exact analytical solutions are developed for both the mean and variance of solute concentrations measured in discrete soil cores using a simplified physical model for vadose-zone flow and solute transport. Theoretical results show that while the ensemble mean concentration is relatively insensitive to the length-scale of the measurement, predictions of the concentration variance are significantly impacted by the sampling interval. Results also show that accounting for vertical heterogeneity in the soil profile results in significantly less spreading in the mean and variance of the measured solute breakthrough curves, indicating that it is important to account for vertical heterogeneity even for relatively small travel distances. Model predictions for both the mean and variance of locally measured solute concentration, based on independently estimated model parameters, agree well with data from a field tracer test conducted in Manatee County, Florida.

  9. Modeling the Risk of Radiation-Induced Acute Esophagitis for Combined Washington University and RTOG Trial 93-11 Lung Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Ellen X.; Bradley, Jeffrey D.; El Naqa, Issam

    2012-04-01

    Purpose: To construct a maximally predictive model of the risk of severe acute esophagitis (AE) for patients who receive definitive radiation therapy (RT) for non-small-cell lung cancer. Methods and Materials: The dataset includes Washington University and RTOG 93-11 clinical trial data (events/patients: 120/374, WUSTL = 101/237, RTOG9311 = 19/137). Statistical model building was performed based on dosimetric and clinical parameters (patient age, sex, weight loss, pretreatment chemotherapy, concurrent chemotherapy, fraction size). A wide range of dose-volume parameters were extracted from dearchived treatment plans, including Dx, Vx, MOHx (mean of hottest x% volume), MOCx (mean of coldest x% volume), and gEUDmore » (generalized equivalent uniform dose) values. Results: The most significant single parameters for predicting acute esophagitis (RTOG Grade 2 or greater) were MOH85, mean esophagus dose (MED), and V30. A superior-inferior weighted dose-center position was derived but not found to be significant. Fraction size was found to be significant on univariate logistic analysis (Spearman R = 0.421, p < 0.00001) but not multivariate logistic modeling. Cross-validation model building was used to determine that an optimal model size needed only two parameters (MOH85 and concurrent chemotherapy, robustly selected on bootstrap model-rebuilding). Mean esophagus dose (MED) is preferred instead of MOH85, as it gives nearly the same statistical performance and is easier to compute. AE risk is given as a logistic function of (0.0688 Asterisk-Operator MED+1.50 Asterisk-Operator ConChemo-3.13), where MED is in Gy and ConChemo is either 1 (yes) if concurrent chemotherapy was given, or 0 (no). This model correlates to the observed risk of AE with a Spearman coefficient of 0.629 (p < 0.000001). Conclusions: Multivariate statistical model building with cross-validation suggests that a two-variable logistic model based on mean dose and the use of concurrent chemotherapy robustly predicts acute esophagitis risk in combined-data WUSTL and RTOG 93-11 trial datasets.« less

  10. An ensemble Kalman filter for statistical estimation of physics constrained nonlinear regression models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harlim, John, E-mail: jharlim@psu.edu; Mahdi, Adam, E-mail: amahdi@ncsu.edu; Majda, Andrew J., E-mail: jonjon@cims.nyu.edu

    2014-01-15

    A central issue in contemporary science is the development of nonlinear data driven statistical–dynamical models for time series of noisy partial observations from nature or a complex model. It has been established recently that ad-hoc quadratic multi-level regression models can have finite-time blow-up of statistical solutions and/or pathological behavior of their invariant measure. Recently, a new class of physics constrained nonlinear regression models were developed to ameliorate this pathological behavior. Here a new finite ensemble Kalman filtering algorithm is developed for estimating the state, the linear and nonlinear model coefficients, the model and the observation noise covariances from available partialmore » noisy observations of the state. Several stringent tests and applications of the method are developed here. In the most complex application, the perfect model has 57 degrees of freedom involving a zonal (east–west) jet, two topographic Rossby waves, and 54 nonlinearly interacting Rossby waves; the perfect model has significant non-Gaussian statistics in the zonal jet with blocked and unblocked regimes and a non-Gaussian skewed distribution due to interaction with the other 56 modes. We only observe the zonal jet contaminated by noise and apply the ensemble filter algorithm for estimation. Numerically, we find that a three dimensional nonlinear stochastic model with one level of memory mimics the statistical effect of the other 56 modes on the zonal jet in an accurate fashion, including the skew non-Gaussian distribution and autocorrelation decay. On the other hand, a similar stochastic model with zero memory levels fails to capture the crucial non-Gaussian behavior of the zonal jet from the perfect 57-mode model.« less

  11. Evaluating the statistical performance of less applied algorithms in classification of worldview-3 imagery data in an urbanized landscape

    NASA Astrophysics Data System (ADS)

    Ranaie, Mehrdad; Soffianian, Alireza; Pourmanafi, Saeid; Mirghaffari, Noorollah; Tarkesh, Mostafa

    2018-03-01

    In recent decade, analyzing the remotely sensed imagery is considered as one of the most common and widely used procedures in the environmental studies. In this case, supervised image classification techniques play a central role. Hence, taking a high resolution Worldview-3 over a mixed urbanized landscape in Iran, three less applied image classification methods including Bagged CART, Stochastic gradient boosting model and Neural network with feature extraction were tested and compared with two prevalent methods: random forest and support vector machine with linear kernel. To do so, each method was run ten time and three validation techniques was used to estimate the accuracy statistics consist of cross validation, independent validation and validation with total of train data. Moreover, using ANOVA and Tukey test, statistical difference significance between the classification methods was significantly surveyed. In general, the results showed that random forest with marginal difference compared to Bagged CART and stochastic gradient boosting model is the best performing method whilst based on independent validation there was no significant difference between the performances of classification methods. It should be finally noted that neural network with feature extraction and linear support vector machine had better processing speed than other.

  12. Better prognostic marker in ICU - APACHE II, SOFA or SAP II!

    PubMed

    Naqvi, Iftikhar Haider; Mahmood, Khalid; Ziaullaha, Syed; Kashif, Syed Mohammad; Sharif, Asim

    2016-01-01

    This study was designed to determine the comparative efficacy of different scoring system in assessing the prognosis of critically ill patients. This was a retrospective study conducted in medical intensive care unit (MICU) and high dependency unit (HDU) Medical Unit III, Civil Hospital, from April 2012 to August 2012. All patients over age 16 years old who have fulfilled the criteria for MICU admission were included. Predictive mortality of APACHE II, SAP II and SOFA were calculated. Calibration and discrimination were used for validity of each scoring model. A total of 96 patients with equal gender distribution were enrolled. The average APACHE II score in non-survivors (27.97+8.53) was higher than survivors (15.82+8.79) with statistically significant p value (<0.001). The average SOFA score in non-survivors (9.68+4.88) was higher than survivors (5.63+3.63) with statistically significant p value (<0.001). SAP II average score in non-survivors (53.71+19.05) was higher than survivors (30.18+16.24) with statistically significant p value (<0.001). All three tested scoring models (APACHE II, SAP II and SOFA) would be accurate enough for a general description of our ICU patients. APACHE II has showed better calibration and discrimination power than SAP II and SOFA.

  13. An open-access CMIP5 pattern library for temperature and precipitation: description and methodology

    NASA Astrophysics Data System (ADS)

    Lynch, Cary; Hartin, Corinne; Bond-Lamberty, Ben; Kravitz, Ben

    2017-05-01

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squares regression methods. We explore the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90° N/S). Bias and mean errors between modeled and pattern-predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5 °C, but the choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. This paper describes our library of least squares regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns. The dataset and netCDF data generation code are available at doi:10.5281/zenodo.495632.

  14. The Abdominal Aortic Aneurysm Statistically Corrected Operative Risk Evaluation (AAA SCORE) for predicting mortality after open and endovascular interventions.

    PubMed

    Ambler, Graeme K; Gohel, Manjit S; Mitchell, David C; Loftus, Ian M; Boyle, Jonathan R

    2015-01-01

    Accurate adjustment of surgical outcome data for risk is vital in an era of surgeon-level reporting. Current risk prediction models for abdominal aortic aneurysm (AAA) repair are suboptimal. We aimed to develop a reliable risk model for in-hospital mortality after intervention for AAA, using rigorous contemporary statistical techniques to handle missing data. Using data collected during a 15-month period in the United Kingdom National Vascular Database, we applied multiple imputation methodology together with stepwise model selection to generate preoperative and perioperative models of in-hospital mortality after AAA repair, using two thirds of the available data. Model performance was then assessed on the remaining third of the data by receiver operating characteristic curve analysis and compared with existing risk prediction models. Model calibration was assessed by Hosmer-Lemeshow analysis. A total of 8088 AAA repair operations were recorded in the National Vascular Database during the study period, of which 5870 (72.6%) were elective procedures. Both preoperative and perioperative models showed excellent discrimination, with areas under the receiver operating characteristic curve of .89 and .92, respectively. This was significantly better than any of the existing models (area under the receiver operating characteristic curve for best comparator model, .84 and .88; P < .001 and P = .001, respectively). Discrimination remained excellent when only elective procedures were considered. There was no evidence of miscalibration by Hosmer-Lemeshow analysis. We have developed accurate models to assess risk of in-hospital mortality after AAA repair. These models were carefully developed with rigorous statistical methodology and significantly outperform existing methods for both elective cases and overall AAA mortality. These models will be invaluable for both preoperative patient counseling and accurate risk adjustment of published outcome data. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  15. Effects of metal- and fiber-reinforced composite root canal posts on flexural properties.

    PubMed

    Kim, Su-Hyeon; Oh, Tack-Oon; Kim, Ju-Young; Park, Chun-Woong; Baek, Seung-Ho; Park, Eun-Seok

    2016-01-01

    The aim of this study was to observe the effects of different test conditions on the flexural properties of root canal post. Metal- and fiber-reinforced composite root canal posts of various diameters were measured to determine flexural properties using a threepoint bending test at different conditions. In this study, the span length/post diameter ratio of root canal posts varied from 3.0 to 10.0. Multiple regression models for maximum load as a dependent variable were statistically significant. The models for flexural properties as dependent variables were statistically significant, but linear regression models could not be fitted to data sets. At a low span length/post diameter ratio, the flexural properties were distorted by occurrence of shear stress in short samples. It was impossible to obtain high span length/post diameter ratio with root canal posts. The addition of parameters or coefficients is necessary to appropriately represent the flexural properties of root canal posts.

  16. [The reentrant binomial model of nuclear anomalies growth in rhabdomyosarcoma RA-23 cell populations under increasing doze of rare ionizing radiation].

    PubMed

    Alekseeva, N P; Alekseev, A O; Vakhtin, Iu B; Kravtsov, V Iu; Kuzovatov, S N; Skorikova, T I

    2008-01-01

    Distributions of nuclear morphology anomalies in transplantable rabdomiosarcoma RA-23 cell populations were investigated under effect of ionizing radiation from 0 to 45 Gy. Internuclear bridges, nuclear protrusions and dumbbell-shaped nuclei were accepted for morphological anomalies. Empirical distributions of the number of anomalies per 100 nuclei were used. The adequate model of reentrant binomial distribution has been found. The sum of binomial random variables with binomial number of summands has such distribution. Averages of these random variables were named, accordingly, internal and external average reentrant components. Their maximum likelihood estimations were received. Statistical properties of these estimations were investigated by means of statistical modeling. It has been received that at equally significant correlation between the radiation dose and the average of nuclear anomalies in cell populations after two-three cellular cycles from the moment of irradiation in vivo the irradiation doze significantly correlates with internal average reentrant component, and in remote descendants of cell transplants irradiated in vitro - with external one.

  17. Do sophisticated epistemic beliefs predict meaningful learning? Findings from a structural equation model of undergraduate biology learning

    NASA Astrophysics Data System (ADS)

    Lee, Silvia Wen-Yu; Liang, Jyh-Chong; Tsai, Chin-Chung

    2016-10-01

    This study investigated the relationships among college students' epistemic beliefs in biology (EBB), conceptions of learning biology (COLB), and strategies of learning biology (SLB). EBB includes four dimensions, namely 'multiple-source,' 'uncertainty,' 'development,' and 'justification.' COLB is further divided into 'constructivist' and 'reproductive' conceptions, while SLB represents deep strategies and surface learning strategies. Questionnaire responses were gathered from 303 college students. The results of the confirmatory factor analysis and structural equation modelling showed acceptable model fits. Mediation testing further revealed two paths with complete mediation. In sum, students' epistemic beliefs of 'uncertainty' and 'justification' in biology were statistically significant in explaining the constructivist and reproductive COLB, respectively; and 'uncertainty' was statistically significant in explaining the deep SLB as well. The results of mediation testing further revealed that 'uncertainty' predicted surface strategies through the mediation of 'reproductive' conceptions; and the relationship between 'justification' and deep strategies was mediated by 'constructivist' COLB. This study provides evidence for the essential roles some epistemic beliefs play in predicting students' learning.

  18. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  19. Correction of the significance level when attempting multiple transformations of an explanatory variable in generalized linear models

    PubMed Central

    2013-01-01

    Background In statistical modeling, finding the most favorable coding for an exploratory quantitative variable involves many tests. This process involves multiple testing problems and requires the correction of the significance level. Methods For each coding, a test on the nullity of the coefficient associated with the new coded variable is computed. The selected coding corresponds to that associated with the largest statistical test (or equivalently the smallest pvalue). In the context of the Generalized Linear Model, Liquet and Commenges (Stat Probability Lett,71:33–38,2005) proposed an asymptotic correction of the significance level. This procedure, based on the score test, has been developed for dichotomous and Box-Cox transformations. In this paper, we suggest the use of resampling methods to estimate the significance level for categorical transformations with more than two levels and, by definition those that involve more than one parameter in the model. The categorical transformation is a more flexible way to explore the unknown shape of the effect between an explanatory and a dependent variable. Results The simulations we ran in this study showed good performances of the proposed methods. These methods were illustrated using the data from a study of the relationship between cholesterol and dementia. Conclusion The algorithms were implemented using R, and the associated CPMCGLM R package is available on the CRAN. PMID:23758852

  20. Low-Level Contrast Statistics of Natural Images Can Modulate the Frequency of Event-Related Potentials (ERP) in Humans.

    PubMed

    Ghodrati, Masoud; Ghodousi, Mahrad; Yoonessi, Ali

    2016-01-01

    Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP) in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs' power within theta frequency band (~3-7 Hz). This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception.

  1. Low-Level Contrast Statistics of Natural Images Can Modulate the Frequency of Event-Related Potentials (ERP) in Humans

    PubMed Central

    Ghodrati, Masoud; Ghodousi, Mahrad; Yoonessi, Ali

    2016-01-01

    Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP) in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs' power within theta frequency band (~3–7 Hz). This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception. PMID:28018197

  2. The Epistemology of Mathematical and Statistical Modeling: A Quiet Methodological Revolution

    ERIC Educational Resources Information Center

    Rodgers, Joseph Lee

    2010-01-01

    A quiet methodological revolution, a modeling revolution, has occurred over the past several decades, almost without discussion. In contrast, the 20th century ended with contentious argument over the utility of null hypothesis significance testing (NHST). The NHST controversy may have been at least partially irrelevant, because in certain ways the…

  3. Evaluating Video Self-Modeling Treatment Outcomes: Differentiating between Statistically and Clinically Significant Change

    ERIC Educational Resources Information Center

    La Spata, Michelle G.; Carter, Christopher W.; Johnson, Wendi L.; McGill, Ryan J.

    2016-01-01

    The present study examined the utility of video self-modeling (VSM) for reducing externalizing behaviors (e.g., aggression, conduct problems, hyperactivity, and impulsivity) observed within the classroom environment. After identification of relevant target behaviors, VSM interventions were developed for first and second grade students (N = 4),…

  4. The Concentric Support Model: A Model for the Planning and Evaluation of Distance Learning Programs

    ERIC Educational Resources Information Center

    Osika, Elizabeth

    2006-01-01

    Each year, the number of institutions offering distance learning courses continues to grow significantly (Green, 2002; National Center for Educational Statistics, 2003; Wagner, 2000). Broskoske and Harvey (2000) explained that "many institutions begin a distance education initiative encouraged by the potential benefits, influenced by their…

  5. EVALUATION OF INTERSPECIES DIFFERENCES IN PHARMACOKINETICS (PK) USING A PBPK MODEL FOR THE PESTICIDE DIMETHYLARSINIC ACID (DMAV)

    EPA Science Inventory

    DMAV is an organoarsenical pesticide registered for use on certain citrus crops and as a cotton defoliant. In lifetime oral route studies in rodents, DMAV causes statistically significant increases in bladder tumors in rats, but not in mice. We have developed a PBPK model for D...

  6. Recessions and health: the impact of economic trends on air pollution in California.

    PubMed

    Davis, Mary E

    2012-10-01

    I explored the hypothesis that economic activity has a significant impact on exposure to air pollution and ultimately human health. I used county-level employment statistics in California (1980-2000), along with major regulatory periods and other controlling factors, to estimate local concentrations of the coefficient of haze, carbon monoxide, and nitrogen dioxide using a mixed regression model approach. The model explained between 33% and 48% of the variability in air pollution levels as estimated by the overall R(2) values. The relationship between employment measures and air pollution was statistically significant, suggesting that air quality improves during economic downturns. Additionally, major air quality regulations played a significant role in reducing air pollution levels over the study period. This study provides important evidence of a role for the economy in understanding human exposure to environmental pollution. The evidence further suggests that the impact of environmental regulations are likely to be overstated when they occur during recessionary periods, and understated when they play out during periods of economic growth.

  7. Anticoagulant vs. antiplatelet therapy in patients with cryptogenic stroke and patent foramen ovale: an individual participant data meta-analysis.

    PubMed

    Kent, David M; Dahabreh, Issa J; Ruthazer, Robin; Furlan, Anthony J; Weimar, Christian; Serena, Joaquín; Meier, Bernhard; Mattle, Heinrich P; Di Angelantonio, Emanuele; Paciaroni, Maurizio; Schuchlenz, Herwig; Homma, Shunichi; Lutz, Jennifer S; Thaler, David E

    2015-09-14

    The preferred antithrombotic strategy for secondary prevention in patients with cryptogenic stroke (CS) and patent foramen ovale (PFO) is unknown. We pooled multiple observational studies and used propensity score-based methods to estimate the comparative effectiveness of oral anticoagulation (OAC) compared with antiplatelet therapy (APT). Individual participant data from 12 databases of medically treated patients with CS and PFO were analysed with Cox regression models, to estimate database-specific hazard ratios (HRs) comparing OAC with APT, for both the primary composite outcome [recurrent stroke, transient ischaemic attack (TIA), or death] and stroke alone. Propensity scores were applied via inverse probability of treatment weighting to control for confounding. We synthesized database-specific HRs using random-effects meta-analysis models. This analysis included 2385 (OAC = 804 and APT = 1581) patients with 227 composite endpoints (stroke/TIA/death). The difference between OAC and APT was not statistically significant for the primary composite outcome [adjusted HR = 0.76, 95% confidence interval (CI) 0.52-1.12] or for the secondary outcome of stroke alone (adjusted HR = 0.75, 95% CI 0.44-1.27). Results were consistent in analyses applying alternative weighting schemes, with the exception that OAC had a statistically significant beneficial effect on the composite outcome in analyses standardized to the patient population who actually received APT (adjusted HR = 0.64, 95% CI 0.42-0.99). Subgroup analyses did not detect statistically significant heterogeneity of treatment effects across clinically important patient groups. We did not find a statistically significant difference comparing OAC with APT; our results justify randomized trials comparing different antithrombotic approaches in these patients. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  8. Breast feeding, infant growth, and body mass index at 30 and 35 years.

    PubMed

    Fergusson, David M; McLeod, Geraldine F H; Horwood, L John

    2014-11-01

    This study examined the associations between duration of breast feeding, early infant growth, and body mass index (BMI) at 30 and 35 years, in a birth cohort studied to age 35. Data were gathered on duration of exclusive and non-exclusive breast feeding (months), early growth (kg; 0-9 months), and BMI at ages 30 and 35 from the Christchurch Health and Development Study. The Christchurch Health and Development Study is a study of a birth cohort of 1265 children, born in Christchurch in 1977. Population-averaged generalised estimating regression models showed statistically significant associations between: duration of breast feeding and mean BMI; and early growth and mean BMI. After adjustment for perinatal, family, and social background factors, statistically significant associations were found between: longer duration of breast feeding and lower adult BMI (B = -0.424 [95% confidence interval (CI) -0.708, -0.140]); and increasing early growth and higher adult BMI (B = 0.393 [95% CI 0.080, 0.707]). When breast feeding and infant growth were entered into the regression model and adjusted for covariates, breast feeding was no longer statistically significantly associated with BMI (B = -0.250 [95% CI -0.553, 0.054]), while early growth remained statistically significantly associated with BMI (B = 0.355 [95% CI 0.039, 0.671]). A test for mediation showed that the association between breast feeding and BMI was mediated by early growth (P = 0.01). The association between longer duration of breast feeding and later lower BMI scores in adulthood was mediated by lower early growth. Breast feeding may be included as one component of multicompartment programmes targeted at early growth and later obesity. © 2014 John Wiley & Sons Ltd.

  9. Genetic programming based models in plant tissue culture: An addendum to traditional statistical approach.

    PubMed

    Mridula, Meenu R; Nair, Ashalatha S; Kumar, K Satheesh

    2018-02-01

    In this paper, we compared the efficacy of observation based modeling approach using a genetic algorithm with the regular statistical analysis as an alternative methodology in plant research. Preliminary experimental data on in vitro rooting was taken for this study with an aim to understand the effect of charcoal and naphthalene acetic acid (NAA) on successful rooting and also to optimize the two variables for maximum result. Observation-based modelling, as well as traditional approach, could identify NAA as a critical factor in rooting of the plantlets under the experimental conditions employed. Symbolic regression analysis using the software deployed here optimised the treatments studied and was successful in identifying the complex non-linear interaction among the variables, with minimalistic preliminary data. The presence of charcoal in the culture medium has a significant impact on root generation by reducing basal callus mass formation. Such an approach is advantageous for establishing in vitro culture protocols as these models will have significant potential for saving time and expenditure in plant tissue culture laboratories, and it further reduces the need for specialised background.

  10. Forecasting volatility with neural regression: a contribution to model adequacy.

    PubMed

    Refenes, A N; Holt, W T

    2001-01-01

    Neural nets' usefulness for forecasting is limited by problems of overfitting and the lack of rigorous procedures for model identification, selection and adequacy testing. This paper describes a methodology for neural model misspecification testing. We introduce a generalization of the Durbin-Watson statistic for neural regression and discuss the general issues of misspecification testing using residual analysis. We derive a generalized influence matrix for neural estimators which enables us to evaluate the distribution of the statistic. We deploy Monte Carlo simulation to compare the power of the test for neural and linear regressors. While residual testing is not a sufficient condition for model adequacy, it is nevertheless a necessary condition to demonstrate that the model is a good approximation to the data generating process, particularly as neural-network estimation procedures are susceptible to partial convergence. The work is also an important step toward developing rigorous procedures for neural model identification, selection and adequacy testing which have started to appear in the literature. We demonstrate its applicability in the nontrivial problem of forecasting implied volatility innovations using high-frequency stock index options. Each step of the model building process is validated using statistical tests to verify variable significance and model adequacy with the results confirming the presence of nonlinear relationships in implied volatility innovations.

  11. Sex-Specific Prediction Models for Sleep Apnea From the Hispanic Community Health Study/Study of Latinos.

    PubMed

    Shah, Neomi; Hanna, David B; Teng, Yanping; Sotres-Alvarez, Daniela; Hall, Martica; Loredo, Jose S; Zee, Phyllis; Kim, Mimi; Yaggi, H Klar; Redline, Susan; Kaplan, Robert C

    2016-06-01

    We developed and validated the first-ever sleep apnea (SA) risk calculator in a large population-based cohort of Hispanic/Latino subjects. Cross-sectional data on adults from the Hispanic Community Health Study/Study of Latinos (2008-2011) were analyzed. Subjective and objective sleep measurements were obtained. Clinically significant SA was defined as an apnea-hypopnea index ≥ 15 events per hour. Using logistic regression, four prediction models were created: three sex-specific models (female-only, male-only, and a sex × covariate interaction model to allow differential predictor effects), and one overall model with sex included as a main effect only. Models underwent 10-fold cross-validation and were assessed by using the C statistic. SA and its predictive variables; a total of 17 variables were considered. A total of 12,158 participants had complete sleep data available; 7,363 (61%) were women. The population-weighted prevalence of SA (apnea-hypopnea index ≥ 15 events per hour) was 6.1% in female subjects and 13.5% in male subjects. Male-only (C statistic, 0.808) and female-only (C statistic, 0.836) prediction models had the same predictor variables (ie, age, BMI, self-reported snoring). The sex-interaction model (C statistic, 0.836) contained sex, age, age × sex, BMI, BMI × sex, and self-reported snoring. The final overall model (C statistic, 0.832) contained age, BMI, snoring, and sex. We developed two websites for our SA risk calculator: one in English (https://www.montefiore.org/sleepapneariskcalc.html) and another in Spanish (http://www.montefiore.org/sleepapneariskcalc-es.html). We created an internally validated, highly discriminating, well-calibrated, and parsimonious prediction model for SA. Contrary to the study hypothesis, the variables did not have different predictive magnitudes in male and female subjects. Copyright © 2016 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  12. Risk of metabolic syndrome for stroke is not greater than the sum of its components: Thai Epidemiologic Stroke (TES) study.

    PubMed

    Hanchaiphiboolkul, Suchat; Suwanwela, Nijasri Charnnarong; Poungvarin, Niphon; Nidhinandana, Samart; Puthkhao, Pimchanok; Towanabut, Somchai; Tantirittisak, Tasanee; Suwantamee, Jithanorm; Samsen, Maiyadhaj

    2013-11-01

    Limited information is available on the association between the metabolic syndrome (MetS) and stroke. Whether or not MetS confers a risk greater than the sum of its components is controversial. This study aimed to assess the association of MetS with stroke, and to evaluate whether the risk of MetS is greater than the sum of its components. The Thai Epidemiologic Stroke (TES) study is a community-based cohort study with 19,997 participants, aged 45-80 years, recruited from the general population from 5 regions of Thailand. Baseline survey data were analyzed in cross-sectional analyses. MetS was defined according to criteria from the National Cholesterol Education Program (NCEP) Adult Treatment Panel III, the American Heart Association/National Heart, Lung, and Blood Institute (revised NCEP), and International Diabetes Federation (IDF). Logistic regression analysis was used to estimate association of MetS and its components with stroke. Using c statistics and the likelihood ratio test we compared the capability of discriminating participants with and without stroke of a logistic model containing all components of MetS and potential confounders and a model also including the MetS variable. We found that among the MetS components, high blood pressure and hypertriglyceridemia were independently and significantly related to stroke. MetS defined by the NCEP (odds ratio [OR], 1.64; 95% confidence interval [CI], 1.32-2.04), revised NCEP (OR, 2.27; 95% CI, 1.80-2.87), and IDF definitions (OR, 1.70; 95% CI, 1.37-2.13) was significantly associated with stroke after adjustment for age, sex, geographical area, education level, occupation, smoking status, alcohol consumption, and low-density lipoprotein cholesterol. After additional adjustment for all MetS components, these associations were not significant. There were no statistically significant difference (P=.723-.901) in c statistics between the model containing all MetS components and potential confounders and the model also including the MetS variable. The likelihood ratio test also showed no statistically significant (P=.166-.718) difference between these 2 models. Our findings suggest that MetS is associated with stroke, but not to a greater degree than the sum of its components. Thus, the focus should be on identification and appropriate control of its individual components, particularly high blood pressure and hypertriglyceridemia, rather than of MetS itself. Copyright © 2013 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  13. Microgravity experiments on vibrated granular gases in a dilute regime: non-classical statistics

    NASA Astrophysics Data System (ADS)

    Leconte, M.; Garrabos, Y.; Falcon, E.; Lecoutre-Chabot, C.; Palencia, F.; Évesque, P.; Beysens, D.

    2006-07-01

    We report on an experimental study of a dilute gas of steel spheres colliding inelastically and excited by a piston performing sinusoidal vibration, in low gravity. Using improved experimental apparatus, here we present some results concerning the collision statistics of particles on a wall of the container. We also propose a simple model where the non-classical statistics obtained from our data are attributed to the boundary condition playing the role of a 'velostat' instead of a thermostat. The significant differences from the kinetic theory of usual gas are related to the inelasticity of collisions.

  14. Assessing the effect of land use change on catchment runoff by combined use of statistical tests and hydrological modelling: Case studies from Zimbabwe

    NASA Astrophysics Data System (ADS)

    Lørup, Jens Kristian; Refsgaard, Jens Christian; Mazvimavi, Dominic

    1998-03-01

    The purpose of this study was to identify and assess long-term impacts of land use change on catchment runoff in semi-arid Zimbabwe, based on analyses of long hydrological time series (25-50 years) from six medium-sized (200-1000 km 2) non-experimental rural catchments. A methodology combining common statistical methods with hydrological modelling was adopted in order to distinguish between the effects of climate variability and the effects of land use change. The hydrological model (NAM) was in general able to simulate the observed hydrographs very well during the reference period, thus providing a means to account for the effects of climate variability and hence strengthening the power of the subsequent statistical tests. In the test period the validated model was used to provide the runoff record which would have occurred in the absence of land use change. The analyses indicated a decrease in the annual runoff for most of the six catchments, with the largest changes occurring for catchments located within communal land, where large increases in population and agricultural intensity have taken place. However, the decrease was only statistically significant at the 5% level for one of the catchments.

  15. Mass detection, localization and estimation for wind turbine blades based on statistical pattern recognition

    NASA Astrophysics Data System (ADS)

    Colone, L.; Hovgaard, M. K.; Glavind, L.; Brincker, R.

    2018-07-01

    A method for mass change detection on wind turbine blades using natural frequencies is presented. The approach is based on two statistical tests. The first test decides if there is a significant mass change and the second test is a statistical group classification based on Linear Discriminant Analysis. The frequencies are identified by means of Operational Modal Analysis using natural excitation. Based on the assumption of Gaussianity of the frequencies, a multi-class statistical model is developed by combining finite element model sensitivities in 10 classes of change location on the blade, the smallest area being 1/5 of the span. The method is experimentally validated for a full scale wind turbine blade in a test setup and loaded by natural wind. Mass change from natural causes was imitated with sand bags and the algorithm was observed to perform well with an experimental detection rate of 1, localization rate of 0.88 and mass estimation rate of 0.72.

  16. The writer independent online handwriting recognition system frog on hand and cluster generative statistical dynamic time warping.

    PubMed

    Bahlmann, Claus; Burkhardt, Hans

    2004-03-01

    In this paper, we give a comprehensive description of our writer-independent online handwriting recognition system frog on hand. The focus of this work concerns the presentation of the classification/training approach, which we call cluster generative statistical dynamic time warping (CSDTW). CSDTW is a general, scalable, HMM-based method for variable-sized, sequential data that holistically combines cluster analysis and statistical sequence modeling. It can handle general classification problems that rely on this sequential type of data, e.g., speech recognition, genome processing, robotics, etc. Contrary to previous attempts, clustering and statistical sequence modeling are embedded in a single feature space and use a closely related distance measure. We show character recognition experiments of frog on hand using CSDTW on the UNIPEN online handwriting database. The recognition accuracy is significantly higher than reported results of other handwriting recognition systems. Finally, we describe the real-time implementation of frog on hand on a Linux Compaq iPAQ embedded device.

  17. Computer-aided auditing of prescription drug claims.

    PubMed

    Iyengar, Vijay S; Hermiz, Keith B; Natarajan, Ramesh

    2014-09-01

    We describe a methodology for identifying and ranking candidate audit targets from a database of prescription drug claims. The relevant audit targets may include various entities such as prescribers, patients and pharmacies, who exhibit certain statistical behavior indicative of potential fraud and abuse over the prescription claims during a specified period of interest. Our overall approach is consistent with related work in statistical methods for detection of fraud and abuse, but has a relative emphasis on three specific aspects: first, based on the assessment of domain experts, certain focus areas are selected and data elements pertinent to the audit analysis in each focus area are identified; second, specialized statistical models are developed to characterize the normalized baseline behavior in each focus area; and third, statistical hypothesis testing is used to identify entities that diverge significantly from their expected behavior according to the relevant baseline model. The application of this overall methodology to a prescription claims database from a large health plan is considered in detail.

  18. Simulating statistics of lightning-induced and man made fires

    NASA Astrophysics Data System (ADS)

    Krenn, R.; Hergarten, S.

    2009-04-01

    The frequency-area distributions of forest fires show power-law behavior with scaling exponents α in a quite narrow range, relating wildfire research to the theoretical framework of self-organized criticality. Examples of self-organized critical behavior can be found in computer simulations of simple cellular automata. The established self-organized critical Drossel-Schwabl forest fire model (DS-FFM) is one of the most widespread models in this context. Despite its qualitative agreement with event-size statistics from nature, its applicability is still questioned. Apart from general concerns that the DS-FFM apparently oversimplifies the complex nature of forest dynamics, it significantly overestimates the frequency of large fires. We present a straightforward modification of the model rules that increases the scaling exponent α by approximately 1•3 and brings the simulated event-size statistics close to those observed in nature. In addition, combined simulations of both the original and the modified model predict a dependence of the overall distribution on the ratio of lightning induced and man made fires as well as a difference between their respective event-size statistics. The increase of the scaling exponent with decreasing lightning probability as well as the splitting of the partial distributions are confirmed by the analysis of the Canadian Large Fire Database. As a consequence, lightning induced and man made forest fires cannot be treated separately in wildfire modeling, hazard assessment and forest management.

  19. Further Development of a Tissue Engineered Muscle Repair Construct In Vitro for Enhanced Functional Recovery Following Implantation In Vivo in a Murine Model of Volumetric Muscle Loss Injury

    DTIC Science & Technology

    2012-01-01

    ence ( LSD ) correction. Statistical significance was set at an aɘ.05. Statistical analyses were performed using SPSS 18.0. Results BAM...Tissue Eng Part A 16, 1395, 2010. 15. Page, R.L., Malcuit, C., Vilner, L., Vojtic, I., Shaw , S., Hed- blom, E., Hu, J., Pins, G.D., Rolle, M.W., and

  20. Integration of Marine Mammal Movement and Behavior into the Effects of Sound on the Marine Environment

    DTIC Science & Technology

    2011-09-30

    capability to emulate the dive and movement behavior of marine mammals provides a significant advantage to modeling environmental impact than do historic...approaches used in Navy environmental assessments (EA) and impact statements (EIS). Many previous methods have been statistical or pseudo-statistical...Siderius. 2011. Comparison of methods used for computing the impact of sound on the marine environment, Marine Environmental Research, 71:342-350. [published

  1. QSAR Study of p56lck Protein Tyrosine Kinase Inhibitory Activity of Flavonoid Derivatives Using MLR and GA-PLS

    PubMed Central

    Fassihi, Afshin; Sabet, Razieh

    2008-01-01

    Quantitative relationships between molecular structure and p56lck protein tyrosine kinase inhibitory activity of 50 flavonoid derivatives are discovered by MLR and GA-PLS methods. Different QSAR models revealed that substituent electronic descriptors (SED) parameters have significant impact on protein tyrosine kinase inhibitory activity of the compounds. Between the two statistical methods employed, GA-PLS gave superior results. The resultant GA-PLS model had a high statistical quality (R2 = 0.74 and Q2 = 0.61) for predicting the activity of the inhibitors. The models proposed in the present work are more useful in describing QSAR of flavonoid derivatives as p56lck protein tyrosine kinase inhibitors than those provided previously. PMID:19325836

  2. Prevalence of refractive errors in the Slovak population calculated using the Gullstrand schematic eye model.

    PubMed

    Popov, I; Valašková, J; Štefaničková, J; Krásnik, V

    2017-01-01

    A substantial part of the population suffers from some kind of refractive errors. It is envisaged that their prevalence may change with the development of society. The aim of this study is to determine the prevalence of refractive errors using calculations based on the Gullstrand schematic eye model. We used the Gullstrand schematic eye model to calculate refraction retrospectively. Refraction was presented as the need for glasses correction at a vertex distance of 12 mm. The necessary data was obtained using the optical biometer Lenstar LS900. Data which could not be obtained due to the limitations of the device was substituted by theoretical data from the Gullstrand schematic eye model. Only analyses from the right eyes were presented. The data was interpreted using descriptive statistics, Pearson correlation and t-test. The statistical tests were conducted at a level of significance of 5%. Our sample included 1663 patients (665 male, 998 female) within the age range of 19 to 96 years. Average age was 70.8 ± 9.53 years. Average refraction of the eye was 2.73 ± 2.13D (males 2.49 ± 2.34, females 2.90 ± 2.76). The mean absolute error from emmetropia was 3.01 ± 1.58 (males 2.83 ± 2.95, females 3.25 ± 3.35). 89.06% of the sample was hyperopic, 6.61% was myopic and 4.33% emmetropic. We did not find any correlation between refraction and age. Females were more hyperopic than males. We did not find any statistically significant hypermetopic shift of refraction with age. According to our estimation, the calculations of refractive errors using the Gullstrand schematic eye model showed a significant hypermetropic shift of more than +2D. Our results could be used in future for comparing the prevalence of refractive errors using same methods we used.Key words: refractive errors, refraction, Gullstrand schematic eye model, population, emmetropia.

  3. Inservice trainings for Shiraz University of Medical Sciences employees: Effectiveness assessment by using the CIPP model

    PubMed Central

    MOKHTARZADEGAN, MARYAM; AMINI, MITRA; TAKMIL, FARNAZ; ADAMIAT, MOHAMMAD; SARVERAVAN, POONEH

    2015-01-01

    Introduction Nowadays, the employees` in-service training has become one of the core components in survival and success of any organization. Unfortunately, despite the importance of training evaluation, a small portion of resources are allocated to this matter. Among many evaluation models, the CIPP model or Context, Input, Process, Product model is a very useful approach to educational evaluation. So far, the evaluation of the training courses mostly provided information for learners but this investigation aims at evaluating the effectiveness of the experts’ training programs in SUMS and identifying its pros and cons based on the 4 stages of the CIPP model. Method In this descriptive analytical study, done in 2013, 250 employees of SUMS participated in in-service training courses were randomly selected. The evaluated variables were designed using CIPP model and a researcher-made questionnaire was used for data collection; the questionnaire was validated using expert opinion and its reliability was confirmed by Cronbach’s alpha (0.89). Quantitative data were analyzed using SPSS 14 and statistical tests was done as needed. Results In the context phase, the mean score was highest in solving work problems (4.07±0.88) and lowest in focusing on learners’ learning style training courses (2.68±0.91). There is a statistically significant difference between the employees` education level and the product phase evaluation (p<0.001).  The necessary effectiveness was not statistically significant in context and input level (p>0.001), in contrast with the process and product phase which showed a significant deference (p<0.001). Conclusion Considering our results, although the in-service trainings given to sums employees has been effective in many ways, it has some weaknesses as well. Therefore improving these weaknesses and reinforcing strong points within the identified fields in this study should be taken into account by decision makers and administrators. PMID:25927072

  4. Inservice trainings for Shiraz University of Medical Sciences employees: Effectiveness assessment by using the CIPP model.

    PubMed

    Mokhtarzadegan, Maryam; Amini, Mitra; Takmil, Farnaz; Adamiat, Mohammad; Sarveravan, Pooneh

    2015-04-01

    Nowadays, the employees` in-service training has become one of the core components in survival and success of any organization. Unfortunately, despite the importance of training evaluation, a small portion of resources are allocated to this matter. Among many evaluation models, the CIPP model or Context, Input, Process, Product model is a very useful approach to educational evaluation. So far, the evaluation of the training courses mostly provided information for learners but this investigation aims at evaluating the effectiveness of the experts' training programs in SUMS and identifying its pros and cons based on the 4 stages of the CIPP model. In this descriptive analytical study, done in 2013, 250 employees of SUMS participated in in-service training courses were randomly selected. The evaluated variables were designed using CIPP model and a researcher-made questionnaire was used for data collection; the questionnaire was validated using expert opinion and its reliability was confirmed by Cronbach's alpha (0.89). Quantitative data were analyzed using SPSS 14 and statistical tests was done as needed. In the context phase, the mean score was highest in solving work problems (4.07±0.88) and lowest in focusing on learners' learning style training courses (2.68±0.91). There is a statistically significant difference between the employees` education level and the product phase evaluation (p<0.001).  The necessary effectiveness was not statistically significant in context and input level (p>0.001), in contrast with the process and product phase which showed a significant deference (p<0.001). Considering our results, although the in-service trainings given to sums employees has been effective in many ways, it has some weaknesses as well. Therefore improving these weaknesses and reinforcing strong points within the identified fields in this study should be taken into account by decision makers and administrators.

  5. A Theory-Based Model for Understanding Faculty Intention to Use Students Ratings to Improve Teaching in a Health Sciences Institution in Puerto Rico

    ERIC Educational Resources Information Center

    Collazo, Andrés A.

    2018-01-01

    A model derived from the theory of planned behavior was empirically assessed for understanding faculty intention to use student ratings for teaching improvement. A sample of 175 professors participated in the study. The model was statistically significant and had a very large explanatory power. Instrumental attitude, affective attitude, perceived…

  6. Largest-Crown- Width Prediction Models for 53 Species in the Western United States

    Treesearch

    William A. Bechtold

    2004-01-01

    The mean crown diameters of stand-grown trees 5.0-in. dbh and larger were modeled as a function of stem diameter, live-crown ratio, stand-level basal area, latitude, longitude, elevation, and Hopkins bioclimatic index for 53 tree species in the western United States. Stem diameter was statistically significant in all models, and a quadratic term for stem diameter was...

  7. Crown-Diameter Prediction Models for 87 Species of Stand-Grown Trees in the Eastern United States

    Treesearch

    William A. Bechtold

    2003-01-01

    The mean crown diameters of stand-grown trees were modeled as a function of stem diameter, live-crown ratio, stand basal area, latitude, longitude, elevation, and Hopkins bioclimatic index for 87 tree species in the eastern United States. Stem diameter was statistically significant in all models, and a quadratic term for stem diameter was required for some species....

  8. PCA as a practical indicator of OPLS-DA model reliability.

    PubMed

    Worley, Bradley; Powers, Robert

    Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.

  9. Respiratory Disease Related Mortality and Morbidity on an Island of Greece Exposed to Perlite and Bentonite Mining Dust

    PubMed Central

    Sampatakakis, Stefanos; Linos, Athena; Papadimitriou, Eleni; Petralias, Athanasios; Dalma, Archontoula; Papasaranti, Eirini Saranti; Christoforidou, Eleni; Stoltidis, Melina

    2013-01-01

    A morbidity and mortality study took place, focused on Milos Island, where perlite and bentonite mining sites are located. Official data concerning number and cause of deaths, regarding specific respiratory diseases and the total of respiratory diseases, for both Milos Island and the Cyclades Prefecture were used. Standardized Mortality Ratios (SMRs) were computed, adjusted specifically for age, gender and calendar year. Tests of linear trend were performed. By means of a predefined questionnaire, the morbidity rates of specific respiratory diseases in Milos, were compared to those of the municipality of Oinofita, an industrial region. Chi-square analysis was used and the confounding factors of age, gender and smoking were taken into account, by estimating binary logistic regression models. The SMRs for Pneumonia and Chronic Obstructive Pulmonary Disease (COPD) were found elevated for both genders, although they did not reach statistical significance. For the total of respiratory diseases, a statistically significant SMR was identified regarding the decade 1989–1998. The morbidity study revealed elevated and statistically significant Odds Ratios (ORs), associated with allergic rhinitis, pneumonia, COPD and bronchiectasis. An elevated OR was also identified for asthma. After controlling for age, gender and smoking, the ORs were statistically significant and towards the same direction. PMID:24129114

  10. Respiratory disease related mortality and morbidity on an island of Greece exposed to perlite and bentonite mining dust.

    PubMed

    Sampatakakis, Stefanos; Linos, Athena; Papadimitriou, Eleni; Petralias, Athanasios; Dalma, Archontoula; Papasaranti, Eirini Saranti; Christoforidou, Eleni; Stoltidis, Melina

    2013-10-14

    A morbidity and mortality study took place, focused on Milos Island, where perlite and bentonite mining sites are located. Official data concerning number and cause of deaths, regarding specific respiratory diseases and the total of respiratory diseases, for both Milos Island and the Cyclades Prefecture were used. Standardized Mortality Ratios (SMRs) were computed, adjusted specifically for age, gender and calendar year. Tests of linear trend were performed. By means of a predefined questionnaire, the morbidity rates of specific respiratory diseases in Milos, were compared to those of the municipality of Oinofita, an industrial region. Chi-square analysis was used and the confounding factors of age, gender and smoking were taken into account, by estimating binary logistic regression models. The SMRs for Pneumonia and Chronic Obstructive Pulmonary Disease (COPD) were found elevated for both genders, although they did not reach statistical significance. For the total of respiratory diseases, a statistically significant SMR was identified regarding the decade 1989-1998. The morbidity study revealed elevated and statistically significant Odds Ratios (ORs), associated with allergic rhinitis, pneumonia, COPD and bronchiectasis. An elevated OR was also identified for asthma. After controlling for age, gender and smoking, the ORs were statistically significant and towards the same direction.

  11. The relationship of bone and blood lead to hypertension: Further analyses of the normative aging study data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, H.; Kim, Rokho; Korrick, S.

    1996-12-31

    In an earlier report based on participants in the Veterans Administration Normative Aging Study, we found a significant association between the risk of hypertension and lead levels in tibia. To examine the possible confounding effects of education and occupation, we considered in this study five levels of education and three levels of occupation as independent variables in the statistical model. Of 1,171 active subjects seen between August 1991 and December 1994, 563 provided complete data for this analysis. In the initial logistic regression model, acre and body mass index, family history of hypertension, and dietary sodium intake, but neither cumulativemore » smoking nor alcohol ingestion, conferred increased odds ratios for being hypertensive that were statistically significant. When the lead biomarkers were added separately to this initial logistic model, tibia lead and patella lead levels were associated with significantly elevated odds ratios for hypertension. In the final backward elimination logistic regression model that included categorical variables for education and occupation, the only variables retained were body mass index, family history of hypertension, and tibia lead level. We conclude that education and occupation variables were not confounding the association between the lead biomarkers and hypertension that we reported previously. 27 refs., 3 tabs.« less

  12. Local Inflammation in Fracture Hematoma: Results from a Combined Trauma Model in Pigs

    PubMed Central

    Horst, K.; Eschbach, D.; Pfeifer, R.; Hübenthal, S.; Sassen, M.; Steinfeldt, T.; Wulf, H.; Ruchholtz, S.; Pape, H. C.; Hildebrand, F.

    2015-01-01

    Background. Previous studies showed significant interaction between the local and systemic inflammatory response after severe trauma in small animal models. The purpose of this study was to establish a new combined trauma model in pigs to investigate fracture-associated local inflammation and gain information about the early inflammatory stages after polytrauma. Material and Methods. Combined trauma consisted of tibial fracture, lung contusion, liver laceration, and controlled hemorrhage. Animals were mechanically ventilated and under ICU-monitoring for 48 h. Blood and fracture hematoma samples were collected during the time course of the study. Local and systemic levels of serum cytokines and diverse alarmins were measured by ELISA kit. Results. A statistical significant difference in the systemic serum values of IL-6 and HMGB1 was observed when compared to the sham. Moreover, there was a statistical significant difference in the serum values of the fracture hematoma of IL-6, IL-8, IL-10, and HMGB1 when compared to the systemic inflammatory response. However a decrease of local proinflammatory concentrations was observed while anti-inflammatory mediators increased. Conclusion. Our data showed a time-dependent activation of the local and systemic inflammatory response. Indeed it is the first study focusing on the local and systemic inflammatory response to multiple-trauma in a large animal model. PMID:25694748

  13. Knowledge, Perceptions, and Self-reported Performance of Hand Hygiene Among Registered Nurses at Community-based Hospitals in the Republic of Korea: A Cross-sectional Multi-center Study.

    PubMed

    Oh, Hyang Soon

    2018-05-01

    To assess the nurses' hand hygiene (HH) knowledge, perception, attitude, and self-reported performance in small- and medium-sized hospitals after Middle East Respiratory Syndrome outbreak. The structured questionnaire was adapted from the World Health Organization's survey. Data were collected between June 26 and July 14, 2017. Nurses showed scores on knowledge (17.6±2.5), perception (69.3±0.8), self-reported HH performance of non-self (86.0±11.0), self-reported performance of self (88.2±11.0), and attitude (50.5±5.5). HH performance rate of non-self was Y 1 =36.678+ 0.555X1 (HH performance rate of self) (adjusted R 2 =0.280, p <0.001). The regression model for performance was Y 4 =18.302+0.247 X 41 (peception)+0.232 X 42 (attitude)+0.875 X 42 (role model); coefficients were significant statistically except attitude, and this model significant statistically (adjusted R 2 =0.191, p <0.001). Advanced HH education program would be developed and operated continuously. Perception, attitude, role model was found to be a significant predictors of HH performance of self. So these findings could be used in future HH promotion strategies for nurses.

  14. Thrombectomy for ischemic stroke: meta-analyses of recurrent strokes, vasospasms, and subarachnoid hemorrhages.

    PubMed

    Emprechtinger, Robert; Piso, Brigitte; Ringleb, Peter A

    2017-03-01

    Mechanical thrombectomy with stent retrievers is an effective treatment for patients with ischemic stroke. Results of recent meta-analyses report that the treatment is safe. However, the endpoints recurrent stroke, vasospasms, and subarachnoid hemorrhage have not been evaluated sufficiently. Hence, we extracted data on these outcomes from the five recent thrombectomy trials (MR CLEAN, ESCAPE, REVASCAT, SWIFT PRIME, and EXTEND IA published in 2015). Subsequently, we conducted meta-analyses for each outcome. We report the results of the fixed, as well as the random effects model. Three studies reported data on recurrent strokes. While the results did not reach statistical significance in the random effects model (despite a three times elevated risk), the fixed effects model revealed a significantly higher rate of recurrent strokes after thrombectomy. Four studies reported data on subarachnoid hemorrhage. The higher pooled rates in the intervention groups were statistically significant in both, the fixed and the random effects model. One study reported on vasospasms. We recorded 14 events in the intervention group and none in the control group. The efficacy of mechanical thrombectomy is not questioned, yet our results indicate an increased risk for recurrent strokes, subarachnoid hemorrhage, and vasospasms post-treatment. Therefore, we strongly recommend a thoroughly surveillance, concerning these adverse events in future clinical trials and routine registries.

  15. Eutrophication risk assessment in coastal embayments using simple statistical models.

    PubMed

    Arhonditsis, G; Eleftheriadou, M; Karydis, M; Tsirtsis, G

    2003-09-01

    A statistical methodology is proposed for assessing the risk of eutrophication in marine coastal embayments. The procedure followed was the development of regression models relating the levels of chlorophyll a (Chl) with the concentration of the limiting nutrient--usually nitrogen--and the renewal rate of the systems. The method was applied in the Gulf of Gera, Island of Lesvos, Aegean Sea and a surrogate for renewal rate was created using the Canberra metric as a measure of the resemblance between the Gulf and the oligotrophic waters of the open sea in terms of their physical, chemical and biological properties. The Chl-total dissolved nitrogen-renewal rate regression model was the most significant, accounting for 60% of the variation observed in Chl. Predicted distributions of Chl for various combinations of the independent variables, based on Bayesian analysis of the models, enabled comparison of the outcomes of specific scenarios of interest as well as further analysis of the system dynamics. The present statistical approach can be used as a methodological tool for testing the resilience of coastal ecosystems under alternative managerial schemes and levels of exogenous nutrient loading.

  16. Common pitfalls in statistical analysis: Clinical versus statistical significance

    PubMed Central

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In clinical research, study results, which are statistically significant are often interpreted as being clinically important. While statistical significance indicates the reliability of the study results, clinical significance reflects its impact on clinical practice. The third article in this series exploring pitfalls in statistical analysis clarifies the importance of differentiating between statistical significance and clinical significance. PMID:26229754

  17. Time to significant pain reduction following DETP application vs placebo for acute soft tissue injuries.

    PubMed

    Yanchick, J; Magelli, M; Bodie, J; Sjogren, J; Rovati, S

    2010-08-01

    Nonsteroidal anti-inflammatory drugs (NSAIDs) provide fast and effective acute pain relief, but systemic administration has increased risk for some adverse reactions. The diclofenac epolamine 1.3% topical patch (DETP) is a topical NSAID with demonstrated safety and efficacy in treatment of acute pain from minor soft tissue injuries. Significant pain reduction has been observed in clinical trials within several hours following DETP application, suggesting rapid pain relief; however, this has not been extensively studied for topical NSAIDs in general. This retrospective post-hoc analysis examined time to onset of significant pain reduction after DETP application compared to a placebo patch for patients with mild-to-moderate acute ankle sprain, evaluating the primary efficacy endpoint from two nearly identical studies. Data from two double-blind, randomized, parallel-group, placebo-controlled studies (N = 274) of safety and efficacy of the DETP applied once daily for 7 days for acute ankle sprain were evaluated post-hoc using statistical modeling to estimate time to onset of significant pain reduction following DETP application. Pain on active movement on a 100 mm Visual Analog Scale (VAS) recorded in patient diaries; physician- and patient-assessed tolerability; and adverse events. DETP treatment resulted in significant pain reduction within approximately 3 hours compared to placebo. Within-treatment post-hoc analysis based on a statistical model suggested significant pain reduction occurred as early as 1.27 hours for the DETP group. The study may have been limited by the retrospective nature of the analyses. In both studies, the DETP was well tolerated with few adverse events, limited primarily to application site skin reactions. The DETP is an effective treatment for acute minor soft tissue injury, providing pain relief as rapidly as 1.27 hours post-treatment. Statistical modeling may be useful in estimating time to onset of pain relief for comparison of topical and oral NSAIDs.

  18. Diagnostic index of three-dimensional osteoarthritic changes in temporomandibular joint condylar morphology

    PubMed Central

    Gomes, Liliane R.; Gomes, Marcelo; Jung, Bryan; Paniagua, Beatriz; Ruellas, Antonio C.; Gonçalves, João Roberto; Styner, Martin A.; Wolford, Larry; Cevidanes, Lucia

    2015-01-01

    Abstract. This study aimed to investigate imaging statistical approaches for classifying three-dimensional (3-D) osteoarthritic morphological variations among 169 temporomandibular joint (TMJ) condyles. Cone-beam computed tomography scans were acquired from 69 subjects with long-term TMJ osteoarthritis (OA), 15 subjects at initial diagnosis of OA, and 7 healthy controls. Three-dimensional surface models of the condyles were constructed and SPHARM-PDM established correspondent points on each model. Multivariate analysis of covariance and direction-projection-permutation (DiProPerm) were used for testing statistical significance of the differences between the groups determined by clinical and radiographic diagnoses. Unsupervised classification using hierarchical agglomerative clustering was then conducted. Compared with healthy controls, OA average condyle was significantly smaller in all dimensions except its anterior surface. Significant flattening of the lateral pole was noticed at initial diagnosis. We observed areas of 3.88-mm bone resorption at the superior surface and 3.10-mm bone apposition at the anterior aspect of the long-term OA average model. DiProPerm supported a significant difference between the healthy control and OA group (p-value=0.001). Clinically meaningful unsupervised classification of TMJ condylar morphology determined a preliminary diagnostic index of 3-D osteoarthritic changes, which may be the first step towards a more targeted diagnosis of this condition. PMID:26158119

  19. Orthotopic bladder substitution in men revisited: identification of continence predictors.

    PubMed

    Koraitim, M M; Atta, M A; Foda, M K

    2006-11-01

    We determined the impact of the functional characteristics of the neobladder and urethral sphincter on continence results, and determined the most significant predictors of continence. A total of 88 male patients 29 to 70 years old underwent orthotopic bladder substitution with tubularized ileocecal segment (40) and detubularized sigmoid (25) or ileum (23). Uroflowmetry, cystometry and urethral pressure profilometry were performed at 13 to 36 months (mean 19) postoperatively. The correlation between urinary continence and 28 urodynamic variables was assessed. Parameters that correlated significantly with continence were entered into a multivariate analysis using a logistic regression model to determine the most significant predictors of continence. Maximum urethral closure pressure was the only parameter that showed a statistically significant correlation with diurnal continence. Nocturnal continence had not only a statistically significant positive correlation with maximum urethral closure pressure, but also statistically significant negative correlations with maximum contraction amplitude, and baseline pressure at mid and maximum capacity. Three of these 4 parameters, including maximum urethral closure pressure, maximum contraction amplitude and baseline pressure at mid capacity, proved to be significant predictors of continence on multivariate analysis. While daytime continence is determined by maximum urethral closure pressure, during the night it is the net result of 2 forces that have about equal influence but in opposite directions, that is maximum urethral closure pressure vs maximum contraction amplitude plus baseline pressure at mid capacity. Two equations were derived from the logistic regression model to predict the probability of continence after orthotopic bladder substitution, including Z1 (diurnal) = 0.605 + 0.0085 maximum urethral closure pressure and Z2 (nocturnal) = 0.841 + 0.01 [maximum urethral closure pressure - (maximum contraction amplitude + baseline pressure at mid capacity)].

  20. Statistical Method to Overcome Overfitting Issue in Rational Function Models

    NASA Astrophysics Data System (ADS)

    Alizadeh Moghaddam, S. H.; Mokhtarzade, M.; Alizadeh Naeini, A.; Alizadeh Moghaddam, S. A.

    2017-09-01

    Rational function models (RFMs) are known as one of the most appealing models which are extensively applied in geometric correction of satellite images and map production. Overfitting is a common issue, in the case of terrain dependent RFMs, that degrades the accuracy of RFMs-derived geospatial products. This issue, resulting from the high number of RFMs' parameters, leads to ill-posedness of the RFMs. To tackle this problem, in this study, a fast and robust statistical approach is proposed and compared to Tikhonov regularization (TR) method, as a frequently-used solution to RFMs' overfitting. In the proposed method, a statistical test, namely, significance test is applied to search for the RFMs' parameters that are resistant against overfitting issue. The performance of the proposed method was evaluated for two real data sets of Cartosat-1 satellite images. The obtained results demonstrate the efficiency of the proposed method in term of the achievable level of accuracy. This technique, indeed, shows an improvement of 50-80% over the TR.

  1. Segmented regression analysis of interrupted time series data to assess outcomes of a South American road traffic alcohol policy change.

    PubMed

    Nistal-Nuño, Beatriz

    2017-09-01

    In Chile, a new law introduced in March 2012 decreased the legal blood alcohol concentration (BAC) limit for driving while impaired from 1 to 0.8 g/l and the legal BAC limit for driving under the influence of alcohol from 0.5 to 0.3 g/l. The goal is to assess the impact of this new law on mortality and morbidity outcomes in Chile. A review of national databases in Chile was conducted from January 2003 to December 2014. Segmented regression analysis of interrupted time series was used for analyzing the data. In a series of multivariable linear regression models, the change in intercept and slope in the monthly incidence rate of traffic deaths and injuries and association with alcohol per 100,000 inhabitants was estimated from pre-intervention to postintervention, while controlling for secular changes. In nested regression models, potential confounding seasonal effects were accounted for. All analyses were performed at a two-sided significance level of 0.05. Immediate level drops in all the monthly rates were observed after the law from the end of the prelaw period in the majority of models and in all the de-seasonalized models, although statistical significance was reached only in the model for injures related to alcohol. After the law, the estimated monthly rate dropped abruptly by -0.869 for injuries related to alcohol and by -0.859 adjusting for seasonality (P < 0.001). Regarding the postlaw long-term trends, it was evidenced a steeper decreasing trend after the law in the models for deaths related to alcohol, although these differences were not statistically significant. A strong evidence of a reduction in traffic injuries related to alcohol was found following the law in Chile. Although insufficient evidence was found of a statistically significant effect for the beneficial effects seen on deaths and overall injuries, potential clinically important effects cannot be ruled out. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  2. Measuring the Pharmacokinetic Properties of Drugs with a Novel Surgical Rat Model.

    PubMed

    Christakis, Ioannis; Scott, Rebecca; Minnion, James; Cuenco, Joyceline; Tan, Tricia; Palazzo, Fausto; Bloom, Stephen

    2017-06-01

    Purpose/aim of the study: The pharmacokinetic (PK) parameters in animal models can help optimize novel candidate drugs prior to human trials. However, due to the complexity of pharmacokinetic experiments, their use is limited in academia. We present a novel surgical rat model for investigation of pharmacokinetic parameters and its use in an anti-obesity drug development program. The model uses anesthetized male Wistar rats, a jugular, a femoral catheter, and an insulin pump for peptide infusion. The following pharmacokinetic parameters were measured: metabolic clearance rate (MCR), half-life, and volume of distribution (Vd). Glucagon-like peptide 1 (GLP-1), glucagon (GCG), and exendin-4 (Ex-4) were used to validate the model. The pharmacokinetic parameters of anti-obesity drug candidates X1, X2, and X3 were measured. GLP-1 had a significantly higher MCR (83.9 ± 14.1 mL/min/kg) compared to GCG (40.7 ± 14.3 mL/min/kg) and Ex-4 (10.1 ± 2.5 mL/min/kg) (p < .01 and p < .001 respectively). Ex-4 had a statistically significant longer half-life (35.1 ± 7.4 min) compared to both GCG (3.2 ± 1.7 min) and GLP-1 (1.2 ± 0.4 min) (p < .01 for both GCG and GLP-1). Ex-4 had a statistically significant higher volume of distribution (429.7 ± 164.9 mL/kg) compared to both GCG (146.8 ± 49.6 mL/kg) and GLP-1 (149.7 ± 53.5 mL/kg) (p < .01 for both GCG and GLP-1). Peptide X3 had a statistically significant longer half-life (21.3 ± 3.5 min) compared to both X1 (3.9 ± 0.4 min) and X2 (16.1 ± 2.8 min) (p < .001 for both X1 and X2). We present an affordable and easily accessible platform for the measurement of PK parameters of peptides. This novel surgical rat model produces consistent and reproducible results while minimizing animal use.

  3. A Statistical Analysis of the Economic Drivers of Battery Energy Storage in Commercial Buildings: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Matthew; Simpkins, Travis; Cutler, Dylan

    There is significant interest in using battery energy storage systems (BESS) to reduce peak demand charges, and therefore the life cycle cost of electricity, in commercial buildings. This paper explores the drivers of economic viability of BESS in commercial buildings through statistical analysis. A sample population of buildings was generated, a techno-economic optimization model was used to size and dispatch the BESS, and the resulting optimal BESS sizes were analyzed for relevant predictor variables. Explanatory regression analyses were used to demonstrate that peak demand charges are the most significant predictor of an economically viable battery, and that the shape ofmore » the load profile is the most significant predictor of the size of the battery.« less

  4. A Statistical Analysis of the Economic Drivers of Battery Energy Storage in Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Matthew; Simpkins, Travis; Cutler, Dylan

    There is significant interest in using battery energy storage systems (BESS) to reduce peak demand charges, and therefore the life cycle cost of electricity, in commercial buildings. This paper explores the drivers of economic viability of BESS in commercial buildings through statistical analysis. A sample population of buildings was generated, a techno-economic optimization model was used to size and dispatch the BESS, and the resulting optimal BESS sizes were analyzed for relevant predictor variables. Explanatory regression analyses were used to demonstrate that peak demand charges are the most significant predictor of an economically viable battery, and that the shape ofmore » the load profile is the most significant predictor of the size of the battery.« less

  5. Predictive modeling of outcomes following definitive chemoradiotherapy for oropharyngeal cancer based on FDG-PET image characteristics

    NASA Astrophysics Data System (ADS)

    Folkert, Michael R.; Setton, Jeremy; Apte, Aditya P.; Grkovski, Milan; Young, Robert J.; Schöder, Heiko; Thorstad, Wade L.; Lee, Nancy Y.; Deasy, Joseph O.; Oh, Jung Hun

    2017-07-01

    In this study, we investigate the use of imaging feature-based outcomes research (‘radiomics’) combined with machine learning techniques to develop robust predictive models for the risk of all-cause mortality (ACM), local failure (LF), and distant metastasis (DM) following definitive chemoradiation therapy (CRT). One hundred seventy four patients with stage III-IV oropharyngeal cancer (OC) treated at our institution with CRT with retrievable pre- and post-treatment 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) scans were identified. From pre-treatment PET scans, 24 representative imaging features of FDG-avid disease regions were extracted. Using machine learning-based feature selection methods, multiparameter logistic regression models were built incorporating clinical factors and imaging features. All model building methods were tested by cross validation to avoid overfitting, and final outcome models were validated on an independent dataset from a collaborating institution. Multiparameter models were statistically significant on 5 fold cross validation with the area under the receiver operating characteristic curve (AUC)  =  0.65 (p  =  0.004), 0.73 (p  =  0.026), and 0.66 (p  =  0.015) for ACM, LF, and DM, respectively. The model for LF retained significance on the independent validation cohort with AUC  =  0.68 (p  =  0.029) whereas the models for ACM and DM did not reach statistical significance, but resulted in comparable predictive power to the 5 fold cross validation with AUC  =  0.60 (p  =  0.092) and 0.65 (p  =  0.062), respectively. In the largest study of its kind to date, predictive features including increasing metabolic tumor volume, increasing image heterogeneity, and increasing tumor surface irregularity significantly correlated to mortality, LF, and DM on 5 fold cross validation in a relatively uniform single-institution cohort. The LF model also retained significance in an independent population.

  6. Structure of Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition Criteria for Obsessive–Compulsive Personality Disorder in Patients With Binge Eating Disorder

    PubMed Central

    Ansell, Emily B; Pinto, Anthony; Edelen, Maria Orlando; Grilo, Carlos M

    2013-01-01

    Objective To examine 1-, 2-, and 3-factor model structures through confirmatory analytic procedures for Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) obsessive–compulsive personality disorder (OCPD) criteria in patients with binge eating disorder (BED). Method Participants were consecutive outpatients (n = 263) with binge eating disorder and were assessed with semi-structured interviews. The 8 OCPD criteria were submitted to confirmatory factor analyses in Mplus Version 4.2 (Los Angeles, CA) in which previously identified factor models of OCPD were compared for fit, theoretical relevance, and parsimony. Nested models were compared for significant improvements in model fit. Results Evaluation of indices of fit in combination with theoretical considerations suggest a multifactorial model is a significant improvement in fit over the current DSM-IV single-factor model of OCPD. Though the data support both 2- and 3-factor models, the 3-factor model is hindered by an underspecified third factor. Conclusion A multifactorial model of OCPD incorporating the factors perfectionism and rigidity represents the best compromise of fit and theory in modelling the structure of OCPD in patients with BED. A third factor representing miserliness may be relevant in BED populations but needs further development. The perfectionism and rigidity factors may represent distinct intrapersonal and interpersonal attempts at control and may have implications for the assessment of OCPD. PMID:19087485

  7. Structure of diagnostic and statistical manual of mental disorders, fourth edition criteria for obsessive-compulsive personality disorder in patients with binge eating disorder.

    PubMed

    Ansell, Emily B; Pinto, Anthony; Edelen, Maria Orlando; Grilo, Carlos M

    2008-12-01

    To examine 1-, 2-, and 3-factor model structures through confirmatory analytic procedures for Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) obsessive-compulsive personality disorder (OCPD) criteria in patients with binge eating disorder (BED). Participants were consecutive outpatients (n = 263) with binge eating disorder and were assessed with semi-structured interviews. The 8 OCPD criteria were submitted to confirmatory factor analyses in Mplus Version 4.2 (Los Angeles, CA) in which previously identified factor models of OCPD were compared for fit, theoretical relevance, and parsimony. Nested models were compared for significant improvements in model fit. Evaluation of indices of fit in combination with theoretical considerations suggest a multifactorial model is a significant improvement in fit over the current DSM-IV single- factor model of OCPD. Though the data support both 2- and 3-factor models, the 3-factor model is hindered by an underspecified third factor. A multifactorial model of OCPD incorporating the factors perfectionism and rigidity represents the best compromise of fit and theory in modelling the structure of OCPD in patients with BED. A third factor representing miserliness may be relevant in BED populations but needs further development. The perfectionism and rigidity factors may represent distinct intrapersonal and interpersonal attempts at control and may have implications for the assessment of OCPD.

  8. Standard and reduced radiation dose liver CT images: adaptive statistical iterative reconstruction versus model-based iterative reconstruction-comparison of findings and image quality.

    PubMed

    Shuman, William P; Chan, Keith T; Busey, Janet M; Mitsumori, Lee M; Choi, Eunice; Koprowicz, Kent M; Kanal, Kalpana M

    2014-12-01

    To investigate whether reduced radiation dose liver computed tomography (CT) images reconstructed with model-based iterative reconstruction ( MBIR model-based iterative reconstruction ) might compromise depiction of clinically relevant findings or might have decreased image quality when compared with clinical standard radiation dose CT images reconstructed with adaptive statistical iterative reconstruction ( ASIR adaptive statistical iterative reconstruction ). With institutional review board approval, informed consent, and HIPAA compliance, 50 patients (39 men, 11 women) were prospectively included who underwent liver CT. After a portal venous pass with ASIR adaptive statistical iterative reconstruction images, a 60% reduced radiation dose pass was added with MBIR model-based iterative reconstruction images. One reviewer scored ASIR adaptive statistical iterative reconstruction image quality and marked findings. Two additional independent reviewers noted whether marked findings were present on MBIR model-based iterative reconstruction images and assigned scores for relative conspicuity, spatial resolution, image noise, and image quality. Liver and aorta Hounsfield units and image noise were measured. Volume CT dose index and size-specific dose estimate ( SSDE size-specific dose estimate ) were recorded. Qualitative reviewer scores were summarized. Formal statistical inference for signal-to-noise ratio ( SNR signal-to-noise ratio ), contrast-to-noise ratio ( CNR contrast-to-noise ratio ), volume CT dose index, and SSDE size-specific dose estimate was made (paired t tests), with Bonferroni adjustment. Two independent reviewers identified all 136 ASIR adaptive statistical iterative reconstruction image findings (n = 272) on MBIR model-based iterative reconstruction images, scoring them as equal or better for conspicuity, spatial resolution, and image noise in 94.1% (256 of 272), 96.7% (263 of 272), and 99.3% (270 of 272), respectively. In 50 image sets, two reviewers (n = 100) scored overall image quality as sufficient or good with MBIR model-based iterative reconstruction in 99% (99 of 100). Liver SNR signal-to-noise ratio was significantly greater for MBIR model-based iterative reconstruction (10.8 ± 2.5 [standard deviation] vs 7.7 ± 1.4, P < .001); there was no difference for CNR contrast-to-noise ratio (2.5 ± 1.4 vs 2.4 ± 1.4, P = .45). For ASIR adaptive statistical iterative reconstruction and MBIR model-based iterative reconstruction , respectively, volume CT dose index was 15.2 mGy ± 7.6 versus 6.2 mGy ± 3.6; SSDE size-specific dose estimate was 16.4 mGy ± 6.6 versus 6.7 mGy ± 3.1 (P < .001). Liver CT images reconstructed with MBIR model-based iterative reconstruction may allow up to 59% radiation dose reduction compared with the dose with ASIR adaptive statistical iterative reconstruction , without compromising depiction of findings or image quality. © RSNA, 2014.

  9. Combined statistical and mechanistic modelling suggests food and temperature effects on survival of early life stages of Northeast Arctic cod (Gadus morhua)

    NASA Astrophysics Data System (ADS)

    Stige, Leif Chr.; Langangen, Øystein; Yaragina, Natalia A.; Vikebø, Frode B.; Bogstad, Bjarte; Ottersen, Geir; Stenseth, Nils Chr.; Hjermann, Dag Ø.

    2015-05-01

    Understanding the causes of the large interannual fluctuations in the recruitment to many marine fishes is a key challenge in fisheries ecology. We here propose that the combination of mechanistic and statistical modelling of the pelagic early life stages (ELS) prior to recruitment can be a powerful approach for improving our understanding of local-scale and population-scale dynamics. Specifically, this approach allows separating effects of ocean transport and survival, and thereby enhances the knowledge of the processes that regulate recruitment. We analyse data on the pelagic eggs, larvae and post-larvae of Northeast Arctic cod and on copepod nauplii, the main prey of the cod larvae. The data originate from two surveys, one in spring and one in summer, for 30 years. A coupled physical-biological model is used to simulate the transport, ambient temperature and development of cod ELS from spawning through spring and summer. The predictions from this model are used as input in a statistical analysis of the summer data, to investigate effects of covariates thought to be linked to growth and survival. We find significant associations between the local-scale ambient copepod nauplii concentration and temperature in spring and the local-scale occurrence of cod (post)larvae in summer, consistent with effects on survival. Moreover, years with low copepod nauplii concentrations and low temperature in spring are significantly associated with lower mean length of the cod (post)larvae in summer, likely caused in part by higher mortality leading to increased dominance of young and hence small individuals. Finally, we find that the recruitment at age 3 is strongly associated with the mean body length of the cod ELS, highlighting the biological significance of the findings.

  10. Evaluation of the learning curve of non-penetrating glaucoma surgery.

    PubMed

    Aslan, Fatih; Yuce, Berna; Oztas, Zafer; Ates, Halil

    2017-08-11

    To evaluate the learning curve of non-penetrating glaucoma surgery (NPGS). The study included 32 eyes of 27 patients' (20 male and 7 female) with medically uncontrolled glaucoma. Non-penetrating glaucoma surgeries performed by trainees under control of an experienced surgeon between 2005 and 2007 at our tertiary referral hospital were evaluated. Residents were separated into two groups. Humanistic training model applied to the one in the first group, he studied with experimental models before performing NPGS. Two residents in the second group performed NPGS after a conventional training model. Surgeries of the residents were recorded on video and intraoperative parameters were scored by the experienced surgeon at the end of the study. Postoperative intraocular pressure, absolute and total success rates were analyzed. In the first group 19 eyes of 16 patients and in the second group 13 eyes of 11 patients had been operated by residents. Intraoperative parameters and complication rates were not statistically significant between groups (p > 0.05, Chi-square). The duration of surgery was 32.7 ± 5.6 min in the first group and 45 ± 3.8 min in the second group. The difference was statistically significant (p < 0.001, Student's t test). Absolute and total success was 68.8 and 93.8% in the first group and 62.5 and 87.5% in the second group, respectively. The difference was not statistically significant. Humanistic and conventional training models under control of an experienced surgeon are safe and effective for senior residents who manage phacoemulsification surgery in routine cataract cases. Senior residents can practice these surgical techniques with reasonable complication rates.

  11. The effects of a hardiness educational intervention on hardiness and perceived stress of junior baccalaureate nursing students.

    PubMed

    Jameson, Paula R

    2014-04-01

    Baccalaureate nursing education is stressful. The stress encompasses a range of academic, personal, clinical, and social reasons. A hardiness educational program, a tool for stress management, based on theory, research, and practice, exists to enhance the attitudes and coping strategies of hardiness (Maddi, 2007; Maddi et al., 2002). Research has shown that students who completed the hardiness educational program, subsequently improved in grade point average (GPA), college retention rates, and health (Maddi et al., 2002). Little research has been done to explore the effects of hardiness education with junior baccalaureate nursing students. Early identification of hardiness, the need for hardiness education, or stress management in this population may influence persistence in and completion of a nursing program (Hensel and Stoelting-Gettelfinger, 2011). Therefore, the aims were to determine if an increase in hardiness and a decrease in perceived stress in junior baccalaureate nursing students occurred in those who participated in a hardiness intervention. The application of the Hardiness Model and the Roy Adaptation Model established connections and conceptual collaboration among stress, stimuli, adaptation, and hardi-coping. A quasi-experimental non-equivalent control group with pre-test and post-test was used with a convenience sample of full-time junior level baccalaureate nursing students. Data were collected from August 2011 to December 2011. Results of statistical analyses by paired t-tests revealed that the hardiness intervention did not have a statistically significant effect on increasing hardiness scores. The hardiness intervention did have a statistically significant effect on decreasing perceived stress scores. The significant decrease in perceived stress was congruent with the Hardiness Model and the Roy Adaptation Model. Further hardiness research among junior baccalaureate nursing students, utilizing the entire hardiness intervention, was recommended. © 2013.

  12. Comparison of two dental implant surface modifications on implants with same macrodesign: an experimental study in the pelvic sheep model.

    PubMed

    Ernst, Sabrina; Stübinger, Stefan; Schüpbach, Peter; Sidler, Michéle; Klein, Karina; Ferguson, Stephen J; von Rechenberg, Brigitte

    2015-08-01

    The aim of this study was to compare two different surfaces of one uniform macro-implant design in order to focus exclusively on the osseointegration properties after 2, 4 and 8 weeks and to discuss the animal model chosen. In six mature sheep, n = 36 implants with a highly crystalline and phosphate-enriched anodized titanium oxide surface (TiU) and n = 36 implants with a hydrophilic, sandblasted, large grit and acid-etched surface (SLA) were placed in the pelvic bone. TiU implants were custom-made to match the SLA implant design. The implant stability and bone-to-implant contact (BIC) were assessed by resonance frequency (ISQ), backscatter scanning electron microscopy (B-SEM), light microscopy (LM), micro-CT and intravital fluorochrome staining. Biomechanical removal torque testing was performed. Overall, no statistically significant differences in BIC total (trabecular + cortical) between TiU and SLA were found via LM and B-SEM. BIC values (B-SEM; LM) in both groups revealed a steady rise in trabecular bone attachment to the implant surface after 2, 4 and 8 weeks. In the 2- to 4-week time interval in the TiU group (P = 0.005) as well as in the SLA group (P = 0.01), a statistically significant increase in BIC trabecular could be observed via LM. B-SEM values confirmed the statistically significant increase for TiU (P = 0.001). In both groups, BIC trabecular values after 8 weeks were significantly higher (P ≤ 0.05) than after 2 weeks (B-SEM; LM). Biomechanical data confirmed the histological data. The two surfaces proved comparable osseointegration in this sheep model. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. A methodology for the stochastic generation of hourly synthetic direct normal irradiation time series

    NASA Astrophysics Data System (ADS)

    Larrañeta, M.; Moreno-Tejera, S.; Lillo-Bravo, I.; Silva-Pérez, M. A.

    2018-02-01

    Many of the available solar radiation databases only provide global horizontal irradiance (GHI) while there is a growing need of extensive databases of direct normal radiation (DNI) mainly for the development of concentrated solar power and concentrated photovoltaic technologies. In the present work, we propose a methodology for the generation of synthetic DNI hourly data from the hourly average GHI values by dividing the irradiance into a deterministic and stochastic component intending to emulate the dynamics of the solar radiation. The deterministic component is modeled through a simple classical model. The stochastic component is fitted to measured data in order to maintain the consistency of the synthetic data with the state of the sky, generating statistically significant DNI data with a cumulative frequency distribution very similar to the measured data. The adaptation and application of the model to the location of Seville shows significant improvements in terms of frequency distribution over the classical models. The proposed methodology applied to other locations with different climatological characteristics better results than the classical models in terms of frequency distribution reaching a reduction of the 50% in the Finkelstein-Schafer (FS) and Kolmogorov-Smirnov test integral (KSI) statistics.

  14. Detection of Clostridium difficile infection clusters, using the temporal scan statistic, in a community hospital in southern Ontario, Canada, 2006-2011.

    PubMed

    Faires, Meredith C; Pearl, David L; Ciccotelli, William A; Berke, Olaf; Reid-Smith, Richard J; Weese, J Scott

    2014-05-12

    In hospitals, Clostridium difficile infection (CDI) surveillance relies on unvalidated guidelines or threshold criteria to identify outbreaks. This can result in false-positive and -negative cluster alarms. The application of statistical methods to identify and understand CDI clusters may be a useful alternative or complement to standard surveillance techniques. The objectives of this study were to investigate the utility of the temporal scan statistic for detecting CDI clusters and determine if there are significant differences in the rate of CDI cases by month, season, and year in a community hospital. Bacteriology reports of patients identified with a CDI from August 2006 to February 2011 were collected. For patients detected with CDI from March 2010 to February 2011, stool specimens were obtained. Clostridium difficile isolates were characterized by ribotyping and investigated for the presence of toxin genes by PCR. CDI clusters were investigated using a retrospective temporal scan test statistic. Statistically significant clusters were compared to known CDI outbreaks within the hospital. A negative binomial regression model was used to identify associations between year, season, month and the rate of CDI cases. Overall, 86 CDI cases were identified. Eighteen specimens were analyzed and nine ribotypes were classified with ribotype 027 (n = 6) the most prevalent. The temporal scan statistic identified significant CDI clusters at the hospital (n = 5), service (n = 6), and ward (n = 4) levels (P ≤ 0.05). Three clusters were concordant with the one C. difficile outbreak identified by hospital personnel. Two clusters were identified as potential outbreaks. The negative binomial model indicated years 2007-2010 (P ≤ 0.05) had decreased CDI rates compared to 2006 and spring had an increased CDI rate compared to the fall (P = 0.023). Application of the temporal scan statistic identified several clusters, including potential outbreaks not detected by hospital personnel. The identification of time periods with decreased or increased CDI rates may have been a result of specific hospital events. Understanding the clustering of CDIs can aid in the interpretation of surveillance data and lead to the development of better early detection systems.

  15. Online Statistical Modeling (Regression Analysis) for Independent Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  16. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    PubMed

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.

  17. Statistical iterative material image reconstruction for spectral CT using a semi-empirical forward model

    NASA Astrophysics Data System (ADS)

    Mechlem, Korbinian; Ehn, Sebastian; Sellerer, Thorsten; Pfeiffer, Franz; Noël, Peter B.

    2017-03-01

    In spectral computed tomography (spectral CT), the additional information about the energy dependence of attenuation coefficients can be exploited to generate material selective images. These images have found applications in various areas such as artifact reduction, quantitative imaging or clinical diagnosis. However, significant noise amplification on material decomposed images remains a fundamental problem of spectral CT. Most spectral CT algorithms separate the process of material decomposition and image reconstruction. Separating these steps is suboptimal because the full statistical information contained in the spectral tomographic measurements cannot be exploited. Statistical iterative reconstruction (SIR) techniques provide an alternative, mathematically elegant approach to obtaining material selective images with improved tradeoffs between noise and resolution. Furthermore, image reconstruction and material decomposition can be performed jointly. This is accomplished by a forward model which directly connects the (expected) spectral projection measurements and the material selective images. To obtain this forward model, detailed knowledge of the different photon energy spectra and the detector response was assumed in previous work. However, accurately determining the spectrum is often difficult in practice. In this work, a new algorithm for statistical iterative material decomposition is presented. It uses a semi-empirical forward model which relies on simple calibration measurements. Furthermore, an efficient optimization algorithm based on separable surrogate functions is employed. This partially negates one of the major shortcomings of SIR, namely high computational cost and long reconstruction times. Numerical simulations and real experiments show strongly improved image quality and reduced statistical bias compared to projection-based material decomposition.

  18. Empirical evidence for acceleration-dependent amplification factors

    USGS Publications Warehouse

    Borcherdt, R.D.

    2002-01-01

    Site-specific amplification factors, Fa and Fv, used in current U.S. building codes decrease with increasing base acceleration level as implied by the Loma Prieta earthquake at 0.1g and extrapolated using numerical models and laboratory results. The Northridge earthquake recordings of 17 January 1994 and subsequent geotechnical data permit empirical estimates of amplification at base acceleration levels up to 0.5g. Distance measures and normalization procedures used to infer amplification ratios from soil-rock pairs in predetermined azimuth-distance bins significantly influence the dependence of amplification estimates on base acceleration. Factors inferred using a hypocentral distance norm do not show a statistically significant dependence on base acceleration. Factors inferred using norms implied by the attenuation functions of Abrahamson and Silva show a statistically significant decrease with increasing base acceleration. The decrease is statistically more significant for stiff clay and sandy soil (site class D) sites than for stiffer sites underlain by gravely soils and soft rock (site class C). The decrease in amplification with increasing base acceleration is more pronounced for the short-period amplification factor, Fa, than for the midperiod factor, Fv.

  19. Dose fractionation theorem in 3-D reconstruction (tomography)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glaeser, R.M.

    It is commonly assumed that the large number of projections for single-axis tomography precludes its application to most beam-labile specimens. However, Hegerl and Hoppe have pointed out that the total dose required to achieve statistical significance for each voxel of a computed 3-D reconstruction is the same as that required to obtain a single 2-D image of that isolated voxel, at the same level of statistical significance. Thus a statistically significant 3-D image can be computed from statistically insignificant projections, as along as the total dosage that is distributed among these projections is high enough that it would have resultedmore » in a statistically significant projection, if applied to only one image. We have tested this critical theorem by simulating the tomographic reconstruction of a realistic 3-D model created from an electron micrograph. The simulations verify the basic conclusions of high absorption, signal-dependent noise, varying specimen contrast and missing angular range. Furthermore, the simulations demonstrate that individual projections in the series of fractionated-dose images can be aligned by cross-correlation because they contain significant information derived from the summation of features from different depths in the structure. This latter information is generally not useful for structural interpretation prior to 3-D reconstruction, owing to the complexity of most specimens investigated by single-axis tomography. These results, in combination with dose estimates for imaging single voxels and measurements of radiation damage in the electron microscope, demonstrate that it is feasible to use single-axis tomography with soft X-ray microscopy of frozen-hydrated specimens.« less

  20. Beyond δ : Tailoring marked statistics to reveal modified gravity

    NASA Astrophysics Data System (ADS)

    Valogiannis, Georgios; Bean, Rachel

    2018-01-01

    Models that seek to explain cosmic acceleration through modifications to general relativity (GR) evade stringent Solar System constraints through a restoring, screening mechanism. Down-weighting the high-density, screened regions in favor of the low density, unscreened ones offers the potential to enhance the amount of information carried in such modified gravity models. In this work, we assess the performance of a new "marked" transformation and perform a systematic comparison with the clipping and logarithmic transformations, in the context of Λ CDM and the symmetron and f (R ) modified gravity models. Performance is measured in terms of the fractional boost in the Fisher information and the signal-to-noise ratio (SNR) for these models relative to the statistics derived from the standard density distribution. We find that all three statistics provide improved Fisher boosts over the basic density statistics. The model parameters for the marked and clipped transformation that best enhance signals and the Fisher boosts are determined. We also show that the mark is useful both as a Fourier and real-space transformation; a marked correlation function also enhances the SNR relative to the standard correlation function, and can on mildly nonlinear scales show a significant difference between the Λ CDM and the modified gravity models. Our results demonstrate how a series of simple analytical transformations could dramatically increase the predicted information extracted on deviations from GR, from large-scale surveys, and give the prospect for a much more feasible potential detection.

  1. Statistical-dynamical modeling of the cloud-to-ground lightning activity in Portugal

    NASA Astrophysics Data System (ADS)

    Sousa, J. F.; Fragoso, M.; Mendes, S.; Corte-Real, J.; Santos, J. A.

    2013-10-01

    The present study employs a dataset of cloud-to-ground discharges over Portugal, collected by the Portuguese lightning detection network in the period of 2003-2009, to identify dynamically coherent lightning regimes in Portugal and to implement a statistical-dynamical modeling of the daily discharges over the country. For this purpose, the high-resolution MERRA reanalysis is used. Three lightning regimes are then identified for Portugal: WREG, WREM and SREG. WREG is a typical cold-core cut-off low. WREM is connected to strong frontal systems driven by remote low pressure systems at higher latitudes over the North Atlantic. SREG is a combination of an inverted trough and a mid-tropospheric cold-core nearby Portugal. The statistical-dynamical modeling is based on logistic regressions (statistical component) developed for each regime separately (dynamical component). It is shown that the strength of the lightning activity (either strong or weak) for each regime is consistently modeled by a set of suitable dynamical predictors (65-70% of efficiency). The difference of the equivalent potential temperature in the 700-500 hPa layer is the best predictor for the three regimes, while the best 4-layer lifted index is still important for all regimes, but with much weaker significance. Six other predictors are more suitable for a specific regime. For the purpose of validating the modeling approach, a regional-scale climate model simulation is carried out under a very intense lightning episode.

  2. No-reference image quality assessment based on natural scene statistics and gradient magnitude similarity

    NASA Astrophysics Data System (ADS)

    Jia, Huizhen; Sun, Quansen; Ji, Zexuan; Wang, Tonghan; Chen, Qiang

    2014-11-01

    The goal of no-reference/blind image quality assessment (NR-IQA) is to devise a perceptual model that can accurately predict the quality of a distorted image as human opinions, in which feature extraction is an important issue. However, the features used in the state-of-the-art "general purpose" NR-IQA algorithms are usually natural scene statistics (NSS) based or are perceptually relevant; therefore, the performance of these models is limited. To further improve the performance of NR-IQA, we propose a general purpose NR-IQA algorithm which combines NSS-based features with perceptually relevant features. The new method extracts features in both the spatial and gradient domains. In the spatial domain, we extract the point-wise statistics for single pixel values which are characterized by a generalized Gaussian distribution model to form the underlying features. In the gradient domain, statistical features based on neighboring gradient magnitude similarity are extracted. Then a mapping is learned to predict quality scores using a support vector regression. The experimental results on the benchmark image databases demonstrate that the proposed algorithm correlates highly with human judgments of quality and leads to significant performance improvements over state-of-the-art methods.

  3. Estimation of Cell Proliferation Dynamics Using CFSE Data

    PubMed Central

    Banks, H.T.; Sutton, Karyn L.; Thompson, W. Clayton; Bocharov, Gennady; Roose, Dirk; Schenkel, Tim; Meyerhans, Andreas

    2010-01-01

    Advances in fluorescent labeling of cells as measured by flow cytometry have allowed for quantitative studies of proliferating populations of cells. The investigations (Luzyanina et al. in J. Math. Biol. 54:57–89, 2007; J. Math. Biol., 2009; Theor. Biol. Med. Model. 4:1–26, 2007) contain a mathematical model with fluorescence intensity as a structure variable to describe the evolution in time of proliferating cells labeled by carboxyfluorescein succinimidyl ester (CFSE). Here, this model and several extensions/modifications are discussed. Suggestions for improvements are presented and analyzed with respect to statistical significance for better agreement between model solutions and experimental data. These investigations suggest that the new decay/label loss and time dependent effective proliferation and death rates do indeed provide improved fits of the model to data. Statistical models for the observed variability/noise in the data are discussed with implications for uncertainty quantification. The resulting new cell dynamics model should prove useful in proliferation assay tracking and modeling, with numerous applications in the biomedical sciences. PMID:20195910

  4. Predictors of workplace violence among female sex workers in Tijuana, Mexico.

    PubMed

    Katsulis, Yasmina; Durfee, Alesha; Lopez, Vera; Robillard, Alyssa

    2015-05-01

    For sex workers, differences in rates of exposure to workplace violence are likely influenced by a variety of risk factors, including where one works and under what circumstances. Economic stressors, such as housing insecurity, may also increase the likelihood of exposure. Bivariate analyses demonstrate statistically significant associations between workplace violence and selected predictor variables, including age, drug use, exchanging sex for goods, soliciting clients outdoors, and experiencing housing insecurity. Multivariate regression analysis shows that after controlling for each of these variables in one model, only soliciting clients outdoors and housing insecurity emerge as statistically significant predictors for workplace violence. © The Author(s) 2014.

  5. Relations Between Environmental and Water-Quality Variables and Escherichia coli in the Cuyahoga River With Emphasis on Turbidity as a Predictor of Recreational Water Quality, Cuyahoga Valley National Park, Ohio, 2008

    USGS Publications Warehouse

    Brady, Amie M.G.; Plona, Meg B.

    2009-01-01

    During the recreational season of 2008 (May through August), a regression model relating turbidity to concentrations of Escherichia coli (E. coli) was used to predict recreational water quality in the Cuyahoga River at the historical community of Jaite, within the present city of Brecksville, Ohio, a site centrally located within Cuyahoga Valley National Park. Samples were collected three days per week at Jaite and at three other sites on the river. Concentrations of E. coli were determined and compared to environmental and water-quality measures and to concentrations predicted with a regression model. Linear relations between E. coli concentrations and turbidity, gage height, and rainfall were statistically significant for Jaite. Relations between E. coli concentrations and turbidity were statistically significant for the three additional sites, and relations between E. coli concentrations and gage height were significant at the two sites where gage-height data were available. The turbidity model correctly predicted concentrations of E. coli above or below Ohio's single-sample standard for primary-contact recreation for 77 percent of samples collected at Jaite.

  6. Exploration of preterm birth rates associated with different models of antenatal midwifery care in Scotland: Unmatched retrospective cohort analysis.

    PubMed

    Symon, Andrew; Winter, Clare; Cochrane, Lynda

    2015-06-01

    preterm birth represents a significant personal, clinical, organisational and financial burden. Strategies to reduce the preterm birth rate have had limited success. Limited evidence indicates that certain antenatal care models may offer some protection, although the causal mechanism is not understood. We sought to compare preterm birth rates for mixed-risk pregnant women accessing antenatal care organised at a freestanding midwifery unit (FMU) and mixed-risk pregnant women attending an obstetric unit (OU) with related community-based antenatal care. unmatched retrospective 4-year Scottish cohort analysis (2008-2011) of mixed-risk pregnant women accessing (i) FMU antenatal care (n=1107); (ii) combined community-based and OU antenatal care (n=7567). Data were accessed via the Information and Statistics Division of the NHS in Scotland. Aggregates analysis and binary logistic regression were used to compare the cohorts׳ rates of preterm birth; and of spontaneous labour onset, use of pharmacological analgesia, unassisted vertex birth, and low birth weight. Odds ratios were adjusted for age, parity, deprivation score and smoking status in pregnancy. after adjustment the 'mixed risk' FMU cohort had a statistically significantly reduced risk of preterm birth (5.1% [n=57] versus 7.7% [n=583]; AOR 0.73 [95% CI 0.55-0.98]; p=0.034). Differences in these secondary outcome measures were also statistically significant: spontaneous labour onset (FMU 83.9% versus OU 74.6%; AOR 1.74 [95% CI 1.46-2.08]; p<0.001); minimal intrapartum analgesia (FMU 53.7% versus OU 34.4%; AOR 2.17 [95% CI 1.90-2.49]; p<0.001); spontaneous vertex delivery (FMU 71.9% versus OU 63.5%; AOR 1.46 [95% CI 1.32-1.78]; p<0.001). Incidence of low birth weight was not statistically significant after adjustment for other variables. There was no significant difference in the rate of perinatal or neonatal death. given this study׳s methodological limitations, we can only claim associations between the care model and or chosen outcomes. Although both cohorts were mixed risk, differences in risk levels could have contributed to these findings. Nevertheless, the significant difference in preterm birth rates in this study resonates with other research, including the recent Cochrane review of midwife-led continuity models. Because of the multiplicity of risk factors for preterm birth we need to explore the salient features of the FMU model which may be contributing to this apparent protective effect. Because a randomised controlled trial would necessarily restrict choice to pregnant women, we feel that this option is problematic in exploring this further. We therefore plan to conduct a prospective matched cohort analysis together with a survey of unit practices and experiences. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Evaluation of conventional, protaper hand and protaper rotary instrumentation system for apical extrusion of debris, irrigants and bacteria- An in vitro randomized trial

    PubMed Central

    Kalra, Pinky; Suman, Ethel; Shenoy, Ramya; Suprabha, Baranya-Shrikrishna

    2017-01-01

    Background Endodontic instrumentation carries the risk of over extrusion of debris and bacteria. The technique used and the type of instrumentation influences this risk. Aim The purpose of this study was to evaluate and compare the K-file, ProTaper hand and ProTaper rotary instrumentation systems for the amount of apically extruded debris, irrigant solution and intracanal bacteria. Design Experimental single blinded randomized type of in vitro study with sample of 30 single rooted teeth. Endodontic access cavities were prepared and the root canals were filled with the suspension of E. faecalis. Myers and Montogomery Model was used to collect apically extruded debris and irrigant. Canals were prepared using K files, Hand protapers and Protaper rotary files. Statistical analysis Non Parametric test like Kruskal-Wallis and Mann-Whitney U test were applied to determine the significant differences among the group. Results Tests revealed statistically significant difference between the amount of debris and number of bacteria extruded by the ProTaper hand and the K-files. No statistically significant difference was observed between the amounts of irrigant extruded by the ProTaper hand and the K-file system. Statistically significant differences were observed between the amounts of bacteria and irrigant extruded by the ProTaper rotary and the Protaper hand. No statistically significant difference was observed between the amounts of debris extruded by the ProTaper hand and the K-file system. Conclusions Amount of apical extrusion of irrigant solution, bacteria and debris are significantly greater with K File instruments and least with Protaper rotary instruments. Key words:Protaper, rotary, periapical extrusion. PMID:28210445

  8. On the urban land-surface impact on climate over Central Europe

    NASA Astrophysics Data System (ADS)

    Huszar, Peter; Halenka, Tomas; Belda, Michal; Zemankova, Katerina; Zak, Michal

    2014-05-01

    For the purpose of qualifying and quantifying the impact of cities and in general the urban surfaces on climate over central Europe, the surface parameterization in regional climate model RegCM4 has been extended with the Single Layer Urban Canopy Model (SLUCM) for urban and suburban land surface. This can be used both in dynamic scale within BATS scheme and in a more detailed SUBBATS scale to treat the surface processes on a higher resolution subgrid. A set of experiments was performed over the period of 2005-2009 over central Europe, either without considering urban surfaces and with the SLUCM treatment. Results show a statistically significant impact of urbanized surfaces on temperature (up to 1.5 K increase in summer), on the boundary layer height (ZPBL, increases up to 50 m). Urbanization further influences surface wind with a winter decrease up to -0,6 m s-1 and both increases and decreases in summer depending the location with respect to cities and daytime (changes up to 0.3 ms-1). Urban surfaces significantly reduce evaporation and thus the humidity over the surface. This impacts in our simulations the summer precipitation rate showing decrease over cities up to - 2 mm day-1. We further showed, that significant temperature increases are not limited to the urban canopy layer but spawn the whole boundary layer. Above that, a small but statistically significant temperature decrease is modeled. The comparison with observational data showed significant improvement in modeling the monthly surface temperatures in summer and the models better describe the diurnal temperature variation reducing the afternoon and evening bias due to the UHI development, which was not captured by the model if one does not apply the urban parameterization. Sensitivity experiments were carried out as well to quantify the response of the meteorological conditions to changes in the parameters specific to the urban environment such as street width, building height, albedo of the roofs, anthropogenic heat release etc. and showed that the results are rather robust and the choice of the key SLUCM parameters impacts the results only slightly (mainly temperature, ZPBL and wind velocity). Further, the important conclusion is that statistically significant impacts are modeled not only over large urbanized areas (cities), but the influence of cities is evident over remote rural areas as well with minor or without any urban surfaces. We show that this is the result of the combined effect of the distant influence of surrounding cities and the influence of the minor local urban surface coverage.

  9. Socio-Demographic and Clinical Characteristics are Not Clinically Useful Predictors of Refill Adherence in Patients with Hypertension

    PubMed Central

    Steiner, John F.; Ho, P. Michael; Beaty, Brenda L.; Dickinson, L. Miriam; Hanratty, Rebecca; Zeng, Chan; Tavel, Heather M.; Havranek, Edward P.; Davidson, Arthur J.; Magid, David J.; Estacio, Raymond O.

    2009-01-01

    Background Although many studies have identified patient characteristics or chronic diseases associated with medication adherence, the clinical utility of such predictors has rarely been assessed. We attempted to develop clinical prediction rules for adherence with antihypertensive medications in two health care delivery systems. Methods and Results Retrospective cohort studies of hypertension registries in an inner-city health care delivery system (N = 17176) and a health maintenance organization (N = 94297) in Denver, Colorado. Adherence was defined by acquisition of 80% or more of antihypertensive medications. A multivariable model in the inner-city system found that adherent patients (36.3% of the total) were more likely than non-adherent patients to be older, white, married, and acculturated in US society, to have diabetes or cerebrovascular disease, not to abuse alcohol or controlled substances, and to be prescribed less than three antihypertensive medications. Although statistically significant, all multivariate odds ratios were 1.7 or less, and the model did not accurately discriminate adherent from non-adherent patients (C-statistic = 0.606). In the health maintenance organization, where 72.1% of patients were adherent, significant but weak associations existed between adherence and older age, white race, the lack of alcohol abuse, and fewer antihypertensive medications. The multivariate model again failed to accurately discriminate adherent from non-adherent individuals (C-statistic = 0.576). Conclusions Although certain socio-demographic characteristics or clinical diagnoses are statistically associated with adherence to refills of antihypertensive medications, a combination of these characteristics is not sufficiently accurate to allow clinicians to predict whether their patients will be adherent with treatment. PMID:20031876

  10. The trend of changes in the evaluation scores of faculty members from administrators' and students' perspectives at the medical school over 10 years.

    PubMed

    Yamani, Nikoo; Changiz, Tahereh; Feizi, Awat; Kamali, Farahnaz

    2018-01-01

    To assess the trend of changes in the evaluation scores of faculty members and discrepancy between administrators' and students' perspectives in a medical school from 2006 to 2015. This repeated cross-sectional study was conducted on the 10-year evaluation scores of all faculty members of a medical school (n=579) in an urban area of Iran. Data on evaluation scores given by students and administrators and the total of these scores were evaluated. Data were analyzed using descriptive and inferential statistics including linear mixed effect models for repeated measures via the SPSS software. There were statistically significant differences between the students' and administrators' perspectives over time ( p <0.001). The mean of the total evaluation scores also showed a statistically significant change over time ( p <0.001). Furthermore, the mean of changes over time in the total evaluation score between different departments was statistically significant ( p <0.001). The trend of changes in the student's evaluations was clear and positive, but the trend of administrators' evaluation was unclear. Since the evaluation of faculty members is affected by many other factors, there is a need for more future studies.

  11. Model variations in predicting incidence of Plasmodium falciparum malaria using 1998-2007 morbidity and meteorological data from south Ethiopia

    PubMed Central

    2010-01-01

    Background Malaria transmission is complex and is believed to be associated with local climate changes. However, simple attempts to extrapolate malaria incidence rates from averaged regional meteorological conditions have proven unsuccessful. Therefore, the objective of this study was to determine if variations in specific meteorological factors are able to consistently predict P. falciparum malaria incidence at different locations in south Ethiopia. Methods Retrospective data from 42 locations were collected including P. falciparum malaria incidence for the period of 1998-2007 and meteorological variables such as monthly rainfall (all locations), temperature (17 locations), and relative humidity (three locations). Thirty-five data sets qualified for the analysis. Ljung-Box Q statistics was used for model diagnosis, and R squared or stationary R squared was taken as goodness of fit measure. Time series modelling was carried out using Transfer Function (TF) models and univariate auto-regressive integrated moving average (ARIMA) when there was no significant predictor meteorological variable. Results Of 35 models, five were discarded because of the significant value of Ljung-Box Q statistics. Past P. falciparum malaria incidence alone (17 locations) or when coupled with meteorological variables (four locations) was able to predict P. falciparum malaria incidence within statistical significance. All seasonal AIRMA orders were from locations at altitudes above 1742 m. Monthly rainfall, minimum and maximum temperature was able to predict incidence at four, five and two locations, respectively. In contrast, relative humidity was not able to predict P. falciparum malaria incidence. The R squared values for the models ranged from 16% to 97%, with the exception of one model which had a negative value. Models with seasonal ARIMA orders were found to perform better. However, the models for predicting P. falciparum malaria incidence varied from location to location, and among lagged effects, data transformation forms, ARIMA and TF orders. Conclusions This study describes P. falciparum malaria incidence models linked with meteorological data. Variability in the models was principally attributed to regional differences, and a single model was not found that fits all locations. Past P. falciparum malaria incidence appeared to be a superior predictor than meteorology. Future efforts in malaria modelling may benefit from inclusion of non-meteorological factors. PMID:20553590

  12. A statistical human rib cage geometry model accounting for variations by age, sex, stature and body mass index.

    PubMed

    Shi, Xiangnan; Cao, Libo; Reed, Matthew P; Rupp, Jonathan D; Hoff, Carrie N; Hu, Jingwen

    2014-07-18

    In this study, we developed a statistical rib cage geometry model accounting for variations by age, sex, stature and body mass index (BMI). Thorax CT scans were obtained from 89 subjects approximately evenly distributed among 8 age groups and both sexes. Threshold-based CT image segmentation was performed to extract the rib geometries, and a total of 464 landmarks on the left side of each subject׳s ribcage were collected to describe the size and shape of the rib cage as well as the cross-sectional geometry of each rib. Principal component analysis and multivariate regression analysis were conducted to predict rib cage geometry as a function of age, sex, stature, and BMI, all of which showed strong effects on rib cage geometry. Except for BMI, all parameters also showed significant effects on rib cross-sectional area using a linear mixed model. This statistical rib cage geometry model can serve as a geometric basis for developing a parametric human thorax finite element model for quantifying effects from different human attributes on thoracic injury risks. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Specious causal attributions in the social sciences: the reformulated stepping-stone theory of heroin use as exemplar.

    PubMed

    Baumrind, D

    1983-12-01

    The claims based on causal models employing either statistical or experimental controls are examined and found to be excessive when applied to social or behavioral science data. An exemplary case, in which strong causal claims are made on the basis of a weak version of the regularity model of cause, is critiqued. O'Donnell and Clayton claim that in order to establish that marijuana use is a cause of heroin use (their "reformulated stepping-stone" hypothesis), it is necessary and sufficient to demonstrate that marijuana use precedes heroin use and that the statistically significant association between the two does not vanish when the effects of other variables deemed to be prior to both of them are removed. I argue that O'Donnell and Clayton's version of the regularity model is not sufficient to establish cause and that the planning of social interventions both presumes and requires a generative rather than a regularity causal model. Causal modeling using statistical controls is of value when it compels the investigator to make explicit and to justify a causal explanation but not when it is offered as a substitute for a generative analysis of causal connection.

  14. Confidence intervals for effect sizes: compliance and clinical significance in the Journal of Consulting and clinical Psychology.

    PubMed

    Odgaard, Eric C; Fowler, Robert L

    2010-06-01

    In 2005, the Journal of Consulting and Clinical Psychology (JCCP) became the first American Psychological Association (APA) journal to require statistical measures of clinical significance, plus effect sizes (ESs) and associated confidence intervals (CIs), for primary outcomes (La Greca, 2005). As this represents the single largest editorial effort to improve statistical reporting practices in any APA journal in at least a decade, in this article we investigate the efficacy of that change. All intervention studies published in JCCP in 2003, 2004, 2007, and 2008 were reviewed. Each article was coded for method of clinical significance, type of ES, and type of associated CI, broken down by statistical test (F, t, chi-square, r/R(2), and multivariate modeling). By 2008, clinical significance compliance was 75% (up from 31%), with 94% of studies reporting some measure of ES (reporting improved for individual statistical tests ranging from eta(2) = .05 to .17, with reasonable CIs). Reporting of CIs for ESs also improved, although only to 40%. Also, the vast majority of reported CIs used approximations, which become progressively less accurate for smaller sample sizes and larger ESs (cf. Algina & Kessleman, 2003). Changes are near asymptote for ESs and clinical significance, but CIs lag behind. As CIs for ESs are required for primary outcomes, we show how to compute CIs for the vast majority of ESs reported in JCCP, with an example of how to use CIs for ESs as a method to assess clinical significance.

  15. Reproducible research in vadose zone sciences

    USDA-ARS?s Scientific Manuscript database

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  16. How allele frequency and study design affect association test statistics with misrepresentation errors.

    PubMed

    Escott-Price, Valentina; Ghodsi, Mansoureh; Schmidt, Karl Michael

    2014-04-01

    We evaluate the effect of genotyping errors on the type-I error of a general association test based on genotypes, showing that, in the presence of errors in the case and control samples, the test statistic asymptotically follows a scaled non-central $\\chi ^2$ distribution. We give explicit formulae for the scaling factor and non-centrality parameter for the symmetric allele-based genotyping error model and for additive and recessive disease models. They show how genotyping errors can lead to a significantly higher false-positive rate, growing with sample size, compared with the nominal significance levels. The strength of this effect depends very strongly on the population distribution of the genotype, with a pronounced effect in the case of rare alleles, and a great robustness against error in the case of large minor allele frequency. We also show how these results can be used to correct $p$-values.

  17. Massive parallelization of serial inference algorithms for a complex generalized linear model

    PubMed Central

    Suchard, Marc A.; Simpson, Shawn E.; Zorych, Ivan; Ryan, Patrick; Madigan, David

    2014-01-01

    Following a series of high-profile drug safety disasters in recent years, many countries are redoubling their efforts to ensure the safety of licensed medical products. Large-scale observational databases such as claims databases or electronic health record systems are attracting particular attention in this regard, but present significant methodological and computational concerns. In this paper we show how high-performance statistical computation, including graphics processing units, relatively inexpensive highly parallel computing devices, can enable complex methods in large databases. We focus on optimization and massive parallelization of cyclic coordinate descent approaches to fit a conditioned generalized linear model involving tens of millions of observations and thousands of predictors in a Bayesian context. We find orders-of-magnitude improvement in overall run-time. Coordinate descent approaches are ubiquitous in high-dimensional statistics and the algorithms we propose open up exciting new methodological possibilities with the potential to significantly improve drug safety. PMID:25328363

  18. Assessing the specificity of posttraumatic stress disorder's dysphoric items within the dysphoria model.

    PubMed

    Armour, Cherie; Shevlin, Mark

    2013-10-01

    The factor structure of posttraumatic stress disorder (PTSD) currently used by the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV), has received limited support. A four-factor dysphoria model is widely supported. However, the dysphoria factor of this model has been hailed as a nonspecific factor of PTSD. The present study investigated the specificity of the dysphoria factor within the dysphoria model by conducting a confirmatory factor analysis while statistically controlling for the variance attributable to depression. The sample consisted of 429 individuals who met the diagnostic criteria for PTSD in the National Comorbidity Survey. The results concluded that there was no significant attenuation in any of the PTSD items. This finding is pertinent given several proposals for the removal of dysphoric items from the diagnostic criteria set of PTSD in the upcoming DSM-5.

  19. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    PubMed

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization both in 2-D and 1-D was undertaken. Results suggest that the iCFD model developed for the SST through the proposed methodology is able to predict solid distribution with high accuracy - taking a reasonable computational effort - when compared to multi-dimensional numerical experiments, under a wide range of flow and design conditions. iCFD tools could play a crucial role in reliably predicting systems' performance under normal and shock events. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Risk Estimation for Lung Cancer in Libya: Analysis Based on Standardized Morbidity Ratio, Poisson-Gamma Model, BYM Model and Mixture Model

    PubMed

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-03-01

    Cancer is the most rapidly spreading disease in the world, especially in developing countries, including Libya. Cancer represents a significant burden on patients, families, and their societies. This disease can be controlled if detected early. Therefore, disease mapping has recently become an important method in the fields of public health research and disease epidemiology. The correct choice of statistical model is a very important step to producing a good map of a disease. Libya was selected to perform this work and to examine its geographical variation in the incidence of lung cancer. The objective of this paper is to estimate the relative risk for lung cancer. Four statistical models to estimate the relative risk for lung cancer and population censuses of the study area for the time period 2006 to 2011 were used in this work. They are initially known as Standardized Morbidity Ratio, which is the most popular statistic, which used in the field of disease mapping, Poisson-gamma model, which is one of the earliest applications of Bayesian methodology, Besag, York and Mollie (BYM) model and Mixture model. As an initial step, this study begins by providing a review of all proposed models, which we then apply to lung cancer data in Libya. Maps, tables and graph, goodness-of-fit (GOF) were used to compare and present the preliminary results. This GOF is common in statistical modelling to compare fitted models. The main general results presented in this study show that the Poisson-gamma model, BYM model, and Mixture model can overcome the problem of the first model (SMR) when there is no observed lung cancer case in certain districts. Results show that the Mixture model is most robust and provides better relative risk estimates across a range of models. Creative Commons Attribution License

  1. Risk Estimation for Lung Cancer in Libya: Analysis Based on Standardized Morbidity Ratio, Poisson-Gamma Model, BYM Model and Mixture Model

    PubMed Central

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-01-01

    Cancer is the most rapidly spreading disease in the world, especially in developing countries, including Libya. Cancer represents a significant burden on patients, families, and their societies. This disease can be controlled if detected early. Therefore, disease mapping has recently become an important method in the fields of public health research and disease epidemiology. The correct choice of statistical model is a very important step to producing a good map of a disease. Libya was selected to perform this work and to examine its geographical variation in the incidence of lung cancer. The objective of this paper is to estimate the relative risk for lung cancer. Four statistical models to estimate the relative risk for lung cancer and population censuses of the study area for the time period 2006 to 2011 were used in this work. They are initially known as Standardized Morbidity Ratio, which is the most popular statistic, which used in the field of disease mapping, Poisson-gamma model, which is one of the earliest applications of Bayesian methodology, Besag, York and Mollie (BYM) model and Mixture model. As an initial step, this study begins by providing a review of all proposed models, which we then apply to lung cancer data in Libya. Maps, tables and graph, goodness-of-fit (GOF) were used to compare and present the preliminary results. This GOF is common in statistical modelling to compare fitted models. The main general results presented in this study show that the Poisson-gamma model, BYM model, and Mixture model can overcome the problem of the first model (SMR) when there is no observed lung cancer case in certain districts. Results show that the Mixture model is most robust and provides better relative risk estimates across a range of models. PMID:28440974

  2. Government Expenditures on Education as the Percentage of GDP in the EU

    ERIC Educational Resources Information Center

    Galetic, Fran

    2015-01-01

    This paper analyzes the government expenditures as the percentage of gross domestic product across countries of the European Union. There is a statistical model based on Z-score, whose aim is to calculate how much each EU country deviates from the average value. The model shows that government expenditures on education vary significantly between…

  3. Modeling driver stop/run behavior at the onset of a yellow indication considering driver run tendency and roadway surface conditions.

    PubMed

    Elhenawy, Mohammed; Jahangiri, Arash; Rakha, Hesham A; El-Shawarby, Ihab

    2015-10-01

    The ability to model driver stop/run behavior at signalized intersections considering the roadway surface condition is critical in the design of advanced driver assistance systems. Such systems can reduce intersection crashes and fatalities by predicting driver stop/run behavior. The research presented in this paper uses data collected from two controlled field experiments on the Smart Road at the Virginia Tech Transportation Institute (VTTI) to model driver stop/run behavior at the onset of a yellow indication for different roadway surface conditions. The paper offers two contributions. First, it introduces a new predictor related to driver aggressiveness and demonstrates that this measure enhances the modeling of driver stop/run behavior. Second, it applies well-known artificial intelligence techniques including: adaptive boosting (AdaBoost), random forest, and support vector machine (SVM) algorithms as well as traditional logistic regression techniques on the data in order to develop a model that can be used by traffic signal controllers to predict driver stop/run decisions in a connected vehicle environment. The research demonstrates that by adding the proposed driver aggressiveness predictor to the model, there is a statistically significant increase in the model accuracy. Moreover the false alarm rate is significantly reduced but this reduction is not statistically significant. The study demonstrates that, for the subject data, the SVM machine learning algorithm performs the best in terms of optimum classification accuracy and false positive rates. However, the SVM model produces the best performance in terms of the classification accuracy only. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. A comparative assessment of preclinical chemotherapeutic response of tumors using quantitative non-Gaussian diffusion MRI

    PubMed Central

    Xu, Junzhong; Li, Ke; Smith, R. Adam; Waterton, John C.; Zhao, Ping; Ding, Zhaohua; Does, Mark D.; Manning, H. Charles; Gore, John C.

    2016-01-01

    Background Diffusion-weighted MRI (DWI) signal attenuation is often not mono-exponential (i.e. non-Gaussian diffusion) with stronger diffusion weighting. Several non-Gaussian diffusion models have been developed and may provide new information or higher sensitivity compared with the conventional apparent diffusion coefficient (ADC) method. However the relative merits of these models to detect tumor therapeutic response is not fully clear. Methods Conventional ADC, and three widely-used non-Gaussian models, (bi-exponential, stretched exponential, and statistical model), were implemented and compared for assessing SW620 human colon cancer xenografts responding to barasertib, an agent known to induce apoptosis via polyploidy. Bayesian Information Criterion (BIC) was used for model selection among all three non-Gaussian models. Results All of tumor volume, histology, conventional ADC, and three non-Gaussian DWI models could show significant differences between control and treatment groups after four days of treatment. However, only the non-Gaussian models detected significant changes after two days of treatment. For any treatment or control group, over 65.7% of tumor voxels indicate the bi-exponential model is strongly or very strongly preferred. Conclusion Non-Gaussian DWI model-derived biomarkers are capable of detecting tumor earlier chemotherapeutic response of tumors compared with conventional ADC and tumor volume. The bi-exponential model provides better fitting compared with statistical and stretched exponential models for the tumor and treatment models used in the current work. PMID:27919785

  5. Exploring a Dynamic Model of Trust Management

    DTIC Science & Technology

    2014-06-01

    individualists whereby their satisfaction towards business negotiation stems mainly from high economic gains or personal outcomes. Therefore, it...undergraduate business students. No significant differences were found in the extent to which Canadian and Japanese trustors rely on dispositional signs...analytic- holism scale signifies high level of holism. No statistically significant difference was found between Malays, Chinese, and Indians on the

  6. Assessing Continuous Operator Workload With a Hybrid Scaffolded Neuroergonomic Modeling Approach.

    PubMed

    Borghetti, Brett J; Giametta, Joseph J; Rusnock, Christina F

    2017-02-01

    We aimed to predict operator workload from neurological data using statistical learning methods to fit neurological-to-state-assessment models. Adaptive systems require real-time mental workload assessment to perform dynamic task allocations or operator augmentation as workload issues arise. Neuroergonomic measures have great potential for informing adaptive systems, and we combine these measures with models of task demand as well as information about critical events and performance to clarify the inherent ambiguity of interpretation. We use machine learning algorithms on electroencephalogram (EEG) input to infer operator workload based upon Improved Performance Research Integration Tool workload model estimates. Cross-participant models predict workload of other participants, statistically distinguishing between 62% of the workload changes. Machine learning models trained from Monte Carlo resampled workload profiles can be used in place of deterministic workload profiles for cross-participant modeling without incurring a significant decrease in machine learning model performance, suggesting that stochastic models can be used when limited training data are available. We employed a novel temporary scaffold of simulation-generated workload profile truth data during the model-fitting process. A continuous workload profile serves as the target to train our statistical machine learning models. Once trained, the workload profile scaffolding is removed and the trained model is used directly on neurophysiological data in future operator state assessments. These modeling techniques demonstrate how to use neuroergonomic methods to develop operator state assessments, which can be employed in adaptive systems.

  7. [Analysis of the technical efficiency of hospitals in the Spanish National Health Service].

    PubMed

    Pérez-Romero, Carmen; Ortega-Díaz, M Isabel; Ocaña-Riola, Ricardo; Martín-Martín, José Jesús

    To analyse the technical efficiency and productivity of general hospitals in the Spanish National Health Service (NHS) (2010-2012) and identify explanatory hospital and regional variables. 230 NHS hospitals were analysed by data envelopment analysis for overall, technical and scale efficiency, and Malmquist index. The robustness of the analysis is contrasted with alternative input-output models. A fixed effects multilevel cross-sectional linear model was used to analyse the explanatory efficiency variables. The average rate of overall technical efficiency (OTE) was 0.736 in 2012; there was considerable variability by region. Malmquist index (2010-2012) is 1.013. A 23% variability in OTE is attributable to the region in question. Statistically significant exogenous variables (residents per 100 physicians, aging index, average annual income per household, essential public service expenditure and public health expenditure per capita) explain 42% of the OTE variability between hospitals and 64% between regions. The number of residents showed a statistically significant relationship. As regards regions, there is a statistically significant direct linear association between OTE and annual income per capita and essential public service expenditure, and an indirect association with the aging index and annual public health expenditure per capita. The significant room for improvement in the efficiency of hospitals is conditioned by region-specific characteristics, specifically aging, wealth and the public expenditure policies of each one. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  8. AIDS susceptibility in a migrant population: perception and behavior.

    PubMed

    McBride, D C; Weatherby, N L; Inciardi, J A; Gillespie, S A

    1999-01-01

    Within the framework of the Health Belief Model, this paper examines correlates of perception of AIDS susceptibility among 846 drug-using migrant farm workers and their sex partners. Significant but relatively small differences by ethnicity and gender were found. The data showed a consistent significant statistical relationship between frequency of drug use, high-risk sexual behavior, and perception of AIDS susceptibility. Perception of AIDS susceptibility was significantly related to a subsequent reduction in sexual risk behaviors. Consistent with the Health Belief Model, the data suggest that increasing perception of AIDS susceptibility may be an important motivator in reducing high-risk behaviors.

  9. Performance impact of stop lists and morphological decomposition on word-word corpus-based semantic space models.

    PubMed

    Keith, Jeff; Westbury, Chris; Goldman, James

    2015-09-01

    Corpus-based semantic space models, which primarily rely on lexical co-occurrence statistics, have proven effective in modeling and predicting human behavior in a number of experimental paradigms that explore semantic memory representation. The most widely studied extant models, however, are strongly influenced by orthographic word frequency (e.g., Shaoul & Westbury, Behavior Research Methods, 38, 190-195, 2006). This has the implication that high-frequency closed-class words can potentially bias co-occurrence statistics. Because these closed-class words are purported to carry primarily syntactic, rather than semantic, information, the performance of corpus-based semantic space models may be improved by excluding closed-class words (using stop lists) from co-occurrence statistics, while retaining their syntactic information through other means (e.g., part-of-speech tagging and/or affixes from inflected word forms). Additionally, very little work has been done to explore the effect of employing morphological decomposition on the inflected forms of words in corpora prior to compiling co-occurrence statistics, despite (controversial) evidence that humans perform early morphological decomposition in semantic processing. In this study, we explored the impact of these factors on corpus-based semantic space models. From this study, morphological decomposition appears to significantly improve performance in word-word co-occurrence semantic space models, providing some support for the claim that sublexical information-specifically, word morphology-plays a role in lexical semantic processing. An overall decrease in performance was observed in models employing stop lists (e.g., excluding closed-class words). Furthermore, we found some evidence that weakens the claim that closed-class words supply primarily syntactic information in word-word co-occurrence semantic space models.

  10. Implications of Analytical Investigations about the Semiconductor Equations on Device Modeling Programs.

    DTIC Science & Technology

    1983-04-01

    34.. .. . ...- "- -,-. SIGNIFICANCE AND EXPLANATION Many different codes for the simulation of semiconductor devices such as transitors , diodes, thyristors are already circulated...partially take into account the consequences introduced by degenerate semiconductors (e.g. invalidity of Boltzmann’s statistics , bandgap narrowing). These...ft - ni p nep /Ut(2.10) Sni *e p nie 2.11) .7. (2.10) can be physically interpreted as the application of Boltzmann statistics . However (2.10) a.,zo

  11. The relationship between procrastination, learning strategies and statistics anxiety among Iranian college students: a canonical correlation analysis.

    PubMed

    Vahedi, Shahrum; Farrokhi, Farahman; Gahramani, Farahnaz; Issazadegan, Ali

    2012-01-01

    Approximately 66-80%of graduate students experience statistics anxiety and some researchers propose that many students identify statistics courses as the most anxiety-inducing courses in their academic curriculums. As such, it is likely that statistics anxiety is, in part, responsible for many students delaying enrollment in these courses for as long as possible. This paper proposes a canonical model by treating academic procrastination (AP), learning strategies (LS) as predictor variables and statistics anxiety (SA) as explained variables. A questionnaire survey was used for data collection and 246-college female student participated in this study. To examine the mutually independent relations between procrastination, learning strategies and statistics anxiety variables, a canonical correlation analysis was computed. Findings show that two canonical functions were statistically significant. The set of variables (metacognitive self-regulation, source management, preparing homework, preparing for test and preparing term papers) helped predict changes of statistics anxiety with respect to fearful behavior, Attitude towards math and class, Performance, but not Anxiety. These findings could be used in educational and psychological interventions in the context of statistics anxiety reduction.

  12. Statistical downscaling of summer precipitation over northwestern South America

    NASA Astrophysics Data System (ADS)

    Palomino Lemus, Reiner; Córdoba Machado, Samir; Raquel Gámiz Fortis, Sonia; Castro Díez, Yolanda; Jesús Esteban Parra, María

    2015-04-01

    In this study a statistical downscaling (SD) model using Principal Component Regression (PCR) for simulating summer precipitation in Colombia during the period 1950-2005, has been developed, and climate projections during the 2071-2100 period by applying the obtained SD model have been obtained. For these ends the Principal Components (PCs) of the SLP reanalysis data from NCEP were used as predictor variables, while the observed gridded summer precipitation was the predictand variable. Period 1950-1993 was utilized for calibration and 1994-2010 for validation. The Bootstrap with replacement was applied to provide estimations of the statistical errors. All models perform reasonably well at regional scales, and the spatial distribution of the correlation coefficients between predicted and observed gridded precipitation values show high values (between 0.5 and 0.93) along Andes range, north and north Pacific of Colombia. Additionally, the ability of the MIROC5 GCM to simulate the summer precipitation in Colombia, for present climate (1971-2005), has been analyzed by calculating the differences between the simulated and observed precipitation values. The simulation obtained by this GCM strongly overestimates the precipitation along a horizontal sector through the center of Colombia, especially important at the east and west of this country. However, the SD model applied to the SLP of the GCM shows its ability to faithfully reproduce the rainfall field. Finally, in order to get summer precipitation projections in Colombia for the period 1971-2100, the downscaled model, recalibrated for the total period 1950-2010, has been applied to the SLP output from MIROC5 model under the RCP2.6, RCP4.5 and RCP8.5 scenarios. The changes estimated by the SD models are not significant under the RCP2.6 scenario, while for the RCP4.5 and RCP8.5 scenarios a significant increase of precipitation appears regard to the present values in all the regions, reaching around the 27% in northern Colombia region under the RCP8.5 scenario. Keywords: Statistical downscaling, precipitation, Principal Component Regression, climate change, Colombia. ACKNOWLEDGEMENTS This work has been financed by the projects P11-RNM-7941 (Junta de Andalucía-Spain) and CGL2013-48539-R (MINECO-Spain, FEDER).

  13. Predictors of the number of under-five malnourished children in Bangladesh: application of the generalized poisson regression model

    PubMed Central

    2013-01-01

    Background Malnutrition is one of the principal causes of child mortality in developing countries including Bangladesh. According to our knowledge, most of the available studies, that addressed the issue of malnutrition among under-five children, considered the categorical (dichotomous/polychotomous) outcome variables and applied logistic regression (binary/multinomial) to find their predictors. In this study malnutrition variable (i.e. outcome) is defined as the number of under-five malnourished children in a family, which is a non-negative count variable. The purposes of the study are (i) to demonstrate the applicability of the generalized Poisson regression (GPR) model as an alternative of other statistical methods and (ii) to find some predictors of this outcome variable. Methods The data is extracted from the Bangladesh Demographic and Health Survey (BDHS) 2007. Briefly, this survey employs a nationally representative sample which is based on a two-stage stratified sample of households. A total of 4,460 under-five children is analysed using various statistical techniques namely Chi-square test and GPR model. Results The GPR model (as compared to the standard Poisson regression and negative Binomial regression) is found to be justified to study the above-mentioned outcome variable because of its under-dispersion (variance < mean) property. Our study also identify several significant predictors of the outcome variable namely mother’s education, father’s education, wealth index, sanitation status, source of drinking water, and total number of children ever born to a woman. Conclusions Consistencies of our findings in light of many other studies suggest that the GPR model is an ideal alternative of other statistical models to analyse the number of under-five malnourished children in a family. Strategies based on significant predictors may improve the nutritional status of children in Bangladesh. PMID:23297699

  14. An R2 statistic for fixed effects in the linear mixed model.

    PubMed

    Edwards, Lloyd J; Muller, Keith E; Wolfinger, Russell D; Qaqish, Bahjat F; Schabenberger, Oliver

    2008-12-20

    Statisticians most often use the linear mixed model to analyze Gaussian longitudinal data. The value and familiarity of the R(2) statistic in the linear univariate model naturally creates great interest in extending it to the linear mixed model. We define and describe how to compute a model R(2) statistic for the linear mixed model by using only a single model. The proposed R(2) statistic measures multivariate association between the repeated outcomes and the fixed effects in the linear mixed model. The R(2) statistic arises as a 1-1 function of an appropriate F statistic for testing all fixed effects (except typically the intercept) in a full model. The statistic compares the full model with a null model with all fixed effects deleted (except typically the intercept) while retaining exactly the same covariance structure. Furthermore, the R(2) statistic leads immediately to a natural definition of a partial R(2) statistic. A mixed model in which ethnicity gives a very small p-value as a longitudinal predictor of blood pressure (BP) compellingly illustrates the value of the statistic. In sharp contrast to the extreme p-value, a very small R(2) , a measure of statistical and scientific importance, indicates that ethnicity has an almost negligible association with the repeated BP outcomes for the study.

  15. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A

    2012-03-15

    To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Active Duty - U.S. Army Noise Induced Hearing Injury Surveillance Calendar Years 2009-2013

    DTIC Science & Technology

    2014-06-01

    rates for sensorineural hearing loss, significant threshold shift, tinnitus , and Noise-Induced Hearing Loss. The intention is to monitor the morbidity...surveillance. These code groups include sensorineural hearing loss (SNHL), significant threshold shift (STS), noise-induced hearing loss (NIHL) and tinnitus ... Tinnitus ) was analyzed using a regression model to determine the trend of incidence rates from 2007 to the current year. Statistical significance of a

  17. Bayesian hierarchical modelling of North Atlantic windiness

    NASA Astrophysics Data System (ADS)

    Vanem, E.; Breivik, O. N.

    2013-03-01

    Extreme weather conditions represent serious natural hazards to ship operations and may be the direct cause or contributing factor to maritime accidents. Such severe environmental conditions can be taken into account in ship design and operational windows can be defined that limits hazardous operations to less extreme conditions. Nevertheless, possible changes in the statistics of extreme weather conditions, possibly due to anthropogenic climate change, represent an additional hazard to ship operations that is less straightforward to account for in a consistent way. Obviously, there are large uncertainties as to how future climate change will affect the extreme weather conditions at sea and there is a need for stochastic models that can describe the variability in both space and time at various scales of the environmental conditions. Previously, Bayesian hierarchical space-time models have been developed to describe the variability and complex dependence structures of significant wave height in space and time. These models were found to perform reasonably well and provided some interesting results, in particular, pertaining to long-term trends in the wave climate. In this paper, a similar framework is applied to oceanic windiness and the spatial and temporal variability of the 10-m wind speed over an area in the North Atlantic ocean is investigated. When the results from the model for North Atlantic windiness is compared to the results for significant wave height over the same area, it is interesting to observe that whereas an increasing trend in significant wave height was identified, no statistically significant long-term trend was estimated in windiness. This may indicate that the increase in significant wave height is not due to an increase in locally generated wind waves, but rather to increased swell. This observation is also consistent with studies that have suggested a poleward shift of the main storm tracks.

  18. Homicide mortality rates in Canada, 2000-2009: Youth at increased risk.

    PubMed

    Basham, C Andrew; Snider, Carolyn

    2016-10-20

    To estimate and compare Canadian homicide mortality rates (HMRs) and trends in HMRs across age groups, with a focus on trends for youth. Data for the period of 2000 to 2009 were collected from Statistics Canada's CANSIM (Canadian Statistical Information Management) Table 102-0540 with the following ICD-10-CA coded external causes of death: X85 to Y09 (assault) and Y87.1 (sequelae of assault). Annual population counts from 2000 to 2009 were obtained from Statistics Canada's CANSIM Table 051-0001. Both death and population counts were organized into five-year age groups. A random effects negative binomial regression analysis was conducted to estimate age group-specific rates, rate ratios, and trends in homicide mortality. There were 9,878 homicide deaths in Canada during the study period. The increase in the overall homicide mortality rate (HMR) of 0.3% per year was not statistically significant (95% CI: -1.1% to +1.8%). Canadians aged 15-19 years and 20-24 years had the highest HMRs during the study period, and experienced statistically significant annual increases in their HMRs of 3% and 4% respectively (p < 0.05). A general, though not statistically significant, decrease in the HMR was observed for all age groups 50+ years. A fixed effects negative binomial regression model showed that the HMR for males was higher than for females over the study period [RRfemale/male = 0.473 (95% CI: 0.361, 0.621)], but no significant difference in sex-specific trends in the HMR was found. An increasing risk of homicide mortality was identified among Canadian youth, ages 15-24, over the 10-year study period. Research that seeks to understand the reasons for the increased homicide risk facing Canada's youth, and public policy responses to reduce this risk, are warranted.

  19. Insights into Corona Formation through Statistical Analyses

    NASA Technical Reports Server (NTRS)

    Glaze, L. S.; Stofan, E. R.; Smrekar, S. E.; Baloga, S. M.

    2002-01-01

    Statistical analysis of an expanded database of coronae on Venus indicates that the populations of Type 1 (with fracture annuli) and 2 (without fracture annuli) corona diameters are statistically indistinguishable, and therefore we have no basis for assuming different formation mechanisms. Analysis of the topography and diameters of coronae shows that coronae that are depressions, rimmed depressions, and domes tend to be significantly smaller than those that are plateaus, rimmed plateaus, or domes with surrounding rims. This is consistent with the model of Smrekar and Stofan and inconsistent with predictions of the spreading drop model of Koch and Manga. The diameter range for domes, the initial stage of corona formation, provides a broad constraint on the buoyancy of corona-forming plumes. Coronae are only slightly more likely to be topographically raised than depressions, with Type 1 coronae most frequently occurring as rimmed depressions and Type 2 coronae most frequently occuring with flat interiors and raised rims. Most Type 1 coronae are located along chasmata systems or fracture belts, while Type 2 coronas are found predominantly as isolated features in the plains. Coronae at hotspot rises tend to be significantly larger than coronae in other settings, consistent with a hotter upper mantle at hotspot rises and their active state.

  20. Statistical Analysis of the Indus Script Using n-Grams

    PubMed Central

    Yadav, Nisha; Joglekar, Hrishikesh; Rao, Rajesh P. N.; Vahia, Mayank N.; Adhikari, Ronojoy; Mahadevan, Iravatham

    2010-01-01

    The Indus script is one of the major undeciphered scripts of the ancient world. The small size of the corpus, the absence of bilingual texts, and the lack of definite knowledge of the underlying language has frustrated efforts at decipherment since the discovery of the remains of the Indus civilization. Building on previous statistical approaches, we apply the tools of statistical language processing, specifically n-gram Markov chains, to analyze the syntax of the Indus script. We find that unigrams follow a Zipf-Mandelbrot distribution. Text beginner and ender distributions are unequal, providing internal evidence for syntax. We see clear evidence of strong bigram correlations and extract significant pairs and triplets using a log-likelihood measure of association. Highly frequent pairs and triplets are not always highly significant. The model performance is evaluated using information-theoretic measures and cross-validation. The model can restore doubtfully read texts with an accuracy of about 75%. We find that a quadrigram Markov chain saturates information theoretic measures against a held-out corpus. Our work forms the basis for the development of a stochastic grammar which may be used to explore the syntax of the Indus script in greater detail. PMID:20333254

  1. Influence of science and technology magnet middle schools on students' motivation and achievement in science

    NASA Astrophysics Data System (ADS)

    Allen, David

    Some informal discussions among educators regarding motivation of students and academic performance have included the topic of magnet schools. The premise is that a focused theme, such as an aspect of science, positively affects student motivation and academic achievement. However, there is limited research involving magnet schools and their influence on student motivation and academic performance. This study provides empirical data for the discussion about magnet schools influence on motivation and academic ability. This study utilized path analysis in a structural equation modeling framework to simultaneously investigate the relationships between demographic exogenous independent variables, the independent variable of attending a science or technology magnet middle school, and the dependent variables of motivation to learn science and academic achievement in science. Due to the categorical nature of the variables, Bayesian statistical analysis was used to calculate the path coefficients and the standardized effects for each relationship in the model. The coefficients of determination were calculated to determine the amount of variance each path explained. Only five of 21 paths had statistical significance. Only one of the five statistically significant paths (Attended Magnet School to Motivation to Learn Science) explained a noteworthy amount (45.8%) of the variance.

  2. Neuropsychological study of IQ scores in offspring of parents with bipolar I disorder.

    PubMed

    Sharma, Aditya; Camilleri, Nigel; Grunze, Heinz; Barron, Evelyn; Le Couteur, James; Close, Andrew; Rushton, Steven; Kelly, Thomas; Ferrier, Ian Nicol; Le Couteur, Ann

    2017-01-01

    Studies comparing IQ in Offspring of Bipolar Parents (OBP) with Offspring of Healthy Controls (OHC) have reported conflicting findings. They have included OBP with mental health/neurodevelopmental disorders and/or pharmacological treatment which could affect results. This UK study aimed to assess IQ in OBP with no mental health/neurodevelopmental disorder and assess the relationship of sociodemographic variables with IQ. IQ data using the Wechsler Abbreviated Scale of Intelligence (WASI) from 24 OBP and 34 OHC from the North East of England was analysed using mixed-effects modelling. All participants had IQ in the average range. OBP differed statistically significantly from OHC on Full Scale IQ (p = .001), Performance IQ (PIQ) (p = .003) and Verbal IQ (VIQ) (p = .001) but not on the PIQ-VIQ split. OBP and OHC groups did not differ on socio-economic status (SES) and gender. SES made a statistically significant contribution to the variance of IQ scores (p = .001). Using a robust statistical model of analysis, the OBP with no current/past history of mental health/neurodevelopmental disorders had lower IQ scores compared to OHC. This finding should be borne in mind when assessing and recommending interventions for OBP.

  3. No-Impact Threshold Values for NRAP's Reduced Order Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Last, George V.; Murray, Christopher J.; Brown, Christopher F.

    2013-02-01

    The purpose of this study was to develop methodologies for establishing baseline datasets and statistical protocols for determining statistically significant changes between background concentrations and predicted concentrations that would be used to represent a contamination plume in the Gen II models being developed by NRAP’s Groundwater Protection team. The initial effort examined selected portions of two aquifer systems; the urban shallow-unconfined aquifer system of the Edwards-Trinity Aquifer System (being used to develop the ROM for carbon-rock aquifers, and the a portion of the High Plains Aquifer (an unconsolidated and semi-consolidated sand and gravel aquifer, being used to development the ROMmore » for sandstone aquifers). Threshold values were determined for Cd, Pb, As, pH, and TDS that could be used to identify contamination due to predicted impacts from carbon sequestration storage reservoirs, based on recommendations found in the EPA’s ''Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities'' (US Environmental Protection Agency 2009). Results from this effort can be used to inform a ''no change'' scenario with respect to groundwater impacts, rather than the use of an MCL that could be significantly higher than existing concentrations in the aquifer.« less

  4. Using Social Network Analysis to Better Understand Compulsive Exercise Behavior Among a Sample of Sorority Members.

    PubMed

    Patterson, Megan S; Goodson, Patricia

    2017-05-01

    Compulsive exercise, a form of unhealthy exercise often associated with prioritizing exercise and feeling guilty when exercise is missed, is a common precursor to and symptom of eating disorders. College-aged women are at high risk of exercising compulsively compared with other groups. Social network analysis (SNA) is a theoretical perspective and methodology allowing researchers to observe the effects of relational dynamics on the behaviors of people. SNA was used to assess the relationship between compulsive exercise and body dissatisfaction, physical activity, and network variables. Descriptive statistics were conducted using SPSS, and quadratic assignment procedure (QAP) analyses were conducted using UCINET. QAP regression analysis revealed a statistically significant model (R 2 = .375, P < .0001) predicting compulsive exercise behavior. Physical activity, body dissatisfaction, and network variables were statistically significant predictor variables in the QAP regression model. In our sample, women who are connected to "important" or "powerful" people in their network are likely to have higher compulsive exercise scores. This result provides healthcare practitioners key target points for intervention within similar groups of women. For scholars researching eating disorders and associated behaviors, this study supports looking into group dynamics and network structure in conjunction with body dissatisfaction and exercise frequency.

  5. Temporal Variability and Statistics of the Strehl Ratio in Adaptive-Optics Images

    DTIC Science & Technology

    2010-01-01

    with the appropriate models and the residuals were extracted. This was done using the ARIMA modelling (Box & Jenkins 1970). ARIMA stands for...It was used here for the opposite goal – to obtain the values of the i.i.d. “noise” and test its distribution. Mixed ARIMA models of order 2 were...often sufficient to ensure non- significant autocorrelation of the residuals. Table 2 lists the stationary sequences with their respective ARIMA models

  6. Work domain constraints for modelling surgical performance.

    PubMed

    Morineau, Thierry; Riffaud, Laurent; Morandi, Xavier; Villain, Jonathan; Jannin, Pierre

    2015-10-01

    Three main approaches can be identified for modelling surgical performance: a competency-based approach, a task-based approach, both largely explored in the literature, and a less known work domain-based approach. The work domain-based approach first describes the work domain properties that constrain the agent's actions and shape the performance. This paper presents a work domain-based approach for modelling performance during cervical spine surgery, based on the idea that anatomical structures delineate the surgical performance. This model was evaluated through an analysis of junior and senior surgeons' actions. Twenty-four cervical spine surgeries performed by two junior and two senior surgeons were recorded in real time by an expert surgeon. According to a work domain-based model describing an optimal progression through anatomical structures, the degree of adjustment of each surgical procedure to a statistical polynomial function was assessed. Each surgical procedure showed a significant suitability with the model and regression coefficient values around 0.9. However, the surgeries performed by senior surgeons fitted this model significantly better than those performed by junior surgeons. Analysis of the relative frequencies of actions on anatomical structures showed that some specific anatomical structures discriminate senior from junior performances. The work domain-based modelling approach can provide an overall statistical indicator of surgical performance, but in particular, it can highlight specific points of interest among anatomical structures that the surgeons dwelled on according to their level of expertise.

  7. Estimating the color of maxillary central incisors based on age and gender

    PubMed Central

    Gozalo-Diaz, David; Johnston, William M.; Wee, Alvin G.

    2008-01-01

    Statement of problem There is no scientific information regarding the selection of the color of teeth for edentulous patients. Purpose The purpose of this study was to evaluate linear regression models that may be used to predict color parameters for central incisors of edentulous patients based on some characteristics of dentate subjects. Material and methods A spectroradiometer and an external light source were set in a noncontacting 45/0 degree (45-degree illumination and 0-degree observer) optical configuration to measure the color of subjects’ vital craniofacial structures (maxillary central incisor, attached gingiva, and facial skin). The subjects (n=120) were stratified into 5 age groups with 4 racial groups and balanced for gender. Linear first-order regression was used to determine the significant factors (α=.05) in the prediction model for each color direction of the color of the maxillary central incisor. Age, gender, and color of the other craniofacial structures were studied as potential predictors. Final predictions in each color direction were based only on the statistically significant factors, and then the color differences between observed and predicted CIELAB values for the central incisors were calculated and summarized. Results The statistically significant predictors of age and gender accounted for 36% of the total variability in L*. The statistically significant predictor of age accounted for 16% of the total variability in a*. The statistically significant predictors of age and gender accounted for 21% of the variability in b*. The mean ΔE (SD) between predicted and observed CIELAB values for the central incisor was 5.8 (3.2). Conclusions Age and gender were found to be statistically significant determinants in predicting the natural color of central incisors. Although the precision of these predictions was less than the median color difference found for all pairs of teeth studied, and may be considered an acceptable precision, further study is needed to reduce this precision to the limit of detection. Clinical Implications Age is highly correlated with the natural color of the central incisors. When age increases, the central incisor becomes darker, more reddish, and more yellow. Also, the women subjects in this study had lighter and less yellow central incisors than the men. PMID:18672125

  8. Cosmic shear measurements with Dark Energy Survey Science Verification data

    DOE PAGES

    Becker, M. R.

    2016-07-06

    Here, we present measurements of weak gravitational lensing cosmic shear two-point statistics using Dark Energy Survey Science Verification data. We demonstrate that our results are robust to the choice of shear measurement pipeline, either ngmix or im3shape, and robust to the choice of two-point statistic, including both real and Fourier-space statistics. Our results pass a suite of null tests including tests for B-mode contamination and direct tests for any dependence of the two-point functions on a set of 16 observing conditions and galaxy properties, such as seeing, airmass, galaxy color, galaxy magnitude, etc. We use a large suite of simulationsmore » to compute the covariance matrix of the cosmic shear measurements and assign statistical significance to our null tests. We find that our covariance matrix is consistent with the halo model prediction, indicating that it has the appropriate level of halo sample variance. We also compare the same jackknife procedure applied to the data and the simulations in order to search for additional sources of noise not captured by the simulations. We find no statistically significant extra sources of noise in the data. The overall detection significance with tomography for our highest source density catalog is 9.7σ. Cosmological constraints from the measurements in this work are presented in a companion paper.« less

  9. Stationary statistical theory of two-surface multipactor regarding all impacts for efficient threshold analysis

    NASA Astrophysics Data System (ADS)

    Lin, Shu; Wang, Rui; Xia, Ning; Li, Yongdong; Liu, Chunliang

    2018-01-01

    Statistical multipactor theories are critical prediction approaches for multipactor breakdown determination. However, these approaches still require a negotiation between the calculation efficiency and accuracy. This paper presents an improved stationary statistical theory for efficient threshold analysis of two-surface multipactor. A general integral equation over the distribution function of the electron emission phase with both the single-sided and double-sided impacts considered is formulated. The modeling results indicate that the improved stationary statistical theory can not only obtain equally good accuracy of multipactor threshold calculation as the nonstationary statistical theory, but also achieve high calculation efficiency concurrently. By using this improved stationary statistical theory, the total time consumption in calculating full multipactor susceptibility zones of parallel plates can be decreased by as much as a factor of four relative to the nonstationary statistical theory. It also shows that the effect of single-sided impacts is indispensable for accurate multipactor prediction of coaxial lines and also more significant for the high order multipactor. Finally, the influence of secondary emission yield (SEY) properties on the multipactor threshold is further investigated. It is observed that the first cross energy and the energy range between the first cross and the SEY maximum both play a significant role in determining the multipactor threshold, which agrees with the numerical simulation results in the literature.

  10. What can 35 years and over 700,000 measurements tell us about noise exposure in the mining industry?

    PubMed

    Roberts, Benjamin; Sun, Kan; Neitzel, Richard L

    2017-01-01

    To analyse over 700,000 cross-sectional measurements from the Mine Safety and Health Administration (MHSA) and develop statistical models to predict noise exposure for a worker. Descriptive statistics were used to summarise the data. Two linear regression models were used to predict noise exposure based on MSHA-permissible exposure limit (PEL) and action level (AL), respectively. Twofold cross validation was used to compare the exposure estimates from the models to actual measurement. The mean difference and t-statistic was calculated for each job title to determine whether the model predictions were significantly different from the actual data. Measurements were acquired from MSHA through a Freedom of Information Act request. From 1979 to 2014, noise exposure has decreased. Measurements taken before the implementation of MSHA's revised noise regulation in 2000 were on average 4.5 dBA higher than after the law was implemented. Both models produced exposure predictions that were less than 1 dBA different than the holdout data. Overall noise levels in mines have been decreasing. However, this decrease has not been uniform across all mining sectors. The exposure predictions from the model will be useful to help predict hearing loss in workers in the mining industry.

  11. Sampling methods to the statistical control of the production of blood components.

    PubMed

    Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo

    2017-12-01

    The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Statistical validation of predictive TRANSP simulations of baseline discharges in preparation for extrapolation to JET D-T

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Tae; Romanelli, M.; Yuan, X.; Kaye, S.; Sips, A. C. C.; Frassinetti, L.; Buchanan, J.; Contributors, JET

    2017-06-01

    This paper presents for the first time a statistical validation of predictive TRANSP simulations of plasma temperature using two transport models, GLF23 and TGLF, over a database of 80 baseline H-mode discharges in JET-ILW. While the accuracy of the predicted T e with TRANSP-GLF23 is affected by plasma collisionality, the dependency of predictions on collisionality is less significant when using TRANSP-TGLF, indicating that the latter model has a broader applicability across plasma regimes. TRANSP-TGLF also shows a good matching of predicted T i with experimental measurements allowing for a more accurate prediction of the neutron yields. The impact of input data and assumptions prescribed in the simulations are also investigated in this paper. The statistical validation and the assessment of uncertainty level in predictive TRANSP simulations for JET-ILW-DD will constitute the basis for the extrapolation to JET-ILW-DT experiments.

  13. Streamwise evolution of statistical events and the triple correlation in a model wind turbine array

    NASA Astrophysics Data System (ADS)

    Viestenz, Kyle; Cal, Raúl Bayoán

    2013-11-01

    Hot-wire anemometry data, obtained from a wind tunnel experiment containing a 3 × 3 wind turbine array, are used to conditionally average the Reynolds stresses. Nine profiles at the centerline behind the array are analyzed to characterize the turbulent velocity statistics of the wake flow. Quadrant analysis yields statistical events occurring in the wake of the wind farm, where quadrants 2 and 4 produce ejections and sweeps, respectively. A balance between these quadrants is expressed via the ΔSo parameter, which attains a maximum value at the bottom tip and changes sign near the top tip of the rotor. These are then associated to the triple correlation term present in the turbulent kinetic energy equation of the fluctuations. The development of these various quantities is assessed in light of wake remediation, energy transport and possess significance in closure models. National Science Foundation: ECCS-1032647.

  14. Prediction of Patient-Controlled Analgesic Consumption: A Multimodel Regression Tree Approach.

    PubMed

    Hu, Yuh-Jyh; Ku, Tien-Hsiung; Yang, Yu-Hung; Shen, Jia-Ying

    2018-01-01

    Several factors contribute to individual variability in postoperative pain, therefore, individuals consume postoperative analgesics at different rates. Although many statistical studies have analyzed postoperative pain and analgesic consumption, most have identified only the correlation and have not subjected the statistical model to further tests in order to evaluate its predictive accuracy. In this study involving 3052 patients, a multistrategy computational approach was developed for analgesic consumption prediction. This approach uses data on patient-controlled analgesia demand behavior over time and combines clustering, classification, and regression to mitigate the limitations of current statistical models. Cross-validation results indicated that the proposed approach significantly outperforms various existing regression methods. Moreover, a comparison between the predictions by anesthesiologists and medical specialists and those of the computational approach for an independent test data set of 60 patients further evidenced the superiority of the computational approach in predicting analgesic consumption because it produced markedly lower root mean squared errors.

  15. Effect of Orem's Self-Care Model on Self-Esteem of Adolescents with Asthma Referred to an Asthma and Allergy Clinic in Isfahan.

    PubMed

    Hemati, Zeinab; Mosaviasl, Fatemeh Sadat; Abasi, Samira; Ghazavi, Zohre; Kiani, Davood

    2015-01-01

    Acquisition of chronic diseases such as asthma leads to psychological, mental and physical complications in adolescents, and hence their self-esteem may be compromised. Therefore, the present study was conducted to assess the effect of Orem's self-care model on self-esteem of adolescents with asthma. This semi-experimental study enrolled 64 asthmatic adolescents referred to Shariati Hospital, Isfahan. Subjects were assigned to two groups of control and intervention consecutively. Then, the self-care training program was conducted according to Orem's self-care model in eight two-hour sessions based on self-care needs, and self-esteem was measured in the two groups prior to and two months after the last training session. The data were collected by a questionnaire of demographic characteristics and the Coopersmith Self-Esteem Inventories (CSEI) and analyzed by SPSS version 20. Independent t-test showed a significant difference in the mean score of self-esteem between the intervention and control groups after the training (P<0.05), but the difference was not statistically significant prior to the intervention. Paired t-test showed a significant difference in the mean score of self-esteem before and after the training in the intervention group (P<0.01), but this difference was not statistically significant in the control group (P>0.05). Regarding the effect of Orem's self-care model on self-esteem of adolescents with asthma, we recommend the use of this model as a care intervention in healthcare centers to promote adolescents' health.

  16. Climate change and dissolved organic carbon export to the Gulf of Maine

    USGS Publications Warehouse

    Huntington, Thomas G.; Balch, William M.; Aiken, George R.; Sheffield, Justin; Luo, Lifeng; Roesler, Collin S.; Camill, Philip

    2016-01-01

    Ongoing climate change is affecting the concentration, export (flux), and timing of dissolved organic carbon (DOC) exported to the Gulf of Maine (GoM) through changes in hydrologic regime. DOC export was calculated for water years 1950 through 2013 for 20 rivers and for water years 1930 through 2013 for 14 rivers draining to the GoM. DOC export was also estimated for the 21st century based on climate and hydrologic modeling in a previously published study. DOC export was calculated by using the regression model LOADEST to fit seasonally adjusted concentration discharge (C-Q) relations. Our results are an analysis of the sensitivity of DOC export to changes in hydrologic conditions over time since land cover and vegetation were held constant over time. Despite large interannual variability, all rivers had increasing DOC export during winter and these trends were significant (p < 0.05) in 10 out of 20 rivers for 1950 to 2013 and in 13 out of 14 rivers for 1930 to 2013. All rivers also had increasing annual export of DOC although fewer trends were statistically significant than for winter export. Projections for DOC export during the 21st century were variable depending on the climate model and greenhouse gas emission scenario that affected future river discharge through effects on precipitation and evapotranspiration. The most consistent result was a significant increase in DOC export in winter in all model-by-emission scenarios. DOC export was projected to decrease during the summer in all model-by-emission scenarios, with statistically significant decreases in half of the scenarios.

  17. Normalization, bias correction, and peak calling for ChIP-seq

    PubMed Central

    Diaz, Aaron; Park, Kiyoub; Lim, Daniel A.; Song, Jun S.

    2012-01-01

    Next-generation sequencing is rapidly transforming our ability to profile the transcriptional, genetic, and epigenetic states of a cell. In particular, sequencing DNA from the immunoprecipitation of protein-DNA complexes (ChIP-seq) and methylated DNA (MeDIP-seq) can reveal the locations of protein binding sites and epigenetic modifications. These approaches contain numerous biases which may significantly influence the interpretation of the resulting data. Rigorous computational methods for detecting and removing such biases are still lacking. Also, multi-sample normalization still remains an important open problem. This theoretical paper systematically characterizes the biases and properties of ChIP-seq data by comparing 62 separate publicly available datasets, using rigorous statistical models and signal processing techniques. Statistical methods for separating ChIP-seq signal from background noise, as well as correcting enrichment test statistics for sequence-dependent and sonication biases, are presented. Our method effectively separates reads into signal and background components prior to normalization, improving the signal-to-noise ratio. Moreover, most peak callers currently use a generic null model which suffers from low specificity at the sensitivity level requisite for detecting subtle, but true, ChIP enrichment. The proposed method of determining a cell type-specific null model, which accounts for cell type-specific biases, is shown to be capable of achieving a lower false discovery rate at a given significance threshold than current methods. PMID:22499706

  18. Statistical inference of dynamic resting-state functional connectivity using hierarchical observation modeling.

    PubMed

    Sojoudi, Alireza; Goodyear, Bradley G

    2016-12-01

    Spontaneous fluctuations of blood-oxygenation level-dependent functional magnetic resonance imaging (BOLD fMRI) signals are highly synchronous between brain regions that serve similar functions. This provides a means to investigate functional networks; however, most analysis techniques assume functional connections are constant over time. This may be problematic in the case of neurological disease, where functional connections may be highly variable. Recently, several methods have been proposed to determine moment-to-moment changes in the strength of functional connections over an imaging session (so called dynamic connectivity). Here a novel analysis framework based on a hierarchical observation modeling approach was proposed, to permit statistical inference of the presence of dynamic connectivity. A two-level linear model composed of overlapping sliding windows of fMRI signals, incorporating the fact that overlapping windows are not independent was described. To test this approach, datasets were synthesized whereby functional connectivity was either constant (significant or insignificant) or modulated by an external input. The method successfully determines the statistical significance of a functional connection in phase with the modulation, and it exhibits greater sensitivity and specificity in detecting regions with variable connectivity, when compared with sliding-window correlation analysis. For real data, this technique possesses greater reproducibility and provides a more discriminative estimate of dynamic connectivity than sliding-window correlation analysis. Hum Brain Mapp 37:4566-4580, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. The effect of health shocks on smoking and obesity.

    PubMed

    Sundmacher, Leonie

    2012-08-01

    To investigate whether negative changes in their own health (i.e. health shocks) or in that of a smoking or obese household member, lead smokers to quit smoking and obese individuals to lose weight. The study is informed by economic models ('rational addiction' and 'demand for health' models) which offer hypotheses on the relationship between health shocks and health-related behaviour. Each hypothesis was tested applying a discrete-time hazard model with random effects using up to ten waves of the German Socioeconomic Panel (GSOEP) and statistics on cigarette, food and beverage prices provided by the Federal Statistical Office. Health shocks had a significant positive impact on the probability that smokers quit during the same year in which they experienced the health shock. Health shocks of a smoking household member between year t-2 and t-1 also motivated smoking cessation, although statistical evidence for this was weaker. Health shocks experienced by obese individuals or their household members had, on the other hand, no significant effect on weight loss, as measured by changes in Body Mass Index (BMI). The results of the study suggest that smokers are aware of the risks associated with tobacco consumption, know about effective strategies to quit smoking, and are willing to quit for health-related reasons. In contrast, there was no evidence for changes in health-related behaviour among obese individuals after a health shock.

  20. A knowledge-based approach to automated planning for hepatocellular carcinoma.

    PubMed

    Zhang, Yujie; Li, Tingting; Xiao, Han; Ji, Weixing; Guo, Ming; Zeng, Zhaochong; Zhang, Jianying

    2018-01-01

    To build a knowledge-based model of liver cancer for Auto-Planning, a function in Pinnacle, which is used as an automated inverse intensity modulated radiation therapy (IMRT) planning system. Fifty Tomotherapy patients were enrolled to extract the dose-volume histograms (DVHs) information and construct the protocol for Auto-Planning model. Twenty more patients were chosen additionally to test the model. Manual planning and automatic planning were performed blindly for all twenty test patients with the same machine and treatment planning system. The dose distributions of target and organs at risks (OARs), along with the working time for planning, were evaluated. Statistically significant results showed that automated plans performed better in target conformity index (CI) while mean target dose was 0.5 Gy higher than manual plans. The differences between target homogeneity indexes (HI) of the two methods were not statistically significant. Additionally, the doses of normal liver, left kidney, and small bowel were significantly reduced with automated plan. Particularly, mean dose and V15 of normal liver were 1.4 Gy and 40.5 cc lower with automated plans respectively. Mean doses of left kidney and small bowel were reduced with automated plans by 1.2 Gy and 2.1 Gy respectively. In contrast, working time was also significantly reduced with automated planning. Auto-Planning shows availability and effectiveness in our knowledge-based model for liver cancer. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  1. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    The additivity model assumed that field-scale reaction properties in a sediment including surface area, reactive site concentration, and reaction rate can be predicted from field-scale grain-size distribution by linearly adding reaction properties estimated in laboratory for individual grain-size fractions. This study evaluated the additivity model in scaling mass transfer-limited, multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment. Experimental data of rate-limited U(VI) desorption in a stirred flow-cell reactor were used to estimate the statistical properties of the rate constants for individual grain-size fractions, which were then used to predict rate-limited U(VI) desorption in the composite sediment. The resultmore » indicated that the additivity model with respect to the rate of U(VI) desorption provided a good prediction of U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel-size fraction (2 to 8 mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.« less

  2. An application of statistics to comparative metagenomics

    PubMed Central

    Rodriguez-Brito, Beltran; Rohwer, Forest; Edwards, Robert A

    2006-01-01

    Background Metagenomics, sequence analyses of genomic DNA isolated directly from the environments, can be used to identify organisms and model community dynamics of a particular ecosystem. Metagenomics also has the potential to identify significantly different metabolic potential in different environments. Results Here we use a statistical method to compare curated subsystems, to predict the physiology, metabolism, and ecology from metagenomes. This approach can be used to identify those subsystems that are significantly different between metagenome sequences. Subsystems that were overrepresented in the Sargasso Sea and Acid Mine Drainage metagenome when compared to non-redundant databases were identified. Conclusion The methodology described herein applies statistics to the comparisons of metabolic potential in metagenomes. This analysis reveals those subsystems that are more, or less, represented in the different environments that are compared. These differences in metabolic potential lead to several testable hypotheses about physiology and metabolism of microbes from these ecosystems. PMID:16549025

  3. An application of statistics to comparative metagenomics.

    PubMed

    Rodriguez-Brito, Beltran; Rohwer, Forest; Edwards, Robert A

    2006-03-20

    Metagenomics, sequence analyses of genomic DNA isolated directly from the environments, can be used to identify organisms and model community dynamics of a particular ecosystem. Metagenomics also has the potential to identify significantly different metabolic potential in different environments. Here we use a statistical method to compare curated subsystems, to predict the physiology, metabolism, and ecology from metagenomes. This approach can be used to identify those subsystems that are significantly different between metagenome sequences. Subsystems that were overrepresented in the Sargasso Sea and Acid Mine Drainage metagenome when compared to non-redundant databases were identified. The methodology described herein applies statistics to the comparisons of metabolic potential in metagenomes. This analysis reveals those subsystems that are more, or less, represented in the different environments that are compared. These differences in metabolic potential lead to several testable hypotheses about physiology and metabolism of microbes from these ecosystems.

  4. Vitamin B12 production from crude glycerol by Propionibacterium freudenreichii ssp. shermanii: optimization of medium composition through statistical experimental designs.

    PubMed

    Kośmider, Alicja; Białas, Wojciech; Kubiak, Piotr; Drożdżyńska, Agnieszka; Czaczyk, Katarzyna

    2012-02-01

    A two-step statistical experimental design was employed to optimize the medium for vitamin B(12) production from crude glycerol by Propionibacterium freudenreichii ssp. shermanii. In the first step, using Plackett-Burman design, five of 13 tested medium components (calcium pantothenate, NaH(2)PO(4)·2H(2)O, casein hydrolysate, glycerol and FeSO(4)·7H(2)O) were identified as factors having significant influence on vitamin production. In the second step, a central composite design was used to optimize levels of medium components selected in the first step. Valid statistical models describing the influence of significant factors on vitamin B(12) production were established for each optimization phase. The optimized medium provided a 93% increase in final vitamin concentration compared to the original medium. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Cluster and propensity based approximation of a network

    PubMed Central

    2013-01-01

    Background The models in this article generalize current models for both correlation networks and multigraph networks. Correlation networks are widely applied in genomics research. In contrast to general networks, it is straightforward to test the statistical significance of an edge in a correlation network. It is also easy to decompose the underlying correlation matrix and generate informative network statistics such as the module eigenvector. However, correlation networks only capture the connections between numeric variables. An open question is whether one can find suitable decompositions of the similarity measures employed in constructing general networks. Multigraph networks are attractive because they support likelihood based inference. Unfortunately, it is unclear how to adjust current statistical methods to detect the clusters inherent in many data sets. Results Here we present an intuitive and parsimonious parametrization of a general similarity measure such as a network adjacency matrix. The cluster and propensity based approximation (CPBA) of a network not only generalizes correlation network methods but also multigraph methods. In particular, it gives rise to a novel and more realistic multigraph model that accounts for clustering and provides likelihood based tests for assessing the significance of an edge after controlling for clustering. We present a novel Majorization-Minimization (MM) algorithm for estimating the parameters of the CPBA. To illustrate the practical utility of the CPBA of a network, we apply it to gene expression data and to a bi-partite network model for diseases and disease genes from the Online Mendelian Inheritance in Man (OMIM). Conclusions The CPBA of a network is theoretically appealing since a) it generalizes correlation and multigraph network methods, b) it improves likelihood based significance tests for edge counts, c) it directly models higher-order relationships between clusters, and d) it suggests novel clustering algorithms. The CPBA of a network is implemented in Fortran 95 and bundled in the freely available R package PropClust. PMID:23497424

  6. Pupil Influence on the Visual Outcomes of a New-Generation Multifocal Toric Intraocular Lens With a Surface-Embedded Near Segment.

    PubMed

    Wang, Mengmeng; Corpuz, Christine Carole C; Huseynova, Tukezban; Tomita, Minoru

    2016-02-01

    To evaluate the influences of preoperative pupil parameters on the visual outcomes of a new-generation multifocal toric intraocular lens (IOL) model with a surface-embedded near segment. In this prospective study, patients with cataract had phacoemulsification and implantation of Lentis Mplus toric LU-313 30TY IOLs (Oculentis GmbH, Berlin, Germany). The visual and optical outcomes were measured and compared preoperatively and postoperatively. The correlations between preoperative pupil parameters (diameter and decentration) and 3-month postoperative visual outcomes were evaluated using the Spearman's rank-order correlation coefficient (Rs) for the nonparametric data. A total of 27 eyes (16 patients) were enrolled into the current study. Statistically significant improvements in visual and refractive performances were found after the implantation of Lentis Mplus toric LU-313 30TY IOLs (P < .05). Statistically significant correlations were present between preoperative pupil diameters and postoperative visual acuities (Rs > 0; P < .05). Patients with a larger pupil always have better postoperative visual acuities. Meanwhile, there was no statistically significant correlation between pupil decentration and visual acuities (P > .05). Lentis Mplus toric LU-313 30TY IOLs provided excellent visual and optical performances during the 3-month follow-up. The preoperative pupil size is an important parameter when this toric multifocal IOL model is contemplated for surgery. Copyright 2016, SLACK Incorporated.

  7. Multinomial logistic regression analysis for differentiating 3 treatment outcome trajectory groups for headache-associated disability.

    PubMed

    Lewis, Kristin Nicole; Heckman, Bernadette Davantes; Himawan, Lina

    2011-08-01

    Growth mixture modeling (GMM) identified latent groups based on treatment outcome trajectories of headache disability measures in patients in headache subspecialty treatment clinics. Using a longitudinal design, 219 patients in headache subspecialty clinics in 4 large cities throughout Ohio provided data on their headache disability at pretreatment and 3 follow-up assessments. GMM identified 3 treatment outcome trajectory groups: (1) patients who initiated treatment with elevated disability levels and who reported statistically significant reductions in headache disability (high-disability improvers; 11%); (2) patients who initiated treatment with elevated disability but who reported no reductions in disability (high-disability nonimprovers; 34%); and (3) patients who initiated treatment with moderate disability and who reported statistically significant reductions in headache disability (moderate-disability improvers; 55%). Based on the final multinomial logistic regression model, a dichotomized treatment appointment attendance variable was a statistically significant predictor for differentiating high-disability improvers from high-disability nonimprovers. Three-fourths of patients who initiated treatment with elevated disability levels did not report reductions in disability after 5 months of treatment with new preventive pharmacotherapies. Preventive headache agents may be most efficacious for patients with moderate levels of disability and for patients with high disability levels who attend all treatment appointments. Copyright © 2011 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  8. PTEN Loss as Determined by Clinical-grade Immunohistochemistry Assay Is Associated with Worse Recurrence-free Survival in Prostate Cancer

    PubMed Central

    Lotan, Tamara L.; Wei, Wei; Morais, Carlos L.; Hawley, Sarah T.; Fazli, Ladan; Hurtado-Coll, Antonio; Troyer, Dean; McKenney, Jesse K.; Simko, Jeffrey; Carroll, Peter R.; Gleave, Martin; Lance, Raymond; Lin, Daniel W.; Nelson, Peter S.; Thompson, Ian M.; True, Lawrence D.; Feng, Ziding; Brooks, James D.

    2015-01-01

    Background PTEN is the most commonly deleted tumor suppressor gene in primary prostate cancer (PCa) and its loss is associated with poor clinical outcomes and ERG gene rearrangement. Objective We tested whether PTEN loss is associated with shorter recurrence-free survival (RFS) in surgically treated PCa patients with known ERG status. Design, setting, and participants A genetically validated, automated PTEN immunohistochemistry (IHC) protocol was used for 1275 primary prostate tumors from the Canary Foundation retrospective PCa tissue microarray cohort to assess homogeneous (in all tumor tissue sampled) or heterogeneous (in a subset of tumor tissue sampled) PTEN loss. ERG status as determined by a genetically validated IHC assay was available for a subset of 938 tumors. Outcome measurements and statistical analysis Associations between PTEN and ERG status were assessed using Fisher’s exact test. Kaplan-Meier and multivariate weighted Cox proportional models for RFS were constructed. Results and limitations When compared to intact PTEN, homogeneous (hazard ratio [HR] 1.66, p = 0.001) but not heterogeneous (HR 1.24, p = 0.14) PTEN loss was significantly associated with shorter RFS in multivariate models. Among ERG-positive tumors, homogeneous (HR 3.07, p < 0.0001) but not heterogeneous (HR 1.46, p = 0.10) PTEN loss was significantly associated with shorter RFS. Among ERG-negative tumors, PTEN did not reach significance for inclusion in the final multivariate models. The interaction term for PTEN and ERG status with respect to RFS did not reach statistical significance (p = 0.11) for the current sample size. Conclusions These data suggest that PTEN is a useful prognostic biomarker and that there is no statistically significant interaction between PTEN and ERG status for RFS. Patient summary We found that loss of the PTEN tumor suppressor gene in prostate tumors as assessed by tissue staining is correlated with shorter time to prostate cancer recurrence after radical prostatectomy. PMID:27617307

  9. Variables that influence BRAF mutation probability: A next-generation sequencing, non-interventional investigation of BRAFV600 mutation status in melanoma.

    PubMed

    Gaiser, Maria Rita; Skorokhod, Alexander; Gransheier, Diana; Weide, Benjamin; Koch, Winfried; Schif, Birgit; Enk, Alexander; Garbe, Claus; Bauer, Jürgen

    2017-01-01

    The incidence of melanoma, particularly in older patients, has steadily increased over the past few decades. Activating mutations of BRAF, the majority occurring in BRAFV600, are frequently detected in melanoma; however, the prognostic significance remains unclear. This study aimed to define the probability and distribution of BRAFV600 mutations, and the clinico-pathological factors that may affect BRAF mutation status, in patients with advanced melanoma using next-generation sequencing. This was a non-interventional, retrospective study of BRAF mutation testing at two German centers, in Heidelberg and Tübingen. Archival tumor samples from patients with histologically confirmed melanoma (stage IIIB, IIIC, IV) were analyzed using PCR amplification and deep sequencing. Clinical, histological, and mutation data were collected. The statistical influence of patient- and tumor-related characteristics on BRAFV600 mutation status was assessed using multiple logistic regression (MLR) and a prediction profiler. BRAFV600 mutation status was assessed in 453 samples. Mutations were detected in 57.6% of patients (n = 261), with 48.1% (n = 102) at the Heidelberg site and 66.0% (n = 159) at the Tübingen site. The decreasing influence of increasing age on mutation probability was quantified. A main effects MLR model identified age (p = 0.0001), center (p = 0.0004), and melanoma subtype (p = 0.014) as significantly influencing BRAFV600 mutation probability; ultraviolet (UV) exposure showed a statistical trend (p = 0.1419). An interaction model of age versus other variables showed that center (p<0.0001) and melanoma subtype (p = 0.0038) significantly influenced BRAF mutation probability; age had a statistically significant effect only as part of an interaction with both UV exposure (p = 0.0110) and melanoma subtype (p = 0.0134). This exploratory study highlights that testing center, melanoma subtype, and age in combination with UV exposure and melanoma subtype significantly influence BRAFV600 mutation probability in patients with melanoma. Further validation of this model, in terms of reproducibility and broader relevance, is required.

  10. Estimation of social value of statistical life using willingness-to-pay method in Nanjing, China.

    PubMed

    Yang, Zhao; Liu, Pan; Xu, Xin

    2016-10-01

    Rational decision making regarding the safety related investment programs greatly depends on the economic valuation of traffic crashes. The primary objective of this study was to estimate the social value of statistical life in the city of Nanjing in China. A stated preference survey was conducted to investigate travelers' willingness to pay for traffic risk reduction. Face-to-face interviews were conducted at stations, shopping centers, schools, and parks in different districts in the urban area of Nanjing. The respondents were categorized into two groups, including motorists and non-motorists. Both the binary logit model and mixed logit model were developed for the two groups of people. The results revealed that the mixed logit model is superior to the fixed coefficient binary logit model. The factors that significantly affect people's willingness to pay for risk reduction include income, education, gender, age, drive age (for motorists), occupation, whether the charged fees were used to improve private vehicle equipment (for motorists), reduction in fatality rate, and change in travel cost. The Monte Carlo simulation method was used to generate the distribution of value of statistical life (VSL). Based on the mixed logit model, the VSL had a mean value of 3,729,493 RMB ($586,610) with a standard deviation of 2,181,592 RMB ($343,142) for motorists; and a mean of 3,281,283 RMB ($505,318) with a standard deviation of 2,376,975 RMB ($366,054) for non-motorists. Using the tax system to illustrate the contribution of different income groups to social funds, the social value of statistical life was estimated. The average social value of statistical life was found to be 7,184,406 RMB ($1,130,032). Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Psychosocial Predictors for Cancer Prevention Behaviors in Workplace Using Protection Motivation Theory.

    PubMed

    Zare Sakhvidi, Mohammad Javad; Zare, Maryam; Mostaghaci, Mehrdad; Mehrparvar, Amir Houshang; Morowatisharifabad, Mohammad Ali; Naghshineh, Elham

    2015-01-01

    Backgrounds. The aim of this study was to describe the preventive behaviors of industrial workers and factors influencing occupational cancer prevention behaviors using protection motivation theory. Methods. A self-administered questionnaire was completed by 161 petrochemical workers in Iran in 2014 which consisted of three sections: background information, protection motivation theory measures, and occupational cancers preventive behaviors. Results. A statistically significant positive correlation was found between PM and self-efficacy, response efficacy, and the cancer preventive behaviors. Meanwhile, statistically significant negative correlations were found between PM, cost, and reward. Conclusions. Among available PMT constructs, only self-efficacy and cost were significant predictors of preventive behaviors. Protection motivation model based health promotion interventions with focus on self-efficacy and cost would be desirable in the case of occupational cancers prevention.

  12. Psychosocial Predictors for Cancer Prevention Behaviors in Workplace Using Protection Motivation Theory

    PubMed Central

    Zare Sakhvidi, Mohammad Javad; Zare, Maryam; Mehrparvar, Amir Houshang; Morowatisharifabad, Mohammad Ali; Naghshineh, Elham

    2015-01-01

    Backgrounds. The aim of this study was to describe the preventive behaviors of industrial workers and factors influencing occupational cancer prevention behaviors using protection motivation theory. Methods. A self-administered questionnaire was completed by 161 petrochemical workers in Iran in 2014 which consisted of three sections: background information, protection motivation theory measures, and occupational cancers preventive behaviors. Results. A statistically significant positive correlation was found between PM and self-efficacy, response efficacy, and the cancer preventive behaviors. Meanwhile, statistically significant negative correlations were found between PM, cost, and reward. Conclusions. Among available PMT constructs, only self-efficacy and cost were significant predictors of preventive behaviors. Protection motivation model based health promotion interventions with focus on self-efficacy and cost would be desirable in the case of occupational cancers prevention. PMID:26543649

  13. Estimating the impact of mineral aerosols on crop yields in food insecure regions using statistical crop models

    NASA Astrophysics Data System (ADS)

    Hoffman, A.; Forest, C. E.; Kemanian, A.

    2016-12-01

    A significant number of food-insecure nations exist in regions of the world where dust plays a large role in the climate system. While the impacts of common climate variables (e.g. temperature, precipitation, ozone, and carbon dioxide) on crop yields are relatively well understood, the impact of mineral aerosols on yields have not yet been thoroughly investigated. This research aims to develop the data and tools to progress our understanding of mineral aerosol impacts on crop yields. Suspended dust affects crop yields by altering the amount and type of radiation reaching the plant, modifying local temperature and precipitation. While dust events (i.e. dust storms) affect crop yields by depleting the soil of nutrients or by defoliation via particle abrasion. The impact of dust on yields is modeled statistically because we are uncertain which impacts will dominate the response on national and regional scales considered in this study. Multiple linear regression is used in a number of large-scale statistical crop modeling studies to estimate yield responses to various climate variables. In alignment with previous work, we develop linear crop models, but build upon this simple method of regression with machine-learning techniques (e.g. random forests) to identify important statistical predictors and isolate how dust affects yields on the scales of interest. To perform this analysis, we develop a crop-climate dataset for maize, soybean, groundnut, sorghum, rice, and wheat for the regions of West Africa, East Africa, South Africa, and the Sahel. Random forest regression models consistently model historic crop yields better than the linear models. In several instances, the random forest models accurately capture the temperature and precipitation threshold behavior in crops. Additionally, improving agricultural technology has caused a well-documented positive trend that dominates time series of global and regional yields. This trend is often removed before regression with traditional crop models, but likely at the cost of removing climate information. Our random forest models consistently discover the positive trend without removing any additional data. The application of random forests as a statistical crop model provides insight into understanding the impact of dust on yields in marginal food producing regions.

  14. Nutritional Status of Rural Older Adults is Linked to Physical and Emotional Health

    PubMed Central

    Jung, Seung Eun; Bishop, Alex J; Kim, Minjung; Hermann, Janice; Kim, Giyeon; Lawrence, Jeannine

    2017-01-01

    Background Although nutritional status is influenced by multi-dimensional aspects encompassing physical and emotional well-being, there is limited research on this complex relationship. Objective The purpose of this study was to examine the interplay between indicators of physical health (perceived health status and self-care capacity) and emotional well-being (depressive affect and loneliness) on rural older adults’ nutritional status. Design The cross-sectional study was conducted from June 1, 2007 to June 1, 2008. Participants/setting A total of 171 community-dwelling older adults, 65 years and older, who resided within non-metro rural communities in the U.S. participated in this study. Main outcome measures Participants completed validated instruments measuring self-care capacity, perceived health status, loneliness, depressive affect, and nutritional status. Statistical analyses performed Structural equation modeling (SEM) was employed to investigate the complex interplay of physical and emotional health status with nutritional status among rural older adults, Chi-square statistic, CFI, RMSEA and SRMR were used to assess model fit. Results Chi-square statistic and the other model fit indices showed the hypothesized SEM model provided a good fit to the data (χ2 (2)=2.15, p=0.34; CFI=1.00; RMSEA=0.02; SRMR=0.03). Self-care capacity was significantly related with depressive affect (γ = −0.11, p=0.03) whereas self-care capacity was not significantly related with loneliness. Perceived health status had a significant negative relationship with both loneliness (γ = −0.16, p=0.03) and depressive affect (γ = −0.22, p=0.03). Although loneliness showed no significant direct relationship with nutritional status, it showed a significant direct relationship with depressive affect (β = 0.46, p<0.01). Finally, the results demonstrated that depressive affect had a significant negative relationship with nutritional status (β = −0.30, p<0.01). The results indicated physical health and emotional indicators have significant multi-dimensional associations with nutritional status among rural older adults. Conclusions The present study provides insights into the importance of addressing both physical and emotional well-being together to reduce potential effects of poor emotional well-being on nutritional status, particularly among rural older adults with impaired physical health and self-care capacity. PMID:28274787

  15. Forging a link between mentoring and collaboration: a new training model for implementation science.

    PubMed

    Luke, Douglas A; Baumann, Ana A; Carothers, Bobbi J; Landsverk, John; Proctor, Enola K

    2016-10-13

    Training investigators for the rapidly developing field of implementation science requires both mentoring and scientific collaboration. Using social network descriptive analyses, visualization, and modeling, this paper presents results of an evaluation of the mentoring and collaborations fostered over time through the National Institute of Mental Health (NIMH) supported by Implementation Research Institute (IRI). Data were comprised of IRI participant self-reported collaborations and mentoring relationships, measured in three annual surveys from 2012 to 2014. Network descriptive statistics, visualizations, and network statistical modeling were conducted to examine patterns of mentoring and collaboration among IRI participants and to model the relationship between mentoring and subsequent collaboration. Findings suggest that IRI is successful in forming mentoring relationships among its participants, and that these mentoring relationships are related to future scientific collaborations. Exponential random graph network models demonstrated that mentoring received in 2012 was positively and significantly related to the likelihood of having a scientific collaboration 2 years later in 2014 (p = 0.001). More specifically, mentoring was significantly related to future collaborations focusing on new research (p = 0.009), grant submissions (p = 0.003), and publications (p = 0.017). Predictions based on the network model suggest that for every additional mentoring relationships established in 2012, the likelihood of a scientific collaboration 2 years later is increased by almost 7 %. These results support the importance of mentoring in implementation science specifically and team science more generally. Mentoring relationships were established quickly and early by the IRI core faculty. IRI fellows reported increasing scientific collaboration of all types over time, including starting new research, submitting new grants, presenting research results, and publishing peer-reviewed papers. Statistical network models demonstrated that mentoring was strongly and significantly related to subsequent scientific collaboration, which supported a core design principle of the IRI. Future work should establish the link between mentoring and scientific productivity. These results may be of interest to team science, as they suggest the importance of mentoring for future team collaborations, as well as illustrate the utility of network analysis for studying team characteristics and activities.

  16. External validation of ADO, DOSE, COTE and CODEX at predicting death in primary care patients with COPD using standard and machine learning approaches.

    PubMed

    Morales, Daniel R; Flynn, Rob; Zhang, Jianguo; Trucco, Emmanuel; Quint, Jennifer K; Zutis, Kris

    2018-05-01

    Several models for predicting the risk of death in people with chronic obstructive pulmonary disease (COPD) exist but have not undergone large scale validation in primary care. The objective of this study was to externally validate these models using statistical and machine learning approaches. We used a primary care COPD cohort identified using data from the UK Clinical Practice Research Datalink. Age-standardised mortality rates were calculated for the population by gender and discrimination of ADO (age, dyspnoea, airflow obstruction), COTE (COPD-specific comorbidity test), DOSE (dyspnoea, airflow obstruction, smoking, exacerbations) and CODEX (comorbidity, dyspnoea, airflow obstruction, exacerbations) at predicting death over 1-3 years measured using logistic regression and a support vector machine learning (SVM) method of analysis. The age-standardised mortality rate was 32.8 (95%CI 32.5-33.1) and 25.2 (95%CI 25.4-25.7) per 1000 person years for men and women respectively. Complete data were available for 54879 patients to predict 1-year mortality. ADO performed the best (c-statistic of 0.730) compared with DOSE (c-statistic 0.645), COTE (c-statistic 0.655) and CODEX (c-statistic 0.649) at predicting 1-year mortality. Discrimination of ADO and DOSE improved at predicting 1-year mortality when combined with COTE comorbidities (c-statistic 0.780 ADO + COTE; c-statistic 0.727 DOSE + COTE). Discrimination did not change significantly over 1-3 years. Comparable results were observed using SVM. In primary care, ADO appears superior at predicting death in COPD. Performance of ADO and DOSE improved when combined with COTE comorbidities suggesting better models may be generated with additional data facilitated using novel approaches. Copyright © 2018. Published by Elsevier Ltd.

  17. Surface Ozone Variability and Trends over the South African Highveld from 1990 to 2007

    NASA Technical Reports Server (NTRS)

    Balashov, Nikolay V.; Thompson, Anne M.; Piketh, Stuart J.; Langerman, Kristy E.

    2014-01-01

    Surface ozone is a secondary air pollutant formed from reactions between nitrogen oxides (NOx = NO + NO2) and volatile organic compounds in the presence of sunlight. In this work we examine effects of the climate pattern known as the El Niño-Southern Oscillation (ENSO) and NOx variability on surface ozone from 1990 to 2007 over the South African Highveld, a heavily populated region in South Africa with numerous industrial facilities. Over summer and autumn (December-May) on the Highveld, El Niño, as signified by positive sea surface temperature (SST) anomalies over the central Pacific Ocean, is typically associated with drier and warmer than normal conditions favoring ozone formation. Conversely, La Niña, or negative SST anomalies over the central Pacific Ocean, is typically associated with cloudier and above normal rainfall conditions, hindering ozone production. We use a generalized regression model to identify any linear dependence that the Highveld ozone, measured at five air quality monitoring stations, may have on ENSO and NOx. Our results indicate that four out of the five stations exhibit a statistically significant sensitivity to ENSO at some point over the December-May period where El Niño amplifies ozone formation and La Niña reduces ozone formation. Three out of the five stations reveal statistically significant sensitivity to NOx variability, primarily in winter and spring. Accounting for ENSO and NOx effects throughout the study period of 18 years, two stations exhibit statistically significant negative ozone trends in spring, one station displays a statistically significant positive trend in August, and two stations show no statistically significant change in surface ozone.

  18. Comparison of mean climate trends in the Northern Hemisphere between National Centers for Environmental Prediction and two atmosphere-ocean model forced runs

    NASA Astrophysics Data System (ADS)

    Lucarini, Valerio; Russell, Gary L.

    2002-08-01

    Results are presented for two greenhouse gas experiments of the Goddard Institute for Space Studies atmosphere-ocean model (AOM). The computed trends of surface pressure; surface temperature; 850, 500, and 200 mbar geopotential heights; and related temperatures of the model for the time frame 1960-2000 are compared with those obtained from the National Centers for Enviromental Prediction (NCEP) observations. The domain of interest is the Northern Hemisphere because of the higher reliability of both the model results and the observations. A spatial correlation analysis and a mean value comparison are performed, showing good agreement in terms of statistical significance for most of the variables considered in the winter and annual means. However, the 850 mbar temperature trends do not show significant positive correlation, and the surface pressure and 850 mbar geopotential height mean trends confidence intervals do not overlap. A brief general discussion about the statistics of trend detection is presented. The accuracy that this AOM has in describing the regional and NH mean climate trends inferred from NCEP through the atmosphere suggests that it may be reliable in forecasting future climate changes.

  19. What’s the good of education on our overall quality of life? A simultaneous equation model of education and life satisfaction for Australia

    PubMed Central

    Powdthavee, Nattavudh; Lekfuangfu, Warn N.; Wooden, Mark

    2017-01-01

    Many economists and educators favour public support for education on the premise that education improves the overall quality of life of citizens. However, little is known about the different pathways through which education shapes people’s satisfaction with life overall. One reason for this is because previous studies have traditionally analysed the effect of education on life satisfaction using single-equation models that ignore interrelationships between different theoretical explanatory variables. In order to advance our understanding of how education may be related to overall quality of life, the current study estimates a structural equation model using nationally representative data for Australia to obtain the direct and indirect associations between education and life satisfaction through five different adult outcomes: income, employment, marriage, children, and health. Although we find the estimated direct (or net) effect of education on life satisfaction to be negative and statistically significant in Australia, the total indirect effect is positive, sizeable and statistically significant for both men and women. This implies that misleading conclusions regarding the influence of education on life satisfaction might be obtained if only single-equation models were used in the analysis. PMID:28713668

  20. Quantification of downscaled precipitation uncertainties via Bayesian inference

    NASA Astrophysics Data System (ADS)

    Nury, A. H.; Sharma, A.; Marshall, L. A.

    2017-12-01

    Prediction of precipitation from global climate model (GCM) outputs remains critical to decision-making in water-stressed regions. In this regard, downscaling of GCM output has been a useful tool for analysing future hydro-climatological states. Several downscaling approaches have been developed for precipitation downscaling, including those using dynamical or statistical downscaling methods. Frequently, outputs from dynamical downscaling are not readily transferable across regions for significant methodical and computational difficulties. Statistical downscaling approaches provide a flexible and efficient alternative, providing hydro-climatological outputs across multiple temporal and spatial scales in many locations. However these approaches are subject to significant uncertainty, arising due to uncertainty in the downscaled model parameters and in the use of different reanalysis products for inferring appropriate model parameters. Consequently, these will affect the performance of simulation in catchment scale. This study develops a Bayesian framework for modelling downscaled daily precipitation from GCM outputs. This study aims to introduce uncertainties in downscaling evaluating reanalysis datasets against observational rainfall data over Australia. In this research a consistent technique for quantifying downscaling uncertainties by means of Bayesian downscaling frame work has been proposed. The results suggest that there are differences in downscaled precipitation occurrences and extremes.

Top