Sample records for zero-inflated poisson regression

  1. A test of inflated zeros for Poisson regression models.

    PubMed

    He, Hua; Zhang, Hui; Ye, Peng; Tang, Wan

    2017-01-01

    Excessive zeros are common in practice and may cause overdispersion and invalidate inference when fitting Poisson regression models. There is a large body of literature on zero-inflated Poisson models. However, methods for testing whether there are excessive zeros are less well developed. The Vuong test comparing a Poisson and a zero-inflated Poisson model is commonly applied in practice. However, the type I error of the test often deviates seriously from the nominal level, rendering serious doubts on the validity of the test in such applications. In this paper, we develop a new approach for testing inflated zeros under the Poisson model. Unlike the Vuong test for inflated zeros, our method does not require a zero-inflated Poisson model to perform the test. Simulation studies show that when compared with the Vuong test our approach not only better at controlling type I error rate, but also yield more power.

  2. Zero-inflated Conway-Maxwell Poisson Distribution to Analyze Discrete Data.

    PubMed

    Sim, Shin Zhu; Gupta, Ramesh C; Ong, Seng Huat

    2018-01-09

    In this paper, we study the zero-inflated Conway-Maxwell Poisson (ZICMP) distribution and develop a regression model. Score and likelihood ratio tests are also implemented for testing the inflation/deflation parameter. Simulation studies are carried out to examine the performance of these tests. A data example is presented to illustrate the concepts. In this example, the proposed model is compared to the well-known zero-inflated Poisson (ZIP) and the zero- inflated generalized Poisson (ZIGP) regression models. It is shown that the fit by ZICMP is comparable or better than these models.

  3. Modeling health survey data with excessive zero and K responses.

    PubMed

    Lin, Ting Hsiang; Tsai, Min-Hsiao

    2013-04-30

    Zero-inflated Poisson regression is a popular tool used to analyze data with excessive zeros. Although much work has already been performed to fit zero-inflated data, most models heavily depend on special features of the individual data. To be specific, this means that there is a sizable group of respondents who endorse the same answers making the data have peaks. In this paper, we propose a new model with the flexibility to model excessive counts other than zero, and the model is a mixture of multinomial logistic and Poisson regression, in which the multinomial logistic component models the occurrence of excessive counts, including zeros, K (where K is a positive integer) and all other values. The Poisson regression component models the counts that are assumed to follow a Poisson distribution. Two examples are provided to illustrate our models when the data have counts containing many ones and sixes. As a result, the zero-inflated and K-inflated models exhibit a better fit than the zero-inflated Poisson and standard Poisson regressions. Copyright © 2012 John Wiley & Sons, Ltd.

  4. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.

    PubMed

    Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.

  5. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach

    PubMed Central

    Mohammadi, Tayeb; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables “number of blood donation” and “number of blood deferral”: as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models. PMID:27703493

  6. Application of zero-inflated poisson mixed models in prognostic factors of hepatitis C.

    PubMed

    Akbarzadeh Baghban, Alireza; Pourhoseingholi, Asma; Zayeri, Farid; Jafari, Ali Akbar; Alavian, Seyed Moayed

    2013-01-01

    In recent years, hepatitis C virus (HCV) infection represents a major public health problem. Evaluation of risk factors is one of the solutions which help protect people from the infection. This study aims to employ zero-inflated Poisson mixed models to evaluate prognostic factors of hepatitis C. The data was collected from a longitudinal study during 2005-2010. First, mixed Poisson regression (PR) model was fitted to the data. Then, a mixed zero-inflated Poisson model was fitted with compound Poisson random effects. For evaluating the performance of the proposed mixed model, standard errors of estimators were compared. The results obtained from mixed PR showed that genotype 3 and treatment protocol were statistically significant. Results of zero-inflated Poisson mixed model showed that age, sex, genotypes 2 and 3, the treatment protocol, and having risk factors had significant effects on viral load of HCV patients. Of these two models, the estimators of zero-inflated Poisson mixed model had the minimum standard errors. The results showed that a mixed zero-inflated Poisson model was the almost best fit. The proposed model can capture serial dependence, additional overdispersion, and excess zeros in the longitudinal count data.

  7. Marginalized zero-inflated negative binomial regression with application to dental caries

    PubMed Central

    Preisser, John S.; Das, Kalyan; Long, D. Leann; Divaris, Kimon

    2015-01-01

    The zero-inflated negative binomial regression model (ZINB) is often employed in diverse fields such as dentistry, health care utilization, highway safety, and medicine to examine relationships between exposures of interest and overdispersed count outcomes exhibiting many zeros. The regression coefficients of ZINB have latent class interpretations for a susceptible subpopulation at risk for the disease/condition under study with counts generated from a negative binomial distribution and for a non-susceptible subpopulation that provides only zero counts. The ZINB parameters, however, are not well-suited for estimating overall exposure effects, specifically, in quantifying the effect of an explanatory variable in the overall mixture population. In this paper, a marginalized zero-inflated negative binomial regression (MZINB) model for independent responses is proposed to model the population marginal mean count directly, providing straightforward inference for overall exposure effects based on maximum likelihood estimation. Through simulation studies, the finite sample performance of MZINB is compared to marginalized zero-inflated Poisson, Poisson, and negative binomial regression. The MZINB model is applied in the evaluation of a school-based fluoride mouthrinse program on dental caries in 677 children. PMID:26568034

  8. Modeling Tetanus Neonatorum case using the regression of negative binomial and zero-inflated negative binomial

    NASA Astrophysics Data System (ADS)

    Amaliana, Luthfatul; Sa'adah, Umu; Wayan Surya Wardhani, Ni

    2017-12-01

    Tetanus Neonatorum is an infectious disease that can be prevented by immunization. The number of Tetanus Neonatorum cases in East Java Province is the highest in Indonesia until 2015. Tetanus Neonatorum data contain over dispersion and big enough proportion of zero-inflation. Negative Binomial (NB) regression is an alternative method when over dispersion happens in Poisson regression. However, the data containing over dispersion and zero-inflation are more appropriately analyzed by using Zero-Inflated Negative Binomial (ZINB) regression. The purpose of this study are: (1) to model Tetanus Neonatorum cases in East Java Province with 71.05 percent proportion of zero-inflation by using NB and ZINB regression, (2) to obtain the best model. The result of this study indicates that ZINB is better than NB regression with smaller AIC.

  9. Zero-inflated Poisson regression models for QTL mapping applied to tick-resistance in a Gyr × Holstein F2 population

    PubMed Central

    Silva, Fabyano Fonseca; Tunin, Karen P.; Rosa, Guilherme J.M.; da Silva, Marcos V.B.; Azevedo, Ana Luisa Souza; da Silva Verneque, Rui; Machado, Marco Antonio; Packer, Irineu Umberto

    2011-01-01

    Now a days, an important and interesting alternative in the control of tick-infestation in cattle is to select resistant animals, and identify the respective quantitative trait loci (QTLs) and DNA markers, for posterior use in breeding programs. The number of ticks/animal is characterized as a discrete-counting trait, which could potentially follow Poisson distribution. However, in the case of an excess of zeros, due to the occurrence of several noninfected animals, zero-inflated Poisson and generalized zero-inflated distribution (GZIP) may provide a better description of the data. Thus, the objective here was to compare through simulation, Poisson and ZIP models (simple and generalized) with classical approaches, for QTL mapping with counting phenotypes under different scenarios, and to apply these approaches to a QTL study of tick resistance in an F2 cattle (Gyr × Holstein) population. It was concluded that, when working with zero-inflated data, it is recommendable to use the generalized and simple ZIP model for analysis. On the other hand, when working with data with zeros, but not zero-inflated, the Poisson model or a data-transformation-approach, such as square-root or Box-Cox transformation, are applicable. PMID:22215960

  10. Marginalized zero-inflated Poisson models with missing covariates.

    PubMed

    Benecha, Habtamu K; Preisser, John S; Divaris, Kimon; Herring, Amy H; Das, Kalyan

    2018-05-11

    Unlike zero-inflated Poisson regression, marginalized zero-inflated Poisson (MZIP) models for counts with excess zeros provide estimates with direct interpretations for the overall effects of covariates on the marginal mean. In the presence of missing covariates, MZIP and many other count data models are ordinarily fitted using complete case analysis methods due to lack of appropriate statistical methods and software. This article presents an estimation method for MZIP models with missing covariates. The method, which is applicable to other missing data problems, is illustrated and compared with complete case analysis by using simulations and dental data on the caries preventive effects of a school-based fluoride mouthrinse program. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Growth Curve Models for Zero-Inflated Count Data: An Application to Smoking Behavior

    ERIC Educational Resources Information Center

    Liu, Hui; Powers, Daniel A.

    2007-01-01

    This article applies growth curve models to longitudinal count data characterized by an excess of zero counts. We discuss a zero-inflated Poisson regression model for longitudinal data in which the impact of covariates on the initial counts and the rate of change in counts over time is the focus of inference. Basic growth curve models using a…

  12. A comparison between Poisson and zero-inflated Poisson regression models with an application to number of black spots in Corriedale sheep

    PubMed Central

    Naya, Hugo; Urioste, Jorge I; Chang, Yu-Mei; Rodrigues-Motta, Mariana; Kremer, Roberto; Gianola, Daniel

    2008-01-01

    Dark spots in the fleece area are often associated with dark fibres in wool, which limits its competitiveness with other textile fibres. Field data from a sheep experiment in Uruguay revealed an excess number of zeros for dark spots. We compared the performance of four Poisson and zero-inflated Poisson (ZIP) models under four simulation scenarios. All models performed reasonably well under the same scenario for which the data were simulated. The deviance information criterion favoured a Poisson model with residual, while the ZIP model with a residual gave estimates closer to their true values under all simulation scenarios. Both Poisson and ZIP models with an error term at the regression level performed better than their counterparts without such an error. Field data from Corriedale sheep were analysed with Poisson and ZIP models with residuals. Parameter estimates were similar for both models. Although the posterior distribution of the sire variance was skewed due to a small number of rams in the dataset, the median of this variance suggested a scope for genetic selection. The main environmental factor was the age of the sheep at shearing. In summary, age related processes seem to drive the number of dark spots in this breed of sheep. PMID:18558072

  13. Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.

    PubMed

    Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai

    2011-01-01

    Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs. Published by Elsevier Ltd.

  14. Analyzing hospitalization data: potential limitations of Poisson regression.

    PubMed

    Weaver, Colin G; Ravani, Pietro; Oliver, Matthew J; Austin, Peter C; Quinn, Robert R

    2015-08-01

    Poisson regression is commonly used to analyze hospitalization data when outcomes are expressed as counts (e.g. number of days in hospital). However, data often violate the assumptions on which Poisson regression is based. More appropriate extensions of this model, while available, are rarely used. We compared hospitalization data between 206 patients treated with hemodialysis (HD) and 107 treated with peritoneal dialysis (PD) using Poisson regression and compared results from standard Poisson regression with those obtained using three other approaches for modeling count data: negative binomial (NB) regression, zero-inflated Poisson (ZIP) regression and zero-inflated negative binomial (ZINB) regression. We examined the appropriateness of each model and compared the results obtained with each approach. During a mean 1.9 years of follow-up, 183 of 313 patients (58%) were never hospitalized (indicating an excess of 'zeros'). The data also displayed overdispersion (variance greater than mean), violating another assumption of the Poisson model. Using four criteria, we determined that the NB and ZINB models performed best. According to these two models, patients treated with HD experienced similar hospitalization rates as those receiving PD {NB rate ratio (RR): 1.04 [bootstrapped 95% confidence interval (CI): 0.49-2.20]; ZINB summary RR: 1.21 (bootstrapped 95% CI 0.60-2.46)}. Poisson and ZIP models fit the data poorly and had much larger point estimates than the NB and ZINB models [Poisson RR: 1.93 (bootstrapped 95% CI 0.88-4.23); ZIP summary RR: 1.84 (bootstrapped 95% CI 0.88-3.84)]. We found substantially different results when modeling hospitalization data, depending on the approach used. Our results argue strongly for a sound model selection process and improved reporting around statistical methods used for modeling count data. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  15. Review and Recommendations for Zero-inflated Count Regression Modeling of Dental Caries Indices in Epidemiological Studies

    PubMed Central

    Stamm, John W.; Long, D. Leann; Kincade, Megan E.

    2012-01-01

    Over the past five to ten years, zero-inflated count regression models have been increasingly applied to the analysis of dental caries indices (e.g., DMFT, dfms, etc). The main reason for that is linked to the broad decline in children’s caries experience, such that dmf and DMF indices more frequently generate low or even zero counts. This article specifically reviews the application of zero-inflated Poisson and zero-inflated negative binomial regression models to dental caries, with emphasis on the description of the models and the interpretation of fitted model results given the study goals. The review finds that interpretations provided in the published caries research are often imprecise or inadvertently misleading, particularly with respect to failing to discriminate between inference for the class of susceptible persons defined by such models and inference for the sampled population in terms of overall exposure effects. Recommendations are provided to enhance the use as well as the interpretation and reporting of results of count regression models when applied to epidemiological studies of dental caries. PMID:22710271

  16. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    PubMed

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  17. Marginal regression models for clustered count data based on zero-inflated Conway-Maxwell-Poisson distribution with applications.

    PubMed

    Choo-Wosoba, Hyoyoung; Levy, Steven M; Datta, Somnath

    2016-06-01

    Community water fluoridation is an important public health measure to prevent dental caries, but it continues to be somewhat controversial. The Iowa Fluoride Study (IFS) is a longitudinal study on a cohort of Iowa children that began in 1991. The main purposes of this study (http://www.dentistry.uiowa.edu/preventive-fluoride-study) were to quantify fluoride exposures from both dietary and nondietary sources and to associate longitudinal fluoride exposures with dental fluorosis (spots on teeth) and dental caries (cavities). We analyze a subset of the IFS data by a marginal regression model with a zero-inflated version of the Conway-Maxwell-Poisson distribution for count data exhibiting excessive zeros and a wide range of dispersion patterns. In general, we introduce two estimation methods for fitting a ZICMP marginal regression model. Finite sample behaviors of the estimators and the resulting confidence intervals are studied using extensive simulation studies. We apply our methodologies to the dental caries data. Our novel modeling incorporating zero inflation, clustering, and overdispersion sheds some new light on the effect of community water fluoridation and other factors. We also include a second application of our methodology to a genomic (next-generation sequencing) dataset that exhibits underdispersion. © 2015, The International Biometric Society.

  18. Marginalized zero-altered models for longitudinal count data.

    PubMed

    Tabb, Loni Philip; Tchetgen, Eric J Tchetgen; Wellenius, Greg A; Coull, Brent A

    2016-10-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias.

  19. Marginalized zero-altered models for longitudinal count data

    PubMed Central

    Tabb, Loni Philip; Tchetgen, Eric J. Tchetgen; Wellenius, Greg A.; Coull, Brent A.

    2015-01-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias. PMID:27867423

  20. Distribution-free Inference of Zero-inated Binomial Data for Longitudinal Studies.

    PubMed

    He, H; Wang, W J; Hu, J; Gallop, R; Crits-Christoph, P; Xia, Y L

    2015-10-01

    Count reponses with structural zeros are very common in medical and psychosocial research, especially in alcohol and HIV research, and the zero-inflated poisson (ZIP) and zero-inflated negative binomial (ZINB) models are widely used for modeling such outcomes. However, as alcohol drinking outcomes such as days of drinkings are counts within a given period, their distributions are bounded above by an upper limit (total days in the period) and thus inherently follow a binomial or zero-inflated binomial (ZIB) distribution, rather than a Poisson or zero-inflated Poisson (ZIP) distribution, in the presence of structural zeros. In this paper, we develop a new semiparametric approach for modeling zero-inflated binomial (ZIB)-like count responses for cross-sectional as well as longitudinal data. We illustrate this approach with both simulated and real study data.

  1. New variable selection methods for zero-inflated count data with applications to the substance abuse field

    PubMed Central

    Buu, Anne; Johnson, Norman J.; Li, Runze; Tan, Xianming

    2011-01-01

    Zero-inflated count data are very common in health surveys. This study develops new variable selection methods for the zero-inflated Poisson regression model. Our simulations demonstrate the negative consequences which arise from the ignorance of zero-inflation. Among the competing methods, the one-step SCAD method is recommended because it has the highest specificity, sensitivity, exact fit, and lowest estimation error. The design of the simulations is based on the special features of two large national databases commonly used in the alcoholism and substance abuse field so that our findings can be easily generalized to the real settings. Applications of the methodology are demonstrated by empirical analyses on the data from a well-known alcohol study. PMID:21563207

  2. Functional linear models for zero-inflated count data with application to modeling hospitalizations in patients on dialysis.

    PubMed

    Sentürk, Damla; Dalrymple, Lorien S; Nguyen, Danh V

    2014-11-30

    We propose functional linear models for zero-inflated count data with a focus on the functional hurdle and functional zero-inflated Poisson (ZIP) models. Although the hurdle model assumes the counts come from a mixture of a degenerate distribution at zero and a zero-truncated Poisson distribution, the ZIP model considers a mixture of a degenerate distribution at zero and a standard Poisson distribution. We extend the generalized functional linear model framework with a functional predictor and multiple cross-sectional predictors to model counts generated by a mixture distribution. We propose an estimation procedure for functional hurdle and ZIP models, called penalized reconstruction, geared towards error-prone and sparsely observed longitudinal functional predictors. The approach relies on dimension reduction and pooling of information across subjects involving basis expansions and penalized maximum likelihood techniques. The developed functional hurdle model is applied to modeling hospitalizations within the first 2 years from initiation of dialysis, with a high percentage of zeros, in the Comprehensive Dialysis Study participants. Hospitalization counts are modeled as a function of sparse longitudinal measurements of serum albumin concentrations, patient demographics, and comorbidities. Simulation studies are used to study finite sample properties of the proposed method and include comparisons with an adaptation of standard principal components regression. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Poisson Mixture Regression Models for Heart Disease Prediction.

    PubMed

    Mufudza, Chipo; Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.

  4. Poisson Mixture Regression Models for Heart Disease Prediction

    PubMed Central

    Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611

  5. Modeling Zero-Inflated and Overdispersed Count Data: An Empirical Study of School Suspensions

    ERIC Educational Resources Information Center

    Desjardins, Christopher David

    2016-01-01

    The purpose of this article is to develop a statistical model that best explains variability in the number of school days suspended. Number of school days suspended is a count variable that may be zero-inflated and overdispersed relative to a Poisson model. Four models were examined: Poisson, negative binomial, Poisson hurdle, and negative…

  6. IRT-ZIP Modeling for Multivariate Zero-Inflated Count Data

    ERIC Educational Resources Information Center

    Wang, Lijuan

    2010-01-01

    This study introduces an item response theory-zero-inflated Poisson (IRT-ZIP) model to investigate psychometric properties of multiple items and predict individuals' latent trait scores for multivariate zero-inflated count data. In the model, two link functions are used to capture two processes of the zero-inflated count data. Item parameters are…

  7. A quantile count model of water depth constraints on Cape Sable seaside sparrows

    USGS Publications Warehouse

    Cade, B.S.; Dong, Q.

    2008-01-01

    1. A quantile regression model for counts of breeding Cape Sable seaside sparrows Ammodramus maritimus mirabilis (L.) as a function of water depth and previous year abundance was developed based on extensive surveys, 1992-2005, in the Florida Everglades. The quantile count model extends linear quantile regression methods to discrete response variables, providing a flexible alternative to discrete parametric distributional models, e.g. Poisson, negative binomial and their zero-inflated counterparts. 2. Estimates from our multiplicative model demonstrated that negative effects of increasing water depth in breeding habitat on sparrow numbers were dependent on recent occupation history. Upper 10th percentiles of counts (one to three sparrows) decreased with increasing water depth from 0 to 30 cm when sites were not occupied in previous years. However, upper 40th percentiles of counts (one to six sparrows) decreased with increasing water depth for sites occupied in previous years. 3. Greatest decreases (-50% to -83%) in upper quantiles of sparrow counts occurred as water depths increased from 0 to 15 cm when previous year counts were 1, but a small proportion of sites (5-10%) held at least one sparrow even as water depths increased to 20 or 30 cm. 4. A zero-inflated Poisson regression model provided estimates of conditional means that also decreased with increasing water depth but rates of change were lower and decreased with increasing previous year counts compared to the quantile count model. Quantiles computed for the zero-inflated Poisson model enhanced interpretation of this model but had greater lack-of-fit for water depths > 0 cm and previous year counts 1, conditions where the negative effect of water depths were readily apparent and fitted better with the quantile count model.

  8. EM Adaptive LASSO—A Multilocus Modeling Strategy for Detecting SNPs Associated with Zero-inflated Count Phenotypes

    PubMed Central

    Mallick, Himel; Tiwari, Hemant K.

    2016-01-01

    Count data are increasingly ubiquitous in genetic association studies, where it is possible to observe excess zero counts as compared to what is expected based on standard assumptions. For instance, in rheumatology, data are usually collected in multiple joints within a person or multiple sub-regions of a joint, and it is not uncommon that the phenotypes contain enormous number of zeroes due to the presence of excessive zero counts in majority of patients. Most existing statistical methods assume that the count phenotypes follow one of these four distributions with appropriate dispersion-handling mechanisms: Poisson, Zero-inflated Poisson (ZIP), Negative Binomial, and Zero-inflated Negative Binomial (ZINB). However, little is known about their implications in genetic association studies. Also, there is a relative paucity of literature on their usefulness with respect to model misspecification and variable selection. In this article, we have investigated the performance of several state-of-the-art approaches for handling zero-inflated count data along with a novel penalized regression approach with an adaptive LASSO penalty, by simulating data under a variety of disease models and linkage disequilibrium patterns. By taking into account data-adaptive weights in the estimation procedure, the proposed method provides greater flexibility in multi-SNP modeling of zero-inflated count phenotypes. A fast coordinate descent algorithm nested within an EM (expectation-maximization) algorithm is implemented for estimating the model parameters and conducting variable selection simultaneously. Results show that the proposed method has optimal performance in the presence of multicollinearity, as measured by both prediction accuracy and empirical power, which is especially apparent as the sample size increases. Moreover, the Type I error rates become more or less uncontrollable for the competing methods when a model is misspecified, a phenomenon routinely encountered in practice. PMID:27066062

  9. EM Adaptive LASSO-A Multilocus Modeling Strategy for Detecting SNPs Associated with Zero-inflated Count Phenotypes.

    PubMed

    Mallick, Himel; Tiwari, Hemant K

    2016-01-01

    Count data are increasingly ubiquitous in genetic association studies, where it is possible to observe excess zero counts as compared to what is expected based on standard assumptions. For instance, in rheumatology, data are usually collected in multiple joints within a person or multiple sub-regions of a joint, and it is not uncommon that the phenotypes contain enormous number of zeroes due to the presence of excessive zero counts in majority of patients. Most existing statistical methods assume that the count phenotypes follow one of these four distributions with appropriate dispersion-handling mechanisms: Poisson, Zero-inflated Poisson (ZIP), Negative Binomial, and Zero-inflated Negative Binomial (ZINB). However, little is known about their implications in genetic association studies. Also, there is a relative paucity of literature on their usefulness with respect to model misspecification and variable selection. In this article, we have investigated the performance of several state-of-the-art approaches for handling zero-inflated count data along with a novel penalized regression approach with an adaptive LASSO penalty, by simulating data under a variety of disease models and linkage disequilibrium patterns. By taking into account data-adaptive weights in the estimation procedure, the proposed method provides greater flexibility in multi-SNP modeling of zero-inflated count phenotypes. A fast coordinate descent algorithm nested within an EM (expectation-maximization) algorithm is implemented for estimating the model parameters and conducting variable selection simultaneously. Results show that the proposed method has optimal performance in the presence of multicollinearity, as measured by both prediction accuracy and empirical power, which is especially apparent as the sample size increases. Moreover, the Type I error rates become more or less uncontrollable for the competing methods when a model is misspecified, a phenomenon routinely encountered in practice.

  10. Does attitude matter in computer use in Australian general practice? A zero-inflated Poisson regression analysis.

    PubMed

    Khan, Asaduzzaman; Western, Mark

    The purpose of this study was to explore factors that facilitate or hinder effective use of computers in Australian general medical practice. This study is based on data extracted from a national telephone survey of 480 general practitioners (GPs) across Australia. Clinical functions performed by GPs using computers were examined using a zero-inflated Poisson (ZIP) regression modelling. About 17% of GPs were not using computer for any clinical function, while 18% reported using computers for all clinical functions. The ZIP model showed that computer anxiety was negatively associated with effective computer use, while practitioners' belief about usefulness of computers was positively associated with effective computer use. Being a female GP or working in partnership or group practice increased the odds of effectively using computers for clinical functions. To fully capitalise on the benefits of computer technology, GPs need to be convinced that this technology is useful and can make a difference.

  11. Zero adjusted models with applications to analysing helminths count data.

    PubMed

    Chipeta, Michael G; Ngwira, Bagrey M; Simoonga, Christopher; Kazembe, Lawrence N

    2014-11-27

    It is common in public health and epidemiology that the outcome of interest is counts of events occurrence. Analysing these data using classical linear models is mostly inappropriate, even after transformation of outcome variables due to overdispersion. Zero-adjusted mixture count models such as zero-inflated and hurdle count models are applied to count data when over-dispersion and excess zeros exist. Main objective of the current paper is to apply such models to analyse risk factors associated with human helminths (S. haematobium) particularly in a case where there's a high proportion of zero counts. The data were collected during a community-based randomised control trial assessing the impact of mass drug administration (MDA) with praziquantel in Malawi, and a school-based cross sectional epidemiology survey in Zambia. Count data models including traditional (Poisson and negative binomial) models, zero modified models (zero inflated Poisson and zero inflated negative binomial) and hurdle models (Poisson logit hurdle and negative binomial logit hurdle) were fitted and compared. Using Akaike information criteria (AIC), the negative binomial logit hurdle (NBLH) and zero inflated negative binomial (ZINB) showed best performance in both datasets. With regards to zero count capturing, these models performed better than other models. This paper showed that zero modified NBLH and ZINB models are more appropriate methods for the analysis of data with excess zeros. The choice between the hurdle and zero-inflated models should be based on the aim and endpoints of the study.

  12. Selecting the right statistical model for analysis of insect count data by using information theoretic measures.

    PubMed

    Sileshi, G

    2006-10-01

    Researchers and regulatory agencies often make statistical inferences from insect count data using modelling approaches that assume homogeneous variance. Such models do not allow for formal appraisal of variability which in its different forms is the subject of interest in ecology. Therefore, the objectives of this paper were to (i) compare models suitable for handling variance heterogeneity and (ii) select optimal models to ensure valid statistical inferences from insect count data. The log-normal, standard Poisson, Poisson corrected for overdispersion, zero-inflated Poisson, the negative binomial distribution and zero-inflated negative binomial models were compared using six count datasets on foliage-dwelling insects and five families of soil-dwelling insects. Akaike's and Schwarz Bayesian information criteria were used for comparing the various models. Over 50% of the counts were zeros even in locally abundant species such as Ootheca bennigseni Weise, Mesoplatys ochroptera Stål and Diaecoderus spp. The Poisson model after correction for overdispersion and the standard negative binomial distribution model provided better description of the probability distribution of seven out of the 11 insects than the log-normal, standard Poisson, zero-inflated Poisson or zero-inflated negative binomial models. It is concluded that excess zeros and variance heterogeneity are common data phenomena in insect counts. If not properly modelled, these properties can invalidate the normal distribution assumptions resulting in biased estimation of ecological effects and jeopardizing the integrity of the scientific inferences. Therefore, it is recommended that statistical models appropriate for handling these data properties be selected using objective criteria to ensure efficient statistical inference.

  13. Variable selection for distribution-free models for longitudinal zero-inflated count responses.

    PubMed

    Chen, Tian; Wu, Pan; Tang, Wan; Zhang, Hui; Feng, Changyong; Kowalski, Jeanne; Tu, Xin M

    2016-07-20

    Zero-inflated count outcomes arise quite often in research and practice. Parametric models such as the zero-inflated Poisson and zero-inflated negative binomial are widely used to model such responses. Like most parametric models, they are quite sensitive to departures from assumed distributions. Recently, new approaches have been proposed to provide distribution-free, or semi-parametric, alternatives. These methods extend the generalized estimating equations to provide robust inference for population mixtures defined by zero-inflated count outcomes. In this paper, we propose methods to extend smoothly clipped absolute deviation (SCAD)-based variable selection methods to these new models. Variable selection has been gaining popularity in modern clinical research studies, as determining differential treatment effects of interventions for different subgroups has become the norm, rather the exception, in the era of patent-centered outcome research. Such moderation analysis in general creates many explanatory variables in regression analysis, and the advantages of SCAD-based methods over their traditional counterparts render them a great choice for addressing this important and timely issues in clinical research. We illustrate the proposed approach with both simulated and real study data. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Zero-Inflated Poisson Modeling of Fall Risk Factors in Community-Dwelling Older Adults.

    PubMed

    Jung, Dukyoo; Kang, Younhee; Kim, Mi Young; Ma, Rye-Won; Bhandari, Pratibha

    2016-02-01

    The aim of this study was to identify risk factors for falls among community-dwelling older adults. The study used a cross-sectional descriptive design. Self-report questionnaires were used to collect data from 658 community-dwelling older adults and were analyzed using logistic and zero-inflated Poisson (ZIP) regression. Perceived health status was a significant factor in the count model, and fall efficacy emerged as a significant predictor in the logistic models. The findings suggest that fall efficacy is important for predicting not only faller and nonfaller status but also fall counts in older adults who may or may not have experienced a previous fall. The fall predictors identified in this study--perceived health status and fall efficacy--indicate the need for fall-prevention programs tailored to address both the physical and psychological issues unique to older adults. © The Author(s) 2014.

  15. Structural zeroes and zero-inflated models.

    PubMed

    He, Hua; Tang, Wan; Wang, Wenjuan; Crits-Christoph, Paul

    2014-08-01

    In psychosocial and behavioral studies count outcomes recording the frequencies of the occurrence of some health or behavior outcomes (such as the number of unprotected sexual behaviors during a period of time) often contain a preponderance of zeroes because of the presence of 'structural zeroes' that occur when some subjects are not at risk for the behavior of interest. Unlike random zeroes (responses that can be greater than zero, but are zero due to sampling variability), structural zeroes are usually very different, both statistically and clinically. False interpretations of results and study findings may result if differences in the two types of zeroes are ignored. However, in practice, the status of the structural zeroes is often not observed and this latent nature complicates the data analysis. In this article, we focus on one model, the zero-inflated Poisson (ZIP) regression model that is commonly used to address zero-inflated data. We first give a brief overview of the issues of structural zeroes and the ZIP model. We then given an illustration of ZIP with data from a study on HIV-risk sexual behaviors among adolescent girls. Sample codes in SAS and Stata are also included to help perform and explain ZIP analyses.

  16. Cumulative sum control charts for monitoring geometrically inflated Poisson processes: An application to infectious disease counts data.

    PubMed

    Rakitzis, Athanasios C; Castagliola, Philippe; Maravelakis, Petros E

    2018-02-01

    In this work, we study upper-sided cumulative sum control charts that are suitable for monitoring geometrically inflated Poisson processes. We assume that a process is properly described by a two-parameter extension of the zero-inflated Poisson distribution, which can be used for modeling count data with an excessive number of zero and non-zero values. Two different upper-sided cumulative sum-type schemes are considered, both suitable for the detection of increasing shifts in the average of the process. Aspects of their statistical design are discussed and their performance is compared under various out-of-control situations. Changes in both parameters of the process are considered. Finally, the monitoring of the monthly cases of poliomyelitis in the USA is given as an illustrative example.

  17. Determinants of The Grade A Embryos in Infertile Women; Zero-Inflated Regression Model.

    PubMed

    Almasi-Hashiani, Amir; Ghaheri, Azadeh; Omani Samani, Reza

    2017-10-01

    In assisted reproductive technology, it is important to choose high quality embryos for embryo transfer. The aim of the present study was to determine the grade A embryo count and factors related to it in infertile women. This historical cohort study included 996 infertile women. The main outcome was the number of grade A embryos. Zero-Inflated Poisson (ZIP) regression and Zero-Inflated Negative Binomial (ZINB) regression were used to model the count data as it contained excessive zeros. Stata software, version 13 (Stata Corp, College Station, TX, USA) was used for all statistical analyses. After adjusting for potential confounders, results from the ZINB model show that for each unit increase in the number 2 pronuclear (2PN) zygotes, we get an increase of 1.45 times as incidence rate ratio (95% confidence interval (CI): 1.23-1.69, P=0.001) in the expected grade A embryo count number, and for each increase in the cleavage day we get a decrease 0.35 times (95% CI: 0.20-0.61, P=0.001) in expected grade A embryo count. There is a significant association between both the number of 2PN zygotes and cleavage day with the number of grade A embryos in both ZINB and ZIP regression models. The estimated coefficients are more plausible than values found in earlier studies using less relevant models. Copyright© by Royan Institute. All rights reserved.

  18. A comparison of different statistical methods analyzing hypoglycemia data using bootstrap simulations.

    PubMed

    Jiang, Honghua; Ni, Xiao; Huster, William; Heilmann, Cory

    2015-01-01

    Hypoglycemia has long been recognized as a major barrier to achieving normoglycemia with intensive diabetic therapies. It is a common safety concern for the diabetes patients. Therefore, it is important to apply appropriate statistical methods when analyzing hypoglycemia data. Here, we carried out bootstrap simulations to investigate the performance of the four commonly used statistical models (Poisson, negative binomial, analysis of covariance [ANCOVA], and rank ANCOVA) based on the data from a diabetes clinical trial. Zero-inflated Poisson (ZIP) model and zero-inflated negative binomial (ZINB) model were also evaluated. Simulation results showed that Poisson model inflated type I error, while negative binomial model was overly conservative. However, after adjusting for dispersion, both Poisson and negative binomial models yielded slightly inflated type I errors, which were close to the nominal level and reasonable power. Reasonable control of type I error was associated with ANCOVA model. Rank ANCOVA model was associated with the greatest power and with reasonable control of type I error. Inflated type I error was observed with ZIP and ZINB models.

  19. Poisson, Poisson-gamma and zero-inflated regression models of motor vehicle crashes: balancing statistical fit and theory.

    PubMed

    Lord, Dominique; Washington, Simon P; Ivan, John N

    2005-01-01

    There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states-perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of "excess" zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to "excess" zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed-and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros.

  20. Full Bayes Poisson gamma, Poisson lognormal, and zero inflated random effects models: Comparing the precision of crash frequency estimates.

    PubMed

    Aguero-Valverde, Jonathan

    2013-01-01

    In recent years, complex statistical modeling approaches have being proposed to handle the unobserved heterogeneity and the excess of zeros frequently found in crash data, including random effects and zero inflated models. This research compares random effects, zero inflated, and zero inflated random effects models using a full Bayes hierarchical approach. The models are compared not just in terms of goodness-of-fit measures but also in terms of precision of posterior crash frequency estimates since the precision of these estimates is vital for ranking of sites for engineering improvement. Fixed-over-time random effects models are also compared to independent-over-time random effects models. For the crash dataset being analyzed, it was found that once the random effects are included in the zero inflated models, the probability of being in the zero state is drastically reduced, and the zero inflated models degenerate to their non zero inflated counterparts. Also by fixing the random effects over time the fit of the models and the precision of the crash frequency estimates are significantly increased. It was found that the rankings of the fixed-over-time random effects models are very consistent among them. In addition, the results show that by fixing the random effects over time, the standard errors of the crash frequency estimates are significantly reduced for the majority of the segments on the top of the ranking. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. A review on models for count data with extra zeros

    NASA Astrophysics Data System (ADS)

    Zamri, Nik Sarah Nik; Zamzuri, Zamira Hasanah

    2017-04-01

    Typically, the zero inflated models are usually used in modelling count data with excess zeros. The existence of the extra zeros could be structural zeros or random which occur by chance. These types of data are commonly found in various disciplines such as finance, insurance, biomedical, econometrical, ecology, and health sciences. As found in the literature, the most popular zero inflated models used are zero inflated Poisson and zero inflated negative binomial. Recently, more complex models have been developed to account for overdispersion and unobserved heterogeneity. In addition, more extended distributions are also considered in modelling data with this feature. In this paper, we review related literature, provide a recent development and summary on models for count data with extra zeros.

  2. Modeling number of claims and prediction of total claim amount

    NASA Astrophysics Data System (ADS)

    Acar, Aslıhan Şentürk; Karabey, Uǧur

    2017-07-01

    In this study we focus on annual number of claims of a private health insurance data set which belongs to a local insurance company in Turkey. In addition to Poisson model and negative binomial model, zero-inflated Poisson model and zero-inflated negative binomial model are used to model the number of claims in order to take into account excess zeros. To investigate the impact of different distributional assumptions for the number of claims on the prediction of total claim amount, predictive performances of candidate models are compared by using root mean square error (RMSE) and mean absolute error (MAE) criteria.

  3. Analysis of overdispersed count data: application to the Human Papillomavirus Infection in Men (HIM) Study.

    PubMed

    Lee, J-H; Han, G; Fulp, W J; Giuliano, A R

    2012-06-01

    The Poisson model can be applied to the count of events occurring within a specific time period. The main feature of the Poisson model is the assumption that the mean and variance of the count data are equal. However, this equal mean-variance relationship rarely occurs in observational data. In most cases, the observed variance is larger than the assumed variance, which is called overdispersion. Further, when the observed data involve excessive zero counts, the problem of overdispersion results in underestimating the variance of the estimated parameter, and thus produces a misleading conclusion. We illustrated the use of four models for overdispersed count data that may be attributed to excessive zeros. These are Poisson, negative binomial, zero-inflated Poisson and zero-inflated negative binomial models. The example data in this article deal with the number of incidents involving human papillomavirus infection. The four models resulted in differing statistical inferences. The Poisson model, which is widely used in epidemiology research, underestimated the standard errors and overstated the significance of some covariates.

  4. Tobit analysis of vehicle accident rates on interstate highways.

    PubMed

    Anastasopoulos, Panagiotis Ch; Tarko, Andrew P; Mannering, Fred L

    2008-03-01

    There has been an abundance of research that has used Poisson models and its variants (negative binomial and zero-inflated models) to improve our understanding of the factors that affect accident frequencies on roadway segments. This study explores the application of an alternate method, tobit regression, by viewing vehicle accident rates directly (instead of frequencies) as a continuous variable that is left-censored at zero. Using data from vehicle accidents on Indiana interstates, the estimation results show that many factors relating to pavement condition, roadway geometrics and traffic characteristics significantly affect vehicle accident rates.

  5. The Role of Depressive Symptoms, Family Invalidation and Behavioral Impulsivity in the Occurrence and Repetition of Non-Suicidal Self-Injury in Chinese Adolescents: A 2-Year Follow-Up Study

    ERIC Educational Resources Information Center

    You, Jianing; Leung, Freedom

    2012-01-01

    This study used zero-inflated poisson regression analysis to examine the role of depressive symptoms, family invalidation, and behavioral impulsivity in the occurrence and repetition of non-suicidal self-injury among Chinese community adolescents over a 2-year period. Participants, 4782 high school students, were assessed twice during the…

  6. Mapping species abundance by a spatial zero-inflated Poisson model: a case study in the Wadden Sea, the Netherlands.

    PubMed

    Lyashevska, Olga; Brus, Dick J; van der Meer, Jaap

    2016-01-01

    The objective of the study was to provide a general procedure for mapping species abundance when data are zero-inflated and spatially correlated counts. The bivalve species Macoma balthica was observed on a 500×500 m grid in the Dutch part of the Wadden Sea. In total, 66% of the 3451 counts were zeros. A zero-inflated Poisson mixture model was used to relate counts to environmental covariates. Two models were considered, one with relatively fewer covariates (model "small") than the other (model "large"). The models contained two processes: a Bernoulli (species prevalence) and a Poisson (species intensity, when the Bernoulli process predicts presence). The model was used to make predictions for sites where only environmental data are available. Predicted prevalences and intensities show that the model "small" predicts lower mean prevalence and higher mean intensity, than the model "large". Yet, the product of prevalence and intensity, which might be called the unconditional intensity, is very similar. Cross-validation showed that the model "small" performed slightly better, but the difference was small. The proposed methodology might be generally applicable, but is computer intensive.

  7. Hurdle models for multilevel zero-inflated data via h-likelihood.

    PubMed

    Molas, Marek; Lesaffre, Emmanuel

    2010-12-30

    Count data often exhibit overdispersion. One type of overdispersion arises when there is an excess of zeros in comparison with the standard Poisson distribution. Zero-inflated Poisson and hurdle models have been proposed to perform a valid likelihood-based analysis to account for the surplus of zeros. Further, data often arise in clustered, longitudinal or multiple-membership settings. The proper analysis needs to reflect the design of a study. Typically random effects are used to account for dependencies in the data. We examine the h-likelihood estimation and inference framework for hurdle models with random effects for complex designs. We extend the h-likelihood procedures to fit hurdle models, thereby extending h-likelihood to truncated distributions. Two applications of the methodology are presented. Copyright © 2010 John Wiley & Sons, Ltd.

  8. Zero-inflated count models for longitudinal measurements with heterogeneous random effects.

    PubMed

    Zhu, Huirong; Luo, Sheng; DeSantis, Stacia M

    2017-08-01

    Longitudinal zero-inflated count data arise frequently in substance use research when assessing the effects of behavioral and pharmacological interventions. Zero-inflated count models (e.g. zero-inflated Poisson or zero-inflated negative binomial) with random effects have been developed to analyze this type of data. In random effects zero-inflated count models, the random effects covariance matrix is typically assumed to be homogeneous (constant across subjects). However, in many situations this matrix may be heterogeneous (differ by measured covariates). In this paper, we extend zero-inflated count models to account for random effects heterogeneity by modeling their variance as a function of covariates. We show via simulation that ignoring intervention and covariate-specific heterogeneity can produce biased estimates of covariate and random effect estimates. Moreover, those biased estimates can be rectified by correctly modeling the random effects covariance structure. The methodological development is motivated by and applied to the Combined Pharmacotherapies and Behavioral Interventions for Alcohol Dependence (COMBINE) study, the largest clinical trial of alcohol dependence performed in United States with 1383 individuals.

  9. On performance of parametric and distribution-free models for zero-inflated and over-dispersed count responses.

    PubMed

    Tang, Wan; Lu, Naiji; Chen, Tian; Wang, Wenjuan; Gunzler, Douglas David; Han, Yu; Tu, Xin M

    2015-10-30

    Zero-inflated Poisson (ZIP) and negative binomial (ZINB) models are widely used to model zero-inflated count responses. These models extend the Poisson and negative binomial (NB) to address excessive zeros in the count response. By adding a degenerate distribution centered at 0 and interpreting it as describing a non-risk group in the population, the ZIP (ZINB) models a two-component population mixture. As in applications of Poisson and NB, the key difference between ZIP and ZINB is the allowance for overdispersion by the ZINB in its NB component in modeling the count response for the at-risk group. Overdispersion arising in practice too often does not follow the NB, and applications of ZINB to such data yield invalid inference. If sources of overdispersion are known, other parametric models may be used to directly model the overdispersion. Such models too are subject to assumed distributions. Further, this approach may not be applicable if information about the sources of overdispersion is unavailable. In this paper, we propose a distribution-free alternative and compare its performance with these popular parametric models as well as a moment-based approach proposed by Yu et al. [Statistics in Medicine 2013; 32: 2390-2405]. Like the generalized estimating equations, the proposed approach requires no elaborate distribution assumptions. Compared with the approach of Yu et al., it is more robust to overdispersed zero-inflated responses. We illustrate our approach with both simulated and real study data. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Spatiotemporal hurdle models for zero-inflated count data: Exploring trends in emergency department visits.

    PubMed

    Neelon, Brian; Chang, Howard H; Ling, Qiang; Hastings, Nicole S

    2016-12-01

    Motivated by a study exploring spatiotemporal trends in emergency department use, we develop a class of two-part hurdle models for the analysis of zero-inflated areal count data. The models consist of two components-one for the probability of any emergency department use and one for the number of emergency department visits given use. Through a hierarchical structure, the models incorporate both patient- and region-level predictors, as well as spatially and temporally correlated random effects for each model component. The random effects are assigned multivariate conditionally autoregressive priors, which induce dependence between the components and provide spatial and temporal smoothing across adjacent spatial units and time periods, resulting in improved inferences. To accommodate potential overdispersion, we consider a range of parametric specifications for the positive counts, including truncated negative binomial and generalized Poisson distributions. We adopt a Bayesian inferential approach, and posterior computation is handled conveniently within standard Bayesian software. Our results indicate that the negative binomial and generalized Poisson hurdle models vastly outperform the Poisson hurdle model, demonstrating that overdispersed hurdle models provide a useful approach to analyzing zero-inflated spatiotemporal data. © The Author(s) 2014.

  11. Semiparametric bivariate zero-inflated Poisson models with application to studies of abundance for multiple species

    USGS Publications Warehouse

    Arab, Ali; Holan, Scott H.; Wikle, Christopher K.; Wildhaber, Mark L.

    2012-01-01

    Ecological studies involving counts of abundance, presence–absence or occupancy rates often produce data having a substantial proportion of zeros. Furthermore, these types of processes are typically multivariate and only adequately described by complex nonlinear relationships involving externally measured covariates. Ignoring these aspects of the data and implementing standard approaches can lead to models that fail to provide adequate scientific understanding of the underlying ecological processes, possibly resulting in a loss of inferential power. One method of dealing with data having excess zeros is to consider the class of univariate zero-inflated generalized linear models. However, this class of models fails to address the multivariate and nonlinear aspects associated with the data usually encountered in practice. Therefore, we propose a semiparametric bivariate zero-inflated Poisson model that takes into account both of these data attributes. The general modeling framework is hierarchical Bayes and is suitable for a broad range of applications. We demonstrate the effectiveness of our model through a motivating example on modeling catch per unit area for multiple species using data from the Missouri River Benthic Fishes Study, implemented by the United States Geological Survey.

  12. Statistical procedures for analyzing mental health services data.

    PubMed

    Elhai, Jon D; Calhoun, Patrick S; Ford, Julian D

    2008-08-15

    In mental health services research, analyzing service utilization data often poses serious problems, given the presence of substantially skewed data distributions. This article presents a non-technical introduction to statistical methods specifically designed to handle the complexly distributed datasets that represent mental health service use, including Poisson, negative binomial, zero-inflated, and zero-truncated regression models. A flowchart is provided to assist the investigator in selecting the most appropriate method. Finally, a dataset of mental health service use reported by medical patients is described, and a comparison of results across several different statistical methods is presented. Implications of matching data analytic techniques appropriately with the often complexly distributed datasets of mental health services utilization variables are discussed.

  13. Statistical Models for the Analysis of Zero-Inflated Pain Intensity Numeric Rating Scale Data.

    PubMed

    Goulet, Joseph L; Buta, Eugenia; Bathulapalli, Harini; Gueorguieva, Ralitza; Brandt, Cynthia A

    2017-03-01

    Pain intensity is often measured in clinical and research settings using the 0 to 10 numeric rating scale (NRS). NRS scores are recorded as discrete values, and in some samples they may display a high proportion of zeroes and a right-skewed distribution. Despite this, statistical methods for normally distributed data are frequently used in the analysis of NRS data. We present results from an observational cross-sectional study examining the association of NRS scores with patient characteristics using data collected from a large cohort of 18,935 veterans in Department of Veterans Affairs care diagnosed with a potentially painful musculoskeletal disorder. The mean (variance) NRS pain was 3.0 (7.5), and 34% of patients reported no pain (NRS = 0). We compared the following statistical models for analyzing NRS scores: linear regression, generalized linear models (Poisson and negative binomial), zero-inflated and hurdle models for data with an excess of zeroes, and a cumulative logit model for ordinal data. We examined model fit, interpretability of results, and whether conclusions about the predictor effects changed across models. In this study, models that accommodate zero inflation provided a better fit than the other models. These models should be considered for the analysis of NRS data with a large proportion of zeroes. We examined and analyzed pain data from a large cohort of veterans with musculoskeletal disorders. We found that many reported no current pain on the NRS on the diagnosis date. We present several alternative statistical methods for the analysis of pain intensity data with a large proportion of zeroes. Published by Elsevier Inc.

  14. A time-varying effect model for examining group differences in trajectories of zero-inflated count outcomes with applications in substance abuse research.

    PubMed

    Yang, Songshan; Cranford, James A; Jester, Jennifer M; Li, Runze; Zucker, Robert A; Buu, Anne

    2017-02-28

    This study proposes a time-varying effect model for examining group differences in trajectories of zero-inflated count outcomes. The motivating example demonstrates that this zero-inflated Poisson model allows investigators to study group differences in different aspects of substance use (e.g., the probability of abstinence and the quantity of alcohol use) simultaneously. The simulation study shows that the accuracy of estimation of trajectory functions improves as the sample size increases; the accuracy under equal group sizes is only higher when the sample size is small (100). In terms of the performance of the hypothesis testing, the type I error rates are close to their corresponding significance levels under all settings. Furthermore, the power increases as the alternative hypothesis deviates more from the null hypothesis, and the rate of this increasing trend is higher when the sample size is larger. Moreover, the hypothesis test for the group difference in the zero component tends to be less powerful than the test for the group difference in the Poisson component. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Accident prediction model for public highway-rail grade crossings.

    PubMed

    Lu, Pan; Tolliver, Denver

    2016-05-01

    Considerable research has focused on roadway accident frequency analysis, but relatively little research has examined safety evaluation at highway-rail grade crossings. Highway-rail grade crossings are critical spatial locations of utmost importance for transportation safety because traffic crashes at highway-rail grade crossings are often catastrophic with serious consequences. The Poisson regression model has been employed to analyze vehicle accident frequency as a good starting point for many years. The most commonly applied variations of Poisson including negative binomial, and zero-inflated Poisson. These models are used to deal with common crash data issues such as over-dispersion (sample variance is larger than the sample mean) and preponderance of zeros (low sample mean and small sample size). On rare occasions traffic crash data have been shown to be under-dispersed (sample variance is smaller than the sample mean) and traditional distributions such as Poisson or negative binomial cannot handle under-dispersion well. The objective of this study is to investigate and compare various alternate highway-rail grade crossing accident frequency models that can handle the under-dispersion issue. The contributions of the paper are two-fold: (1) application of probability models to deal with under-dispersion issues and (2) obtain insights regarding to vehicle crashes at public highway-rail grade crossings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Zero-state Markov switching count-data models: an empirical assessment.

    PubMed

    Malyshkina, Nataliya V; Mannering, Fred L

    2010-01-01

    In this study, a two-state Markov switching count-data model is proposed as an alternative to zero-inflated models to account for the preponderance of zeros sometimes observed in transportation count data, such as the number of accidents occurring on a roadway segment over some period of time. For this accident-frequency case, zero-inflated models assume the existence of two states: one of the states is a zero-accident count state, which has accident probabilities that are so low that they cannot be statistically distinguished from zero, and the other state is a normal-count state, in which counts can be non-negative integers that are generated by some counting process, for example, a Poisson or negative binomial. While zero-inflated models have come under some criticism with regard to accident-frequency applications - one fact is undeniable - in many applications they provide a statistically superior fit to the data. The Markov switching approach we propose seeks to overcome some of the criticism associated with the zero-accident state of the zero-inflated model by allowing individual roadway segments to switch between zero and normal-count states over time. An important advantage of this Markov switching approach is that it allows for the direct statistical estimation of the specific roadway-segment state (i.e., zero-accident or normal-count state) whereas traditional zero-inflated models do not. To demonstrate the applicability of this approach, a two-state Markov switching negative binomial model (estimated with Bayesian inference) and standard zero-inflated negative binomial models are estimated using five-year accident frequencies on Indiana interstate highway segments. It is shown that the Markov switching model is a viable alternative and results in a superior statistical fit relative to the zero-inflated models.

  17. Applying the zero-inflated Poisson model with random effects to detect abnormal rises in school absenteeism indicating infectious diseases outbreak.

    PubMed

    Song, X X; Zhao, Q; Tao, T; Zhou, C M; Diwan, V K; Xu, B

    2018-05-30

    Records of absenteeism from primary schools are valuable data for infectious diseases surveillance. However, the analysis of the absenteeism is complicated by the data features of clustering at zero, non-independence and overdispersion. This study aimed to generate an appropriate model to handle the absenteeism data collected in a European Commission granted project for infectious disease surveillance in rural China and to evaluate the validity and timeliness of the resulting model for early warnings of infectious disease outbreak. Four steps were taken: (1) building a 'well-fitting' model by the zero-inflated Poisson model with random effects (ZIP-RE) using the absenteeism data from the first implementation year; (2) applying the resulting model to predict the 'expected' number of absenteeism events in the second implementation year; (3) computing the differences between the observations and the expected values (O-E values) to generate an alternative series of data; (4) evaluating the early warning validity and timeliness of the observational data and model-based O-E values via the EARS-3C algorithms with regard to the detection of real cluster events. The results indicate that ZIP-RE and its corresponding O-E values could improve the detection of aberrations, reduce the false-positive signals and are applicable to the zero-inflated data.

  18. Comparing statistical methods for analyzing skewed longitudinal count data with many zeros: an example of smoking cessation.

    PubMed

    Xie, Haiyi; Tao, Jill; McHugo, Gregory J; Drake, Robert E

    2013-07-01

    Count data with skewness and many zeros are common in substance abuse and addiction research. Zero-adjusting models, especially zero-inflated models, have become increasingly popular in analyzing this type of data. This paper reviews and compares five mixed-effects Poisson family models commonly used to analyze count data with a high proportion of zeros by analyzing a longitudinal outcome: number of smoking quit attempts from the New Hampshire Dual Disorders Study. The findings of our study indicated that count data with many zeros do not necessarily require zero-inflated or other zero-adjusting models. For rare event counts or count data with small means, a simpler model such as the negative binomial model may provide a better fit. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. A new multivariate zero-adjusted Poisson model with applications to biomedicine.

    PubMed

    Liu, Yin; Tian, Guo-Liang; Tang, Man-Lai; Yuen, Kam Chuen

    2018-05-25

    Recently, although advances were made on modeling multivariate count data, existing models really has several limitations: (i) The multivariate Poisson log-normal model (Aitchison and Ho, ) cannot be used to fit multivariate count data with excess zero-vectors; (ii) The multivariate zero-inflated Poisson (ZIP) distribution (Li et al., 1999) cannot be used to model zero-truncated/deflated count data and it is difficult to apply to high-dimensional cases; (iii) The Type I multivariate zero-adjusted Poisson (ZAP) distribution (Tian et al., 2017) could only model multivariate count data with a special correlation structure for random components that are all positive or negative. In this paper, we first introduce a new multivariate ZAP distribution, based on a multivariate Poisson distribution, which allows the correlations between components with a more flexible dependency structure, that is some of the correlation coefficients could be positive while others could be negative. We then develop its important distributional properties, and provide efficient statistical inference methods for multivariate ZAP model with or without covariates. Two real data examples in biomedicine are used to illustrate the proposed methods. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. A new approach for handling longitudinal count data with zero-inflation and overdispersion: poisson geometric process model.

    PubMed

    Wan, Wai-Yin; Chan, Jennifer S K

    2009-08-01

    For time series of count data, correlated measurements, clustering as well as excessive zeros occur simultaneously in biomedical applications. Ignoring such effects might contribute to misleading treatment outcomes. A generalized mixture Poisson geometric process (GMPGP) model and a zero-altered mixture Poisson geometric process (ZMPGP) model are developed from the geometric process model, which was originally developed for modelling positive continuous data and was extended to handle count data. These models are motivated by evaluating the trend development of new tumour counts for bladder cancer patients as well as by identifying useful covariates which affect the count level. The models are implemented using Bayesian method with Markov chain Monte Carlo (MCMC) algorithms and are assessed using deviance information criterion (DIC).

  1. A tutorial on count regression and zero-altered count models for longitudinal substance use data

    PubMed Central

    Atkins, David C.; Baldwin, Scott A.; Zheng, Cheng; Gallop, Robert J.; Neighbors, Clayton

    2012-01-01

    Critical research questions in the study of addictive behaviors concern how these behaviors change over time - either as the result of intervention or in naturalistic settings. The combination of count outcomes that are often strongly skewed with many zeroes (e.g., days using, number of total drinks, number of drinking consequences) with repeated assessments (e.g., longitudinal follow-up after intervention or daily diary data) present challenges for data analyses. The current article provides a tutorial on methods for analyzing longitudinal substance use data, focusing on Poisson, zero-inflated, and hurdle mixed models, which are types of hierarchical or multilevel models. Two example datasets are used throughout, focusing on drinking-related consequences following an intervention and daily drinking over the past 30 days, respectively. Both datasets as well as R, SAS, Mplus, Stata, and SPSS code showing how to fit the models are available on a supplemental website. PMID:22905895

  2. Zero-Inflated Models for Identifying Relationships Between Body Mass Index and Gastroesophageal Reflux Symptoms: A Nationwide Population-Based Study in China.

    PubMed

    Xu, Qin; Zhang, Wei; Zhang, Tianyi; Zhang, Ruijie; Zhao, Yanfang; Zhang, Yuan; Guo, Yibin; Wang, Rui; Ma, Xiuqiang; He, Jia

    2016-07-01

    That obesity leads to gastroesophageal reflux is a widespread notion. However, scientific evidence for this association is limited, with no rigorous epidemiological approach conducted to address this question. This study examined the relationship between body mass index (BMI) and gastroesophageal reflux symptoms in a large population-representative sample from China. We performed a cross-sectional study in an age- and gender-stratified random sample of the population of five central regions in China. Participants aged 18-80 years completed a general information questionnaire and a Chinese version of the Reflux Disease Questionnaire. The zero-inflated Poisson regression model estimated the relationship between body mass index and gastroesophageal reflux symptoms. Overall, 16,091 (89.4 %) of the 18,000 eligible participants responded. 638 (3.97 %) and 1738 (10.81 %) experienced at least weekly heartburn and weekly acid regurgitation, respectively. After adjusting for potential risk factors in the zero-inflated part, the frequency [odds ratio (OR) 0.66, 95 % confidence interval (95 % CI) 0.50-0.86, p = 0.002] and severity (OR 0.66, 95 % CI 0.50-088, p = 0.004) of heartburn in obese participants were statistically significant compared to those in normal participants. In the Poisson part, the frequency of acid regurgitation, overweight (OR 1.10, 95 % CI 1.01-1.21, p = 0.038) and obesity (OR 1.19, 95 % CI 1.04-1.37, p = 0.013) were statistically significant. BMI was strongly and positively related to the frequency and severity of gastroesophageal reflux symptoms. Additionally, gender exerted strong specific effects on the relationship between BMI and gastroesophageal reflux symptoms. The severity and frequency of heartburn were positively correlated with obesity. This relationship was presented distinct in male participants only.

  3. Classifying next-generation sequencing data using a zero-inflated Poisson model.

    PubMed

    Zhou, Yan; Wan, Xiang; Zhang, Baoxue; Tong, Tiejun

    2018-04-15

    With the development of high-throughput techniques, RNA-sequencing (RNA-seq) is becoming increasingly popular as an alternative for gene expression analysis, such as RNAs profiling and classification. Identifying which type of diseases a new patient belongs to with RNA-seq data has been recognized as a vital problem in medical research. As RNA-seq data are discrete, statistical methods developed for classifying microarray data cannot be readily applied for RNA-seq data classification. Witten proposed a Poisson linear discriminant analysis (PLDA) to classify the RNA-seq data in 2011. Note, however, that the count datasets are frequently characterized by excess zeros in real RNA-seq or microRNA sequence data (i.e. when the sequence depth is not enough or small RNAs with the length of 18-30 nucleotides). Therefore, it is desired to develop a new model to analyze RNA-seq data with an excess of zeros. In this paper, we propose a Zero-Inflated Poisson Logistic Discriminant Analysis (ZIPLDA) for RNA-seq data with an excess of zeros. The new method assumes that the data are from a mixture of two distributions: one is a point mass at zero, and the other follows a Poisson distribution. We then consider a logistic relation between the probability of observing zeros and the mean of the genes and the sequencing depth in the model. Simulation studies show that the proposed method performs better than, or at least as well as, the existing methods in a wide range of settings. Two real datasets including a breast cancer RNA-seq dataset and a microRNA-seq dataset are also analyzed, and they coincide with the simulation results that our proposed method outperforms the existing competitors. The software is available at http://www.math.hkbu.edu.hk/∼tongt. xwan@comp.hkbu.edu.hk or tongt@hkbu.edu.hk. Supplementary data are available at Bioinformatics online.

  4. A review and comparison of Bayesian and likelihood-based inferences in beta regression and zero-or-one-inflated beta regression.

    PubMed

    Liu, Fang; Eugenio, Evercita C

    2018-04-01

    Beta regression is an increasingly popular statistical technique in medical research for modeling of outcomes that assume values in (0, 1), such as proportions and patient reported outcomes. When outcomes take values in the intervals [0,1), (0,1], or [0,1], zero-or-one-inflated beta (zoib) regression can be used. We provide a thorough review on beta regression and zoib regression in the modeling, inferential, and computational aspects via the likelihood-based and Bayesian approaches. We demonstrate the statistical and practical importance of correctly modeling the inflation at zero/one rather than ad hoc replacing them with values close to zero/one via simulation studies; the latter approach can lead to biased estimates and invalid inferences. We show via simulation studies that the likelihood-based approach is computationally faster in general than MCMC algorithms used in the Bayesian inferences, but runs the risk of non-convergence, large biases, and sensitivity to starting values in the optimization algorithm especially with clustered/correlated data, data with sparse inflation at zero and one, and data that warrant regularization of the likelihood. The disadvantages of the regular likelihood-based approach make the Bayesian approach an attractive alternative in these cases. Software packages and tools for fitting beta and zoib regressions in both the likelihood-based and Bayesian frameworks are also reviewed.

  5. Analyzing Propensity Matched Zero-Inflated Count Outcomes in Observational Studies

    PubMed Central

    DeSantis, Stacia M.; Lazaridis, Christos; Ji, Shuang; Spinale, Francis G.

    2013-01-01

    Determining the effectiveness of different treatments from observational data, which are characterized by imbalance between groups due to lack of randomization, is challenging. Propensity matching is often used to rectify imbalances among prognostic variables. However, there are no guidelines on how appropriately to analyze group matched data when the outcome is a zero inflated count. In addition, there is debate over whether to account for correlation of responses induced by matching, and/or whether to adjust for variables used in generating the propensity score in the final analysis. The aim of this research is to compare covariate unadjusted and adjusted zero-inflated Poisson models that do and do not account for the correlation. A simulation study is conducted, demonstrating that it is necessary to adjust for potential residual confounding, but that accounting for correlation is less important. The methods are applied to a biomedical research data set. PMID:24298197

  6. Some considerations for excess zeroes in substance abuse research.

    PubMed

    Bandyopadhyay, Dipankar; DeSantis, Stacia M; Korte, Jeffrey E; Brady, Kathleen T

    2011-09-01

    Count data collected in substance abuse research often come with an excess of "zeroes," which are typically handled using zero-inflated regression models. However, there is a need to consider the design aspects of those studies before using such a statistical model to ascertain the sources of zeroes. We sought to illustrate hurdle models as alternatives to zero-inflated models to validate a two-stage decision-making process in situations of "excess zeroes." We use data from a study of 45 cocaine-dependent subjects where the primary scientific question was to evaluate whether study participation influences drug-seeking behavior. The outcome, "the frequency (count) of cocaine use days per week," is bounded (ranging from 0 to 7). We fit and compare binomial, Poisson, negative binomial, and the hurdle version of these models to study the effect of gender, age, time, and study participation on cocaine use. The hurdle binomial model provides the best fit. Gender and time are not predictive of use. Higher odds of use versus no use are associated with age; however once use is experienced, odds of further use decrease with increase in age. Participation was associated with higher odds of no-cocaine use; once there is use, participation reduced the odds of further use. Age and study participation are significantly predictive of cocaine-use behavior. The two-stage decision process as modeled by a hurdle binomial model (appropriate for bounded count data with excess zeroes) provides interesting insights into the study of covariate effects on count responses of substance use, when all enrolled subjects are believed to be "at-risk" of use.

  7. Herd-level risk factors for Campylobacter fetus infection, Brucella seropositivity and within-herd seroprevalence of brucellosis in cattle in northern Nigeria.

    PubMed

    Mai, H M; Irons, P C; Kabir, J; Thompson, P N

    2013-09-01

    Brucellosis and campylobacteriosis are economically important diseases affecting bovine reproductive efficiency in Nigeria. A questionnaire-based survey was conducted in 271 cattle herds in Adamawa, Kaduna and Kano states of northern Nigeria using multistage cluster sampling. Serum from 4745 mature animals was tested for Brucella antibodies using the Rose-Bengal plate test and positives were confirmed in series-testing protocol using competitive enzyme-linked immunosorbent assay. Preputial scrapings from 602 bulls were tested using culture and identification for Campylobacter fetus. For each disease, a herd was classified as positive if one or more animals tested positive. For each herd, information on potential managemental and environmental risk factors was collected through a questionnaire administered during an interview with the manager, owner or herdsman. Multiple logistic regression models were used to model the odds of herd infection for each disease. A zero-inflated Poisson model was used to model the count of Brucella-positive animals within herds, with the number tested as an exposure variable. The presence of small ruminants (sheep and/or goats) on the same farm, and buying-in of >3 new animals in the previous year or failure to practice quarantine were associated with increased odds of herd-level campylobacteriosis and brucellosis, as well as increased within-herd counts of Brucella-positive animals. In addition, high rainfall, initial acquisition of animals from markets, practice of gynaecological examination and failure to practice herd prophylactic measures were positively associated with the odds of C. fetus infection in the herd. Herd size of >15, pastoral management system and presence of handling facility on the farm were associated with increased odds, and gynaecological examination with reduced odds of herd-level Brucella seropositivity. Furthermore, the zero-inflated Poisson model showed that borrowing or sharing of bulls was associated with higher counts, and provision of mineral supplement with lower counts of Brucella-positive cattle within herds. Identification of risk factors for bovine campylobacteriosis and brucellosis can help to identify appropriate control measures, and the use of zero-inflated count model can provide more specific information on these risk factors. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Selecting a distributional assumption for modelling relative densities of benthic macroinvertebrates

    USGS Publications Warehouse

    Gray, B.R.

    2005-01-01

    The selection of a distributional assumption suitable for modelling macroinvertebrate density data is typically challenging. Macroinvertebrate data often exhibit substantially larger variances than expected under a standard count assumption, that of the Poisson distribution. Such overdispersion may derive from multiple sources, including heterogeneity of habitat (historically and spatially), differing life histories for organisms collected within a single collection in space and time, and autocorrelation. Taken to extreme, heterogeneity of habitat may be argued to explain the frequent large proportions of zero observations in macroinvertebrate data. Sampling locations may consist of habitats defined qualitatively as either suitable or unsuitable. The former category may yield random or stochastic zeroes and the latter structural zeroes. Heterogeneity among counts may be accommodated by treating the count mean itself as a random variable, while extra zeroes may be accommodated using zero-modified count assumptions, including zero-inflated and two-stage (or hurdle) approaches. These and linear assumptions (following log- and square root-transformations) were evaluated using 9 years of mayfly density data from a 52 km, ninth-order reach of the Upper Mississippi River (n = 959). The data exhibited substantial overdispersion relative to that expected under a Poisson assumption (i.e. variance:mean ratio = 23 ??? 1), and 43% of the sampling locations yielded zero mayflies. Based on the Akaike Information Criterion (AIC), count models were improved most by treating the count mean as a random variable (via a Poisson-gamma distributional assumption) and secondarily by zero modification (i.e. improvements in AIC values = 9184 units and 47-48 units, respectively). Zeroes were underestimated by the Poisson, log-transform and square root-transform models, slightly by the standard negative binomial model but not by the zero-modified models (61%, 24%, 32%, 7%, and 0%, respectively). However, the zero-modified Poisson models underestimated small counts (1 ??? y ??? 4) and overestimated intermediate counts (7 ??? y ??? 23). Counts greater than zero were estimated well by zero-modified negative binomial models, while counts greater than one were also estimated well by the standard negative binomial model. Based on AIC and percent zero estimation criteria, the two-stage and zero-inflated models performed similarly. The above inferences were largely confirmed when the models were used to predict values from a separate, evaluation data set (n = 110). An exception was that, using the evaluation data set, the standard negative binomial model appeared superior to its zero-modified counterparts using the AIC (but not percent zero criteria). This and other evidence suggest that a negative binomial distributional assumption should be routinely considered when modelling benthic macroinvertebrate data from low flow environments. Whether negative binomial models should themselves be routinely examined for extra zeroes requires, from a statistical perspective, more investigation. However, this question may best be answered by ecological arguments that may be specific to the sampled species and locations. ?? 2004 Elsevier B.V. All rights reserved.

  9. Disease Mapping of Zero-excessive Mesothelioma Data in Flanders

    PubMed Central

    Neyens, Thomas; Lawson, Andrew B.; Kirby, Russell S.; Nuyts, Valerie; Watjou, Kevin; Aregay, Mehreteab; Carroll, Rachel; Nawrot, Tim S.; Faes, Christel

    2016-01-01

    Purpose To investigate the distribution of mesothelioma in Flanders using Bayesian disease mapping models that account for both an excess of zeros and overdispersion. Methods The numbers of newly diagnosed mesothelioma cases within all Flemish municipalities between 1999 and 2008 were obtained from the Belgian Cancer Registry. To deal with overdispersion, zero-inflation and geographical association, the hurdle combined model was proposed, which has three components: a Bernoulli zero-inflation mixture component to account for excess zeros, a gamma random effect to adjust for overdispersion and a normal conditional autoregressive random effect to attribute spatial association. This model was compared with other existing methods in literature. Results The results indicate that hurdle models with a random effects term accounting for extra-variance in the Bernoulli zero-inflation component fit the data better than hurdle models that do not take overdispersion in the occurrence of zeros into account. Furthermore, traditional models that do not take into account excessive zeros but contain at least one random effects term that models extra-variance in the counts have better fits compared to their hurdle counterparts. In other words, the extra-variability, due to an excess of zeros, can be accommodated by spatially structured and/or unstructured random effects in a Poisson model such that the hurdle mixture model is not necessary. Conclusions Models taking into account zero-inflation do not always provide better fits to data with excessive zeros than less complex models. In this study, a simple conditional autoregressive model identified a cluster in mesothelioma cases near a former asbestos processing plant (Kapelle-op-den-Bos). This observation is likely linked with historical local asbestos exposures. Future research will clarify this. PMID:27908590

  10. Disease mapping of zero-excessive mesothelioma data in Flanders.

    PubMed

    Neyens, Thomas; Lawson, Andrew B; Kirby, Russell S; Nuyts, Valerie; Watjou, Kevin; Aregay, Mehreteab; Carroll, Rachel; Nawrot, Tim S; Faes, Christel

    2017-01-01

    To investigate the distribution of mesothelioma in Flanders using Bayesian disease mapping models that account for both an excess of zeros and overdispersion. The numbers of newly diagnosed mesothelioma cases within all Flemish municipalities between 1999 and 2008 were obtained from the Belgian Cancer Registry. To deal with overdispersion, zero inflation, and geographical association, the hurdle combined model was proposed, which has three components: a Bernoulli zero-inflation mixture component to account for excess zeros, a gamma random effect to adjust for overdispersion, and a normal conditional autoregressive random effect to attribute spatial association. This model was compared with other existing methods in literature. The results indicate that hurdle models with a random effects term accounting for extra variance in the Bernoulli zero-inflation component fit the data better than hurdle models that do not take overdispersion in the occurrence of zeros into account. Furthermore, traditional models that do not take into account excessive zeros but contain at least one random effects term that models extra variance in the counts have better fits compared to their hurdle counterparts. In other words, the extra variability, due to an excess of zeros, can be accommodated by spatially structured and/or unstructured random effects in a Poisson model such that the hurdle mixture model is not necessary. Models taking into account zero inflation do not always provide better fits to data with excessive zeros than less complex models. In this study, a simple conditional autoregressive model identified a cluster in mesothelioma cases near a former asbestos processing plant (Kapelle-op-den-Bos). This observation is likely linked with historical local asbestos exposures. Future research will clarify this. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Prediction of Short-Distance Aerial Movement of Phakopsora pachyrhizi Urediniospores Using Machine Learning.

    PubMed

    Wen, L; Bowen, C R; Hartman, G L

    2017-10-01

    Dispersal of urediniospores by wind is the primary means of spread for Phakopsora pachyrhizi, the cause of soybean rust. Our research focused on the short-distance movement of urediniospores from within the soybean canopy and up to 61 m from field-grown rust-infected soybean plants. Environmental variables were used to develop and compare models including the least absolute shrinkage and selection operator regression, zero-inflated Poisson/regular Poisson regression, random forest, and neural network to describe deposition of urediniospores collected in passive and active traps. All four models identified distance of trap from source, humidity, temperature, wind direction, and wind speed as the five most important variables influencing short-distance movement of urediniospores. The random forest model provided the best predictions, explaining 76.1 and 86.8% of the total variation in the passive- and active-trap datasets, respectively. The prediction accuracy based on the correlation coefficient (r) between predicted values and the true values were 0.83 (P < 0.0001) and 0.94 (P < 0.0001) for the passive and active trap datasets, respectively. Overall, multiple machine learning techniques identified the most important variables to make the most accurate predictions of movement of P. pachyrhizi urediniospores short-distance.

  12. Predictors for the Number of Warning Information Sources During Tornadoes.

    PubMed

    Cong, Zhen; Luo, Jianjun; Liang, Daan; Nejat, Ali

    2017-04-01

    People may receive tornado warnings from multiple information sources, but little is known about factors that affect the number of warning information sources (WISs). This study examined predictors for the number of WISs with a telephone survey on randomly sampled residents in Tuscaloosa, Alabama, and Joplin, Missouri, approximately 1 year after both cities were struck by violent tornadoes (EF4 and EF5) in 2011. The survey included 1006 finished interviews and the working sample included 903 respondents. Poisson regression and Zero-Inflated Poisson regression showed that older age and having an emergency plan predicted more WISs in both cities. Education, marital status, and gender affected the possibilities of receiving warnings and the number of WISs either in Joplin or in Tuscaloosa. The findings suggest that social disparity affects the access to warnings not only with respect to the likelihood of receiving any warnings but also with respect to the number of WISs. In addition, historical and social contexts are important for examining predictors for the number of WISs. We recommend that the number of WISs should be regarded as an important measure to evaluate access to warnings in addition to the likelihood of receiving warnings. (Disaster Med Public Health Preparedness. 2017;11:168-172).

  13. Population heterogeneity in the salience of multiple risk factors for adolescent delinquency.

    PubMed

    Lanza, Stephanie T; Cooper, Brittany R; Bray, Bethany C

    2014-03-01

    To present mixture regression analysis as an alternative to more standard regression analysis for predicting adolescent delinquency. We demonstrate how mixture regression analysis allows for the identification of population subgroups defined by the salience of multiple risk factors. We identified population subgroups (i.e., latent classes) of individuals based on their coefficients in a regression model predicting adolescent delinquency from eight previously established risk indices drawn from the community, school, family, peer, and individual levels. The study included N = 37,763 10th-grade adolescents who participated in the Communities That Care Youth Survey. Standard, zero-inflated, and mixture Poisson and negative binomial regression models were considered. Standard and mixture negative binomial regression models were selected as optimal. The five-class regression model was interpreted based on the class-specific regression coefficients, indicating that risk factors had varying salience across classes of adolescents. Standard regression showed that all risk factors were significantly associated with delinquency. Mixture regression provided more nuanced information, suggesting a unique set of risk factors that were salient for different subgroups of adolescents. Implications for the design of subgroup-specific interventions are discussed. Copyright © 2014 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  14. Efficacy of a savings-led microfinance intervention to reduce sexual risk for HIV among women engaged in sex work: a randomized clinical trial.

    PubMed

    Witte, Susan S; Aira, Toivgoo; Tsai, Laura Cordisco; Riedel, Marion; Offringa, Reid; Chang, Mingway; El-Bassel, Nabila; Ssewamala, Fred

    2015-03-01

    We tested whether a structural intervention combining savings-led microfinance and HIV prevention components would achieve enhanced reductions in sexual risk among women engaging in street-based sex work in Ulaanbaatar, Mongolia, compared with an HIV prevention intervention alone. Between November 2011 and August 2012, we randomized 107 eligible women who completed baseline assessments to either a 4-session HIV sexual risk reduction intervention (HIVSRR) alone (n=50) or a 34-session HIVSRR plus a savings-led microfinance intervention (n=57). At 3- and 6-month follow-up assessments, participants reported unprotected acts of vaginal intercourse with paying partners and number of paying partners with whom they engaged in sexual intercourse in the previous 90 days. Using Poisson and zero-inflated Poisson model regressions, we examined the effects of assignment to treatment versus control condition on outcomes. At 6-month follow-up, the HIVSRR plus microfinance participants reported significantly fewer paying sexual partners and were more likely to report zero unprotected vaginal sex acts with paying sexual partners. Findings advance the HIV prevention repertoire for women, demonstrating that risk reduction may be achieved through a structural intervention that relies on asset building, including savings, and alternatives to income from sex work.

  15. Conditional modeling of antibody titers using a zero-inflated poisson random effects model: application to Fabrazyme.

    PubMed

    Bonate, Peter L; Sung, Crystal; Welch, Karen; Richards, Susan

    2009-10-01

    Patients that are exposed to biotechnology-derived therapeutics often develop antibodies to the therapeutic, the magnitude of which is assessed by measuring antibody titers. A statistical approach for analyzing antibody titer data conditional on seroconversion is presented. The proposed method is to first transform the antibody titer data based on a geometric series using a common ratio of 2 and a scale factor of 50 and then analyze the exponent using a zero-inflated or hurdle model assuming a Poisson or negative binomial distribution with random effects to account for patient heterogeneity. Patient specific covariates can be used to model the probability of developing an antibody response, i.e., seroconversion, as well as the magnitude of the antibody titer itself. The method was illustrated using antibody titer data from 87 male seroconverted Fabry patients receiving Fabrazyme. Titers from five clinical trials were collected over 276 weeks of therapy with anti-Fabrazyme IgG titers ranging from 100 to 409,600 after exclusion of seronegative patients. The best model to explain seroconversion was a zero-inflated Poisson (ZIP) model where cumulative dose (under a constant dose regimen of dosing every 2 weeks) influenced the probability of seroconversion. There was an 80% chance of seroconversion when the cumulative dose reached 210 mg (90% confidence interval: 194-226 mg). No difference in antibody titers was noted between Japanese or Western patients. Once seroconverted, antibody titers did not remain constant but decreased in an exponential manner from an initial magnitude to a new lower steady-state value. The expected titer after the new steady-state titer had been achieved was 870 (90% CI: 630-1109). The half-life to the new steady-state value after seroconversion was 44 weeks (90% CI: 17-70 weeks). Time to seroconversion did not appear to be correlated with titer at the time of seroconversion. The method can be adequately used to model antibody titer data.

  16. The impact of state safe routes to school-related laws on active travel to school policies and practices in U.S. elementary schools.

    PubMed

    Chriqui, Jamie F; Taber, Daniel R; Slater, Sandy J; Turner, Lindsey; Lowrey, Kerri McGowan; Chaloupka, Frank J

    2012-01-01

    This study examined the relationship between state laws requiring minimum bussing distances, hazardous route exemptions, sidewalks, crossing guards, speed zones, and traffic control measures around schools and active travel to school (ATS) policies/practices in nationally representative samples of U.S. public elementary schools between 2007-2009. The state laws and school data were compiled through primary legal research and annual mail-back surveys of principals, respectively. Multivariate logistic and zero-inflated poisson regression indicated that all state law categories (except for sidewalks) relate to ATS. These laws should be considered in addition to formal safe routes to school programs as possible influences on ATS. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Some findings on zero-inflated and hurdle poisson models for disease mapping.

    PubMed

    Corpas-Burgos, Francisca; García-Donato, Gonzalo; Martinez-Beneito, Miguel A

    2018-05-27

    Zero excess in the study of geographically referenced mortality data sets has been the focus of considerable attention in the literature, with zero-inflation being the most common procedure to handle this lack of fit. Although hurdle models have also been used in disease mapping studies, their use is more rare. We show in this paper that models using particular treatments of zero excesses are often required for achieving appropriate fits in regular mortality studies since, otherwise, geographical units with low expected counts are oversmoothed. However, as also shown, an indiscriminate treatment of zero excess may be unnecessary and has a problematic implementation. In this regard, we find that naive zero-inflation and hurdle models, without an explicit modeling of the probabilities of zeroes, do not fix zero excesses problems well enough and are clearly unsatisfactory. Results sharply suggest the need for an explicit modeling of the probabilities that should vary across areal units. Unfortunately, these more flexible modeling strategies can easily lead to improper posterior distributions as we prove in several theoretical results. Those procedures have been repeatedly used in the disease mapping literature, and one should bear these issues in mind in order to propose valid models. We finally propose several valid modeling alternatives according to the results mentioned that are suitable for fitting zero excesses. We show that those proposals fix zero excesses problems and correct the mentioned oversmoothing of risks in low populated units depicting geographic patterns more suited to the data. Copyright © 2018 John Wiley & Sons, Ltd.

  18. Simulation on Poisson and negative binomial models of count road accident modeling

    NASA Astrophysics Data System (ADS)

    Sapuan, M. S.; Razali, A. M.; Zamzuri, Z. H.; Ibrahim, K.

    2016-11-01

    Accident count data have often been shown to have overdispersion. On the other hand, the data might contain zero count (excess zeros). The simulation study was conducted to create a scenarios which an accident happen in T-junction with the assumption the dependent variables of generated data follows certain distribution namely Poisson and negative binomial distribution with different sample size of n=30 to n=500. The study objective was accomplished by fitting Poisson regression, negative binomial regression and Hurdle negative binomial model to the simulated data. The model validation was compared and the simulation result shows for each different sample size, not all model fit the data nicely even though the data generated from its own distribution especially when the sample size is larger. Furthermore, the larger sample size indicates that more zeros accident count in the dataset.

  19. Matching the Statistical Model to the Research Question for Dental Caries Indices with Many Zero Counts.

    PubMed

    Preisser, John S; Long, D Leann; Stamm, John W

    2017-01-01

    Marginalized zero-inflated count regression models have recently been introduced for the statistical analysis of dental caries indices and other zero-inflated count data as alternatives to traditional zero-inflated and hurdle models. Unlike the standard approaches, the marginalized models directly estimate overall exposure or treatment effects by relating covariates to the marginal mean count. This article discusses model interpretation and model class choice according to the research question being addressed in caries research. Two data sets, one consisting of fictional dmft counts in 2 groups and the other on DMFS among schoolchildren from a randomized clinical trial comparing 3 toothpaste formulations to prevent incident dental caries, are analyzed with negative binomial hurdle, zero-inflated negative binomial, and marginalized zero-inflated negative binomial models. In the first example, estimates of treatment effects vary according to the type of incidence rate ratio (IRR) estimated by the model. Estimates of IRRs in the analysis of the randomized clinical trial were similar despite their distinctive interpretations. The choice of statistical model class should match the study's purpose, while accounting for the broad decline in children's caries experience, such that dmft and DMFS indices more frequently generate zero counts. Marginalized (marginal mean) models for zero-inflated count data should be considered for direct assessment of exposure effects on the marginal mean dental caries count in the presence of high frequencies of zero counts. © 2017 S. Karger AG, Basel.

  20. Matching the Statistical Model to the Research Question for Dental Caries Indices with Many Zero Counts

    PubMed Central

    Preisser, John S.; Long, D. Leann; Stamm, John W.

    2017-01-01

    Marginalized zero-inflated count regression models have recently been introduced for the statistical analysis of dental caries indices and other zero-inflated count data as alternatives to traditional zero-inflated and hurdle models. Unlike the standard approaches, the marginalized models directly estimate overall exposure or treatment effects by relating covariates to the marginal mean count. This article discusses model interpretation and model class choice according to the research question being addressed in caries research. Two datasets, one consisting of fictional dmft counts in two groups and the other on DMFS among schoolchildren from a randomized clinical trial (RCT) comparing three toothpaste formulations to prevent incident dental caries, are analysed with negative binomial hurdle (NBH), zero-inflated negative binomial (ZINB), and marginalized zero-inflated negative binomial (MZINB) models. In the first example, estimates of treatment effects vary according to the type of incidence rate ratio (IRR) estimated by the model. Estimates of IRRs in the analysis of the RCT were similar despite their distinctive interpretations. Choice of statistical model class should match the study’s purpose, while accounting for the broad decline in children’s caries experience, such that dmft and DMFS indices more frequently generate zero counts. Marginalized (marginal mean) models for zero-inflated count data should be considered for direct assessment of exposure effects on the marginal mean dental caries count in the presence of high frequencies of zero counts. PMID:28291962

  1. Efficacy of a Savings-Led Microfinance Intervention to Reduce Sexual Risk for HIV Among Women Engaged in Sex Work: A Randomized Clinical Trial

    PubMed Central

    Aira, Toivgoo; Tsai, Laura Cordisco; Riedel, Marion; Offringa, Reid; Chang, Mingway; El-Bassel, Nabila; Ssewamala, Fred

    2015-01-01

    Objectives. We tested whether a structural intervention combining savings-led microfinance and HIV prevention components would achieve enhanced reductions in sexual risk among women engaging in street-based sex work in Ulaanbaatar, Mongolia, compared with an HIV prevention intervention alone. Methods. Between November 2011 and August 2012, we randomized 107 eligible women who completed baseline assessments to either a 4-session HIV sexual risk reduction intervention (HIVSRR) alone (n = 50) or a 34-session HIVSRR plus a savings-led microfinance intervention (n = 57). At 3- and 6-month follow-up assessments, participants reported unprotected acts of vaginal intercourse with paying partners and number of paying partners with whom they engaged in sexual intercourse in the previous 90 days. Using Poisson and zero-inflated Poisson model regressions, we examined the effects of assignment to treatment versus control condition on outcomes. Results. At 6-month follow-up, the HIVSRR plus microfinance participants reported significantly fewer paying sexual partners and were more likely to report zero unprotected vaginal sex acts with paying sexual partners. Conclusions. Findings advance the HIV prevention repertoire for women, demonstrating that risk reduction may be achieved through a structural intervention that relies on asset building, including savings, and alternatives to income from sex work. PMID:25602889

  2. Marginalized multilevel hurdle and zero-inflated models for overdispersed and correlated count data with excess zeros.

    PubMed

    Kassahun, Wondwosen; Neyens, Thomas; Molenberghs, Geert; Faes, Christel; Verbeke, Geert

    2014-11-10

    Count data are collected repeatedly over time in many applications, such as biology, epidemiology, and public health. Such data are often characterized by the following three features. First, correlation due to the repeated measures is usually accounted for using subject-specific random effects, which are assumed to be normally distributed. Second, the sample variance may exceed the mean, and hence, the theoretical mean-variance relationship is violated, leading to overdispersion. This is usually allowed for based on a hierarchical approach, combining a Poisson model with gamma distributed random effects. Third, an excess of zeros beyond what standard count distributions can predict is often handled by either the hurdle or the zero-inflated model. A zero-inflated model assumes two processes as sources of zeros and combines a count distribution with a discrete point mass as a mixture, while the hurdle model separately handles zero observations and positive counts, where then a truncated-at-zero count distribution is used for the non-zero state. In practice, however, all these three features can appear simultaneously. Hence, a modeling framework that incorporates all three is necessary, and this presents challenges for the data analysis. Such models, when conditionally specified, will naturally have a subject-specific interpretation. However, adopting their purposefully modified marginalized versions leads to a direct marginal or population-averaged interpretation for parameter estimates of covariate effects, which is the primary interest in many applications. In this paper, we present a marginalized hurdle model and a marginalized zero-inflated model for correlated and overdispersed count data with excess zero observations and then illustrate these further with two case studies. The first dataset focuses on the Anopheles mosquito density around a hydroelectric dam, while adolescents' involvement in work, to earn money and support their families or themselves, is studied in the second example. Sub-models, which result from omitting zero-inflation and/or overdispersion features, are also considered for comparison's purpose. Analysis of the two datasets showed that accounting for the correlation, overdispersion, and excess zeros simultaneously resulted in a better fit to the data and, more importantly, that omission of any of them leads to incorrect marginal inference and erroneous conclusions about covariate effects. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Estimating cavity tree and snag abundance using negative binomial regression models and nearest neighbor imputation methods

    Treesearch

    Bianca N.I. Eskelson; Hailemariam Temesgen; Tara M. Barrett

    2009-01-01

    Cavity tree and snag abundance data are highly variable and contain many zero observations. We predict cavity tree and snag abundance from variables that are readily available from forest cover maps or remotely sensed data using negative binomial (NB), zero-inflated NB, and zero-altered NB (ZANB) regression models as well as nearest neighbor (NN) imputation methods....

  4. Variable- and Person-Centered Approaches to the Analysis of Early Adolescent Substance Use: Linking Peer, Family, and Intervention Effects with Developmental Trajectories

    ERIC Educational Resources Information Center

    Connell, Arin M.; Dishion, Thomas J.; Deater-Deckard, Kirby

    2006-01-01

    This 4-year study of 698 young adolescents examined the covariates of early onset substance use from Grade 6 through Grade 9. The youth were randomly assigned to a family-centered Adolescent Transitions Program (ATP) condition. Variable-centered (zero-inflated Poisson growth model) and person-centered (latent growth mixture model) approaches were…

  5. Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation

    USGS Publications Warehouse

    Zipkin, Elise F.; Leirness, Jeffery B.; Kinlan, Brian P.; O'Connell, Allan F.; Silverman, Emily D.

    2014-01-01

    Determining appropriate statistical distributions for modeling animal count data is important for accurate estimation of abundance, distribution, and trends. In the case of sea ducks along the U.S. Atlantic coast, managers want to estimate local and regional abundance to detect and track population declines, to define areas of high and low use, and to predict the impact of future habitat change on populations. In this paper, we used a modified marked point process to model survey data that recorded flock sizes of Common eiders, Long-tailed ducks, and Black, Surf, and White-winged scoters. The data come from an experimental aerial survey, conducted by the United States Fish & Wildlife Service (USFWS) Division of Migratory Bird Management, during which east-west transects were flown along the Atlantic Coast from Maine to Florida during the winters of 2009–2011. To model the number of flocks per transect (the points), we compared the fit of four statistical distributions (zero-inflated Poisson, zero-inflated geometric, zero-inflated negative binomial and negative binomial) to data on the number of species-specific sea duck flocks that were recorded for each transect flown. To model the flock sizes (the marks), we compared the fit of flock size data for each species to seven statistical distributions: positive Poisson, positive negative binomial, positive geometric, logarithmic, discretized lognormal, zeta and Yule–Simon. Akaike’s Information Criterion and Vuong’s closeness tests indicated that the negative binomial and discretized lognormal were the best distributions for all species for the points and marks, respectively. These findings have important implications for estimating sea duck abundances as the discretized lognormal is a more skewed distribution than the Poisson and negative binomial, which are frequently used to model avian counts; the lognormal is also less heavy-tailed than the power law distributions (e.g., zeta and Yule–Simon), which are becoming increasingly popular for group size modeling. Choosing appropriate statistical distributions for modeling flock size data is fundamental to accurately estimating population summaries, determining required survey effort, and assessing and propagating uncertainty through decision-making processes.

  6. Antibiotic Resistances in Livestock: A Comparative Approach to Identify an Appropriate Regression Model for Count Data

    PubMed Central

    Hüls, Anke; Frömke, Cornelia; Ickstadt, Katja; Hille, Katja; Hering, Johanna; von Münchhausen, Christiane; Hartmann, Maria; Kreienbrock, Lothar

    2017-01-01

    Antimicrobial resistance in livestock is a matter of general concern. To develop hygiene measures and methods for resistance prevention and control, epidemiological studies on a population level are needed to detect factors associated with antimicrobial resistance in livestock holdings. In general, regression models are used to describe these relationships between environmental factors and resistance outcome. Besides the study design, the correlation structures of the different outcomes of antibiotic resistance and structural zero measurements on the resistance outcome as well as on the exposure side are challenges for the epidemiological model building process. The use of appropriate regression models that acknowledge these complexities is essential to assure valid epidemiological interpretations. The aims of this paper are (i) to explain the model building process comparing several competing models for count data (negative binomial model, quasi-Poisson model, zero-inflated model, and hurdle model) and (ii) to compare these models using data from a cross-sectional study on antibiotic resistance in animal husbandry. These goals are essential to evaluate which model is most suitable to identify potential prevention measures. The dataset used as an example in our analyses was generated initially to study the prevalence and associated factors for the appearance of cefotaxime-resistant Escherichia coli in 48 German fattening pig farms. For each farm, the outcome was the count of samples with resistant bacteria. There was almost no overdispersion and only moderate evidence of excess zeros in the data. Our analyses show that it is essential to evaluate regression models in studies analyzing the relationship between environmental factors and antibiotic resistances in livestock. After model comparison based on evaluation of model predictions, Akaike information criterion, and Pearson residuals, here the hurdle model was judged to be the most appropriate model. PMID:28620609

  7. Zero-inflated spatio-temporal models for disease mapping.

    PubMed

    Torabi, Mahmoud

    2017-05-01

    In this paper, our aim is to analyze geographical and temporal variability of disease incidence when spatio-temporal count data have excess zeros. To that end, we consider random effects in zero-inflated Poisson models to investigate geographical and temporal patterns of disease incidence. Spatio-temporal models that employ conditionally autoregressive smoothing across the spatial dimension and B-spline smoothing over the temporal dimension are proposed. The analysis of these complex models is computationally difficult from the frequentist perspective. On the other hand, the advent of the Markov chain Monte Carlo algorithm has made the Bayesian analysis of complex models computationally convenient. Recently developed data cloning method provides a frequentist approach to mixed models that is also computationally convenient. We propose to use data cloning, which yields to maximum likelihood estimation, to conduct frequentist analysis of zero-inflated spatio-temporal modeling of disease incidence. One of the advantages of the data cloning approach is that the prediction and corresponding standard errors (or prediction intervals) of smoothing disease incidence over space and time is easily obtained. We illustrate our approach using a real dataset of monthly children asthma visits to hospital in the province of Manitoba, Canada, during the period April 2006 to March 2010. Performance of our approach is also evaluated through a simulation study. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. SEMIPARAMETRIC ZERO-INFLATED MODELING IN MULTI-ETHNIC STUDY OF ATHEROSCLEROSIS (MESA)

    PubMed Central

    Liu, Hai; Ma, Shuangge; Kronmal, Richard; Chan, Kung-Sik

    2013-01-01

    We analyze the Agatston score of coronary artery calcium (CAC) from the Multi-Ethnic Study of Atherosclerosis (MESA) using semi-parametric zero-inflated modeling approach, where the observed CAC scores from this cohort consist of high frequency of zeroes and continuously distributed positive values. Both partially constrained and unconstrained models are considered to investigate the underlying biological processes of CAC development from zero to positive, and from small amount to large amount. Different from existing studies, a model selection procedure based on likelihood cross-validation is adopted to identify the optimal model, which is justified by comparative Monte Carlo studies. A shrinkaged version of cubic regression spline is used for model estimation and variable selection simultaneously. When applying the proposed methods to the MESA data analysis, we show that the two biological mechanisms influencing the initiation of CAC and the magnitude of CAC when it is positive are better characterized by an unconstrained zero-inflated normal model. Our results are significantly different from those in published studies, and may provide further insights into the biological mechanisms underlying CAC development in human. This highly flexible statistical framework can be applied to zero-inflated data analyses in other areas. PMID:23805172

  9. Map scale effects on estimating the number of undiscovered mineral deposits

    USGS Publications Warehouse

    Singer, D.A.; Menzie, W.D.

    2008-01-01

    Estimates of numbers of undiscovered mineral deposits, fundamental to assessing mineral resources, are affected by map scale. Where consistently defined deposits of a particular type are estimated, spatial and frequency distributions of deposits are linked in that some frequency distributions can be generated by processes randomly in space whereas others are generated by processes suggesting clustering in space. Possible spatial distributions of mineral deposits and their related frequency distributions are affected by map scale and associated inclusions of non-permissive or covered geological settings. More generalized map scales are more likely to cause inclusion of geologic settings that are not really permissive for the deposit type, or that include unreported cover over permissive areas, resulting in the appearance of deposit clustering. Thus, overly generalized map scales can cause deposits to appear clustered. We propose a model that captures the effects of map scale and the related inclusion of non-permissive geologic settings on numbers of deposits estimates, the zero-inflated Poisson distribution. Effects of map scale as represented by the zero-inflated Poisson distribution suggest that the appearance of deposit clustering should diminish as mapping becomes more detailed because the number of inflated zeros would decrease with more detailed maps. Based on observed worldwide relationships between map scale and areas permissive for deposit types, mapping at a scale with twice the detail should cut permissive area size of a porphyry copper tract to 29% and a volcanic-hosted massive sulfide tract to 50% of their original sizes. Thus some direct benefits of mapping an area at a more detailed scale are indicated by significant reductions in areas permissive for deposit types, increased deposit density and, as a consequence, reduced uncertainty in the estimate of number of undiscovered deposits. Exploration enterprises benefit from reduced areas requiring detailed and expensive exploration, and land-use planners benefit from reduced areas of concern. ?? 2008 International Association for Mathematical Geology.

  10. Evaluation of the Use of Zero-Augmented Regression Techniques to Model Incidence of Campylobacter Infections in FoodNet.

    PubMed

    Tremblay, Marlène; Crim, Stacy M; Cole, Dana J; Hoekstra, Robert M; Henao, Olga L; Döpfer, Dörte

    2017-10-01

    The Foodborne Diseases Active Surveillance Network (FoodNet) is currently using a negative binomial (NB) regression model to estimate temporal changes in the incidence of Campylobacter infection. FoodNet active surveillance in 483 counties collected data on 40,212 Campylobacter cases between years 2004 and 2011. We explored models that disaggregated these data to allow us to account for demographic, geographic, and seasonal factors when examining changes in incidence of Campylobacter infection. We hypothesized that modeling structural zeros and including demographic variables would increase the fit of FoodNet's Campylobacter incidence regression models. Five different models were compared: NB without demographic covariates, NB with demographic covariates, hurdle NB with covariates in the count component only, hurdle NB with covariates in both zero and count components, and zero-inflated NB with covariates in the count component only. Of the models evaluated, the nonzero-augmented NB model with demographic variables provided the best fit. Results suggest that even though zero inflation was not present at this level, individualizing the level of aggregation and using different model structures and predictors per site might be required to correctly distinguish between structural and observational zeros and account for risk factors that vary geographically.

  11. Zero-inflated modeling of fish catch per unit area resulting from multiple gears: Application to channel catfish and shovelnose sturgeon in the Missouri River

    USGS Publications Warehouse

    Arab, A.; Wildhaber, M.L.; Wikle, C.K.; Gentry, C.N.

    2008-01-01

    Fisheries studies often employ multiple gears that result in large percentages of zero values. We considered a zero-inflated Poisson (ZIP) model with random effects to address these excessive zeros. By employing a Bayesian ZIP model that simultaneously incorporates data from multiple gears to analyze data from the Missouri River, we were able to compare gears and make more year, segment, and macrohabitat comparisons than did the original data analysis. For channel catfish Ictalurus punctatus, our results rank (highest to lowest) the mean catch per unit area (CPUA) for gears (beach seine, benthic trawl, electrofishing, and drifting trammel net); years (1998 and 1997); macrohabitats (tributary mouth, connected secondary channel, nonconnected secondary channel, and bend); and river segment zones (channelized, inter-reservoir, and least-altered). For shovelnose sturgeon Scaphirhynchus platorynchus, the mean CPUA was significantly higher for benthic trawls and drifting trammel nets; 1998 and 1997; tributary mouths, bends, and connected secondary channels; and some channelized or least-altered inter-reservoir segments. One important advantage of our approach is the ability to reliably infer patterns of relative abundance by means of multiple gears without using gear efficiencies. ?? Copyright by the American Fisheries Society 2008.

  12. Empirical null estimation using zero-inflated discrete mixture distributions and its application to protein domain data.

    PubMed

    Gauran, Iris Ivy M; Park, Junyong; Lim, Johan; Park, DoHwan; Zylstra, John; Peterson, Thomas; Kann, Maricel; Spouge, John L

    2017-09-22

    In recent mutation studies, analyses based on protein domain positions are gaining popularity over gene-centric approaches since the latter have limitations in considering the functional context that the position of the mutation provides. This presents a large-scale simultaneous inference problem, with hundreds of hypothesis tests to consider at the same time. This article aims to select significant mutation counts while controlling a given level of Type I error via False Discovery Rate (FDR) procedures. One main assumption is that the mutation counts follow a zero-inflated model in order to account for the true zeros in the count model and the excess zeros. The class of models considered is the Zero-inflated Generalized Poisson (ZIGP) distribution. Furthermore, we assumed that there exists a cut-off value such that smaller counts than this value are generated from the null distribution. We present several data-dependent methods to determine the cut-off value. We also consider a two-stage procedure based on screening process so that the number of mutations exceeding a certain value should be considered as significant mutations. Simulated and protein domain data sets are used to illustrate this procedure in estimation of the empirical null using a mixture of discrete distributions. Overall, while maintaining control of the FDR, the proposed two-stage testing procedure has superior empirical power. 2017 The Authors. Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

  13. Household Implementation of Smoke-Free Rules in Homes and Cars: A Focus on Adolescent Smoking Behavior and Secondhand Smoke Exposure.

    PubMed

    Parks, Michael J; Kingsbury, John H; Boyle, Raymond G; Evered, Sharrilyn

    2018-01-01

    This study addresses the dearth of population-based research on how comprehensive household smoke-free rules (ie, in the home and car) relate to tobacco use and secondhand smoke (SHS) exposure among adolescents. Analysis of 2014 Minnesota Youth Tobacco Survey. Representative sample of Minnesota youth. A total of 1287 youth who lived with a smoker. Measures included household smoke-free rules (no rules, partial rules-home or car, but not both-and comprehensive rules), lifetime and 30-day cigarette use, 30-day cigarette and other product use, and SHS exposure in past 7 days in home and car. Weighted multivariate logistic, zero-inflated Poisson, and zero-inflated negative binomial regressions were used. Compared to comprehensive rules, partial and no smoke-free rules were significantly and positively related to lifetime cigarette use (respectively, adjusted odds ratio [AOR] = 1.80, 95% confidence interval [CI] = 1.24-2.61; AOR = 2.87, 95% CI = 1.93-4.25), and a similar significant pattern was found for 30-day cigarette use (respectively, AOR = 2.20, 95% CI = 1.21-4.02; AOR = 2.45, 95% CI = 1.34-4.50). No smoke-free rules significantly predicted using cigarettes and other tobacco products compared to comprehensive rules. In both descriptive and regression analyses, we found SHS exposure rates in both the home and car were significantly lower among youth whose household implemented comprehensive smoke-free rules. Comprehensive smoke-free rules protect youth from the harms of caregiver tobacco use. Relative to both partial and no smoke-free rules, comprehensive smoke-free rules have a marked impact on tobacco use and SHS exposure among youth who live with a smoker. Health promotion efforts should promote comprehensive smoke-free rules among all households and particularly households with children and adolescents.

  14. Family size and old-age wellbeing: effects of the fertility transition in Mexico

    PubMed Central

    DÍAZ-VENEGAS, CARLOS; SÁENZ, JOSEPH L.; WONG, REBECA

    2016-01-01

    The present study aims to determine how family size affects psycho-social, economic and health wellbeing in old age differently across two cohorts with declining fertility. The data are from the 2012 Mexican Health and Ageing Study (MHAS) including respondents aged 50+ (N = 13,102). Poisson (standard and zero-inflated) and logistic regressions are used to model determinants of wellbeing in old age: psycho-social (depressive symptoms), economic (consumer durables and insurance) and health (chronic conditions). In the younger cohort, having fewer children is associated with fewer depressive symptoms and chronic conditions, and better economic well-being. For the older cohort, having fewer children is associated with lower economic wellbeing and higher odds of being uninsured. Lower fertility benefited the younger cohort (born after 1937), whereas the older cohort (born in 1937 or earlier) benefited from lower fertility only in chronic conditions. Further research is needed to continue exploring the old-age effects of the fertility transition. PMID:28239210

  15. Health care usage among immigrants and native-born elderly populations in eleven European countries: results from SHARE

    PubMed Central

    Guillén, Montserrat; Crimmins, Eileen M.

    2013-01-01

    Differences in health care utilization of immigrants 50 years of age and older relative to the native-born populations in eleven European countries are investigated. Negative binomial and zero-inflated Poisson regression are used to examine differences between immigrants and native-borns in number of doctor visits, visits to general practitioners, and hospital stays using the 2004 Survey of Health, Ageing, and Retirement in Europe database. In the pooled European sample and in some individual countries, older immigrants use from 13 to 20% more health services than native-borns after demographic characteristics are controlled. After controlling for the need for health care, differences between immigrants and native-borns in the use of physicians, but not hospitals, are reduced by about half. These are not changed much with the incorporation of indicators of socioeconomic status and extra insurance coverage. Higher country-level relative expenditures on health, paying physicians a fee-for-service, and physician density are associated with higher usage of physician services among immigrants. PMID:21660564

  16. Examining psychosocial and physical hazards in the Ghanaian mining industry and their implications for employees' safety experience.

    PubMed

    Amponsah-Tawiah, Kwesi; Jain, Aditya; Leka, Stavroula; Hollis, David; Cox, Tom

    2013-06-01

    In addition to hazardous conditions that are prevalent in mines, there are various physical and psychosocial risk factors that can affect mine workers' safety and health. Without due diligence to mine safety, these risk factors can affect workers' safety experience, in terms of near misses, disabling injuries and accidents experienced or witnessed by workers. This study sets out to examine the effects of physical and psychosocial risk factors on workers' safety experience in a sample of Ghanaian miners. 307 participants from five mining companies responded to a cross sectional survey examining physical and psychosocial hazards and their implications for employees' safety experience. Zero-inflated Poisson regression models indicated that mining conditions, equipment, ambient conditions, support and security, and work demands and control are significant predictors of near misses, disabling injuries, and accidents experienced or witnessed by workers. The type of mine had important implications for workers' safety experience. Copyright © 2013 Elsevier Ltd and National Safety Council. All rights reserved.

  17. A Spatial Poisson Hurdle Model for Exploring Geographic Variation in Emergency Department Visits

    PubMed Central

    Neelon, Brian; Ghosh, Pulak; Loebs, Patrick F.

    2012-01-01

    Summary We develop a spatial Poisson hurdle model to explore geographic variation in emergency department (ED) visits while accounting for zero inflation. The model consists of two components: a Bernoulli component that models the probability of any ED use (i.e., at least one ED visit per year), and a truncated Poisson component that models the number of ED visits given use. Together, these components address both the abundance of zeros and the right-skewed nature of the nonzero counts. The model has a hierarchical structure that incorporates patient- and area-level covariates, as well as spatially correlated random effects for each areal unit. Because regions with high rates of ED use are likely to have high expected counts among users, we model the spatial random effects via a bivariate conditionally autoregressive (CAR) prior, which introduces dependence between the components and provides spatial smoothing and sharing of information across neighboring regions. Using a simulation study, we show that modeling the between-component correlation reduces bias in parameter estimates. We adopt a Bayesian estimation approach, and the model can be fit using standard Bayesian software. We apply the model to a study of patient and neighborhood factors influencing emergency department use in Durham County, North Carolina. PMID:23543242

  18. Coronary artery calcium distributions in older persons in the AGES-Reykjavik study

    PubMed Central

    Gudmundsson, Elias Freyr; Gudnason, Vilmundur; Sigurdsson, Sigurdur; Launer, Lenore J.; Harris, Tamara B.; Aspelund, Thor

    2013-01-01

    Coronary Artery Calcium (CAC) is a sign of advanced atherosclerosis and an independent risk factor for cardiac events. Here, we describe CAC-distributions in an unselected aged population and compare modelling methods to characterize CAC-distribution. CAC is difficult to model because it has a skewed and zero inflated distribution with over-dispersion. Data are from the AGES-Reykjavik sample, a large population based study [2002-2006] in Iceland of 5,764 persons aged 66-96 years. Linear regressions using logarithmic- and Box-Cox transformations on CAC+1, quantile regression and a Zero-Inflated Negative Binomial model (ZINB) were applied. Methods were compared visually and with the PRESS-statistic, R2 and number of detected associations with concurrently measured variables. There were pronounced differences in CAC according to sex, age, history of coronary events and presence of plaque in the carotid artery. Associations with conventional coronary artery disease (CAD) risk factors varied between the sexes. The ZINB model provided the best results with respect to the PRESS-statistic, R2, and predicted proportion of zero scores. The ZINB model detected similar numbers of associations as the linear regression on ln(CAC+1) and usually with the same risk factors. PMID:22990371

  19. Paramedic-Initiated Home Care Referrals and Use of Home Care and Emergency Medical Services.

    PubMed

    Verma, Amol A; Klich, John; Thurston, Adam; Scantlebury, Jordan; Kiss, Alex; Seddon, Gayle; Sinha, Samir K

    2018-01-01

    We examined the association between paramedic-initiated home care referrals and utilization of home care, 9-1-1, and Emergency Department (ED) services. This was a retrospective cohort study of individuals who received a paramedic-initiated home care referral after a 9-1-1 call between January 1, 2011 and December 31, 2012 in Toronto, Ontario, Canada. Home care, 9-1-1, and ED utilization were compared in the 6 months before and after home care referral. Nonparametric longitudinal regression was performed to assess changes in hours of home care service use and zero-inflated Poisson regression was performed to assess changes in the number of 9-1-1 calls and ambulance transports to ED. During the 24-month study period, 2,382 individuals received a paramedic-initiated home care referral. After excluding individuals who died, were hospitalized, or were admitted to a nursing home, the final study cohort was 1,851. The proportion of the study population receiving home care services increased from 18.2% to 42.5% after referral, representing 450 additional people receiving services. In longitudinal regression analysis, there was an increase of 17.4 hours in total services per person in the six months after referral (95% CI: 1.7-33.1, p = 0.03). The mean number of 9-1-1 calls per person was 1.44 (SD 9.58) before home care referral and 1.20 (SD 7.04) after home care referral in the overall study cohort. This represented a 10% reduction in 9-1-1 calls (95% CI: 7-13%, p < 0.001) in Poisson regression analysis. The mean number of ambulance transports to ED per person was 0.91 (SD 8.90) before home care referral and 0.79 (SD 6.27) after home care referral, representing a 7% reduction (95% CI: 3-11%, p < 0.001) in Poisson regression analysis. When only the participants with complete paramedic and home care records were included in the analysis, the reductions in 9-1-1 calls and ambulance transports to ED were attenuated but remained statistically significant. Paramedic-initiated home care referrals in Toronto were associated with improved access to and use of home care services and may have been associated with reduced 9-1-1 calls and ambulance transports to ED.

  20. Exploring the effects of roadway characteristics on the frequency and severity of head-on crashes: case studies from Malaysian federal roads.

    PubMed

    Hosseinpour, Mehdi; Yahaya, Ahmad Shukri; Sadullah, Ahmad Farhan

    2014-01-01

    Head-on crashes are among the most severe collision types and of great concern to road safety authorities. Therefore, it justifies more efforts to reduce both the frequency and severity of this collision type. To this end, it is necessary to first identify factors associating with the crash occurrence. This can be done by developing crash prediction models that relate crash outcomes to a set of contributing factors. This study intends to identify the factors affecting both the frequency and severity of head-on crashes that occurred on 448 segments of five federal roads in Malaysia. Data on road characteristics and crash history were collected on the study segments during a 4-year period between 2007 and 2010. The frequency of head-on crashes were fitted by developing and comparing seven count-data models including Poisson, standard negative binomial (NB), random-effect negative binomial, hurdle Poisson, hurdle negative binomial, zero-inflated Poisson, and zero-inflated negative binomial models. To model crash severity, a random-effect generalized ordered probit model (REGOPM) was used given a head-on crash had occurred. With respect to the crash frequency, the random-effect negative binomial (RENB) model was found to outperform the other models according to goodness of fit measures. Based on the results of the model, the variables horizontal curvature, terrain type, heavy-vehicle traffic, and access points were found to be positively related to the frequency of head-on crashes, while posted speed limit and shoulder width decreased the crash frequency. With regard to the crash severity, the results of REGOPM showed that horizontal curvature, paved shoulder width, terrain type, and side friction were associated with more severe crashes, whereas land use, access points, and presence of median reduced the probability of severe crashes. Based on the results of this study, some potential countermeasures were proposed to minimize the risk of head-on crashes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Dental Caries and Enamel Defects in Very Low Birth Weight Adolescents

    PubMed Central

    Nelson, S.; Albert, J.M.; Lombardi, G.; Wishnek, S.; Asaad, G.; Kirchner, H.L.; Singer, L.T.

    2011-01-01

    Objectives The purpose of this study was to examine developmental enamel defects and dental caries in very low birth weight adolescents with high risk (HR-VLBW) and low risk (LR-VLBW) compared to full-term (term) adolescents. Methods The sample consisted of 224 subjects (80 HR-VLBW, 59 LR-VLBW, 85 term adolescents) recruited from an ongoing longitudinal study. Sociodemographic and medical information was available from birth. Dental examination of the adolescent at the 14-year visit included: enamel defects (opacity and hypoplasia); decayed, missing, filled teeth of incisors and molars (DMFT-IM) and of overall permanent teeth (DMFT); Simplified Oral Hygiene Index for debris/calculus on teeth, and sealant presence. A caregiver questionnaire completed simultaneously assessed dental behavior, access, insurance status and prevention factors. Hierarchical analysis utilized the zero-inflated negative binomial model and zero-inflated Poisson model. Results The zero-inflated negative binomial model controlling for sociodemographic variables indicated that the LR-VLBW group had an estimated 75% increase (p < 0.05) in number of demarcated opacities in the incisors and first molar teeth compared to the term group. Hierarchical modeling indicated that demarcated opacities were a significant predictor of DMFT-IM after control for relevant covariates. The term adolescents had significantly increased DMFT-IM and DMFT scores compared to the LR-VLBW adolescents. Conclusion LR-VLBW was a significant risk factor for increased enamel defects in the permanent incisors and first molars. Term children had increased caries compared to the LR-VLBW group. The effect of birth group and enamel defects on caries has to be investigated longitudinally from birth. PMID:20975268

  2. Fast-food exposure around schools in urban Adelaide.

    PubMed

    Coffee, Neil T; Kennedy, Hannah P; Niyonsenga, Theo

    2016-12-01

    To assess whether exposure to fast-food outlets around schools differed depending on socio-economic status (SES). Binary logistic regression was used to investigate the presence and zero-inflated Poisson regression was used for the count (due to the excess of zeroes) of fast food within 1000 m and 15000 m road network buffers around schools. The low and middle SES tertiles were combined due to a lack of significant variation as the 'disadvantaged' group and compared with the high SES tertile as the 'advantaged' group. School SES was expressed using the 2011 Australian Bureau of Statistics, socio-economic indices for areas, index of relative socio-economic disadvantage. Fast-food data included independent takeaway food outlets and major fast-food chains. Metropolitan Adelaide, South Australia. A total of 459 schools were geocoded to the street address and 1000 m and 1500 m road network distance buffers calculated. There was a 1·6 times greater risk of exposure to fast food within 1000 m (OR=1·634; 95 % 1·017, 2·625) and a 9·5 times greater risk of exposure to a fast food within 1500 m (OR=9·524; 95 % CI 3·497, 25·641) around disadvantaged schools compared with advantaged schools. Disadvantaged schools were exposed to more fast food, with more than twice the number of disadvantaged schools exposed to fast food. The higher exposure to fast food near more disadvantaged schools may reflect lower commercial land cost in low-SES areas, potentially creating more financially desirable investments for fast-food developers.

  3. Spatio-temporal patterns of gun violence in Syracuse, New York 2009-2015.

    PubMed

    Larsen, David A; Lane, Sandra; Jennings-Bey, Timothy; Haygood-El, Arnett; Brundage, Kim; Rubinstein, Robert A

    2017-01-01

    Gun violence in the United States of America is a large public health problem that disproportionately affects urban areas. The epidemiology of gun violence reflects various aspects of an infectious disease including spatial and temporal clustering. We examined the spatial and temporal trends of gun violence in Syracuse, New York, a city of 145,000. We used a spatial scan statistic to reveal spatio-temporal clusters of gunshots investigated and corroborated by Syracuse City Police Department for the years 2009-2015. We also examined predictors of areas with increased gun violence using a multi-level zero-inflated Poisson regression with data from the 2010 census. Two space-time clusters of gun violence were revealed in the city. Higher rates of segregation, poverty and the summer months were all associated with increased risk of gun violence. Previous gunshots in the area were associated with a 26.8% increase in the risk of gun violence. Gun violence in Syracuse, NY is both spatially and temporally stable, with some neighborhoods of the city greatly afflicted.

  4. Spatio-temporal patterns of gun violence in Syracuse, New York 2009-2015

    PubMed Central

    Lane, Sandra; Jennings-Bey, Timothy; Haygood-El, Arnett; Brundage, Kim; Rubinstein, Robert A.

    2017-01-01

    Gun violence in the United States of America is a large public health problem that disproportionately affects urban areas. The epidemiology of gun violence reflects various aspects of an infectious disease including spatial and temporal clustering. We examined the spatial and temporal trends of gun violence in Syracuse, New York, a city of 145,000. We used a spatial scan statistic to reveal spatio-temporal clusters of gunshots investigated and corroborated by Syracuse City Police Department for the years 2009–2015. We also examined predictors of areas with increased gun violence using a multi-level zero-inflated Poisson regression with data from the 2010 census. Two space-time clusters of gun violence were revealed in the city. Higher rates of segregation, poverty and the summer months were all associated with increased risk of gun violence. Previous gunshots in the area were associated with a 26.8% increase in the risk of gun violence. Gun violence in Syracuse, NY is both spatially and temporally stable, with some neighborhoods of the city greatly afflicted. PMID:28319125

  5. Novel Phenotype Issues Raised in Cross-National Epidemiological Research on Drug Dependence

    PubMed Central

    Anthony, James C.

    2010-01-01

    Stage-transition models based on the American Diagnostic and Statistical Manual (DSM) generally are applied in epidemiology and genetics research on drug dependence syndromes associated with cannabis, cocaine, and other internationally regulated drugs (IRD). Difficulties with DSM stage-transition models have surfaced during cross-national research intended to provide a truly global perspective, such as the work of the World Mental Health Surveys (WMHS) Consortium. Alternative simpler dependence-related phenotypes are possible, including population-level count process models for steps early and before coalescence of clinical features into a coherent syndrome (e.g., zero-inflated Poisson regression). Selected findings are reviewed, based on ZIP modeling of alcohol, tobacco, and IRD count processes, with an illustration that may stimulate new research on genetic susceptibility traits. The annual National Surveys on Drug Use and Health can be readily modified for this purpose, along the lines of a truly anonymous research approach that can help make NSDUH-type cross-national epidemiological surveys more useful in the context of subsequent genome wide association (GWAS) research and post-GWAS investigations with a truly global health perspective. PMID:20201862

  6. A Three-dimensional Polymer Scaffolding Material Exhibiting a Zero Poisson's Ratio.

    PubMed

    Soman, Pranav; Fozdar, David Y; Lee, Jin Woo; Phadke, Ameya; Varghese, Shyni; Chen, Shaochen

    2012-05-14

    Poisson's ratio describes the degree to which a material contracts (expands) transversally when axially strained. A material with a zero Poisson's ratio does not transversally deform in response to an axial strain (stretching). In tissue engineering applications, scaffolding having a zero Poisson's ratio (ZPR) may be more suitable for emulating the behavior of native tissues and accommodating and transmitting forces to the host tissue site during wound healing (or tissue regrowth). For example, scaffolding with a zero Poisson's ratio may be beneficial in the engineering of cartilage, ligament, corneal, and brain tissues, which are known to possess Poisson's ratios of nearly zero. Here, we report a 3D biomaterial constructed from polyethylene glycol (PEG) exhibiting in-plane Poisson's ratios of zero for large values of axial strain. We use digital micro-mirror device projection printing (DMD-PP) to create single- and double-layer scaffolds composed of semi re-entrant pores whose arrangement and deformation mechanisms contribute the zero Poisson's ratio. Strain experiments prove the zero Poisson's behavior of the scaffolds and that the addition of layers does not change the Poisson's ratio. Human mesenchymal stem cells (hMSCs) cultured on biomaterials with zero Poisson's ratio demonstrate the feasibility of utilizing these novel materials for biological applications which require little to no transverse deformations resulting from axial strains. Techniques used in this work allow Poisson's ratio to be both scale-independent and independent of the choice of strut material for strains in the elastic regime, and therefore ZPR behavior can be imparted to a variety of photocurable biomaterial.

  7. Identifiability in N-mixture models: a large-scale screening test with bird data.

    PubMed

    Kéry, Marc

    2018-02-01

    Binomial N-mixture models have proven very useful in ecology, conservation, and monitoring: they allow estimation and modeling of abundance separately from detection probability using simple counts. Recently, doubts about parameter identifiability have been voiced. I conducted a large-scale screening test with 137 bird data sets from 2,037 sites. I found virtually no identifiability problems for Poisson and zero-inflated Poisson (ZIP) binomial N-mixture models, but negative-binomial (NB) models had problems in 25% of all data sets. The corresponding multinomial N-mixture models had no problems. Parameter estimates under Poisson and ZIP binomial and multinomial N-mixture models were extremely similar. Identifiability problems became a little more frequent with smaller sample sizes (267 and 50 sites), but were unaffected by whether the models did or did not include covariates. Hence, binomial N-mixture model parameters with Poisson and ZIP mixtures typically appeared identifiable. In contrast, NB mixtures were often unidentifiable, which is worrying since these were often selected by Akaike's information criterion. Identifiability of binomial N-mixture models should always be checked. If problems are found, simpler models, integrated models that combine different observation models or the use of external information via informative priors or penalized likelihoods, may help. © 2017 by the Ecological Society of America.

  8. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters.

  9. Understanding the Determinants of Debt Burden among College Graduates

    ERIC Educational Resources Information Center

    Chen, Rong; Wiederspan, Mark

    2014-01-01

    This article examines debt burden among college graduates and contributes to previous research by incorporating institutional and state characteristics. Utilizing a combination of national datasets and zero-one inflated beta regression, we find several major themes. First, family income and college experiences are strongly associated with the…

  10. Logistic quantile regression provides improved estimates for bounded avian counts: a case study of California Spotted Owl fledgling production

    Treesearch

    Brian S. Cade; Barry R. Noon; Rick D. Scherer; John J. Keane

    2017-01-01

    Counts of avian fledglings, nestlings, or clutch size that are bounded below by zero and above by some small integer form a discrete random variable distribution that is not approximated well by conventional parametric count distributions such as the Poisson or negative binomial. We developed a logistic quantile regression model to provide estimates of the empirical...

  11. Observation weights unlock bulk RNA-seq tools for zero inflation and single-cell applications.

    PubMed

    Van den Berge, Koen; Perraudeau, Fanny; Soneson, Charlotte; Love, Michael I; Risso, Davide; Vert, Jean-Philippe; Robinson, Mark D; Dudoit, Sandrine; Clement, Lieven

    2018-02-26

    Dropout events in single-cell RNA sequencing (scRNA-seq) cause many transcripts to go undetected and induce an excess of zero read counts, leading to power issues in differential expression (DE) analysis. This has triggered the development of bespoke scRNA-seq DE methods to cope with zero inflation. Recent evaluations, however, have shown that dedicated scRNA-seq tools provide no advantage compared to traditional bulk RNA-seq tools. We introduce a weighting strategy, based on a zero-inflated negative binomial model, that identifies excess zero counts and generates gene- and cell-specific weights to unlock bulk RNA-seq DE pipelines for zero-inflated data, boosting performance for scRNA-seq.

  12. Assessment and Selection of Competing Models for Zero-Inflated Microbiome Data

    PubMed Central

    Xu, Lizhen; Paterson, Andrew D.; Turpin, Williams; Xu, Wei

    2015-01-01

    Typical data in a microbiome study consist of the operational taxonomic unit (OTU) counts that have the characteristic of excess zeros, which are often ignored by investigators. In this paper, we compare the performance of different competing methods to model data with zero inflated features through extensive simulations and application to a microbiome study. These methods include standard parametric and non-parametric models, hurdle models, and zero inflated models. We examine varying degrees of zero inflation, with or without dispersion in the count component, as well as different magnitude and direction of the covariate effect on structural zeros and the count components. We focus on the assessment of type I error, power to detect the overall covariate effect, measures of model fit, and bias and effectiveness of parameter estimations. We also evaluate the abilities of model selection strategies using Akaike information criterion (AIC) or Vuong test to identify the correct model. The simulation studies show that hurdle and zero inflated models have well controlled type I errors, higher power, better goodness of fit measures, and are more accurate and efficient in the parameter estimation. Besides that, the hurdle models have similar goodness of fit and parameter estimation for the count component as their corresponding zero inflated models. However, the estimation and interpretation of the parameters for the zero components differs, and hurdle models are more stable when structural zeros are absent. We then discuss the model selection strategy for zero inflated data and implement it in a gut microbiome study of > 400 independent subjects. PMID:26148172

  13. Assessment and Selection of Competing Models for Zero-Inflated Microbiome Data.

    PubMed

    Xu, Lizhen; Paterson, Andrew D; Turpin, Williams; Xu, Wei

    2015-01-01

    Typical data in a microbiome study consist of the operational taxonomic unit (OTU) counts that have the characteristic of excess zeros, which are often ignored by investigators. In this paper, we compare the performance of different competing methods to model data with zero inflated features through extensive simulations and application to a microbiome study. These methods include standard parametric and non-parametric models, hurdle models, and zero inflated models. We examine varying degrees of zero inflation, with or without dispersion in the count component, as well as different magnitude and direction of the covariate effect on structural zeros and the count components. We focus on the assessment of type I error, power to detect the overall covariate effect, measures of model fit, and bias and effectiveness of parameter estimations. We also evaluate the abilities of model selection strategies using Akaike information criterion (AIC) or Vuong test to identify the correct model. The simulation studies show that hurdle and zero inflated models have well controlled type I errors, higher power, better goodness of fit measures, and are more accurate and efficient in the parameter estimation. Besides that, the hurdle models have similar goodness of fit and parameter estimation for the count component as their corresponding zero inflated models. However, the estimation and interpretation of the parameters for the zero components differs, and hurdle models are more stable when structural zeros are absent. We then discuss the model selection strategy for zero inflated data and implement it in a gut microbiome study of > 400 independent subjects.

  14. Hierarchical modeling of bycatch rates of sea turtles in the western North Atlantic

    USGS Publications Warehouse

    Gardner, B.; Sullivan, P.J.; Epperly, S.; Morreale, S.J.

    2008-01-01

    Previous studies indicate that the locations of the endangered loggerhead Caretta caretta and critically endangered leatherback Dermochelys coriacea sea turtles are influenced by water temperatures, and that incidental catch rates in the pelagic longline fishery vary by region. We present a Bayesian hierarchical model to examine the effects of environmental variables, including water temperature, on the number of sea turtles captured in the US pelagic longline fishery in the western North Atlantic. The modeling structure is highly flexible, utilizes a Bayesian model selection technique, and is fully implemented in the software program WinBUGS. The number of sea turtles captured is modeled as a zero-inflated Poisson distribution and the model incorporates fixed effects to examine region-specific differences in the parameter estimates. Results indicate that water temperature, region, bottom depth, and target species are all significant predictors of the number of loggerhead sea turtles captured. For leatherback sea turtles, the model with only target species had the most posterior model weight, though a re-parameterization of the model indicates that temperature influences the zero-inflation parameter. The relationship between the number of sea turtles captured and the variables of interest all varied by region. This suggests that management decisions aimed at reducing sea turtle bycatch may be more effective if they are spatially explicit. ?? Inter-Research 2008.

  15. Unified Computational Methods for Regression Analysis of Zero-Inflated and Bound-Inflated Data

    PubMed Central

    Yang, Yan; Simpson, Douglas

    2010-01-01

    Bounded data with excess observations at the boundary are common in many areas of application. Various individual cases of inflated mixture models have been studied in the literature for bound-inflated data, yet the computational methods have been developed separately for each type of model. In this article we use a common framework for computing these models, and expand the range of models for both discrete and semi-continuous data with point inflation at the lower boundary. The quasi-Newton and EM algorithms are adapted and compared for estimation of model parameters. The numerical Hessian and generalized Louis method are investigated as means for computing standard errors after optimization. Correlated data are included in this framework via generalized estimating equations. The estimation of parameters and effectiveness of standard errors are demonstrated through simulation and in the analysis of data from an ultrasound bioeffect study. The unified approach enables reliable computation for a wide class of inflated mixture models and comparison of competing models. PMID:20228950

  16. Screening Adolescents in the Emergency Department for Weapon Carriage

    PubMed Central

    Cunningham, Rebecca M.; Resko, Stella M.; Harrison, Stephanie Roahen; Zimmerman, Marc; Stanley, Rachel; Chermack, Stephen T.; Walton, Maureen A.

    2010-01-01

    Objective To describe prevalence and correlates of past year weapon involvement among adolescents seeking care in an inner-city ED. Methods This cross-sectional study administered a computerized survey to all eligible adolescents (age 14–18), seven days a week seeking care in the ED over an 18 month period in an inner-city Level 1 ED. Validated measures were administered including measures of demographics, sexual activity, substance use, injury, violent behavior and weapon carriage/use. Results Adolescents (N=2069, 86% response rate) completed the computerized survey. 55% were female; 56.5% were African American. In the past year, 20% of adolescents reported knife/razor carriage, 7% reported gun carriage, and 6% pulled a knife/gun on someone; zero-inflated Poisson (ZIP) regression models were used to identify correlates of the occurrence and past year frequency of these weapon variables. Although gun carriage was more frequent among males, females were as likely to carry a knife or pull a weapon in the past year. Conclusions One fifth of all adolescent’s seeking care in this inner city ED have carried a weapon. Understanding weapon carriage among teens seeking ED care is a critical first step to future ED based injury prevention initiatives. PMID:20370746

  17. Prescription Drug Misuse and Sexual Behavior Among Young Adults.

    PubMed

    Wells, Brooke E; Kelly, Brian C; Rendina, H Jonathon; Parsons, Jeffrey T

    2015-01-01

    Though research indicates a complex link between substance use and sexual risk behavior, there is limited research on the association between sexual risk behavior and prescription drug misuse. In light of alarming increases in prescription drug misuse and the role of demographic characteristics in sexual risk behavior and outcomes, the current study examined demographic differences (gender, sexual identity, age, relationship status, parental class background, and race/ethnicity) in sexual risk behavior, sexual behavior under the influence of prescription drugs, and sexual risk behavior under the influence of prescription drugs in a sample of 402 young adults (ages 18 to 29) who misused prescription drugs. Nearly half of the sexually active young adult prescription drug misusers in this sample reported recent sex under the influence of prescription drugs; more than three-quarters reported recent sex without a condom; and more than one-third reported recent sex without a condom after using prescription drugs. Zero-inflated Poisson regression models indicated that White race, younger age, higher parental class, and being a heterosexual man were all associated with sexual risk behavior, sex under the influence of prescription drugs, and sexual risk under the influence of prescription drugs. Findings have implications for the targeting of prevention and intervention efforts.

  18. Physiological Response to Reward and Extinction Predicts Alcohol, Marijuana, and Cigarette Use Two Years Later

    PubMed Central

    Derefinko, Karen J.; Eisenlohr-Moul, Tory A.; Peters, Jessica R.; Roberts, Walter; Walsh, Erin C.; Milich, Richard; Lynam, Donald R.

    2017-01-01

    Background Physiological responses to reward and extinction are believed to represent the Behavioral Activation System (BAS) and Behavioral Inhibition System (BIS) constructs of Reinforcement Sensitivity Theory and underlie externalizing behaviors, including substance use. However, little research has examined these relations directly. Methods We assessed individuals’ cardiac pre-ejection periods (PEP) and electrodermal responses (EDR) during reward and extinction trials through the “Number Elimination Game” paradigm. Responses represented BAS and BIS, respectively. We then examined whether these responses provided incremental utility in the prediction of future alcohol, marijuana, and cigarette use. Results Zero-inflated Poisson (ZIP) regression models were used to examine the predictive utility of physiological BAS and BIS responses above and beyond previous substance use. Physiological responses accounted for incremental variance over previous use. Low BAS responses during reward predicted frequency of alcohol use at year 3. Low BAS responses during reward and extinction and high BIS responses during extinction predicted frequency of marijuana use at year 3. For cigarette use, low BAS response during extinction predicted use at year 3. Conclusions These findings suggest that the constructs of Reinforcement Sensitivity Theory, as assessed through physiology, contribute to the longitudinal maintenance of substance use. PMID:27306728

  19. Quantitative Relationship Between AUEC of Absolute Neutrophil Count and Duration of Severe Neutropenia for G-CSF in Breast Cancer Patients.

    PubMed

    Li, Liang; Ma, Lian; Schrieber, Sarah J; Rahman, Nam Atiqur; Deisseroth, Albert; Farrell, Ann T; Wang, Yaning; Sinha, Vikram; Marathe, Anshu

    2018-02-02

    The aim of the study was to evaluate the quantitative relationship between duration of severe neutropenia (DSN, the efficacy endpoint) and area under effect curve of absolute neutrophil counts (ANC-AUEC, the pharmacodynamic endpoint), based on data from filgrastim products, a human granulocyte colony-stimulating factor (G-CSF). Clinical data from filgrastim product comparator and test arms of two randomized, parallel-group, phase III studies in breast cancer patients treated with myelosuppressive chemotherapy were utilized. A zero-inflated Poisson regression model best described the negative correlation between DSN and ANC-AUEC. The models predicted that with 10 × 10 9 day/L of increase in ANC-AUEC, the mean DSN would decrease from 1.1 days to 0.93 day in Trial 1 and from 1.2 days to 1.0 day in Trial 2. The findings of the analysis provide useful information regarding the relationship between ANC and DSN that can be used for dose selection and optimization of clinical trial design for G-CSF. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.

  20. QMRA for Drinking Water: 2. The Effect of Pathogen Clustering in Single-Hit Dose-Response Models.

    PubMed

    Nilsen, Vegard; Wyller, John

    2016-01-01

    Spatial and/or temporal clustering of pathogens will invalidate the commonly used assumption of Poisson-distributed pathogen counts (doses) in quantitative microbial risk assessment. In this work, the theoretically predicted effect of spatial clustering in conventional "single-hit" dose-response models is investigated by employing the stuttering Poisson distribution, a very general family of count distributions that naturally models pathogen clustering and contains the Poisson and negative binomial distributions as special cases. The analysis is facilitated by formulating the dose-response models in terms of probability generating functions. It is shown formally that the theoretical single-hit risk obtained with a stuttering Poisson distribution is lower than that obtained with a Poisson distribution, assuming identical mean doses. A similar result holds for mixed Poisson distributions. Numerical examples indicate that the theoretical single-hit risk is fairly insensitive to moderate clustering, though the effect tends to be more pronounced for low mean doses. Furthermore, using Jensen's inequality, an upper bound on risk is derived that tends to better approximate the exact theoretical single-hit risk for highly overdispersed dose distributions. The bound holds with any dose distribution (characterized by its mean and zero inflation index) and any conditional dose-response model that is concave in the dose variable. Its application is exemplified with published data from Norovirus feeding trials, for which some of the administered doses were prepared from an inoculum of aggregated viruses. The potential implications of clustering for dose-response assessment as well as practical risk characterization are discussed. © 2016 Society for Risk Analysis.

  1. Item Response Modeling of Multivariate Count Data with Zero Inflation, Maximum Inflation, and Heaping

    ERIC Educational Resources Information Center

    Magnus, Brooke E.; Thissen, David

    2017-01-01

    Questionnaires that include items eliciting count responses are becoming increasingly common in psychology. This study proposes methodological techniques to overcome some of the challenges associated with analyzing multivariate item response data that exhibit zero inflation, maximum inflation, and heaping at preferred digits. The modeling…

  2. Contribution of chronic diseases to disability in elderly people in countries with low and middle incomes: a 10/66 Dementia Research Group population-based survey.

    PubMed

    Sousa, Renata M; Ferri, Cleusa P; Acosta, Daisy; Albanese, Emiliano; Guerra, Mariella; Huang, Yueqin; Jacob, K S; Jotheeswaran, A T; Rodriguez, Juan J Llibre; Pichardo, Guillermina Rodriguez; Rodriguez, Marina Calvo; Salas, Aquiles; Sosa, Ana Luisa; Williams, Joseph; Zuniga, Tirso; Prince, Martin

    2009-11-28

    Disability in elderly people in countries with low and middle incomes is little studied; according to Global Burden of Disease estimates, visual impairment is the leading contributor to years lived with disability in this population. We aimed to assess the contribution of physical, mental, and cognitive chronic diseases to disability, and the extent to which sociodemographic and health characteristics account for geographical variation in disability. We undertook cross-sectional surveys of residents aged older than 65 years (n=15 022) in 11 sites in seven countries with low and middle incomes (China, India, Cuba, Dominican Republic, Venezuela, Mexico, and Peru). Disability was assessed with the 12-item WHO disability assessment schedule 2.0. Dementia, depression, hypertension, and chronic obstructive pulmonary disease were ascertained by clinical assessment; diabetes, stroke, and heart disease by self-reported diagnosis; and sensory, gastrointestinal, skin, limb, and arthritic disorders by self-reported impairment. Independent contributions to disability scores were assessed by zero-inflated negative binomial regression and Poisson regression to generate population-attributable prevalence fractions (PAPF). In regions other than rural India and Venezuela, dementia made the largest contribution to disability (median PAPF 25.1% [IQR 19.2-43.6]). Other substantial contributors were stroke (11.4% [1.8-21.4]), limb impairment (10.5% [5.7-33.8]), arthritis (9.9% [3.2-34.8]), depression (8.3% [0.5-23.0]), eyesight problems (6.8% [1.7-17.6]), and gastrointestinal impairments (6.5% [0.3-23.1]). Associations with chronic diseases accounted for around two-thirds of prevalent disability. When zero inflation was taken into account, between-site differences in disability scores were largely attributable to compositional differences in health and sociodemographic characteristics. On the basis of empirical research, dementia, not blindness, is overwhelmingly the most important independent contributor to disability for elderly people in countries with low and middle incomes. Chronic diseases of the brain and mind deserve increased prioritisation. Besides disability, they lead to dependency and present stressful, complex, long-term challenges to carers. Societal costs are enormous. Wellcome Trust; WHO; US Alzheimer's Association; Fondo Nacional de Ciencia Y Tecnologia, Consejo de Desarrollo Cientifico Y Humanistico, Universidad Central de Venezuela.

  3. Predictors for hospitalization and outpatient visits in patients with inflammatory bowel disease: results from the Swiss Inflammatory Bowel Disease Cohort Study.

    PubMed

    Sulz, Michael C; Siebert, Uwe; Arvandi, Marjan; Gothe, Raffaella M; Wurm, Johannes; von Känel, Roland; Vavricka, Stephan R; Meyenberger, Christa; Sagmeister, Markus

    2013-07-01

    Patients with inflammatory bowel disease (IBD) have a high resource consumption, with considerable costs for the healthcare system. In a system with sparse resources, treatment is influenced not only by clinical judgement but also by resource consumption. We aimed to determine the resource consumption of IBD patients and to identify its significant predictors. Data from the prospective Swiss Inflammatory Bowel Disease Cohort Study were analysed for the resource consumption endpoints hospitalization and outpatient consultations at enrolment [1187 patients; 41.1% ulcerative colitis (UC), 58.9% Crohn's disease (CD)] and at 1-year follow-up (794 patients). Predictors of interest were chosen through an expert panel and a review of the relevant literature. Logistic regressions were used for binary endpoints, and negative binomial regressions and zero-inflated Poisson regressions were used for count data. For CD, fistula, use of biologics and disease activity were significant predictors for hospitalization days (all P-values <0.001); age, sex, steroid therapy and biologics were significant predictors for the number of outpatient visits (P=0.0368, 0.023, 0.0002, 0.0003, respectively). For UC, biologics, C-reactive protein, smoke quitters, age and sex were significantly predictive for hospitalization days (P=0.0167, 0.0003, 0.0003, 0.0076 and 0.0175 respectively); disease activity and immunosuppressive therapy predicted the number of outpatient visits (P=0.0009 and 0.0017, respectively). The results of multivariate regressions are shown in detail. Several highly significant clinical predictors for resource consumption in IBD were identified that might be considered in medical decision-making. In terms of resource consumption and its predictors, CD and UC show a different behaviour.

  4. Modeling factors influencing the demand for emergency department services in Ontario: a comparison of methods.

    PubMed

    Moineddin, Rahim; Meaney, Christopher; Agha, Mohammad; Zagorski, Brandon; Glazier, Richard Henry

    2011-08-19

    Emergency departments are medical treatment facilities, designed to provide episodic care to patients suffering from acute injuries and illnesses as well as patients who are experiencing sporadic flare-ups of underlying chronic medical conditions which require immediate attention. Supply and demand for emergency department services varies across geographic regions and time. Some persons do not rely on the service at all whereas; others use the service on repeated occasions. Issues regarding increased wait times for services and crowding illustrate the need to investigate which factors are associated with increased frequency of emergency department utilization. The evidence from this study can help inform policy makers on the appropriate mix of supply and demand targeted health care policies necessary to ensure that patients receive appropriate health care delivery in an efficient and cost-effective manner. The purpose of this report is to assess those factors resulting in increased demand for emergency department services in Ontario. We assess how utilization rates vary according to the severity of patient presentation in the emergency department. We are specifically interested in the impact that access to primary care physicians has on the demand for emergency department services. Additionally, we wish to investigate these trends using a series of novel regression models for count outcomes which have yet to be employed in the domain of emergency medical research. Data regarding the frequency of emergency department visits for the respondents of Canadian Community Health Survey (CCHS) during our study interval (2003-2005) are obtained from the National Ambulatory Care Reporting System (NACRS). Patients' emergency department utilizations were linked with information from the Canadian Community Health Survey (CCHS) which provides individual level medical, socio-demographic, psychological and behavioral information for investigating predictors of increased emergency department utilization. Six different multiple regression models for count data were fitted to assess the influence of predictors on demand for emergency department services, including: Poisson, Negative Binomial, Zero-Inflated Poisson, Zero-Inflated Negative Binomial, Hurdle Poisson, and Hurdle Negative Binomial. Comparison of competing models was assessed by the Vuong test statistic. The CCHS cycle 2.1 respondents were a roughly equal mix of males (50.4%) and females (49.6%). The majority (86.2%) were young-middle aged adults between the ages of 20-64, living in predominantly urban environments (85.9%), with mid-high household incomes (92.2%) and well-educated, receiving at least a high-school diploma (84.1%). Many participants reported no chronic disease (51.9%), fell into a small number (0-5) of ambulatory diagnostic groups (62.3%), and perceived their health status as good/excellent (88.1%); however, were projected to have high Resource Utilization Band levels of health resource utilization (68.2%). These factors were largely stable for CCHS cycle 3.1 respondents. Factors influencing demand for emergency department services varied according to the severity of triage scores at initial presentation. For example, although a non-significant predictor of the odds of emergency department utilization in high severity cases, access to a primary care physician was a statistically significant predictor of the likelihood of emergency department utilization (OR: 0.69; 95% CI OR: 0.63-0.75) and the rate of emergency department utilization (RR: 0.57; 95% CI RR: 0.50-0.66) in low severity cases. Using a theoretically appropriate hurdle negative binomial regression model this unique study illustrates that access to a primary care physician is an important predictor of both the odds and rate of emergency department utilization in Ontario. Restructuring primary care services, with aims of increasing access to undersupplied populations may result in decreased emergency department utilization rates by approximately 43% for low severity triage level cases.

  5. Variable selection for zero-inflated and overdispersed data with application to health care demand in Germany

    PubMed Central

    Wang, Zhu; Shuangge, Ma; Wang, Ching-Yun

    2017-01-01

    In health services and outcome research, count outcomes are frequently encountered and often have a large proportion of zeros. The zero-inflated negative binomial (ZINB) regression model has important applications for this type of data. With many possible candidate risk factors, this paper proposes new variable selection methods for the ZINB model. We consider maximum likelihood function plus a penalty including the least absolute shrinkage and selection operator (LASSO), smoothly clipped absolute deviation (SCAD) and minimax concave penalty (MCP). An EM (expectation-maximization) algorithm is proposed for estimating the model parameters and conducting variable selection simultaneously. This algorithm consists of estimating penalized weighted negative binomial models and penalized logistic models via the coordinated descent algorithm. Furthermore, statistical properties including the standard error formulae are provided. A simulation study shows that the new algorithm not only has more accurate or at least comparable estimation, also is more robust than the traditional stepwise variable selection. The proposed methods are applied to analyze the health care demand in Germany using an open-source R package mpath. PMID:26059498

  6. On the Singularity of the Vlasov-Poisson System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    and Hong Qin, Jian Zheng

    2013-04-26

    The Vlasov-Poisson system can be viewed as the collisionless limit of the corresponding Fokker- Planck-Poisson system. It is reasonable to expect that the result of Landau damping can also be obtained from the Fokker-Planck-Poisson system when the collision frequency v approaches zero. However, we show that the colllisionless Vlasov-Poisson system is a singular limit of the collisional Fokker-Planck-Poisson system, and Landau's result can be recovered only as the approaching zero from the positive side.

  7. On the singularity of the Vlasov-Poisson system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Jian; Qin, Hong; Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08550

    2013-09-15

    The Vlasov-Poisson system can be viewed as the collisionless limit of the corresponding Fokker-Planck-Poisson system. It is reasonable to expect that the result of Landau damping can also be obtained from the Fokker-Planck-Poisson system when the collision frequency ν approaches zero. However, we show that the collisionless Vlasov-Poisson system is a singular limit of the collisional Fokker-Planck-Poisson system, and Landau's result can be recovered only as the ν approaches zero from the positive side.

  8. Testing anti-smoking messages for Air Force trainees

    PubMed Central

    Popova, Lucy; Linde, Brittany D.; Bursac, Zoran; Talcott, G. Wayne; Modayil, Mary V.; Little, Melissa A.; Ling, Pamela M.; Glantz, Stanton A.; Klesges, Robert C.

    2015-01-01

    Introduction Young adults in the military are aggressively targeted by tobacco companies and are at high risk of tobacco use. Existing anti-smoking advertisements developed for the general population might be effective in educating young adults in the military. This study evaluated the effects of different themes of existing anti-smoking advertisements on perceived harm and intentions to use cigarettes and other tobacco products among Air Force trainees. Methods In a pretest-posttest experiment, 782 Airmen were randomized to view anti-smoking advertisements in one of six conditions: anti-industry, health effects+anti-industry, sexual health, secondhand smoke, environment+anti-industry, or control. We assessed the effect of different conditions on changes in perceived harm and intentions to use cigarettes, electronic cigarettes (e-cigarettes), smokeless tobacco, hookah and cigarillos from pretest to posttest with multivariable linear regression models (perceived harm) and zero-inflated Poisson regression model (intentions). Results Anti-smoking advertisements increased perceived harm of various tobacco products and reduced intentions to use. Advertisements featuring negative effects of tobacco on health and sexual performance coupled with revealing tobacco industry manipulations had the most consistent pattern of effects on perceived harm and intentions. Conclusion Anti-smoking advertisements produced for the general public might also be effective with a young adult military population and could have spillover effects on perceptions of harm and intentions to use other tobacco products besides cigarettes. Existing anti-smoking advertising may be a cost-effective tool to educate young adults in the military. PMID:26482786

  9. Nonsuicidal Self-Injury Among “Privileged” Youths: Longitudinal and Cross-Sectional Approaches to Developmental Process

    PubMed Central

    Yates, Tuppett M.; Luthar, Suniya S.; Tracy, Allison J.

    2015-01-01

    This investigation examined process-level pathways to nonsuicidal self-injury (NSSI; e.g., self-cutting, -burning, -hitting) in 2 cohorts of suburban, upper-middle-class youths: a cross-sectional sample of 9th–12th graders (n = 1,036, 51.9% girls) on the West Coast and a longitudinal sample followed annually from the 6th through 12th grades (n = 245, 53.1% girls) on the East Coast. High rates of NSSI were found in both the cross-sectional (37.2%) and the longitudinal (26.1%) samples. Zero-inflated Poisson regression models estimated process-level pathways from perceived parental criticism to NSSI via youth-reported alienation toward parents. Pathways toward the initiation of NSSI were distinct from those accounting for its frequency. Parental criticism was associated with increased NSSI, and youth alienation toward parents emerged as a relevant process underlying this pathway, particularly for boys. The specificity of these pathways was explored by examining separate trajectories toward delinquent outcomes. The findings illustrate the prominence of NSSI among “privileged” youths, the salience of the caregiving environment in NSSI, the importance of parental alienation in explaining these relations, and the value of incorporating multiple systems in treatment approaches for adolescents who self-injure. PMID:18229983

  10. Screening adolescents in the emergency department for weapon carriage.

    PubMed

    Cunningham, Rebecca M; Resko, Stella M; Harrison, Stephanie Roahen; Zimmerman, Marc; Stanley, Rachel; Chermack, Stephen T; Walton, Maureen A

    2010-02-01

    The objective was to describe the prevalence and correlates of past-year weapon involvement among adolescents seeking care in an inner-city emergency department (ED). This cross-sectional study administered a computerized survey to all eligible adolescents (age 14-18 years), 7 days a week, who were seeking care over an 18-month period at an inner-city Level 1 ED. Validated measures were administered, including measures of demographics, sexual activity, substance use, injury, violent behavior, weapon carriage, and/or weapon use. Zero-inflated Poisson (ZIP) regression models were used to identify correlates of the occurrence and past-year frequency of these weapons variables. Adolescents (n = 2069, 86% response rate) completed the computerized survey. Fifty-five percent were female; 56.5% were African American. In the past year, 20% of adolescents reported knife or razor carriage, 7% reported gun carriage, and 6% pulled a knife or gun on someone. Although gun carriage was more frequent among males, females were as likely to carry a knife or pull a weapon in the past year. One-fifth of all adolescents seeking care in this inner-city ED have carried a weapon. Understanding weapon carriage among teens seeking ED care is a critical first step to future ED-based injury prevention initiatives. (c) 2010 by the Society for Academic Emergency Medicine.

  11. Statistical analysis of the El Niño-Southern Oscillation and sea-floor seismicity in the eastern tropical Pacific.

    PubMed

    Guillas, Serge; Day, Simon J; McGuire, B

    2010-05-28

    We present statistical evidence for a temporal link between variations in the El Niño-Southern Oscillation (ENSO) and the occurrence of earthquakes on the East Pacific Rise (EPR). We adopt a zero-inflated Poisson regression model to represent the relationship between the number of earthquakes in the Easter microplate on the EPR and ENSO (expressed using the southern oscillation index (SOI) for east Pacific sea-level pressure anomalies) from February 1973 to February 2009. We also examine the relationship between the numbers of earthquakes and sea levels, as retrieved by Topex/Poseidon from October 1992 to July 2002. We observe a significant (95% confidence level) positive influence of SOI on seismicity: positive SOI values trigger more earthquakes over the following 2 to 6 months than negative SOI values. There is a significant negative influence of absolute sea levels on seismicity (at 6 months lag). We propose that increased seismicity is associated with ENSO-driven sea-surface gradients (rising from east to west) in the equatorial Pacific, leading to a reduction in ocean-bottom pressure over the EPR by a few kilopascal. This relationship is opposite to reservoir-triggered seismicity and suggests that EPR fault activity may be triggered by plate flexure associated with the reduced pressure.

  12. 'Green' on the ground but not in the air: Pro-environmental attitudes are related to household behaviours but not discretionary air travel.

    PubMed

    Alcock, Ian; White, Mathew P; Taylor, Tim; Coldwell, Deborah F; Gribble, Matthew O; Evans, Karl L; Corner, Adam; Vardoulakis, Sotiris; Fleming, Lora E

    2017-01-01

    The rise in greenhouse gas emissions from air travel could be reduced by individuals voluntarily abstaining from, or reducing, flights for leisure and recreational purposes. In theory, we might expect that people with pro-environmental value orientations and concerns about the risks of climate change, and those who engage in more pro-environmental household behaviours, would also be more likely to abstain from such voluntary air travel, or at least to fly less far. Analysis of two large datasets from the United Kingdom, weighted to be representative of the whole population, tested these associations. Using zero-inflated Poisson regression models, we found that, after accounting for potential confounders, there was no association between individuals' environmental attitudes, concern over climate change, or their routine pro-environmental household behaviours, and either their propensity to take non-work related flights, or the distances flown by those who do so. These findings contrasted with those for pro-environmental household behaviours, where associations with environmental attitudes and concern were observed. Our results offer little encouragement for policies aiming to reduce discretionary air travel through pro-environmental advocacy, or through 'spill-over' from interventions to improve environmental impacts of household routines.

  13. Prescription Drug Misuse and Sexual Behavior among Young Adults

    PubMed Central

    Wells, Brooke E.; Kelly, Brian C.; Rendina, H. Jonathon; Parsons, Jeffrey T.

    2015-01-01

    Though research indicates a complex link between substance use and sexual risk behavior, there is limited research on the association between sexual risk behavior and prescription drug misuse. In light of the alarming increases in prescription drug misuse and the role of demographic characteristics in sexual risk behavior and outcomes, the current study examines demographic differences (gender, sexual identity, age, relationship status, parental class background, and race/ethnicity) in sexual risk behavior, sexual behavior under the influence of prescription drugs, and sexual risk behavior under the influence of prescription drugs in a sample of 402 young adults (18–29) who misuse prescription drugs. Nearly half of the sexually active young adult prescription drug misusers in this sample reported recent sex under the influence of prescription drugs, more than three quarters reported recent sex without a condom, and more than one-third reported recent sex without a condom after using prescription drugs. Zero-inflated Poisson regression models indicated that white race, younger age, higher parental class, and being a heterosexual man were all associated with sexual risk behavior, sex under the influence of prescription drugs, and sexual risk under the influence of prescription drugs. Findings have implications for the targeting of prevention and intervention efforts. PMID:25569204

  14. ‘Green’ on the ground but not in the air: Pro-environmental attitudes are related to household behaviours but not discretionary air travel

    PubMed Central

    White, Mathew P.; Taylor, Tim; Coldwell, Deborah F.; Gribble, Matthew O.; Evans, Karl L.; Corner, Adam; Vardoulakis, Sotiris; Fleming, Lora E.

    2017-01-01

    The rise in greenhouse gas emissions from air travel could be reduced by individuals voluntarily abstaining from, or reducing, flights for leisure and recreational purposes. In theory, we might expect that people with pro-environmental value orientations and concerns about the risks of climate change, and those who engage in more pro-environmental household behaviours, would also be more likely to abstain from such voluntary air travel, or at least to fly less far. Analysis of two large datasets from the United Kingdom, weighted to be representative of the whole population, tested these associations. Using zero-inflated Poisson regression models, we found that, after accounting for potential confounders, there was no association between individuals' environmental attitudes, concern over climate change, or their routine pro-environmental household behaviours, and either their propensity to take non-work related flights, or the distances flown by those who do so. These findings contrasted with those for pro-environmental household behaviours, where associations with environmental attitudes and concern were observed. Our results offer little encouragement for policies aiming to reduce discretionary air travel through pro-environmental advocacy, or through ‘spill-over’ from interventions to improve environmental impacts of household routines. PMID:28367001

  15. Nonsuicidal self-injury among "privileged" youths: longitudinal and cross-sectional approaches to developmental process.

    PubMed

    Yates, Tuppett M; Tracy, Allison J; Luthar, Suniya S

    2008-02-01

    This investigation examined process-level pathways to nonsuicidal self-injury (NSSI; e.g., self-cutting, -burning, -hitting) in 2 cohorts of suburban, upper-middle-class youths: a cross-sectional sample of 9th-12th graders (n = 1,036, 51.9% girls) on the West Coast and a longitudinal sample followed annually from the 6th through 12th grades (n = 245, 53.1% girls) on the East Coast. High rates of NSSI were found in both the cross-sectional (37.2%) and the longitudinal (26.1%) samples. Zero-inflated Poisson regression models estimated process-level pathways from perceived parental criticism to NSSI via youth-reported alienation toward parents. Pathways toward the initiation of NSSI were distinct from those accounting for its frequency. Parental criticism was associated with increased NSSI, and youth alienation toward parents emerged as a relevant process underlying this pathway, particularly for boys. The specificity of these pathways was explored by examining separate trajectories toward delinquent outcomes. The findings illustrate the prominence of NSSI among "privileged" youths, the salience of the caregiving environment in NSSI, the importance of parental alienation in explaining these relations, and the value of incorporating multiple systems in treatment approaches for adolescents who self-injure.

  16. Breakdown in the organ donation process and its effect on organ availability.

    PubMed

    Razdan, Manik; Degenholtz, Howard B; Kahn, Jeremy M; Driessen, Julia

    2015-01-01

    Background. This study examines the effect of breakdown in the organ donation process on the availability of transplantable organs. A process breakdown is defined as a deviation from the organ donation protocol that may jeopardize organ recovery. Methods. A retrospective analysis of donation-eligible decedents was conducted using data from an independent organ procurement organization. Adjusted effect of process breakdown on organs transplanted from an eligible decedent was examined using multivariable zero-inflated Poisson regression. Results. An eligible decedent is four times more likely to become an organ donor when there is no process breakdown (adjusted OR: 4.01; 95% CI: 1.6838, 9.6414; P < 0.01) even after controlling for the decedent's age, gender, race, and whether or not a decedent had joined the state donor registry. However once the eligible decedent becomes a donor, whether or not there was a process breakdown does not affect the number of transplantable organs yielded. Overall, for every process breakdown occurring in the care of an eligible decedent, one less organ is available for transplant. Decedent's age is a strong predictor of likelihood of donation and the number of organs transplanted from a donor. Conclusion. Eliminating breakdowns in the donation process can potentially increase the number of organs available for transplant but some organs will still be lost.

  17. Bayesian spatiotemporal analysis of zero-inflated biological population density data by a delta-normal spatiotemporal additive model.

    PubMed

    Arcuti, Simona; Pollice, Alessio; Ribecco, Nunziata; D'Onghia, Gianfranco

    2016-03-01

    We evaluate the spatiotemporal changes in the density of a particular species of crustacean known as deep-water rose shrimp, Parapenaeus longirostris, based on biological sample data collected during trawl surveys carried out from 1995 to 2006 as part of the international project MEDITS (MEDiterranean International Trawl Surveys). As is the case for many biological variables, density data are continuous and characterized by unusually large amounts of zeros, accompanied by a skewed distribution of the remaining values. Here we analyze the normalized density data by a Bayesian delta-normal semiparametric additive model including the effects of covariates, using penalized regression with low-rank thin-plate splines for nonlinear spatial and temporal effects. Modeling the zero and nonzero values by two joint processes, as we propose in this work, allows to obtain great flexibility and easily handling of complex likelihood functions, avoiding inaccurate statistical inferences due to misclassification of the high proportion of exact zeros in the model. Bayesian model estimation is obtained by Markov chain Monte Carlo simulations, suitably specifying the complex likelihood function of the zero-inflated density data. The study highlights relevant nonlinear spatial and temporal effects and the influence of the annual Mediterranean oscillations index and of the sea surface temperature on the distribution of the deep-water rose shrimp density. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Multivariate random-parameters zero-inflated negative binomial regression model: an application to estimate crash frequencies at intersections.

    PubMed

    Dong, Chunjiao; Clarke, David B; Yan, Xuedong; Khattak, Asad; Huang, Baoshan

    2014-09-01

    Crash data are collected through police reports and integrated with road inventory data for further analysis. Integrated police reports and inventory data yield correlated multivariate data for roadway entities (e.g., segments or intersections). Analysis of such data reveals important relationships that can help focus on high-risk situations and coming up with safety countermeasures. To understand relationships between crash frequencies and associated variables, while taking full advantage of the available data, multivariate random-parameters models are appropriate since they can simultaneously consider the correlation among the specific crash types and account for unobserved heterogeneity. However, a key issue that arises with correlated multivariate data is the number of crash-free samples increases, as crash counts have many categories. In this paper, we describe a multivariate random-parameters zero-inflated negative binomial (MRZINB) regression model for jointly modeling crash counts. The full Bayesian method is employed to estimate the model parameters. Crash frequencies at urban signalized intersections in Tennessee are analyzed. The paper investigates the performance of MZINB and MRZINB regression models in establishing the relationship between crash frequencies, pavement conditions, traffic factors, and geometric design features of roadway intersections. Compared to the MZINB model, the MRZINB model identifies additional statistically significant factors and provides better goodness of fit in developing the relationships. The empirical results show that MRZINB model possesses most of the desirable statistical properties in terms of its ability to accommodate unobserved heterogeneity and excess zero counts in correlated data. Notably, in the random-parameters MZINB model, the estimated parameters vary significantly across intersections for different crash types. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Golden alga presence and abundance are inversely related to salinity in a high-salinity river ecosystem, Pecos River, USA

    USGS Publications Warehouse

    Israël, Natascha M.D.; VanLandeghem, Matthew M.; Denny, Shawn; Ingle, John; Patino, Reynaldo

    2014-01-01

    Prymnesium parvum (golden alga, GA) is a toxigenic harmful alga native to marine ecosystems that has also affected brackish inland waters. The first toxic bloom of GA in the western hemisphere occurred in the Pecos River, one of the saltiest rivers in North America. Environmental factors (water quality) associated with GA occurrence in this basin, however, have not been examined. Water quality and GA presence and abundance were determined at eight sites in the Pecos River basin with or without prior history of toxic blooms. Sampling was conducted monthly from January 2012 to July 2013. Specific conductance (salinity) varied spatiotemporally between 4408 and 73,786 mS/cm. Results of graphical, principal component (PCA), and zero-inflated Poisson (ZIP) regression analyses indicated that the incidence and abundance of GA are reduced as salinity increases spatiotemporally. LOWESS regression and correlation analyses of archived data for specific conductance and GA abundance at one of the study sites retrospectively confirmed the negative association between these variables. Results of PCA also suggested that at <15,000 mS/cm, GA was present at a relatively wide range of nutrient (nitrogen and phosphorus) concentrations whereas at higher salinity, GA was observed only at mid-to-high nutrient levels. Generally consistent with earlier studies, results of ZIP regression indicated that GA presence is positively associated with organic phosphorus and in samples where GA is present, GA abundance is positively associated with organic nitrogen and negatively associated with inorganic nitrogen. This is the first report of an inverse relation between salinity and GA presence and abundance in riverine waters and of interaction effects of salinity and nutrients in the field. These observations contribute to a more complete understanding of environmental conditions that influence GA distribution in inland waters.

  20. Logistic regression for dichotomized counts.

    PubMed

    Preisser, John S; Das, Kalyan; Benecha, Habtamu; Stamm, John W

    2016-12-01

    Sometimes there is interest in a dichotomized outcome indicating whether a count variable is positive or zero. Under this scenario, the application of ordinary logistic regression may result in efficiency loss, which is quantifiable under an assumed model for the counts. In such situations, a shared-parameter hurdle model is investigated for more efficient estimation of regression parameters relating to overall effects of covariates on the dichotomous outcome, while handling count data with many zeroes. One model part provides a logistic regression containing marginal log odds ratio effects of primary interest, while an ancillary model part describes the mean count of a Poisson or negative binomial process in terms of nuisance regression parameters. Asymptotic efficiency of the logistic model parameter estimators of the two-part models is evaluated with respect to ordinary logistic regression. Simulations are used to assess the properties of the models with respect to power and Type I error, the latter investigated under both misspecified and correctly specified models. The methods are applied to data from a randomized clinical trial of three toothpaste formulations to prevent incident dental caries in a large population of Scottish schoolchildren. © The Author(s) 2014.

  1. Variable selection for zero-inflated and overdispersed data with application to health care demand in Germany.

    PubMed

    Wang, Zhu; Ma, Shuangge; Wang, Ching-Yun

    2015-09-01

    In health services and outcome research, count outcomes are frequently encountered and often have a large proportion of zeros. The zero-inflated negative binomial (ZINB) regression model has important applications for this type of data. With many possible candidate risk factors, this paper proposes new variable selection methods for the ZINB model. We consider maximum likelihood function plus a penalty including the least absolute shrinkage and selection operator (LASSO), smoothly clipped absolute deviation (SCAD), and minimax concave penalty (MCP). An EM (expectation-maximization) algorithm is proposed for estimating the model parameters and conducting variable selection simultaneously. This algorithm consists of estimating penalized weighted negative binomial models and penalized logistic models via the coordinated descent algorithm. Furthermore, statistical properties including the standard error formulae are provided. A simulation study shows that the new algorithm not only has more accurate or at least comparable estimation, but also is more robust than the traditional stepwise variable selection. The proposed methods are applied to analyze the health care demand in Germany using the open-source R package mpath. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Contribution of chronic diseases to disability in elderly people in countries with low and middle incomes: a 10/66 Dementia Research Group population-based survey

    PubMed Central

    Sousa, Renata M; Ferri, Cleusa P; Acosta, Daisy; Albanese, Emiliano; Guerra, Mariella; Huang, Yueqin; Jacob, KS; Jotheeswaran, AT; Rodriguez, Juan J Llibre; Pichardo, Guillermina Rodriguez; Rodriguez, Marina Calvo; Salas, Aquiles; Sosa, Ana Luisa; Williams, Joseph; Zuniga, Tirso; Prince, Martin

    2009-01-01

    Summary Background Disability in elderly people in countries with low and middle incomes is little studied; according to Global Burden of Disease estimates, visual impairment is the leading contributor to years lived with disability in this population. We aimed to assess the contribution of physical, mental, and cognitive chronic diseases to disability, and the extent to which sociodemographic and health characteristics account for geographical variation in disability. Methods We undertook cross-sectional surveys of residents aged older than 65 years (n=15 022) in 11 sites in seven countries with low and middle incomes (China, India, Cuba, Dominican Republic, Venezuela, Mexico, and Peru). Disability was assessed with the 12-item WHO disability assessment schedule 2.0. Dementia, depression, hypertension, and chronic obstructive pulmonary disease were ascertained by clinical assessment; diabetes, stroke, and heart disease by self-reported diagnosis; and sensory, gastrointestinal, skin, limb, and arthritic disorders by self-reported impairment. Independent contributions to disability scores were assessed by zero-inflated negative binomial regression and Poisson regression to generate population-attributable prevalence fractions (PAPF). Findings In regions other than rural India and Venezuela, dementia made the largest contribution to disability (median PAPF 25·1% [IQR 19·2–43·6]). Other substantial contributors were stroke (11·4% [1·8–21·4]), limb impairment (10·5% [5·7–33·8]), arthritis (9·9% [3·2–34·8]), depression (8·3% [0·5–23·0]), eyesight problems (6·8% [1·7–17·6]), and gastrointestinal impairments (6·5% [0·3–23·1]). Associations with chronic diseases accounted for around two-thirds of prevalent disability. When zero inflation was taken into account, between-site differences in disability scores were largely attributable to compositional differences in health and sociodemographic characteristics. Interpretation On the basis of empirical research, dementia, not blindness, is overwhelmingly the most important independent contributor to disability for elderly people in countries with low and middle incomes. Chronic diseases of the brain and mind deserve increased prioritisation. Besides disability, they lead to dependency and present stressful, complex, long-term challenges to carers. Societal costs are enormous. Funding Wellcome Trust; WHO; US Alzheimer's Association; Fondo Nacional de Ciencia Y Tecnologia, Consejo de Desarrollo Cientifico Y Humanistico, Universidad Central de Venezuela. PMID:19944863

  3. Bayesian inference for unidirectional misclassification of a binary response trait.

    PubMed

    Xia, Michelle; Gustafson, Paul

    2018-03-15

    When assessing association between a binary trait and some covariates, the binary response may be subject to unidirectional misclassification. Unidirectional misclassification can occur when revealing a particular level of the trait is associated with a type of cost, such as a social desirability or financial cost. The feasibility of addressing misclassification is commonly obscured by model identification issues. The current paper attempts to study the efficacy of inference when the binary response variable is subject to unidirectional misclassification. From a theoretical perspective, we demonstrate that the key model parameters possess identifiability, except for the case with a single binary covariate. From a practical standpoint, the logistic model with quantitative covariates can be weakly identified, in the sense that the Fisher information matrix may be near singular. This can make learning some parameters difficult under certain parameter settings, even with quite large samples. In other cases, the stronger identification enables the model to provide more effective adjustment for unidirectional misclassification. An extension to the Poisson approximation of the binomial model reveals the identifiability of the Poisson and zero-inflated Poisson models. For fully identified models, the proposed method adjusts for misclassification based on learning from data. For binary models where there is difficulty in identification, the method is useful for sensitivity analyses on the potential impact from unidirectional misclassification. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Modeling the spatial distribution of African buffalo (Syncerus caffer) in the Kruger National Park, South Africa

    PubMed Central

    Hughes, Kristen; Budke, Christine M.; Ward, Michael P.; Kerry, Ruth; Ingram, Ben

    2017-01-01

    The population density of wildlife reservoirs contributes to disease transmission risk for domestic animals. The objective of this study was to model the African buffalo distribution of the Kruger National Park. A secondary objective was to collect field data to evaluate models and determine environmental predictors of buffalo detection. Spatial distribution models were created using buffalo census information and archived data from previous research. Field data were collected during the dry (August 2012) and wet (January 2013) seasons using a random walk design. The fit of the prediction models were assessed descriptively and formally by calculating the root mean square error (rMSE) of deviations from field observations. Logistic regression was used to estimate the effects of environmental variables on the detection of buffalo herds and linear regression was used to identify predictors of larger herd sizes. A zero-inflated Poisson model produced distributions that were most consistent with expected buffalo behavior. Field data confirmed that environmental factors including season (P = 0.008), vegetation type (P = 0.002), and vegetation density (P = 0.010) were significant predictors of buffalo detection. Bachelor herds were more likely to be detected in dense vegetation (P = 0.005) and during the wet season (P = 0.022) compared to the larger mixed-sex herds. Static distribution models for African buffalo can produce biologically reasonable results but environmental factors have significant effects and therefore could be used to improve model performance. Accurate distribution models are critical for the evaluation of disease risk and to model disease transmission. PMID:28902858

  5. Testing antismoking messages for Air Force trainees.

    PubMed

    Popova, Lucy; Linde, Brittany D; Bursac, Zoran; Talcott, G Wayne; Modayil, Mary V; Little, Melissa A; Ling, Pamela M; Glantz, Stanton A; Klesges, Robert C

    2016-11-01

    Young adults in the military are aggressively targeted by tobacco companies and are at high risk of tobacco use. Existing antismoking advertisements developed for the general population might be effective in educating young adults in the military. This study evaluated the effects of different themes of existing antismoking advertisements on perceived harm and intentions to use cigarettes and other tobacco products among Air Force trainees. In a pretest-post-test experiment, 782 Airmen were randomised to view antismoking advertisements in 1 of 6 conditions: anti-industry, health effects+anti-industry, sexual health, secondhand smoke, environment+anti-industry or control. We assessed the effect of different conditions on changes in perceived harm and intentions to use cigarettes, electronic cigarettes, smokeless tobacco, hookah and cigarillos from pretest to post-test with multivariable linear regression models (perceived harm) and zero-inflated Poisson regression model (intentions). Antismoking advertisements increased perceived harm of various tobacco products and reduced intentions to use. Advertisements featuring negative effects of tobacco on health and sexual performance coupled with revealing tobacco industry manipulations had the most consistent pattern of effects on perceived harm and intentions. Antismoking advertisements produced for the general public might also be effective with a young adult military population and could have spillover effects on perceptions of harm and intentions to use other tobacco products besides cigarettes. Existing antismoking advertising may be a cost-effective tool to educate young adults in the military. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. Zero-truncated negative binomial - Erlang distribution

    NASA Astrophysics Data System (ADS)

    Bodhisuwan, Winai; Pudprommarat, Chookait; Bodhisuwan, Rujira; Saothayanun, Luckhana

    2017-11-01

    The zero-truncated negative binomial-Erlang distribution is introduced. It is developed from negative binomial-Erlang distribution. In this work, the probability mass function is derived and some properties are included. The parameters of the zero-truncated negative binomial-Erlang distribution are estimated by using the maximum likelihood estimation. Finally, the proposed distribution is applied to real data, the number of methamphetamine in the Bangkok, Thailand. Based on the results, it shows that the zero-truncated negative binomial-Erlang distribution provided a better fit than the zero-truncated Poisson, zero-truncated negative binomial, zero-truncated generalized negative-binomial and zero-truncated Poisson-Lindley distributions for this data.

  7. Mental illness in bariatric surgery: A cohort study from the PORTAL network.

    PubMed

    Fisher, David; Coleman, Karen J; Arterburn, David E; Fischer, Heidi; Yamamoto, Ayae; Young, Deborah R; Sherwood, Nancy E; Trinacty, Connie Mah; Lewis, Kristina H

    2017-05-01

    To compare bariatric surgery outcomes according to preoperative mental illness category. Electronic health record data from several US healthcare systems were used to compare outcomes of four groups of patients who underwent bariatric surgery in 2012 and 2013. These included the following: people with (1) no mental illness, (2) mild-to-moderate depression or anxiety, (3) severe depression or anxiety, and (4) bipolar, psychosis, or schizophrenia spectrum disorders. Groups were compared on weight loss trajectory using generalized estimating equations using B-spline bases and on all-cause emergency department visits and hospital days using zero-inflated Poisson and negative binomial regression up to 2 years after surgery. Models were adjusted for demographic and health covariates, including baseline healthcare use. Among 8,192 patients, mean age was 44.3 (10.7) years, 79.9% were female, and 45.6% were white. Fifty-seven percent had preoperative mental illness. There were no differences between groups for weight loss, but patients with preoperative severe depression or anxiety or bipolar, psychosis, or schizophrenia spectrum disorders had higher follow-up levels of emergency department visits and hospital days compared to those with no mental illness. In this multicenter study, mental illness was not associated with differential weight loss after bariatric surgery, but additional research could focus on reducing acute care use among these patients. © 2017 The Obesity Society.

  8. Dental erosion prevalence and associated risk indicators among preschool children in Athens, Greece.

    PubMed

    Mantonanaki, Magdalini; Koletsi-Kounari, Haroula; Mamai-Homata, Eleni; Papaioannou, William

    2013-03-01

    The aims of the study were to investigate dental erosion prevalence, distribution and severity in Greek preschool children attending public kindergartens in the prefecture of Attica, Greece and to determine the effect of dental caries, oral hygiene level, socio-economic factors, dental behavior, erosion related medication and chronic illness. A random and stratified sample of 605 Greek preschool children was clinically examined for dental erosion using the Basic Erosive Wear Examination Index (ΒΕWE). Dental caries (dmfs) and Simplified Debris Index were also recorded. The data concerning possible risk indicators were derived by a questionnaire. Zero-inflated Poisson regression was generated to test the predictive effects of the independent variables on dental erosion. The prevalence of dental erosion was 78.8 %, and the mean and SE of BEWE index was 3.64 ± 0.15. High monthly family income was positively related to ΒΕWE cumulative scores [RR = 1.204 (1.016-1.427)], while high maternal education level [RR = 0.872 (0.771-0.986)] and poor oral hygiene level [DI-s, RR = 0.584 (0.450-0.756)] showed a negative association. Dental erosion is a common oral disease in Greek preschool children in Attica, related to oral hygiene and socio-economic factors. Programs aimed at erosion prevention should begin at an early age for all children.

  9. The influence of impulsiveness on binge eating and problem gambling: A prospective study of gender differences in Canadian adults.

    PubMed

    Farstad, Sarah M; von Ranson, Kristin M; Hodgins, David C; El-Guebaly, Nady; Casey, David M; Schopflocher, Don P

    2015-09-01

    This study investigated the degree to which facets of impulsiveness predicted future binge eating and problem gambling, 2 theorized forms of behavioral addiction. Participants were 596 women and 406 men from 4 age cohorts randomly recruited from a Canadian province. Participants completed self-report measures of 3 facets of impulsiveness (negative urgency, sensation seeking, lack of persistence), binge-eating frequency, and problem-gambling symptoms. Impulsiveness was assessed at baseline, and assessments of binge eating and problem gambling were followed up after 3 years. Weighted data were analyzed using zero-inflated negative binomial and Poisson regression models. We found evidence of transdiagnostic and disorder-specific predictors of binge eating and problem gambling. Negative urgency emerged as a common predictor of binge eating and problem gambling among women and men. There were disorder-specific personality traits identified among men only: High lack-of-persistence scores predicted binge eating and high sensation-seeking scores predicted problem gambling. Among women, younger age predicted binge eating and older age predicted problem gambling. Thus, there are gender differences in facets of impulsiveness that longitudinally predict binge eating and problem gambling, suggesting that treatments for these behaviors should consider gender-specific personality and demographic traits in addition to the common personality trait of negative urgency. (c) 2015 APA, all rights reserved).

  10. Association of Pediatric Abusive Head Trauma Rates With Macroeconomic Indicators.

    PubMed

    Wood, Joanne N; French, Benjamin; Fromkin, Janet; Fakeye, Oludolapo; Scribano, Philip V; Letson, Megan M; Makoroff, Kathi L; Feldman, Kenneth W; Fabio, Anthony; Berger, Rachel

    2016-04-01

    We aimed to examine abusive head trauma (AHT) incidence before, during and after the recession of 2007-2009 in 3 US regions and assess the association of economic measures with AHT incidence. Data for children <5 years old diagnosed with AHT between January 1, 2004, and December 31, 2012, in 3 regions were linked to county-level economic data using an ecologic time series analysis. Associations between county-level AHT rates and recession period as well as employment growth, mortgage delinquency, and foreclosure rates were examined using zero-inflated Poisson regression models. During the 9-year period, 712 children were diagnosed with AHT. The mean rate of AHT per 100,000 child-years increased from 9.8 before the recession to 15.6 during the recession before decreasing to 12.8 after the recession. The AHT rates after the recession were higher than the rates before the recession (incidence rate ratio 1.31, P = .004) but lower than rates during the recession (incidence rate ratio 0.78, P = .005). There was no association between the AHT rate and employment growth, mortgage delinquency rates, or foreclosure rates. In the period after the recession, AHT rate was lower than during the recession period yet higher than the level before the recession, suggesting a lingering effect of the economic stress of the recession on maltreatment risk. Copyright © 2016 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  11. GMPR: A robust normalization method for zero-inflated count data with application to microbiome sequencing data.

    PubMed

    Chen, Li; Reeve, James; Zhang, Lujun; Huang, Shengbing; Wang, Xuefeng; Chen, Jun

    2018-01-01

    Normalization is the first critical step in microbiome sequencing data analysis used to account for variable library sizes. Current RNA-Seq based normalization methods that have been adapted for microbiome data fail to consider the unique characteristics of microbiome data, which contain a vast number of zeros due to the physical absence or under-sampling of the microbes. Normalization methods that specifically address the zero-inflation remain largely undeveloped. Here we propose geometric mean of pairwise ratios-a simple but effective normalization method-for zero-inflated sequencing data such as microbiome data. Simulation studies and real datasets analyses demonstrate that the proposed method is more robust than competing methods, leading to more powerful detection of differentially abundant taxa and higher reproducibility of the relative abundances of taxa.

  12. Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model

    NASA Technical Reports Server (NTRS)

    Vallejo, Jonathon; Hejduk, Matt; Stamey, James

    2015-01-01

    We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.

  13. Performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data.

    PubMed

    Yelland, Lisa N; Salter, Amy B; Ryan, Philip

    2011-10-15

    Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.

  14. Regular exercise and related factors in patients with Parkinson's disease: Applying zero-inflated negative binomial modeling of exercise count data.

    PubMed

    Lee, JuHee; Park, Chang Gi; Choi, Moonki

    2016-05-01

    This study was conducted to identify risk factors that influence regular exercise among patients with Parkinson's disease in Korea. Parkinson's disease is prevalent in the elderly, and may lead to a sedentary lifestyle. Exercise can enhance physical and psychological health. However, patients with Parkinson's disease are less likely to exercise than are other populations due to physical disability. A secondary data analysis and cross-sectional descriptive study were conducted. A convenience sample of 106 patients with Parkinson's disease was recruited at an outpatient neurology clinic of a tertiary hospital in Korea. Demographic characteristics, disease-related characteristics (including disease duration and motor symptoms), self-efficacy for exercise, balance, and exercise level were investigated. Negative binomial regression and zero-inflated negative binomial regression for exercise count data were utilized to determine factors involved in exercise. The mean age of participants was 65.85 ± 8.77 years, and the mean duration of Parkinson's disease was 7.23 ± 6.02 years. Most participants indicated that they engaged in regular exercise (80.19%). Approximately half of participants exercised at least 5 days per week for 30 min, as recommended (51.9%). Motor symptoms were a significant predictor of exercise in the count model, and self-efficacy for exercise was a significant predictor of exercise in the zero model. Severity of motor symptoms was related to frequency of exercise. Self-efficacy contributed to the probability of exercise. Symptom management and improvement of self-efficacy for exercise are important to encourage regular exercise in patients with Parkinson's disease. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Floral traits influencing plant attractiveness to three bee species: Consequences for plant reproductive success.

    PubMed

    Bauer, Austin A; Clayton, Murray K; Brunet, Johanne

    2017-05-01

    The ability to attract pollinators is crucial to plants that rely on insects for pollination. We contrasted the roles of floral display size and flower color in attracting three bee species and determined the relationships between plant attractiveness (number of pollinator visits) and seed set for each bee species. We recorded pollinator visits to plants, measured plant traits, and quantified plant reproductive success. A zero-inflated Poisson regression model indicated plant traits associated with pollinator attraction. It identified traits that increased the number of bee visits and traits that increased the probability of a plant not receiving any visits. Different components of floral display size were examined and two models of flower color contrasted. Relationships between plant attractiveness and seed set were determined using regression analyses. Plants with more racemes received more bee visits from all three bee species. Plants with few racemes were more likely not to receive any bee visits. The role of flower color varied with bee species and was influenced by the choice of the flower color model. Increasing bee visits increased seed set for all three bee species, with the steepest slope for leafcutting bees, followed by bumble bees, and finally honey bees. Floral display size influenced pollinator attraction more consistently than flower color. The same plant traits affected the probability of not being visited and the number of pollinator visits received. The impact of plant attractiveness on female reproductive success varied, together with pollinator effectiveness, by pollinator species. © 2017 Bauer et al. Published by the Botanical Society of America. This work is licensed under a Creative Commons public domain license (CC0 1.0).

  16. Persistently Auxetic Materials: Engineering the Poisson Ratio of 2D Self-Avoiding Membranes under Conditions of Non-Zero Anisotropic Strain.

    PubMed

    Ulissi, Zachary W; Govind Rajan, Ananth; Strano, Michael S

    2016-08-23

    Entropic surfaces represented by fluctuating two-dimensional (2D) membranes are predicted to have desirable mechanical properties when unstressed, including a negative Poisson's ratio ("auxetic" behavior). Herein, we present calculations of the strain-dependent Poisson ratio of self-avoiding 2D membranes demonstrating desirable auxetic properties over a range of mechanical strain. Finite-size membranes with unclamped boundary conditions have positive Poisson's ratio due to spontaneous non-zero mean curvature, which can be suppressed with an explicit bending rigidity in agreement with prior findings. Applying longitudinal strain along a singular axis to this system suppresses this mean curvature and the entropic out-of-plane fluctuations, resulting in a molecular-scale mechanism for realizing a negative Poisson's ratio above a critical strain, with values significantly more negative than the previously observed zero-strain limit for infinite sheets. We find that auxetic behavior persists over surprisingly high strains of more than 20% for the smallest surfaces, with desirable finite-size scaling producing surfaces with negative Poisson's ratio over a wide range of strains. These results promise the design of surfaces and composite materials with tunable Poisson's ratio by prestressing platelet inclusions or controlling the surface rigidity of a matrix of 2D materials.

  17. Work-related injuries involving a hand or fingers among union carpenters in Washington State, 1989 to 2008.

    PubMed

    Lipscomb, Hester J; Schoenfisch, Ashley; Cameron, Wilfrid

    2013-07-01

    We evaluated work-related injuries involving a hand or fingers and associated costs among a cohort of 24,830 carpenters between 1989 and 2008. Injury rates and rate ratios were calculated by using Poisson regression to explore higher risk on the basis of age, sex, time in the union, predominant work, and calendar time. Negative binomial regression was used to model dollars paid per claim after adjustment for inflation and discounting. Hand injuries accounted for 21.1% of reported injuries and 9.5% of paid lost time injuries. Older carpenters had proportionately more amputations, fractures, and multiple injuries, but their rates of these more severe injuries were not higher. Costs exceeded $21 million, a cost burden of $0.11 per hour worked. Older carpenters' higher proportion of serious injuries in the absence of higher rates likely reflects age-related reporting differences.

  18. Bayesian hierarchical modelling of continuous non‐negative longitudinal data with a spike at zero: An application to a study of birds visiting gardens in winter

    PubMed Central

    Buckland, Stephen T.; King, Ruth; Toms, Mike P.

    2015-01-01

    The development of methods for dealing with continuous data with a spike at zero has lagged behind those for overdispersed or zero‐inflated count data. We consider longitudinal ecological data corresponding to an annual average of 26 weekly maximum counts of birds, and are hence effectively continuous, bounded below by zero but also with a discrete mass at zero. We develop a Bayesian hierarchical Tweedie regression model that can directly accommodate the excess number of zeros common to this type of data, whilst accounting for both spatial and temporal correlation. Implementation of the model is conducted in a Markov chain Monte Carlo (MCMC) framework, using reversible jump MCMC to explore uncertainty across both parameter and model spaces. This regression modelling framework is very flexible and removes the need to make strong assumptions about mean‐variance relationships a priori. It can also directly account for the spike at zero, whilst being easily applicable to other types of data and other model formulations. Whilst a correlative study such as this cannot prove causation, our results suggest that an increase in an avian predator may have led to an overall decrease in the number of one of its prey species visiting garden feeding stations in the United Kingdom. This may reflect a change in behaviour of house sparrows to avoid feeding stations frequented by sparrowhawks, or a reduction in house sparrow population size as a result of sparrowhawk increase. PMID:25737026

  19. Childhood temperament predictors of adolescent physical activity.

    PubMed

    Janssen, James A; Kolacz, Jacek; Shanahan, Lilly; Gangel, Meghan J; Calkins, Susan D; Keane, Susan P; Wideman, Laurie

    2017-01-05

    Physical inactivity is a leading cause of mortality worldwide. Many patterns of physical activity involvement are established early in life. To date, the role of easily identifiable early-life individual predictors of PA, such as childhood temperament, remains relatively unexplored. Here, we tested whether childhood temperamental activity level, high intensity pleasure, low intensity pleasure, and surgency predicted engagement in physical activity (PA) patterns 11 years later in adolescence. Data came from a longitudinal community study (N = 206 participants, 53% females, 70% Caucasian). Parents reported their children's temperamental characteristics using the Child Behavior Questionnaire (CBQ) when children were 4 & 5 years old. Approximately 11 years later, adolescents completed self-reports of PA using the Godin Leisure Time Exercise Questionnaire and the Youth Risk Behavior Survey. Ordered logistic regression, ordinary least squares linear regression, and Zero-inflated Poisson regression models were used to predict adolescent PA from childhood temperament. Race, socioeconomic status, and adolescent body mass index were used as covariates. Males with greater childhood temperamental activity level engaged in greater adolescent PA volume (B = .42, SE = .13) and a 1 SD difference in childhood temperamental activity level predicted 29.7% more strenuous adolescent PA per week. Males' high intensity pleasure predicted higher adolescent PA volume (B = .28, SE = .12). Males' surgency positively predicted more frequent PA activity (B = .47, SE = .23, OR = 1.61, 95% CI: 1.02, 2.54) and PA volume (B = .31, SE = .12). No predictions from females' childhood temperament to later PA engagement were identified. Childhood temperament may influence the formation of later PA habits, particularly in males. Boys with high temperamental activity level, high intensity pleasure, and surgency may directly seek out pastimes that involve PA. Indirectly, temperament may also influence caregivers' perceptions of optimal activity choices for children. Understanding how temperament influences the development of PA patterns has the potential to inform efforts aimed at promoting long-term PA engagement and physical health.

  20. Population trends, bend use relative to available habitat and within-river-bend habitat use of eight indicator species of Missouri and Lower Kansas River benthic fishes: 15 years after baseline assessment

    USGS Publications Warehouse

    Wildhaber, Mark L.; Yang, Wen-Hsi; Arab, Ali

    2016-01-01

    A baseline assessment of the Missouri River fish community and species-specific habitat use patterns conducted from 1996 to 1998 provided the first comprehensive analysis of Missouri River benthic fish population trends and habitat use in the Missouri and Lower Yellowstone rivers, exclusive of reservoirs, and provided the foundation for the present Pallid Sturgeon Population Assessment Program (PSPAP). Data used in such studies are frequently zero inflated. To address this issue, the zero-inflated Poisson (ZIP) model was applied. This follow-up study is based on PSPAP data collected up to 15 years later along with new understanding of how habitat characteristics among and within bends affect habitat use of fish species targeted by PSPAP, including pallid sturgeon. This work demonstrated that a large-scale, large-river, PSPAP-type monitoring program can be an effective tool for assessing population trends and habitat usage of large-river fish species. Using multiple gears, PSPAP was effective in monitoring shovelnose and pallid sturgeons, sicklefin, shoal and sturgeon chubs, sand shiner, blue sucker and sauger. For all species, the relationship between environmental variables and relative abundance differed, somewhat, among river segments suggesting the importance of the overall conditions of Upper and Middle Missouri River and Lower Missouri and Kansas rivers on the habitat usage patterns exhibited. Shoal and sicklefin chubs exhibited many similar habitat usage patterns; blue sucker and shovelnose sturgeon also shared similar responses. For pallid sturgeon, the primary focus of PSPAP, relative abundance tended to increase in Upper and Middle Missouri River paralleling stocking efforts, whereas no evidence of an increasing relative abundance was found in the Lower Missouri River despite stocking.

  1. Bayesian hierarchical modelling of continuous non-negative longitudinal data with a spike at zero: An application to a study of birds visiting gardens in winter.

    PubMed

    Swallow, Ben; Buckland, Stephen T; King, Ruth; Toms, Mike P

    2016-03-01

    The development of methods for dealing with continuous data with a spike at zero has lagged behind those for overdispersed or zero-inflated count data. We consider longitudinal ecological data corresponding to an annual average of 26 weekly maximum counts of birds, and are hence effectively continuous, bounded below by zero but also with a discrete mass at zero. We develop a Bayesian hierarchical Tweedie regression model that can directly accommodate the excess number of zeros common to this type of data, whilst accounting for both spatial and temporal correlation. Implementation of the model is conducted in a Markov chain Monte Carlo (MCMC) framework, using reversible jump MCMC to explore uncertainty across both parameter and model spaces. This regression modelling framework is very flexible and removes the need to make strong assumptions about mean-variance relationships a priori. It can also directly account for the spike at zero, whilst being easily applicable to other types of data and other model formulations. Whilst a correlative study such as this cannot prove causation, our results suggest that an increase in an avian predator may have led to an overall decrease in the number of one of its prey species visiting garden feeding stations in the United Kingdom. This may reflect a change in behaviour of house sparrows to avoid feeding stations frequented by sparrowhawks, or a reduction in house sparrow population size as a result of sparrowhawk increase. © 2015 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Current hormonal contraceptive use predicts female extra-pair and dyadic sexual behavior: evidence based on Czech National Survey data.

    PubMed

    Klapilová, Kateřina; Cobey, Kelly D; Wells, Timothy; Roberts, S Craig; Weiss, Petr; Havlíček, Jan

    2014-01-10

    Data from 1155 Czech women (493 using oral contraception, 662 non-users), obtained from the Czech National Survey of Sexual Behavior, were used to investigate evolutionary-based hypotheses concerning the predictive value of current oral contraceptive (OC) use on extra-pair and dyadic (in-pair) sexual behavior of coupled women. Specifically, the aim was to determine whether current OC use was associated with lower extra-pair and higher in-pair sexual interest and behavior, because OC use suppresses cyclical shifts in mating psychology that occur in normally cycling women. Zero-inflated Poisson (ZIP) regression and negative binomial models were used to test associations between OC use and these sexual measures, controlling for other relevant predictors (e.g., age, parity, in-pair sexual satisfaction, relationship length). The overall incidence of having had an extra-pair partner or one-night stand in the previous year was not related to current OC use (the majority of the sample had not). However, among the women who had engaged in extra-pair sexual behavior, OC users had fewer one-night stands than non-users, and tended to have fewer partners, than non-users. OC users also had more frequent dyadic intercourse than non-users, potentially indicating higher commitment to their current relationship. These results suggest that suppression of fertility through OC use may alter important aspects of female sexual behavior, with potential implications for relationship functioning and stability.

  3. Parallel Demand-Withdraw Processes in Family Therapy for Adolescent Drug Abuse

    PubMed Central

    Rynes, Kristina N.; Rohrbaugh, Michael J.; Lebensohn-Chialvo, Florencia; Shoham, Varda

    2013-01-01

    Isomorphism, or parallel process, occurs in family therapy when patterns of therapist-client interaction replicate problematic interaction patterns within the family. This study investigated parallel demand-withdraw processes in Brief Strategic Family Therapy (BSFT) for adolescent drug abuse, hypothesizing that therapist-demand/adolescent-withdraw interaction (TD/AW) cycles observed early in treatment would predict poor adolescent outcomes at follow-up for families who exhibited entrenched parent-demand/adolescent-withdraw interaction (PD/AW) before treatment began. Participants were 91 families who received at least 4 sessions of BSFT in a multi-site clinical trial on adolescent drug abuse (Robbins et al., 2011). Prior to receiving therapy, families completed videotaped family interaction tasks from which trained observers coded PD/AW. Another team of raters coded TD/AW during two early BSFT sessions. The main dependent variable was the number of drug use days that adolescents reported in Timeline Follow-Back interviews 7 to 12 months after family therapy began. Zero-inflated Poisson (ZIP) regression analyses supported the main hypothesis, showing that PD/AW and TD/AW interacted to predict adolescent drug use at follow-up. For adolescents in high PD/AW families, higher levels of TD/AW predicted significant increases in drug use at follow-up, whereas for low PD/AW families, TD/AW and follow-up drug use were unrelated. Results suggest that attending to parallel demand-withdraw processes in parent/adolescent and therapist/adolescent dyads may be useful in family therapy for substance-using adolescents. PMID:23438248

  4. Psychological Aggression, Physical Aggression, and Injury in Nonpartner Relationships Among Men and Women in Treatment for Substance-Use Disorders*

    PubMed Central

    Murray, Regan L.; Chermack, Stephen T.; Walton, Maureen A.; Winters, Jamie; Booth, Brenda M.; Blow, Frederic C.

    2008-01-01

    Objective: This study focused on the prevalence and predictors of psychological aggression, physical aggression, and injury rates in nonintimate partner relationships in a substance-use disorder treatment sample. Method: The sample included 489 (76% men, 24% women) participants who completed screening measures for inclusion in a randomized control trial for an aggression-prevention treatment. Primary outcome measures included rates of past-year psychological aggression, physical aggression, and injury (both from the participant to nonpartners and from nonpartners to the participant). Potential predictors included individual factors (e.g., age, gender), developmental factors (e.g., family history of drug use, childhood physical abuse), and recent factors (e.g., depression, cocaine use). Results: Rates of participant-tononpartner psychological aggression (83%), physical aggression (61%), and injury (47%) were high, as were rates of nonpartner-to-participant aggression. Bivariate analyses revealed significant relationships between the aggression outcomes and most of the individual, developmental, and recent factors. However, multivariate analyses (zero-inflated Poisson regression) revealed that age, treatment status, current symptoms of depression, heavy periods of drinking, and cocaine use were related most frequently to the occurrence of aggression to and from nonpartners. Conclusions: Nonpartner aggression may be as common within a substance-use disorder sample as partner aggression, and it is associated with heavy drinking episodes, cocaine use, and depressive symptoms. The findings highlight the need for the development of effective violence interventions addressing violence in nonpartner relationship types. PMID:18925348

  5. Family composition and children's dental health behavior: evidence from Germany.

    PubMed

    Listl, Stefan

    2011-01-01

    To assess whether children's dental health behavior differs between family compositions of either natural parents or birth mothers together with stepfathers. We use data from the German Health Interview and Examination Survey Children and Adolescents (KiGGS) public use file. This is the first nationally r ep resentative sample on child health in Germany and particularly contains variables for dental attendance, tooth care, and eating behavior of 13,904 children below 14 years of age. A series of zero-inflated Poisson, ordinary least squares, binary, and ordered logistic regression models was set up in order to identify whether family composition is a significant explanatory variable for children's dental health behavior. Family composition turned out as a significant parameter for some aspects of children's dental health behavior. Specifically, children who grow up in families with a birth mother and a stepfather have only half the probability to access dental services but, once seeking treatment, the number of visits is significantly higher in comparison with children raised by their natural parents. Moreover, children growing up in such a patchwork family setting consume a higher amount of sugary foods and drinks. This appears mainly attributable to differential consumption habits for juices, cookies, and chocolate. Children who grow up in settings other than the nuclear family may develop different dental health behaviors than children who grow up with both natural parents, albeit more research is needed to identify the extent to which such behavioral changes lead to variations in caries occurrence.

  6. Personality disorder risk factors for suicide attempts over 10 years of follow-up.

    PubMed

    Ansell, Emily B; Wright, Aidan G C; Markowitz, John C; Sanislow, Charles A; Hopwood, Christopher J; Zanarini, Mary C; Yen, Shirley; Pinto, Anthony; McGlashan, Thomas H; Grilo, Carlos M

    2015-04-01

    Identifying personality disorder (PD) risk factors for suicide attempts is an important consideration for research and clinical care alike. However, most prior research has focused on single PDs or categorical PD diagnoses without considering unique influences of different PDs or of severity (sum) of PD criteria on the risk for suicide-related outcomes. This has usually been done with cross-sectional or retrospective assessment methods. Rarely are dimensional models of PDs examined in longitudinal, naturalistic prospective designs. In addition, it is important to consider divergent risk factors in predicting the risk of ever making a suicide attempt versus the risk of making an increasing number of attempts within the same model. This study examined 431 participants who were followed for 10 years in the Collaborative Longitudinal Personality Disorders Study. Baseline assessments of personality disorder criteria were summed as dimensional counts of personality pathology and examined as predictors of suicide attempts reported at annual interviews throughout the 10-year follow-up period. We used univariate and multivariate zero-inflated Poisson regression models to simultaneously evaluate PD risk factors for ever attempting suicide and for increasing numbers of attempts among attempters. Consistent with prior research, borderline PD was uniquely associated with ever attempting. However, only narcissistic PD was uniquely associated with an increasing number of attempts. These findings highlight the relevance of both borderline and narcissistic personality pathology as unique contributors to suicide-related outcomes. (c) 2015 APA, all rights reserved).

  7. The effect of a major cigarette price change on smoking behavior in california: a zero-inflated negative binomial model.

    PubMed

    Sheu, Mei-Ling; Hu, Teh-Wei; Keeler, Theodore E; Ong, Michael; Sung, Hai-Yen

    2004-08-01

    The objective of this paper is to determine the price sensitivity of smokers in their consumption of cigarettes, using evidence from a major increase in California cigarette prices due to Proposition 10 and the Tobacco Settlement. The study sample consists of individual survey data from Behavioral Risk Factor Survey (BRFS) and price data from the Bureau of Labor Statistics between 1996 and 1999. A zero-inflated negative binomial (ZINB) regression model was applied for the statistical analysis. The statistical model showed that price did not have an effect on reducing the estimated prevalence of smoking. However, it indicated that among smokers the price elasticity was at the level of -0.46 and statistically significant. Since smoking prevalence is significantly lower than it was a decade ago, price increases are becoming less effective as an inducement for hard-core smokers to quit, although they may respond by decreasing consumption. For those who only smoke occasionally (many of them being young adults) price increases alone may not be an effective inducement to quit smoking. Additional underlying behavioral factors need to be identified so that more effective anti-smoking strategies can be developed.

  8. Modified Regression Correlation Coefficient for Poisson Regression Model

    NASA Astrophysics Data System (ADS)

    Kaengthong, Nattacha; Domthong, Uthumporn

    2017-09-01

    This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).

  9. Poisson's ratio from polarization of acoustic zero-group velocity Lamb mode.

    PubMed

    Baggens, Oskar; Ryden, Nils

    2015-07-01

    Poisson's ratio of an isotropic and free elastic plate is estimated from the polarization of the first symmetric acoustic zero-group velocity Lamb mode. This polarization is interpreted as the ratio of the absolute amplitudes of the surface normal and surface in-plane components of the acoustic mode. Results from the evaluation of simulated datasets indicate that the presented relation, which links the polarization and Poisson's ratio, can be extended to incorporate plates with material damping. Furthermore, the proposed application of the polarization is demonstrated in a practical field case, where an increased accuracy of estimated nominal thickness is obtained.

  10. Modelling infant mortality rate in Central Java, Indonesia use generalized poisson regression method

    NASA Astrophysics Data System (ADS)

    Prahutama, Alan; Sudarno

    2018-05-01

    The infant mortality rate is the number of deaths under one year of age occurring among the live births in a given geographical area during a given year, per 1,000 live births occurring among the population of the given geographical area during the same year. This problem needs to be addressed because it is an important element of a country’s economic development. High infant mortality rate will disrupt the stability of a country as it relates to the sustainability of the population in the country. One of regression model that can be used to analyze the relationship between dependent variable Y in the form of discrete data and independent variable X is Poisson regression model. Recently The regression modeling used for data with dependent variable is discrete, among others, poisson regression, negative binomial regression and generalized poisson regression. In this research, generalized poisson regression modeling gives better AIC value than poisson regression. The most significant variable is the Number of health facilities (X1), while the variable that gives the most influence to infant mortality rate is the average breastfeeding (X9).

  11. [Application of negative binomial regression and modified Poisson regression in the research of risk factors for injury frequency].

    PubMed

    Cao, Qingqing; Wu, Zhenqiang; Sun, Ying; Wang, Tiezhu; Han, Tengwei; Gu, Chaomei; Sun, Yehuan

    2011-11-01

    To Eexplore the application of negative binomial regression and modified Poisson regression analysis in analyzing the influential factors for injury frequency and the risk factors leading to the increase of injury frequency. 2917 primary and secondary school students were selected from Hefei by cluster random sampling method and surveyed by questionnaire. The data on the count event-based injuries used to fitted modified Poisson regression and negative binomial regression model. The risk factors incurring the increase of unintentional injury frequency for juvenile students was explored, so as to probe the efficiency of these two models in studying the influential factors for injury frequency. The Poisson model existed over-dispersion (P < 0.0001) based on testing by the Lagrangemultiplier. Therefore, the over-dispersion dispersed data using a modified Poisson regression and negative binomial regression model, was fitted better. respectively. Both showed that male gender, younger age, father working outside of the hometown, the level of the guardian being above junior high school and smoking might be the results of higher injury frequencies. On a tendency of clustered frequency data on injury event, both the modified Poisson regression analysis and negative binomial regression analysis can be used. However, based on our data, the modified Poisson regression fitted better and this model could give a more accurate interpretation of relevant factors affecting the frequency of injury.

  12. Modeling forest fire occurrences using count-data mixed models in Qiannan autonomous prefecture of Guizhou province in China.

    PubMed

    Xiao, Yundan; Zhang, Xiongqing; Ji, Ping

    2015-01-01

    Forest fires can cause catastrophic damage on natural resources. In the meantime, it can also bring serious economic and social impacts. Meteorological factors play a critical role in establishing conditions favorable for a forest fire. Effective prediction of forest fire occurrences could prevent or minimize losses. This paper uses count data models to analyze fire occurrence data which is likely to be dispersed and frequently contain an excess of zero counts (no fire occurrence). Such data have commonly been analyzed using count data models such as a Poisson model, negative binomial model (NB), zero-inflated models, and hurdle models. Data we used in this paper is collected from Qiannan autonomous prefecture of Guizhou province in China. Using the fire occurrence data from January to April (spring fire season) for the years 1996 through 2007, we introduced random effects to the count data models. In this study, the results indicated that the prediction achieved through NB model provided a more compelling and credible inferential basis for fitting actual forest fire occurrence, and mixed-effects model performed better than corresponding fixed-effects model in forest fire forecasting. Besides, among all meteorological factors, we found that relative humidity and wind speed is highly correlated with fire occurrence.

  13. Modeling Forest Fire Occurrences Using Count-Data Mixed Models in Qiannan Autonomous Prefecture of Guizhou Province in China

    PubMed Central

    Ji, Ping

    2015-01-01

    Forest fires can cause catastrophic damage on natural resources. In the meantime, it can also bring serious economic and social impacts. Meteorological factors play a critical role in establishing conditions favorable for a forest fire. Effective prediction of forest fire occurrences could prevent or minimize losses. This paper uses count data models to analyze fire occurrence data which is likely to be dispersed and frequently contain an excess of zero counts (no fire occurrence). Such data have commonly been analyzed using count data models such as a Poisson model, negative binomial model (NB), zero-inflated models, and hurdle models. Data we used in this paper is collected from Qiannan autonomous prefecture of Guizhou province in China. Using the fire occurrence data from January to April (spring fire season) for the years 1996 through 2007, we introduced random effects to the count data models. In this study, the results indicated that the prediction achieved through NB model provided a more compelling and credible inferential basis for fitting actual forest fire occurrence, and mixed-effects model performed better than corresponding fixed-effects model in forest fire forecasting. Besides, among all meteorological factors, we found that relative humidity and wind speed is highly correlated with fire occurrence. PMID:25790309

  14. An application of a zero-inflated lifetime distribution with multiple and incomplete data sources

    DOE PAGES

    Hamada, M. S.; Margevicius, K. J.

    2016-02-11

    In this study, we analyze data sampled from a population of parts in which an associated anomaly can occur at assembly or after assembly. Using a zero-inflated lifetime distribution to fit left-censored and right-censored data as well data from a supplementary sample, we make predictions about the proportion of the population with anomalies today and in the future. Goodness-of-fit is also addressed.

  15. Frequency distribution of Echinococcus multilocularis and other helminths of foxes in Kyrgyzstan

    PubMed Central

    I., Ziadinov; P., Deplazes; A., Mathis; B., Mutunova; K., Abdykerimov; R., Nurgaziev; P.R, Torgerson

    2010-01-01

    Echinococcosis is a major emerging zoonosis in central Asia. A study of the helminth fauna of foxes from Naryn Oblast in central Kyrgyzstan was undertaken to investigate the abundance of Echinococcus multilocularis in a district where a high prevalence of this parasite had previously been detected in dogs. A total of 151 foxes (Vulpes vulpes) were investigated in a necropsy study. Of these 96 (64%) were infected with E. multilocularis with a mean abundance of 8669 parasites per fox. This indicates that red foxes are a major definitive host of E. multilocularis in this country. This also demonstrates that the abundance and prevalence of E. multilocularis in the natural definitive host are likely to be high in geographical regions where there is a concomitant high prevalence in alternative definitive hosts such as dogs. In addition Mesocestoides spp., Dipylidium caninum, Taenia spp., Toxocara canis, Toxascaris leonina, Capillaria and Acanthocephala spp. were found in 99 (66%), 50 (33%), 48 (32%), 46 (30%), 9 (6%), 34 (23%) and 2 (1%) of foxes, respectively. The prevalence but not the abundance of E. multilocularis decreased with age. The abundance of Dipylidium caninum also decreased with age. The frequency distribution of E. multilocularis and Mesocestoides spp. followed a zero inflated negative binomial distribution, whilst all other helminths had a negative binomial distribution. This demonstrates that the frequency distribution of positive counts and not just the frequency of zeros in the data set can determine if a zero inflated or non-zero inflated model is more appropriate. This is because the prevalences of E. multolocularis and Mesocestoides spp. were the highest (and hence had fewest zero counts) yet the parasite distribution nevertheless gave a better fit to the zero inflated models. PMID:20434845

  16. Are Negative Peer Influences Domain Specific? Examining the Influence of Peers and Parents on Externalizing and Drug Use Behaviors.

    PubMed

    Cox, Ronald B; Criss, Michael M; Harrist, Amanda W; Zapata-Roblyer, Martha

    2017-10-01

    Most studies tend to characterize peer influences as either positive or negative. In a sample of 1815 youth from 14 different schools in Caracas, Venezuela, we explored how two types of peer affiliations (i.e., deviant and drug-using peers) differentially mediated the paths from positive parenting to youth's externalizing behavior and licit and illicit drug use. We used Zero Inflated Poisson models to test the probability of use and the extent of use during the past 12 months. Results suggested that peer influences are domain specific among Venezuelan youth. That is, deviant peer affiliations mediated the path from positive parenting to youth externalizing behaviors, and peer drug-using affiliations mediated the paths to the drug use outcomes. Mediation effects were partial, suggesting that parenting explained unique variance in the outcomes after accounting for both peer variables, gender, and age. We discuss implications for the development of screening tools and for prevention interventions targeting adolescents from different cultures.

  17. Do Stress Trajectories Predict Mortality in Older Men? Longitudinal Findings from the VA Normative Aging Study

    PubMed Central

    Aldwin, Carolyn M.; Molitor, Nuoo-Ting; Avron, Spiro; Levenson, Michael R.; Molitor, John; Igarashi, Heidi

    2011-01-01

    We examined long-term patterns of stressful life events (SLE) and their impact on mortality contrasting two theoretical models: allostatic load (linear relationship) and hormesis (inverted U relationship) in 1443 NAS men (aged 41–87 in 1985; M = 60.30, SD = 7.3) with at least two reports of SLEs over 18 years (total observations = 7,634). Using a zero-inflated Poisson growth mixture model, we identified four patterns of SLE trajectories, three showing linear decreases over time with low, medium, and high intercepts, respectively, and one an inverted U, peaking at age 70. Repeating the analysis omitting two health-related SLEs yielded only the first three linear patterns. Compared to the low-stress group, both the moderate and the high-stress groups showed excess mortality, controlling for demographics and health behavior habits, HRs = 1.42 and 1.37, ps <.01 and <.05. The relationship between stress trajectories and mortality was complex and not easily explained by either theoretical model. PMID:21961066

  18. Three-dimensionally bonded spongy graphene material with super compressive elasticity and near-zero Poisson's ratio.

    PubMed

    Wu, Yingpeng; Yi, Ningbo; Huang, Lu; Zhang, Tengfei; Fang, Shaoli; Chang, Huicong; Li, Na; Oh, Jiyoung; Lee, Jae Ah; Kozlov, Mikhail; Chipara, Alin C; Terrones, Humberto; Xiao, Peishuang; Long, Guankui; Huang, Yi; Zhang, Fan; Zhang, Long; Lepró, Xavier; Haines, Carter; Lima, Márcio Dias; Lopez, Nestor Perea; Rajukumar, Lakshmy P; Elias, Ana L; Feng, Simin; Kim, Seon Jeong; Narayanan, N T; Ajayan, Pulickel M; Terrones, Mauricio; Aliev, Ali; Chu, Pengfei; Zhang, Zhong; Baughman, Ray H; Chen, Yongsheng

    2015-01-20

    It is a challenge to fabricate graphene bulk materials with properties arising from the nature of individual graphene sheets, and which assemble into monolithic three-dimensional structures. Here we report the scalable self-assembly of randomly oriented graphene sheets into additive-free, essentially homogenous graphene sponge materials that provide a combination of both cork-like and rubber-like properties. These graphene sponges, with densities similar to air, display Poisson's ratios in all directions that are near-zero and largely strain-independent during reversible compression to giant strains. And at the same time, they function as enthalpic rubbers, which can recover up to 98% compression in air and 90% in liquids, and operate between -196 and 900 °C. Furthermore, these sponges provide reversible liquid absorption for hundreds of cycles and then discharge it within seconds, while still providing an effective near-zero Poisson's ratio.

  19. Treatment of singularities in cracked bodies

    NASA Technical Reports Server (NTRS)

    Shivakumar, K. N.; Raju, I. S.

    1990-01-01

    Three-dimensional finite-element analyses of middle-crack tension (M-T) and bend specimens subjected to mode I loadings were performed to study the stress singularity along the crack front. The specimen was modeled using 20-node isoparametric elements. The displacements and stresses from the analysis were used to estimate the power of singularities using a log-log regression analysis along the crack front. The analyses showed that finite-sized cracked bodies have two singular stress fields of the form rho = C sub o (theta, z) r to the -1/2 power + D sub o (theta, phi) R to the lambda rho power. The first term is the cylindrical singularity with the power -1/2 and is dominant over the middle 96 pct (for Poisson's ratio = 0.3) of the crack front and becomes nearly zero at the free surface. The second singularity is a vertex singularity with the vertex point located at the intersection of the crack front and the free surface. The second term is dominant at the free surface and becomes nearly zero away from the boundary layer. The thickness of the boundary layer depends on Poisson's ratio of the material and is independent of the specimen type. The thickness of the boundary layer varied from 0 pct to about 5 pct of the total specimen thickness as Poisson's ratio varied from 0.0 to 0.45. Because there are two singular stress fields near the free surface, the strain energy release rate (G) is an appropriate parameter to measure the severity of the crack.

  20. Treatment of singularities in cracked bodies

    NASA Technical Reports Server (NTRS)

    Shivakumar, K. N.; Raju, I. S.

    1989-01-01

    Three-dimensional finite-element analyses of middle-crack tension (M-T) and bend specimens subjected to mode I loadings were performed to study the stress singularity along the crack front. The specimen was modeled using 20-node isoparametric elements. The displacements and stresses from the analysis were used to estimate the power of singularities using a log-log regression analysis along the crack front. The analyses showed that finite-sized cracked bodies have two singular stress fields of the form rho = C sub o (theta, z) r to the -1/2 power + D sub o (theta, phi) R to the lambda rho power. The first term is the cylindrical singularity with the power -1/2 and is dominant over the middle 96 pct (for Poisson's ratio = 0.3) of the crack front and becomes nearly zero at the free surface. The second singularity is a vertex singularity with the vertex point located at the intersection of the crack front and the free surface. The second term is dominant at the free surface and becomes nearly zero away from the the boundary layer. The thickness of the boundary layer depends on Poisson's ratio of the material and is independent of the specimen type. The thickness of the boundary layer varied from 0 pct to about 5 pct of the total specimen thickness as Poisson's ratio varied from 0.0 to 0.45. Because there are two singular stress fields near the free surface, the strain energy release rate (G) is an appropriate parameter to measure the severity of the crack.

  1. A generalized right truncated bivariate Poisson regression model with applications to health data.

    PubMed

    Islam, M Ataharul; Chowdhury, Rafiqul I

    2017-01-01

    A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model.

  2. A generalized right truncated bivariate Poisson regression model with applications to health data

    PubMed Central

    Islam, M. Ataharul; Chowdhury, Rafiqul I.

    2017-01-01

    A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model. PMID:28586344

  3. Multinomial model and zero-inflated gamma model to study time spent on leisure time physical activity: an example of ELSA-Brasil.

    PubMed

    Nobre, Aline Araújo; Carvalho, Marilia Sá; Griep, Rosane Härter; Fonseca, Maria de Jesus Mendes da; Melo, Enirtes Caetano Prates; Santos, Itamar de Souza; Chor, Dora

    2017-08-17

    To compare two methodological approaches: the multinomial model and the zero-inflated gamma model, evaluating the factors associated with the practice and amount of time spent on leisure time physical activity. Data collected from 14,823 baseline participants in the Longitudinal Study of Adult Health (ELSA-Brasil - Estudo Longitudinal de Saúde do Adulto ) have been analysed. Regular leisure time physical activity has been measured using the leisure time physical activity module of the International Physical Activity Questionnaire. The explanatory variables considered were gender, age, education level, and annual per capita family income. The main advantage of the zero-inflated gamma model over the multinomial model is that it estimates mean time (minutes per week) spent on leisure time physical activity. For example, on average, men spent 28 minutes/week longer on leisure time physical activity than women did. The most sedentary groups were young women with low education level and income. The zero-inflated gamma model, which is rarely used in epidemiological studies, can give more appropriate answers in several situations. In our case, we have obtained important information on the main determinants of the duration of leisure time physical activity. This information can help guide efforts towards the most vulnerable groups since physical inactivity is associated with different diseases and even premature death.

  4. Background stratified Poisson regression analysis of cohort data.

    PubMed

    Richardson, David B; Langholz, Bryan

    2012-03-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models.

  5. Socioeconomic differences in alcohol-related risk-taking behaviours.

    PubMed

    Livingston, Michael

    2014-11-01

    There is substantial research showing that low socioeconomic position is a predictor of negative outcomes from alcohol consumption, while alcohol consumption itself does not exhibit a strong social gradient. This study aims to examine socioeconomic differences in self-reported alcohol-related risk-taking behaviour to explore whether differences in risk-taking while drinking may explain some of the socioeconomic disparities in alcohol-related harm. Cross-sectional data from current drinkers (n = 21 452) in the 2010 wave of the Australian National Drug Strategy Household Survey were used. Ten items on risk-taking behaviour while drinking were combined into two risk scores, and zero-inflated Poisson regression was used to assess the relationship between socioeconomic position and risk-taking while controlling for age, sex and alcohol consumption. Socioeconomically advantaged respondents reported substantially higher rates of alcohol-related hazardous behaviour than socioeconomically disadvantaged respondents. Controlling for age, sex, volume of drinking and frequency of heavy drinking, respondents living in the most advantaged quintile of neighbourhoods reported significantly higher rates of hazardous behaviour than those in the least advantaged quintile. A similar pattern was evident for household income. Socioeconomically advantaged Australians engage in alcohol-related risky behaviour at higher rates than more disadvantaged Australians even with alcohol consumption controlled. The significant socioeconomic disparities in negative consequences linked to alcohol consumption cannot in this instance be explained via differences in behaviour while drinking. Other factors not directly related to alcohol consumption may be responsible for health inequalities in outcomes with significant alcohol involvement. © 2014 Australasian Professional Society on Alcohol and other Drugs.

  6. Association of Maternal Depressive Symptoms and Offspring Physical Health in Low-Income Families.

    PubMed

    Thompson, Sarah M; Jiang, Lu; Hammen, Constance; Whaley, Shannon E

    2018-06-01

    Objectives The present study sought to examine the association between maternal depressive symptoms and characteristics of offspring physical health, including health status, health behaviors, and healthcare utilization, among low-income families. Maternal engagement was explored as a mediator of observed effects. Methods Cross-sectional survey data from a community sample of 4589 low-income women and their preschool-age children participating in the WIC program in Los Angeles County were analyzed using logistic, Poisson, and zero-inflated negative binomial regression. Mediation was tested via conditional process analyses. Results After controlling for the effects of demographic characteristics including maternal health insurance coverage, employment status, education, and preferred language, children of depressed women (N = 1025) were significantly more likely than children of non-depressed women (N = 3564) to receive a "poor" or "fair" maternal rating of general health (OR 2.34), eat fewer vegetables (IRR: 0.94) more sweets (IRR: 1.20) and sugary drinks daily (IRR: 1.32), and consume fast food more often (OR 1.21). These children were also less likely to have health insurance (OR 1.59) and more likely to receive medical care from a public medical clinic or hospital emergency room (OR 1.30). Reduced maternal engagement partially mediated associations between maternal depressive symptoms and several child health outcomes including poor diet, health insurance coverage, and use of public medical services. Conclusions for Practice Maternal depressive symptoms are associated with poor health among preschool-age children in low-income families. Prevention, screening, and treatment efforts aimed at reducing the prevalence of maternal depression may positively affect young children's health.

  7. Prevalence of dental caries in 5-year-old Greek children and the use of dental services: evaluation of socioeconomic, behavioural factors and living conditions.

    PubMed

    Mantonanaki, Magdalini; Koletsi-Kounari, Haroula; Mamai-Homata, Eleni; Papaioannou, William

    2013-04-01

    To assess dental caries and use of dental services experience in 5-year-old children attending public kindergartens in Attica, Greece and to examine the influence of certain socioeconomic factors and living conditions as well as dental behaviours and attitudes. In this cross-sectional study, a random and stratified sample of 605 Greek children was examined using decayed, missing, filled tooth surfaces and simplified debris indices. The use of dental services was measured by children's dental visits (any dental visit up to the age of 5 years). Care Index was also calculated. Risk indicators were assessed by a questionnaire. Zero-inflated Poisson and Logistic Regression Analysis were generated to test statistical significant associations. The prevalence of dental caries was 16.5%. Care Index was 32% and dental visits were reported for the 84% of the children. Medium Socio-Economic Level (SEL) was associated with no detectable caries. High SEL was related to decreased decayed, missing, filled teeth values, while female gender and rented houses had the opposite effect. The age of the mother (35-39 years) and the higher SEL were related to higher levels of dental services use. It is suggested that there are differences in the experience of dental caries and use of dental services among preschool children in Attica, which are related to demographic, socioeconomic factors and living conditions. Dental public polices should focus on groups with specific characteristics in order to improve oral health levels of disease-susceptible populations. © 2013 FDI World Dental Federation.

  8. Sampling Strategies for Evaluating the Rate of Adventitious Transgene Presence in Non-Genetically Modified Crop Fields.

    PubMed

    Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine

    2017-09-01

    According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.

  9. Parental Alcohol Involvement and Adolescent Alcohol Expectancies Predict Alcohol Involvement in Male Adolescents

    PubMed Central

    Cranford, James A.; Zucker, Robert A.; Jester, Jennifer M.; Puttler, Leon I.; Fitzgerald, Hiram E.

    2010-01-01

    Current models of adolescent drinking behavior hypothesize that alcohol expectancies mediate the effects of other proximal and distal risk factors. This longitudinal study tested the hypothesis that the effects of parental alcohol involvement on their children’s drinking behavior in mid-adolescence are mediated by the children’s alcohol expectancies in early adolescence. A sample of 148 initially 9–11 year old boys and their parents from a high-risk population and a contrast group of community families completed measures of drinking behavior and alcohol expectancies over a 6-year interval. We analyzed data from middle childhood (M age = 10.4 years), early adolescence (M age = 13.5 years), and mid-adolescence (M age = 16.5 years). The sample was restricted only to adolescents who had begun to drink by mid-adolescence. Results from zero-inflated Poisson regression analyses showed that 1) maternal drinking during their children’s middle childhood predicted number of drinking days in middle adolescence; 2) negative and positive alcohol expectancies in early adolescence predicted odds of any intoxication in middle adolescence; and 3) paternal alcoholism during their children’s middle childhood and adolescents’ alcohol expectancies in early adolescence predicted frequency of intoxication in middle adolescence. Contrary to predictions, child alcohol expectancies did not mediate the effects of parental alcohol involvement in this high-risk sample. Different aspects of parental alcohol involvement, along with early adolescent alcohol expectancies, independently predicted adolescent drinking behavior in middle adolescence. Alternative pathways for the influence of maternal and paternal alcohol involvement and implications for expectancy models of adolescent drinking behavior were discussed. PMID:20853923

  10. Exposure to Hurricane Sandy, neighborhood collective efficacy, and post-traumatic stress symptoms in older adults.

    PubMed

    Heid, Allison R; Pruchno, Rachel; Cartwright, Francine P; Wilson-Genderson, Maureen

    2017-07-01

    Older adults exposed to natural disasters are at risk for negative psychological outcomes such as post-traumatic stress disorder (PTSD). Neighborhood social capital can act as a resource that supports individual-level coping with stressors. This study explores the ability of perceived neighborhood collective efficacy, a form of social capital, to moderate the association between exposure to Hurricane Sandy and PTSD symptoms in older adults. Data from 2205 older individuals aged 54-80 residing in New Jersey who self-reported exposure to Hurricane Sandy in October of 2012 were identified and extracted from the ORANJ BOWL™ research panel. Participants completed baseline assessments of demographic and individual-level characteristics in 2006-2008 and follow-up assessments about storm exposure, perceived neighborhood collective efficacy (social cohesion and social control), and PTSD symptoms 8-33 months following the storm. Zero-inflated Poisson regression models were tested to examine the association between exposure, neighborhood collective efficacy, and PTSD symptoms. After accounting for known demographic and individual-level covariates, greater storm exposure was linked to higher levels of PTSD symptoms. Social cohesion, but not social control, was linked to lower reports of PTSD symptoms and moderated the association between exposure and PTSD. The impact of storm exposure on PTSD symptoms was less for individuals reporting higher levels of social cohesion. Mental health service providers and disaster preparedness and response teams should consider the larger social network of individuals served. Building social connections in older adults' neighborhoods that promote cohesion can reduce the negative psychological impact of a disaster.

  11. Predictive validity of cannabis consumption measures: Results from a national longitudinal study.

    PubMed

    Buu, Anne; Hu, Yi-Han; Pampati, Sanjana; Arterberry, Brooke J; Lin, Hsien-Chang

    2017-10-01

    Validating the utility of cannabis consumption measures for predicting later cannabis related symptomatology or progression to cannabis use disorder (CUD) is crucial for prevention and intervention work that may use consumption measures for quick screening. This study examined whether cannabis use quantity and frequency predicted CUD symptom counts, progression to onset of CUD, and persistence of CUD. Data from the National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) at Wave 1 (2001-2002) and Wave 2 (2004-2005) were used to identify three risk samples: (1) current cannabis users at Wave 1 who were at risk for having CUD symptoms at Wave 2; (2) current users without lifetime CUD who were at risk for incident CUD; and (3) current users with past-year CUD who were at risk for persistent CUD. Logistic regression and zero-inflated Poisson models were used to examine the longitudinal effect of cannabis consumption on CUD outcomes. Higher frequency of cannabis use predicted lower likelihood of being symptom-free but it did not predict the severity of CUD symptomatology. Higher frequency of cannabis use also predicted higher likelihood of progression to onset of CUD and persistence of CUD. Cannabis use quantity, however, did not predict any of the developmental stages of CUD symptomatology examined in this study. This study has provided a new piece of evidence to support the predictive validity of cannabis use frequency based on national longitudinal data. The result supports the common practice of including frequency items in cannabis screening tools. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. The relationship between gambling attitudes, involvement, and problems in adolescence: Examining the moderating role of coping strategies and parenting styles.

    PubMed

    Dixon, Ramsay W; Youssef, George J; Hasking, Penelope; Yücel, Murat; Jackson, Alun C; Dowling, Nicki A

    2016-07-01

    Several factors are associated with an increased risk of adolescent problem gambling, including positive gambling attitudes, higher levels of gambling involvement, ineffective coping strategies and unhelpful parenting practices. It is less clear, however, how these factors interact or influence each other in the development of problem gambling behavior during adolescence. The aim of the current study was to simultaneously explore these predictors, with a particular focus on the extent to which coping skills and parenting styles may moderate the expected association between gambling involvement and gambling problems. Participants were 612 high school students. The data were analyzed using a zero-inflated Poisson (ZIP) regression model, controlling for gender. Although several variables predicted the number of symptoms associated with problem gambling, none of them predicted the probability of displaying any problem gambling. Gambling involvement fully mediated the relationship between positive gambling attitudes and gambling problem severity. There was a significant relationship between gambling involvement and problems at any level of problem focused coping, reference to others and inconsistent discipline. However, adaptive coping styles employed by adolescents and consistent disciplinary practices by parents were buffers of gambling problems at low levels of adolescent gambling involvement, but failed to protect adolescents when their gambling involvement was high. These findings indicate that research exploring the development of gambling problems is required and imply that coping and parenting interventions may have particular utility for adolescents who are at risk of development gambling problems but who are not gambling frequently. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Three-part joint modeling methods for complex functional data mixed with zero-and-one-inflated proportions and zero-inflated continuous outcomes with skewness.

    PubMed

    Li, Haocheng; Staudenmayer, John; Wang, Tianying; Keadle, Sarah Kozey; Carroll, Raymond J

    2018-02-20

    We take a functional data approach to longitudinal studies with complex bivariate outcomes. This work is motivated by data from a physical activity study that measured 2 responses over time in 5-minute intervals. One response is the proportion of time active in each interval, a continuous proportions with excess zeros and ones. The other response, energy expenditure rate in the interval, is a continuous variable with excess zeros and skewness. This outcome is complex because there are 3 possible activity patterns in each interval (inactive, partially active, and completely active), and those patterns, which are observed, induce both nonrandom and random associations between the responses. More specifically, the inactive pattern requires a zero value in both the proportion for active behavior and the energy expenditure rate; a partially active pattern means that the proportion of activity is strictly between zero and one and that the energy expenditure rate is greater than zero and likely to be moderate, and the completely active pattern means that the proportion of activity is exactly one, and the energy expenditure rate is greater than zero and likely to be higher. To address these challenges, we propose a 3-part functional data joint modeling approach. The first part is a continuation-ratio model to reorder the ordinal valued 3 activity patterns. The second part models the proportions when they are in interval (0,1). The last component specifies the skewed continuous energy expenditure rate with Box-Cox transformations when they are greater than zero. In this 3-part model, the regression structures are specified as smooth curves measured at various time points with random effects that have a correlation structure. The smoothed random curves for each variable are summarized using a few important principal components, and the association of the 3 longitudinal components is modeled through the association of the principal component scores. The difficulties in handling the ordinal and proportional variables are addressed using a quasi-likelihood type approximation. We develop an efficient algorithm to fit the model that also involves the selection of the number of principal components. The method is applied to physical activity data and is evaluated empirically by a simulation study. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Predictors of participant engagement and naloxone utilization in a community-based naloxone distribution program.

    PubMed

    Rowe, Christopher; Santos, Glenn-Milo; Vittinghoff, Eric; Wheeler, Eliza; Davidson, Peter; Coffin, Philip O

    2015-08-01

    To describe characteristics of participants and overdose reversals associated with a community-based naloxone distribution program and identify predictors of obtaining naloxone refills and using naloxone for overdose reversal. Bivariate statistical tests were used to compare characteristics of participants who obtained refills and reported overdose reversals versus those who did not. We fitted multiple logistic regression models to identify predictors of refills and reversals; zero-inflated multiple Poisson regression models were used to identify predictors of number of refills and reversals. San Francisco, California, USA. Naloxone program participants registered and reversals reported from 2010 to 2013. Baseline characteristics of participants and reported characteristics of reversals. A total of 2500 participants were registered and 702 reversals were reported from 2010 to 2013. Participants who had witnessed an overdose [adjusted odds ratio (AOR)=2.02, 95% confidence interval (CI)= 1.53-2.66; AOR = 2.73, 95% CI = 1.73-4.30] or used heroin (AOR = 1.85, 95% CI =  1.44-2.37; AOR = 2.19, 95% CI = 1.54-3.13) or methamphetamine (AOR=1.71, 95% CI=1.37-2.15; AOR=1.61, 95% CI=1.18-2.19) had higher odds of obtaining a refill and reporting a reversal, respectively. African American (AOR = 0.63, 95% CI = 0.45-0.88) and Latino (AOR = 0.65, 95% CI =  0.43-1.00) participants had lower odds of obtaining a naloxone refill, whereas Latino participants who obtained at least one refill reported a higher number of refills [incidence rate ratio (IRR) = 1.33 (1.05-1.69)]. Community naloxone distribution programs are capable of reaching sizeable populations of high-risk individuals and facilitating large numbers of overdose reversals. Community members most likely to engage with a naloxone program and use naloxone to reverse an overdose are active drug users. © 2015 Society for the Study of Addiction.

  15. A smooth exit from eternal inflation?

    NASA Astrophysics Data System (ADS)

    Hawking, S. W.; Hertog, Thomas

    2018-04-01

    The usual theory of inflation breaks down in eternal inflation. We derive a dual description of eternal inflation in terms of a deformed Euclidean CFT located at the threshold of eternal inflation. The partition function gives the amplitude of different geometries of the threshold surface in the no-boundary state. Its local and global behavior in dual toy models shows that the amplitude is low for surfaces which are not nearly conformal to the round three-sphere and essentially zero for surfaces with negative curvature. Based on this we conjecture that the exit from eternal inflation does not produce an infinite fractal-like multiverse, but is finite and reasonably smooth.

  16. A Zero- and K-Inflated Mixture Model for Health Questionnaire Data

    PubMed Central

    Finkelman, Matthew D.; Green, Jennifer Greif; Gruber, Michael J.; Zaslavsky, Alan M.

    2011-01-01

    In psychiatric assessment, Item Response Theory (IRT) is a popular tool to formalize the relation between the severity of a disorder and associated responses to questionnaire items. Practitioners of IRT sometimes make the assumption of normally distributed severities within a population; while convenient, this assumption is often violated when measuring psychiatric disorders. Specifically, there may be a sizable group of respondents whose answers place them at an extreme of the latent trait spectrum. In this article, a zero- and K-inflated mixture model is developed to account for the presence of such respondents. The model is fitted using an expectation-maximization (E-M) algorithm to estimate the percentage of the population at each end of the continuum, concurrently analyzing the remaining “graded component” via IRT. A method to perform factor analysis for only the graded component is introduced. In assessments of oppositional defiant disorder and conduct disorder, the zero- and K-inflated model exhibited better fit than the standard IRT model. PMID:21365673

  17. Understanding poisson regression.

    PubMed

    Hayat, Matthew J; Higgins, Melinda

    2014-04-01

    Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.

  18. Computational prediction of new auxetic materials.

    PubMed

    Dagdelen, John; Montoya, Joseph; de Jong, Maarten; Persson, Kristin

    2017-08-22

    Auxetics comprise a rare family of materials that manifest negative Poisson's ratio, which causes an expansion instead of contraction under tension. Most known homogeneously auxetic materials are porous foams or artificial macrostructures and there are few examples of inorganic materials that exhibit this behavior as polycrystalline solids. It is now possible to accelerate the discovery of materials with target properties, such as auxetics, using high-throughput computations, open databases, and efficient search algorithms. Candidates exhibiting features correlating with auxetic behavior were chosen from the set of more than 67 000 materials in the Materials Project database. Poisson's ratios were derived from the calculated elastic tensor of each material in this reduced set of compounds. We report that this strategy results in the prediction of three previously unidentified homogeneously auxetic materials as well as a number of compounds with a near-zero homogeneous Poisson's ratio, which are here denoted "anepirretic materials".There are very few inorganic materials with auxetic homogenous Poisson's ratio in polycrystalline form. Here authors develop an approach to screening materials databases for target properties such as negative Poisson's ratio by using stability and structural motifs to predict new instances of homogenous auxetic behavior as well as a number of materials with near-zero Poisson's ratio.

  19. MIXED MODEL AND ESTIMATING EQUATION APPROACHES FOR ZERO INFLATION IN CLUSTERED BINARY RESPONSE DATA WITH APPLICATION TO A DATING VIOLENCE STUDY1

    PubMed Central

    Fulton, Kara A.; Liu, Danping; Haynie, Denise L.; Albert, Paul S.

    2016-01-01

    The NEXT Generation Health study investigates the dating violence of adolescents using a survey questionnaire. Each student is asked to affirm or deny multiple instances of violence in his/her dating relationship. There is, however, evidence suggesting that students not in a relationship responded to the survey, resulting in excessive zeros in the responses. This paper proposes likelihood-based and estimating equation approaches to analyze the zero-inflated clustered binary response data. We adopt a mixed model method to account for the cluster effect, and the model parameters are estimated using a maximum-likelihood (ML) approach that requires a Gaussian–Hermite quadrature (GHQ) approximation for implementation. Since an incorrect assumption on the random effects distribution may bias the results, we construct generalized estimating equations (GEE) that do not require the correct specification of within-cluster correlation. In a series of simulation studies, we examine the performance of ML and GEE methods in terms of their bias, efficiency and robustness. We illustrate the importance of properly accounting for this zero inflation by reanalyzing the NEXT data where this issue has previously been ignored. PMID:26937263

  20. Descriptive Analysis on the Impacts of Universal Zero-Markup Drug Policy on a Chinese Urban Tertiary Hospital

    PubMed Central

    Yang, Dong

    2016-01-01

    Background Universal Zero-Markup Drug Policy (UZMDP) mandates no price mark-ups on any drug dispensed by a healthcare institution, and covers the medicines not included in the China’s National Essential Medicine System. Five tertiary hospitals in Beijing, China implemented UZMDP in 2012. Its impacts on these hospitals are unknown. We described the effects of UZMDP on a participating hospital, Jishuitan Hospital, Beijing, China (JST). Methods This retrospective longitudinal study examined the hospital-level data of JST and city-level data of tertiary hospitals of Beijing, China (BJT) 2009–2015. Rank-sum tests and join-point regression analyses were used to assess absolute changes and differences in trends, respectively. Results In absolute terms, after the UZDMP implementation, there were increased annual patient-visits and decreased ratios of medicine-to-healthcare-charges (RMOH) in JST outpatient and inpatient services; however, in outpatient service, physician work-days decreased and physician-workload and inflation-adjusted per-visit healthcare charges increased, while the inpatient physician work-days increased and inpatient mortality-rate reduced. Interestingly, the decreasing trend in inpatient mortality-rate was neutralized after UZDMP implementation. Compared with BJT and under influence of UZDMP, JST outpatient and inpatient services both had increasing trends in annual patient-visits (annual percentage changes[APC] = 8.1% and 6.5%, respectively) and decreasing trends in RMOH (APC = -4.3% and -5.4%, respectively), while JST outpatient services had increasing trend in inflation-adjusted per-visit healthcare charges (APC = 3.4%) and JST inpatient service had decreasing trend in inflation-adjusted per-visit medicine-charges (APC = -5.2%). Conclusion Implementation of UZMDP seems to increase annual patient-visits, reduce RMOH and have different impacts on outpatient and inpatient services in a Chinese urban tertiary hospital. PMID:27627811

  1. Descriptive Analysis on the Impacts of Universal Zero-Markup Drug Policy on a Chinese Urban Tertiary Hospital.

    PubMed

    Tian, Wei; Yuan, Jiangfan; Yang, Dong; Zhang, Lanjing

    2016-01-01

    Universal Zero-Markup Drug Policy (UZMDP) mandates no price mark-ups on any drug dispensed by a healthcare institution, and covers the medicines not included in the China's National Essential Medicine System. Five tertiary hospitals in Beijing, China implemented UZMDP in 2012. Its impacts on these hospitals are unknown. We described the effects of UZMDP on a participating hospital, Jishuitan Hospital, Beijing, China (JST). This retrospective longitudinal study examined the hospital-level data of JST and city-level data of tertiary hospitals of Beijing, China (BJT) 2009-2015. Rank-sum tests and join-point regression analyses were used to assess absolute changes and differences in trends, respectively. In absolute terms, after the UZDMP implementation, there were increased annual patient-visits and decreased ratios of medicine-to-healthcare-charges (RMOH) in JST outpatient and inpatient services; however, in outpatient service, physician work-days decreased and physician-workload and inflation-adjusted per-visit healthcare charges increased, while the inpatient physician work-days increased and inpatient mortality-rate reduced. Interestingly, the decreasing trend in inpatient mortality-rate was neutralized after UZDMP implementation. Compared with BJT and under influence of UZDMP, JST outpatient and inpatient services both had increasing trends in annual patient-visits (annual percentage changes[APC] = 8.1% and 6.5%, respectively) and decreasing trends in RMOH (APC = -4.3% and -5.4%, respectively), while JST outpatient services had increasing trend in inflation-adjusted per-visit healthcare charges (APC = 3.4%) and JST inpatient service had decreasing trend in inflation-adjusted per-visit medicine-charges (APC = -5.2%). Implementation of UZMDP seems to increase annual patient-visits, reduce RMOH and have different impacts on outpatient and inpatient services in a Chinese urban tertiary hospital.

  2. Referent group proximity, social norms, and context: alcohol use in a low-use environment.

    PubMed

    Cox, Jared M; Bates, Scott C

    2011-01-01

    The purpose of this study was to investigate the relationship between perceived normative use of alcohol and reported consumption in an environment where relatively little alcohol use occurs. A total of 585 undergraduate students completed an online survey on alcohol use in March 2006. Participants reported personal alcohol use and perceptions of use by "friends," "the average student," and "the average student who drinks." Due to the large number of students reporting zero alcohol use, zero-inflated negative binomial regression was used to analyze the data. Results showed that perceptions of use and beliefs about the acceptability of use by proximal groups were strongly and positively correlated with personal alcohol use. Perceptions of distal groups were either not correlated or were correlated negatively with personal use. These findings suggest that the use of distal referent groups for a social norms campaign in a low-use environment may have paradoxical effects.

  3. Two-Part and Related Regression Models for Longitudinal Data

    PubMed Central

    Farewell, V.T.; Long, D.L.; Tom, B.D.M.; Yiu, S.; Su, L.

    2017-01-01

    Statistical models that involve a two-part mixture distribution are applicable in a variety of situations. Frequently, the two parts are a model for the binary response variable and a model for the outcome variable that is conditioned on the binary response. Two common examples are zero-inflated or hurdle models for count data and two-part models for semicontinuous data. Recently, there has been particular interest in the use of these models for the analysis of repeated measures of an outcome variable over time. The aim of this review is to consider motivations for the use of such models in this context and to highlight the central issues that arise with their use. We examine two-part models for semicontinuous and zero-heavy count data, and we also consider models for count data with a two-part random effects distribution. PMID:28890906

  4. Extending the Applicability of the Generalized Likelihood Function for Zero-Inflated Data Series

    NASA Astrophysics Data System (ADS)

    Oliveira, Debora Y.; Chaffe, Pedro L. B.; Sá, João. H. M.

    2018-03-01

    Proper uncertainty estimation for data series with a high proportion of zero and near zero observations has been a challenge in hydrologic studies. This technical note proposes a modification to the Generalized Likelihood function that accounts for zero inflation of the error distribution (ZI-GL). We compare the performance of the proposed ZI-GL with the original Generalized Likelihood function using the entire data series (GL) and by simply suppressing zero observations (GLy>0). These approaches were applied to two interception modeling examples characterized by data series with a significant number of zeros. The ZI-GL produced better uncertainty ranges than the GL as measured by the precision, reliability and volumetric bias metrics. The comparison between ZI-GL and GLy>0 highlights the need for further improvement in the treatment of residuals from near zero simulations when a linear heteroscedastic error model is considered. Aside from the interception modeling examples illustrated herein, the proposed ZI-GL may be useful for other hydrologic studies, such as for the modeling of the runoff generation in hillslopes and ephemeral catchments.

  5. Access to Transportation and Health Care Visits for Medicaid Enrollees With Diabetes.

    PubMed

    Thomas, Leela V; Wedel, Kenneth R; Christopher, Jan E

    2018-03-01

    Diabetes is a chronic condition that requires frequent health care visits for its management. Individuals without nonemergency medical transportation often miss appointments and do not receive optimal care. This study aims to evaluate the association between Medicaid-provided nonemergency medical transportation and diabetes care visits. A retrospective analysis was conducted of demographic and claims data obtained from the Oklahoma Medicaid program. Participants consisted of Medicaid enrollees with diabetes who made at least 1 visit for diabetes care in a year. The sample was predominantly female and white, with an average age of 46.38 years. Two zero-truncated Poisson regression models were estimated to assess the independent effect of transportation use on number of diabetes care visits. Use of nonemergency medical transportation is a significant predictor of diabetes care visits. Zero-truncated Poisson regression coefficients showed a positive association between the use of transportation and number of visits (0.6563, P < .001). Age, gender, race/ethnicity, area of residence, and presence of additional chronic conditions had independent associations with number of visits. Older enrollees were likely to make more visits than younger enrollees with diabetes (0.02382); controlling for all other factors in the model, rural residents made more visits than urban; women made fewer visits than men (-0.09312; P < .001); and minorities made fewer visits than whites, with pronounced differences for Hispanics and Asians compared to whites. Findings underscore the importance of ensuring transportation to Medicaid populations with diabetes, particularly in the rural areas where the prevalence of diabetes and complications are higher and the availability of medical resources lower than in the urban areas. © 2017 National Rural Health Association.

  6. On the Dequantization of Fedosov's Deformation Quantization

    NASA Astrophysics Data System (ADS)

    Karabegov, Alexander V.

    2003-08-01

    To each natural deformation quantization on a Poisson manifold M we associate a Poisson morphism from the formal neighborhood of the zero section of the cotangent bundle to M to the formal neighborhood of the diagonal of the product M x M~, where M~ is a copy of M with the opposite Poisson structure. We call it dequantization of the natural deformation quantization. Then we "dequantize" Fedosov's quantization.

  7. Dark energy from gravitoelectromagnetic inflation?

    NASA Astrophysics Data System (ADS)

    Membiela, F. A.; Bellini, M.

    2008-02-01

    Gravitoectromagnetic Inflation (GI) was introduced to describe in an unified manner, electromagnetic, gravitatory and inflaton fields from a 5D vacuum state. On the other hand, the primordial origin and evolution of dark energy is today unknown. In this letter we show using GI that the zero modes of some redefined vector fields $B_i=A_i/a$ produced during inflation, could be the source of dark energy in the universe.

  8. Conditional Poisson models: a flexible alternative to conditional logistic case cross-over analysis.

    PubMed

    Armstrong, Ben G; Gasparrini, Antonio; Tobias, Aurelio

    2014-11-24

    The time stratified case cross-over approach is a popular alternative to conventional time series regression for analysing associations between time series of environmental exposures (air pollution, weather) and counts of health outcomes. These are almost always analyzed using conditional logistic regression on data expanded to case-control (case crossover) format, but this has some limitations. In particular adjusting for overdispersion and auto-correlation in the counts is not possible. It has been established that a Poisson model for counts with stratum indicators gives identical estimates to those from conditional logistic regression and does not have these limitations, but it is little used, probably because of the overheads in estimating many stratum parameters. The conditional Poisson model avoids estimating stratum parameters by conditioning on the total event count in each stratum, thus simplifying the computing and increasing the number of strata for which fitting is feasible compared with the standard unconditional Poisson model. Unlike the conditional logistic model, the conditional Poisson model does not require expanding the data, and can adjust for overdispersion and auto-correlation. It is available in Stata, R, and other packages. By applying to some real data and using simulations, we demonstrate that conditional Poisson models were simpler to code and shorter to run than are conditional logistic analyses and can be fitted to larger data sets than possible with standard Poisson models. Allowing for overdispersion or autocorrelation was possible with the conditional Poisson model but when not required this model gave identical estimates to those from conditional logistic regression. Conditional Poisson regression models provide an alternative to case crossover analysis of stratified time series data with some advantages. The conditional Poisson model can also be used in other contexts in which primary control for confounding is by fine stratification.

  9. Deformation of a flexible disk bonded to an elastic half space-application to the lung.

    PubMed

    Lai-Fook, S J; Hajji, M A; Wilson, T A

    1980-08-01

    An analysis is presented of the deformation of a homogeneous, isotropic, elastic half space subjected to a constant radial strain in a circular area on the boundary. Explicit analytic expressions for the normal and radial displacements and the shear stress on the boundary are used to interpret experiments performed on inflated pig lungs. The boundary strain was induced by inflating or deflating the lung after bonding a flexible disk to the lung surface. The prediction that the surface bulges outward for positive boundary strain and inward for negative strain was observed in the experiments. Poisson's ratio at two transpulmonary pressures was measured, by use of the normal displacement equation evaluated at the surface. A direct estimate of Poisson's ratio was possible because the normal displacement of the surface depended uniquely on the compressibility of the material. Qualitative comparisons between theory and experiment support the use of continuum analyses in evaluating the behavior of the lung parenchyma when subjected to small local distortions.

  10. Estimating a Logistic Discrimination Functions When One of the Training Samples Is Subject to Misclassification: A Maximum Likelihood Approach.

    PubMed

    Nagelkerke, Nico; Fidler, Vaclav

    2015-01-01

    The problem of discrimination and classification is central to much of epidemiology. Here we consider the estimation of a logistic regression/discrimination function from training samples, when one of the training samples is subject to misclassification or mislabeling, e.g. diseased individuals are incorrectly classified/labeled as healthy controls. We show that this leads to zero-inflated binomial model with a defective logistic regression or discrimination function, whose parameters can be estimated using standard statistical methods such as maximum likelihood. These parameters can be used to estimate the probability of true group membership among those, possibly erroneously, classified as controls. Two examples are analyzed and discussed. A simulation study explores properties of the maximum likelihood parameter estimates and the estimates of the number of mislabeled observations.

  11. Weather variability and the incidence of cryptosporidiosis: comparison of time series poisson regression and SARIMA models.

    PubMed

    Hu, Wenbiao; Tong, Shilu; Mengersen, Kerrie; Connell, Des

    2007-09-01

    Few studies have examined the relationship between weather variables and cryptosporidiosis in Australia. This paper examines the potential impact of weather variability on the transmission of cryptosporidiosis and explores the possibility of developing an empirical forecast system. Data on weather variables, notified cryptosporidiosis cases, and population size in Brisbane were supplied by the Australian Bureau of Meteorology, Queensland Department of Health, and Australian Bureau of Statistics for the period of January 1, 1996-December 31, 2004, respectively. Time series Poisson regression and seasonal auto-regression integrated moving average (SARIMA) models were performed to examine the potential impact of weather variability on the transmission of cryptosporidiosis. Both the time series Poisson regression and SARIMA models show that seasonal and monthly maximum temperature at a prior moving average of 1 and 3 months were significantly associated with cryptosporidiosis disease. It suggests that there may be 50 more cases a year for an increase of 1 degrees C maximum temperature on average in Brisbane. Model assessments indicated that the SARIMA model had better predictive ability than the Poisson regression model (SARIMA: root mean square error (RMSE): 0.40, Akaike information criterion (AIC): -12.53; Poisson regression: RMSE: 0.54, AIC: -2.84). Furthermore, the analysis of residuals shows that the time series Poisson regression appeared to violate a modeling assumption, in that residual autocorrelation persisted. The results of this study suggest that weather variability (particularly maximum temperature) may have played a significant role in the transmission of cryptosporidiosis. A SARIMA model may be a better predictive model than a Poisson regression model in the assessment of the relationship between weather variability and the incidence of cryptosporidiosis.

  12. The Effect of the "Zero Tolerance for Head Contact" Rule Change on the Risk of Concussions in Youth Ice Hockey Players.

    PubMed

    Krolikowski, Maciej P; Black, Amanda M; Palacios-Derflingher, Luz; Blake, Tracy A; Schneider, Kathryn J; Emery, Carolyn A

    2017-02-01

    Ice hockey is a popular winter sport in Canada. Concussions account for the greatest proportion of all injuries in youth ice hockey. In 2011, a policy change enforcing "zero tolerance for head contact" was implemented in all leagues in Canada. To determine if the risk of game-related concussions and more severe concussions (ie, resulting in >10 days of time loss) and the mechanisms of a concussion differed for Pee Wee class (ages 11-12 years) and Bantam class (ages 13-14 years) players after the 2011 "zero tolerance for head contact" policy change compared with players in similar divisions before the policy change. Cohort study; Level of evidence, 3. The retrospective cohort included Pee Wee (most elite 70%, 2007-2008; n = 891) and Bantam (most elite 30%, 2008-2009; n = 378) players before the rule change and Pee Wee (2011-2012; n = 588) and Bantam (2011-2012; n = 242) players in the same levels of play after the policy change. Suspected concussions were identified by a team designate and referred to a sport medicine physician for diagnosis. Incidence rate ratios (IRRs) were estimated based on multiple Poisson regression analysis, controlling for clustering by team and other important covariates and offset by game-exposure hours. Incidence rates based on the mechanisms of a concussion were estimated based on univariate Poisson regression analysis. The risk of game-related concussions increased after the head contact rule in Pee Wee (IRR, 1.85; 95% CI, 1.20-2.86) and Bantam (IRR, 2.48; 95% CI, 1.17-5.24) players. The risk of more severe concussions increased after the head contact rule in Pee Wee (IRR, 4.12; 95% CI, 2.00-8.50) and Bantam (IRR, 7.91; 95% CI, 3.13-19.94) players. The rates of concussions due to body checking and direct head contact increased after the rule change. The "zero tolerance for head contact" policy change did not reduce the risk of game-related concussions in Pee Wee or Bantam class ice hockey players. Increased concussion awareness and education after the policy change may have contributed to the increased risk of concussions found after the policy change.

  13. Zero-truncated panel Poisson mixture models: Estimating the impact on tourism benefits in Fukushima Prefecture.

    PubMed

    Narukawa, Masaki; Nohara, Katsuhito

    2018-04-01

    This study proposes an estimation approach to panel count data, truncated at zero, in order to apply a contingent behavior travel cost method to revealed and stated preference data collected via a web-based survey. We develop zero-truncated panel Poisson mixture models by focusing on respondents who visited a site. In addition, we introduce an inverse Gaussian distribution to unobserved individual heterogeneity as an alternative to a popular gamma distribution, making it possible to capture effectively the long tail typically observed in trip data. We apply the proposed method to estimate the impact on tourism benefits in Fukushima Prefecture as a result of the Fukushima Nuclear Power Plant No. 1 accident. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Macro-level pedestrian and bicycle crash analysis: Incorporating spatial spillover effects in dual state count models.

    PubMed

    Cai, Qing; Lee, Jaeyoung; Eluru, Naveen; Abdel-Aty, Mohamed

    2016-08-01

    This study attempts to explore the viability of dual-state models (i.e., zero-inflated and hurdle models) for traffic analysis zones (TAZs) based pedestrian and bicycle crash frequency analysis. Additionally, spatial spillover effects are explored in the models by employing exogenous variables from neighboring zones. The dual-state models such as zero-inflated negative binomial and hurdle negative binomial models (with and without spatial effects) are compared with the conventional single-state model (i.e., negative binomial). The model comparison for pedestrian and bicycle crashes revealed that the models that considered observed spatial effects perform better than the models that did not consider the observed spatial effects. Across the models with spatial spillover effects, the dual-state models especially zero-inflated negative binomial model offered better performance compared to single-state models. Moreover, the model results clearly highlighted the importance of various traffic, roadway, and sociodemographic characteristics of the TAZ as well as neighboring TAZs on pedestrian and bicycle crash frequency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Terminal Duct Lobular Unit Involution of the Normal Breast: Implications for Breast Cancer Etiology

    PubMed Central

    Pfeiffer, Ruth M.; Patel, Deesha A.; Linville, Laura; Brinton, Louise A.; Gierach, Gretchen L.; Yang, Xiaohong R.; Papathomas, Daphne; Visscher, Daniel; Mies, Carolyn; Degnim, Amy C.; Anderson, William F.; Hewitt, Stephen; Khodr, Zeina G.; Clare, Susan E.; Storniolo, Anna Maria; Sherman, Mark E.

    2014-01-01

    Background Greater degrees of terminal duct lobular unit (TDLU) involution have been linked to lower breast cancer risk; however, factors that influence this process are poorly characterized. Methods To study this question, we developed three reproducible measures that are inversely associated with TDLU involution: TDLU counts, median TDLU span, and median acini counts/TDLU. We determined factors associated with TDLU involution using normal breast tissues from 1938 participants (1369 premenopausal and 569 postmenopausal) ages 18 to 75 years in the Susan G. Komen Tissue Bank at the Indiana University Simon Cancer Center. Multivariable zero-inflated Poisson models were used to estimate relative risks (RRs) and 95% confidence intervals (95% CIs) for factors associated with TDLU counts, and multivariable ordinal logistic regression models were used to estimate odds ratios (ORs) and 95% CIs for factors associated with categories of median TDLU span and acini counts/TDLU. Results All TDLU measures started declining in the third age decade (all measures, two-sided P trend ≤ .001); and all metrics were statistically significantly lower among postmenopausal women. Nulliparous women demonstrated lower TDLU counts compared with uniparous women (among premenopausal women, RR = 0.79, 95% CI = 0.73 to 0.85; among postmenopausal, RR = 0.67, 95% CI = 0.56 to 0.79); however, rates of age-related TDLU decline were faster among parous women. Other factors were related to specific measures of TDLU involution. Conclusion Morphometric analysis of TDLU involution warrants further evaluation to understand the pathogenesis of breast cancer and assessing its role as a progression marker for women with benign biopsies or as an intermediate endpoint in prevention studies. PMID:25274491

  16. Physical activity and asthma: A longitudinal and multi-country study.

    PubMed

    Russell, Melissa A; Janson, Christer; Real, Francisco Gómez; Johannessen, Ane; Waatevik, Marie; Benediktsdóttir, Bryndis; Holm, Mathias; Lindberg, Eva; Schlünssen, Vivi; Raza, Wasif; Dharmage, Shyamali C; Svanes, Cecilie

    2017-11-01

    To investigate the impact of physical activity on asthma in middle-aged adults, in one longitudinal analysis, and one multi-centre cross-sectional analysis. The Respiratory Health in Northern Europe (RHINE) is a population-based postal questionnaire cohort study. Physical activity, height and weight were self-reported in Bergen, Norway, at RHINE II (1999-2001) and all centres at RHINE III (2010-2012). A longitudinal analysis of Bergen data investigated the association of baseline physical activity with follow-up asthma, incident asthma and symptoms, using logistic and zero-inflated Poisson regression (n = 1782). A cross-sectional analysis of all RHINE III centres investigated the association of physical activity with concurrent asthma and symptoms (n = 13,542) using mixed-effects models. Body mass index (BMI) was categorised (<20, 20-24.99, 25-29.99, 30+ kg/m 2 ) and physical activity grouped by amount and frequency of lighter (no sweating/heavy breathing) and vigorous (sweating/heavy breathing) activity. In the Bergen longitudinal analysis, undertaking light activity 3+ times/week at baseline was associated with less follow-up asthma (odds ratio [OR] 0.44, 95% confidence interval [CI] 0.22, 0.89), whilst an effect from undertaking vigorous activity 3+ times/week was not detected (OR 1.22, 95% CI 0.44, 2.76). The associations were attenuated with BMI adjustment. In the all-centre cross-sectional analysis an interaction was found, with the association between physical activity and asthma varying across BMI categories. These findings suggest potential longer-term benefit from lighter physical activity, whilst improvement in asthma outcomes from increasing activity intensity was not evident. Additionally, it appears the benefit from physical activity may differ according to BMI.

  17. Effect of automated ultraviolet C-emitting device on decontamination of hospital rooms with and without real-time observation of terminal room disinfection.

    PubMed

    Penno, Katie; Jandarov, Roman A; Sopirala, Madhuri M

    2017-11-01

    We studied the effectiveness of an ultraviolet C (UV-C) emitter in clinical settings and compared it with observed terminal disinfection. We cultured 22 hospital discharge rooms at a tertiary care academic medical center. Phase 1 (unobserved terminal disinfection) included cultures of 11 high-touch environmental surfaces (HTSs) after terminal room disinfection (AD) and after the use of a UV-C-emitting device (AUV). Phase 2 (observed terminal disinfection) included cultures before terminal room disinfection (BD), AD, and AUV. Zero-inflated Poisson regression compared mean colony forming units (CFU) between the groups. Two-sample proportion tests identified significance of the observed differences in proportions of thoroughly cleaned HTSs (CFU < 5). Significant P value was determined using the Bonferroni corrected threshold of α = .05/12 = .004. We obtained 594 samples. Risk of overall contamination was 0.48 times lower in the AUV group than in the AD group (P < .001), with 1.04 log 10 reduction. During phase 1, overall proportion of HTSs with <5 CFUs increased in AUV versus AD by 0.12 (P = .001). During phase 2, it increased in AD versus BD by 0.45 (P < .001), with no significant difference between AD and AUV (P = .02). Use of UV-C with standard cleaning significantly reduced microbial burden and improved the thoroughness of terminal disinfection. We found no further benefit to UV-C use if standard terminal disinfection was observed. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  18. Absenteeism and Employer Costs Associated With Chronic Diseases and Health Risk Factors in the US Workforce.

    PubMed

    Asay, Garrett R Beeler; Roy, Kakoli; Lang, Jason E; Payne, Rebecca L; Howard, David H

    2016-10-06

    Employers may incur costs related to absenteeism among employees who have chronic diseases or unhealthy behaviors. We examined the association between employee absenteeism and 5 conditions: 3 risk factors (smoking, physical inactivity, and obesity) and 2 chronic diseases (hypertension and diabetes). We identified 5 chronic diseases or risk factors from 2 data sources: MarketScan Health Risk Assessment and the Medical Expenditure Panel Survey (MEPS). Absenteeism was measured as the number of workdays missed because of sickness or injury. We used zero-inflated Poisson regression to estimate excess absenteeism as the difference in the number of days missed from work by those who reported having a risk factor or chronic disease and those who did not. Covariates included demographics (eg, age, education, sex) and employment variables (eg, industry, union membership). We quantified absenteeism costs in 2011 and adjusted them to reflect growth in employment costs to 2015 dollars. Finally, we estimated absenteeism costs for a hypothetical small employer (100 employees) and a hypothetical large employer (1,000 employees). Absenteeism estimates ranged from 1 to 2 days per individual per year depending on the risk factor or chronic disease. Except for the physical inactivity and obesity estimates, disease- and risk-factor-specific estimates were similar in MEPS and MarketScan. Absenteeism increased with the number of risk factors or diseases reported. Nationally, each risk factor or disease was associated with annual absenteeism costs greater than $2 billion. Absenteeism costs ranged from $16 to $81 (small employer) and $17 to $286 (large employer) per employee per year. Absenteeism costs associated with chronic diseases and health risk factors can be substantial. Employers may incur these costs through lower productivity, and employees could incur costs through lower wages.

  19. Health Care Utilization and Expenditures Attributable to Cigar Smoking Among US Adults, 2000-2015.

    PubMed

    Wang, Yingning; Sung, Hai-Yen; Yao, Tingting; Lightwood, James; Max, Wendy

    Cigar use in the United States is a growing public health concern because of its increasing popularity. We estimated health care utilization and expenditures attributable to cigar smoking among US adults aged ≥35. We analyzed data on 84 178 adults using the 2000, 2005, 2010, and 2015 National Health Interview Surveys. We estimated zero-inflated Poisson (ZIP) regression models on hospital nights, emergency department (ED) visits, physician visits, and home-care visits as a function of tobacco use status-current sole cigar smokers (ie, smoke cigars only), current poly cigar smokers (smoke cigars and smoke cigarettes or use smokeless tobacco), former sole cigar smokers (used to smoke cigars only), former poly cigar smokers (used to smoke cigars and smoke cigarettes or use smokeless tobacco), other tobacco users (ever smoked cigarettes and used smokeless tobacco but not cigars), and never tobacco users (never smoked cigars, smoked cigarettes, or used smokeless tobacco)-and other covariates. We calculated health care utilization attributable to current and former sole cigar smoking based on the estimated ZIP models, and then we calculated total health care expenditures attributable to cigar smoking. Current and former sole cigar smoking was associated with excess annual utilization of 72 137 hospital nights, 32 748 ED visits, and 420 118 home-care visits. Annual health care expenditures attributable to sole cigar smoking were $284 million ($625 per sole cigar smoker), and total annual health care expenditures attributable to sole and poly cigar smoking were $1.75 billion. Comprehensive tobacco control policies and interventions are needed to reduce cigar smoking and the associated health care burden.

  20. Legalization of recreational marijuana and community sales policy in Oregon: Impact on adolescent willingness and intent to use, parent use, and adolescent use.

    PubMed

    Rusby, Julie C; Westling, Erika; Crowley, Ryann; Light, John M

    2018-02-01

    Studies investigating the impact of medical marijuana legalization have found no significant changes in adolescent use. In one of the few studies focused on recreational marijuana, we investigated how recreational marijuana legalization and community sales policy influenced factors that likely impact youth use (youth willingness and intent to use, parent use) as well as youth use. Legalization of recreational marijuana in Oregon coincided with our study on adolescent substance use. Cohort 1 transitioned from 8th to 9th grade prior to legalization and Cohort 2 made this transition during legalization (N = 444; 53% female). Communities were allowed to opt out of sales. Multivariate linear regression models estimated the impact of legalization and community sales policy on changes in attitudes and parent use (2 time points 1 year apart). Zero-inflated Poisson growth curve models estimated the effects on initial levels and rate of change from 8th through 9th grade (4 time points). In communities opting out of sales, the prior-to-legalization cohort was less likely to increase their willingness and intent to use marijuana, and the legalization cohort was more likely to increase intent to use. For youth who used marijuana, legalization was associated with increased use, and those in communities opting out of sales had greater growth in marijuana use. Community policy appears to impact youth attitudes toward, and use of, marijuana. Results suggest that legalization of recreational marijuana did not increase marijuana use for youth who did not use marijuana but did increase use in youth who were already using. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  1. Identification of burden hotspots and risk factors for cholera in India: An observational study

    PubMed Central

    Sen Gupta, Sanjukta; Arora, Nisha; Khasnobis, Pradeep; Venkatesh, Srinivas; Sur, Dipika; Nair, Gopinath B.; Sack, David A.; Ganguly, Nirmal K.

    2017-01-01

    Background Even though cholera has existed for centuries and many parts of the country have sporadic, endemic and epidemic cholera, it is still an under-recognized health problem in India. A Cholera Expert Group in the country was established to gather evidence and to prepare a road map for control of cholera in India. This paper identifies cholera burden hotspots and factors associated with an increased risk of the disease. Methodology/Principle findings We acquired district level data on cholera case reports of 2010–2015 from the Integrated Disease Surveillance Program. Socioeconomic characteristics and coverage of water and sanitation was obtained from the 2011 census. Spatial analysis was performed to identify cholera hotspots, and a zero-inflated Poisson regression was employed to identify the factors associated with cholera and predicted case count in the district. 27,615 cholera cases were reported during the 6-year period. Twenty-four of 36 states of India reported cholera during these years, and 13 states were classified as endemic. Of 641 districts, 78 districts in 15 states were identified as “hotspots” based on the reported cases. On the other hand, 111 districts in nine states were identified as “hotspots” from model-based predicted number of cases. The risk for cholera in a district was negatively associated with the coverage of literate persons, households using treated water source and owning mobile telephone, and positively associated with the coverage of poor sanitation and drainage conditions and urbanization level in the district. Conclusions/Significance The study reaffirms that cholera continues to occur throughout a large part of India and identifies the burden hotspots and risk factors. Policymakers may use the findings of the article to develop a roadmap for prevention and control of cholera in India. PMID:28837645

  2. Equity in children's dental caries before and after cessation of community water fluoridation: differential impact by dental insurance status and geographic material deprivation.

    PubMed

    McLaren, Lindsay; McNeil, Deborah A; Potestio, Melissa; Patterson, Steve; Thawer, Salima; Faris, Peter; Shi, Congshi; Shwart, Luke

    2016-02-11

    One of the main arguments made in favor of community water fluoridation is that it is equitable in its impact on dental caries (i.e., helps to offset inequities in dental caries). Although an equitable effect of fluoridation has been demonstrated in cross-sectional studies, it has not been studied in the context of cessation of community water fluoridation (CWF). The objective of this study was to compare the socio-economic patterns of children's dental caries (tooth decay) in Calgary, Canada, in 2009/10 when CWF was in place, and in 2013/14, after it had been discontinued. We analyzed data from population-based samples of schoolchildren (grade 2) in 2009/10 and 2013/14. Data on dental caries (decayed, missing, and filled primary and permanent teeth) were gathered via open mouth exams conducted in schools by registered dental hygienists. We examined the association between dental caries and 1) presence/absence of dental insurance and 2) small area index of material deprivation, using Poisson (zero-inflated) and logistic regression, for both time points separately. For small-area material deprivation at each time point, we also computed the concentration index of inequality for each outcome variable. Statistically significant inequities by dental insurance status and by small area material deprivation were more apparent in 2013/14 than in 2009/10. Results are consistent with increasing inequities in dental caries following cessation of CWF. However, further research is needed to 1) confirm the effects in a study that includes a comparison community, and 2) explore possible alternative reasons for the findings, including changes in treatment and preventive programming.

  3. Predictors of temporary and permanent work disability in patients with inflammatory bowel disease: results of the swiss inflammatory bowel disease cohort study.

    PubMed

    Siebert, Uwe; Wurm, Johannes; Gothe, Raffaella Matteucci; Arvandi, Marjan; Vavricka, Stephan R; von Känel, Roland; Begré, Stefan; Sulz, Michael C; Meyenberger, Christa; Sagmeister, Markus

    2013-01-01

    Inflammatory bowel disease can decrease the quality of life and induce work disability. We sought to (1) identify and quantify the predictors of disease-specific work disability in patients with inflammatory bowel disease and (2) assess the suitability of using cross-sectional data to predict future outcomes, using the Swiss Inflammatory Bowel Disease Cohort Study data. A total of 1187 patients were enrolled and followed up for an average of 13 months. Predictors included patient and disease characteristics and drug utilization. Potential predictors were identified through an expert panel and published literature. We estimated adjusted effect estimates with 95% confidence intervals using logistic and zero-inflated Poisson regression. Overall, 699 (58.9%) experienced Crohn's disease and 488 (41.1%) had ulcerative colitis. Most important predictors for temporary work disability in patients with Crohn's disease included gender, disease duration, disease activity, C-reactive protein level, smoking, depressive symptoms, fistulas, extraintestinal manifestations, and the use of immunosuppressants/steroids. Temporary work disability in patients with ulcerative colitis was associated with age, disease duration, disease activity, and the use of steroids/antibiotics. In all patients, disease activity emerged as the only predictor of permanent work disability. Comparing data at enrollment versus follow-up yielded substantial differences regarding disability and predictors, with follow-up data showing greater predictor effects. We identified predictors of work disability in patients with Crohn's disease and ulcerative colitis. Our findings can help in forecasting these disease courses and guide the choice of appropriate measures to prevent adverse outcomes. Comparing cross-sectional and longitudinal data showed that the conduction of cohort studies is inevitable for the examination of disability.

  4. Healthcare costs attributable to secondhand smoke exposure at home for U.S. adults.

    PubMed

    Yao, Tingting; Sung, Hai-Yen; Wang, Yingning; Lightwood, James; Max, Wendy

    2018-03-01

    To estimate healthcare costs attributable to secondhand smoke (SHS) exposure at home among nonsmoking adults (18+) in the U.S. We analyzed data on nonsmoking adults (N=67,735) from the 2000, 2005, and 2010 (the latest available data on SHS exposure at home) U.S. National Health Interview Surveys. This study was conducted from 2015 to 2017. We examined hospital nights, home care visits, doctor visits, and emergency room (ER) visits. For each, we analyzed the association of SHS exposure at home with healthcare utilization with a Zero-Inflated Poisson regression model controlling for socio-demographic and other risk characteristics. Excess healthcare utilization attributable to SHS exposure at home was determined and multiplied by unit costs derived from the 2014 Medical Expenditures Panel Survey to determine annual SHS-attributable healthcare costs. SHS exposure at home was positively associated with hospital nights and ER visits, but was not statistically associated with home care visits and doctor visits. Exposed adults had 1.28 times more hospital nights and 1.16 times more ER visits than non-exposed adults. Annual SHS-attributable healthcare costs totaled $4.6 billion (including $3.8 billion for hospital nights and $0.8 billion for ER visits, 2014 dollars) in 2000, $2.1 billion (including $1.8 billion for hospital nights and $0.3 billion for ER visits) in 2005, and $1.9 billion (including $1.6 billion for hospital nights and $0.4 billion for ER visits) in 2010. SHS-attributable costs remain high, but have fallen over time. Tobacco control efforts are needed to further reduce SHS exposure at home and associated healthcare costs. Copyright © 2017. Published by Elsevier Inc.

  5. Characterizing the effect of summer temperature on heatstroke-related emergency ambulance dispatches in the Kanto area of Japan

    NASA Astrophysics Data System (ADS)

    Ng, Chris Fook Sheng; Ueda, Kayo; Ono, Masaji; Nitta, Hiroshi; Takami, Akinori

    2014-07-01

    Despite rising concern on the impact of heat on human health, the risk of high summer temperature on heatstroke-related emergency dispatches is not well understood in Japan. A time-series study was conducted to examine the association between apparent temperature and daily heatstroke-related ambulance dispatches (HSAD) within the Kanto area of Japan. A total of 12,907 HSAD occurring from 2000 to 2009 in five major cities—Saitama, Chiba, Tokyo, Kawasaki, and Yokohama—were analyzed. Generalized additive models and zero-inflated Poisson regressions were used to estimate the effects of daily maximum three-hour apparent temperature (AT) on dispatch frequency from May to September, with adjustment for seasonality, long-term trend, weekends, and public holidays. Linear and non-linear exposure effects were considered. Effects on days when AT first exceeded its summer median were also investigated. City-specific estimates were combined using random effects meta-analyses. Exposure-response relationship was found to be fairly linear. Significant risk increase began from 21 °C with a combined relative risk (RR) of 1.22 (95 % confidence interval, 1.03-1.44), increasing to 1.49 (1.42-1.57) at peak AT. When linear exposure was assumed, combined RR was 1.43 (1.37-1.50) per degree Celsius increment. Overall association was significant the first few times when median AT was initially exceeded in a particular warm season. More than two-thirds of these initial hot days were in June, implying the harmful effect of initial warming as the season changed. Risk increase that began early at the fairly mild perceived temperature implies the need for early precaution.

  6. Characterizing the effect of summer temperature on heatstroke-related emergency ambulance dispatches in the Kanto area of Japan.

    PubMed

    Ng, Chris Fook Sheng; Ueda, Kayo; Ono, Masaji; Nitta, Hiroshi; Takami, Akinori

    2014-07-01

    Despite rising concern on the impact of heat on human health, the risk of high summer temperature on heatstroke-related emergency dispatches is not well understood in Japan. A time-series study was conducted to examine the association between apparent temperature and daily heatstroke-related ambulance dispatches (HSAD) within the Kanto area of Japan. A total of 12,907 HSAD occurring from 2000 to 2009 in five major cities-Saitama, Chiba, Tokyo, Kawasaki, and Yokohama-were analyzed. Generalized additive models and zero-inflated Poisson regressions were used to estimate the effects of daily maximum three-hour apparent temperature (AT) on dispatch frequency from May to September, with adjustment for seasonality, long-term trend, weekends, and public holidays. Linear and non-linear exposure effects were considered. Effects on days when AT first exceeded its summer median were also investigated. City-specific estimates were combined using random effects meta-analyses. Exposure-response relationship was found to be fairly linear. Significant risk increase began from 21 °C with a combined relative risk (RR) of 1.22 (95% confidence interval, 1.03-1.44), increasing to 1.49 (1.42-1.57) at peak AT. When linear exposure was assumed, combined RR was 1.43 (1.37-1.50) per degree Celsius increment. Overall association was significant the first few times when median AT was initially exceeded in a particular warm season. More than two-thirds of these initial hot days were in June, implying the harmful effect of initial warming as the season changed. Risk increase that began early at the fairly mild perceived temperature implies the need for early precaution.

  7. Age, occupational class and sickness absence during pregnancy: a retrospective analysis study of the Norwegian population registry

    PubMed Central

    Ariansen, Anja M S

    2014-01-01

    Objective Western women increasingly delay having children to advance their career, and pregnancy is considered to be riskier among older women. In Norway, this development surprisingly coincides with increased sickness absence among young pregnant women, rather than their older counterparts. This paper tests the hypothesis that young pregnant women have a higher number of sick days because this age group includes a higher proportion of working class women, who are more prone to sickness absence. Design A zero-inflated Poisson regression was conducted on the Norwegian population registry. Participants All pregnant employees giving birth in 2004–2008 were included in the study. A total number of 216 541 pregnancies were observed among 180 483 women. Outcome measure Number of sick days. Results Although the association between age and number of sick days was U-shaped, pregnant women in their early 20s had a higher number of sick days than those in their mid-40s. This was particularly the case for pregnant women with previous births. In this group, 20-year-olds had 12.6 more sick days than 45-year-olds; this age difference was reduced to 6.3 after control for class. Among women undergoing their first pregnancy, 20-year-olds initially had 1.2 more sick days than 45-year-olds, but control for class altered this age difference. After control for class, 45-year-old first-time pregnant women had 2.9 more sick days than 20-year-olds with corresponding characteristics. Conclusions The negative association between age and sickness absence was partly due to younger age groups including more working class women, who were more prone to sickness absence. Young pregnant women's needs for job adjustments should not be underestimated. PMID:24793246

  8. Absenteeism and Employer Costs Associated With Chronic Diseases and Health Risk Factors in the US Workforce

    PubMed Central

    Roy, Kakoli; Lang, Jason E.; Payne, Rebecca L.; Howard, David H.

    2016-01-01

    Introduction Employers may incur costs related to absenteeism among employees who have chronic diseases or unhealthy behaviors. We examined the association between employee absenteeism and 5 conditions: 3 risk factors (smoking, physical inactivity, and obesity) and 2 chronic diseases (hypertension and diabetes). Methods We identified 5 chronic diseases or risk factors from 2 data sources: MarketScan Health Risk Assessment and the Medical Expenditure Panel Survey (MEPS). Absenteeism was measured as the number of workdays missed because of sickness or injury. We used zero-inflated Poisson regression to estimate excess absenteeism as the difference in the number of days missed from work by those who reported having a risk factor or chronic disease and those who did not. Covariates included demographics (eg, age, education, sex) and employment variables (eg, industry, union membership). We quantified absenteeism costs in 2011 and adjusted them to reflect growth in employment costs to 2015 dollars. Finally, we estimated absenteeism costs for a hypothetical small employer (100 employees) and a hypothetical large employer (1,000 employees). Results Absenteeism estimates ranged from 1 to 2 days per individual per year depending on the risk factor or chronic disease. Except for the physical inactivity and obesity estimates, disease- and risk-factor–specific estimates were similar in MEPS and MarketScan. Absenteeism increased with the number of risk factors or diseases reported. Nationally, each risk factor or disease was associated with annual absenteeism costs greater than $2 billion. Absenteeism costs ranged from $16 to $81 (small employer) and $17 to $286 (large employer) per employee per year. Conclusion Absenteeism costs associated with chronic diseases and health risk factors can be substantial. Employers may incur these costs through lower productivity, and employees could incur costs through lower wages. PMID:27710764

  9. Spatial and temporal patterns of dengue infections in Timor-Leste, 2005-2013.

    PubMed

    Wangdi, Kinley; Clements, Archie C A; Du, Tai; Nery, Susana Vaz

    2018-01-04

    Dengue remains an important public health problem in Timor-Leste, with several major epidemics occurring over the last 10 years. The aim of this study was to identify dengue clusters at high geographical resolution and to determine the association between local environmental characteristics and the distribution and transmission of the disease. Notifications of dengue cases that occurred from January 2005 to December 2013 were obtained from the Ministry of Health, Timor-Leste. The population of each suco (the third-level administrative subdivision) was obtained from the Population and Housing Census 2010. Spatial autocorrelation in dengue incidence was explored using Moran's I statistic, Local Indicators of Spatial Association (LISA), and the Getis-Ord statistics. A multivariate, Zero-Inflated, Poisson (ZIP) regression model was developed with a conditional autoregressive (CAR) prior structure, and with posterior parameters estimated using Bayesian Markov chain Monte Carlo (MCMC) simulation with Gibbs sampling. The analysis used data from 3206 cases. Dengue incidence was highly seasonal with a large peak in January. Patients ≥ 14 years were found to be 74% [95% credible interval (CrI): 72-76%] less likely to be infected than those < 14 years, and females were 12% (95% CrI: 4-21%) more likely to suffer from dengue as compared to males. Dengue incidence increased by 0.7% (95% CrI: 0.6-0.8%) for a 1 °C increase in mean temperature; and 47% (95% CrI: 29-59%) for a 1 mm increase in precipitation. There was no significant residual spatial clustering after accounting for climate and demographic variables. Dengue incidence was highly seasonal and spatially clustered, with positive associations with temperature, precipitation and demographic factors. These factors explained the observed spatial heterogeneity of infection.

  10. Hospitalizations for asthma among adults exposed to the September 11, 2001 World Trade Center terrorist attack.

    PubMed

    Miller-Archie, Sara A; Jordan, Hannah T; Alper, Howard; Wisnivesky, Juan P; Cone, James E; Friedman, Stephen M; Brackbill, Robert M

    2018-04-01

    We described the patterns of asthma hospitalization among persons exposed to the 2001 World Trade Center (WTC) attacks, and assessed whether 9/11-related exposures or comorbidities, including posttraumatic stress disorder (PTSD) and gastroesophageal reflux symptoms (GERS), were associated with an increased rate of hospitalization. Data for adult enrollees in the WTC Health Registry, a prospective cohort study, with self-reported physician-diagnosed asthma who resided in New York State on 9/11 were linked to administrative hospitalization data to identify asthma hospitalizations during September 11, 2001-December 31, 2010. Multivariable zero-inflated Poisson regression was used to examine associations among 9/11 exposures, comorbid conditions, and asthma hospitalizations. Of 11 471 enrollees with asthma, 406 (3.5%) had ≥1 asthma hospitalization during the study period (721 total hospitalizations). Among enrollees diagnosed before 9/11 (n = 6319), those with PTSD or GERS had over twice the rate of hospitalization (adjusted rate ratio (ARR) = 2.5, 95% CI = 1.4-4.1; ARR = 2.1, 95% CI = 1.3-3.2, respectively) compared to those without. This association was not statistically significant in enrollees diagnosed after 9/11. Compared to higher educational attainment, completing less than college was associated with an increased hospitalization rate among participants with both pre-9/11- and post-9/11-onset asthma (ARR = 1.9, 95% CI = 1.2-2.9; ARR = 2.6, 95% CI = 1.6-4.1, respectively). Sinus symptoms, exposure to the dust cloud, and having been a WTC responder were not associated with asthma hospitalization. Among enrollees with pre-9/11 asthma, comorbid PTSD and GERS were associated with an increase in asthma hospitalizations. Management of these comorbidities may be an important factor in preventing hospitalization.

  11. Identification of burden hotspots and risk factors for cholera in India: An observational study.

    PubMed

    Ali, Mohammad; Sen Gupta, Sanjukta; Arora, Nisha; Khasnobis, Pradeep; Venkatesh, Srinivas; Sur, Dipika; Nair, Gopinath B; Sack, David A; Ganguly, Nirmal K

    2017-01-01

    Even though cholera has existed for centuries and many parts of the country have sporadic, endemic and epidemic cholera, it is still an under-recognized health problem in India. A Cholera Expert Group in the country was established to gather evidence and to prepare a road map for control of cholera in India. This paper identifies cholera burden hotspots and factors associated with an increased risk of the disease. We acquired district level data on cholera case reports of 2010-2015 from the Integrated Disease Surveillance Program. Socioeconomic characteristics and coverage of water and sanitation was obtained from the 2011 census. Spatial analysis was performed to identify cholera hotspots, and a zero-inflated Poisson regression was employed to identify the factors associated with cholera and predicted case count in the district. 27,615 cholera cases were reported during the 6-year period. Twenty-four of 36 states of India reported cholera during these years, and 13 states were classified as endemic. Of 641 districts, 78 districts in 15 states were identified as "hotspots" based on the reported cases. On the other hand, 111 districts in nine states were identified as "hotspots" from model-based predicted number of cases. The risk for cholera in a district was negatively associated with the coverage of literate persons, households using treated water source and owning mobile telephone, and positively associated with the coverage of poor sanitation and drainage conditions and urbanization level in the district. The study reaffirms that cholera continues to occur throughout a large part of India and identifies the burden hotspots and risk factors. Policymakers may use the findings of the article to develop a roadmap for prevention and control of cholera in India.

  12. Effectiveness of Healthy Relationships Video-Group—A Videoconferencing Group Intervention for Women Living with HIV: Preliminary Findings from a Randomized Controlled Trial

    PubMed Central

    Buhi, Eric R.; Baldwin, Julie; Chen, Henian; Johnson, Ayesha; Lynn, Vickie; Glueckauf, Robert

    2014-01-01

    Abstract Introduction: Expanded access to efficacious interventions is needed for women living with human immunodeficiency virus (WLH) in the United States. Availability of “prevention with (human immunodeficiency virus [HIV)] positives” interventions in rural/remote and low HIV prevalence areas remains limited, leaving WLH in these communities few options for receiving effective behavioral interventions such as Healthy Relationships (HR). Offering such programs via videoconferencing groups (VGs) may expand access. This analysis tests the effectiveness of HR-VG (versus wait-list control) for reducing sexual risk behavior among WLH and explores intervention satisfaction. Subjects and Methods: In this randomized controlled trial unprotected vaginal/anal sex occasions over the prior 3 months reported at the 6-month follow-up were compared across randomization groups through zero-inflated Poisson regression modeling, controlling for unprotected sex at baseline. Seventy-one WLH were randomized and completed the baseline assessment (n=36 intervention and n=35 control); 59 (83% in each group) had follow-up data. Results: Among those who engaged in unprotected sex at 6-month follow-up, intervention participants had approximately seven fewer unprotected occasions than control participants (95% confidence interval 5.43–7.43). Intervention participants reported high levels of satisfaction with HR-VG; 84% reported being “very satisfied” overall. Conclusions: This study found promising evidence for effective dissemination of HIV risk reduction interventions via VGs. Important next steps will be to determine whether VGs are effective with other subpopulations of people living with HIV (i.e., men and non-English speakers) and to assess cost-effectiveness. Possibilities for using VGs to expand access to other psychosocial and behavioral interventions and reduce stigma are discussed. PMID:24237482

  13. Racial and Ethnic Service Use Disparities Among Homeless Adults With Severe Mental Illnesses Receiving ACT

    PubMed Central

    Horvitz-Lennon, Marcela; Zhou, Dongli; Normand, Sharon-Lise T.; Alegría, Margarita; Thompson, Wes K.

    2013-01-01

    Objective Case management–based interventions aimed at improving quality of care have the potential to narrow racial and ethnic disparities among people with chronic illnesses. The aim of this study was to assess the equity effects of assertive community treatment (ACT), an evidence-based case management intervention, among homeless adults with severe mental illness. Methods This study used baseline, three-, and 12-month data for 6,829 black, Latino, and white adults who received ACT services through the ACCESS study (Access to Community Care and Effective Services and Support). Zero-inflated Poisson random regression models were used to estimate the adjusted probability of use of outpatient psychiatric services and, among service users, the intensity of use. Odds ratios and rate ratios (RRs) were computed to assess disparities at baseline and over time. Results No disparities were found in probability of use at baseline or over time. Compared with white users, baseline intensity of use was lower for black users (RR=.89; 95% confidence interval [CI]=.83–.96) and Latino users (RR=.65; CI=.52–.81]). Intensity did not change over time for whites, but it did for black and Latino users. Intensity increased for blacks between baseline and three months (RR=1.11, CI=1.06–1.17]) and baseline and 12 months (RR=1.17, CI=1.11–1.22]). Intensity of use dropped for Latinos between baseline and three months (RR=.83, CI=.70–.98). Conclusions Receipt of ACT was associated with a reduction in service use disparities for blacks but not for Latinos. Findings suggest that ACT’s equity effects differ depending on race-ethnicity. PMID:21632726

  14. The Effect of Reactive Oxygen Species on Embryo Quality in IVF.

    PubMed

    Siristatidis, Charalampos; Vogiatzi, Paraskevi; Varounis, Christos; Askoxylaki, Marily; Chrelias, Charalampos; Papantoniou, Nikolaos

    2016-01-01

    BACKROUND/AIM: Reactive oxygen species (ROS) are involved in critical biological processes in human reproduction. The aim of this study was to evaluate the association of embryo quality following in vitro fertilization (IVF), with ROS levels in the serum and follicular fluid (FF). Eighty-five participants underwent ovarian stimulation and IVF; ROS levels were measured in blood samples on the day of oocyte retrieval and in the FF from follicular aspirates using enzyme-linked immunosorbent assay. These values were associated with the quality of embryos generated. Univariable zero-inflated Poisson model revealed that ROS levels at both oocyte retrieval and in FF were not associated with the number of grade I, II, III and IV embryos (p>0.05). Age, body mass index, stimulation protocol and smoking status were not associated with the number of embryos of any grade (p>0.05). Neither ROS levels in serum nor in FF are associated with the quality of embryos produced following IVF. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  15. Non-Poisson Processes: Regression to Equilibrium Versus Equilibrium Correlation Functions

    DTIC Science & Technology

    2004-07-07

    ARTICLE IN PRESSPhysica A 347 (2005) 268–2880378-4371/$ - doi:10.1016/j Correspo E-mail adwww.elsevier.com/locate/physaNon- Poisson processes : regression...05.40.a; 89.75.k; 02.50.Ey Keywords: Stochastic processes; Non- Poisson processes ; Liouville and Liouville-like equations; Correlation function...which is not legitimate with renewal non- Poisson processes , is a correct property if the deviation from the exponential relaxation is obtained by time

  16. Non-linear properties of metallic cellular materials with a negative Poisson's ratio

    NASA Technical Reports Server (NTRS)

    Choi, J. B.; Lakes, R. S.

    1992-01-01

    Negative Poisson's ratio copper foam was prepared and characterized experimentally. The transformation into re-entrant foam was accomplished by applying sequential permanent compressions above the yield point to achieve a triaxial compression. The Poisson's ratio of the re-entrant foam depended on strain and attained a relative minimum at strains near zero. Poisson's ratio as small as -0.8 was achieved. The strain dependence of properties occurred over a narrower range of strain than in the polymer foams studied earlier. Annealing of the foam resulted in a slightly greater magnitude of negative Poisson's ratio and greater toughness at the expense of a decrease in the Young's modulus.

  17. A flexible count data model to fit the wide diversity of expression profiles arising from extensively replicated RNA-seq experiments

    PubMed Central

    2013-01-01

    Background High-throughput RNA sequencing (RNA-seq) offers unprecedented power to capture the real dynamics of gene expression. Experimental designs with extensive biological replication present a unique opportunity to exploit this feature and distinguish expression profiles with higher resolution. RNA-seq data analysis methods so far have been mostly applied to data sets with few replicates and their default settings try to provide the best performance under this constraint. These methods are based on two well-known count data distributions: the Poisson and the negative binomial. The way to properly calibrate them with large RNA-seq data sets is not trivial for the non-expert bioinformatics user. Results Here we show that expression profiles produced by extensively-replicated RNA-seq experiments lead to a rich diversity of count data distributions beyond the Poisson and the negative binomial, such as Poisson-Inverse Gaussian or Pólya-Aeppli, which can be captured by a more general family of count data distributions called the Poisson-Tweedie. The flexibility of the Poisson-Tweedie family enables a direct fitting of emerging features of large expression profiles, such as heavy-tails or zero-inflation, without the need to alter a single configuration parameter. We provide a software package for R called tweeDEseq implementing a new test for differential expression based on the Poisson-Tweedie family. Using simulations on synthetic and real RNA-seq data we show that tweeDEseq yields P-values that are equally or more accurate than competing methods under different configuration parameters. By surveying the tiny fraction of sex-specific gene expression changes in human lymphoblastoid cell lines, we also show that tweeDEseq accurately detects differentially expressed genes in a real large RNA-seq data set with improved performance and reproducibility over the previously compared methodologies. Finally, we compared the results with those obtained from microarrays in order to check for reproducibility. Conclusions RNA-seq data with many replicates leads to a handful of count data distributions which can be accurately estimated with the statistical model illustrated in this paper. This method provides a better fit to the underlying biological variability; this may be critical when comparing groups of RNA-seq samples with markedly different count data distributions. The tweeDEseq package forms part of the Bioconductor project and it is available for download at http://www.bioconductor.org. PMID:23965047

  18. Bycatch, bait, anglers, and roads: quantifying vector activity and propagule introduction risk across lake ecosystems.

    PubMed

    Drake, D Andrew R; Mandrak, Nicholas E

    2014-06-01

    Long implicated in the invasion process, live-bait anglers are highly mobile species vectors with frequent overland transport of fishes. To test hypotheses about the role of anglers in propagule transport, we developed a social-ecological model quantifying the opportunity for species transport beyond the invaded range resulting from bycatch during commercial bait operations, incidental transport, and release to lake ecosystems by anglers. We combined a gravity model with a stochastic, agent-based simulation, representing a 1-yr iteration of live-bait angling and the dynamics of propagule transport at fine spatiotemporal scales (i.e., probability of introducing n propagules per lake per year). A baseline scenario involving round goby (Neogobius melanostomus) indicated that most angling trips were benign; irrespective of lake visitation, anglers failed to purchase and transport propagules (benign trips, median probability P = 0.99912). However, given the large number of probability trials (4.2 million live-bait angling events per year), even the rarest sequence of events (uptake, movement, and deposition of propagules) is anticipated to occur. Risky trips (modal P = 0.00088 trips per year; approximately 1 in 1136) were sufficient to introduce a substantial number of propagules (modal values, Poisson model = 3715 propagules among 1288 lakes per year; zero-inflated negative binomial model = 6722 propagules among 1292 lakes per year). Two patterns of lake-specific introduction risk emerged. Large lakes supporting substantial angling activity experienced propagule pressure likely to surpass demographic barriers to establishment (top 2.5% of lakes with modal outcomes of five to 76 propagules per year; 303 high-risk lakes with three or more propagules, per year). Small or remote lakes were less likely to receive propagules; however, most risk distributions were leptokurtic with a long right tail, indicating the rare occurrence of high propagule loads to most waterbodies. Infestation simulations indicated that the number of high-risk waterbodies could be as great as 1318 (zero-inflated negative binomial), whereas a 90% reduction in bycatch from baseline would reduce the modal number of high risk lakes to zero. Results indicate that the combination of invasive bycatch and live-bait anglers warrants management concern as a species vector, but that risk is confined to a subset of individuals and recipient sites that may be effectively managed with targeted strategies.

  19. Mental Health Symptoms Among Student Service Members/Veterans and Civilian College Students.

    PubMed

    Cleveland, Sandi D; Branscum, Adam J; Bovbjerg, Viktor E; Thorburn, Sheryl

    2015-01-01

    The aim of this study was to investigate if and to what extent student service members/veterans differ from civilian college students in the prevalence of self-reported symptoms of poor mental health. The Fall 2011 implementation of the American College Health Association-National College Health Assessment included 27,774 respondents from 44 colleges and universities. Participants were matched using propensity scores, and the prevalence of symptoms was compared using logistic regression and zero-inflated negative binomial regression models. The odds of feeling overwhelmed in the last 12 months were significantly lower among student service members/veterans with a history of hazardous duty (odd ratio [OR] = 0.46, adjusted p value <.05) compared with civilian students. Military service, with and without hazardous duty deployment, was not a significant predictor of the total number of symptoms of poor mental health. Current student service members/veterans may not be disproportionately affected by poor psychological functioning.

  20. 31 CFR 356.20 - How does the Treasury determine auction awards?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and bond issues. We set the interest rate at a 1/8 of one percent increment. If a Treasury inflation-protected securities auction results in a negative or zero yield, the interest rate will be set at zero, and...

  1. Zero inflation in ordinal data: Incorporating susceptibility to response through the use of a mixture model

    PubMed Central

    Kelley, Mary E.; Anderson, Stewart J.

    2008-01-01

    Summary The aim of the paper is to produce a methodology that will allow users of ordinal scale data to more accurately model the distribution of ordinal outcomes in which some subjects are susceptible to exhibiting the response and some are not (i.e., the dependent variable exhibits zero inflation). This situation occurs with ordinal scales in which there is an anchor that represents the absence of the symptom or activity, such as “none”, “never” or “normal”, and is particularly common when measuring abnormal behavior, symptoms, and side effects. Due to the unusually large number of zeros, traditional statistical tests of association can be non-informative. We propose a mixture model for ordinal data with a built-in probability of non-response that allows modeling of the range (e.g., severity) of the scale, while simultaneously modeling the presence/absence of the symptom. Simulations show that the model is well behaved and a likelihood ratio test can be used to choose between the zero-inflated and the traditional proportional odds model. The model, however, does have minor restrictions on the nature of the covariates that must be satisfied in order for the model to be identifiable. The method is particularly relevant for public health research such as large epidemiological surveys where more careful documentation of the reasons for response may be difficult. PMID:18351711

  2. Poisson regression models outperform the geometrical model in estimating the peak-to-trough ratio of seasonal variation: a simulation study.

    PubMed

    Christensen, A L; Lundbye-Christensen, S; Dethlefsen, C

    2011-12-01

    Several statistical methods of assessing seasonal variation are available. Brookhart and Rothman [3] proposed a second-order moment-based estimator based on the geometrical model derived by Edwards [1], and reported that this estimator is superior in estimating the peak-to-trough ratio of seasonal variation compared with Edwards' estimator with respect to bias and mean squared error. Alternatively, seasonal variation may be modelled using a Poisson regression model, which provides flexibility in modelling the pattern of seasonal variation and adjustments for covariates. Based on a Monte Carlo simulation study three estimators, one based on the geometrical model, and two based on log-linear Poisson regression models, were evaluated in regards to bias and standard deviation (SD). We evaluated the estimators on data simulated according to schemes varying in seasonal variation and presence of a secular trend. All methods and analyses in this paper are available in the R package Peak2Trough[13]. Applying a Poisson regression model resulted in lower absolute bias and SD for data simulated according to the corresponding model assumptions. Poisson regression models had lower bias and SD for data simulated to deviate from the corresponding model assumptions than the geometrical model. This simulation study encourages the use of Poisson regression models in estimating the peak-to-trough ratio of seasonal variation as opposed to the geometrical model. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  3. Computation of solar perturbations with Poisson series

    NASA Technical Reports Server (NTRS)

    Broucke, R.

    1974-01-01

    Description of a project for computing first-order perturbations of natural or artificial satellites by integrating the equations of motion on a computer with automatic Poisson series expansions. A basic feature of the method of solution is that the classical variation-of-parameters formulation is used rather than rectangular coordinates. However, the variation-of-parameters formulation uses the three rectangular components of the disturbing force rather than the classical disturbing function, so that there is no problem in expanding the disturbing function in series. Another characteristic of the variation-of-parameters formulation employed is that six rather unusual variables are used in order to avoid singularities at the zero eccentricity and zero (or 90 deg) inclination. The integration process starts by assuming that all the orbit elements present on the right-hand sides of the equations of motion are constants. These right-hand sides are then simple Poisson series which can be obtained with the use of the Bessel expansions of the two-body problem in conjunction with certain interation methods. These Poisson series can then be integrated term by term, and a first-order solution is obtained.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gim, Yongwan; Kim, Wontae, E-mail: yongwan89@sogang.ac.kr, E-mail: wtkim@sogang.ac.kr

    In warm inflation scenarios, radiation always exists, so that the radiation energy density is also assumed to be finite when inflation starts. To find out the origin of the non-vanishing initial radiation energy density, we revisit thermodynamic analysis for a warm inflation model and then derive an effective Stefan-Boltzmann law which is commensurate with the temperature-dependent effective potential by taking into account the non-vanishing trace of the total energy-momentum tensors. The effective Stefan-Boltzmann law shows that the zero energy density for radiation at the Grand Unification epoch increases until the inflation starts and it becomes eventually finite at the initialmore » stage of warm inflation. By using the above effective Stefan-Boltzmann law, we also study the cosmological scalar perturbation, and obtain the sufficient radiation energy density in order for GUT baryogenesis at the end of inflation.« less

  5. Factors associated with the frequency of monitoring of liver enzymes, renal function and lipid laboratory markers among individuals initiating combination antiretroviral therapy: a cohort study.

    PubMed

    Gillis, Jennifer; Bayoumi, Ahmed M; Burchell, Ann N; Cooper, Curtis; Klein, Marina B; Loutfy, Mona; Machouf, Nima; Montaner, Julio Sg; Tsoukas, Chris; Hogg, Robert S; Raboud, Janet

    2015-10-26

    As the average age of the HIV-positive population increases, there is increasing need to monitor patients for the development of comorbidities as well as for drug toxicities. We examined factors associated with the frequency of measurement of liver enzymes, renal function tests, and lipid levels among participants of the Canadian Observational Cohort (CANOC) collaboration which follows people who initiated HIV antiretroviral therapy in 2000 or later. We used zero-inflated negative binomial regression models to examine the associations of demographic and clinical characteristics with the rates of measurement during follow-up. Generalized estimating equations with a logit link were used to examine factors associated with gaps of 12 months or more between measurements. Electronic laboratory data were available for 3940 of 7718 CANOC participants. The median duration of electronic follow-up was 3.5 years. The median (interquartile) rates of tests per year were 2.76 (1.60, 3.73), 2.55 (1.44, 3.38) and 1.42 (0.50, 2.52) for liver, renal and lipid parameters, respectively. In multivariable zero-inflated negative binomial regression models, individuals infected through injection drug use (IDU) were significantly less likely to have any measurements. Among participants with at least one measurement, rates of measurement of liver, renal and lipid tests were significantly lower for younger individuals and Aboriginal Peoples. Hepatitis C co-infected individuals with a history of IDU had lower rates of measurement and were at greater risk of having 12 month gaps between measurements. Hepatitis C co-infected participants infected through IDU were at increased risk of gaps in testing, despite publicly funded health care and increased risk of comorbid conditions. This should be taken into consideration in analyses examining factors associated with outcomes based on laboratory parameters.

  6. Age and the economics of an emergency medical admission-what factors determine costs?

    PubMed

    McCabe, J J; Cournane, S; Byrne, D; Conway, R; O'Riordan, D; Silke, B

    2017-02-01

    The ageing of the population may be anticipated to increase demand on hospital resources. We have investigated the relationship between hospital episode costs and age profile in a single centre. All Emergency Medical admissions (33 732 episodes) to an Irish hospital over a 6-year period, categorized into three age groups, were evaluated against total hospital episode costs. Univariate and adjusted incidence rate ratios (IRRs) were calculated using zero truncated Poisson regression. The total hospital episode cost increased with age ( P < 0.001). The multi-variable Poisson regression model demonstrated that the most important drivers of overall costs were Acute Illness Severity-IRR 1.36 (95% CI: 1.30, 1.41), Sepsis Status -1.46 (95% CI: 1.42, 1.51) and Chronic Disabling Disease Score -1.25 (95% CI: 1.22, 1.27) and the Age Group as exemplified for those 85 years IRR 1.23 (95% CI: 1.15, 1.32). Total hospital episode costs are a product of clinical complexity with contributions from the Acute Illness Severity, Co-Morbidity, Chronic Disabling Disease Score and Sepsis Status. However age is also an important contributor and an increasing patient age profile will have a predictable impact on total hospital episode costs. © The Author 2016. Published by Oxford University Press on behalf of the Association of Physicians. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  7. Atmospheric pollutants and hospital admissions due to pneumonia in children

    PubMed Central

    Negrisoli, Juliana; Nascimento, Luiz Fernando C.

    2013-01-01

    OBJECTIVE: To analyze the relationship between exposure to air pollutants and hospitalizations due to pneumonia in children of Sorocaba, São Paulo, Brazil. METHODS: Time series ecological study, from 2007 to 2008. Daily data were obtained from the State Environmental Agency for Pollution Control for particulate matter, nitric oxide, nitrogen dioxide, ozone, besides air temperature and relative humidity. The data concerning pneumonia admissions were collected in the public health system of Sorocaba. Correlations between the variables of interest using Pearson cofficient were calculated. Models with lags from zero to five days after exposure to pollutants were performed to analyze the association between the exposure to environmental pollutants and hospital admissions. The analysis used the generalized linear model of Poisson regression, being significant p<0.05. RESULTS: There were 1,825 admissions for pneumonia, with a daily mean of 2.5±2.1. There was a strong correlation between pollutants and hospital admissions, except for ozone. Regarding the Poisson regression analysis with the multi-pollutant model, only nitrogen dioxide was statistically significant in the same day (relative risk - RR=1.016), as well as particulate matter with a lag of four days (RR=1.009) after exposure to pollutants. CONCLUSIONS: There was an acute effect of exposure to nitrogen dioxide and a later effect of exposure to particulate matter on children hospitalizations for pneumonia in Sorocaba. PMID:24473956

  8. False Positives in Multiple Regression: Unanticipated Consequences of Measurement Error in the Predictor Variables

    ERIC Educational Resources Information Center

    Shear, Benjamin R.; Zumbo, Bruno D.

    2013-01-01

    Type I error rates in multiple regression, and hence the chance for false positive research findings, can be drastically inflated when multiple regression models are used to analyze data that contain random measurement error. This article shows the potential for inflated Type I error rates in commonly encountered scenarios and provides new…

  9. Dual impact of organisational change on subsequent exit from work unit and sickness absence: a longitudinal study among public healthcare employees.

    PubMed

    Jensen, Johan Høy; Flachs, Esben Meulengracht; Skakon, Janne; Rod, Naja Hulvej; Bonde, Jens Peter

    2018-05-14

    We investigated work-unit exit, total and long-term sickness absence following organisational change among public healthcare employees. The study population comprised employees from the Capital Region of Denmark (n=14 388). Data on reorganisation at the work-unit level (merger, demerger, relocation, change of management, employee layoff or budget cut) between July and December 2013 were obtained via surveys distributed to the managers of each work unit. Individual-level data on work-unit exit, total and long-term sickness absence (≥29 days) in 2014 were obtained from company registries. For exposure to any, each type or number of reorganisations (1, 2 or ≥3), the HRs and 95% CIs for subsequent work-unit exit were estimated by Cox regression, and the risk for total and long-term sickness absence were estimated by zero-inflated Poisson regression. Reorganisation was associated with subsequent work-unit exit (HR 1.10, 95% CI 1.01 to 1.19) in the year after reorganisation. This association was specifically important for exposure to ≥3 types of changes (HR 1.52, 95% CI 1.30 to 1.79), merger (HR 1.29, 95% CI 1.12 to 1.49), demerger (HR 1.41, 95% CI 1.16 to 1.71) or change of management (HR 1.24, 95% CI 1.11 to 1.38). Among the employees remaining in the work unit, reorganisation was also associated with more events of long-term sickness absence (OR 1.15, 95% CI 1.00 to 1.33), which was particularly important for merger (OR 1.31, 95% CI 1.00 to 1.72) and employee layoff (OR 1.31, 95% CI 1.08 to 1.59). Specific types of reorganisation seem to have a dual impact on subsequent work-unit exit and sickness absence in the year after change. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Genericness of inflation in isotropic loop quantum cosmology.

    PubMed

    Date, Ghanashyam; Hossain, Golam Mortuza

    2005-01-14

    Nonperturbative corrections from loop quantum cosmology (LQC) to the scalar matter sector are already known to imply inflation. We prove that the LQC modified scalar field generates exponential inflation in the small scale factor regime, for all positive definite potentials, independent of initial conditions and independent of ambiguity parameters. For positive semidefinite potentials it is always possible to choose, without fine-tuning, a value of one of the ambiguity parameters such that exponential inflation results, provided zeros of the potential are approached at most as a power law in the scale factor. In conjunction with the generic occurrence of bounce at small volumes, particle horizon is absent, thus eliminating the horizon problem of the standard big bang model.

  11. The emergence of gravity as a retro-causal post-inflation macro-quantum-coherent holographic vacuum Higgs-Goldstone field

    NASA Astrophysics Data System (ADS)

    Sarfatti, Jack; Levit, Creon

    2009-06-01

    We present a model for the origin of gravity, dark energy and dark matter: Dark energy and dark matter are residual pre-inflation false vacuum random zero point energy (w = - 1) of large-scale negative, and short-scale positive pressure, respectively, corresponding to the "zero point" (incoherent) component of a superfluid (supersolid) ground state. Gravity, in contrast, arises from the 2nd order topological defects in the post-inflation virtual "condensate" (coherent) component. We predict, as a consequence, that the LHC will never detect exotic real on-mass-shell particles that can explain dark matter ΩMDM approx 0.23. We also point out that the future holographic dark energy de Sitter horizon is a total absorber (in the sense of retro-causal Wheeler-Feynman action-at-a-distance electrodynamics) because it is an infinite redshift surface for static detectors. Therefore, the advanced Hawking-Unruh thermal radiation from the future de Sitter horizon is a candidate for the negative pressure dark vacuum energy.

  12. Prediction of attendance at fitness center: a comparison between the theory of planned behavior, the social cognitive theory, and the physical activity maintenance theory

    PubMed Central

    Jekauc, Darko; Völkle, Manuel; Wagner, Matthias O.; Mess, Filip; Reiner, Miriam; Renner, Britta

    2015-01-01

    In the processes of physical activity (PA) maintenance specific predictors are effective, which differ from other stages of PA development. Recently, Physical Activity Maintenance Theory (PAMT) was specifically developed for prediction of PA maintenance. The aim of the present study was to evaluate the predictability of the future behavior by the PAMT and compare it with the Theory of Planned Behavior (TPB) and Social Cognitive Theory (SCT). Participation rate in a fitness center was observed for 101 college students (53 female) aged between 19 and 32 years (M = 23.6; SD = 2.9) over 20 weeks using a magnetic card. In order to predict the pattern of participation TPB, SCT and PAMT were used. A latent class zero-inflated Poisson growth curve analysis identified two participation patterns: regular attenders and intermittent exercisers. SCT showed the highest predictive power followed by PAMT and TPB. Impeding aspects as life stress and barriers were the strongest predictors suggesting that overcoming barriers might be an important aspect for working out on a regular basis. Self-efficacy, perceived behavioral control, and social support could also significantly differentiate between the participation patterns. PMID:25717313

  13. Interpolation between spatial frameworks: an application of process convolution to estimating neighbourhood disease prevalence.

    PubMed

    Congdon, Peter

    2014-04-01

    Health data may be collected across one spatial framework (e.g. health provider agencies), but contrasts in health over another spatial framework (neighbourhoods) may be of policy interest. In the UK, population prevalence totals for chronic diseases are provided for populations served by general practitioner practices, but not for neighbourhoods (small areas of circa 1500 people), raising the question whether data for one framework can be used to provide spatially interpolated estimates of disease prevalence for the other. A discrete process convolution is applied to this end and has advantages when there are a relatively large number of area units in one or other framework. Additionally, the interpolation is modified to take account of the observed neighbourhood indicators (e.g. hospitalisation rates) of neighbourhood disease prevalence. These are reflective indicators of neighbourhood prevalence viewed as a latent construct. An illustrative application is to prevalence of psychosis in northeast London, containing 190 general practitioner practices and 562 neighbourhoods, including an assessment of sensitivity to kernel choice (e.g. normal vs exponential). This application illustrates how a zero-inflated Poisson can be used as the likelihood model for a reflective indicator.

  14. The association between commuter cycling and sickness absence.

    PubMed

    Hendriksen, Ingrid J M; Simons, Monique; Garre, Francisca Galindo; Hildebrandt, Vincent H

    2010-08-01

    To study the association between commuter cycling and all-cause sickness absence, and the possible dose-response relationship between absenteeism and the distance, frequency and speed of commuter cycling. Cross-sectional data about cycling in 1236 Dutch employees were collected using a self-report questionnaire. Company absenteeism records were checked over a one-year period (May 2007-April 2008). Propensity scores were used to make groups comparable and to adjust for confounders. Zero-inflated Poisson models were used to assess differences in absenteeism between cyclists and non-cyclists. The mean total duration of absenteeism over the study year was more than 1 day shorter in cyclists than in non-cyclists. This can be explained by the higher proportion of people with no absenteeism in the cycling group. A dose-response relationship was observed between the speed and distance of cycling and absenteeism. Compared to people who cycle a short distance (

  15. Efficiency optimization of a fast Poisson solver in beam dynamics simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Dawei; Pöplau, Gisela; van Rienen, Ursula

    2016-01-01

    Calculating the solution of Poisson's equation relating to space charge force is still the major time consumption in beam dynamics simulations and calls for further improvement. In this paper, we summarize a classical fast Poisson solver in beam dynamics simulations: the integrated Green's function method. We introduce three optimization steps of the classical Poisson solver routine: using the reduced integrated Green's function instead of the integrated Green's function; using the discrete cosine transform instead of discrete Fourier transform for the Green's function; using a novel fast convolution routine instead of an explicitly zero-padded convolution. The new Poisson solver routine preserves the advantages of fast computation and high accuracy. This provides a fast routine for high performance calculation of the space charge effect in accelerators.

  16. An Analysis of the Number of Medical Malpractice Claims and Their Amounts

    PubMed Central

    Bonetti, Marco; Cirillo, Pasquale; Musile Tanzi, Paola; Trinchero, Elisabetta

    2016-01-01

    Starting from an extensive database, pooling 9 years of data from the top three insurance brokers in Italy, and containing 38125 reported claims due to alleged cases of medical malpractice, we use an inhomogeneous Poisson process to model the number of medical malpractice claims in Italy. The intensity of the process is allowed to vary over time, and it depends on a set of covariates, like the size of the hospital, the medical department and the complexity of the medical operations performed. We choose the combination medical department by hospital as the unit of analysis. Together with the number of claims, we also model the associated amounts paid by insurance companies, using a two-stage regression model. In particular, we use logistic regression for the probability that a claim is closed with a zero payment, whereas, conditionally on the fact that an amount is strictly positive, we make use of lognormal regression to model it as a function of several covariates. The model produces estimates and forecasts that are relevant to both insurance companies and hospitals, for quality assurance, service improvement and cost reduction. PMID:27077661

  17. Estimation of inflation parameters for Perturbed Power Law model using recent CMB measurements

    NASA Astrophysics Data System (ADS)

    Mukherjee, Suvodip; Das, Santanu; Joy, Minu; Souradeep, Tarun

    2015-01-01

    Cosmic Microwave Background (CMB) is an important probe for understanding the inflationary era of the Universe. We consider the Perturbed Power Law (PPL) model of inflation which is a soft deviation from Power Law (PL) inflationary model. This model captures the effect of higher order derivative of Hubble parameter during inflation, which in turn leads to a non-zero effective mass meff for the inflaton field. The higher order derivatives of Hubble parameter at leading order sources constant difference in the spectral index for scalar and tensor perturbation going beyond PL model of inflation. PPL model have two observable independent parameters, namely spectral index for tensor perturbation νt and change in spectral index for scalar perturbation νst to explain the observed features in the scalar and tensor power spectrum of perturbation. From the recent measurements of CMB power spectra by WMAP, Planck and BICEP-2 for temperature and polarization, we estimate the feasibility of PPL model with standard ΛCDM model. Although BICEP-2 claimed a detection of r=0.2, estimates of dust contamination provided by Planck have left open the possibility that only upper bound on r will be expected in a joint analysis. As a result we consider different upper bounds on the value of r and show that PPL model can explain a lower value of tensor to scalar ratio (r<0.1 or r<0.01) for a scalar spectral index of ns=0.96 by having a non-zero value of effective mass of the inflaton field m2eff/H2. The analysis with WP + Planck likelihood shows a non-zero detection of m2eff/H2 with 5.7 σ and 8.1 σ respectively for r<0.1 and r<0.01. Whereas, with BICEP-2 likelihood m2eff/H2 = -0.0237 ± 0.0135 which is consistent with zero.

  18. Inflationary predictions of double-well, Coleman-Weinberg, and hilltop potentials with non-minimal coupling

    NASA Astrophysics Data System (ADS)

    Bostan, Nilay; Güleryüz, Ömer; Nefer Şenoğuz, Vedat

    2018-05-01

    We discuss how the non-minimal coupling ξphi2R between the inflaton and the Ricci scalar affects the predictions of single field inflation models where the inflaton has a non-zero vacuum expectation value (VEV) v after inflation. We show that, for inflaton values both above the VEV and below the VEV during inflation, under certain conditions the inflationary predictions become approximately the same as the predictions of the Starobinsky model. We then analyze inflation with double-well and Coleman-Weinberg potentials in detail, displaying the regions in the v-ξ plane for which the spectral index ns and the tensor-to-scalar ratio r values are compatible with the current observations. r is always larger than 0.002 in these regions. Finally, we consider the effect of ξ on small field inflation (hilltop) potentials.

  19. A multi-worksite analysis of the relationships among body mass index, medical utilization, and worker productivity.

    PubMed

    Goetzel, Ron Z; Gibson, Teresa B; Short, Meghan E; Chu, Bong-Chul; Waddell, Jessica; Bowen, Jennie; Lemon, Stephenie C; Fernandez, Isabel Diana; Ozminkowski, Ronald J; Wilson, Mark G; DeJoy, David M

    2010-01-01

    The relationships between worker health and productivity are becoming clearer. However, few large scale studies have measured the direct and indirect cost burden of overweight and obesity among employees using actual biometric values. The objective of this study was to quantify the direct medical and indirect (absence and productivity) cost burden of overweight and obesity in workers. A cross-sectional study of 10,026 employees in multiple professions and worksites across the United States was conducted. The main outcomes were five self-reported measures of workers' annual health care use and productivity: doctor visits, emergency department visits, hospitalizations, absenteeism (days absent from work), and presenteeism (percent on-the-job productivity losses). Multivariate count and continuous data models (Poisson, negative binomial, and zero-inflated Poisson) were estimated. After adjusting for covariates, obese employees had 20% higher doctor visits than normal weight employees (confidence interval [CI] 16%, 24%, P < 0.01) and 26% higher emergency department visits (CI 11%, 42%, P < 0.01). Rates of doctor and emergency department visits for overweight employees were no different than those of normal weight employees. Compared to normal weight employees, presenteeism rates were 10% and 12% higher for overweight and obese employees, respectively (CI 5%, 15% and 5%, 19%, all P < 0.01). Taken together, compared to normal weight employees, obese and overweight workers were estimated to cost employers $644 and $201 more per employee per year, respectively. This study provides evidence that employers face a financial burden imposed by obesity. Implementation of effective workplace programs for the prevention and management of excess weight will benefit employers and their workers.

  20. Enhanced polarization of the cosmic microwave background radiation from thermal gravitational waves.

    PubMed

    Bhattacharya, Kaushik; Mohanty, Subhendra; Nautiyal, Akhilesh

    2006-12-22

    If inflation was preceded by a radiation era, then at the time of inflation there will exist a decoupled thermal distribution of gravitons. Gravitational waves generated during inflation will be amplified by the process of stimulated emission into the existing thermal distribution of gravitons. Consequently, the usual zero temperature scale invariant tensor spectrum is modified by a temperature dependent factor. This thermal correction factor amplifies the B-mode polarization of the cosmic microwave background radiation by an order of magnitude at large angles, which may now be in the range of observability of the Wilkinson Microwave Anisotropy Probe.

  1. LD Score Regression Distinguishes Confounding from Polygenicity in Genome-Wide Association Studies

    PubMed Central

    Bulik-Sullivan, Brendan K.; Loh, Po-Ru; Finucane, Hilary; Ripke, Stephan; Yang, Jian; Patterson, Nick; Daly, Mark J.; Price, Alkes L.; Neale, Benjamin M.

    2015-01-01

    Both polygenicity (i.e., many small genetic effects) and confounding biases, such as cryptic relatedness and population stratification, can yield an inflated distribution of test statistics in genome-wide association studies (GWAS). However, current methods cannot distinguish between inflation from true polygenic signal and bias. We have developed an approach, LD Score regression, that quantifies the contribution of each by examining the relationship between test statistics and linkage disequilibrium (LD). The LD Score regression intercept can be used to estimate a more powerful and accurate correction factor than genomic control. We find strong evidence that polygenicity accounts for the majority of test statistic inflation in many GWAS of large sample size. PMID:25642630

  2. Distribution and habitat use of the Missouri River and Lower Yellowstone River benthic fishes from 1996 to 1998: A baseline for fish community recovery

    USGS Publications Warehouse

    Wildhaber, M.L.; Gladish, D.W.; Arab, A.

    2011-01-01

    Past and present Missouri River management practices have resulted in native fishes being identified as in jeopardy. In 1995, the Missouri River Benthic Fishes Study was initiated to provide improved information on Missouri River fish populations and how alterations might affect them. The study produced a baseline against which to evaluate future changes in Missouri River operating criteria. The objective was to evaluate population structure and habitat use of benthic fishes along the entire mainstem Missouri River, exclusive of reservoirs. Here we use the data from this study to provide a recent-past baseline for on-going Missouri River fish population monitoring programmes along with a more powerful method for analysing data containing large percentages of zero values. This is carried out by describing the distribution and habitat use of 21 species of Missouri River benthic fishes based on catch-per-unit area data from multiple gears. We employ a Bayesian zero-inflated Poisson model expanded to include continuous measures of habitat quality (i.e. substrate composition, depth, velocity, temperature, turbidity and conductivity). Along with presenting the method, we provide a relatively complete picture of the Missouri River benthic fish community and the relationship between their relative population numbers and habitat conditions. We demonstrate that our single model provides all the information that is often obtained by a myriad of analytical techniques. An important advantage of the present approach is reliable inference for patterns of relative abundance using multiple gears without using gear efficiencies.

  3. Program Manager: Journal of the Defense Systems Management College. Volume 18, Number 1, January-February 1989

    DTIC Science & Technology

    1989-02-01

    the at detecting changes to a trend by averaging closer to zero than 2 percent. accelerator, and the car’s speed can be highlighting the underlying...competent boat, it might initially appear that the zero ; and physicians in the technology of information Department of Interior had acted arbi...of Economists cannot predict inflation; lack of specificity in the regula- the organization. mathematicians cannot divide by zero ; tions regarding

  4. Prescription Drug Misuse and Sexual Risk Behaviors Among Young Men Who Have Sex With Men (YMSM) in Philadelphia

    PubMed Central

    Kecojevic, Aleksandar; Silva, Karol; Sell, Randall; Lankenau, Stephen E.

    2014-01-01

    This study examined the relationship between prescription drug misuse and sexual risk behaviors (i.e. unprotected sex, increased number of sex partners) in a sample of young men who have sex with men (YMSM) in Philadelphia. Data come from a cross-sectional study of 18-29 year old YMSM (N=191) who misused prescription drugs in the past 6 months. Associations were investigated in two regression models: logistic models for unprotected anal intercourse (UAI) and zero-truncated Poisson regression model for number of sex partners. Of 177 participants engaging in anal intercourse in the past 6 months, 57.6% engaged in UAI. After adjusting for socio-demographic variables and illicit drug use, misuse of prescription pain pills and muscle relaxants remained significantly associated with engaging in receptive UAI. No prescription drug class was associated with a high number of sex partners. This study provides additional evidence that some prescription drugs are associated with sexual risk behaviors among YMSM. PMID:25240627

  5. Prescription Drug Misuse and Sexual Risk Behaviors Among Young Men Who have Sex with Men (YMSM) in Philadelphia.

    PubMed

    Kecojevic, Aleksandar; Silva, Karol; Sell, Randall L; Lankenau, Stephen E

    2015-05-01

    This study examined the relationship between prescription drug misuse and sexual risk behaviors (i.e. unprotected sex, increased number of sex partners) in a sample of young men who have sex with men (YMSM) in Philadelphia. Data come from a cross-sectional study of 18-29 year old YMSM (N = 191) who misused prescription drugs in the past 6 months. Associations were investigated in two regression models: logistic models for unprotected anal intercourse (UAI) and zero-truncated Poisson regression model for number of sex partners. Of 177 participants engaging in anal intercourse in the past 6 months, 57.6 % engaged in UAI. After adjusting for socio-demographic variables and illicit drug use, misuse of prescription pain pills and muscle relaxants remained significantly associated with engaging in receptive UAI. No prescription drug class was associated with a high number of sex partners. This study provides additional evidence that some prescription drugs are associated with sexual risk behaviors among YMSM.

  6. A Linear Regression Model Identifying the Primary Factors Contributing to Maintenance Man Hours for the C-17 Globemaster III in the Air National Guard

    DTIC Science & Technology

    2012-06-15

    Maintenance AFSCs ................................................................................................. 14 2. Variation Inflation Factors...total variability in the data. It is an indication of how much of the   20    variation in the data can be accounted for in the regression model. In... Variation Inflation Factors for each independent variable (predictor) as regressed against all of the other independent variables in the model. The

  7. Analysis of multinomial models with unknown index using data augmentation

    USGS Publications Warehouse

    Royle, J. Andrew; Dorazio, R.M.; Link, W.A.

    2007-01-01

    Multinomial models with unknown index ('sample size') arise in many practical settings. In practice, Bayesian analysis of such models has proved difficult because the dimension of the parameter space is not fixed, being in some cases a function of the unknown index. We describe a data augmentation approach to the analysis of this class of models that provides for a generic and efficient Bayesian implementation. Under this approach, the data are augmented with all-zero detection histories. The resulting augmented dataset is modeled as a zero-inflated version of the complete-data model where an estimable zero-inflation parameter takes the place of the unknown multinomial index. Interestingly, data augmentation can be justified as being equivalent to imposing a discrete uniform prior on the multinomial index. We provide three examples involving estimating the size of an animal population, estimating the number of diabetes cases in a population using the Rasch model, and the motivating example of estimating the number of species in an animal community with latent probabilities of species occurrence and detection.

  8. On the impact of relatedness on SNP association analysis.

    PubMed

    Gross, Arnd; Tönjes, Anke; Scholz, Markus

    2017-12-06

    When testing for SNP (single nucleotide polymorphism) associations in related individuals, observations are not independent. Simple linear regression assuming independent normally distributed residuals results in an increased type I error and the power of the test is also affected in a more complicate manner. Inflation of type I error is often successfully corrected by genomic control. However, this reduces the power of the test when relatedness is of concern. In the present paper, we derive explicit formulae to investigate how heritability and strength of relatedness contribute to variance inflation of the effect estimate of the linear model. Further, we study the consequences of variance inflation on hypothesis testing and compare the results with those of genomic control correction. We apply the developed theory to the publicly available HapMap trio data (N=129), the Sorbs (a self-contained population with N=977 characterised by a cryptic relatedness structure) and synthetic family studies with different sample sizes (ranging from N=129 to N=999) and different degrees of relatedness. We derive explicit and easily to apply approximation formulae to estimate the impact of relatedness on the variance of the effect estimate of the linear regression model. Variance inflation increases with increasing heritability. Relatedness structure also impacts the degree of variance inflation as shown for example family structures. Variance inflation is smallest for HapMap trios, followed by a synthetic family study corresponding to the trio data but with larger sample size than HapMap. Next strongest inflation is observed for the Sorbs, and finally, for a synthetic family study with a more extreme relatedness structure but with similar sample size as the Sorbs. Type I error increases rapidly with increasing inflation. However, for smaller significance levels, power increases with increasing inflation while the opposite holds for larger significance levels. When genomic control is applied, type I error is preserved while power decreases rapidly with increasing variance inflation. Stronger relatedness as well as higher heritability result in increased variance of the effect estimate of simple linear regression analysis. While type I error rates are generally inflated, the behaviour of power is more complex since power can be increased or reduced in dependence on relatedness and the heritability of the phenotype. Genomic control cannot be recommended to deal with inflation due to relatedness. Although it preserves type I error, the loss in power can be considerable. We provide a simple formula for estimating variance inflation given the relatedness structure and the heritability of a trait of interest. As a rule of thumb, variance inflation below 1.05 does not require correction and simple linear regression analysis is still appropriate.

  9. Universality of the Volume Bound in Slow-Roll Eternal Inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubovsky, Sergei; Senatore, Leonardo; Villadoro, Giovanni

    2012-03-28

    It has recently been shown that in single field slow-roll inflation the total volume cannot grow by a factor larger than e{sup S{sub dS}/2} without becoming infinite. The bound is saturated exactly at the phase transition to eternal inflation where the probability to produce infinite volume becomes non zero. We show that the bound holds sharply also in any space-time dimensions, when arbitrary higher-dimensional operators are included and in the multi-field inflationary case. The relation with the entropy of de Sitter and the universality of the bound strengthen the case for a deeper holographic interpretation. As a spin-off we providemore » the formalism to compute the probability distribution of the volume after inflation for generic multi-field models, which might help to address questions about the population of vacua of the landscape during slow-roll inflation.« less

  10. Poisson Regression Analysis of Illness and Injury Surveillance Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frome E.L., Watkins J.P., Ellis E.D.

    2012-12-12

    The Department of Energy (DOE) uses illness and injury surveillance to monitor morbidity and assess the overall health of the work force. Data collected from each participating site include health events and a roster file with demographic information. The source data files are maintained in a relational data base, and are used to obtain stratified tables of health event counts and person time at risk that serve as the starting point for Poisson regression analysis. The explanatory variables that define these tables are age, gender, occupational group, and time. Typical response variables of interest are the number of absences duemore » to illness or injury, i.e., the response variable is a count. Poisson regression methods are used to describe the effect of the explanatory variables on the health event rates using a log-linear main effects model. Results of fitting the main effects model are summarized in a tabular and graphical form and interpretation of model parameters is provided. An analysis of deviance table is used to evaluate the importance of each of the explanatory variables on the event rate of interest and to determine if interaction terms should be considered in the analysis. Although Poisson regression methods are widely used in the analysis of count data, there are situations in which over-dispersion occurs. This could be due to lack-of-fit of the regression model, extra-Poisson variation, or both. A score test statistic and regression diagnostics are used to identify over-dispersion. A quasi-likelihood method of moments procedure is used to evaluate and adjust for extra-Poisson variation when necessary. Two examples are presented using respiratory disease absence rates at two DOE sites to illustrate the methods and interpretation of the results. In the first example the Poisson main effects model is adequate. In the second example the score test indicates considerable over-dispersion and a more detailed analysis attributes the over-dispersion to extra-Poisson variation. The R open source software environment for statistical computing and graphics is used for analysis. Additional details about R and the data that were used in this report are provided in an Appendix. Information on how to obtain R and utility functions that can be used to duplicate results in this report are provided.« less

  11. Validation and Improvement of Reliability Methods for Air Force Building Systems

    DTIC Science & Technology

    focusing primarily on HVAC systems . This research used contingency analysis to assess the performance of each model for HVAC systems at six Air Force...probabilistic model produced inflated reliability calculations for HVAC systems . In light of these findings, this research employed a stochastic method, a...Nonhomogeneous Poisson Process (NHPP), in an attempt to produce accurate HVAC system reliability calculations. This effort ultimately concluded that

  12. Effects of categorization method, regression type, and variable distribution on the inflation of Type-I error rate when categorizing a confounding variable.

    PubMed

    Barnwell-Ménard, Jean-Louis; Li, Qing; Cohen, Alan A

    2015-03-15

    The loss of signal associated with categorizing a continuous variable is well known, and previous studies have demonstrated that this can lead to an inflation of Type-I error when the categorized variable is a confounder in a regression analysis estimating the effect of an exposure on an outcome. However, it is not known how the Type-I error may vary under different circumstances, including logistic versus linear regression, different distributions of the confounder, and different categorization methods. Here, we analytically quantified the effect of categorization and then performed a series of 9600 Monte Carlo simulations to estimate the Type-I error inflation associated with categorization of a confounder under different regression scenarios. We show that Type-I error is unacceptably high (>10% in most scenarios and often 100%). The only exception was when the variable categorized was a continuous mixture proxy for a genuinely dichotomous latent variable, where both the continuous proxy and the categorized variable are error-ridden proxies for the dichotomous latent variable. As expected, error inflation was also higher with larger sample size, fewer categories, and stronger associations between the confounder and the exposure or outcome. We provide online tools that can help researchers estimate the potential error inflation and understand how serious a problem this is. Copyright © 2014 John Wiley & Sons, Ltd.

  13. Contemporary Trends in Radiation Oncology Resident Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verma, Vivek; Burt, Lindsay; Gimotty, Phyllis A.

    Purpose: To test the hypothesis that recent resident research productivity might be different than a decade ago, and to provide contemporary information about resident scholarly activity. Methods and Materials: We compiled a list of radiation oncology residents from the 2 most recent graduating classes (June 2014 and 2015) using the Association of Residents in Radiation Oncology annual directories. We queried the PubMed database for each resident's first-authored publications from postgraduate years (PGY) 2 through 5, plus a 3-month period after residency completion. We abstracted corresponding historical data for 2002 to 2007 from the benchmark publication by Morgan and colleagues (Int Jmore » Radiat Oncol Biol Phys 2009;74:1567-1572). We tested the null hypothesis that these 2 samples had the same distribution for number of publications using the Wilcoxon rank-sum test. We explored the association of demographic factors and publication number using multivariable zero-inflated Poisson regression. Results: There were 334 residents publishing 659 eligible first-author publications during residency (range 0-17; interquartile range 0-3; mean 2.0; median 1). The contemporary and historical distributions were significantly different (P<.001); contemporary publication rates were higher. Publications accrued late in residency (27% in PGY-4, 59% in PGY-5), and most were original research (75%). In the historical cohort, half of all articles were published in 3 journals; in contrast, the top half of contemporary publications were spread over 10 journals—most commonly International Journal of Radiation Oncology • Biology • Physics (17%), Practical Radiation Oncology (7%), and Radiation Oncology (4%). Male gender, non-PhD status, and larger residency size were associated with higher number of publications in the multivariable analysis. Conclusion: We observed an increase in first-author publications during training compared with historical data from the mid-2000s. These contemporary figures may be useful to medical students considering radiation oncology, current residents, training programs, and prospective employers.« less

  14. Extreme climatic conditions and health service utilisation across rural and metropolitan New South Wales

    NASA Astrophysics Data System (ADS)

    Jegasothy, Edward; McGuire, Rhydwyn; Nairn, John; Fawcett, Robert; Scalley, Benjamin

    2017-08-01

    Periods of successive extreme heat and cold temperature have major effects on human health and increase rates of health service utilisation. The severity of these events varies between geographic locations and populations. This study aimed to estimate the effects of heat waves and cold waves on health service utilisation across urban, regional and remote areas in New South Wales (NSW), Australia, during the 10-year study period 2005-2015. We divided the state into three regions and used 24 over-dispersed or zero-inflated Poisson time-series regression models to estimate the effect of heat waves and cold waves, of three levels of severity, on the rates of ambulance call-outs, emergency department (ED) presentations and mortality. We defined heat waves and cold waves using excess heat factor (EHF) and excess cold factor (ECF) metrics, respectively. Heat waves generally resulted in increased rates of ambulance call-outs, ED presentations and mortality across the three regions and the entire state. For all of NSW, very intense heat waves resulted in an increase of 10.8% (95% confidence interval (CI) 4.5, 17.4%) in mortality, 3.4% (95% CI 0.8, 7.8%) in ED presentations and 10.9% (95% CI 7.7, 14.2%) in ambulance call-outs. Cold waves were shown to have significant effects on ED presentations (9.3% increase for intense events, 95% CI 8.0-10.6%) and mortality (8.8% increase for intense events, 95% CI 2.1-15.9%) in outer regional and remote areas. There was little evidence for an effect from cold waves on health service utilisation in major cities and inner regional areas. Heat waves have a large impact on health service utilisation in NSW in both urban and rural settings. Cold waves also have significant effects in outer regional and remote areas. EHF is a good predictor of health service utilisation for heat waves, although service needs may differ between urban and rural areas.

  15. The role of socioeconomic status in longitudinal trends of cholera in Matlab, Bangladesh, 1993-2007.

    PubMed

    Root, Elisabeth Dowling; Rodd, Joshua; Yunus, Mohammad; Emch, Michael

    2013-01-01

    There has been little evidence of a decline in the global burden of cholera in recent years as the number of cholera cases reported to WHO continues to rise. Cholera remains a global threat to public health and a key indicator of lack of socioeconomic development. Overall socioeconomic development is the ultimate solution for control of cholera as evidenced in developed countries. However, most research has focused on cross-county comparisons so that the role of individual- or small area-level socioeconomic status (SES) in cholera dynamics has not been carefully studied. Reported cases of cholera in Matlab, Bangladesh have fluctuated greatly over time and epidemic outbreaks of cholera continue, most recently with the introduction of a new serotype into the region. The wealth of longitudinal data on the population of Matlab provides a unique opportunity to explore the impact of socioeconomic status and other demographic characteristics on the long-term temporal dynamics of cholera in the region. In this population-based study we examine which factors impact the initial number of cholera cases in a bari at the beginning of the 0139 epidemic and the factors impacting the number of cases over time. Cholera data were derived from the ICDDR,B health records and linked to socioeconomic and geographic data collected as part of the Matlab Health and Demographic Surveillance System. Longitudinal zero-inflated Poisson (ZIP) multilevel regression models are used to examine the impact of environmental and socio-demographic factors on cholera counts across baris. Results indicate that baris with a high socioeconomic status had lower initial rates of cholera at the beginning of the 0139 epidemic (γ(01) = -0.147, p = 0.041) and a higher probability of reporting no cholera cases (α(01) = 0.156, p = 0.061). Populations in baris characterized by low SES are more likely to experience higher cholera morbidity at the beginning of an epidemic than populations in high SES baris.

  16. Chronic diseases as predictors of labour market attachment after participation in subsidised re-employment programme: a 6-year follow-up study.

    PubMed

    Nwaru, Chioma A; Peutere, Laura; Kivimäki, Mika; Pentti, Jaana; Vahtera, Jussi; Virtanen, Pekka J

    2017-11-01

    Little is known about the work patterns of re-employed people. We investigated the labour market attachment trajectories of re-employed people and assessed the influence of chronic diseases on these trajectories. The study was based on register data of 18 944 people (aged 18-60 years) who participated in a subsidised re-employment programme in Finland. Latent class growth analysis with zero-inflated Poisson was used to model the labour market attachment trajectories over a 6-year follow-up time. Multinomial logistic regression was used to examine the associations between chronic diseases and labour market attachment trajectories, adjusting for age, gender, educational level, size of town and calendar year in subsidised re-employment programme. We identified four distinct labour market attachment trajectories, namely: strengthening (a relatively stable attachment throughout the follow-up time; 77%), delayed (initial weak attachment increasing later; 6%), leavers (attachment declined with time; 10%) and none-attached (weak attachment throughout the study period; 7%). We found that severe mental problems strongly increased the likelihood of belonging in the leavers (OR 3.61; 95% CI 2.23 to 5.37) and none-attached (OR 3.41; 95% CI 1.91 to 6.10) trajectories, while chronic hypertension was associated with none-attached (OR 1.37; 95% CI 1.06 to 1.77) trajectory. The associations between other chronic diseases (diabetes, heart disease, asthma and arthritics) and labour market attachment trajectories were less evident. Re-employed people appear to follow distinct labour market attachment trajectories over time. Having chronic diseases, especially mental disorders appear to increase the risk for relatively poor labour market attachment. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Contemporary Trends in Radiation Oncology Resident Research.

    PubMed

    Verma, Vivek; Burt, Lindsay; Gimotty, Phyllis A; Ojerholm, Eric

    2016-11-15

    To test the hypothesis that recent resident research productivity might be different than a decade ago, and to provide contemporary information about resident scholarly activity. We compiled a list of radiation oncology residents from the 2 most recent graduating classes (June 2014 and 2015) using the Association of Residents in Radiation Oncology annual directories. We queried the PubMed database for each resident's first-authored publications from postgraduate years (PGY) 2 through 5, plus a 3-month period after residency completion. We abstracted corresponding historical data for 2002 to 2007 from the benchmark publication by Morgan and colleagues (Int J Radiat Oncol Biol Phys 2009;74:1567-1572). We tested the null hypothesis that these 2 samples had the same distribution for number of publications using the Wilcoxon rank-sum test. We explored the association of demographic factors and publication number using multivariable zero-inflated Poisson regression. There were 334 residents publishing 659 eligible first-author publications during residency (range 0-17; interquartile range 0-3; mean 2.0; median 1). The contemporary and historical distributions were significantly different (P<.001); contemporary publication rates were higher. Publications accrued late in residency (27% in PGY-4, 59% in PGY-5), and most were original research (75%). In the historical cohort, half of all articles were published in 3 journals; in contrast, the top half of contemporary publications were spread over 10 journals-most commonly International Journal of Radiation Oncology • Biology • Physics (17%), Practical Radiation Oncology (7%), and Radiation Oncology (4%). Male gender, non-PhD status, and larger residency size were associated with higher number of publications in the multivariable analysis. We observed an increase in first-author publications during training compared with historical data from the mid-2000s. These contemporary figures may be useful to medical students considering radiation oncology, current residents, training programs, and prospective employers. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Protection from annual flooding is correlated with increased cholera prevalence in Bangladesh: a zero-inflated regression analysis.

    PubMed

    Carrel, Margaret; Voss, Paul; Streatfield, Peter K; Yunus, Mohammad; Emch, Michael

    2010-03-22

    Alteration of natural or historical aquatic flows can have unintended consequences for regions where waterborne diseases are endemic and where the epidemiologic implications of such change are poorly understood. The implementation of flood protection measures for a portion of an intensely monitored population in Matlab, Bangladesh, allows us to examine whether cholera outcomes respond positively or negatively to measures designed to control river flooding. Using a zero inflated negative binomial model, we examine how selected covariates can simultaneously account for household clusters reporting no cholera from those with positive counts as well as distinguishing residential areas with low counts from areas with high cholera counts. Our goal is to examine how residence within or outside a flood protected area interacts with the probability of cholera presence and the effect of flood protection on the magnitude of cholera prevalence. In Matlab, living in a household that is protected from annual monsoon flooding appears to have no significant effect on whether the household experiences cholera, net of other covariates. However, counter-intuitively, among households where cholera is reported, living within the flood protected region significantly increases the number of cholera cases. The construction of dams or other water impoundment strategies for economic or social motives can have profound and unanticipated consequences for waterborne disease. Our results indicate that the construction of a flood control structure in rural Bangladesh is correlated with an increase in cholera cases for residents protected from annual monsoon flooding. Such a finding requires attention from both the health community and from governments and non-governmental organizations involved in ongoing water management schemes.

  19. Effects of health intervention programs and arsenic exposure on child mortality from acute lower respiratory infections in rural Bangladesh.

    PubMed

    Jochem, Warren C; Razzaque, Abdur; Root, Elisabeth Dowling

    2016-09-01

    Respiratory infections continue to be a public health threat, particularly to young children in developing countries. Understanding the geographic patterns of diseases and the role of potential risk factors can help improve future mitigation efforts. Toward this goal, this paper applies a spatial scan statistic combined with a zero-inflated negative-binomial regression to re-examine the impacts of a community-based treatment program on the geographic patterns of acute lower respiratory infection (ALRI) mortality in an area of rural Bangladesh. Exposure to arsenic-contaminated drinking water is also a serious threat to the health of children in this area, and the variation in exposure to arsenic must be considered when evaluating the health interventions. ALRI mortality data were obtained for children under 2 years old from 1989 to 1996 in the Matlab Health and Demographic Surveillance System. This study period covers the years immediately following the implementation of an ALRI control program. A zero-inflated negative binomial (ZINB) regression model was first used to simultaneously estimate mortality rates and the likelihood of no deaths in groups of related households while controlling for socioeconomic status, potential arsenic exposure, and access to care. Next a spatial scan statistic was used to assess the location and magnitude of clusters of ALRI mortality. The ZINB model was used to adjust the scan statistic for multiple social and environmental risk factors. The results of the ZINB models and spatial scan statistic suggest that the ALRI control program was successful in reducing child mortality in the study area. Exposure to arsenic-contaminated drinking water was not associated with increased mortality. Higher socioeconomic status also significantly reduced mortality rates, even among households who were in the treatment program area. Community-based ALRI interventions can be effective at reducing child mortality, though socioeconomic factors may continue to influence mortality patterns. The combination of spatial and non-spatial methods used in this paper has not been applied previously in the literature, and this study demonstrates the importance of such approaches for evaluating and improving public health intervention programs.

  20. DEsingle for detecting three types of differential expression in single-cell RNA-seq data.

    PubMed

    Miao, Zhun; Deng, Ke; Wang, Xiaowo; Zhang, Xuegong

    2018-04-24

    The excessive amount of zeros in single-cell RNA-seq data include "real" zeros due to the on-off nature of gene transcription in single cells and "dropout" zeros due to technical reasons. Existing differential expression (DE) analysis methods cannot distinguish these two types of zeros. We developed an R package DEsingle which employed Zero-Inflated Negative Binomial model to estimate the proportion of real and dropout zeros and to define and detect 3 types of DE genes in single-cell RNA-seq data with higher accuracy. The R package DEsingle is freely available at https://github.com/miaozhun/DEsingle and is under Bioconductor's consideration now. zhangxg@tsinghua.edu.cn. Supplementary data are available at Bioinformatics online.

  1. Mixed effect Poisson log-linear models for clinical and epidemiological sleep hypnogram data

    PubMed Central

    Swihart, Bruce J.; Caffo, Brian S.; Crainiceanu, Ciprian; Punjabi, Naresh M.

    2013-01-01

    Bayesian Poisson log-linear multilevel models scalable to epidemiological studies are proposed to investigate population variability in sleep state transition rates. Hierarchical random effects are used to account for pairings of subjects and repeated measures within those subjects, as comparing diseased to non-diseased subjects while minimizing bias is of importance. Essentially, non-parametric piecewise constant hazards are estimated and smoothed, allowing for time-varying covariates and segment of the night comparisons. The Bayesian Poisson regression is justified through a re-derivation of a classical algebraic likelihood equivalence of Poisson regression with a log(time) offset and survival regression assuming exponentially distributed survival times. Such re-derivation allows synthesis of two methods currently used to analyze sleep transition phenomena: stratified multi-state proportional hazards models and log-linear models with GEE for transition counts. An example data set from the Sleep Heart Health Study is analyzed. Supplementary material includes the analyzed data set as well as the code for a reproducible analysis. PMID:22241689

  2. Estimation of inflation parameters for Perturbed Power Law model using recent CMB measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mukherjee, Suvodip; Das, Santanu; Souradeep, Tarun

    2015-01-01

    Cosmic Microwave Background (CMB) is an important probe for understanding the inflationary era of the Universe. We consider the Perturbed Power Law (PPL) model of inflation which is a soft deviation from Power Law (PL) inflationary model. This model captures the effect of higher order derivative of Hubble parameter during inflation, which in turn leads to a non-zero effective mass m{sub eff} for the inflaton field. The higher order derivatives of Hubble parameter at leading order sources constant difference in the spectral index for scalar and tensor perturbation going beyond PL model of inflation. PPL model have two observable independentmore » parameters, namely spectral index for tensor perturbation ν{sub t} and change in spectral index for scalar perturbation ν{sub st} to explain the observed features in the scalar and tensor power spectrum of perturbation. From the recent measurements of CMB power spectra by WMAP, Planck and BICEP-2 for temperature and polarization, we estimate the feasibility of PPL model with standard ΛCDM model. Although BICEP-2 claimed a detection of r=0.2, estimates of dust contamination provided by Planck have left open the possibility that only upper bound on r will be expected in a joint analysis. As a result we consider different upper bounds on the value of r and show that PPL model can explain a lower value of tensor to scalar ratio (r<0.1 or r<0.01) for a scalar spectral index of n{sub s}=0.96 by having a non-zero value of effective mass of the inflaton field m{sup 2}{sub eff}/H{sup 2}. The analysis with WP + Planck likelihood shows a non-zero detection of m{sup 2}{sub eff}/H{sup 2} with 5.7 σ and 8.1 σ respectively for r<0.1 and r<0.01. Whereas, with BICEP-2 likelihood m{sup 2}{sub eff}/H{sup 2} = −0.0237 ± 0.0135 which is consistent with zero.« less

  3. Inflation and dark energy from f(R) gravity

    NASA Astrophysics Data System (ADS)

    Artymowski, Michał; Lalak, Zygmunt

    2014-09-01

    The standard Starobinsky inflation has been extended to the R + α Rn - β R2-n model to obtain a stable minimum of the Einstein frame scalar potential of the auxiliary field. As a result we have obtained obtain a scalar potential with non-zero value of residual vacuum energy, which may be a source of Dark Energy. Our results can be easily consistent with PLANCK or BICEP2 data for appropriate choices of the value of n.

  4. Fuzzy classifier based support vector regression framework for Poisson ratio determination

    NASA Astrophysics Data System (ADS)

    Asoodeh, Mojtaba; Bagheripour, Parisa

    2013-09-01

    Poisson ratio is considered as one of the most important rock mechanical properties of hydrocarbon reservoirs. Determination of this parameter through laboratory measurement is time, cost, and labor intensive. Furthermore, laboratory measurements do not provide continuous data along the reservoir intervals. Hence, a fast, accurate, and inexpensive way of determining Poisson ratio which produces continuous data over the whole reservoir interval is desirable. For this purpose, support vector regression (SVR) method based on statistical learning theory (SLT) was employed as a supervised learning algorithm to estimate Poisson ratio from conventional well log data. SVR is capable of accurately extracting the implicit knowledge contained in conventional well logs and converting the gained knowledge into Poisson ratio data. Structural risk minimization (SRM) principle which is embedded in the SVR structure in addition to empirical risk minimization (EMR) principle provides a robust model for finding quantitative formulation between conventional well log data and Poisson ratio. Although satisfying results were obtained from an individual SVR model, it had flaws of overestimation in low Poisson ratios and underestimation in high Poisson ratios. These errors were eliminated through implementation of fuzzy classifier based SVR (FCBSVR). The FCBSVR significantly improved accuracy of the final prediction. This strategy was successfully applied to data from carbonate reservoir rocks of an Iranian Oil Field. Results indicated that SVR predicted Poisson ratio values are in good agreement with measured values.

  5. Data driven CAN node reliability assessment for manufacturing system

    NASA Astrophysics Data System (ADS)

    Zhang, Leiming; Yuan, Yong; Lei, Yong

    2017-01-01

    The reliability of the Controller Area Network(CAN) is critical to the performance and safety of the system. However, direct bus-off time assessment tools are lacking in practice due to inaccessibility of the node information and the complexity of the node interactions upon errors. In order to measure the mean time to bus-off(MTTB) of all the nodes, a novel data driven node bus-off time assessment method for CAN network is proposed by directly using network error information. First, the corresponding network error event sequence for each node is constructed using multiple-layer network error information. Then, the generalized zero inflated Poisson process(GZIP) model is established for each node based on the error event sequence. Finally, the stochastic model is constructed to predict the MTTB of the node. The accelerated case studies with different error injection rates are conducted on a laboratory network to demonstrate the proposed method, where the network errors are generated by a computer controlled error injection system. Experiment results show that the MTTB of nodes predicted by the proposed method agree well with observations in the case studies. The proposed data driven node time to bus-off assessment method for CAN networks can successfully predict the MTTB of nodes by directly using network error event data.

  6. Bayesian analysis of zero inflated spatiotemporal HIV/TB child mortality data through the INLA and SPDE approaches: Applied to data observed between 1992 and 2010 in rural North East South Africa

    NASA Astrophysics Data System (ADS)

    Musenge, Eustasius; Chirwa, Tobias Freeman; Kahn, Kathleen; Vounatsou, Penelope

    2013-06-01

    Longitudinal mortality data with few deaths usually have problems of zero-inflation. This paper presents and applies two Bayesian models which cater for zero-inflation, spatial and temporal random effects. To reduce the computational burden experienced when a large number of geo-locations are treated as a Gaussian field (GF) we transformed the field to a Gaussian Markov Random Fields (GMRF) by triangulation. We then modelled the spatial random effects using the Stochastic Partial Differential Equations (SPDEs). Inference was done using a computationally efficient alternative to Markov chain Monte Carlo (MCMC) called Integrated Nested Laplace Approximation (INLA) suited for GMRF. The models were applied to data from 71,057 children aged 0 to under 10 years from rural north-east South Africa living in 15,703 households over the years 1992-2010. We found protective effects on HIV/TB mortality due to greater birth weight, older age and more antenatal clinic visits during pregnancy (adjusted RR (95% CI)): 0.73(0.53;0.99), 0.18(0.14;0.22) and 0.96(0.94;0.97) respectively. Therefore childhood HIV/TB mortality could be reduced if mothers are better catered for during pregnancy as this can reduce mother-to-child transmissions and contribute to improved birth weights. The INLA and SPDE approaches are computationally good alternatives in modelling large multilevel spatiotemporal GMRF data structures.

  7. Temperature of the inflaton and duration of inflation from Wilkinson microwave anisotropy probe data.

    PubMed

    Bhattacharya, Kaushik; Mohanty, Subhendra; Rangarajan, Raghavan

    2006-03-31

    If the initial state of the inflaton field is taken to have a thermal distribution instead of the conventional zero particle vacuum state then the curvature power spectrum gets modified by a temperature dependent factor such that the fluctuation spectrum of the microwave background radiation is enhanced at larger angles. We compare this modified cosmic microwave background spectrum with Wilkinson microwave anisotropy probe data to obtain an upper bound on the temperature of the inflaton at the time our current horizon crossed the horizon during inflation. We further conclude that there must be additional -foldings of inflation beyond what is needed to solve the horizon problem.

  8. Automatic processes in at-risk adolescents: the role of alcohol-approach tendencies and response inhibition in drinking behavior.

    PubMed

    Peeters, Margot; Wiers, Reinout W; Monshouwer, Karin; van de Schoot, Rens; Janssen, Tim; Vollebergh, Wilma A M

    2012-11-01

    This study examined the association between automatic processes and drinking behavior in relation to individual differences in response inhibition in young adolescents who had just started drinking. It was hypothesized that strong automatic behavioral tendencies toward alcohol-related stimuli (alcohol-approach bias) were associated with higher levels of alcohol use, especially amongst adolescents with relatively weak inhibition skills. To test this hypothesis structural equation analyses (standard error of mean) were performed using a zero inflated Poisson (ZIP) model. A well-known problem in studying risk behavior is the low incidence rate resulting in a zero dominated distribution. A ZIP-model accounts for non-normality of the data. Adolescents were selected from secondary Special Education schools (a risk group for the development of substance use problems). Participants were 374 adolescents (mean age of M = 13.6 years). Adolescents completed the alcohol approach avoidance task (a-AAT), the Stroop colour naming task (Stroop) and a questionnaire that assessed alcohol use. The ZIP-model established stronger alcohol-approach tendencies for adolescent drinkers (P < 0.01) and the interaction revealed a stronger effect of alcohol-approach tendencies on alcohol use in the absence of good inhibition skills (P < 0.05). Automatically-activated cognitive processes are associated with the drinking behavior of young, at-risk adolescents. It appears that alcohol-approach tendencies are formed shortly after the initiation of drinking and particularly affect the drinking behavior of adolescents with relatively weak inhibition skills. Implications for the prevention of problem drinking in adolescents are discussed. © 2012 The Authors. Addiction © 2012 Society for the Study of Addiction.

  9. Characterizing the performance of the Conway-Maxwell Poisson generalized linear model.

    PubMed

    Francis, Royce A; Geedipally, Srinivas Reddy; Guikema, Seth D; Dhavala, Soma Sekhar; Lord, Dominique; LaRocca, Sarah

    2012-01-01

    Count data are pervasive in many areas of risk analysis; deaths, adverse health outcomes, infrastructure system failures, and traffic accidents are all recorded as count events, for example. Risk analysts often wish to estimate the probability distribution for the number of discrete events as part of doing a risk assessment. Traditional count data regression models of the type often used in risk assessment for this problem suffer from limitations due to the assumed variance structure. A more flexible model based on the Conway-Maxwell Poisson (COM-Poisson) distribution was recently proposed, a model that has the potential to overcome the limitations of the traditional model. However, the statistical performance of this new model has not yet been fully characterized. This article assesses the performance of a maximum likelihood estimation method for fitting the COM-Poisson generalized linear model (GLM). The objectives of this article are to (1) characterize the parameter estimation accuracy of the MLE implementation of the COM-Poisson GLM, and (2) estimate the prediction accuracy of the COM-Poisson GLM using simulated data sets. The results of the study indicate that the COM-Poisson GLM is flexible enough to model under-, equi-, and overdispersed data sets with different sample mean values. The results also show that the COM-Poisson GLM yields accurate parameter estimates. The COM-Poisson GLM provides a promising and flexible approach for performing count data regression. © 2011 Society for Risk Analysis.

  10. The transverse Poisson's ratio of composites.

    NASA Technical Reports Server (NTRS)

    Foye, R. L.

    1972-01-01

    An expression is developed that makes possible the prediction of Poisson's ratio for unidirectional composites with reference to any pair of orthogonal axes that are normal to the direction of the reinforcing fibers. This prediction appears to be a reasonable one in that it follows the trends of the finite element analysis and the bounding estimates, and has the correct limiting value for zero fiber content. It can only be expected to apply to composites containing stiff, circular, isotropic fibers bonded to a soft matrix material.

  11. What is the cause of confidence inflation in the Life Events Inventory (LEI) paradigm?

    PubMed

    Von Glahn, Nicholas R; Otani, Hajime; Migita, Mai; Langford, Sara J; Hillard, Erin E

    2012-01-01

    Briefly imagining, paraphrasing, or explaining an event causes people to increase their confidence that this event occurred during childhood-the imagination inflation effect. The mechanisms responsible for the effect were investigated with a new paradigm. In Experiment 1, event familiarity (defined as processing fluency) was varied by asking participants to rate each event once, three times, or five times. No inflation was found, indicating that familiarity does not account for the effect. In Experiment 2, richness of memory representation was manipulated by asking participants to generate zero, three, or six details. Confidence increased from the initial to the final rating in the three- and six-detail conditions, indicating that the effect is based on reality-monitoring errors. However, greater inflation in the three-detail condition than in the six-detail condition indicated that there is a boundary condition. These results were also consistent with an alternative hypothesis, the mental workload hypothesis.

  12. Applicability of Newton's law of cooling in monetary economics

    NASA Astrophysics Data System (ADS)

    Todorović, Jadranka Đurović; Tomić, Zoran; Denić, Nebojša; Petković, Dalibor; Kojić, Nenad; Petrović, Jelena; Petković, Biljana

    2018-03-01

    Inflation is a phenomenon which attracts the attention of many researchers. Inflation is not a recent date phenomenon, but it has existed ever since money emerged in world's first economies. With the development of economy and market, inflation developed as well. Today, even though there is a considerable number of research papers on inflation, there is still not enough knowledge about all factors which might cause inflation, and influence its evolution and dynamics. Regression analysis is a powerful statistical tool which might help analyse a vast amount of data on inflation, and provide an answer to the question about the factors of inflation, as well as the way those factors influence it. In this article Newton's Law of Cooling was applied to determine the long-term dynamics of monetary aggregates and inflation in Serbia and Croatia.

  13. Estimating the Depth of the Navy Recruiting Market

    DTIC Science & Technology

    2016-09-01

    recommend that NRC make use of the Poisson regression model in order to determine high-yield ZIP codes for market depth. 14. SUBJECT...recommend that NRC make use of the Poisson regression model in order to determine high-yield ZIP codes for market depth. vi THIS PAGE INTENTIONALLY LEFT...DEPTH OF THE NAVY RECRUITING MARKET by Emilie M. Monaghan September 2016 Thesis Advisor: Lyn R. Whitaker Second Reader: Jonathan K. Alt

  14. The BRST complex of homological Poisson reduction

    NASA Astrophysics Data System (ADS)

    Müller-Lennert, Martin

    2017-02-01

    BRST complexes are differential graded Poisson algebras. They are associated with a coisotropic ideal J of a Poisson algebra P and provide a description of the Poisson algebra (P/J)^J as their cohomology in degree zero. Using the notion of stable equivalence introduced in Felder and Kazhdan (Contemporary Mathematics 610, Perspectives in representation theory, 2014), we prove that any two BRST complexes associated with the same coisotropic ideal are quasi-isomorphic in the case P = R[V] where V is a finite-dimensional symplectic vector space and the bracket on P is induced by the symplectic structure on V. As a corollary, the cohomology of the BRST complexes is canonically associated with the coisotropic ideal J in the symplectic case. We do not require any regularity assumptions on the constraints generating the ideal J. We finally quantize the BRST complex rigorously in the presence of infinitely many ghost variables and discuss the uniqueness of the quantization procedure.

  15. Inflation without inflaton: A model for dark energy

    NASA Astrophysics Data System (ADS)

    Falomir, H.; Gamboa, J.; Méndez, F.; Gondolo, P.

    2017-10-01

    The interaction between two initially causally disconnected regions of the Universe is studied using analogies of noncommutative quantum mechanics and the deformation of Poisson manifolds. These causally disconnect regions are governed by two independent Friedmann-Lemaître-Robertson-Walker (FLRW) metrics with scale factors a and b and cosmological constants Λa and Λb, respectively. The causality is turned on by positing a nontrivial Poisson bracket [Pα,Pβ]=ɛα βκ/G , where G is Newton's gravitational constant and κ is a dimensionless parameter. The posited deformed Poisson bracket has an interpretation in terms of 3-cocycles, anomalies, and Poissonian manifolds. The modified FLRW equations acquire an energy-momentum tensor from which we explicitly obtain the equation of state parameter. The modified FLRW equations are solved numerically and the solutions are inflationary or oscillating depending on the values of κ . In this model, the accelerating and decelerating regime may be periodic. The analysis of the equation of state clearly shows the presence of dark energy. By completeness, the perturbative solution for κ ≪1 is also studied.

  16. Health information exchange and healthcare utilization.

    PubMed

    Vest, Joshua R

    2009-06-01

    Health information exchange (HIE) makes previously inaccessible data available to clinicians, resulting in more complete information. This study tested the hypotheses that HIE information access reduced emergency room visits and inpatient hospitalizations for ambulatory care sensitive conditions among medically indigent adults. HIE access was quantified by how frequently system users' accessed patients' data. Encounter counts were modeled using zero inflated binomial regression. HIE was not accessed for 43% of individuals. Patient factors associated with accessed data included: prior utilization, chronic conditions, and age. Higher levels of information access were significantly associated with increased counts of all encounter types. Results indicate system users were more likely to access HIE for patients for whom the information might be considered most beneficial. Ultimately, these results imply that HIE information access did not transform care in the ways many would expect. Expectations in utilization reductions, however logical, may have to be reevaluated or postponed.

  17. STS-45 crewmembers during zero gravity activities onboard KC-135 NASA 930

    NASA Image and Video Library

    1991-08-21

    S91-44453 (21 Aug 1991) --- The crew of STS-45 is already training for its March 1992 mission, including stints on the KC-135 zero-gravity-simulating aircraft. Shown with an inflatable globe are, clockwise from the top, C. Michael Foale, mission specialist; Dirk Frimout, payload specialist; Brian Duffy, pilot; Charles R. (Rick) Chappell, backup payload specialist; Charles F. Bolden, mission commander; Byron K. Lichtenberg, payload specialist; and Kathryn D. Sullivan, payload commander.

  18. Impact of a New Law to Reduce the Legal Blood Alcohol Concentration Limit - A Poisson Regression Analysis and Descriptive Approach.

    PubMed

    Nistal-Nuño, Beatriz

    2017-03-31

    In Chile, a new law introduced in March 2012 lowered the blood alcohol concentration (BAC) limit for impaired drivers from 0.1% to 0.08% and the BAC limit for driving under the influence of alcohol from 0.05% to 0.03%, but its effectiveness remains uncertain. The goal of this investigation was to evaluate the effects of this enactment on road traffic injuries and fatalities in Chile. A retrospective cohort study. Data were analyzed using a descriptive and a Generalized Linear Models approach, type of Poisson regression, to analyze deaths and injuries in a series of additive Log-Linear Models accounting for the effects of law implementation, month influence, a linear time trend and population exposure. A review of national databases in Chile was conducted from 2003 to 2014 to evaluate the monthly rates of traffic fatalities and injuries associated to alcohol and in total. It was observed a decrease by 28.1 percent in the monthly rate of traffic fatalities related to alcohol as compared to before the law (P<0.001). Adding a linear time trend as a predictor, the decrease was by 20.9 percent (P<0.001).There was a reduction in the monthly rate of traffic injuries related to alcohol by 10.5 percent as compared to before the law (P<0.001). Adding a linear time trend as a predictor, the decrease was by 24.8 percent (P<0.001). Positive results followed from this new 'zero-tolerance' law implemented in 2012 in Chile. Chile experienced a significant reduction in alcohol-related traffic fatalities and injuries, being a successful public health intervention.

  19. Quasi-neutral limit of Euler–Poisson system of compressible fluids coupled to a magnetic field

    NASA Astrophysics Data System (ADS)

    Yang, Jianwei

    2018-06-01

    In this paper, we consider the quasi-neutral limit of a three-dimensional Euler-Poisson system of compressible fluids coupled to a magnetic field. We prove that, as Debye length tends to zero, periodic initial-value problems of the model have unique smooth solutions existing in the time interval where the ideal incompressible magnetohydrodynamic equations has smooth solution. Meanwhile, it is proved that smooth solutions converge to solutions of incompressible magnetohydrodynamic equations with a sharp convergence rate in the process of quasi-neutral limit.

  20. Characterization and modelling of magneto-auxeticity, the magnetically induced auxetic behavior, in Galfenol

    NASA Astrophysics Data System (ADS)

    Raghunath, Ganesh

    Iron-Gallium alloy (Galfenol) is a magnetostrictive smart material (lambdasat ˜400 ppm) with potential for robust transduction owing to good magneto-mechanical coupling and useful mechanical properties. In addition, Galfenol exhibits a highly negative Poisson's ratio (denoted by nu) along the crystallographic directions on {100} planes with nu values of as low as -0.7 under tensile loads. Consequently, their samples become wider when elongated and narrower when compressed (aka auxeticity). This is an anisotropic, in-plane and volume conserving phenomenon with compensating contractions and expansions in the third (out of plane) direction. Since there is good magneto-elastic coupling in Galfenol, a negative Poisson's ratio is expected to be observed under application of magnetic fields even under zero stress conditions. This work deals with systematically studying the magneto-elastic contributions in Galfenol samples between 12 and 33 atomic percent Ga as a non-synthetic (no artificial linkages, unlike foams) 'structural auxetic' material, capable of bearing loads. This investigation addresses the profound gap in understanding this atypical behavior using empirical data supported by analytical modeling from first principles to predict the Poisson's ratio at magnetic saturation, multi-physics finite element simulations to determine the trends in the strains along the {100} directions and magnetic domain imaging to explain the mechanical response from a magnetic domain perspective. The outcome of this effort will help comprehend the association between anisotropic magnetic and mechanical energies and hence the magnetic contributions to the atomic level interactions that are the origins of this magneto-auxetic characteristic. Also, it is well established that a number of mechanical properties such as shear resistance and toughness depend on the value of Poisson's ratio. There is a slight increase in these mechanical properties with non-zero nu values, but as we enter the highly auxetic regime (nu<-0.5), these values increase by magnitudes. Hence, the possibility of nu values approaching -1.0 under applied magnetic fields at zero stress is extremely intriguing, as these properties can be much larger than is possible in conventional materials. This has potential for several novel applications where the value of Poisson's ratio can be magnetically tuned to keep it near -1 under applied stresses.

  1. Analysis of aggregate impact factor inflation in ophthalmology.

    PubMed

    Caramoy, Albert; Korwitz, Ulrich; Eppelin, Anita; Kirchhof, Bernd; Fauser, Sascha

    2013-01-01

    To analyze the aggregate impact factor (AIF) in ophthalmology, its inflation rate, and its relation to other subject fields. A retrospective, database review of all subject fields in the Journal Citation Reports (JCR), Science edition. Citation data, AIF, number of journals and citations from the years 2003-2011 were analyzed. Data were retrieved from JCR. Future trends were calculated using a linear regression method. The AIF varies considerably between subjects. It shows also an inflation rate, which varies annually. The AIF inflation rate in ophthalmology was not as high as the background AIF inflation rate. The AIF inflation rate caused the AIF to increase annually. Not considering these variations in the AIF between years and between fields will make the AIF as a bibliometric tool inappropriate. Copyright © 2012 S. Karger AG, Basel.

  2. A Predictive Model Has Identified Tick-Borne Encephalitis High-Risk Areas in Regions Where No Cases Were Reported Previously, Poland, 1999–2012

    PubMed Central

    Rubikowska, Barbara; Bratkowski, Jakub; Ustrnul, Zbigniew; Vanwambeke, Sophie O.

    2018-01-01

    During 1999–2012, 77% of the cases of tick-borne encephalitis (TBE) were recorded in two out of 16 Polish provinces. However, historical data, mostly from national serosurveys, suggest that the disease could be undetected in many areas. The aim of this study was to identify which routinely-measured meteorological, environmental, and socio-economic factors are associated to TBE human risk across Poland, with a particular focus on areas reporting few cases, but where serosurveys suggest higher incidence. We fitted a zero-inflated Poisson model using data on TBE incidence recorded in 108 NUTS-5 administrative units in high-risk areas over the period 1999–2012. Subsequently we applied the best fitting model to all Polish municipalities. Keeping the remaining variables constant, the predicted rate increased with the increase of air temperature over the previous 10–20 days, precipitation over the previous 20–30 days, in forestation, forest edge density, forest road density, and unemployment. The predicted rate decreased with increasing distance from forests. The map of predicted rates was consistent with the established risk areas. It predicted, however, high rates in provinces considered TBE-free. We recommend raising awareness among physicians working in the predicted high-risk areas and considering routine use of household animal surveys for risk mapping. PMID:29617333

  3. A Predictive Model Has Identified Tick-Borne Encephalitis High-Risk Areas in Regions Where No Cases Were Reported Previously, Poland, 1999-2012.

    PubMed

    Stefanoff, Pawel; Rubikowska, Barbara; Bratkowski, Jakub; Ustrnul, Zbigniew; Vanwambeke, Sophie O; Rosinska, Magdalena

    2018-04-04

    During 1999–2012, 77% of the cases of tick-borne encephalitis (TBE) were recorded in two out of 16 Polish provinces. However, historical data, mostly from national serosurveys, suggest that the disease could be undetected in many areas. The aim of this study was to identify which routinely-measured meteorological, environmental, and socio-economic factors are associated to TBE human risk across Poland, with a particular focus on areas reporting few cases, but where serosurveys suggest higher incidence. We fitted a zero-inflated Poisson model using data on TBE incidence recorded in 108 NUTS-5 administrative units in high-risk areas over the period 1999–2012. Subsequently we applied the best fitting model to all Polish municipalities. Keeping the remaining variables constant, the predicted rate increased with the increase of air temperature over the previous 10–20 days, precipitation over the previous 20–30 days, in forestation, forest edge density, forest road density, and unemployment. The predicted rate decreased with increasing distance from forests. The map of predicted rates was consistent with the established risk areas. It predicted, however, high rates in provinces considered TBE-free. We recommend raising awareness among physicians working in the predicted high-risk areas and considering routine use of household animal surveys for risk mapping.

  4. Attention Deficit Hyperactivity Disorder symptoms and smoking trajectories: race and gender differences.

    PubMed

    Lee, Chien-Ti; Clark, Trenette T; Kollins, Scott H; McClernon, F Joseph; Fuemmeler, Bernard F

    2015-03-01

    This study examined the influence of Attention Deficit Hyperactivity Disorder (ADHD) symptoms severity and directionality (hyperactive-impulsive symptoms relative to inattentive symptoms) on trajectories of the probability of current (past month) smoking and the number of cigarettes smoked from age 13 to 32. Racial and gender differences in the relationship of ADHD symptoms and smoking trajectories were also assessed. A subsample of 9719 youth (54.5% female) was drawn from the National Longitudinal Study of Adolescent to Adult Health (Add Health). Cohort sequential design and zero-inflated Poisson (ZIP) latent growth modeling were used to estimate the relationship between ADHD directionality and severity on smoking development. ADHD severity's effect on the likelihood of ever smoking cigarettes at the intercept (age 13) had a greater impact on White males than other groups. ADHD severity also had a stronger influence on the initial number of cigarettes smoked at age 13 among Hispanic participants. The relationships between ADHD directionality (hyperactive-impulsive symptoms relative to inattentive symptoms) and a higher number of cigarettes smoked at the intercept were stronger among Hispanic males than others. Gender differences manifested only among Whites. ADHD severity and directionality had unique effects on smoking trajectories. Our results also highlight that the risk of ADHD symptoms may differ by race and gender. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. THE ROLE OF IMMIGRATION AGE ON ALCOHOL AND DRUG USE AMONG BORDER AND NON-BORDER MEXICAN AMERICANS

    PubMed Central

    Reingle, Jennifer M.; Caetano, Raul; Mills, Britain A.; Vaeth, Patrice A. C.

    2014-01-01

    Background To determine the age of immigration at which the marked increase in risk for alcohol- and drug use problems in adulthood is observed among Mexican American adults residing in two distinct contexts: the U.S.-Mexico border, and cities not proximal to the border. Methods We used two samples of Mexican American adults; specifically, 1,307 who resided along the U.S.-Mexico border, and 1,288 non-border adults who were interviewed as a part of the 2006 Hispanic Americans Baseline Alcohol Survey study. Survey logistic and Zero-Inflated Poisson methods were used to examine how immigration age during adolescence is related to alcohol and drug use behavior in adulthood. Results We found that participants who immigrate to the U.S. prior to age 12 have qualitatively different alcohol- and drug-related outcomes compared to those who immigrate later in life. Adults who immigrated at younger ages have alcohol and drug use patterns similar to those who were U.S.-born. Similarly, adults who immigrated at younger ages and live along the U.S.-Mexico border are at greater risk for alcohol and drug use than those who live in non-border contexts. Conclusions Immigration from Mexico to the U.S. before age 12 results in alcohol and drug-related behavior that mirrors the behavior of U.S.-born residents. PMID:24846850

  6. [Use of multiple regression models in observational studies (1970-2013) and requirements of the STROBE guidelines in Spanish scientific journals].

    PubMed

    Real, J; Cleries, R; Forné, C; Roso-Llorach, A; Martínez-Sánchez, J M

    In medicine and biomedical research, statistical techniques like logistic, linear, Cox and Poisson regression are widely known. The main objective is to describe the evolution of multivariate techniques used in observational studies indexed in PubMed (1970-2013), and to check the requirements of the STROBE guidelines in the author guidelines in Spanish journals indexed in PubMed. A targeted PubMed search was performed to identify papers that used logistic linear Cox and Poisson models. Furthermore, a review was also made of the author guidelines of journals published in Spain and indexed in PubMed and Web of Science. Only 6.1% of the indexed manuscripts included a term related to multivariate analysis, increasing from 0.14% in 1980 to 12.3% in 2013. In 2013, 6.7, 2.5, 3.5, and 0.31% of the manuscripts contained terms related to logistic, linear, Cox and Poisson regression, respectively. On the other hand, 12.8% of journals author guidelines explicitly recommend to follow the STROBE guidelines, and 35.9% recommend the CONSORT guideline. A low percentage of Spanish scientific journals indexed in PubMed include the STROBE statement requirement in the author guidelines. Multivariate regression models in published observational studies such as logistic regression, linear, Cox and Poisson are increasingly used both at international level, as well as in journals published in Spanish. Copyright © 2015 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España, S.L.U. All rights reserved.

  7. A comparison of methods for the analysis of binomial clustered outcomes in behavioral research.

    PubMed

    Ferrari, Alberto; Comelli, Mario

    2016-12-01

    In behavioral research, data consisting of a per-subject proportion of "successes" and "failures" over a finite number of trials often arise. This clustered binary data are usually non-normally distributed, which can distort inference if the usual general linear model is applied and sample size is small. A number of more advanced methods is available, but they are often technically challenging and a comparative assessment of their performances in behavioral setups has not been performed. We studied the performances of some methods applicable to the analysis of proportions; namely linear regression, Poisson regression, beta-binomial regression and Generalized Linear Mixed Models (GLMMs). We report on a simulation study evaluating power and Type I error rate of these models in hypothetical scenarios met by behavioral researchers; plus, we describe results from the application of these methods on data from real experiments. Our results show that, while GLMMs are powerful instruments for the analysis of clustered binary outcomes, beta-binomial regression can outperform them in a range of scenarios. Linear regression gave results consistent with the nominal level of significance, but was overall less powerful. Poisson regression, instead, mostly led to anticonservative inference. GLMMs and beta-binomial regression are generally more powerful than linear regression; yet linear regression is robust to model misspecification in some conditions, whereas Poisson regression suffers heavily from violations of the assumptions when used to model proportion data. We conclude providing directions to behavioral scientists dealing with clustered binary data and small sample sizes. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Mass inflation followed by Belinskii-Khalatnikov-Lifshitz collapse inside accreting, rotating black holes

    NASA Astrophysics Data System (ADS)

    Hamilton, Andrew J. S.

    2017-10-01

    Numerical evidence is presented that the Poisson-Israel mass inflation instability at the inner horizon of an accreting, rotating black hole is generically followed by Belinskii-Khalatnikov-Lifshitz oscillatory collapse to a spacelike singularity. The computation involves following all 6 degrees of freedom of the gravitational field. To simplify the problem, the computation takes as initial conditions the conformally separable solutions of Andrew J. S. Hamilton and Gavin Polhemus [Interior structure of rotating black holes. I. Concise derivation, Phys. Rev. D 84, 124055 (2011), 10.1103/PhysRevD.84.124055] and Andrew J. S. Hamilton [Interior structure of rotating black holes. II. Uncharged black holes, Phys. Rev. D 84, 124056 (2011), 10.1103/PhysRevD.84.124056] just above the inner horizon of a slowly accreting, rotating black hole and integrates the equations inward along single latitudes.

  9. Efficacy of a Universal Brief Intervention for Violence Among Urban Emergency Department Youth

    PubMed Central

    Carter, Patrick M.; Walton, Maureen A.; Zimmerman, Marc A.; Chermack, Stephen T.; Roche, Jessica S.; Cunningham, Rebecca M.

    2016-01-01

    Background Violent injury is the leading cause of death among urban youth. Emergency department (ED) visits represent an opportunity to deliver a brief intervention (BI) to reduce violence among youth seeking medical care in high-risk communities. Objective To determine the efficacy of a universally applied Brief Intervention (BI) addressing violence behaviors among youth presenting to an urban ED. Methods ED youth (14-to-20 years-old) seeking medical or injury- related care in a Level-1 ED (October 2011–March 2015) and screening positive for a home address within the intervention or comparison neighborhood of a larger youth violence project were enrolled in this quasi-experimental study. Based on home address, participants were assigned to receive either the 30-min therapist-delivered BI (Project Sync) or a resource brochure (enhanced usual care [EUC] condition). The Project Sync BI combined motivational interviewing and cognitive skills training, including a review of participant goals, tailored feedback, decisional balance exercises, role-playing exercises, and linkage to community resources. Participants completed validated survey measures at baseline and a 2-month follow-up assessment. Main outcome measures included self-report of physical victimization, aggression, and self-efficacy to avoid fighting. Poisson and Zero-inflated Poisson regression analyses analyzed the effects of the BI, as compared to the EUC condition on primary outcomes. Results 409 eligible youth (82% participation) were enrolled and assigned to either receive the BI (n=263) or the EUC condition (n=146). Two-month follow-up was 91% (n=373). There were no significant baseline differences between study conditions. Among the entire sample, mean age was 17.7 y/o (SD 1.9), 60% were female, 93% were African-American, and 79% reported receipt of public assistance. Of participants, 9% presented for a violent injury, 9% reported recent firearm carriage, 20% reported recent alcohol use, and 39% reported recent marijuana use. Compared with the EUC group, participants in the therapist BI group showed self-reported reductions in frequency of violent aggression (therapist, −46.8%; EUC, −36.9%; Incident rate ratio [IRR], 0.87; 95% confidence interval [CI], [0.76–0.99]) and increased self-efficacy for avoiding fighting (therapist, +7.2%; EUC, −1.3%; IRR, 1.09; 95% CI, 1.02–1.15). No significant changes were noted for victimization. Conclusions Among youth seeking ED care in a high-risk community, a brief, universally applied BI shows promise in increased self-efficacy for avoiding fighting and a decrease in the frequency of violent aggression. Trial Registration Clinicaltrials.gov identifier – NCT02586766 PMID:27265097

  10. Poisson-Like Spiking in Circuits with Probabilistic Synapses

    PubMed Central

    Moreno-Bote, Rubén

    2014-01-01

    Neuronal activity in cortex is variable both spontaneously and during stimulation, and it has the remarkable property that it is Poisson-like over broad ranges of firing rates covering from virtually zero to hundreds of spikes per second. The mechanisms underlying cortical-like spiking variability over such a broad continuum of rates are currently unknown. We show that neuronal networks endowed with probabilistic synaptic transmission, a well-documented source of variability in cortex, robustly generate Poisson-like variability over several orders of magnitude in their firing rate without fine-tuning of the network parameters. Other sources of variability, such as random synaptic delays or spike generation jittering, do not lead to Poisson-like variability at high rates because they cannot be sufficiently amplified by recurrent neuronal networks. We also show that probabilistic synapses predict Fano factor constancy of synaptic conductances. Our results suggest that synaptic noise is a robust and sufficient mechanism for the type of variability found in cortex. PMID:25032705

  11. Inverse Jacobi multiplier as a link between conservative systems and Poisson structures

    NASA Astrophysics Data System (ADS)

    García, Isaac A.; Hernández-Bermejo, Benito

    2017-08-01

    Some aspects of the relationship between conservativeness of a dynamical system (namely the preservation of a finite measure) and the existence of a Poisson structure for that system are analyzed. From the local point of view, due to the flow-box theorem we restrict ourselves to neighborhoods of singularities. In this sense, we characterize Poisson structures around the typical zero-Hopf singularity in dimension 3 under the assumption of having a local analytic first integral with non-vanishing first jet by connecting with the classical Poincaré center problem. From the global point of view, we connect the property of being strictly conservative (the invariant measure must be positive) with the existence of a Poisson structure depending on the phase space dimension. Finally, weak conservativeness in dimension two is introduced by the extension of inverse Jacobi multipliers as weak solutions of its defining partial differential equation and some of its applications are developed. Examples including Lotka-Volterra systems, quadratic isochronous centers, and non-smooth oscillators are provided.

  12. Species abundance in a forest community in South China: A case of poisson lognormal distribution

    USGS Publications Warehouse

    Yin, Z.-Y.; Ren, H.; Zhang, Q.-M.; Peng, S.-L.; Guo, Q.-F.; Zhou, G.-Y.

    2005-01-01

    Case studies on Poisson lognormal distribution of species abundance have been rare, especially in forest communities. We propose a numerical method to fit the Poisson lognormal to the species abundance data at an evergreen mixed forest in the Dinghushan Biosphere Reserve, South China. Plants in the tree, shrub and herb layers in 25 quadrats of 20 m??20 m, 5 m??5 m, and 1 m??1 m were surveyed. Results indicated that: (i) for each layer, the observed species abundance with a similarly small median, mode, and a variance larger than the mean was reverse J-shaped and followed well the zero-truncated Poisson lognormal; (ii) the coefficient of variation, skewness and kurtosis of abundance, and two Poisson lognormal parameters (?? and ??) for shrub layer were closer to those for the herb layer than those for the tree layer; and (iii) from the tree to the shrub to the herb layer, the ?? and the coefficient of variation decreased, whereas diversity increased. We suggest that: (i) the species abundance distributions in the three layers reflects the overall community characteristics; (ii) the Poisson lognormal can describe the species abundance distribution in diverse communities with a few abundant species but many rare species; and (iii) 1/?? should be an alternative measure of diversity.

  13. Horizon feedback inflation

    NASA Astrophysics Data System (ADS)

    Fairbairn, Malcolm; Markkanen, Tommi; Rodriguez Roman, David

    2018-04-01

    We consider the effect of the Gibbons-Hawking radiation on the inflaton in the situation where it is coupled to a large number of spectator fields. We argue that this will lead to two important effects - a thermal contribution to the potential and a gradual change in parameters in the Lagrangian which results from thermodynamic and energy conservation arguments. We present a scenario of hilltop inflation where the field starts trapped at the origin before slowly experiencing a phase transition during which the field extremely slowly moves towards its zero temperature expectation value. We show that it is possible to obtain enough e-folds of expansion as well as the correct spectrum of perturbations without hugely fine-tuned parameters in the potential (albeit with many spectator fields). We also comment on how initial conditions for inflation can arise naturally in this situation.

  14. Earthquake number forecasts testing

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness and kurtosis both tend to zero for large earthquake rates: for the Gaussian law, these values are identically zero. A calculation of the NBD skewness and kurtosis levels based on the values of the first two statistical moments of the distribution, shows rapid increase of these upper moments levels. However, the observed catalogue values of skewness and kurtosis are rising even faster. This means that for small time intervals, the earthquake number distribution is even more heavy-tailed than the NBD predicts. Therefore for small time intervals, we propose using empirical number distributions appropriately smoothed for testing forecasted earthquake numbers.

  15. Impact of diabetes on hospital admission and length of stay among a general population aged 45 year or more: a record linkage study.

    PubMed

    Comino, Elizabeth Jean; Harris, Mark Fort; Islam, M D Fakhrul; Tran, Duong Thuy; Jalaludin, Bin; Jorm, Louisa; Flack, Jeff; Haas, Marion

    2015-01-22

    The increased prevalence of diabetes and its significant impact on use of health care services, particularly hospitals, is a concern for health planners. This paper explores the risk factors for all-cause hospitalisation and the excess risk due to diabetes in a large sample of older Australians. The study population was 263,482 participants in the 45 and Up Study. The data assessed were linked records of hospital admissions in the 12 months following completion of a baseline questionnaire. All cause and ambulatory care sensitive admission rates and length of stay were examined. The associations between demographic characteristics, socioeconomic status, lifestyle factors, and health and wellbeing and risk of hospitalisation were explored using zero inflated Poisson (ZIP) regression models adjusting for age and gender. The ratios of adjusted relative rates and 95% confidence intervals were calculated to determine the excess risk due to diabetes. Prevalence of diabetes was 9.0% (n = 23,779). Age adjusted admission rates for all-cause hospitalisation were 631.3 and 454.8 per 1,000 participant years and the mean length of stay was 8.2 and 7.1 days respectively for participants with and without diabetes. In people with and without diabetes, the risk of hospitalisation was associated with age, gender, household income, smoking, BMI, physical activity, and health and wellbeing. However, the increased risk of hospitalisation was attenuated for participants with diabetes who were older, obese, or had hypertension or hyperlipidaemia and enhanced for those participants with diabetes who were male, on low income, current smokers or who had anxiety or depression. This study is one of the few studies published to explore the impact of diabetes on hospitalisation in a large non-clinical population, the 45 and Up Study. The attenuation of risk associated with some factors is likely to be due to correlation between diabetes and factors such as age and obesity. The increased risk in association with other factors such as gender and low income in participants with diabetes is likely to be due to their synergistic influence on health status and the way services are accessed.

  16. The Role of Socioeconomic Status in Longitudinal Trends of Cholera in Matlab, Bangladesh, 1993–2007

    PubMed Central

    Root, Elisabeth Dowling; Rodd, Joshua; Yunus, Mohammad; Emch, Michael

    2013-01-01

    There has been little evidence of a decline in the global burden of cholera in recent years as the number of cholera cases reported to WHO continues to rise. Cholera remains a global threat to public health and a key indicator of lack of socioeconomic development. Overall socioeconomic development is the ultimate solution for control of cholera as evidenced in developed countries. However, most research has focused on cross-county comparisons so that the role of individual- or small area-level socioeconomic status (SES) in cholera dynamics has not been carefully studied. Reported cases of cholera in Matlab, Bangladesh have fluctuated greatly over time and epidemic outbreaks of cholera continue, most recently with the introduction of a new serotype into the region. The wealth of longitudinal data on the population of Matlab provides a unique opportunity to explore the impact of socioeconomic status and other demographic characteristics on the long-term temporal dynamics of cholera in the region. In this population-based study we examine which factors impact the initial number of cholera cases in a bari at the beginning of the 0139 epidemic and the factors impacting the number of cases over time. Cholera data were derived from the ICDDR,B health records and linked to socioeconomic and geographic data collected as part of the Matlab Health and Demographic Surveillance System. Longitudinal zero-inflated Poisson (ZIP) multilevel regression models are used to examine the impact of environmental and socio-demographic factors on cholera counts across baris. Results indicate that baris with a high socioeconomic status had lower initial rates of cholera at the beginning of the 0139 epidemic (γ01 = −0.147, p = 0.041) and a higher probability of reporting no cholera cases (α01 = 0.156, p = 0.061). Populations in baris characterized by low SES are more likely to experience higher cholera morbidity at the beginning of an epidemic than populations in high SES baris. PMID:23326618

  17. Bayesian semi-parametric analysis of Poisson change-point regression models: application to policy making in Cali, Colombia.

    PubMed

    Park, Taeyoung; Krafty, Robert T; Sánchez, Alvaro I

    2012-07-27

    A Poisson regression model with an offset assumes a constant baseline rate after accounting for measured covariates, which may lead to biased estimates of coefficients in an inhomogeneous Poisson process. To correctly estimate the effect of time-dependent covariates, we propose a Poisson change-point regression model with an offset that allows a time-varying baseline rate. When the nonconstant pattern of a log baseline rate is modeled with a nonparametric step function, the resulting semi-parametric model involves a model component of varying dimension and thus requires a sophisticated varying-dimensional inference to obtain correct estimates of model parameters of fixed dimension. To fit the proposed varying-dimensional model, we devise a state-of-the-art MCMC-type algorithm based on partial collapse. The proposed model and methods are used to investigate an association between daily homicide rates in Cali, Colombia and policies that restrict the hours during which the legal sale of alcoholic beverages is permitted. While simultaneously identifying the latent changes in the baseline homicide rate which correspond to the incidence of sociopolitical events, we explore the effect of policies governing the sale of alcohol on homicide rates and seek a policy that balances the economic and cultural dependencies on alcohol sales to the health of the public.

  18. Dark zone in the centre of the Arago-Poisson diffraction spot of a helical laser beam

    NASA Astrophysics Data System (ADS)

    Emile, O.; Voisin, A.; Niemiec, R.; Viaris de Lesegno, B.; Pruvost, L.; Ropars, G.; Emile, J.; Brousseau, C.

    2013-03-01

    We report on the diffraction of non-zero Laguerre Gaussian laser beams by an opaque disk. We observe a tiny circular dark zone at the centre of the usual Arago-Poisson diffraction bright spot. For such non-diffracting dark hollow beams, we have measured diameters as small as 20 μm on distances of the order of ten metres, without focalization. Diameters depend on the diffracting object size and on the topological charge of the input Laguerre Gaussian beam. These results are in good agreement with theoretical considerations. Potential applications are then discussed.

  19. DQE as detection probability of the radiation detectors

    NASA Astrophysics Data System (ADS)

    Zanella, Giovanni

    2008-02-01

    In this paper it is shown that quantum efficiency (DQE), as commonly defined for imaging detectors, can be extended to all radiation detectors with the meaning of detection probability, if Poisson statistics applies. This unified approach is possible in time-domain at zero spatial-frequency.

  20. Comparison of robustness to outliers between robust poisson models and log-binomial models when estimating relative risks for common binary outcomes: a simulation study.

    PubMed

    Chen, Wansu; Shi, Jiaxiao; Qian, Lei; Azen, Stanley P

    2014-06-26

    To estimate relative risks or risk ratios for common binary outcomes, the most popular model-based methods are the robust (also known as modified) Poisson and the log-binomial regression. Of the two methods, it is believed that the log-binomial regression yields more efficient estimators because it is maximum likelihood based, while the robust Poisson model may be less affected by outliers. Evidence to support the robustness of robust Poisson models in comparison with log-binomial models is very limited. In this study a simulation was conducted to evaluate the performance of the two methods in several scenarios where outliers existed. The findings indicate that for data coming from a population where the relationship between the outcome and the covariate was in a simple form (e.g. log-linear), the two models yielded comparable biases and mean square errors. However, if the true relationship contained a higher order term, the robust Poisson models consistently outperformed the log-binomial models even when the level of contamination is low. The robust Poisson models are more robust (or less sensitive) to outliers compared to the log-binomial models when estimating relative risks or risk ratios for common binary outcomes. Users should be aware of the limitations when choosing appropriate models to estimate relative risks or risk ratios.

  1. Poisson-Based Inference for Perturbation Models in Adaptive Spelling Training

    ERIC Educational Resources Information Center

    Baschera, Gian-Marco; Gross, Markus

    2010-01-01

    We present an inference algorithm for perturbation models based on Poisson regression. The algorithm is designed to handle unclassified input with multiple errors described by independent mal-rules. This knowledge representation provides an intelligent tutoring system with local and global information about a student, such as error classification…

  2. Does money matter in inflation forecasting?

    NASA Astrophysics Data System (ADS)

    Binner, J. M.; Tino, P.; Tepper, J.; Anderson, R.; Jones, B.; Kendall, G.

    2010-11-01

    This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regression-techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naïve random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists’ long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies.

  3. Yes, the GIGP Really Does Work--And Is Workable!

    ERIC Educational Resources Information Center

    Burrell, Quentin L.; Fenton, Michael R.

    1993-01-01

    Discusses the generalized inverse Gaussian-Poisson (GIGP) process for informetric modeling. Negative binomial distribution is discussed, construction of the GIGP process is explained, zero-truncated GIGP is considered, and applications of the process with journals, library circulation statistics, and database index terms are described. (50…

  4. Zimbabwe

    DTIC Science & Technology

    2009-02-20

    arrears, and foreign currency for essential imports, particularly fuel, is in extremely short supply. The IMF suggests that the inflation rate will not... devalue the official exchange rate. Instead, in June 2006, Gono devalued the country’s currency , the Zimbabwe dollar, removing three zeros in an effort to...23 The IMF and the World Bank

  5. Is there scale-dependent bias in single-field inflation?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Putter, Roland; Doré, Olivier; Green, Daniel, E-mail: rdputter@caltech.edu, E-mail: Olivier.P.Dore@jpl.nasa.gov, E-mail: drgreen@cita.utoronto.ca

    2015-10-01

    Scale-dependent halo bias due to local primordial non-Gaussianity provides a strong test of single-field inflation. While it is universally understood that single-field inflation predicts negligible scale-dependent bias compared to current observational uncertainties, there is still disagreement on the exact level of scale-dependent bias at a level that could strongly impact inferences made from future surveys. In this paper, we clarify this confusion and derive in various ways that there is exactly zero scale-dependent bias in single-field inflation. Much of the current confusion follows from the fact that single-field inflation does predict a mode coupling of matter perturbations at the levelmore » of f{sub NL}{sup local}; ≈ −5/3, which naively would lead to scale-dependent bias. However, we show explicitly that this mode coupling cancels out when perturbations are evaluated at a fixed physical scale rather than fixed coordinate scale. Furthermore, we show how the absence of scale-dependent bias can be derived easily in any gauge. This result can then be incorporated into a complete description of the observed galaxy clustering, including the previously studied general relativistic terms, which are important at the same level as scale-dependent bias of order f{sub NL}{sup local} ∼ 1. This description will allow us to draw unbiased conclusions about inflation from future galaxy clustering data.« less

  6. Meta-analysis of haplotype-association studies: comparison of methods and empirical evaluation of the literature

    PubMed Central

    2011-01-01

    Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440

  7. Neighborhood walkability and active travel (walking and cycling) in New York City.

    PubMed

    Freeman, Lance; Neckerman, Kathryn; Schwartz-Soicher, Ofira; Quinn, James; Richards, Catherine; Bader, Michael D M; Lovasi, Gina; Jack, Darby; Weiss, Christopher; Konty, Kevin; Arno, Peter; Viola, Deborah; Kerker, Bonnie; Rundle, Andrew G

    2013-08-01

    Urban planners have suggested that built environment characteristics can support active travel (walking and cycling) and reduce sedentary behavior. This study assessed whether engagement in active travel is associated with neighborhood walkability measured for zip codes in New York City. Data were analyzed on engagement in active travel and the frequency of walking or biking ten blocks or more in the past month, from 8,064 respondents to the New York City 2003 Community Health Survey (CHS). A neighborhood walkability scale that measures: residential, intersection, and subway stop density; land use mix; and the ratio of retail building floor area to retail land area was calculated for each zip code. Data were analyzed using zero-inflated negative binomial regression incorporating survey sample weights and adjusting for respondents' sociodemographic characteristics. Overall, 44 % of respondents reported no episodes of active travel and among those who reported any episode, the mean number was 43.2 episodes per month. Comparing the 75th to the 25th percentile of zip code walkability, the odds ratio for reporting zero episodes of active travel was 0.71 (95 % CI 0.61, 0.83) and the exponentiated beta coefficient for the count of episodes of active travel was 1.13 (95 % CI 1.06, 1.21). Associations between lower walkability and reporting zero episodes of active travel were significantly stronger for non-Hispanic Whites as compared to non-Hispanic Blacks and to Hispanics and for those living in higher income zip codes. The results suggest that neighborhood walkability is associated with higher engagement in active travel.

  8. Can a supersonically expanding Bose-Einstein Condensates be used to study cosmological inflation?

    NASA Astrophysics Data System (ADS)

    Banik, Swarnav; Eckel, Stephen; Kumar, Avinash; Jacobson, Ted; Spielman, Ian; Campbell, Gretchen

    2017-04-01

    The massive scale of the universe makes the experimental study of cosmological inflation difficult. This has led to an interest in developing analogous systems using table top experiments. Here, we present the basic features of an expanding universe by drawing parallels with an expanding toroidal Bose Einstein Condensate (BEC) of 23Na atoms. The toroidal BEC serves as the background vacuum and phonons are the analogue to photons in the expanding universe. We study the dynamics of phonons in both non-expanding and expanding condensates and measure dissipation using the structure factor. We demonstrate red shifting of phonons and quasi-particle production similar to pre-heating after the inflation of universe. At the end of expansion, we also observe spontaneous non-zero winding numbers in the ring. Using Monte-Carlo simulations, we predict the widths of the resulting winding number distribution, which agree well with our experimental findings.

  9. Outpatient Pain Predicts Subsequent One-Year Acute Health Care Utilization Among Adults With Sickle Cell Disease

    PubMed Central

    Ezenwa, Miriam O.; Molokie, Robert E.; Wang, Zaijie Jim; Yao, Yingwei; Suarez, Marie L.; Angulo, Veronica; Wilkie, Diana J.

    2014-01-01

    Context Patient demographic and clinical factors have known associations with acute health care utilization (AHCU) among patients with sickle cell disease (SCD), but it is unknown if pain measured predominantly in an outpatient setting is a predictor of future AHCU in patients with SCD. Objectives To determine whether multidimensional pain scores obtained predominantly in an outpatient setting predicted subsequent one-year AHCU by 137 adults with SCD and whether the pain measured at a second visit also predicted AHCU. Methods Pain data included the Composite Pain Index (CPI), a single score representative of a multidimensional pain experience (number of pain sites, intensity, quality, and pattern). Based on the distribution of AHCU events, we divided patients into three groups: (1) zero events (Zero), (2) 1–3 events (Low), or (3) 4–23 events (High). Results The initial CPI scores differed significantly by the three groups (F(2,134)=7.38, P=0.001). Post hoc comparisons showed that the Zero group had lower CPI scores than both the Low group (P<0.01) and the High group (P<0.001). In multiviariate, overdispersed Poisson regression analyses, age, and CPI scores (at both measurement times) were statistically significant predictors of utilization events. Pain intensity scores at both measurement times were significant predictors of utilization, but other pain scores (number of pain sites, quality, and pattern) were not. Conclusion Findings support use of outpatient CPI scores or pain intensity and age to identify at-risk young adults with SCD who are likely to benefit from improved outpatient pain management plans. PMID:24636960

  10. Inflatable Re-Entry Vehicle Experiment (IRVE) Design Overview

    NASA Technical Reports Server (NTRS)

    Hughes, Stephen J.; Dillman, Robert A.; Starr, Brett R.; Stephan, Ryan A.; Lindell, Michael C.; Player, Charles J.; Cheatwood, F. McNeil

    2005-01-01

    Inflatable aeroshells offer several advantages over traditional rigid aeroshells for atmospheric entry. Inflatables offer increased payload volume fraction of the launch vehicle shroud and the possibility to deliver more payload mass to the surface for equivalent trajectory constraints. An inflatable s diameter is not constrained by the launch vehicle shroud. The resultant larger drag area can provide deceleration equivalent to a rigid system at higher atmospheric altitudes, thus offering access to higher landing sites. When stowed for launch and cruise, inflatable aeroshells allow access to the payload after the vehicle is integrated for launch and offer direct access to vehicle structure for structural attachment with the launch vehicle. They also offer an opportunity to eliminate system duplication between the cruise stage and entry vehicle. There are however several potential technical challenges for inflatable aeroshells. First and foremost is the fact that they are flexible structures. That flexibility could lead to unpredictable drag performance or an aerostructural dynamic instability. In addition, durability of large inflatable structures may limit their application. They are susceptible to puncture, a potentially catastrophic insult, from many possible sources. Finally, aerothermal heating during planetary entry poses a significant challenge to a thin membrane. NASA Langley Research Center and NASA's Wallops Flight Facility are jointly developing inflatable aeroshell technology for use on future NASA missions. The technology will be demonstrated in the Inflatable Re-entry Vehicle Experiment (IRVE). This paper will detail the development of the initial IRVE inflatable system to be launched on a Terrier/Orion sounding rocket in the fourth quarter of CY2005. The experiment will demonstrate achievable packaging efficiency of the inflatable aeroshell for launch, inflation, leak performance of the inflatable system throughout the flight regime, structural integrity when exposed to a relevant dynamic pressure and aerodynamic stability of the inflatable system. Structural integrity and structural response of the inflatable will be verified with photogrammetric measurements of the back side of the aeroshell in flight. Aerodynamic stability as well as drag performance will be verified with on board inertial measurements and radar tracking from multiple ground radar stations. The experiment will yield valuable information about zero-g vacuum deployment dynamics of the flexible inflatable structure with both inertial and photographic measurements. In addition to demonstrating inflatable technology, IRVE will validate structural, aerothermal, and trajectory modeling techniques for the inflatable. Structural response determined from photogrammetrics will validate structural models, skin temperature measurements and additional in-depth temperature measurements will validate material thermal performance models, and on board inertial measurements along with radar tracking from multiple ground radar stations will validate trajectory simulation models.

  11. Modeling zero-modified count and semicontinuous data in health services research part 2: case studies.

    PubMed

    Neelon, Brian; O'Malley, A James; Smith, Valerie A

    2016-11-30

    This article is the second installment of a two-part tutorial on the analysis of zero-modified count and semicontinuous data. Part 1, which appears as a companion piece in this issue of Statistics in Medicine, provides a general background and overview of the topic, with particular emphasis on applications to health services research. Here, we present three case studies highlighting various approaches for the analysis of zero-modified data. The first case study describes methods for analyzing zero-inflated longitudinal count data. Case study 2 considers the use of hurdle models for the analysis of spatiotemporal count data. The third case study discusses an application of marginalized two-part models to the analysis of semicontinuous health expenditure data. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Geographically weighted poisson regression semiparametric on modeling of the number of tuberculosis cases (Case study: Bandung city)

    NASA Astrophysics Data System (ADS)

    Octavianty, Toharudin, Toni; Jaya, I. G. N. Mindra

    2017-03-01

    Tuberculosis (TB) is a disease caused by a bacterium, called Mycobacterium tuberculosis, which typically attacks the lungs but can also affect the kidney, spine, and brain (Centers for Disease Control and Prevention). Indonesia had the largest number of TB cases after India (Global Tuberculosis Report 2015 by WHO). The distribution of Mycobacterium tuberculosis genotypes in Indonesia showed the high genetic diversity and tended to vary by geographic regions. For instance, in Bandung city, the prevalence rate of TB morbidity is quite high. A number of TB patients belong to the counted data. To determine the factors that significantly influence the number of tuberculosis patients in each location of the observations can be used statistical analysis tool that is Geographically Weighted Poisson Regression Semiparametric (GWPRS). GWPRS is an extension of the Poisson regression and GWPR that is influenced by geographical factors, and there is also variables that influence globally and locally. Using the TB Data in Bandung city (in 2015), the results show that the global and local variables that influence the number of tuberculosis patients in every sub-district.

  13. Multi-material Additive Manufacturing of Metamaterials with Giant, Tailorable Negative Poisson's Ratios.

    PubMed

    Chen, Da; Zheng, Xiaoyu

    2018-06-14

    Nature has evolved with a recurring strategy to achieve unusual mechanical properties through coupling variable elastic moduli from a few GPa to below KPa within a single tissue. The ability to produce multi-material, three-dimensional (3D) micro-architectures with high fidelity incorporating dissimilar components has been a major challenge in man-made materials. Here we show multi-modulus metamaterials whose architectural element is comprised of encoded elasticity ranging from rigid to soft. We found that, in contrast to ordinary architected materials whose negative Poisson's ratio is dictated by their geometry, these type of metamaterials are capable of displaying Poisson's ratios from extreme negative to zero, independent of their 3D micro-architecture. The resulting low density metamaterials is capable of achieving functionally graded, distributed strain amplification capabilities within the metamaterial with uniform micro-architectures. Simultaneous tuning of Poisson's ratio and moduli within the 3D multi-materials could open up a broad array of material by design applications ranging from flexible armor, artificial muscles, to actuators and bio-mimetic materials.

  14. Adiabatic elimination for systems with inertia driven by compound Poisson colored noise.

    PubMed

    Li, Tiejun; Min, Bin; Wang, Zhiming

    2014-02-01

    We consider the dynamics of systems driven by compound Poisson colored noise in the presence of inertia. We study the limit when the frictional relaxation time and the noise autocorrelation time both tend to zero. We show that the Itô and Marcus stochastic calculuses naturally arise depending on these two time scales, and an extra intermediate type occurs when the two time scales are comparable. This leads to three different limiting regimes which are supported by numerical simulations. Furthermore, we establish that when the resulting compound Poisson process tends to the Wiener process in the frequent jump limit the Itô and Marcus calculuses, respectively, tend to the classical Itô and Stratonovich calculuses for Gaussian white noise, and the crossover type calculus tends to a crossover between the Itô and Stratonovich calculuses. Our results would be very helpful for understanding relevant experiments when jump type noise is involved.

  15. A comparative study of count models: application to pedestrian-vehicle crashes along Malaysia federal roads.

    PubMed

    Hosseinpour, Mehdi; Pour, Mehdi Hossein; Prasetijo, Joewono; Yahaya, Ahmad Shukri; Ghadiri, Seyed Mohammad Reza

    2013-01-01

    The objective of this study was to examine the effects of various roadway characteristics on the incidence of pedestrian-vehicle crashes by developing a set of crash prediction models on 543 km of Malaysia federal roads over a 4-year time span between 2007 and 2010. Four count models including the Poisson, negative binomial (NB), hurdle Poisson (HP), and hurdle negative binomial (HNB) models were developed and compared to model the number of pedestrian crashes. The results indicated the presence of overdispersion in the pedestrian crashes (PCs) and showed that it is due to excess zero rather than variability in the crash data. To handle the issue, the hurdle Poisson model was found to be the best model among the considered models in terms of comparative measures. Moreover, the variables average daily traffic, heavy vehicle traffic, speed limit, land use, and area type were significantly associated with PCs.

  16. Biomechanical remodeling of obstructed guinea pig jejunum

    PubMed Central

    Zhao, Jingbo; Liao, Donghua; Yang, Jian; Gregersen, Hans

    2010-01-01

    Data on morphological and biomechanical remodeling are needed to understand the mechanisms behind intestinal obstruction. The effect of partial obstruction on mechanical properties with reference to the zero-stress state and on the histomorphological properties of the guinea pig small intestine was determined in this study. Partial obstruction and sham operation were surgically created in mid-jejunum of guinea pigs. The animals survived 2, 4, 7, and 14 days respectively. The age-matched guinea pigs that were not operated served as normal controls. The segment proximal to the obstruction site was used for histological analysis, no-load state and zero-stress state data, and distension test. The segment for distension was immersed in an organ bath and inflated to 10 cmH20. The outer diameter change during the inflation was monitored using a microscope with CCD camera. Circumferential stresses and strains were computed from the diameter, pressure and the zero-stress state data. The opening angle and absolute value of residual strain decreased (P<0.01 and P<0.001) whereas the wall thickness, wall cross-sectional area, and the wall stiffness increased after 7 days obstruction (P<0.05, P<0.01). Histologically, the muscle and submucosa layers, especially the circumferential muscle layer increased in thickness after obstruction. The opening angle and residual strain mainly depended on the thickness of the muscle layer whereas the wall stiffness mainly depended on the thickness of the submucosa layer. In conclusion, the histomorphological and biomechanical properties of small intestine (referenced for the first time to the zero-stress state) remodel proximal to the obstruction site in a time-dependent manner. PMID:20189575

  17. A Model Comparison for Count Data with a Positively Skewed Distribution with an Application to the Number of University Mathematics Courses Completed

    ERIC Educational Resources Information Center

    Liou, Pey-Yan

    2009-01-01

    The current study examines three regression models: OLS (ordinary least square) linear regression, Poisson regression, and negative binomial regression for analyzing count data. Simulation results show that the OLS regression model performed better than the others, since it did not produce more false statistically significant relationships than…

  18. Anisotropic inflation with a non-minimally coupled electromagnetic field to gravity

    NASA Astrophysics Data System (ADS)

    Adak, Muzaffer; Akarsu, Özgür; Dereli, Tekin; Sert, Özcan

    2017-11-01

    We consider the non-minimal model of gravity in Y(R) F2-form. We investigate a particular case of the model, for which the higher order derivatives are eliminated but the scalar curvature R is kept to be dynamical via the constraint YRFmnFmn =-2/κ2. The effective fluid obtained can be represented by interacting electromagnetic field and vacuum depending on Y(R), namely, the energy density of the vacuum tracks R while energy density of the conventional electromagnetic field is dynamically scaled with the factor Y(R)/2. We give exact solutions for anisotropic inflation by assuming the volume scale factor of the Universe exhibits a power-law expansion. The directional scale factors do not necessarily exhibit power-law expansion, which would give rise to a constant expansion anisotropy, but expand non-trivially and give rise to a non-monotonically evolving expansion anisotropy that eventually converges to a non-zero constant. Relying on this fact, we discuss the anisotropic e-fold during the inflation by considering observed scale invariance in CMB and demanding the Universe to undergo the same amount of e-folds in all directions. We calculate the residual expansion anisotropy at the end of inflation, though as a result of non-monotonic behaviour of expansion anisotropy all the axes of the Universe undergo the same of amount of e-folds by the end of inflation. We also discuss the generation of the modified electromagnetic field during the first few e-folds of the inflation and its persistence against to the vacuum till end of inflation.

  19. A New Zero-Inflated Negative Binomial Methodology for Latent Category Identification

    ERIC Educational Resources Information Center

    Blanchard, Simon J.; DeSarbo, Wayne S.

    2013-01-01

    We introduce a new statistical procedure for the identification of unobserved categories that vary between individuals and in which objects may span multiple categories. This procedure can be used to analyze data from a proposed sorting task in which individuals may simultaneously assign objects to multiple piles. The results of a synthetic…

  20. An analysis of input errors in precipitation-runoff models using regression with errors in the independent variables

    USGS Publications Warehouse

    Troutman, Brent M.

    1982-01-01

    Errors in runoff prediction caused by input data errors are analyzed by treating precipitation-runoff models as regression (conditional expectation) models. Independent variables of the regression consist of precipitation and other input measurements; the dependent variable is runoff. In models using erroneous input data, prediction errors are inflated and estimates of expected storm runoff for given observed input variables are biased. This bias in expected runoff estimation results in biased parameter estimates if these parameter estimates are obtained by a least squares fit of predicted to observed runoff values. The problems of error inflation and bias are examined in detail for a simple linear regression of runoff on rainfall and for a nonlinear U.S. Geological Survey precipitation-runoff model. Some implications for flood frequency analysis are considered. A case study using a set of data from Turtle Creek near Dallas, Texas illustrates the problems of model input errors.

  1. Relative age effect in elite soccer: More early-born players, but no better valued, and no paragon clubs or countries

    PubMed Central

    Doyle, John R.

    2018-01-01

    The paper analyses two datasets of elite soccer players (top 1000 professionals and UEFA Under-19 Youth League). In both, we find a Relative Age Effect (RAE) for frequency, but not for value. That is, while there are more players born at the start of the competition year, their transfer values are no higher, nor are they given more game time. We use Poisson regression to derive a transparent index of the discrimination present in RAE. Also, because Poisson is valid for small frequency counts, it supports analysis at the disaggregated levels of country and club. From this, we conclude there are no paragon clubs or countries immune to RAE; that is clubs and countries do not differ systematically in the RAE they experience; also, that Poisson regression is a powerful and flexible method of analysing RAE data. PMID:29420576

  2. Use of ceramic water filtration in the prevention of diarrheal disease: a randomized controlled trial in rural South Africa and zimbabwe.

    PubMed

    du Preez, Martella; Conroy, Ronán M; Wright, James A; Moyo, Sibonginkosi; Potgieter, Natasha; Gundry, Stephen W

    2008-11-01

    To determine the effectiveness of ceramic filters in reducing diarrhea, we conducted a randomized controlled trial in Zimbabwe and South Africa, in which 61 of 115 households received ceramic filters. Incidence of non-bloody and bloody diarrhea was recorded daily over 6 months using pictorial diaries for children 24-36 months of age. Poisson regression was used to compare incidence rates in intervention and control households. Adjusted for source quality, intervention household drinking water showed reduced Escherichia coli counts (relative risk, 0.67; 95% CI, 0.50-0.89). Zero E. coli were obtained for drinking water in 56.9% of intervention households. The incidence rate ratio for bloody diarrhea was 0.20 (95% CI, 0.09-0.43; P < 0.001) and for non-bloody diarrhea was 0.17 (95% CI, 0.08-0.38; P < 0.001), indicating much lower diarrhea incidence among filter users. The results suggest that ceramic filters are effective in reducing diarrheal disease incidence.

  3. Flight height preference for oviposition of mosquito (Diptera: Culicidae) vectors of sylvatic yellow fever virus near the hydroelectric reservoir of Simplício, Minas Gerais, Brazil.

    PubMed

    Alencar, Jeronimo; Morone, Fernanda; De Mello, Cecília Ferreira; Dégallier, Nicolas; Lucio, Paulo Sérgio; de Serra-Freire, Nicolau Maués; Guimarães, Anthony Erico

    2013-07-01

    In this study, the oviposition behavior of mosquito species exhibiting acrodendrophilic habits was investigated. The study was conducted near the Simplicio Hydroelectic Reservoir (SHR) located on the border of the states of Minas Gerais and Rio de Janeiro, Brazil. Samples were collected using oviposition traps installed in forest vegetation cover between 1.70 and 4.30 m above ground level during the months of April, June, August, October, and December of 2011. Haemagogus janthinomys (Dyar), Haemagogus leucocelaenus (Dyar and Shannon), Aedes albopictus (Skuse), and Aedes terrens (Walker) specimens were present among the collected samples, the first two of which being proven vectors of sylvatic yellow fever (SYF) in Brazil and the latter is a vector of dengue in mainland Asia. As the data set was zero-inflated, a specific Poisson-based model was used for the statistical analysis. When all four species were considered in the model, only heights used for egg laying and months of sampling were explaining the distribution. However, grouping the species under the genera Haemagogus Williston and Aedes Meigen showed a significant preference for higher traps of the former. Considering the local working population of SHR is very large, fluctuating, and potentially exposed to SYF, and that this virus occurs in almost all Brazilian states, monitoring of Culicidae in Brazil is essential for assessing the risk of transmission of this arbovirus.

  4. The Effects of a Park Awareness Campaign on Rural Park Use and Physical Activity.

    PubMed

    Banda, Jorge A; Hooker, Steven P; Wilcox, Sara; Colabianchi, Natalie; Kaczynski, Andrew T; Hussey, James

    To examine the effects of a park awareness campaign on park use in 6 community parks. One-group pretest-posttest design. Six community parks located in a South Carolina county. Children, adolescents, and adults observed in community parks. A 1-month awareness campaign that culminated in single 1.5-hour events at 6 parks in April 2011 and May 2011. The System for Observing Play and Recreation in Communities was used to objectively measure park use in May 2010 (baseline) and May 2011 (postcampaign). Zero-inflated Poisson models tested whether the number of total park users and the number of park users engaged in sedentary, walking, and vigorous activities differed by observation date. Park use was significantly greater at baseline than postcampaign (97 vs 84 users, respectively; χ = 4.69, P = .03). There were no significant differences in the number of park users engaged in sedentary (χ = 2.45, P = .12), walking (χ = 0.29, P = .59), and vigorous (χ = 0.20, P = .65) activities between baseline and postcampaign. Although only 97 and 84 people were observed across all parks at baseline and postcampaign, a total of 629 people were observed during the 6 separate 1.5-hour campaign park events. This suggests that there is potential for greater park utilization in these communities, and important questions remain on how to conduct effective awareness campaigns and how to harness interest in park events for the purpose of contributing to future community-wide physical activity and health promotion efforts.

  5. Development of Secondary Woodland in Oak Wood Pastures Reduces the Richness of Rare Epiphytic Lichens

    PubMed Central

    Paltto, Heidi; Nordberg, Anna; Nordén, Björn; Snäll, Tord

    2011-01-01

    Wooded pastures with ancient trees were formerly abundant throughout Europe, but during the last century, grazing has largely been abandoned often resulting in dense forests. Ancient trees constitute habitat for many declining and threatened species, but the effects of secondary woodland on the biodiversity associated with these trees are largely unknown. We tested for difference in species richness, occurrence, and abundance of a set of nationally and regionally red-listed epiphytic lichens between ancient oaks located in secondary woodland and ancient oaks located in open conditions. We refined the test of the effect of secondary woodland by also including other explanatory variables. Species occurrence and abundance were modelled jointly using overdispersed zero-inflated Poisson models. The richness of the red-listed lichens on ancient oaks in secondary woodland was half of that compared with oaks growing in open conditions. The species-level analyses revealed that this was mainly the result of lower occupancy of two of the study species. The tree-level abundance of one species was also lower in secondary woodland. Potential explanations for this pattern are that the study lichens are adapted to desiccating conditions enhancing their population persistence by low competition or that open, windy conditions enhance their colonisation rate. This means that the development of secondary woodland is a threat to red-listed epiphytic lichens. We therefore suggest that woody vegetation is cleared and grazing resumed in abandoned oak pastures. Importantly, this will also benefit the vitality of the oaks. PMID:21961041

  6. Selecting exposure measures in crash rate prediction for two-lane highway segments.

    PubMed

    Qin, Xiao; Ivan, John N; Ravishanker, Nalini

    2004-03-01

    A critical part of any risk assessment is identifying how to represent exposure to the risk involved. Recent research shows that the relationship between crash count and traffic volume is non-linear; consequently, a simple crash rate computed as the ratio of crash count to volume is not proper for comparing the safety of sites with different traffic volumes. To solve this problem, we describe a new approach for relating traffic volume and crash incidence. Specifically, we disaggregate crashes into four types: (1) single-vehicle, (2) multi-vehicle same direction, (3) multi-vehicle opposite direction, and (4) multi-vehicle intersecting, and define candidate exposure measures for each that we hypothesize will be linear with respect to each crash type. This paper describes initial investigation using crash and physical characteristics data for highway segments in Michigan from the Highway Safety Information System (HSIS). We use zero-inflated-Poisson (ZIP) modeling to estimate models for predicting counts for each of the above crash types as a function of the daily volume, segment length, speed limit and roadway width. We found that the relationship between crashes and the daily volume (AADT) is non-linear and varies by crash type, and is significantly different from the relationship between crashes and segment length for all crash types. Our research will provide information to improve accuracy of crash predictions and, thus, facilitate more meaningful comparison of the safety record of seemingly similar highway locations.

  7. Modeling Associations between Principals' Reported Indoor Environmental Quality and Students' Self-Reported Respiratory Health Outcomes Using GLMM and ZIP Models.

    PubMed

    Toyinbo, Oluyemi; Matilainen, Markus; Turunen, Mari; Putus, Tuula; Shaughnessy, Richard; Haverinen-Shaughnessy, Ulla

    2016-03-30

    The aim of this paper was to examine associations between school building characteristics, indoor environmental quality (IEQ), and health responses using questionnaire data from both school principals and students. From 334 randomly sampled schools, 4248 sixth grade students from 297 schools participated in a questionnaire. From these schools, 134 principals returned questionnaires concerning 51 IEQ related questions of their school. Generalized linear mixed models (GLMM) were used to study the associations between IEQ indicators and existence of self-reported upper respiratory symptoms, while hierarchical Zero Inflated Poisson (ZIP)-models were used to model the number of symptoms. Significant associations were established between existence of upper respiratory symptoms and unsatisfactory classroom temperature during the heating season (ORs 1.45 for too hot and cold, and 1.27 for too cold as compared to satisfactory temperature) and dampness or moisture damage during the year 2006-2007 (OR: 1.80 as compared to no moisture damage), respectively. The number of upper respiratory symptoms was significantly associated with inadequate ventilation and dampness or moisture damage. A higher number of missed school days due to respiratory infections were reported in schools with inadequate ventilation (RR: 1.16). The school level IEQ indicator variables described in this paper could explain a relatively large part of the school level variation observed in the self-reported upper respiratory symptoms and missed school days due to respiratory infections among students.

  8. Modeling Associations between Principals’ Reported Indoor Environmental Quality and Students’ Self-Reported Respiratory Health Outcomes Using GLMM and ZIP Models

    PubMed Central

    Toyinbo, Oluyemi; Matilainen, Markus; Turunen, Mari; Putus, Tuula; Shaughnessy, Richard; Haverinen-Shaughnessy, Ulla

    2016-01-01

    Background: The aim of this paper was to examine associations between school building characteristics, indoor environmental quality (IEQ), and health responses using questionnaire data from both school principals and students. Methods: From 334 randomly sampled schools, 4248 sixth grade students from 297 schools participated in a questionnaire. From these schools, 134 principals returned questionnaires concerning 51 IEQ related questions of their school. Generalized linear mixed models (GLMM) were used to study the associations between IEQ indicators and existence of self-reported upper respiratory symptoms, while hierarchical Zero Inflated Poisson (ZIP)—models were used to model the number of symptoms. Results: Significant associations were established between existence of upper respiratory symptoms and unsatisfactory classroom temperature during the heating season (ORs 1.45 for too hot and cold, and 1.27 for too cold as compared to satisfactory temperature) and dampness or moisture damage during the year 2006–2007 (OR: 1.80 as compared to no moisture damage), respectively. The number of upper respiratory symptoms was significantly associated with inadequate ventilation and dampness or moisture damage. A higher number of missed school days due to respiratory infections were reported in schools with inadequate ventilation (RR: 1.16). Conclusions: The school level IEQ indicator variables described in this paper could explain a relatively large part of the school level variation observed in the self-reported upper respiratory symptoms and missed school days due to respiratory infections among students. PMID:27043595

  9. The Effects of Text Message Content on the Use of an Internet-Based Physical Activity Intervention in Hong Kong Chinese Adolescents.

    PubMed

    Lau, Erica Y; Lau, Patrick W C; Cai, Bo; Archer, Edward

    2015-01-01

    This study examined the effects of text message content (generic vs. culturally tailored) on the login rate of an Internet physical activity program in Hong Kong Chinese adolescent school children. A convenience sample of 252 Hong Kong secondary school adolescents (51% female, 49% male; M age = 13.17 years, SD = 1.28 years) were assigned to one of 3 treatments for 8 weeks. The control group consisted of an Internet physical activity program. The Internet plus generic text message group consisted of the same Internet physical activity program and included daily generic text messages. The Internet plus culturally tailored text message group consisted of the Internet physical activity program and included daily culturally tailored text messages. Zero-inflated Poisson mixed models showed that the overall effect of the treatment group on the login rates varied significantly across individuals. The login rates over time were significantly higher in the Internet plus culturally tailored text message group than the control group (β = 46.06, 95% CI 13.60, 156.02; p = .002) and the Internet plus generic text message group (β = 15.80, 95% CI 4.81, 51.9; p = .021) after adjusting for covariates. These findings suggest that culturally tailored text messages may be more advantageous than generic text messages on improving adolescents' website login rate, but effects varied significantly across individuals. Our results support the inclusion of culturally tailored messaging in future online physical activity interventions.

  10. Unemployment and inflation dynamics prior to the economic downturn of 2007-2008.

    PubMed

    Guastello, Stephen J; Myers, Adam

    2009-10-01

    This article revisits a long-standing theoretical issue as to whether a "natural rate" of unemployment exists in the sense of an exogenously driven fixed-point Walrasian equilibrium or attractor, or whether more complex dynamics such as hysteresis or chaos characterize an endogenous dynamical process instead. The same questions are posed regarding a possible natural rate of inflation along with an investigation of the actual relationship between inflation and unemployment for which extent theories differ. Time series of unemployment and inflation for US data - were analyzed using the exponential model series and nonlinear regression for capturing Lyapunov exponents and transfer effects from other variables. The best explanation for unemployment was that it is a chaotic variable that is driven in part by inflation. The best explanation for inflation is that it is also a chaotic variable driven in part by unemployment and the prices of treasury bills. Estimates of attractors' epicenters were calculated in lieu of classical natural rates.

  11. Modeling zero-modified count and semicontinuous data in health services research Part 1: background and overview.

    PubMed

    Neelon, Brian; O'Malley, A James; Smith, Valerie A

    2016-11-30

    Health services data often contain a high proportion of zeros. In studies examining patient hospitalization rates, for instance, many patients will have no hospitalizations, resulting in a count of zero. When the number of zeros is greater or less than expected under a standard count model, the data are said to be zero modified relative to the standard model. A similar phenomenon arises with semicontinuous data, which are characterized by a spike at zero followed by a continuous distribution with positive support. When analyzing zero-modified count and semicontinuous data, flexible mixture distributions are often needed to accommodate both the excess zeros and the typically skewed distribution of nonzero values. Various models have been introduced over the past three decades to accommodate such data, including hurdle models, zero-inflated models, and two-part semicontinuous models. This tutorial describes recent modeling strategies for zero-modified count and semicontinuous data and highlights their role in health services research studies. Part 1 of the tutorial, presented here, provides a general overview of the topic. Part 2, appearing as a companion piece in this issue of Statistics in Medicine, discusses three case studies illustrating applications of the methods to health services research. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Climate variations and salmonellosis transmission in Adelaide, South Australia: a comparison between regression models

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Bi, Peng; Hiller, Janet

    2008-01-01

    This is the first study to identify appropriate regression models for the association between climate variation and salmonellosis transmission. A comparison between different regression models was conducted using surveillance data in Adelaide, South Australia. By using notified salmonellosis cases and climatic variables from the Adelaide metropolitan area over the period 1990-2003, four regression methods were examined: standard Poisson regression, autoregressive adjusted Poisson regression, multiple linear regression, and a seasonal autoregressive integrated moving average (SARIMA) model. Notified salmonellosis cases in 2004 were used to test the forecasting ability of the four models. Parameter estimation, goodness-of-fit and forecasting ability of the four regression models were compared. Temperatures occurring 2 weeks prior to cases were positively associated with cases of salmonellosis. Rainfall was also inversely related to the number of cases. The comparison of the goodness-of-fit and forecasting ability suggest that the SARIMA model is better than the other three regression models. Temperature and rainfall may be used as climatic predictors of salmonellosis cases in regions with climatic characteristics similar to those of Adelaide. The SARIMA model could, thus, be adopted to quantify the relationship between climate variations and salmonellosis transmission.

  13. Infrared Extinction Coefficients of Aerosolized Conductive Flake Powders and Flake Suspensions having a Zero-Truncated Poisson Size Distribution

    DTIC Science & Technology

    2012-11-01

    report may not be cited for purposes of advertisement . This report has been approved for public release. Acknowledgments The authors would...visible wavelengths, the eye perceives an image as a result of color contrasts that consist of differences in luminance and chromaticity (hue and

  14. The Application of Censored Regression Models in Low Streamflow Analyses

    NASA Astrophysics Data System (ADS)

    Kroll, C.; Luz, J.

    2003-12-01

    Estimation of low streamflow statistics at gauged and ungauged river sites is often a daunting task. This process is further confounded by the presence of intermittent streamflows, where streamflow is sometimes reported as zero, within a region. Streamflows recorded as zero may be zero, or may be less than the measurement detection limit. Such data is often referred to as censored data. Numerous methods have been developed to characterize intermittent streamflow series. Logit regression has been proposed to develop regional models of the probability annual lowflows series (such as 7-day lowflows) are zero. In addition, Tobit regression, a method of regression that allows for censored dependent variables, has been proposed for lowflow regional regression models in regions where the lowflow statistic of interest estimated as zero at some sites in the region. While these methods have been proposed, their use in practice has been limited. Here a delete-one jackknife simulation is presented to examine the performance of Logit and Tobit models of 7-day annual minimum flows in 6 USGS water resource regions in the United States. For the Logit model, an assessment is made of whether sites are correctly classified as having at least 10% of 7-day annual lowflows equal to zero. In such a situation, the 7-day, 10-year lowflow (Q710), a commonly employed low streamflow statistic, would be reported as zero. For the Tobit model, a comparison is made between results from the Tobit model, and from performing either ordinary least squares (OLS) or principal component regression (PCR) after the zero sites are dropped from the analysis. Initial results for the Logit model indicate this method to have a high probability of correctly classifying sites into groups with Q710s as zero and non-zero. Initial results also indicate the Tobit model produces better results than PCR and OLS when more than 5% of the sites in the region have Q710 values calculated as zero.

  15. STS-45 crewmembers during zero gravity activities onboard KC-135 NASA 930

    NASA Technical Reports Server (NTRS)

    1991-01-01

    STS-45 Atlantis, Orbiter Vehicle (OV) 104, crewmembers and backup payload specialist participate in zero gravity activities onboard KC-135 NASA 930. The crewmembers, wearing flight suits, float and tumble around an inflated globe during the few seconds of microgravity created by parabolic flight. With his hand on the fuselage ceiling is Payload Specialist Dirk D. Frimout. Clockwise from his position are Mission Specialist (MS) C. Michael Foale, Pilot Brian Duffy, backup Payload Specialist Charles R. Chappell, MS and Payload Commander (PLC) Kathryn D. Sullivan (with eye glasses), Commander Charles F. Bolden, and Payload Specialist Byron K. Lichtenberg.

  16. Modelling fourier regression for time series data- a case study: modelling inflation in foods sector in Indonesia

    NASA Astrophysics Data System (ADS)

    Prahutama, Alan; Suparti; Wahyu Utami, Tiani

    2018-03-01

    Regression analysis is an analysis to model the relationship between response variables and predictor variables. The parametric approach to the regression model is very strict with the assumption, but nonparametric regression model isn’t need assumption of model. Time series data is the data of a variable that is observed based on a certain time, so if the time series data wanted to be modeled by regression, then we should determined the response and predictor variables first. Determination of the response variable in time series is variable in t-th (yt), while the predictor variable is a significant lag. In nonparametric regression modeling, one developing approach is to use the Fourier series approach. One of the advantages of nonparametric regression approach using Fourier series is able to overcome data having trigonometric distribution. In modeling using Fourier series needs parameter of K. To determine the number of K can be used Generalized Cross Validation method. In inflation modeling for the transportation sector, communication and financial services using Fourier series yields an optimal K of 120 parameters with R-square 99%. Whereas if it was modeled by multiple linear regression yield R-square 90%.

  17. Secondhand smoke exposure in the workplace.

    PubMed

    Skeer, Margie; Cheng, Debbie M; Rigotti, Nancy A; Siegel, Michael

    2005-05-01

    Currently, there is little understanding of the relationship between the strength of workplace smoking policies and the likelihood and duration, not just the likelihood, of exposure to secondhand smoke at work. This study assessed self-reported exposure to secondhand smoke at work in hours per week among a cross-sectional sample of 3650 Massachusetts adults who were employed primarily at a single worksite outside the home that was not mainly outdoors. The sample data were from a larger longitudinal study designed to examine the effect of community-based tobacco control interventions on adult and youth smoking behavior. Participants were identified through a random-digit-dialing telephone survey. Multiple logistic regression and zero-inflated negative binomial regression models were used to estimate the independent effect of workplace smoking policies on the likelihood and duration of exposure to secondhand smoke. Compared to employees whose workplace banned smoking completely, those whose workplace provided designated smoking areas had 2.9 times the odds of being exposed to secondhand smoke and 1.74 times the duration of exposure, while those with no restrictions had 10.27 times the odds of being exposed and 6.34 times the duration of exposure. Workplace smoking policies substantially reduce the likelihood of self-reported secondhand smoke exposure among employees in the workplace and also greatly affect the duration of exposure.

  18. [Exploration of influencing factors of price of herbal based on VAR model].

    PubMed

    Wang, Nuo; Liu, Shu-Zhen; Yang, Guang

    2014-10-01

    Based on vector auto-regression (VAR) model, this paper takes advantage of Granger causality test, variance decomposition and impulse response analysis techniques to carry out a comprehensive study of the factors influencing the price of Chinese herbal, including herbal cultivation costs, acreage, natural disasters, the residents' needs and inflation. The study found that there is Granger causality relationship between inflation and herbal prices, cultivation costs and herbal prices. And in the total variance analysis of Chinese herbal and medicine price index, the largest contribution to it is from its own fluctuations, followed by the cultivation costs and inflation.

  19. Measuring moral hazard and adverse selection by propensity scoring in the mixed health care economy of Hong Kong.

    PubMed

    Wong, Irene O L; Lindner, Michael J; Cowling, Benjamin J; Lau, Eric H Y; Lo, Su-Vui; Leung, Gabriel M

    2010-04-01

    To evaluate the presence of moral hazard, adjusted for the propensity to have self-purchased insurance policies, employer-based medical benefits, and welfare-associated medical benefits in Hong Kong. Based on 2005 population survey, we used logistic regression and zero-truncated negative binomial/Poisson regressions to assess the presence of moral hazard by comparing inpatient and outpatient utilization between insured and uninsured individuals. We fitted each enabling factor specific to the type of service covered, and adjusted for predisposing socioeconomic and demographic factors. We used a propensity score approach to account for potential adverse selection. Employment-based benefits coverage was associated with increased access and intensity of use for both inpatient and outpatient care, except for public hospital use. Similarly, welfare-based coverage had comparable effect sizes as employment-based schemes, except for the total number of public ambulatory episodes. Self-purchased insurance facilitated access but did not apparently induce greater demand of services among ever users. Nevertheless, there was no evidence of moral hazard in public hospital use. Our findings suggest that employment-based benefits coverage lead to the greatest degree of moral hazard in Hong Kong. Future studies should focus on confirming these observational findings using a randomized design. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  20. A Negative Binomial Regression Model for Accuracy Tests

    ERIC Educational Resources Information Center

    Hung, Lai-Fa

    2012-01-01

    Rasch used a Poisson model to analyze errors and speed in reading tests. An important property of the Poisson distribution is that the mean and variance are equal. However, in social science research, it is very common for the variance to be greater than the mean (i.e., the data are overdispersed). This study embeds the Rasch model within an…

  1. Inspiratory capacity at inflation hold in ventilated newborns: a surrogate measure for static compliance of the respiratory system.

    PubMed

    Hentschel, Roland; Semar, Nicole; Guttmann, Josef

    2012-09-01

    To study appropriateness of respiratory system compliance calculation using an inflation hold and compare it with ventilator readouts of pressure and tidal volume as well as with measurement of compliance of the respiratory system with the single-breath-single-occlusion technique gained with a standard lung function measurement. Prospective clinical trial. Level III neonatal unit of a university hospital. Sixty-seven newborns, born prematurely or at term, ventilated for a variety of pathologic conditions. A standardized sigh maneuver with a predefined peak inspiratory pressure of 30 cm H2O, termed inspiratory capacity at inflation hold, was applied. Using tidal volume, exhaled from inspiratory pause down to ambient pressure, as displayed by the ventilator, and predefined peak inspiratory pressure, compliance at inspiratory capacity at inflation hold conditions could be calculated as well as ratio of tidal volume and ventilator pressure using tidal volume and differential pressure at baseline ventilator settings: peak inspiratory pressure minus positive end-expiratory pressure. For the whole cohort, the equation for the regression between tidal volume at inspiratory capacity at inflation hold and compliance of the respiratory system was: compliance of the respiratory system = 0.052 * tidal volume at inspiratory capacity at inflation hold - 0.113, and compliance at inspiratory capacity at inflation hold conditions was closely related to the standard lung function measurement method of compliance of the respiratory system (R = 0.958). In contrast, ratio of tidal volume and ventilator pressure per kilogram calculated from the ventilator readouts and displayed against compliance of the respiratory system per kilogram yielded a broad scatter throughout the whole range of compliance; both were only weakly correlated (R = 0.309) and also the regression line was significantly different from the line of identity (p < .05). Peak inspiratory pressure at study entry did not affect the correlation between compliance at inspiratory capacity at inflation hold conditions and compliance of the respiratory system. After a standard sigh maneuver, inspiratory capacity at inflation hold and the derived quantity compliance at inspiratory capacity at inflation hold conditions can be regarded as a valid, accurate, and reliable surrogate measure for standard compliance of the respiratory system in contrast to ratio of tidal volume and ventilator pressure calculated from the ventilator readouts during ongoing mechanical ventilation at respective ventilator settings.

  2. Maternal hypertension and risk for hypospadias in offspring.

    PubMed

    Agopian, A J; Hoang, Thanh T; Mitchell, Laura E; Morrison, Alanna C; Tu, Duong; Nassar, Natasha; Canfield, Mark A

    2016-12-01

    Hypospadias is one of the most common birth defects in male infants. Maternal hypertension is a suspected risk factor; however, few previous studies have addressed the possibility of reporting bias, and several previous studies have not accounted for hypospadias severity. We analyzed data from the Texas Birth Defects Registry for 10,924 nonsyndromic cases and statewide vital records for deliveries during 1999-2009, using Poisson regression. After adjustment for potential confounders, hypospadias was associated with maternal hypertension (adjusted prevalence ratio: 1.5, 95% confidence interval: 1.4-1.7). Similar associations were observed with gestational and pregestational hypertension, including separate analyses restricted to the subset of cases with severe (second- or third-degree) hypospadias. All of these associations were also similar among the subset of cases with isolated hypospadias (without additional birth defects). To evaluate the potential for bias due to potential hypertension misclassification, we repeated our analyses using logistic regression, comparing the cases to controls with other birth defects. In these analyses, the associations with gestational hypertension were similar, but adjusted associations with pregestational hypertension were no longer observed. Our findings support an association between gestational hypertension and hypospadias in offspring, but also suggest that previously observed associations with pregestational hypertension may have been inflated due to differential misclassification of hypertension (e.g., reporting bias). As gestational hypertension is recognized after hypospadias development, more research is needed to determine if this association reflects an increase in gestational hypertension risk secondary to hypospadias or if both conditions have shared risk factors (e.g., precursors of gestational hypertension). © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  3. Quasiopen inflation

    NASA Astrophysics Data System (ADS)

    García-Bellido, Juan; Garriga, Jaume; Montes, Xavier

    1998-04-01

    We show that a large class of two-field models of single-bubble open inflation does not lead to infinite open universes, as was previously thought, but to an ensemble of very large but finite inflating ``islands.'' The reason is that the quantum tunneling responsible for the nucleation of the bubble does not occur simultaneously along both field directions and equal-time hypersurfaces in the open universe are not synchronized with equal-density or fixed-field hypersurfaces. The most probable tunneling trajectory corresponds to a zero value of the inflaton field; large values, necessary for the second period of inflation inside the bubble, only arise as localized fluctuations. The interior of each nucleated bubble will contain an infinite number of such inflating regions of comoving size of order γ-1, where γ is the supercurvature eigenvalue, which depends on the parameters of the model. Each one of these islands will be a quasi-open universe. Since the volume of the hyperboloid is infinite, inflating islands with all possible values of the field at their center will be realized inside of a single bubble. We may happen to live in one of those patches of comoving size d<~γ-1, where the universe appears to be open. In particular, we consider the ``supernatural'' model proposed by Linde and Mezhlumian. There, an approximate U(1) symmetry is broken by a tunneling field in a first order phase transition, and slow-roll inflation inside the nucleated bubble is driven by the pseudo Goldstone field. We find that the excitations of the pseudo Goldstone field produced by the nucleation and subsequent expansion of the bubble place severe constraints on this model. We also discuss the coupled and uncoupled two-field models.

  4. Intraurban Differences in the Use of Ambulatory Health Services in a Large Brazilian City

    PubMed Central

    Lima-Costa, Maria Fernanda; Proietti, Fernando Augusto; Cesar, Cibele C.; Macinko, James

    2010-01-01

    A major goal of health systems is to reduce inequities in access to services, that is, to ensure that health care is provided based on health needs rather than social or economic factors. This study aims to identify the determinants of health services utilization among adults in a large Brazilian city and intraurban disparities in health care use. We combine household survey data with census-derived classification of social vulnerability of each household’s census tract. The dependent variable was utilization of physician services in the prior 12 months, and the independent variables included predisposing factors, health needs, enabling factors, and context. Prevalence ratios and 95% confidence intervals were estimated by the Hurdle regression model, which combined Poisson regression analysis of factors associated with any doctor visits (dichotomous variable) and zero-truncated negative binomial regression for the analysis of factors associated with the number of visits among those who had at least one. Results indicate that the use of health services was greater among women and increased with age, and was determined primarily by health needs and whether the individual had a regular doctor, even among those living in areas of the city with the worst socio-environmental indicators. The experience of Belo Horizonte may have implications for other world cities, particularly in the development and use of a comprehensive index to identify populations at risk and in order to guide expansion of primary health care services as a means of enhancing equity in health. PMID:21104332

  5. Seeing Double with K2: Testing Re-inflation with Two Remarkably Similar Planets around Red Giant Branch Stars

    NASA Astrophysics Data System (ADS)

    Grunblatt, Samuel K.; Huber, Daniel; Gaidos, Eric; Lopez, Eric D.; Howard, Andrew W.; Isaacson, Howard T.; Sinukoff, Evan; Vanderburg, Andrew; Nofi, Larissa; Yu, Jie; North, Thomas S. H.; Chaplin, William; Foreman-Mackey, Daniel; Petigura, Erik; Ansdell, Megan; Weiss, Lauren; Fulton, Benjamin; Lin, Douglas N. C.

    2017-12-01

    Despite more than 20 years since the discovery of the first gas giant planet with an anomalously large radius, the mechanism for planet inflation remains unknown. Here, we report the discovery of K2-132b, an inflated gas giant planet found with the NASA K2 Mission, and a revised mass for another inflated planet, K2-97b. These planets orbit on ≈9 day orbits around host stars that recently evolved into red giants. We constrain the irradiation history of these planets using models constrained by asteroseismology and Keck/High Resolution Echelle Spectrometer spectroscopy and radial velocity measurements. We measure planet radii of 1.31 ± 0.11 R J and 1.30 ± 0.07 R J, respectively. These radii are typical for planets receiving the current irradiation, but not the former, zero age main-sequence irradiation of these planets. This suggests that the current sizes of these planets are directly correlated to their current irradiation. Our precise constraints of the masses and radii of the stars and planets in these systems allow us to constrain the planetary heating efficiency of both systems as 0.03{ % }-0.02 % +0.03 % . These results are consistent with a planet re-inflation scenario, but suggest that the efficiency of planet re-inflation may be lower than previously theorized. Finally, we discuss the agreement within 10% of the stellar masses and radii, and the planet masses, radii, and orbital periods of both systems, and speculate that this may be due to selection bias in searching for planets around evolved stars.

  6. Zimbabwe

    DTIC Science & Technology

    2008-09-26

    foreign currency for essential imports, particularly fuel, is in extremely short supply. The IMF suggests that the inflation rate will not reverse without...international assessments of Zimbabwe’s economic prospects remain bleak. Ignoring the advice of the IMF , the government has refused to devalue the official...exchange rate. Instead, in June 2006, Gono devalued the country’s currency , the Zimbabwe dollar, removing three zeros in an effort to mitigate

  7. The zero inflation of standing dead tree carbon stocks

    Treesearch

    Christopher W. Woodall; David W. MacFarlane

    2012-01-01

    Given the importance of standing dead trees in numerous forest ecosystem attributes/processes such as carbon (C) stocks, the USDA Forest Service’s Forest Inventory and Analysis (FIA) program began consistent nationwide sampling of standing dead trees in 1999. Modeled estimates of standing dead tree C stocks are currently used as the official C stock estimates for the...

  8. Economic and policy factors driving adoption of institutional woody biomass heating systems in the United States

    Treesearch

    Jesse D. Young; Nathaniel M. Anderson; Helen T. Naughton; Katrina Mullan

    2018-01-01

    Abundant stocks of woody biomass that are associated with active forest management can be used as fuel for bioenergy in many applications. Though factors driving large-scale biomass use in industrial settings have been studied extensively, small-scale biomass combustion systems commonly used by institutions for heating have received less attention. A zero inflated...

  9. Bayesian Poisson hierarchical models for crash data analysis: Investigating the impact of model choice on site-specific predictions.

    PubMed

    Khazraee, S Hadi; Johnson, Valen; Lord, Dominique

    2018-08-01

    The Poisson-gamma (PG) and Poisson-lognormal (PLN) regression models are among the most popular means for motor vehicle crash data analysis. Both models belong to the Poisson-hierarchical family of models. While numerous studies have compared the overall performance of alternative Bayesian Poisson-hierarchical models, little research has addressed the impact of model choice on the expected crash frequency prediction at individual sites. This paper sought to examine whether there are any trends among candidate models predictions e.g., that an alternative model's prediction for sites with certain conditions tends to be higher (or lower) than that from another model. In addition to the PG and PLN models, this research formulated a new member of the Poisson-hierarchical family of models: the Poisson-inverse gamma (PIGam). Three field datasets (from Texas, Michigan and Indiana) covering a wide range of over-dispersion characteristics were selected for analysis. This study demonstrated that the model choice can be critical when the calibrated models are used for prediction at new sites, especially when the data are highly over-dispersed. For all three datasets, the PIGam model would predict higher expected crash frequencies than would the PLN and PG models, in order, indicating a clear link between the models predictions and the shape of their mixing distributions (i.e., gamma, lognormal, and inverse gamma, respectively). The thicker tail of the PIGam and PLN models (in order) may provide an advantage when the data are highly over-dispersed. The analysis results also illustrated a major deficiency of the Deviance Information Criterion (DIC) in comparing the goodness-of-fit of hierarchical models; models with drastically different set of coefficients (and thus predictions for new sites) may yield similar DIC values, because the DIC only accounts for the parameters in the lowest (observation) level of the hierarchy and ignores the higher levels (regression coefficients). Copyright © 2018. Published by Elsevier Ltd.

  10. Effectiveness on Early Childhood Caries of an Oral Health Promotion Program for Medical Providers

    PubMed Central

    Widmer-Racich, Katina; Sevick, Carter; Starzyk, Erin J.; Mauritson, Katya; Hambidge, Simon J.

    2017-01-01

    Objectives. To assess an oral health promotion (OHP) intervention for medical providers’ impact on early childhood caries (ECC). Methods. We implemented a quasiexperimental OHP intervention in 8 federally qualified health centers that trained medical providers on ECC risk assessment, oral examination and instruction, dental referral, and fluoride varnish applications (FVAs). We measured OHP delivery by FVA count at medical visits. We measured the intervention’s impact on ECC in 3 unique cohorts of children aged 3 to 4 years in 2009 (preintervention; n = 202), 2011 (midintervention; n = 420), and 2015 (≥ 4 FVAs; n = 153). We compared numbers of decayed, missing, and filled tooth surfaces using adjusted zero-inflated negative binomial models. Results. Across 3 unique cohorts, the FVA mean (range) count was 0.0 (0), 1.1 (0–7), and 4.5 (4–7) in 2009, 2011, and 2015, respectively. In adjusted zero-inflated negative binomial models analyses, children in the 2015 cohort had significantly fewer decayed, missing, and filled tooth surfaces than did children in previous cohorts. Conclusions. An OHP intervention targeting medical providers reduced ECC when children received 4 or more FVAs at a medical visit by age 3 years. PMID:28661802

  11. Combined analysis of magnetic and gravity anomalies using normalized source strength (NSS)

    NASA Astrophysics Data System (ADS)

    Li, L.; Wu, Y.

    2017-12-01

    Gravity field and magnetic field belong to potential fields which lead inherent multi-solution. Combined analysis of magnetic and gravity anomalies based on Poisson's relation is used to determinate homology gravity and magnetic anomalies and decrease the ambiguity. The traditional combined analysis uses the linear regression of the reduction to pole (RTP) magnetic anomaly to the first order vertical derivative of the gravity anomaly, and provides the quantitative or semi-quantitative interpretation by calculating the correlation coefficient, slope and intercept. In the calculation process, due to the effect of remanent magnetization, the RTP anomaly still contains the effect of oblique magnetization. In this case the homology gravity and magnetic anomalies display irrelevant results in the linear regression calculation. The normalized source strength (NSS) can be transformed from the magnetic tensor matrix, which is insensitive to the remanence. Here we present a new combined analysis using NSS. Based on the Poisson's relation, the gravity tensor matrix can be transformed into the pseudomagnetic tensor matrix of the direction of geomagnetic field magnetization under the homologous condition. The NSS of pseudomagnetic tensor matrix and original magnetic tensor matrix are calculated and linear regression analysis is carried out. The calculated correlation coefficient, slope and intercept indicate the homology level, Poisson's ratio and the distribution of remanent respectively. We test the approach using synthetic model under complex magnetization, the results show that it can still distinguish the same source under the condition of strong remanence, and establish the Poisson's ratio. Finally, this approach is applied in China. The results demonstrated that our approach is feasible.

  12. Modification of the Mantel-Haenszel and Logistic Regression DIF Procedures to Incorporate the SIBTEST Regression Correction

    ERIC Educational Resources Information Center

    DeMars, Christine E.

    2009-01-01

    The Mantel-Haenszel (MH) and logistic regression (LR) differential item functioning (DIF) procedures have inflated Type I error rates when there are large mean group differences, short tests, and large sample sizes.When there are large group differences in mean score, groups matched on the observed number-correct score differ on true score,…

  13. Exact Analytic Result of Contact Value for the Density in a Modified Poisson-Boltzmann Theory of an Electrical Double Layer.

    PubMed

    Lou, Ping; Lee, Jin Yong

    2009-04-14

    For a simple modified Poisson-Boltzmann (SMPB) theory, taking into account the finite ionic size, we have derived the exact analytic expression for the contact values of the difference profile of the counterion and co-ion, as well as of the sum (density) and product profiles, near a charged planar electrode that is immersed in a binary symmetric electrolyte. In the zero ionic size or dilute limit, these contact values reduce to the contact values of the Poisson-Boltzmann (PB) theory. The analytic results of the SMPB theory, for the difference, sum, and product profiles were compared with the results of the Monte-Carlo (MC) simulations [ Bhuiyan, L. B.; Outhwaite, C. W.; Henderson, D. J. Electroanal. Chem. 2007, 607, 54 ; Bhuiyan, L. B.; Henderson, D. J. Chem. Phys. 2008, 128, 117101 ], as well as of the PB theory. In general, the analytic expression of the SMPB theory gives better agreement with the MC data than the PB theory does. For the difference profile, as the electrode charge increases, the result of the PB theory departs from the MC data, but the SMPB theory still reproduces the MC data quite well, which indicates the importance of including steric effects in modeling diffuse layer properties. As for the product profile, (i) it drops to zero as the electrode charge approaches infinity; (ii) the speed of the drop increases with the ionic size, and these behaviors are in contrast with the predictions of the PB theory, where the product is identically 1.

  14. Spatial variation of natural radiation and childhood leukaemia incidence in Great Britain.

    PubMed

    Richardson, S; Monfort, C; Green, M; Draper, G; Muirhead, C

    This paper describes an analysis of the geographical variation of childhood leukaemia incidence in Great Britain over a 15 year period in relation to natural radiation (gamma and radon). Data at the level of the 459 district level local authorities in England, Wales and regional districts in Scotland are analysed in two complementary ways: first, by Poisson regressions with the inclusion of environmental covariates and a smooth spatial structure; secondly, by a hierarchical Bayesian model in which extra-Poisson variability is modelled explicitly in terms of spatial and non-spatial components. From this analysis, we deduce a strong indication that a main part of the variability is accounted for by a local neighbourhood 'clustering' structure. This structure is furthermore relatively stable over the 15 year period for the lymphocytic leukaemias which make up the majority of observed cases. We found no evidence of a positive association of childhood leukaemia incidence with outdoor or indoor gamma radiation levels. There is no consistent evidence of any association with radon levels. Indeed, in the Poisson regressions, a significant positive association was only observed for one 5-year period, a result which is not compatible with a stable environmental effect. Moreover, this positive association became clearly non-significant when over-dispersion relative to the Poisson distribution was taken into account.

  15. Decoding and modelling of time series count data using Poisson hidden Markov model and Markov ordinal logistic regression models.

    PubMed

    Sebastian, Tunny; Jeyaseelan, Visalakshi; Jeyaseelan, Lakshmanan; Anandan, Shalini; George, Sebastian; Bangdiwala, Shrikant I

    2018-01-01

    Hidden Markov models are stochastic models in which the observations are assumed to follow a mixture distribution, but the parameters of the components are governed by a Markov chain which is unobservable. The issues related to the estimation of Poisson-hidden Markov models in which the observations are coming from mixture of Poisson distributions and the parameters of the component Poisson distributions are governed by an m-state Markov chain with an unknown transition probability matrix are explained here. These methods were applied to the data on Vibrio cholerae counts reported every month for 11-year span at Christian Medical College, Vellore, India. Using Viterbi algorithm, the best estimate of the state sequence was obtained and hence the transition probability matrix. The mean passage time between the states were estimated. The 95% confidence interval for the mean passage time was estimated via Monte Carlo simulation. The three hidden states of the estimated Markov chain are labelled as 'Low', 'Moderate' and 'High' with the mean counts of 1.4, 6.6 and 20.2 and the estimated average duration of stay of 3, 3 and 4 months, respectively. Environmental risk factors were studied using Markov ordinal logistic regression analysis. No significant association was found between disease severity levels and climate components.

  16. Drought impact functions as intermediate step towards drought damage assessment

    NASA Astrophysics Data System (ADS)

    Bachmair, Sophie; Svensson, Cecilia; Prosdocimi, Ilaria; Hannaford, Jamie; Helm Smith, Kelly; Svoboda, Mark; Stahl, Kerstin

    2016-04-01

    While damage or vulnerability functions for floods and seismic hazards have gained considerable attention, there is comparably little knowledge on drought damage or loss. On the one hand this is due to the complexity of the drought hazard affecting different domains of the hydrological cycle and different sectors of human activity. Hence, a single hazard indicator is likely not able to fully capture this multifaceted hazard. On the other hand, drought impacts are often non-structural and hard to quantify or monetize. Examples are impaired navigability of streams, restrictions on domestic water use, reduced hydropower production, reduced tree growth, and irreversible deterioration/loss of wetlands. Apart from reduced crop yield, data about drought damage or loss with adequate spatial and temporal resolution is scarce, making the development of drought damage functions difficult. As an intermediate step towards drought damage functions we exploit text-based reports on drought impacts from the European Drought Impact report Inventory and the US Drought Impact Reporter to derive surrogate information for drought damage or loss. First, text-based information on drought impacts is converted into timeseries of absence versus presence of impacts, or number of impact occurrences. Second, meaningful hydro-meteorological indicators characterizing drought intensity are identified. Third, different statistical models are tested as link functions relating drought hazard indicators with drought impacts: 1) logistic regression for drought impacts coded as binary response variable; and 2) mixture/hurdle models (zero-inflated/zero-altered negative binomial regression) and an ensemble regression tree approach for modeling the number of drought impact occurrences. Testing the predictability of (number of) drought impact occurrences based on cross-validation revealed a good agreement between observed and modeled (number of) impacts for regions at the scale of federal states or provinces with good data availability. Impact functions representing localized drought impacts are more challenging to construct given that less data is available, yet may provide information that more directly addresses stakeholders' needs. Overall, our study contributes insights into how drought intensity translates into ecological and socioeconomic impacts, and how such information may be used for enhancing drought monitoring and early warning.

  17. Prediction of forest fires occurrences with area-level Poisson mixed models.

    PubMed

    Boubeta, Miguel; Lombardía, María José; Marey-Pérez, Manuel Francisco; Morales, Domingo

    2015-05-01

    The number of fires in forest areas of Galicia (north-west of Spain) during the summer period is quite high. Local authorities are interested in analyzing the factors that explain this phenomenon. Poisson regression models are good tools for describing and predicting the number of fires per forest areas. This work employs area-level Poisson mixed models for treating real data about fires in forest areas. A parametric bootstrap method is applied for estimating the mean squared errors of fires predictors. The developed methodology and software are applied to a real data set of fires in forest areas of Galicia. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Oscillatory Reduction in Option Pricing Formula Using Shifted Poisson and Linear Approximation

    NASA Astrophysics Data System (ADS)

    Nur Rachmawati, Ro'fah; Irene; Budiharto, Widodo

    2014-03-01

    Option is one of derivative instruments that can help investors improve their expected return and minimize the risks. However, the Black-Scholes formula is generally used in determining the price of the option does not involve skewness factor and it is difficult to apply in computing process because it produces oscillation for the skewness values close to zero. In this paper, we construct option pricing formula that involve skewness by modified Black-Scholes formula using Shifted Poisson model and transformed it into the form of a Linear Approximation in the complete market to reduce the oscillation. The results are Linear Approximation formula can predict the price of an option with very accurate and successfully reduce the oscillations in the calculation processes.

  19. A critical re-evaluation of the regression model specification in the US D1 EQ-5D value function

    PubMed Central

    2012-01-01

    Background The EQ-5D is a generic health-related quality of life instrument (five dimensions with three levels, 243 health states), used extensively in cost-utility/cost-effectiveness analyses. EQ-5D health states are assigned values on a scale anchored in perfect health (1) and death (0). The dominant procedure for defining values for EQ-5D health states involves regression modeling. These regression models have typically included a constant term, interpreted as the utility loss associated with any movement away from perfect health. The authors of the United States EQ-5D valuation study replaced this constant with a variable, D1, which corresponds to the number of impaired dimensions beyond the first. The aim of this study was to illustrate how the use of the D1 variable in place of a constant is problematic. Methods We compared the original D1 regression model with a mathematically equivalent model with a constant term. Comparisons included implications for the magnitude and statistical significance of the coefficients, multicollinearity (variance inflation factors, or VIFs), number of calculation steps needed to determine tariff values, and consequences for tariff interpretation. Results Using the D1 variable in place of a constant shifted all dummy variable coefficients away from zero by the value of the constant, greatly increased the multicollinearity of the model (maximum VIF of 113.2 vs. 21.2), and increased the mean number of calculation steps required to determine health state values. Discussion Using the D1 variable in place of a constant constitutes an unnecessary complication of the model, obscures the fact that at least two of the main effect dummy variables are statistically nonsignificant, and complicates and biases interpretation of the tariff algorithm. PMID:22244261

  20. A critical re-evaluation of the regression model specification in the US D1 EQ-5D value function.

    PubMed

    Rand-Hendriksen, Kim; Augestad, Liv A; Dahl, Fredrik A

    2012-01-13

    The EQ-5D is a generic health-related quality of life instrument (five dimensions with three levels, 243 health states), used extensively in cost-utility/cost-effectiveness analyses. EQ-5D health states are assigned values on a scale anchored in perfect health (1) and death (0).The dominant procedure for defining values for EQ-5D health states involves regression modeling. These regression models have typically included a constant term, interpreted as the utility loss associated with any movement away from perfect health. The authors of the United States EQ-5D valuation study replaced this constant with a variable, D1, which corresponds to the number of impaired dimensions beyond the first. The aim of this study was to illustrate how the use of the D1 variable in place of a constant is problematic. We compared the original D1 regression model with a mathematically equivalent model with a constant term. Comparisons included implications for the magnitude and statistical significance of the coefficients, multicollinearity (variance inflation factors, or VIFs), number of calculation steps needed to determine tariff values, and consequences for tariff interpretation. Using the D1 variable in place of a constant shifted all dummy variable coefficients away from zero by the value of the constant, greatly increased the multicollinearity of the model (maximum VIF of 113.2 vs. 21.2), and increased the mean number of calculation steps required to determine health state values. Using the D1 variable in place of a constant constitutes an unnecessary complication of the model, obscures the fact that at least two of the main effect dummy variables are statistically nonsignificant, and complicates and biases interpretation of the tariff algorithm.

  1. Serum insulin-like growth factor (IGF)-I and IGF binding protein-3 in relation to terminal duct lobular unit involution of the normal breast in Caucasian and African American women: The Susan G. Komen Tissue Bank.

    PubMed

    Oh, Hannah; Pfeiffer, Ruth M; Falk, Roni T; Horne, Hisani N; Xiang, Jackie; Pollak, Michael; Brinton, Louise A; Storniolo, Anna Maria V; Sherman, Mark E; Gierach, Gretchen L; Figueroa, Jonine D

    2018-08-01

    Lesser degrees of terminal duct lobular unit (TDLU) involution, as reflected by higher numbers of TDLUs and acini/TDLU, are associated with elevated breast cancer risk. In rodent models, the insulin-like growth factor (IGF) system regulates involution of the mammary gland. We examined associations of circulating IGF measures with TDLU involution in normal breast tissues among women without precancerous lesions. Among 715 Caucasian and 283 African American (AA) women who donated normal breast tissue samples to the Komen Tissue Bank between 2009 and 2012 (75% premenopausal), serum concentrations of IGF-I and binding protein (IGFBP)-3 were quantified using enzyme-linked immunosorbent assay. Hematoxilyn and eosin-stained tissue sections were assessed for numbers of TDLUs ("TDLU count"). Zero-inflated Poisson regression models with a robust variance estimator were used to estimate relative risks (RRs) for association of IGF measures (tertiles) with TDLU count by race and menopausal status, adjusting for potential confounders. AA (vs. Caucasian) women had higher age-adjusted mean levels of serum IGF-I (137 vs. 131 ng/mL, p = 0.07) and lower levels of IGFBP-3 (4165 vs. 4684 ng/mL, p < 0.0001). Postmenopausal IGFBP-3 was inversely associated with TDLU count among AA (RR T3vs.T1  = 0.49, 95% CI = 0.28-0.84, p-trend = 0.04) and Caucasian (RR T3vs.T1 =0.64, 95% CI = 0.42-0.98, p-trend = 0.04) women. In premenopausal women, higher IGF-I:IGFBP-3 ratios were associated with higher TDLU count in Caucasian (RR T3vs.T1 =1.33, 95% CI = 1.02-1.75, p-trend = 0.04), but not in AA (RR T3vs.T1 =0.65, 95% CI = 0.42-1.00, p-trend = 0.05), women. Our data suggest a role of the IGF system, particularly IGFBP-3, in TDLU involution of the normal breast, a breast cancer risk factor, among Caucasian and AA women. © 2018 UICC.

  2. A semi-nonparametric Poisson regression model for analyzing motor vehicle crash data.

    PubMed

    Ye, Xin; Wang, Ke; Zou, Yajie; Lord, Dominique

    2018-01-01

    This paper develops a semi-nonparametric Poisson regression model to analyze motor vehicle crash frequency data collected from rural multilane highway segments in California, US. Motor vehicle crash frequency on rural highway is a topic of interest in the area of transportation safety due to higher driving speeds and the resultant severity level. Unlike the traditional Negative Binomial (NB) model, the semi-nonparametric Poisson regression model can accommodate an unobserved heterogeneity following a highly flexible semi-nonparametric (SNP) distribution. Simulation experiments are conducted to demonstrate that the SNP distribution can well mimic a large family of distributions, including normal distributions, log-gamma distributions, bimodal and trimodal distributions. Empirical estimation results show that such flexibility offered by the SNP distribution can greatly improve model precision and the overall goodness-of-fit. The semi-nonparametric distribution can provide a better understanding of crash data structure through its ability to capture potential multimodality in the distribution of unobserved heterogeneity. When estimated coefficients in empirical models are compared, SNP and NB models are found to have a substantially different coefficient for the dummy variable indicating the lane width. The SNP model with better statistical performance suggests that the NB model overestimates the effect of lane width on crash frequency reduction by 83.1%.

  3. Effect of motivational interviewing on rates of early childhood caries: a randomized trial.

    PubMed

    Harrison, Rosamund; Benton, Tonya; Everson-Stewart, Siobhan; Weinstein, Phil

    2007-01-01

    The purposes of this randomized controlled trial were to: (1) test motivational interviewing (MI) to prevent early childhood caries; and (2) use Poisson regression for data analysis. A total of 240 South Asian children 6 to 18 months old were enrolled and randomly assigned to either the MI or control condition. Children had a dental exam, and their mothers completed pretested instruments at baseline and 1 and 2 years postintervention. Other covariates that might explain outcomes over and above treatment differences were modeled using Poisson regression. Hazard ratios were produced. Analyses included all participants whenever possible. Poisson regression supported a protective effect of MI (hazard ratio [HR]=0.54 (95%CI=035-0.84)-that is, the M/ group had about a 46% lower rate of dmfs at 2 years than did control children. Similar treatment effect estimates were obtained from models that included, as alternative outcomes, ds, dms, and dmfs, including "white spot lesions." Exploratory analyses revealed that rates of dmfs were higher in children whose mothers had: (1) prechewed their food; (2) been raised in a rural environment; and (3) a higher family income (P<.05). A motivational interviewing-style intervention shows promise to promote preventive behaviors in mothers of young children at high risk for caries.

  4. Trajectories of suicidal ideation in depressed older adults undergoing antidepressant treatment.

    PubMed

    Kasckow, John; Youk, Ada; Anderson, Stewart J; Dew, Mary Amanda; Butters, Meryl A; Marron, Megan M; Begley, Amy E; Szanto, Katalin; Dombrovski, Alexander Y; Mulsant, Benoit H; Lenze, Eric J; Reynolds, Charles F

    2016-02-01

    Suicide is a public health concern in older adults. Recent cross sectional studies suggest that impairments in executive functioning, memory and attention are associated with suicidal ideation in older adults. It is unknown whether these neuropsychological features predict persistent suicidal ideation. We analyzed data from 468 individuals ≥ age 60 with major depression who received venlafaxine XR monotherapy for up to 16 weeks. We used latent class growth modeling to classify groups of individuals based on trajectories of suicidal ideation. We also examined whether cognitive dysfunction predicted suicidal ideation while controlling for time-dependent variables including depression severity, and age and education. The optimal model using a zero inflated Poisson link classified individuals into four groups, each with a distinct temporal trajectory of suicidal ideation: those with 'minimal suicidal ideation' across time points; those with 'low suicidal ideation'; those with 'rapidly decreasing suicidal ideation'; and those with 'high and persistent suicidal ideation'. Participants in the 'high and persistent suicidal ideation' group had worse scores relative to those in the "rapidly decreasing suicidal ideation" group on the Color-Word 'inhibition/switching' subtest from the Delis-Kaplan Executive Function Scale, worse attention index scores on the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS) and worse total RBANS index scores. These findings suggest that individuals with poorer ability to switch between inhibitory and non-inhibitory responses as well as worse attention and worse overall cognitive status are more likely to have persistently higher levels of suicidal ideation. CLINICALTRIAL. NCT00892047. Published by Elsevier Ltd.

  5. WHERE THE INDIVIDUAL MEETS THE ECOLOGICAL: A STUDY OF PARENT DRINKING PATTERNS, ALCOHOL OUTLETS AND CHILD PHYSICAL ABUSE

    PubMed Central

    Freisthler, Bridget; Gruenewald, Paul J.

    2012-01-01

    Background Despite well-known associations between heavy drinking and child physical abuse, little is known about specific risks related to drinking different amounts of alcohol in different drinking venues. This study uses a context specific dose-response model to examine how drinking in various venues (e.g., at bars or parties) are related to physically abusive parenting practices while controlling for individual and psychosocial characteristics. Methods Data were collected via a telephone survey of parents in 50 cities in California resulting in 2,163 respondents who reported drinking in the past year. Child physical abuse and corporal punishment were measured using the Conflict Tactics Scale, Parent Child version. Drinking behaviors were measured using continued drinking measures. Data were analyzed using zero inflated Poisson models. Results Drinking at homes, parties or bars more frequently was related to greater frequencies of physically abusive parenting practices. The use of greater amounts of alcohol in association with drinking at bars appeared to increase risks for corporal punishment, a dose-response effect. Dose-response relationships were not found for drinking at homes or parties or drinking at bars for physical abuse nor for drinking at home and parties for corporal punishment. Conclusion Frequencies of using drinking venues, particularly bars and home or parties, are associated with greater use of abusive parenting practices. These findings suggest that a parent’s routine drinking activities place children at different risks for being physically abused. They also suggest that interventions that take into account parents’ alcohol use at drinking venues are an important avenue for secondary prevention efforts. PMID:23316780

  6. Racial differences in parenting style typologies and heavy episodic drinking trajectories.

    PubMed

    Clark, Trenette T; Yang, Chongming; McClernon, F Joseph; Fuemmeler, Bernard F

    2015-07-01

    This study examines racial differences between Whites and Blacks in the association of parenting style typologies with changes in heavy episodic drinking from adolescence to young adulthood. The analytic sample consists of 9,942 adolescents drawn from the National Longitudinal Study of Adolescent Health, which followed respondents from ages 12 to 31 years. Confirmatory factor analysis and factor mixture modeling are used to classify parenting style typologies based on measures of parental acceptance and control. Heavy Episodic Drinking (HED) trajectories are evaluated using a zero-inflated Poisson multigroup latent growth curve modeling approach. The mixture model identified 4 heterogeneous groups that differed based on the 2 latent variables (parental acceptance and control): balanced (65.8% of the sample), authoritarian (12.2%), permissive (19.4%), and uninvolved or neglectful (2.7%). Regardless of race, we found that at age 12 years, children of authoritarian parents have a higher probability of not engaging in HED than children of parents with balanced, permissive, or neglectful parenting styles. However, among Black youth who reported HED at age 12, authoritarian parenting was associated with greater level of HED at age 12 but a less steep increase in level of HED as age increased yearly as compared with balanced parenting. For White adolescents, uninvolved, permissive, and authoritarian parenting were not associated with a greater level of HED as age increased yearly as compared with adolescents exposed to balanced parenting. The influence of parenting styles on HED during adolescence persists into young adulthood and differs by race for youth engaging in HED. (c) 2015 APA, all rights reserved.

  7. Racial Differences in Parenting Style Typologies and Heavy Episodic Drinking Trajectories

    PubMed Central

    Clark, Trenette T.; Yang, Chongming; McClernon, F. Joseph; Fuemmeler, Bernard

    2014-01-01

    Objective This study examines racial differences between Caucasians and African Americans in the association of parenting style typologies with changes in heavy episodic drinking from adolescence to young adulthood. Methods The analytic sample consists of 9,942 adolescents drawn from the National Longitudinal Study of Adolescent Health, which followed respondents from ages 12 to 31 years. Confirmatory factor analysis and factor mixture modeling are used to classify parenting style typologies based on measures of parental acceptance and control. HED trajectories are evaluated using a zero-inflated Poisson multigroup latent growth curve modeling approach. Results The mixture model identified four heterogeneous groups that differed based on the two latent variables (parental acceptance and control): balanced (65.8% of the sample), authoritarian (12.2%), permissive (19.4%), and uninvolved/neglectful (2.7%). Regardless of race, we found that at age 12 years, children of authoritarian parents have a higher probability of not engaging in HED than children of parents with balanced, permissive, or neglectful parenting styles. However, among African American youth who reported HED at age 12, authoritarian parenting was associated with greater level of HED at age 12 but a less steep increase in level of HED as age increased yearly as compared with balanced parenting. For Caucasian adolescents, uninvolved, permissive, and authoritarian parenting were not associated with a greater level of HED as age increased yearly as compared with adolescents exposed to balanced parenting. Conclusion The influence of parenting styles on HED during adolescence persists into young adulthood and differs by race for youth engaging in HED. PMID:25222086

  8. Trajectories of Suicidal Ideation in Depressed Older Adults Undergoing Antidepressant Treatment

    PubMed Central

    Youk, Ada; Anderson, Stewart J.; Dew, Mary Amanda; Butters, Meryl A.; Marron, Megan M.; Begley, Amy E.; Szanto, Katalin; Dombrovski, Alexander Y.; Mulsant, Benoit H.; Lenze, EricJ.; Reynolds, Charles F.

    2015-01-01

    Suicide is a public health concern in older adults. Recent cross sectional studies suggest that impairments in executive functioning, memory and attention are associated with suicidal ideation in older adults. It is unknown whether these neuropsychological features predict persistent suicidal ideation. We analyzed data from 468 individuals ≥ age 60 with major depression who received venlafaxine XR monotherapy for up to 16 weeks. We used latent class growth modeling to classify groups of individuals based on trajectories of suicidal ideation. We also examined whether cognitive dysfunction predicted suicidal ideation while controlling for time-dependent variables including depression severity, and age and education. The optimal model using a zero inflated Poisson link classified individuals into four groups, each with a distinct temporal trajectory of suicidal ideation: those with ‘minimal suicidal ideation’ across time points; those with ‘low suicidal ideation’; those with ‘rapidly decreasing suicidal ideation’; and those with ‘high and persistent suicidal ideation’. Participants in the ‘high and persistent suicidal ideation’ group had worse scores relative to those in the “rapidly decreasing suicidal ideation” group on the Color-Word ‘inhibition/switching’ subtest from the Delis-Kaplan Executive Function Scale, worse attention index scores on the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS) and worse total RBANS index scores. These findings suggest that individuals with poorer ability to switch between inhibitory and non-inhibitory responses as well as worse attention and worse overall cognitive status are more likely to have persistently higher levels of suicidal ideation. PMID:26708830

  9. Modeling the number of car theft using Poisson regression

    NASA Astrophysics Data System (ADS)

    Zulkifli, Malina; Ling, Agnes Beh Yen; Kasim, Maznah Mat; Ismail, Noriszura

    2016-10-01

    Regression analysis is the most popular statistical methods used to express the relationship between the variables of response with the covariates. The aim of this paper is to evaluate the factors that influence the number of car theft using Poisson regression model. This paper will focus on the number of car thefts that occurred in districts in Peninsular Malaysia. There are two groups of factor that have been considered, namely district descriptive factors and socio and demographic factors. The result of the study showed that Bumiputera composition, Chinese composition, Other ethnic composition, foreign migration, number of residence with the age between 25 to 64, number of employed person and number of unemployed person are the most influence factors that affect the car theft cases. These information are very useful for the law enforcement department, insurance company and car owners in order to reduce and limiting the car theft cases in Peninsular Malaysia.

  10. An integrated model for detecting significant chromatin interactions from high-resolution Hi-C data

    PubMed Central

    Carty, Mark; Zamparo, Lee; Sahin, Merve; González, Alvaro; Pelossof, Raphael; Elemento, Olivier; Leslie, Christina S.

    2017-01-01

    Here we present HiC-DC, a principled method to estimate the statistical significance (P values) of chromatin interactions from Hi-C experiments. HiC-DC uses hurdle negative binomial regression account for systematic sources of variation in Hi-C read counts—for example, distance-dependent random polymer ligation and GC content and mappability bias—and model zero inflation and overdispersion. Applied to high-resolution Hi-C data in a lymphoblastoid cell line, HiC-DC detects significant interactions at the sub-topologically associating domain level, identifying potential structural and regulatory interactions supported by CTCF binding sites, DNase accessibility, and/or active histone marks. CTCF-associated interactions are most strongly enriched in the middle genomic distance range (∼700 kb–1.5 Mb), while interactions involving actively marked DNase accessible elements are enriched both at short (<500 kb) and longer (>1.5 Mb) genomic distances. There is a striking enrichment of longer-range interactions connecting replication-dependent histone genes on chromosome 6, potentially representing the chromatin architecture at the histone locus body. PMID:28513628

  11. Social vulnerability and the natural and built environment: a model of flood casualties in Texas.

    PubMed

    Zahran, Sammy; Brody, Samuel D; Peacock, Walter Gillis; Vedlitz, Arnold; Grover, Himanshu

    2008-12-01

    Studies on the impacts of hurricanes, tropical storms, and tornados indicate that poor communities of colour suffer disproportionately in human death and injury.(2) Few quantitative studies have been conducted on the degree to which flood events affect socially vulnerable populations. We address this research void by analysing 832 countywide flood events in Texas from 1997-2001. Specifically, we examine whether geographic localities characterised by high percentages of socially vulnerable populations experience significantly more casualties due to flood events, adjusting for characteristics of the natural and built environment. Zero-inflated negative binomial regression models indicate that the odds of a flood casualty increase with the level of precipitation on the day of a flood event, flood duration, property damage caused by the flood, population density, and the presence of socially vulnerable populations. Odds decrease with the number of dams, the level of precipitation on the day before a recorded flood event, and the extent to which localities have enacted flood mitigation strategies. The study concludes with comments on hazard-resilient communities and protection of casualty-prone populations.

  12. Social network and individual correlates of sexual risk behavior among homeless young men who have sex with men.

    PubMed

    Tucker, Joan S; Hu, Jianhui; Golinelli, Daniela; Kennedy, David P; Green, Harold D; Wenzel, Suzanne L

    2012-10-01

    There is growing interest in network-based interventions to reduce HIV sexual risk behavior among both homeless youth and men who have sex with men. The goal of this study was to better understand the social network and individual correlates of sexual risk behavior among homeless young men who have sex with men (YMSM) to inform these HIV prevention efforts. A multistage sampling design was used to recruit a probability sample of 121 homeless YMSM (ages: 16-24 years) from shelters, drop-in centers, and street venues in Los Angeles County. Face-to-face interviews were conducted. Because of the different distributions of the three outcome variables, three distinct regression models were needed: ordinal logistic regression for unprotected sex, zero-truncated Poisson regression for number of sex partners, and logistic regression for any sex trade. Homeless YMSM were less likely to engage in unprotected sex and had fewer sex partners if their networks included platonic ties to peers who regularly attended school, and had fewer sex partners if most of their network members were not heavy drinkers. Most other aspects of network composition were unrelated to sexual risk behavior. Individual predictors of sexual risk behavior included older age, Hispanic ethnicity, lower education, depressive symptoms, less positive condom attitudes, and sleeping outdoors because of nowhere else to stay. HIV prevention programs for homeless YMSM may warrant a multipronged approach that helps these youth strengthen their ties to prosocial peers, develop more positive condom attitudes, and access needed mental health and housing services. Copyright © 2012 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  13. Censored Hurdle Negative Binomial Regression (Case Study: Neonatorum Tetanus Case in Indonesia)

    NASA Astrophysics Data System (ADS)

    Yuli Rusdiana, Riza; Zain, Ismaini; Wulan Purnami, Santi

    2017-06-01

    Hurdle negative binomial model regression is a method that can be used for discreate dependent variable, excess zero and under- and overdispersion. It uses two parts approach. The first part estimates zero elements from dependent variable is zero hurdle model and the second part estimates not zero elements (non-negative integer) from dependent variable is called truncated negative binomial models. The discrete dependent variable in such cases is censored for some values. The type of censor that will be studied in this research is right censored. This study aims to obtain the parameter estimator hurdle negative binomial regression for right censored dependent variable. In the assessment of parameter estimation methods used Maximum Likelihood Estimator (MLE). Hurdle negative binomial model regression for right censored dependent variable is applied on the number of neonatorum tetanus cases in Indonesia. The type data is count data which contains zero values in some observations and other variety value. This study also aims to obtain the parameter estimator and test statistic censored hurdle negative binomial model. Based on the regression results, the factors that influence neonatorum tetanus case in Indonesia is the percentage of baby health care coverage and neonatal visits.

  14. Zimbabwe

    DTIC Science & Technology

    2009-04-01

    Fund ( IMF ) lending has been suspended since 2000 due to nonpayment of arrears, and foreign currency for essential imports, particularly fuel, is in...remain bleak in the near term. Ignoring the advice of the IMF , the government refused to devalue the official exchange rate. Instead, in June 2006...Gono devalued the country’s currency , the Zimbabwe dollar, removing three zeros in an effort to mitigate inflation. Under “Operation Sunrise,” the

  15. Intimate partner violence and women's economic and non-economic activities in Minya, Egypt.

    PubMed

    Yount, Kathryn M; Zureick-Brown, Sarah; salem, Rania

    2014-06-01

    Intimate partner violence (IPV) against women is widespread, but its implications for their economic and non-economic activities are understudied. Leveraging new data from 564 ever-married women aged 22–65 in rural Minya, Egypt, we estimated logistic regressions and zero-inflated negative binomial regressions to test spillover, compensation, and patriarchal bargaining theories about the influences of women's exposure to IPV on their engagement in and time spent on market, subsistence, domestic, and care work. Supporting compensation theory, exposures to lifetime, recent, and chronic physical or sexual IPV were associated with higher adjusted odds of performing market work in the prior month, and exposures to recent and chronic IPV were associated with higher adjusted odds of performing subsistence work in this period. Supporting compensation and patriarchal bargaining theories, exposures to recent and chronic IPV were associated with more time spent on domestic work in the prior day. Supporting spillover and patriarchal bargaining theories, exposures to lifetime IPV of all forms were associated with lower adjusted odds of performing mostly nonspousal care work in the prior day, and this association was partially mediated by women's generalized anxiety. Women in rural Minya who are exposed to IPV may escalate their housework to fulfill local norms of feminine domesticity while substituting economic activities for nonspousal care work to enhance their economic independence from violent partners.

  16. Is there an Appalachian disparity in dental caries in Pennsylvania schoolchildren?

    PubMed

    Polk, Deborah E; Kim, Sunghee; Manz, Michael; Weyant, Robert J

    2015-02-01

    To determine whether there is an Appalachian disparity in caries prevalence or extent in children living in Pennsylvania. We conducted a cross-sectional clinical assessment of caries in a sample representing 1st, 3rd, 9th, and 11th grade students across Pennsylvania. We used logistic regression and zero-inflated negative binomial regression controlling for age to examine the association of residence in an Appalachian county with caries prevalence and extent in the primary and permanent dentitions. Compared with children living outside Appalachia, more children living in Appalachia had a dft >0 (OR = 1.37, 95% CI = 1.07-1.76) and more had a DMFT >0 (OR = 1.32, 95% CI = 1.06-1.64). In addition, compared with children living outside Appalachia, children living in Appalachia had a greater primary but not permanent caries extent (IRR = 1.10, 95% CI = 1.01-1.19). We found Appalachian disparities in caries prevalence in both the primary and permanent dentitions and an Appalachian disparity in caries extent in the primary dentition. None of the disparities was moderated by age. This suggests that the search for the mechanism or mechanisms for the Appalachian disparities should focus on differential exposures to risk factors occurring prior to and at the start of elementary school. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Is There an Appalachian Disparity in Dental Caries in Pennsylvania Schoolchildren?

    PubMed Central

    Polk, Deborah E.; Kim, Sunghee; Manz, Michael; Weyant, Robert J.

    2015-01-01

    Objectives To determine whether there is an Appalachian disparity in caries prevalence or extent in children living in Pennsylvania. Methods We conducted a cross-sectional clinical assessment of caries in a sample representing 1st, 3rd, 9th, and 11th grade students across Pennsylvania. We used logistic regression and zero-inflated negative binomial regression controlling for age to examine the association of residence in an Appalachian county with caries prevalence and extent in the primary and permanent dentitions. Results Compared with children living outside Appalachia, more children living in Appalachia had a dft > 0 (OR = 1.37, 95% CI = 1.07 – 1.76) and more had a DMFT > 0 (OR = 1.32, 95% CI = 1.06 – 1.64). In addition, compared with children living outside Appalachia, children living in Appalachia had a greater primary but not permanent caries extent (IRR = 1.10, 95% CI = 1.01 – 1.19). Conclusions We found Appalachian disparities in caries prevalence in both the primary and permanent dentitions and an Appalachian disparity in caries extent in the primary dentition. None of the disparities was moderated by age. This suggests that the search for the mechanism or mechanisms for the Appalachian disparities should focus on differential exposures to risk factors occurring prior to and at the start of elementary school. PMID:25470650

  18. Type I error rates of rare single nucleotide variants are inflated in tests of association with non-normally distributed traits using simple linear regression methods.

    PubMed

    Schwantes-An, Tae-Hwi; Sung, Heejong; Sabourin, Jeremy A; Justice, Cristina M; Sorant, Alexa J M; Wilson, Alexander F

    2016-01-01

    In this study, the effects of (a) the minor allele frequency of the single nucleotide variant (SNV), (b) the degree of departure from normality of the trait, and (c) the position of the SNVs on type I error rates were investigated in the Genetic Analysis Workshop (GAW) 19 whole exome sequence data. To test the distribution of the type I error rate, 5 simulated traits were considered: standard normal and gamma distributed traits; 2 transformed versions of the gamma trait (log 10 and rank-based inverse normal transformations); and trait Q1 provided by GAW 19. Each trait was tested with 313,340 SNVs. Tests of association were performed with simple linear regression and average type I error rates were determined for minor allele frequency classes. Rare SNVs (minor allele frequency < 0.05) showed inflated type I error rates for non-normally distributed traits that increased as the minor allele frequency decreased. The inflation of average type I error rates increased as the significance threshold decreased. Normally distributed traits did not show inflated type I error rates with respect to the minor allele frequency for rare SNVs. There was no consistent effect of transformation on the uniformity of the distribution of the location of SNVs with a type I error.

  19. Impact of cigarette smoking on utilization of nursing home services.

    PubMed

    Warner, Kenneth E; McCammon, Ryan J; Fries, Brant E; Langa, Kenneth M

    2013-11-01

    Few studies have examined the effects of smoking on nursing home utilization, generally using poor data on smoking status. No previous study has distinguished utilization for recent from long-term quitters. Using the Health and Retirement Study, we assessed nursing home utilization by never-smokers, long-term quitters (quit >3 years), recent quitters (quit ≤3 years), and current smokers. We used logistic regression to evaluate the likelihood of a nursing home admission. For those with an admission, we used negative binomial regression on the number of nursing home nights. Finally, we employed zero-inflated negative binomial regression to estimate nights for the full sample. Controlling for other variables, compared with never-smokers, long-term quitters have an odds ratio (OR) for nursing home admission of 1.18 (95% CI: 1.07-1.2), current smokers 1.39 (1.23-1.57), and recent quitters 1.55 (1.29-1.87). The probability of admission rises rapidly with age and is lower for African Americans and Hispanics, more affluent respondents, respondents with a spouse present in the home, and respondents with a living child. Given admission, smoking status is not associated with length of stay (LOS). LOS is longer for older respondents and women and shorter for more affluent respondents and those with spouses present. Compared with otherwise identical never-smokers, former and current smokers have a significantly increased risk of nursing home admission. That recent quitters are at greatest risk of admission is consistent with evidence that many stop smoking because they are sick, often due to smoking.

  20. Phenomenology of fermion production during axion inflation

    NASA Astrophysics Data System (ADS)

    Adshead, Peter; Pearce, Lauren; Peloso, Marco; Roberts, Michael A.; Sorbo, Lorenzo

    2018-06-01

    We study the production of fermions through a derivative coupling with a pseudoscalar inflaton and the effects of the produced fermions on the scalar primordial perturbations. We present analytic results for the modification of the scalar power spectrum due to the produced fermions, and we estimate the amplitude of the non-Gaussianities in the equilateral regime. Remarkably, we find a regime where the effect of the fermions gives the dominant contribution to the scalar spectrum while the amplitude of the bispectrum is small and in agreement with observation. We also note the existence of a regime in which the backreaction of the fermions on the evolution of the zero-mode of the inflaton can lead to inflation even if the potential of the inflaton is steep and does not satisfy the slow-roll conditions.

  1. Effects of Smoking on Cost of Hospitalization and Length of Stay among Patients with Lung Cancer in Iran: a Hospital-Based Study.

    PubMed

    Sari, Ali Akbari; Rezaei, Satar; Arab, Mohammad; Majdzadeh, Reza; Matin, Behzad Karami; Zandian, Hamed

    2016-01-01

    Smoking is recognized as a main leading preventable cause of mortality and morbidity worldwide. It is responsible for a considerable nancial burden both on the health system and in society. This study aimed to examine the effect of smoking on cost of hospitalization and length of stay (LoS) among patients with lung cancer in Iran in 2014. A total of 415 patients were included in the study. Data on age, sex, insurance status, type of hospitals, type of insurance, geographic local, length of stay and cost of hospitalization was extracted by medical records and smoking status was obtained from a telephone survey. To compare cost of hospitalization and LoS for different smoking groups, current smokers, former smokers, and never smokers, a gamma regression model and zero-truncated poisson regression were used, respectively. Compared with never smokers, current and former smokers showed a 48% and 35% increase in hospitalization costs, respectively. Also, hospital LoS for current and former smokers was 72% and 31% higher than for never smokers, respectively. Our study indicated that cigarette smoking imposes a signi cant nancial burden on hospitals in Iran. It is, however, recommended that more research should be done to implement and evaluate hospital based smoking cessation interventions to better increase cessation rates in these settings.

  2. Predictors of the number of under-five malnourished children in Bangladesh: application of the generalized poisson regression model

    PubMed Central

    2013-01-01

    Background Malnutrition is one of the principal causes of child mortality in developing countries including Bangladesh. According to our knowledge, most of the available studies, that addressed the issue of malnutrition among under-five children, considered the categorical (dichotomous/polychotomous) outcome variables and applied logistic regression (binary/multinomial) to find their predictors. In this study malnutrition variable (i.e. outcome) is defined as the number of under-five malnourished children in a family, which is a non-negative count variable. The purposes of the study are (i) to demonstrate the applicability of the generalized Poisson regression (GPR) model as an alternative of other statistical methods and (ii) to find some predictors of this outcome variable. Methods The data is extracted from the Bangladesh Demographic and Health Survey (BDHS) 2007. Briefly, this survey employs a nationally representative sample which is based on a two-stage stratified sample of households. A total of 4,460 under-five children is analysed using various statistical techniques namely Chi-square test and GPR model. Results The GPR model (as compared to the standard Poisson regression and negative Binomial regression) is found to be justified to study the above-mentioned outcome variable because of its under-dispersion (variance < mean) property. Our study also identify several significant predictors of the outcome variable namely mother’s education, father’s education, wealth index, sanitation status, source of drinking water, and total number of children ever born to a woman. Conclusions Consistencies of our findings in light of many other studies suggest that the GPR model is an ideal alternative of other statistical models to analyse the number of under-five malnourished children in a family. Strategies based on significant predictors may improve the nutritional status of children in Bangladesh. PMID:23297699

  3. Evaluation of Denoising Strategies to Address Motion-Correlated Artifacts in Resting-State Functional Magnetic Resonance Imaging Data from the Human Connectome Project

    PubMed Central

    Kandala, Sridhar; Nolan, Dan; Laumann, Timothy O.; Power, Jonathan D.; Adeyemo, Babatunde; Harms, Michael P.; Petersen, Steven E.; Barch, Deanna M.

    2016-01-01

    Abstract Like all resting-state functional connectivity data, the data from the Human Connectome Project (HCP) are adversely affected by structured noise artifacts arising from head motion and physiological processes. Functional connectivity estimates (Pearson's correlation coefficients) were inflated for high-motion time points and for high-motion participants. This inflation occurred across the brain, suggesting the presence of globally distributed artifacts. The degree of inflation was further increased for connections between nearby regions compared with distant regions, suggesting the presence of distance-dependent spatially specific artifacts. We evaluated several denoising methods: censoring high-motion time points, motion regression, the FMRIB independent component analysis-based X-noiseifier (FIX), and mean grayordinate time series regression (MGTR; as a proxy for global signal regression). The results suggest that FIX denoising reduced both types of artifacts, but left substantial global artifacts behind. MGTR significantly reduced global artifacts, but left substantial spatially specific artifacts behind. Censoring high-motion time points resulted in a small reduction of distance-dependent and global artifacts, eliminating neither type. All denoising strategies left differences between high- and low-motion participants, but only MGTR substantially reduced those differences. Ultimately, functional connectivity estimates from HCP data showed spatially specific and globally distributed artifacts, and the most effective approach to address both types of motion-correlated artifacts was a combination of FIX and MGTR. PMID:27571276

  4. Football goal distributions and extremal statistics

    NASA Astrophysics Data System (ADS)

    Greenhough, J.; Birch, P. C.; Chapman, S. C.; Rowlands, G.

    2002-12-01

    We analyse the distributions of the number of goals scored by home teams, away teams, and the total scored in the match, in domestic football games from 169 countries between 1999 and 2001. The probability density functions (PDFs) of goals scored are too heavy-tailed to be fitted over their entire ranges by Poisson or negative binomial distributions which would be expected for uncorrelated processes. Log-normal distributions cannot include zero scores and here we find that the PDFs are consistent with those arising from extremal statistics. In addition, we show that it is sufficient to model English top division and FA Cup matches in the seasons of 1970/71-2000/01 on Poisson or negative binomial distributions, as reported in analyses of earlier seasons, and that these are not consistent with extremal statistics.

  5. An INAR(1) Negative Multinomial Regression Model for Longitudinal Count Data.

    ERIC Educational Resources Information Center

    Bockenholt, Ulf

    1999-01-01

    Discusses a regression model for the analysis of longitudinal count data in a panel study by adapting an integer-valued first-order autoregressive (INAR(1)) Poisson process to represent time-dependent correlation between counts. Derives a new negative multinomial distribution by combining INAR(1) representation with a random effects approach.…

  6. Pick Your Poisson: A Tutorial on Analyzing Counts of Student Victimization Data

    ERIC Educational Resources Information Center

    Huang, Francis L.; Cornell, Dewey G.

    2012-01-01

    School violence research is often concerned with infrequently occurring events such as counts of the number of bullying incidents or fights a student may experience. Analyzing count data using ordinary least squares regression may produce improbable predicted values, and as a result of regression assumption violations, result in higher Type I…

  7. [Prediction and spatial distribution of recruitment trees of natural secondary forest based on geographically weighted Poisson model].

    PubMed

    Zhang, Ling Yu; Liu, Zhao Gang

    2017-12-01

    Based on the data collected from 108 permanent plots of the forest resources survey in Maoershan Experimental Forest Farm during 2004-2016, this study investigated the spatial distribution of recruitment trees in natural secondary forest by global Poisson regression and geographically weighted Poisson regression (GWPR) with four bandwidths of 2.5, 5, 10 and 15 km. The simulation effects of the 5 regressions and the factors influencing the recruitment trees in stands were analyzed, a description was given to the spatial autocorrelation of the regression residuals on global and local levels using Moran's I. The results showed that the spatial distribution of the number of natural secondary forest recruitment was significantly influenced by stands and topographic factors, especially average DBH. The GWPR model with small scale (2.5 km) had high accuracy of model fitting, a large range of model parameter estimates was generated, and the localized spatial distribution effect of the model parameters was obtained. The GWPR model at small scale (2.5 and 5 km) had produced a small range of model residuals, and the stability of the model was improved. The global spatial auto-correlation of the GWPR model residual at the small scale (2.5 km) was the lowe-st, and the local spatial auto-correlation was significantly reduced, in which an ideal spatial distribution pattern of small clusters with different observations was formed. The local model at small scale (2.5 km) was much better than the global model in the simulation effect on the spatial distribution of recruitment tree number.

  8. Modeling urban coastal flood severity from crowd-sourced flood reports using Poisson regression and Random Forest

    NASA Astrophysics Data System (ADS)

    Sadler, J. M.; Goodall, J. L.; Morsy, M. M.; Spencer, K.

    2018-04-01

    Sea level rise has already caused more frequent and severe coastal flooding and this trend will likely continue. Flood prediction is an essential part of a coastal city's capacity to adapt to and mitigate this growing problem. Complex coastal urban hydrological systems however, do not always lend themselves easily to physically-based flood prediction approaches. This paper presents a method for using a data-driven approach to estimate flood severity in an urban coastal setting using crowd-sourced data, a non-traditional but growing data source, along with environmental observation data. Two data-driven models, Poisson regression and Random Forest regression, are trained to predict the number of flood reports per storm event as a proxy for flood severity, given extensive environmental data (i.e., rainfall, tide, groundwater table level, and wind conditions) as input. The method is demonstrated using data from Norfolk, Virginia USA from September 2010 to October 2016. Quality-controlled, crowd-sourced street flooding reports ranging from 1 to 159 per storm event for 45 storm events are used to train and evaluate the models. Random Forest performed better than Poisson regression at predicting the number of flood reports and had a lower false negative rate. From the Random Forest model, total cumulative rainfall was by far the most dominant input variable in predicting flood severity, followed by low tide and lower low tide. These methods serve as a first step toward using data-driven methods for spatially and temporally detailed coastal urban flood prediction.

  9. 77 FR 13691 - Qualification of Drivers; Exemption Applications; Vision

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-07

    ..., ocular hypertension, retinal detachment, cataracts and corneal scaring. In most cases, their eye... Application of Multiple Regression Analysis of a Poisson Process,'' Journal of American Statistical...

  10. A covariance correction that accounts for correlation estimation to improve finite-sample inference with generalized estimating equations: A study on its applicability with structured correlation matrices

    PubMed Central

    Westgate, Philip M.

    2016-01-01

    When generalized estimating equations (GEE) incorporate an unstructured working correlation matrix, the variances of regression parameter estimates can inflate due to the estimation of the correlation parameters. In previous work, an approximation for this inflation that results in a corrected version of the sandwich formula for the covariance matrix of regression parameter estimates was derived. Use of this correction for correlation structure selection also reduces the over-selection of the unstructured working correlation matrix. In this manuscript, we conduct a simulation study to demonstrate that an increase in variances of regression parameter estimates can occur when GEE incorporates structured working correlation matrices as well. Correspondingly, we show the ability of the corrected version of the sandwich formula to improve the validity of inference and correlation structure selection. We also study the relative influences of two popular corrections to a different source of bias in the empirical sandwich covariance estimator. PMID:27818539

  11. A covariance correction that accounts for correlation estimation to improve finite-sample inference with generalized estimating equations: A study on its applicability with structured correlation matrices.

    PubMed

    Westgate, Philip M

    2016-01-01

    When generalized estimating equations (GEE) incorporate an unstructured working correlation matrix, the variances of regression parameter estimates can inflate due to the estimation of the correlation parameters. In previous work, an approximation for this inflation that results in a corrected version of the sandwich formula for the covariance matrix of regression parameter estimates was derived. Use of this correction for correlation structure selection also reduces the over-selection of the unstructured working correlation matrix. In this manuscript, we conduct a simulation study to demonstrate that an increase in variances of regression parameter estimates can occur when GEE incorporates structured working correlation matrices as well. Correspondingly, we show the ability of the corrected version of the sandwich formula to improve the validity of inference and correlation structure selection. We also study the relative influences of two popular corrections to a different source of bias in the empirical sandwich covariance estimator.

  12. Should the poor have no medicines to cure? A study on the association between social class and social security among the rural migrant workers in urban China.

    PubMed

    Guan, Ming

    2017-11-07

    The rampant urbanization and medical marketization in China have resulted in increased vulnerabilities to health and socioeconomic disparities among the rural migrant workers in urban China. In the Chinese context, the socioeconomic characteristics of rural migrant workers have attracted considerable research attention in the recent past years. However, to date, no previous studies have explored the association between the socioeconomic factors and social security among the rural migrant workers in urban China. This study aims to explore the association between socioeconomic inequity and social security inequity and the subsequent associations with medical inequity and reimbursement rejection. Data from a regionally representative sample of 2009 Survey of Migrant Workers in Pearl River Delta in China were used for analyses. Multiple logistic regressions were used to analyze the impacts of socioeconomic factors on the eight dimensions of social security (sick pay, paid leave, maternity pay, medical insurance, pension insurance, occupational injury insurance, unemployment insurance, and maternity insurance) and the impacts of social security on medical reimbursement rejection. The zero-inflated negative binomial regression model (ZINB regression) was adopted to explore the relationship between socioeconomic factors and hospital visits among the rural migrant workers with social security. The study population consisted of 848 rural migrant workers with high income who were young and middle-aged, low-educated, and covered by social security. Reimbursement rejection and abusive supervision for the rural migrant workers were observed. Logistic regression analysis showed that there were significant associations between socioeconomic factors and social security. ZINB regression showed that there were significant associations between socioeconomic factors and hospital visits among the rural migrant workers. Also, several dimensions of social security had significant associations with reimbursement rejections. This study showed that social security inequity, medical inequity, and reimbursement inequity happened to the rural migrant workers simultaneously. Future policy should strengthen health justice and enterprises' medical responsibilities to the employed rural migrant workers.

  13. Method selection and adaptation for distributed monitoring of infectious diseases for syndromic surveillance.

    PubMed

    Xing, Jian; Burkom, Howard; Tokars, Jerome

    2011-12-01

    Automated surveillance systems require statistical methods to recognize increases in visit counts that might indicate an outbreak. In prior work we presented methods to enhance the sensitivity of C2, a commonly used time series method. In this study, we compared the enhanced C2 method with five regression models. We used emergency department chief complaint data from US CDC BioSense surveillance system, aggregated by city (total of 206 hospitals, 16 cities) during 5/2008-4/2009. Data for six syndromes (asthma, gastrointestinal, nausea and vomiting, rash, respiratory, and influenza-like illness) was used and was stratified by mean count (1-19, 20-49, ≥50 per day) into 14 syndrome-count categories. We compared the sensitivity for detecting single-day artificially-added increases in syndrome counts. Four modifications of the C2 time series method, and five regression models (two linear and three Poisson), were tested. A constant alert rate of 1% was used for all methods. Among the regression models tested, we found that a Poisson model controlling for the logarithm of total visits (i.e., visits both meeting and not meeting a syndrome definition), day of week, and 14-day time period was best. Among 14 syndrome-count categories, time series and regression methods produced approximately the same sensitivity (<5% difference) in 6; in six categories, the regression method had higher sensitivity (range 6-14% improvement), and in two categories the time series method had higher sensitivity. When automated data are aggregated to the city level, a Poisson regression model that controls for total visits produces the best overall sensitivity for detecting artificially added visit counts. This improvement was achieved without increasing the alert rate, which was held constant at 1% for all methods. These findings will improve our ability to detect outbreaks in automated surveillance system data. Published by Elsevier Inc.

  14. On the Inverse Mapping of the Formal Symplectic Groupoid of a Deformation Quantization

    NASA Astrophysics Data System (ADS)

    Karabegov, Alexander V.

    2004-10-01

    To each natural star product on a Poisson manifold $M$ we associate an antisymplectic involutive automorphism of the formal neighborhood of the zero section of the cotangent bundle of $M$. If $M$ is symplectic, this mapping is shown to be the inverse mapping of the formal symplectic groupoid of the star product. The construction of the inverse mapping involves modular automorphisms of the star product.

  15. What Do Test Score Really Mean? A Latent Class Analysis of Danish Test Score Performance

    ERIC Educational Resources Information Center

    McIntosh, James; Munk, Martin D.

    2014-01-01

    Latent class Poisson count models are used to analyse a sample of Danish test score results from a cohort of individuals born in 1954-1955, tested in 1968, and followed until 2011. The procedure takes account of unobservable effects as well as excessive zeros in the data. We show that the test scores measure manifest or measured ability as it has…

  16. Estimation of Microbial Concentration in Food Products from Qualitative, Microbiological Test Data with the MPN Technique.

    PubMed

    Fujikawa, Hiroshi

    2017-01-01

    Microbial concentration in samples of a food product lot has been generally assumed to follow the log-normal distribution in food sampling, but this distribution cannot accommodate the concentration of zero. In the present study, first, a probabilistic study with the most probable number (MPN) technique was done for a target microbe present at a low (or zero) concentration in food products. Namely, based on the number of target pathogen-positive samples in the total samples of a product found by a qualitative, microbiological examination, the concentration of the pathogen in the product was estimated by means of the MPN technique. The effects of the sample size and the total sample number of a product were then examined. Second, operating characteristic (OC) curves for the concentration of a target microbe in a product lot were generated on the assumption that the concentration of a target microbe could be expressed with the Poisson distribution. OC curves for Salmonella and Cronobacter sakazakii in powdered formulae for infants and young children were successfully generated. The present study suggested that the MPN technique and the Poisson distribution would be useful for qualitative microbiological test data analysis for a target microbe whose concentration in a lot is expected to be low.

  17. Synchrotron x-ray imaging of pulmonary alveoli in respiration in live intact mice

    NASA Astrophysics Data System (ADS)

    Chang, Soeun; Kwon, Namseop; Kim, Jinkyung; Kohmura, Yoshiki; Ishikawa, Tetsuya; Rhee, Chin Kook; Je, Jung Ho; Tsuda, Akira

    2015-03-01

    Despite nearly a half century of studies, it has not been fully understood how pulmonary alveoli, the elementary gas exchange units in mammalian lungs, inflate and deflate during respiration. Understanding alveolar dynamics is crucial for treating patients with pulmonary diseases. In-vivo, real-time visualization of the alveoli during respiration has been hampered by active lung movement. Previous studies have been therefore limited to alveoli at lung apices or subpleural alveoli under open thorax conditions. Here we report direct and real-time visualization of alveoli of live intact mice during respiration using tracking X-ray microscopy. Our studies, for the first time, determine the alveolar size of normal mice in respiration without positive end expiratory pressure as 58 +/- 14 (mean +/- s.d.) μm on average, accurately measured in the lung bases as well as the apices. Individual alveoli of normal lungs clearly show heterogeneous inflation from zero to ~25% (6.7 +/- 4.7% (mean +/- s.d.)) in size. The degree of inflation is higher in the lung bases (8.7 +/- 4.3% (mean +/- s.d.)) than in the apices (5.7 +/- 3.2% (mean +/- s.d.)). The fraction of the total tidal volume allocated for alveolar inflation is 34 +/- 3.8% (mean +/- s.e.m). This study contributes to the better understanding of alveolar dynamics and helps to develop potential treatment options for pulmonary diseases.

  18. Bayesian inference and assessment for rare-event bycatch in marine fisheries: a drift gillnet fishery case study.

    PubMed

    Martin, Summer L; Stohs, Stephen M; Moore, Jeffrey E

    2015-03-01

    Fisheries bycatch is a global threat to marine megafauna. Environmental laws require bycatch assessment for protected species, but this is difficult when bycatch is rare. Low bycatch rates, combined with low observer coverage, may lead to biased, imprecise estimates when using standard ratio estimators. Bayesian model-based approaches incorporate uncertainty, produce less volatile estimates, and enable probabilistic evaluation of estimates relative to management thresholds. Here, we demonstrate a pragmatic decision-making process that uses Bayesian model-based inferences to estimate the probability of exceeding management thresholds for bycatch in fisheries with < 100% observer coverage. Using the California drift gillnet fishery as a case study, we (1) model rates of rare-event bycatch and mortality using Bayesian Markov chain Monte Carlo estimation methods and 20 years of observer data; (2) predict unobserved counts of bycatch and mortality; (3) infer expected annual mortality; (4) determine probabilities of mortality exceeding regulatory thresholds; and (5) classify the fishery as having low, medium, or high bycatch impact using those probabilities. We focused on leatherback sea turtles (Dermochelys coriacea) and humpback whales (Megaptera novaeangliae). Candidate models included Poisson or zero-inflated Poisson likelihood, fishing effort, and a bycatch rate that varied with area, time, or regulatory regime. Regulatory regime had the strongest effect on leatherback bycatch, with the highest levels occurring prior to a regulatory change. Area had the strongest effect on humpback bycatch. Cumulative bycatch estimates for the 20-year period were 104-242 leatherbacks (52-153 deaths) and 6-50 humpbacks (0-21 deaths). The probability of exceeding a regulatory threshold under the U.S. Marine Mammal Protection Act (Potential Biological Removal, PBR) of 0.113 humpback deaths was 0.58, warranting a "medium bycatch impact" classification of the fishery. No PBR thresholds exist for leatherbacks, but the probability of exceeding an anticipated level of two deaths per year, stated as part of a U.S. Endangered Species Act assessment process, was 0.0007. The approach demonstrated here would allow managers to objectively and probabilistically classify fisheries with respect to bycatch impacts on species that have population-relevant mortality reference points, and declare with a stipulated level of certainty that bycatch did or did not exceed estimated upper bounds.

  19. Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.

    PubMed

    Ritz, Christian; Van der Vliet, Leana

    2009-09-01

    The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.

  20. Inflation from Minkowski space

    DOE PAGES

    Pirtskhalava, David; Santoni, Luca; Trincherini, Enrico; ...

    2014-12-23

    Here, we propose a class of scalar models that, once coupled to gravity, lead to cosmologies that smoothly and stably connect an inflationary quasi-de Sitter universe to a low, or even zero-curvature, maximally symmetric spacetime in the asymptotic past, strongly violating the null energy condition (H • >>H2) at intermediate times. The models are deformations of the conformal galileon lagrangian and are therefore based on symmetries, both exact and approximate, that ensure the quantum robustness of the whole picture. The resulting cosmological backgrounds can be viewed as regularized extensions of the galilean genesis scenario, or, equivalently, as ‘early-time-complete’ realizations ofmore » inflation. The late-time inflationary dynamics possesses phenomenologically interesting properties: it can produce a large tensor-to-scalar ratio within the regime of validity of the effective field theory and can lead to sizeable equilateral nongaussianities.« less

  1. Impact of multicollinearity on small sample hydrologic regression models

    NASA Astrophysics Data System (ADS)

    Kroll, Charles N.; Song, Peter

    2013-06-01

    Often hydrologic regression models are developed with ordinary least squares (OLS) procedures. The use of OLS with highly correlated explanatory variables produces multicollinearity, which creates highly sensitive parameter estimators with inflated variances and improper model selection. It is not clear how to best address multicollinearity in hydrologic regression models. Here a Monte Carlo simulation is developed to compare four techniques to address multicollinearity: OLS, OLS with variance inflation factor screening (VIF), principal component regression (PCR), and partial least squares regression (PLS). The performance of these four techniques was observed for varying sample sizes, correlation coefficients between the explanatory variables, and model error variances consistent with hydrologic regional regression models. The negative effects of multicollinearity are magnified at smaller sample sizes, higher correlations between the variables, and larger model error variances (smaller R2). The Monte Carlo simulation indicates that if the true model is known, multicollinearity is present, and the estimation and statistical testing of regression parameters are of interest, then PCR or PLS should be employed. If the model is unknown, or if the interest is solely on model predictions, is it recommended that OLS be employed since using more complicated techniques did not produce any improvement in model performance. A leave-one-out cross-validation case study was also performed using low-streamflow data sets from the eastern United States. Results indicate that OLS with stepwise selection generally produces models across study regions with varying levels of multicollinearity that are as good as biased regression techniques such as PCR and PLS.

  2. The effects of ion adsorption on the potential of zero charge and the differential capacitance of charged aqueous interfaces

    NASA Astrophysics Data System (ADS)

    Uematsu, Yuki; Netz, Roland R.; Bonthuis, Douwe Jan

    2018-02-01

    Using a box profile approximation for the non-electrostatic surface adsorption potentials of anions and cations, we calculate the differential capacitance of aqueous electrolyte interfaces from a numerical solution of the Poisson-Boltzmann equation, including steric interactions between the ions and an inhomogeneous dielectric profile. Preferential adsorption of the positive (negative) ion shifts the minimum of the differential capacitance to positive (negative) surface potential values. The trends are similar for the potential of zero charge; however, the potential of zero charge does not correspond to the minimum of the differential capacitance in the case of asymmetric ion adsorption, contrary to the assumption commonly used to determine the potential of zero charge. Our model can be used to obtain more accurate estimates of ion adsorption properties from differential capacitance or electrocapillary measurements. Asymmetric ion adsorption also affects the relative heights of the characteristic maxima in the differential capacitance curves as a function of the surface potential, but even for strong adsorption potentials the effect is small, making it difficult to reliably determine the adsorption properties from the peak heights.

  3. Using multi-year national survey cohorts for period estimates: an application of weighted discrete Poisson regression for assessing annual national mortality in US adults with and without diabetes, 2000-2006.

    PubMed

    Cheng, Yiling J; Gregg, Edward W; Rolka, Deborah B; Thompson, Theodore J

    2016-12-15

    Monitoring national mortality among persons with a disease is important to guide and evaluate progress in disease control and prevention. However, a method to estimate nationally representative annual mortality among persons with and without diabetes in the United States does not currently exist. The aim of this study is to demonstrate use of weighted discrete Poisson regression on national survey mortality follow-up data to estimate annual mortality rates among adults with diabetes. To estimate mortality among US adults with diabetes, we applied a weighted discrete time-to-event Poisson regression approach with post-stratification adjustment to national survey data. Adult participants aged 18 or older with and without diabetes in the National Health Interview Survey 1997-2004 were followed up through 2006 for mortality status. We estimated mortality among all US adults, and by self-reported diabetes status at baseline. The time-varying covariates used were age and calendar year. Mortality among all US adults was validated using direct estimates from the National Vital Statistics System (NVSS). Using our approach, annual all-cause mortality among all US adults ranged from 8.8 deaths per 1,000 person-years (95% confidence interval [CI]: 8.0, 9.6) in year 2000 to 7.9 (95% CI: 7.6, 8.3) in year 2006. By comparison, the NVSS estimates ranged from 8.6 to 7.9 (correlation = 0.94). All-cause mortality among persons with diabetes decreased from 35.7 (95% CI: 28.4, 42.9) in 2000 to 31.8 (95% CI: 28.5, 35.1) in 2006. After adjusting for age, sex, and race/ethnicity, persons with diabetes had 2.1 (95% CI: 2.01, 2.26) times the risk of death of those without diabetes. Period-specific national mortality can be estimated for people with and without a chronic condition using national surveys with mortality follow-up and a discrete time-to-event Poisson regression approach with post-stratification adjustment.

  4. Year-round monitoring reveals prevalence of fatal bird-window collisions at the Virginia Tech Corporate Research Center

    PubMed Central

    Barton, Christine M.; Zirkle, Keith W.; Greene, Caitlin F.; Newman, Kara B.

    2018-01-01

    Collisions with glass are a serious threat to avian life and are estimated to kill hundreds of millions of birds per year in the United States. We monitored 22 buildings at the Virginia Tech Corporate Research Center (VTCRC) in Blacksburg, Virginia, for collision fatalities from October 2013 through May 2015 and explored possible effects exerted by glass area and surrounding land cover on avian mortality. We documented 240 individuals representing 55 identifiable species that died due to collisions with windows at the VTCRC. The relative risk of fatal collisions at all buildings over the study period were estimated using a Bayesian hierarchical zero-inflated Poisson model adjusting for percentage of tree and lawn cover within 50 m of buildings, as well as for glass area. We found significant relationships between fatalities and surrounding lawn area (relative risk: 0.96, 95% credible interval: 0.93, 0.98) as well as glass area on buildings (RR: 1.30, 95% CI [1.05–1.65]). The model also found a moderately significant relationship between fatal collisions and the percent land cover of ornamental trees surrounding buildings (RR = 1.02, 95% CI [1.00–1.05]). Every building surveyed had at least one recorded collision death. Our findings indicate that birds collide with VTCRC windows during the summer breeding season in addition to spring and fall migration. The Ruby-throated Hummingbird (Archilochus colubris) was the most common window collision species and accounted for 10% of deaths. Though research has identified various correlates with fatal bird-window collisions, such studies rarely culminate in mitigation. We hope our study brings attention, and ultimately action, to address this significant threat to birds at the VTCRC and elsewhere. PMID:29637021

  5. Repercussions of mild diabetes on pregnancy in Wistar rats and on the fetal development

    PubMed Central

    2010-01-01

    Background Experimental models are necessary to elucidate diabetes pathophysiological mechanisms not yet understood in humans. Objective: To evaluate the repercussions of the mild diabetes, considering two methodologies, on the pregnancy of Wistar rats and on the development of their offspring. Methods In the 1st induction, female offspring were distributed into two experimental groups: Group streptozotocin (STZ, n = 67): received the β-cytotoxic agent (100 mg STZ/kg body weight - sc) on the 1st day of the life; and Non-diabetic Group (ND, n = 14): received the vehicle in a similar time period. In the adult life, the animals were mated. After a positive diagnosis of pregnancy (0), female rats from group STZ presenting with lower glycemia than 120 mg/dL received more 20 mg STZ/kg (ip) at day 7 of pregnancy (2nd induction). The female rats with glycemia higher than 120 mg/dL were discarded because they reproduced results already found in the literature. In the mornings of days 0, 7, 14 and 21 of the pregnancy glycemia was determined. At day 21 of pregnancy (at term), the female rats were anesthetized and killed for maternal reproductive performance and fetal development analysis. The data were analyzed using Student-Newman-Keuls, Chi-square and Zero-inflated Poisson (ZIP) Tests (p < 0.05). Results STZ rats presented increased rates of pre (STZ = 22.0%; ND = 5.1%) and post-implantation losses (STZ = 26.1%; ND = 5.7%), reduced rates of fetuses with appropriate weight for gestational age (STZ = 66%; ND = 93%) and reduced degree of development (ossification sites). Conclusion Mild diabetes led a negative impact on maternal reproductive performance and caused intrauterine growth restriction and impaired fetal development. PMID:20416073

  6. Risk factors related to Toxoplasma gondii seroprevalence in indoor-housed Dutch dairy goats.

    PubMed

    Deng, Huifang; Dam-Deisz, Cecile; Luttikholt, Saskia; Maas, Miriam; Nielen, Mirjam; Swart, Arno; Vellema, Piet; van der Giessen, Joke; Opsteegh, Marieke

    2016-02-01

    Toxoplasma gondii can cause disease in goats, but also has impact on human health through food-borne transmission. Our aims were to determine the seroprevalence of T. gondii infection in indoor-housed Dutch dairy goats and to identify the risk factors related to T. gondii seroprevalence. Fifty-two out of ninety approached farmers with indoor-kept goats (58%) participated by answering a standardized questionnaire and contributing 32 goat blood samples each. Serum samples were tested for T. gondii SAG1 antibodies by ELISA and results showed that the frequency distribution of the log10-transformed OD-values fitted well with a binary mixture of a shifted gamma and a shifted reflected gamma distribution. The overall animal seroprevalence was 13.3% (95% CI: 11.7–14.9%), and at least one seropositive animal was found on 61.5% (95% CI: 48.3–74.7%) of the farms. To evaluate potential risk factors on herd level, three modeling strategies (Poisson, negative binomial and zero-inflated) were compared. The negative binomial model fitted the data best with the number of cats (1–4 cats: IR: 2.6, 95% CI: 1.1–6.5; > = 5 cats:IR: 14.2, 95% CI: 3.9–51.1) and mean animal age (IR: 1.5, 95% CI: 1.1–2.1) related to herd positivity. In conclusion, the ELISA test was 100% sensitive and specific based on binary mixture analysis. T. gondii infection is prevalent in indoor housed Dutch dairy goats but at a lower overall animal level seroprevalence than outdoor farmed goats in other European countries, and cat exposure is an important risk factor.

  7. Individual- and community-level correlates of cigarette-smoking trajectories from age 13 to 32 in a U.S. population-based sample.

    PubMed

    Fuemmeler, Bernard; Lee, Chien-Ti; Ranby, Krista W; Clark, Trenette; McClernon, F Joseph; Yang, Chongming; Kollins, Scott H

    2013-09-01

    Characterizing smoking behavior is important for informing etiologic models and targeting prevention efforts. This study explored the effects of both individual- and community-level variables in predicting cigarette use vs. non-use and level of use among adolescents as they transition into adulthood. Data on 14,779 youths (53% female) were drawn from the National Longitudinal Study of Adolescent Health (Add Health); a nationally representative longitudinal cohort. A cohort sequential design allowed for examining trajectories of smoking typologies from age 13 to 32 years. Smoking trajectories were evaluated by using a zero-inflated Poisson (ZIP) latent growth analysis and latent class growth analysis modeling approach. Significant relationships emerged between both individual- and community-level variables and smoking outcomes. Maternal and peer smoking predicted increases in smoking over development and were associated with a greater likelihood of belonging to any of the four identified smoking groups versus Non-Users. Conduct problems and depressive symptoms during adolescence were related to cigarette use versus non-use. State-level prevalence of adolescent smoking was related to greater cigarette use during adolescence. Individual- and community-level variables that distinguish smoking patterns within the population aid in understanding cigarette use versus non-use and the quantity of cigarette use into adulthood. Our findings suggest that efforts to prevent cigarette use would benefit from attention to both parental and peer smoking and individual well-being. Future work is needed to better understand the role of variables in the context of multiple levels (individual and community-level) on smoking trajectories. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Assessing the mandatory bovine abortion notification system in France using unilist capture-recapture approach.

    PubMed

    Bronner, Anne; Hénaux, Viviane; Vergne, Timothée; Vinard, Jean-Luc; Morignat, Eric; Hendrikx, Pascal; Calavas, Didier; Gay, Emilie

    2013-01-01

    The mandatory bovine abortion notification system in France aims to detect as soon as possible any resurgence of bovine brucellosis. However, under-reporting seems to be a major limitation of this system. We used a unilist capture-recapture approach to assess the sensitivity, i.e. the proportion of farmers who reported at least one abortion among those who detected such events, and representativeness of the system during 2006-2011. We implemented a zero-inflated Poisson model to estimate the proportion of farmers who detected at least one abortion, and among them, the proportion of farmers not reporting. We also applied a hurdle model to evaluate the effect of factors influencing the notification process. We found that the overall surveillance sensitivity was about 34%, and was higher in beef than dairy cattle farms. The observed increase in the proportion of notifying farmers from 2007 to 2009 resulted from an increase in the surveillance sensitivity in 2007/2008 and an increase in the proportion of farmers who detected at least one abortion in 2008/2009. These patterns suggest a raise in farmers' awareness in 2007/2008 when the Bluetongue Virus (BTV) was detected in France, followed by an increase in the number of abortions in 2008/2009 as BTV spread across the country. Our study indicated a lack of sensitivity of the mandatory bovine abortion notification system, raising concerns about the ability to detect brucellosis outbreaks early. With the increasing need to survey the zoonotic Rift Valley Fever and Q fever diseases that may also cause bovine abortions, our approach is of primary interest for animal health stakeholders to develop information programs to increase abortion notifications. Our framework combining hurdle and ZIP models may also be applied to estimate the completeness of other clinical surveillance systems.

  9. Assessing the Mandatory Bovine Abortion Notification System in France Using Unilist Capture-Recapture Approach

    PubMed Central

    Bronner, Anne; Hénaux, Viviane; Vergne, Timothée; Vinard, Jean-Luc; Morignat, Eric; Hendrikx, Pascal; Calavas, Didier; Gay, Emilie

    2013-01-01

    The mandatory bovine abortion notification system in France aims to detect as soon as possible any resurgence of bovine brucellosis. However, under-reporting seems to be a major limitation of this system. We used a unilist capture-recapture approach to assess the sensitivity, i.e. the proportion of farmers who reported at least one abortion among those who detected such events, and representativeness of the system during 2006–2011. We implemented a zero-inflated Poisson model to estimate the proportion of farmers who detected at least one abortion, and among them, the proportion of farmers not reporting. We also applied a hurdle model to evaluate the effect of factors influencing the notification process. We found that the overall surveillance sensitivity was about 34%, and was higher in beef than dairy cattle farms. The observed increase in the proportion of notifying farmers from 2007 to 2009 resulted from an increase in the surveillance sensitivity in 2007/2008 and an increase in the proportion of farmers who detected at least one abortion in 2008/2009. These patterns suggest a raise in farmers’ awareness in 2007/2008 when the Bluetongue Virus (BTV) was detected in France, followed by an increase in the number of abortions in 2008/2009 as BTV spread across the country. Our study indicated a lack of sensitivity of the mandatory bovine abortion notification system, raising concerns about the ability to detect brucellosis outbreaks early. With the increasing need to survey the zoonotic Rift Valley Fever and Q fever diseases that may also cause bovine abortions, our approach is of primary interest for animal health stakeholders to develop information programs to increase abortion notifications. Our framework combining hurdle and ZIP models may also be applied to estimate the completeness of other clinical surveillance systems. PMID:23691004

  10. A big data approach to the development of mixed-effects models for seizure count data.

    PubMed

    Tharayil, Joseph J; Chiang, Sharon; Moss, Robert; Stern, John M; Theodore, William H; Goldenholz, Daniel M

    2017-05-01

    Our objective was to develop a generalized linear mixed model for predicting seizure count that is useful in the design and analysis of clinical trials. This model also may benefit the design and interpretation of seizure-recording paradigms. Most existing seizure count models do not include children, and there is currently no consensus regarding the most suitable model that can be applied to children and adults. Therefore, an additional objective was to develop a model that accounts for both adult and pediatric epilepsy. Using data from SeizureTracker.com, a patient-reported seizure diary tool with >1.2 million recorded seizures across 8 years, we evaluated the appropriateness of Poisson, negative binomial, zero-inflated negative binomial, and modified negative binomial models for seizure count data based on minimization of the Bayesian information criterion. Generalized linear mixed-effects models were used to account for demographic and etiologic covariates and for autocorrelation structure. Holdout cross-validation was used to evaluate predictive accuracy in simulating seizure frequencies. For both adults and children, we found that a negative binomial model with autocorrelation over 1 day was optimal. Using holdout cross-validation, the proposed model was found to provide accurate simulation of seizure counts for patients with up to four seizures per day. The optimal model can be used to generate more realistic simulated patient data with very few input parameters. The availability of a parsimonious, realistic virtual patient model can be of great utility in simulations of phase II/III clinical trials, epilepsy monitoring units, outpatient biosensors, and mobile Health (mHealth) applications. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  11. Effects of Barometric Pressure and Temperature on Acute Ischemic Stroke Hospitalization in Augusta, GA.

    PubMed

    Guan, Weihua; Clay, Sandra J; Sloan, Gloria J; Pretlow, Lester G

    2018-06-24

    Several studies worldwide have demonstrated significant relationships between meteorological parameters and stroke events. However, authors often reported discordant effects of both barometric pressure and air temperature on stroke occurrence. The present study investigated whether there was an association between weather parameters (barometric pressure and temperature) and ischemic stroke hospitalization. The aim of the study was to find out whether daily barometric pressure may be used as a prognostic variable to evaluate the workload change of a neurological intensive care unit. We conducted a retrospective review study in which we collected the independent (barometric pressure and temperature) and dependent variables (stroke hospitalization) every 24 h for the periods 10/1/2016-4/30/2017 at Augusta University Medical Center of Augusta, GA. We analyzed the data with zero-inflated Poisson model to assess the relationship between the barometric pressure, temperature, and daily stroke hospitalization. The results showed that there was a significantly correlation between daily barometric pressure variation and daily stroke hospitalization, especially on elder male patients (≥ 65). Stroke events were more likely to occur in the patients with risk factors than in those without risk factors when exposed to barometric pressure and temperature changes. Decreased barometric pressure and increased temperature were associated with increased daily stroke hospitalization. Furthermore, there was a potential delayed effect of increased stroke events after cold temperature exposure. Barometric pressure and temperature changes over the preceding 24 h are associated with daily stroke hospitalization. These findings may enhance our understanding of relationship between stroke and weather and maybe used in the development of public health strategies to minimize the weather-related stroke risk.

  12. Predicting stem borer density in maize using RapidEye data and generalized linear models

    NASA Astrophysics Data System (ADS)

    Abdel-Rahman, Elfatih M.; Landmann, Tobias; Kyalo, Richard; Ong'amo, George; Mwalusepo, Sizah; Sulieman, Saad; Ru, Bruno Le

    2017-05-01

    Average maize yield in eastern Africa is 2.03 t ha-1 as compared to global average of 6.06 t ha-1 due to biotic and abiotic constraints. Amongst the biotic production constraints in Africa, stem borers are the most injurious. In eastern Africa, maize yield losses due to stem borers are currently estimated between 12% and 21% of the total production. The objective of the present study was to explore the possibility of RapidEye spectral data to assess stem borer larva densities in maize fields in two study sites in Kenya. RapidEye images were acquired for the Bomet (western Kenya) test site on the 9th of December 2014 and on 27th of January 2015, and for Machakos (eastern Kenya) a RapidEye image was acquired on the 3rd of January 2015. Five RapidEye spectral bands as well as 30 spectral vegetation indices (SVIs) were utilized to predict per field maize stem borer larva densities using generalized linear models (GLMs), assuming Poisson ('Po') and negative binomial ('NB') distributions. Root mean square error (RMSE) and ratio prediction to deviation (RPD) statistics were used to assess the models performance using a leave-one-out cross-validation approach. The Zero-inflated NB ('ZINB') models outperformed the 'NB' models and stem borer larva densities could only be predicted during the mid growing season in December and early January in both study sites, respectively (RMSE = 0.69-1.06 and RPD = 8.25-19.57). Overall, all models performed similar when all the 30 SVIs (non-nested) and only the significant (nested) SVIs were used. The models developed could improve decision making regarding controlling maize stem borers within integrated pest management (IPM) interventions.

  13. Analyzing Seasonal Variations in Suicide With Fourier Poisson Time-Series Regression: A Registry-Based Study From Norway, 1969-2007.

    PubMed

    Bramness, Jørgen G; Walby, Fredrik A; Morken, Gunnar; Røislien, Jo

    2015-08-01

    Seasonal variation in the number of suicides has long been acknowledged. It has been suggested that this seasonality has declined in recent years, but studies have generally used statistical methods incapable of confirming this. We examined all suicides occurring in Norway during 1969-2007 (more than 20,000 suicides in total) to establish whether seasonality decreased over time. Fitting of additive Fourier Poisson time-series regression models allowed for formal testing of a possible linear decrease in seasonality, or a reduction at a specific point in time, while adjusting for a possible smooth nonlinear long-term change without having to categorize time into discrete yearly units. The models were compared using Akaike's Information Criterion and analysis of variance. A model with a seasonal pattern was significantly superior to a model without one. There was a reduction in seasonality during the period. Both the model assuming a linear decrease in seasonality and the model assuming a change at a specific point in time were both superior to a model assuming constant seasonality, thus confirming by formal statistical testing that the magnitude of the seasonality in suicides has diminished. The additive Fourier Poisson time-series regression model would also be useful for studying other temporal phenomena with seasonal components. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. The effect of starspots on the radii of low-mass pre-main-sequence stars

    NASA Astrophysics Data System (ADS)

    Jackson, R. J.; Jeffries, R. D.

    2014-07-01

    A polytropic model is used to investigate the effects of dark photospheric spots on the evolution and radii of magnetically active, low-mass (M < 0.5 M⊙), pre-main-sequence (PMS) stars. Spots slow the contraction along Hayashi tracks and inflate the radii of PMS stars by a factor of (1 - β)-N compared to unspotted stars of the same luminosity, where β is the equivalent covering fraction of dark starspots and N ≃ 0.45 ± 0.05. This is a much stronger inflation than predicted by Spruit & Weiss for main-sequence stars with the same β, where N ˜ 0.2-0.3. These models have been compared to radii determined for very magnetically active K- and M-dwarfs in the young Pleiades and NGC 2516 clusters, and the radii of tidally locked, low-mass eclipsing binary components. The binary components and zero-age main-sequence K-dwarfs have radii inflated by ˜10 per cent compared to an empirical radius-luminosity relation that is defined by magnetically inactive field dwarfs with interferometrically measured radii; low-mass M-type PMS stars, that are still on their Hayashi tracks, are inflated by up to ˜40 per cent. If this were attributable to starspots alone, we estimate that an effective spot coverage of 0.35 < β < 0.51 is required. Alternatively, global inhibition of convective flux transport by dynamo-generated fields may play a role. However, we find greater consistency with the starspot models when comparing the loci of active young stars and inactive field stars in colour-magnitude diagrams, particularly for the highly inflated PMS stars, where the large, uniform temperature reduction required in globally inhibited convection models would cause the stars to be much redder than observed.

  15. Predictors of birth-related post-traumatic stress symptoms: secondary analysis of a cohort study.

    PubMed

    Furuta, Marie; Sandall, Jane; Cooper, Derek; Bick, Debra

    2016-12-01

    This study aimed to identify factors associated with birth-related post-traumatic stress symptoms during the early postnatal period. Secondary analysis was conducted using data from a prospective cohort study of 1824 women who gave birth in one large hospital in England. Post-traumatic stress symptoms were measured by the Impact of Event Scale at 6 to 8 weeks postpartum. Zero-inflated negative binomial regression models were developed for analyses. Results showed that post-traumatic stress symptoms were more frequently observed in black women and in women who had a higher pre-pregnancy BMI compared to those with a lower BMI. Women who have a history of mental illness as well as those who gave birth before arriving at the hospital, underwent an emergency caesarean section or experienced severe maternal morbidity or neonatal complications also showed symptoms. Women's perceived control during labour and birth significantly reduced the effects of some risk factors. A higher level of perceived social support during the postnatal period also reduced the risk of post-traumatic stress symptoms. From the perspective of clinical practice, improving women's sense of control during labour and birth appears to be important, as does providing social support following the birth.

  16. Value-based purchasing and hospital acquired conditions: are we seeing improvement?

    PubMed

    Spaulding, Aaron; Zhao, Mei; Haley, D Rob

    2014-12-01

    To determine if the Value-Based Purchasing Performance Scoring system correlates with hospital acquired condition quality indicators. This study utilizes the following secondary data sources: the American Hospital Association (AHA) annual survey and the Centers for Medicare and Medicaid (CMS) Value-Based Purchasing and Hospital Acquired Conditions databases. Zero-inflated negative binomial regression was used to examine the effect of CMS total performance score on counts of hospital acquired conditions. Hospital structure variables including size, ownership, teaching status, payer mix, case mix, and location were utilized as control variables. The secondary data sources were merged into a single database using Stata 10. Total performance scores, which are used to determine if hospitals should receive incentive money, do not correlate well with quality outcome in the form of hospital acquired conditions. Value-based purchasing does not appear to correlate with improved quality and patient safety as indicated by Hospital Acquired Condition (HAC) scores. This leads us to believe that either the total performance score does not measure what it should, or the quality outcome measurements do not reflect the quality of the total performance scores measure. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Volunteerism: Social Network Dynamics and Education

    PubMed Central

    Ajrouch, Kristine J.; Antonucci, Toni C.; Webster, Noah J.

    2016-01-01

    Objectives . We examine how changes in social networks influence volunteerism through bridging (diversity) and bonding (spending time) mechanisms. We further investigate whether social network change substitutes or amplifies the effects of education on volunteerism. Methods . Data (n = 543) are drawn from a two-wave survey of Social Relations and Health over the Life Course (SRHLC). Zero-inflated negative binomial regressions were conducted to test competing hypotheses about how changes in social network characteristics alone and in conjunction with education level predict likelihood and frequency of volunteering. Results . Changes in social networks were associated with volunteerism: as the proportion of family members decreased and the average number of network members living within a one-hour drive increased over time, participants reported higher odds of volunteering. The substitution hypothesis was supported: social networks that exhibited more geographic proximity and greater contact frequency over-time compensated for lower levels of education to predict volunteering more hours. Discussion . The dynamic role of social networks and the ways in which they may work through bridging and bonding to influence both likelihood and frequency of volunteering are discussed. The potential benefits of volunteerism in light of longer life expectancies and smaller families are also considered. PMID:25512570

  18. Hospital utilization outcome of an assertive outreach model for schizophrenic patients - results of a quasi-experimental study.

    PubMed

    Büchtemann, Dorothea; Kästner, Denise; Warnke, Ingeborg; Radisch, Jeanett; Baumgardt, Johanna; Giersberg, Steffi; Kleine-Budde, Katja; Moock, Jörn; Kawohl, Wolfram; Rössler, Wulf

    2016-07-30

    We assessed whether an Assertive Outreach (AO) program for patients with schizophrenia implemented in German routine care in rural areas reduces psychiatric hospital admissions and/or psychiatric hospital days. We conducted a quasi-experimental controlled study with 5 assessments in 12 months. Data collection included health care utilization (Client Sociodemographic and Service Receipt Inventory), and clinical parameters. The assessments took place in the practices of the psychiatrists. Admission incidence rates were calculated. For bivariate group comparison, we used U-tests, T-tests and Chi(2)-Tests, multivariate analysis was conducted using zero-inflated regression models. For hospital outcomes, data of 295 patients was analysed. No statistically significant differences between AO and TAU patients in terms of hospital admissions or hospital days were found. Overall hospital utilization was low (8%). Advantages of AO over TAU referring to hospital utilization were not found. However, a spill-over effect might have reduced hospital utilization in both groups. Further research should differentiate patient subgroups. These two appear to be key factors to explain effects or absence of effects and to draw conclusions for the mental health care delivery. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Health services utilization of people having and not having a regular doctor in Canada.

    PubMed

    Thanh, Nguyen Xuan; Rapoport, John

    2017-04-01

    Canada having a universal health insurance plan that provides hospital and physician benefits offers a natural experiment of whether continuity of care actually provides lower or higher utilization of services. The question we are evaluating is whether Canadians, who have a regular physician, use more health resources than those who do not have one? Using two statistical methods, including propensity score matching and zero-inflated negative binomial regression, we analyzed data from the 2010 and 2007/2008 Canadian Community Health Surveys separately to document differences between people self-reportedly having and not having a regular doctor in the utilization of general practitioner, specialist, and hospital services. The results showed, consistently for all two statistical methods and two datasets used, that people reportedly having a regular doctor used more healthcare services than a matched group of people who was self-reportedly not having a regular doctor. For specialist and hospital utilization, the statistically significant differences were in the likelihood if the service was used but not in the number of specialist visits or hospital nights among users. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  20. Effects of early dental office visits on dental caries experience.

    PubMed

    Beil, Heather; Rozier, R Gary; Preisser, John S; Stearns, Sally C; Lee, Jessica Y

    2014-10-01

    We determined the association between timing of a first dentist office visit before age 5 years and dental disease in kindergarten. We used North Carolina Medicaid claims (1999-2006) linked to state oral health surveillance data to compare caries experience for kindergarten students (2005-2006) who had a visit before age 60 months (n=11,394) to derive overall exposure effects from a zero-inflated negative binomial regression model. We repeated the analysis separately for children who had preventive and tertiary visits. Children who had a visit at age 37 to 48 and 49 to 60 months had significantly less disease than children with a visit by age 24 months (incidence rate ratio [IRR]=0.88; 95% confidence interval [CI]=0.81, 0.95; IRR=0.75; 95% CI=0.69, 0.82, respectively). Disease status did not differ between children who had a tertiary visit by age 24 months and other children. Medicaid-enrolled children in our study followed an urgent care type of utilization, and access to dental care was limited. Children at high risk for dental disease should be given priority for a preventive dental visit before age 3 years.

  1. Spatial distribution of citizen science casuistic observations for different taxonomic groups.

    PubMed

    Tiago, Patrícia; Ceia-Hasse, Ana; Marques, Tiago A; Capinha, César; Pereira, Henrique M

    2017-10-16

    Opportunistic citizen science databases are becoming an important way of gathering information on species distributions. These data are temporally and spatially dispersed and could have limitations regarding biases in the distribution of the observations in space and/or time. In this work, we test the influence of landscape variables in the distribution of citizen science observations for eight taxonomic groups. We use data collected through a Portuguese citizen science database (biodiversity4all.org). We use a zero-inflated negative binomial regression to model the distribution of observations as a function of a set of variables representing the landscape features plausibly influencing the spatial distribution of the records. Results suggest that the density of paths is the most important variable, having a statistically significant positive relationship with number of observations for seven of the eight taxa considered. Wetland coverage was also identified as having a significant, positive relationship, for birds, amphibians and reptiles, and mammals. Our results highlight that the distribution of species observations, in citizen science projects, is spatially biased. Higher frequency of observations is driven largely by accessibility and by the presence of water bodies. We conclude that efforts are required to increase the spatial evenness of sampling effort from volunteers.

  2. Associations between sensitivity to punishment, sensitivity to reward, and gambling.

    PubMed

    Gaher, Raluca M; Hahn, Austin M; Shishido, Hanako; Simons, Jeffrey S; Gaster, Sam

    2015-03-01

    The majority of individuals gamble during their lifetime; however only a subset of these individuals develops problematic gambling. Gray's Reinforcement Sensitivity Theory may be relevant to understanding gambling problems. Differences in sensitivity to punishments and rewards can influence an individual's behavior and may be pertinent to the development of gambling problems. This study examined the functional associations between sensitivity to punishment (SP), sensitivity to reward (SR), and gambling problems in a sample of 2254 college students. Zero-inflated negative binomial regression was used to predict gambling problems as well as the absence of gambling problems. Gambling problems were hypothesized to be positively associated with SR and inversely associated with SP. In addition, SP was hypothesized to moderate the association between SR and gambling problems, attenuating the strength of the association. As hypothesized, SR was positively associated with gambling problems. However, SP did not moderate the relationship between SR and gambling problems. SP did, however, moderate the relationship between SR and the likelihood of never experiencing gambling problems. The results demonstrate that individual differences in SP and SR are functionally associated with gambling problems. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    PubMed

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  4. Cosmic Inflation

    ScienceCinema

    Lincoln, Don

    2018-01-16

    In 1964, scientists discovered a faint radio hiss coming from the heavens and realized that the hiss wasn’t just noise. It was a message from eons ago; specifically the remnants of the primordial fireball, cooled to about 3 degrees above absolute zero. Subsequent research revealed that the radio hiss was the same in every direction. The temperature of the early universe was uniform to at better than a part in a hundred thousand. And this was weird. According to the prevailing theory, the two sides of the universe have never been in contact. So how could two places that had never been in contact be so similar? One possible explanation was proposed in 1979. Called inflation, the theory required that early in the history of the universe, the universe expanded faster than the speed of light. Confused? Watch this video as Fermilab’s Dr. Don Lincoln makes sense of this mind-bending idea.

  5. Cosmic Inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lincoln, Don

    In 1964, scientists discovered a faint radio hiss coming from the heavens and realized that the hiss wasn’t just noise. It was a message from eons ago; specifically the remnants of the primordial fireball, cooled to about 3 degrees above absolute zero. Subsequent research revealed that the radio hiss was the same in every direction. The temperature of the early universe was uniform to at better than a part in a hundred thousand. And this was weird. According to the prevailing theory, the two sides of the universe have never been in contact. So how could two places that hadmore » never been in contact be so similar? One possible explanation was proposed in 1979. Called inflation, the theory required that early in the history of the universe, the universe expanded faster than the speed of light. Confused? Watch this video as Fermilab’s Dr. Don Lincoln makes sense of this mind-bending idea.« less

  6. General methodology for nonlinear modeling of neural systems with Poisson point-process inputs.

    PubMed

    Marmarelis, V Z; Berger, T W

    2005-07-01

    This paper presents a general methodological framework for the practical modeling of neural systems with point-process inputs (sequences of action potentials or, more broadly, identical events) based on the Volterra and Wiener theories of functional expansions and system identification. The paper clarifies the distinctions between Volterra and Wiener kernels obtained from Poisson point-process inputs. It shows that only the Wiener kernels can be estimated via cross-correlation, but must be defined as zero along the diagonals. The Volterra kernels can be estimated far more accurately (and from shorter data-records) by use of the Laguerre expansion technique adapted to point-process inputs, and they are independent of the mean rate of stimulation (unlike their P-W counterparts that depend on it). The Volterra kernels can also be estimated for broadband point-process inputs that are not Poisson. Useful applications of this modeling approach include cases where we seek to determine (model) the transfer characteristics between one neuronal axon (a point-process 'input') and another axon (a point-process 'output') or some other measure of neuronal activity (a continuous 'output', such as population activity) with which a causal link exists.

  7. A study on industrial accident rate forecasting and program development of estimated zero accident time in Korea.

    PubMed

    Kim, Tae-gu; Kang, Young-sig; Lee, Hyung-won

    2011-01-01

    To begin a zero accident campaign for industry, the first thing is to estimate the industrial accident rate and the zero accident time systematically. This paper considers the social and technical change of the business environment after beginning the zero accident campaign through quantitative time series analysis methods. These methods include sum of squared errors (SSE), regression analysis method (RAM), exponential smoothing method (ESM), double exponential smoothing method (DESM), auto-regressive integrated moving average (ARIMA) model, and the proposed analytic function method (AFM). The program is developed to estimate the accident rate, zero accident time and achievement probability of an efficient industrial environment. In this paper, MFC (Microsoft Foundation Class) software of Visual Studio 2008 was used to develop a zero accident program. The results of this paper will provide major information for industrial accident prevention and be an important part of stimulating the zero accident campaign within all industrial environments.

  8. Synchrotron X-ray imaging of pulmonary alveoli in respiration in live intact mice.

    PubMed

    Chang, Soeun; Kwon, Namseop; Kim, Jinkyung; Kohmura, Yoshiki; Ishikawa, Tetsuya; Rhee, Chin Kook; Je, Jung Ho; Tsuda, Akira

    2015-03-04

    Despite nearly a half century of studies, it has not been fully understood how pulmonary alveoli, the elementary gas exchange units in mammalian lungs, inflate and deflate during respiration. Understanding alveolar dynamics is crucial for treating patients with pulmonary diseases. In-vivo, real-time visualization of the alveoli during respiration has been hampered by active lung movement. Previous studies have been therefore limited to alveoli at lung apices or subpleural alveoli under open thorax conditions. Here we report direct and real-time visualization of alveoli of live intact mice during respiration using tracking X-ray microscopy. Our studies, for the first time, determine the alveolar size of normal mice in respiration without positive end expiratory pressure as 58 ± 14 (mean ± s.d.) μm on average, accurately measured in the lung bases as well as the apices. Individual alveoli of normal lungs clearly show heterogeneous inflation from zero to ~25% (6.7 ± 4.7% (mean ± s.d.)) in size. The degree of inflation is higher in the lung bases (8.7 ± 4.3% (mean ± s.d.)) than in the apices (5.7 ± 3.2% (mean ± s.d.)). The fraction of the total tidal volume allocated for alveolar inflation is 34 ± 3.8% (mean ± s.e.m). This study contributes to the better understanding of alveolar dynamics and helps to develop potential treatment options for pulmonary diseases.

  9. Equivalent Theories and Changing Hamiltonian Observables in General Relativity

    NASA Astrophysics Data System (ADS)

    Pitts, J. Brian

    2018-03-01

    Change and local spatial variation are missing in Hamiltonian general relativity according to the most common definition of observables as having 0 Poisson bracket with all first-class constraints. But other definitions of observables have been proposed. In pursuit of Hamiltonian-Lagrangian equivalence, Pons, Salisbury and Sundermeyer use the Anderson-Bergmann-Castellani gauge generator G, a tuned sum of first-class constraints. Kuchař waived the 0 Poisson bracket condition for the Hamiltonian constraint to achieve changing observables. A systematic combination of the two reforms might use the gauge generator but permit non-zero Lie derivative Poisson brackets for the external gauge symmetry of General Relativity. Fortunately one can test definitions of observables by calculation using two formulations of a theory, one without gauge freedom and one with gauge freedom. The formulations, being empirically equivalent, must have equivalent observables. For de Broglie-Proca non-gauge massive electromagnetism, all constraints are second-class, so everything is observable. Demanding equivalent observables from gauge Stueckelberg-Utiyama electromagnetism, one finds that the usual definition fails while the Pons-Salisbury-Sundermeyer definition with G succeeds. This definition does not readily yield change in GR, however. Should GR's external gauge freedom of general relativity share with internal gauge symmetries the 0 Poisson bracket (invariance), or is covariance (a transformation rule) sufficient? A graviton mass breaks the gauge symmetry (general covariance), but it can be restored by parametrization with clock fields. By requiring equivalent observables, one can test whether observables should have 0 or the Lie derivative as the Poisson bracket with the gauge generator G. The latter definition is vindicated by calculation. While this conclusion has been reported previously, here the calculation is given in some detail.

  10. Equivalent Theories and Changing Hamiltonian Observables in General Relativity

    NASA Astrophysics Data System (ADS)

    Pitts, J. Brian

    2018-05-01

    Change and local spatial variation are missing in Hamiltonian general relativity according to the most common definition of observables as having 0 Poisson bracket with all first-class constraints. But other definitions of observables have been proposed. In pursuit of Hamiltonian-Lagrangian equivalence, Pons, Salisbury and Sundermeyer use the Anderson-Bergmann-Castellani gauge generator G, a tuned sum of first-class constraints. Kuchař waived the 0 Poisson bracket condition for the Hamiltonian constraint to achieve changing observables. A systematic combination of the two reforms might use the gauge generator but permit non-zero Lie derivative Poisson brackets for the external gauge symmetry of General Relativity. Fortunately one can test definitions of observables by calculation using two formulations of a theory, one without gauge freedom and one with gauge freedom. The formulations, being empirically equivalent, must have equivalent observables. For de Broglie-Proca non-gauge massive electromagnetism, all constraints are second-class, so everything is observable. Demanding equivalent observables from gauge Stueckelberg-Utiyama electromagnetism, one finds that the usual definition fails while the Pons-Salisbury-Sundermeyer definition with G succeeds. This definition does not readily yield change in GR, however. Should GR's external gauge freedom of general relativity share with internal gauge symmetries the 0 Poisson bracket (invariance), or is covariance (a transformation rule) sufficient? A graviton mass breaks the gauge symmetry (general covariance), but it can be restored by parametrization with clock fields. By requiring equivalent observables, one can test whether observables should have 0 or the Lie derivative as the Poisson bracket with the gauge generator G. The latter definition is vindicated by calculation. While this conclusion has been reported previously, here the calculation is given in some detail.

  11. A novel stent inflation protocol improves long-term outcomes compared with rapid inflation/deflation deployment method.

    PubMed

    Vallurupalli, Srikanth; Kasula, Srikanth; Kumar Agarwal, Shiv; Pothineni, Naga Venkata K; Abualsuod, Amjad; Hakeem, Abdul; Ahmed, Zubair; Uretsky, Barry F

    2017-08-01

    High-pressure inflation for coronary stent deployment is universally performed. However, the duration of inflation is variable and does not take into account differences in lesion compliance. We developed a standardized "pressure optimization protocol" (POP) using inflation pressure stability rather than an arbitrary inflation time or angiographic balloon appearance for stent deployment. Whether this approach improves long-term outcomes is unknown. 792 patients who underwent PCI using either rapid inflation/deflation (n = 376) or POP (n = 416) between January 2009 and March 2014 were included. Exclusion criteria included PCI for acute myocardial infarction, in-stent restenosis, chronic total occlusion, left main, and saphenous vein graft lesions. Primary endpoint was target vessel failure [TVF = combined end point of target vessel revascularization (TVR), myocardial infarction, and cardiac death]. Outcomes were analyzed in the entire cohort and in a propensity analysis. Stent implantation using POP with a median follow-up of 1317 days was associated with lower TVF compared with rapid inflation/deflation (10.1 vs. 17.8%, P < 0.0001). This difference was driven by a decrease in TVR (7 vs. 10.6%, P = 0.0016) and cardiac death (2.9 vs. 5.8%, P = 0.017) while there was no difference in myocardial infarction (1 vs. 1.9%, P = 0.19). In the Cox regression model, deployment using POP was the only independent predictor of reduced TVF (HR 0.43; 0.29-0.64; P < 0.0001). In the propensity analysis (330 patients per group) TVF remained lower with POP vs. rapid inflation/deflation (10 vs. 18%, P < 0.0001). Stent deployment using POP led to reduced TVF compared to rapid I/D. These results recommend this method to improve long-term outcomes. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. Two dimensional analytical model for a reconfigurable field effect transistor

    NASA Astrophysics Data System (ADS)

    Ranjith, R.; Jayachandran, Remya; Suja, K. J.; Komaragiri, Rama S.

    2018-02-01

    This paper presents two-dimensional potential and current models for a reconfigurable field effect transistor (RFET). Two potential models which describe subthreshold and above-threshold channel potentials are developed by solving two-dimensional (2D) Poisson's equation. In the first potential model, 2D Poisson's equation is solved by considering constant/zero charge density in the channel region of the device to get the subthreshold potential characteristics. In the second model, accumulation charge density is considered to get above-threshold potential characteristics of the device. The proposed models are applicable for the device having lightly doped or intrinsic channel. While obtaining the mathematical model, whole body area is divided into two regions: gated region and un-gated region. The analytical models are compared with technology computer-aided design (TCAD) simulation results and are in complete agreement for different lengths of the gated regions as well as at various supply voltage levels.

  13. Linking local vulnerability to climatic hazard damage assessment for integrated river basin management

    NASA Astrophysics Data System (ADS)

    Hung, Hung-Chih; Liu, Yi-Chung; Chien, Sung-Ying

    2015-04-01

    1. Background Major portions of areas in Asia are expected to increase exposure and vulnerability to climate change and weather extremes due to rapid urbanization and overdevelopment in hazard-prone areas. To prepare and confront the potential impacts of climate change and related hazard risk, many countries have implemented programs of integrated river basin management. This has led to an impending challenge for the police-makers in many developing countries to build effective mechanism to assess how the vulnerability distributes over river basins, and to understand how the local vulnerability links to climatic (climate-related) hazard damages and risks. However, the related studies have received relatively little attention. This study aims to examine whether geographic localities characterized by high vulnerability experience significantly more damages owing to onset weather extreme events at the river basin level, and to explain what vulnerability factors influence these damages or losses. 2. Methods and data An indicator-based assessment framework is constructed with the goal of identifying composite indicators (including exposure, biophysical, socioeconomic, land-use and adaptive capacity factors) that could serve as proxies for attributes of local vulnerability. This framework is applied by combining geographical information system (GIS) techniques with multicriteria decision analysis (MCDA) to evaluate and map integrated vulnerability to climatic hazards across river basins. Furthermore, to explain the relationship between vulnerability factors and disaster damages, we develop a disaster damage model (DDM) based on existing disaster impact theory. We then synthesize a Zero-Inflated Poisson regression model with a Tobit regression analysis to identify and examine how the disaster impacts and vulnerability factors connect to typhoon disaster damages and losses. To illustrate the proposed methodology, the study collects data on the vulnerability attributes of the Kaoping, Tsengwen, and Taimali River basins in southern Taiwan, and on the disaster impacts and damages in these river basins due to Typhoon Morakot in 2009. The data was offered by the National Science and Technology Center for Disaster Reduction, Taiwan, as well as collected from the National Land Use Investigation, official census statistics and questionnaire surveys. 3. Results We use an MCDA to create a composite vulnerability index, and this index is incorporated into a GIS analysis to demonstrate the results of integrated vulnerability assessment throughout the river basins. Results of the vulnerability assessment indicate that the most vulnerable areas are almost all situated in the regions of middle and upper reaches of the river basins. Through the examining of DDM, it shows that the vulnerability factors play a critical role in determining disaster damages. Findings also present that the losses and casualties caused by Typhoon Morakot increase with elevation, urban and agricultural developments, proximity to rivers, and decrease with levels of income and adaptive capacity. Finally, we propose the adaptive options for minimizing vulnerability and risk, as well as for integrated river basin governance.

  14. Is the United States in the middle of a healthcare bubble?

    PubMed

    Chen, Wen-Yi; Liang, Yia-Wun; Lin, Yu-Hui

    2016-01-01

    This study investigates the possibility of multiple healthcare bubbles in the US healthcare market. We first applied the newly developed Generalized Sup ADF test to locate multiple healthcare bubble episodes and then estimated the switching regression model specifying multiple healthcare bubble periods to evaluate to what extent macroeconomic variables (such as the interest rate, public debt, and fiscal deficit) and public financing healthcare programs influence the magnitude of healthcare bubbles in terms of the deviation of the medical care price inflation from either the overall price inflation or the money wage growth. Our results show that expansionary monetary and fiscal policies play important roles in determining the deviation of the medical care price inflation from the overall price inflation and that the net government debt has a positive impact on the deviation of the medical care price inflation from the money wage growth. The US healthcare market is now in the middle of a healthcare bubble, and this healthcare bubble has developed slowly and has lasted for approximately 3 decades, mirroring an increased societal preference for healthcare. Policymakers in the US should cautiously consider the fact that healthcare bubbles must imply a misallocation of resources into healthcare, leading to negative consequences on the sustainability of the healthcare system.

  15. Exploring the impact of a dedicated streetcar right-of-way on pedestrian motor vehicle collisions: a quasi experimental design.

    PubMed

    Richmond, Sarah A; Rothman, Linda; Buliung, Ron; Schwartz, Naomi; Larsen, Kristian; Howard, Andrew

    2014-10-01

    The frequency of pedestrian collisions is strongly influenced by the built environment, including road width, street connectivity and public transit design. In 2010, 2159 pedestrian collisions were reported in the City of Toronto, Canada with 20 fatalities. Previous studies have reported that streetcars operating in mixed traffic pose safety risks to pedestrians; however, few studies evaluate the effects on pedestrian-motor vehicle collisions (PMVC). The objective of this study was to examine changes in the rate and spatial patterning of PMVC, pre to post right-of-way (ROW) installation of the St. Clair Avenue West streetcar in the City of Toronto, Canada. A quasi-experimental design was used to evaluate changes in PMVC rate, following implementation of a streetcar ROW. Collision data were extracted from all police-reported PMVC, complied and verified by the City of Toronto, from January 1, 2000 to December 31, 2011. A zero-inflated Poisson regression analysis estimated the change in PMVC, pre to post ROW. Age and injury severity were also examined. Changes in the spatial pattern of collisions were examined by applying the G function to describe the proportion of collision events that shared a nearest neighbor distance less than or equal to a threshold distance. A total of 23,607 PMVC occurred on roadways during the study period; 441 occurring on St. Clair Ave, 153 during the period of analysis. There was a 48% decrease in the rate of collisions on St. Clair [Incidence rate ratio (IRR)=0.52, 95% CI: 0.37-0.74], post ROW installation. There were also decreases noted for children (IRR=0.13, 95% CI: 0.04-0.44), adults (IRR=0.61, 95% CI: 0.38-0.97), and minor injuries (IRR=0.56, 95% CI: 0.40-0.80). Spatial analyses indicated increased dispersion of collision events across each redeveloped route segment following the changes in ROW design. Construction of a raised ROW operating on St. Clair Ave. was associated with a reduction in the rate of collisions. Differences in pre- and post collision spatial structure indicated changes in collision locations. Results from this study suggest that a streetcar ROW may be a safer alternative for pedestrians compared to a mixed traffic streetcar route and should be considered by city planners where appropriate to the street environment. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Equivalence of MAXENT and Poisson point process models for species distribution modeling in ecology.

    PubMed

    Renner, Ian W; Warton, David I

    2013-03-01

    Modeling the spatial distribution of a species is a fundamental problem in ecology. A number of modeling methods have been developed, an extremely popular one being MAXENT, a maximum entropy modeling approach. In this article, we show that MAXENT is equivalent to a Poisson regression model and hence is related to a Poisson point process model, differing only in the intercept term, which is scale-dependent in MAXENT. We illustrate a number of improvements to MAXENT that follow from these relations. In particular, a point process model approach facilitates methods for choosing the appropriate spatial resolution, assessing model adequacy, and choosing the LASSO penalty parameter, all currently unavailable to MAXENT. The equivalence result represents a significant step in the unification of the species distribution modeling literature. Copyright © 2013, The International Biometric Society.

  17. Study of non-Hodgkin's lymphoma mortality associated with industrial pollution in Spain, using Poisson models

    PubMed Central

    Ramis, Rebeca; Vidal, Enrique; García-Pérez, Javier; Lope, Virginia; Aragonés, Nuria; Pérez-Gómez, Beatriz; Pollán, Marina; López-Abente, Gonzalo

    2009-01-01

    Background Non-Hodgkin's lymphomas (NHLs) have been linked to proximity to industrial areas, but evidence regarding the health risk posed by residence near pollutant industries is very limited. The European Pollutant Emission Register (EPER) is a public register that furnishes valuable information on industries that release pollutants to air and water, along with their geographical location. This study sought to explore the relationship between NHL mortality in small areas in Spain and environmental exposure to pollutant emissions from EPER-registered industries, using three Poisson-regression-based mathematical models. Methods Observed cases were drawn from mortality registries in Spain for the period 1994–2003. Industries were grouped into the following sectors: energy; metal; mineral; organic chemicals; waste; paper; food; and use of solvents. Populations having an industry within a radius of 1, 1.5, or 2 kilometres from the municipal centroid were deemed to be exposed. Municipalities outside those radii were considered as reference populations. The relative risks (RRs) associated with proximity to pollutant industries were estimated using the following methods: Poisson Regression; mixed Poisson model with random provincial effect; and spatial autoregressive modelling (BYM model). Results Only proximity of paper industries to population centres (>2 km) could be associated with a greater risk of NHL mortality (mixed model: RR:1.24, 95% CI:1.09–1.42; BYM model: RR:1.21, 95% CI:1.01–1.45; Poisson model: RR:1.16, 95% CI:1.06–1.27). Spatial models yielded higher estimates. Conclusion The reported association between exposure to air pollution from the paper, pulp and board industry and NHL mortality is independent of the model used. Inclusion of spatial random effects terms in the risk estimate improves the study of associations between environmental exposures and mortality. The EPER could be of great utility when studying the effects of industrial pollution on the health of the population. PMID:19159450

  18. The analysis of incontinence episodes and other count data in patients with overactive bladder by Poisson and negative binomial regression.

    PubMed

    Martina, R; Kay, R; van Maanen, R; Ridder, A

    2015-01-01

    Clinical studies in overactive bladder have traditionally used analysis of covariance or nonparametric methods to analyse the number of incontinence episodes and other count data. It is known that if the underlying distributional assumptions of a particular parametric method do not hold, an alternative parametric method may be more efficient than a nonparametric one, which makes no assumptions regarding the underlying distribution of the data. Therefore, there are advantages in using methods based on the Poisson distribution or extensions of that method, which incorporate specific features that provide a modelling framework for count data. One challenge with count data is overdispersion, but methods are available that can account for this through the introduction of random effect terms in the modelling, and it is this modelling framework that leads to the negative binomial distribution. These models can also provide clinicians with a clearer and more appropriate interpretation of treatment effects in terms of rate ratios. In this paper, the previously used parametric and non-parametric approaches are contrasted with those based on Poisson regression and various extensions in trials evaluating solifenacin and mirabegron in patients with overactive bladder. In these applications, negative binomial models are seen to fit the data well. Copyright © 2014 John Wiley & Sons, Ltd.

  19. [Analysis on risk factors of endotracheal cuff under inflation in mechanically ventilated patients].

    PubMed

    Fu, You; Xi, Xiuming

    2014-12-01

    To investigate the prevalent condition of endotracheal cuff pressure and risk factors for under inflation. A prospective cohort study was conducted. Patients admitted to the Department of Critical Care Medicine of Fuxing Hospital Affiliated to Capital Medical University, who were intubated with a high-volume low-pressure endotracheal tube, and had undergone mechanical ventilation for at least 48 hours, were enrolled. The endotracheal cuff pressure was determined every 8 hours by a manual manometer connected to the distal edge of the valve cuff at 07 : 00, 15 : 00, and 23 : 00. Measurement of the endotracheal cuff pressure was continued until the extubation of endotracheal or tracheostomy tube, or death of the patient. According to the incidence of under inflation of endotracheal cuff, patients were divided into the incidence of under inflation lower than 25% group (lower low cuff pressure group) and higher than 25% group (higher low cuff pressure group). The possible influencing factors were evaluated in the two groups, including body mass index (BMI), size of endotracheal tube, duration of intubation, use of sedative or analgesic, number of leaving from intensive care unit (ICU), the number of turning over the patients, and aspiration of sputum. Logistic regression analysis was used to determine risk factors for under-inflation of the endotracheal cuff. During the study period, 53 patients were enrolled. There were 812 measurements, and 46.3% of them was abnormal, and 204 times (25.1%) of under inflation of endotracheal cuff were found. There were 24 patients (45.3%) in whom the incidence of under inflation rate was higher than 25%. The average of under inflation was 7 (4, 10) times. Compared with the group with lower rate of low cuff pressure, a longer time for intubation was found in group with higher rate of low cuff pressure [hours: 162 (113, 225) vs. 118 (97, 168), Z=-2.034, P=0.042]. There were no differences between the two groups in other factors, including size of endotracheal tube, the time from intubation to first measurement of endotracheal cuff pressure, number of leaving from ICU during admission, use of sedative agent or analgesic, and the number of body turning and aspiration (all P>0.05). No risk factor was found resulting from under inflation of the endotracheal cuff by logistic regression analysis. No significant difference was found in the incidence of ventilator associated pneumonia, duration of mechanical ventilation, successful rate of weaning on 28th day, or 28-day mortality after weaning from mechanical ventilation, and ICU mortality between the two groups. However, patients in the group of higher rate of low cuff pressure had a longer ICU stay compared with that in the group of lower rate of low cuff pressure group [days: 13 (8, 21) vs. 10 (6, 18), Z=-2.120, P=0.034]. Abnormal endotracheal cuff pressure is common in critically ill patients with intratracheal intubation. Duration of intubation is associated with under inflation of the cuff, and it calls for strengthening monitoring and management.

  20. Validation of Statistical Models for Estimating Hospitalization Associated with Influenza and Other Respiratory Viruses

    PubMed Central

    Chan, King-Pan; Chan, Kwok-Hung; Wong, Wilfred Hing-Sang; Peiris, J. S. Malik; Wong, Chit-Ming

    2011-01-01

    Background Reliable estimates of disease burden associated with respiratory viruses are keys to deployment of preventive strategies such as vaccination and resource allocation. Such estimates are particularly needed in tropical and subtropical regions where some methods commonly used in temperate regions are not applicable. While a number of alternative approaches to assess the influenza associated disease burden have been recently reported, none of these models have been validated with virologically confirmed data. Even fewer methods have been developed for other common respiratory viruses such as respiratory syncytial virus (RSV), parainfluenza and adenovirus. Methods and Findings We had recently conducted a prospective population-based study of virologically confirmed hospitalization for acute respiratory illnesses in persons <18 years residing in Hong Kong Island. Here we used this dataset to validate two commonly used models for estimation of influenza disease burden, namely the rate difference model and Poisson regression model, and also explored the applicability of these models to estimate the disease burden of other respiratory viruses. The Poisson regression models with different link functions all yielded estimates well correlated with the virologically confirmed influenza associated hospitalization, especially in children older than two years. The disease burden estimates for RSV, parainfluenza and adenovirus were less reliable with wide confidence intervals. The rate difference model was not applicable to RSV, parainfluenza and adenovirus and grossly underestimated the true burden of influenza associated hospitalization. Conclusion The Poisson regression model generally produced satisfactory estimates in calculating the disease burden of respiratory viruses in a subtropical region such as Hong Kong. PMID:21412433

  1. Systematic review of treatment modalities for gingival depigmentation: a random-effects poisson regression analysis.

    PubMed

    Lin, Yi Hung; Tu, Yu Kang; Lu, Chun Tai; Chung, Wen Chen; Huang, Chiung Fang; Huang, Mao Suan; Lu, Hsein Kun

    2014-01-01

    Repigmentation variably occurs with different treatment methods in patients with gingival pigmentation. A systemic review was conducted of various treatment modalities for eliminating melanin pigmentation of the gingiva, comprising bur abrasion, scalpel surgery, cryosurgery, electrosurgery, gingival grafts, and laser techniques, to compare the recurrence rates (Rrs) of these treatment procedures. Electronic databases, including PubMed, Web of Science, Google, and Medline were comprehensively searched, and manual searches were conducted for studies published from January 1951 to June 2013. After applying inclusion and exclusion criteria, the final list of articles was reviewed in depth to achieve the objectives of this review. A Poisson regression was used to analyze the outcome of depigmentation using the various treatment methods. The systematic review was based on case reports mainly. In total, 61 eligible publications met the defined criteria. The various therapeutic procedures showed variable clinical results with a wide range of Rrs. A random-effects Poisson regression showed that cryosurgery (Rr = 0.32%), electrosurgery (Rr = 0.74%), and laser depigmentation (Rr = 1.16%) yielded superior result, whereas bur abrasion yielded the highest Rr (8.89%). Within the limit of the sampling level, the present evidence-based results show that cryosurgery exhibits the optimal predictability for depigmentation of the gingiva among all procedures examined, followed by electrosurgery and laser techniques. It is possible to treat melanin pigmentation of the gingiva with various methods and prevent repigmentation. Among those treatment modalities, cryosurgery, electrosurgery, and laser surgery appear to be the best choices for treating gingival pigmentation. © 2014 Wiley Periodicals, Inc.

  2. Disability rates for cardiovascular and psychological disorders among autoworkers by job category, facility type, and facility overtime hours.

    PubMed

    Landsbergis, Paul A; Janevic, Teresa; Rothenberg, Laura; Adamu, Mohammed T; Johnson, Sylvia; Mirer, Franklin E

    2013-07-01

    We examined the association between long work hours, assembly line work and stress-related diseases utilizing objective health and employment data from an employer's administrative databases. A North American automobile manufacturing company provided data for claims for sickness, accident and disability insurance (work absence of at least 4 days) for cardiovascular disease (CVD), hypertension and psychological disorders, employee demographics, and facility hours worked per year for 1996-2001. Age-adjusted claim rates and age-adjusted rate ratios were calculated using Poisson regression, except for comparisons between production and skilled trades workers owing to lack of age denominator data by job category. Associations between overtime hours and claim rates by facility were examined by Poisson regression and multi-level Poisson regression. Claims for hypertension, coronary heart disease, CVD, and psychological disorders were associated with facility overtime hours. We estimate that a facility with 10 more overtime hours per week than another facility would have 4.36 more claims for psychological disorders, 2.33 more claims for CVD, and 3.29 more claims for hypertension per 1,000 employees per year. Assembly plants had the highest rates of claims for most conditions. Production workers tended to have higher rates of claims than skilled trades workers. Data from an auto manufacturer's administrative databases suggest that autoworkers working long hours, and assembly-line workers relative to skilled trades workers or workers in non-assembly facilities, have a higher risk of hypertension, CVD, and psychological disorders. Occupational disease surveillance and disease prevention programs need to fully utilize such administrative data. Copyright © 2013 Wiley Periodicals, Inc.

  3. Influences on preschool children's oral health-related quality of life as reported by English and Spanish-speaking parents and caregivers.

    PubMed

    Born, Catherine D; Divaris, Kimon; Zeldin, Leslie P; Rozier, R Gary

    2016-09-01

    This study examined young, preschool children's oral health-related quality of life (OHRQoL) among a community-based cohort of English and Spanish-speaking parent-child dyads in North Carolina, and sought to quantify the association of parent/caregiver characteristics, including spoken language, with OHRQoL impacts. Data from structured interviews with 1,111 parents of children aged 6-23 months enrolled in the Zero-Out Early Childhood Caries study in 2010-2012 were used. OHRQoL was measured using the overall score (range: 0-52) of the Early Childhood Oral Health Impact Scale (ECOHIS). We examined associations with parents' sociodemographic characteristics, spoken language, self-reported oral and general health, oral health knowledge, children's dental attendance, and dental care needs. Analyses included descriptive, bivariate, and multivariate methods based upon zero-inflated negative binomial regression. To determine differences between English and Spanish speakers, language-stratified model estimates were contrasted using homogeneity χ 2 tests. The mean overall ECOHIS score was 3.9 [95% confidence interval (CI) = 3.6-4.2]; 4.7 among English-speakers and 1.5 among Spanish speakers. In multivariate analyses, caregivers' education showed a positive association with OHRQoL impacts among Spanish speakers [prevalence ratio (PR) = 1.12 (95% CI = 1.03-1.22), for every added year of schooling], whereas caregivers' fair/poor oral health showed a positive association among English speakers (PR = 1.20; 95% CI = 1.02-1.41). The overall severity of ECOHIS impacts was low among this population-based sample of young, preschool children, and substantially lower among Spanish versus English speakers. Further studies are warranted to identify sources of these differences in - actual or reported - OHRQoL impacts. © 2016 American Association of Public Health Dentistry.

  4. The impact of poor psychosocial work environment on non-work-related sickness absence.

    PubMed

    Catalina-Romero, C; Sainz, J C; Pastrana-Jiménez, J I; García-Diéguez, N; Irízar-Muñoz, I; Aleixandre-Chiva, J L; Gonzalez-Quintela, A; Calvo-Bonacho, E

    2015-08-01

    We aimed to analyse the impact of psychosocial work environment on non-work-related sickness absence (NWRSA) among a prospective cohort study, stratified using a random sampling technique. Psychosocial variables were assessed among 15,643 healthy workers using a brief version of the Spanish adaptation of Copenhagen Psychosocial Questionnaire. A one year follow-up assessed the total count of NWRSA days. Zero-inflated negative binomial regression was used for multivariate analyses. After adjusting for covariates, low levels of job control and possibilities for development (Odds Ratio [OR]: 1.17; 95% CI: 1.01-1.36 [men]; OR: 1.39 95% CI: 1.09-1.77 [women]), poor social support and quality of leadership (OR: 1.29; 95% CI: 1.11-1.50 [men]; OR: 1.28; 95% CI: 1.01-1.63 [women]), and poor rewards (OR: 1.34; 95% CI: 1.14-1.57 [men]; OR: 1.30; 95% CI: 1.01-1.66 [women]) predicted a total count of sickness absence greater than zero, in both men and women. Double presence was also significantly associated with NWRSA different than 0, but only among women (OR: 1.40; 95% CI: 1.08-1.81). Analyses found no association between psychosocial risk factors at work and the total count (i.e., number of days) of sickness absences. The results suggest that work-related psychosocial factors may increase the likelihood of initiating an NWRSA episode, but were not associated with the length of the sickness absence episode. Among our large cohort we observed that some associations were gender-dependent, suggesting that future research should consider gender when designing psychosocial interventions aimed at decreasing sickness absences. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. [The short-term effects of air pollution on mortality: the results of the EMECAM project in the city of Pamplona, 1991-95. Estudio Multicéntrico Español sobre la Relación entre la Contaminación Atmosférica y la Mortalidad].

    PubMed

    Aguinaga Ontoso, I; Guillén Grima, F; Oviedo de Sola, P J; Floristan Floristan, M Y; Laborda Santesteban, M S; Martínez Ramírez, M T; Martínez González, M A

    1999-01-01

    To assess the short-term impact of air pollution on the daily death rate in the city of Pamplona. Ecological study with a population of 212,000 inhabitants. A time series data analysis is conducted by means of multiple linear regression and Poisson regression, with the daily death rate data, air pollution levels for Particles and SO2, weather parameters of average relative humidity and temperature daily and number of cases weekly of flu for the 1991-1995 period. The average number of deaths daily for non-external causes is that of 4.15 deaths, with a range from zero to 13 deaths. The city of Pamplona has a mean annual temperature of 12.7 degrees C (-2.3 degrees C to 31.6 degrees C) and a relative humidity of 68.5%. In the model, the temperature (with a one-day time lag and a six-day time lag temperature squared) and the humidity (with a one-day time lag) is related to the death rate for all causes. But the death rate for non-external causes is only related in the model with the temperature (one-day time lag, P: 0.035) and five-day time lag with temperature squared (p: 0.028). The timely estimates of the relative particle-related risk show that the highest risk of dying stems from respiratory causes with a relative risk of 1.13. However, none of these relationships is statistically significant. In the case of Sulfur Dioxide, the estimates closely near the zero figure, and none of them is significant. The Temperature has an impact of the death rate for all causes, both external and non-external, and the relative humidity solely has an impact on the death rate for non-external causes. It has not been possible to prove any influence of the daily environmental pollution levels on the daily death rate.

  6. Maternal Smoking during Pregnancy, Prematurity and Recurrent Wheezing in Early Childhood

    PubMed Central

    Robison, Rachel G; Kumar, Rajesh; Arguelles, Lester M; Hong, Xiumei; Wang, Guoying; Apollon, Stephanie; Bonzagni, Anthony; Ortiz, Kathryn; Pearson, Colleen; Pongracic, Jacqueline A; Wang, Xiaobin

    2013-01-01

    Summary Background Prenatal maternal smoking and prematurity independently affect wheezing and asthma in childhood. Objective We sought to evaluate the interactive effects of maternal smoking and prematurity upon the development of early childhood wheezing. Methods We evaluated 1448 children with smoke exposure data from a prospective urban birth cohort in Boston. Maternal antenatal and postnatal exposure was determined from standardized questionnaires. Gestational age was assessed by the first day of the last menstrual period and early prenatal ultrasound (preterm<37 weeks gestation). Wheezing episodes were determined from medical record extraction of well and ill/unscheduled visits. The primary outcome was recurrent wheezing, defined as ≥ 4 episodes of physician documented wheezing. Logistic regression models and zero inflated negative binomial regression (for number of episodes of wheeze) assessed the independent and joint association of prematurity and maternal antenatal smoking on recurrent wheeze, controlling for relevant covariates. Results In the cohort, 90 (6%) children had recurrent wheezing, 147 (10%) were exposed to in utero maternal smoke and 419 (29%) were premature. Prematurity (odds ratio [OR] 2.0; 95% CI, 1.3-3.1) was associated with an increased risk of recurrent wheezing, but in utero maternal smoking was not (OR 1.1, 95% CI 0.5-2.4). Jointly, maternal smoke exposure and prematurity caused an increased risk of recurrent wheezing (OR 3.8, 95% CI 1.8-8.0). There was an interaction between prematurity and maternal smoking upon episodes of wheezing (p=0.049). Conclusions We demonstrated an interaction between maternal smoking during pregnancy and prematurity on childhood wheezing in this urban, multiethnic birth cohort. PMID:22290763

  7. Evidence of a truncated spectrum in the angular correlation function of the cosmic microwave background

    NASA Astrophysics Data System (ADS)

    Melia, F.; López-Corredoira, M.

    2018-03-01

    Aim. The lack of large-angle correlations in the fluctuations of the cosmic microwave background (CMB) conflicts with predictions of slow-roll inflation. But while probabilities (≲0.24%) for the missing correlations disfavour the conventional picture at ≳3σ, factors not associated with the model itself may be contributing to the tension. Here we aim to show that the absence of large-angle correlations is best explained with the introduction of a non-zero minimum wave number kmin for the fluctuation power spectrum P(k). Methods: We assumed that quantum fluctuations were generated in the early Universe with a well-defined power spectrum P(k), although with a cut-off kmin ≠ 0. We then re-calculated the angular correlation function of the CMB and compared it with Planck observations. Results: The Planck 2013 data rule out a zero kmin at a confidence level exceeding 8σ. Whereas purely slow-roll inflation would have stretched all fluctuations beyond the horizon, producing a P(k) with kmin = 0 - and therefore strong correlations at all angles - a kmin ≠ 0 would signal the presence of a maximum wavelength at the time (tdec) of decoupling. This argues against the basic inflationary paradigm, and perhaps even suggests non-inflationary alternatives, for the origin and growth of perturbations in the early Universe. In at least one competing cosmology, the Rh = ct universe, the inferred kmin corresponds to the gravitational radius at tdec.

  8. Comparison of modeling methods to predict the spatial distribution of deep-sea coral and sponge in the Gulf of Alaska

    NASA Astrophysics Data System (ADS)

    Rooper, Christopher N.; Zimmermann, Mark; Prescott, Megan M.

    2017-08-01

    Deep-sea coral and sponge ecosystems are widespread throughout most of Alaska's marine waters, and are associated with many different species of fishes and invertebrates. These ecosystems are vulnerable to the effects of commercial fishing activities and climate change. We compared four commonly used species distribution models (general linear models, generalized additive models, boosted regression trees and random forest models) and an ensemble model to predict the presence or absence and abundance of six groups of benthic invertebrate taxa in the Gulf of Alaska. All four model types performed adequately on training data for predicting presence and absence, with regression forest models having the best overall performance measured by the area under the receiver-operating-curve (AUC). The models also performed well on the test data for presence and absence with average AUCs ranging from 0.66 to 0.82. For the test data, ensemble models performed the best. For abundance data, there was an obvious demarcation in performance between the two regression-based methods (general linear models and generalized additive models), and the tree-based models. The boosted regression tree and random forest models out-performed the other models by a wide margin on both the training and testing data. However, there was a significant drop-off in performance for all models of invertebrate abundance ( 50%) when moving from the training data to the testing data. Ensemble model performance was between the tree-based and regression-based methods. The maps of predictions from the models for both presence and abundance agreed very well across model types, with an increase in variability in predictions for the abundance data. We conclude that where data conforms well to the modeled distribution (such as the presence-absence data and binomial distribution in this study), the four types of models will provide similar results, although the regression-type models may be more consistent with biological theory. For data with highly zero-inflated distributions and non-normal distributions such as the abundance data from this study, the tree-based methods performed better. Ensemble models that averaged predictions across the four model types, performed better than the GLM or GAM models but slightly poorer than the tree-based methods, suggesting ensemble models might be more robust to overfitting than tree methods, while mitigating some of the disadvantages in predictive performance of regression methods.

  9. An examination of sources of sensitivity of consumer surplus estimates in travel cost models.

    PubMed

    Blaine, Thomas W; Lichtkoppler, Frank R; Bader, Timothy J; Hartman, Travis J; Lucente, Joseph E

    2015-03-15

    We examine sensitivity of estimates of recreation demand using the Travel Cost Method (TCM) to four factors. Three of the four have been routinely and widely discussed in the TCM literature: a) Poisson verses negative binomial regression; b) application of Englin correction to account for endogenous stratification; c) truncation of the data set to eliminate outliers. A fourth issue we address has not been widely modeled: the potential effect on recreation demand of the interaction between income and travel cost. We provide a straightforward comparison of all four factors, analyzing the impact of each on regression parameters and consumer surplus estimates. Truncation has a modest effect on estimates obtained from the Poisson models but a radical effect on the estimates obtained by way of the negative binomial. Inclusion of an income-travel cost interaction term generally produces a more conservative but not a statistically significantly different estimate of consumer surplus in both Poisson and negative binomial models. It also generates broader confidence intervals. Application of truncation, the Englin correction and the income-travel cost interaction produced the most conservative estimates of consumer surplus and eliminated the statistical difference between the Poisson and the negative binomial. Use of the income-travel cost interaction term reveals that for visitors who face relatively low travel costs, the relationship between income and travel demand is negative, while it is positive for those who face high travel costs. This provides an explanation of the ambiguities on the findings regarding the role of income widely observed in the TCM literature. Our results suggest that policies that reduce access to publicly owned resources inordinately impact local low income recreationists and are contrary to environmental justice. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Logistic quantile regression provides improved estimates for bounded avian counts: A case study of California Spotted Owl fledgling production

    USGS Publications Warehouse

    Cade, Brian S.; Noon, Barry R.; Scherer, Rick D.; Keane, John J.

    2017-01-01

    Counts of avian fledglings, nestlings, or clutch size that are bounded below by zero and above by some small integer form a discrete random variable distribution that is not approximated well by conventional parametric count distributions such as the Poisson or negative binomial. We developed a logistic quantile regression model to provide estimates of the empirical conditional distribution of a bounded discrete random variable. The logistic quantile regression model requires that counts are randomly jittered to a continuous random variable, logit transformed to bound them between specified lower and upper values, then estimated in conventional linear quantile regression, repeating the 3 steps and averaging estimates. Back-transformation to the original discrete scale relies on the fact that quantiles are equivariant to monotonic transformations. We demonstrate this statistical procedure by modeling 20 years of California Spotted Owl fledgling production (0−3 per territory) on the Lassen National Forest, California, USA, as related to climate, demographic, and landscape habitat characteristics at territories. Spotted Owl fledgling counts increased nonlinearly with decreasing precipitation in the early nesting period, in the winter prior to nesting, and in the prior growing season; with increasing minimum temperatures in the early nesting period; with adult compared to subadult parents; when there was no fledgling production in the prior year; and when percentage of the landscape surrounding nesting sites (202 ha) with trees ≥25 m height increased. Changes in production were primarily driven by changes in the proportion of territories with 2 or 3 fledglings. Average variances of the discrete cumulative distributions of the estimated fledgling counts indicated that temporal changes in climate and parent age class explained 18% of the annual variance in owl fledgling production, which was 34% of the total variance. Prior fledgling production explained as much of the variance in the fledgling counts as climate, parent age class, and landscape habitat predictors. Our logistic quantile regression model can be used for any discrete response variables with fixed upper and lower bounds.

  11. Modeling Count Outcomes from HIV Risk Reduction Interventions: A Comparison of Competing Statistical Models for Count Responses

    PubMed Central

    Xia, Yinglin; Morrison-Beedy, Dianne; Ma, Jingming; Feng, Changyong; Cross, Wendi; Tu, Xin

    2012-01-01

    Modeling count data from sexual behavioral outcomes involves many challenges, especially when the data exhibit a preponderance of zeros and overdispersion. In particular, the popular Poisson log-linear model is not appropriate for modeling such outcomes. Although alternatives exist for addressing both issues, they are not widely and effectively used in sex health research, especially in HIV prevention intervention and related studies. In this paper, we discuss how to analyze count outcomes distributed with excess of zeros and overdispersion and introduce appropriate model-fit indices for comparing the performance of competing models, using data from a real study on HIV prevention intervention. The in-depth look at these common issues arising from studies involving behavioral outcomes will promote sound statistical analyses and facilitate research in this and other related areas. PMID:22536496

  12. Are star formation rates of galaxies bimodal?

    NASA Astrophysics Data System (ADS)

    Feldmann, Robert

    2017-09-01

    Star formation rate (SFR) distributions of galaxies are often assumed to be bimodal with modes corresponding to star-forming and quiescent galaxies, respectively. Both classes of galaxies are typically studied separately, and SFR distributions of star-forming galaxies are commonly modelled as lognormals. Using both observational data and results from numerical simulations, I argue that this division into star-forming and quiescent galaxies is unnecessary from a theoretical point of view and that the SFR distributions of the whole population can be well fitted by zero-inflated negative binomial distributions. This family of distributions has three parameters that determine the average SFR of the galaxies in the sample, the scatter relative to the star-forming sequence and the fraction of galaxies with zero SFRs, respectively. The proposed distributions naturally account for (I) the discrete nature of star formation, (II) the presence of 'dead' galaxies with zero SFRs and (III) asymmetric scatter. Excluding 'dead' galaxies, the distribution of log SFR is unimodal with a peak at the star-forming sequence and an extended tail towards low SFRs. However, uncertainties and biases in the SFR measurements can create the appearance of a bimodal distribution.

  13. Introduction to the use of regression models in epidemiology.

    PubMed

    Bender, Ralf

    2009-01-01

    Regression modeling is one of the most important statistical techniques used in analytical epidemiology. By means of regression models the effect of one or several explanatory variables (e.g., exposures, subject characteristics, risk factors) on a response variable such as mortality or cancer can be investigated. From multiple regression models, adjusted effect estimates can be obtained that take the effect of potential confounders into account. Regression methods can be applied in all epidemiologic study designs so that they represent a universal tool for data analysis in epidemiology. Different kinds of regression models have been developed in dependence on the measurement scale of the response variable and the study design. The most important methods are linear regression for continuous outcomes, logistic regression for binary outcomes, Cox regression for time-to-event data, and Poisson regression for frequencies and rates. This chapter provides a nontechnical introduction to these regression models with illustrating examples from cancer research.

  14. Degravitation, inflation and the cosmological constant as an afterglow

    NASA Astrophysics Data System (ADS)

    Patil, Subodh P.

    2009-01-01

    In this report, we adopt the phenomenological approach of taking the degravitation paradigm seriously as a consistent modification of gravity in the IR, and investigate its consequences for various cosmological situations. We motivate degravitation — where Netwon's constant is promoted to a scale dependent filter function — as arising from either a small (resonant) mass for the graviton, or as an effect in semi-classical gravity. After addressing how the Bianchi identities are to be satisfied in such a set up, we turn our attention towards the cosmological consequences of degravitation. By considering the example filter function corresponding to a resonantly massive graviton (with a filter scale larger than the present horizon scale), we show that slow roll inflation, hybrid inflation and old inflation remain quantitatively unchanged. We also find that the degravitation mechanism inherits a memory of past energy densities in the present epoch in such a way that is likely significant for present cosmological evolution. For example, if the universe underwent inflation in the past due to it having tunneled out of some false vacuum, we find that degravitation implies a remnant `afterglow' cosmological constant, whose scale immediately afterwards is parametrically suppressed by the filter scale (L) in Planck units Λ ~ l2pl/L2. We discuss circumstances through which this scenario reasonably yields the presently observed value for Λ ~ O(10-120). We also find that in a universe still currently trapped in some false vacuum state, resonance graviton models of degravitation only degravitate initially Planck or GUT scale energy densities down to the presently observed value over timescales comparable to the filter scale. We argue that different functional forms for the filter function will yield similar conclusions. In this way, we argue that although the degravitation models we study have the potential to explain why the cosmological constant is not large in addition to why it is not zero, it does not satisfactorily address the co-incidence problem without additional tuning.

  15. An episode of reinflation of the Long Valley Caldera, eastern California: 1989-1991

    USGS Publications Warehouse

    Langbein, J.; Hill, D.P.; Parker, T.N.; Wilkinson, S.K.

    1993-01-01

    Following the episodes of inflation of the resurgent dome associated with the May 1980 earthquake sequence (four M 6 earthquakes) and the January 1983 earthquake swarm (two M 5.2 events), 7 years of frequently repeated two-color geodimeter measurements spanning the Long Valley caldera document gradually decreasing extensional strain rates from 5 ppm/yr in mid-1983, when the measurements began, to near zero in mid-1989. Early October 1989 marked a change in activity when measurements of the two-color geodimeter network showed a significant increase in extensional strain rate (9 ppm/yr) across the caldera. The seismic activity began exceeding 10 M ??? 1..2 per week in early December 1989 and rapidly increased to a sustained level of tens of M ??? 1.2 per week with bursts having hundreds of events per day. The episode of inflation can be modeled by a single Mogi point source located about 7 km beneath the center of the resurgent dome. -from Authors

  16. Simplifying the EFT of Inflation: generalized disformal transformations and redundant couplings

    NASA Astrophysics Data System (ADS)

    Bordin, Lorenzo; Cabass, Giovanni; Creminelli, Paolo; Vernizzi, Filippo

    2017-09-01

    We study generalized disformal transformations, including derivatives of the metric, in the context of the Effective Field Theory of Inflation. All these transformations do not change the late-time cosmological observables but change the coefficients of the operators in the action: some couplings are effectively redundant. At leading order in derivatives and up to cubic order in perturbations, one has 6 free functions that can be used to set to zero 6 of the 17 operators at this order. This is used to show that the tensor three-point function cannot be modified at leading order in derivatives, while the scalar-tensor-tensor correlator can only be modified by changing the scalar dynamics. At higher order in derivatives there are transformations that do not affect the Einstein-Hilbert action: one can find 6 additional transformations that can be used to simplify the inflaton action, at least when the dynamics is dominated by the lowest derivative terms. We also identify the leading higher-derivative corrections to the tensor power spectrum and bispectrum.

  17. A first principle calculation of anisotropic elastic, mechanical and electronic properties of TiB

    NASA Astrophysics Data System (ADS)

    Zhang, Junqin; Zhao, Bin; Ma, Huihui; Wei, Qun; Yang, Yintang

    2018-04-01

    The structural, mechanical and electronic properties of the NaCl-type structure TiB are theoretically calculated based on the first principles. The density of states of TiB shows obvious density peaks at -0.70eV. Furthermore, there exists a pseudogap at 0.71eV to the right of the Fermi level. The calculated structural and mechanical parameters (i.e., bulk modulus, shear modulus, Young's modulus, Poisson's ratio and universal elastic anisotropy index) were in good agreement both with the previously reported experimental values and theoretical results at zero pressure. The mechanical stability criterion proves that TiB at zero pressure is mechanistically stable and exhibits ductility. The universal anisotropic index and the 3D graphics of Young's modulus are also given in this paper, which indicates that TiB is anisotropy under zero pressure. Moreover, the effects of applied pressures on the structural, mechanical and anisotropic elastic of TiB were studied in the range from 0 to 100GPa. It was found that ductility and anisotropy of TiB were enhanced with the increase of pressure.

  18. Leptospirosis disease mapping with standardized morbidity ratio and Poisson-Gamma model: An analysis of Leptospirosis disease in Kelantan, Malaysia

    NASA Astrophysics Data System (ADS)

    Che Awang, Aznida; Azah Samat, Nor

    2017-09-01

    Leptospirosis is a disease caused by the infection of pathogenic species from the genus of Leptospira. Human can be infected by the leptospirosis from direct or indirect exposure to the urine of infected animals. The excretion of urine from the animal host that carries pathogenic Leptospira causes the soil or water to be contaminated. Therefore, people can become infected when they are exposed to contaminated soil and water by cut on the skin as well as open wound. It also can enter the human body by mucous membrane such nose, eyes and mouth, for example by splashing contaminated water or urine into the eyes or swallowing contaminated water or food. Currently, there is no vaccine available for the prevention or treatment of leptospirosis disease but this disease can be treated if it is diagnosed early to avoid any complication. The disease risk mapping is important in a way to control and prevention of disease. Using a good choice of statistical model will produce a good disease risk map. Therefore, the aim of this study is to estimate the relative risk for leptospirosis disease based initially on the most common statistic used in disease mapping called Standardized Morbidity Ratio (SMR) and Poisson-gamma model. This paper begins by providing a review of the SMR method and Poisson-gamma model, which we then applied to leptospirosis data of Kelantan, Malaysia. Both results are displayed and compared using graph, tables and maps. The result shows that the second method Poisson-gamma model produces better relative risk estimates compared to the SMR method. This is because the Poisson-gamma model can overcome the drawback of SMR where the relative risk will become zero when there is no observed leptospirosis case in certain regions. However, the Poisson-gamma model also faced problems where the covariate adjustment for this model is difficult and no possibility for allowing spatial correlation between risks in neighbouring areas. The problems of this model have motivated many researchers to introduce other alternative methods for estimating the risk.

  19. Volunteerism: Social Network Dynamics and Education.

    PubMed

    Ajrouch, Kristine J; Antonucci, Toni C; Webster, Noah J

    2016-03-01

    . We examine how changes in social networks influence volunteerism through bridging (diversity) and bonding (spending time) mechanisms. We further investigate whether social network change substitutes or amplifies the effects of education on volunteerism. . Data (n = 543) are drawn from a two-wave survey of Social Relations and Health over the Life Course (SRHLC). Zero-inflated negative binomial regressions were conducted to test competing hypotheses about how changes in social network characteristics alone and in conjunction with education level predict likelihood and frequency of volunteering. . Changes in social networks were associated with volunteerism: as the proportion of family members decreased and the average number of network members living within a one-hour drive increased over time, participants reported higher odds of volunteering. The substitution hypothesis was supported: social networks that exhibited more geographic proximity and greater contact frequency over-time compensated for lower levels of education to predict volunteering more hours. . The dynamic role of social networks and the ways in which they may work through bridging and bonding to influence both likelihood and frequency of volunteering are discussed. The potential benefits of volunteerism in light of longer life expectancies and smaller families are also considered. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Negative Urgency, Distress Tolerance, and Substance Abuse Among College Students

    PubMed Central

    Kaiser, Alison J.; Milich, Richard; Lynam, Donald R.; Charnigo, Richard J.

    2012-01-01

    Objective Negative affect has been consistently linked with substance use/problems in prior research. The present study sought to build upon these findings by exploring how an individual’s characteristic responding to negative affect impacts substance abuse risk. Trait negative affect was examined in relation to substance abuse outcomes along with two variables tapping into response to negative affect: Distress Tolerance, an individual’s perceived ability to tolerate negative affect, and Negative Urgency, the tendency to act rashly while experiencing distress. Method Participants were 525 first-year college students (48.1% male, 81.1% Caucasian), who completed self-report measures assessing personality traits and alcohol-related problems, and a structured interview assessing past and current substance use. Relations were tested using Zero-Inflated Negative Binomial regression models, and each of the personality variables was tested in a model on its own, and in a model where all three traits were accounted for. Results Negative Urgency emerged as the best predictor, relating to every one of the substance use outcome variables even when trait negative affect and Distress Tolerance were accounted for. Conclusions These findings suggest that Negative Urgency is an important factor to consider in developing prevention and intervention efforts aimed at reducing substance use and problems. PMID:22698894

  1. Race, Sex, and Discrimination in School Settings: A Multilevel Analysis of Associations With Delinquency.

    PubMed

    Chambers, Brittany D; Erausquin, Jennifer Toller

    2018-02-01

    Adolescence is a critical phase of development and experimentation with delinquent behaviors. There is a growing body of literature exploring individual and structural impacts of discrimination on health outcomes and delinquent behaviors. However, there is limited research assessing how school diversity and discrimination impact students' delinquent behaviors. In response, the purpose of this study was to assess if individual- and school-level indicators of discrimination and diversity were associated with student delinquent behaviors among African American and White students. We analyzed Wave I (1994-1995) data from the National Longitudinal Study of Adolescent Health. Our analysis was limited to 8947 African American and White students (73% White, 48% male, and 88% parent ≥ high school education). We used multilevel zero-inflated negative binomial regression to test the association of individual- and school characteristics and discrimination with the number of self-reported delinquent behaviors. Race, sex, perceived peer inclusion, and teacher discrimination were predictors of students' delinquent behaviors. The average school perceived peer inclusion and percentage of African Americans in teaching roles were associated with delinquent behaviors. Findings from this study highlight the potential for intervention at the interpersonal and school levels to reduce delinquency among African American and White students. © 2018, American School Health Association.

  2. Ascaris and hookworm transmission in preschool children from rural Panama: role of yard environment, soil eggs/larvae and hygiene and play behaviours.

    PubMed

    Krause, Rachel J; Koski, Kristine G; Pons, Emérita; Sandoval, Nidia; Sinisterra, Odalis; Scott, Marilyn E

    2015-10-01

    This study explored whether the yard environment and child hygiene and play behaviours were associated with presence and intensity of Ascaris and hookworm in preschool children and with eggs and larvae in soil. Data were collected using questionnaires, a visual survey of the yard, soil samples and fecal samples collected at baseline and following re-infection. The presence of eggs/larvae in soil was associated negatively with water storage (eggs) but positively with dogs (eggs) and distance from home to latrine (larvae). Baseline and re-infection prevalences were: hookworm (28.0%, 3.4%); Ascaris (16.9%, 9.5%); Trichuris (0.9%, 0.7%). Zero-inflated negative binomial regression models revealed a higher baseline hookworm infection if yards had eggs or larvae, more vegetation or garbage, and if the child played with soil. Baseline Ascaris was associated with dirt floor, dogs, exposed soil in yard, open defecation and with less time playing outdoors, whereas Ascaris re-infection was associated with water storage, vegetation cover and garbage near the home and not playing with animals. Our results show complex interactions between infection, the yard environment and child behaviours, and indicate that transmission would be reduced if latrines were closer to the home, and if open defecation and water spillage were reduced.

  3. Peritraumatic tonic immobility is associated with PTSD symptom severity in Brazilian police officers: a prospective study.

    PubMed

    Maia, Deborah B; Nóbrega, Augusta; Marques-Portella, Carla; Mendlowicz, Mauro V; Volchan, Eliane; Coutinho, Evandro S; Figueira, Ivan

    2015-01-01

    Peritraumatic reactions feature prominently among the main predictors for development of posttraumatic stress disorder (PTSD). Peritraumatic tonic immobility (PTI), a less investigated but equally important type of peritraumatic response, has been recently attracting the attention of researchers and clinicians for its close association with traumatic reactions and PTSD. Our objective was to investigate the role of PTI, peritraumatic panic, and dissociation as predictors of PTSD symptoms in a cohort of police recruits (n=132). Participants were asked to complete the following questionnaires during academy training and after the first year of work: Posttraumatic Stress Disorder Checklist - Civilian Version (PCL-C), Physical Reactions Subscale (PRS), Peritraumatic Dissociative Experiences Questionnaire (PDEQ), Tonic Immobility Scale (TIS), and Critical Incident History Questionnaire. Employing a zero-inflated negative binomial regression model, we found that each additional point in the TIS was associated with a 9% increment in PCL-C mean scores (RM = 1.09), whereas for PRS, the increment was 7% (RM = 1.07). As the severity of peritraumatic dissociation increased one point in the PDEQ, the chance of having at least one symptom in the PCL-C increased 22% (OR = 1.22). Our findings highlight the need to expand investigation on the incidence and impact of PTI on the mental health of police officers.

  4. Differences in passenger car and large truck involved crash frequencies at urban signalized intersections: an exploratory analysis.

    PubMed

    Dong, Chunjiao; Clarke, David B; Richards, Stephen H; Huang, Baoshan

    2014-01-01

    The influence of intersection features on safety has been examined extensively because intersections experience a relatively large proportion of motor vehicle conflicts and crashes. Although there are distinct differences between passenger cars and large trucks-size, operating characteristics, dimensions, and weight-modeling crash counts across vehicle types is rarely addressed. This paper develops and presents a multivariate regression model of crash frequencies by collision vehicle type using crash data for urban signalized intersections in Tennessee. In addition, the performance of univariate Poisson-lognormal (UVPLN), multivariate Poisson (MVP), and multivariate Poisson-lognormal (MVPLN) regression models in establishing the relationship between crashes, traffic factors, and geometric design of roadway intersections is investigated. Bayesian methods are used to estimate the unknown parameters of these models. The evaluation results suggest that the MVPLN model possesses most of the desirable statistical properties in developing the relationships. Compared to the UVPLN and MVP models, the MVPLN model better identifies significant factors and predicts crash frequencies. The findings suggest that traffic volume, truck percentage, lighting condition, and intersection angle significantly affect intersection safety. Important differences in car, car-truck, and truck crash frequencies with respect to various risk factors were found to exist between models. The paper provides some new or more comprehensive observations that have not been covered in previous studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Geographical variation in the incidence of childhood leukaemia in Manitoba.

    PubMed

    Torabi, Mahmoud; Singh, Harminder; Galloway, Katie; Israels, Sara J

    2015-11-01

    Identification of geographical areas and ecological factors associated with higher incidence of childhood leukaemias can direct further study for preventable factors and location of health services to manage such individuals. The aim of this study was to describe the geographical variation and the socio-demographic factors associated with childhood leukaemia in Manitoba. Information on childhood leukaemia incidence between 1992 and 2008 was obtained from the Canadian Cancer Registry and the socio-demographic characteristics for the area of residence from the 2006 Canadian Census. Bayesian spatial Poisson mixed models were used to describe the geographical variation of childhood leukaemia and to determine the association between childhood leukaemia and socio-demographic factors. The south-eastern part of the province had a higher incidence of childhood leukaemia than other parts of the province. In the age and sex-adjusted Poisson regression models, areas with higher proportions of visible minorities and immigrant residents had higher childhood leukaemia incidence rate ratios. In the saturated Poisson regression model, the childhood leukaemia rates were higher in areas with higher proportions of immigrant residents. Unemployment rates were not a significant factor in leukaemia incidence. In Manitoba, areas with higher proportions of immigrants experience higher incidence rates of childhood leukaemia. We have identified geographical areas with higher incidence, which require further study and attention. © 2015 The Authors. Journal of Paediatrics and Child Health © 2015 Paediatrics and Child Health Division (Royal Australasian College of Physicians).

  6. Temporal framing and the hidden-zero effect: rate-dependent outcomes on delay discounting.

    PubMed

    Naudé, Gideon P; Kaplan, Brent A; Reed, Derek D; Henley, Amy J; DiGennaro Reed, Florence D

    2018-05-01

    Recent research suggests that presenting time intervals as units (e.g., days) or as specific dates, can modulate the degree to which humans discount delayed outcomes. Another framing effect involves explicitly stating that choosing a smaller-sooner reward is mutually exclusive to receiving a larger-later reward, thus presenting choices as an extended sequence. In Experiment 1, participants (N = 201) recruited from Amazon Mechanical Turk completed the Monetary Choice Questionnaire in a 2 (delay framing) by 2 (zero framing) design. Regression suggested a main effect of delay, but not zero, framing after accounting for other demographic variables and manipulations. We observed a rate-dependent effect for the date-framing group, such that those with initially steep discounting exhibited greater sensitivity to the manipulation than those with initially shallow discounting. Subsequent analyses suggest these effects cannot be explained by regression to the mean. Experiment 2 addressed the possibility that the null effect of zero framing was due to within-subject exposure to the hidden- and explicit-zero conditions. A new Amazon Mechanical Turk sample completed the Monetary Choice Questionnaire in either hidden- or explicit-zero formats. Analyses revealed a main effect of reward magnitude, but not zero framing, suggesting potential limitations to the generality of the hidden-zero effect. © 2018 Society for the Experimental Analysis of Behavior.

  7. Particulate matter exposure increases JC polyomavirus replication in the human host.

    PubMed

    Dolci, Maria; Favero, Chiara; Bollati, Valentina; Campo, Laura; Cattaneo, Andrea; Bonzini, Matteo; Villani, Sonia; Ticozzi, Rosalia; Ferrante, Pasquale; Delbue, Serena

    2018-05-29

    Human polyomaviruses (HPyVs) asymptomatically infect the human population during childhood and establish latency in the host. Viral reactivation and urinary excretion can occur when the immune system is impaired. Exposure to particulate air pollution, including the PM 10 /PM 2.5 components, is a public health problem and has been linked to several disorders. Studies assessing the relationship between PM 10 /PM 2.5 exposure and viral replication are lacking. To investigate the relationship between HPyVs viruria and PM 10 /PM 2.5 exposures. Individual environmental exposure was assessed in 50 healthy adult volunteers using a chemical transport model (CTM) with a municipality resolution for daily PM 10 and monitoring stations data for daily PM 2.5 exposures. For each subject, a urine sample was collected, and HPyVs (JCPyV, BKPyV, MCPyV, HPyV6, HPyV7 and HPyV9) loads were determined. Zero-inflated negative binomial (ZINB) regression was used to model the count data, as it contained excessive zeros. Covariates were chosen by stepwise selection. HPyVs DNA was detected in 54% (median:87.6*10 5 copies/ml) of the urine samples. JCPyV was the prevalent (48%, (median viral load:126*10 5 copies/ml). Considering the load of the most frequently measured HPyVs, JCPyV, in the count-part of the ZINB model, every unitary in PM measured 2 days before urine collection (PM Day -2) was associated with an increase in JCPyV load (PM 10 : +4.0%, p-value = 0.002; PM 2.5 : +3.6%, p-value = 0.005). In the zero-part, the significant predictor was the PM 10 measured 5 days before urine collection (+3%, p-value = 0.03). The environmental levels of PM 10 /PM 2.5 increase the JCPyV viruria. Our findings emphasize the need for studies assessing the influence of air pollution exposure on the risk of viral reactivation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Dark energy and the cosmic microwave background radiation

    NASA Technical Reports Server (NTRS)

    Dodelson, S.; Knox, L.

    2000-01-01

    We find that current cosmic microwave background anisotropy data strongly constrain the mean spatial curvature of the Universe to be near zero, or, equivalently, the total energy density to be near critical-as predicted by inflation. This result is robust to editing of data sets, and variation of other cosmological parameters (totaling seven, including a cosmological constant). Other lines of argument indicate that the energy density of nonrelativistic matter is much less than critical. Together, these results are evidence, independent of supernovae data, for dark energy in the Universe.

  9. Dark energy and the cosmic microwave background radiation.

    PubMed

    Dodelson, S; Knox, L

    2000-04-17

    We find that current cosmic microwave background anisotropy data strongly constrain the mean spatial curvature of the Universe to be near zero, or, equivalently, the total energy density to be near critical-as predicted by inflation. This result is robust to editing of data sets, and variation of other cosmological parameters (totaling seven, including a cosmological constant). Other lines of argument indicate that the energy density of nonrelativistic matter is much less than critical. Together, these results are evidence, independent of supernovae data, for dark energy in the Universe.

  10. Simple, Efficient Estimators of Treatment Effects in Randomized Trials Using Generalized Linear Models to Leverage Baseline Variables

    PubMed Central

    Rosenblum, Michael; van der Laan, Mark J.

    2010-01-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation. PMID:20628636

  11. Galerkin methods for Boltzmann-Poisson transport with reflection conditions on rough boundaries

    NASA Astrophysics Data System (ADS)

    Morales Escalante, José A.; Gamba, Irene M.

    2018-06-01

    We consider in this paper the mathematical and numerical modeling of reflective boundary conditions (BC) associated to Boltzmann-Poisson systems, including diffusive reflection in addition to specularity, in the context of electron transport in semiconductor device modeling at nano scales, and their implementation in Discontinuous Galerkin (DG) schemes. We study these BC on the physical boundaries of the device and develop a numerical approximation to model an insulating boundary condition, or equivalently, a pointwise zero flux mathematical condition for the electron transport equation. Such condition balances the incident and reflective momentum flux at the microscopic level, pointwise at the boundary, in the case of a more general mixed reflection with momentum dependant specularity probability p (k →). We compare the computational prediction of physical observables given by the numerical implementation of these different reflection conditions in our DG scheme for BP models, and observe that the diffusive condition influences the kinetic moments over the whole domain in position space.

  12. Consumer-choice health plan (first of two parts). Inflation and inequity in health care today: alternatives for cost control and an analysis of proposals for national health insurance.

    PubMed

    Enthoven, A C

    1978-03-23

    The financing system for medical costs in this country suffers from severe inflation and inequity. The tax-supported system of fee for service for doctors, third-party intermediaries and cost reimbursement for hospitals produces inflation by rewarding cost-increasing behavior and failing to provide incentives for economy. The system is inequitable because the government pays more on behalf of those who choose more costly systems of care, because tax benefits subsidize the health insurance of the well-to-do, while not helping many low-income people, and because employment health insurance does not guarantee continuity of coverage and is regressive in its financing. Analysis of previous proposals for national health insurance shows none to be capable of solving most of these problems. Direct economic regulation by government will not improve the situation. Cost controls through incentives and regulated competition in the private sector are most likely to be effective.

  13. A bridge between unified cosmic history by f( R)-gravity and BIonic system

    NASA Astrophysics Data System (ADS)

    Sepehri, Alireza; Capozziello, Salvatore; Setare, Mohammad Reza

    2016-04-01

    Recently, the cosmological deceleration-acceleration transition redshift in f( R) gravity has been considered in order to address consistently the problem of cosmic evolution. It is possible to show that the deceleration parameter changes sign at a given redshift according to observational data. Furthermore, a f( R) gravity cosmological model can be constructed in brane-antibrane system starting from the very early universe and accounting for the cosmological redshift at all phases of cosmic history, from inflation to late time acceleration. Here we propose a f( R) model where transition redshifts correspond to inflation-deceleration and deceleration-late time acceleration transitions starting froma BIon system. At the point where the universe was born, due to the transition of k black fundamental strings to the BIon configuration, the redshift is approximately infinity and decreases with reducing temperature (z˜ T2). The BIon is a configuration in flat space of a universe-brane and a parallel anti-universe-brane connected by a wormhole. This wormhole is a channel for flowing energy from extra dimensions into our universe, occurring at inflation and decreasing with redshift as z˜ T^{4+1/7}. Dynamics consists with the fact that the wormhole misses its energy and vanishes as soon as inflation ends and deceleration begins. Approaching two universe branes together, a tachyon is originated, it grows up and causes the formation of a wormhole. We show that, in the framework of f( R) gravity, the cosmological redshift depends on the tachyonic potential and has a significant decrease at deceleration-late time acceleration transition point (z˜ T^{2/3}). As soon as today acceleration approaches, the redshift tends to zero and the cosmological model reduces to the standard Λ CDM cosmology.

  14. [Spatio-temporal distribution of scrub typhus and related influencing factors in coastal beach area of Yancheng, China].

    PubMed

    Chen, Y Z; Li, F; Xu, H; Huang, L C; Gu, Z G; Sun, Z Y; Yan, G J; Zhu, Y J; Tang, C

    2016-02-01

    In order to provide better programs on monitoring, early warning and prevention of Scrub Typhus in the coastal beach area, temporal-spatial distribution characteristics of scrub typhus were summarized. Relationships between temporal-spatial clustering of Scrub Typhus, meteorological factors, rodent distribution and the biological characteristics in coastal beach area of Yancheng city, were studied. Reports on network-based Scrub Typhus epidemics and information on population, weather situation through monitoring those stations, from 2005 to 2014 were collected and processed, in the coastal beach area of Yancheng city. Distribution, density of the population concerned and seasonal fluctuation on rodents were monitored in coastal beach area, from April 2011 to December, 2013. METHODS as descriptive statistics, space-time permutation scantistics, autocorrelation and Cross-correlation analysis etc, were used to analyze the temporal-spatial distribution of Scrub Typhus and correlation with rodent distribution, density fluctuation and meteorological indexes. Zero-inflated Pearson (ZIP) regression model was contributed according to the distribution of related data. All methods were calculated under Excel 2003, SPSS 16.0, Mapinfo 11.0, Satscan 9.0 and Stata/SE 10.0 softwares. (1) The incidence of Scrub Typhus was gradually increasing and the highest incidence of the year was seen in 2014, as 5.81/10 million. There was an autumn peak of Scrub typhus, with the highest incidence rate as 12.02/10 million in November. The incidence rate of Scrub typhus appeared high in Binhai, Dafeng and Xiangshui, with the average incidence rates appeared as 3.30/10 million, 3.21/10 million and 2.79/10 million, respectively. There were 12 towns with high incidence rates in the coastal beach area, with incidence rate showed between 4.41/10 and 10.03/10 million. (2) There were three incidence clusters of Scrub typhus seen in 25 towns, between October 2012 and November 2012 in Dongtai, Dafeng, Sheyang areas and 5 towns between October and November, 2014 in Xiangshui area, together with another 6 towns in November of 2006, in Binhai area. (3) Apodemus agrarius appeared the dominant species in the coastal area, with the constituent ratio as 89.19%. The rodent density appeared two peaks in winter and summer in 2011 and 2013. The winter peak was seen in January and the summer peak lasting for 5-8 months. Scrub Typhus was seen 10-11 months in a year and the incidence was increasing, parallel with the peak of the rodent density. The peak incidence of Scrub Typhus showed a temperature/rainfall-related peak. Rodent density, temperature, rainfalls were correlated with the incidence of Scrub Typhus, under the Cross correlation analysis. Rains, Mean minimum temperature of a 3-month lagging were directly correlated but the duration of sunshine and relative humidity were negatively correlated with the incidence of Scrub Typhus, under the Zero-inflated Pearson (ZIP) regression model. Temporal-spatial clustering and factors as media creature and weather condition of Scrub Typhu were discovered, which provided evidence for effective measures on prevention and control of the disease.

  15. Detecting isotopic ratio outliers

    NASA Astrophysics Data System (ADS)

    Bayne, C. K.; Smith, D. H.

    An alternative method is proposed for improving isotopic ratio estimates. This method mathematically models pulse-count data and uses iterative reweighted Poisson regression to estimate model parameters to calculate the isotopic ratios. This computer-oriented approach provides theoretically better methods than conventional techniques to establish error limits and to identify outliers.

  16. Evaluation of the CATSIB DIF Procedure in a Pretest Setting

    ERIC Educational Resources Information Center

    Nandakumar, Ratna; Roussos, Louis

    2004-01-01

    A new procedure, CATSIB, for assessing differential item functioning (DIF) on computerized adaptive tests (CATs) is proposed. CATSIB, a modified SIBTEST procedure, matches test takers on estimated ability and controls for impact-induced Type 1 error inflation by employing a CAT version of the IBTEST "regression correction." The…

  17. Do `negative' temperatures exist?

    NASA Astrophysics Data System (ADS)

    Lavenda, B. H.

    1999-06-01

    A modification of the second law is required for a system with a bounded density of states and not the introduction of a `negative' temperature scale. The ascending and descending branches of the entropy versus energy curve describe particle and hole states, having thermal equations of state that are given by the Fermi and logistic distributions, respectively. Conservation of energy requires isentropic states to be isothermal. The effect of adiabatically reversing the field is entirely mechanical because the only difference between the two states is their energies. The laws of large and small numbers, leading to the normal and Poisson approximations, characterize statistically the states of infinite and zero temperatures, respectively. Since the heat capacity also vanishes in the state of maximum disorder, the third law can be generalized in systems with a bounded density of states: the entropy tends to a constant as the temperature tends to either zero or infinity.

  18. Tracheal tube cuff inflation guided by pressure volume loop closure associated with lower postoperative cuff-related complications: Prospective, randomized clinical trial.

    PubMed

    Almarakbi, Waleed A; Kaki, Abdullah M

    2014-07-01

    The main function of an endotracheal tube (ETT) cuff is to prevent aspiration. High cuff pressure is usually associated with postoperative complications. We tried to compare cuff inflation guided by pressure volume loop closure (PV-L) with those by just to seal technique (JS) and assess the postoperative incidence of sore throat, cough and hoarseness. In a prospective, randomized clinical trial, 100 patients' tracheas were intubated. In the first group (n = 50), ETT cuff inflation was guided by PV-L, while in the second group (n. = 50) the ETT cuff was inflated using the JS technique. Intracuff pressures and volumes were measured. The incidence of postoperative cuff-related complications was reported. Demographic data and durations of intubation were comparable between the groups. The use of PV-L was associated with a lesser amount of intracuff air [4.05 (3.7-4.5) vs 5 (4.8-5.5), P < 0.001] and lower cuff pressure than those in the JS group [18.25 (18-19) vs 33 (32-35), P ≤ 0.001]. The incidence of postextubation cuff-related complications was significantly less frequent among the PV-L group patients as compared with the JS group patients (P ≤ 0.009), except for hoarseness of voice, which was less frequent among the PV-L group, but not statistically significant (P ≤ 0.065). Multiple regression models for prediction of intra-cuff pressure after intubation and before extubation revealed a statistically significant association with the technique used for cuff inflation (P < 0.0001). The study confirms that PV-L-guided ETT cuff inflation is an effective way to seal the airway and associates with a lower ETT cuff pressure and lower incidence of cuff-related complications.

  19. Effects of economic crises on population health outcomes in Latin America, 1981–2010: an ecological study

    PubMed Central

    Williams, Callum; Gilbert, Barnabas James; Zeltner, Thomas; Watkins, Johnathan; Atun, Rifat; Maruthappu, Mahiben

    2016-01-01

    Objectives The relative health effects of changes in unemployment, inflation and gross domestic product (GDP) per capita on population health have not been assessed. We aimed to determine the effect of changes in these economic measures on mortality metrics across Latin America. Design Ecological study. Setting Latin America (21 countries), 1981–2010. Outcome measures Uses multivariate regression analysis to assess the effects of changes in unemployment, inflation and GDP per capita on 5 mortality indicators across 21 countries in Latin America, 1981–2010. Country-specific differences in healthcare infrastructure, population structure and population size were controlled for. Results Between 1981 and 2010, a 1% rise in unemployment was associated with statistically significant deteriorations (p<0.05) in 5 population health outcomes, with largest deteriorations in 1–5 years of age and male adult mortality rates (1.14 and 0.53 rises per 1000 deaths respectively). A 1% rise in inflation rate was associated with significant deteriorations (p<0.05) in 4 population health outcomes, with the largest deterioration in male adult mortality rate (0.0033 rise per 1000 deaths). Lag analysis showed that 5 years after rises in unemployment and inflation, significant deteriorations (p<0.05) occurred in 3 and 5 mortality metrics, respectively. A 1% rise in GDP per capita was associated with no significant deteriorations in population health outcomes either in the short or long term. β coefficient comparisons indicated that the effect of unemployment increases was substantially greater than that of changes in GDP per capita or inflation. Conclusions Rises in unemployment and inflation are associated with long-lasting deteriorations in several population health outcomes. Unemployment exerted much larger effects on health than inflation. In contrast, changes in GDP per capita had almost no association with the explored health outcomes. Contrary to neoclassical development economics, policymakers should prioritise amelioration of unemployment if population health outcomes are to be optimised. PMID:26739715

  20. Inflation of the screening length induced by Bjerrum pairs.

    PubMed

    Zwanikken, Jos; van Roij, René

    2009-10-21

    Within a modified Poisson-Boltzmann theory we study the effect of Bjerrum pairs on the typical length scale [Formula: see text] over which electric fields are screened in electrolyte solutions, taking into account a simple association-dissociation equilibrium between free ions and Bjerrum pairs. At low densities of Bjerrum pairs, this length scale is well approximated by the Debye length [Formula: see text], with ρ(s) the free-ion density. At high densities of Bjerrum pairs, however, we find [Formula: see text], which is significantly larger than 1/κ due to the enhanced effective permittivity of the electrolyte, caused by the polarization of Bjerrum pairs. We argue that this mechanism may explain the recently observed anomalously large colloid-free zones between an oil-dispersed colloidal crystal and a colloidal monolayer at the oil-water interface.

Top