A Bayesian bird's eye view of ‘Replications of important results in social psychology’
Schönbrodt, Felix D.; Yao, Yuling; Gelman, Andrew; Wagenmakers, Eric-Jan
2017-01-01
We applied three Bayesian methods to reanalyse the preregistered contributions to the Social Psychology special issue ‘Replications of Important Results in Social Psychology’ (Nosek & Lakens. 2014 Registered reports: a method to increase the credibility of published results. Soc. Psychol. 45, 137–141. (doi:10.1027/1864-9335/a000192)). First, individual-experiment Bayesian parameter estimation revealed that for directed effect size measures, only three out of 44 central 95% credible intervals did not overlap with zero and fell in the expected direction. For undirected effect size measures, only four out of 59 credible intervals contained values greater than 0.10 (10% of variance explained) and only 19 intervals contained values larger than 0.05. Second, a Bayesian random-effects meta-analysis for all 38 t-tests showed that only one out of the 38 hierarchically estimated credible intervals did not overlap with zero and fell in the expected direction. Third, a Bayes factor hypothesis test was used to quantify the evidence for the null hypothesis against a default one-sided alternative. Only seven out of 60 Bayes factors indicated non-anecdotal support in favour of the alternative hypothesis (BF10>3), whereas 51 Bayes factors indicated at least some support for the null hypothesis. We hope that future analyses of replication success will embrace a more inclusive statistical approach by adopting a wider range of complementary techniques. PMID:28280547
Comparing interval estimates for small sample ordinal CFA models
Natesan, Prathiba
2015-01-01
Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading. Therefore, editors and policymakers should continue to emphasize the inclusion of interval estimates in research. PMID:26579002
Comparing interval estimates for small sample ordinal CFA models.
Natesan, Prathiba
2015-01-01
Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading. Therefore, editors and policymakers should continue to emphasize the inclusion of interval estimates in research.
Love, Jeffrey J.
2012-01-01
Statistical analysis is made of rare, extreme geophysical events recorded in historical data -- counting the number of events $k$ with sizes that exceed chosen thresholds during specific durations of time $\\tau$. Under transformations that stabilize data and model-parameter variances, the most likely Poisson-event occurrence rate, $k/\\tau$, applies for frequentist inference and, also, for Bayesian inference with a Jeffreys prior that ensures posterior invariance under changes of variables. Frequentist confidence intervals and Bayesian (Jeffreys) credibility intervals are approximately the same and easy to calculate: $(1/\\tau)[(\\sqrt{k} - z/2)^{2},(\\sqrt{k} + z/2)^{2}]$, where $z$ is a parameter that specifies the width, $z=1$ ($z=2$) corresponding to $1\\sigma$, $68.3\\%$ ($2\\sigma$, $95.4\\%$). If only a few events have been observed, as is usually the case for extreme events, then these "error-bar" intervals might be considered to be relatively wide. From historical records, we estimate most likely long-term occurrence rates, 10-yr occurrence probabilities, and intervals of frequentist confidence and Bayesian credibility for large earthquakes, explosive volcanic eruptions, and magnetic storms.
Prokinetics for the treatment of functional dyspepsia: Bayesian network meta-analysis.
Yang, Young Joo; Bang, Chang Seok; Baik, Gwang Ho; Park, Tae Young; Shin, Suk Pyo; Suk, Ki Tae; Kim, Dong Joon
2017-06-26
Controversies persist regarding the effect of prokinetics for the treatment of functional dyspepsia (FD). This study aimed to assess the comparative efficacy of prokinetic agents for the treatment of FD. Randomized controlled trials (RCTs) of prokinetics for the treatment of FD were identified from core databases. Symptom response rates were extracted and analyzed using odds ratios (ORs). A Bayesian network meta-analysis was performed using the Markov chain Monte Carlo method in WinBUGS and NetMetaXL. In total, 25 RCTs, which included 4473 patients with FD who were treated with 6 different prokinetics or placebo, were identified and analyzed. Metoclopramide showed the best surface under the cumulative ranking curve (SUCRA) probability (92.5%), followed by trimebutine (74.5%) and mosapride (63.3%). However, the therapeutic efficacy of metoclopramide was not significantly different from that of trimebutine (OR:1.32, 95% credible interval: 0.27-6.06), mosapride (OR: 1.99, 95% credible interval: 0.87-4.72), or domperidone (OR: 2.04, 95% credible interval: 0.92-4.60). Metoclopramide showed better efficacy than itopride (OR: 2.79, 95% credible interval: 1.29-6.21) and acotiamide (OR: 3.07, 95% credible interval: 1.43-6.75). Domperidone (SUCRA probability 62.9%) showed better efficacy than itopride (OR: 1.37, 95% credible interval: 1.07-1.77) and acotiamide (OR: 1.51, 95% credible interval: 1.04-2.18). Metoclopramide, trimebutine, mosapride, and domperidone showed better efficacy for the treatment of FD than itopride or acotiamide. Considering the adverse events related to metoclopramide or domperidone, the short-term use of these agents or the alternative use of trimebutine or mosapride could be recommended for the symptomatic relief of FD.
Power in Bayesian Mediation Analysis for Small Sample Research
Miočević, Milica; MacKinnon, David P.; Levy, Roy
2018-01-01
It was suggested that Bayesian methods have potential for increasing power in mediation analysis (Koopman, Howe, Hollenbeck, & Sin, 2015; Yuan & MacKinnon, 2009). This paper compares the power of Bayesian credibility intervals for the mediated effect to the power of normal theory, distribution of the product, percentile, and bias-corrected bootstrap confidence intervals at N≤ 200. Bayesian methods with diffuse priors have power comparable to the distribution of the product and bootstrap methods, and Bayesian methods with informative priors had the most power. Varying degrees of precision of prior distributions were also examined. Increased precision led to greater power only when N≥ 100 and the effects were small, N < 60 and the effects were large, and N < 200 and the effects were medium. An empirical example from psychology illustrated a Bayesian analysis of the single mediator model from prior selection to interpreting results. PMID:29662296
Power in Bayesian Mediation Analysis for Small Sample Research.
Miočević, Milica; MacKinnon, David P; Levy, Roy
2017-01-01
It was suggested that Bayesian methods have potential for increasing power in mediation analysis (Koopman, Howe, Hollenbeck, & Sin, 2015; Yuan & MacKinnon, 2009). This paper compares the power of Bayesian credibility intervals for the mediated effect to the power of normal theory, distribution of the product, percentile, and bias-corrected bootstrap confidence intervals at N≤ 200. Bayesian methods with diffuse priors have power comparable to the distribution of the product and bootstrap methods, and Bayesian methods with informative priors had the most power. Varying degrees of precision of prior distributions were also examined. Increased precision led to greater power only when N≥ 100 and the effects were small, N < 60 and the effects were large, and N < 200 and the effects were medium. An empirical example from psychology illustrated a Bayesian analysis of the single mediator model from prior selection to interpreting results.
Browne, Erica N; Rathinam, Sivakumar R; Kanakath, Anuradha; Thundikandy, Radhika; Babu, Manohar; Lietman, Thomas M; Acharya, Nisha R
2017-02-01
To conduct a Bayesian analysis of a randomized clinical trial (RCT) for non-infectious uveitis using expert opinion as a subjective prior belief. A RCT was conducted to determine which antimetabolite, methotrexate or mycophenolate mofetil, is more effective as an initial corticosteroid-sparing agent for the treatment of intermediate, posterior, and pan-uveitis. Before the release of trial results, expert opinion on the relative effectiveness of these two medications was collected via online survey. Members of the American Uveitis Society executive committee were invited to provide an estimate for the relative decrease in efficacy with a 95% credible interval (CrI). A prior probability distribution was created from experts' estimates. A Bayesian analysis was performed using the constructed expert prior probability distribution and the trial's primary outcome. A total of 11 of the 12 invited uveitis specialists provided estimates. Eight of 11 experts (73%) believed mycophenolate mofetil is more effective. The group prior belief was that the odds of treatment success for patients taking mycophenolate mofetil were 1.4-fold the odds of those taking methotrexate (95% CrI 0.03-45.0). The odds of treatment success with mycophenolate mofetil compared to methotrexate was 0.4 from the RCT (95% confidence interval 0.1-1.2) and 0.7 (95% CrI 0.2-1.7) from the Bayesian analysis. A Bayesian analysis combining expert belief with the trial's result did not indicate preference for one drug. However, the wide credible interval leaves open the possibility of a substantial treatment effect. This suggests clinical equipoise necessary to allow a larger, more definitive RCT.
Author Correction: Phase-resolved X-ray polarimetry of the Crab pulsar with the AstroSat CZT Imager
NASA Astrophysics Data System (ADS)
Vadawale, S. V.; Chattopadhyay, T.; Mithun, N. P. S.; Rao, A. R.; Bhattacharya, D.; Vibhute, A.; Bhalerao, V. B.; Dewangan, G. C.; Misra, R.; Paul, B.; Basu, A.; Joshi, B. C.; Sreekumar, S.; Samuel, E.; Priya, P.; Vinod, P.; Seetha, S.
2018-05-01
In the Supplementary Information file originally published for this Letter, in Supplementary Fig. 7 the error bars for the polarization fraction were provided as confidence intervals but instead should have been Bayesian credibility intervals. This has been corrected and does not alter the conclusions of the Letter in any way.
Resolved Stellar Streams around NGC 4631 from a Subaru/Hyper Suprime-Cam Survey
NASA Astrophysics Data System (ADS)
Tanaka, Mikito; Chiba, Masashi; Komiyama, Yutaka
2017-06-01
We present the first results of the Subaru/Hyper Suprime-Cam survey of the interacting galaxy system, NGC 4631 and NGC 4656. From the maps of resolved stellar populations, we identify 11 dwarf galaxies (including already-known dwarfs) in the outer region of NGC 4631 and the two tidal stellar streams around NGC 4631, named Stream SE and Stream NW, respectively. This paper describes the fundamental properties of these tidal streams. Based on the tip of the red giant branch method and the Bayesian statistics, we find that Stream SE (7.10 Mpc in expected a posteriori, EAP, with 90% credible intervals of [6.22, 7.29] Mpc) and Stream NW (7.91 Mpc in EAP with 90% credible intervals of [6.44, 7.97] Mpc) are located in front of and behind NGC 4631, respectively. We also calculate the metallicity distribution of stellar streams by comparing the member stars with theoretical isochrones on the color-magnitude diagram. We find that both streams have the same stellar population based on the Bayesian model selection method, suggesting that they originated from a tidal interaction between NGC 4631 and a single dwarf satellite. The expected progenitor has a positively skewed metallicity distribution function with {[M/H]}{EAP}=-0.92, with 90% credible intervals of [-1.46, -0.51]. The stellar mass of the progenitor is estimated as 3.7× {10}8 {M}⊙ , with 90% credible intervals of [5.8× {10}6,8.6× {10}9] {M}⊙ based on the mass-metallicity relation for Local group dwarf galaxies. This is in good agreement with the initial stellar mass of the progenitor that was presumed in the previous N-body simulation.
Application of Bayesian model averaging to measurements of the primordial power spectrum
NASA Astrophysics Data System (ADS)
Parkinson, David; Liddle, Andrew R.
2010-11-01
Cosmological parameter uncertainties are often stated assuming a particular model, neglecting the model uncertainty, even when Bayesian model selection is unable to identify a conclusive best model. Bayesian model averaging is a method for assessing parameter uncertainties in situations where there is also uncertainty in the underlying model. We apply model averaging to the estimation of the parameters associated with the primordial power spectra of curvature and tensor perturbations. We use CosmoNest and MultiNest to compute the model evidences and posteriors, using cosmic microwave data from WMAP, ACBAR, BOOMERanG, and CBI, plus large-scale structure data from the SDSS DR7. We find that the model-averaged 95% credible interval for the spectral index using all of the data is 0.940
2014-01-01
Background Meta-regression is becoming increasingly used to model study level covariate effects. However this type of statistical analysis presents many difficulties and challenges. Here two methods for calculating confidence intervals for the magnitude of the residual between-study variance in random effects meta-regression models are developed. A further suggestion for calculating credible intervals using informative prior distributions for the residual between-study variance is presented. Methods Two recently proposed and, under the assumptions of the random effects model, exact methods for constructing confidence intervals for the between-study variance in random effects meta-analyses are extended to the meta-regression setting. The use of Generalised Cochran heterogeneity statistics is extended to the meta-regression setting and a Newton-Raphson procedure is developed to implement the Q profile method for meta-analysis and meta-regression. WinBUGS is used to implement informative priors for the residual between-study variance in the context of Bayesian meta-regressions. Results Results are obtained for two contrasting examples, where the first example involves a binary covariate and the second involves a continuous covariate. Intervals for the residual between-study variance are wide for both examples. Conclusions Statistical methods, and R computer software, are available to compute exact confidence intervals for the residual between-study variance under the random effects model for meta-regression. These frequentist methods are almost as easily implemented as their established counterparts for meta-analysis. Bayesian meta-regressions are also easily performed by analysts who are comfortable using WinBUGS. Estimates of the residual between-study variance in random effects meta-regressions should be routinely reported and accompanied by some measure of their uncertainty. Confidence and/or credible intervals are well-suited to this purpose. PMID:25196829
Kruschke, John K; Liddell, Torrin M
2018-02-01
In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.
Browne, Erica N; Rathinam, Sivakumar R; Kanakath, Anuradha; Thundikandy, Radhika; Babu, Manohar; Lietman, Thomas M; Acharya, Nisha R
2017-01-01
Purpose To conduct a Bayesian analysis of a randomized clinical trial (RCT) for non-infectious uveitis using expert opinion as a subjective prior belief. Methods A RCT was conducted to determine which antimetabolite, methotrexate or mycophenolate mofetil, is more effective as an initial corticosteroid-sparing agent for the treatment of intermediate, posterior, and pan- uveitis. Before the release of trial results, expert opinion on the relative effectiveness of these two medications was collected via online survey. Members of the American Uveitis Society executive committee were invited to provide an estimate for the relative decrease in efficacy with a 95% credible interval (CrI). A prior probability distribution was created from experts’ estimates. A Bayesian analysis was performed using the constructed expert prior probability distribution and the trial’s primary outcome. Results 11 of 12 invited uveitis specialists provided estimates. Eight of 11 experts (73%) believed mycophenolate mofetil is more effective. The group prior belief was that the odds of treatment success for patients taking mycophenolate mofetil were 1.4-fold the odds of those taking methotrexate (95% CrI 0.03 – 45.0). The odds of treatment success with mycophenolate mofetil compared to methotrexate was 0.4 from the RCT (95% confidence interval 0.1–1.2) and 0.7 (95% CrI 0.2–1.7) from the Bayesian analysis. Conclusions A Bayesian analysis combining expert belief with the trial’s result did not indicate preference for one drug. However, the wide credible interval leaves open the possibility of a substantial treatment effect. This suggests clinical equipoise necessary to allow a larger, more definitive RCT. PMID:27982726
A hierarchical Bayesian GEV model for improving local and regional flood quantile estimates
NASA Astrophysics Data System (ADS)
Lima, Carlos H. R.; Lall, Upmanu; Troy, Tara; Devineni, Naresh
2016-10-01
We estimate local and regional Generalized Extreme Value (GEV) distribution parameters for flood frequency analysis in a multilevel, hierarchical Bayesian framework, to explicitly model and reduce uncertainties. As prior information for the model, we assume that the GEV location and scale parameters for each site come from independent log-normal distributions, whose mean parameter scales with the drainage area. From empirical and theoretical arguments, the shape parameter for each site is shrunk towards a common mean. Non-informative prior distributions are assumed for the hyperparameters and the MCMC method is used to sample from the joint posterior distribution. The model is tested using annual maximum series from 20 streamflow gauges located in an 83,000 km2 flood prone basin in Southeast Brazil. The results show a significant reduction of uncertainty estimates of flood quantile estimates over the traditional GEV model, particularly for sites with shorter records. For return periods within the range of the data (around 50 years), the Bayesian credible intervals for the flood quantiles tend to be narrower than the classical confidence limits based on the delta method. As the return period increases beyond the range of the data, the confidence limits from the delta method become unreliable and the Bayesian credible intervals provide a way to estimate satisfactory confidence bands for the flood quantiles considering parameter uncertainties and regional information. In order to evaluate the applicability of the proposed hierarchical Bayesian model for regional flood frequency analysis, we estimate flood quantiles for three randomly chosen out-of-sample sites and compare with classical estimates using the index flood method. The posterior distributions of the scaling law coefficients are used to define the predictive distributions of the GEV location and scale parameters for the out-of-sample sites given only their drainage areas and the posterior distribution of the average shape parameter is taken as the regional predictive distribution for this parameter. While the index flood method does not provide a straightforward way to consider the uncertainties in the index flood and in the regional parameters, the results obtained here show that the proposed Bayesian method is able to produce adequate credible intervals for flood quantiles that are in accordance with empirical estimates.
A computer program for uncertainty analysis integrating regression and Bayesian methods
Lu, Dan; Ye, Ming; Hill, Mary C.; Poeter, Eileen P.; Curtis, Gary
2014-01-01
This work develops a new functionality in UCODE_2014 to evaluate Bayesian credible intervals using the Markov Chain Monte Carlo (MCMC) method. The MCMC capability in UCODE_2014 is based on the FORTRAN version of the differential evolution adaptive Metropolis (DREAM) algorithm of Vrugt et al. (2009), which estimates the posterior probability density function of model parameters in high-dimensional and multimodal sampling problems. The UCODE MCMC capability provides eleven prior probability distributions and three ways to initialize the sampling process. It evaluates parametric and predictive uncertainties and it has parallel computing capability based on multiple chains to accelerate the sampling process. This paper tests and demonstrates the MCMC capability using a 10-dimensional multimodal mathematical function, a 100-dimensional Gaussian function, and a groundwater reactive transport model. The use of the MCMC capability is made straightforward and flexible by adopting the JUPITER API protocol. With the new MCMC capability, UCODE_2014 can be used to calculate three types of uncertainty intervals, which all can account for prior information: (1) linear confidence intervals which require linearity and Gaussian error assumptions and typically 10s–100s of highly parallelizable model runs after optimization, (2) nonlinear confidence intervals which require a smooth objective function surface and Gaussian observation error assumptions and typically 100s–1,000s of partially parallelizable model runs after optimization, and (3) MCMC Bayesian credible intervals which require few assumptions and commonly 10,000s–100,000s or more partially parallelizable model runs. Ready access allows users to select methods best suited to their work, and to compare methods in many circumstances.
Meta-analysis of few small studies in orphan diseases.
Friede, Tim; Röver, Christian; Wandel, Simon; Neuenschwander, Beat
2017-03-01
Meta-analyses in orphan diseases and small populations generally face particular problems, including small numbers of studies, small study sizes and heterogeneity of results. However, the heterogeneity is difficult to estimate if only very few studies are included. Motivated by a systematic review in immunosuppression following liver transplantation in children, we investigate the properties of a range of commonly used frequentist and Bayesian procedures in simulation studies. Furthermore, the consequences for interval estimation of the common treatment effect in random-effects meta-analysis are assessed. The Bayesian credibility intervals using weakly informative priors for the between-trial heterogeneity exhibited coverage probabilities in excess of the nominal level for a range of scenarios considered. However, they tended to be shorter than those obtained by the Knapp-Hartung method, which were also conservative. In contrast, methods based on normal quantiles exhibited coverages well below the nominal levels in many scenarios. With very few studies, the performance of the Bayesian credibility intervals is of course sensitive to the specification of the prior for the between-trial heterogeneity. In conclusion, the use of weakly informative priors as exemplified by half-normal priors (with a scale of 0.5 or 1.0) for log odds ratios is recommended for applications in rare diseases. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
Gustafsson, Mats G; Wallman, Mikael; Wickenberg Bolin, Ulrika; Göransson, Hanna; Fryknäs, M; Andersson, Claes R; Isaksson, Anders
2010-06-01
Successful use of classifiers that learn to make decisions from a set of patient examples require robust methods for performance estimation. Recently many promising approaches for determination of an upper bound for the error rate of a single classifier have been reported but the Bayesian credibility interval (CI) obtained from a conventional holdout test still delivers one of the tightest bounds. The conventional Bayesian CI becomes unacceptably large in real world applications where the test set sizes are less than a few hundred. The source of this problem is that fact that the CI is determined exclusively by the result on the test examples. In other words, there is no information at all provided by the uniform prior density distribution employed which reflects complete lack of prior knowledge about the unknown error rate. Therefore, the aim of the study reported here was to study a maximum entropy (ME) based approach to improved prior knowledge and Bayesian CIs, demonstrating its relevance for biomedical research and clinical practice. It is demonstrated how a refined non-uniform prior density distribution can be obtained by means of the ME principle using empirical results from a few designs and tests using non-overlapping sets of examples. Experimental results show that ME based priors improve the CIs when employed to four quite different simulated and two real world data sets. An empirically derived ME prior seems promising for improving the Bayesian CI for the unknown error rate of a designed classifier. Copyright 2010 Elsevier B.V. All rights reserved.
Bayesian inference for disease prevalence using negative binomial group testing
Pritchard, Nicholas A.; Tebbs, Joshua M.
2011-01-01
Group testing, also known as pooled testing, and inverse sampling are both widely used methods of data collection when the goal is to estimate a small proportion. Taking a Bayesian approach, we consider the new problem of estimating disease prevalence from group testing when inverse (negative binomial) sampling is used. Using different distributions to incorporate prior knowledge of disease incidence and different loss functions, we derive closed form expressions for posterior distributions and resulting point and credible interval estimators. We then evaluate our new estimators, on Bayesian and classical grounds, and apply our methods to a West Nile Virus data set. PMID:21259308
A Two-Step Bayesian Approach for Propensity Score Analysis: Simulations and Case Study.
Kaplan, David; Chen, Jianshen
2012-07-01
A two-step Bayesian propensity score approach is introduced that incorporates prior information in the propensity score equation and outcome equation without the problems associated with simultaneous Bayesian propensity score approaches. The corresponding variance estimators are also provided. The two-step Bayesian propensity score is provided for three methods of implementation: propensity score stratification, weighting, and optimal full matching. Three simulation studies and one case study are presented to elaborate the proposed two-step Bayesian propensity score approach. Results of the simulation studies reveal that greater precision in the propensity score equation yields better recovery of the frequentist-based treatment effect. A slight advantage is shown for the Bayesian approach in small samples. Results also reveal that greater precision around the wrong treatment effect can lead to seriously distorted results. However, greater precision around the correct treatment effect parameter yields quite good results, with slight improvement seen with greater precision in the propensity score equation. A comparison of coverage rates for the conventional frequentist approach and proposed Bayesian approach is also provided. The case study reveals that credible intervals are wider than frequentist confidence intervals when priors are non-informative.
Inverse analysis and regularisation in conditional source-term estimation modelling
NASA Astrophysics Data System (ADS)
Labahn, Jeffrey W.; Devaud, Cecile B.; Sipkens, Timothy A.; Daun, Kyle J.
2014-05-01
Conditional Source-term Estimation (CSE) obtains the conditional species mass fractions by inverting a Fredholm integral equation of the first kind. In the present work, a Bayesian framework is used to compare two different regularisation methods: zeroth-order temporal Tikhonov regulatisation and first-order spatial Tikhonov regularisation. The objectives of the current study are: (i) to elucidate the ill-posedness of the inverse problem; (ii) to understand the origin of the perturbations in the data and quantify their magnitude; (iii) to quantify the uncertainty in the solution using different priors; and (iv) to determine the regularisation method best suited to this problem. A singular value decomposition shows that the current inverse problem is ill-posed. Perturbations to the data may be caused by the use of a discrete mixture fraction grid for calculating the mixture fraction PDF. The magnitude of the perturbations is estimated using a box filter and the uncertainty in the solution is determined based on the width of the credible intervals. The width of the credible intervals is significantly reduced with the inclusion of a smoothing prior and the recovered solution is in better agreement with the exact solution. The credible intervals for temporal and spatial smoothing are shown to be similar. Credible intervals for temporal smoothing depend on the solution from the previous time step and a smooth solution is not guaranteed. For spatial smoothing, the credible intervals are not dependent upon a previous solution and better predict characteristics for higher mixture fraction values. These characteristics make spatial smoothing a promising alternative method for recovering a solution from the CSE inversion process.
Demography and population status of polar bears in western Hudson Bay
Lunn, Nicholas J.; Regher, Eric V; Servanty, Sabrina; Converse, Sarah J.; Richardson, Evan S.; Stirling, Ian
2013-01-01
The 2011 abundance estimate from this analysis was 806 bears with a 95% Bayesian credible interval of 653-984. This is lower than, but broadly consistent with, the abundance estimate of 1,030 (95% confidence interval = 745-1406) from a 2011 aerial survey (Stapleton et al. 2014). The capture-recapture and aerial survey approaches have different spatial and temporal coverage of the WH subpopulation and, consequently, the effective study population considered by each approach is different.
Benedetto, Umberto; Taggart, David P; Sousa-Uva, Miguel; Biondi-Zoccai, Giuseppe; Di Franco, Antonino; Ohmes, Lucas B; Rahouma, Mohamed; Kamel, Mohamed; Caputo, Massimo; Girardi, Leonard N; Angelini, Gianni D; Gaudino, Mario
2018-05-01
With the advent of bare metal stents and drug-eluting stents, percutaneous coronary intervention has emerged as an alternative to coronary artery bypass grafting surgery for unprotected left main disease. However, whether the evolution of stents technology has translated into better results after percutaneous coronary intervention remains unclear. We aimed to compare coronary artery bypass grafting with stents of different generations for left main disease by performing a Bayesian network meta-analysis of available randomized controlled trials. All randomized controlled trials with at least 1 arm randomized to percutaneous coronary intervention with stents or coronary artery bypass grafting for left main disease were included. Bare metal stents and drug-eluting stents of first- and second-generation were compared with coronary artery bypass grafting. Poisson methods and Bayesian framework were used to compute the head-to-head incidence rate ratio and 95% credible intervals. Primary end points were the composite of death/myocardial infarction/stroke and repeat revascularization. Nine randomized controlled trials were included in the final analysis. Six trials compared percutaneous coronary intervention with coronary artery bypass grafting (n = 4654), and 3 trials compared different types of stents (n = 1360). Follow-up ranged from 6 months to 5 years. Second-generation drug-eluting stents (incidence rate ratio, 1.3; 95% credible interval, 1.1-1.6), but not bare metal stents (incidence rate ratio, 0.63; 95% credible interval, 0.27-1.4), and first-generation drug-eluting stents (incidence rate ratio, 0.85; 95% credible interval, 0.65-1.1) were associated with a significantly increased risk of death/myocardial infarction/stroke when compared with coronary artery bypass grafting. When compared with coronary artery bypass grafting, the highest risk of repeat revascularization was observed for bare metal stents (hazard ratio, 5.1; 95% confidence interval, 2.1-14), whereas first-generation drug-eluting stents (incidence rate ratio, 1.8; 95% confidence interval, 1.4-2.4) and second-generation drug-eluting stents (incidence rate ratio, 1.8; 95% confidence interval, 1.4-2.4) were comparable. The introduction of new-generation drug-eluting stents did not translate into better outcomes for percutaneous coronary intervention when compared with coronary artery bypass grafting. Copyright © 2017 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Bayesian estimation of seasonal course of canopy leaf area index from hyperspectral satellite data
NASA Astrophysics Data System (ADS)
Varvia, Petri; Rautiainen, Miina; Seppänen, Aku
2018-03-01
In this paper, Bayesian inversion of a physically-based forest reflectance model is investigated to estimate of boreal forest canopy leaf area index (LAI) from EO-1 Hyperion hyperspectral data. The data consist of multiple forest stands with different species compositions and structures, imaged in three phases of the growing season. The Bayesian estimates of canopy LAI are compared to reference estimates based on a spectral vegetation index. The forest reflectance model contains also other unknown variables in addition to LAI, for example leaf single scattering albedo and understory reflectance. In the Bayesian approach, these variables are estimated simultaneously with LAI. The feasibility and seasonal variation of these estimates is also examined. Credible intervals for the estimates are also calculated and evaluated. The results show that the Bayesian inversion approach is significantly better than using a comparable spectral vegetation index regression.
Pensgaard, Anne Marte; Ivarsson, Andreas; Nilstad, Agnethe; Solstad, Bård Erlend; Steffen, Kathrin
2018-01-01
The relationship between specific types of stressors (eg, teammates, coach) and acute versus overuse injuries is not well understood. To examine the roles of different types of stressors as well as the effect of motivational climate on the occurrence of acute and overuse injuries. Players in the Norwegian elite female football league (n=193 players from 12 teams) participated in baseline screening tests prior to the 2009 competitive football season. As part of the screening, we included the Life Event Survey for Collegiate Athletes and the Perceived Motivational Climate in Sport Questionnaire (Norwegian short version). Acute and overuse time-loss injuries and exposure to training and matches were recorded prospectively in the football season using weekly text messaging. Data were analysed with Bayesian logistic regression analyses. Using Bayesian logistic regression analyses, we showed that perceived negative life event stress from teammates was associated with an increased risk of acute injuries (OR=1.23, 95% credibility interval (1.01 to 1.48)). There was a credible positive association between perceived negative life event stress from the coach and the risk of overuse injuries (OR=1.21, 95% credibility interval (1.01 to 1.45)). Players who report teammates as a source of stress have a greater risk of sustaining an acute injury, while players reporting the coach as a source of stress are at greater risk of sustaining an overuse injury. Motivational climate did not relate to increased injury occurrence.
On a full Bayesian inference for force reconstruction problems
NASA Astrophysics Data System (ADS)
Aucejo, M.; De Smet, O.
2018-05-01
In a previous paper, the authors introduced a flexible methodology for reconstructing mechanical sources in the frequency domain from prior local information on both their nature and location over a linear and time invariant structure. The proposed approach was derived from Bayesian statistics, because of its ability in mathematically accounting for experimenter's prior knowledge. However, since only the Maximum a Posteriori estimate was computed, the posterior uncertainty about the regularized solution given the measured vibration field, the mechanical model and the regularization parameter was not assessed. To answer this legitimate question, this paper fully exploits the Bayesian framework to provide, from a Markov Chain Monte Carlo algorithm, credible intervals and other statistical measures (mean, median, mode) for all the parameters of the force reconstruction problem.
NASA Astrophysics Data System (ADS)
Piecuch, C. G.; Huybers, P. J.; Tingley, M.
2016-12-01
Sea level observations from coastal tide gauges are some of the longest instrumental records of the ocean. However, these data can be noisy, biased, and gappy, featuring missing values, and reflecting land motion and local effects. Coping with these issues in a formal manner is a challenging task. Some studies use Bayesian approaches to estimate sea level from tide gauge records, making inference probabilistically. Such methods are typically empirically Bayesian in nature: model parameters are treated as known and assigned point values. But, in reality, parameters are not perfectly known. Empirical Bayes methods thus neglect a potentially important source of uncertainty, and so may overestimate the precision (i.e., underestimate the uncertainty) of sea level estimates. We consider whether empirical Bayes methods underestimate uncertainty in sea level from tide gauge data, comparing to a full Bayes method that treats parameters as unknowns to be solved for along with the sea level field. We develop a hierarchical algorithm that we apply to tide gauge data on the North American northeast coast over 1893-2015. The algorithm is run in full Bayes mode, solving for the sea level process and parameters, and in empirical mode, solving only for the process using fixed parameter values. Error bars on sea level from the empirical method are smaller than from the full Bayes method, and the relative discrepancies increase with time; the 95% credible interval on sea level values from the empirical Bayes method in 1910 and 2010 is 23% and 56% narrower, respectively, than from the full Bayes approach. To evaluate the representativeness of the credible intervals, empirical Bayes and full Bayes methods are applied to corrupted data of a known surrogate field. Using rank histograms to evaluate the solutions, we find that the full Bayes method produces generally reliable error bars, whereas the empirical Bayes method gives too-narrow error bars, such that the 90% credible interval only encompasses 70% of true process values. Results demonstrate that parameter uncertainty is an important source of process uncertainty, and advocate for the fully Bayesian treatment of tide gauge records in ocean circulation and climate studies.
Wang, Zengfang; Wang, Zengyan; Wang, Luang; Qiu, Mingyue; Wang, Yangang; Hou, Xu; Guo, Zhong; Wang, Bin
2017-03-01
Many studies assessed the association between hypertensive disorders during pregnancy and risk of type 2 diabetes mellitus in later life, but contradictory findings were reported. A systemic review and meta-analysis was carried out to elucidate type 2 diabetes mellitus risk in women with hypertensive disorders during pregnancy. Pubmed, Embase, and Web of Science were searched for cohort or case-control studies on the association between hypertensive disorders during pregnancy and subsequent type 2 diabetes mellitus. Random-effect model was used to pool risk estimates. Bayesian meta-analysis was carried out to further estimate the type 2 diabetes mellitus risk associated with hypertensive disorders during pregnancy. Seventeen cohort or prospective matched case-control studies were finally included. Those 17 studies involved 2,984,634 women and 46,732 type 2 diabetes mellitus cases. Overall, hypertensive disorders during pregnancy were significantly correlated with type 2 diabetes mellitus risk (relative risk = 1.56, 95 % confidence interval 1.21-2.01, P = 0.001). Preeclampsia was significantly and independently correlated with type 2 diabetes mellitus risk (relative risk = 2.25, 95 % confidence interval 1.73-2.90, P < 0.001). In addition, gestational hypertension was also significantly and independently correlated with subsequent type 2 diabetes mellitus risk (relative risk = 2.06, 95 % confidence interval 1.57-2.69, P < 0.001). The pooled estimates were not significantly altered in the subgroup analyses of studies on preeclampsia or gestational hypertension. Bayesian meta-analysis showed the relative risks of type 2 diabetes mellitus risk for individuals with hypertensive disorders during pregnancy, preeclampsia, and gestational hypertension were 1.59 (95 % credibility interval: 1.11-2.32), 2.27 (95 % credibility interval: 1.67-2.97), and 2.06 (95 % credibility interval: 1.41-2.84), respectively. Publication bias was not evident in the meta-analysis. Preeclampsia and gestational hypertension are independently associated with substantially elevated risk of type 2 diabetes mellitus in later life.
Pensgaard, Anne Marte; Ivarsson, Andreas; Nilstad, Agnethe; Solstad, Bård Erlend; Steffen, Kathrin
2018-01-01
Background The relationship between specific types of stressors (eg, teammates, coach) and acute versus overuse injuries is not well understood. Objective To examine the roles of different types of stressors as well as the effect of motivational climate on the occurrence of acute and overuse injuries. Methods Players in the Norwegian elite female football league (n=193 players from 12 teams) participated in baseline screening tests prior to the 2009 competitive football season. As part of the screening, we included the Life Event Survey for Collegiate Athletes and the Perceived Motivational Climate in Sport Questionnaire (Norwegian short version). Acute and overuse time-loss injuries and exposure to training and matches were recorded prospectively in the football season using weekly text messaging. Data were analysed with Bayesian logistic regression analyses. Results Using Bayesian logistic regression analyses, we showed that perceived negative life event stress from teammates was associated with an increased risk of acute injuries (OR=1.23, 95% credibility interval (1.01 to 1.48)). There was a credible positive association between perceived negative life event stress from the coach and the risk of overuse injuries (OR=1.21, 95% credibility interval (1.01 to 1.45)). Conclusions Players who report teammates as a source of stress have a greater risk of sustaining an acute injury, while players reporting the coach as a source of stress are at greater risk of sustaining an overuse injury. Motivational climate did not relate to increased injury occurrence. PMID:29629182
Hadwin, Paul J; Peterson, Sean D
2017-04-01
The Bayesian framework for parameter inference provides a basis from which subject-specific reduced-order vocal fold models can be generated. Previously, it has been shown that a particle filter technique is capable of producing estimates and associated credibility intervals of time-varying reduced-order vocal fold model parameters. However, the particle filter approach is difficult to implement and has a high computational cost, which can be barriers to clinical adoption. This work presents an alternative estimation strategy based upon Kalman filtering aimed at reducing the computational cost of subject-specific model development. The robustness of this approach to Gaussian and non-Gaussian noise is discussed. The extended Kalman filter (EKF) approach is found to perform very well in comparison with the particle filter technique at dramatically lower computational cost. Based upon the test cases explored, the EKF is comparable in terms of accuracy to the particle filter technique when greater than 6000 particles are employed; if less particles are employed, the EKF actually performs better. For comparable levels of accuracy, the solution time is reduced by 2 orders of magnitude when employing the EKF. By virtue of the approximations used in the EKF, however, the credibility intervals tend to be slightly underpredicted.
Modeling epilepsy disparities among ethnic groups in Philadelphia, PA
Wheeler, David C.; Waller, Lance A.; Elliott, John O.
2014-01-01
SUMMARY The Centers for Disease Control and Prevention defined epilepsy as an emerging public health issue in a recent report and emphasized the importance of epilepsy studies in minorities and people of low socioeconomic status. Previous research has suggested that the incidence rate for epilepsy is positively associated with various measures of social and economic disadvantage. In response, we utilize hierarchical Bayesian models to analyze health disparities in epilepsy and seizure risks among multiple ethnicities in the city of Philadelphia, Pennsylvania. The goals of the analysis are to highlight any overall significant disparities in epilepsy risks between the populations of Caucasians, African Americans, and Hispanics in the study area during the years 2002–2004 and to visualize the spatial pattern of epilepsy risks by ethnicity to indicate where certain ethnic populations were most adversely affected by epilepsy within the study area. Results of the Bayesian model indicate that Hispanics have the highest epilepsy risk overall, followed by African Americans, and then Caucasians. There are significant increases in relative risk for both African Americans and Hispanics when compared with Caucasians, as indicated by the posterior mean estimates of 2.09 with a 95 per cent credible interval of (1.67, 2.62) for African Americans and 2.97 with a 95 per cent credible interval of (2.37, 3.71) for Hispanics. Results also demonstrate that using a Bayesian analysis in combination with geographic information system (GIS) technology can reveal spatial patterns in patient data and highlight areas of disparity in epilepsy risk among subgroups of the population. PMID:18381676
Hay, K E; Morton, J M; Schibrowski, M L; Clements, A C A; Mahony, T J; Barnes, T S
2016-05-01
Bovine respiratory disease (BRD) is the major cause of clinical disease and death in feedlot populations worldwide. A longitudinal study was conducted to assess associations between risk factors related to on-farm management prior to transport to the feedlot and risk of BRD in a population of feedlot beef cattle sourced from throughout the cattle producing regions of Australia. Exposure variables were derived from questionnaire data provided by farmers supplying cattle (N=10,721) that were a subset of the population included in a nationwide prospective study investigating numerous putative risk factors for BRD. Causal diagrams were used to inform model building to allow estimation of effects of interest. Multilevel mixed effects logistic regression models were fitted within the Bayesian framework. Animals that were yard weaned were at reduced risk (OR: 0.7, 95% credible interval: 0.5-1.0) of BRD at the feedlot compared to animals immediately returned to pasture after weaning. Animals that had previously been fed grain (OR: 0.6, 95% credible interval: 0.3-1.1) were probably at reduced risk of BRD at the feedlot compared to animals not previously fed grain. Animals that received prior vaccinations against Bovine viral diarrhoea virus 1 (OR: 0.8, 95% credible interval: 0.5-1.1) or Mannheimia haemolytica (OR: 0.8, 95% credible interval: 0.6-1.0) were also probably at reduced risk compared to non-vaccinated animals. The results of this study confirm that on-farm management before feedlot entry can alter risk of BRD after beef cattle enter feedlots. Copyright © 2016 Elsevier B.V. All rights reserved.
Performing Contrast Analysis in Factorial Designs: From NHST to Confidence Intervals and Beyond
Wiens, Stefan; Nilsson, Mats E.
2016-01-01
Because of the continuing debates about statistics, many researchers may feel confused about how to analyze and interpret data. Current guidelines in psychology advocate the use of effect sizes and confidence intervals (CIs). However, researchers may be unsure about how to extract effect sizes from factorial designs. Contrast analysis is helpful because it can be used to test specific questions of central interest in studies with factorial designs. It weighs several means and combines them into one or two sets that can be tested with t tests. The effect size produced by a contrast analysis is simply the difference between means. The CI of the effect size informs directly about direction, hypothesis exclusion, and the relevance of the effects of interest. However, any interpretation in terms of precision or likelihood requires the use of likelihood intervals or credible intervals (Bayesian). These various intervals and even a Bayesian t test can be obtained easily with free software. This tutorial reviews these methods to guide researchers in answering the following questions: When I analyze mean differences in factorial designs, where can I find the effects of central interest, and what can I learn about their effect sizes? PMID:29805179
Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula
NASA Astrophysics Data System (ADS)
Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.
2016-03-01
A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.
Ashley-Martin, Jillian; Dodds, Linda; Arbuckle, Tye E.; Bouchard, Maryse F.; Fisher, Mandy; Morriset, Anne-Sophie; Monnier, Patricia; Shapiro, Gabriel D.; Ettinger, Adrienne S.; Dallaire, Renee; Taback, Shayne; Fraser, William; Platt, Robert W.
2017-01-01
Abstract Perfluoroalkyl substances (PFAS) are ubiquitous, persistent chemicals that have been widely used in the production of common household and consumer goods for their nonflammable, lipophobic, and hydrophobic properties. Inverse associations between maternal or umbilical cord blood concentrations of perfluorooctanoic acid and perfluorooctanesulfonate and birth weight have been identified. This literature has primarily examined each PFAS individually without consideration of the potential influence of correlated exposures. Further, the association between PFAS exposures and indicators of metabolic function (i.e., leptin and adiponectin) has received limited attention. We examined associations between first-trimester maternal plasma PFAS concentrations and birth weight and cord blood concentrations of leptin and adiponectin using data on 1,705 mother-infant pairs from the Maternal Infant Research on Environmental Chemicals (MIREC) Study, a trans-Canada birth cohort study that recruited women between 2008 and 2011. Bayesian hierarchical models were used to quantify associations and calculate credible intervals. Maternal perfluorooctanoic acid concentrations were inversely associated with birth weight z score, though the null value was included in all credible intervals (log10 β = −0.10, 95% credible interval: −0.34, 0.13). All associations between maternal PFAS concentrations and cord blood adipocytokine concentrations were of small magnitude and centered around the null value. Follow-up in a cohort of children is required to determine how the observed associations manifest in childhood. PMID:28172036
Goldstein, Neal D; Burstyn, Igor; Newbern, E Claire; Tabb, Loni P; Gutowski, Jennifer; Welles, Seth L
2016-06-01
Diagnosis of pertussis remains a challenge, and consequently research on the risk of disease might be biased because of misclassification. We quantified this misclassification and corrected for it in a case-control study of children in Philadelphia, Pennsylvania, who were 3 months to 6 years of age and diagnosed with pertussis between 2011 and 2013. Vaccine effectiveness (VE; calculated as (1 - odds ratio) × 100) was used to describe the average reduction in reported pertussis incidence resulting from persons being up to date on pertussis-antigen containing vaccines. Bayesian techniques were used to correct for purported nondifferential misclassification by reclassifying the cases per the 2014 Council of State and Territorial Epidemiologists pertussis case definition. Naïve VE was 50% (95% confidence interval: 16%, 69%). After correcting for misclassification, VE ranged from 57% (95% credible interval: 30, 73) to 82% (95% credible interval: 43, 95), depending on the amount of underreporting of pertussis that was assumed to have occurred in the study period. Meaningful misclassification was observed in terms of false negatives detected after the incorporation of infant apnea to the 2014 case definition. Although specificity was nearly perfect, sensitivity of the case definition varied from 90% to 20%, depending on the assumption about missed cases. Knowing the degree of the underreporting is essential to the accurate evaluation of VE. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Ray -Bing; Wang, Weichung; Jeff Wu, C. F.
A numerical method, called OBSM, was recently proposed which employs overcomplete basis functions to achieve sparse representations. While the method can handle non-stationary response without the need of inverting large covariance matrices, it lacks the capability to quantify uncertainty in predictions. We address this issue by proposing a Bayesian approach which first imposes a normal prior on the large space of linear coefficients, then applies the MCMC algorithm to generate posterior samples for predictions. From these samples, Bayesian credible intervals can then be obtained to assess prediction uncertainty. A key application for the proposed method is the efficient construction ofmore » sequential designs. Several sequential design procedures with different infill criteria are proposed based on the generated posterior samples. As a result, numerical studies show that the proposed schemes are capable of solving problems of positive point identification, optimization, and surrogate fitting.« less
Chen, Ray -Bing; Wang, Weichung; Jeff Wu, C. F.
2017-04-12
A numerical method, called OBSM, was recently proposed which employs overcomplete basis functions to achieve sparse representations. While the method can handle non-stationary response without the need of inverting large covariance matrices, it lacks the capability to quantify uncertainty in predictions. We address this issue by proposing a Bayesian approach which first imposes a normal prior on the large space of linear coefficients, then applies the MCMC algorithm to generate posterior samples for predictions. From these samples, Bayesian credible intervals can then be obtained to assess prediction uncertainty. A key application for the proposed method is the efficient construction ofmore » sequential designs. Several sequential design procedures with different infill criteria are proposed based on the generated posterior samples. As a result, numerical studies show that the proposed schemes are capable of solving problems of positive point identification, optimization, and surrogate fitting.« less
A Comparison of Metamodeling Techniques via Numerical Experiments
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2016-01-01
This paper presents a comparative analysis of a few metamodeling techniques using numerical experiments for the single input-single output case. These experiments enable comparing the models' predictions with the phenomenon they are aiming to describe as more data is made available. These techniques include (i) prediction intervals associated with a least squares parameter estimate, (ii) Bayesian credible intervals, (iii) Gaussian process models, and (iv) interval predictor models. Aspects being compared are computational complexity, accuracy (i.e., the degree to which the resulting prediction conforms to the actual Data Generating Mechanism), reliability (i.e., the probability that new observations will fall inside the predicted interval), sensitivity to outliers, extrapolation properties, ease of use, and asymptotic behavior. The numerical experiments describe typical application scenarios that challenge the underlying assumptions supporting most metamodeling techniques.
Statistical inferences with jointly type-II censored samples from two Pareto distributions
NASA Astrophysics Data System (ADS)
Abu-Zinadah, Hanaa H.
2017-08-01
In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.
Saha, Dibakar; Alluri, Priyanka; Gan, Albert; Wu, Wanyang
2018-02-21
The objective of this study was to investigate the relationship between bicycle crash frequency and their contributing factors at the census block group level in Florida, USA. Crashes aggregated over the census block groups tend to be clustered (i.e., spatially dependent) rather than randomly distributed. To account for the effect of spatial dependence across the census block groups, the class of conditional autoregressive (CAR) models were employed within the hierarchical Bayesian framework. Based on four years (2011-2014) of crash data, total and fatal-and-severe injury bicycle crash frequencies were modeled as a function of a large number of variables representing demographic and socio-economic characteristics, roadway infrastructure and traffic characteristics, and bicycle activity characteristics. This study explored and compared the performance of two CAR models, namely the Besag's model and the Leroux's model, in crash prediction. The Besag's models, which differ from the Leroux's models by the structure of how spatial autocorrelation are specified in the models, were found to fit the data better. A 95% Bayesian credible interval was selected to identify the variables that had credible impact on bicycle crashes. A total of 21 variables were found to be credible in the total crash model, while 18 variables were found to be credible in the fatal-and-severe injury crash model. Population, daily vehicle miles traveled, age cohorts, household automobile ownership, density of urban roads by functional class, bicycle trip miles, and bicycle trip intensity had positive effects in both the total and fatal-and-severe crash models. Educational attainment variables, truck percentage, and density of rural roads by functional class were found to be negatively associated with both total and fatal-and-severe bicycle crash frequencies. Published by Elsevier Ltd.
Bayesian stock assessment of Pacific herring in Prince William Sound, Alaska.
Muradian, Melissa L; Branch, Trevor A; Moffitt, Steven D; Hulson, Peter-John F
2017-01-01
The Pacific herring (Clupea pallasii) population in Prince William Sound, Alaska crashed in 1993 and has yet to recover, affecting food web dynamics in the Sound and impacting Alaskan communities. To help researchers design and implement the most effective monitoring, management, and recovery programs, a Bayesian assessment of Prince William Sound herring was developed by reformulating the current model used by the Alaska Department of Fish and Game. The Bayesian model estimated pre-fishery spawning biomass of herring age-3 and older in 2013 to be a median of 19,410 mt (95% credibility interval 12,150-31,740 mt), with a 54% probability that biomass in 2013 was below the management limit used to regulate fisheries in Prince William Sound. The main advantages of the Bayesian model are that it can more objectively weight different datasets and provide estimates of uncertainty for model parameters and outputs, unlike the weighted sum-of-squares used in the original model. In addition, the revised model could be used to manage herring stocks with a decision rule that considers both stock status and the uncertainty in stock status.
Bayesian stock assessment of Pacific herring in Prince William Sound, Alaska
Moffitt, Steven D.; Hulson, Peter-John F.
2017-01-01
The Pacific herring (Clupea pallasii) population in Prince William Sound, Alaska crashed in 1993 and has yet to recover, affecting food web dynamics in the Sound and impacting Alaskan communities. To help researchers design and implement the most effective monitoring, management, and recovery programs, a Bayesian assessment of Prince William Sound herring was developed by reformulating the current model used by the Alaska Department of Fish and Game. The Bayesian model estimated pre-fishery spawning biomass of herring age-3 and older in 2013 to be a median of 19,410 mt (95% credibility interval 12,150–31,740 mt), with a 54% probability that biomass in 2013 was below the management limit used to regulate fisheries in Prince William Sound. The main advantages of the Bayesian model are that it can more objectively weight different datasets and provide estimates of uncertainty for model parameters and outputs, unlike the weighted sum-of-squares used in the original model. In addition, the revised model could be used to manage herring stocks with a decision rule that considers both stock status and the uncertainty in stock status. PMID:28222151
NASA Astrophysics Data System (ADS)
Xu, T.; Valocchi, A. J.; Ye, M.; Liang, F.
2016-12-01
Due to simplification and/or misrepresentation of the real aquifer system, numerical groundwater flow and solute transport models are usually subject to model structural error. During model calibration, the hydrogeological parameters may be overly adjusted to compensate for unknown structural error. This may result in biased predictions when models are used to forecast aquifer response to new forcing. In this study, we extend a fully Bayesian method [Xu and Valocchi, 2015] to calibrate a real-world, regional groundwater flow model. The method uses a data-driven error model to describe model structural error and jointly infers model parameters and structural error. In this study, Bayesian inference is facilitated using high performance computing and fast surrogate models. The surrogate models are constructed using machine learning techniques to emulate the response simulated by the computationally expensive groundwater model. We demonstrate in the real-world case study that explicitly accounting for model structural error yields parameter posterior distributions that are substantially different from those derived by the classical Bayesian calibration that does not account for model structural error. In addition, the Bayesian with error model method gives significantly more accurate prediction along with reasonable credible intervals.
Rediscovery of Good-Turing estimators via Bayesian nonparametrics.
Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye
2016-03-01
The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library. © 2015, The International Biometric Society.
A solution to the static frame validation challenge problem using Bayesian model selection
Grigoriu, M. D.; Field, R. V.
2007-12-23
Within this paper, we provide a solution to the static frame validation challenge problem (see this issue) in a manner that is consistent with the guidelines provided by the Validation Challenge Workshop tasking document. The static frame problem is constructed such that variability in material properties is known to be the only source of uncertainty in the system description, but there is ignorance on the type of model that best describes this variability. Hence both types of uncertainty, aleatoric and epistemic, are present and must be addressed. Our approach is to consider a collection of competing probabilistic models for themore » material properties, and calibrate these models to the information provided; models of different levels of complexity and numerical efficiency are included in the analysis. A Bayesian formulation is used to select the optimal model from the collection, which is then used for the regulatory assessment. Lastly, bayesian credible intervals are used to provide a measure of confidence to our regulatory assessment.« less
Saha, Dibakar; Alluri, Priyanka; Gan, Albert
2017-01-01
The Highway Safety Manual (HSM) presents statistical models to quantitatively estimate an agency's safety performance. The models were developed using data from only a few U.S. states. To account for the effects of the local attributes and temporal factors on crash occurrence, agencies are required to calibrate the HSM-default models for crash predictions. The manual suggests updating calibration factors every two to three years, or preferably on an annual basis. Given that the calibration process involves substantial time, effort, and resources, a comprehensive analysis of the required calibration factor update frequency is valuable to the agencies. Accordingly, the objective of this study is to evaluate the HSM's recommendation and determine the required frequency of calibration factor updates. A robust Bayesian estimation procedure is used to assess the variation between calibration factors computed annually, biennially, and triennially using data collected from over 2400 miles of segments and over 700 intersections on urban and suburban facilities in Florida. Bayesian model yields a posterior distribution of the model parameters that give credible information to infer whether the difference between calibration factors computed at specified intervals is credibly different from the null value which represents unaltered calibration factors between the comparison years or in other words, zero difference. The concept of the null value is extended to include the range of values that are practically equivalent to zero. Bayesian inference shows that calibration factors based on total crash frequency are required to be updated every two years in cases where the variations between calibration factors are not greater than 0.01. When the variations are between 0.01 and 0.05, calibration factors based on total crash frequency could be updated every three years. Copyright © 2016 Elsevier Ltd. All rights reserved.
Shahian, David M; He, Xia; Jacobs, Jeffrey P; Kurlansky, Paul A; Badhwar, Vinay; Cleveland, Joseph C; Fazzalari, Frank L; Filardo, Giovanni; Normand, Sharon-Lise T; Furnary, Anthony P; Magee, Mitchell J; Rankin, J Scott; Welke, Karl F; Han, Jane; O'Brien, Sean M
2015-10-01
Previous composite performance measures of The Society of Thoracic Surgeons (STS) were estimated at the STS participant level, typically a hospital or group practice. The STS Quality Measurement Task Force has now developed a multiprocedural, multidimensional composite measure suitable for estimating the performance of individual surgeons. The development sample from the STS National Database included 621,489 isolated coronary artery bypass grafting procedures, isolated aortic valve replacement, aortic valve replacement plus coronary artery bypass grafting, mitral, or mitral plus coronary artery bypass grafting procedures performed by 2,286 surgeons between July 1, 2011, and June 30, 2014. Each surgeon's composite score combined their aggregate risk-adjusted mortality and major morbidity rates (each weighted inversely by their standard deviations) and reflected the proportion of case types they performed. Model parameters were estimated in a Bayesian framework. Composite star ratings were examined using 90%, 95%, or 98% Bayesian credible intervals. Measure reliability was estimated using various 3-year case thresholds. The final composite measure was defined as 0.81 × (1 minus risk-standardized mortality rate) + 0.19 × (1 minus risk-standardized complication rate). Risk-adjusted mortality (median, 2.3%; interquartile range, 1.7% to 3.0%), morbidity (median, 13.7%; interquartile range, 10.8% to 17.1%), and composite scores (median, 95.4%; interquartile range, 94.4% to 96.3%) varied substantially across surgeons. Using 98% Bayesian credible intervals, there were 207 1-star (lower performance) surgeons (9.1%), 1,701 2-star (as-expected performance) surgeons (74.4%), and 378 3-star (higher performance) surgeons (16.5%). With an eligibility threshold of 100 cases over 3 years, measure reliability was 0.81. The STS has developed a multiprocedural composite measure suitable for evaluating performance at the individual surgeon level. Copyright © 2015 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Cheng, Ji; Iorio, Alfonso; Marcucci, Maura; Romanov, Vadim; Pullenayegum, Eleanor M; Marshall, John K; Thabane, Lehana
2016-01-01
Background Developing inhibitors is a rare event during the treatment of hemophilia A. The multifacets and uncertainty surrounding the development of inhibitors further complicate the process of estimating inhibitor rate from the limited data. Bayesian statistical modeling provides a useful tool in generating, enhancing, and exploring the evidence through incorporating all the available information. Methods We built our Bayesian analysis using three study cases to estimate the inhibitor rates of patients with hemophilia A in three different scenarios: Case 1, a single cohort of previously treated patients (PTPs) or previously untreated patients; Case 2, a meta-analysis of PTP cohorts; and Case 3, a previously unexplored patient population – patients with baseline low-titer inhibitor or history of inhibitor development. The data used in this study were extracted from three published ADVATE (antihemophilic factor [recombinant] is a product of Baxter for treating hemophilia A) post-authorization surveillance studies. Noninformative and informative priors were applied to Bayesian standard (Case 1) or random-effects (Case 2 and Case 3) logistic models. Bayesian probabilities of satisfying three meaningful thresholds of the risk of developing a clinical significant inhibitor (10/100, 5/100 [high rates], and 1/86 [the Food and Drug Administration mandated cutoff rate in PTPs]) were calculated. The effect of discounting prior information or scaling up the study data was evaluated. Results Results based on noninformative priors were similar to the classical approach. Using priors from PTPs lowered the point estimate and narrowed the 95% credible intervals (Case 1: from 1.3 [0.5, 2.7] to 0.8 [0.5, 1.1]; Case 2: from 1.9 [0.6, 6.0] to 0.8 [0.5, 1.1]; Case 3: 2.3 [0.5, 6.8] to 0.7 [0.5, 1.1]). All probabilities of satisfying a threshold of 1/86 were above 0.65. Increasing the number of patients by two and ten times substantially narrowed the credible intervals for the single cohort study (1.4 [0.7, 2.3] and 1.4 [1.1, 1.8], respectively). Increasing the number of studies by two and ten times for the multiple study scenarios (Case 2: 1.9 [0.6, 4.0] and 1.9 [1.5, 2.6]; Case 3: 2.4 [0.9, 5.0] and 2.6 [1.9, 3.5], respectively) had a similar effect. Conclusion Bayesian approach as a robust, transparent, and reproducible analytic method can be efficiently used to estimate the inhibitor rate of hemophilia A in complex clinical settings. PMID:27822129
Cheng, Ji; Iorio, Alfonso; Marcucci, Maura; Romanov, Vadim; Pullenayegum, Eleanor M; Marshall, John K; Thabane, Lehana
2016-01-01
Developing inhibitors is a rare event during the treatment of hemophilia A. The multifacets and uncertainty surrounding the development of inhibitors further complicate the process of estimating inhibitor rate from the limited data. Bayesian statistical modeling provides a useful tool in generating, enhancing, and exploring the evidence through incorporating all the available information. We built our Bayesian analysis using three study cases to estimate the inhibitor rates of patients with hemophilia A in three different scenarios: Case 1, a single cohort of previously treated patients (PTPs) or previously untreated patients; Case 2, a meta-analysis of PTP cohorts; and Case 3, a previously unexplored patient population - patients with baseline low-titer inhibitor or history of inhibitor development. The data used in this study were extracted from three published ADVATE (antihemophilic factor [recombinant] is a product of Baxter for treating hemophilia A) post-authorization surveillance studies. Noninformative and informative priors were applied to Bayesian standard (Case 1) or random-effects (Case 2 and Case 3) logistic models. Bayesian probabilities of satisfying three meaningful thresholds of the risk of developing a clinical significant inhibitor (10/100, 5/100 [high rates], and 1/86 [the Food and Drug Administration mandated cutoff rate in PTPs]) were calculated. The effect of discounting prior information or scaling up the study data was evaluated. Results based on noninformative priors were similar to the classical approach. Using priors from PTPs lowered the point estimate and narrowed the 95% credible intervals (Case 1: from 1.3 [0.5, 2.7] to 0.8 [0.5, 1.1]; Case 2: from 1.9 [0.6, 6.0] to 0.8 [0.5, 1.1]; Case 3: 2.3 [0.5, 6.8] to 0.7 [0.5, 1.1]). All probabilities of satisfying a threshold of 1/86 were above 0.65. Increasing the number of patients by two and ten times substantially narrowed the credible intervals for the single cohort study (1.4 [0.7, 2.3] and 1.4 [1.1, 1.8], respectively). Increasing the number of studies by two and ten times for the multiple study scenarios (Case 2: 1.9 [0.6, 4.0] and 1.9 [1.5, 2.6]; Case 3: 2.4 [0.9, 5.0] and 2.6 [1.9, 3.5], respectively) had a similar effect. Bayesian approach as a robust, transparent, and reproducible analytic method can be efficiently used to estimate the inhibitor rate of hemophilia A in complex clinical settings.
Lee, R S C; Hermens, D F; Scott, J; O'Dea, B; Glozier, N; Scott, E M; Hickie, I B
2017-09-01
Optimizing functional recovery in young individuals with severe mental illness constitutes a major healthcare priority. The current study sought to quantify the cognitive and clinical factors underpinning academic and vocational engagement in a transdiagnostic and prospective youth mental health cohort. The primary outcome measure was 'not in education, employment or training' ('NEET') status. A clinical sample of psychiatric out-patients aged 15-25 years (n = 163) was assessed at two time points, on average, 24 months apart. Functional status, and clinical and neuropsychological data were collected. Bayesian structural equation modelling was used to confirm the factor structure of predictors and cross-lagged effects at follow-up. Individually, NEET status, cognitive dysfunction and negative symptoms at baseline were predictive of NEET status at follow-up (p < 0.05). Baseline cognitive functioning was the only predictor of follow-up NEET status in the multivariate Bayesian model, while controlling for baseline NEET status. For every 1 s.d. deficit in cognition, the probability of being disengaged at follow-up increased by 40% (95% credible interval 19-58%). Baseline NEET status predicted poorer negative symptoms at follow-up (β = 0.24, 95% credible interval 0.04-0.43). Disengagement with education, employment or training (i.e. being NEET) was reported in about one in four members of this cohort. The initial level of cognitive functioning was the strongest determinant of future NEET status, whereas being academically or vocationally engaged had an impact on future negative symptomatology. If replicated, these findings support the need to develop early interventions that target cognitive phenotypes transdiagnostically.
Converse, Sarah J.; Chandler, J. N.; Olsen, Glenn H.; Shafer, C. C.; Hartup, Barry K.; Urbanek, Richard P.
2010-01-01
In captive-rearing programs, small sample sizes can limit the quality of information on performance of propagation methods. Bayesian updating can be used to increase information on method performance over time. We demonstrate an application to incubator testing at USGS Patuxent Wildlife Research Center. A new type of incubator was purchased for use in the whooping crane (Grus americana) propagation program, which produces birds for release. We tested the new incubator for reliability, using sandhill crane (Grus canadensis) eggs as surrogates. We determined that the new incubator should result in hatching rates no more than 5% lower than the available incubators, with 95% confidence, before it would be used to incubate whooping crane eggs. In 2007, 5 healthy chicks hatched from 12 eggs in the new incubator, and 2 hatched from 5 in an available incubator, for a median posterior difference of <1%, but with a large 95% credible interval (-41%, 43%). In 2008, we implemented a double-blind evaluation method, where a veterinarian determined whether eggs produced chicks that, at hatching, had no apparent health problems that would impede future release. We used the 2007 estimates as priors in the 2008 analysis. In 2008, 7 normal chicks hatched from 15 eggs in the new incubator, and 11 hatched from 15 in an available incubator, for a median posterior difference of 19%, with 95% credible interval (-8%, 44%). The increased sample size has increased our understanding of incubator performance. While additional data will be collected, at this time the new incubator does not appear adequate for use with whooping crane eggs.
CDMBE: A Case Description Model Based on Evidence
Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing
2015-01-01
By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006
Improving Photometric Redshifts for Hyper Suprime-Cam
NASA Astrophysics Data System (ADS)
Speagle, Josh S.; Leauthaud, Alexie; Eisenstein, Daniel; Bundy, Kevin; Capak, Peter L.; Leistedt, Boris; Masters, Daniel C.; Mortlock, Daniel; Peiris, Hiranya; HSC Photo-z Team; HSC Weak Lensing Team
2017-01-01
Deriving accurate photometric redshift (photo-z) probability distribution functions (PDFs) are crucial science components for current and upcoming large-scale surveys. We outline how rigorous Bayesian inference and machine learning can be combined to quickly derive joint photo-z PDFs to individual galaxies and their parent populations. Using the first 170 deg^2 of data from the ongoing Hyper Suprime-Cam survey, we demonstrate our method is able to generate accurate predictions and reliable credible intervals over ~370k high-quality redshifts. We then use galaxy-galaxy lensing to empirically validate our predicted photo-z's over ~14M objects, finding a robust signal.
Thorlund, Kristian; Thabane, Lehana; Mills, Edward J
2013-01-11
Multiple treatment comparison (MTC) meta-analyses are commonly modeled in a Bayesian framework, and weakly informative priors are typically preferred to mirror familiar data driven frequentist approaches. Random-effects MTCs have commonly modeled heterogeneity under the assumption that the between-trial variance for all involved treatment comparisons are equal (i.e., the 'common variance' assumption). This approach 'borrows strength' for heterogeneity estimation across treatment comparisons, and thus, ads valuable precision when data is sparse. The homogeneous variance assumption, however, is unrealistic and can severely bias variance estimates. Consequently 95% credible intervals may not retain nominal coverage, and treatment rank probabilities may become distorted. Relaxing the homogeneous variance assumption may be equally problematic due to reduced precision. To regain good precision, moderately informative variance priors or additional mathematical assumptions may be necessary. In this paper we describe four novel approaches to modeling heterogeneity variance - two novel model structures, and two approaches for use of moderately informative variance priors. We examine the relative performance of all approaches in two illustrative MTC data sets. We particularly compare between-study heterogeneity estimates and model fits, treatment effect estimates and 95% credible intervals, and treatment rank probabilities. In both data sets, use of moderately informative variance priors constructed from the pair wise meta-analysis data yielded the best model fit and narrower credible intervals. Imposing consistency equations on variance estimates, assuming variances to be exchangeable, or using empirically informed variance priors also yielded good model fits and narrow credible intervals. The homogeneous variance model yielded high precision at all times, but overall inadequate estimates of between-trial variances. Lastly, treatment rankings were similar among the novel approaches, but considerably different when compared with the homogenous variance approach. MTC models using a homogenous variance structure appear to perform sub-optimally when between-trial variances vary between comparisons. Using informative variance priors, assuming exchangeability or imposing consistency between heterogeneity variances can all ensure sufficiently reliable and realistic heterogeneity estimation, and thus more reliable MTC inferences. All four approaches should be viable candidates for replacing or supplementing the conventional homogeneous variance MTC model, which is currently the most widely used in practice.
Catelan, Dolores; Buzzoni, Carlotta; Coviello, Enzo; Crocetti, Emanuele; Pasetto, Roberto; Pirastu, Roberta; Biggeri, Annibale
2014-01-01
Epidemiological surveillance on high risk environmental areas or areas covered by cancer registration yields long inventories of relative risks. Summaries of the results' tables must be produced to identify priorities and tailor public health actions. The aim is, therefore, to draw conclusions from each area's disease profile, or from the area signature of each disease.With this inmind, we used data on cancer incidence from 17 Cancer Registries that participated in the ISS-AIRTUM (National Institute of Health-Italian Network of Cancer Registries) study, and we produced conditional and marginal rankings of areas/diseases using a multivariate hierarchical Bayesian model. In this context, it is important to obtain an uncertainty evaluation by calculating the credibility intervals of ranks. The areas marginal ranking shows a large overlapping of credibility intervals, such that it is not possible to speak of a limited number of ISS-AIRTUM areas as being particularly affected. Every ISS-AIRTUMarea, therefore,must be considered individually and ordering themby ranking of cancer incidence wouldn't be appropriate. Instead,marginal ranking of diseases highlights the impact of asbestos exposure in all the analyzed areas.
A history of chagas disease transmission, control, and re-emergence in peri-rural La Joya, Peru.
Delgado, Stephen; Castillo Neyra, Ricardo; Quispe Machaca, Víctor R; Ancca Juárez, Jenny; Chou Chu, Lily; Verastegui, Manuela Renee; Moscoso Apaza, Giovanna M; Bocángel, César D; Tustin, Aaron W; Sterling, Charles R; Comrie, Andrew C; Náquira, César; Cornejo del Carpio, Juan G; Gilman, Robert H; Bern, Caryn; Levy, Michael Z
2011-02-22
The history of Chagas disease control in Peru and many other nations is marked by scattered and poorly documented vector control campaigns. The complexities of human migration and sporadic control campaigns complicate evaluation of the burden of Chagas disease and dynamics of Trypanosoma cruzi transmission. We conducted a cross-sectional serological and entomological study to evaluate temporal and spatial patterns of T. cruzi transmission in a peri-rural region of La Joya, Peru. We use a multivariate catalytic model and Bayesian methods to estimate incidence of infection over time and thereby elucidate the complex history of transmission in the area. Of 1,333 study participants, 101 (7.6%; 95% CI: 6.2-9.0%) were confirmed T. cruzi seropositive. Spatial clustering of parasitic infection was found in vector insects, but not in human cases. Expanded catalytic models suggest that transmission was interrupted in the study area in 1996 (95% credible interval: 1991-2000), with a resultant decline in the average annual incidence of infection from 0.9% (95% credible interval: 0.6-1.3%) to 0.1% (95% credible interval: 0.005-0.3%). Through a search of archival newspaper reports, we uncovered documentation of a 1995 vector control campaign, and thereby independently validated the model estimates. High levels of T. cruzi transmission had been ongoing in peri-rural La Joya prior to interruption of parasite transmission through a little-documented vector control campaign in 1995. Despite the efficacy of the 1995 control campaign, T. cruzi was rapidly reemerging in vector populations in La Joya, emphasizing the need for continuing surveillance and control at the rural-urban interface.
A History of Chagas Disease Transmission, Control, and Re-Emergence in Peri-Rural La Joya, Peru
Delgado, Stephen; Castillo Neyra, Ricardo; Quispe Machaca, Víctor R.; Ancca Juárez, Jenny; Chou Chu, Lily; Verastegui, Manuela Renee; Moscoso Apaza, Giovanna M.; Bocángel, César D.; Tustin, Aaron W.; Sterling, Charles R.; Comrie, Andrew C.; Náquira, César; Cornejo del Carpio, Juan G.; Gilman, Robert H.; Bern, Caryn; Levy, Michael Z.
2011-01-01
Background The history of Chagas disease control in Peru and many other nations is marked by scattered and poorly documented vector control campaigns. The complexities of human migration and sporadic control campaigns complicate evaluation of the burden of Chagas disease and dynamics of Trypanosoma cruzi transmission. Methodology/Principal Findings We conducted a cross-sectional serological and entomological study to evaluate temporal and spatial patterns of T. cruzi transmission in a peri-rural region of La Joya, Peru. We use a multivariate catalytic model and Bayesian methods to estimate incidence of infection over time and thereby elucidate the complex history of transmission in the area. Of 1,333 study participants, 101 (7.6%; 95% CI: 6.2–9.0%) were confirmed T. cruzi seropositive. Spatial clustering of parasitic infection was found in vector insects, but not in human cases. Expanded catalytic models suggest that transmission was interrupted in the study area in 1996 (95% credible interval: 1991–2000), with a resultant decline in the average annual incidence of infection from 0.9% (95% credible interval: 0.6–1.3%) to 0.1% (95% credible interval: 0.005–0.3%). Through a search of archival newspaper reports, we uncovered documentation of a 1995 vector control campaign, and thereby independently validated the model estimates. Conclusions/Significance High levels of T. cruzi transmission had been ongoing in peri-rural La Joya prior to interruption of parasite transmission through a little-documented vector control campaign in 1995. Despite the efficacy of the 1995 control campaign, T. cruzi was rapidly reemerging in vector populations in La Joya, emphasizing the need for continuing surveillance and control at the rural-urban interface. PMID:21364970
Hao, Yongping; Balluz, Lina; Strosnider, Heather; Wen, Xiao Jun; Li, Chaoyang; Qualters, Judith R
2015-08-01
Short-term effects of air pollution exposure on respiratory disease mortality are well established. However, few studies have examined the effects of long-term exposure, and among those that have, results are inconsistent. To evaluate long-term association between ambient ozone, fine particulate matter (PM2.5, particles with an aerodynamic diameter of 2.5 μm or less), and chronic lower respiratory disease (CLRD) mortality in the contiguous United States. We fit Bayesian hierarchical spatial Poisson models, adjusting for five county-level covariates (percentage of adults aged ≥65 years, poverty, lifetime smoking, obesity, and temperature), with random effects at state and county levels to account for spatial heterogeneity and spatial dependence. We derived county-level average daily concentration levels for ambient ozone and PM2.5 for 2001-2008 from the U.S. Environmental Protection Agency's down-scaled estimates and obtained 2007-2008 CLRD deaths from the National Center for Health Statistics. Exposure to ambient ozone was associated with an increased rate of CLRD deaths, with a rate ratio of 1.05 (95% credible interval, 1.01-1.09) per 5-ppb increase in ozone; the association between ambient PM2.5 and CLRD mortality was positive but statistically insignificant (rate ratio, 1.07; 95% credible interval, 0.99-1.14). This study links air pollution exposure data with CLRD mortality for all 3,109 contiguous U.S. counties. Ambient ozone may be associated with an increased rate of death from CLRD in the contiguous United States. Although we adjusted for selected county-level covariates and unobserved influences through Bayesian hierarchical spatial modeling, the possibility of ecologic bias remains.
Wang, Tianli; Baron, Kyle; Zhong, Wei; Brundage, Richard; Elmquist, William
2014-03-01
The current study presents a Bayesian approach to non-compartmental analysis (NCA), which provides the accurate and precise estimate of AUC 0 (∞) and any AUC 0 (∞) -based NCA parameter or derivation. In order to assess the performance of the proposed method, 1,000 simulated datasets were generated in different scenarios. A Bayesian method was used to estimate the tissue and plasma AUC 0 (∞) s and the tissue-to-plasma AUC 0 (∞) ratio. The posterior medians and the coverage of 95% credible intervals for the true parameter values were examined. The method was applied to laboratory data from a mice brain distribution study with serial sacrifice design for illustration. Bayesian NCA approach is accurate and precise in point estimation of the AUC 0 (∞) and the partition coefficient under a serial sacrifice design. It also provides a consistently good variance estimate, even considering the variability of the data and the physiological structure of the pharmacokinetic model. The application in the case study obtained a physiologically reasonable posterior distribution of AUC, with a posterior median close to the value estimated by classic Bailer-type methods. This Bayesian NCA approach for sparse data analysis provides statistical inference on the variability of AUC 0 (∞) -based parameters such as partition coefficient and drug targeting index, so that the comparison of these parameters following destructive sampling becomes statistically feasible.
NASA Astrophysics Data System (ADS)
Figueira, P.; Faria, J. P.; Adibekyan, V. Zh.; Oshagh, M.; Santos, N. C.
2016-11-01
We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, ρ, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short (˜ 130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log(R^' }_{ {HK}}) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the reader interested in this kind of problem to apply our code to his/her own scientific problems. The full understanding of what the Bayesian framework is can only be gained through the insight that comes by handling priors, assessing the convergence of Monte Carlo runs, and a multitude of other practical problems. We hope to contribute so that Bayesian analysis becomes a tool in the toolkit of researchers, and they understand by experience its advantages and limitations.
The Challenges of Credible Thermal Protection System Reliability Quantification
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2013-01-01
The paper discusses several of the challenges associated with developing a credible reliability estimate for a human-rated crew capsule thermal protection system. The process of developing such a credible estimate is subject to the quantification, modeling and propagation of numerous uncertainties within a probabilistic analysis. The development of specific investment recommendations, to improve the reliability prediction, among various potential testing and programmatic options is then accomplished through Bayesian analysis.
Lu, Y; Baggett, H C; Rhodes, J; Thamthitiwat, S; Joseph, L; Gregory, C J
2016-10-01
Pneumonia is a leading cause of mortality and morbidity worldwide with radiographically confirmed pneumonia a key disease burden indicator. This is usually determined by a radiology panel which is assumed to be the best available standard; however, this assumption may introduce bias into pneumonia incidence estimates. To improve estimates of radiographic pneumonia incidence, we applied Bayesian latent class modelling (BLCM) to a large database of hospitalized patients with acute lower respiratory tract illness in Sa Kaeo and Nakhon Phanom provinces, Thailand from 2005 to 2010 with chest radiographs read by both a radiology panel and a clinician. We compared these estimates to those from conventional analysis. For children aged <5 years, estimated radiographically confirmed pneumonia incidence by BLCM was 2394/100 000 person-years (95% credible interval 2185-2574) vs. 1736/100 000 person-years (95% confidence interval 1706-1766) from conventional analysis. For persons aged ⩾5 years, estimated radiographically confirmed pneumonia incidence was similar between BLCM and conventional analysis (235 vs. 215/100 000 person-years). BLCM suggests the incidence of radiographically confirmed pneumonia in young children is substantially larger than estimated from the conventional approach using radiology panels as the reference standard.
Timsit, E; Dendukuri, N; Schiller, I; Buczinski, S
2016-12-01
Diagnosis of bovine respiratory disease (BRD) in beef cattle placed in feedlots is typically based on clinical illness (CI) detected by pen-checkers. Unfortunately, the accuracy of this diagnostic approach (namely, sensitivity [Se] and specificity [Sp]) remains poorly understood, in part due to the absence of a reference test for ante-mortem diagnosis of BRD. Our objective was to pool available estimates of CI's diagnostic accuracy for BRD diagnosis in feedlot beef cattle while adjusting for the inaccuracy in the reference test. The presence of lung lesions (LU) at slaughter was used as the reference test. A systematic review of the literature was conducted to identify research articles comparing CI detected by pen-checkers during the feeding period to LU at slaughter. A hierarchical Bayesian latent-class meta-analysis was used to model test accuracy. This approach accounted for imperfections of both tests as well as the within and between study variability in the accuracy of CI. Furthermore, it also predicted the Se CI and Sp CI for future studies. Conditional independence between CI and LU was assumed, as these two tests are not based on similar biological principles. Seven studies were included in the meta-analysis. Estimated pooled Se CI and Sp CI were 0.27 (95% Bayesian credible interval: 0.12-0.65) and 0.92 (0.72-0.98), respectively, whereas estimated pooled Se LU and Sp LU were 0.91 (0.82-0.99) and 0.67 (0.64-0.79). Predicted Se CI and Sp CI for future studies were 0.27 (0.01-0.96) and 0.92 (0.14-1.00), respectively. The wide credible intervals around predicted Se CI and Sp CI estimates indicated considerable heterogeneity among studies, which suggests that pooled Se CI and Sp CI are not generalizable to individual studies. In conclusion, CI appeared to have poor Se but high Sp for BRD diagnosis in feedlots. Furthermore, considerable heterogeneity among studies highlighted an urgent need to standardize BRD diagnosis in feedlots. Copyright © 2016 Elsevier B.V. All rights reserved.
Yelland, L N; Gajewski, B J; Colombo, J; Gibson, R A; Makrides, M; Carlson, S E
2016-09-01
The DHA to Optimize Mother Infant Outcome (DOMInO) and Kansas DHA Outcomes Study (KUDOS) were randomized controlled trials that supplemented mothers with 800 and 600mg DHA/day, respectively, or a placebo during pregnancy. DOMInO was conducted in Australia and KUDOS in the United States. Both trials found an unanticipated and statistically significant reduction in early preterm birth (ePTB; i.e., birth before 34 weeks gestation). However, in each trial, the number of ePTBs were small. We used a novel Bayesian approach to estimate statistically derived low, moderate or high risk for ePTB, and to test for differences between the DHA and placebo groups. In both trials, the model predicted DHA would significantly reduce the expected proportion of deliveries in the high risk group under the trial conditions of the parent studies. Among the next 300,000 births in Australia we estimated that 1112 ePTB (95% credible interval 51-2189) could be avoided by providing DHA. And in the USA we estimated that 106,030 ePTB (95% credible interval 6400 to 175,700) could be avoided with DHA. Copyright © 2016 Elsevier Ltd. All rights reserved.
First oscillation analysis using neutrino and antineutrino data at T2K
NASA Astrophysics Data System (ADS)
Duffy, Kirsty
2017-09-01
We present details of the first T2K neutrino and antineutrino oscillation results, in which data collected using both a muon neutrino-enhanced neutrino beam and a muon antineutrino-enhanced neutrino beam are analysed, equating to 7.002×1020 protons on target (POT) and 7.471×1020 POT respectively. Both {ν }μ /{\\bar{ν }}μ disappearance and {ν }e/{\\bar{ν }}e appearance data are analysed using a Bayesian Markov Chain Monte Carlo method, providing the first ever sensitivity to the CP-violating phase δCP from T2K data alone. The T2K data favour near-maximal mixing, with sin2 θ 23 and Δ {m}322 consistent with previous T2K measurements, a value of sin2 θ 13 consistent with measurements by reactor experiments, and δCP close to -π/2. When fitting with T2K data alone, the 90% credible interval for δCP disfavours values around π/2: δ CP ∉ [0.38, 2.60] rad. When using a prior on sin2 2θ 13 from reactor measurements, the 90% credible interval contains δCP ∉ [-3.10, -0.17] rad, disfavouring the CP-conserving values 0 and ±π. The effect on this result of the δCP prior is also investigated and presented.
Uncertainty analysis of depth predictions from seismic reflection data using Bayesian statistics
NASA Astrophysics Data System (ADS)
Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.
2018-03-01
Estimating the depths of target horizons from seismic reflection data is an important task in exploration geophysics. To constrain these depths we need a reliable and accurate velocity model. Here, we build an optimum 2D seismic reflection data processing flow focused on pre - stack deghosting filters and velocity model building and apply Bayesian methods, including Gaussian process emulation and Bayesian History Matching (BHM), to estimate the uncertainties of the depths of key horizons near the borehole DSDP-258 located in the Mentelle Basin, south west of Australia, and compare the results with the drilled core from that well. Following this strategy, the tie between the modelled and observed depths from DSDP-258 core was in accordance with the ± 2σ posterior credibility intervals and predictions for depths to key horizons were made for the two new drill sites, adjacent the existing borehole of the area. The probabilistic analysis allowed us to generate multiple realizations of pre-stack depth migrated images, these can be directly used to better constrain interpretation and identify potential risk at drill sites. The method will be applied to constrain the drilling targets for the upcoming International Ocean Discovery Program (IODP), leg 369.
Uncertainty analysis of depth predictions from seismic reflection data using Bayesian statistics
NASA Astrophysics Data System (ADS)
Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.
2018-06-01
Estimating the depths of target horizons from seismic reflection data is an important task in exploration geophysics. To constrain these depths we need a reliable and accurate velocity model. Here, we build an optimum 2-D seismic reflection data processing flow focused on pre-stack deghosting filters and velocity model building and apply Bayesian methods, including Gaussian process emulation and Bayesian History Matching, to estimate the uncertainties of the depths of key horizons near the Deep Sea Drilling Project (DSDP) borehole 258 (DSDP-258) located in the Mentelle Basin, southwest of Australia, and compare the results with the drilled core from that well. Following this strategy, the tie between the modelled and observed depths from DSDP-258 core was in accordance with the ±2σ posterior credibility intervals and predictions for depths to key horizons were made for the two new drill sites, adjacent to the existing borehole of the area. The probabilistic analysis allowed us to generate multiple realizations of pre-stack depth migrated images, these can be directly used to better constrain interpretation and identify potential risk at drill sites. The method will be applied to constrain the drilling targets for the upcoming International Ocean Discovery Program, leg 369.
A Bayesian framework to estimate diversification rates and their variation through time and space
2011-01-01
Background Patterns of species diversity are the result of speciation and extinction processes, and molecular phylogenetic data can provide valuable information to derive their variability through time and across clades. Bayesian Markov chain Monte Carlo methods offer a promising framework to incorporate phylogenetic uncertainty when estimating rates of diversification. Results We introduce a new approach to estimate diversification rates in a Bayesian framework over a distribution of trees under various constant and variable rate birth-death and pure-birth models, and test it on simulated phylogenies. Furthermore, speciation and extinction rates and their posterior credibility intervals can be estimated while accounting for non-random taxon sampling. The framework is particularly suitable for hypothesis testing using Bayes factors, as we demonstrate analyzing dated phylogenies of Chondrostoma (Cyprinidae) and Lupinus (Fabaceae). In addition, we develop a model that extends the rate estimation to a meta-analysis framework in which different data sets are combined in a single analysis to detect general temporal and spatial trends in diversification. Conclusions Our approach provides a flexible framework for the estimation of diversification parameters and hypothesis testing while simultaneously accounting for uncertainties in the divergence times and incomplete taxon sampling. PMID:22013891
2013-01-01
Background Multiple treatment comparison (MTC) meta-analyses are commonly modeled in a Bayesian framework, and weakly informative priors are typically preferred to mirror familiar data driven frequentist approaches. Random-effects MTCs have commonly modeled heterogeneity under the assumption that the between-trial variance for all involved treatment comparisons are equal (i.e., the ‘common variance’ assumption). This approach ‘borrows strength’ for heterogeneity estimation across treatment comparisons, and thus, ads valuable precision when data is sparse. The homogeneous variance assumption, however, is unrealistic and can severely bias variance estimates. Consequently 95% credible intervals may not retain nominal coverage, and treatment rank probabilities may become distorted. Relaxing the homogeneous variance assumption may be equally problematic due to reduced precision. To regain good precision, moderately informative variance priors or additional mathematical assumptions may be necessary. Methods In this paper we describe four novel approaches to modeling heterogeneity variance - two novel model structures, and two approaches for use of moderately informative variance priors. We examine the relative performance of all approaches in two illustrative MTC data sets. We particularly compare between-study heterogeneity estimates and model fits, treatment effect estimates and 95% credible intervals, and treatment rank probabilities. Results In both data sets, use of moderately informative variance priors constructed from the pair wise meta-analysis data yielded the best model fit and narrower credible intervals. Imposing consistency equations on variance estimates, assuming variances to be exchangeable, or using empirically informed variance priors also yielded good model fits and narrow credible intervals. The homogeneous variance model yielded high precision at all times, but overall inadequate estimates of between-trial variances. Lastly, treatment rankings were similar among the novel approaches, but considerably different when compared with the homogenous variance approach. Conclusions MTC models using a homogenous variance structure appear to perform sub-optimally when between-trial variances vary between comparisons. Using informative variance priors, assuming exchangeability or imposing consistency between heterogeneity variances can all ensure sufficiently reliable and realistic heterogeneity estimation, and thus more reliable MTC inferences. All four approaches should be viable candidates for replacing or supplementing the conventional homogeneous variance MTC model, which is currently the most widely used in practice. PMID:23311298
Hu, Wenbiao; Clements, Archie; Williams, Gail; Tong, Shilu; Mengersen, Kerrie
2010-01-01
This study aims to examine the impact of socio-ecologic factors on the transmission of Ross River virus (RRV) infection and to identify areas prone to social and ecologic-driven epidemics in Queensland, Australia. We used a Bayesian spatiotemporal conditional autoregressive model to quantify the relationship between monthly variation of RRV incidence and socio-ecologic factors and to determine spatiotemporal patterns. Our results show that the average increase in monthly RRV incidence was 2.4% (95% credible interval (CrI): 0.1–4.5%) and 2.0% (95% CrI: 1.6–2.3%) for a 1°C increase in monthly average maximum temperature and a 10 mm increase in monthly average rainfall, respectively. A significant spatiotemporal variation and interactive effect between temperature and rainfall on RRV incidence were found. No association between Socio-economic Index for Areas (SEIFA) and RRV was observed. The transmission of RRV in Queensland, Australia appeared to be primarily driven by ecologic variables rather than social factors. PMID:20810846
Bayesian Estimation Supersedes the "t" Test
ERIC Educational Resources Information Center
Kruschke, John K.
2013-01-01
Bayesian estimation for 2 groups provides complete distributions of credible values for the effect size, group means and their difference, standard deviations and their difference, and the normality of the data. The method handles outliers. The decision rule can accept the null value (unlike traditional "t" tests) when certainty in the estimate is…
Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L
2016-02-10
Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure. Copyright © 2015 John Wiley & Sons, Ltd.
Beach, Jeremy; Burstyn, Igor; Cherry, Nicola
2012-07-01
We previously described a method to identify the incidence of new-onset adult asthma (NOAA) in Alberta by industry and occupation, utilizing Workers' Compensation Board (WCB) and physician billing data. The aim of this study was to extend this method to data from British Columbia (BC) so as to compare the two provinces and to incorporate Bayesian methodology into estimates of risk. WCB claims for any reason 1995-2004 were linked to physician billing data. NOAA was defined as a billing for asthma (ICD-9 493) in the 12 months before a WCB claim without asthma in the previous 3 years. Incidence was calculated by occupation and industry. In a matched case-referent analysis, associations with exposures were examined using an asthma-specific job exposure matrix (JEM). Posterior distributions from the Alberta analysis and estimated misclassification parameters were used as priors in the Bayesian analysis of the BC data. Among 1 118 239 eligible WCB claims the incidence of NOAA was 1.4%. Sixteen occupations and 44 industries had a significantly increased risk; six industries had a decreased risk. The JEM identified wood dust [odds ratio (OR) 1.55, 95% confidence interval (CI) 1.08-2.24] and animal antigens (OR 1.66, 95% CI 1.17-2.36) as related to an increased risk of NOAA. Exposure to isocyanates was associated with decreased risk (OR 0.57, 95% CI 0.39-0.85). Bayesian analyses taking account of exposure misclassification and informative priors resulted in posterior distributions of ORs with lower boundary of 95% credible intervals >1.00 for almost all exposures. The distribution of NOAA in BC appeared somewhat similar to that in Alberta, except for isocyanates. Bayesian analyses allowed incorporation of prior evidence into risk estimates, permitting reconsideration of the apparently protective effect of isocyanate exposure.
Anderson-Cook, Christine Michaela
2017-03-01
Here, one of the substantial improvements to the practice of data analysis in recent decades is the change from reporting just a point estimate for a parameter or characteristic, to now including a summary of uncertainty for that estimate. Understanding the precision of the estimate for the quantity of interest provides better understanding of what to expect and how well we are able to predict future behavior from the process. For example, when we report a sample average as an estimate of the population mean, it is good practice to also provide a confidence interval (or credible interval, if youmore » are doing a Bayesian analysis) to accompany that summary. This helps to calibrate what ranges of values are reasonable given the variability observed in the sample and the amount of data that were included in producing the summary.« less
Liu, Guang-ying; Zheng, Yang; Deng, Yan; Gao, Yan-yan; Wang, Lie
2013-01-01
Background Although transfusion-transmitted infection of hepatitis B virus (HBV) threatens the blood safety of China, the nationwide circumstance of HBV infection among blood donors is still unclear. Objectives To comprehensively estimate the prevalence of HBsAg positive and HBV occult infection (OBI) among Chinese volunteer blood donors through bayesian meta-analysis. Methods We performed an electronic search in Pub-Med, Web of Knowledge, Medline, Wanfang Data and CNKI, complemented by a hand search of relevant reference lists. Two authors independently extracted data from the eligible studies. Then two bayesian random-effect meta-analyses were performed, followed by bayesian meta-regressions. Results 5957412 and 571227 donors were identified in HBsAg group and OBI group, respectively. The pooled prevalence of HBsAg group and OBI group among donors is 1.085% (95% credible interval [CI] 0.859%∼1.398%) and 0.094% (95% CI 0.0578%∼0.1655%). For HBsAg group, subgroup analysis shows the more developed area has a lower prevalence than the less developed area; meta-regression indicates there is a significant decreasing trend in HBsAg positive prevalence with sampling year (beta = −0.1202, 95% −0.2081∼−0.0312). Conclusion Blood safety against HBV infection in China is suffering serious threats and the government should take effective measures to improve this situation. PMID:24236110
Evaluation of uncertainty in the adjustment of fundamental constants
NASA Astrophysics Data System (ADS)
Bodnar, Olha; Elster, Clemens; Fischer, Joachim; Possolo, Antonio; Toman, Blaza
2016-02-01
Combining multiple measurement results for the same quantity is an important task in metrology and in many other areas. Examples include the determination of fundamental constants, the calculation of reference values in interlaboratory comparisons, or the meta-analysis of clinical studies. However, neither the GUM nor its supplements give any guidance for this task. Various approaches are applied such as weighted least-squares in conjunction with the Birge ratio or random effects models. While the former approach, which is based on a location-scale model, is particularly popular in metrology, the latter represents a standard tool used in statistics for meta-analysis. We investigate the reliability and robustness of the location-scale model and the random effects model with particular focus on resulting coverage or credible intervals. The interval estimates are obtained by adopting a Bayesian point of view in conjunction with a non-informative prior that is determined by a currently favored principle for selecting non-informative priors. Both approaches are compared by applying them to simulated data as well as to data for the Planck constant and the Newtonian constant of gravitation. Our results suggest that the proposed Bayesian inference based on the random effects model is more reliable and less sensitive to model misspecifications than the approach based on the location-scale model.
Cortina-Borja, Mario; Tan, Hooi Kuan; Wallon, Martine; Paul, Malgorzata; Prusa, Andrea; Buffolano, Wilma; Malm, Gunilla; Salt, Alison; Freeman, Katherine; Petersen, Eskild; Gilbert, Ruth E.
2010-01-01
Background The effectiveness of prenatal treatment to prevent serious neurological sequelae (SNSD) of congenital toxoplasmosis is not known. Methods and Findings Congenital toxoplasmosis was prospectively identified by universal prenatal or neonatal screening in 14 European centres and children were followed for a median of 4 years. We evaluated determinants of postnatal death or SNSD defined by one or more of functional neurological abnormalities, severe bilateral visual impairment, or pregnancy termination for confirmed congenital toxoplasmosis. Two-thirds of the cohort received prenatal treatment (189/293; 65%). 23/293 (8%) fetuses developed SNSD of which nine were pregnancy terminations. Prenatal treatment reduced the risk of SNSD. The odds ratio for prenatal treatment, adjusted for gestational age at maternal seroconversion, was 0.24 (95% Bayesian credible intervals 0.07–0.71). This effect was robust to most sensitivity analyses. The number of infected fetuses needed to be treated to prevent one case of SNSD was three (95% Bayesian credible intervals 2–15) after maternal seroconversion at 10 weeks, and 18 (9–75) at 30 weeks of gestation. Pyrimethamine-sulphonamide treatment did not reduce SNSD compared with spiramycin alone (adjusted odds ratio 0.78, 0.21–2.95). The proportion of live-born infants with intracranial lesions detected postnatally who developed SNSD was 31.0% (17.0%–38.1%). Conclusion The finding that prenatal treatment reduced the risk of SNSD in infected fetuses should be interpreted with caution because of the low number of SNSD cases and uncertainty about the timing of maternal seroconversion. As these are observational data, policy decisions about screening require further evidence from a randomized trial of prenatal screening and from cost-effectiveness analyses that take into account the incidence and prevalence of maternal infection. Please see later in the article for the Editors' Summary PMID:20967235
Kwon, Deukwoo; Reis, Isildinha M
2015-08-12
When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.
Buscot, Marie-Jeanne; Wotherspoon, Simon S; Magnussen, Costan G; Juonala, Markus; Sabin, Matthew A; Burgner, David P; Lehtimäki, Terho; Viikari, Jorma S A; Hutri-Kähönen, Nina; Raitakari, Olli T; Thomson, Russell J
2017-06-06
Bayesian hierarchical piecewise regression (BHPR) modeling has not been previously formulated to detect and characterise the mechanism of trajectory divergence between groups of participants that have longitudinal responses with distinct developmental phases. These models are useful when participants in a prospective cohort study are grouped according to a distal dichotomous health outcome. Indeed, a refined understanding of how deleterious risk factor profiles develop across the life-course may help inform early-life interventions. Previous techniques to determine between-group differences in risk factors at each age may result in biased estimate of the age at divergence. We demonstrate the use of Bayesian hierarchical piecewise regression (BHPR) to generate a point estimate and credible interval for the age at which trajectories diverge between groups for continuous outcome measures that exhibit non-linear within-person response profiles over time. We illustrate our approach by modeling the divergence in childhood-to-adulthood body mass index (BMI) trajectories between two groups of adults with/without type 2 diabetes mellitus (T2DM) in the Cardiovascular Risk in Young Finns Study (YFS). Using the proposed BHPR approach, we estimated the BMI profiles of participants with T2DM diverged from healthy participants at age 16 years for males (95% credible interval (CI):13.5-18 years) and 21 years for females (95% CI: 19.5-23 years). These data suggest that a critical window for weight management intervention in preventing T2DM might exist before the age when BMI growth rate is naturally expected to decrease. Simulation showed that when using pairwise comparison of least-square means from categorical mixed models, smaller sample sizes tended to conclude a later age of divergence. In contrast, the point estimate of the divergence time is not biased by sample size when using the proposed BHPR method. BHPR is a powerful analytic tool to model long-term non-linear longitudinal outcomes, enabling the identification of the age at which risk factor trajectories diverge between groups of participants. The method is suitable for the analysis of unbalanced longitudinal data, with only a limited number of repeated measures per participants and where the time-related outcome is typically marked by transitional changes or by distinct phases of change over time.
Kaguelidou, Florentia; Alberti, Corinne; Biran, Valerie; Bourdon, Olivier; Farnoux, Caroline; Zohar, Sarah; Jacqz-Aigrain, Evelyne
2016-01-01
Proton pump inhibitors are frequently administered on clinical symptoms in neonates but benefit remains controversial. Clinical trials validating omeprazole dosage in neonates are limited. The objective of this trial was to determine the minimum effective dose (MED) of omeprazole to treat pathological acid reflux in neonates using reflux index as surrogate marker. Double blind dose-finding trial with continual reassessment method of individual dose administration using a Bayesian approach, aiming to select drug dose as close as possible to the predefined target level of efficacy (with a credibility interval of 95%). Neonatal Intensive Care unit of the Robert Debré University Hospital in Paris, France. Neonates with a postmenstrual age ≥ 35 weeks and a pathologic 24-hour intra-esophageal pH monitoring defined by a reflux index ≥ 5% over 24 hours were considered for participation. Recruitment was stratified to 3 groups according to gestational age at birth. Five preselected doses of oral omeprazole from 1 to 3 mg/kg/day. Primary outcome, measured at 35 weeks postmenstrual age or more, was a reflux index <5% during the 24-h pH monitoring registered 72±24 hours after omeprazole initiation. Fifty-four neonates with a reflux index ranging from 5.06 to 27.7% were included. Median age was 37.5 days and median postmenstrual age was 36 weeks. In neonates born at less than 32 weeks of GA (n = 30), the MED was 2.5mg/kg/day with an estimated mean posterior probability of success of 97.7% (95% credibility interval: 90.3-99.7%). The MED was 1mg/kg/day for neonates born at more than 32 GA (n = 24). Omeprazole is extensively prescribed on clinical symptoms but efficacy is not demonstrated while safety concerns do exist. When treatment is required, the daily dose needs to be validated in preterm and term neonates. Optimal doses of omeprazole to increase gastric pH and decrease reflux index below 5% over 24 hours, determined using an adaptive Bayesian design differ among neonates. Both gestational and postnatal ages account for these differences but their differential impact on omeprazole doses remains to be determined.
Prion Amplification and Hierarchical Bayesian Modeling Refine Detection of Prion Infection
NASA Astrophysics Data System (ADS)
Wyckoff, A. Christy; Galloway, Nathan; Meyerett-Reid, Crystal; Powers, Jenny; Spraker, Terry; Monello, Ryan J.; Pulford, Bruce; Wild, Margaret; Antolin, Michael; Vercauteren, Kurt; Zabel, Mark
2015-02-01
Prions are unique infectious agents that replicate without a genome and cause neurodegenerative diseases that include chronic wasting disease (CWD) of cervids. Immunohistochemistry (IHC) is currently considered the gold standard for diagnosis of a prion infection but may be insensitive to early or sub-clinical CWD that are important to understanding CWD transmission and ecology. We assessed the potential of serial protein misfolding cyclic amplification (sPMCA) to improve detection of CWD prior to the onset of clinical signs. We analyzed tissue samples from free-ranging Rocky Mountain elk (Cervus elaphus nelsoni) and used hierarchical Bayesian analysis to estimate the specificity and sensitivity of IHC and sPMCA conditional on simultaneously estimated disease states. Sensitivity estimates were higher for sPMCA (99.51%, credible interval (CI) 97.15-100%) than IHC of obex (brain stem, 76.56%, CI 57.00-91.46%) or retropharyngeal lymph node (90.06%, CI 74.13-98.70%) tissues, or both (98.99%, CI 90.01-100%). Our hierarchical Bayesian model predicts the prevalence of prion infection in this elk population to be 18.90% (CI 15.50-32.72%), compared to previous estimates of 12.90%. Our data reveal a previously unidentified sub-clinical prion-positive portion of the elk population that could represent silent carriers capable of significantly impacting CWD ecology.
Prion amplification and hierarchical Bayesian modeling refine detection of prion infection.
Wyckoff, A Christy; Galloway, Nathan; Meyerett-Reid, Crystal; Powers, Jenny; Spraker, Terry; Monello, Ryan J; Pulford, Bruce; Wild, Margaret; Antolin, Michael; VerCauteren, Kurt; Zabel, Mark
2015-02-10
Prions are unique infectious agents that replicate without a genome and cause neurodegenerative diseases that include chronic wasting disease (CWD) of cervids. Immunohistochemistry (IHC) is currently considered the gold standard for diagnosis of a prion infection but may be insensitive to early or sub-clinical CWD that are important to understanding CWD transmission and ecology. We assessed the potential of serial protein misfolding cyclic amplification (sPMCA) to improve detection of CWD prior to the onset of clinical signs. We analyzed tissue samples from free-ranging Rocky Mountain elk (Cervus elaphus nelsoni) and used hierarchical Bayesian analysis to estimate the specificity and sensitivity of IHC and sPMCA conditional on simultaneously estimated disease states. Sensitivity estimates were higher for sPMCA (99.51%, credible interval (CI) 97.15-100%) than IHC of obex (brain stem, 76.56%, CI 57.00-91.46%) or retropharyngeal lymph node (90.06%, CI 74.13-98.70%) tissues, or both (98.99%, CI 90.01-100%). Our hierarchical Bayesian model predicts the prevalence of prion infection in this elk population to be 18.90% (CI 15.50-32.72%), compared to previous estimates of 12.90%. Our data reveal a previously unidentified sub-clinical prion-positive portion of the elk population that could represent silent carriers capable of significantly impacting CWD ecology.
A Bayesian approach to meta-analysis of plant pathology studies.
Mila, A L; Ngugi, H K
2011-01-01
Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework. Bayesian meta-analysis can readily include information not easily incorporated in classical methods, and allow for a full evaluation of competing models. Given the power and flexibility of Bayesian methods, we expect them to become widely adopted for meta-analysis of plant pathology studies.
Hay, K E; Morton, J M; Clements, A C A; Mahony, T J; Barnes, T S
2016-06-01
Bovine respiratory disease (BRD) is the major cause of clinical disease and death in feedlot cattle. A prospective longitudinal study was conducted in a population of Australian feedlot cattle to assess associations between factors related to feedlot management and risk of BRD. In total, 35,131 animals in 170 pens (cohorts) inducted into 14 feedlots were included in statistical analyses. Causal diagrams were used to inform model building to allow separate estimation of total and direct effects. Multilevel mixed effects logistic regression models were fitted within the Bayesian framework. The placement of pen water troughs such that they could be accessed by animals in adjoining pens was associated with markedly increased risk of BRD (OR 4.3, 95% credible interval: 1.4-10.3). Adding animals to pens over multiple days was associated with increased risk of BRD across all animals in those pens compared to placing all animals in the pen on a single day (total effect: OR 1.9, 95% credible interval: 1.2-2.8). The much attenuated direct effect indicated that this was primarily mediated via factors on indirect pathways so it may be possible to ameliorate the adverse effects of adding animals to pens over multiple days by altering exposure to these intervening factors (e.g. mixing history). In pens in which animals were added to the pen over multiple days, animals added ≥7 days (OR: 0.7, credible interval: 0.5-0.9) or 1-6 days (OR: 0.8, credible interval: 0.7-1.0) before the last animal was added were at modestly reduced risk of BRD compared to the animals that were added to the pen on the latest day. Further research is required to disentangle effects of cohort formation patterns at animal-level and higher levels on animal-level risk of BRD. Vaccination against Bovine herpesvirus 1 at feedlot entry was investigated but results were inconclusive and further research is required to evaluate vaccine efficacy. We conclude that there are practical interventions available to feedlot managers to reduce the risk of cattle developing BRD at the feedlot. We recommend placement of water troughs in feedlot pens so that they cannot be accessed by animals in adjoining pens. Further research is required to identify practical and cost-effective management strategies that allow longer adaption times for cattle identified prior to induction as being at higher risk of developing BRD. Copyright © 2016 Elsevier B.V. All rights reserved.
Oldenkamp, Rik; Hendriks, Harrie W M; van de Meent, Dik; Ragas, Ad M J
2015-09-01
Species in the aquatic environment differ in their toxicological sensitivity to the various chemicals they encounter. In aquatic risk assessment, this interspecies variation is often quantified via species sensitivity distributions. Because the information available for the characterization of these distributions is typically limited, optimal use of information is essential to reduce uncertainty involved in the assessment. In the present study, we show that the credibility intervals on the estimated potentially affected fraction of species after exposure to a mixture of chemicals at environmentally relevant surface water concentrations can be extremely wide if a classical approach is followed, in which each chemical in the mixture is considered in isolation. As an alternative, we propose a hierarchical Bayesian approach, in which knowledge on the toxicity of chemicals other than those assessed is incorporated. A case study with a mixture of 13 pharmaceuticals demonstrates that this hierarchical approach results in more realistic estimations of the potentially affected fraction, as a result of reduced uncertainty in species sensitivity distributions for data-poor chemicals.
A Bayesian inferential approach to quantify the transmission intensity of disease outbreak.
Kadi, Adiveppa S; Avaradi, Shivakumari R
2015-01-01
Emergence of infectious diseases like influenza pandemic (H1N1) 2009 has become great concern, which posed new challenges to the health authorities worldwide. To control these diseases various studies have been developed in the field of mathematical modelling, which is useful tool for understanding the epidemiological dynamics and their dependence on social mixing patterns. We have used Bayesian approach to quantify the disease outbreak through key epidemiological parameter basic reproduction number (R0), using effective contacts, defined as sum of the product of incidence cases and probability of generation time distribution. We have estimated R0 from daily case incidence data for pandemic influenza A/H1N1 2009 in India, for the initial phase. The estimated R0 with 95% credible interval is consistent with several other studies on the same strain. Through sensitivity analysis our study indicates that infectiousness affects the estimate of R0. Basic reproduction number R0 provides the useful information to the public health system to do some effort in controlling the disease by using mitigation strategies like vaccination, quarantine, and so forth.
Taking error into account when fitting models using Approximate Bayesian Computation.
van der Vaart, Elske; Prangle, Dennis; Sibly, Richard M
2018-03-01
Stochastic computer simulations are often the only practical way of answering questions relating to ecological management. However, due to their complexity, such models are difficult to calibrate and evaluate. Approximate Bayesian Computation (ABC) offers an increasingly popular approach to this problem, widely applied across a variety of fields. However, ensuring the accuracy of ABC's estimates has been difficult. Here, we obtain more accurate estimates by incorporating estimation of error into the ABC protocol. We show how this can be done where the data consist of repeated measures of the same quantity and errors may be assumed to be normally distributed and independent. We then derive the correct acceptance probabilities for a probabilistic ABC algorithm, and update the coverage test with which accuracy is assessed. We apply this method, which we call error-calibrated ABC, to a toy example and a realistic 14-parameter simulation model of earthworms that is used in environmental risk assessment. A comparison with exact methods and the diagnostic coverage test show that our approach improves estimation of parameter values and their credible intervals for both models. © 2017 by the Ecological Society of America.
Sharmin, Sifat; Glass, Kathryn; Viennet, Elvina; Harley, David
2018-04-01
Determining the relation between climate and dengue incidence is challenging due to under-reporting of disease and consequent biased incidence estimates. Non-linear associations between climate and incidence compound this. Here, we introduce a modelling framework to estimate dengue incidence from passive surveillance data while incorporating non-linear climate effects. We estimated the true number of cases per month using a Bayesian generalised linear model, developed in stages to adjust for under-reporting. A semi-parametric thin-plate spline approach was used to quantify non-linear climate effects. The approach was applied to data collected from the national dengue surveillance system of Bangladesh. The model estimated that only 2.8% (95% credible interval 2.7-2.8) of all cases in the capital Dhaka were reported through passive case reporting. The optimal mean monthly temperature for dengue transmission is 29℃ and average monthly rainfall above 15 mm decreases transmission. Our approach provides an estimate of true incidence and an understanding of the effects of temperature and rainfall on dengue transmission in Dhaka, Bangladesh.
Impact of Bayesian Priors on the Characterization of Binary Black Hole Coalescences
NASA Astrophysics Data System (ADS)
Vitale, Salvatore; Gerosa, Davide; Haster, Carl-Johan; Chatziioannou, Katerina; Zimmerman, Aaron
2017-12-01
In a regime where data are only mildly informative, prior choices can play a significant role in Bayesian statistical inference, potentially affecting the inferred physics. We show this is indeed the case for some of the parameters inferred from current gravitational-wave measurements of binary black hole coalescences. We reanalyze the first detections performed by the twin LIGO interferometers using alternative (and astrophysically motivated) prior assumptions. We find different prior distributions can introduce deviations in the resulting posteriors that impact the physical interpretation of these systems. For instance, (i) limits on the 90% credible interval on the effective black hole spin χeff are subject to variations of ˜10 % if a prior with black hole spins mostly aligned to the binary's angular momentum is considered instead of the standard choice of isotropic spin directions, and (ii) under priors motivated by the initial stellar mass function, we infer tighter constraints on the black hole masses, and in particular, we find no support for any of the inferred masses within the putative mass gap M ≲5 M⊙.
Impact of Bayesian Priors on the Characterization of Binary Black Hole Coalescences.
Vitale, Salvatore; Gerosa, Davide; Haster, Carl-Johan; Chatziioannou, Katerina; Zimmerman, Aaron
2017-12-22
In a regime where data are only mildly informative, prior choices can play a significant role in Bayesian statistical inference, potentially affecting the inferred physics. We show this is indeed the case for some of the parameters inferred from current gravitational-wave measurements of binary black hole coalescences. We reanalyze the first detections performed by the twin LIGO interferometers using alternative (and astrophysically motivated) prior assumptions. We find different prior distributions can introduce deviations in the resulting posteriors that impact the physical interpretation of these systems. For instance, (i) limits on the 90% credible interval on the effective black hole spin χ_{eff} are subject to variations of ∼10% if a prior with black hole spins mostly aligned to the binary's angular momentum is considered instead of the standard choice of isotropic spin directions, and (ii) under priors motivated by the initial stellar mass function, we infer tighter constraints on the black hole masses, and in particular, we find no support for any of the inferred masses within the putative mass gap M≲5 M_{⊙}.
Practical differences among probabilities, possibilities, and credibilities
NASA Astrophysics Data System (ADS)
Grandin, Jean-Francois; Moulin, Caroline
2002-03-01
This paper presents some important differences that exist between theories, which allow the uncertainty management in data fusion. The main comparative results illustrated in this paper are the followings: Incompatibility between decisions got from probabilities and credibilities is highlighted. In the dynamic frame, as remarked in [19] or [17], belief and plausibility of Dempster-Shafer model do not frame the Bayesian probability. This framing can however be obtained by the Modified Dempster-Shafer approach. It also can be obtained in the Bayesian framework either by simulation techniques, or with a studentization. The uncommitted in the Dempster-Shafer way, e.g. the mass accorded to the ignorance, gives a mechanism similar to the reliability in the Bayesian model. Uncommitted mass in Dempster-Shafer theory or reliability in Bayes theory act like a filter that weakens extracted information, and improves robustness to outliners. So, it is logical to observe on examples like the one presented particularly by D.M. Buede, a faster convergence of a Bayesian method that doesn't take into account the reliability, in front of Dempster-Shafer method which uses uncommitted mass. But, on Bayesian masses, if reliability is taken into account, at the same level that the uncommited, e.g. F=1-m, we observe an equivalent rate for convergence. When Dempster-Shafer and Bayes operator are informed by uncertainty, faster or lower convergence can be exhibited on non Bayesian masses. This is due to positive or negative synergy between information delivered by sensors. This effect is a direct consequence of non additivity when considering non Bayesian masses. Unknowledge of the prior in bayesian techniques can be quickly compensated by information accumulated as time goes on by a set of sensors. All these results are presented on simple examples, and developed when necessary.
DiMaggio, Charles; Chen, Qixuan; Muennig, Peter A; Li, Guohua
2014-12-01
In 2005, the US Congress allocated $612 million for a national Safe Routes to School (SRTS) program to encourage walking and bicycling to schools. We evaluated the effectiveness of a SRTS in controlling pedestrian injuries among school-age children. Bayesian changepoint analysis was applied to model the quarterly counts of pedestrian injuries among 5- to 19-year old children in New York City between 2001 and 2010 during school-travel hours in census tracts with and without SRTS. Overdispersed Poisson model was used to estimate difference-in-differences in injury risk between census tracts with and without SRTS following the changepoint. In SRTS-intervention census tracts, a change point in the quarterly counts of injuries was identified in the second quarter of 2008, which was consistent with the timing of the implementation of SRTS interventions. In census tracts with SRTS interventions, the estimated quarterly rates of pedestrian injury per 10,000 population among school-age children during school-travel hours were 3.47 (95% Credible Interval [CrI] 2.67, 4.39) prior to the changepoint, and 0.74 (95% CrI 0.30, 1.50) after the changepoint. There was no change in the average number of quarterly injuries in non-SRTS census tracts. Overdispersed Poisson modeling revealed that SRTS implementation was associated with a 44% reduction (95% Confidence Interval [CI] 87% decrease to 130% increase) in school-age pedestrian injury risk during school-travel hours. Bayesian changepoint analysis of quarterly counts of school-age pedestrian injuries successfully identified the timing of SRTS intervention in New York City. Implementation of the SRTS program in New York City appears to be effective in reducing school-age pedestrian injuries during school-travel hours.
Neutron multiplicity counting: Confidence intervals for reconstruction parameters
Verbeke, Jerome M.
2016-03-09
From nuclear materials accountability to homeland security, the need for improved nuclear material detection, assay, and authentication has grown over the past decades. Starting in the 1940s, neutron multiplicity counting techniques have enabled quantitative evaluation of masses and multiplications of fissile materials. In this paper, we propose a new method to compute uncertainties on these parameters using a model-based sequential Bayesian processor, resulting in credible regions in the fissile material mass and multiplication space. These uncertainties will enable us to evaluate quantitatively proposed improvements to the theoretical fission chain model. Additionally, because the processor can calculate uncertainties in real time,more » it is a useful tool in applications such as portal monitoring: monitoring can stop as soon as a preset confidence of non-threat is reached.« less
Tutorial: Asteroseismic Stellar Modelling with AIMS
NASA Astrophysics Data System (ADS)
Lund, Mikkel N.; Reese, Daniel R.
The goal of aims (Asteroseismic Inference on a Massive Scale) is to estimate stellar parameters and credible intervals/error bars in a Bayesian manner from a set of asteroseismic frequency data and so-called classical constraints. To achieve reliable parameter estimates and computational efficiency, it searches through a grid of pre-computed models using an MCMC algorithm—interpolation within the grid of models is performed by first tessellating the grid using a Delaunay triangulation and then doing a linear barycentric interpolation on matching simplexes. Inputs for the modelling consist of individual frequencies from peak-bagging, which can be complemented with classical spectroscopic constraints. aims is mostly written in Python with a modular structure to facilitate contributions from the community. Only a few computationally intensive parts have been rewritten in Fortran in order to speed up calculations.
Coker, Eric; Gunier, Robert; Bradman, Asa; Harley, Kim; Kogut, Katherine; Molitor, John; Eskenazi, Brenda
2017-05-09
We previously showed that potential prenatal exposure to agricultural pesticides was associated with adverse neurodevelopmental outcomes in children, yet the effects of joint exposure to multiple pesticides is poorly understood. In this paper, we investigate associations between the joint distribution of agricultural use patterns of multiple pesticides (denoted as "pesticide profiles") applied near maternal residences during pregnancy and Full-Scale Intelligence Quotient (FSIQ) at 7 years of age. Among a cohort of children residing in California's Salinas Valley, we used Pesticide Use Report (PUR) data to characterize potential exposure from use within 1 km of maternal residences during pregnancy for 15 potentially neurotoxic pesticides from five different chemical classes. We used Bayesian profile regression (BPR) to examine associations between clustered pesticide profiles and deficits in childhood FSIQ. BPR identified eight distinct clusters of prenatal pesticide profiles. Two of the pesticide profile clusters exhibited some of the highest cumulative pesticide use levels and were associated with deficits in adjusted FSIQ of -6.9 (95% credible interval: -11.3, -2.2) and -6.4 (95% credible interval: -13.1, 0.49), respectively, when compared with the pesticide profile cluster that showed the lowest level of pesticides use. Although maternal residence during pregnancy near high agricultural use of multiple neurotoxic pesticides was associated with FSIQ deficit, the magnitude of the associations showed potential for sub-additive effects. Epidemiologic analysis of pesticides and their potential health effects can benefit from a multi-pollutant approach to analysis.
Virlogeux, Victor; Yang, Juan; Fang, Vicky J; Feng, Luzhao; Tsang, Tim K; Jiang, Hui; Wu, Peng; Zheng, Jiandong; Lau, Eric H Y; Qin, Ying; Peng, Zhibin; Peiris, J S Malik; Yu, Hongjie; Cowling, Benjamin J
2016-01-01
In early 2013, a novel avian-origin influenza A(H7N9) virus emerged in China, and has caused sporadic human infections. The incubation period is the delay from infection until onset of symptoms, and varies from person to person. Few previous studies have examined whether the duration of the incubation period correlates with subsequent disease severity. We analyzed data of period of exposure on 395 human cases of laboratory-confirmed influenza A(H7N9) virus infection in China in a Bayesian framework using a Weibull distribution. We found a longer incubation period for the 173 fatal cases with a mean of 3.7 days (95% credibility interval, CrI: 3.4-4.1), compared to a mean of 3.3 days (95% CrI: 2.9-3.6) for the 222 non-fatal cases, and the difference in means was marginally significant at 0.47 days (95% CrI: -0.04, 0.99). There was a statistically significant correlation between a longer incubation period and an increased risk of death after adjustment for age, sex, geographical location and underlying medical conditions (adjusted odds ratio 1.70 per day increase in incubation period; 95% credibility interval 1.47-1.97). We found a significant association between a longer incubation period and a greater risk of death among human H7N9 cases. The underlying biological mechanisms leading to this association deserve further exploration.
NASA Astrophysics Data System (ADS)
Eadie, Gwendolyn M.; Springford, Aaron; Harris, William E.
2017-02-01
We present a hierarchical Bayesian method for estimating the total mass and mass profile of the Milky Way Galaxy. The new hierarchical Bayesian approach further improves the framework presented by Eadie et al. and Eadie and Harris and builds upon the preliminary reports by Eadie et al. The method uses a distribution function f({ E },L) to model the Galaxy and kinematic data from satellite objects, such as globular clusters (GCs), to trace the Galaxy’s gravitational potential. A major advantage of the method is that it not only includes complete and incomplete data simultaneously in the analysis, but also incorporates measurement uncertainties in a coherent and meaningful way. We first test the hierarchical Bayesian framework, which includes measurement uncertainties, using the same data and power-law model assumed in Eadie and Harris and find the results are similar but more strongly constrained. Next, we take advantage of the new statistical framework and incorporate all possible GC data, finding a cumulative mass profile with Bayesian credible regions. This profile implies a mass within 125 kpc of 4.8× {10}11{M}⊙ with a 95% Bayesian credible region of (4.0{--}5.8)× {10}11{M}⊙ . Our results also provide estimates of the true specific energies of all the GCs. By comparing these estimated energies to the measured energies of GCs with complete velocity measurements, we observe that (the few) remote tracers with complete measurements may play a large role in determining a total mass estimate of the Galaxy. Thus, our study stresses the need for more remote tracers with complete velocity measurements.
Bayesian inference for the distribution of grams of marijuana in a joint.
Ridgeway, Greg; Kilmer, Beau
2016-08-01
The average amount of marijuana in a joint is unknown, yet this figure is a critical quantity for creating credible measures of marijuana consumption. It is essential for projecting tax revenues post-legalization, estimating the size of illicit marijuana markets, and learning about how much marijuana users are consuming in order to understand health and behavioral consequences. Arrestee Drug Abuse Monitoring data collected between 2000 and 2010 contain relevant information on 10,628 marijuana transactions, joints and loose marijuana purchases, including the city in which the purchase occurred and the price paid for the marijuana. Using the Brown-Silverman drug pricing model to link marijuana price and weight, we are able to infer the distribution of grams of marijuana in a joint and provide a Bayesian posterior distribution for the mean weight of marijuana in a joint. We estimate that the mean weight of marijuana in a joint is 0.32g (95% Bayesian posterior interval: 0.30-0.35). Our estimate of the mean weight of marijuana in a joint is lower than figures commonly used to make estimates of marijuana consumption. These estimates can be incorporated into drug policy discussions to produce better understanding about illicit marijuana markets, the size of potential legalized marijuana markets, and health and behavior outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Hu, Zhiyong; Liebens, Johan; Rao, K Ranga
2008-01-01
Background Relatively few studies have examined the association between air pollution and stroke mortality. Inconsistent and inclusive results from existing studies on air pollution and stroke justify the need to continue to investigate the linkage between stroke and air pollution. No studies have been done to investigate the association between stroke and greenness. The objective of this study was to examine if there is association of stroke with air pollution, income and greenness in northwest Florida. Results Our study used an ecological geographical approach and dasymetric mapping technique. We adopted a Bayesian hierarchical model with a convolution prior considering five census tract specific covariates. A 95% credible set which defines an interval having a 0.95 posterior probability of containing the parameter for each covariate was calculated from Markov Chain Monte Carlo simulations. The 95% credible sets are (-0.286, -0.097) for household income, (0.034, 0.144) for traffic air pollution effect, (0.419, 1.495) for emission density of monitored point source polluters, (0.413, 1.522) for simple point density of point source polluters without emission data, and (-0.289,-0.031) for greenness. Household income and greenness show negative effects (the posterior densities primarily cover negative values). Air pollution covariates have positive effects (the 95% credible sets cover positive values). Conclusion High risk of stroke mortality was found in areas with low income level, high air pollution level, and low level of exposure to green space. PMID:18452609
NASA Astrophysics Data System (ADS)
Huang, Maoyi; Ray, Jaideep; Hou, Zhangshuan; Ren, Huiying; Liu, Ying; Swiler, Laura
2016-07-01
The Community Land Model (CLM) has been widely used in climate and Earth system modeling. Accurate estimation of model parameters is needed for reliable model simulations and predictions under current and future conditions, respectively. In our previous work, a subset of hydrological parameters has been identified to have significant impact on surface energy fluxes at selected flux tower sites based on parameter screening and sensitivity analysis, which indicate that the parameters could potentially be estimated from surface flux observations at the towers. To date, such estimates do not exist. In this paper, we assess the feasibility of applying a Bayesian model calibration technique to estimate CLM parameters at selected flux tower sites under various site conditions. The parameters are estimated as a joint probability density function (PDF) that provides estimates of uncertainty of the parameters being inverted, conditional on climatologically average latent heat fluxes derived from observations. We find that the simulated mean latent heat fluxes from CLM using the calibrated parameters are generally improved at all sites when compared to those obtained with CLM simulations using default parameter sets. Further, our calibration method also results in credibility bounds around the simulated mean fluxes which bracket the measured data. The modes (or maximum a posteriori values) and 95% credibility intervals of the site-specific posterior PDFs are tabulated as suggested parameter values for each site. Analysis of relationships between the posterior PDFs and site conditions suggests that the parameter values are likely correlated with the plant functional type, which needs to be confirmed in future studies by extending the approach to more sites.
Pedroza, Claudia; Han, Weilu; Thanh Truong, Van Thi; Green, Charles; Tyson, Jon E
2018-01-01
One of the main advantages of Bayesian analyses of clinical trials is their ability to formally incorporate skepticism about large treatment effects through the use of informative priors. We conducted a simulation study to assess the performance of informative normal, Student- t, and beta distributions in estimating relative risk (RR) or odds ratio (OR) for binary outcomes. Simulation scenarios varied the prior standard deviation (SD; level of skepticism of large treatment effects), outcome rate in the control group, true treatment effect, and sample size. We compared the priors with regards to bias, mean squared error (MSE), and coverage of 95% credible intervals. Simulation results show that the prior SD influenced the posterior to a greater degree than the particular distributional form of the prior. For RR, priors with a 95% interval of 0.50-2.0 performed well in terms of bias, MSE, and coverage under most scenarios. For OR, priors with a wider 95% interval of 0.23-4.35 had good performance. We recommend the use of informative priors that exclude implausibly large treatment effects in analyses of clinical trials, particularly for major outcomes such as mortality.
Pritt, Jeremy J.; DuFour, Mark R.; Mayer, Christine M.; Kocovsky, Patrick M.; Tyson, Jeffrey T.; Weimer, Eric J.; Vandergoot, Christopher S.
2013-01-01
Walleye (Sander vitreus) in Lake Erie is a valuable and migratory species that spawns in tributaries. We used hydroacoustic sampling, gill net sampling, and Bayesian state-space modeling to estimate the spawning stock abundance, characterize size and sex structure, and explore environmental factors cuing migration of walleye in the Maumee River for 2011 and 2012. We estimated the spawning stock abundance to be between 431,000 and 1,446,000 individuals in 2011 and between 386,400 and 857,200 individuals in 2012 (95% Bayesian credible intervals). A back-calculation from a concurrent larval fish study produced an estimate of 78,000 to 237,000 spawners for 2011. The sex ratio was skewed towards males early in the spawning season but approached 1:1 later, and larger individuals entered the river earlier in the season than smaller individuals. Walleye migration was greater during low river discharge and intermediate temperatures. Our approach to estimating absolute abundance and uncertainty as well as characterization of the spawning stock could improve assessment and management of this species, and our methodology is applicable to other diadromous populations.
Stringer, Lesley A; Jones, Geoff; Jewell, Chris P; Noble, Alasdair D; Heuer, Cord; Wilson, Peter R; Johnson, Wesley O
2013-11-01
A Bayesian latent class model was used to estimate the sensitivity and specificity of an immunoglobulin G1 serum enzyme-linked immunosorbent assay (Paralisa) and individual fecal culture to detect young deer infected with Mycobacterium avium subsp. paratuberculosis. Paired fecal and serum samples were collected, between July 2009 and April 2010, from 20 individual yearling (12-24-month-old) deer in each of 20 South Island and 18 North Island herds in New Zealand and subjected to culture and Paralisa, respectively. Two fecal samples and 16 serum samples from 356 North Island deer, and 55 fecal and 37 serum samples from 401 South Island deer, were positive. The estimate of individual fecal culture sensitivity was 77% (95% credible interval [CI] = 61-92%) with specificity of 99% (95% CI = 98-99.7%). The Paralisa sensitivity estimate was 19% (95% CI = 10-30%), with specificity of 94% (95% CI = 93-96%). All estimates were robust to variation of priors and assumptions tested in a sensitivity analysis. The data informs the use of the tests in determining infection status at the individual and herd level.
Strong Bayesian evidence for the normal neutrino hierarchy
NASA Astrophysics Data System (ADS)
Simpson, Fergus; Jimenez, Raul; Pena-Garay, Carlos; Verde, Licia
2017-06-01
The configuration of the three neutrino masses can take two forms, known as the normal and inverted hierarchies. We compute the Bayesian evidence associated with these two hierarchies. Previous studies found a mild preference for the normal hierarchy, and this was driven by the asymmetric manner in which cosmological data has confined the available parameter space. Here we identify the presence of a second asymmetry, which is imposed by data from neutrino oscillations. By combining constraints on the squared-mass splittings [1] with the limit on the sum of neutrino masses of Σmν < 0.13 eV [2], and using a minimally informative prior on the masses, we infer odds of 42:1 in favour of the normal hierarchy, which is classified as "strong" in the Jeffreys' scale. We explore how these odds may evolve in light of higher precision cosmological data, and discuss the implications of this finding with regards to the nature of neutrinos. Finally the individual masses are inferred to be m1=3.80+26.2-3.73meV; m2=8.8+18-1.2meV; m3=50.4+5.8-1.2meV (95% credible intervals).
A Bayesian Inferential Approach to Quantify the Transmission Intensity of Disease Outbreak
Kadi, Adiveppa S.; Avaradi, Shivakumari R.
2015-01-01
Background. Emergence of infectious diseases like influenza pandemic (H1N1) 2009 has become great concern, which posed new challenges to the health authorities worldwide. To control these diseases various studies have been developed in the field of mathematical modelling, which is useful tool for understanding the epidemiological dynamics and their dependence on social mixing patterns. Method. We have used Bayesian approach to quantify the disease outbreak through key epidemiological parameter basic reproduction number (R 0), using effective contacts, defined as sum of the product of incidence cases and probability of generation time distribution. We have estimated R 0 from daily case incidence data for pandemic influenza A/H1N1 2009 in India, for the initial phase. Result. The estimated R 0 with 95% credible interval is consistent with several other studies on the same strain. Through sensitivity analysis our study indicates that infectiousness affects the estimate of R 0. Conclusion. Basic reproduction number R 0 provides the useful information to the public health system to do some effort in controlling the disease by using mitigation strategies like vaccination, quarantine, and so forth. PMID:25784956
Bayesian analyses of time-interval data for environmental radiation monitoring.
Luo, Peng; Sharp, Julia L; DeVol, Timothy A
2013-01-01
Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.
Meyers, S.R.; Siewert, S.E.; Singer, B.S.; Sageman, B.B.; Condon, D.J.; Obradovich, J.D.; Jicha, B.R.; Sawyer, D.A.
2012-01-01
We develop an intercalibrated astrochronologic and radioisotopic time scale for the Cenomanian-Turonian boundary (CTB) interval near the Global Stratotype Section and Point in Colorado, USA, where orbitally influenced rhythmic strata host bentonites that contain sanidine and zircon suitable for 40Ar/ 39Ar and U-Pb dating. Paired 40Ar/ 39Ar and U-Pb ages are determined from four bentonites that span the Vascoceras diartianum to Pseudaspidoceras flexuosum ammonite biozones, utilizing both newly collected material and legacy sanidine samples of J. Obradovich. Comparison of the 40Ar/ 39Ar and U-Pb results underscores the strengths and limitations of each system, and supports an astronomically calibrated Fish Canyon sanidine standard age of 28.201 Ma. The radioisotopic data and published astrochronology are employed to develop a new CTB time scale, using two statistical approaches: (1) a simple integration that yields a CTB age of 93.89 ?? 0.14 Ma (2??; total radioisotopic uncertainty), and (2) a Bayesian intercalibration that explicitly accounts for orbital time scale uncertainty, and yields a CTB age of 93.90 ?? 0.15 Ma (95% credible interval; total radioisotopic and orbital time scale uncertainty). Both approaches firmly anchor the floating orbital time scale, and the Bayesian technique yields astronomically recalibrated radioisotopic ages for individual bentonites, with analytical uncertainties at the permil level of resolution, and total uncertainties below 2???. Using our new results, the duration between the Cenomanian-Turonian and the Cretaceous-Paleogene boundaries is 27.94 ?? 0.16 Ma, with an uncertainty of less than one-half of a long eccentricity cycle. ?? 2012 Geological Society of America.
Teaching Note--Was the Champions League Draw Rigged?
ERIC Educational Resources Information Center
Tijms, Henk
2015-01-01
This teaching note gives a real-life example of Bayesian thinking. It discusses how credible accusations are that the outcome of the draw for the quarter-finals in the 2013 European Champions League Football was manipulated.
Fang, Vicky J.; Feng, Luzhao; Tsang, Tim K.; Jiang, Hui; Wu, Peng; Zheng, Jiandong; Lau, Eric H. Y.; Qin, Ying; Peng, Zhibin; Peiris, J. S. Malik; Yu, Hongjie; Cowling, Benjamin J.
2016-01-01
Background In early 2013, a novel avian-origin influenza A(H7N9) virus emerged in China, and has caused sporadic human infections. The incubation period is the delay from infection until onset of symptoms, and varies from person to person. Few previous studies have examined whether the duration of the incubation period correlates with subsequent disease severity. Methods and Findings We analyzed data of period of exposure on 395 human cases of laboratory-confirmed influenza A(H7N9) virus infection in China in a Bayesian framework using a Weibull distribution. We found a longer incubation period for the 173 fatal cases with a mean of 3.7 days (95% credibility interval, CrI: 3.4–4.1), compared to a mean of 3.3 days (95% CrI: 2.9–3.6) for the 222 non-fatal cases, and the difference in means was marginally significant at 0.47 days (95% CrI: -0.04, 0.99). There was a statistically significant correlation between a longer incubation period and an increased risk of death after adjustment for age, sex, geographical location and underlying medical conditions (adjusted odds ratio 1.70 per day increase in incubation period; 95% credibility interval 1.47–1.97). Conclusions We found a significant association between a longer incubation period and a greater risk of death among human H7N9 cases. The underlying biological mechanisms leading to this association deserve further exploration. PMID:26885816
Coker, Eric; Gunier, Robert; Bradman, Asa; Harley, Kim; Kogut, Katherine; Molitor, John; Eskenazi, Brenda
2017-01-01
We previously showed that potential prenatal exposure to agricultural pesticides was associated with adverse neurodevelopmental outcomes in children, yet the effects of joint exposure to multiple pesticides is poorly understood. In this paper, we investigate associations between the joint distribution of agricultural use patterns of multiple pesticides (denoted as “pesticide profiles”) applied near maternal residences during pregnancy and Full-Scale Intelligence Quotient (FSIQ) at 7 years of age. Among a cohort of children residing in California’s Salinas Valley, we used Pesticide Use Report (PUR) data to characterize potential exposure from use within 1 km of maternal residences during pregnancy for 15 potentially neurotoxic pesticides from five different chemical classes. We used Bayesian profile regression (BPR) to examine associations between clustered pesticide profiles and deficits in childhood FSIQ. BPR identified eight distinct clusters of prenatal pesticide profiles. Two of the pesticide profile clusters exhibited some of the highest cumulative pesticide use levels and were associated with deficits in adjusted FSIQ of −6.9 (95% credible interval: −11.3, −2.2) and −6.4 (95% credible interval: −13.1, 0.49), respectively, when compared with the pesticide profile cluster that showed the lowest level of pesticides use. Although maternal residence during pregnancy near high agricultural use of multiple neurotoxic pesticides was associated with FSIQ deficit, the magnitude of the associations showed potential for sub-additive effects. Epidemiologic analysis of pesticides and their potential health effects can benefit from a multi-pollutant approach to analysis. PMID:28486423
Wolfe, Lisa L; Conner, Mary M; Bedwell, Cathy L; Lukacs, Paul M; Miller, Michael W
2010-07-01
Trace mineral imbalances have been suggested as having a causative or contributory role in chronic wasting disease (CWD), a prion disease of several North American cervid species. To begin exploring relationships between tissue mineral concentrations and CWD in natural systems, we measured liver tissue concentrations of copper, manganese, and molybdenum in samples from 447 apparently healthy, adult (> or = 2 yr old) mule deer (Odocoileus hemionus) culled or vehicle killed from free-ranging populations in north-central Colorado, United States, where CWD occurs naturally; we also measured copper concentrations in brain-stem (medulla oblongata at the obex) tissue from 181 of these deer. Analyses revealed a wide range of concentrations of all three minerals among sampled deer (copper: 5.6-331 ppm in liver, 1.5-31.9 ppm in obex; manganese: 0.1-21.4 ppm in liver; molybdenum: 0.5-4.0 ppm in liver). Bayesian multiple regression analysis revealed a negative association between obex copper (-0.097; 95% credible interval -0.192 to -0.006) and the probability of sampled deer also being infected with CWD, as well as a positive association between liver manganese (0.158; 95% credible interval 0.066 to 0.253) and probability of infection. We could not discern whether the tendencies toward lower brain-stem copper concentrations or higher systemic manganese concentrations in infected deer preceded prion infection or rather were the result of infection and its subsequent effects, although the distribution of trace mineral concentrations in infected deer seemed more suggestive of the latter.
Integration of individual and social information for decision-making in groups of different sizes.
Park, Seongmin A; Goïame, Sidney; O'Connor, David A; Dreher, Jean-Claude
2017-06-01
When making judgments in a group, individuals often revise their initial beliefs about the best judgment to make given what others believe. Despite the ubiquity of this phenomenon, we know little about how the brain updates beliefs when integrating personal judgments (individual information) with those of others (social information). Here, we investigated the neurocomputational mechanisms of how we adapt our judgments to those made by groups of different sizes, in the context of jury decisions for a criminal. By testing different theoretical models, we showed that a social Bayesian inference model captured changes in judgments better than 2 other models. Our results showed that participants updated their beliefs by appropriately weighting individual and social sources of information according to their respective credibility. When investigating 2 fundamental computations of Bayesian inference, belief updates and credibility estimates of social information, we found that the dorsal anterior cingulate cortex (dACC) computed the level of belief updates, while the bilateral frontopolar cortex (FPC) was more engaged in individuals who assigned a greater credibility to the judgments of a larger group. Moreover, increased functional connectivity between these 2 brain regions reflected a greater influence of group size on the relative credibility of social information. These results provide a mechanistic understanding of the computational roles of the FPC-dACC network in steering judgment adaptation to a group's opinion. Taken together, these findings provide a computational account of how the human brain integrates individual and social information for decision-making in groups.
Budke, Christine M.; Carabin, Hélène; Ndimubanzi, Patrick C.; Nguyen, Hai; Rainwater, Elizabeth; Dickey, Mary; Bhattarai, Rachana; Zeziulin, Oleksandr; Qian, Men-Bao
2013-01-01
A systematic literature review of cystic echinoccocosis (CE) frequency and symptoms was conducted. Studies without denominators, original data, or using one serological test were excluded. Random-effect log-binomial models were run for CE frequency and proportion of reported symptoms where appropriate. A total of 45 and 25 articles on CE frequency and symptoms met all inclusion criteria. Prevalence of CE ranged from 1% to 7% in community-based studies and incidence rates ranged from 0 to 32 cases per 100,000 in hospital-based studies. The CE prevalence was higher in females (Prevalence Proportion Ratio: 1.35 [95% Bayesian Credible Interval: 1.16–1.53]) and increased with age. The most common manifestations of hepatic and pulmonary CE were abdominal pain (57.3% [95% confidence interval [CI]: 37.3–76.1%]) and cough (51.3% [95% CI: 35.7–66.7%]), respectively. The results are limited by the small number of unbiased studies. Nonetheless, the age/gender prevalence differences could be used to inform future models of CE burden. PMID:23546806
Budke, Christine M; Carabin, Hélène; Ndimubanzi, Patrick C; Nguyen, Hai; Rainwater, Elizabeth; Dickey, Mary; Bhattarai, Rachana; Zeziulin, Oleksandr; Qian, Men-Bao
2013-06-01
A systematic literature review of cystic echinoccocosis (CE) frequency and symptoms was conducted. Studies without denominators, original data, or using one serological test were excluded. Random-effect log-binomial models were run for CE frequency and proportion of reported symptoms where appropriate. A total of 45 and 25 articles on CE frequency and symptoms met all inclusion criteria. Prevalence of CE ranged from 1% to 7% in community-based studies and incidence rates ranged from 0 to 32 cases per 100,000 in hospital-based studies. The CE prevalence was higher in females (Prevalence Proportion Ratio: 1.35 [95% Bayesian Credible Interval: 1.16-1.53]) and increased with age. The most common manifestations of hepatic and pulmonary CE were abdominal pain (57.3% [95% confidence interval [CI]: 37.3-76.1%]) and cough (51.3% [95% CI: 35.7-66.7%]), respectively. The results are limited by the small number of unbiased studies. Nonetheless, the age/gender prevalence differences could be used to inform future models of CE burden.
Bayman, Emine O; Chaloner, Kathryn M; Hindman, Bradley J; Todd, Michael M
2013-01-16
To quantify the variability among centers and to identify centers whose performance are potentially outside of normal variability in the primary outcome and to propose a guideline that they are outliers. Novel statistical methodology using a Bayesian hierarchical model is used. Bayesian methods for estimation and outlier detection are applied assuming an additive random center effect on the log odds of response: centers are similar but different (exchangeable). The Intraoperative Hypothermia for Aneurysm Surgery Trial (IHAST) is used as an example. Analyses were adjusted for treatment, age, gender, aneurysm location, World Federation of Neurological Surgeons scale, Fisher score and baseline NIH stroke scale scores. Adjustments for differences in center characteristics were also examined. Graphical and numerical summaries of the between-center standard deviation (sd) and variability, as well as the identification of potential outliers are implemented. In the IHAST, the center-to-center variation in the log odds of favorable outcome at each center is consistent with a normal distribution with posterior sd of 0.538 (95% credible interval: 0.397 to 0.726) after adjusting for the effects of important covariates. Outcome differences among centers show no outlying centers. Four potential outlying centers were identified but did not meet the proposed guideline for declaring them as outlying. Center characteristics (number of subjects enrolled from the center, geographical location, learning over time, nitrous oxide, and temporary clipping use) did not predict outcome, but subject and disease characteristics did. Bayesian hierarchical methods allow for determination of whether outcomes from a specific center differ from others and whether specific clinical practices predict outcome, even when some centers/subgroups have relatively small sample sizes. In the IHAST no outlying centers were found. The estimated variability between centers was moderately large.
Nicoulaud-Gouin, V; Garcia-Sanchez, L; Giacalone, M; Attard, J C; Martin-Garin, A; Bois, F Y
2016-10-01
This paper addresses the methodological conditions -particularly experimental design and statistical inference- ensuring the identifiability of sorption parameters from breakthrough curves measured during stirred flow-through reactor experiments also known as continuous flow stirred-tank reactor (CSTR) experiments. The equilibrium-kinetic (EK) sorption model was selected as nonequilibrium parameterization embedding the K d approach. Parameter identifiability was studied formally on the equations governing outlet concentrations. It was also studied numerically on 6 simulated CSTR experiments on a soil with known equilibrium-kinetic sorption parameters. EK sorption parameters can not be identified from a single breakthrough curve of a CSTR experiment, because K d,1 and k - were diagnosed collinear. For pairs of CSTR experiments, Bayesian inference allowed to select the correct models of sorption and error among sorption alternatives. Bayesian inference was conducted with SAMCAT software (Sensitivity Analysis and Markov Chain simulations Applied to Transfer models) which launched the simulations through the embedded simulation engine GNU-MCSim, and automated their configuration and post-processing. Experimental designs consisting in varying flow rates between experiments reaching equilibrium at contamination stage were found optimal, because they simultaneously gave accurate sorption parameters and predictions. Bayesian results were comparable to maximum likehood method but they avoided convergence problems, the marginal likelihood allowed to compare all models, and credible interval gave directly the uncertainty of sorption parameters θ. Although these findings are limited to the specific conditions studied here, in particular the considered sorption model, the chosen parameter values and error structure, they help in the conception and analysis of future CSTR experiments with radionuclides whose kinetic behaviour is suspected. Copyright © 2016 Elsevier Ltd. All rights reserved.
Badve, Sunil V; Palmer, Suetonia C; Strippoli, Giovanni F M; Roberts, Matthew A; Teixeira-Pinto, Armando; Boudville, Neil; Cass, Alan; Hawley, Carmel M; Hiremath, Swapnil S; Pascoe, Elaine M; Perkovic, Vlado; Whalley, Gillian A; Craig, Jonathan C; Johnson, David W
2016-10-01
Left ventricular mass (LVM) is a widely used surrogate end point in randomized trials involving people with chronic kidney disease (CKD) because treatment-induced LVM reductions are assumed to lower cardiovascular risk. The aim of this study was to assess the validity of LVM as a surrogate end point for all-cause and cardiovascular mortality in CKD. Systematic review and meta-analysis. Participants with any stages of CKD. Randomized controlled trials with 3 or more months' follow-up that reported LVM data. Any pharmacologic or nonpharmacologic intervention. The surrogate outcome of interest was LVM change from baseline to last measurement, and clinical outcomes of interest were all-cause and cardiovascular mortality. Standardized mean differences (SMDs) of LVM change and relative risk for mortality were estimated using pairwise random-effects meta-analysis. Correlations between surrogate and clinical outcomes were summarized across all interventions combined using bivariate random-effects Bayesian models, and 95% credible intervals were computed. 73 trials (6,732 participants) covering 25 intervention classes were included in the meta-analysis. Overall, risk of bias was uncertain or high. Only 3 interventions reduced LVM: erythropoiesis-stimulating agents (9 trials; SMD, -0.13; 95% CI, -0.23 to -0.03), renin-angiotensin-aldosterone system inhibitors (13 trials; SMD, -0.28; 95% CI, -0.45 to -0.12), and isosorbide mononitrate (2 trials; SMD, -0.43; 95% CI, -0.72 to -0.14). All interventions had uncertain effects on all-cause and cardiovascular mortality. There were weak and imprecise associations between the effects of interventions on LVM change and all-cause (32 trials; 5,044 participants; correlation coefficient, 0.28; 95% credible interval, -0.13 to 0.59) and cardiovascular mortality (13 trials; 2,327 participants; correlation coefficient, 0.30; 95% credible interval, -0.54 to 0.76). Limited long-term data, suboptimal quality of included studies. There was no clear and consistent association between intervention-induced LVM change and mortality. Evidence for LVM as a valid surrogate end point in CKD is currently lacking. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
Koch, Tobias; Schultze, Martin; Jeon, Minjeong; Nussbeck, Fridtjof W; Praetorius, Anna-Katharina; Eid, Michael
2016-01-01
Multirater (multimethod, multisource) studies are increasingly applied in psychology. Eid and colleagues (2008) proposed a multilevel confirmatory factor model for multitrait-multimethod (MTMM) data combining structurally different and multiple independent interchangeable methods (raters). In many studies, however, different interchangeable raters (e.g., peers, subordinates) are asked to rate different targets (students, supervisors), leading to violations of the independence assumption and to cross-classified data structures. In the present work, we extend the ML-CFA-MTMM model by Eid and colleagues (2008) to cross-classified multirater designs. The new C4 model (Cross-Classified CTC[M-1] Combination of Methods) accounts for nonindependent interchangeable raters and enables researchers to explicitly model the interaction between targets and raters as a latent variable. Using a real data application, it is shown how credibility intervals of model parameters and different variance components can be obtained using Bayesian estimation techniques.
NASA Technical Reports Server (NTRS)
Veitch, J.; Raymond, V.; Farr, B.; Farr, W.; Graff, P.; Vitale, S.; Aylott, B.; Blackburn, K.; Christensen, N.; Coughlin, M.
2015-01-01
The Advanced LIGO and Advanced Virgo gravitational wave (GW) detectors will begin operation in the coming years, with compact binary coalescence events a likely source for the first detections. The gravitational waveforms emitted directly encode information about the sources, including the masses and spins of the compact objects. Recovering the physical parameters of the sources from the GW observations is a key analysis task. This work describes the LALInference software library for Bayesian parameter estimation of compact binary signals, which builds on several previous methods to provide a well-tested toolkit which has already been used for several studies. We show that our implementation is able to correctly recover the parameters of compact binary signals from simulated data from the advanced GW detectors. We demonstrate this with a detailed comparison on three compact binary systems: a binary neutron star (BNS), a neutron star - black hole binary (NSBH) and a binary black hole (BBH), where we show a cross-comparison of results obtained using three independent sampling algorithms. These systems were analysed with non-spinning, aligned spin and generic spin configurations respectively, showing that consistent results can be obtained even with the full 15-dimensional parameter space of the generic spin configurations. We also demonstrate statistically that the Bayesian credible intervals we recover correspond to frequentist confidence intervals under correct prior assumptions by analysing a set of 100 signals drawn from the prior. We discuss the computational cost of these algorithms, and describe the general and problem-specific sampling techniques we have used to improve the efficiency of sampling the compact binary coalescence (CBC) parameter space.
Integration of individual and social information for decision-making in groups of different sizes
Goïame, Sidney; O'Connor, David A.; Dreher, Jean-Claude
2017-01-01
When making judgments in a group, individuals often revise their initial beliefs about the best judgment to make given what others believe. Despite the ubiquity of this phenomenon, we know little about how the brain updates beliefs when integrating personal judgments (individual information) with those of others (social information). Here, we investigated the neurocomputational mechanisms of how we adapt our judgments to those made by groups of different sizes, in the context of jury decisions for a criminal. By testing different theoretical models, we showed that a social Bayesian inference model captured changes in judgments better than 2 other models. Our results showed that participants updated their beliefs by appropriately weighting individual and social sources of information according to their respective credibility. When investigating 2 fundamental computations of Bayesian inference, belief updates and credibility estimates of social information, we found that the dorsal anterior cingulate cortex (dACC) computed the level of belief updates, while the bilateral frontopolar cortex (FPC) was more engaged in individuals who assigned a greater credibility to the judgments of a larger group. Moreover, increased functional connectivity between these 2 brain regions reflected a greater influence of group size on the relative credibility of social information. These results provide a mechanistic understanding of the computational roles of the FPC-dACC network in steering judgment adaptation to a group’s opinion. Taken together, these findings provide a computational account of how the human brain integrates individual and social information for decision-making in groups. PMID:28658252
Bayesian Statistical Analysis of Historical and Late Holocene Rates of Sea-Level Change
NASA Astrophysics Data System (ADS)
Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin
2014-05-01
A fundamental concern associated with climate change is the rate at which sea levels are rising. Studies of past sea level (particularly beyond the instrumental data range) allow modern sea-level rise to be placed in a more complete context. Considering this, we perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level rise, to determine when modern rates of sea-level rise began and to observe how these rates have been changing over time. Many of the current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over confidence in the sea-level trends being characterized. The proposed Bayesian model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, this is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary, in this case, for the model to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method provides a flexible fit and it allows for the direct estimation of the rate process with full consideration of all sources of uncertainty. Analysis of tide-gauge datasets and proxy reconstructions in this way means that changing rates of sea level can be estimated more comprehensively and accurately than previously possible. The model captures the continuous and dynamic evolution of sea-level change and results show that not only are modern sea levels rising but that the rates of rise are continuously increasing. Analysis of the a global tide-gauge record (Church and White, 2011) indicated that the rate of sea-level rise increased continuously since 1880AD and is currently 2.57mm/yr (95% credible interval of 1.71 to 4.35mm/yr). Application of the model a proxy reconstruction from North Carolina (Kemp et al., 2011) indicated that the mean rate of rise in this locality since the middle of the 19th century (current rate of 2.66 mm/yr with a 95% credible interval of 1.29 to 4.59mm/yr) is in agreement with results from the tide gauge analysis and is unprecedented in at least the last 2000 years.
NASA Astrophysics Data System (ADS)
Salonen, Heidi; Duchaine, Caroline; Mazaheri, Mandana; Clifford, Sam; Morawska, Lidia
2015-04-01
There is currently a lack of reference values for indoor air fungal concentrations to allow for the interpretation of measurement results in subtropical school settings. Analysis of the results of this work established that, in the majority of properly maintained subtropical school buildings, without any major affecting events such as floods or visible mould or moisture contamination, indoor culturable fungi levels were driven by outdoor concentration. The results also allowed us to benchmark the "baseline range" concentrations for total culturable fungi, Penicillium spp., Cladosporium spp. and Aspergillus spp. in such school settings. The measured concentration of total culturable fungi and three individual fungal genera were estimated using Bayesian hierarchical modelling. Pooling of these estimates provided a predictive distribution for concentrations at an unobserved school. The results indicated that "baseline" indoor concentration levels for indoor total fungi, Penicillium spp., Cladosporium spp. and Aspergillus spp. in such school settings were generally ≤1450, ≤680, ≤480 and ≤90 cfu/m3, respectively, and elevated levels would indicate mould damage in building structures. The indoor/outdoor ratio for most classrooms had 95% credible intervals containing 1, indicating that fungi concentrations are generally the same indoors and outdoors at each school. Bayesian fixed effects regression modelling showed that increasing both temperature and humidity resulted in higher levels of fungi concentration.
Bayesian inference for OPC modeling
NASA Astrophysics Data System (ADS)
Burbine, Andrew; Sturtevant, John; Fryer, David; Smith, Bruce W.
2016-03-01
The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and inference techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper aims to demonstrate the predictive power of Bayesian inference as a method for parameter selection in lithographic models by quantifying the uncertainty associated with model inputs and wafer data. Specifically, the method combines the model builder's prior information about each modelling assumption with the maximization of each observation's likelihood as a Student's t-distributed random variable. Through the use of a Markov chain Monte Carlo (MCMC) algorithm, a model's parameter space is explored to find the most credible parameter values. During parameter exploration, the parameters' posterior distributions are generated by applying Bayes' rule, using a likelihood function and the a priori knowledge supplied. The MCMC algorithm used, an affine invariant ensemble sampler (AIES), is implemented by initializing many walkers which semiindependently explore the space. The convergence of these walkers to global maxima of the likelihood volume determine the parameter values' highest density intervals (HDI) to reveal champion models. We show that this method of parameter selection provides insights into the data that traditional methods do not and outline continued experiments to vet the method.
The improved business valuation model for RFID company based on the community mining method.
Li, Shugang; Yu, Zhaoxu
2017-01-01
Nowadays, the appetite for the investment and mergers and acquisitions (M&A) activity in RFID companies is growing rapidly. Although the huge number of papers have addressed the topic of business valuation models based on statistical methods or neural network methods, only a few are dedicated to constructing a general framework for business valuation that improves the performance with network graph (NG) and the corresponding community mining (CM) method. In this study, an NG based business valuation model is proposed, where real options approach (ROA) integrating CM method is designed to predict the company's net profit as well as estimate the company value. Three improvements are made in the proposed valuation model: Firstly, our model figures out the credibility of the node belonging to each community and clusters the network according to the evolutionary Bayesian method. Secondly, the improved bacterial foraging optimization algorithm (IBFOA) is adopted to calculate the optimized Bayesian posterior probability function. Finally, in IBFOA, bi-objective method is used to assess the accuracy of prediction, and these two objectives are combined into one objective function using a new Pareto boundary method. The proposed method returns lower forecasting error than 10 well-known forecasting models on 3 different time interval valuing tasks for the real-life simulation of RFID companies.
The improved business valuation model for RFID company based on the community mining method
Li, Shugang; Yu, Zhaoxu
2017-01-01
Nowadays, the appetite for the investment and mergers and acquisitions (M&A) activity in RFID companies is growing rapidly. Although the huge number of papers have addressed the topic of business valuation models based on statistical methods or neural network methods, only a few are dedicated to constructing a general framework for business valuation that improves the performance with network graph (NG) and the corresponding community mining (CM) method. In this study, an NG based business valuation model is proposed, where real options approach (ROA) integrating CM method is designed to predict the company’s net profit as well as estimate the company value. Three improvements are made in the proposed valuation model: Firstly, our model figures out the credibility of the node belonging to each community and clusters the network according to the evolutionary Bayesian method. Secondly, the improved bacterial foraging optimization algorithm (IBFOA) is adopted to calculate the optimized Bayesian posterior probability function. Finally, in IBFOA, bi-objective method is used to assess the accuracy of prediction, and these two objectives are combined into one objective function using a new Pareto boundary method. The proposed method returns lower forecasting error than 10 well-known forecasting models on 3 different time interval valuing tasks for the real-life simulation of RFID companies. PMID:28459815
NASA Astrophysics Data System (ADS)
Wübbeler, Gerd; Bodnar, Olha; Elster, Clemens
2018-02-01
Weighted least-squares estimation is commonly applied in metrology to fit models to measurements that are accompanied with quoted uncertainties. The weights are chosen in dependence on the quoted uncertainties. However, when data and model are inconsistent in view of the quoted uncertainties, this procedure does not yield adequate results. When it can be assumed that all uncertainties ought to be rescaled by a common factor, weighted least-squares estimation may still be used, provided that a simple correction of the uncertainty obtained for the estimated model is applied. We show that these uncertainties and credible intervals are robust, as they do not rely on the assumption of a Gaussian distribution of the data. Hence, common software for weighted least-squares estimation may still safely be employed in such a case, followed by a simple modification of the uncertainties obtained by that software. We also provide means of checking the assumptions of such an approach. The Bayesian regression procedure is applied to analyze the CODATA values for the Planck constant published over the past decades in terms of three different models: a constant model, a straight line model and a spline model. Our results indicate that the CODATA values may not have yet stabilized.
A Bayesian Approach for Summarizing and Modeling Time-Series Exposure Data with Left Censoring.
Houseman, E Andres; Virji, M Abbas
2017-08-01
Direct reading instruments are valuable tools for measuring exposure as they provide real-time measurements for rapid decision making. However, their use is limited to general survey applications in part due to issues related to their performance. Moreover, statistical analysis of real-time data is complicated by autocorrelation among successive measurements, non-stationary time series, and the presence of left-censoring due to limit-of-detection (LOD). A Bayesian framework is proposed that accounts for non-stationary autocorrelation and LOD issues in exposure time-series data in order to model workplace factors that affect exposure and estimate summary statistics for tasks or other covariates of interest. A spline-based approach is used to model non-stationary autocorrelation with relatively few assumptions about autocorrelation structure. Left-censoring is addressed by integrating over the left tail of the distribution. The model is fit using Markov-Chain Monte Carlo within a Bayesian paradigm. The method can flexibly account for hierarchical relationships, random effects and fixed effects of covariates. The method is implemented using the rjags package in R, and is illustrated by applying it to real-time exposure data. Estimates for task means and covariates from the Bayesian model are compared to those from conventional frequentist models including linear regression, mixed-effects, and time-series models with different autocorrelation structures. Simulations studies are also conducted to evaluate method performance. Simulation studies with percent of measurements below the LOD ranging from 0 to 50% showed lowest root mean squared errors for task means and the least biased standard deviations from the Bayesian model compared to the frequentist models across all levels of LOD. In the application, task means from the Bayesian model were similar to means from the frequentist models, while the standard deviations were different. Parameter estimates for covariates were significant in some frequentist models, but in the Bayesian model their credible intervals contained zero; such discrepancies were observed in multiple datasets. Variance components from the Bayesian model reflected substantial autocorrelation, consistent with the frequentist models, except for the auto-regressive moving average model. Plots of means from the Bayesian model showed good fit to the observed data. The proposed Bayesian model provides an approach for modeling non-stationary autocorrelation in a hierarchical modeling framework to estimate task means, standard deviations, quantiles, and parameter estimates for covariates that are less biased and have better performance characteristics than some of the contemporary methods. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2017.
Jones, Hayley E; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J; Baker, David R; Ades, A E
2014-07-15
Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these 'back-calculations', the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. Copyright © 2014. Published by Elsevier B.V.
Jones, Hayley E.; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J.; Baker, David R.; Ades, A.E.
2014-01-01
Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these ‘back-calculations’, the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. PMID:24636801
Duchêne, Sebastián; Duchêne, David; Holmes, Edward C; Ho, Simon Y W
2015-07-01
Rates and timescales of viral evolution can be estimated using phylogenetic analyses of time-structured molecular sequences. This involves the use of molecular-clock methods, calibrated by the sampling times of the viral sequences. However, the spread of these sampling times is not always sufficient to allow the substitution rate to be estimated accurately. We conducted Bayesian phylogenetic analyses of simulated virus data to evaluate the performance of the date-randomization test, which is sometimes used to investigate whether time-structured data sets have temporal signal. An estimate of the substitution rate passes this test if its mean does not fall within the 95% credible intervals of rate estimates obtained using replicate data sets in which the sampling times have been randomized. We find that the test sometimes fails to detect rate estimates from data with no temporal signal. This error can be minimized by using a more conservative criterion, whereby the 95% credible interval of the estimate with correct sampling times should not overlap with those obtained with randomized sampling times. We also investigated the behavior of the test when the sampling times are not uniformly distributed throughout the tree, which sometimes occurs in empirical data sets. The test performs poorly in these circumstances, such that a modification to the randomization scheme is needed. Finally, we illustrate the behavior of the test in analyses of nucleotide sequences of cereal yellow dwarf virus. Our results validate the use of the date-randomization test and allow us to propose guidelines for interpretation of its results. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Thorlund, Kristian; Druyts, Eric; Mills, Edward J; Fedorak, Richard N; Marshall, John K
2014-07-01
To compare the efficacy of adalimumab and infliximab for the treatment of moderate to severe ulcerative colitis using indirect treatment comparison meta-analysis. A systematic review and Bayesian indirect treatment comparison meta-analyses were performed for seven patient-important clinical outcomes at 8 weeks and 52 weeks. Odds ratio (OR) estimates and associated 95% credible intervals (CrIs) were produced. Five eligible RCTs informed clinical remission, response, mucosal healing, quality of life, colectomy, serious adverse events, and discontinuation due to adverse events at 8 weeks and 52 weeks. At 8 weeks of induction therapy, clinical remission (OR=0.42, 95% CrI 0.17-0.97), clinical response (OR=0.45, 95% CrI 0.23-0.89) and mucosal healing (OR=0.46, 95% CrI 0.25-0.86) statistically favored infliximab. However, after 52 weeks of maintenance therapy OR estimates showed no significant difference between infliximab and adalimumab. For serious adverse events and discontinuations due to adverse events, adalimumab and infliximab were similar to placebo. Further, the indirect treatment comparison of adalimumab and infliximab yielded odds ratios close to 1.00 with wide credible intervals. The findings of this indirect treatment comparison meta-analysis suggest that both infliximab and adalimumab are superior to placebo in the treatment of moderate to moderately severe ulcerative colitis. While infliximab is statistically more effective than adalimumab in the induction of remission, response and mucosal healing at 8 weeks, infliximab and adalimumab are comparable in efficacy at 52 weeks of maintenance treatment. Copyright © 2014 European Crohn's and Colitis Organisation. Published by Elsevier B.V. All rights reserved.
Spatial analysis of community-onset Staphylococcus aureus bacteremia in Queensland, Australia.
Marquess, John; Hu, Wenbiao; Nimmo, Graeme R; Clements, Archie C A
2013-03-01
To investigate and describe the relationship between indigenous Australian populations, residential aged care services, and community-onset Staphylococcus aureus bacteremia (SAB) among patients admitted to public hospitals in Queensland, Australia. Ecological study. We used administrative healthcare data linked to microbiology results from patients with SAB admitted to Queensland public hospitals from 2005 through 2010 to identify community-onset infections. Data about indigenous Australian population and residential aged care services at the local government area level were obtained from the Queensland Office of Economic and Statistical Research. Associations between community-onset SAB and indigenous Australian population and residential aged care services were calculated using Poisson regression models in a Bayesian framework. Choropleth maps were used to describe the spatial patterns of SAB risk. We observed a 21% increase in relative risk (RR) of bacteremia with methicillin-susceptible S. aureus (MSSA; RR, 1.21 [95% credible interval, 1.15-1.26]) and a 24% increase in RR with nonmultiresistant methicillin-resistant S. aureus (nmMRSA; RR, 1.24 [95% credible interval, 1.13-1.34]) with a 10% increase in the indigenous Australian population proportion. There was no significant association between RR of SAB and the number of residential aged care services. Areas with the highest RR for nmMRSA and MSSA bacteremia were identified in the northern and western regions of Queensland. The RR of community-onset SAB varied spatially across Queensland. There was increased RR of community-onset SAB with nmMRSA and MSSA in areas of Queensland with increased indigenous population proportions. Additional research should be undertaken to understand other factors that increase the risk of infection due to this organism.
1981-12-01
Gisler, William S. Jewell* 1. Motivation In Ratemaking and in Experience Rating one is often confronted with the dileina of whether or not to fully...are greatly indebted to R. Schnieper who did all the numerical work on the ETE computer. 4 -, -2- 2. The Basic Model Throughout the paper we work...l )~1) f - (1-i0)po (xl) + ipe(X) 3. The Basic Problem As always in the credibility context our aim is to estimate ua(s) based on the observations of
Determinants of medication adherence in older people with dementia from the caregivers' perspective.
El-Saifi, Najwan; Moyle, Wendy; Jones, Cindy; Alston-Knox, Clair
2018-05-11
ABSTRACTBackground:Adherence to treatment is a primary determinant of treatment success. Caregiver support can influence medication adherence in people with cognitive impairment. This study sought to characterize medication adherence in older people with dementia from the caregivers' perspective, and to identify influencing factors. Caregivers caring for a person with dementia and living in the community were eligible to complete the survey. Bayesian profile regression was applied to identify determinants of medication adherence measured using the Adherence to Refills and Medication Scale. Out of the 320 caregivers who participated in the survey, Bayesian profile regression on 221 participants identified two groups: Profile 1 (55 caregivers) with a mean adherence rate of 0.69 (80% Credible Interval (CrI): 0.61-0.77), and Profile 2 (166 caregivers) with a mean adherence rate of 0.80 (80% CrI: 0.77-0.84). Caregivers in Profile 1 were characterized with below data average scores for the following: cognitive functioning, commitment or intention, self-efficacy, and health knowledge, which were all above the data average in Profile 2, except for health knowledge. Caregivers in Profile 1 had a greater proportion of care recipients taking more than five medications and with late-stage dementia. Trade, technical, or vocational training was more common among the caregivers in Profile 1. Profile 2 caregivers had a better patient-provider relationship and less medical problems. Bayesian profile regression was useful in understanding caregiver factors that influence medication adherence. Tailored interventions to the determinants of medication adherence can guide the development of evidence-based interventions.
Bayesian model selection validates a biokinetic model for zirconium processing in humans
2012-01-01
Background In radiation protection, biokinetic models for zirconium processing are of crucial importance in dose estimation and further risk analysis for humans exposed to this radioactive substance. They provide limiting values of detrimental effects and build the basis for applications in internal dosimetry, the prediction for radioactive zirconium retention in various organs as well as retrospective dosimetry. Multi-compartmental models are the tool of choice for simulating the processing of zirconium. Although easily interpretable, determining the exact compartment structure and interaction mechanisms is generally daunting. In the context of observing the dynamics of multiple compartments, Bayesian methods provide efficient tools for model inference and selection. Results We are the first to apply a Markov chain Monte Carlo approach to compute Bayes factors for the evaluation of two competing models for zirconium processing in the human body after ingestion. Based on in vivo measurements of human plasma and urine levels we were able to show that a recently published model is superior to the standard model of the International Commission on Radiological Protection. The Bayes factors were estimated by means of the numerically stable thermodynamic integration in combination with a recently developed copula-based Metropolis-Hastings sampler. Conclusions In contrast to the standard model the novel model predicts lower accretion of zirconium in bones. This results in lower levels of noxious doses for exposed individuals. Moreover, the Bayesian approach allows for retrospective dose assessment, including credible intervals for the initially ingested zirconium, in a significantly more reliable fashion than previously possible. All methods presented here are readily applicable to many modeling tasks in systems biology. PMID:22863152
Carter, Daniel; Charlett, André; Conti, Stefano; Robotham, Julie V.; Johnson, Alan P.; Livermore, David M.; Fowler, Tom; Sharland, Mike; Hopkins, Susan; Woodford, Neil; Burgess, Philip; Dobra, Stephen
2017-01-01
To inform the UK antimicrobial resistance strategy, a risk assessment was undertaken of the likelihood, over a five-year time-frame, of the emergence and widespread dissemination of pan-drug-resistant (PDR) Gram-negative bacteria that would pose a major public health threat by compromising effective healthcare delivery. Subsequent impact over five- and 20-year time-frames was assessed in terms of morbidity and mortality attributable to PDR Gram-negative bacteraemia. A Bayesian approach, combining available data with expert prior opinion, was used to determine the probability of the emergence, persistence and spread of PDR bacteria. Overall probability was modelled using Monte Carlo simulation. Estimates of impact were also obtained using Bayesian methods. The estimated probability of widespread occurrence of PDR pathogens within five years was 0.2 (95% credibility interval (CrI): 0.07–0.37). Estimated annual numbers of PDR Gram-negative bacteraemias at five and 20 years were 6800 (95% CrI: 400–58,600) and 22,800 (95% CrI: 1500–160,000), respectively; corresponding estimates of excess deaths were 1900 (95% CrI: 0–23,000) and 6400 (95% CrI: 0–64,000). Over 20 years, cumulative estimates indicate 284,000 (95% CrI: 17,000–1,990,000) cases of PDR Gram-negative bacteraemia, leading to an estimated 79,000 (95% CrI: 0–821,000) deaths. This risk assessment reinforces the need for urgent national and international action to tackle antibiotic resistance. PMID:28272350
Measuring the radius of PSR J0437 -4715 using NICER observations of X-ray oscillations
NASA Astrophysics Data System (ADS)
Lamb, Frederick; Miller, M. Coleman
2017-01-01
The Neutron Star Interior Composition Explorer (NICER) will launch early in 2017. Its first scientific objective is to precisely and reliably measure the radius of several neutron stars, thereby constraining the properties of cold matter at supranuclear densities. This will be done by fitting energy-dependent waveform models to the observed thermal X-ray waveforms of selected rotation-powered millisecond pulsars. A key target is the 174-Hz pulsar PSR J0437 -4715. Using synthetic waveform data and Bayesian methods, we have estimated the precisions with which its mass M and radius R can be measured by NICER. When generating the synthetic data, we assumed M = 1 . 4M⊙ and R = 13 km. When generating the data and when analyzing it, we assumed the X-ray spectrum and radiation beaming pattern given by models with cool hydrogen atmospheres and two hot spots. Assuming NICER observations lasting a total of 1.0 Msec, current knowledge of M and the distance, and knowledge of the pulsar's spin axis to within 1°, the 1 σ credible region in R extends from 11.83 to 13.73 km (7.4%) and in M, from 1.307 to 1.567 M⊙ (9.1%). Marginalizing over M, we find the 1 σ credible interval for R alone extends from 12.62 to 13.68 km (4%).
The impact of the rate prior on Bayesian estimation of divergence times with multiple Loci.
Dos Reis, Mario; Zhu, Tianqi; Yang, Ziheng
2014-07-01
Bayesian methods provide a powerful way to estimate species divergence times by combining information from molecular sequences with information from the fossil record. With the explosive increase of genomic data, divergence time estimation increasingly uses data of multiple loci (genes or site partitions). Widely used computer programs to estimate divergence times use independent and identically distributed (i.i.d.) priors on the substitution rates for different loci. The i.i.d. prior is problematic. As the number of loci (L) increases, the prior variance of the average rate across all loci goes to zero at the rate 1/L. As a consequence, the rate prior dominates posterior time estimates when many loci are analyzed, and if the rate prior is misspecified, the estimated divergence times will converge to wrong values with very narrow credibility intervals. Here we develop a new prior on the locus rates based on the Dirichlet distribution that corrects the problematic behavior of the i.i.d. prior. We use computer simulation and real data analysis to highlight the differences between the old and new priors. For a dataset for six primate species, we show that with the old i.i.d. prior, if the prior rate is too high (or too low), the estimated divergence times are too young (or too old), outside the bounds imposed by the fossil calibrations. In contrast, with the new Dirichlet prior, posterior time estimates are insensitive to the rate prior and are compatible with the fossil calibrations. We re-analyzed a phylogenomic data set of 36 mammal species and show that using many fossil calibrations can alleviate the adverse impact of a misspecified rate prior to some extent. We recommend the use of the new Dirichlet prior in Bayesian divergence time estimation. [Bayesian inference, divergence time, relaxed clock, rate prior, partition analysis.]. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
Raghavan, Ram K.; Hanlon, Cathleen A.; Goodin, Douglas G.; Anderson, Gary A.
2016-01-01
Striped skunks are one of the most important terrestrial reservoirs of rabies virus in North America, and yet the prevalence of rabies among this host is only passively monitored and the disease among this host remains largely unmanaged. Oral vaccination campaigns have not efficiently targeted striped skunks, while periodic spillovers of striped skunk variant viruses to other animals, including some domestic animals, are routinely recorded. In this study we evaluated the spatial and spatio-temporal patterns of infection status among striped skunk cases submitted for rabies testing in the North Central Plains of US in a Bayesian hierarchical framework, and also evaluated potential eco-climatological drivers of such patterns. Two Bayesian hierarchical models were fitted to point-referenced striped skunk rabies cases [n = 656 (negative), and n = 310 (positive)] received at a leading rabies diagnostic facility between the years 2007–2013. The first model included only spatial and temporal terms and a second covariate model included additional covariates representing eco-climatic conditions within a 4km2 home-range area for striped skunks. The better performing covariate model indicated the presence of significant spatial and temporal trends in the dataset and identified higher amounts of land covered by low-intensity developed areas [Odds ratio (OR) = 3.41; 95% Bayesian Credible Intervals (CrI) = 2.08, 3.85], higher level of patch fragmentation (OR = 1.70; 95% CrI = 1.25, 2.89), and diurnal temperature range (OR = 0.54; 95% CrI = 0.27, 0.91) to be important drivers of striped skunk rabies incidence in the study area. Model validation statistics indicated satisfactory performance for both models; however, the covariate model fared better. The findings of this study are important in the context of rabies management among striped skunks in North America, and the relevance of physical and climatological factors as risk factors for skunk to human rabies transmission and the space-time patterns of striped skunk rabies are discussed. PMID:27127994
Liew, Bernard X W; Drovandi, Christopher C; Clifford, Samuel; Keogh, Justin W L; Morris, Susan; Netto, Kevin
2018-01-01
There is convincing evidence for the benefits of resistance training on vertical jump improvements, but little evidence to guide optimal training prescription. The inability to detect small between modality effects may partially reflect the use of ANOVA statistics. This study represents the results of a sub-study from a larger project investigating the effects of two resistance training methods on load carriage running energetics. Bayesian statistics were used to compare the effectiveness of isoinertial resistance against speed-power training to change countermovement jump (CMJ) and squat jump (SJ) height, and joint energetics. Active adults were randomly allocated to either a six-week isoinertial ( n = 16; calf raises, leg press, and lunge), or a speed-power training program ( n = 14; countermovement jumps, hopping, with hip flexor training to target pre-swing running energetics). Primary outcome variables included jump height and joint power. Bayesian mixed modelling and Functional Data Analysis were used, where significance was determined by a non-zero crossing of the 95% Bayesian Credible Interval (CrI). The gain in CMJ height after isoinertial training was 1.95 cm (95% CrI [0.85-3.04] cm) greater than the gain after speed-power training, but the gain in SJ height was similar between groups. In the CMJ, isoinertial training produced a larger increase in power absorption at the hip by a mean 0.018% (equivalent to 35 W) (95% CrI [0.007-0.03]), knee by 0.014% (equivalent to 27 W) (95% CrI [0.006-0.02]) and foot by 0.011% (equivalent to 21 W) (95% CrI [0.005-0.02]) compared to speed-power training. Short-term isoinertial training improved CMJ height more than speed-power training. The principle adaptive difference between training modalities was at the level of hip, knee and foot power absorption.
Rosinska, M; Gwiazda, P; De Angelis, D; Presanis, A M
2016-04-01
HIV spread in men who have sex with men (MSM) is an increasing problem in Poland. Despite the existence of a surveillance system, there is no direct evidence to allow estimation of HIV prevalence and the proportion undiagnosed in MSM. We extracted data on HIV and the MSM population in Poland, including case-based surveillance data, diagnostic testing prevalence data and behavioural data relating to self-reported prior diagnosis, stratified by age (⩽35, >35 years) and region (Mazowieckie including the capital city of Warsaw; other regions). They were integrated into one model based on a Bayesian evidence synthesis approach. The posterior distributions for HIV prevalence and the undiagnosed fraction were estimated by Markov Chain Monte Carlo methods. To improve the model fit we repeated the analysis, introducing bias parameters to account for potential lack of representativeness in data. By placing additional constraints on bias parameters we obtained precisely identified estimates. This family of models indicates a high undiagnosed fraction [68·3%, 95% credibility interval (CrI) 53·9-76·1] and overall low prevalence (2·3%, 95% CrI 1·4-4·1) of HIV in MSM. Additional data are necessary in order to produce more robust epidemiological estimates. More effort is urgently needed to ensure timely diagnosis of HIV in Poland.
Smith, Brian J; Zhang, Lixun; Field, R William
2007-11-10
This paper presents a Bayesian model that allows for the joint prediction of county-average radon levels and estimation of the associated leukaemia risk. The methods are motivated by radon data from an epidemiologic study of residential radon in Iowa that include 2726 outdoor and indoor measurements. Prediction of county-average radon is based on a geostatistical model for the radon data which assumes an underlying continuous spatial process. In the radon model, we account for uncertainties due to incomplete spatial coverage, spatial variability, characteristic differences between homes, and detector measurement error. The predicted radon averages are, in turn, included as a covariate in Poisson models for incident cases of acute lymphocytic (ALL), acute myelogenous (AML), chronic lymphocytic (CLL), and chronic myelogenous (CML) leukaemias reported to the Iowa cancer registry from 1973 to 2002. Since radon and leukaemia risk are modelled simultaneously in our approach, the resulting risk estimates accurately reflect uncertainties in the predicted radon exposure covariate. Posterior mean (95 per cent Bayesian credible interval) estimates of the relative risk associated with a 1 pCi/L increase in radon for ALL, AML, CLL, and CML are 0.91 (0.78-1.03), 1.01 (0.92-1.12), 1.06 (0.96-1.16), and 1.12 (0.98-1.27), respectively. Copyright 2007 John Wiley & Sons, Ltd.
Ochoa, J; León, A L; Ramírez, I C; Lopera, C M; Bernal, E; Arbeláez, M P
2017-04-01
A latent tuberculosis infection (LTBI) prevalence survey was conducted using tuberculin skin test (TST) and Quantiferon test (QFT) in 1218 healthcare workers (HCWs) in Medellín, Colombia. In order to improve the prevalence estimates, a latent class model was built using a Bayesian approach with informative priors on the sensitivity and specificity of the TST. The proportion of concordant results (TST+,QFT+) was 41% and the discordant results contributed 27%. The marginal estimate of the prevalence P(LTBI+) was 62·1% [95% credible interval (CrI) 53·0-68·2]. The probability of LTBI+ given positive results for both tests was 99·6% (95% CrI 98·1-99·9). Sensitivity was 88·5 for TST and 74·3 for QFT, and specificity was 87·8 for TST and 97·6 for QFT. A high LTBI prevalence was found in HCWs with time-accumulated exposure in hospitals that lack control plans. In a context of intermediate tuberculosis (TB) incidence it is recommended to use only one test (either QFT or TST) in prevalence surveys or as pre-employment tests. Results will be useful to help implement TB infection control plans in hospitals where HCWs may be repeatedly exposed to unnoticed TB patients, and to inform the design of TB control policies.
Geographical Heterogeneity of Multiple Sclerosis Prevalence in France.
Pivot, Diane; Debouverie, Marc; Grzebyk, Michel; Brassat, David; Clanet, Michel; Clavelou, Pierre; Confavreux, Christian; Edan, Gilles; Leray, Emmanuelle; Moreau, Thibault; Vukusic, Sandra; Hédelin, Guy; Guillemin, Francis
2016-01-01
Geographical variation in the prevalence of multiple sclerosis (MS) is controversial. Heterogeneity is important to acknowledge to adapt the provision of care within the healthcare system. We aimed to investigate differences in prevalence of MS in departments in the French territory. We estimated MS prevalence on October 31, 2004 in 21 administrative departments in France (22% of the metropolitan departments) by using multiple data sources: the main French health insurance systems, neurologist networks devoted to MS and the Technical Information Agency of Hospitalization. We used a spatial Bayesian approach based on estimating the number of MS cases from 2005 and 2008 capture-recapture studies to analyze differences in prevalence. The age- and sex-standardized prevalence of MS per 100,000 inhabitants ranged from 68.1 (95% credible interval 54.6, 84.4) in Hautes-Pyrénées (southwest France) to 296.5 (258.8, 338.9) in Moselle (northeast France). The greatest prevalence was in the northeast departments, and the other departments showed great variability. By combining multiple data sources into a spatial Bayesian model, we found heterogeneity in MS prevalence among the 21 departments of France, some with higher prevalence than anticipated from previous publications. No clear explanation related to health insurance coverage and hospital facilities can be advanced. Population migration, socioeconomic status of the population studied and environmental effects are suspected.
Bayesian modeling of Clostridium perfringens growth in beef-in-sauce products.
Jaloustre, S; Cornu, M; Morelli, E; Noël, V; Delignette-Muller, M L
2011-04-01
Models on Clostridium perfringens growth which have been published to date have all been deterministic. A probabilistic model describing growth under non-isothermal conditions was thus proposed for predicting C. perfringens growth in beef-in-sauce products cooked and distributed in a French hospital. Model parameters were estimated from different types of data from various studies. A Bayesian approach was proposed to model the overall uncertainty regarding parameters and potential variability on the 'work to be done' (h(0)) during the germination, outgrowth and lag phase. Three models which differed according to their description of this parameter h(0) were tested. The model with inter-curve variability on h(0) was found to be the best one, on the basis of goodness-of-fit assessment and validation with literature data on results obtained under non-isothermal conditions. This model was used in two-dimensional Monte Carlo simulations to predict C. perfringens growth throughout the preparation of beef-in-sauce products, using temperature profiles recorded in a hospital kitchen. The median predicted growth was 7.8×10(-2) log(10) cfu·g(-1) (95% credibility interval [2.4×10(-2), 0.8]) despite the fact that for more than 50% of the registered temperature profiles cooling steps were longer than those required by French regulations. Copyright © 2010 Elsevier Ltd. All rights reserved.
Bayesian Inversion of 2D Models from Airborne Transient EM Data
NASA Astrophysics Data System (ADS)
Blatter, D. B.; Key, K.; Ray, A.
2016-12-01
The inherent non-uniqueness in most geophysical inverse problems leads to an infinite number of Earth models that fit observed data to within an adequate tolerance. To resolve this ambiguity, traditional inversion methods based on optimization techniques such as the Gauss-Newton and conjugate gradient methods rely on an additional regularization constraint on the properties that an acceptable model can possess, such as having minimal roughness. While allowing such an inversion scheme to converge on a solution, regularization makes it difficult to estimate the uncertainty associated with the model parameters. This is because regularization biases the inversion process toward certain models that satisfy the regularization constraint and away from others that don't, even when both may suitably fit the data. By contrast, a Bayesian inversion framework aims to produce not a single `most acceptable' model but an estimate of the posterior likelihood of the model parameters, given the observed data. In this work, we develop a 2D Bayesian framework for the inversion of transient electromagnetic (TEM) data. Our method relies on a reversible-jump Markov Chain Monte Carlo (RJ-MCMC) Bayesian inverse method with parallel tempering. Previous gradient-based inversion work in this area used a spatially constrained scheme wherein individual (1D) soundings were inverted together and non-uniqueness was tackled by using lateral and vertical smoothness constraints. By contrast, our work uses a 2D model space of Voronoi cells whose parameterization (including number of cells) is fully data-driven. To make the problem work practically, we approximate the forward solution for each TEM sounding using a local 1D approximation where the model is obtained from the 2D model by retrieving a vertical profile through the Voronoi cells. The implicit parsimony of the Bayesian inversion process leads to the simplest models that adequately explain the data, obviating the need for explicit smoothness constraints. In addition, credible intervals in model space are directly obtained, resolving some of the uncertainty introduced by regularization. An example application shows how the method can be used to quantify the uncertainty in airborne EM soundings for imaging subglacial brine channels and groundwater systems.
Bayesian analysis of multimethod ego-depletion studies favours the null hypothesis.
Etherton, Joseph L; Osborne, Randall; Stephenson, Katelyn; Grace, Morgan; Jones, Chas; De Nadai, Alessandro S
2018-04-01
Ego-depletion refers to the purported decrease in performance on a task requiring self-control after engaging in a previous task involving self-control, with self-control proposed to be a limited resource. Despite many published studies consistent with this hypothesis, recurrent null findings within our laboratory and indications of publication bias have called into question the validity of the depletion effect. This project used three depletion protocols involved three different depleting initial tasks followed by three different self-control tasks as dependent measures (total n = 840). For each method, effect sizes were not significantly different from zero When data were aggregated across the three different methods and examined meta-analytically, the pooled effect size was not significantly different from zero (for all priors evaluated, Hedges' g = 0.10 with 95% credibility interval of [-0.05, 0.24]) and Bayes factors reflected strong support for the null hypothesis (Bayes factor > 25 for all priors evaluated). © 2018 The British Psychological Society.
DE Knegt, L V; Pires, S M; Hald, T
2015-04-01
A Bayesian modelling approach comparing the occurrence of Salmonella serovars in animals and humans was used to attribute salmonellosis cases to broilers, turkeys, pigs, laying hens, travel and outbreaks in 24 European Union countries. Salmonella data for animals and humans, covering the period from 2007 to 2009, were mainly obtained from studies and reports published by the European Food Safety Authority. Availability of food sources for consumption was derived from trade and production data from the European Statistical Office. Results showed layers as the most important reservoir of human salmonellosis in Europe, with 42·4% (7 903 000 cases, 95% credibility interval 4 181 000-14 510 000) of cases, 95·9% of which was caused by S. Enteritidis. In Finland and Sweden, most cases were travel-related, while in most other countries the main sources were related to the laying hen or pig reservoir, highlighting differences in the epidemiology of Salmonella, surveillance focus and eating habits across the European Union.
NASA Astrophysics Data System (ADS)
Piecuch, C. G.; Huybers, P. J.; Hay, C.; Mitrovica, J. X.; Little, C. M.; Ponte, R. M.; Tingley, M.
2017-12-01
Understanding observed spatial variations in centennial relative sea level trends on the United States east coast has important scientific and societal applications. Past studies based on models and proxies variously suggest roles for crustal displacement, ocean dynamics, and melting of the Greenland ice sheet. Here we perform joint Bayesian inference on regional relative sea level, vertical land motion, and absolute sea level fields based on tide gauge records and GPS data. Posterior solutions show that regional vertical land motion explains most (80% median estimate) of the spatial variance in the large-scale relative sea level trend field on the east coast over 1900-2016. The posterior estimate for coastal absolute sea level rise is remarkably spatially uniform compared to previous studies, with a spatial average of 1.4-2.3 mm/yr (95% credible interval). Results corroborate glacial isostatic adjustment models and reveal that meaningful long-period, large-scale vertical velocity signals can be extracted from short GPS records.
Evaluating a fish monitoring protocol using state-space hierarchical models
Russell, Robin E.; Schmetterling, David A.; Guy, Chris S.; Shepard, Bradley B.; McFarland, Robert; Skaar, Donald
2012-01-01
Using data collected from three river reaches in Montana, we evaluated our ability to detect population trends and predict fish future fish abundance. Data were collected as part of a long-term monitoring program conducted by Montana Fish, Wildlife and Parks to primarily estimate rainbow (Oncorhynchus mykiss) and brown trout (Salmo trutta) abundance in numerous rivers across Montana. We used a hierarchical Bayesian mark-recapture model to estimate fish abundance over time in each of the three river reaches. We then fit a state-space Gompertz model to estimate current trends and future fish populations. Density dependent effects were detected in 1 of the 6 fish populations. Predictions of future fish populations displayed wide credible intervals. Our simulations indicated that given the observed variation in the abundance estimates, the probability of detecting a 30% decline in fish populations over a five-year period was less than 50%. We recommend a monitoring program that is closely tied to management objectives and reflects the precision necessary to make informed management decisions.
Lottering, Nicolene; MacGregor, Donna M; Alston, Clair L; Watson, Debbie; Gregory, Laura S
2016-01-01
Contemporary, population-specific ossification timings of the cranium are lacking in current literature due to challenges in obtaining large repositories of documented subadult material, forcing Australian practitioners to rely on North American, arguably antiquated reference standards for age estimation. This study assessed the temporal pattern of ossification of the cranium and provides recalibrated probabilistic information for age estimation of modern Australian children. Fusion status of the occipital and frontal bones, atlas, and axis was scored using a modified two- to four-tier system from cranial/cervical DICOM datasets of 585 children aged birth to 10 years. Transition analysis was applied to elucidate maximum-likelihood estimates between consecutive fusion stages, in conjunction with Bayesian statistics to calculate credible intervals for age estimation. Results demonstrate significant sex differences in skeletal maturation (p < 0.05) and earlier timings in comparison with major literary sources, underscoring the requisite of updated standards for age estimation of modern individuals. © 2015 American Academy of Forensic Sciences.
Hanks, Ephraim M.; Schliep, Erin M.; Hooten, Mevin B.; Hoeting, Jennifer A.
2015-01-01
In spatial generalized linear mixed models (SGLMMs), covariates that are spatially smooth are often collinear with spatially smooth random effects. This phenomenon is known as spatial confounding and has been studied primarily in the case where the spatial support of the process being studied is discrete (e.g., areal spatial data). In this case, the most common approach suggested is restricted spatial regression (RSR) in which the spatial random effects are constrained to be orthogonal to the fixed effects. We consider spatial confounding and RSR in the geostatistical (continuous spatial support) setting. We show that RSR provides computational benefits relative to the confounded SGLMM, but that Bayesian credible intervals under RSR can be inappropriately narrow under model misspecification. We propose a posterior predictive approach to alleviating this potential problem and discuss the appropriateness of RSR in a variety of situations. We illustrate RSR and SGLMM approaches through simulation studies and an analysis of malaria frequencies in The Gambia, Africa.
Catelan, Dolores; Biggeri, Annibale
2008-11-01
In environmental epidemiology, long lists of relative risk estimates from exposed populations are compared to a reference to scrutinize the dataset for extremes. Here, inference on disease profiles for given areas, or for fixed disease population signatures, are of interest and summaries can be obtained averaging over areas or diseases. We have developed a multivariate hierarchical Bayesian approach to estimate posterior rank distributions and we show how to produce league tables of ranks with credibility intervals useful to address the above mentioned inferential problems. Applying the procedure to a real dataset from the report "Environment and Health in Sardinia (Italy)" we selected 18 areas characterized by high environmental pressure for industrial, mining or military activities investigated for 29 causes of deaths among male residents. Ranking diseases highlighted the increased burdens of neoplastic (cancerous), and non-neoplastic respiratory diseases in the heavily polluted area of Portoscuso. The averaged ranks by disease over areas showed lung cancer among the three highest positions.
qPR: An adaptive partial-report procedure based on Bayesian inference.
Baek, Jongsoo; Lesmes, Luis Andres; Lu, Zhong-Lin
2016-08-01
Iconic memory is best assessed with the partial report procedure in which an array of letters appears briefly on the screen and a poststimulus cue directs the observer to report the identity of the cued letter(s). Typically, 6-8 cue delays or 600-800 trials are tested to measure the iconic memory decay function. Here we develop a quick partial report, or qPR, procedure based on a Bayesian adaptive framework to estimate the iconic memory decay function with much reduced testing time. The iconic memory decay function is characterized by an exponential function and a joint probability distribution of its three parameters. Starting with a prior of the parameters, the method selects the stimulus to maximize the expected information gain in the next test trial. It then updates the posterior probability distribution of the parameters based on the observer's response using Bayesian inference. The procedure is reiterated until either the total number of trials or the precision of the parameter estimates reaches a certain criterion. Simulation studies showed that only 100 trials were necessary to reach an average absolute bias of 0.026 and a precision of 0.070 (both in terms of probability correct). A psychophysical validation experiment showed that estimates of the iconic memory decay function obtained with 100 qPR trials exhibited good precision (the half width of the 68.2% credible interval = 0.055) and excellent agreement with those obtained with 1,600 trials of the conventional method of constant stimuli procedure (RMSE = 0.063). Quick partial-report relieves the data collection burden in characterizing iconic memory and makes it possible to assess iconic memory in clinical populations.
Ide, Kazuki; Kawasaki, Yohei; Akutagawa, Maiko; Yamada, Hiroshi
2017-02-01
The aim of this study is to analyze the data obtained from a randomized trial on the prevention of influenza by gargling with green tea, which gave nonsignificant results based on frequentist approaches, by using Bayesian approaches. The posterior proportion, with 95% credible interval (CrI), of influenza in each group was calculated. The Bayesian index θ is the probability that a hypothesis is true. In this case, θ is the probability that the hypothesis that green tea gargling reduced influenza compared with water gargling is true. Univariate and multivariate logistic regression analyses were also performed by using the Markov chain Monte Carlo method. The full analysis set included 747 participants. During the study period, influenza occurred in 44 participants (5.9%). The difference between the two independent binominal proportions was -0.019 (95% CrI, -0.054 to 0.015; θ = 0.87). The partial regression coefficients in the univariate analysis were -0.35 (95% CrI, -1.00 to 0.24) with use of a uniform prior and -0.34 (95% CrI, -0.96 to 0.27) with use of a Jeffreys prior. In the multivariate analysis, the values were -0.37 (95% CrI, -0.96 to 0.30) and -0.36 (95% CrI, -1.03 to 0.21), respectively. The difference between the two independent binominal proportions was less than 0, and θ was greater than 0.85. Therefore, green tea gargling may slightly reduce influenza compared with water gargling. This analysis suggests that green tea gargling can be an additional preventive measure for use with other pharmaceutical and nonpharmaceutical measures and indicates the need for additional studies to confirm the effect of green tea gargling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cauthen, Katherine Regina; Lambert, Gregory Joseph; Finley, Patrick D.
There is mounting evidence that alcohol use is significantly linked to lower HCV treatment response rates in interferon-based therapies, though some of the evidence is conflicting. Furthermore, although health care providers recommend reducing or abstaining from alcohol use prior to treatment, many patients do not succeed in doing so. The goal of this meta-analysis was to systematically review and summarize the Englishlanguage literature up through January 30, 2015 regarding the relationship between alcohol use and HCV treatment outcomes, among patients who were not required to abstain from alcohol use in order to receive treatment. Seven pertinent articles studying 1,751 HCV-infectedmore » patients were identified. Log-ORs of HCV treatment response for heavy alcohol use and light alcohol use were calculated and compared. We employed a hierarchical Bayesian meta-analytic model to accommodate the small sample size. The summary estimate for the log-OR of HCV treatment response was -0.775 with a 95% credible interval of (-1.397, -0.236). The results of the Bayesian meta-analysis are slightly more conservative compared to those obtained from a boot-strapped, random effects model. We found evidence of heterogeneity (Q = 14.489, p = 0.025), accounting for 60.28% of the variation among log-ORs. Meta-regression to capture the sources of this heterogeneity did not identify any of the covariates investigated as significant. This meta-analysis confirms that heavy alcohol use is associated with decreased HCV treatment response compared to lighter levels of alcohol use. Further research is required to characterize the mechanism by which alcohol use affects HCV treatment response.« less
qPR: An adaptive partial-report procedure based on Bayesian inference
Baek, Jongsoo; Lesmes, Luis Andres; Lu, Zhong-Lin
2016-01-01
Iconic memory is best assessed with the partial report procedure in which an array of letters appears briefly on the screen and a poststimulus cue directs the observer to report the identity of the cued letter(s). Typically, 6–8 cue delays or 600–800 trials are tested to measure the iconic memory decay function. Here we develop a quick partial report, or qPR, procedure based on a Bayesian adaptive framework to estimate the iconic memory decay function with much reduced testing time. The iconic memory decay function is characterized by an exponential function and a joint probability distribution of its three parameters. Starting with a prior of the parameters, the method selects the stimulus to maximize the expected information gain in the next test trial. It then updates the posterior probability distribution of the parameters based on the observer's response using Bayesian inference. The procedure is reiterated until either the total number of trials or the precision of the parameter estimates reaches a certain criterion. Simulation studies showed that only 100 trials were necessary to reach an average absolute bias of 0.026 and a precision of 0.070 (both in terms of probability correct). A psychophysical validation experiment showed that estimates of the iconic memory decay function obtained with 100 qPR trials exhibited good precision (the half width of the 68.2% credible interval = 0.055) and excellent agreement with those obtained with 1,600 trials of the conventional method of constant stimuli procedure (RMSE = 0.063). Quick partial-report relieves the data collection burden in characterizing iconic memory and makes it possible to assess iconic memory in clinical populations. PMID:27580045
Inhaled Cannabis for Chronic Neuropathic Pain: A Meta-analysis of Individual Patient Data.
Andreae, Michael H; Carter, George M; Shaparin, Naum; Suslov, Kathryn; Ellis, Ronald J; Ware, Mark A; Abrams, Donald I; Prasad, Hannah; Wilsey, Barth; Indyk, Debbie; Johnson, Matthew; Sacks, Henry S
2015-12-01
Chronic neuropathic pain, the most frequent condition affecting the peripheral nervous system, remains underdiagnosed and difficult to treat. Inhaled cannabis may alleviate chronic neuropathic pain. Our objective was to synthesize the evidence on the use of inhaled cannabis for chronic neuropathic pain. We performed a systematic review and a meta-analysis of individual patient data. We registered our protocol with PROSPERO CRD42011001182. We searched in Cochrane Central, PubMed, EMBASE, and AMED. We considered all randomized controlled trials investigating chronic painful neuropathy and comparing inhaled cannabis with placebo. We pooled treatment effects following a hierarchical random-effects Bayesian responder model for the population-averaged subject-specific effect. Our evidence synthesis of individual patient data from 178 participants with 405 observed responses in 5 randomized controlled trials following patients for days to weeks provides evidence that inhaled cannabis results in short-term reductions in chronic neuropathic pain for 1 in every 5 to 6 patients treated (number needed to treat = 5.6 with a Bayesian 95% credible interval ranging between 3.4 and 14). Our inferences were insensitive to model assumptions, priors, and parameter choices. We caution that the small number of studies and participants, the short follow-up, shortcomings in allocation concealment, and considerable attrition limit the conclusions that can be drawn from the review. The Bayes factor is 332, corresponding to a posterior probability of effect of 99.7%. This novel Bayesian meta-analysis of individual patient data from 5 randomized trials suggests that inhaled cannabis may provide short-term relief for 1 in 5 to 6 patients with neuropathic pain. Pragmatic trials are needed to evaluate the long-term benefits and risks of this treatment. Copyright © 2015 American Pain Society. Published by Elsevier Inc. All rights reserved.
Estimating Tree Height-Diameter Models with the Bayesian Method
Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei
2014-01-01
Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the “best” model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2. PMID:24711733
Estimating tree height-diameter models with the Bayesian method.
Zhang, Xiongqing; Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei
2014-01-01
Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the "best" model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2.
Giacoppo, Daniele; Gargiulo, Giuseppe; Aruta, Patrizia; Capranzano, Piera; Tamburino, Corrado
2015-01-01
Study question What is the most safe and effective interventional treatment for coronary in-stent restenosis? Methods In a hierarchical Bayesian network meta-analysis, PubMed, Embase, Scopus, Cochrane Library, Web of Science, ScienceDirect, and major scientific websites were screened up to 10 August 2015. Randomised controlled trials of patients with any type of coronary in-stent restenosis (either of bare metal stents or drug eluting stents; and either first or recurrent instances) were included. Trials including multiple treatments at the same time in the same group or comparing variants of the same intervention were excluded. Primary endpoints were target lesion revascularisation and late lumen loss, both at six to 12 months. The main analysis was complemented by network subanalyses, standard pairwise comparisons, and subgroup and sensitivity analyses. Study answer and limitations Twenty four trials (4880 patients), including seven interventional treatments, were identified. Compared with plain balloons, bare metal stents, brachytherapy, rotational atherectomy, and cutting balloons, drug coated balloons and drug eluting stents were associated with a reduced risk of target lesion revascularisation and major adverse cardiac events, and with reduced late lumen loss. Treatment ranking indicated that drug eluting stents had the highest probability (61.4%) of being the most effective for target lesion vascularisation; drug coated balloons were similarly indicated as the most effective treatment for late lumen loss (probability 70.3%). The comparative efficacy of drug coated balloons and drug eluting stents was similar for target lesion revascularisation (summary odds ratio 1.10, 95% credible interval 0.59 to 2.01) and late lumen loss reduction (mean difference in minimum lumen diameter 0.04 mm, 95% credible interval −0.20 to 0.10). Risks of death, myocardial infarction, and stent thrombosis were comparable across all treatments, but these analyses were limited by a low number of events. Trials had heterogeneity regarding investigation periods, baseline characteristics, and endpoint reporting, with a lack of information at long term follow-up. Direct and indirect evidence was also inconsistent for the comparison between drug eluting stents and drug coated balloons. What this study adds Compared with other currently available interventional treatments for coronary in-stent restenosis, drug coated balloons and drug eluting stents are associated with superior clinical and angiographic outcomes, with a similar comparative efficacy. Funding, competing interests, data sharing This study received no external funding. The authors declare no competing interests. No additional data available. PMID:26537292
2014 Gulf of Mexico Hypoxia Forecast
Scavia, Donald; Evans, Mary Anne; Obenour, Dan
2014-01-01
The Gulf of Mexico annual summer hypoxia forecasts are based on average May total nitrogen loads from the Mississippi River basin for that year. The load estimate, recently released by USGS, is 4,761 metric tons per day. Based on that estimate, we predict the area of this summer’s hypoxic zone to be 14,000 square kilometers (95% credible interval, 8,000 to 20,000) – an “average year”. Our forecast hypoxic volume is 50 km3 (95% credible interval, 20 to 77).
Consistency between direct and indirect trial evidence: is direct evidence always more reliable?
Madan, Jason; Stevenson, Matt D; Cooper, Katy L; Ades, A E; Whyte, Sophie; Akehurst, Ron
2011-01-01
To present a case study involving the reduction in incidence of febrile neutropenia (FN) after chemotherapy with granulocyte colony-stimulating factors (G-CSFs), illustrating difficulties that may arise when following the common preference for direct evidence over indirect evidence. Evidence of the efficacy of treatments was identified from two previous systematic reviews. We used Bayesian evidence synthesis to estimate relative treatment effects based on direct evidence, indirect evidence, and both pooled together. We checked for inconsistency between direct and indirect evidence and explored the role of one specific trial using cross-validation. A subsequent review identified further studies not available at the time of the original analysis. We repeated the analyses on the enlarged evidence base. We found substantial inconsistency in the original evidence base. The median odds ratio of FN for primary pegfilgrastim versus no primary G-CSF was 0.06 (95% credible interval: 0.02-0.19) based on direct evidence, but 0.27 (95% credible interval: 0.13-0.53) based on indirect evidence (P value for consistency hypothesis 0.027). The additional trials were consistent with the earlier indirect, rather than the direct, evidence, and there was no inconsistency between direct and indirect estimates in the updated evidence. The earlier inconsistency was due to one trial comparing primary pegfilgrastim with no primary G-CSF. Predictive cross-validation showed that this study was inconsistent with the evidence as a whole and with other trials making this comparison. Both the Cochrane Handbook and the NICE Methods Guide express a preference for direct evidence. A more robust strategy, which is in line with the accepted principles of evidence synthesis, would be to combine all relevant and appropriate information, whether direct or indirect. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
New insights into faster computation of uncertainties
NASA Astrophysics Data System (ADS)
Bhattacharya, Atreyee
2012-11-01
Heavy computation power, lengthy simulations, and an exhaustive number of model runs—often these seem like the only statistical tools that scientists have at their disposal when computing uncertainties associated with predictions, particularly in cases of environmental processes such as groundwater movement. However, calculation of uncertainties need not be as lengthy, a new study shows. Comparing two approaches—the classical Bayesian “credible interval” and a less commonly used regression-based “confidence interval” method—Lu et al. show that for many practical purposes both methods provide similar estimates of uncertainties. The advantage of the regression method is that it demands 10-1000 model runs, whereas the classical Bayesian approach requires 10,000 to millions of model runs.
Statistical surrogate models for prediction of high-consequence climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick
2011-09-01
In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest.more » A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.« less
Chong, Ka Chun; Zee, Benny Chung Ying; Wang, Maggie Haitian
2018-04-10
In an influenza pandemic, arrival times of cases are a proxy of the epidemic size and disease transmissibility. Because of intense surveillance of travelers from infected countries, detection is more rapid and complete than on local surveillance. Travel information can provide a more reliable estimation of transmission parameters. We developed an Approximate Bayesian Computation algorithm to estimate the basic reproduction number (R 0 ) in addition to the reporting rate and unobserved epidemic start time, utilizing travel, and routine surveillance data in an influenza pandemic. A simulation was conducted to assess the sampling uncertainty. The estimation approach was further applied to the 2009 influenza A/H1N1 pandemic in Mexico as a case study. In the simulations, we showed that the estimation approach was valid and reliable in different simulation settings. We also found estimates of R 0 and the reporting rate to be 1.37 (95% Credible Interval [CI]: 1.26-1.42) and 4.9% (95% CI: 0.1%-18%), respectively, in the 2009 influenza pandemic in Mexico, which were robust to variations in the fixed parameters. The estimated R 0 was consistent with that in the literature. This method is useful for officials to obtain reliable estimates of disease transmissibility for strategic planning. We suggest that improvements to the flow of reporting for confirmed cases among patients arriving at different countries are required. Copyright © 2018 Elsevier Ltd. All rights reserved.
Bayesian Source Attribution of Salmonellosis in South Australia.
Glass, K; Fearnley, E; Hocking, H; Raupach, J; Veitch, M; Ford, L; Kirk, M D
2016-03-01
Salmonellosis is a significant cause of foodborne gastroenteritis in Australia, and rates of illness have increased over recent years. We adopt a Bayesian source attribution model to estimate the contribution of different animal reservoirs to illness due to Salmonella spp. in South Australia between 2000 and 2010, together with 95% credible intervals (CrI). We excluded known travel associated cases and those of rare subtypes (fewer than 20 human cases or fewer than 10 isolates from included sources over the 11-year period), and the remaining 76% of cases were classified as sporadic or outbreak associated. Source-related parameters were included to allow for different handling and consumption practices. We attributed 35% (95% CrI: 20-49) of sporadic cases to chicken meat and 37% (95% CrI: 23-53) of sporadic cases to eggs. Of outbreak-related cases, 33% (95% CrI: 20-62) were attributed to chicken meat and 59% (95% CrI: 29-75) to eggs. A comparison of alternative model assumptions indicated that biases due to possible clustering of samples from sources had relatively minor effects on these estimates. Analysis of source-related parameters showed higher risk of illness from contaminated eggs than from contaminated chicken meat, suggesting that consumption and handling practices potentially play a bigger role in illness due to eggs, considering low Salmonella prevalence on eggs. Our results strengthen the evidence that eggs and chicken meat are important vehicles for salmonellosis in South Australia. © 2015 Society for Risk Analysis.
Koons, David N; Colchero, Fernando; Hersey, Kent; Gimenez, Olivier
2015-06-01
Understanding the relative effects of climate, harvest, and density dependence on population dynamics is critical for guiding sound population management, especially for ungulates in arid and semiarid environments experiencing climate change. To address these issues for bison in southern Utah, USA, we applied a Bayesian state-space model to a 72-yr time series of abundance counts. While accounting for known harvest (as well as live removal) from the population, we found that the bison population in southern Utah exhibited a strong potential to grow from low density (β0 = 0.26; Bayesian credible interval based on 95% of the highest posterior density [BCI] = 0.19-0.33), and weak but statistically significant density dependence (β1 = -0.02, BCI = -0.04 to -0.004). Early spring temperatures also had strong positive effects on population growth (Pfat1 = 0.09, BCI = 0.04-0.14), much more so than precipitation and other temperature-related variables (model weight > three times more than that for other climate variables). Although we hypothesized that harvest is the primary driving force of bison population dynamics in southern Utah, our elasticity analysis indicated that changes in early spring temperature could have a greater relative effect on equilibrium abundance than either harvest or. the strength of density dependence. Our findings highlight the utility of incorporating elasticity analyses into state-space population models, and the need to include climatic processes in wildlife management policies and planning.
Fablet, C; Rose, N; Bernard, C; Messager, I; Piel, Y; Grasland, B
2017-11-01
This study was designed to assess the diagnostic characteristics of two PCV2 ELISAs without a gold standard. Four hundred and sixty-five serum samples from finishing pigs (25 herds) not vaccinated against PCV2 were used. Samples were tested by two ELISAs: an in-house ELISA (I-ELISA) and the commercial SERELISA ® PCV2 Ab Mono Blocking kit (S-ELISA). A ROC curve was used to assess the S-ELISA's optimal threshold by taking the I-ELISA as a reference and using the cut-off previously determined by comparison to an cccmonolayer assay (IPMA). This led to an S-ELISA result ≥170 being considered as positive. The sensitivity (Se) and specificity (Sp) of each ELISA were then estimated without a gold standard using a Bayesian approach. The mean Se and Sp values of the I-ELISA were slightly higher than those of the S-ELISA (mean Se I-ELISA=0.90 vs. mean Se S-ELISA=0.86; mean Sp I-ELISA=0.92 vs. mean Sp S-ELISA=0.85). However, the 95% credibility intervals (CI95%) overlapped (Se I-ELISA CI95%=0.85-0.95 vs. Se S-ELISA CI95%=0.82-0.90; Sp I-ELISA CI95%=0.82-0.98 vs. Sp S-ELISA CI95%=0.75-0.94). Both ELISAs appeared to be valuable tools for detecting PCV2 antibodies. Copyright © 2017 Elsevier B.V. All rights reserved.
A Bayesian analysis of trends in ozone sounding data series from 9 Nordic stations
NASA Astrophysics Data System (ADS)
Christiansen, Bo; Jepsen, Nis; Larsen, Niels; Korsholm, Ulrik S.
2016-04-01
Ozone soundings from 9 Nordic stations have been homogenized and interpolated to standard pressure levels. The different stations have very different data coverage; the longest period with data is from the end of the 1980ies to 2013. We apply a model which includes both low-frequency variability in form of a polynomial, an annual cycle with harmonics, the possibility for low-frequency variability in the annual amplitude and phasing, and either white noise or AR1 noise. The fitting of the parameters is performed with a Bayesian approach not only giving the posterior mean values but also credible intervals. We find that all stations agree on an well-defined annual cycle in the free troposphere with a relatively confined maximum in the early summer. Regarding the low-frequency variability we find that Scoresbysund, Ny Aalesund, and Sodankyla show similar structures with a maximum near 2005 followed by a decrease. However, these results are only weakly significant. A significant change in the amplitude of the annual cycle was only found for Ny Aalesund. Here the peak-to-peak amplitude changes from 0.9 to 0.8 mhPa between 1995-2000 and 2007-2012. The results are shown to be robust to the different settings of the model parameters (order of the polynomial, number of harmonics in the annual cycle, type of noise, etc). The results are also shown to be characteristic for all pressure levels in the free troposphere.
Eilstein, Daniel; Uhry, Zoé; Lim, Tek-Ang; Bloch, Juliette
2008-03-01
Lung cancer is currently the most common cancer in the world and as such is an important public health concern. One of the main challenges is to foresee the evolution of trends in lung cancer mortality rates in order to anticipate the future burden of this disease as well as to plan the supply of adequate health care. The aim of this study is to propose a quantification of future lung cancer mortality rates by gender in France until the year 2012. Lung cancer mortality data in France (1978-2002) were extracted from the National Statistics of Death and analyzed by 5-year age-groups and periods, using a Bayesian age-period-cohort model. Between 1978 and 2002, female lung cancer mortality rate rises by 3.3%year(-1). For men, a slow increase is observed until 1988-1992 followed by a declining trend. In 1998-2002, age-standardized mortality rates were, respectively, 45.5 and 7.6 per 100000 for males and for females. By 2008-2012 these figures would reach 40.8 (95% credibility interval (CI): 32.7, 50.0) and 12.1 (CI: 11.7, 12.6) per 100000, respectively, which represents among women a 4.7% annual increase (CI: 4.5, 5.0). Our results highlight the relevance of pursuing public health measures in order to cope more actively with tobacco smoking in the prevention strategy against lung cancer specifically among women.
Espino-Hernandez, Gabriela; Gustafson, Paul; Burstyn, Igor
2011-05-14
In epidemiological studies explanatory variables are frequently subject to measurement error. The aim of this paper is to develop a Bayesian method to correct for measurement error in multiple continuous exposures in individually matched case-control studies. This is a topic that has not been widely investigated. The new method is illustrated using data from an individually matched case-control study of the association between thyroid hormone levels during pregnancy and exposure to perfluorinated acids. The objective of the motivating study was to examine the risk of maternal hypothyroxinemia due to exposure to three perfluorinated acids measured on a continuous scale. Results from the proposed method are compared with those obtained from a naive analysis. Using a Bayesian approach, the developed method considers a classical measurement error model for the exposures, as well as the conditional logistic regression likelihood as the disease model, together with a random-effect exposure model. Proper and diffuse prior distributions are assigned, and results from a quality control experiment are used to estimate the perfluorinated acids' measurement error variability. As a result, posterior distributions and 95% credible intervals of the odds ratios are computed. A sensitivity analysis of method's performance in this particular application with different measurement error variability was performed. The proposed Bayesian method to correct for measurement error is feasible and can be implemented using statistical software. For the study on perfluorinated acids, a comparison of the inferences which are corrected for measurement error to those which ignore it indicates that little adjustment is manifested for the level of measurement error actually exhibited in the exposures. Nevertheless, a sensitivity analysis shows that more substantial adjustments arise if larger measurement errors are assumed. In individually matched case-control studies, the use of conditional logistic regression likelihood as a disease model in the presence of measurement error in multiple continuous exposures can be justified by having a random-effect exposure model. The proposed method can be successfully implemented in WinBUGS to correct individually matched case-control studies for several mismeasured continuous exposures under a classical measurement error model.
Persichetti, Maria Flaminia; Solano-Gallego, Laia; Vullo, Angela; Masucci, Marisa; Marty, Pierre; Delaunay, Pascal; Vitale, Fabrizio; Pennisi, Maria Grazia
2017-03-13
Anti-Leishmania antibodies are increasingly investigated in cats for epidemiological studies or for the diagnosis of clinical feline leishmaniosis. The immunofluorescent antibody test (IFAT), the enzyme-linked immunosorbent assay (ELISA) and western blot (WB) are the serological tests more frequently used. The aim of the present study was to assess diagnostic performance of IFAT, ELISA and WB to detect anti-L. infantum antibodies in feline serum samples obtained from endemic (n = 76) and non-endemic (n = 64) areas and from cats affected by feline leishmaniosis (n = 21) by a Bayesian approach without a gold standard. Cut-offs were set at 80 titre for IFAT and 40 ELISA units for ELISA. WB was considered positive in presence of at least a 18 KDa band. Statistical analysis was performed through a written routine with MATLAB software in the Bayesian framework. The latent data and observations from the joint posterior were simulated in the Bayesian approach by an iterative Markov Chain Monte Carlo technique using the Gibbs sampler for estimating sensitivity and specificity of the three tests. The median seroprevalence in the sample used for evaluating the performance of tests was estimated at 0.27 [credible interval (CI) = 0.20-0.34]. The median sensitivity of the three different methods was 0.97 (CI: 0.86-1.00), 0.75 (CI: 0.61-0.87) and 0.70 (CI: 0.56-0.83) for WB, IFAT and ELISA, respectively. Median specificity reached 0.99 (CI: 0.96-1.00) with WB, 0.97 (CI: 0.93-0.99) with IFAT and 0.98 (CI: 0.94-1.00) with ELISA. IFAT was more sensitive than ELISA (75 vs 70%) for the detection of subclinical infection while ELISA was better for diagnosing clinical leishmaniosis when compared with IFAT (98 vs 97%). The overall performance of all serological techniques was good and the most accurate test for anti-Leishmania antibody detection in feline serum samples was WB.
National spatial and temporal patterns of notified dengue cases, Colombia 2007-2010.
Restrepo, Angela Cadavid; Baker, Peter; Clements, Archie C A
2014-07-01
To explore the variation in the spatial distribution of notified dengue cases in Colombia from January 2007 to December 2010 and examine associations between the disease and selected environmental risk factors. Data on the number of notified dengue cases in Colombia were obtained from the National Institute of Health (Instituto Nacional de Salud - INS) for the period 1 January 2007 through 31 December 2010. Data on environmental factors were collected from the Worldclim website. A Bayesian spatio-temporal conditional autoregressive model was used to quantify the relationship between monthly dengue cases and temperature, precipitation and elevation. Monthly dengue counts decreased by 18% (95% credible interval (CrI): 17-19%) in 2008 and increased by 30% (95% CrI: 28-31%) and 326% (95% CrI: 322-331%) in 2009 and 2010, respectively, compared to 2007. Additionally, there was a significant, nonlinear effect of monthly average precipitation. The results highlight the role of environmental risk factors in determining the spatial of dengue and show how these factors can be used to develop and refine preventive approaches for dengue in Colombia. © 2014 John Wiley & Sons Ltd.
Kim, Hea-Jung
2014-01-01
This paper considers a hierarchical screened Gaussian model (HSGM) for Bayesian inference of normal models when an interval constraint in the mean parameter space needs to be incorporated in the modeling but when such a restriction is uncertain. An objective measure of the uncertainty, regarding the interval constraint, accounted for by using the HSGM is proposed for the Bayesian inference. For this purpose, we drive a maximum entropy prior of the normal mean, eliciting the uncertainty regarding the interval constraint, and then obtain the uncertainty measure by considering the relationship between the maximum entropy prior and the marginal prior of the normal mean in HSGM. Bayesian estimation procedure of HSGM is developed and two numerical illustrations pertaining to the properties of the uncertainty measure are provided.
2013 Gulf of Mexico Hypoxia Forecast
Scavia, Donald; Evans, Mary Anne; Obenour, Dan
2013-01-01
The Gulf of Mexico annual summer hypoxia forecasts are based on average May total nitrogen loads from the Mississippi River basin for that year. The load estimate, recently released by USGS, is 7,316 metric tons per day. Based on that estimate, we predict the area of this summer’s hypoxic zone to be 18,900 square kilometers (95% credible interval, 13,400 to 24,200), the 7th largest reported and about the size of New Jersey. Our forecast hypoxic volume is 74.5 km3 (95% credible interval, 51.5 to 97.0), also the 7th largest on record.
Bayesian Parameter Inference and Model Selection by Population Annealing in Systems Biology
Murakami, Yohei
2014-01-01
Parameter inference and model selection are very important for mathematical modeling in systems biology. Bayesian statistics can be used to conduct both parameter inference and model selection. Especially, the framework named approximate Bayesian computation is often used for parameter inference and model selection in systems biology. However, Monte Carlo methods needs to be used to compute Bayesian posterior distributions. In addition, the posterior distributions of parameters are sometimes almost uniform or very similar to their prior distributions. In such cases, it is difficult to choose one specific value of parameter with high credibility as the representative value of the distribution. To overcome the problems, we introduced one of the population Monte Carlo algorithms, population annealing. Although population annealing is usually used in statistical mechanics, we showed that population annealing can be used to compute Bayesian posterior distributions in the approximate Bayesian computation framework. To deal with un-identifiability of the representative values of parameters, we proposed to run the simulations with the parameter ensemble sampled from the posterior distribution, named “posterior parameter ensemble”. We showed that population annealing is an efficient and convenient algorithm to generate posterior parameter ensemble. We also showed that the simulations with the posterior parameter ensemble can, not only reproduce the data used for parameter inference, but also capture and predict the data which was not used for parameter inference. Lastly, we introduced the marginal likelihood in the approximate Bayesian computation framework for Bayesian model selection. We showed that population annealing enables us to compute the marginal likelihood in the approximate Bayesian computation framework and conduct model selection depending on the Bayes factor. PMID:25089832
Bayesian phylogenetic estimation of fossil ages.
Drummond, Alexei J; Stadler, Tanja
2016-07-19
Recent advances have allowed for both morphological fossil evidence and molecular sequences to be integrated into a single combined inference of divergence dates under the rule of Bayesian probability. In particular, the fossilized birth-death tree prior and the Lewis-Mk model of discrete morphological evolution allow for the estimation of both divergence times and phylogenetic relationships between fossil and extant taxa. We exploit this statistical framework to investigate the internal consistency of these models by producing phylogenetic estimates of the age of each fossil in turn, within two rich and well-characterized datasets of fossil and extant species (penguins and canids). We find that the estimation accuracy of fossil ages is generally high with credible intervals seldom excluding the true age and median relative error in the two datasets of 5.7% and 13.2%, respectively. The median relative standard error (RSD) was 9.2% and 7.2%, respectively, suggesting good precision, although with some outliers. In fact, in the two datasets we analyse, the phylogenetic estimate of fossil age is on average less than 2 Myr from the mid-point age of the geological strata from which it was excavated. The high level of internal consistency found in our analyses suggests that the Bayesian statistical model employed is an adequate fit for both the geological and morphological data, and provides evidence from real data that the framework used can accurately model the evolution of discrete morphological traits coded from fossil and extant taxa. We anticipate that this approach will have diverse applications beyond divergence time dating, including dating fossils that are temporally unconstrained, testing of the 'morphological clock', and for uncovering potential model misspecification and/or data errors when controversial phylogenetic hypotheses are obtained based on combined divergence dating analyses.This article is part of the themed issue 'Dating species divergences using rocks and clocks'. © 2016 The Authors.
Bayesian phylogenetic estimation of fossil ages
Drummond, Alexei J.; Stadler, Tanja
2016-01-01
Recent advances have allowed for both morphological fossil evidence and molecular sequences to be integrated into a single combined inference of divergence dates under the rule of Bayesian probability. In particular, the fossilized birth–death tree prior and the Lewis-Mk model of discrete morphological evolution allow for the estimation of both divergence times and phylogenetic relationships between fossil and extant taxa. We exploit this statistical framework to investigate the internal consistency of these models by producing phylogenetic estimates of the age of each fossil in turn, within two rich and well-characterized datasets of fossil and extant species (penguins and canids). We find that the estimation accuracy of fossil ages is generally high with credible intervals seldom excluding the true age and median relative error in the two datasets of 5.7% and 13.2%, respectively. The median relative standard error (RSD) was 9.2% and 7.2%, respectively, suggesting good precision, although with some outliers. In fact, in the two datasets we analyse, the phylogenetic estimate of fossil age is on average less than 2 Myr from the mid-point age of the geological strata from which it was excavated. The high level of internal consistency found in our analyses suggests that the Bayesian statistical model employed is an adequate fit for both the geological and morphological data, and provides evidence from real data that the framework used can accurately model the evolution of discrete morphological traits coded from fossil and extant taxa. We anticipate that this approach will have diverse applications beyond divergence time dating, including dating fossils that are temporally unconstrained, testing of the ‘morphological clock', and for uncovering potential model misspecification and/or data errors when controversial phylogenetic hypotheses are obtained based on combined divergence dating analyses. This article is part of the themed issue ‘Dating species divergences using rocks and clocks’. PMID:27325827
Blacksell, Stuart D.; Tanganuchitcharnchai, Ampai; Jintaworn, Suthatip; Kantipong, Pacharee; Richards, Allen L.; Day, Nicholas P. J.
2016-01-01
The enzyme-linked immunosorbent assay (ELISA) has been proposed as an alternative serologic diagnostic test to the indirect immunofluorescence assay (IFA) for scrub typhus. Here, we systematically determine the optimal sample dilution and cutoff optical density (OD) and estimate the accuracy of IgM ELISA using Bayesian latent class models (LCMs). Data from 135 patients with undifferentiated fever were reevaluated using Bayesian LCMs. Every patient was evaluated for the presence of an eschar and tested with a blood culture for Orientia tsutsugamushi, three different PCR assays, and an IgM IFA. The IgM ELISA was performed for every sample at sample dilutions from 1:100 to 1:102,400 using crude whole-cell antigens of the Karp, Kato, and Gilliam strains of O. tsutsugamushi developed by the Naval Medical Research Center. We used Bayesian LCMs to generate unbiased receiver operating characteristic curves and found that the sample dilution of 1:400 was optimal for the IgM ELISA. With the optimal cutoff OD of 1.474 at a sample dilution of 1:400, the IgM ELISA had a sensitivity of 85.7% (95% credible interval [CrI], 77.4% to 86.7%) and a specificity of 98.1% (95% CrI, 97.2% to 100%) using paired samples. For the ELISA, the OD could be determined objectively and quickly, in contrast to the reading of IFA slides, which was both subjective and labor-intensive. The IgM ELISA for scrub typhus has high diagnostic accuracy and is less subjective than the IgM IFA. We suggest that the IgM ELISA may be used as an alternative reference test to the IgM IFA for the serological diagnosis of scrub typhus. PMID:27008880
Buczinski, S; Ménard, J; Timsit, E
2016-07-01
Thoracic ultrasonography (TUS) is a specific and relatively sensitive method to diagnose bronchopneumonia (BP) in dairy calves. Unfortunately, as it requires specific training and equipment, veterinarians typically base their diagnosis on thoracic auscultation (AUSC), which is rapid and easy to perform. We hypothesized that the use of TUS, in addition to AUSC, can significantly increase accuracy of BP diagnosis. Therefore, the objectives were to (i) determine the incremental value of TUS over AUSC for diagnosis of BP in preweaned dairy calves and (ii) assess diagnostic accuracy of AUSC. Two hundred and nine dairy calves (<1 month of age) were enrolled in this cross-sectional study. Prospective cross-sectional study. All calves from a veal calves unit were examined (independent operators) using the Wisconsin Calf Respiratory Scoring Criteria (CRSC), AUSC, and TUS. A Bayesian latent class approach was used to estimate the incremental value of AUSC over TUS (integrated discrimination improvement [IDI]) and the diagnostic accuracy of AUSC. Abnormal CRSC, AUSC, and TUS were recorded in 3.3, 53.1, and 23.9% of calves, respectively. AUSC was sensitive (72.9%; 95% Bayesian credible interval [BCI]: 50.1-96.4%), but not specific (53.3%; 95% BCI: 43.3-64.0%) to diagnose BP. Compared to AUSC, TUS was more specific (92.9%; 95% BCI: 86.5-97.1%), but had similar sensitivity (76.5%; 95% BCI: 60.2-88.8%). The incremental value of TUS over AUSC was high (IDI = 43.7%; 5% BCI: 22.0-63.0%) significantly improving proportions of sick and healthy calves appropriately classified. The use of TUS over AUSC significantly improved accuracy of BP diagnosis in dairy calves. Copyright © 2016 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
Angelidou, E; Kostoulas, P; Leontides, L
2014-02-01
We validated a commercial (Idexx Pourquier, Montpellier, France) serum and milk indirect ELISA that detects antibodies against Mycobacterium avium ssp. paratuberculosis (MAP) in Greek dairy goats. Each goat was sampled 4 times, starting from kidding and covering early, mid, and late lactation. A total of 1,268 paired milk (or colostrum) and serum samples were collected during the 7-mo lactation period. Bayesian latent class models, which allow for the continuous interpretation of test results, were used to derive the distribution of the serum and milk ELISA response for healthy and MAP-infected individuals at each lactation stage. Both serum and milk ELISA, in all lactation stages, had average and similar overall discriminatory ability as measured by the area under the curve (AUC). For each test, the smallest overlap between the distribution of the healthy and MAP-infected does was in late lactation. At this stage, the AUC was 0.89 (95% credible interval: 0.70; 0.98) and 0.92 (0.74; 0.99) for the milk and serum ELISA, respectively. Both tests had comparable sensitivities and specificities at the recommended cutoffs across lactation. Lowering the cutoffs led to an increase in sensitivity without serious loss in specificity. In conclusion, the milk ELISA was as accurate as the serum ELISA. Therefore, it could serve as the diagnostic tool of choice, especially during the implementation of MAP control programs that require frequent testing, because milk sampling is a noninvasive, rapid, and easy process. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Marchetta, Claire M; Devine, Owen J; Crider, Krista S; Tsang, Becky L; Cordero, Amy M; Qi, Yan Ping; Guo, Jing; Berry, Robert J; Rosenthal, Jorge; Mulinare, Joseph; Mersereau, Patricia; Hamner, Heather C
2015-04-10
Folate is found naturally in foods or as synthetic folic acid in dietary supplements and fortified foods. Adequate periconceptional folic acid intake can prevent neural tube defects. Folate intake impacts blood folate concentration; however, the dose-response between natural food folate and blood folate concentrations has not been well described. We estimated this association among healthy females. A systematic literature review identified studies (1 1992-3 2014) with both natural food folate intake alone and blood folate concentration among females aged 12-49 years. Bayesian methods were used to estimate regression model parameters describing the association between natural food folate intake and subsequent blood folate concentration. Seven controlled trials and 29 observational studies met the inclusion criteria. For the six studies using microbiologic assay (MA) included in the meta-analysis, we estimate that a 6% (95% Credible Interval (CrI): 4%, 9%) increase in red blood cell (RBC) folate concentration and a 7% (95% CrI: 1%, 12%) increase in serum/plasma folate concentration can occur for every 10% increase in natural food folate intake. Using modeled results, we estimate that a natural food folate intake of ≥ 450 μg dietary folate equivalents (DFE)/day could achieve the lower bound of an RBC folate concentration (~ 1050 nmol/L) associated with the lowest risk of a neural tube defect. Natural food folate intake affects blood folate concentration and adequate intakes could help women achieve a RBC folate concentration associated with a risk of 6 neural tube defects/10,000 live births.
Park, Sun-Kyeong; Lee, Min-Young; Jang, Eun-Jin; Kim, Hye-Lin; Ha, Dong-Mun; Lee, Eui-Kyung
2017-01-01
The purpose of this study was to compare the discontinuation rates of tofacitinib and biologics (tumour necrosis factor inhibitors (TNFi), abatacept, rituximab, and tocilizumab) in rheumatoid arthritis (RA) patients considering inadequate responses (IRs) to previous treatment(s). Randomised controlled trials of tofacitinib and biologics - reporting at least one total discontinuation, discontinuation due to lack of efficacy (LOE), and discontinuation due to adverse events (AEs) - were identified through systematic review. The analyses were conducted for patients with IRs to conventional synthetic disease-modifying anti-rheumatic drugs (cDMARDs) and for patients with biologics-IR, separately. Bayesian network meta-analysis was used to estimate rate ratio (RR) of a biologic relative to tofacitinib with 95% credible interval (CrI), and probability of RR being <1 (P[RR<1]). The analyses of 34 studies showed no significant differences in discontinuation rates between tofacitinib and biologics in the cDMARDs-IR group. In the biologics-IR group, however, TNFi (RR 0.17, 95% CrI 0.01-3.61, P[RR<1] 92.0%) and rituximab (RR 0.20, 95% CrI 0.01-2.91, P[RR<1] 92.3%) showed significantly lower total discontinuation rates than tofacitinib did. Despite the difference, discontinuation cases owing to LOE and AEs revealed that tofacitinib was comparable to the biologics. The comparability of discontinuation rate between tofacitinib and biologics was different based on previous treatments and discontinuation reasons: LOE, AEs, and total (due to other reasons). Therefore, those factors need to be considered to decide the optimal treatment strategy.
Buczinski, Sébastien; L Ollivett, Terri; Dendukuri, Nandini
2015-05-01
There is currently no gold standard method for the diagnosis of bovine respiratory disease (BRD) complex in Holstein pre-weaned dairy calves. Systematic thoracic ultrasonography (TUS) has been used as a proxy for BRD, but cannot be directly used by producers. The Wisconsin calf respiratory scoring chart (CRSC) is a simpler alternative, but with unknown accuracy. Our objective was to estimate the accuracy of CRSC, while adjusting for the lack of a gold standard. Two cross sectional study populations with a high BRD prevalence (n=106 pre-weaned Holstein calves) and an average BRD prevalence (n=85 pre-weaned Holstein calves) from North America were studied. All calves were simultaneously assessed using CRSC (cutoff used ≥ 5) and TUS (cutoff used ≥ 1cm of lung consolidation). Bayesian latent class models allowing for conditional dependence were used with informative priors for BRD prevalence and TUS accuracy (sensitivity (Se) and specificity (Sp)) and non-informative priors for CRSC accuracies. Robustness of the model was tested by relaxing priors for prevalence or TUS accuracy. The SeCRSC (95% credible interval (CI)) and SpCRSC were 62.4% (47.9-75.8) and 74.1% (64.9-82.8) respectively. The SeTUS was 79.4% (66.4-90.9) and SpTUS was 93.9% (88.0-97.6). The imperfect accuracy of CRSC and TUS should be taken into account when using those tools to assess BRD status. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Butler, Rhett; Frazer, L. Neil; Templeton, William J.
2016-05-01
We use the global rate of Mw ≥ 9.0 earthquakes, and standard Bayesian procedures, to estimate the probability of such mega events in the Aleutian Islands, where they pose a significant risk to Hawaii. We find that the probability of such an earthquake along the Aleutians island arc is 6.5% to 12% over the next 50 years (50% credibility interval) and that the annualized risk to Hawai'i is about $30 M. Our method (the regionally scaled global rate method or RSGR) is to scale the global rate of Mw 9.0+ events in proportion to the fraction of global subduction (units of area per year) that takes place in the Aleutians. The RSGR method assumes that Mw 9.0+ events are a Poisson process with a rate that is both globally and regionally stationary on the time scale of centuries, and it follows the principle of Burbidge et al. (2008) who used the product of fault length and convergence rate, i.e., the area being subducted per annum, to scale the Poisson rate for the GSS to sections of the Indonesian subduction zone. Before applying RSGR to the Aleutians, we first apply it to five other regions of the global subduction system where its rate predictions can be compared with those from paleotsunami, paleoseismic, and geoarcheology data. To obtain regional rates from paleodata, we give a closed-form solution for the probability density function of the Poisson rate when event count and observation time are both uncertain.
Huang, X; Lambert, S; Lau, C; Soares Magalhaes, R J; Marquess, J; Rajmokan, M; Milinovich, G; Hu, W
2017-04-01
Pertussis epidemics have displayed substantial spatial heterogeneity in countries with high socioeconomic conditions and high vaccine coverage. This study aims to investigate the relationship between pertussis risk and socio-environmental factors on the spatio-temporal variation underlying pertussis infection. We obtained daily case numbers of pertussis notifications from Queensland Health, Australia by postal area, for the period January 2006 to December 2012. A Bayesian spatio-temporal model was used to quantify the relationship between monthly pertussis incidence and socio-environmental factors. The socio-environmental factors included monthly mean minimum temperature (MIT), monthly mean vapour pressure (VAP), Queensland school calendar pattern (SCP), and socioeconomic index for area (SEIFA). An increase in pertussis incidence was observed from 2006 to 2010 and a slight decrease from 2011 to 2012. Spatial analyses showed pertussis incidence across Queensland postal area to be low and more spatially homogeneous during 2006-2008; incidence was higher and more spatially heterogeneous after 2009. The results also showed that the average decrease in monthly pertussis incidence was 3·1% [95% credible interval (CrI) 1·3-4·8] for each 1 °C increase in monthly MIT, while average increase in monthly pertussis incidences were 6·2% (95% CrI 0·4-12·4) and 2% (95% CrI 1-3) for SCP periods and for each 10-unit increase in SEIFA, respectively. This study demonstrated that pertussis transmission is significantly associated with MIT, SEIFA, and SCP. Mapping derived from this work highlights the potential for future investigation and areas for focusing future control strategies.
NASA Astrophysics Data System (ADS)
Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang
2016-07-01
This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.
Thomas, Len; Jaramillo-Legorreta, Armando; Cardenas-Hinojosa, Gustavo; Nieto-Garcia, Edwyna; Rojas-Bracho, Lorenzo; Ver Hoef, Jay M; Moore, Jeffrey; Taylor, Barbara; Barlow, Jay; Tregenza, Nicholas
2017-11-01
The vaquita is a critically endangered species of porpoise. It produces echolocation clicks, making it a good candidate for passive acoustic monitoring. A systematic grid of sensors has been deployed for 3 months annually since 2011; results from 2016 are reported here. Statistical models (to compensate for non-uniform data loss) show an overall decline in the acoustic detection rate between 2015 and 2016 of 49% (95% credible interval 82% decline to 8% increase), and total decline between 2011 and 2016 of over 90%. Assuming the acoustic detection rate is proportional to population size, approximately 30 vaquita (95% credible interval 8-96) remained in November 2016.
Virlogeux, Victor; Fang, Vicky J; Park, Minah; Wu, Joseph T; Cowling, Benjamin J
2016-10-24
The incubation period is an important epidemiologic distribution, it is often incorporated in case definitions, used to determine appropriate quarantine periods, and is an input to mathematical modeling studies. Middle East Respiratory Syndrome coronavirus (MERS) is an emerging infectious disease in the Arabian Peninsula. There was a large outbreak of MERS in South Korea in 2015. We examined the incubation period distribution of MERS coronavirus infection for cases in South Korea and in Saudi Arabia. Using parametric and nonparametric methods, we estimated a mean incubation period of 6.9 days (95% credibility interval: 6.3-7.5) for cases in South Korea and 5.0 days (95% credibility interval: 4.0-6.6) among cases in Saudi Arabia. In a log-linear regression model, the mean incubation period was 1.42 times longer (95% credibility interval: 1.18-1.71) among cases in South Korea compared to Saudi Arabia. The variation that we identified in the incubation period distribution between locations could be associated with differences in ascertainment or reporting of exposure dates and illness onset dates, differences in the source or mode of infection, or environmental differences.
A Bayesian framework for infrasound location
NASA Astrophysics Data System (ADS)
Modrak, Ryan T.; Arrowsmith, Stephen J.; Anderson, Dale N.
2010-04-01
We develop a framework for location of infrasound events using backazimuth and infrasonic arrival times from multiple arrays. Bayesian infrasonic source location (BISL) developed here estimates event location and associated credibility regions. BISL accounts for unknown source-to-array path or phase by formulating infrasonic group velocity as random. Differences between observed and predicted source-to-array traveltimes are partitioned into two additive Gaussian sources, measurement error and model error, the second of which accounts for the unknown influence of wind and temperature on path. By applying the technique to both synthetic tests and ground-truth events, we highlight the complementary nature of back azimuths and arrival times for estimating well-constrained event locations. BISL is an extension to methods developed earlier by Arrowsmith et al. that provided simple bounds on location using a grid-search technique.
Machine learning methods for credibility assessment of interviewees based on posturographic data.
Saripalle, Sashi K; Vemulapalli, Spandana; King, Gregory W; Burgoon, Judee K; Derakhshani, Reza
2015-01-01
This paper discusses the advantages of using posturographic signals from force plates for non-invasive credibility assessment. The contributions of our work are two fold: first, the proposed method is highly efficient and non invasive. Second, feasibility for creating an autonomous credibility assessment system using machine-learning algorithms is studied. This study employs an interview paradigm that includes subjects responding with truthful and deceptive intent while their center of pressure (COP) signal is being recorded. Classification models utilizing sets of COP features for deceptive responses are derived and best accuracy of 93.5% for test interval is reported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Maoyi; Ray, Jaideep; Hou, Zhangshuan
2016-07-04
The Community Land Model (CLM) has been widely used in climate and Earth system modeling. Accurate estimation of model parameters is needed for reliable model simulations and predictions under current and future conditions, respectively. In our previous work, a subset of hydrological parameters has been identified to have significant impact on surface energy fluxes at selected flux tower sites based on parameter screening and sensitivity analysis, which indicate that the parameters could potentially be estimated from surface flux observations at the towers. To date, such estimates do not exist. In this paper, we assess the feasibility of applying a Bayesianmore » model calibration technique to estimate CLM parameters at selected flux tower sites under various site conditions. The parameters are estimated as a joint probability density function (PDF) that provides estimates of uncertainty of the parameters being inverted, conditional on climatologically-average latent heat fluxes derived from observations. We find that the simulated mean latent heat fluxes from CLM using the calibrated parameters are generally improved at all sites when compared to those obtained with CLM simulations using default parameter sets. Further, our calibration method also results in credibility bounds around the simulated mean fluxes which bracket the measured data. The modes (or maximum a posteriori values) and 95% credibility intervals of the site-specific posterior PDFs are tabulated as suggested parameter values for each site. Analysis of relationships between the posterior PDFs and site conditions suggests that the parameter values are likely correlated with the plant functional type, which needs to be confirmed in future studies by extending the approach to more sites.« less
Huang, Maoyi; Ray, Jaideep; Hou, Zhangshuan; ...
2016-06-01
The Community Land Model (CLM) has been widely used in climate and Earth system modeling. Accurate estimation of model parameters is needed for reliable model simulations and predictions under current and future conditions, respectively. In our previous work, a subset of hydrological parameters has been identified to have significant impact on surface energy fluxes at selected flux tower sites based on parameter screening and sensitivity analysis, which indicate that the parameters could potentially be estimated from surface flux observations at the towers. To date, such estimates do not exist. In this paper, we assess the feasibility of applying a Bayesianmore » model calibration technique to estimate CLM parameters at selected flux tower sites under various site conditions. The parameters are estimated as a joint probability density function (PDF) that provides estimates of uncertainty of the parameters being inverted, conditional on climatologically average latent heat fluxes derived from observations. We find that the simulated mean latent heat fluxes from CLM using the calibrated parameters are generally improved at all sites when compared to those obtained with CLM simulations using default parameter sets. Further, our calibration method also results in credibility bounds around the simulated mean fluxes which bracket the measured data. The modes (or maximum a posteriori values) and 95% credibility intervals of the site-specific posterior PDFs are tabulated as suggested parameter values for each site. As a result, analysis of relationships between the posterior PDFs and site conditions suggests that the parameter values are likely correlated with the plant functional type, which needs to be confirmed in future studies by extending the approach to more sites.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Maoyi; Ray, Jaideep; Hou, Zhangshuan
The Community Land Model (CLM) has been widely used in climate and Earth system modeling. Accurate estimation of model parameters is needed for reliable model simulations and predictions under current and future conditions, respectively. In our previous work, a subset of hydrological parameters has been identified to have significant impact on surface energy fluxes at selected flux tower sites based on parameter screening and sensitivity analysis, which indicate that the parameters could potentially be estimated from surface flux observations at the towers. To date, such estimates do not exist. In this paper, we assess the feasibility of applying a Bayesianmore » model calibration technique to estimate CLM parameters at selected flux tower sites under various site conditions. The parameters are estimated as a joint probability density function (PDF) that provides estimates of uncertainty of the parameters being inverted, conditional on climatologically average latent heat fluxes derived from observations. We find that the simulated mean latent heat fluxes from CLM using the calibrated parameters are generally improved at all sites when compared to those obtained with CLM simulations using default parameter sets. Further, our calibration method also results in credibility bounds around the simulated mean fluxes which bracket the measured data. The modes (or maximum a posteriori values) and 95% credibility intervals of the site-specific posterior PDFs are tabulated as suggested parameter values for each site. As a result, analysis of relationships between the posterior PDFs and site conditions suggests that the parameter values are likely correlated with the plant functional type, which needs to be confirmed in future studies by extending the approach to more sites.« less
Goldberg, Joshua F; Tempa, Tshering; Norbu, Nawang; Hebblewhite, Mark; Mills, L Scott; Wangchuk, Tshewang R; Lukacs, Paul
2015-01-01
Many large carnivores occupy a wide geographic distribution, and face threats from habitat loss and fragmentation, poaching, prey depletion, and human wildlife-conflicts. Conservation requires robust techniques for estimating population densities and trends, but the elusive nature and low densities of many large carnivores make them difficult to detect. Spatial capture-recapture (SCR) models provide a means for handling imperfect detectability, while linking population estimates to individual movement patterns to provide more accurate estimates than standard approaches. Within this framework, we investigate the effect of different sample interval lengths on density estimates, using simulations and a common leopard (Panthera pardus) model system. We apply Bayesian SCR methods to 89 simulated datasets and camera-trapping data from 22 leopards captured 82 times during winter 2010-2011 in Royal Manas National Park, Bhutan. We show that sample interval length from daily, weekly, monthly or quarterly periods did not appreciably affect median abundance or density, but did influence precision. We observed the largest gains in precision when moving from quarterly to shorter intervals. We therefore recommend daily sampling intervals for monitoring rare or elusive species where practicable, but note that monthly or quarterly sample periods can have similar informative value. We further develop a novel application of Bayes factors to select models where multiple ecological factors are integrated into density estimation. Our simulations demonstrate that these methods can help identify the "true" explanatory mechanisms underlying the data. Using this method, we found strong evidence for sex-specific movement distributions in leopards, suggesting that sexual patterns of space-use influence density. This model estimated a density of 10.0 leopards/100 km2 (95% credibility interval: 6.25-15.93), comparable to contemporary estimates in Asia. These SCR methods provide a guide to monitor and observe the effect of management interventions on leopards and other species of conservation interest.
Goldberg, Joshua F.; Tempa, Tshering; Norbu, Nawang; Hebblewhite, Mark; Mills, L. Scott; Wangchuk, Tshewang R.; Lukacs, Paul
2015-01-01
Many large carnivores occupy a wide geographic distribution, and face threats from habitat loss and fragmentation, poaching, prey depletion, and human wildlife-conflicts. Conservation requires robust techniques for estimating population densities and trends, but the elusive nature and low densities of many large carnivores make them difficult to detect. Spatial capture-recapture (SCR) models provide a means for handling imperfect detectability, while linking population estimates to individual movement patterns to provide more accurate estimates than standard approaches. Within this framework, we investigate the effect of different sample interval lengths on density estimates, using simulations and a common leopard (Panthera pardus) model system. We apply Bayesian SCR methods to 89 simulated datasets and camera-trapping data from 22 leopards captured 82 times during winter 2010–2011 in Royal Manas National Park, Bhutan. We show that sample interval length from daily, weekly, monthly or quarterly periods did not appreciably affect median abundance or density, but did influence precision. We observed the largest gains in precision when moving from quarterly to shorter intervals. We therefore recommend daily sampling intervals for monitoring rare or elusive species where practicable, but note that monthly or quarterly sample periods can have similar informative value. We further develop a novel application of Bayes factors to select models where multiple ecological factors are integrated into density estimation. Our simulations demonstrate that these methods can help identify the “true” explanatory mechanisms underlying the data. Using this method, we found strong evidence for sex-specific movement distributions in leopards, suggesting that sexual patterns of space-use influence density. This model estimated a density of 10.0 leopards/100 km2 (95% credibility interval: 6.25–15.93), comparable to contemporary estimates in Asia. These SCR methods provide a guide to monitor and observe the effect of management interventions on leopards and other species of conservation interest. PMID:26536231
Hagos, Seifu; Hailemariam, Damen; WoldeHanna, Tasew; Lindtjørn, Bernt
2017-01-01
Understanding the spatial distribution of stunting and underlying factors operating at meso-scale is of paramount importance for intervention designing and implementations. Yet, little is known about the spatial distribution of stunting and some discrepancies are documented on the relative importance of reported risk factors. Therefore, the present study aims at exploring the spatial distribution of stunting at meso- (district) scale, and evaluates the effect of spatial dependency on the identification of risk factors and their relative contribution to the occurrence of stunting and severe stunting in a rural area of Ethiopia. A community based cross sectional study was conducted to measure the occurrence of stunting and severe stunting among children aged 0-59 months. Additionally, we collected relevant information on anthropometric measures, dietary habits, parent and child-related demographic and socio-economic status. Latitude and longitude of surveyed households were also recorded. Local Anselin Moran's I was calculated to investigate the spatial variation of stunting prevalence and identify potential local pockets (hotspots) of high prevalence. Finally, we employed a Bayesian geo-statistical model, which accounted for spatial dependency structure in the data, to identify potential risk factors for stunting in the study area. Overall, the prevalence of stunting and severe stunting in the district was 43.7% [95%CI: 40.9, 46.4] and 21.3% [95%CI: 19.5, 23.3] respectively. We identified statistically significant clusters of high prevalence of stunting (hotspots) in the eastern part of the district and clusters of low prevalence (cold spots) in the western. We found out that the inclusion of spatial structure of the data into the Bayesian model has shown to improve the fit for stunting model. The Bayesian geo-statistical model indicated that the risk of stunting increased as the child's age increased (OR 4.74; 95% Bayesian credible interval [BCI]:3.35-6.58) and among boys (OR 1.28; 95%BCI; 1.12-1.45). However, maternal education and household food security were found to be protective against stunting and severe stunting. Stunting prevalence may vary across space at different scale. For this, it's important that nutrition studies and, more importantly, control interventions take into account this spatial heterogeneity in the distribution of nutritional deficits and their underlying associated factors. The findings of this study also indicated that interventions integrating household food insecurity in nutrition programs in the district might help to avert the burden of stunting.
NASA Astrophysics Data System (ADS)
Wang, Hongrui; Wang, Cheng; Wang, Ying; Gao, Xiong; Yu, Chen
2017-06-01
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLE confidence interval and thus more precise estimation by using the related information from regional gage stations. The Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.
Bayesian characterization of uncertainty in species interaction strengths.
Wolf, Christopher; Novak, Mark; Gitelman, Alix I
2017-06-01
Considerable effort has been devoted to the estimation of species interaction strengths. This effort has focused primarily on statistical significance testing and obtaining point estimates of parameters that contribute to interaction strength magnitudes, leaving the characterization of uncertainty associated with those estimates unconsidered. We consider a means of characterizing the uncertainty of a generalist predator's interaction strengths by formulating an observational method for estimating a predator's prey-specific per capita attack rates as a Bayesian statistical model. This formulation permits the explicit incorporation of multiple sources of uncertainty. A key insight is the informative nature of several so-called non-informative priors that have been used in modeling the sparse data typical of predator feeding surveys. We introduce to ecology a new neutral prior and provide evidence for its superior performance. We use a case study to consider the attack rates in a New Zealand intertidal whelk predator, and we illustrate not only that Bayesian point estimates can be made to correspond with those obtained by frequentist approaches, but also that estimation uncertainty as described by 95% intervals is more useful and biologically realistic using the Bayesian method. In particular, unlike in bootstrap confidence intervals, the lower bounds of the Bayesian posterior intervals for attack rates do not include zero when a predator-prey interaction is in fact observed. We conclude that the Bayesian framework provides a straightforward, probabilistic characterization of interaction strength uncertainty, enabling future considerations of both the deterministic and stochastic drivers of interaction strength and their impact on food webs.
Bravo, Mercedes A; Anthopolos, Rebecca; Kimbro, Rachel T; Miranda, Marie Lynn
2018-05-14
Neighborhood characteristics such as racial segregation may be associated with type 2 diabetes mellitus, but studies have not examined these relationships using spatial models appropriate for geographically patterned health outcomes. We construct a local, spatial index of racial isolation (RI) for blacks, which measures the extent to which blacks are exposed to only one another, to estimate associations of diabetes with RI and examine how RI relates to spatial patterning in diabetes. We obtained 2007-2011 electronic health records from the Duke Medicine Enterprise Data Warehouse. Patient data were linked to RI based on census block of residence. We use aspatial and spatial Bayesian models to assess spatial variation in diabetes and relationships with RI. Compared to spatial models with patient age and sex, residual geographic heterogeneity in diabetes in spatial models that also included RI was 29% and 24% lower for non-Hispanic whites and blacks, respectively. A 0.20 unit increase in RI was associated with 1.24 (95% credible interval: 1.17, 1.31) and 1.07 (1.05, 1.10) increased risk of diabetes for whites and blacks, respectively. Improved understanding of neighborhood characteristics associated with diabetes can inform development of policy interventions.
Hallett, Timothy B; Gregson, Simon; Mugurungi, Owen; Gonese, Elizabeth; Garnett, Geoff P
2009-06-01
Determining whether interventions to reduce HIV transmission have worked is essential, but complicated by the potential for generalised epidemics to evolve over time without individuals changing risk behaviour. We aimed to develop a method to evaluate evidence for changes in risk behaviour altering the course of an HIV epidemic. We developed a mathematical model of HIV transmission, incorporating the potential for natural changes in the epidemic as it matures and the introduction of antiretroviral treatment, and applied a Bayesian Melding framework, in which the model and observed trends in prevalence can be compared. We applied the model to Zimbabwe, using HIV prevalence estimates from antenatal clinic surveillance and house-hold based surveys, and basing model parameters on data from sexual behaviour surveys. There was strong evidence for reductions in risk behaviour stemming HIV transmission. We estimate these changes occurred between 1999 and 2004 and averted 660,000 (95% credible interval: 460,000-860,000) infections by 2008. The model and associated analysis framework provide a robust way to evaluate the evidence for changes in risk behaviour affecting the course of HIV epidemics, avoiding confounding by the natural evolution of HIV epidemics.
Bayesian evidence for the prevalence of waterworlds
NASA Astrophysics Data System (ADS)
Simpson, Fergus
2017-07-01
Should we expect most habitable planets to share the Earth's marbled appearance? For a planetary surface to boast extensive areas of both land and water, a delicate balance must be struck between the volume of water it retains and the capacity of its perturbations. These two quantities may show substantial variability across the full spectrum of water-bearing worlds. This would suggest that, barring strong feedback effects, most surfaces are heavily dominated by either water or land. Why is the Earth so finely poised? To address this question, we construct a simple model for the selection bias that would arise within an ensemble of surface conditions. Based on the Earth's ocean coverage of 71 per cent, we find substantial evidence (Bayes factor K ≃ 6) supporting the hypothesis that anthropic selection effects are at work. Furthermore, due to the Earth's proximity to the waterworld limit, this model predicts that most habitable planets are dominated by oceans spanning over 90 per cent of their surface area (95 per cent credible interval). This scenario, in which the Earth has a much greater land area than most habitable planets, is consistent with results from numerical simulations and could help explain the apparently low-mass transition in the mass-radius relation.
Dettmer, Jan; Dosso, Stan E; Holland, Charles W
2008-03-01
This paper develops a joint time/frequency-domain inversion for high-resolution single-bounce reflection data, with the potential to resolve fine-scale profiles of sediment velocity, density, and attenuation over small seafloor footprints (approximately 100 m). The approach utilizes sequential Bayesian inversion of time- and frequency-domain reflection data, employing ray-tracing inversion for reflection travel times and a layer-packet stripping method for spherical-wave reflection-coefficient inversion. Posterior credibility intervals from the travel-time inversion are passed on as prior information to the reflection-coefficient inversion. Within the reflection-coefficient inversion, parameter information is passed from one layer packet inversion to the next in terms of marginal probability distributions rotated into principal components, providing an efficient approach to (partially) account for multi-dimensional parameter correlations with one-dimensional, numerical distributions. Quantitative geoacoustic parameter uncertainties are provided by a nonlinear Gibbs sampling approach employing full data error covariance estimation (including nonstationary effects) and accounting for possible biases in travel-time picks. Posterior examination of data residuals shows the importance of including data covariance estimates in the inversion. The joint inversion is applied to data collected on the Malta Plateau during the SCARAB98 experiment.
Implementation of Instrumental Variable Bounds for Data Missing Not at Random.
Marden, Jessica R; Wang, Linbo; Tchetgen, Eric J Tchetgen; Walter, Stefan; Glymour, M Maria; Wirth, Kathleen E
2018-05-01
Instrumental variables are routinely used to recover a consistent estimator of an exposure causal effect in the presence of unmeasured confounding. Instrumental variable approaches to account for nonignorable missing data also exist but are less familiar to epidemiologists. Like instrumental variables for exposure causal effects, instrumental variables for missing data rely on exclusion restriction and instrumental variable relevance assumptions. Yet these two conditions alone are insufficient for point identification. For estimation, researchers have invoked a third assumption, typically involving fairly restrictive parametric constraints. Inferences can be sensitive to these parametric assumptions, which are typically not empirically testable. The purpose of our article is to discuss another approach for leveraging a valid instrumental variable. Although the approach is insufficient for nonparametric identification, it can nonetheless provide informative inferences about the presence, direction, and magnitude of selection bias, without invoking a third untestable parametric assumption. An important contribution of this article is an Excel spreadsheet tool that can be used to obtain empirical evidence of selection bias and calculate bounds and corresponding Bayesian 95% credible intervals for a nonidentifiable population proportion. For illustrative purposes, we used the spreadsheet tool to analyze HIV prevalence data collected by the 2007 Zambia Demographic and Health Survey (DHS).
NASA Astrophysics Data System (ADS)
Liu, Yiming; Shi, Yimin; Bai, Xuchao; Zhan, Pei
2018-01-01
In this paper, we study the estimation for the reliability of a multicomponent system, named N- M-cold-standby redundancy system, based on progressive Type-II censoring sample. In the system, there are N subsystems consisting of M statistically independent distributed strength components, and only one of these subsystems works under the impact of stresses at a time and the others remain as standbys. Whenever the working subsystem fails, one from the standbys takes its place. The system fails when the entire subsystems fail. It is supposed that the underlying distributions of random strength and stress both belong to the generalized half-logistic distribution with different shape parameter. The reliability of the system is estimated by using both classical and Bayesian statistical inference. Uniformly minimum variance unbiased estimator and maximum likelihood estimator for the reliability of the system are derived. Under squared error loss function, the exact expression of the Bayes estimator for the reliability of the system is developed by using the Gauss hypergeometric function. The asymptotic confidence interval and corresponding coverage probabilities are derived based on both the Fisher and the observed information matrices. The approximate highest probability density credible interval is constructed by using Monte Carlo method. Monte Carlo simulations are performed to compare the performances of the proposed reliability estimators. A real data set is also analyzed for an illustration of the findings.
Global mean sea-level rise in a world agreed upon in Paris
NASA Astrophysics Data System (ADS)
Bittermann, Klaus; Rahmstorf, Stefan; Kopp, Robert E.; Kemp, Andrew C.
2017-12-01
Although the 2015 Paris Agreement seeks to hold global average temperature to ‘well below 2 °C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5 °C above pre-industrial levels’, projections of global mean sea-level (GMSL) rise commonly focus on scenarios in which there is a high probability that warming exceeds 1.5 °C. Using a semi-empirical model, we project GMSL changes between now and 2150 CE under a suite of temperature scenarios that satisfy the Paris Agreement temperature targets. The projected magnitude and rate of GMSL rise varies among these low emissions scenarios. Stabilizing temperature at 1.5 °C instead of 2 °C above preindustrial reduces GMSL in 2150 CE by 17 cm (90% credible interval: 14-21 cm) and reduces peak rates of rise by 1.9 mm yr-1 (90% credible interval: 1.4-2.6 mm yr-1). Delaying the year of peak temperature has little long-term influence on GMSL, but does reduce the maximum rate of rise. Stabilizing at 2 °C in 2080 CE rather than 2030 CE reduces the peak rate by 2.7 mm yr-1 (90% credible interval: 2.0-4.0 mm yr-1).
NASA Astrophysics Data System (ADS)
Abe, K.; Adam, J.; Aihara, H.; Akiri, T.; Andreopoulos, C.; Aoki, S.; Ariga, A.; Assylbekov, S.; Autiero, D.; Barbi, M.; Barker, G. J.; Barr, G.; Bartet-Friburg, P.; Bass, M.; Batkiewicz, M.; Bay, F.; Berardi, V.; Berger, B. E.; Berkman, S.; Bhadra, S.; Blaszczyk, F. d. M.; Blondel, A.; Bolognesi, S.; Bordoni, S.; Boyd, S. B.; Brailsford, D.; Bravar, A.; Bronner, C.; Buchanan, N.; Calland, R. G.; Caravaca Rodríguez, J.; Cartwright, S. L.; Castillo, R.; Catanesi, M. G.; Cervera, A.; Cherdack, D.; Chikuma, N.; Christodoulou, G.; Clifton, A.; Coleman, J.; Coleman, S. J.; Collazuol, G.; Connolly, K.; Cremonesi, L.; Dabrowska, A.; Danko, I.; Das, R.; Davis, S.; de Perio, P.; De Rosa, G.; Dealtry, T.; Dennis, S. R.; Densham, C.; Dewhurst, D.; Di Lodovico, F.; Di Luise, S.; Dolan, S.; Drapier, O.; Duboyski, T.; Duffy, K.; Dumarchez, J.; Dytman, S.; Dziewiecki, M.; Emery-Schrenk, S.; Ereditato, A.; Escudero, L.; Ferchichi, C.; Feusels, T.; Finch, A. J.; Fiorentini, G. A.; Friend, M.; Fujii, Y.; Fukuda, Y.; Furmanski, A. P.; Galymov, V.; Garcia, A.; Giffin, S.; Giganti, C.; Gilje, K.; Goeldi, D.; Golan, T.; Gonin, M.; Grant, N.; Gudin, D.; Hadley, D. R.; Haegel, L.; Haesler, A.; Haigh, M. D.; Hamilton, P.; Hansen, D.; Hara, T.; Hartz, M.; Hasegawa, T.; Hastings, N. C.; Hayashino, T.; Hayato, Y.; Hearty, C.; Helmer, R. L.; Hierholzer, M.; Hignight, J.; Hillairet, A.; Himmel, A.; Hiraki, T.; Hirota, S.; Holeczek, J.; Horikawa, S.; Hosomi, F.; Huang, K.; Ichikawa, A. K.; Ieki, K.; Ieva, M.; Ikeda, M.; Imber, J.; Insler, J.; Irvine, T. J.; Ishida, T.; Ishii, T.; Iwai, E.; Iwamoto, K.; Iyogi, K.; Izmaylov, A.; Jacob, A.; Jamieson, B.; Jiang, M.; Johnson, S.; Jo, J. H.; Jonsson, P.; Jung, C. K.; Kabirnezhad, M.; Kaboth, A. C.; Kajita, T.; Kakuno, H.; Kameda, J.; Kanazawa, Y.; Karlen, D.; Karpikov, I.; Katori, T.; Kearns, E.; Khabibullin, M.; Khotjantsev, A.; Kielczewska, D.; Kikawa, T.; Kilinski, A.; Kim, J.; King, S.; Kisiel, J.; Kitching, P.; Kobayashi, T.; Koch, L.; Koga, T.; Kolaceke, A.; Konaka, A.; Kopylov, A.; Kormos, L. L.; Korzenev, A.; Koshio, Y.; Kropp, W.; Kubo, H.; Kudenko, Y.; Kurjata, R.; Kutter, T.; Lagoda, J.; Lamont, I.; Larkin, E.; Laveder, M.; Lawe, M.; Lazos, M.; Lindner, T.; Lister, C.; Litchfield, R. P.; Longhin, A.; Lopez, J. P.; Ludovici, L.; Magaletti, L.; Mahn, K.; Malek, M.; Manly, S.; Marino, A. D.; Marteau, J.; Martin, J. F.; Martins, P.; Martynenko, S.; Maruyama, T.; Matveev, V.; Mavrokoridis, K.; Mazzucato, E.; McCarthy, M.; McCauley, N.; McFarland, K. S.; McGrew, C.; Mefodiev, A.; Metelko, C.; Mezzetto, M.; Mijakowski, P.; Miller, C. A.; Minamino, A.; Mineev, O.; Missert, A.; Miura, M.; Moriyama, S.; Mueller, Th. A.; Murakami, A.; Murdoch, M.; Murphy, S.; Myslik, J.; Nakadaira, T.; Nakahata, M.; Nakamura, K. G.; Nakamura, K.; Nakayama, S.; Nakaya, T.; Nakayoshi, K.; Nantais, C.; Nielsen, C.; Nirkko, M.; Nishikawa, K.; Nishimura, Y.; Nowak, J.; O'Keeffe, H. M.; Ohta, R.; Okumura, K.; Okusawa, T.; Oryszczak, W.; Oser, S. M.; Ovsyannikova, T.; Owen, R. A.; Oyama, Y.; Palladino, V.; Palomino, J. L.; Paolone, V.; Payne, D.; Perevozchikov, O.; Perkin, J. D.; Petrov, Y.; Pickard, L.; Pinzon Guerra, E. S.; Pistillo, C.; Plonski, P.; Poplawska, E.; Popov, B.; Posiadala-Zezula, M.; Poutissou, J.-M.; Poutissou, R.; Przewlocki, P.; Quilain, B.; Radicioni, E.; Ratoff, P. N.; Ravonel, M.; Rayner, M. A. M.; Redij, A.; Reeves, M.; Reinherz-Aronis, E.; Riccio, C.; Rodrigues, P. A.; Rojas, P.; Rondio, E.; Roth, S.; Rubbia, A.; Ruterbories, D.; Rychter, A.; Sacco, R.; Sakashita, K.; Sánchez, F.; Sato, F.; Scantamburlo, E.; Scholberg, K.; Schoppmann, S.; Schwehr, J. D.; Scott, M.; Seiya, Y.; Sekiguchi, T.; Sekiya, H.; Sgalaberna, D.; Shah, R.; Shaker, F.; Shaw, D.; Shiozawa, M.; Short, S.; Shustrov, Y.; Sinclair, P.; Smith, B.; Smy, M.; Sobczyk, J. T.; Sobel, H.; Sorel, M.; Southwell, L.; Stamoulis, P.; Steinmann, J.; Still, B.; Suda, Y.; Suzuki, A.; Suzuki, K.; Suzuki, S. Y.; Suzuki, Y.; Tacik, R.; Tada, M.; Takahashi, S.; Takeda, A.; Takeuchi, Y.; Tanaka, H. K.; Tanaka, H. A.; Tanaka, M. M.; Terhorst, D.; Terri, R.; Thompson, L. F.; Thorley, A.; Tobayama, S.; Toki, W.; Tomura, T.; Touramanis, C.; Tsukamoto, T.; Tzanov, M.; Uchida, Y.; Vacheret, A.; Vagins, M.; Vasseur, G.; Wachala, T.; Wakamatsu, K.; Walter, C. W.; Wark, D.; Warzycha, W.; Wascko, M. O.; Weber, A.; Wendell, R.; Wilkes, R. J.; Wilking, M. J.; Wilkinson, C.; Williamson, Z.; Wilson, J. R.; Wilson, R. J.; Wongjirad, T.; Yamada, Y.; Yamamoto, K.; Yanagisawa, C.; Yano, T.; Yen, S.; Yershov, N.; Yokoyama, M.; Yoo, J.; Yoshida, K.; Yuan, T.; Yu, M.; Zalewska, A.; Zalipska, J.; Zambelli, L.; Zaremba, K.; Ziembicki, M.; Zimmerman, E. D.; Zito, M.; Żmuda, J.; T2K Collaboration
2015-04-01
We report on measurements of neutrino oscillation using data from the T2K long-baseline neutrino experiment collected between 2010 and 2013. In an analysis of muon neutrino disappearance alone, we find the following estimates and 68% confidence intervals for the two possible mass hierarchies: normal hierarchy: sin2θ23=0.51 4-0.056+0.055 and Δ m322=(2.51 ±0.10 )×1 0-3 eV2/c4 and inverted hierarchy: sin2θ23=0.511 ±0.055 and Δ m132=(2.48 ±0.10 )×1 0-3 eV2/c4 . The analysis accounts for multinucleon mechanisms in neutrino interactions which were found to introduce negligible bias. We describe our first analyses that combine measurements of muon neutrino disappearance and electron neutrino appearance to estimate four oscillation parameters, |Δ m2|, sin2θ23, sin2θ13, δC P, and the mass hierarchy. Frequentist and Bayesian intervals are presented for combinations of these parameters, with and without including recent reactor measurements. At 90% confidence level and including reactor measurements, we exclude the region δC P=[0.15 ,0.83 ]π for normal hierarchy and δC P=[-0.08 ,1.09 ]π for inverted hierarchy. The T2K and reactor data weakly favor the normal hierarchy with a Bayes factor of 2.2. The most probable values and 68% one-dimensional credible intervals for the other oscillation parameters, when reactor data are included, are sin2θ23=0.52 8-0.038+0.055 and |Δ m322 |=(2.51 ±0.11 )×1 0-3 eV2/c4 .
Swartz, Michael D; Cai, Yi; Chan, Wenyaw; Symanski, Elaine; Mitchell, Laura E; Danysh, Heather E; Langlois, Peter H; Lupo, Philip J
2015-02-09
While there is evidence that maternal exposure to benzene is associated with spina bifida in offspring, to our knowledge there have been no assessments to evaluate the role of multiple hazardous air pollutants (HAPs) simultaneously on the risk of this relatively common birth defect. In the current study, we evaluated the association between maternal exposure to HAPs identified by the United States Environmental Protection Agency (U.S. EPA) and spina bifida in offspring using hierarchical Bayesian modeling that includes Stochastic Search Variable Selection (SSVS). The Texas Birth Defects Registry provided data on spina bifida cases delivered between 1999 and 2004. The control group was a random sample of unaffected live births, frequency matched to cases on year of birth. Census tract-level estimates of annual HAP levels were obtained from the U.S. EPA's 1999 Assessment System for Population Exposure Nationwide. Using the distribution among controls, exposure was categorized as high exposure (>95(th) percentile), medium exposure (5(th)-95(th) percentile), and low exposure (<5(th) percentile, reference). We used hierarchical Bayesian logistic regression models with SSVS to evaluate the association between HAPs and spina bifida by computing an odds ratio (OR) for each HAP using the posterior mean, and a 95% credible interval (CI) using the 2.5(th) and 97.5(th) quantiles of the posterior samples. Based on previous assessments, any pollutant with a Bayes factor greater than 1 was selected for inclusion in a final model. Twenty-five HAPs were selected in the final analysis to represent "bins" of highly correlated HAPs (ρ > 0.80). We identified two out of 25 HAPs with a Bayes factor greater than 1: quinoline (ORhigh = 2.06, 95% CI: 1.11-3.87, Bayes factor = 1.01) and trichloroethylene (ORmedium = 2.00, 95% CI: 1.14-3.61, Bayes factor = 3.79). Overall there is evidence that quinoline and trichloroethylene may be significant contributors to the risk of spina bifida. Additionally, the use of Bayesian hierarchical models with SSVS is an alternative approach in the evaluation of multiple environmental pollutants on disease risk. This approach can be easily extended to environmental exposures, where novel approaches are needed in the context of multi-pollutant modeling.
Estimation of divergence from Hardy-Weinberg form.
Stark, Alan E
2015-08-01
The Hardy–Weinberg (HW) principle explains how random mating (RM) can produce and maintain a population in equilibrium, that is, with constant genotypic proportions. When proportions diverge from HW form, it is of interest to estimate the fixation index F, which reflects the degree of divergence. Starting from a sample of genotypic counts, a mixed procedure gives first the orthodox estimate of gene frequency q and then a Bayesian estimate of F, based on a credible prior distribution of F, which is described here.
Komócsi, András; Kehl, Dániel; d'Ascenso, Fabrizio; DiNicolantonio, James; Vorobcsuk, András
2017-03-01
In ST-segment elevation myocardial infarction (STEMI), current guidelines discourage treatment of the non-culprit lesions at the time of the primary intervention. Latest trials have challenged this strategy suggesting benefit of early complete revascularization. We performed a Bayesian multiple treatment network meta-analysis of randomized clinical trials (RCTs) in STEMI on culprit-only intervention (CO) versus different timing multivessel revascularization, including immediate (IM), same hospitalization (SH) or later staged (ST). Outcome parameters were pooled with a random-effects model. For multiple-treatment meta-analysis, a Bayesian Markov chain Monte Carlo method was used. Eight RCTs involving 2077 patients were identified. ST and IM revascularization was associated with a decrease in major adverse cardiac events (MACEs) compared to culprit-only approach (risk ratio [RR]: 0.43 credible interval [CrI]: 0.22-0.77 and RR: 0.36 CrI: 0.24-0.54, respectively). IM was superior to SH (RR: 0.49 CrI: 0.29-0.80). With regards to myocardial infarction IM was superior to SH (RR: 0.18 CrI: 0.02-0.99). The posterior probability of being the best choice of treatment regarding the frequency of MACEs was 71.2% for IM, 28.5% for ST, 0.3% for SH and 0.05% for culprit-only approach. Results from RCTs indicate that immediate or staged revascularization of non-culprit lesions reduces major adverse events in patients after primary percutaneous coronary intervention. Differences in MACEs suggest superiority of the immediate or staged intervention; however, further randomized trials are needed to determine the optimal timing of revascularization of the non-culprit lesions.
Mina, George S; Watti, Hussam; Soliman, Demiana; Shewale, Anand; Atkins, Jessica; Reddy, Pratap; Dominic, Paari
2018-01-05
Most data guiding revascularization of multivessel disease (MVD) and/or left main disease (LMD) favor coronary artery bypass grafting (CABG) over percutaneous coronary intervention (PCI). However, those data are based on trials comparing CABG to bare metal stents (BMS) or old generation drug eluting stents (OG-DES). Hence, it is essential to outcomes of CABG to those of new generation drug eluting stents (NG-DES). We searched PUBMED and Cochrane database for trials evaluating revascularization of MVD and/or LMD with CABG and/or PCI. A Bayesian network meta-analysis was performed to calculate odds ratios (OR) and 95% credible intervals (CrI). Primary outcome was major adverse cardiovascular events (MACE) at 3-5 years. Secondary outcomes were mortality, cerebrovascular accidents (CVA), myocardial infarction (MI) and repeat revascularization. We included 10 trials with a total of 9287 patients. CABG was associated with lower MACE when compared to BMS or OG-DES. However, MACE was not significantly different between CABG and NG-DES (OR 0.79, CrI 0.45-1.40). Moreover, there were no significant differences between CABG and NG-DES in mortality (OR 0.78, CrI 0.45-1.37), CVA (OR 0.93 CrI 0.35-2.2) or MI (OR 0.6, CrI 0.17-2.0). On the other hand, CABG was associated with lower repeat revascularization (OR 0.55, CrI 0.36-0.84). Our study suggests that NG-DES is an acceptable alternative to CABG in patients with MVD and/or LMD. However, repeat revascularization remains to be lower with CABG than with PCI. Copyright © 2018. Published by Elsevier Inc.
Low coverage of central point vaccination against dog rabies in Bamako, Mali.
Muthiani, Yvonne; Traoré, Abdallah; Mauti, Stephanie; Zinsstag, Jakob; Hattendorf, Jan
2015-06-15
Canine rabies remains an important public-health problem in Africa. Dog mass vaccination is the recommended method for rabies control and elimination. We report on the first small-scale mass dog vaccination campaign trial in Bamako, Mali. Our objective was to estimate coverage of the vaccination campaign and to quantify determinants of intervention effectiveness. In September 2013, a central point vaccination campaign--free of cost for dog owners--was carried out in 17 posts on three consecutive days within Bamako's Commune 1. Vaccination coverage and the proportion of ownerless dogs were estimated by combining mark-recapture household and transect surveys using Bayesian modeling. The estimated vaccination coverage was 17.6% (95% Credibility Interval, CI: 14.4-22.1%) which is far below the World Health Organization (WHO) recommended vaccination coverage of 70%. The Bayesian estimate for the owned dog population of Commune 1 was 3459 dogs (95% CI: 2786-4131) and the proportion of ownerless dogs was about 8%. The low coverage observed is primarily attributed to low participation by dog owners. Dog owners reported several reasons for not bringing their dogs to the vaccination posts. The most frequently reported reasons for non-attendance were lack of information (25%) and the inability to handle the dog (16%). For 37% of respondents, no clear reason was given for non-vaccination. Despite low coverage, the vaccination campaign in Bamako was relatively easy to implement, both in terms of logistics and organization. Almost half of the participating dog owners brought their pets on the first day of the campaign. Participatory stakeholder processes involving communities and local authorities are needed to identify effective communication channels and locally adapted vaccination strategies, which could include both central-point and door-to-door vaccination. Copyright © 2015 Elsevier B.V. All rights reserved.
van Gelder, Marleen M. H. J.; Rogier, A.; Donders, T.; Devine, Owen; Roeleveld, Nel; Reefhuis, Jennita
2015-01-01
Background Studies on associations between periconceptional cannabis exposure and birth defects have mainly relied on self-reported exposure. Therefore, the results may be biased due to underreporting of the exposure. The aim of this study was to quantify the potential effects of this form of exposure misclassification. Methods Using multivariable logistic regression, we re-analyzed associations between periconceptional cannabis use and 20 specific birth defects using data from the National Birth Defects Prevention Study from 1997–2005 for 13 859 case infants and 6556 control infants. For seven birth defects, we implemented four Bayesian models based on various assumptions concerning the sensitivity of self-reported cannabis use to estimate odds ratios (ORs), adjusted for confounding and underreporting of the exposure. We used information on sensitivity of self-reported cannabis use from the literature for prior assumptions. Results The results unadjusted for underreporting of the exposure showed an association between cannabis use and anencephaly (posterior OR 1.9 [95% credible interval (CRI) 1.1, 3.2]) which persisted after adjustment for potential exposure misclassification. Initially, no statistically significant associations were observed between cannabis use and the other birth defect categories studied. Although adjustment for underreporting did not notably change these effect estimates, cannabis use was associated with esophageal atresia (posterior OR 1.7 [95% CRI 1.0, 2.9]), diaphragmatic hernia (posterior OR 1.8 [95% CRI 1.1, 3.0]) and gastroschisis (posterior OR 1.7 [95% CRI 1.2, 2.3]) after correction for exposure misclassification. Conclusions Underreporting of the exposure may have obscured some cannabis-birth defect associations in previous studies. However, the resulting bias is likely to be limited. PMID:25155701
van Gelder, Marleen M H J; Donders, A Rogier T; Devine, Owen; Roeleveld, Nel; Reefhuis, Jennita
2014-09-01
Studies on associations between periconceptional cannabis exposure and birth defects have mainly relied on self-reported exposure. Therefore, the results may be biased due to under-reporting of the exposure. The aim of this study was to quantify the potential effects of this form of exposure misclassification. Using multivariable logistic regression, we re-analysed associations between periconceptional cannabis use and 20 specific birth defects using data from the National Birth Defects Prevention Study from 1997-2005 for 13 859 case infants and 6556 control infants. For seven birth defects, we implemented four Bayesian models based on various assumptions concerning the sensitivity of self-reported cannabis use to estimate odds ratios (ORs), adjusted for confounding and under-reporting of the exposure. We used information on sensitivity of self-reported cannabis use from the literature for prior assumptions. The results unadjusted for under-reporting of the exposure showed an association between cannabis use and anencephaly (posterior OR 1.9 [95% credible interval (CRI) 1.1, 3.2]) which persisted after adjustment for potential exposure misclassification. Initially, no statistically significant associations were observed between cannabis use and the other birth defect categories studied. Although adjustment for under-reporting did not notably change these effect estimates, cannabis use was associated with esophageal atresia (posterior OR 1.7 [95% CRI 1.0, 2.9]), diaphragmatic hernia (posterior OR 1.8 [95% CRI 1.1, 3.0]), and gastroschisis (posterior OR 1.7 [95% CRI 1.2, 2.3]) after correction for exposure misclassification. Under-reporting of the exposure may have obscured some cannabis-birth defect associations in previous studies. However, the resulting bias is likely to be limited. © 2014 John Wiley & Sons Ltd.
Chen, Cong; Zhang, Guohui; Huang, Helai; Wang, Jiangfeng; Tarefder, Rafiqul A
2016-11-01
Rural non-interstate crashes induce a significant amount of severe injuries and fatalities. Examination of such injury patterns and the associated contributing factors is of practical importance. Taking into account the ordinal nature of injury severity levels and the hierarchical feature of crash data, this study employs a hierarchical ordered logit model to examine the significant factors in predicting driver injury severities in rural non-interstate crashes based on two-year New Mexico crash records. Bayesian inference is utilized in model estimation procedure and 95% Bayesian Credible Interval (BCI) is applied to testing variable significance. An ordinary ordered logit model omitting the between-crash variance effect is evaluated as well for model performance comparison. Results indicate that the model employed in this study outperforms ordinary ordered logit model in model fit and parameter estimation. Variables regarding crash features, environment conditions, and driver and vehicle characteristics are found to have significant influence on the predictions of driver injury severities in rural non-interstate crashes. Factors such as road segments far from intersection, wet road surface condition, collision with animals, heavy vehicle drivers, male drivers and driver seatbelt used tend to induce less severe driver injury outcomes than the factors such as multiple-vehicle crashes, severe vehicle damage in a crash, motorcyclists, females, senior drivers, driver with alcohol or drug impairment, and other major collision types. Research limitations regarding crash data and model assumptions are also discussed. Overall, this research provides reasonable results and insight in developing effective road safety measures for crash injury severity reduction and prevention. Copyright © 2016 Elsevier Ltd. All rights reserved.
de Vocht, Frank
2016-12-01
Mobile phone use has been increasing rapidly in the past decades and, in parallel, so has the annual incidence of certain types of brain cancers. However, it remains unclear whether this correlation is coincidental or whether use of mobile phones may cause the development, promotion or progression of specific cancers. The 1985-2014 incidence of selected brain cancer subtypes in England were analyzed and compared to counterfactual 'synthetic control' timeseries. Annual 1985-2014 incidence of malignant glioma, glioblastoma multiforme, and malignant neoplasms of the temporal and parietal lobes in England were modelled based on population-level covariates using Bayesian structural time series models assuming 5,10 and 15year minimal latency periods. Post-latency counterfactual 'synthetic England' timeseries were nowcast based on covariate trends. The impact of mobile phone use was inferred from differences between measured and modelled time series. There is no evidence of an increase in malignant glioma, glioblastoma multiforme, or malignant neoplasms of the parietal lobe not predicted in the 'synthetic England' time series. Malignant neoplasms of the temporal lobe however, have increased faster than expected. A latency period of 10years reflected the earliest latency period when this was measurable and related to mobile phone penetration rates, and indicated an additional increase of 35% (95% Credible Interval 9%:59%) during 2005-2014; corresponding to an additional 188 (95%CI 48-324) cases annually. A causal factor, of which mobile phone use (and possibly other wireless equipment) is in agreement with the hypothesized temporal association, is related to an increased risk of developing malignant neoplasms in the temporal lobe. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
Efficacy of nonvenous medications for acute convulsive seizures
Kothari, Harsh; Zhang, Zongjun; Han, Baoguang; Horn, Paul S.; Glauser, Tracy A.
2015-01-01
Objective: This is a network meta-analysis of nonvenous drugs used in randomized controlled trials (RCTs) for treatment of acute convulsive seizures and convulsive status epilepticus. Methods: Literature was searched according to Preferred Reporting Items for Systematic reviews and Meta-Analyses guidelines for RCTs examining treatment of acute convulsive seizures or status epilepticus with at least one of the study arms being a nonvenous medication. After demographic and outcome data extraction, a Bayesian network meta-analysis was performed and efficacy results were summarized using treatment effects and their credible intervals (CrI). We also calculated the probability of each route–drug combination being the most clinically effective for a given outcome, and provided their Bayesian hierarchical ranking. Results: This meta-analysis of 16 studies found that intramuscular midazolam (IM-MDZ) is superior to other nonvenous medications regarding time to seizure termination after administration (2.145 minutes, 95% CrI 1.308–3.489), time to seizure cessation after arrival in the hospital (3.841 minutes, 95% CrI 2.697–5.416), and time to initiate treatment (0.779 minutes, 95% CrI 0.495–1.221). Additionally, intranasal midazolam (IN-MDZ) was adjudged most efficacious for seizure cessation within 10 minutes of administration (90.4% of participants, 95% CrI 79.4%–96.9%), and persistent seizure cessation for ≥1 hour (78.5% of participants, 95% CrI 59.5%–92.1%). Paucity of RCTs produced evidence gaps resulting in small networks, routes/drugs included in some networks but not others, and some trials not being connected to any network. Conclusions: Despite the evidence gaps, IM-MDZ and IN-MDZ exhibit the best efficacy data for the nonvenous treatment of acute convulsive seizures or status epilepticus. PMID:26511448
Hiruki-Raring, Lisa M; Ver Hoef, Jay M; Boveng, Peter L; Bengtson, John L
2012-03-01
We created a Bayesian hierarchical model (BHM) to investigate ecosystem relationships between the physical ecosystem (sea ice extent), a prey measure (krill density), predator behaviors (diving and foraging effort of female Antarctic fur seals, Arctocephalus gazella, with pups) and predator characteristics (mass of maternal fur seals and pups). We collected data on Antarctic fur seals from 1987/1988 to 1994/1995 at Seal Island, Antarctica. The BHM allowed us to link together predators and prey into a model that uses all the data efficiently and accounts for major sources of uncertainty. Based on the literature, we made hypotheses about the relationships in the model, which we compared with the model outcome after fitting the BHM. For each BHM parameter, we calculated the mean of the posterior density and the 95% credible interval. Our model confirmed others' findings that increased sea ice was related to increased krill density. Higher krill density led to reduced dive intensity of maternal fur seals, as measured by dive depth and duration, and to less time spent foraging by maternal fur seals. Heavier maternal fur seals and lower maternal foraging effort resulted in heavier pups at 22 d. No relationship was found between krill density and maternal mass, or between maternal mass and foraging effort on pup growth rates between 22 and 85 days of age. Maternal mass may have reflected environmental conditions prior to the pup provisioning season, rather than summer prey densities. Maternal mass and foraging effort were not related to pup growth rates between 22 and 85 d, possibly indicating that food was not limiting, food sources other than krill were being used, or differences occurred before pups reached age 22 d.
Wen, Yi Feng; Wong, Hai Ming; Lin, Ruitao; Yin, Guosheng; McGrath, Colman
2015-01-01
Background Numerous facial photogrammetric studies have been published around the world. We aimed to critically review these studies so as to establish population norms for various angular and linear facial measurements; and to determine inter-ethnic/racial facial variations. Methods and Findings A comprehensive and systematic search of PubMed, ISI Web of Science, Embase, and Scopus was conducted to identify facial photogrammetric studies published before December, 2014. Subjects of eligible studies were either Africans, Asians or Caucasians. A Bayesian hierarchical random effects model was developed to estimate posterior means and 95% credible intervals (CrI) for each measurement by ethnicity/race. Linear contrasts were constructed to explore inter-ethnic/racial facial variations. We identified 38 eligible studies reporting 11 angular and 18 linear facial measurements. Risk of bias of the studies ranged from 0.06 to 0.66. At the significance level of 0.05, African males were found to have smaller nasofrontal angle (posterior mean difference: 8.1°, 95% CrI: 2.2°–13.5°) compared to Caucasian males and larger nasofacial angle (7.4°, 0.1°–13.2°) compared to Asian males. Nasolabial angle was more obtuse in Caucasian females than in African (17.4°, 0.2°–35.3°) and Asian (9.1°, 0.4°–17.3°) females. Additional inter-ethnic/racial variations were revealed when the level of statistical significance was set at 0.10. Conclusions A comprehensive database for angular and linear facial measurements was established from existing studies using the statistical model and inter-ethnic/racial variations of facial features were observed. The results have implications for clinical practice and highlight the need and value for high quality photogrammetric studies. PMID:26247212
Huang, Xiaodong; Mengersen, Kerrie; Milinovich, Gabriel; Hu, Wenbiao
2017-06-01
The effects of weather variability on seasonal influenza among different age groups remain unclear. The comparative study aims to explore the differences in the associations between weather variability and seasonal influenza, and growth rates of seasonal influenza epidemics among different age groups in Queensland, Australia. Three Bayesian spatiotemporal conditional autoregressive models were fitted at the postal area level to quantify the relationships between seasonal influenza and monthly minimum temperature (MIT), monthly vapor pressure, school calendar pattern, and Index of Relative Socio-Economic Advantage and Disadvantage for 3 age groups (<15, 15-64, and ≥65 years). The results showed that the expected decrease in monthly influenza cases was 19.3% (95% credible interval [CI], 14.7%-23.4%), 16.3% (95% CI, 13.6%-19.0%), and 8.5% (95% CI, 1.5%-15.0%) for a 1°C increase in monthly MIT at <15, 15-64, and ≥65 years of age, respectively, while the average increase in the monthly influenza cases was 14.6% (95% CI, 9.0%-21.0%), 12.1% (95% CI, 8.8%-16.1%), and 9.2% (95% CI, 1.4%-16.9%) for a 1-hPa increase in vapor pressure. Weather variability appears to be more influential on seasonal influenza transmission in younger (0-14) age groups. The growth rates of influenza at postal area level were relatively small for older (≥65) age groups in Queensland, Australia. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.
Wen, Yi Feng; Wong, Hai Ming; Lin, Ruitao; Yin, Guosheng; McGrath, Colman
2015-01-01
Numerous facial photogrammetric studies have been published around the world. We aimed to critically review these studies so as to establish population norms for various angular and linear facial measurements; and to determine inter-ethnic/racial facial variations. A comprehensive and systematic search of PubMed, ISI Web of Science, Embase, and Scopus was conducted to identify facial photogrammetric studies published before December, 2014. Subjects of eligible studies were either Africans, Asians or Caucasians. A Bayesian hierarchical random effects model was developed to estimate posterior means and 95% credible intervals (CrI) for each measurement by ethnicity/race. Linear contrasts were constructed to explore inter-ethnic/racial facial variations. We identified 38 eligible studies reporting 11 angular and 18 linear facial measurements. Risk of bias of the studies ranged from 0.06 to 0.66. At the significance level of 0.05, African males were found to have smaller nasofrontal angle (posterior mean difference: 8.1°, 95% CrI: 2.2°-13.5°) compared to Caucasian males and larger nasofacial angle (7.4°, 0.1°-13.2°) compared to Asian males. Nasolabial angle was more obtuse in Caucasian females than in African (17.4°, 0.2°-35.3°) and Asian (9.1°, 0.4°-17.3°) females. Additional inter-ethnic/racial variations were revealed when the level of statistical significance was set at 0.10. A comprehensive database for angular and linear facial measurements was established from existing studies using the statistical model and inter-ethnic/racial variations of facial features were observed. The results have implications for clinical practice and highlight the need and value for high quality photogrammetric studies.
Mills, Edward J; Adhvaryu, Achyuta; Jakiela, Pamela; Birungi, Josephine; Okoboi, Stephen; Chimulwa, Teddy; Wangisi, Jonathan; Achilla, Tina; Popoff, Evan; Golchi, Shirin; Karlan, Dean
2018-05-28
HIV infection has profound clinical and economic costs at the household level. This is particularly important in low-income settings, where access to additional sources of income or loans may be limited. While several microfinance interventions have been proposed, unconditional cash grants, a strategy to allow participants to choose how to use finances that may improve household security and health, has not previously been evaluated. We examined the effect of an unconditional cash transfer to HIV-infected individuals using a 2 x 2 factorial randomised trial in two rural districts in Uganda. Our primary outcomes were changes in CD4 cell count, sexual behaviors, and adherence to ART. Secondary outcomes were changes in household food security and adult mental health. We applied a Bayesian approach for our primary analysis. We randomized 2170 patients as participatants, with 1081 receiving a cash grant. We found no important intervention effects on CD4 t-cell counts between groups (mean difference [MD] 35.48, 95% Credible Interval [CrI] -59.9-1131.6), food security (odds ratio [OR] 1.22, 95% CrI: 0.47, 3.02), medication adherence (OR 3.15, 95% CrI: 0.58, 18.15), or sexual behavior (OR 0.45 95% CrI: 0.12, 1.55), or health expenditure in the previous 3 weeks (Mean Difference $2.65, 95% CrI: -9.30, 15.69). In secondary analysis, we detected an effect of mental planning on CD4 change between groups (104.2 cells, 9% CrI: 5.99, 202.16). We did not have data on viral load outcomes. Although all outcomes were associated with favorable point estimates, our trial did not demonstrate important effects of unconditional cash grants on health outcomes.
Alegana, Victor A; Wright, Jim; Bosco, Claudio; Okiro, Emelda A; Atkinson, Peter M; Snow, Robert W; Tatem, Andrew J; Noor, Abdisalan M
2017-11-21
One pillar to monitoring progress towards the Sustainable Development Goals is the investment in high quality data to strengthen the scientific basis for decision-making. At present, nationally-representative surveys are the main source of data for establishing a scientific evidence base, monitoring, and evaluation of health metrics. However, little is known about the optimal precisions of various population-level health and development indicators that remains unquantified in nationally-representative household surveys. Here, a retrospective analysis of the precision of prevalence from these surveys was conducted. Using malaria indicators, data were assembled in nine sub-Saharan African countries with at least two nationally-representative surveys. A Bayesian statistical model was used to estimate between- and within-cluster variability for fever and malaria prevalence, and insecticide-treated bed nets (ITNs) use in children under the age of 5 years. The intra-class correlation coefficient was estimated along with the optimal sample size for each indicator with associated uncertainty. Results suggest that the estimated sample sizes for the current nationally-representative surveys increases with declining malaria prevalence. Comparison between the actual sample size and the modelled estimate showed a requirement to increase the sample size for parasite prevalence by up to 77.7% (95% Bayesian credible intervals 74.7-79.4) for the 2015 Kenya MIS (estimated sample size of children 0-4 years 7218 [7099-7288]), and 54.1% [50.1-56.5] for the 2014-2015 Rwanda DHS (12,220 [11,950-12,410]). This study highlights the importance of defining indicator-relevant sample sizes to achieve the required precision in the current national surveys. While expanding the current surveys would need additional investment, the study highlights the need for improved approaches to cost effective sampling.
Coward, Stephanie; Kuenzig, M Ellen; Hazlewood, Glen; Clement, Fiona; McBrien, Kerry; Holmes, Rebecca; Panaccione, Remo; Ghosh, Subrata; Seow, Cynthia H; Rezaie, Ali; Kaplan, Gilaad G
2017-03-01
Induction treatment of mild-to-moderate Crohn's disease is controversial. To compare the induction of remission between different doses of mesalamine, sulfasalazine, corticosteroids, and budesonide for active Crohn's disease. We identified randomized controlled trials from existing Cochrane reviews and an updated literature search in Medline, EMBASE, and CENTRAL to November 2015. We included randomized controlled trials (n = 22) in adult patients with Crohn's disease that compared budesonide, sulfasalazine, mesalamine, or corticosteroids with placebo or each other, for the induction of remission (8-17 wks). Mesalamine (above and below 2.4 g/d) and budesonide (above and below 6 mg/d) were stratified into low and high doses. Our primary outcome was remission, defined as a Crohn's Disease Activity Index score <150. A Bayesian random-effects network meta-analysis was performed on the proportion in remission. Corticosteroids (odds ratio [OR] = 3.80; 95% credible interval [CrI]: 2.48-5.66), high-dose budesonide (OR = 2.96; 95% CrI: 2.06-4.30), and high-dose mesalamine (OR = 2.29; 95% CrI: 1.58-3.33) were superior to placebo. Corticosteroids were similar to high-dose budesonide (OR = 1.21; 95% CrI: 0.84-1.76), but more effective than high-dose mesalamine (OR = 1.83; 95% CrI: 1.16-2.88). Sulfasalazine was not significantly superior to any therapy including placebo. Randomized controlled trials that use a strict definition of induction of remission and disease severity at enrollment to assess effectiveness in treating mild-to-moderate Crohn's disease are limited. Corticosteroids and high-dose budesonide were effective treatments for inducing remission in mild-to-moderate Crohn's disease. High-dose mesalamine is an option among patients preferring to avoid steroids.
Schroeder, Bernard K.; Lindsay, David J.; Faust, Deborah A.
2015-01-01
Species at risk with secretive breeding behaviours, low densities, and wide geographic range pose a significant challenge to conservation actions because population trends are difficult to detect. Such is the case with the Marbled Murrelet (Brachyramphus marmoratus), a seabird listed as ‘Threatened’ by the Species at Risk Act in Canada largely due to the loss of its old growth forest nesting habitat. We report the first estimates of population trend of Marbled Murrelets in Canada derived from a monitoring program that uses marine radar to detect birds as they enter forest watersheds during 923 dawn surveys at 58 radar monitoring stations within the six Marbled Murrelet Conservation Regions on coastal British Columbia, Canada, 1996–2013. Temporal trends in radar counts were analyzed with a hierarchical Bayesian multivariate modeling approach that controlled for variation in tilt of the radar unit and day of year, included year-specific deviations from the overall trend (‘year effects’), and allowed for trends to be estimated at three spatial scales. A negative overall trend of -1.6%/yr (95% credibility interval: -3.2%, 0.01%) indicated moderate evidence for a coast-wide decline, although trends varied strongly among the six conservation regions. Negative annual trends were detected in East Vancouver Island (-9%/yr) and South Mainland Coast (-3%/yr) Conservation Regions. Over a quarter of the year effects were significantly different from zero, and the estimated standard deviation in common-shared year effects between sites within each region was about 50% per year. This large common-shared interannual variation in counts may have been caused by regional movements of birds related to changes in marine conditions that affect the availability of prey. PMID:26258803
Delbiso, Tefera Darge; Rodriguez-Llanes, Jose Manuel; Altare, Chiara; Masquelier, Bruno; Guha-Sapir, Debarati
2016-01-01
Women's malnutrition, particularly undernutrition, remains an important public health challenge in Ethiopia. Although various studies examined the levels and determinants of women's nutritional status, the influence of living close to an international border on women's nutrition has not been investigated. Yet, Ethiopian borders are regularly affected by conflict and refugee flows, which might ultimately impact health. To investigate the impact of living close to borders in the nutritional status of women in Ethiopia, while considering other important covariates. Our analysis was based on the body mass index (BMI) of 6,334 adult women aged 20-49 years, obtained from the 2011 Ethiopian Demographic and Health Survey (EDHS). A Bayesian multilevel multinomial logistic regression analysis was used to capture the clustered structure of the data and the possible correlation that may exist within and between clusters. After controlling for potential confounders, women living close to borders (i.e. ≤100 km) in Ethiopia were 59% more likely to be underweight (posterior odds ratio [OR]=1.59; 95% credible interval [CrI]: 1.32-1.90) than their counterparts living far from the borders. This result was robust to different choices of border delineation (i.e. ≤50, ≤75, ≤125, and ≤150 km). Women from poor families, those who have no access to improved toilets, reside in lowland areas, and are Muslim, were independently associated with underweight. In contrast, more wealth, higher education, older age, access to improved toilets, being married, and living in urban or lowlands were independently associated with overweight. The problem of undernutrition among women in Ethiopia is most worrisome in the border areas. Targeted interventions to improve nutritional status in these areas, such as improved access to sanitation, economic and livelihood support, are recommended.
Modeling Input Errors to Improve Uncertainty Estimates for Sediment Transport Model Predictions
NASA Astrophysics Data System (ADS)
Jung, J. Y.; Niemann, J. D.; Greimann, B. P.
2016-12-01
Bayesian methods using Markov chain Monte Carlo algorithms have recently been applied to sediment transport models to assess the uncertainty in the model predictions due to the parameter values. Unfortunately, the existing approaches can only attribute overall uncertainty to the parameters. This limitation is critical because no model can produce accurate forecasts if forced with inaccurate input data, even if the model is well founded in physical theory. In this research, an existing Bayesian method is modified to consider the potential errors in input data during the uncertainty evaluation process. The input error is modeled using Gaussian distributions, and the means and standard deviations are treated as uncertain parameters. The proposed approach is tested by coupling it to the Sedimentation and River Hydraulics - One Dimension (SRH-1D) model and simulating a 23-km reach of the Tachia River in Taiwan. The Wu equation in SRH-1D is used for computing the transport capacity for a bed material load of non-cohesive material. Three types of input data are considered uncertain: (1) the input flowrate at the upstream boundary, (2) the water surface elevation at the downstream boundary, and (3) the water surface elevation at a hydraulic structure in the middle of the reach. The benefits of modeling the input errors in the uncertainty analysis are evaluated by comparing the accuracy of the most likely forecast and the coverage of the observed data by the credible intervals to those of the existing method. The results indicate that the internal boundary condition has the largest uncertainty among those considered. Overall, the uncertainty estimates from the new method are notably different from those of the existing method for both the calibration and forecast periods.
ERIC Educational Resources Information Center
Rindskopf, David
2012-01-01
Muthen and Asparouhov (2012) made a strong case for the advantages of Bayesian methodology in factor analysis and structural equation models. I show additional extensions and adaptations of their methods and show how non-Bayesians can take advantage of many (though not all) of these advantages by using interval restrictions on parameters. By…
Wang, Hongrui; Wang, Cheng; Wang, Ying; ...
2017-04-05
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLEmore » confidence interval and thus more precise estimation by using the related information from regional gage stations. As a result, the Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.« less
Rong, Qiangqiang; Cai, Yanpeng; Chen, Bing; Yue, Wencong; Yin, Xin'an; Tan, Qian
2017-02-15
In this research, an export coefficient based dual inexact two-stage stochastic credibility constrained programming (ECDITSCCP) model was developed through integrating an improved export coefficient model (ECM), interval linear programming (ILP), fuzzy credibility constrained programming (FCCP) and a fuzzy expected value equation within a general two stage programming (TSP) framework. The proposed ECDITSCCP model can effectively address multiple uncertainties expressed as random variables, fuzzy numbers, pure and dual intervals. Also, the model can provide a direct linkage between pre-regulated management policies and the associated economic implications. Moreover, the solutions under multiple credibility levels can be obtained for providing potential decision alternatives for decision makers. The proposed model was then applied to identify optimal land use structures for agricultural NPS pollution mitigation in a representative upstream subcatchment of the Miyun Reservoir watershed in north China. Optimal solutions of the model were successfully obtained, indicating desired land use patterns and nutrient discharge schemes to get a maximum agricultural system benefits under a limited discharge permit. Also, numerous results under multiple credibility levels could provide policy makers with several options, which could help get an appropriate balance between system benefits and pollution mitigation. The developed ECDITSCCP model can be effectively applied to addressing the uncertain information in agricultural systems and shows great applicability to the land use adjustment for agricultural NPS pollution mitigation. Copyright © 2016 Elsevier B.V. All rights reserved.
Conroy, M.J.; Runge, J.P.; Barker, R.J.; Schofield, M.R.; Fonnesbeck, C.J.
2008-01-01
Many organisms are patchily distributed, with some patches occupied at high density, others at lower densities, and others not occupied. Estimation of overall abundance can be difficult and is inefficient via intensive approaches such as capture-mark-recapture (CMR) or distance sampling. We propose a two-phase sampling scheme and model in a Bayesian framework to estimate abundance for patchily distributed populations. In the first phase, occupancy is estimated by binomial detection samples taken on all selected sites, where selection may be of all sites available, or a random sample of sites. Detection can be by visual surveys, detection of sign, physical captures, or other approach. At the second phase, if a detection threshold is achieved, CMR or other intensive sampling is conducted via standard procedures (grids or webs) to estimate abundance. Detection and CMR data are then used in a joint likelihood to model probability of detection in the occupancy sample via an abundance-detection model. CMR modeling is used to estimate abundance for the abundance-detection relationship, which in turn is used to predict abundance at the remaining sites, where only detection data are collected. We present a full Bayesian modeling treatment of this problem, in which posterior inference on abundance and other parameters (detection, capture probability) is obtained under a variety of assumptions about spatial and individual sources of heterogeneity. We apply the approach to abundance estimation for two species of voles (Microtus spp.) in Montana, USA. We also use a simulation study to evaluate the frequentist properties of our procedure given known patterns in abundance and detection among sites as well as design criteria. For most population characteristics and designs considered, bias and mean-square error (MSE) were low, and coverage of true parameter values by Bayesian credibility intervals was near nominal. Our two-phase, adaptive approach allows efficient estimation of abundance of rare and patchily distributed species and is particularly appropriate when sampling in all patches is impossible, but a global estimate of abundance is required.
Lai, Ying-Si; Zhou, Xiao-Nong; Pan, Zhi-Heng; Utzinger, Jürg; Vounatsou, Penelope
2017-01-01
Background Clonorchiasis, one of the most important food-borne trematodiases, affects more than 12 million people in the People’s Republic of China (P.R. China). Spatially explicit risk estimates of Clonorchis sinensis infection are needed in order to target control interventions. Methodology Georeferenced survey data pertaining to infection prevalence of C. sinensis in P.R. China from 2000 onwards were obtained via a systematic review in PubMed, ISI Web of Science, Chinese National Knowledge Internet, and Wanfang Data from January 1, 2000 until January 10, 2016, with no restriction of language or study design. Additional disease data were provided by the National Institute of Parasitic Diseases, Chinese Center for Diseases Control and Prevention in Shanghai. Environmental and socioeconomic proxies were extracted from remote-sensing and other data sources. Bayesian variable selection was carried out to identify the most important predictors of C. sinensis risk. Geostatistical models were applied to quantify the association between infection risk and the predictors of the disease, and to predict the risk of infection across P.R. China at high spatial resolution (over a grid with grid cell size of 5×5 km). Principal findings We obtained clonorchiasis survey data at 633 unique locations in P.R. China. We observed that the risk of C. sinensis infection increased over time, particularly from 2005 onwards. We estimate that around 14.8 million (95% Bayesian credible interval 13.8–15.8 million) people in P.R. China were infected with C. sinensis in 2010. Highly endemic areas (≥ 20%) were concentrated in southern and northeastern parts of the country. The provinces with the highest risk of infection and the largest number of infected people were Guangdong, Guangxi, and Heilongjiang. Conclusions/Significance Our results provide spatially relevant information for guiding clonorchiasis control interventions in P.R. China. The trend toward higher risk of C. sinensis infection in the recent past urges the Chinese government to pay more attention to the public health importance of clonorchiasis and to target interventions to high-risk areas. PMID:28253272
Seliske, L; Norwood, T A; McLaughlin, J R; Wang, S; Palleschi, C; Holowaty, E
2016-06-07
An important public health goal is to decrease the prevalence of key behavioural risk factors, such as tobacco use and obesity. Survey information is often available at the regional level, but heterogeneity within large geographic regions cannot be assessed. Advanced spatial analysis techniques are demonstrated to produce sensible micro area estimates of behavioural risk factors that enable identification of areas with high prevalence. A spatial Bayesian hierarchical model was used to estimate the micro area prevalence of current smoking and excess bodyweight for the Erie-St. Clair region in southwestern Ontario. Estimates were mapped for male and female respondents of five cycles of the Canadian Community Health Survey (CCHS). The micro areas were 2006 Census Dissemination Areas, with an average population of 400-700 people. Two individual-level models were specified: one controlled for survey cycle and age group (model 1), and one controlled for survey cycle, age group and micro area median household income (model 2). Post-stratification was used to derive micro area behavioural risk factor estimates weighted to the population structure. SaTScan analyses were conducted on the granular, postal-code level CCHS data to corroborate findings of elevated prevalence. Current smoking was elevated in two urban areas for both sexes (Sarnia and Windsor), and an additional small community (Chatham) for males only. Areas of excess bodyweight were prevalent in an urban core (Windsor) among males, but not females. Precision of the posterior post-stratified current smoking estimates was improved in model 2, as indicated by narrower credible intervals and a lower coefficient of variation. For excess bodyweight, both models had similar precision. Aggregation of the micro area estimates to CCHS design-based estimates validated the findings. This is among the first studies to apply a full Bayesian model to complex sample survey data to identify micro areas with variation in risk factor prevalence, accounting for spatial correlation and other covariates. Application of micro area analysis techniques helps define areas for public health planning, and may be informative to surveillance and research modeling of relevant chronic disease outcomes.
NASA Astrophysics Data System (ADS)
Costa, Veber; Fernandes, Wilson
2017-11-01
Extreme flood estimation has been a key research topic in hydrological sciences. Reliable estimates of such events are necessary as structures for flood conveyance are continuously evolving in size and complexity and, as a result, their failure-associated hazards become more and more pronounced. Due to this fact, several estimation techniques intended to improve flood frequency analysis and reducing uncertainty in extreme quantile estimation have been addressed in the literature in the last decades. In this paper, we develop a Bayesian framework for the indirect estimation of extreme flood quantiles from rainfall-runoff models. In the proposed approach, an ensemble of long daily rainfall series is simulated with a stochastic generator, which models extreme rainfall amounts with an upper-bounded distribution function, namely, the 4-parameter lognormal model. The rationale behind the generation model is that physical limits for rainfall amounts, and consequently for floods, exist and, by imposing an appropriate upper bound for the probabilistic model, more plausible estimates can be obtained for those rainfall quantiles with very low exceedance probabilities. Daily rainfall time series are converted into streamflows by routing each realization of the synthetic ensemble through a conceptual hydrologic model, the Rio Grande rainfall-runoff model. Calibration of parameters is performed through a nonlinear regression model, by means of the specification of a statistical model for the residuals that is able to accommodate autocorrelation, heteroscedasticity and nonnormality. By combining the outlined steps in a Bayesian structure of analysis, one is able to properly summarize the resulting uncertainty and estimating more accurate credible intervals for a set of flood quantiles of interest. The method for extreme flood indirect estimation was applied to the American river catchment, at the Folsom dam, in the state of California, USA. Results show that most floods, including exceptionally large non-systematic events, were reasonably estimated with the proposed approach. In addition, by accounting for uncertainties in each modeling step, one is able to obtain a better understanding of the influential factors in large flood formation dynamics.
Intervals for posttest probabilities: a comparison of 5 methods.
Mossman, D; Berger, J O
2001-01-01
Several medical articles discuss methods of constructing confidence intervals for single proportions and the likelihood ratio, but scant attention has been given to the systematic study of intervals for the posterior odds, or the positive predictive value, of a test. The authors describe 5 methods of constructing confidence intervals for posttest probabilities when estimates of sensitivity, specificity, and the pretest probability of a disorder are derived from empirical data. They then evaluate each method to determine how well the intervals' coverage properties correspond to their nominal value. When the estimates of pretest probabilities, sensitivity, and specificity are derived from more than 80 subjects and are not close to 0 or 1, all methods generate intervals with appropriate coverage properties. When these conditions are not met, however, the best-performing method is an objective Bayesian approach implemented by a simple simulation using a spreadsheet. Physicians and investigators can generate accurate confidence intervals for posttest probabilities in small-sample situations using the objective Bayesian approach.
Hu, X H; Li, Y P; Huang, G H; Zhuang, X W; Ding, X W
2016-05-01
In this study, a Bayesian-based two-stage inexact optimization (BTIO) method is developed for supporting water quality management through coupling Bayesian analysis with interval two-stage stochastic programming (ITSP). The BTIO method is capable of addressing uncertainties caused by insufficient inputs in water quality model as well as uncertainties expressed as probabilistic distributions and interval numbers. The BTIO method is applied to a real case of water quality management for the Xiangxi River basin in the Three Gorges Reservoir region to seek optimal water quality management schemes under various uncertainties. Interval solutions for production patterns under a range of probabilistic water quality constraints have been generated. Results obtained demonstrate compromises between the system benefit and the system failure risk due to inherent uncertainties that exist in various system components. Moreover, information about pollutant emission is accomplished, which would help managers to adjust production patterns of regional industry and local policies considering interactions of water quality requirement, economic benefit, and industry structure.
Classical and Bayesian Seismic Yield Estimation: The 1998 Indian and Pakistani Tests
NASA Astrophysics Data System (ADS)
Shumway, R. H.
2001-10-01
- The nuclear tests in May, 1998, in India and Pakistan have stimulated a renewed interest in yield estimation, based on limited data from uncalibrated test sites. We study here the problem of estimating yields using classical and Bayesian methods developed by Shumway (1992), utilizing calibration data from the Semipalatinsk test site and measured magnitudes for the 1998 Indian and Pakistani tests given by Murphy (1998). Calibration is done using multivariate classical or Bayesian linear regression, depending on the availability of measured magnitude-yield data and prior information. Confidence intervals for the classical approach are derived applying an extension of Fieller's method suggested by Brown (1982). In the case where prior information is available, the posterior predictive magnitude densities are inverted to give posterior intervals for yield. Intervals obtained using the joint distribution of magnitudes are comparable to the single-magnitude estimates produced by Murphy (1998) and reinforce the conclusion that the announced yields of the Indian and Pakistani tests were too high.
Classical and Bayesian Seismic Yield Estimation: The 1998 Indian and Pakistani Tests
NASA Astrophysics Data System (ADS)
Shumway, R. H.
The nuclear tests in May, 1998, in India and Pakistan have stimulated a renewed interest in yield estimation, based on limited data from uncalibrated test sites. We study here the problem of estimating yields using classical and Bayesian methods developed by Shumway (1992), utilizing calibration data from the Semipalatinsk test site and measured magnitudes for the 1998 Indian and Pakistani tests given by Murphy (1998). Calibration is done using multivariate classical or Bayesian linear regression, depending on the availability of measured magnitude-yield data and prior information. Confidence intervals for the classical approach are derived applying an extension of Fieller's method suggested by Brown (1982). In the case where prior information is available, the posterior predictive magnitude densities are inverted to give posterior intervals for yield. Intervals obtained using the joint distribution of magnitudes are comparable to the single-magnitude estimates produced by Murphy (1998) and reinforce the conclusion that the announced yields of the Indian and Pakistani tests were too high.
Henschel, Volkmar; Engel, Jutta; Hölzel, Dieter; Mansmann, Ulrich
2009-02-10
Multivariate analysis of interval censored event data based on classical likelihood methods is notoriously cumbersome. Likelihood inference for models which additionally include random effects are not available at all. Developed algorithms bear problems for practical users like: matrix inversion, slow convergence, no assessment of statistical uncertainty. MCMC procedures combined with imputation are used to implement hierarchical models for interval censored data within a Bayesian framework. Two examples from clinical practice demonstrate the handling of clustered interval censored event times as well as multilayer random effects for inter-institutional quality assessment. The software developed is called survBayes and is freely available at CRAN. The proposed software supports the solution of complex analyses in many fields of clinical epidemiology as well as health services research.
NASA Technical Reports Server (NTRS)
Kraft, Ralph P.; Burrows, David N.; Nousek, John A.
1991-01-01
Two different methods, classical and Bayesian, for determining confidence intervals involving Poisson-distributed data are compared. Particular consideration is given to cases where the number of counts observed is small and is comparable to the mean number of background counts. Reasons for preferring the Bayesian over the classical method are given. Tables of confidence limits calculated by the Bayesian method are provided for quick reference.
2014-10-02
intervals (Neil, Tailor, Marquez, Fenton , & Hear, 2007). This is cumbersome, error prone and usually inaccurate. Even though a universal framework...Science. Neil, M., Tailor, M., Marquez, D., Fenton , N., & Hear. (2007). Inference in Bayesian networks using dynamic discretisation. Statistics
McCool, Rachael; Gould, Ian M; Eales, Jacqui; Barata, Teresa; Arber, Mick; Fleetwood, Kelly; Glanville, Julie; Kauf, Teresa L
2017-01-07
Tedizolid, the active moiety of tedizolid phosphate, is approved in the United States, the European Union, Canada and a number of other countries for the treatment of acute bacterial skin and skin structure infections (ABSSSI) caused by certain susceptible bacteria, including methicillin-resistant Staphylococcus aureus (MRSA). This network meta-analysis (NMA) evaluates the comparative effectiveness of tedizolid and other antibacterials indicated for the treatment of ABSSSI caused by MRSA. Systematic review of 10 databases was undertaken to inform an NMA to estimate the relative effectiveness of tedizolid and established monotherapy comparators (ceftaroline, daptomycin, linezolid, teicoplanin, tigecycline, vancomycin) for treating MRSA-associated ABSSSI. Randomized controlled trials enrolling adults with ABSSSI or complicated skin and skin structure infections caused by suspected/documented MRSA were eligible for inclusion. Networks were developed based on similarity of study design, patient characteristics, outcome measures and available data. Outcomes of interest included clinical response at end of therapy (EOT), post-therapy evaluation (PTE) or test-of-cure assessment and treatment discontinuations resulting from adverse events (AEs). Bayesian NMA was conducted for each outcome using fixed-effects and random effects models. Literature searches identified 3,618 records; 15 trials met the inclusion criteria and were considered suitable for NMA comparison. In fixed-effects models, tedizolid had higher odds of clinical response at EOT (odds ratio [OR], 1.7; credible interval, 1.0, 3.0) and PTE than vancomycin (OR, 1.6; credible interval, 1.1, 2.5). No differences in odds of clinical response at EOT or PTE were observed between tedizolid and other comparators. There was no evidence of a difference among treatments for discontinuation due to AEs. Results from random effects and fixed-effects models were generally consistent. Tedizolid was superior to vancomycin for clinical response at EOT and PTE. There was no evidence of a difference between tedizolid and other comparators and no evidence of a difference between tedizolid and all comparators when evaluating discontinuation due to AEs. These findings suggest that tedizolid provides an alternative option for the management of serious skin infections caused by suspected or documented MRSA. This study is subject to the limitations inherent in all NMAs, and the results should be interpreted accordingly.
Genetic variability in calving success in Aberdeen Angus cows under extensive recording.
Urioste, J I; Chang, Y M; Naya, H; Gianola, D
2007-09-01
Data from 2032 Uruguayan Aberdeen Angus cows under extensive management and recording practices were analysed with Bayesian threshold-liability sire models, to assess genetic variability in calving success (CS), defined as a different binary trait for each of the second (CS2), third (CS3) and fourth (CS4) calving opportunities. Sire (herd) variances ranged from 0.08 to 0.11 (0.10 to 0.20) and heritability from 0.27 to 0.35, with large credibility intervals. Correlations between herd effects on CS at different calving opportunities were positive. Genetic correlation between CS2 and CS4 was positive (0.68), whereas those involving adjacent calving opportunities (CS2-CS3 and CS3-CS4) were negative, at -0.39 and -0.54, respectively. The residual correlation CS2-CS3 was negative (-0.32). The extent of uncertainty associated with the posterior estimates of the parameters was further evaluated through simulation, assuming different true values (-0.4, -0.2, +0.2 and +0.4) for the genetic correlations and changes in the degree of belief parameters of the inverse Wishart priors for the sire covariance matrix. Although inferences were not sharp enough, CS appears to be moderately heritable. The quality of data recording should be improved, in order to effect genetic improvement in female fertility.
Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.
Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana
2014-06-01
Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.
Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework†
Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana
2014-01-01
Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd. PMID:25505370
Optimal Search for an Astrophysical Gravitational-Wave Background
NASA Astrophysics Data System (ADS)
Smith, Rory; Thrane, Eric
2018-04-01
Roughly every 2-10 min, a pair of stellar-mass black holes merge somewhere in the Universe. A small fraction of these mergers are detected as individually resolvable gravitational-wave events by advanced detectors such as LIGO and Virgo. The rest contribute to a stochastic background. We derive the statistically optimal search strategy (producing minimum credible intervals) for a background of unresolved binaries. Our method applies Bayesian parameter estimation to all available data. Using Monte Carlo simulations, we demonstrate that the search is both "safe" and effective: it is not fooled by instrumental artifacts such as glitches and it recovers simulated stochastic signals without bias. Given realistic assumptions, we estimate that the search can detect the binary black hole background with about 1 day of design sensitivity data versus ≈40 months using the traditional cross-correlation search. This framework independently constrains the merger rate and black hole mass distribution, breaking a degeneracy present in the cross-correlation approach. The search provides a unified framework for population studies of compact binaries, which is cast in terms of hyperparameter estimation. We discuss a number of extensions and generalizations, including application to other sources (such as binary neutron stars and continuous-wave sources), simultaneous estimation of a continuous Gaussian background, and applications to pulsar timing.
Analyzing chromatographic data using multilevel modeling.
Wiczling, Paweł
2018-06-01
It is relatively easy to collect chromatographic measurements for a large number of analytes, especially with gradient chromatographic methods coupled with mass spectrometry detection. Such data often have a hierarchical or clustered structure. For example, analytes with similar hydrophobicity and dissociation constant tend to be more alike in their retention than a randomly chosen set of analytes. Multilevel models recognize the existence of such data structures by assigning a model for each parameter, with its parameters also estimated from data. In this work, a multilevel model is proposed to describe retention time data obtained from a series of wide linear organic modifier gradients of different gradient duration and different mobile phase pH for a large set of acids and bases. The multilevel model consists of (1) the same deterministic equation describing the relationship between retention time and analyte-specific and instrument-specific parameters, (2) covariance relationships relating various physicochemical properties of the analyte to chromatographically specific parameters through quantitative structure-retention relationship based equations, and (3) stochastic components of intra-analyte and interanalyte variability. The model was implemented in Stan, which provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods. Graphical abstract Relationships between log k and MeOH content for acidic, basic, and neutral compounds with different log P. CI credible interval, PSA polar surface area.
An introduction to using Bayesian linear regression with clinical data.
Baldwin, Scott A; Larson, Michael J
2017-11-01
Statistical training psychology focuses on frequentist methods. Bayesian methods are an alternative to standard frequentist methods. This article provides researchers with an introduction to fundamental ideas in Bayesian modeling. We use data from an electroencephalogram (EEG) and anxiety study to illustrate Bayesian models. Specifically, the models examine the relationship between error-related negativity (ERN), a particular event-related potential, and trait anxiety. Methodological topics covered include: how to set up a regression model in a Bayesian framework, specifying priors, examining convergence of the model, visualizing and interpreting posterior distributions, interval estimates, expected and predicted values, and model comparison tools. We also discuss situations where Bayesian methods can outperform frequentist methods as well has how to specify more complicated regression models. Finally, we conclude with recommendations about reporting guidelines for those using Bayesian methods in their own research. We provide data and R code for replicating our analyses. Copyright © 2017 Elsevier Ltd. All rights reserved.
The psychometric properties of Observer OPTION(5), an observer measure of shared decision making.
Barr, Paul J; O'Malley, Alistair James; Tsulukidze, Maka; Gionfriddo, Michael R; Montori, Victor; Elwyn, Glyn
2015-08-01
Observer OPTION(5) was designed as a more efficient version of OPTION(12), the most commonly used measure of shared decision making (SDM). The current paper assesses the psychometric properties of OPTION(5). Two raters used OPTION(5) to rate recordings of clinical encounters from two previous patient decision aid (PDA) trials (n=201; n=110). A subsample was re-rated two weeks later. We assessed discriminative validity, inter-rater reliability, intra-rater reliability, and concurrent validity. OPTION(5) demonstrated discriminative validity, with increases in SDM between usual care and PDA arms. OPTION(5) also demonstrated concurrent validity with OPTION(12), r=0.61 (95%CI 0.54, 0.68) and intra-rater reliability, r=0.93 (0.83, 0.97). The mean difference in rater score was 8.89 (95% Credibility Interval, 7.5, 10.3), with intraclass correlation (ICC) of 0.67 (95% Credibility Interval, 0.51, 0.91) for the accuracy of rater scores and 0.70 (95% Credibility Interval, 0.56, 0.94) for the consistency of rater scores across encounters, indicating good inter-rater reliability. Raters reported lower cognitive burden when using OPTION(5) compared to OPTION(12). OPTION(5) is a brief, theoretically grounded observer measure of SDM with promising psychometric properties in this sample and low burden on raters. OPTION(5) has potential to provide reliable, valid assessment of SDM in clinical encounters. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
2015-03-26
depicting the CSE implementation for use with CV Domes data. . . 88 B.1 Validation results for N = 1 observation at 1.0 interval. Legendre polynomial of... Legendre polynomial of order Nl = 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 B.3 Validation results for N = 1 observation at...0.01 interval. Legendre polynomial of order Nl = 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 B.4 Validation results for N
Transmission potential of influenza A/H7N9, February to May 2013, China
2013-01-01
Background On 31 March 2013, the first human infections with the novel influenza A/H7N9 virus were reported in Eastern China. The outbreak expanded rapidly in geographic scope and size, with a total of 132 laboratory-confirmed cases reported by 3 June 2013, in 10 Chinese provinces and Taiwan. The incidence of A/H7N9 cases has stalled in recent weeks, presumably as a consequence of live bird market closures in the most heavily affected areas. Here we compare the transmission potential of influenza A/H7N9 with that of other emerging pathogens and evaluate the impact of intervention measures in an effort to guide pandemic preparedness. Methods We used a Bayesian approach combined with a SEIR (Susceptible-Exposed-Infectious-Removed) transmission model fitted to daily case data to assess the reproduction number (R) of A/H7N9 by province and to evaluate the impact of live bird market closures in April and May 2013. Simulation studies helped quantify the performance of our approach in the context of an emerging pathogen, where human-to-human transmission is limited and most cases arise from spillover events. We also used alternative approaches to estimate R based on individual-level information on prior exposure and compared the transmission potential of influenza A/H7N9 with that of other recent zoonoses. Results Estimates of R for the A/H7N9 outbreak were below the epidemic threshold required for sustained human-to-human transmission and remained near 0.1 throughout the study period, with broad 95% credible intervals by the Bayesian method (0.01 to 0.49). The Bayesian estimation approach was dominated by the prior distribution, however, due to relatively little information contained in the case data. We observe a statistically significant deceleration in growth rate after 6 April 2013, which is consistent with a reduction in A/H7N9 transmission associated with the preemptive closure of live bird markets. Although confidence intervals are broad, the estimated transmission potential of A/H7N9 appears lower than that of recent zoonotic threats, including avian influenza A/H5N1, swine influenza H3N2sw and Nipah virus. Conclusion Although uncertainty remains high in R estimates for H7N9 due to limited epidemiological information, all available evidence points to a low transmission potential. Continued monitoring of the transmission potential of A/H7N9 is critical in the coming months as intervention measures may be relaxed and seasonal factors could promote disease transmission in colder months. PMID:24083506
ERIC Educational Resources Information Center
Abayomi, Kobi; Pizarro, Gonzalo
2013-01-01
We offer a straightforward framework for measurement of progress, across many dimensions, using cross-national social indices, which we classify as linear combinations of multivariate country level data onto a univariate score. We suggest a Bayesian approach which yields probabilistic (confidence type) intervals for the point estimates of country…
ERIC Educational Resources Information Center
Gustafson, S. C.; Costello, C. S.; Like, E. C.; Pierce, S. J.; Shenoy, K. N.
2009-01-01
Bayesian estimation of a threshold time (hereafter simply threshold) for the receipt of impulse signals is accomplished given the following: 1) data, consisting of the number of impulses received in a time interval from zero to one and the time of the largest time impulse; 2) a model, consisting of a uniform probability density of impulse time…
A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information
Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter
2016-01-01
Abstract This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non‐fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation. PMID:27840456
A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information
NASA Astrophysics Data System (ADS)
Salinas, José Luis; Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter
2016-09-01
This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non-fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation.
A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information.
Salinas, José Luis; Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter
2016-09-01
This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non-fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation.
Peña, Carlos; Espeland, Marianne
2015-01-01
The species rich butterfly family Nymphalidae has been used to study evolutionary interactions between plants and insects. Theories of insect-hostplant dynamics predict accelerated diversification due to key innovations. In evolutionary biology, analysis of maximum credibility trees in the software MEDUSA (modelling evolutionary diversity using stepwise AIC) is a popular method for estimation of shifts in diversification rates. We investigated whether phylogenetic uncertainty can produce different results by extending the method across a random sample of trees from the posterior distribution of a Bayesian run. Using the MultiMEDUSA approach, we found that phylogenetic uncertainty greatly affects diversification rate estimates. Different trees produced diversification rates ranging from high values to almost zero for the same clade, and both significant rate increase and decrease in some clades. Only four out of 18 significant shifts found on the maximum clade credibility tree were consistent across most of the sampled trees. Among these, we found accelerated diversification for Ithomiini butterflies. We used the binary speciation and extinction model (BiSSE) and found that a hostplant shift to Solanaceae is correlated with increased net diversification rates in Ithomiini, congruent with the diffuse cospeciation hypothesis. Our results show that taking phylogenetic uncertainty into account when estimating net diversification rate shifts is of great importance, as very different results can be obtained when using the maximum clade credibility tree and other trees from the posterior distribution. PMID:25830910
Peña, Carlos; Espeland, Marianne
2015-01-01
The species rich butterfly family Nymphalidae has been used to study evolutionary interactions between plants and insects. Theories of insect-hostplant dynamics predict accelerated diversification due to key innovations. In evolutionary biology, analysis of maximum credibility trees in the software MEDUSA (modelling evolutionary diversity using stepwise AIC) is a popular method for estimation of shifts in diversification rates. We investigated whether phylogenetic uncertainty can produce different results by extending the method across a random sample of trees from the posterior distribution of a Bayesian run. Using the MultiMEDUSA approach, we found that phylogenetic uncertainty greatly affects diversification rate estimates. Different trees produced diversification rates ranging from high values to almost zero for the same clade, and both significant rate increase and decrease in some clades. Only four out of 18 significant shifts found on the maximum clade credibility tree were consistent across most of the sampled trees. Among these, we found accelerated diversification for Ithomiini butterflies. We used the binary speciation and extinction model (BiSSE) and found that a hostplant shift to Solanaceae is correlated with increased net diversification rates in Ithomiini, congruent with the diffuse cospeciation hypothesis. Our results show that taking phylogenetic uncertainty into account when estimating net diversification rate shifts is of great importance, as very different results can be obtained when using the maximum clade credibility tree and other trees from the posterior distribution.
Josefsson, Torbjörn; Ivarsson, Andreas; Lindwall, Magnus; Gustafsson, Henrik; Stenling, Andreas; Böröy, Jan; Mattsson, Emil; Carnebratt, Jakob; Sevholt, Simon; Falkevik, Emil
2017-01-01
The main objective of the project was to examine a proposed theoretical model of mindfulness mechanisms in sports. We conducted two studies (the first study using a cross-sectional design and the second a longitudinal design) to investigate if rumination and emotion regulation mediate the relation between dispositional mindfulness and sport-specific coping. Two hundred and forty-two young elite athletes, drawn from various sports, were recruited for the cross-sectional study. For the longitudinal study, 65 elite athletes were recruited. All analyses were performed using Bayesian statistics. The path analyses showed credible indirect effects of dispositional mindfulness on coping via rumination and emotion regulation in both the cross-sectional study and the longitudinal study. Additionally, the results in both studies showed credible direct effects of dispositional mindfulness on rumination and emotion regulation. Further, credible direct effects of emotion regulation as well as rumination on coping were also found in both studies. Our findings support the theoretical model, indicating that rumination and emotion regulation function as essential mechanisms in the relation between dispositional mindfulness and sport-specific coping skills. Increased dispositional mindfulness in competitive athletes (i.e. by practicing mindfulness) may lead to reductions in rumination, as well as an improved capacity to regulate negative emotions. By doing so, athletes may improve their sport-related coping skills, and thereby enhance athletic performance.
Bórquez, Annick; Cori, Anne; Pufall, Erica L; Kasule, Jingo; Slaymaker, Emma; Price, Alison; Elmes, Jocelyn; Zaba, Basia; Crampin, Amelia C; Kagaayi, Joseph; Lutalo, Tom; Urassa, Mark; Gregson, Simon; Hallett, Timothy B
2016-09-01
Programmatic planning in HIV requires estimates of the distribution of new HIV infections according to identifiable characteristics of individuals. In sub-Saharan Africa, robust routine data sources and historical epidemiological observations are available to inform and validate such estimates. We developed a predictive model, the Incidence Patterns Model (IPM), representing populations according to factors that have been demonstrated to be strongly associated with HIV acquisition risk: gender, marital/sexual activity status, geographic location, "key populations" based on risk behaviours (sex work, injecting drug use, and male-to-male sex), HIV and ART status within married or cohabiting unions, and circumcision status. The IPM estimates the distribution of new infections acquired by group based on these factors within a Bayesian framework accounting for regional prior information on demographic and epidemiological characteristics from trials or observational studies. We validated and trained the model against direct observations of HIV incidence by group in seven rounds of cohort data from four studies ("sites") conducted in Manicaland, Zimbabwe; Rakai, Uganda; Karonga, Malawi; and Kisesa, Tanzania. The IPM performed well, with the projections' credible intervals for the proportion of new infections per group overlapping the data's confidence intervals for all groups in all rounds of data. In terms of geographical distribution, the projections' credible intervals overlapped the confidence intervals for four out of seven rounds, which were used as proxies for administrative divisions in a country. We assessed model performance after internal training (within one site) and external training (between sites) by comparing mean posterior log-likelihoods and used the best model to estimate the distribution of HIV incidence in six countries (Gabon, Kenya, Malawi, Rwanda, Swaziland, and Zambia) in the region. We subsequently inferred the potential contribution of each group to transmission using a simple model that builds on the results from the IPM and makes further assumptions about sexual mixing patterns and transmission rates. In all countries except Swaziland, individuals in unions were the single group contributing to the largest proportion of new infections acquired (39%-77%), followed by never married women and men. Female sex workers accounted for a large proportion of new infections (5%-16%) compared to their population size. Individuals in unions were also the single largest contributor to the proportion of infections transmitted (35%-62%), followed by key populations and previously married men and women. Swaziland exhibited different incidence patterns, with never married men and women accounting for over 65% of new infections acquired and also contributing to a large proportion of infections transmitted (up to 56%). Between- and within-country variations indicated different incidence patterns in specific settings. It is possible to reliably predict the distribution of new HIV infections acquired using data routinely available in many countries in the sub-Saharan African region with a single relatively simple mathematical model. This tool would complement more specific analyses to guide resource allocation, data collection, and programme planning.
Zou, Yonghong; Wang, Lixia; Christensen, Erik R
2015-10-01
This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). Copyright © 2015. Published by Elsevier Ltd.
Bayesian GGE biplot models applied to maize multi-environments trials.
de Oliveira, L A; da Silva, C P; Nuvunga, J J; da Silva, A Q; Balestre, M
2016-06-17
The additive main effects and multiplicative interaction (AMMI) and the genotype main effects and genotype x environment interaction (GGE) models stand out among the linear-bilinear models used in genotype x environment interaction studies. Despite the advantages of their use to describe genotype x environment (AMMI) or genotype and genotype x environment (GGE) interactions, these methods have known limitations that are inherent to fixed effects models, including difficulty in treating variance heterogeneity and missing data. Traditional biplots include no measure of uncertainty regarding the principal components. The present study aimed to apply the Bayesian approach to GGE biplot models and assess the implications for selecting stable and adapted genotypes. Our results demonstrated that the Bayesian approach applied to GGE models with non-informative priors was consistent with the traditional GGE biplot analysis, although the credible region incorporated into the biplot enabled distinguishing, based on probability, the performance of genotypes, and their relationships with the environments in the biplot. Those regions also enabled the identification of groups of genotypes and environments with similar effects in terms of adaptability and stability. The relative position of genotypes and environments in biplots is highly affected by the experimental accuracy. Thus, incorporation of uncertainty in biplots is a key tool for breeders to make decisions regarding stability selection and adaptability and the definition of mega-environments.
Bayesian Approach for Reliability Assessment of Sunshield Deployment on JWST
NASA Technical Reports Server (NTRS)
Kaminskiy, Mark P.; Evans, John W.; Gallo, Luis D.
2013-01-01
Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications, for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a Bayesian approach for reliability estimation of spacecraft deployment was developed for this purpose. This approach was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the observatory's telescope and science instruments. In order to collect the prior information on deployable systems, detailed studies of "heritage information", were conducted extending over 45 years of spacecraft launches. The NASA Goddard Space Flight Center (GSFC) Spacecraft Operational Anomaly and Reporting System (SOARS) data were then used to estimate the parameters of the conjugative beta prior distribution for anomaly and failure occurrence, as the most consistent set of available data and that could be matched to launch histories. This allows for an emperical Bayesian prediction for the risk of an anomaly occurrence of the complex Sunshield deployment, with credibility limits, using prior deployment data and test information.
A Bayesian Framework for Reliability Analysis of Spacecraft Deployments
NASA Technical Reports Server (NTRS)
Evans, John W.; Gallo, Luis; Kaminsky, Mark
2012-01-01
Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.
He, Jian; Wu, Ping; Tang, Yaoyun; Liu, Sulai; Xie, Chubo; Luo, Shi; Zeng, Junfeng; Xu, Jing; Zhao, Suping
2017-01-01
Object A Bayesian network meta-analysis (NMA) was conducted to estimate the overall survival (OS) and complete response (CR) performance in nasopharyngeal carcinoma (NPC) patients who have been given the treatment of radiotherapy, concurrent chemoradiotherapy (C), adjuvant chemotherapy (A), neoadjuvant chemotherapy (N), concurrent chemoradiotherapy with adjuvant chemotherapy (C+A), concurrent chemoradiotherapy with neoadjuvant chemotherapy (C+N) and neoadjuvant chemotherapy with adjuvant chemotherapy (N+A). Methods Literature search was conducted in electronic databases. Hazard ratios (HRs) accompanied their 95% confidence intervals (95%CIs) or 95% credible intervals (95%CrIs) were applied to measure the relative survival benefit between two comparators. Meanwhile odd ratios (ORs) with their 95% CIs or CrIs were given to present CR data from individual studies. RESULTS Totally 52 qualified studies with 10,081 patients were included in this NMA. In conventional meta-analysis (MA), patients with N+C exhibited an average increase of 9% in the 3-year OS in relation to those with C+A. As for the NMA results, five therapies were associated with a significantly reduced HR when compared with the control group when concerning 5-year OS. C, C+A and N+A also presented a decreased HR compared with A. There was continuity among 1-year, 3-year and 5-year OS status. Cluster analysis suggested that the three chemoradiotherapy appeared to be divided into the most compete group which is located in the upper right corner of the cluster plot. Conclusion In view of survival rate and complete response, the NMA results revealed that C, C+A and C+N showed excellent efficacy. As a result, these 3 therapies were supposed to be considered as the first-line treatment according to this NMA. PMID:28418901
Choosing MUSE: Validation of a Low-Cost, Portable EEG System for ERP Research
Krigolson, Olave E.; Williams, Chad C.; Norton, Angela; Hassall, Cameron D.; Colino, Francisco L.
2017-01-01
In recent years there has been an increase in the number of portable low-cost electroencephalographic (EEG) systems available to researchers. However, to date the validation of the use of low-cost EEG systems has focused on continuous recording of EEG data and/or the replication of large system EEG setups reliant on event-markers to afford examination of event-related brain potentials (ERP). Here, we demonstrate that it is possible to conduct ERP research without being reliant on event markers using a portable MUSE EEG system and a single computer. Specifically, we report the results of two experiments using data collected with the MUSE EEG system—one using the well-known visual oddball paradigm and the other using a standard reward-learning task. Our results demonstrate that we could observe and quantify the N200 and P300 ERP components in the visual oddball task and the reward positivity (the mirror opposite component to the feedback-related negativity) in the reward-learning task. Specifically, single sample t-tests of component existence (all p's < 0.05), computation of Bayesian credible intervals, and 95% confidence intervals all statistically verified the existence of the N200, P300, and reward positivity in all analyses. We provide with this research paper an open source website with all the instructions, methods, and software to replicate our findings and to provide researchers with an easy way to use the MUSE EEG system for ERP research. Importantly, our work highlights that with a single computer and a portable EEG system such as the MUSE one can conduct ERP research with ease thus greatly extending the possible use of the ERP methodology to a variety of novel contexts. PMID:28344546
Choosing MUSE: Validation of a Low-Cost, Portable EEG System for ERP Research.
Krigolson, Olave E; Williams, Chad C; Norton, Angela; Hassall, Cameron D; Colino, Francisco L
2017-01-01
In recent years there has been an increase in the number of portable low-cost electroencephalographic (EEG) systems available to researchers. However, to date the validation of the use of low-cost EEG systems has focused on continuous recording of EEG data and/or the replication of large system EEG setups reliant on event-markers to afford examination of event-related brain potentials (ERP). Here, we demonstrate that it is possible to conduct ERP research without being reliant on event markers using a portable MUSE EEG system and a single computer. Specifically, we report the results of two experiments using data collected with the MUSE EEG system-one using the well-known visual oddball paradigm and the other using a standard reward-learning task. Our results demonstrate that we could observe and quantify the N200 and P300 ERP components in the visual oddball task and the reward positivity (the mirror opposite component to the feedback-related negativity) in the reward-learning task. Specifically, single sample t -tests of component existence (all p 's < 0.05), computation of Bayesian credible intervals, and 95% confidence intervals all statistically verified the existence of the N200, P300, and reward positivity in all analyses. We provide with this research paper an open source website with all the instructions, methods, and software to replicate our findings and to provide researchers with an easy way to use the MUSE EEG system for ERP research. Importantly, our work highlights that with a single computer and a portable EEG system such as the MUSE one can conduct ERP research with ease thus greatly extending the possible use of the ERP methodology to a variety of novel contexts.
Sylvatic plague vaccine partially protects prairie dogs (Cynomys spp.) in field trials
Rocke, Tonie E.; Tripp, Daniel W.; Russell, Robin E.; Abbott, Rachel C.; Richgels, Katherine; Matchett, Marc R.; Biggins, Dean E.; Griebel, Randall; Schroeder, Greg; Grassel, Shaun M.; Pipkin, David R.; Cordova, Jennifer; Kavalunas, Adam; Maxfield, Brian; Boulerice, Jesse; Miller, Michael W.
2017-01-01
Sylvatic plague, caused by Yersinia pestis, frequently afflicts prairie dogs (Cynomys spp.), causing population declines and local extirpations. We tested the effectiveness of bait-delivered sylvatic plague vaccine (SPV) in prairie dog colonies on 29 paired placebo and treatment plots (1–59 ha in size; average 16.9 ha) in 7 western states from 2013 to 2015. We compared relative abundance (using catch per unit effort (CPUE) as an index) and apparent survival of prairie dogs on 26 of the 29 paired plots, 12 with confirmed or suspected plague (Y. pestis positive carcasses or fleas). Even though plague mortality occurred in prairie dogs on vaccine plots, SPV treatment had an overall positive effect on CPUE in all three years, regardless of plague status. Odds of capturing a unique animal were 1.10 (95% confidence interval [C.I.] 1.02–1.19) times higher per trap day on vaccine-treated plots than placebo plots in 2013, 1.47 (95% C.I. 1.41–1.52) times higher in 2014 and 1.19 (95% C.I. 1.13–1.25) times higher in 2015. On pairs where plague occurred, odds of apparent survival were 1.76 (95% Bayesian credible interval [B.C.I.] 1.28–2.43) times higher on vaccine plots than placebo plots for adults and 2.41 (95% B.C.I. 1.72–3.38) times higher for juveniles. Our results provide evidence that consumption of vaccine-laden baits can protect prairie dogs against plague; however, further evaluation and refinement are needed to optimize SPV use as a management tool.
Lee, Young Ho; Bae, Sang-Cheol; Song, Gwan Gyu
2015-12-01
This study aimed to assess the relative efficacy and safety of tofacitinib 5 and 10 mg twice daily, or in combination with methotrexate (MTX), in patients with active RA. Randomized controlled trials (RCTs) examining the efficacy and safety of tofacitinib in patients with active RA were included in this network meta-analysis. We performed a Bayesian network meta-analysis to combine the direct and indirect evidence from the RCTs. Ten RCTs including 4867 patients met the inclusion criteria. There were 21 pairwise comparisons including 11 direct comparisons of seven interventions. The ACR20 response rate was significantly higher in the tofacitinib 10 mg + MTX group than in the placebo and MTX groups (OR 7.56, 95 % credible interval (CrI) 3.07-21.16; OR 3.67, 95 % CrI 2.60-5.71, respectively). Ranking probabilities based on the surface under the cumulative ranking curve (SUCRA) indicated that tofacitinib 10 mg + MTX had the highest probability of being the best treatment for achieving the ACR20 response rate (SUCRA = 0.9254), followed by tofacitinib 5 mg + MTX (SUCRA = 0.7156), adalimumab 40 mg + MTX (SUCRA = 0.6097), tofacitinib 10 mg (SUCRA = 0.5984), tofacitinib 5 mg (SUCRA = 0.4749), MTX (SUCRA = 0.1674), and placebo (SUCRA = 0.0086). In contrast, the safety based on the number of withdrawals due to adverse events did not differ significantly among the seven interventions. Tofacitinib, at dosages 5 and 10 mg twice daily, in combination with MTX, was the most efficacious intervention for active RA and was not associated with a significant risk for withdrawals due to adverse events.
Giacoppo, Daniele; Gargiulo, Giuseppe; Buccheri, Sergio; Aruta, Patrizia; Byrne, Robert A; Cassese, Salvatore; Dangas, George; Kastrati, Adnan; Mehran, Roxana; Tamburino, Corrado; Capodanno, Davide
2017-05-01
The effectiveness of currently available effective preventive strategies for contrast-induced acute kidney injury (CIAKI) is a matter of debate. We performed a Bayesian random-effects network meta-analysis of 124 trials (28 240 patients) comparing a total of 10 strategies: saline, statin, N-acetylcysteine (NAC), sodium bicarbonate (NaHCO 3 ), NAC+NaHCO 3 , ascorbic acid, xanthine, dopaminergic agent, peripheral ischemic preconditioning, and natriuretic peptide. Compared with saline, the risk of CIAKI was reduced by using statin (odds ratio [OR], 0.42; 95% credible interval [CrI], 0.26-0.67), xanthine (OR, 0.32; 95% CrI, 0.17-0.57), ischemic preconditioning (OR, 0.48; 95% CrI, 0.26-0.87), NAC+NaHCO 3 (OR, 0.50; 95% CrI, 0.33-0.76), NAC (OR, 0.68; 95% CrI, 0.55-0.84), and NaHCO 3 (OR, 0.66; 95% CrI, 0.47-0.90). The benefit of statin therapy was consistent across multiple sensitivity analyses, whereas the efficacy of all the other strategies was questioned by restricting the analysis to high-quality trials. Overall, high heterogeneity was observed for comparisons involving xanthine and ischemic preconditioning, although the impact of NAC and xanthine was probably influenced by publication bias/small-study effect. Hydration alone was the least effective preventive strategy for CIAKI. Meta-regressions did not reveal significant associations with baseline creatinine and contrast volume. In patients with diabetes mellitus, no strategy was found to reduce the incidence of CIAKI. In patients undergoing percutaneous coronary procedures, statin administration is associated with a marked and consistent reduction in the risk of CIAKI compared with saline. Although xanthine, NAC, NaHCO 3 , NAC+NaHCO 3 , ischemic preconditioning, and natriuretic peptide may have nephroprotective effects, these results were not consistent across multiple sensitivity analyses. © 2017 American Heart Association, Inc.
Garrard, Georgia E; McCarthy, Michael A; Vesk, Peter A; Radford, James Q; Bennett, Andrew F
2012-01-01
1. Informative Bayesian priors can improve the precision of estimates in ecological studies or estimate parameters for which little or no information is available. While Bayesian analyses are becoming more popular in ecology, the use of strongly informative priors remains rare, perhaps because examples of informative priors are not readily available in the published literature. 2. Dispersal distance is an important ecological parameter, but is difficult to measure and estimates are scarce. General models that provide informative prior estimates of dispersal distances will therefore be valuable. 3. Using a world-wide data set on birds, we develop a predictive model of median natal dispersal distance that includes body mass, wingspan, sex and feeding guild. This model predicts median dispersal distance well when using the fitted data and an independent test data set, explaining up to 53% of the variation. 4. Using this model, we predict a priori estimates of median dispersal distance for 57 woodland-dependent bird species in northern Victoria, Australia. These estimates are then used to investigate the relationship between dispersal ability and vulnerability to landscape-scale changes in habitat cover and fragmentation. 5. We find evidence that woodland bird species with poor predicted dispersal ability are more vulnerable to habitat fragmentation than those species with longer predicted dispersal distances, thus improving the understanding of this important phenomenon. 6. The value of constructing informative priors from existing information is also demonstrated. When used as informative priors for four example species, predicted dispersal distances reduced the 95% credible intervals of posterior estimates of dispersal distance by 8-19%. Further, should we have wished to collect information on avian dispersal distances and relate it to species' responses to habitat loss and fragmentation, data from 221 individuals across 57 species would have been required to obtain estimates with the same precision as those provided by the general model. © 2011 The Authors. Journal of Animal Ecology © 2011 British Ecological Society.
Surgical or Transcatheter Aortic-Valve Replacement in Intermediate-Risk Patients.
Reardon, Michael J; Van Mieghem, Nicolas M; Popma, Jeffrey J; Kleiman, Neal S; Søndergaard, Lars; Mumtaz, Mubashir; Adams, David H; Deeb, G Michael; Maini, Brijeshwar; Gada, Hemal; Chetcuti, Stanley; Gleason, Thomas; Heiser, John; Lange, Rüdiger; Merhi, William; Oh, Jae K; Olsen, Peter S; Piazza, Nicolo; Williams, Mathew; Windecker, Stephan; Yakubov, Steven J; Grube, Eberhard; Makkar, Raj; Lee, Joon S; Conte, John; Vang, Eric; Nguyen, Hang; Chang, Yanping; Mugglin, Andrew S; Serruys, Patrick W J C; Kappetein, Arie P
2017-04-06
Although transcatheter aortic-valve replacement (TAVR) is an accepted alternative to surgery in patients with severe aortic stenosis who are at high surgical risk, less is known about comparative outcomes among patients with aortic stenosis who are at intermediate surgical risk. We evaluated the clinical outcomes in intermediate-risk patients with severe, symptomatic aortic stenosis in a randomized trial comparing TAVR (performed with the use of a self-expanding prosthesis) with surgical aortic-valve replacement. The primary end point was a composite of death from any cause or disabling stroke at 24 months in patients undergoing attempted aortic-valve replacement. We used Bayesian analytical methods (with a margin of 0.07) to evaluate the noninferiority of TAVR as compared with surgical valve replacement. A total of 1746 patients underwent randomization at 87 centers. Of these patients, 1660 underwent an attempted TAVR or surgical procedure. The mean (±SD) age of the patients was 79.8±6.2 years, and all were at intermediate risk for surgery (Society of Thoracic Surgeons Predicted Risk of Mortality, 4.5±1.6%). At 24 months, the estimated incidence of the primary end point was 12.6% in the TAVR group and 14.0% in the surgery group (95% credible interval [Bayesian analysis] for difference, -5.2 to 2.3%; posterior probability of noninferiority, >0.999). Surgery was associated with higher rates of acute kidney injury, atrial fibrillation, and transfusion requirements, whereas TAVR had higher rates of residual aortic regurgitation and need for pacemaker implantation. TAVR resulted in lower mean gradients and larger aortic-valve areas than surgery. Structural valve deterioration at 24 months did not occur in either group. TAVR was a noninferior alternative to surgery in patients with severe aortic stenosis at intermediate surgical risk, with a different pattern of adverse events associated with each procedure. (Funded by Medtronic; SURTAVI ClinicalTrials.gov number, NCT01586910 .).
de Vocht, Frank; Tilling, Kate; Pliakas, Triantafyllos; Angus, Colin; Egan, Matt; Brennan, Alan; Campbell, Rona; Hickman, Matthew
2017-09-01
Control of alcohol licensing at local government level is a key component of alcohol policy in England. There is, however, only weak evidence of any public health improvement. We used a novel natural experiment design to estimate the impact of new local alcohol licensing policies on hospital admissions and crime. We used Home Office licensing data (2007-2012) to identify (1) interventions: local areas where both a cumulative impact zone and increased licensing enforcement were introduced in 2011; and (2) controls: local areas with neither. Outcomes were 2009-2015 alcohol-related hospital admissions, violent and sexual crimes, and antisocial behaviour. Bayesian structural time series were used to create postintervention synthetic time series (counterfactuals) based on weighted time series in control areas. Intervention effects were calculated from differences between measured and expected trends. Validation analyses were conducted using randomly selected controls. 5 intervention and 86 control areas were identified. Intervention was associated with an average reduction in alcohol-related hospital admissions of 6.3% (95% credible intervals (CI) -12.8% to 0.2%) and to lesser extent with a reduced in violent crimes, especially up to 2013 (-4.6%, 95% CI -10.7% to 1.4%). There was weak evidence of an effect on sexual crimes up 2013 (-8.4%, 95% CI -21.4% to 4.6%) and insufficient evidence of an effect on antisocial behaviour as a result of a change in reporting. Moderate reductions in alcohol-related hospital admissions and violent and sexual crimes were associated with introduction of local alcohol licensing policies. This novel methodology holds promise for use in other natural experiments in public health. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Okano, Justin T; Robbins, Danielle; Palk, Laurence; Gerstoft, Jan; Obel, Niels; Blower, Sally
2016-07-01
Worldwide, approximately 35 million individuals are infected with HIV; about 25 million of these live in sub-Saharan Africa. WHO proposes using treatment as prevention (TasP) to eliminate HIV. Treatment suppresses viral load, decreasing the probability an individual transmits HIV. The elimination threshold is one new HIV infection per 1000 individuals. Here, we test the hypothesis that TasP can substantially reduce epidemics and eliminate HIV. We estimate the impact of TasP, between 1996 and 2013, on the Danish HIV epidemic in men who have sex with men (MSM), an epidemic UNAIDS has identified as a priority for elimination. We use a CD4-staged Bayesian back-calculation approach to estimate incidence, and the hidden epidemic (the number of HIV-infected undiagnosed MSM). To develop the back-calculation model, we use data from an ongoing nationwide population-based study: the Danish HIV Cohort Study. Incidence, and the hidden epidemic, decreased substantially after treatment was introduced in 1996. By 2013, incidence was close to the elimination threshold: 1·4 (median, 95% Bayesian credible interval [BCI] 0·4-2·1) new HIV infections per 1000 MSM and there were only 617 (264-858) undiagnosed MSM. Decreasing incidence and increasing treatment coverage were highly correlated; a treatment threshold effect was apparent. Our study is the first to show that TasP can substantially reduce a country's HIV epidemic, and bring it close to elimination. However, we have shown the effectiveness of TasP under optimal conditions: very high treatment coverage, and exceptionally high (98%) viral suppression rate. Unless these extremely challenging conditions can be met in sub-Saharan Africa, the WHO's global elimination strategy is unlikely to succeed. National Institute of Allergy and Infectious Diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fang, Xin; Fang, Bo; Wang, Chunfang; Xia, Tian; Bottai, Matteo; Fang, Fang; Cao, Yang
2017-01-01
There are concerns that the reported association of ambient fine particulate matter (PM2.5) with mortality might be a mixture of PM2.5 and weather conditions. We evaluated the effects of extreme weather conditions and weather types on mortality as well as their interactions with PM2.5 concentrations in a time series study. Daily non-accidental deaths, individual demographic information, daily average PM2.5 concentrations and meteorological data between 2012 and 2014 were obtained from Shanghai, China. Days with extreme weather conditions were identified. Six synoptic weather types (SWTs) were generated. The generalized additive model was set up to link the mortality with PM2.5 and weather conditions. Parameter estimation was based on Bayesian methods using both the Jeffreys' prior and an informative normal prior in a sensitivity analysis. We estimate the percent increase in non-accidental mortality per 10 μg/m3 increase in PM2.5 concentration and constructed corresponding 95% credible interval (CrI). In total, 336,379 non-accidental deaths occurred during the study period. Average daily deaths were 307. The results indicated that per 10 μg/m3 increase in daily average PM2.5 concentration alone corresponded to 0.26-0.35% increase in daily non-accidental mortality in Shanghai. Statistically significant positive associations between PM2.5 and mortality were found for favorable SWTs when considering the interaction between PM2.5 and SWTs. The greatest effect was found in hot dry SWT (percent increase = 1.28, 95% CrI: 0.72, 1.83), followed by warm humid SWT (percent increase = 0.64, 95% CrI: 0.15, 1.13). The effect of PM2.5 on non-accidental mortality differed under specific extreme weather conditions and SWTs. Environmental policies and actions should take into account the interrelationship between the two hazardous exposures.
Wang, Chunfang; Xia, Tian; Bottai, Matteo; Fang, Fang; Cao, Yang
2017-01-01
There are concerns that the reported association of ambient fine particulate matter (PM2.5) with mortality might be a mixture of PM2.5 and weather conditions. We evaluated the effects of extreme weather conditions and weather types on mortality as well as their interactions with PM2.5 concentrations in a time series study. Daily non-accidental deaths, individual demographic information, daily average PM2.5 concentrations and meteorological data between 2012 and 2014 were obtained from Shanghai, China. Days with extreme weather conditions were identified. Six synoptic weather types (SWTs) were generated. The generalized additive model was set up to link the mortality with PM2.5 and weather conditions. Parameter estimation was based on Bayesian methods using both the Jeffreys’ prior and an informative normal prior in a sensitivity analysis. We estimate the percent increase in non-accidental mortality per 10 μg/m3 increase in PM2.5 concentration and constructed corresponding 95% credible interval (CrI). In total, 336,379 non-accidental deaths occurred during the study period. Average daily deaths were 307. The results indicated that per 10 μg/m3 increase in daily average PM2.5 concentration alone corresponded to 0.26–0.35% increase in daily non-accidental mortality in Shanghai. Statistically significant positive associations between PM2.5 and mortality were found for favorable SWTs when considering the interaction between PM2.5 and SWTs. The greatest effect was found in hot dry SWT (percent increase = 1.28, 95% CrI: 0.72, 1.83), followed by warm humid SWT (percent increase = 0.64, 95% CrI: 0.15, 1.13). The effect of PM2.5 on non-accidental mortality differed under specific extreme weather conditions and SWTs. Environmental policies and actions should take into account the interrelationship between the two hazardous exposures. PMID:29121092
Proximity to mining industry and cancer mortality.
Fernández-Navarro, Pablo; García-Pérez, Javier; Ramis, Rebeca; Boldo, Elena; López-Abente, Gonzalo
2012-10-01
Mining installations are releasing toxic substances into the environment which could pose a health problem to populations in their vicinity. We sought to investigate whether there might be excess cancer-related mortality in populations residing in towns lying in the vicinity of Spanish mining industries governed by the Integrated Pollution Prevention and Control Directive, and the European Pollutant Release and Transfer Register Regulation, according to the type of extraction method used. An ecologic study was designed to examine municipal mortality due to 32 types of cancer, across the period 1997 through 2006. Population exposure to pollution was estimated on the basis of distance from town of residence to pollution source. Poisson regression models, using the Bayesian conditional autoregressive model proposed by Besag, York and Molliè and Integrated Nested Laplace Approximations for Bayesian inference, were used: to analyze risk of dying from cancer in a 5-kilometer zone around mining installations; effect of type of industrial activity; and to conduct individual analyses within a 50-kilometer radius of each installation. Excess mortality (relative risk, 95% credible interval) of colorectal cancer (1.097, 1.041-1.157), lung cancer (1.066, 1.009-1.126) specifically related with proximity to opencast coal mining, bladder cancer (1.106, 1.016-1.203) and leukemia (1.093, 1.003-1.191) related with other opencast mining installations, was detected among the overall population in the vicinity of mining installations. Other tumors also associated in the stratified analysis by type of mine, were: thyroid, gallbladder and liver cancers (underground coal installations); brain cancer (opencast coal mining); stomach cancer (coal and other opencast mining installations); and myeloma (underground mining installations). The results suggested an association between risk of dying due to digestive, respiratory, hematologic and thyroid cancers and proximity to Spanish mining industries. These associations were dependent on the type of mine. Copyright © 2012 Elsevier B.V. All rights reserved.
Dedefo, Melkamu; Oljira, Lemessa; Assefa, Nega
2016-02-01
Child mortality reflects a country's level of socio-economic development and quality of life. In Ethiopia, limited studies were conducted on under-five mortality and almost none of them tried to identify the spatial effect on mortality. Thus, this study explored the small area clustering of under-five mortality and associated factors in Kersa HDSS, Eastern Ethiopia. The study population included all children under the age of five years during the time September, 2008-august 31, 2012 which are registered in Kersa Health and Demographic Surveillance System (Kersa HDSS). A flexible Bayesian geo-additive discrete-time survival mixed model was used. Some of the factors that are significantly associated with under-five mortality, with posterior odds ratio and 95% credible intervals, are maternal educational status 1.31(1.13,-1.49), place of delivery 1.016(1.013-1.12), no of live birth at a delivery 0.35(0.23,1.83), low household wealth index 1.26(1.10 1.43) middle level household wealth index 0.95 (0.84 1.07) pre-term duration of pregnancy 1.95(1.27,2.91), post-term duration of pregnancy 0.74(0.60,0.93) and antenatal visit 1.19(1.06, 1.35). Variation was noted in the risk of under-five mortality by the selected small administrative regions (kebeles). This study reveals geographic patterns in rates of under-five mortality in those selected small administrative regions and shows some important determinants of under-five mortality. More importantly, we observed clustering of under-five mortality, which indicates the importance of spatial effects and presentation of this clustering through maps that facilitates visuality and highlights differentials across geographical areas that would, otherwise, be overlooked in traditional data-analytic methods. Copyright © 2015 Elsevier Ltd. All rights reserved.
Huang, Xiaodong; Clements, Archie C A; Williams, Gail; Mengersen, Kerrie; Tong, Shilu; Hu, Wenbiao
2016-04-01
A pandemic strain of influenza A spread rapidly around the world in 2009, now referred to as pandemic (H1N1) 2009. This study aimed to examine the spatiotemporal variation in the transmission rate of pandemic (H1N1) 2009 associated with changes in local socio-environmental conditions from May 7-December 31, 2009, at a postal area level in Queensland, Australia. We used the data on laboratory-confirmed H1N1 cases to examine the spatiotemporal dynamics of transmission using a flexible Bayesian, space-time, Susceptible-Infected-Recovered (SIR) modelling approach. The model incorporated parameters describing spatiotemporal variation in H1N1 infection and local socio-environmental factors. The weekly transmission rate of pandemic (H1N1) 2009 was negatively associated with the weekly area-mean maximum temperature at a lag of 1 week (LMXT) (posterior mean: -0.341; 95% credible interval (CI): -0.370--0.311) and the socio-economic index for area (SEIFA) (posterior mean: -0.003; 95% CI: -0.004--0.001), and was positively associated with the product of LMXT and the weekly area-mean vapour pressure at a lag of 1 week (LVAP) (posterior mean: 0.008; 95% CI: 0.007-0.009). There was substantial spatiotemporal variation in transmission rate of pandemic (H1N1) 2009 across Queensland over the epidemic period. High random effects of estimated transmission rates were apparent in remote areas and some postal areas with higher proportion of indigenous populations and smaller overall populations. Local SEIFA and local atmospheric conditions were associated with the transmission rate of pandemic (H1N1) 2009. The more populated regions displayed consistent and synchronized epidemics with low average transmission rates. The less populated regions had high average transmission rates with more variations during the H1N1 epidemic period. Copyright © 2016 Elsevier Inc. All rights reserved.
Zhao, Bing-Cheng; Jiang, Hong-Ye; Ma, Wei-Ying; Jin, Da-Di; Li, Hao-Miao; Lu, Hai; Nakajima, Hideaki; Huang, Tong-Yi; Sun, Kai-Yu; Chen, Shu-Ling; Chen, Ke-Bing
2016-02-01
Solitary cysticercus granuloma (SCG) is the commonest form of neurocysticercosis in the Indian subcontinent and in travelers. Several different treatment options exist for SCG. We conducted a Bayesian network meta-analysis of randomized clinical trials (RCTs) to identify the best treatment option to prevent seizure recurrence and promote lesion resolution for patients with SCG. PubMed, EMBASE and the Cochrane Library databases (up to June 1, 2015) were searched for RCTs that compared any anthelmintics or corticosteroids, alone or in combination, with placebo or head to head and reported on seizure recurrence and lesion resolution in patients with SCG. A total of 14 RCTs (1277 patients) were included in the quantitative analysis focusing on four different treatment options. A Bayesian network model computing odds ratios (OR) with 95% credible intervals (CrI) and probability of being best (Pbest) was used to compare all interventions simultaneously. Albendazole and corticosteroids combination therapy was the only regimen that significantly decreased the risk of seizure recurrence compared with conservative treatment (OR 0.32, 95% CrI 0.10-0.93, Pbest 73.3%). Albendazole and corticosteroids alone or in combination were all efficacious in hastening granuloma resolution, but the combined therapy remained the best option based on probability analysis (OR 3.05, 95% CrI 1.24-7.95, Pbest 53.9%). The superiority of the combination therapy changed little in RCTs with different follow-up durations and in sensitivity analyses. The limitations of this study include high risk of bias and short follow-up duration in most studies. Dual therapy of albendazole and corticosteroids was the most efficacious regimen that could prevent seizure recurrence and promote lesion resolution in a follow-up period of around one year. It should be recommended for the management of SCG until more high-quality evidence is available.
Nakajima, Hideaki; Huang, Tong-Yi; Sun, Kai-Yu; Chen, Shu-Ling; Chen, Ke-Bing
2016-01-01
Background Solitary cysticercus granuloma (SCG) is the commonest form of neurocysticercosis in the Indian subcontinent and in travelers. Several different treatment options exist for SCG. We conducted a Bayesian network meta-analysis of randomized clinical trials (RCTs) to identify the best treatment option to prevent seizure recurrence and promote lesion resolution for patients with SCG. Methods and Principal Findings PubMed, EMBASE and the Cochrane Library databases (up to June 1, 2015) were searched for RCTs that compared any anthelmintics or corticosteroids, alone or in combination, with placebo or head to head and reported on seizure recurrence and lesion resolution in patients with SCG. A total of 14 RCTs (1277 patients) were included in the quantitative analysis focusing on four different treatment options. A Bayesian network model computing odds ratios (OR) with 95% credible intervals (CrI) and probability of being best (Pbest) was used to compare all interventions simultaneously. Albendazole and corticosteroids combination therapy was the only regimen that significantly decreased the risk of seizure recurrence compared with conservative treatment (OR 0.32, 95% CrI 0.10–0.93, Pbest 73.3%). Albendazole and corticosteroids alone or in combination were all efficacious in hastening granuloma resolution, but the combined therapy remained the best option based on probability analysis (OR 3.05, 95% CrI 1.24–7.95, Pbest 53.9%). The superiority of the combination therapy changed little in RCTs with different follow-up durations and in sensitivity analyses. The limitations of this study include high risk of bias and short follow-up duration in most studies. Conclusions Dual therapy of albendazole and corticosteroids was the most efficacious regimen that could prevent seizure recurrence and promote lesion resolution in a follow-up period of around one year. It should be recommended for the management of SCG until more high-quality evidence is available. PMID:26849048
Wang, Ming; Li, Zheng; Lee, Eun Young; Lewis, Mechelle M; Zhang, Lijun; Sterling, Nicholas W; Wagner, Daymond; Eslinger, Paul; Du, Guangwei; Huang, Xuemei
2017-09-25
It is challenging for current statistical models to predict clinical progression of Parkinson's disease (PD) because of the involvement of multi-domains and longitudinal data. Past univariate longitudinal or multivariate analyses from cross-sectional trials have limited power to predict individual outcomes or a single moment. The multivariate generalized linear mixed-effect model (GLMM) under the Bayesian framework was proposed to study multi-domain longitudinal outcomes obtained at baseline, 18-, and 36-month. The outcomes included motor, non-motor, and postural instability scores from the MDS-UPDRS, and demographic and standardized clinical data were utilized as covariates. The dynamic prediction was performed for both internal and external subjects using the samples from the posterior distributions of the parameter estimates and random effects, and also the predictive accuracy was evaluated based on the root of mean square error (RMSE), absolute bias (AB) and the area under the receiver operating characteristic (ROC) curve. First, our prediction model identified clinical data that were differentially associated with motor, non-motor, and postural stability scores. Second, the predictive accuracy of our model for the training data was assessed, and improved prediction was gained in particularly for non-motor (RMSE and AB: 2.89 and 2.20) compared to univariate analysis (RMSE and AB: 3.04 and 2.35). Third, the individual-level predictions of longitudinal trajectories for the testing data were performed, with ~80% observed values falling within the 95% credible intervals. Multivariate general mixed models hold promise to predict clinical progression of individual outcomes in PD. The data was obtained from Dr. Xuemei Huang's NIH grant R01 NS060722 , part of NINDS PD Biomarker Program (PDBP). All data was entered within 24 h of collection to the Data Management Repository (DMR), which is publically available ( https://pdbp.ninds.nih.gov/data-management ).
Amene, E; Horn, B; Pirie, R; Lake, R; Döpfer, D
2016-09-06
Data containing notified cases of disease are often compromised by incomplete or partial information related to individual cases. In an effort to enhance the value of information from enteric disease notifications in New Zealand, this study explored the use of Bayesian and Multiple Imputation (MI) models to fill risk factor data gaps. As a test case, overseas travel as a risk factor for infection with campylobacteriosis has been examined. Two methods, namely Bayesian Specification (BAS) and Multiple Imputation (MI), were compared regarding predictive performance for various levels of artificially induced missingness of overseas travel status in campylobacteriosis notification data. Predictive performance of the models was assessed through the Brier Score, the Area Under the ROC Curve and the Percent Bias of regression coefficients. Finally, the best model was selected and applied to predict missing overseas travel status of campylobacteriosis notifications. While no difference was observed in the predictive performance of the BAS and MI methods at a lower rate of missingness (<10 %), but the BAS approach performed better than MI at a higher rate of missingness (50 %, 65 %, 80 %). The estimated proportion (95 % Credibility Intervals) of travel related cases was greatest in highly urban District Health Boards (DHBs) in Counties Manukau, Auckland and Waitemata, at 0.37 (0.12, 0.57), 0.33 (0.13, 0.55) and 0.28 (0.10, 0.49), whereas the lowest proportion was estimated for more rural West Coast, Northland and Tairawhiti DHBs at 0.02 (0.01, 0.05), 0.03 (0.01, 0.08) and 0.04 (0.01, 0.06), respectively. The national rate of travel related campylobacteriosis cases was estimated at 0.16 (0.02, 0.48). The use of BAS offers a flexible approach to data augmentation particularly when the missing rate is very high and when the Missing At Random (MAR) assumption holds. High rates of travel associated cases in urban regions of New Zealand predicted by this approach are plausible given the high rate of travel in these regions, including destinations with higher risk of infection. The added advantage of using a Bayesian approach is that the model's prediction can be improved whenever new information becomes available.
Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change
NASA Astrophysics Data System (ADS)
Field, R.; Constantine, P.; Boslough, M.
2011-12-01
We have posed the climate change problem in a framework similar to that used in safety engineering, by acknowledging that probabilistic risk assessments focused on low-probability, high-consequence climate events are perhaps more appropriate than studies focused simply on best estimates. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We have developed specialized statistical surrogate models (SSMs) that can be used to make predictions about the tails of the associated probability distributions. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field, that is, a random variable for every fixed location in the atmosphere at all times. The SSM can be calibrated to available spatial and temporal data from existing climate databases, or to a collection of outputs from general circulation models. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework was also developed to provide quantitative measures of confidence, via Bayesian credible intervals, to assess these risks. To illustrate the use of the SSM, we considered two collections of NCAR CCSM 3.0 output data. The first collection corresponds to average December surface temperature for years 1990-1999 based on a collection of 8 different model runs obtained from the Program for Climate Model Diagnosis and Intercomparison (PCMDI). We calibrated the surrogate model to the available model data and make various point predictions. We also analyzed average precipitation rate in June, July, and August over a 54-year period assuming a cyclic Y2K ocean model. We applied the calibrated surrogate model to study the probability that the precipitation rate falls below certain thresholds and utilized the Bayesian approach to quantify our confidence in these predictions. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.
Tarafder, M R; Carabin, H; Joseph, L; Balolong, E; Olveda, R; McGarvey, S T
2010-03-15
The accuracy of the Kato-Katz technique in identifying individuals with soil-transmitted helminth (STH) infections is limited by day-to-day variation in helminth egg excretion, confusion with other parasites and the laboratory technicians' experience. We aimed to estimate the sensitivity and specificity of the Kato-Katz technique to detect infection with Ascaris lumbricoides, hookworm and Trichuris trichiura using a Bayesian approach in the absence of a 'gold standard'. Data were obtained from a longitudinal study conducted between January 2004 and December 2005 in Samar Province, the Philippines. Each participant provided between one and three stool samples over consecutive days. Stool samples were examined using the Kato-Katz technique and reported as positive or negative for STHs. In the presence of measurement error, the true status of each individual is considered as latent data. Using a Bayesian method, we calculated marginal posterior densities of sensitivity and specificity parameters from the product of the likelihood function of observed and latent data. A uniform prior distribution was used (beta distribution: alpha=1, beta=1). A total of 5624 individuals provided at least one stool sample. One, two and three stool samples were provided by 1582, 1893 and 2149 individuals, respectively. All STHs showed variation in test results from day to day. Sensitivity estimates of the Kato-Katz technique for one stool sample were 96.9% (95% Bayesian Credible Interval [BCI]: 96.1%, 97.6%), 65.2% (60.0%, 69.8%) and 91.4% (90.5%, 92.3%), for A. lumbricoides, hookworm and T. trichiura, respectively. Specificity estimates for one stool sample were 96.1% (95.5%, 96.7%), 93.8% (92.4%, 95.4%) and 94.4% (93.2%, 95.5%), for A. lumbricoides, hookworm and T. trichiura, respectively. Our results show that the Kato-Katz technique can perform with reasonable accuracy with one day's stool collection for A. lumbricoides and T. trichiura. Low sensitivity of the Kato-Katz for detection of hookworm infection may be related to rapid degeneration of delicate hookworm eggs with time. (c) 2009 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.
Cerdá, Magdalena; Tracy, Melissa; Messner, Steven F; Vlahov, David; Tardiff, Kenneth; Galea, Sandro
2009-07-01
Homicide contributes substantially to the burden of death in the US and remains a key contributor to the gap in white-black life expectancy. It has been hypothesized that "broken-windows" policing is associated with lower homicide rates and that physical disorder may mediate this association. However, the empiric evidence is limited and conflicting. We used pooled, cross-sectional time-series data for 74 New York City (NYC) Police Precincts between 1990 and 1999 to test the relation between neighborhood misdemeanor policing (an indicator of physical order) and homicide in NYC in the 1990s. We applied Bayesian hierarchical models, including a random effect of place, to account for serial correlations in homicide across adjacent neighborhoods. An increase of 5000 misdemeanor arrests in a precinct with 100,000 people was associated with a reduction of 3.5 homicides (95% credible interval = -5.00 to -1.00). However, increased misdemeanor arrests were associated with lower physical order (posterior median = -0.015 [-0.025 to -0.01]), and physical order was unrelated to homicide. Our study replicated prior findings suggesting that misdemeanor policing reduces homicide rates, but offered no support for the hypothesis that physical disorder is a mediator of the impact of such policing. Factors responsible for the dramatic decline in US homicides in the last decade remain unclear.
Boulais, Christophe; Wacker, Ron; Augustin, Jean-Christophe; Cheikh, Mohamed Hedi Ben; Peladan, Fabrice
2011-07-01
Mycobacterium avium subsp. paratuberculosis (MAP) is the causal agent of paratuberculosis (Johne's disease) in cattle and other farm ruminants. The potential role of MAP in Crohn's disease in humans and the contribution of dairy products to human exposure to MAP continue to be the subject of scientific debate. The occurrence of MAP in bulk raw milk from dairy herds was assessed using a stochastic modeling approach. Raw milk samples were collected from bulk tanks in dairy plants and tested for the presence of MAP. Results from this analytical screening were used in a Bayesian network to update the model prediction. Of the 83 raw milk samples tested, 4 were positive for MAP by culture and PCR. We estimated that the level of MAP in bulk tanks ranged from 0 CFU/ml for the 2.5th percentile to 65 CFU/ml for the 97.5th percentile, with 95% credibility intervals of [0, 0] and [16, 326], respectively. The model was used to evaluate the effect of measures aimed at reducing the occurrence of MAP in raw milk. Reducing the prevalence of paratuberculosis has less of an effect on the occurrence of MAP in bulk raw milk than does managing clinically infected animals through good farming practices. Copyright ©, International Association for Food Protection
Hajna, Samantha; Ross, Nancy A; Brazeau, Anne-Sophie; Bélisle, Patrick; Joseph, Lawrence; Dasgupta, Kaberi
2015-08-11
Higher street connectivity, land use mix and residential density (collectively referred to as neighbourhood walkability) have been linked to higher levels of walking. The objective of our study was to summarize the current body of knowledge on the association between neighbourhood walkability and biosensor-assessed daily steps in adults. We conducted a systematic search of PubMed, SCOPUS, and Embase (Ovid) for articles published prior to May 2014 on the association between walkability (based on Geographic Information Systems-derived street connectivity, land use mix, and/or residential density) and daily steps (pedometer or accelerometer-assessed) in adults. The mean differences in daily steps between adults living in high versus low walkable neighbourhoods were pooled across studies using a Bayesian hierarchical model. The search strategy yielded 8,744 unique abstracts. Thirty of these underwent full article review of which six met the inclusion criteria. Four of these studies were conducted in Europe and two were conducted in Asia. A meta-analysis of four of these six studies indicates that participants living in high compared to low walkable neighbourhoods accumulate 766 more steps per day (95 % credible interval 250, 1271). This accounts for approximately 8 % of recommended daily steps. The results of European and Asian studies support the hypothesis that higher neighbourhood walkability is associated with higher levels of biosensor-assessed walking in adults. More studies on this association are needed in North America.
Dunham, Jason B.; Chelgren, Nathan D.; Heck, Michael P.; Clark, Steven M.
2013-01-01
We evaluated the probability of detecting larval lampreys using different methods of backpack electrofishing in wadeable streams in the U.S. Pacific Northwest. Our primary objective was to compare capture of lampreys using electrofishing with standard settings for salmon and trout to settings specifically adapted for capture of lampreys. Field work consisted of removal sampling by means of backpack electrofishing in 19 sites in streams representing a broad range of conditions in the region. Captures of lampreys at these sites were analyzed with a modified removal-sampling model and Bayesian estimation to measure the relative odds of capture using the lamprey-specific settings compared with the standard salmonid settings. We found that the odds of capture were 2.66 (95% credible interval, 0.87–78.18) times greater for the lamprey-specific settings relative to standard salmonid settings. When estimates of capture probability were applied to estimating the probabilities of detection, we found high (>0.80) detectability when the actual number of lampreys in a site was greater than 10 individuals and effort was at least two passes of electrofishing, regardless of the settings used. Further work is needed to evaluate key assumptions in our approach, including the evaluation of individual-specific capture probabilities and population closure. For now our results suggest comparable results are possible for detection of lampreys by using backpack electrofishing with salmonid- or lamprey-specific settings.
Ziehl-Quirós, E Carolina; García-Aguilar, María C; Mellink, Eric
2017-01-24
The relatively small population size and restricted distribution of the Guadalupe fur seal Arctocephalus townsendi could make it highly vulnerable to infectious diseases. We performed a colony-level assessment in this species of the prevalence and presence of Brucella spp. and Leptospira spp., pathogenic bacteria that have been reported in several pinniped species worldwide. Forty-six serum samples were collected in 2014 from pups at Isla Guadalupe, the only place where the species effectively reproduces. Samples were tested for Brucella using 3 consecutive serological tests, and for Leptospira using the microscopic agglutination test. For each bacterium, a Bayesian approach was used to estimate prevalence to exposure, and an epidemiological model was used to test the null hypothesis that the bacterium was present in the colony. No serum sample tested positive for Brucella, and the statistical analyses concluded that the colony was bacterium-free with a 96.3% confidence level. However, a Brucella surveillance program would be highly recommendable. Twelve samples were positive (titers 1:50) to 1 or more serovars of Leptospira. The prevalence was calculated at 27.1% (95% credible interval: 15.6-40.3%), and the posterior analyses indicated that the colony was not Leptospira-free with a 100% confidence level. Serovars Icterohaemorrhagiae, Canicola, and Bratislava were detected, but only further research can unveil whether they affect the fur seal population.
Cooper, Ben S.; Kotirum, Surachai; Kulpeng, Wantanee; Praditsitthikorn, Naiyana; Chittaganpitch, Malinee; Limmathurotsakul, Direk; Day, Nicholas P. J.; Coker, Richard; Teerawattananon, Yot; Meeyai, Aronrag
2015-01-01
Influenza epidemiology differs substantially in tropical and temperate zones, but estimates of seasonal influenza mortality in developing countries in the tropics are lacking. We aimed to quantify mortality due to seasonal influenza in Thailand, a tropical middle-income country. Time series of polymerase chain reaction–confirmed influenza infections between 2005 and 2009 were constructed from a sentinel surveillance network. These were combined with influenza-like illness data to derive measures of influenza activity and relationships to mortality by using a Bayesian regression framework. We estimated 6.1 (95% credible interval: 0.5, 12.4) annual deaths per 100,000 population attributable to influenza A and B, predominantly in those aged ≥60 years, with the largest contribution from influenza A(H1N1) in 3 out of 4 years. For A(H3N2), the relationship between influenza activity and mortality varied over time. Influenza was associated with increases in deaths classified as resulting from respiratory disease (posterior probability of positive association, 99.8%), cancer (98.6%), renal disease (98.0%), and liver disease (99.2%). No association with circulatory disease mortality was found. Seasonal influenza infections are associated with substantial mortality in Thailand, but evidence for the strong relationship between influenza activity and circulatory disease mortality reported in temperate countries is lacking. PMID:25899091
Sequential updating of a new dynamic pharmacokinetic model for caffeine in premature neonates.
Micallef, Sandrine; Amzal, Billy; Bach, Véronique; Chardon, Karen; Tourneux, Pierre; Bois, Frédéric Y
2007-01-01
Caffeine treatment is widely used in nursing care to reduce the risk of apnoea in premature neonates. To check the therapeutic efficacy of the treatment against apnoea, caffeine concentration in blood is an important indicator. The present study was aimed at building a pharmacokinetic model as a basis for a medical decision support tool. In the proposed model, time dependence of physiological parameters is introduced to describe rapid growth of neonates. To take into account the large variability in the population, the pharmacokinetic model is embedded in a population structure. The whole model is inferred within a Bayesian framework. To update caffeine concentration predictions as data of an incoming patient are collected, we propose a fast method that can be used in a medical context. This involves the sequential updating of model parameters (at individual and population levels) via a stochastic particle algorithm. Our model provides better predictions than the ones obtained with models previously published. We show, through an example, that sequential updating improves predictions of caffeine concentration in blood (reduce bias and length of credibility intervals). The update of the pharmacokinetic model using body mass and caffeine concentration data is studied. It shows how informative caffeine concentration data are in contrast to body mass data. This study provides the methodological basis to predict caffeine concentration in blood, after a given treatment if data are collected on the treated neonate.
Hardelid, P; Cortina-Borja, M; Munro, A; Jones, H; Cleary, M; Champion, M P; Foo, Y; Scriver, C R; Dezateux, C
2008-01-01
Phenylketonuria (PKU) is an autosomal recessive inborn error of metabolism (OMIM 261600). Treatment with a low-phenylalanine diet following early ascertainment by newborn screening prevents impaired cognitive development, the major disease phenotype in PKU. The overall birth prevalence of PKU in European, Chinese and Korean populations is approximately 1/10,000. Since the human PAH locus contains PKU-causing alleles and polymorphic core haplotypes that describe and corroborate an out-of-Africa range expansion in modern human populations, it is of interest to know the prevalence of PKU in different ethnic groups with diverse geographical origin. We estimated PKU prevalence in South East England, where a sizeable proportion of the population are of Sub-Saharan African or South Asian ancestry. Over the period 1994 to 2004 167 children were diagnosed with PKU. Using birth registration and census data to derive denominators, PKU birth prevalence per 10,000 live births (95% Bayesian credible intervals) was estimated to be 1.14 (0.96-1.33) among white, 0.11 (0.02-0.37) among black, and 0.29 (0.10-0.63) among Asian ethnic groups. This suggests that PKU is up to an order of magnitude less prevalent in populations with Sub-Saharan African and South Asian ancestry that have migrated to the UK.
Statistical properties of four effect-size measures for mediation models.
Miočević, Milica; O'Rourke, Holly P; MacKinnon, David P; Brown, Hendricks C
2018-02-01
This project examined the performance of classical and Bayesian estimators of four effect size measures for the indirect effect in a single-mediator model and a two-mediator model. Compared to the proportion and ratio mediation effect sizes, standardized mediation effect-size measures were relatively unbiased and efficient in the single-mediator model and the two-mediator model. Percentile and bias-corrected bootstrap interval estimates of ab/s Y , and ab(s X )/s Y in the single-mediator model outperformed interval estimates of the proportion and ratio effect sizes in terms of power, Type I error rate, coverage, imbalance, and interval width. For the two-mediator model, standardized effect-size measures were superior to the proportion and ratio effect-size measures. Furthermore, it was found that Bayesian point and interval summaries of posterior distributions of standardized effect-size measures reduced excessive relative bias for certain parameter combinations. The standardized effect-size measures are the best effect-size measures for quantifying mediated effects.
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
Aaltonen, T; Adelman, J; Akimoto, T; Alvarez González, B; Amerio, S; Amidei, D; Anastassov, A; Annovi, A; Antos, J; Apollinari, G; Apresyan, A; Arisawa, T; Artikov, A; Ashmanskas, W; Attal, A; Aurisano, A; Azfar, F; Azzurri, P; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Bartsch, V; Bauer, G; Beauchemin, P-H; Bedeschi, F; Beecher, D; Behari, S; Bellettini, G; Bellinger, J; Benjamin, D; Beretvas, A; Beringer, J; Bhatti, A; Binkley, M; Bisello, D; Bizjak, I; Blair, R E; Blocker, C; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bolla, G; Bortoletto, D; Boudreau, J; Boveia, A; Brau, B; Bridgeman, A; Brigliadori, L; Bromberg, C; Brubaker, E; Budagov, J; Budd, H S; Budd, S; Burke, S; Burkett, K; Busetto, G; Bussey, P; Buzatu, A; Byrum, K L; Cabrera, S; Calancha, C; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carls, B; Carlsmith, D; Carosi, R; Carrillo, S; Carron, S; Casal, B; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavaliere, V; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, K; Chokheli, D; Chou, J P; Choudalakis, G; Chuang, S H; Chung, K; Chung, W H; Chung, Y S; Chwalek, T; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Compostella, G; Convery, M E; Conway, J; Cordelli, M; Cortiana, G; Cox, C A; Cox, D J; Crescioli, F; Cuenca Almenar, C; Cuevas, J; Culbertson, R; Cully, J C; Dagenhart, D; Datta, M; Davies, T; de Barbaro, P; De Cecco, S; Deisher, A; De Lorenzo, G; Dell'orso, M; Deluca, C; Demay, C; Demortier, L; Deng, J; Deninno, M; Derwent, P F; di Giovanni, G P; Dionisi, C; Di Ruzza, B; Dittmann, J R; D'Onofrio, M; Donati, S; Dong, P; Donini, J; Dorigo, T; Dube, S; Efron, J; Elagin, A; Erbacher, R; Errede, D; Errede, S; Eusebi, R; Fang, H C; Farrington, S; Fedorko, W T; Feild, R G; Feindt, M; Fernandez, J P; Ferrazza, C; Field, R; Flanagan, G; Forrest, R; Frank, M J; Franklin, M; Freeman, J C; Furic, I; Gallinaro, M; Galyardt, J; Garberson, F; Garcia, J E; Garfinkel, A F; Genser, K; Gerberich, H; Gerdes, D; Gessler, A; Giagu, S; Giakoumopoulou, V; Giannetti, P; Gibson, K; Gimmell, J L; Ginsburg, C M; Giokaris, N; Giordani, M; Giromini, P; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldschmidt, N; Golossanov, A; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Goulianos, K; Gresele, A; Grinstein, S; Grosso-Pilcher, C; Grundler, U; Guimaraes da Costa, J; Gunay-Unalan, Z; Haber, C; Hahn, K; Hahn, S R; Halkiadakis, E; Han, B-Y; Han, J Y; Happacher, F; Hara, K; Hare, D; Hare, M; Harper, S; Harr, R F; Harris, R M; Hartz, M; Hatakeyama, K; Hays, C; Heck, M; Heijboer, A; Heinrich, J; Henderson, C; Herndon, M; Heuser, J; Hewamanage, S; Hidas, D; Hill, C S; Hirschbuehl, D; Hocker, A; Hou, S; Houlden, M; Hsu, S-C; Huffman, B T; Hughes, R E; Husemann, U; Hussein, M; Husemann, U; Huston, J; Incandela, J; Introzzi, G; Iori, M; Ivanov, A; James, E; Jayatilaka, B; Jeon, E J; Jha, M K; Jindariani, S; Johnson, W; Jones, M; Joo, K K; Jun, S Y; Jung, J E; Junk, T R; Kamon, T; Kar, D; Karchin, P E; Kato, Y; Kephart, R; Keung, J; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, H W; Kim, J E; Kim, M J; Kim, S B; Kim, S H; Kim, Y K; Kimura, N; Kirsch, L; Klimenko, S; Knuteson, B; Ko, B R; Kondo, K; Kong, D J; Konigsberg, J; Korytov, A; Kotwal, A V; Kreps, M; Kroll, J; Krop, D; Krumnack, N; Kruse, M; Krutelyov, V; Kubo, T; Kuhr, T; Kulkarni, N P; Kurata, M; Kwang, S; Laasanen, A T; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; Lecompte, T; Lee, E; Lee, H S; Lee, S W; Leone, S; Lewis, J D; Lin, C-S; Linacre, J; Lindgren, M; Lipeles, E; Lister, A; Litvintsev, D O; Liu, C; Liu, T; Lockyer, N S; Loginov, A; Loreti, M; Lovas, L; Lucchesi, D; Luci, C; Lueck, J; Lujan, P; Lukens, P; Lungu, G; Lyons, L; Lys, J; Lysak, R; Macqueen, D; Madrak, R; Maeshima, K; Makhoul, K; Maki, T; Maksimovic, P; Malde, S; Malik, S; Manca, G; Manousakis-Katsikakis, A; Margaroli, F; Marino, C; Marino, C P; Martin, A; Martin, V; Martínez, M; Martínez-Ballarín, R; Maruyama, T; Mastrandrea, P; Masubuchi, T; Mathis, M; Mattson, M E; Mazzanti, P; McFarland, K S; McIntyre, P; McNulty, R; Mehta, A; Mehtala, P; Menzione, A; Merkel, P; Mesropian, C; Miao, T; Miladinovic, N; Miller, R; Mills, C; Milnik, M; Mitra, A; Mitselmakher, G; Miyake, H; Moggi, N; Moon, C S; Moore, R; Morello, M J; Morlock, J; Movilla Fernandez, P; Mülmenstädt, J; Mukherjee, A; Muller, Th; Mumford, R; Murat, P; Mussini, M; Nachtman, J; Nagai, Y; Nagano, A; Naganoma, J; Nakamura, K; Nakano, I; Napier, A; Necula, V; Nett, J; Neu, C; Neubauer, M S; Neubauer, S; Nielsen, J; Nodulman, L; Norman, M; Norniella, O; Nurse, E; Oakes, L; Oh, S H; Oh, Y D; Oksuzian, I; Okusawa, T; Orava, R; Osterberg, K; Pagan Griso, S; Palencia, E; Papadimitriou, V; Papaikonomou, A; Paramonov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Peiffer, T; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Pianori, E; Pinera, L; Pitts, K; Plager, C; Pondrom, L; Poukhov, O; Pounder, N; Prakoshyn, F; Pronko, A; Proudfoot, J; Ptohos, F; Pueschel, E; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Ramakrishnan, V; Ranjan, N; Ray, J; Redondo, I; Renton, P; Renz, M; Rescigno, M; Richter, S; Rimondi, F; Ristori, L; Robson, A; Rodrigo, T; Rodriguez, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Roy, P; Ruiz, A; Russ, J; Rusu, V; Saarikko, H; Safonov, A; Sakumoto, W K; Saltó, O; Santi, L; Sarkar, S; Sartori, L; Sato, K; Savoy-Navarro, A; Schlabach, P; Schmidt, A; Schmidt, E E; Schmidt, M A; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Sexton-Kennedy, L; Sforza, F; Sfyrla, A; Shalhout, S Z; Shears, T; Shepard, P F; Shimojima, M; Shiraishi, S; Shochet, M; Shon, Y; Shreyber, I; Sidoti, A; Sinervo, P; Sisakyan, A; Slaughter, A J; Slaunwhite, J; Sliwa, K; Smith, J R; Snider, F D; Snihur, R; Soha, A; Somalwar, S; Sorin, V; Spalding, J; Spreitzer, T; Squillacioti, P; Stanitzki, M; St Denis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Strycker, G L; Stuart, D; Suh, J S; Sukhanov, A; Suslov, I; Suzuki, T; Taffard, A; Takashima, R; Takeuchi, Y; Tanaka, R; Tecchio, M; Teng, P K; Terashi, K; Thom, J; Thompson, A S; Thompson, G A; Thomson, E; Tipton, P; Ttito-Guzmán, P; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Torre, S; Torretta, D; Totaro, P; Tourneur, S; Trovato, M; Tsai, S-Y; Tu, Y; Turini, N; Ukegawa, F; Vallecorsa, S; van Remortel, N; Varganov, A; Vataga, E; Vázquez, F; Velev, G; Vellidis, C; Vidal, M; Vidal, R; Vila, I; Vilar, R; Vine, T; Vogel, M; Volobouev, I; Volpi, G; Wagner, P; Wagner, R G; Wagner, R L; Wagner, W; Wagner-Kuhr, J; Wakisaka, T; Wallny, R; Wang, S M; Warburton, A; Waters, D; Weinberger, M; Weinelt, J; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Wilbur, S; Williams, G; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Wright, T; Wu, X; Würthwein, F; Xie, S; Yagil, A; Yamamoto, K; Yamaoka, J; Yang, U K; Yang, Y C; Yao, W M; Yeh, G P; Yoh, J; Yorita, K; Yoshida, T; Yu, G B; Yu, I; Yu, S S; Yun, J C; Zanello, L; Zanetti, A; Zhang, X; Zheng, Y; Zucchelli, S
2009-08-07
A search for a narrow diphoton mass resonance is presented based on data from 3.0 fb;{-1} of integrated luminosity from pp[over ] collisions at sqrt[s] = 1.96 TeV collected by the CDF experiment. No evidence of a resonance in the diphoton mass spectrum is observed, and upper limits are set on the cross section times branching fraction of the resonant state as a function of Higgs boson mass. The resulting limits exclude Higgs bosons with masses below 106 GeV/c;{2} at a 95% Bayesian credibility level for one fermiophobic benchmark model.
Abdullah, Nasreen; Laing, Robert S; Hariri, Susan; Young, Collette M; Schafer, Sean
2016-04-01
Human papillomavirus (HPV) vaccine should reduce cervical dysplasia before cervical cancer. However, dysplasia diagnosis is screening-dependent. Accurate screening estimates are needed. To estimate the percentage of women in a geographic population that has had cervical cancer screening. We analyzed claims data for (Papanicolau) Pap tests from 2008-2012 to estimate the percentage of insured women aged 18-39 years screened. We estimated screening in uninsured women by dividing the percentage of insured Behavioral Risk Factor Surveillance Survey respondents reporting previous-year testing by the percentage of uninsured respondents reporting previous-year testing, and multiplying this ratio by claims-based estimates of insured women with previous-year screening. We calculated a simple weighted average of the two estimates to estimate overall screening percentage. We estimated credible intervals using Monte-Carlo simulations. During 2008-2012, an annual average of 29.6% of women aged 18-39 years were screened. Screening increased from 2008 to 2009 in all age groups. During 2009-2012, the screening percentages decreased for all groups, but declined most in women aged 18-20 years, from 21.5% to 5.4%. Within age groups, compared to 2009, credible intervals did not overlap during 2011 (except age group 21-29 years) and 2012, and credible intervals in the 18-20 year group did not overlap with older groups in any year. This introduces a novel method to estimate population-level cervical cancer screening. Overall, percentage of women screened in Portland, Oregon fell following changes in screening recommendations released in 2009 and later modified in 2012. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bakoban, Rana A.
2017-08-01
The coefficient of variation [CV] has several applications in applied statistics. So in this paper, we adopt Bayesian and non-Bayesian approaches for the estimation of CV under type-II censored data from extension exponential distribution [EED]. The point and interval estimate of the CV are obtained for each of the maximum likelihood and parametric bootstrap techniques. Also the Bayesian approach with the help of MCMC method is presented. A real data set is presented and analyzed, hence the obtained results are used to assess the obtained theoretical results.
Singh, Jasvinder A; Cameron, Chris; Noorbaloochi, Shahrzad; Cullis, Tyler; Tucker, Matthew; Christensen, Robin; Ghogomu, Elizabeth Tanjong; Coyle, Doug; Clifford, Tammy; Tugwell, Peter; Wells, George A
2015-07-18
Serious infections are a major concern for patients considering treatments for rheumatoid arthritis. Evidence is inconsistent as to whether biological drugs are associated with an increased risk of serious infection compared with traditional disease-modifying antirheumatic drugs (DMARDs). We did a systematic review and meta-analysis of serious infections in patients treated with biological drugs compared with those treated with traditional DMARDs. We did a systematic literature search with Medline, Embase, Cochrane Central Register of Controlled Trials, and ClinicalTrials.gov from their inception to Feb 11, 2014. Search terms included "biologics", "rheumatoid arthritis" and their synonyms. Trials were eligible for inclusion if they included any of the approved biological drugs and reported serious infections. We assessed the risk of bias with the Cochrane Risk of Bias Tool. We did a Bayesian network meta-analysis of published trials using a binomial likelihood model to assess the risk of serious infections in patients with rheumatoid arthritis who were treated with biological drugs, compared with those treated with traditional DMARDs. The odds ratio (OR) of serious infection was the primary measure of treatment effect and calculated 95% credible intervals using Markov Chain Monte Carlo methods. The systematic review identified 106 trials that reported serious infections and included patients with rheumatoid arthritis who received biological drugs. Compared with traditional DMARDs, standard-dose biological drugs (OR 1.31, 95% credible interval [CrI] 1.09-1.58) and high-dose biological drugs (1.90, 1.50-2.39) were associated with an increased risk of serious infections, although low-dose biological drugs (0.93, 0.65-1.33) were not. The risk was lower in patients who were methotrexate naive compared with traditional DMARD-experienced or anti-tumour necrosis factor biological drug-experienced patients. The absolute increase in the number of serious infections per 1000 patients treated each year ranged from six for standard-dose biological drugs to 55 for combination biological therapy, compared with traditional DMARDs. Standard-dose and high-dose biological drugs (with or without traditional DMARDs) are associated with an increase in serious infections in rheumatoid arthritis compared with traditional DMARDs, although low-dose biological drugs are not. Clinicians should discuss the balance between benefit and harm with the individual patient before starting biological treatment for rheumatoid arthritis. Rheumatology Division at the University of Alabama at Birmingham. Copyright © 2015 Elsevier Ltd. All rights reserved.
Thrombectomy 6 to 24 Hours after Stroke with a Mismatch between Deficit and Infarct.
Nogueira, Raul G; Jadhav, Ashutosh P; Haussen, Diogo C; Bonafe, Alain; Budzik, Ronald F; Bhuva, Parita; Yavagal, Dileep R; Ribo, Marc; Cognard, Christophe; Hanel, Ricardo A; Sila, Cathy A; Hassan, Ameer E; Millan, Monica; Levy, Elad I; Mitchell, Peter; Chen, Michael; English, Joey D; Shah, Qaisar A; Silver, Frank L; Pereira, Vitor M; Mehta, Brijesh P; Baxter, Blaise W; Abraham, Michael G; Cardona, Pedro; Veznedaroglu, Erol; Hellinger, Frank R; Feng, Lei; Kirmani, Jawad F; Lopes, Demetrius K; Jankowitz, Brian T; Frankel, Michael R; Costalat, Vincent; Vora, Nirav A; Yoo, Albert J; Malik, Amer M; Furlan, Anthony J; Rubiera, Marta; Aghaebrahim, Amin; Olivot, Jean-Marc; Tekle, Wondwossen G; Shields, Ryan; Graves, Todd; Lewis, Roger J; Smith, Wade S; Liebeskind, David S; Saver, Jeffrey L; Jovin, Tudor G
2018-01-04
The effect of endovascular thrombectomy that is performed more than 6 hours after the onset of ischemic stroke is uncertain. Patients with a clinical deficit that is disproportionately severe relative to the infarct volume may benefit from late thrombectomy. We enrolled patients with occlusion of the intracranial internal carotid artery or proximal middle cerebral artery who had last been known to be well 6 to 24 hours earlier and who had a mismatch between the severity of the clinical deficit and the infarct volume, with mismatch criteria defined according to age (<80 years or ≥80 years). Patients were randomly assigned to thrombectomy plus standard care (the thrombectomy group) or to standard care alone (the control group). The coprimary end points were the mean score for disability on the utility-weighted modified Rankin scale (which ranges from 0 [death] to 10 [no symptoms or disability]) and the rate of functional independence (a score of 0, 1, or 2 on the modified Rankin scale, which ranges from 0 to 6, with higher scores indicating more severe disability) at 90 days. A total of 206 patients were enrolled; 107 were assigned to the thrombectomy group and 99 to the control group. At 31 months, enrollment in the trial was stopped because of the results of a prespecified interim analysis. The mean score on the utility-weighted modified Rankin scale at 90 days was 5.5 in the thrombectomy group as compared with 3.4 in the control group (adjusted difference [Bayesian analysis], 2.0 points; 95% credible interval, 1.1 to 3.0; posterior probability of superiority, >0.999), and the rate of functional independence at 90 days was 49% in the thrombectomy group as compared with 13% in the control group (adjusted difference, 33 percentage points; 95% credible interval, 24 to 44; posterior probability of superiority, >0.999). The rate of symptomatic intracranial hemorrhage did not differ significantly between the two groups (6% in the thrombectomy group and 3% in the control group, P=0.50), nor did 90-day mortality (19% and 18%, respectively; P=1.00). Among patients with acute stroke who had last been known to be well 6 to 24 hours earlier and who had a mismatch between clinical deficit and infarct, outcomes for disability at 90 days were better with thrombectomy plus standard care than with standard care alone. (Funded by Stryker Neurovascular; DAWN ClinicalTrials.gov number, NCT02142283 .).
Hawken, Steven; Kwong, Jeffrey C; Deeks, Shelley L; Crowcroft, Natasha S; McGeer, Allison J; Ducharme, Robin; Campitelli, Michael A; Coyle, Doug; Wilson, Kumanan
2015-02-01
It is unclear whether seasonal influenza vaccination results in a net increase or decrease in the risk for Guillain-Barré syndrome (GBS). To assess the effect of seasonal influenza vaccination on the absolute risk of acquiring GBS, we used simulation models and published estimates of age- and sex-specific risks for GBS, influenza incidence, and vaccine effectiveness. For a hypothetical 45-year-old woman and 75-year-old man, excess GBS risk for influenza vaccination versus no vaccination was -0.36/1 million vaccinations (95% credible interval -1.22% to 0.28) and -0.42/1 million vaccinations (95% credible interval, -3.68 to 2.44), respectively. These numbers represent a small absolute reduction in GBS risk with vaccination. Under typical conditions (e.g. influenza incidence rates >5% and vaccine effectiveness >60%), vaccination reduced GBS risk. These findings should strengthen confidence in the safety of influenza vaccine and allow health professionals to better put GBS risk in context when discussing influenza vaccination with patients.
Bayesian Nonparametric Ordination for the Analysis of Microbial Communities.
Ren, Boyu; Bacallado, Sergio; Favaro, Stefano; Holmes, Susan; Trippa, Lorenzo
2017-01-01
Human microbiome studies use sequencing technologies to measure the abundance of bacterial species or Operational Taxonomic Units (OTUs) in samples of biological material. Typically the data are organized in contingency tables with OTU counts across heterogeneous biological samples. In the microbial ecology community, ordination methods are frequently used to investigate latent factors or clusters that capture and describe variations of OTU counts across biological samples. It remains important to evaluate how uncertainty in estimates of each biological sample's microbial distribution propagates to ordination analyses, including visualization of clusters and projections of biological samples on low dimensional spaces. We propose a Bayesian analysis for dependent distributions to endow frequently used ordinations with estimates of uncertainty. A Bayesian nonparametric prior for dependent normalized random measures is constructed, which is marginally equivalent to the normalized generalized Gamma process, a well-known prior for nonparametric analyses. In our prior, the dependence and similarity between microbial distributions is represented by latent factors that concentrate in a low dimensional space. We use a shrinkage prior to tune the dimensionality of the latent factors. The resulting posterior samples of model parameters can be used to evaluate uncertainty in analyses routinely applied in microbiome studies. Specifically, by combining them with multivariate data analysis techniques we can visualize credible regions in ecological ordination plots. The characteristics of the proposed model are illustrated through a simulation study and applications in two microbiome datasets.
Bayesian Spatial Design of Optimal Deep Tubewell Locations in Matlab, Bangladesh.
Warren, Joshua L; Perez-Heydrich, Carolina; Yunus, Mohammad
2013-09-01
We introduce a method for statistically identifying the optimal locations of deep tubewells (dtws) to be installed in Matlab, Bangladesh. Dtw installations serve to mitigate exposure to naturally occurring arsenic found at groundwater depths less than 200 meters, a serious environmental health threat for the population of Bangladesh. We introduce an objective function, which incorporates both arsenic level and nearest town population size, to identify optimal locations for dtw placement. Assuming complete knowledge of the arsenic surface, we then demonstrate how minimizing the objective function over a domain favors dtws placed in areas with high arsenic values and close to largely populated regions. Given only a partial realization of the arsenic surface over a domain, we use a Bayesian spatial statistical model to predict the full arsenic surface and estimate the optimal dtw locations. The uncertainty associated with these estimated locations is correctly characterized as well. The new method is applied to a dataset from a village in Matlab and the estimated optimal locations are analyzed along with their respective 95% credible regions.
Cori, Anne; Pufall, Erica L.; Price, Alison; Elmes, Jocelyn; Zaba, Basia; Crampin, Amelia C.; Lutalo, Tom; Gregson, Simon; Hallett, Timothy B.
2016-01-01
Background Programmatic planning in HIV requires estimates of the distribution of new HIV infections according to identifiable characteristics of individuals. In sub-Saharan Africa, robust routine data sources and historical epidemiological observations are available to inform and validate such estimates. Methods and Findings We developed a predictive model, the Incidence Patterns Model (IPM), representing populations according to factors that have been demonstrated to be strongly associated with HIV acquisition risk: gender, marital/sexual activity status, geographic location, “key populations” based on risk behaviours (sex work, injecting drug use, and male-to-male sex), HIV and ART status within married or cohabiting unions, and circumcision status. The IPM estimates the distribution of new infections acquired by group based on these factors within a Bayesian framework accounting for regional prior information on demographic and epidemiological characteristics from trials or observational studies. We validated and trained the model against direct observations of HIV incidence by group in seven rounds of cohort data from four studies (“sites”) conducted in Manicaland, Zimbabwe; Rakai, Uganda; Karonga, Malawi; and Kisesa, Tanzania. The IPM performed well, with the projections’ credible intervals for the proportion of new infections per group overlapping the data’s confidence intervals for all groups in all rounds of data. In terms of geographical distribution, the projections’ credible intervals overlapped the confidence intervals for four out of seven rounds, which were used as proxies for administrative divisions in a country. We assessed model performance after internal training (within one site) and external training (between sites) by comparing mean posterior log-likelihoods and used the best model to estimate the distribution of HIV incidence in six countries (Gabon, Kenya, Malawi, Rwanda, Swaziland, and Zambia) in the region. We subsequently inferred the potential contribution of each group to transmission using a simple model that builds on the results from the IPM and makes further assumptions about sexual mixing patterns and transmission rates. In all countries except Swaziland, individuals in unions were the single group contributing to the largest proportion of new infections acquired (39%–77%), followed by never married women and men. Female sex workers accounted for a large proportion of new infections (5%–16%) compared to their population size. Individuals in unions were also the single largest contributor to the proportion of infections transmitted (35%–62%), followed by key populations and previously married men and women. Swaziland exhibited different incidence patterns, with never married men and women accounting for over 65% of new infections acquired and also contributing to a large proportion of infections transmitted (up to 56%). Between- and within-country variations indicated different incidence patterns in specific settings. Conclusions It is possible to reliably predict the distribution of new HIV infections acquired using data routinely available in many countries in the sub-Saharan African region with a single relatively simple mathematical model. This tool would complement more specific analyses to guide resource allocation, data collection, and programme planning. PMID:27622516
[Evaluation of estimation of prevalence ratio using bayesian log-binomial regression model].
Gao, W L; Lin, H; Liu, X N; Ren, X W; Li, J S; Shen, X P; Zhu, S L
2017-03-10
To evaluate the estimation of prevalence ratio ( PR ) by using bayesian log-binomial regression model and its application, we estimated the PR of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea in their infants by using bayesian log-binomial regression model in Openbugs software. The results showed that caregivers' recognition of infant' s risk signs of diarrhea was associated significantly with a 13% increase of medical care-seeking. Meanwhile, we compared the differences in PR 's point estimation and its interval estimation of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea and convergence of three models (model 1: not adjusting for the covariates; model 2: adjusting for duration of caregivers' education, model 3: adjusting for distance between village and township and child month-age based on model 2) between bayesian log-binomial regression model and conventional log-binomial regression model. The results showed that all three bayesian log-binomial regression models were convergence and the estimated PRs were 1.130(95 %CI : 1.005-1.265), 1.128(95 %CI : 1.001-1.264) and 1.132(95 %CI : 1.004-1.267), respectively. Conventional log-binomial regression model 1 and model 2 were convergence and their PRs were 1.130(95 % CI : 1.055-1.206) and 1.126(95 % CI : 1.051-1.203), respectively, but the model 3 was misconvergence, so COPY method was used to estimate PR , which was 1.125 (95 %CI : 1.051-1.200). In addition, the point estimation and interval estimation of PRs from three bayesian log-binomial regression models differed slightly from those of PRs from conventional log-binomial regression model, but they had a good consistency in estimating PR . Therefore, bayesian log-binomial regression model can effectively estimate PR with less misconvergence and have more advantages in application compared with conventional log-binomial regression model.
Dong, Linsong; Wang, Zhiyong
2018-06-11
Genomic prediction is feasible for estimating genomic breeding values because of dense genome-wide markers and credible statistical methods, such as Genomic Best Linear Unbiased Prediction (GBLUP) and various Bayesian methods. Compared with GBLUP, Bayesian methods propose more flexible assumptions for the distributions of SNP effects. However, most Bayesian methods are performed based on Markov chain Monte Carlo (MCMC) algorithms, leading to computational efficiency challenges. Hence, some fast Bayesian approaches, such as fast BayesB (fBayesB), were proposed to speed up the calculation. This study proposed another fast Bayesian method termed fast BayesC (fBayesC). The prior distribution of fBayesC assumes that a SNP with probability γ has a non-zero effect which comes from a normal density with a common variance. The simulated data from QTLMAS XII workshop and actual data on large yellow croaker were used to compare the predictive results of fBayesB, fBayesC and (MCMC-based) BayesC. The results showed that when γ was set as a small value, such as 0.01 in the simulated data or 0.001 in the actual data, fBayesB and fBayesC yielded lower prediction accuracies (abilities) than BayesC. In the actual data, fBayesC could yield very similar predictive abilities as BayesC when γ ≥ 0.01. When γ = 0.01, fBayesB could also yield similar results as fBayesC and BayesC. However, fBayesB could not yield an explicit result when γ ≥ 0.1, but a similar situation was not observed for fBayesC. Moreover, the computational speed of fBayesC was significantly faster than that of BayesC, making fBayesC a promising method for genomic prediction.
The role of probability arguments in the history of science.
Weinert, Friedel
2010-03-01
The paper examines Wesley Salmon's claim that the primary role of plausibility arguments in the history of science is to impose constraints on the prior probability of hypotheses (in the language of Bayesian confirmation theory). A detailed look at Copernicanism and Darwinism and, more briefly, Rutherford's discovery of the atomic nucleus reveals a further and arguably more important role of plausibility arguments. It resides in the consideration of likelihoods, which state how likely a given hypothesis makes a given piece of evidence. In each case the likelihoods raise the probability of one of the competing hypotheses and diminish the credibility of its rival, and this may happen either on the basis of 'old' or 'new' evidence.
NASA Astrophysics Data System (ADS)
Annan, James; Hargreaves, Julia
2016-04-01
In order to perform any Bayesian processing of a model ensemble, we need a prior over the ensemble members. In the case of multimodel ensembles such as CMIP, the historical approach of ``model democracy'' (i.e. equal weight for all models in the sample) is no longer credible (if it ever was) due to model duplication and inbreeding. The question of ``model independence'' is central to the question of prior weights. However, although this question has been repeatedly raised, it has not yet been satisfactorily addressed. Here I will discuss the issue of independence and present a theoretical foundation for understanding and analysing the ensemble in this context. I will also present some simple examples showing how these ideas may be applied and developed.
NASA Astrophysics Data System (ADS)
Nomura, Shunichi; Ogata, Yosihiko
2016-04-01
We propose a Bayesian method of probability forecasting for recurrent earthquakes of inland active faults in Japan. Renewal processes with the Brownian Passage Time (BPT) distribution are applied for over a half of active faults in Japan by the Headquarters for Earthquake Research Promotion (HERP) of Japan. Long-term forecast with the BPT distribution needs two parameters; the mean and coefficient of variation (COV) for recurrence intervals. The HERP applies a common COV parameter for all of these faults because most of them have very few specified paleoseismic events, which is not enough to estimate reliable COV values for respective faults. However, different COV estimates are proposed for the same paleoseismic catalog by some related works. It can make critical difference in forecast to apply different COV estimates and so COV should be carefully selected for individual faults. Recurrence intervals on a fault are, on the average, determined by the long-term slip rate caused by the tectonic motion but fluctuated by nearby seismicities which influence surrounding stress field. The COVs of recurrence intervals depend on such stress perturbation and so have spatial trends due to the heterogeneity of tectonic motion and seismicity. Thus we introduce a spatial structure on its COV parameter by Bayesian modeling with a Gaussian process prior. The COVs on active faults are correlated and take similar values for closely located faults. It is found that the spatial trends in the estimated COV values coincide with the density of active faults in Japan. We also show Bayesian forecasts by the proposed model using Markov chain Monte Carlo method. Our forecasts are different from HERP's forecast especially on the active faults where HERP's forecasts are very high or low.
NASA Astrophysics Data System (ADS)
Kim, Y.; Nishina, K.; Chae, N.; Park, S.; Yoon, Y.; Lee, B.
2014-04-01
The tundra ecosystem is quite vulnerable to drastic climate change in the Arctic, and the quantification of carbon dynamics is of significant importance in response to thawing permafrost, changes in the snow-covered period and snow and shrub community extent, and the decline of sea ice in the Arctic. Here, CO2 efflux measurements using a manual chamber system within a 40 m × 40 m (5 m interval; 81 total points) plot were conducted in dominant tundra vegetation on the Seward Peninsula of Alaska, during the growing seasons of 2011 and 2012, for the assessment of the driving parameters of CO2 efflux. We applied a hierarchical Bayesian (HB) model - which is a function of soil temperature, soil moisture, vegetation type and thaw depth - to quantify the effect of environmental parameters on CO2 efflux, and to estimate growing season CO2 emission. Our results showed that average CO2 efflux in 2011 is 1.4-fold higher than in 2012, resulting from the distinct difference in soil moisture between the two years. Tussock-dominated CO2 efflux is 1.4 to 2.3 times higher than those measured in lichen and moss communities, reflecting tussock as a significant CO2 source in the Arctic, with wide area distribution on a circumpolar scale. CO2 efflux followed soil temperature nearly exponentially from both the observed data and the posterior medians of the HB model. This reveals soil temperature as the most important parameter in regulating CO2 efflux, rather than soil moisture and thaw depth. Obvious changes in soil moisture during the growing seasons of 2011 and 2012 resulted in an explicit difference in CO2 efflux - 742 and 539 g CO2 m-2 period-1 in 2011 and 2012, respectively, suggesting that the 2012 CO2 emission rate was constrained by 27% (95% credible interval: 17-36%) compared to 2011, due to higher soil moisture from severe rain. Estimated growing season CO2 emission rate ranged from 0.86 Mg CO2 period-1 in 2012 to 1.2 Mg CO2 period-1 in 2011 within a 40 m × 40 m plot, corresponding to 86% and 80% of the annual CO2 emission rates within the Alaska western tundra ecosystem. Therefore, the HB model can be readily applied to observed CO2 efflux, as it demands only four environmental parameters and can also be effective for quantitatively assessing the driving parameters of CO2 efflux.
Chen, Jinsong; Zhang, Dake; Choi, Jaehwa
2015-12-01
It is common to encounter latent variables with ordinal data in social or behavioral research. Although a mediated effect of latent variables (latent mediated effect, or LME) with ordinal data may appear to be a straightforward combination of LME with continuous data and latent variables with ordinal data, the methodological challenges to combine the two are not trivial. This research covers model structures as complex as LME and formulates both point and interval estimates of LME for ordinal data using the Bayesian full-information approach. We also combine weighted least squares (WLS) estimation with the bias-corrected bootstrapping (BCB; Efron Journal of the American Statistical Association, 82, 171-185, 1987) method or the traditional delta method as the limited-information approach. We evaluated the viability of these different approaches across various conditions through simulation studies, and provide an empirical example to illustrate the approaches. We found that the Bayesian approach with reasonably informative priors is preferred when both point and interval estimates are of interest and the sample size is 200 or above.
Schächtele, Simone; Tümena, Thomas; Gaßmann, Karl-Günter; Fromm, Martin F; Maas, Renke
2016-01-01
Drug-induced QT-interval prolongation is associated with occurrence of potentially fatal Torsades de Pointes arrhythmias (TdP). So far, data regarding the overall burden of QT-interval prolonging drugs (QT-drugs) in geriatric patients are limited. This study was performed to assess the individual burden of QT-interval prolonging drugs (QT-drugs) in geriatric polymedicated patients and to identify the most frequent and risky combinations of QT-drugs. In the discharge medication of geriatric patients between July 2009 and June 2013 from the Geriatrics in Bavaria-Database (GiB-DAT) (co)-prescriptions of QT-drugs were investigated. QT-drugs were classified according to a publicly available reference site (CredibleMeds®) as ALL-QT-drugs (associated with any QT-risk) or High-risk-QT-drugs (corresponding to QT-drugs with known risk of Torsades de Pointes according to CredibleMeds®) and in addition as SmPC-high-risk-QT-drugs (according to the German prescribing information (SmPC) contraindicated co-prescription with other QT-drugs). Of a cohort of 130,434 geriatric patients (mean age 81 years, 67% women), prescribed a median of 8 drugs, 76,594 patients (58.7%) received at least one ALL-QT-drug. Co-prescriptions of two or more ALL-QT-drugs were observed in 28,768 (22.1%) patients. Particularly risky co-prescriptions of High-risk-QT-drugs or SmPC-high-risk-QT-drugs with at least on further QT-drug occurred in 55.9% (N = 12,633) and 54.2% (N = 12,429) of these patients, respectively. Consideration of SmPCs (SmPC-high-risk-QT-drugs) allowed the identification of an additional 15% (N = 3,999) patients taking a risky combination that was not covered by the commonly used CredibleMeds® classification. Only 20 drug-drug combinations accounted for more than 90% of these potentially most dangerous co-prescriptions. In a geriatric study population co-prescriptions of two and more QT-drugs were common. A considerable proportion of QT-drugs with higher risk only could be detected by using more than one classification-system. Local adaption of international classifications can improve identification of patients at risk.
Petrie, Joshua G; Eisenberg, Marisa C; Ng, Sophia; Malosh, Ryan E; Lee, Kyu Han; Ohmit, Suzanne E; Monto, Arnold S
2017-12-15
Household cohort studies are an important design for the study of respiratory virus transmission. Inferences from these studies can be improved through the use of mechanistic models to account for household structure and risk as an alternative to traditional regression models. We adapted a previously described individual-based transmission hazard (TH) model and assessed its utility for analyzing data from a household cohort maintained in part for study of influenza vaccine effectiveness (VE). Households with ≥4 individuals, including ≥2 children <18 years of age, were enrolled and followed during the 2010-2011 influenza season. VE was estimated in both TH and Cox proportional hazards (PH) models. For each individual, TH models estimated hazards of infection from the community and each infected household contact. Influenza A(H3N2) infection was laboratory-confirmed in 58 (4%) subjects. VE estimates from both models were similarly low overall (Cox PH: 20%, 95% confidence interval: -57, 59; TH: 27%, 95% credible interval: -23, 58) and highest for children <9 years of age (Cox PH: 40%, 95% confidence interval: -49, 76; TH: 52%, 95% credible interval: 7, 75). VE estimates were robust to model choice, although the ability of the TH model to accurately describe transmission of influenza presents continued opportunity for analyses. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
The 2-10 keV unabsorbed luminosity function of AGN from the LSS, CDFS, and COSMOS surveys
NASA Astrophysics Data System (ADS)
Ranalli, P.; Koulouridis, E.; Georgantopoulos, I.; Fotopoulou, S.; Hsu, L.-T.; Salvato, M.; Comastri, A.; Pierre, M.; Cappelluti, N.; Carrera, F. J.; Chiappetti, L.; Clerc, N.; Gilli, R.; Iwasawa, K.; Pacaud, F.; Paltani, S.; Plionis, E.; Vignali, C.
2016-05-01
The XMM-Large scale structure (XMM-LSS), XMM-Cosmological evolution survey (XMM-COSMOS), and XMM-Chandra deep field south (XMM-CDFS) surveys are complementary in terms of sky coverage and depth. Together, they form a clean sample with the least possible variance in instrument effective areas and point spread function. Therefore this is one of the best samples available to determine the 2-10 keV luminosity function of active galactic nuclei (AGN) and their evolution. The samples and the relevant corrections for incompleteness are described. A total of 2887 AGN is used to build the LF in the luminosity interval 1042-1046 erg s-1 and in the redshift interval 0.001-4. A new method to correct for absorption by considering the probability distribution for the column density conditioned on the hardness ratio is presented. The binned luminosity function and its evolution is determined with a variant of the Page-Carrera method, which is improved to include corrections for absorption and to account for the full probability distribution of photometric redshifts. Parametric models, namely a double power law with luminosity and density evolution (LADE) or luminosity-dependent density evolution (LDDE), are explored using Bayesian inference. We introduce the Watanabe-Akaike information criterion (WAIC) to compare the models and estimate their predictive power. Our data are best described by the LADE model, as hinted by the WAIC indicator. We also explore the recently proposed 15-parameter extended LDDE model and find that this extension is not supported by our data. The strength of our method is that it provides unabsorbed, non-parametric estimates, credible intervals for luminosity function parameters, and a model choice based on predictive power for future data. Based on observations obtained with XMM-Newton, an ESA science mission with instruments and contributions directly funded by ESA member states and NASA.Tables with the samples of the posterior probability distributions are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/590/A80
Flood quantile estimation at ungauged sites by Bayesian networks
NASA Astrophysics Data System (ADS)
Mediero, L.; Santillán, D.; Garrote, L.
2012-04-01
Estimating flood quantiles at a site for which no observed measurements are available is essential for water resources planning and management. Ungauged sites have no observations about the magnitude of floods, but some site and basin characteristics are known. The most common technique used is the multiple regression analysis, which relates physical and climatic basin characteristic to flood quantiles. Regression equations are fitted from flood frequency data and basin characteristics at gauged sites. Regression equations are a rigid technique that assumes linear relationships between variables and cannot take the measurement errors into account. In addition, the prediction intervals are estimated in a very simplistic way from the variance of the residuals in the estimated model. Bayesian networks are a probabilistic computational structure taken from the field of Artificial Intelligence, which have been widely and successfully applied to many scientific fields like medicine and informatics, but application to the field of hydrology is recent. Bayesian networks infer the joint probability distribution of several related variables from observations through nodes, which represent random variables, and links, which represent causal dependencies between them. A Bayesian network is more flexible than regression equations, as they capture non-linear relationships between variables. In addition, the probabilistic nature of Bayesian networks allows taking the different sources of estimation uncertainty into account, as they give a probability distribution as result. A homogeneous region in the Tagus Basin was selected as case study. A regression equation was fitted taking the basin area, the annual maximum 24-hour rainfall for a given recurrence interval and the mean height as explanatory variables. Flood quantiles at ungauged sites were estimated by Bayesian networks. Bayesian networks need to be learnt from a huge enough data set. As observational data are reduced, a stochastic generator of synthetic data was developed. Synthetic basin characteristics were randomised, keeping the statistical properties of observed physical and climatic variables in the homogeneous region. The synthetic flood quantiles were stochastically generated taking the regression equation as basis. The learnt Bayesian network was validated by the reliability diagram, the Brier Score and the ROC diagram, which are common measures used in the validation of probabilistic forecasts. Summarising, the flood quantile estimations through Bayesian networks supply information about the prediction uncertainty as a probability distribution function of discharges is given as result. Therefore, the Bayesian network model has application as a decision support for water resources and planning management.
Bayes to the Rescue: Continuous Positive Airway Pressure Has Less Mortality Than High-Flow Oxygen.
Modesto I Alapont, Vicent; Khemani, Robinder G; Medina, Alberto; Del Villar Guerra, Pablo; Molina Cambra, Alfred
2017-02-01
The merits of high-flow nasal cannula oxygen versus bubble continuous positive airway pressure are debated in children with pneumonia, with suggestions that randomized controlled trials are needed. In light of a previous randomized controlled trial showing a trend for lower mortality with bubble continuous positive airway pressure, we sought to determine the probability that a new randomized controlled trial would find high-flow nasal cannula oxygen superior to bubble continuous positive airway pressure through a "robust" Bayesian analysis. Sample data were extracted from the trial by Chisti et al, and requisite to "robust" Bayesian analysis, we specified three prior distributions to represent clinically meaningful assumptions. These priors (reference, pessimistic, and optimistic) were used to generate three scenarios to represent the range of possible hypotheses. 1) "Reference": we believe bubble continuous positive airway pressure and high-flow nasal cannula oxygen are equally effective with the same uninformative reference priors; 2) "Sceptic on high-flow nasal cannula oxygen": we believe that bubble continuous positive airway pressure is better than high-flow nasal cannula oxygen (bubble continuous positive airway pressure has an optimistic prior and high-flow nasal cannula oxygen has a pessimistic prior); and 3) "Enthusiastic on high-flow nasal cannula oxygen": we believe that high-flow nasal cannula oxygen is better than bubble continuous positive airway pressure (high-flow nasal cannula oxygen has an optimistic prior and bubble continuous positive airway pressure has a pessimistic prior). Finally, posterior empiric Bayesian distributions were obtained through 100,000 Markov Chain Monte Carlo simulations. In all three scenarios, there was a high probability for more death from high-flow nasal cannula oxygen compared with bubble continuous positive airway pressure (reference, 0.98; sceptic on high-flow nasal cannula oxygen, 0.982; enthusiastic on high-flow nasal cannula oxygen, 0.742). The posterior 95% credible interval on the difference in mortality identified a future randomized controlled trial would be extremely unlikely to find a mortality benefit for high-flow nasal cannula oxygen over bubble continuous positive airway pressure, regardless of the scenario. Interpreting these findings using the "range of practical equivalence" framework would recommend rejecting the hypothesis that high-flow nasal cannula oxygen is superior to bubble continuous positive airway pressure for these children. For children younger than 5 years with pneumonia, high-flow nasal cannula oxygen has higher mortality than bubble continuous positive airway pressure. A future randomized controlled trial in this population is unlikely to find high-flow nasal cannula oxygen superior to bubble continuous positive airway pressure.
Combined effects of smoking and HPV16 in oropharyngeal cancer
Anantharaman, Devasena; Muller, David C; Lagiou, Pagona; Ahrens, Wolfgang; Holcátová, Ivana; Merletti, Franco; Kjærheim, Kristina; Polesel, Jerry; Simonato, Lorenzo; Canova, Cristina; Castellsague, Xavier; Macfarlane, Tatiana V; Znaor, Ariana; Thomson, Peter; Robinson, Max; Conway, David I; Healy, Claire M; Tjønneland, Anne; Westin, Ulla; Ekström, Johanna; Chang-Claude, Jenny; Kaaks, Rudolf; Overvad, Kim; Drogan, Dagmar; Hallmans, Göran; Laurell, Göran; Bueno-de-Mesquita, HB; Peeters, Petra H; Agudo, Antonio; Larrañaga, Nerea; Travis, Ruth C; Palli, Domenico; Barricarte, Aurelio; Trichopoulou, Antonia; George, Saitakis; Trichopoulos, Dimitrios; Quirós, J Ramón; Grioni, Sara; Sacerdote, Carlotta; Navarro, Carmen; Sánchez, María-José; Tumino, Rosario; Severi, Gianluca; Boutron-Ruault, Marie-Christine; Clavel-Chapelon, Francoise; Panico, Salvatore; Weiderpass, Elisabete; Lund, Eiliv; Gram, Inger T; Riboli, Elio; Pawlita, Michael; Waterboer, Tim; Kreimer, Aimée R; Johansson, Mattias; Brennan, Paul
2016-01-01
Abstract Background: Although smoking and HPV infection are recognized as important risk factors for oropharyngeal cancer, how their joint exposure impacts on oropharyngeal cancer risk is unclear. Specifically, whether smoking confers any additional risk to HPV-positive oropharyngeal cancer is not understood. Methods: Using HPV serology as a marker of HPV-related cancer, we examined the interaction between smoking and HPV16 in 459 oropharyngeal (and 1445 oral cavity and laryngeal) cancer patients and 3024 control participants from two large European multi-centre studies. Odds ratios and credible intervals [CrI], adjusted for potential confounders, were estimated using Bayesian logistic regression. Results: Both smoking [odds ratio (OR [CrI]: 6.82 [4.52, 10.29]) and HPV seropositivity (OR [CrI]: 235.69 [99.95, 555.74]) were independently associated with oropharyngeal cancer. The joint association of smoking and HPV seropositivity was consistent with that expected on the additive scale (synergy index [CrI]: 1.32 [0.51, 3.45]), suggesting they act as independent risk factors for oropharyngeal cancer. Conclusions: Smoking was consistently associated with increase in oropharyngeal cancer risk in models stratified by HPV16 seropositivity. In addition, we report that the prevalence of oropharyngeal cancer increases with smoking for both HPV16-positive and HPV16-negative persons. The impact of smoking on HPV16-positive oropharyngeal cancer highlights the continued need for smoking cessation programmes for primary prevention of head and neck cancer. PMID:27197530
Hughes, Jacob B.; Hightower, Joseph E.
2015-01-01
Riverine hydroacoustic techniques are an effective method for evaluating abundance of upstream migrating anadromous fishes. To use these methods in the Roanoke River, North Carolina, at a wide site with uneven bottom topography, we used a combination of split-beam sonar and dual-frequency identification sonar (DIDSON) deployments. We aimed a split-beam sonar horizontally to monitor midchannel and near-bottom zones continuously over the 3-month spring monitoring periods in 2010 and 2011. The DIDSON was rotated between seven cross-channel locations (using a vertical aim) and nearshore regions (using horizontal aims). Vertical deployment addressed blind spots in split-beam coverage along the bottom and provided reliable information about the cross-channel and vertical distributions of upstream migrants. Using a Bayesian framework, we modeled sonar counts within four cross-channel strata and apportioned counts by species using species proportions from boat electrofishing and gill netting. Modeled estimates (95% credible intervals [CIs]) of total upstream migrants in 2010 and 2011 were 2.5 million (95% CI, 2.4–2.6 million) and 3.6 million (95% CI, 3.4–3.9 million), respectively. Results indicated that upstream migrants are extremely shore- and bottom-oriented, suggesting nearshore DIDSON monitoring improved the accuracy and precision of our estimates. This monitoring protocol and model may be widely applicable to river systems regardless of their cross-sectional width or profile.
Policing and risk of overdose mortality in urban neighborhoods
Bohnert, Amy S.B.; Nandi, Arijit; Tracy, Melissa; Cerdá, Magdalena; Tardiff, Kenneth J; Vlahov, David; Galea, Sandro
2010-01-01
Background Accidental drug overdose is a major cause of mortality among drug users. Fears of police arrest may deter witnesses of drug overdose from calling for medical help and may be a determinant of drug overdose mortality. To our knowledge, no studies have empirically assessed the relation between levels of policing and drug overdose mortality. We hypothesized that levels of police activity, congruent with fears of police arrest, are positively associated with drug overdose mortality. Methods We assembled cross-sectional time-series data for 74 New York City (NYC) police precincts over the period 1990–1999 using data collected from the Office of the Chief Medical Examiner of NYC, the NYC Police Department, and the US Census Bureau. Misdemeanor arrest rate—reflecting police activity—was our primary independent variable of interest, and overdose rate our primary dependent variable of interest. Results The mean overdose rate per 100,000 among police precincts in NYC between 1990 and 1999 was 10.8 (standard deviation = 10.0). In a Bayesian hierarchical model that included random spatial and temporal effects and a space-time interaction, the misdemeanor arrest rate per 1,000 was associated with higher overdose mortality (posterior median = 0.003, 95% Credible Interval = 0.001, 0.005) after adjustment for overall drug use in the precinct and demographic characteristics. Conclusions Levels of police activity in a precinct are associated with accidental drug overdose mortality. Future research should examine aspects of police-community interactions that contribute to higher overdose mortality. PMID:20727684
Salmonella risk in imported fresh beef, beef preparations, and beef products.
Tuominen, P; Ranta, J; Maijala, R
2006-08-01
Additional guarantees (AGs) for Salmonella in imported defined animal-derived foods were agreed on for Finland when it was admitted to the European Community. The aim of this project was to evaluate the impact of these AGs on the prevalence of Salmonella in the Finnish beef supply and the adequacy of their scope. According to the quantitative Bayesian model, the efficacy of AGs was mainly dependent on the proportions of different beef categories imported and the true prevalence in the countries of origin. According to the model, AGs were able to reach their target in the referred year 1999 and kept the true Salmonella prevalence of beef imports below 1% with quantified uncertainty. The extension of AGs to all imported fresh beef would have reduced the Salmonella prevalence of beef imports from three- to fourfold, whereas expanding the implementation of AGs to all imports of fresh beef, beef preparations, and beef products would have resulted in a sixfold decrease. If current AGs targeting fresh beef intended to be sold as fresh or to be processed by the Finnish industry with processes not achieving 70 degrees C were not implemented, the 95% credible interval of Salmonella prevalence in the Finnish beef supply would be 0.2 to 1.3% (mean, 0.6%) instead of 0.1 to 1.2% (mean, 0.5%). However, if the prevalence in the exporting countries were to rise or the main import countries and/or magnitudes were to change, AGs would be of greater importance.
Nationwide Increase in Cryptorchidism After the Fukushima Nuclear Accident.
Murase, Kaori; Murase, Joe; Machidori, Koji; Mizuno, Kentaro; Hayashi, Yutaro; Kohri, Kenjiro
2018-05-08
To estimate the change of discharge rate after cryptorchidism surgery between pre- and postdisaster in Japan. Cryptorchidism cannot be diagnosed before birth and is not a factor that would influence a woman's decision to seek an abortion. Therefore, this disease is considered suitable for assessing how the Great East Japan Earthquake and the subsequent Fukushima Daiichi nuclear accident (2011) influenced congenital diseases. We obtained cryptorchidism discharge data collected over 6 years from hospitals that were included in an impact assessment survey of the Diagnosis Procedure Combination survey database in Japan and used these data to estimate the discharge rate after cryptorchidism surgery before and after the disaster. The 94 hospitals in Japan that participated in Diagnosis Procedure Combination system and had 10 or more discharges after cryptorchidism surgery within successive 6 years covering pre- and postdisaster period (FY2010-FY2015) were involved. The change in discharge rate between pre- and postdisaster was analyzed using a Bayesian generalized linear mixed model. Nationwide, a 13.4% (95% credible interval 4.7%-23.0%) increase in discharge rates was estimated. The results of all sensitivity analyses were similar to the reported main results. The discharge rate of cryptorchidism was increased nationwide. The rates of low-weight babies or preterm births, risk factors of cryptorchidism, were almost constant during the study period, and age distribution of the surgery was also not changed, which suggested that the other factors that associated with the disaster increased the incidence of cryptorchidism. Copyright © 2018 Elsevier Inc. All rights reserved.
A Daily Diary Study of Posttraumatic Stress Symptoms and Romantic Partner Accommodation
Campbell, Sarah B.; Renshaw, Keith D.; Kashdan, Todd B.; Curby, Timothy W.; Carter, Sarah P.
2017-01-01
Little is known about the role of romantic partner symptom accommodation in PTSD symptom maintenance. To explore the bidirectional associations of posttraumatic stress disorder (PTSD) symptoms and romantic partner symptom accommodation over time, military servicemen (n = 64) with symptoms of PTSD and their co-habiting heterosexual civilian romantic partners (n = 64) completed a 2-week daily diary study. Cross-lagged, autoregressive models assessed the stability of men’s PTSD symptoms and partners’ accommodation, as well as the prospective associations of earlier PTSD symptoms with later accommodation and vice versa. Analyses used Bayesian estimation to provide point estimates (b) and Credible Intervals (CIs). In all models, PTSD symptoms (total and individual clusters) were highly stable (b = 0.91; CI: 0.88–0.95), and accommodation was moderately stable (b = 0.48; CI: 0.40–0.54). In all models, earlier PTSD symptoms (total and clusters) were significantly, positively associated with later accommodation (b = 0.04; CI: 0.02–0.07). In contrast, earlier accommodation was significantly associated only with later situational avoidance (b = 0.02; CI: 0.00–0.07). Thus, PTSD symptoms may lead to subsequent accommodating behaviors in romantic partners, but partner accommodation seems to contribute only to survivors’ future situational avoidance symptoms. The findings reinforce the notion that PTSD symptoms have an impact on relationship behaviors, and that accommodation from partners may sustain avoidant behaviors in particular. Clinicians should attend to romantic partners’ accommodating behaviors when working with survivors. PMID:28270332
Neurobehavioral performance in adolescents is inversely associated with traffic exposure.
Kicinski, Michal; Vermeir, Griet; Van Larebeke, Nicolas; Den Hond, Elly; Schoeters, Greet; Bruckers, Liesbeth; Sioen, Isabelle; Bijnens, Esmée; Roels, Harry A; Baeyens, Willy; Viaene, Mineke K; Nawrot, Tim S
2015-02-01
On the basis of animal research and epidemiological studies in children and elderly there is a growing concern that traffic exposure may affect the brain. The aim of our study was to investigate the association between traffic exposure and neurobehavioral performance in adolescents. We examined 606 adolescents. To model the exposure, we constructed a traffic exposure factor based on a biomarker of benzene (urinary trans,trans-muconic acid) and the amount of contact with traffic preceding the neurobehavioral examination (using distance-weighted traffic density and time spent in traffic). We used a Bayesian structural equation model to investigate the association between traffic exposure and three neurobehavioral domains: sustained attention, short-term memory, and manual motor speed. A one standard deviation increase in traffic exposure was associated with a 0.26 standard deviation decrease in sustained attention (95% credible interval: -0.02 to -0.51), adjusting for gender, age, smoking, passive smoking, level of education of the mother, socioeconomic status, time of the day, and day of the week. The associations between traffic exposure and the other neurobehavioral domains studied had the same direction but did not reach the level of statistical significance. The results remained consistent in the sensitivity analysis excluding smokers and passive smokers. The inverse association between sustained attention and traffic exposure was independent of the blood lead level. Our study in adolescents supports the recent findings in children and elderly suggesting that traffic exposure adversely affects the neurobehavioral function. Copyright © 2014 Elsevier Ltd. All rights reserved.
Zhao, Xiaojing; Zhou, Changcheng; Ma, Jingjing; Zhu, Yunjuan; Sun, Min; Wang, Peixue; Zhang, Yi; Ma, Haiqin; Zhang, Hongjie
2017-01-01
Topical 5-aminosalicylic acid (5-ASA) and corticosteroids are used frequently in the treatment of active distal ulcerative colitis (UC). Our study aimed to determine the efficacy and safety of different topical drugs used to treat active distal UC. A random-effects model within a Bayesian framework was utilized to compare treatment effects and safety as odds ratios (ORs) with corresponding 95% credible intervals (CrI). The surface under the cumulative ranking area (SUCRA) and median rank (MR) with corresponding 95% CrI were calculated to rank the treatment outcomes. In the induction of clinical and endoscopic remission, most regimens showed significant advantages over placebo except topical budesonide 0.5 mg/d and hydrocortisone 100 mg/d. According to SUCRA and MR values, rectal 5-ASA 1.5 to 2.0 g/d + Beclomethasone dipropionate (BDP) 3 mg/d rendered the highest probability of being the best regimen to achieve clinical and endoscopic remission, followed by the separate use of 5-ASA 4 g/d and BDP 3 mg/d. The occurrence of adverse events was not significantly different between each treatments and placebo. In conclusion, the combined use of topical 5-ASA and BDP proved to be the best choice for active distal UC and further well-designed researches are warranted to assess its efficacy and safety. PMID:28440311
Passive acoustic monitoring of the decline of Mexico's critically endangered vaquita.
Jaramillo-Legorreta, Armando; Cardenas-Hinojosa, Gustavo; Nieto-Garcia, Edwyna; Rojas-Bracho, Lorenzo; Ver Hoef, Jay; Moore, Jeffrey; Tregenza, Nicholas; Barlow, Jay; Gerrodette, Tim; Thomas, Len; Taylor, Barbara
2017-02-01
The vaquita (Phocoena sinus) is the world's most endangered marine mammal with approximately 245 individuals remaining in 2008. This species of porpoise is endemic to the northern Gulf of California, Mexico, and historically the population has declined because of unsustainable bycatch in gillnets. An illegal gillnet fishery for an endangered fish, the totoaba (Totoaba macdonaldi), has recently resurged throughout the vaquita's range. The secretive but lucrative wildlife trade with China for totoaba swim bladders has probably increased vaquita bycatch mortality by an unknown amount. Precise population monitoring by visual surveys is difficult because vaquitas are inherently hard to see and have now become so rare that sighting rates are very low. However, their echolocation clicks can be identified readily on specialized acoustic detectors. Acoustic detections on an array of 46 moored detectors indicated vaquita acoustic activity declined by 80% between 2011 and 2015 in the central part of the species' range. Statistical models estimated an annual rate of decline of 34% (95% Bayesian credible interval -48% to -21%). Based on results from 2011 to 2014, the government of Mexico enacted and is enforcing an emergency 2-year ban on gillnets throughout the species' range to prevent extinction, at a cost of US$74 million to compensate fishers. Developing precise acoustic monitoring methods proved critical to exposing the severity of vaquitas' decline and emphasizes the need for continual monitoring to effectively manage critically endangered species. © 2016 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.
North Atlantic observations sharpen meridional overturning projections
NASA Astrophysics Data System (ADS)
Olson, R.; An, S.-I.; Fan, Y.; Evans, J. P.; Caesar, L.
2018-06-01
Atlantic Meridional Overturning Circulation (AMOC) projections are uncertain due to both model errors, as well as internal climate variability. An AMOC slowdown projected by many climate models is likely to have considerable effects on many aspects of global and North Atlantic climate. Previous studies to make probabilistic AMOC projections have broken new ground. However, they do not drift-correct or cross-validate the projections, and do not fully account for internal variability. Furthermore, they consider a limited subset of models, and ignore the skill of models at representing the temporal North Atlantic dynamics. We improve on previous work by applying Bayesian Model Averaging to weight 13 Coupled Model Intercomparison Project phase 5 models by their skill at modeling the AMOC strength, and its temporal dynamics, as approximated by the northern North-Atlantic temperature-based AMOC Index. We make drift-corrected projections accounting for structural model errors, and for the internal variability. Cross-validation experiments give approximately correct empirical coverage probabilities, which validates our method. Our results present more evidence that AMOC likely already started slowing down. While weighting considerably moderates and sharpens our projections, our results are at low end of previously published estimates. We project mean AMOC changes between periods 1960-1999 and 2060-2099 of -4.0 Sv and -6.8 Sv for RCP4.5 and RCP8.5 emissions scenarios respectively. The corresponding average 90% credible intervals for our weighted experiments are [-7.2, -1.2] and [-10.5, -3.7] Sv respectively for the two scenarios.
Siikamaki, H; Kivela, P; Fotopoulos, M; Ollgren, J; Kantele, A
2015-05-14
The number of international tourist arrivals reached 1,000 million in 2012. Assessment of travellers' health problems has relied on proportionate morbidity data.Given the lack of data on number of visitors to each region, incidences have been impossible to calculate.This study, largest yet reporting travellers' health problems, is the first to present incidence of illness and injury. Data on Finnish travellers with health problems abroad during 2010 to 2012 were retrieved from the database of an assistance organisation,SOS International, covering 95% of those requiring aid abroad. The numbers were compared with those of Finnish travellers in the database of the Official Statistics of Finland. The SOS International database included 50,710 cases: infections constituted the most common health problem (60%), followed by injuries(14%), diseases of skin (5%), musculoskeletal system and connective tissue (5%), digestive tract (3%),and vascular system (2%). Gastroenteritis (23%) and respiratory infections (21%) proved the most frequent diagnoses. Overall incidence of illness or injury was high in Africa (97.9/100,000 travel days; 95% Bayesian credible interval (BCI): 53.1–145.5), southern Europe plus the eastern Mediterranean (92.3; 95% BCI: 75.4–110.1) and Asia (65.0; 95% BCI: 41.5–87.9). The data show significant differences between geographical regions, indicating the main risks and thus providing destination-specific tools for travelers' healthcare.
Association between Severity of MERS-CoV Infection and Incubation Period.
Virlogeux, Victor; Park, Minah; Wu, Joseph T; Cowling, Benjamin J
2016-03-01
We analyzed data for 170 patients in South Korea who had laboratory-confirmed infection with Middle East respiratory syndrome coronavirus. A longer incubation period was associated with a reduction in the risk for death (adjusted odds ratio/1-day increase in incubation period 0.83, 95% credibility interval 0.68-1.03).
Hawken, Steven; Kwong, Jeffrey C.; Deeks, Shelley L.; Crowcroft, Natasha S.; McGeer, Allison J.; Ducharme, Robin; Campitelli, Michael A.; Coyle, Doug
2015-01-01
It is unclear whether seasonal influenza vaccination results in a net increase or decrease in the risk for Guillain-Barré syndrome (GBS). To assess the effect of seasonal influenza vaccination on the absolute risk of acquiring GBS, we used simulation models and published estimates of age- and sex-specific risks for GBS, influenza incidence, and vaccine effectiveness. For a hypothetical 45-year-old woman and 75-year-old man, excess GBS risk for influenza vaccination versus no vaccination was −0.36/1 million vaccinations (95% credible interval −1.22 to 0.28) and −0.42/1 million vaccinations (95% credible interval, –3.68 to 2.44), respectively. These numbers represent a small absolute reduction in GBS risk with vaccination. Under typical conditions (e.g. influenza incidence rates >5% and vaccine effectiveness >60%), vaccination reduced GBS risk. These findings should strengthen confidence in the safety of influenza vaccine and allow health professionals to better put GBS risk in context when discussing influenza vaccination with patients. PMID:25625590
Population and energy elasticity of tornado casualties
NASA Astrophysics Data System (ADS)
Fricker, Tyler; Elsner, James B.; Jagger, Thomas H.
2017-04-01
Tornadoes are capable of catastrophic destruction and mass casualties, but there are yet no estimates of how sensitive the number of casualties are to changes in the number of people in harm's way or to changes in tornado energy. Here the relationship between tornado casualties (deaths and injuries), population, and energy dissipation is quantified using the economic concept of "elasticity." Records of casualties from individual tornadoes over the period 2007-2015 are fit to a regression model. The coefficient on the population term (population elasticity) indicates that a doubling in population increases the casualty rate by 21% [(17, 24)%, 95% credible interval]. The coefficient on the energy term (energy elasticity) indicates that a doubling in energy dissipation leads to a 33% [(30, 35)%, 95% credible interval] increase in the casualty rate. The difference in elasticity values show that on average, changes in energy dissipation have been relatively more important in explaining tornado casualties than changes in population. Assuming no changes in warning effectiveness or mitigation efforts, these elasticity estimates can be used to project changes in casualties given the known population trends and possible trends in tornado activity.
A Fatty Acid Based Bayesian Approach for Inferring Diet in Aquatic Consumers
Holtgrieve, Gordon W.; Ward, Eric J.; Ballantyne, Ashley P.; Burns, Carolyn W.; Kainz, Martin J.; Müller-Navarra, Doerthe C.; Persson, Jonas; Ravet, Joseph L.; Strandberg, Ursula; Taipale, Sami J.; Alhgren, Gunnel
2015-01-01
We modified the stable isotope mixing model MixSIR to infer primary producer contributions to consumer diets based on their fatty acid composition. To parameterize the algorithm, we generated a ‘consumer-resource library’ of FA signatures of Daphnia fed different algal diets, using 34 feeding trials representing diverse phytoplankton lineages. This library corresponds to the resource or producer file in classic Bayesian mixing models such as MixSIR or SIAR. Because this library is based on the FA profiles of zooplankton consuming known diets, and not the FA profiles of algae directly, trophic modification of consumer lipids is directly accounted for. To test the model, we simulated hypothetical Daphnia comprised of 80% diatoms, 10% green algae, and 10% cryptophytes and compared the FA signatures of these known pseudo-mixtures to outputs generated by the mixing model. The algorithm inferred these simulated consumers were comprised of 82% (63-92%) [median (2.5th to 97.5th percentile credible interval)] diatoms, 11% (4-22%) green algae, and 6% (0-25%) cryptophytes. We used the same model with published phytoplankton stable isotope (SI) data for δ13C and δ15N to examine how a SI based approach resolved a similar scenario. With SI, the algorithm inferred that the simulated consumer assimilated 52% (4-91%) diatoms, 23% (1-78%) green algae, and 18% (1-73%) cyanobacteria. The accuracy and precision of SI based estimates was extremely sensitive to both resource and consumer uncertainty, as well as the trophic fractionation assumption. These results indicate that when using only two tracers with substantial uncertainty for the putative resources, as is often the case in this class of analyses, the underdetermined constraint in consumer-resource SI analyses may be intractable. The FA based approach alleviated the underdetermined constraint because many more FA biomarkers were utilized (n < 20), different primary producers (e.g., diatoms, green algae, and cryptophytes) have very characteristic FA compositions, and the FA profiles of many aquatic primary consumers are strongly influenced by their diets. PMID:26114945
Colorectal cancer mortality and industrial pollution in Spain
2012-01-01
Background Records kept as a result of the implementation of Integrated Pollution Prevention and Control (IPPC) and the European Pollutant Release and Transfer Register (E-PRTR) constitute a public inventory of industries, created by the European Commission, which is a valuable resource for monitoring industrial pollution. Our objective is to ascertain whether there might be excess colorectal cancer mortality among populations residing in the vicinity of Spanish industrial installations that are governed by the IPPC Directive and E-PRTR Regulation and report their emissions to air. Methods An ecological study was designed to examine colorectal cancer mortality at a municipal level (8098 Spanish towns), over the period 1997–2006. We conducted an exploratory "near vs. far" analysis to estimate the relative risks (RR) of towns situated at a distance of less than 2 km from industrial installations. The analysis was repeated for each of the 24 industrial groups. RR and their 95% credible/confidence intervals (95%CI) were estimated on the basis of Poisson regression models, using two types of modelling: a) the conditional autoregressive Bayesian model proposed by Besag, York and Mollié, with explanatory variables; and b) a mixed regression model. Integrated nested Laplace approximations were used as a Bayesian inference tool. Results Statistically significant RRs were detected in the vicinity of mining industry (RR 1.258; 95%CI 1.082 - 1.463), paper and wood production (RR 1.071; 95%CI 1.007 – 1.140), food and beverage sector (RR 1.069; 95%CI 1.029 - 1.111), metal production and processing installations (RR 1.065; 95% CI 1.011 – 1.123) and ceramics (RR 1.050 ; 95%CI 1.004 – 1.099). Conclusions Given the exploratory nature of this study, it would seem advisable to check in other countries or with other designs, if the proximity of industries that emit pollutants into the air could be an added risk factor for colorectal cancer mortality. Nevertheless, some of the differences between men and women observed in the analyses of the industrial groups suggest that there may be a component of occupational exposure, little-studied in the case of cancers of the digestive system. PMID:22852770
Nuñez-Garcia, Javier; Downs, Sara H; Parry, Jessica E; Abernethy, Darrell A; Broughan, Jennifer M; Cameron, Angus R; Cook, Alasdair J; de la Rua-Domenech, Ricardo; Goodchild, Anthony V; Gunn, Jane; More, Simon J; Rhodes, Shelley; Rolfe, Simon; Sharp, Michael; Upton, Paul A; Vordermeier, H Martin; Watson, Eamon; Welsh, Michael; Whelan, Adam O; Woolliams, John A; Clifton-Hadley, Richard S; Greiner, Matthias
2018-05-01
Bovine Tuberculosis (bTB) in cattle is a global health problem and eradication of the disease requires accurate estimates of diagnostic test performance to optimize their efficiency. The objective of this study was, through statistical meta-analyses, to obtain estimates of sensitivity (Se) and specificity (Sp), for 14 different ante-mortem and post-mortem diagnostic tests for bTB in cattle. Using data from a systematic review of the scientific literature (published 1934-2009) diagnostic Se and Sp were estimated using Bayesian logistic regression models adjusting for confounding factors. Random effect terms were used to account for unexplained heterogeneity. Parameters in the models were implemented using Markov Chain Monte Carlo (MCMC), and posterior distributions for the diagnostic parameters with adjustment for covariates (confounding factors) were obtained using the inverse logit function. Estimates for Se and/or Sp of the tuberculin skin tests and the IFN-γ blood test were compared with estimates published 2010-2015. Median Se for the single intradermal comparative cervical tuberculin skin (SICCT) test (standard interpretation) was 0.50 and Bayesian credible intervals (CrI) were wide (95% CrI 0.26, 0.78). Median Sp for the SICCT test was 1.00 (95% CrI 0.99, 1.00). Estimates for the IFN-γ blood test Bovine Purified Protein Derivative (PPD)-Avian PPD and Early Secreted Antigen target 6 and Culture Filtrate Protein 10 (ESAT-6/CFP10) ESAT6/CFP10 were 0.67 (95% CrI 0.49, 0.82) and 0.78 (95% CrI 0.60, 0.90) respectively for Se, and 0.98 (95% CrI 0.96, 0.99) and 0.99 (95% CrI 0.99, 1.00) for Sp. The study provides an overview of the accuracy of a range of contemporary diagnostic tests for bTB in cattle. Better understanding of diagnostic test performance is essential for the design of effective control strategies and their evaluation. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
Lunn, Nicholas J; Servanty, Sabrina; Regehr, Eric V; Converse, Sarah J; Richardson, Evan; Stirling, Ian
2016-07-01
Changes in the abundance and distribution of wildlife populations are common consequences of historic and contemporary climate change. Some Arctic marine mammals, such as the polar bear (Ursus maritimus), may be particularly vulnerable to such changes due to the loss of Arctic sea ice. We evaluated the impacts of environmental variation on demographic rates for the Western Hudson Bay (WH), polar bear subpopulation from 1984 to 2011 using live-recapture and dead-recovery data in a Bayesian implementation of multistate capture-recapture models. We found that survival of female polar bears was related to the annual timing of sea ice break-up and formation. Using estimated vital rates (e.g., survival and reproduction) in matrix projection models, we calculated the growth rate of the WH subpopulation and projected population responses under different environmental scenarios while accounting for parametric uncertainty, temporal variation, and demographic stochasticity. Our analysis suggested a long-term decline in the number of bears from 1185 (95% Bayesian credible interval [BCI] = 993-1411) in 1987 to 806 (95% BCI = 653-984) in 2011. In the last 10 yr of the study, the number of bears appeared stable due to temporary stability in sea ice conditions (mean population growth rate for the period 2001-2010 = 1.02, 95% BCI = 0.98-1.06). Looking forward, we estimated long-term growth rates for the WH subpopulation of ~1.02 (95% BCI = 1.00-1.05) and 0.97 (95% BCI = 0.92-1.01) under hypothetical high and low sea ice conditions, respectively. Our findings support previous evidence for a demographic linkage between sea ice conditions and polar bear population dynamics. Furthermore, we present a robust framework for sensitivity analysis with respect to continued climate change (e.g., to inform scenario planning) and for evaluating the combined effects of climate change and management actions on the status of wildlife populations. © 2016 by the Ecological Society of America.
Lunn, Nicholas J.; Servanty, Sabrina; Regehr, Eric V.; Converse, Sarah J.; Richardson, Evan S.; Stirling, Ian
2016-01-01
Changes in the abundance and distribution of wildlife populations are common consequences of historic and contemporary climate change. Some Arctic marine mammals, such as the polar bear (Ursus maritimus), may be particularly vulnerable to such changes due to the loss of Arctic sea ice. We evaluated the impacts of environmental variation on demographic rates for the Western Hudson Bay (WH), polar bear subpopulation from 1984 to 2011 using live-recapture and dead-recovery data in a Bayesian implementation of multistate capture–recapture models. We found that survival of female polar bears was related to the annual timing of sea ice break-up and formation. Using estimated vital rates (e.g., survival and reproduction) in matrix projection models, we calculated the growth rate of the WH subpopulation and projected population responses under different environmental scenarios while accounting for parametric uncertainty, temporal variation, and demographic stochasticity. Our analysis suggested a long-term decline in the number of bears from 1185 (95% Bayesian credible interval [BCI] = 993–1411) in 1987 to 806 (95% BCI = 653–984) in 2011. In the last 10 yr of the study, the number of bears appeared stable due to temporary stability in sea ice conditions (mean population growth rate for the period 2001–2010 = 1.02, 95% BCI = 0.98–1.06). Looking forward, we estimated long-term growth rates for the WH subpopulation of ~1.02 (95% BCI = 1.00–1.05) and 0.97 (95% BCI = 0.92–1.01) under hypothetical high and low sea ice conditions, respectively. Our findings support previous evidence for a demographic linkage between sea ice conditions and polar bear population dynamics. Furthermore, we present a robust framework for sensitivity analysis with respect to continued climate change (e.g., to inform scenario planning) and for evaluating the combined effects of climate change and management actions on the status of wildlife populations.
Bogaards, Johannes A; Wallinga, Jacco; Brakenhoff, Ruud H; Meijer, Chris J L M; Berkhof, Johannes
2015-05-12
To assess the reduction in the vaccine preventable burden of cancer in men if boys are vaccinated along with girls against oncogenic human papillomavirus (HPV). Bayesian evidence synthesis approach used to evaluate the impact of vaccination against HPV types 16 and 18 on the burden of anal, penile, and oropharyngeal carcinomas among heterosexual men and men who have sex with men. The reduced transmission of vaccine-type HPV from vaccination of girls was assumed to lower the risk of HPV associated cancer in all men but not to affect the excess risk of HPV associated cancers among men who have sex with men. General population in the Netherlands. Inclusion of boys aged 12 into HPV vaccination programmes. Quality adjusted life years (QALYs) and numbers needed to vaccinate. Before HPV vaccination, 14.9 (95% credible interval 12.2 to 18.1) QALYs per thousand men were lost to vaccine preventable cancers associated with HPV in the Netherlands. This burden would be reduced by 37% (28% to 48%) if the vaccine uptake among girls remains at the current level of 60%. To prevent one additional case of cancer among men, 795 boys (660 to 987) would need to be vaccinated; with tumour specific numbers for anal, penile, and oropharyngeal cancer of 2162, 3486, and 1975, respectively. The burden of HPV related cancer in men would be reduced by 66% (53% to 805) if vaccine uptake among girls increases to 90%. In that case, 1735 boys (1240 to 2900) would need to be vaccinated to prevent an additional case; with tumour specific numbers for anal, penile, and oropharyngeal cancer of 2593, 29107, and 6484, respectively. Men will benefit indirectly from vaccination of girls but remain at risk of cancers associated with HPV. The incremental benefit of vaccinating boys when vaccine uptake among girls is high is driven by the prevention of anal carcinomas, which underscores the relevance of HPV prevention efforts for men who have sex with men. © Bogaards et al 2015.
Wallinga, Jacco; Brakenhoff, Ruud H; Meijer, Chris J L M; Berkhof, Johannes
2015-01-01
Objective To assess the reduction in the vaccine preventable burden of cancer in men if boys are vaccinated along with girls against oncogenic human papillomavirus (HPV). Design Bayesian evidence synthesis approach used to evaluate the impact of vaccination against HPV types 16 and 18 on the burden of anal, penile, and oropharyngeal carcinomas among heterosexual men and men who have sex with men. The reduced transmission of vaccine-type HPV from vaccination of girls was assumed to lower the risk of HPV associated cancer in all men but not to affect the excess risk of HPV associated cancers among men who have sex with men. Setting General population in the Netherlands. Intervention Inclusion of boys aged 12 into HPV vaccination programmes. Main outcome measures Quality adjusted life years (QALYs) and numbers needed to vaccinate. Results Before HPV vaccination, 14.9 (95% credible interval 12.2 to 18.1) QALYs per thousand men were lost to vaccine preventable cancers associated with HPV in the Netherlands. This burden would be reduced by 37% (28% to 48%) if the vaccine uptake among girls remains at the current level of 60%. To prevent one additional case of cancer among men, 795 boys (660 to 987) would need to be vaccinated; with tumour specific numbers for anal, penile, and oropharyngeal cancer of 2162, 3486, and 1975, respectively. The burden of HPV related cancer in men would be reduced by 66% (53% to 805) if vaccine uptake among girls increases to 90%. In that case, 1735 boys (1240 to 2900) would need to be vaccinated to prevent an additional case; with tumour specific numbers for anal, penile, and oropharyngeal cancer of 2593, 29107, and 6484, respectively. Conclusions Men will benefit indirectly from vaccination of girls but remain at risk of cancers associated with HPV. The incremental benefit of vaccinating boys when vaccine uptake among girls is high is driven by the prevention of anal carcinomas, which underscores the relevance of HPV prevention efforts for men who have sex with men. PMID:25985328
Washington, Simon; Oh, Jutaek
2006-03-01
Transportation professionals are sometimes required to make difficult transportation safety investment decisions in the face of uncertainty. In particular, an engineer may be expected to choose among an array of technologies and/or countermeasures to remediate perceived safety problems when: (1) little information is known about the countermeasure effects on safety; (2) information is known but from different regions, states, or countries where a direct generalization may not be appropriate; (3) where the technologies and/or countermeasures are relatively untested, or (4) where costs prohibit the full and careful testing of each of the candidate countermeasures via before-after studies. The importance of an informed and well-considered decision based on the best possible engineering knowledge and information is imperative due to the potential impact on the numbers of human injuries and deaths that may result from these investments. This paper describes the formalization and application of a methodology to evaluate the safety benefit of countermeasures in the face of uncertainty. To illustrate the methodology, 18 countermeasures for improving safety of at grade railroad crossings (AGRXs) in the Republic of Korea are considered. Akin to "stated preference" methods in travel survey research, the methodology applies random selection and laws of large numbers to derive accident modification factor (AMF) densities from expert opinions. In a full Bayesian analysis framework, the collective opinions in the form of AMF densities (data likelihood) are combined with prior knowledge (AMF density priors) for the 18 countermeasures to obtain 'best' estimates of AMFs (AMF posterior credible intervals). The countermeasures are then compared and recommended based on the largest safety returns with minimum risk (uncertainty). To the author's knowledge the complete methodology is new and has not previously been applied or reported in the literature. The results demonstrate that the methodology is able to discern anticipated safety benefit differences across candidate countermeasures. For the 18 at grade railroad crossings considered in this analysis, it was found that the top three performing countermeasures for reducing crashes are in-vehicle warning systems, obstacle detection systems, and constant warning time systems.
A brain-machine interface for control of medically-induced coma.
Shanechi, Maryam M; Chemali, Jessica J; Liberman, Max; Solt, Ken; Brown, Emery N
2013-10-01
Medically-induced coma is a drug-induced state of profound brain inactivation and unconsciousness used to treat refractory intracranial hypertension and to manage treatment-resistant epilepsy. The state of coma is achieved by continually monitoring the patient's brain activity with an electroencephalogram (EEG) and manually titrating the anesthetic infusion rate to maintain a specified level of burst suppression, an EEG marker of profound brain inactivation in which bursts of electrical activity alternate with periods of quiescence or suppression. The medical coma is often required for several days. A more rational approach would be to implement a brain-machine interface (BMI) that monitors the EEG and adjusts the anesthetic infusion rate in real time to maintain the specified target level of burst suppression. We used a stochastic control framework to develop a BMI to control medically-induced coma in a rodent model. The BMI controlled an EEG-guided closed-loop infusion of the anesthetic propofol to maintain precisely specified dynamic target levels of burst suppression. We used as the control signal the burst suppression probability (BSP), the brain's instantaneous probability of being in the suppressed state. We characterized the EEG response to propofol using a two-dimensional linear compartment model and estimated the model parameters specific to each animal prior to initiating control. We derived a recursive Bayesian binary filter algorithm to compute the BSP from the EEG and controllers using a linear-quadratic-regulator and a model-predictive control strategy. Both controllers used the estimated BSP as feedback. The BMI accurately controlled burst suppression in individual rodents across dynamic target trajectories, and enabled prompt transitions between target levels while avoiding both undershoot and overshoot. The median performance error for the BMI was 3.6%, the median bias was -1.4% and the overall posterior probability of reliable control was 1 (95% Bayesian credibility interval of [0.87, 1.0]). A BMI can maintain reliable and accurate real-time control of medically-induced coma in a rodent model suggesting this strategy could be applied in patient care.
Kang, Si-Hyuck; Park, Kyung Woo; Kang, Do-Yoon; Lim, Woo-Hyun; Park, Kyung Taek; Han, Jung-Kyu; Kang, Hyun-Jae; Koo, Bon-Kwon; Oh, Byung-Hee; Park, Young-Bae; Kandzari, David E; Cohen, David J; Hwang, Seung-Sik; Kim, Hyo-Soo
2014-05-01
The aim of this study was to compare the safety and efficacy of biodegradable-polymer (BP) drug-eluting stents (DES), bare metal stents (BMS), and durable-polymer DES in patients undergoing coronary revascularization, we performed a systematic review and network meta-analysis using a Bayesian framework. Study stents included BMS, paclitaxel-eluting (PES), sirolimus-eluting (SES), endeavor zotarolimus-eluting (ZES-E), cobalt-chromium everolimus-eluting (CoCr-EES), platinium-chromium everolimus-eluting (PtCr-EES), resolute zotarolimus-eluting (ZES-R), and BP biolimus-eluting stents (BP-BES). After a systematic electronic search, 113 trials with 90 584 patients were selected. The principal endpoint was definite or probable stent thrombosis (ST) defined according to the Academic Research Consortium within 1 year. Biodegradable polymer-biolimus-eluting stents [OR, 0.56; 95% credible interval (CrI), 0.33-0.90], SES (OR, 0.53; 95% CrI, 0.38-0.73), CoCr-EES (OR, 0.34; 95% CrI, 0.23-0.52), and PtCr-EES (OR, 0.31; 95% CrI, 0.10-0.90) were all superior to BMS in terms of definite or probable ST within 1 year. Cobalt-chromium everolimus-eluting stents demonstrated the lowest risk of ST of all stents at all times after stent implantation. Biodegradable polymer-biolimus-eluting stents was associated with a higher risk of definite or probable ST than CoCr-EES (OR, 1.72; 95% CrI, 1.04-2.98). All DES reduced the need for repeat revascularization, and all but PES reduced the risk of myocardial infarction compared with BMS. All DESs but PES and ZES-E were superior to BMS in terms of ST within 1 year. Cobalt-chromium everolimus-eluting stents was safer than any DES even including BP-BES. Our results suggest that not only the biodegradability of polymer, but the optimal combination of stent alloy, design, strut thickness, polymer, and drug all combined determine the safety of DES.
Spatial analysis of MODIS aerosol optical depth, PM2.5, and chronic coronary heart disease.
Hu, Zhiyong
2009-05-12
Numerous studies have found adverse health effects of acute and chronic exposure to fine particulate matter (PM2.5). Air pollution epidemiological studies relying on ground measurements provided by monitoring networks are often limited by sparse and unbalanced spatial distribution of the monitors. Studies have found correlations between satellite aerosol optical depth (AOD) and PM2.5 in some land regions. Satellite aerosol data may be used to extend the spatial coverage of PM2.5 exposure assessment. This study was to investigate correlation between PM2.5 and AOD in the conterminous USA, to derive a spatially complete PM2.5 surface by merging satellite AOD data and ground measurements based on the potential correlation, and to examine if there is an association of coronary heart disease with PM2.5. Years 2003 and 2004 daily MODIS (Moderate Resolution Imaging Spectrometer) Level 2 AOD images were collated with US EPA PM2.5 data covering the conterminous USA. Pearson's correlation analysis and geographically weighted regression (GWR) found that the relationship between PM2.5 and AOD is not spatially consistent across the conterminous states. The average correlation is 0.67 in the east and 0.22 in the west. GWR predicts well in the east and poorly in the west. The GWR model was used to derive a PM2.5 grid surface using the mean AOD raster calculated using the daily AOD data (RMSE = 1.67 microg/m3). Fitting of a Bayesian hierarchical model linking PM2.5 with age-race standardized mortality rates (SMRs) of chronic coronary heart disease found that areas with higher values of PM2.5 also show high rates of CCHD mortality: = 0.802, posterior 95% Bayesian credible interval (CI) = (0.386, 1.225). There is a spatial variation of the relationship between PM2.5 and AOD in the conterminous USA. In the eastern USA where AOD correlates well with PM2.5, AOD can be merged with ground PM2.5 data to derive a PM2.5 surface for epidemiological study. The study found that chronic coronary heart disease mortality rate increases with exposure to PM2.5.
Significance testing - are we ready yet to abandon its use?
The, Bertram
2011-11-01
Understanding of the damaging effects of significance testing has steadily grown. Reporting p values without dichotomizing the result to be significant or not, is not the solution. Confidence intervals are better, but are troubled by a non-intuitive interpretation, and are often misused just to see whether the null value lies within the interval. Bayesian statistics provide an alternative which solves most of these problems. Although criticized for relying on subjective models, the interpretation of a Bayesian posterior probability is more intuitive than the interpretation of a p value, and seems to be closest to intuitive patterns of human decision making. Another alternative could be using confidence interval functions (or p value functions) to display a continuum of intervals at different levels of confidence around a point estimate. Thus, better alternatives to significance testing exist. The reluctance to abandon this practice might be both preference of clinging to old habits as well as the unfamiliarity with better methods. Authors might question if using less commonly exercised, though superior, techniques will be well received by the editors, reviewers and the readership. A joint effort will be needed to abandon significance testing in clinical research in the future.
Herman, Benjamin R; Gross, Barry; Moshary, Fred; Ahmed, Samir
2008-04-01
We investigate the assessment of uncertainty in the inference of aerosol size distributions from backscatter and extinction measurements that can be obtained from a modern elastic/Raman lidar system with a Nd:YAG laser transmitter. To calculate the uncertainty, an analytic formula for the correlated probability density function (PDF) describing the error for an optical coefficient ratio is derived based on a normally distributed fractional error in the optical coefficients. Assuming a monomodal lognormal particle size distribution of spherical, homogeneous particles with a known index of refraction, we compare the assessment of uncertainty using a more conventional forward Monte Carlo method with that obtained from a Bayesian posterior PDF assuming a uniform prior PDF and show that substantial differences between the two methods exist. In addition, we use the posterior PDF formalism, which was extended to include an unknown refractive index, to find credible sets for a variety of optical measurement scenarios. We find the uncertainty is greatly reduced with the addition of suitable extinction measurements in contrast to the inclusion of extra backscatter coefficients, which we show to have a minimal effect and strengthens similar observations based on numerical regularization methods.
Cholapranee, Aurada; Hazlewood, Glen S; Kaplan, Gilaad G.; Peyrin-Biroulet, Laurent; Ananthakrishnan, Ashwin N
2017-01-01
Background Mucosal healing is an important therapeutic endpoint in the management of Crohn’s disease (CD) and ulcerative colitis (UC). Limited data exists regarding the comparative efficacy of various therapies in achieving this outcome. Methods We performed a systematic review and meta-analysis of randomized controlled trials (RCT) examining mucosal healing as an endpoint of immunosuppressives, anti-tumor necrosis factor α (anti-TNF) or anti-integrin monoclonal antibody therapy for moderate-to-severe CD or UC. Pooled effect sizes for induction and maintenance of mucosal healing were calculated and pair-wise treatment comparisons evaluated using a Bayesian network meta-analysis. Results A total of 12 RCTs were included in the meta-analysis (CD – 2 induction, 4 maintenance; UC – 8 induction, 5 maintenance). Duration of follow-up was 6–12 weeks for induction and 32–54 weeks for maintenance trials. In CD, anti-TNFs were more effective than placebo for maintaining mucosal healing (28% vs. 1%, Odds ratio (OR) 19.71, 95% confidence interval (CI) 3.51 – 110.84). In UC, anti-TNFs and anti-integrins were more effective than placebo for inducing (45% vs. 30%) and maintaining mucosal healing (33% vs. 18%). In network analysis, adalimumab therapy was inferior to infliximab (OR 0.45, 95% credible interval (CrI) 0.25 – 0.82) and combination infliximab-azathioprine (OR 0.32, 95% CrI 0.12 – 0.84) for inducing mucosal healing in UC. There was no statistically significant pairwise difference between vedolizumab and anti-TNF agents in UC. Conclusion Anti-TNF and anti-integrin biologic agents are effective in inducing mucosal healing in UC with adalimumab being inferior to infliximab or combination therapy. Infliximab and adalimumab were similar in CD. PMID:28326566
Cholapranee, A; Hazlewood, G S; Kaplan, G G; Peyrin-Biroulet, L; Ananthakrishnan, A N
2017-05-01
Mucosal healing is an important therapeutic endpoint in the management of Crohn's disease (CD) and ulcerative colitis (UC). Limited data exist regarding the comparative efficacy of various therapies in achieving this outcome. To perform a systematic review and meta-analysis of biologics for induction and maintenance of mucosal healing in Crohn's disease and ulcerative colitis. We performed a systematic review and meta-analysis of randomised controlled trials (RCT) examining mucosal healing as an endpoint of immunosuppressives, anti-tumour necrosis factor α (anti-TNF) or anti-integrin monoclonal antibody therapy for moderate-to-severe CD or UC. Pooled effect sizes for induction and maintenance of mucosal healing were calculated and pairwise treatment comparisons evaluated using a Bayesian network meta-analysis. A total of 12 RCTs were included in the meta-analysis (CD - 2 induction, 4 maintenance; UC - 8 induction, 5 maintenance). Duration of follow-up was 6-12 weeks for induction and 32-54 weeks for maintenance trials. In CD, anti-TNFs were more effective than placebo for maintaining mucosal healing [28% vs. 1%, Odds ratio (OR) 19.71, 95% confidence interval (CI) 3.51-110.84]. In UC, anti-TNFs and anti-integrins were more effective than placebo for inducing (45% vs. 30%) and maintaining mucosal healing (33% vs. 18%). In network analysis, adalimumab therapy was inferior to infliximab [OR 0.45, 95% credible interval (CrI) 0.25-0.82] and combination infliximab-azathioprine (OR 0.32, 95% CrI 0.12-0.84) for inducing mucosal healing in UC. There was no statistically significant pairwise difference between vedolizumab and anti-TNF agents in UC. Anti-TNF and anti-integrin biological agents are effective in inducing mucosal healing in UC, with adalimumab being inferior to infliximab or combination therapy. Infliximab and adalimumab were similar in CD. © 2017 John Wiley & Sons Ltd.
The Chandra Source Catalog: X-ray Aperture Photometry
NASA Astrophysics Data System (ADS)
Kashyap, Vinay; Primini, F. A.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, I. N.; Evans, J. D.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2009-09-01
The Chandra Source Catalog (CSC) represents a reanalysis of the entire ACIS and HRC imaging observations over the 9-year Chandra mission. We describe here the method by which fluxes are measured for detected sources. Source detection is carried out on a uniform basis, using the CIAO tool wavdetect. Source fluxes are estimated post-facto using a Bayesian method that accounts for background, spatial resolution effects, and contamination from nearby sources. We use gamma-function prior distributions, which could be either non-informative, or in case there exist previous observations of the same source, strongly informative. The current implementation is however limited to non-informative priors. The resulting posterior probability density functions allow us to report the flux and a robust credible range on it.
Search for a Higgs boson decaying to two W bosons at CDF.
Aaltonen, T; Adelman, J; Akimoto, T; Albrow, M G; Alvarez González, B; Amerio, S; Amidei, D; Anastassov, A; Annovi, A; Antos, J; Apollinari, G; Apresyan, A; Arisawa, T; Artikov, A; Ashmanskas, W; Attal, A; Aurisano, A; Azfar, F; Azzurri, P; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Bartsch, V; Bauer, G; Beauchemin, P-H; Bedeschi, F; Beecher, D; Behari, S; Bellettini, G; Bellinger, J; Benjamin, D; Beretvas, A; Beringer, J; Bhatti, A; Binkley, M; Bisello, D; Bizjak, I; Blair, R E; Blocker, C; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bolla, G; Bortoletto, D; Boudreau, J; Boveia, A; Brau, B; Bridgeman, A; Brigliadori, L; Bromberg, C; Brubaker, E; Budagov, J; Budd, H S; Budd, S; Burke, S; Burkett, K; Busetto, G; Bussey, P; Buzatu, A; Byrum, K L; Cabrera, S; Calancha, C; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carls, B; Carlsmith, D; Carosi, R; Carrillo, S; Carron, S; Casal, B; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavaliere, V; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, K; Chokheli, D; Chou, J P; Choudalakis, G; Chuang, S H; Chung, K; Chung, W H; Chung, Y S; Chwalek, T; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Compostella, G; Convery, M E; Conway, J; Cordelli, M; Cortiana, G; Cox, C A; Cox, D J; Crescioli, F; Cuenca Almenar, C; Cuevas, J; Culbertson, R; Cully, J C; Dagenhart, D; Datta, M; Davies, T; de Barbaro, P; De Cecco, S; Deisher, A; De Lorenzo, G; Dell'orso, M; Deluca, C; Demortier, L; Deng, J; Deninno, M; Derwent, P F; di Giovanni, G P; Dionisi, C; Di Ruzza, B; Dittmann, J R; D'Onofrio, M; Donati, S; Dong, P; Donini, J; Dorigo, T; Dube, S; Efron, J; Elagin, A; Erbacher, R; Errede, D; Errede, S; Eusebi, R; Fang, H C; Farrington, S; Fedorko, W T; Feild, R G; Feindt, M; Fernandez, J P; Ferrazza, C; Field, R; Flanagan, G; Forrest, R; Frank, M J; Franklin, M; Freeman, J C; Furic, I; Gallinaro, M; Galyardt, J; Garberson, F; Garcia, J E; Garfinkel, A F; Genser, K; Gerberich, H; Gerdes, D; Gessler, A; Giagu, S; Giakoumopoulou, V; Giannetti, P; Gibson, K; Gimmell, J L; Ginsburg, C M; Giokaris, N; Giordani, M; Giromini, P; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldschmidt, N; Golossanov, A; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Goulianos, K; Gresele, A; Grinstein, S; Grosso-Pilcher, C; Grundler, U; Guimaraes da Costa, J; Gunay-Unalan, Z; Haber, C; Hahn, K; Hahn, S R; Halkiadakis, E; Han, B-Y; Han, J Y; Happacher, F; Hara, K; Hare, D; Hare, M; Harper, S; Harr, R F; Harris, R M; Hartz, M; Hatakeyama, K; Hays, C; Heck, M; Heijboer, A; Heinrich, J; Henderson, C; Herndon, M; Heuser, J; Hewamanage, S; Hidas, D; Hill, C S; Hirschbuehl, D; Hocker, A; Hou, S; Houlden, M; Hsu, S-C; Huffman, B T; Hughes, R E; Husemann, U; Huston, J; Incandela, J; Introzzi, G; Iori, M; Ivanov, A; James, E; Jayatilaka, B; Jeon, E J; Jha, M K; Jindariani, S; Johnson, W; Jones, M; Joo, K K; Jun, S Y; Jung, J E; Junk, T R; Kamon, T; Kar, D; Karchin, P E; Kato, Y; Kephart, R; Keung, J; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, H W; Kim, J E; Kim, M J; Kim, S B; Kim, S H; Kim, Y K; Kimura, N; Kirsch, L; Klimenko, S; Knuteson, B; Ko, B R; Kondo, K; Kong, D J; Konigsberg, J; Korytov, A; Kotwal, A V; Kreps, M; Kroll, J; Krop, D; Krumnack, N; Kruse, M; Krutelyov, V; Kubo, T; Kuhr, T; Kulkarni, N P; Kurata, M; Kusakabe, Y; Kwang, S; Laasanen, A T; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; Lecompte, T; Lee, E; Lee, H S; Lee, S W; Leone, S; Lewis, J D; Lin, C-S; Linacre, J; Lindgren, M; Lipeles, E; Lister, A; Litvintsev, D O; Liu, C; Liu, T; Lockyer, N S; Loginov, A; Loreti, M; Lovas, L; Lucchesi, D; Luci, C; Lueck, J; Lujan, P; Lukens, P; Lungu, G; Lyons, L; Lys, J; Lysak, R; Macqueen, D; Madrak, R; Maeshima, K; Makhoul, K; Maki, T; Maksimovic, P; Malde, S; Malik, S; Manca, G; Manousakis-Katsikakis, A; Margaroli, F; Marino, C; Marino, C P; Martin, A; Martin, V; Martínez, M; Martínez-Ballarín, R; Maruyama, T; Mastrandrea, P; Masubuchi, T; Mathis, M; Mattson, M E; Mazzanti, P; McFarland, K S; McIntyre, P; McNulty, R; Mehta, A; Mehtala, P; Menzione, A; Merkel, P; Mesropian, C; Miao, T; Miladinovic, N; Miller, R; Mills, C; Milnik, M; Mitra, A; Mitselmakher, G; Miyake, H; Moggi, N; Moon, C S; Moore, R; Morello, M J; Morlok, J; Movilla Fernandez, P; Mülmenstädt, J; Mukherjee, A; Muller, Th; Mumford, R; Murat, P; Mussini, M; Nachtman, J; Nagai, Y; Nagano, A; Naganoma, J; Nakamura, K; Nakano, I; Napier, A; Necula, V; Nett, J; Neu, C; Neubauer, M S; Neubauer, S; Nielsen, J; Nodulman, L; Norman, M; Norniella, O; Nurse, E; Oakes, L; Oh, S H; Oh, Y D; Oksuzian, I; Okusawa, T; Orava, R; Griso, S Pagan; Palencia, E; Papadimitriou, V; Papaikonomou, A; Paramonov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Peiffer, T; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Pianori, E; Pinera, L; Pitts, K; Plager, C; Pondrom, L; Poukhov, O; Pounder, N; Prakoshyn, F; Pronko, A; Proudfoot, J; Ptohos, F; Pueschel, E; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Ramakrishnan, V; Ranjan, N; Redondo, I; Rekovic, V; Renton, P; Renz, M; Rescigno, M; Richter, S; Rimondi, F; Ristori, L; Robson, A; Rodrigo, T; Rodriguez, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Roy, P; Ruiz, A; Russ, J; Rusu, V; Safonov, A; Sakumoto, W K; Saltó, O; Santi, L; Sarkar, S; Sartori, L; Sato, K; Savoy-Navarro, A; Schlabach, P; Schmidt, A; Schmidt, E E; Schmidt, M A; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Sexton-Kennedy, L; Sforza, F; Sfyrla, A; Shalhout, S Z; Shears, T; Shepard, P F; Shimojima, M; Shiraishi, S; Shochet, M; Shon, Y; Shreyber, I; Sidoti, A; Sinervo, P; Sisakyan, A; Slaughter, A J; Slaunwhite, J; Sliwa, K; Smith, J R; Snider, F D; Snihur, R; Soha, A; Somalwar, S; Sorin, V; Spalding, J; Spreitzer, T; Squillacioti, P; Stanitzki, M; St Denis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Strycker, G L; Stuart, D; Suh, J S; Sukhanov, A; Suslov, I; Suzuki, T; Taffard, A; Takashima, R; Takeuchi, Y; Tanaka, R; Tecchio, M; Teng, P K; Terashi, K; Thom, J; Thompson, A S; Thompson, G A; Thomson, E; Tipton, P; Ttito-Guzmán, P; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Torre, S; Torretta, D; Totaro, P; Tourneur, S; Trovato, M; Tsai, S-Y; Tu, Y; Turini, N; Ukegawa, F; Vallecorsa, S; van Remortel, N; Varganov, A; Vataga, E; Vázquez, F; Velev, G; Vellidis, C; Veszpremi, V; Vidal, M; Vidal, R; Vila, I; Vilar, R; Vine, T; Vogel, M; Volobouev, I; Volpi, G; Wagner, P; Wagner, R G; Wagner, R L; Wagner, W; Wagner-Kuhr, J; Wakisaka, T; Wallny, R; Wang, S M; Warburton, A; Waters, D; Weinberger, M; Weinelt, J; Wester Iii, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Wilbur, S; Williams, G; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Wright, T; Wu, X; Würthwein, F; Wynne, S M; Xie, S; Yagil, A; Yamamoto, K; Yamaoka, J; Yang, U K; Yang, Y C; Yao, W M; Yeh, G P; Yoh, J; Yorita, K; Yoshida, T; Yu, G B; Yu, I; Yu, S S; Yun, J C; Zanello, L; Zanetti, A; Zhang, X; Zheng, Y; Zucchelli, S
2009-01-16
We present a search for a Higgs boson decaying to two W bosons in pp[over ] collisions at sqrt[s]=1.96 TeV center-of-mass energy. The data sample corresponds to an integrated luminosity of 3.0 fb;(-1) collected with the CDF II detector. We find no evidence for production of a Higgs boson with mass between 110 and 200 GeV/c;(2), and determine upper limits on the production cross section. For the mass of 160 GeV/c;(2), where the analysis is most sensitive, the observed (expected) limit is 0.7 pb (0.9 pb) at 95% Bayesian credibility level which is 1.7 (2.2) times the standard model cross section.
Numerical study on the sequential Bayesian approach for radioactive materials detection
NASA Astrophysics Data System (ADS)
Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng
2013-01-01
A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.
Satellite altimetry based rating curves throughout the entire Amazon basin
NASA Astrophysics Data System (ADS)
Paris, A.; Calmant, S.; Paiva, R. C.; Collischonn, W.; Silva, J. S.; Bonnet, M.; Seyler, F.
2013-05-01
The Amazonian basin is the largest hydrological basin all over the world. In the recent past years, the basin has experienced an unusual succession of extreme draughts and floods, which origin is still a matter of debate. Yet, the amount of data available is poor, both over time and space scales, due to factor like basin's size, access difficulty and so on. One of the major locks is to get discharge series distributed over the entire basin. Satellite altimetry can be used to improve our knowledge of the hydrological stream flow conditions in the basin, through rating curves. Rating curves are mathematical relationships between stage and discharge at a given place. The common way to determine the parameters of the relationship is to compute the non-linear regression between the discharge and stage series. In this study, the discharge data was obtained by simulation through the entire basin using the MGB-IPH model with TRMM Merge input rainfall data and assimilation of gage data, run from 1998 to 2010. The stage dataset is made of ~800 altimetry series at ENVISAT and JASON-2 virtual stations. Altimetry series span between 2002 and 2010. In the present work we present the benefits of using stochastic methods instead of probabilistic ones to determine a dataset of rating curve parameters which are consistent throughout the entire Amazon basin. The rating curve parameters have been computed using a parameter optimization technique based on Markov Chain Monte Carlo sampler and Bayesian inference scheme. This technique provides an estimate of the best parameters for the rating curve, but also their posterior probability distribution, allowing the determination of a credibility interval for the rating curve. Also is included in the rating curve determination the error over discharges estimates from the MGB-IPH model. These MGB-IPH errors come from either errors in the discharge derived from the gage readings or errors in the satellite rainfall estimates. The present experiment shows that the stochastic approach is more efficient than the determinist one. By using for the parameters prior credible intervals defined by the user, this method provides an estimate of best rating curve estimate without any unlikely parameter, and all sites achieved convergence before reaching the maximum number of model evaluations. Results were assessed trough the Nash Sutcliffe efficiency coefficient, applied both to discharge and logarithm of discharges. Most of the virtual stations had good or very good results, showing values of Ens going from 0.7 to 0.98. However, worse results were found at a few virtual stations, unveiling the necessity of investigating possibilities of segmentation of the rating curve, depending on the stage or the rising or recession limb, but also possible errors in the altimetry series.
An empirical Bayesian and Buhlmann approach with non-homogenous Poisson process
NASA Astrophysics Data System (ADS)
Noviyanti, Lienda
2015-12-01
All general insurance companies in Indonesia have to adjust their current premium rates according to maximum and minimum limit rates in the new regulation established by the Financial Services Authority (Otoritas Jasa Keuangan / OJK). In this research, we estimated premium rate by means of the Bayesian and the Buhlmann approach using historical claim frequency and claim severity in a five-group risk. We assumed a Poisson distributed claim frequency and a Normal distributed claim severity. Particularly, we used a non-homogenous Poisson process for estimating the parameters of claim frequency. We found that estimated premium rates are higher than the actual current rate. Regarding to the OJK upper and lower limit rates, the estimates among the five-group risk are varied; some are in the interval and some are out of the interval.
Search for the decays B_{(s)};{0} --> e;{+} micro;{-} and B_{(s)};{0} --> e;{+} e;{-} in CDF run II.
Aaltonen, T; Adelman, J; Akimoto, T; Alvarez González, B; Amerio, S; Amidei, D; Anastassov, A; Annovi, A; Antos, J; Apollinari, G; Apresyan, A; Arisawa, T; Artikov, A; Ashmanskas, W; Attal, A; Aurisano, A; Azfar, F; Azzurri, P; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Bartsch, V; Bauer, G; Beauchemin, P-H; Bedeschi, F; Beecher, D; Behari, S; Bellettini, G; Bellinger, J; Benjamin, D; Beretvas, A; Beringer, J; Bhatti, A; Binkley, M; Bisello, D; Bizjak, I; Blair, R E; Blocker, C; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bolla, G; Bortoletto, D; Boudreau, J; Boveia, A; Brau, B; Bridgeman, A; Brigliadori, L; Bromberg, C; Brubaker, E; Budagov, J; Budd, H S; Budd, S; Burke, S; Burkett, K; Busetto, G; Bussey, P; Buzatu, A; Byrum, K L; Cabrera, S; Calancha, C; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carls, B; Carlsmith, D; Carosi, R; Carrillo, S; Carron, S; Casal, B; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavaliere, V; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, K; Chokheli, D; Chou, J P; Choudalakis, G; Chuang, S H; Chung, K; Chung, W H; Chung, Y S; Chwalek, T; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Compostella, G; Convery, M E; Conway, J; Cordelli, M; Cortiana, G; Cox, C A; Cox, D J; Crescioli, F; Cuenca Almenar, C; Cuevas, J; Culbertson, R; Cully, J C; Dagenhart, D; Datta, M; Davies, T; de Barbaro, P; De Cecco, S; Deisher, A; De Lorenzo, G; Dell'orso, M; Deluca, C; Demortier, L; Deng, J; Deninno, M; Derwent, P F; di Giovanni, G P; Dionisi, C; Di Ruzza, B; Dittmann, J R; D'Onofrio, M; Donati, S; Dong, P; Donini, J; Dorigo, T; Dube, S; Efron, J; Elagin, A; Erbacher, R; Errede, D; Errede, S; Eusebi, R; Fang, H C; Farrington, S; Fedorko, W T; Feild, R G; Feindt, M; Fernandez, J P; Ferrazza, C; Field, R; Flanagan, G; Forrest, R; Frank, M J; Franklin, M; Freeman, J C; Furic, I; Gallinaro, M; Galyardt, J; Garberson, F; Garcia, J E; Garfinkel, A F; Genser, K; Gerberich, H; Gerdes, D; Gessler, A; Giagu, S; Giakoumopoulou, V; Giannetti, P; Gibson, K; Gimmell, J L; Ginsburg, C M; Giokaris, N; Giordani, M; Giromini, P; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldschmidt, N; Golossanov, A; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Goulianos, K; Gresele, A; Grinstein, S; Grosso-Pilcher, C; Grundler, U; Guimaraes da Costa, J; Gunay-Unalan, Z; Haber, C; Hahn, K; Hahn, S R; Halkiadakis, E; Han, B-Y; Han, J Y; Happacher, F; Hara, K; Hare, D; Hare, M; Harper, S; Harr, R F; Harris, R M; Hartz, M; Hatakeyama, K; Hays, C; Heck, M; Heijboer, A; Heinrich, J; Henderson, C; Herndon, M; Heuser, J; Hewamanage, S; Hidas, D; Hill, C S; Hirschbuehl, D; Hocker, A; Hou, S; Houlden, M; Hsu, S-C; Huffman, B T; Hughes, R E; Husemann, U; Hussein, M; Huston, J; Incandela, J; Introzzi, G; Iori, M; Ivanov, A; James, E; Jang, D; Jayatilaka, B; Jeon, E J; Jha, M K; Jindariani, S; Johnson, W; Jones, M; Joo, K K; Jun, S Y; Jung, J E; Junk, T R; Kamon, T; Kar, D; Karchin, P E; Kato, Y; Kephart, R; Keung, J; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, H W; Kim, J E; Kim, M J; Kim, S B; Kim, S H; Kim, Y K; Kimura, N; Kirsch, L; Klimenko, S; Knuteson, B; Ko, B R; Kondo, K; Kong, D J; Konigsberg, J; Korytov, A; Kotwal, A V; Kreps, M; Kroll, J; Krop, D; Krumnack, N; Kruse, M; Krutelyov, V; Kubo, T; Kuhr, T; Kulkarni, N P; Kurata, M; Kwang, S; Laasanen, A T; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; Lecompte, T; Lee, E; Lee, H S; Lee, S W; Leone, S; Lewis, J D; Lin, C-S; Linacre, J; Lindgren, M; Lipeles, E; Lister, A; Litvintsev, D O; Liu, C; Liu, T; Lockyer, N S; Loginov, A; Loreti, M; Lovas, L; Lucchesi, D; Luci, C; Lueck, J; Lujan, P; Lukens, P; Lungu, G; Lyons, L; Lys, J; Lysak, R; Macqueen, D; Madrak, R; Maeshima, K; Makhoul, K; Maki, T; Maksimovic, P; Malde, S; Malik, S; Manca, G; Manousakis-Katsikakis, A; Margaroli, F; Marino, C; Marino, C P; Martin, A; Martin, V; Martínez, M; Martínez-Ballarín, R; Maruyama, T; Mastrandrea, P; Masubuchi, T; Mathis, M; Mattson, M E; Mazzanti, P; McFarland, K S; McIntyre, P; McNulty, R; Mehta, A; Mehtala, P; Menzione, A; Merkel, P; Mesropian, C; Miao, T; Miladinovic, N; Miller, R; Mills, C; Milnik, M; Mitra, A; Mitselmakher, G; Miyake, H; Moggi, N; Moon, C S; Moore, R; Morello, M J; Morlock, J; Movilla Fernandez, P; Mülmenstädt, J; Mukherjee, A; Muller, Th; Mumford, R; Murat, P; Mussini, M; Nachtman, J; Nagai, Y; Nagano, A; Naganoma, J; Nakamura, K; Nakano, I; Napier, A; Necula, V; Nett, J; Neu, C; Neubauer, M S; Neubauer, S; Nielsen, J; Nodulman, L; Norman, M; Norniella, O; Nurse, E; Oakes, L; Oh, S H; Oh, Y D; Oksuzian, I; Okusawa, T; Orava, R; Osterberg, K; Griso, S Pagan; Palencia, E; Papadimitriou, V; Papaikonomou, A; Paramonov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Peiffer, T; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Pianori, E; Pinera, L; Pitts, K; Plager, C; Pondrom, L; Poukhov, O; Pounder, N; Prakoshyn, F; Pronko, A; Proudfoot, J; Ptohos, F; Pueschel, E; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Ramakrishnan, V; Ranjan, N; Redondo, I; Renton, P; Renz, M; Rescigno, M; Richter, S; Rimondi, F; Ristori, L; Robson, A; Rodrigo, T; Rodriguez, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Roy, P; Ruiz, A; Russ, J; Rusu, V; Rutherford, B; Saarikko, H; Safonov, A; Sakumoto, W K; Saltó, O; Santi, L; Sarkar, S; Sartori, L; Sato, K; Savoy-Navarro, A; Schlabach, P; Schmidt, A; Schmidt, E E; Schmidt, M A; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Sexton-Kennedy, L; Sforza, F; Sfyrla, A; Shalhout, S Z; Shears, T; Shepard, P F; Shimojima, M; Shiraishi, S; Shochet, M; Shon, Y; Shreyber, I; Sidoti, A; Sinervo, P; Sisakyan, A; Slaughter, A J; Slaunwhite, J; Sliwa, K; Smith, J R; Snider, F D; Snihur, R; Soha, A; Somalwar, S; Sorin, V; Spalding, J; Spreitzer, T; Squillacioti, P; Stanitzki, M; St Denis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Strycker, G L; Stuart, D; Suh, J S; Sukhanov, A; Suslov, I; Suzuki, T; Taffard, A; Takashima, R; Takeuchi, Y; Tanaka, R; Tecchio, M; Teng, P K; Terashi, K; Thom, J; Thompson, A S; Thompson, G A; Thomson, E; Tipton, P; Ttito-Guzmán, P; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Torre, S; Torretta, D; Totaro, P; Tourneur, S; Trovato, M; Tsai, S-Y; Tu, Y; Turini, N; Ukegawa, F; Vallecorsa, S; van Remortel, N; Varganov, A; Vataga, E; Vázquez, F; Velev, G; Vellidis, C; Vidal, M; Vidal, R; Vila, I; Vilar, R; Vine, T; Vogel, M; Volobouev, I; Volpi, G; Wagner, P; Wagner, R G; Wagner, R L; Wagner, W; Wagner-Kuhr, J; Wakisaka, T; Wallny, R; Wang, S M; Warburton, A; Waters, D; Weinberger, M; Weinelt, J; Wenzel, H; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Wilbur, S; Williams, G; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Wright, T; Wu, X; Würthwein, F; Xie, S; Yagil, A; Yamamoto, K; Yamaoka, J; Yang, U K; Yang, Y C; Yao, W M; Yeh, G P; Yoh, J; Yorita, K; Yoshida, T; Yu, G B; Yu, I; Yu, S S; Yun, J C; Zanello, L; Zanetti, A; Zhang, X; Zheng, Y; Zucchelli, S
2009-05-22
We report results from a search for the lepton flavor violating decays B_{s};{0} --> e;{+} micro;{-} and B;{0} --> e;{+} micro;{-}, and the flavor-changing neutral-current decays B_{s};{0} --> e;{+} e;{-} and B;{0} --> e;{+} e;{-}. The analysis uses data corresponding to 2 fb;{-1} of integrated luminosity of pp[over ] collisions at sqrt[s] = 1.96 TeV collected with the upgraded Collider Detector (CDF II) at the Fermilab Tevatron. The observed number of B0 and B_{s};{0} candidates is consistent with background expectations. The resulting Bayesian upper limits on the branching ratios at 90% credibility level are B(B_{s};{0} --> e;{+} micro;{-}) < 2.0 x 10;{-7}, B(B;{0} --> e;{+} micro;{-}) < 6.4 x 10;{-8}, B(B_{s};{0} --> e;{+} e;{-}) < 2.8 x 10;{-7}, and B(B;{0} --> e;{+} e;{-}) < 8.3 x 10;{-8}. From the limits on B(B_{(s)};{0} --> e;{+} micro;{-}), the following lower bounds on the Pati-Salam leptoquark masses are also derived: M_{LQ}(B_{s};{0} --> e;{+} micro;{-}) > 47.8 TeV/c;{2}, and M_{LQ}(B;{0} --> e;{+} micro;{-}) > 59.3 TeV / c;{2}, at 90% credibility level.
Empirical Bayes estimation of proportions with application to cowbird parasitism rates
Link, W.A.; Hahn, D.C.
1996-01-01
Bayesian models provide a structure for studying collections of parameters such as are considered in the investigation of communities, ecosystems, and landscapes. This structure allows for improved estimation of individual parameters, by considering them in the context of a group of related parameters. Individual estimates are differentially adjusted toward an overall mean, with the magnitude of their adjustment based on their precision. Consequently, Bayesian estimation allows for a more credible identification of extreme values in a collection of estimates. Bayesian models regard individual parameters as values sampled from a specified probability distribution, called a prior. The requirement that the prior be known is often regarded as an unattractive feature of Bayesian analysis and may be the reason why Bayesian analyses are not frequently applied in ecological studies. Empirical Bayes methods provide an alternative approach that incorporates the structural advantages of Bayesian models while requiring a less stringent specification of prior knowledge. Rather than requiring that the prior distribution be known, empirical Bayes methods require only that it be in a certain family of distributions, indexed by hyperparameters that can be estimated from the available data. This structure is of interest per se, in addition to its value in allowing for improved estimation of individual parameters; for example, hypotheses regarding the existence of distinct subgroups in a collection of parameters can be considered under the empirical Bayes framework by allowing the hyperparameters to vary among subgroups. Though empirical Bayes methods have been applied in a variety of contexts, they have received little attention in the ecological literature. We describe the empirical Bayes approach in application to estimation of proportions, using data obtained in a community-wide study of cowbird parasitism rates for illustration. Since observed proportions based on small sample sizes are heavily adjusted toward the mean, extreme values among empirical Bayes estimates identify those species for which there is the greatest evidence of extreme parasitism rates. Applying a subgroup analysis to our data on cowbird parasitism rates, we conclude that parasitism rates for Neotropical Migrants as a group are no greater than those of Resident/Short-distance Migrant species in this forest community. Our data and analyses demonstrate that the parasitism rates for certain Neotropical Migrant species are remarkably low (Wood Thrush and Rose-breasted Grosbeak) while those for others are remarkably high (Ovenbird and Red-eyed Vireo).
Bayesian B-spline mapping for dynamic quantitative traits.
Xing, Jun; Li, Jiahan; Yang, Runqing; Zhou, Xiaojing; Xu, Shizhong
2012-04-01
Owing to their ability and flexibility to describe individual gene expression at different time points, random regression (RR) analyses have become a popular procedure for the genetic analysis of dynamic traits whose phenotypes are collected over time. Specifically, when modelling the dynamic patterns of gene expressions in the RR framework, B-splines have been proved successful as an alternative to orthogonal polynomials. In the so-called Bayesian B-spline quantitative trait locus (QTL) mapping, B-splines are used to characterize the patterns of QTL effects and individual-specific time-dependent environmental errors over time, and the Bayesian shrinkage estimation method is employed to estimate model parameters. Extensive simulations demonstrate that (1) in terms of statistical power, Bayesian B-spline mapping outperforms the interval mapping based on the maximum likelihood; (2) for the simulated dataset with complicated growth curve simulated by B-splines, Legendre polynomial-based Bayesian mapping is not capable of identifying the designed QTLs accurately, even when higher-order Legendre polynomials are considered and (3) for the simulated dataset using Legendre polynomials, the Bayesian B-spline mapping can find the same QTLs as those identified by Legendre polynomial analysis. All simulation results support the necessity and flexibility of B-spline in Bayesian mapping of dynamic traits. The proposed method is also applied to a real dataset, where QTLs controlling the growth trajectory of stem diameters in Populus are located.
Holme, Øyvind; Bretthauer, Michael; Fretheim, Atle; Odgaard-Jensen, Jan; Hoff, Geir
2013-10-01
Colorectal cancer is the third most frequent cancer in the world. As the sojourn time for this cancer is several years and a good prognosis is associated with early stage diagnosis, screening has been implemented in a number of countries. Both screening with faecal occult blood test and flexible sigmoidoscopy have been shown to reduce mortality from colorectal cancer in randomised controlled trials. The comparative effectiveness of these tests on colorectal cancer mortality has, however, never been evaluated, and controversies exist over which test to choose. To compare the effectiveness of screening for colorectal cancer with flexible sigmoidoscopy to faecal occult blood testing. We searched MEDLINE and EMBASE (November 16, 2012), the Cochrane Central Register of Controlled Trials (CENTRAL) (2012, Issue 11) and reference lists for eligible studies. Randomised controlled trials comparing screening with flexible sigmoidoscopy or faecal occult blood testing to each other or to no screening. Only studies reporting mortality from colorectal cancer were included. Faecal occult blood testing had to be repeated (annually or biennially). Data retrieval and assessment of risk of bias were performed independently by two review authors. Standard meta-analyses using a random-effects model were conducted for flexible sigmoidoscopy and faecal occult blood testing (FOBT) separately and we calculated relative risks with 95% confidence intervals (CI). We used a Bayesian approach (a contrast-based network meta-analysis method) for indirect analyses and presented the results as posterior median relative risk with 95% credibility intervals. We assessed the quality of evidence using GRADE. We identified nine studies comprising 338,467 individuals randomised to screening and 405,919 individuals to the control groups. Five studies compared flexible sigmoidoscopy to no screening and four studies compared repetitive guaiac-based FOBT (annually and biennially) to no screening. We did not consider that study risk of bias reduced our confidence in our results. We did not identify any studies comparing the two screening methods directly. When compared with no screening, colorectal cancer mortality was lower with flexible sigmoidoscopy (relative risk 0.72; 95% CI 0.65 to 0.79, high quality evidence) and FOBT (relative risk 0.86; 95% CI 0.80 to 0.92, high quality evidence). In the analyses based on indirect comparison of the two screening methods, the relative risk of dying from colorectal cancer was 0.85 (95% credibility interval 0.72 to 1.01, low quality evidence) for flexible sigmoidoscopy screening compared to FOBT. No complications occurred after the FOBT test itself, but 0.03% of participants suffered a major complication after follow-up. Among more than 60,000 flexible sigmoidoscopy screening procedures and almost 6000 work-up colonoscopies, a major complication was recorded in 0.08% of participants. Adverse event data should be interpreted with caution as the reporting of adverse effects was incomplete. There is high quality evidence that both flexible sigmoidoscopy and faecal occult blood testing reduce colorectal cancer mortality when applied as screening tools. There is low quality indirect evidence that screening with either approach reduces colorectal cancer deaths more than the other. Major complications associated with screening require validation from studies with more complete reporting of harms
Data free inference with processed data products
Chowdhary, K.; Najm, H. N.
2014-07-12
Here, we consider the context of probabilistic inference of model parameters given error bars or confidence intervals on model output values, when the data is unavailable. We introduce a class of algorithms in a Bayesian framework, relying on maximum entropy arguments and approximate Bayesian computation methods, to generate consistent data with the given summary statistics. Once we obtain consistent data sets, we pool the respective posteriors, to arrive at a single, averaged density on the parameters. This approach allows us to perform accurate forward uncertainty propagation consistent with the reported statistics.
Co-Prescription of QT-Interval Prolonging Drugs: An Analysis in a Large Cohort of Geriatric Patients
Schächtele, Simone; Tümena, Thomas; Gaßmann, Karl-Günter; Fromm, Martin F.; Maas, Renke
2016-01-01
Background Drug-induced QT-interval prolongation is associated with occurrence of potentially fatal Torsades de Pointes arrhythmias (TdP). So far, data regarding the overall burden of QT-interval prolonging drugs (QT-drugs) in geriatric patients are limited. Objective This study was performed to assess the individual burden of QT-interval prolonging drugs (QT-drugs) in geriatric polymedicated patients and to identify the most frequent and risky combinations of QT-drugs. Methods In the discharge medication of geriatric patients between July 2009 and June 2013 from the Geriatrics in Bavaria–Database (GiB-DAT) (co)-prescriptions of QT-drugs were investigated. QT-drugs were classified according to a publicly available reference site (CredibleMeds®) as ALL-QT-drugs (associated with any QT-risk) or High-risk-QT-drugs (corresponding to QT-drugs with known risk of Torsades de Pointes according to CredibleMeds®) and in addition as SmPC-high-risk-QT-drugs (according to the German prescribing information (SmPC) contraindicated co-prescription with other QT-drugs). Results Of a cohort of 130,434 geriatric patients (mean age 81 years, 67% women), prescribed a median of 8 drugs, 76,594 patients (58.7%) received at least one ALL-QT-drug. Co-prescriptions of two or more ALL-QT-drugs were observed in 28,768 (22.1%) patients. Particularly risky co-prescriptions of High-risk-QT-drugs or SmPC-high-risk-QT-drugs with at least on further QT-drug occurred in 55.9% (N = 12,633) and 54.2% (N = 12,429) of these patients, respectively. Consideration of SmPCs (SmPC-high-risk-QT-drugs) allowed the identification of an additional 15% (N = 3,999) patients taking a risky combination that was not covered by the commonly used CredibleMeds® classification. Only 20 drug-drug combinations accounted for more than 90% of these potentially most dangerous co-prescriptions. Conclusion In a geriatric study population co-prescriptions of two and more QT-drugs were common. A considerable proportion of QT-drugs with higher risk only could be detected by using more than one classification-system. Local adaption of international classifications can improve identification of patients at risk. PMID:27192430
Shell egg handling and preparation practices in food service establishments in Finland.
Lievonen, S; Ranta, J; Maijala, R
2007-10-01
Foodborne outbreaks are often reported to be acquired at food service establishments. As a part of a quantitative risk assessment on the consumer risk of contracting Salmonella infection via shell eggs, we studied how small, medium, and large restaurants, institutional kitchens, and staff canteens (n=171) purchase, store, and use shell eggs. In addition, we estimated the fraction of raw and undercooked risky egg dishes among all egg dishes served in food service establishments of different sizes and types. The majority of establishments used shell eggs (78%), purchased eggs once per week (39%), and stored eggs at cool temperatures (82%). The size of the food service establishment had a less significant effect on shell egg preparation and handling practices than the type of the establishment. In particular, restaurants and institutional kitchens differed from each other. Restaurants purchased shell eggs more frequently, were more likely to store them at room temperature, stored shell eggs for a shorter period, and were more likely to prepare undercooked egg dishes than institutional kitchens. It was predicted that 6 to 20% of all different egg dishes prepared in a single randomly chosen food service establishment would be risky egg dishes with a 95% Bayesian credible interval of 0 to 96%, showing uncertainty because of the variability between kitchens and uncertainty in kitchen type-specific parameters. The results indicate that although most Finnish food service establishments had safe egg handling practices, a substantial minority expressed risky behavior. Compared with the egg consumption patterns in private Finnish households, however, practices in food service establishments did not prove to be more prone to risk.
Numeric score-based conditional and overall change-in-status indices for ordered categorical data.
Lyles, Robert H; Kupper, Lawrence L; Barnhart, Huiman X; Martin, Sandra L
2015-11-30
Planned interventions and/or natural conditions often effect change on an ordinal categorical outcome (e.g., symptom severity). In such scenarios, it is sometimes desirable to assign a priori scores to observed changes in status, typically giving higher weight to changes of greater magnitude. We define change indices for such data based upon a multinomial model for each row of a c × c table, where the rows represent the baseline status categories. We distinguish an index designed to assess conditional changes within each baseline category from two others designed to capture overall change. One of these overall indices measures expected change across a target population. The other is scaled to capture the proportion of total possible change in the direction indicated by the data, so that it ranges from -1 (when all subjects finish in the least favorable category) to +1 (when all finish in the most favorable category). The conditional assessment of change can be informative regardless of how subjects are sampled into the baseline categories. In contrast, the overall indices become relevant when subjects are randomly sampled at baseline from the target population of interest, or when the investigator is able to make certain assumptions about the baseline status distribution in that population. We use a Dirichlet-multinomial model to obtain Bayesian credible intervals for the conditional change index that exhibit favorable small-sample frequentist properties. Simulation studies illustrate the methods, and we apply them to examples involving changes in ordinal responses for studies of sleep deprivation and activities of daily living. Copyright © 2015 John Wiley & Sons, Ltd.
Kinyoki, Damaris K; Berkley, James A; Moloney, Grainne M; Odundo, Elijah O; Kandala, Ngianga-Bakwin; Noor, Abdisalan M
2016-07-28
Stunting among children under five years old is associated with long-term effects on cognitive development, school achievement, economic productivity in adulthood and maternal reproductive outcomes. Accurate estimation of stunting and tools to forecast risk are key to planning interventions. We estimated the prevalence and distribution of stunting among children under five years in Somalia from 2007 to 2010 and explored the role of environmental covariates in its forecasting. Data from household nutritional surveys in Somalia from 2007 to 2010 with a total of 1,066 clusters covering 73,778 children were included. We developed a Bayesian hierarchical space-time model to forecast stunting by using the relationship between observed stunting and environmental covariates in the preceding years. We then applied the model coefficients to environmental covariates in subsequent years. To determine the accuracy of the forecasting, we compared this model with a model that used data from all the years with the corresponding environmental covariates. Rainfall (OR = 0.994, 95 % Credible interval (CrI): 0.993, 0.995) and vegetation cover (OR = 0.719, 95 % CrI: 0.603, 0.858) were significant in forecasting stunting. The difference in estimates of stunting using the two approaches was less than 3 % in all the regions for all forecast years. Stunting in Somalia is spatially and temporally heterogeneous. Rainfall and vegetation are major drivers of these variations. The use of environmental covariates for forecasting of stunting is a potentially useful and affordable tool for planning interventions to reduce the high burden of malnutrition in Somalia.
Addressing potential prior-data conflict when using informative priors in proof-of-concept studies.
Mutsvari, Timothy; Tytgat, Dominique; Walley, Rosalind
2016-01-01
Bayesian methods are increasingly used in proof-of-concept studies. An important benefit of these methods is the potential to use informative priors, thereby reducing sample size. This is particularly relevant for treatment arms where there is a substantial amount of historical information such as placebo and active comparators. One issue with using an informative prior is the possibility of a mismatch between the informative prior and the observed data, referred to as prior-data conflict. We focus on two methods for dealing with this: a testing approach and a mixture prior approach. The testing approach assesses prior-data conflict by comparing the observed data to the prior predictive distribution and resorting to a non-informative prior if prior-data conflict is declared. The mixture prior approach uses a prior with a precise and diffuse component. We assess these approaches for the normal case via simulation and show they have some attractive features as compared with the standard one-component informative prior. For example, when the discrepancy between the prior and the data is sufficiently marked, and intuitively, one feels less certain about the results, both the testing and mixture approaches typically yield wider posterior-credible intervals than when there is no discrepancy. In contrast, when there is no discrepancy, the results of these approaches are typically similar to the standard approach. Whilst for any specific study, the operating characteristics of any selected approach should be assessed and agreed at the design stage; we believe these two approaches are each worthy of consideration. Copyright © 2015 John Wiley & Sons, Ltd.
Markham, Francis; Doran, Bruce; Young, Martin
2016-08-01
An emerging body of research has documented an association between problem gambling and domestic violence in a range of study populations and locations. Yet little research has analysed this relationship at ecological scales. This study investigates the proposition that gambling accessibility and the incidence of domestic violence might be linked. The association between police-recorded domestic violence and electronic gaming machine accessibility is described at the postcode level. Police recorded family incidents per 10,000 and domestic-violence related physical assault offenses per 10,000 were used as outcome variables. Electronic gaming machine accessibility was measured as electronic gaming machines per 10,000 and gambling venues per 100,000. Bayesian spatio-temporal mixed-effects models were used to estimate the associations between gambling accessibility and domestic violence, using annual postcode-level data in Victoria, Australia between 2005 and 2014, adjusting for a range of covariates. Significant associations of policy-relevant magnitudes were found between all domestic violence and EGM accessibility variables. Postcodes with no electronic gaming machines were associated with 20% (95% credibility interval [C.I.]: 15%, 24%) fewer family incidents per 10,000 and 30% (95% C.I.: 24%, 35%) fewer domestic-violence assaults per 10,000, when compared with postcodes with 75 electronic gaming machine per 10,000. The causal relations underlying these associations are unclear. Quasi-experimental research is required to determine if reducing gambling accessibility is likely to reduce the incidence of domestic violence. Copyright © 2016 Elsevier Ltd. All rights reserved.
Gender-specific estimates of COPD prevalence: a systematic review and meta-analysis.
Ntritsos, Georgios; Franek, Jacob; Belbasis, Lazaros; Christou, Maria A; Markozannes, Georgios; Altman, Pablo; Fogel, Robert; Sayre, Tobias; Ntzani, Evangelia E; Evangelou, Evangelos
2018-01-01
COPD has been perceived as being a disease of older men. However, >7 million women are estimated to live with COPD in the USA alone. Despite a growing body of literature suggesting an increasing burden of COPD in women, the evidence is limited. To assess and synthesize the available evidence among population-based epidemiologic studies and calculate the global prevalence of COPD in men and women. A systematic review and meta-analysis reporting gender-specific prevalence of COPD was undertaken. Gender-specific prevalence estimates were abstracted from relevant studies. Associated patient characteristics as well as custom variables pertaining to the diagnostic method and other important epidemiologic covariates were also collected. A Bayesian random-effects meta-analysis was performed investigating gender-specific prevalence of COPD stratified by age, geography, calendar time, study setting, diagnostic method, and disease severity. Among 194 eligible studies, summary prevalence was 9.23% (95% credible interval [CrI]: 8.16%-10.36%) in men and 6.16% (95% CrI: 5.41%-6.95%) in women. Gender prevalences varied widely by the World Health Organization Global Burden of Disease subregions, with the highest female prevalence found in North America (8.07% vs 7.30%) and in participants in urban settings (13.03% vs 8.34%). Meta-regression indicated that age ≥40 and bronchodilator testing contributed most significantly to heterogeneity of prevalence estimates across studies. We conducted the largest ever systematic review and meta-analysis of global prevalence of COPD and the first large gender-specific review. These results will increase awareness of COPD as a critical woman's health issue.
Molecular evolution and phylodynamics of hepatitis B virus infection circulating in Iran.
Mozhgani, Sayed-Hamidreza; Malekpour, Seyed Amir; Norouzi, Mehdi; Ramezani, Fatemeh; Rezaee, Seyed Abdolrahim; Poortahmasebi, Vahdat; Sadeghi, Mehdi; Alavian, Seyed Moayed; Zarei-Ghobadi, Mohadeseh; Ghaziasadi, Azam; Karimzadeh, Hadi; Malekzadeh, Reza; Ziaee, Masood; Abedi, Farshid; Ataei, Behrooz; Yaran, Majid; Sayad, Babak; Jahantigh, Hamid Reza; Somi, Mohammad Hossein; Sarizadeh, Gholamreza; Sanei-Moghaddam, Ismail; Mansour-Ghanaei, Fariborz; Keyvani, Hossein; Kalantari, Ebrahim; Fakhari, Zahra; Geravand, Babak; Jazayeri, Seyed Mohammad
2018-06-01
Previous local and national Iranian publications indicate that all Iranian hepatitis B virus (HBV) strains belong to HBV genotype D. The aim of this study was to analyze the evolutionary history of HBV infection in Iran for the first time, based on an intensive phylodynamic study. The evolutionary parameters, time to most recent common ancestor (tMRCA), and the population dynamics of infections were investigated using the Bayesian Monte Carlo Markov chain (BMCMC). The effective sample size (ESS) and sampling convergence were then monitored. After sampling from the posterior distribution of the nucleotide substitution rate and other evolutionary parameters, the point estimations (median) of these parameters were obtained. All Iranian HBV isolates were of genotype D, sub-type ayw2. The origin of HBV is regarded as having evolved first on the eastern border, before moving westward, where Isfahan province then hosted the virus. Afterwards, the virus moved to the south and west of the country. The tMRCA of HBV in Iran was estimated to be around 1894, with a 95% credible interval between the years 1701 and 1957. The effective number of infections increased exponentially from around 1925 to 1960. Conversely, from around 1992 onwards, the effective number of HBV infections has decreased at a very high rate. Phylodynamic inference clearly demonstrates a unique homogenous pattern of HBV genotype D compatible with a steady configuration of the decreased effective number of infections in the population in recent years, possibly due to the implementation of blood donation screening and vaccination programs. Adequate molecular epidemiology databases for HBV are crucial for infection prevention and treatment programs.
Johansson, Michael A; Vasconcelos, Pedro F C; Staples, J Erin
2014-08-01
Like many infectious agents, yellow fever (YF) virus only causes disease in a proportion of individuals it infects and severe illness only represents the tip of the iceberg relative to the total number of infections, the more critical factor for virus transmission. We compiled data on asymptomatic infections, mild disease, severe disease (fever with jaundice or hemorrhagic symptoms) and fatalities from 11 studies in Africa and South America between 1969 and 2011. We used a Bayesian model to estimate the probability of each infection outcome. For YF virus infections, the probability of being asymptomatic was 0.55 (95% credible interval [CI] 0.37-0.74), mild disease 0.33 (95% CI 0.13-0.52) and severe disease 0.12 (95% CI 0.05-0.26). The probability of death for people experiencing severe disease was 0.47 (95% CI 0.31-0.62). In outbreak situations where only severe cases may initially be detected, we estimated that there may be between one and seventy infections that are either asymptomatic or cause mild disease for every severe case identified. As it is generally only the most severe cases that are recognized and reported, these estimates will help improve the understanding of the burden of disease and the estimation of the potential risk of spread during YF outbreaks. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Trans-ethnic meta-analysis of white blood cell phenotypes
Keller, Margaux F.; Reiner, Alexander P.; Okada, Yukinori; van Rooij, Frank J.A.; Johnson, Andrew D.; Chen, Ming-Huei; Smith, Albert V.; Morris, Andrew P.; Tanaka, Toshiko; Ferrucci, Luigi; Zonderman, Alan B.; Lettre, Guillaume; Harris, Tamara; Garcia, Melissa; Bandinelli, Stefania; Qayyum, Rehan; Yanek, Lisa R.; Becker, Diane M.; Becker, Lewis C.; Kooperberg, Charles; Keating, Brendan; Reis, Jared; Tang, Hua; Boerwinkle, Eric; Kamatani, Yoichiro; Matsuda, Koichi; Kamatani, Naoyuki; Nakamura, Yusuke; Kubo, Michiaki; Liu, Simin; Dehghan, Abbas; Felix, Janine F.; Hofman, Albert; Uitterlinden, André G.; van Duijn, Cornelia M.; Franco, Oscar H.; Longo, Dan L.; Singleton, Andrew B.; Psaty, Bruce M.; Evans, Michelle K.; Cupples, L. Adrienne; Rotter, Jerome I.; O'Donnell, Christopher J.; Takahashi, Atsushi; Wilson, James G.; Ganesh, Santhi K.; Nalls, Mike A.
2014-01-01
White blood cell (WBC) count is a common clinical measure used as a predictor of certain aspects of human health, including immunity and infection status. WBC count is also a complex trait that varies among individuals and ancestry groups. Differences in linkage disequilibrium structure and heterogeneity in allelic effects are expected to play a role in the associations observed between populations. Prior genome-wide association study (GWAS) meta-analyses have identified genomic loci associated with WBC and its subtypes, but much of the heritability of these phenotypes remains unexplained. Using GWAS summary statistics for over 50 000 individuals from three diverse populations (Japanese, African-American and European ancestry), a Bayesian model methodology was employed to account for heterogeneity between ancestry groups. This approach was used to perform a trans-ethnic meta-analysis of total WBC, neutrophil and monocyte counts. Ten previously known associations were replicated and six new loci were identified, including several regions harboring genes related to inflammation and immune cell function. Ninety-five percent credible interval regions were calculated to narrow the association signals and fine-map the putatively causal variants within loci. Finally, a conditional analysis was performed on the most significant SNPs identified by the trans-ethnic meta-analysis (MA), and nine secondary signals within loci previously associated with WBC or its subtypes were identified. This work illustrates the potential of trans-ethnic analysis and ascribes a critical role to multi-ethnic cohorts and consortia in exploring complex phenotypes with respect to variants that lie outside the European-biased GWAS pool. PMID:25096241
Neighbourhood Walkability and Daily Steps in Adults with Type 2 Diabetes.
Hajna, Samantha; Ross, Nancy A; Joseph, Lawrence; Harper, Sam; Dasgupta, Kaberi
2016-01-01
There is evidence that greater neighbourhood walkability (i.e., neighbourhoods with more amenities and well-connected streets) is associated with higher levels of total walking in Europe and in Asia, but it remains unclear if this association holds in the Canadian context and in chronic disease populations. We examined the relationships of different walkability measures to biosensor-assessed total walking (i.e., steps/day) in adults with type 2 diabetes living in Montreal (QC, Canada). Participants (60.5±10.4 years; 48.1% women) were recruited through McGill University-affiliated clinics (June 2006 to May 2008). Steps/day were assessed once per season for one year with pedometers. Neighbourhood walkability was evaluated through participant reports, in-field audits, Geographic Information Systems (GIS)-derived measures, and the Walk Score®. Relationships between walkability and daily steps were estimated using Bayesian longitudinal hierarchical linear regression models (n = 131). Participants who reported living in the most compared to the least walkable neighbourhoods completed 1345 more steps/day (95% Credible Interval: 718, 1976; Quartiles 4 versus 1). Those living in the most compared to the least walkable neighbourhoods (based on GIS-derived walkability) completed 606 more steps per day (95% CrI: 8, 1203). No statistically significant associations with steps were observed for audit-assessed walkability or the Walk Score®. Adults with type 2 diabetes who perceived their neighbourhoods as more walkable accumulated more daily steps. This suggests that knowledge of local neighborhood features that enhance walking is a meaningful predictor of higher levels of walking and an important component of neighbourhood walkability.
Scleroderma prevalence: demographic variations in a population-based sample.
Bernatsky, S; Joseph, L; Pineau, C A; Belisle, P; Hudson, M; Clarke, A E
2009-03-15
To estimate the prevalence of systemic sclerosis (SSc) using population-based administrative data, and to assess the sensitivity of case ascertainment approaches. We ascertained SSc cases from Quebec physician billing and hospitalization databases (covering approximately 7.5 million individuals). Three case definition algorithms were compared, and statistical methods accounting for imperfect case ascertainment were used to estimate SSc prevalence and case ascertainment sensitivity. A hierarchical Bayesian latent class regression model that accounted for possible between-test dependence conditional on disease status estimated the effect of patient characteristics on SSc prevalence and the sensitivity of the 3 ascertainment algorithms. Accounting for error inherent in both the billing and the hospitalization data, we estimated SSc prevalence in 2003 at 74.4 cases per 100,000 women (95% credible interval [95% CrI] 69.3-79.7) and 13.3 cases per 100,000 men (95% CrI 11.1-16.1). Prevalence was higher for older individuals, particularly in urban women (161.2 cases per 100,000, 95% CrI 148.6-175.0). Prevalence was lowest in young men (in rural areas, as low as 2.8 cases per 100,000, 95% CrI 1.4-4.8). In general, no single algorithm was very sensitive, with point estimates for sensitivity ranging from 20-73%. We found marked differences in SSc prevalence according to age, sex, and region. In general, no single case ascertainment approach was very sensitive for SSc. Therefore, using data from multiple sources, with adjustment for the imperfect nature of each, is an important strategy in population-based studies of SSc and similar conditions.
Injecting drug users in Scotland, 2006: Listing, number, demography, and opiate-related death-rates.
King, Ruth; Bird, Sheila M; Overstall, Antony; Hay, Gordon; Hutchinson, Sharon J
2013-06-01
Using Bayesian capture-recapture analysis, we estimated the number of current injecting drug users (IDUs) in Scotland in 2006 from the cross-counts of 5670 IDUs listed on four data-sources: social enquiry reports (901 IDUs listed), hospital records (953), drug treatment agencies (3504), and recent Hepatitis C virus (HCV) diagnoses (827 listed as IDU-risk). Further, we accessed exact numbers of opiate-related drugs-related deaths (DRDs) in 2006 and 2007 to improve estimation of Scotland's DRD rates per 100 current IDUs. Using all four data-sources, and model-averaging of standard hierarchical log-linear models to allow for pairwise interactions between data-sources and/or demographic classifications, Scotland had an estimated 31700 IDUs in 2006 (95% credible interval: 24900-38700); but 25000 IDUs (95% CI: 20700-35000) by excluding recent HCV diagnoses whose IDU-risk can refer to past injecting. Only in the younger age-group (15-34 years) were Scotland's opiate-related DRD rates significantly lower for females than males. Older males' opiate-related DRD rate was 1.9 (1.24-2.40) per 100 current IDUs without or 1.3 (0.94-1.64) with inclusion of recent HCV diagnoses. If, indeed, Scotland had only 25000 current IDUs in 2006, with only 8200 of them aged 35+ years, the opiate-related DRD rate is higher among this older age group than has been appreciated hitherto. There is counter-balancing good news for the public health: the hitherto sharp increase in older current IDUs had stalled by 2006.
Gaussian Process Model for Antarctic Surface Mass Balance and Ice Core Site Selection
NASA Astrophysics Data System (ADS)
White, P. A.; Reese, S.; Christensen, W. F.; Rupper, S.
2017-12-01
Surface mass balance (SMB) is an important factor in the estimation of sea level change, and data are collected to estimate models for prediction of SMB on the Antarctic ice sheet. Using Favier et al.'s (2013) quality-controlled aggregate data set of SMB field measurements, a fully Bayesian spatial model is posed to estimate Antarctic SMB and propose new field measurement locations. Utilizing Nearest-Neighbor Gaussian process (NNGP) models, SMB is estimated over the Antarctic ice sheet. An Antarctic SMB map is rendered using this model and is compared with previous estimates. A prediction uncertainty map is created to identify regions of high SMB uncertainty. The model estimates net SMB to be 2173 Gton yr-1 with 95% credible interval (2021,2331) Gton yr-1. On average, these results suggest lower Antarctic SMB and higher uncertainty than previously purported [Vaughan et al. (1999); Van de Berg et al. (2006); Arthern, Winebrenner and Vaughan (2006); Bromwich et al. (2004); Lenaerts et al. (2012)], even though this model utilizes significantly more observations than previous models. Using the Gaussian process' uncertainty and model parameters, we propose 15 new measurement locations for field study utilizing a maximin space-filling, error-minimizing design; these potential measurements are identied to minimize future estimation uncertainty. Using currently accepted Antarctic mass balance estimates and our SMB estimate, we estimate net mass loss [Shepherd et al. (2012); Jacob et al. (2012)]. Furthermore, we discuss modeling details for both space-time data and combining field measurement data with output from mathematical models using the NNGP framework.
Mair, Christina; Freisthler, Bridget; Ponicki, William R.; Gaidus, Andrew
2015-01-01
Background As an increasing number of states liberalize cannabis use and develop laws and local policies, it is essential to better understand the impacts of neighborhood ecology and marijuana dispensary density on marijuana use, abuse, and dependence. We investigated associations between marijuana abuse/dependence hospitalizations and community demographic and environmental conditions from 2001–2012 in California, as well as cross-sectional associations between local and adjacent marijuana dispensary densities and marijuana hospitalizations. Methods We analyzed panel population data relating hospitalizations coded for marijuana abuse or dependence and assigned to residential ZIP codes in California from 2001 through 2012 (20,219 space-time units) to ZIP code demographic and ecological characteristics. Bayesian space-time misalignment models were used to account for spatial variations in geographic unit definitions over time, while also accounting for spatial autocorrelation using conditional autoregressive priors. We also analyzed cross-sectional associations between marijuana abuse/dependence and the density of dispensaries in local and spatially adjacent ZIP codes in 2012. Results An additional one dispensary per square mile in a ZIP code was cross-sectionally associated with a 6.8% increase in the number of marijuana hospitalizations (95% credible interval 1.033, 1.105) with a marijuana abuse/dependence code. Other local characteristics, such as the median household income and age and racial/ethnic distributions, were associated with marijuana hospitalizations in cross-sectional and panel analyses. Conclusions Prevention and intervention programs for marijuana abuse and dependence may be particularly essential in areas of concentrated disadvantage. Policy makers may want to consider regulations that limit the density of dispensaries. PMID:26154479
Conflict in Somalia: impact on child undernutrition
Kinyoki, Damaris K; Moloney, Grainne M; Uthman, Olalekan A; Kandala, Ngianga-Bakwin; Odundo, Elijah O; Noor, Abdisalan M; Berkley, James A
2017-01-01
Introduction In Somalia, protracted conflict and drought have caused population displacement and livelihood destruction. There is also widespread childhood undernutrition. We aimed to determine the independent effects of conflict on wasting and stunting among children aged 6–59 months nationwide in Somalia. Methods Data were from household surveys during 2007–2010, including 73 778 children in 1066 clusters, the Armed Conflict Location and Event Data project database and remote sensing. We used Bayesian hierarchical spatial-temporal regression to examine the effects of conflict on wasting and stunting. Models included individual, household and environmental covariates and recent (<3 months) or longer term (3–12 months) conflict events. Results 15 355 (21%) and 22 739 (31%) observations were from wasted and stunted children, respectively. The conflict was associated with undernutrition independently of the individual, household and environmental factors, and its inclusion improved model performance. Recent conflict was associated with wasting (OR 1.37, 95% credible interval (CrI): (1.33, 1.42) and attributable fraction (AF) 7.6%)) and stunting (OR 1.21, 95% CrI (1.15, 1.28), AF 6.9%). Longer term conflict had greater effects on wasting (OR 1.76, 95% CrI (1.71, 1.81), AF 6.0%) and stunting (OR 1.88, 95% CrI = (1.83, 1.94), AF 7.4%). After controlling for conflict, the harmful effect of internal displacement and protective effects of rainfall and vegetation cover on undernutrition were enhanced. Conclusion Conflict and internal displacement have large effects on undernutrition in ways not fully captured by simply measuring individual, household and environmental factors or drought. PMID:28966793
Odden, Morten; Linnell, John D. C.; Odden, John
2017-01-01
Sarcoptic mange is a widely distributed disease that affects numerous mammalian species. We used camera traps to investigate the apparent prevalence and spatiotemporal dynamics of sarcoptic mange in a red fox population in southeastern Norway. We monitored red foxes for five years using 305 camera traps distributed across an 18000 km2 area. A total of 6581 fox events were examined to visually identify mange compatible lesions. We investigated factors associated with the occurrence of mange by using logistic models within a Bayesian framework, whereas the spatiotemporal dynamics of the disease were analysed with space-time scan statistics. The apparent prevalence of the disease fluctuated over the study period with a mean of 3.15% and credible interval [1.25, 6.37], and our best logistic model explaining the presence of red foxes with mange-compatible lesions included time since the beginning of the study and the interaction between distance to settlement and season as explanatory variables. The scan analyses detected several potential clusters of the disease that varied in persistence and size, and the locations in the cluster with the highest probability were closer to human settlements than the other survey locations. Our results indicate that red foxes in an advanced stage of the disease are most likely found closer to human settlements during periods of low wild prey availability (winter). We discuss different potential causes. Furthermore, the disease appears to follow a pattern of small localized outbreaks rather than sporadic isolated events. PMID:28423011
Spatial Patterns and Socioecological Drivers of Dengue Fever Transmission in Queensland, Australia
Clements, Archie; Williams, Gail; Tong, Shilu; Mengersen, Kerrie
2011-01-01
Background: Understanding how socioecological factors affect the transmission of dengue fever (DF) may help to develop an early warning system of DF. Objectives: We examined the impact of socioecological factors on the transmission of DF and assessed potential predictors of locally acquired and overseas-acquired cases of DF in Queensland, Australia. Methods: We obtained data from Queensland Health on the numbers of notified DF cases by local government area (LGA) in Queensland for the period 1 January 2002 through 31 December 2005. Data on weather and the socioeconomic index were obtained from the Australian Bureau of Meteorology and the Australian Bureau of Statistics, respectively. A Bayesian spatial conditional autoregressive model was fitted at the LGA level to quantify the relationship between DF and socioecological factors. Results: Our estimates suggest an increase in locally acquired DF of 6% [95% credible interval (CI): 2%, 11%] and 61% (95% CI: 2%, 241%) in association with a 1-mm increase in average monthly rainfall and a 1°C increase in average monthly maximum temperature between 2002 and 2005, respectively. By contrast, overseas-acquired DF cases increased by 1% (95% CI: 0%, 3%) and by 1% (95% CI: 0%, 2%) in association with a 1-mm increase in average monthly rainfall and a 1-unit increase in average socioeconomic index, respectively. Conclusions: Socioecological factors appear to influence the transmission of DF in Queensland, but the drivers of locally acquired and overseas-acquired DF may differ. DF risk is spatially clustered with different patterns for locally acquired and overseas-acquired cases. PMID:22015625
Diabetes mellitus mortality in Spanish cities: Trends and geographical inequalities.
Aguilar-Palacio, I; Martinez-Beneito, M A; Rabanaque, M J; Borrell, C; Cirera, L; Daponte, A; Domínguez-Berjón, M F; Gandarillas, A; Gotsens, M; Lorenzo-Ruano, P; Marí-Dell'Olmo, M; Nolasco, A; Saez, M; Sánchez-Villegas, P; Saurina, C; Martos, C
2017-10-01
To analyze the geographical pattern of diabetes mellitus (DM) mortality and its association with socioeconomic factors in 26 Spanish cities. We conducted an ecological study of DM mortality trends with two cross-sectional cuts (1996-2001; 2002-2007) using census tract (CT) as the unit of analysis. Smoothed standardized mortality rates (sSMR) were calculated using Bayesian models, and a socioeconomic deprivation score was calculated for each CT. In total, 27,757 deaths by DM were recorded, with higher mortality rates observed in men and in the period 1996-2001. For men, a significant association between CT deprivation score and DM mortality was observed in 6 cities in the first study period and in 7 cities in the second period. The highest relative risk was observed in Pamplona (RR, 5.13; 95% credible interval (95%CI), 1.32-15.16). For women, a significant association between CT deprivation score and DM mortality was observed in 13 cities in the first period and 8 in the second. The strongest association was observed in San Sebastián (RR, 3.44; 95%CI, 1.25-7.36). DM mortality remained stable in the majority of cities, although a marked decrease was observed in some cities, including Madrid (RR, 0.67 and 0.64 for men and women, respectively). Our findings demonstrate clear inequalities in DM mortality in Spain. These inequalities remained constant over time are were more marked in women. Detection of high-risk areas is crucial for the implementation of specific interventions. Copyright © 2017 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.
Grenet, Guillaume; Lajoinie, Audrey; Ribault, Shams; Nguyen, Gia Bao; Linet, Thomas; Metge, Augustin; Cornu, Catherine; Cucherat, Michel; Moulin, Philippe; Gueyffier, François
2017-06-01
The aim of this study was to propose a ranking of the currently available antidiabetic drugs, regarding vascular clinical outcomes, in patients with type 2 diabetes, through a network meta-analysis approach. Randomized clinical trials, regardless of the blinding design, testing contemporary antidiabetic drugs, and considering clinically relevant outcomes in patients with type 2 diabetes mellitus will be included. The primary outcomes of this analysis will be overall mortality, cardiovascular mortality, and major cardiovascular events. Diabetic microangiopathy will be a secondary outcome. Adverse events, hypoglycemia, weight evolution, bariatric surgery, and discontinuation of the treatment will also be recorded. Each drug will be analyzed according to its therapeutic class: biguanide, alpha-glucosidase inhibitors, sulfonylureas, glitazones, glinides, insulin, DPP-4 inhibitors, GLP-1 analogs, and gliflozins. The treatment effect of each drug class will be compared using pairwise meta-analysis and a Bayesian random model network meta-analysis. Sensitivity analyses will be conducted according to the quality of the studies and the glycemic control. The report will follow the PRISMA checklist for network meta-analysis. Results of the search strategy and of the study selection will be presented in a PRISMA compliant flowchart. The treatment effects will be summarized with odds ratio (OR) estimates and their 95% credible intervals. A ranking of the drugs will be proposed. Our network meta-analysis should allow a clinically relevant ranking of the contemporary antidiabetic drugs. © 2016 Société Française de Pharmacologie et de Thérapeutique.
The Chandra Source Catalog: X-ray Aperture Photometry
NASA Astrophysics Data System (ADS)
Kashyap, Vinay; Primini, F. A.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, I. N.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2009-01-01
The Chandra Source Catalog represents a reanalysis of the entire ACIS and HRC imaging observations over the 9-year Chandra mission. Source detection is carried out on a uniform basis, using the CIAO tool wavdetect, and source fluxes are estimated post-facto using a Bayesian method that accounts for background, spatial resolution effects, and contamination from nearby sources. We use gamma-function prior distributions, which could be either non-informative, or in case there exist previous observations of the same source, strongly informative. The resulting posterior probability density functions allow us to report the flux and a robust credible range on it. We also determine limiting sensitivities at arbitrary locations in the field using the same formulation. This work was supported by CXC NASA contracts NAS8-39073 (VK) and NAS8-03060 (CSC).
Aaltonen, T.
2012-01-04
A search for a narrow Higgs boson resonance in the diphoton mass spectrum is presented based on data corresponding to 7.0 fb -1 of integrated luminosity from pp⁻ collisions at \\(\\sqrt{s}=1.96\\) TeV collected by the CDF experiment. No evidence of such a resonance is observed, and upper limits are set on the cross section times branching ratio of the resonant state as a function of Higgs boson mass. The limits are interpreted in the context of the standard model and one fermiophobic benchmark model where the data exclude fermiophobic Higgs bosons with masses below 114 GeV/c 2 at a 95%more » Bayesian credibility level.« less
Search for a Higgs boson in the diphoton final state in pp collisions at sqrt[s]=1.96 TeV.
Aaltonen, T; Alvarez González, B; Amerio, S; Amidei, D; Anastassov, A; Annovi, A; Antos, J; Apollinari, G; Appel, J A; Apresyan, A; Arisawa, T; Artikov, A; Asaadi, J; Ashmanskas, W; Auerbach, B; Aurisano, A; Azfar, F; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Barria, P; Bartos, P; Bauce, M; Bauer, G; Bedeschi, F; Beecher, D; Behari, S; Bellettini, G; Bellinger, J; Benjamin, D; Beretvas, A; Bhatti, A; Binkley, M; Bisello, D; Bizjak, I; Bland, K R; Blumenfeld, B; Bocci, A; Bodek, A; Bortoletto, D; Boudreau, J; Boveia, A; Brigliadori, L; Brisuda, A; Bromberg, C; Brucken, E; Bucciantonio, M; Budagov, J; Budd, H S; Budd, S; Burkett, K; Busetto, G; Bussey, P; Buzatu, A; Calancha, C; Camarda, S; Campanelli, M; Campbell, M; Canelli, F; Carls, B; Carlsmith, D; Carosi, R; Carrillo, S; Carron, S; Casal, B; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavaliere, V; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, K; Chokheli, D; Chou, J P; Chung, W H; Chung, Y S; Ciobanu, C I; Ciocci, M A; Clark, A; Clarke, C; Compostella, G; Convery, M E; Conway, J; Corbo, M; Cordelli, M; Cox, C A; Cox, D J; Crescioli, F; Cuenca Almenar, C; Cuevas, J; Culbertson, R; Dagenhart, D; d'Ascenzo, N; Datta, M; de Barbaro, P; De Cecco, S; De Lorenzo, G; Dell'orso, M; Deluca, C; Demortier, L; Deng, J; Deninno, M; Devoto, F; d'Errico, M; Di Canto, A; Di Ruzza, B; Dittmann, J R; D'Onofrio, M; Donati, S; Dong, P; Dorigo, M; Dorigo, T; Ebina, K; Elagin, A; Eppig, A; Erbacher, R; Errede, D; Errede, S; Ershaidat, N; Eusebi, R; Fang, H C; Farrington, S; Feindt, M; Fernandez, J P; Ferrazza, C; Field, R; Flanagan, G; Forrest, R; Frank, M J; Franklin, M; Freeman, J C; Funakoshi, Y; Furic, I; Gallinaro, M; Galyardt, J; Garcia, J E; Garfinkel, A F; Garosi, P; Gerberich, H; Gerchtein, E; Giagu, S; Giakoumopoulou, V; Giannetti, P; Gibson, K; Ginsburg, C M; Giokaris, N; Giromini, P; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldin, D; Goldschmidt, N; Golossanov, A; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Goulianos, K; Grinstein, S; Grosso-Pilcher, C; Group, R C; Guimaraes da Costa, J; Gunay-Unalan, Z; Haber, C; Hahn, S R; Halkiadakis, E; Hamaguchi, A; Han, J Y; Happacher, F; Hara, K; Hare, D; Hare, M; Harr, R F; Hatakeyama, K; Hays, C; Heck, M; Heinrich, J; Herndon, M; Hewamanage, S; Hidas, D; Hocker, A; Hopkins, W; Horn, D; Hou, S; Hughes, R E; Hurwitz, M; Husemann, U; Hussain, N; Hussein, M; Huston, J; Introzzi, G; Iori, M; Ivanov, A; James, E; Jang, D; Jayatilaka, B; Jeon, E J; Jha, M K; Jindariani, S; Johnson, W; Jones, M; Joo, K K; Jun, S Y; Junk, T R; Kamon, T; Karchin, P E; Kasmi, A; Kato, Y; Ketchum, W; Keung, J; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, H W; Kim, J E; Kim, M J; Kim, S B; Kim, S H; Kim, Y K; Kimura, N; Kirby, M; Klimenko, S; Kondo, K; Kong, D J; Konigsberg, J; Kotwal, A V; Kreps, M; Kroll, J; Krop, D; Krumnack, N; Kruse, M; Krutelyov, V; Kuhr, T; Kurata, M; Kwang, S; Laasanen, A T; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lecompte, T; Lee, E; Lee, H S; Lee, J S; Lee, S W; Leo, S; Leone, S; Lewis, J D; Limosani, A; Lin, C-J; Linacre, J; Lindgren, M; Lipeles, E; Lister, A; Litvintsev, D O; Liu, C; Liu, Q; Liu, T; Lockwitz, S; Loginov, A; Lucchesi, D; Lueck, J; Lujan, P; Lukens, P; Lungu, G; Lys, J; Lysak, R; Madrak, R; Maeshima, K; Makhoul, K; Malik, S; Manca, G; Manousakis-Katsikakis, A; Margaroli, F; Marino, C; Martínez, M; Martínez-Ballarín, R; Mastrandrea, P; Mattson, M E; Mazzanti, P; McFarland, K S; McIntyre, P; McNulty, R; Mehta, A; Mehtala, P; Menzione, A; Mesropian, C; Miao, T; Mietlicki, D; Mitra, A; Miyake, H; Moed, S; Moggi, N; Mondragon, M N; Moon, C S; Moore, R; Morello, M J; Morlock, J; Movilla Fernandez, P; Mukherjee, A; Muller, Th; Murat, P; Mussini, M; Nachtman, J; Nagai, Y; Naganoma, J; Nakano, I; Napier, A; Nett, J; Neu, C; Neubauer, M S; Nielsen, J; Nodulman, L; Norniella, O; Nurse, E; Oakes, L; Oh, S H; Oh, Y D; Oksuzian, I; Okusawa, T; Orava, R; Ortolan, L; Pagan Griso, S; Pagliarone, C; Palencia, E; Papadimitriou, V; Paramonov, A A; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Pianori, E; Pilot, J; Pitts, K; Plager, C; Pondrom, L; Poprocki, S; Potamianos, K; Poukhov, O; Prokoshin, F; Pronko, A; Ptohos, F; Pueschel, E; Punzi, G; Pursley, J; Rahaman, A; Ramakrishnan, V; Ranjan, N; Ray, J; Redondo, I; Renton, P; Rescigno, M; Riddick, T; Rimondi, F; Ristori, L; Robson, A; Rodrigo, T; Rodriguez, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rubbo, F; Ruffini, F; Ruiz, A; Russ, J; Rusu, V; Safonov, A; Sakumoto, W K; Sakurai, Y; Santi, L; Sartori, L; Sato, K; Saveliev, V; Savoy-Navarro, A; Schlabach, P; Schmidt, A; Schmidt, E E; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Sforza, F; Sfyrla, A; Shalhout, S Z; Shears, T; Shepard, P F; Shimojima, M; Shiraishi, S; Shochet, M; Shreyber, I; Simonenko, A; Sinervo, P; Sissakian, A; Sliwa, K; Smith, J R; Snider, F D; Soha, A; Somalwar, S; Sorin, V; Squillacioti, P; Stancari, M; Stanitzki, M; St Denis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Strycker, G L; Sudo, Y; Sukhanov, A; Suslov, I; Takemasa, K; Takeuchi, Y; Tang, J; Tecchio, M; Teng, P K; Thom, J; Thome, J; Thompson, G A; Thomson, E; Ttito-Guzmán, P; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Torre, S; Torretta, D; Totaro, P; Trovato, M; Tu, Y; Ukegawa, F; Uozumi, S; Varganov, A; Vázquez, F; Velev, G; Vellidis, C; Vidal, M; Vila, I; Vilar, R; Vizán, J; Vogel, M; Volpi, G; Wagner, P; Wagner, R L; Wakisaka, T; Wallny, R; Wang, S M; Warburton, A; Waters, D; Weinberger, M; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Wilbur, S; Wick, F; Williams, H H; Wilson, J S; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, H; Wright, T; Wu, X; Wu, Z; Yamamoto, K; Yamaoka, J; Yang, T; Yang, U K; Yang, Y C; Yao, W-M; Yeh, G P; Yi, K; Yoh, J; Yorita, K; Yoshida, T; Yu, G B; Yu, I; Yu, S S; Yun, J C; Zanetti, A; Zeng, Y; Zucchelli, S
2012-01-06
A search for a narrow Higgs boson resonance in the diphoton mass spectrum is presented based on data corresponding to 7.0 fb{-1} of integrated luminosity from pp collisions at sqrt[s]=1.96 TeV collected by the CDF experiment. No evidence of such a resonance is observed, and upper limits are set on the cross section times branching ratio of the resonant state as a function of Higgs boson mass. The limits are interpreted in the context of the standard model and one fermiophobic benchmark model where the data exclude fermiophobic Higgs bosons with masses below 114 GeV/c{2} at a 95% Bayesian credibility level.
Wijeysundera, Duminda N; Austin, Peter C; Hux, Janet E; Beattie, W Scott; Laupacis, Andreas
2009-01-01
Randomized trials generally use "frequentist" statistics based on P-values and 95% confidence intervals. Frequentist methods have limitations that might be overcome, in part, by Bayesian inference. To illustrate these advantages, we re-analyzed randomized trials published in four general medical journals during 2004. We used Medline to identify randomized superiority trials with two parallel arms, individual-level randomization and dichotomous or time-to-event primary outcomes. Studies with P<0.05 in favor of the intervention were deemed "positive"; otherwise, they were "negative." We used several prior distributions and exact conjugate analyses to calculate Bayesian posterior probabilities for clinically relevant effects. Of 88 included studies, 39 were positive using a frequentist analysis. Although the Bayesian posterior probabilities of any benefit (relative risk or hazard ratio<1) were high in positive studies, these probabilities were lower and variable for larger benefits. The positive studies had only moderate probabilities for exceeding the effects that were assumed for calculating the sample size. By comparison, there were moderate probabilities of any benefit in negative studies. Bayesian and frequentist analyses complement each other when interpreting the results of randomized trials. Future reports of randomized trials should include both.
NASA Astrophysics Data System (ADS)
Kim, Y.; Nishina, K.; Chae, N.; Park, S. J.; Yoon, Y. J.; Lee, B. Y.
2014-10-01
The tundra ecosystem is quite vulnerable to drastic climate change in the Arctic, and the quantification of carbon dynamics is of significant importance regarding thawing permafrost, changes to the snow-covered period and snow and shrub community extent, and the decline of sea ice in the Arctic. Here, CO2 efflux measurements using a manual chamber system within a 40 m × 40 m (5 m interval; 81 total points) plot were conducted within dominant tundra vegetation on the Seward Peninsula of Alaska, during the growing seasons of 2011 and 2012, for the assessment of driving parameters of CO2 efflux. We applied a hierarchical Bayesian (HB) model - a function of soil temperature, soil moisture, vegetation type, and thaw depth - to quantify the effects of environmental factors on CO2 efflux and to estimate growing season CO2 emissions. Our results showed that average CO2 efflux in 2011 was 1.4 times higher than in 2012, resulting from the distinct difference in soil moisture between the 2 years. Tussock-dominated CO2 efflux is 1.4 to 2.3 times higher than those measured in lichen and moss communities, revealing tussock as a significant CO2 source in the Arctic, with a wide area distribution on the circumpolar scale. CO2 efflux followed soil temperature nearly exponentially from both the observed data and the posterior medians of the HB model. This reveals that soil temperature regulates the seasonal variation of CO2 efflux and that soil moisture contributes to the interannual variation of CO2 efflux for the two growing seasons in question. Obvious changes in soil moisture during the growing seasons of 2011 and 2012 resulted in an explicit difference between CO2 effluxes - 742 and 539 g CO2 m-2 period-1 for 2011 and 2012, respectively, suggesting the 2012 CO2 emission rate was reduced to 27% (95% credible interval: 17-36%) of the 2011 emission, due to higher soil moisture from severe rain. The estimated growing season CO2 emission rate ranged from 0.86 Mg CO2 in 2012 to 1.20 Mg CO2 in 2011 within a 40 m × 40 m plot, corresponding to 86 and 80% of annual CO2 emission rates within the western Alaska tundra ecosystem, estimated from the temperature dependence of CO2 efflux. Therefore, this HB model can be readily applied to observed CO2 efflux, as it demands only four environmental factors and can also be effective for quantitatively assessing the driving parameters of CO2 efflux.
A Bayesian CUSUM plot: Diagnosing quality of treatment.
Rosthøj, Steen; Jacobsen, Rikke-Line
2017-12-01
To present a CUSUM plot based on Bayesian diagnostic reasoning displaying evidence in favour of "healthy" rather than "sick" quality of treatment (QOT), and to demonstrate a technique using Kaplan-Meier survival curves permitting application to case series with ongoing follow-up. For a case series with known final outcomes: Consider each case a diagnostic test of good versus poor QOT (expected vs. increased failure rates), determine the likelihood ratio (LR) of the observed outcome, convert LR to weight taking log to base 2, and add up weights sequentially in a plot showing how many times odds in favour of good QOT have been doubled. For a series with observed survival times and an expected survival curve: Divide the curve into time intervals, determine "healthy" and specify "sick" risks of failure in each interval, construct a "sick" survival curve, determine the LR of survival or failure at the given observation times, convert to weights, and add up. The Bayesian plot was applied retrospectively to 39 children with acute lymphoblastic leukaemia with completed follow-up, using Nordic collaborative results as reference, showing equal odds between good and poor QOT. In the ongoing treatment trial, with 22 of 37 children still at risk for event, QOT has been monitored with average survival curves as reference, odds so far favoring good QOT 2:1. QOT in small patient series can be assessed with a Bayesian CUSUM plot, retrospectively when all treatment outcomes are known, but also in ongoing series with unfinished follow-up. © 2017 John Wiley & Sons, Ltd.
Bayes in biological anthropology.
Konigsberg, Lyle W; Frankenberg, Susan R
2013-12-01
In this article, we both contend and illustrate that biological anthropologists, particularly in the Americas, often think like Bayesians but act like frequentists when it comes to analyzing a wide variety of data. In other words, while our research goals and perspectives are rooted in probabilistic thinking and rest on prior knowledge, we often proceed to use statistical hypothesis tests and confidence interval methods unrelated (or tenuously related) to the research questions of interest. We advocate for applying Bayesian analyses to a number of different bioanthropological questions, especially since many of the programming and computational challenges to doing so have been overcome in the past two decades. To facilitate such applications, this article explains Bayesian principles and concepts, and provides concrete examples of Bayesian computer simulations and statistics that address questions relevant to biological anthropology, focusing particularly on bioarchaeology and forensic anthropology. It also simultaneously reviews the use of Bayesian methods and inference within the discipline to date. This article is intended to act as primer to Bayesian methods and inference in biological anthropology, explaining the relationships of various methods to likelihoods or probabilities and to classical statistical models. Our contention is not that traditional frequentist statistics should be rejected outright, but that there are many situations where biological anthropology is better served by taking a Bayesian approach. To this end it is hoped that the examples provided in this article will assist researchers in choosing from among the broad array of statistical methods currently available. Copyright © 2013 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally
2015-01-01
Purpose: Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable…
Translating Climate Projections for Bridge Engineering
NASA Astrophysics Data System (ADS)
Anderson, C.; Takle, E. S.; Krajewski, W.; Mantilla, R.; Quintero, F.
2015-12-01
A bridge vulnerability pilot study was conducted by Iowa Department of Transportation (IADOT) as one of nineteen pilots supported by the Federal Highway Administration Climate Change Resilience Pilots. Our pilot study team consisted of the IADOT senior bridge engineer who is the preliminary design section leader as well as climate and hydrological scientists. The pilot project culminated in a visual graphic designed by the bridge engineer (Figure 1), and an evaluation framework for bridge engineering design. The framework has four stages. The first two stages evaluate the spatial and temporal resolution needed in climate projection data in order to be suitable for input to a hydrology model. The framework separates streamflow simulation error into errors from the streamflow model and from the coarseness of input weather data series. In the final two stages, the framework evaluates credibility of climate projection streamflow simulations. Using an empirically downscaled data set, projection streamflow is generated. Error is computed in two time frames: the training period of the empirical downscaling methodology, and an out-of-sample period. If large errors in projection streamflow were observed during the training period, it would indicate low accuracy and, therefore, low credibility. If large errors in streamflow were observed during the out-of-sample period, it would mean the approach may not include some causes of change and, therefore, the climate projections would have limited credibility for setting expectations for changes. We address uncertainty with confidence intervals on quantiles of streamflow discharge. The results show the 95% confidence intervals have significant overlap. Nevertheless, the use of confidence intervals enabled engineering judgement. In our discussions, we noted the consistency in direction of change across basins, though the flood mechanism was different across basins, and the high bound of bridge lifetime period quantiles exceeded that of the historical period. This suggested the change was not isolated, and it systemically altered the risk profile. One suggestion to incorporate engineering judgement was to consider degrees of vulnerability using the median discharge of the historical period and the upper bound discharge for the bridge lifetime period.
Bayesian models for comparative analysis integrating phylogenetic uncertainty.
de Villemereuil, Pierre; Wells, Jessie A; Edwards, Robert D; Blomberg, Simon P
2012-06-28
Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language.
Bayesian models for comparative analysis integrating phylogenetic uncertainty
2012-01-01
Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language. PMID:22741602
NASA Astrophysics Data System (ADS)
Koch, Wolfgang
1996-05-01
Sensor data processing in a dense target/dense clutter environment is inevitably confronted with data association conflicts which correspond with the multiple hypothesis character of many modern approaches (MHT: multiple hypothesis tracking). In this paper we analyze the efficiency of retrodictive techniques that generalize standard fixed interval smoothing to MHT applications. 'Delayed estimation' based on retrodiction provides uniquely interpretable and accurate trajectories from ambiguous MHT output if a certain time delay is tolerated. In a Bayesian framework the theoretical background of retrodiction and its intimate relation to Bayesian MHT is sketched. By a simulated example with two closely-spaced targets, relatively low detection probabilities, and rather high false return densities, we demonstrate the benefits of retrodiction and quantitatively discuss the achievable track accuracies and the time delays involved for typical radar parameters.
Bayesian estimation of dynamic matching function for U-V analysis in Japan
NASA Astrophysics Data System (ADS)
Kyo, Koki; Noda, Hideo; Kitagawa, Genshiro
2012-05-01
In this paper we propose a Bayesian method for analyzing unemployment dynamics. We derive a Beveridge curve for unemployment and vacancy (U-V) analysis from a Bayesian model based on a labor market matching function. In our framework, the efficiency of matching and the elasticities of new hiring with respect to unemployment and vacancy are regarded as time varying parameters. To construct a flexible model and obtain reasonable estimates in an underdetermined estimation problem, we treat the time varying parameters as random variables and introduce smoothness priors. The model is then described in a state space representation, enabling the parameter estimation to be carried out using Kalman filter and fixed interval smoothing. In such a representation, dynamic features of the cyclic unemployment rate and the structural-frictional unemployment rate can be accurately captured.
Levecke, Bruno; Kaplan, Ray M; Thamsborg, Stig M; Torgerson, Paul R; Vercruysse, Jozef; Dobson, Robert J
2018-04-15
Although various studies have provided novel insights into how to best design, analyze and interpret a fecal egg count reduction test (FECRT), it is still not straightforward to provide guidance that allows improving both the standardization and the analytical performance of the FECRT across a variety of both animal and nematode species. For example, it has been suggested to recommend a minimum number of eggs to be counted under the microscope (not eggs per gram of feces), but we lack the evidence to recommend any number of eggs that would allow a reliable assessment of drug efficacy. Other aspects that need further research are the methodology of calculating uncertainty intervals (UIs; confidence intervals in case of frequentist methods and credible intervals in case of Bayesian methods) and the criteria of classifying drug efficacy into 'normal', 'suspected' and 'reduced'. The aim of this study is to provide complementary insights into the current knowledge, and to ultimately provide guidance in the development of new standardized guidelines for the FECRT. First, data were generated using a simulation in which the 'true' drug efficacy (TDE) was evaluated by the FECRT under varying scenarios of sample size, analytic sensitivity of the diagnostic technique, and level of both intensity and aggregation of egg excretion. Second, the obtained data were analyzed with the aim (i) to verify which classification criteria allow for reliable detection of reduced drug efficacy, (ii) to identify the UI methodology that yields the most reliable assessment of drug efficacy (coverage of TDE) and detection of reduced drug efficacy, and (iii) to determine the required sample size and number of eggs counted under the microscope that optimizes the detection of reduced efficacy. Our results confirm that the currently recommended criteria for classifying drug efficacy are the most appropriate. Additionally, the UI methodologies we tested varied in coverage and ability to detect reduced drug efficacy, thus a combination of UI methodologies is recommended to assess the uncertainty across all scenarios of drug efficacy estimates. Finally, based on our model estimates we were able to determine the required number of eggs to count for each sample size, enabling investigators to optimize the probability of correctly classifying a theoretical TDE while minimizing both financial and technical resources. Copyright © 2018 Elsevier B.V. All rights reserved.
Building a Database for a Quantitative Model
NASA Technical Reports Server (NTRS)
Kahn, C. Joseph; Kleinhammer, Roger
2014-01-01
A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.
Feng, Dai; Svetnik, Vladimir; Coimbra, Alexandre; Baumgartner, Richard
2014-01-01
The intraclass correlation coefficient (ICC) with fixed raters or, equivalently, the concordance correlation coefficient (CCC) for continuous outcomes is a widely accepted aggregate index of agreement in settings with small number of raters. Quantifying the precision of the CCC by constructing its confidence interval (CI) is important in early drug development applications, in particular in qualification of biomarker platforms. In recent years, there have been several new methods proposed for construction of CIs for the CCC, but their comprehensive comparison has not been attempted. The methods consisted of the delta method and jackknifing with and without Fisher's Z-transformation, respectively, and Bayesian methods with vague priors. In this study, we carried out a simulation study, with data simulated from multivariate normal as well as heavier tailed distribution (t-distribution with 5 degrees of freedom), to compare the state-of-the-art methods for assigning CI to the CCC. When the data are normally distributed, the jackknifing with Fisher's Z-transformation (JZ) tended to provide superior coverage and the difference between it and the closest competitor, the Bayesian method with the Jeffreys prior was in general minimal. For the nonnormal data, the jackknife methods, especially the JZ method, provided the coverage probabilities closest to the nominal in contrast to the others which yielded overly liberal coverage. Approaches based upon the delta method and Bayesian method with conjugate prior generally provided slightly narrower intervals and larger lower bounds than others, though this was offset by their poor coverage. Finally, we illustrated the utility of the CIs for the CCC in an example of a wake after sleep onset (WASO) biomarker, which is frequently used in clinical sleep studies of drugs for treatment of insomnia.
Bayesian alternative to the ISO-GUM's use of the Welch Satterthwaite formula
NASA Astrophysics Data System (ADS)
Kacker, Raghu N.
2006-02-01
In certain disciplines, uncertainty is traditionally expressed as an interval about an estimate for the value of the measurand. Development of such uncertainty intervals with a stated coverage probability based on the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement (GUM) requires a description of the probability distribution for the value of the measurand. The ISO-GUM propagates the estimates and their associated standard uncertainties for various input quantities through a linear approximation of the measurement equation to determine an estimate and its associated standard uncertainty for the value of the measurand. This procedure does not yield a probability distribution for the value of the measurand. The ISO-GUM suggests that under certain conditions motivated by the central limit theorem the distribution for the value of the measurand may be approximated by a scaled-and-shifted t-distribution with effective degrees of freedom obtained from the Welch-Satterthwaite (W-S) formula. The approximate t-distribution may then be used to develop an uncertainty interval with a stated coverage probability for the value of the measurand. We propose an approximate normal distribution based on a Bayesian uncertainty as an alternative to the t-distribution based on the W-S formula. A benefit of the approximate normal distribution based on a Bayesian uncertainty is that it greatly simplifies the expression of uncertainty by eliminating altogether the need for calculating effective degrees of freedom from the W-S formula. In the special case where the measurand is the difference between two means, each evaluated from statistical analyses of independent normally distributed measurements with unknown and possibly unequal variances, the probability distribution for the value of the measurand is known to be a Behrens-Fisher distribution. We compare the performance of the approximate normal distribution based on a Bayesian uncertainty and the approximate t-distribution based on the W-S formula with respect to the Behrens-Fisher distribution. The approximate normal distribution is simpler and better in this case. A thorough investigation of the relative performance of the two approximate distributions would require comparison for a range of measurement equations by numerical methods.
Zhang, J L; Li, Y P; Huang, G H; Baetz, B W; Liu, J
2017-06-01
In this study, a Bayesian estimation-based simulation-optimization modeling approach (BESMA) is developed for identifying effluent trading strategies. BESMA incorporates nutrient fate modeling with soil and water assessment tool (SWAT), Bayesian estimation, and probabilistic-possibilistic interval programming with fuzzy random coefficients (PPI-FRC) within a general framework. Based on the water quality protocols provided by SWAT, posterior distributions of parameters can be analyzed through Bayesian estimation; stochastic characteristic of nutrient loading can be investigated which provides the inputs for the decision making. PPI-FRC can address multiple uncertainties in the form of intervals with fuzzy random boundaries and the associated system risk through incorporating the concept of possibility and necessity measures. The possibility and necessity measures are suitable for optimistic and pessimistic decision making, respectively. BESMA is applied to a real case of effluent trading planning in the Xiangxihe watershed, China. A number of decision alternatives can be obtained under different trading ratios and treatment rates. The results can not only facilitate identification of optimal effluent-trading schemes, but also gain insight into the effects of trading ratio and treatment rate on decision making. The results also reveal that decision maker's preference towards risk would affect decision alternatives on trading scheme as well as system benefit. Compared with the conventional optimization methods, it is proved that BESMA is advantageous in (i) dealing with multiple uncertainties associated with randomness and fuzziness in effluent-trading planning within a multi-source, multi-reach and multi-period context; (ii) reflecting uncertainties existing in nutrient transport behaviors to improve the accuracy in water quality prediction; and (iii) supporting pessimistic and optimistic decision making for effluent trading as well as promoting diversity of decision alternatives. Copyright © 2017 Elsevier Ltd. All rights reserved.
Global determination of rating curves in the Amazon basin from satellite altimetry
NASA Astrophysics Data System (ADS)
Paris, Adrien; Paiva, Rodrigo C. D.; Santos da Silva, Joecila; Medeiros Moreira, Daniel; Calmant, Stéphane; Collischonn, Walter; Bonnet, Marie-Paule; Seyler, Frédérique
2014-05-01
The Amazonian basin is the largest hydrological basin all over the world. Over the past few years, it has experienced an unusual succession of extreme droughts and floods, which origin is still a matter of debate. One of the major issues in understanding such events is to get discharge series distributed over the entire basin. Satellite altimetry can be used to improve our knowledge of the hydrological stream flow conditions in the basin, through rating curves. Rating curves are mathematical relationships between stage and discharge at a given place. The common way to determine the parameters of the relationship is to compute the non-linear regression between the discharge and stage series. In this study, the discharge data was obtained by simulation through the entire basin using the MGB-IPH model with TRMM Merge input rainfall data and assimilation of gage data, run from 1998 to 2009. The stage dataset is made of ~900 altimetry series at ENVISAT and Jason-2 virtual stations, sampling the stages over more than a hundred of rivers in the basin. Altimetry series span between 2002 and 2011. In the present work we present the benefits of using stochastic methods instead of probabilistic ones to determine a dataset of rating curve parameters which are hydrologicaly meaningful throughout the entire Amazon basin. The rating curve parameters have been computed using an optimization technique based on Markov Chain Monte Carlo sampler and Bayesian inference scheme. This technique provides an estimate of the best value for the parameters together with their posterior probability distribution, allowing the determination of a credibility interval for calculated discharge. Also the error over discharges estimates from the MGB-IPH model is included in the rating curve determination. These MGB-IPH errors come from either errors in the discharge derived from the gage readings or errors in the satellite rainfall estimates. The present experiment shows that the stochastic approach is more efficient than the determinist one. By using for the parameters prior credible intervals defined by the user, this method provides an estimate of best rating curve estimate without any unlikely parameter. Results were assessed trough the Nash Sutcliffe efficiency coefficient. Ens superior to 0.7 is found for most of the 920 virtual stations . From these results we were able to determinate a fully coherent map of river bed height, mean depth and Manning's roughness coefficient, information that can be reused in hydrological modeling. Bad results found at a few virtual stations are also of interest. For some sub-basins in the Andean piemont, the bad result confirms that the model failed to estimate discharges overthere. Other are found at tributary mouths experiencing backwater effects from the Amazon. Considering mean monthly slope at the virtual station in the rating curve equation, we obtain rated discharges much more consistent with modeled and measured ones, showing that it is now possible to obtain a meaningful rating curve in such critical areas.
A Bayesian sequential processor approach to spectroscopic portal system decisions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sale, K; Candy, J; Breitfeller, E
The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waitingmore » for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.« less
Bayesian tomography and integrated data analysis in fusion diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Dong, E-mail: lid@swip.ac.cn; Dong, Y. B.; Deng, Wei
2016-11-15
In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varyingmore » smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.« less
Su, Xiaole; Xie, Xinfang; Liu, Lijun; Lv, Jicheng; Song, Fujian; Perkovic, Vlado; Zhang, Hong
2017-01-01
To simultaneously evaluate the relative efficacy of multiple pharmacologic strategies for preventing contrast-induced acute kidney injury (AKI). Systematic review containing a Bayesian network meta-analysis of randomized controlled trials. Participants undergoing diagnostic and/or interventional procedures with contrast media. Randomized controlled trials comparing the active drug treatments with each other or with hydration alone. Any of the following drugs in combination with hydration: N-acetylcysteine (NAC), theophylline (aminophylline), fenoldopam, iloprost, alprostadil, prostaglandin E 1 , statins, statins plus NAC, bicarbonate sodium, bicarbonate sodium plus NAC, ascorbic acid (vitamin C), tocopherol (vitamin E), α-lipoic acid, atrial natriuretic peptide, B-type natriuretic peptide, and carperitide. The occurrence of contrast-induced AKI. The trial network included 150 trials with 31,631 participants and 4,182 contrast-induced AKI events assessing 12 different interventions. Compared to hydration, ORs (95% credible intervals) for contrast-induced AKI were 0.31 (0.14-0.60) for high-dose statin plus NAC, 0.37 (0.19-0.64) for high-dose statin alone, 0.37 (0.17-0.72) for prostaglandins, 0.48 (0.26-0.82) for theophylline, 0.62 (0.40-0.88) for bicarbonate sodium plus NAC, 0.67 (0.54-0.81) for NAC alone, 0.64 (0.41-0.95) for vitamins and analogues, 0.70 (0.29-1.37) for natriuretic peptides, 0.69 (0.31-1.37) for fenoldopam, 0.78 (0.59-1.01) for bicarbonate sodium, and 0.98 (0.41-2.07) for low-dose statin. High-dose statin plus NAC or high-dose statin alone were likely to be ranked the best or the second best for preventing contrast-induced AKI. The overall results were not materially changed in metaregressions or subgroup and sensitivity analyses. Patient-level data were unavailable; unable to include some treatment agents; low event rates; imbalanced distribution of participants among treatment strategies. High-dose statins plus hydration with or without NAC might be the preferred treatment strategy to prevent contrast-induced AKI in patients undergoing diagnostic and/or interventional procedures requiring contrast media. Copyright © 2016 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
Abraham, William T; Kuck, Karl-Heinz; Goldsmith, Rochelle L; Lindenfeld, JoAnn; Reddy, Vivek Y; Carson, Peter E; Mann, Douglas L; Saville, Benjamin; Parise, Helen; Chan, Rodrigo; Wiegn, Phi; Hastings, Jeffrey L; Kaplan, Andrew J; Edelmann, Frank; Luthje, Lars; Kahwash, Rami; Tomassoni, Gery F; Gutterman, David D; Stagg, Angela; Burkhoff, Daniel; Hasenfuß, Gerd
2018-05-05
The authors sought to confirm a subgroup analysis of the prior FIX-HF-5 (Evaluate Safety and Efficacy of the OPTIMIZER System in Subjects With Moderate-to-Severe Heart Failure) study showing that cardiac contractility modulation (CCM) improved exercise tolerance (ET) and quality of life in patients with ejection fractions between 25% and 45%. CCM therapy for New York Heart Association (NYHA) functional class III and IV heart failure (HF) patients consists of nonexcitatory electrical signals delivered to the heart during the absolute refractory period. A total of 160 patients with NYHA functional class III or IV symptoms, QRS duration <130 ms, and ejection fraction ≥25% and ≤45% were randomized to continued medical therapy (control, n = 86) or CCM (treatment, n = 74, unblinded) for 24 weeks. Peak VO 2 (primary endpoint), Minnesota Living With Heart Failure questionnaire, NYHA functional class, and 6-min hall walk were measured at baseline and at 12 and 24 weeks. Bayesian repeated measures linear modeling was used for the primary endpoint analysis with 30% borrowing from the FIX-HF-5 subgroup. Safety was assessed by the percentage of patients free of device-related adverse events with a pre-specified lower bound of 70%. The difference in peak VO 2 between groups was 0.84 (95% Bayesian credible interval: 0.123 to 1.552) ml O 2 /kg/min, satisfying the primary endpoint. Minnesota Living With Heart Failure questionnaire (p < 0.001), NYHA functional class (p < 0.001), and 6-min hall walk (p = 0.02) were all better in the treatment versus control group. There were 7 device-related events, yielding a lower bound of 80% of patients free of events, satisfying the primary safety endpoint. The composite of cardiovascular death and HF hospitalizations was reduced from 10.8% to 2.9% (p = 0.048). CCM is safe, improves exercise tolerance and quality of life in the specified group of HF patients, and leads to fewer HF hospitalizations. (Evaluate Safety and Efficacy of the OPTIMIZER System in Subjects With Moderate-to-Severe Heart Failure; NCT01381172). Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Dufour, Simon; Durocher, Jean; Dubuc, Jocelyn; Dendukuri, Nandini; Hassan, Shereen; Buczinski, Sébastien
2017-05-01
Using a milk sample for pregnancy diagnosis in dairy cattle is extremely convenient due to the low technical inputs required for collection of biological materials. Determining accuracy of a novel pregnancy diagnostic test that relies on a milk sample is, however, difficult since no gold standard test is available for comparison. The objective of the current study was to estimate diagnostic accuracy of the milk PAG-based ELISA and of transrectal ultrasonographic (TUS) exam for determining pregnancy status of individual dairy cows using a methodology suited for test validation in the absence of gold standard. Secondary objectives were to evaluate whether test accuracy varies with cow's characteristics and to identify the optimal ELISA optical density threshold for PAG test interpretation. Cows (n=519) from 18 commercial dairies tested with both TUS and PAG between 28 and 45days following breeding were included in the study. Other covariates (number of days since breeding, parity, and daily milk production) hypothesized to affect TUS or PAG test accuracy were measured. A Bayesian hierarchical latent class model (LCM) methodology assuming conditional independence between tests was used to obtain estimates of tests' sensitivities (Se) and specificities (Sp), to evaluate impact of covariates on these, and to compute misclassification costs across a range of ELISA thresholds. Very little disagreement was observed between tests with only 23 cows yielding discordant results. Using the LCM model with non-informative priors for tests accuracy parameters, median (95% credibility intervals [CI]) TUS Se and Sp estimates of 0.96 (0.91, 1.00) and 0.99 (0.97, 1.0) were obtained. For the PAG test, median (95% CI) Se of 0.99 (0.98, 1.00) and Sp of 0.95 (0.89, 1.0) were observed. The impact of adjusting for conditional dependence between tests was negligible. Test accuracy of the PAG test varied slightly by parity number. When assuming false negative to false positive costs ratio≥3:1, the optimal ELISA optical density threshold allowing minimization of misclassification costs was 0.25. In conclusion, both TUS and PAG showed excellent accuracy for pregnancy diagnosis in dairy cows. When using the PAG test, a threshold of 0.25 could be used for test interpretation. Copyright © 2017 Elsevier B.V. All rights reserved.
Fuzzy Intervals for Designing Structural Signature: An Application to Graphic Symbol Recognition
NASA Astrophysics Data System (ADS)
Luqman, Muhammad Muzzamil; Delalandre, Mathieu; Brouard, Thierry; Ramel, Jean-Yves; Lladós, Josep
The motivation behind our work is to present a new methodology for symbol recognition. The proposed method employs a structural approach for representing visual associations in symbols and a statistical classifier for recognition. We vectorize a graphic symbol, encode its topological and geometrical information by an attributed relational graph and compute a signature from this structural graph. We have addressed the sensitivity of structural representations to noise, by using data adapted fuzzy intervals. The joint probability distribution of signatures is encoded by a Bayesian network, which serves as a mechanism for pruning irrelevant features and choosing a subset of interesting features from structural signatures of underlying symbol set. The Bayesian network is deployed in a supervised learning scenario for recognizing query symbols. The method has been evaluated for robustness against degradations & deformations on pre-segmented 2D linear architectural & electronic symbols from GREC databases, and for its recognition abilities on symbols with context noise i.e. cropped symbols.
Modeling stream fish distributions using interval-censored detection times.
Ferreira, Mário; Filipe, Ana Filipa; Bardos, David C; Magalhães, Maria Filomena; Beja, Pedro
2016-08-01
Controlling for imperfect detection is important for developing species distribution models (SDMs). Occupancy-detection models based on the time needed to detect a species can be used to address this problem, but this is hindered when times to detection are not known precisely. Here, we extend the time-to-detection model to deal with detections recorded in time intervals and illustrate the method using a case study on stream fish distribution modeling. We collected electrofishing samples of six fish species across a Mediterranean watershed in Northeast Portugal. Based on a Bayesian hierarchical framework, we modeled the probability of water presence in stream channels, and the probability of species occupancy conditional on water presence, in relation to environmental and spatial variables. We also modeled time-to-first detection conditional on occupancy in relation to local factors, using modified interval-censored exponential survival models. Posterior distributions of occupancy probabilities derived from the models were used to produce species distribution maps. Simulations indicated that the modified time-to-detection model provided unbiased parameter estimates despite interval-censoring. There was a tendency for spatial variation in detection rates to be primarily influenced by depth and, to a lesser extent, stream width. Species occupancies were consistently affected by stream order, elevation, and annual precipitation. Bayesian P-values and AUCs indicated that all models had adequate fit and high discrimination ability, respectively. Mapping of predicted occupancy probabilities showed widespread distribution by most species, but uncertainty was generally higher in tributaries and upper reaches. The interval-censored time-to-detection model provides a practical solution to model occupancy-detection when detections are recorded in time intervals. This modeling framework is useful for developing SDMs while controlling for variation in detection rates, as it uses simple data that can be readily collected by field ecologists.
Systematic meta-analyses and field synopsis of genetic association studies in colorectal adenomas
Montazeri, Zahra; Theodoratou, Evropi; Nyiraneza, Christine; Timofeeva, Maria; Chen, Wanjing; Svinti, Victoria; Sivakumaran, Shanya; Gresham, Gillian; Cubitt, Laura; Carvajal-Carmona, Luis; Bertagnolli, Monica M; Zauber, Ann G; Tomlinson, Ian; Farrington, Susan M; Dunlop, Malcolm G; Campbell, Harry; Little, Julian
2018-01-01
Background Low penetrance genetic variants, primarily single nucleotide polymorphisms, have substantial influence on colorectal cancer (CRC) susceptibility. Most CRCs develop from colorectal adenomas (CRA). Here, we report the first comprehensive field synopsis that catalogues all genetic association studies on CRA, with a parallel online database (http://www.chs.med.ed.ac.uk/CRAgene/). Methods We performed a systematic review, reviewing 9750 titles and then extracted data from 130 publications reporting on 181 polymorphisms in 74 genes. We conducted meta-analyses to derive summary effect estimates for 37 polymorphisms in 26 genes. We applied the Venice criteria and Bayesian False Discovery Probability (BFDP) to assess the levels of the credibility of associations. Results We considered the association with the rs6983267 variant at 8q24 as “highly credible”, reaching genome wide statistical significance in at least one meta-analysis model. We identified “less credible” associations (higher heterogeneity, lower statistical power, BFDP>0.02) with a further four variants of four independent genes: MTHFR c.677C>T p.A222V (rs1801133), TP53 c.215C>G p.R72P (rs1042522), NQO1 c.559C>T p.P187S (rs1800566), and NAT1 alleles imputed as fast acetylator genotypes. For the remaining 32 variants of 22 genes for which positive associations with CRA risk have been previously reported, the meta-analyses revealed no credible evidence to support these as true associations. Conclusions The limited number of credible associations between low penetrance genetic variants and CRA reflects the lower volume of evidence and associated lack of statistical power to detect associations of the magnitude typically observed for genetic variants and chronic diseases. The CRAgene database provides context for CRA genetic association data and will help inform future research directions. PMID:26451011
Socioeconomic inequalities in cause-specific mortality in 15 European cities.
Marí-Dell'Olmo, Marc; Gotsens, Mercè; Palència, Laia; Burström, Bo; Corman, Diana; Costa, Giuseppe; Deboosere, Patrick; Díez, Èlia; Domínguez-Berjón, Felicitas; Dzúrová, Dagmar; Gandarillas, Ana; Hoffmann, Rasmus; Kovács, Katalin; Martikainen, Pekka; Demaria, Moreno; Pikhart, Hynek; Rodríguez-Sanz, Maica; Saez, Marc; Santana, Paula; Schwierz, Cornelia; Tarkiainen, Lasse; Borrell, Carme
2015-05-01
Socioeconomic inequalities are increasingly recognised as an important public health issue, although their role in the leading causes of mortality in urban areas in Europe has not been fully evaluated. In this study, we used data from the INEQ-CITIES study to analyse inequalities in cause-specific mortality in 15 European cities at the beginning of the 21st century. A cross-sectional ecological study was carried out to analyse 9 of the leading specific causes of death in small areas from 15 European cities. Using a hierarchical Bayesian spatial model, we estimated smoothed Standardized Mortality Ratios, relative risks and 95% credible intervals for cause-specific mortality in relation to a socioeconomic deprivation index, separately for men and women. We detected spatial socioeconomic inequalities for most causes of mortality studied, although these inequalities differed markedly between cities, being more pronounced in Northern and Central-Eastern Europe. In the majority of cities, most of these causes of death were positively associated with deprivation among men, with the exception of prostatic cancer. Among women, diabetes, ischaemic heart disease, chronic liver diseases and respiratory diseases were also positively associated with deprivation in most cities. Lung cancer mortality was positively associated with deprivation in Northern European cities and in Kosice, but this association was non-existent or even negative in Southern European cities. Finally, breast cancer risk was inversely associated with deprivation in three Southern European cities. The results confirm the existence of socioeconomic inequalities in many of the main causes of mortality, and reveal variations in their magnitude between different European cities. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Laurent, Olivier; Ancelet, Sophie; Richardson, David B; Hémon, Denis; Ielsch, Géraldine; Demoury, Claire; Clavel, Jacqueline; Laurier, Dominique
2013-05-01
Previous epidemiological studies and quantitative risk assessments (QRA) have suggested that natural background radiation may be a cause of childhood leukemia. The present work uses a QRA approach to predict the excess risk of childhood leukemia in France related to three components of natural radiation: radon, cosmic rays and terrestrial gamma rays, using excess relative and absolute risk models proposed by the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR). Both models were developed from the Life Span Study (LSS) of Japanese A-bomb survivors. Previous risk assessments were extended by considering uncertainties in radiation-related leukemia risk model parameters as part of this process, within a Bayesian framework. Estimated red bone marrow doses cumulated during childhood by the average French child due to radon, terrestrial gamma and cosmic rays are 4.4, 7.5 and 4.3 mSv, respectively. The excess fractions of cases (expressed as percentages) associated with these sources of natural radiation are 20 % [95 % credible interval (CI) 0-68 %] and 4 % (95 % CI 0-11 %) under the excess relative and excess absolute risk models, respectively. The large CIs, as well as the different point estimates obtained under these two models, highlight the uncertainties in predictions of radiation-related childhood leukemia risks. These results are only valid provided that models developed from the LSS can be transferred to the population of French children and to chronic natural radiation exposures, and must be considered in view of the currently limited knowledge concerning other potential risk factors for childhood leukemia. Last, they emphasize the need for further epidemiological investigations of the effects of natural radiation on childhood leukemia to reduce uncertainties and help refine radiation protection standards.
Clifford, Sam; Mazaheri, Mandana; Salimi, Farhad; Ezz, Wafaa Nabil; Yeganeh, Bijan; Low-Choy, Samantha; Walker, Katy; Mengersen, Kerrie; Marks, Guy B; Morawska, Lidia
2018-05-01
It is known that ultrafine particles (UFP, particles smaller than 0.1 μm) can penetrate deep into the lungs and potentially have adverse health effects. However, epidemiological data on the health effects of UFP is limited. Therefore, our objective was to test the hypothesis that exposure to UFPs is associated with respiratory health status and systemic inflammation among children aged 8 to 11 years. We conducted a cross-sectional study among 655 children (43.3% male) attending 25 primary (elementary) schools in the Brisbane Metropolitan Area, Australia. Ultrafine particle number concentration (PNC) was measured at each school and modelled at homes using Land Use Regression to derive exposure estimates. Health outcomes were respiratory symptoms and diagnoses, measured by parent-completed questionnaire, spirometric lung function, exhaled nitric oxide (FeNO), and serum C reactive protein (CRP). Exposure-response models, adjusted for potential personal and environmental confounders measured at the individual, home and school level, were fitted using Bayesian methods. PNC was not independently associated with respiratory symptoms, asthma diagnosis or spirometric lung function. However, PNC was positively associated with an increase in CRP (1.188-fold change per 1000 UFP cm -3 day/day (95% credible interval 1.077 to 1.299)) and an increase in FeNO among atopic participants (1.054 fold change per 1000 UFP cm -3 day/day (95% CrI 1.005 to 1.106)). UFPs do not affect respiratory health outcomes in children but do have systemic effects, detected here in the form of a positive association with a biomarker for systemic inflammation. This is consistent with the known propensity of UFPs to penetrate deep into the lung and circulatory system. Copyright © 2018 Elsevier Ltd. All rights reserved.
Young, Jim; Rossi, Carmine; Gill, John; Walmsley, Sharon; Cooper, Curtis; Cox, Joseph; Martel-Laferriere, Valerie; Conway, Brian; Pick, Neora; Vachon, Marie-Louise
2017-01-01
Abstract Background. Highly effective hepatitis C virus (HCV) therapies have spurred a scale-up of treatment to populations at greater risk of reinfection after sustained virologic response (SVR). Reinfection may be higher in HIV–HCV coinfection, but prior studies have considered small selected populations. We assessed risk factors for reinfection after SVR in a representative cohort of Canadian coinfected patients in clinical care. Methods. All patients achieving SVR after HCV treatment were followed with HCV RNA measurements every 6 months in a prospective cohort study. We used Bayesian Cox regression to estimate reinfection rates according to patient reported injection drug use (IDU) and sexual activity among men who have sex with men (MSM). Results. Of 497 patients treated for HCV, 257 achieved SVR and had at least 1 subsequent RNA measurement. During 589 person-years of follow-up (PYFU) after SVR, 18 (7%) became HCV RNA positive. The adjusted reinfection rate (per 1000 PYFU) in the first year after SVR was highest in those who reported high-frequency IDU (58; 95% credible interval [CrI], 18–134) followed by MSM reporting high-risk sexual activity (26; 95% CrI, 6–66) and low-frequency IDU (22; 95% CrI, 4–68). The rate in low-risk MSM (16; 95% CrI, 4–38) was similar to that in reference patients (10; 95% CrI, 4–20). Reinfection rates did not diminish with time. Conclusions. HCV reinfection rates varied according to risk. Measures are needed to reduce risk behaviors and increase monitoring in high-risk IDU and MSM if HCV elimination targets are to be realized. PMID:28199495
Karolemeas, Katerina; de la Rua-Domenech, Ricardo; Cooper, Roderick; Goodchild, Anthony V; Clifton-Hadley, Richard S; Conlan, Andrew J K; Mitchell, Andrew P; Hewinson, R Glyn; Donnelly, Christl A; Wood, James L N; McKinley, Trevelyan J
2012-01-01
Bovine tuberculosis (bTB) is one of the most serious economic animal health problems affecting the cattle industry in Great Britain (GB), with incidence in cattle herds increasing since the mid-1980s. The single intradermal comparative cervical tuberculin (SICCT) test is the primary screening test in the bTB surveillance and control programme in GB and Ireland. The sensitivity (ability to detect infected cattle) of this test is central to the efficacy of the current testing regime, but most previous studies that have estimated test sensitivity (relative to the number of slaughtered cattle with visible lesions [VL] and/or positive culture results) lacked post-mortem data for SICCT test-negative cattle. The slaughter of entire herds ("whole herd slaughters" or "depopulations") that are infected by bTB are occasionally conducted in GB as a last-resort control measure to resolve intractable bTB herd breakdowns. These provide additional post-mortem data for SICCT test-negative cattle, allowing a rare opportunity to calculate the animal-level sensitivity of the test relative to the total number of SICCT test-positive and negative VL animals identified post-mortem (rSe). In this study, data were analysed from 16 whole herd slaughters (748 SICCT test-positive and 1031 SICCT test-negative cattle) conducted in GB between 1988 and 2010, using a bayesian hierarchical model. The overall rSe estimate of the SICCT test at the severe interpretation was 85% (95% credible interval [CI]: 78-91%), and at standard interpretation was 81% (95% CI: 70-89%). These estimates are more robust than those previously reported in GB due to inclusion of post-mortem data from SICCT test-negative cattle.
Barton, Christine M.; Zirkle, Keith W.; Greene, Caitlin F.; Newman, Kara B.
2018-01-01
Collisions with glass are a serious threat to avian life and are estimated to kill hundreds of millions of birds per year in the United States. We monitored 22 buildings at the Virginia Tech Corporate Research Center (VTCRC) in Blacksburg, Virginia, for collision fatalities from October 2013 through May 2015 and explored possible effects exerted by glass area and surrounding land cover on avian mortality. We documented 240 individuals representing 55 identifiable species that died due to collisions with windows at the VTCRC. The relative risk of fatal collisions at all buildings over the study period were estimated using a Bayesian hierarchical zero-inflated Poisson model adjusting for percentage of tree and lawn cover within 50 m of buildings, as well as for glass area. We found significant relationships between fatalities and surrounding lawn area (relative risk: 0.96, 95% credible interval: 0.93, 0.98) as well as glass area on buildings (RR: 1.30, 95% CI [1.05–1.65]). The model also found a moderately significant relationship between fatal collisions and the percent land cover of ornamental trees surrounding buildings (RR = 1.02, 95% CI [1.00–1.05]). Every building surveyed had at least one recorded collision death. Our findings indicate that birds collide with VTCRC windows during the summer breeding season in addition to spring and fall migration. The Ruby-throated Hummingbird (Archilochus colubris) was the most common window collision species and accounted for 10% of deaths. Though research has identified various correlates with fatal bird-window collisions, such studies rarely culminate in mitigation. We hope our study brings attention, and ultimately action, to address this significant threat to birds at the VTCRC and elsewhere. PMID:29637021
Tuberculosis infection among young nursing trainees in South India.
Christopher, Devasahayam J; Daley, Peter; Armstrong, Lois; James, Prince; Gupta, Richa; Premkumar, Beulah; Michael, Joy Sarojini; Radha, Vedha; Zwerling, Alice; Schiller, Ian; Dendukuri, Nandini; Pai, Madhukar
2010-04-29
Among healthcare workers in developing countries, nurses spend a large amount of time in direct contact with tuberculosis (TB) patients, and are at high risk for acquisition of TB infection and disease. To better understand the epidemiology of nosocomial TB among nurses, we recruited a cohort of young nursing trainees at Christian Medical College, a large, tertiary medical school hospital in Southern India. Among 535 nursing students enrolled in 2007, 468 gave consent to participate, and 436 underwent two-step tuberculin skin testing (TST). A majority (95%) were females, and almost 80% were under 22 years of age. Detailed TB exposure information was obtained using interviews and clinical log books. Prevalence of latent TB infection (LTBI) was estimated using Bayesian latent class analyses (LCA). Logistic regression analyses were done to determine the association between LTBI prevalence and TB exposure and risk factors. 219 of 436 students (50.2%, 95% CI: 45.4-55.0) were TST positive using the 10 mm or greater cut-off. Based on the LCA, the prevalence of LTBI was 47.8% (95% credible interval 17.8% to 65.6%). In the multivariate analysis, TST positivity was strongly associated with time spent in health care, after adjusting for age at entry into healthcare. Our study showed a high prevalence of LTBI even in young nursing trainees. With the recent TB infection control (TBIC) policy guidance from the World Health Organization as the reference, Indian healthcare providers and the Indian Revised National TB Control Programme will need to implement TBIC interventions, and enhance capacity for TBIC at the country level. Young trainees and nurses, in particular, will need to be targeted for TBIC interventions.
Aoki, Takuya; Yamamoto, Yosuke; Ikenoue, Tatsuyoshi; Kaneko, Makoto; Kise, Morito; Fujinuma, Yasuki; Fukuhara, Shunichi
2018-05-01
To discuss how best to implement the gatekeeping functionality of primary care; identifying the factors that cause patients to bypass their primary care gatekeepers when seeking care should be beneficial. To examine the association between patient experience with their primary care physicians and bypassing them to directly obtain care from higher-level healthcare facilities. This prospective cohort study was conducted in 13 primary care clinics in Japan. We assessed patient experience of primary care using the Japanese version of Primary Care Assessment Tool (JPCAT), which comprises six domains: first contact, longitudinality, coordination, comprehensiveness (services available), comprehensiveness (services provided), and community orientation. The primary outcome was the patient bypassing their usual primary care physician to seek care at a hospital, with this occurring at least once in a year. We used a Bayesian hierarchical model to adjust clustering within clinics and individual covariates. Data were analyzed from 205 patients for whom a physician at a clinic served as their usual primary care physician. The patient follow-up rate was 80.1%. After adjustment for patients' sociodemographic and health status characteristics, the JPCAT total score was found to be inversely associated with patient bypass behavior (odds ratio per 1 SD increase, 0.44; 95% credible interval, 0.21-0.88). The results of various sensitivity analyses were consistent with those of the primary analysis. We found that patient experience of primary care in Japan was inversely associated with bypassing a primary care gatekeeper to seek care at higher-level healthcare facilities, such as hospitals. Our findings suggest that primary care providers' efforts to improve patient experience should help to ensure appropriate use of healthcare services under loosely regulated gatekeeping systems; further studies are warranted.
Adamina, Michel; Kehlet, Henrik; Tomlinson, George A; Senagore, Anthony J; Delaney, Conor P
2011-06-01
Health care systems provide care to increasingly complex and elderly patients. Colorectal surgery is a prime example, with high volumes of major procedures, significant morbidity, prolonged hospital stays, and unplanned readmissions. This situation is exacerbated by an exponential rise in costs that threatens the stability of health care systems. Enhanced recovery pathways (ERP) have been proposed as a means to reduce morbidity and improve effectiveness of care. We have reviewed the evidence supporting the implementation of ERP in clinical practice. Medline, Embase, and the Cochrane library were searched for randomized, controlled trials comparing ERP with traditional care in colorectal surgery. Systematic reviews and papers on ERP based on data published in major surgical and anesthesiology journals were critically reviewed by international contributors, experienced in the development and implementation of ERP. A random-effect Bayesian meta-analysis was performed, including 6 randomized, controlled trials totalizing 452 patients. For patients adhering to ERP, length of stay decreased by 2.5 days (95% credible interval [CrI] -3.92 to -1.11), whereas 30-day morbidity was halved (relative risk, 0.52; 95% CrI, 0.36-0.73) and readmission was not increased (relative risk, 0.59; 95% CrI, 0.14-1.43) when compared with patients undergoing traditional care. Adherence to ERP achieves a reproducible improvement in the quality of care by enabling standardization of health care processes. Thus, while accelerating recovery and safely reducing hospital stay, ERPs optimize utilization of health care resources. ERPs can and should be routinely used in care after colorectal and other major gastrointestinal procedures. Copyright © 2011 Mosby, Inc. All rights reserved.
An empirically derived three-dimensional Laplace resonance in the Gliese 876 planetary system
NASA Astrophysics Data System (ADS)
Nelson, Benjamin E.; Robertson, Paul M.; Payne, Matthew J.; Pritchard, Seth M.; Deck, Katherine M.; Ford, Eric B.; Wright, Jason T.; Isaacson, Howard T.
2016-01-01
We report constraints on the three-dimensional orbital architecture for all four planets known to orbit the nearby M dwarf Gliese 876 based solely on Doppler measurements and demanding long-term orbital stability. Our data set incorporates publicly available radial velocities taken with the ELODIE and CORALIE spectrographs, High Accuracy Radial velocity Planet Searcher (HARPS), and Keck HIgh Resolution Echelle Spectrometer (HIRES) as well as previously unpublished HIRES velocities. We first quantitatively assess the validity of the planets thought to orbit GJ 876 by computing the Bayes factors for a variety of different coplanar models using an importance sampling algorithm. We find that a four-planet model is preferred over a three-planet model. Next, we apply a Newtonian Markov chain Monte Carlo algorithm to perform a Bayesian analysis of the planet masses and orbits using an N-body model in three-dimensional space. Based on the radial velocities alone, we find that a 99 per cent credible interval provides upper limits on the mutual inclinations for the three resonant planets (Φcb < 6.20° for the {c} and {b} pair and Φbe < 28.5° for the {b} and {e} pair). Subsequent dynamical integrations of our posterior sample find that the GJ 876 planets must be roughly coplanar (Φcb < 2.60° and Φbe < 7.87°, suggesting that the amount of planet-planet scattering in the system has been low. We investigate the distribution of the respective resonant arguments of each planet pair and find that at least one argument for each planet pair and the Laplace argument librate. The libration amplitudes in our three-dimensional orbital model support the idea of the outer three planets having undergone significant past disc migration.
Dempsey, R L; Layde, P M; Laud, P W; Guse, C E; Hargarten, S W
2005-04-01
To describe the incidence and patterns of sports and recreation related injuries resulting in inpatient hospitalization in Wisconsin. Although much sports and recreation related injury research has focused on the emergency department setting, little is known about the scope or characteristics of more severe sports injuries resulting in hospitalization. The Wisconsin Bureau of Health Information (BHI) maintains hospital inpatient discharge data through a statewide mandatory reporting system. The database contains demographic and health information on all patients hospitalized in acute care non-federal hospitals in Wisconsin. The authors developed a classification scheme based on the International Classification of Diseases External cause of injury code (E code) to identify hospitalizations for sports and recreation related injuries from the BHI data files (2000). Due to the uncertainty within E codes in specifying sports and recreation related injuries, the authors used Bayesian analysis to model the incidence of these types of injuries. There were 1714 (95% credible interval 1499 to 2022) sports and recreation-related injury hospitalizations in Wisconsin in 2000 (32.0 per 100,000 population). The most common mechanisms of injury were being struck by/against an object in sports (6.4 per 100,000 population) and pedal cycle riding (6.2 per 100,000). Ten to 19 year olds had the highest rate of sports and recreation related injury hospitalization (65.3 per 100,000 population), and males overall had a rate four times higher than females. Over 1700 sports and recreation related injuries occurred in Wisconsin in 2000 that were treated during an inpatient hospitalization. Sports and recreation activities result in a substantial number of serious, as well as minor injuries. Prevention efforts aimed at reducing injuries while continuing to promote participation in physical activity for all ages are critical.
Wood, Douglas E; Nader, Daniel A; Springmeyer, Steven C; Elstad, Mark R; Coxson, Harvey O; Chan, Andrew; Rai, Navdeep S; Mularski, Richard A; Cooper, Christopher B; Wise, Robert A; Jones, Paul W; Mehta, Atul C; Gonzalez, Xavier; Sterman, Daniel H
2014-10-01
Lung volume reduction surgery improves quality of life, exercise capacity, and survival in selected patients but is accompanied by significant morbidity. Bronchoscopic approaches may provide similar benefits with less morbidity. In a randomized, sham procedure controlled, double-blind trial, 277 subjects were enrolled at 36 centers. Patients had emphysema, airflow obstruction, hyperinflation, and severe dyspnea. The primary effectiveness measure was a significant improvement in disease-related quality of life (St. George's Respiratory Questionnaire) and changes in lobar lung volumes. The primary safety measure was a comparison of serious adverse events. There were 6/121 (5.0%) responders in the treatment group at 6 months, significantly >1/134 (0.7%) in the control group [Bayesian credible intervals (BCI), 0.05%, 9.21%]. Lobar volume changes were significantly different with an average decrease in the treated lobes of -224 mL compared with -17 mL for the control group (BCI, -272, -143). The proportion of responders in St. George's Respiratory Questionnaire was not greater in the treatment group. There were significantly more subjects with a serious adverse event in the treatment group (n=20 or 14.1%) compared with the control group (n=5 or 3.7%) (BCI, 4.0, 17.1), but most were neither procedure nor device related. This trial had technical and statistical success but partial-bilateral endobronchial valve occlusion did not obtain clinically meaningful results. Safety results were acceptable and compare favorably to lung volume reduction surgery and other bronchial valve studies. Further studies need to focus on improved patient selection and a different treatment algorithm. ClinicalTrials.gov NCT00475007.
Body density and diving gas volume of the northern bottlenose whale (Hyperoodon ampullatus).
Miller, Patrick; Narazaki, Tomoko; Isojunno, Saana; Aoki, Kagari; Smout, Sophie; Sato, Katsufumi
2016-08-15
Diving lung volume and tissue density, reflecting lipid store volume, are important physiological parameters that have only been estimated for a few breath-hold diving species. We fitted 12 northern bottlenose whales with data loggers that recorded depth, 3-axis acceleration and speed either with a fly-wheel or from change of depth corrected by pitch angle. We fitted measured values of the change in speed during 5 s descent and ascent glides to a hydrodynamic model of drag and buoyancy forces using a Bayesian estimation framework. The resulting estimate of diving gas volume was 27.4±4.2 (95% credible interval, CI) ml kg(-1), closely matching the measured lung capacity of the species. Dive-by-dive variation in gas volume did not correlate with dive depth or duration. Estimated body densities of individuals ranged from 1028.4 to 1033.9 kg m(-3) at the sea surface, indicating overall negative tissue buoyancy of this species in seawater. Body density estimates were highly precise with ±95% CI ranging from 0.1 to 0.4 kg m(-3), which would equate to a precision of <0.5% of lipid content based upon extrapolation from the elephant seal. Six whales tagged near Jan Mayen (Norway, 71°N) had lower body density and were closer to neutral buoyancy than six whales tagged in the Gully (Nova Scotia, Canada, 44°N), a difference that was consistent with the amount of gliding observed during ascent versus descent phases in these animals. Implementation of this approach using longer-duration tags could be used to track longitudinal changes in body density and lipid store body condition of free-ranging cetaceans. © 2016. Published by The Company of Biologists Ltd.
Distribution of Marburg virus in Africa: An evolutionary approach.
Zehender, Gianguglielmo; Sorrentino, Chiara; Veo, Carla; Fiaschi, Lisa; Gioffrè, Sonia; Ebranati, Erika; Tanzi, Elisabetta; Ciccozzi, Massimo; Lai, Alessia; Galli, Massimo
2016-10-01
The aim of this study was to investigate the origin and geographical dispersion of Marburg virus, the first member of the Filoviridae family to be discovered. Seventy-three complete genome sequences of Marburg virus isolated from animals and humans were retrieved from public databases and analysed using a Bayesian phylogeographical framework. The phylogenetic tree of the Marburg virus data set showed two significant evolutionary lineages: Ravn virus (RAVV) and Marburg virus (MARV). MARV divided into two main clades; clade A included isolates from Uganda (five from the European epidemic in 1967), Kenya (1980) and Angola (from the epidemic of 2004-2005); clade B included most of the isolates obtained during the 1999-2000 epidemic in the Democratic Republic of the Congo (DRC) and a group of Ugandan isolates obtained in 2007-2009. The estimated mean evolutionary rate of the whole genome was 3.3×10(-4) substitutions/site/year (credibility interval 2.0-4.8). The MARV strain had a mean root time of the most recent common ancestor of 177.9years ago (YA) (95% highest posterior density 87-284), thus indicating that it probably originated in the mid-XIX century, whereas the RAVV strain had a later origin dating back to a mean 33.8 YA. The most probable location of the MARV ancestor was Uganda (state posterior probability, spp=0.41), whereas that of the RAVV ancestor was Kenya (spp=0.71). There were significant migration rates from Uganda to the DRC (Bayes Factor, BF=42.0) and in the opposite direction (BF=5.7). Our data suggest that Uganda may have been the cradle of Marburg virus in Africa. Copyright © 2016 Elsevier B.V. All rights reserved.
Lyons, James E.; Andrew, Royle J.; Thomas, Susan M.; Elliott-Smith, Elise; Evenson, Joseph R.; Kelly, Elizabeth G.; Milner, Ruth L.; Nysewander, David R.; Andres, Brad A.
2012-01-01
Large-scale monitoring of bird populations is often based on count data collected across spatial scales that may include multiple physiographic regions and habitat types. Monitoring at large spatial scales may require multiple survey platforms (e.g., from boats and land when monitoring coastal species) and multiple survey methods. It becomes especially important to explicitly account for detection probability when analyzing count data that have been collected using multiple survey platforms or methods. We evaluated a new analytical framework, N-mixture models, to estimate actual abundance while accounting for multiple detection biases. During May 2006, we made repeated counts of Black Oystercatchers (Haematopus bachmani) from boats in the Puget Sound area of Washington (n = 55 sites) and from land along the coast of Oregon (n = 56 sites). We used a Bayesian analysis of N-mixture models to (1) assess detection probability as a function of environmental and survey covariates and (2) estimate total Black Oystercatcher abundance during the breeding season in the two regions. Probability of detecting individuals during boat-based surveys was 0.75 (95% credible interval: 0.42–0.91) and was not influenced by tidal stage. Detection probability from surveys conducted on foot was 0.68 (0.39–0.90); the latter was not influenced by fog, wind, or number of observers but was ~35% lower during rain. The estimated population size was 321 birds (262–511) in Washington and 311 (276–382) in Oregon. N-mixture models provide a flexible framework for modeling count data and covariates in large-scale bird monitoring programs designed to understand population change.
2012-01-01
Background Pesticide self-poisoning is the most commonly used suicide method worldwide, but few studies have investigated the national epidemiology of pesticide suicide in countries where it is a major public health problem. This study aims to investigate geographic variations in pesticide suicide and their impact on the spatial distribution of suicide in Taiwan. Methods Smoothed standardized mortality ratios for pesticide suicide (2002-2009) were mapped across Taiwan's 358 districts (median population aged 15 or above = 27 000), and their associations with the size of agricultural workforce were investigated using Bayesian hierarchical models. Results In 2002-2009 pesticide poisoning was the third most common suicide method in Taiwan, accounting for 13.6% (4913/36 110) of all suicides. Rates were higher in agricultural East and Central Taiwan and lower in major cities. Almost half (47%) of all pesticide suicides occurred in areas where only 13% of Taiwan's population lived. The geographic distribution of overall suicides was more similar to that of pesticide suicides than non-pesticide suicides. Rural-urban differences in suicide were mostly due to pesticide suicide. Areas where a higher proportion of people worked in agriculture showed higher pesticide suicide rates (adjusted rate ratio [ARR] per standard deviation increase in the proportion of agricultural workers = 1.58, 95% Credible Interval [CrI] 1.44-1.74) and overall suicide rates (ARR = 1.06, 95% CrI 1.03-1.10) but lower non-pesticide suicide rates (ARR = 0.91, 95% CrI 0.87-0.95). Conclusion Easy access to pesticides appears to influence the geographic distribution of suicide in Taiwan, highlighting the potential benefits of targeted prevention strategies such as restricting access to highly toxic pesticides. PMID:22471759
A bias-adjusted evidence synthesis of RCT and observational data: the case of total hip replacement.
Schnell-Inderst, Petra; Iglesias, Cynthia P; Arvandi, Marjan; Ciani, Oriana; Matteucci Gothe, Raffaella; Peters, Jaime; Blom, Ashley W; Taylor, Rod S; Siebert, Uwe
2017-02-01
Evaluation of clinical effectiveness of medical devices differs in some aspects from the evaluation of pharmaceuticals. One of the main challenges identified is lack of robust evidence and a will to make use of experimental and observational studies (OSs) in quantitative evidence synthesis accounting for internal and external biases. Using a case study of total hip replacement to compare the risk of revision of cemented and uncemented implant fixation modalities, we pooled treatment effect estimates from OS and RCTs, and simplified existing methods for bias-adjusted evidence synthesis to enhance practical application. We performed an elicitation exercise using methodological and clinical experts to determine the strength of beliefs about the magnitude of internal and external bias affecting estimates of treatment effect. We incorporated the bias-adjusted treatment effects into a generalized evidence synthesis, calculating both frequentist and Bayesian statistical models. We estimated relative risks as summary effect estimates with 95% confidence/credibility intervals to capture uncertainty. When we compared alternative approaches to synthesizing evidence, we found that the pooled effect size strongly depended on the inclusion of observational data as well as on the use bias-adjusted estimates. We demonstrated the feasibility of using observational studies in meta-analyses to complement RCTs and incorporate evidence from a wider spectrum of clinically relevant studies and healthcare settings. To ensure internal validity, OS data require sufficient correction for confounding and selection bias, either through study design and primary analysis, or by applying post-hoc bias adjustments to the results. © 2017 The Authors. Health Economics published by John Wiley & Sons, Ltd. © 2017 The Authors. Health Economics published by John Wiley & Sons, Ltd.
Neighbourhood Walkability and Daily Steps in Adults with Type 2 Diabetes
Hajna, Samantha; Ross, Nancy A.; Joseph, Lawrence; Harper, Sam; Dasgupta, Kaberi
2016-01-01
Introduction There is evidence that greater neighbourhood walkability (i.e., neighbourhoods with more amenities and well-connected streets) is associated with higher levels of total walking in Europe and in Asia, but it remains unclear if this association holds in the Canadian context and in chronic disease populations. We examined the relationships of different walkability measures to biosensor-assessed total walking (i.e., steps/day) in adults with type 2 diabetes living in Montreal (QC, Canada). Materials and Methods Participants (60.5±10.4 years; 48.1% women) were recruited through McGill University-affiliated clinics (June 2006 to May 2008). Steps/day were assessed once per season for one year with pedometers. Neighbourhood walkability was evaluated through participant reports, in-field audits, Geographic Information Systems (GIS)-derived measures, and the Walk Score®. Relationships between walkability and daily steps were estimated using Bayesian longitudinal hierarchical linear regression models (n = 131). Results Participants who reported living in the most compared to the least walkable neighbourhoods completed 1345 more steps/day (95% Credible Interval: 718, 1976; Quartiles 4 versus 1). Those living in the most compared to the least walkable neighbourhoods (based on GIS-derived walkability) completed 606 more steps per day (95% CrI: 8, 1203). No statistically significant associations with steps were observed for audit-assessed walkability or the Walk Score®. Conclusions Adults with type 2 diabetes who perceived their neighbourhoods as more walkable accumulated more daily steps. This suggests that knowledge of local neighborhood features that enhance walking is a meaningful predictor of higher levels of walking and an important component of neighbourhood walkability. PMID:26991308
Quantifying Uncertainty in the Greenland Surface Mass Balance Elevation Feedback
NASA Astrophysics Data System (ADS)
Edwards, T.
2015-12-01
As the shape of the Greenland ice sheet responds to changes in surface mass balance (SMB) and dynamics, it affects the surface mass balance through the atmospheric lapse rate and by altering atmospheric circulation patterns. Positive degree day models include simplified representations of this feedback, but it is difficult to simulate with state-of-the-art models because it requires coupling of regional climate models with dynamical ice sheet models, which is technically challenging. This difficulty, along with the high computational expense of regional climate models, also drastically limits opportunities for exploring the impact of modelling uncertainties on sea level projections. We present a parameterisation of the SMB-elevation feedback in the MAR regional climate model that provides a far easier and quicker estimate than atmosphere-ice sheet model coupling, which can be used with any ice sheet model. This allows us to use ensembles of different parameter values and ice sheet models to assess the effect of uncertainty in the feedback and ice sheet model structure on future sea level projections. We take a Bayesian approach to uncertainty in the feedback parameterisation, scoring the results from multiple possible "SMB lapse rates" according to how well they reproduce a MAR simulation with altered ice sheet topography. We test the impact of the resulting parameterisation on sea level projections using five ice sheet models forced by MAR (in turned forced by two different global climate models) under the emissions scenario A1B. The estimated additional sea level contribution due to the SMB-elevation feedback is 4.3% at 2100 (95% credibility interval 1.8-6.9%), and 9.6% at 2200 (3.6-16.0%).
Wang, Wei-Ting; You, Li-Kai; Chiang, Chern-En; Sung, Shih-Hsien; Chuang, Shao-Yuan; Cheng, Hao-Min; Chen, Chen-Huan
2016-01-01
Abstract Hypertension is the most important risk factor for stroke and stroke recurrence. However, the preferred blood pressure (BP)-lowering drug class for patients who have suffered from a stroke has yet to be determined. To investigate the relative effects of BP-lowering therapies [angiotensin-converting enzyme inhibitor (ACEI), angiotensin receptor blockers (ARB), β blockers, calcium channel blockers (CCBs), diuretics, and combinations of these drugs] in patients with a prior stroke history, we performed a systematic review and meta-analysis using both traditional frequentist and Bayesian random-effects models and meta-regression of randomized controlled trials (RCTs) on the outcomes of recurrent stroke, coronary heart disease (CHD), and any major adverse cardiac and cerebrovascular events (MACCE). Trials were identified from searches of published hypertension guidelines, electronic databases, and previous systematic reviews. Fifteen RCTs composed of 39,329 participants with previous stroke were identified. Compared with the placebo, only ACEI along with diuretics significantly reduced recurrent stroke events [odds ratio (OR) = 0.54, 95% credibility interval (95% CI) 0.33–0.90]. On the basis of the distribution of posterior probabilities, the treatment ranking consistently identified ACEI along with diuretics as the preferred BP-lowering strategy for the reduction of recurrent stroke and CHD (31% and 35%, respectively). For preventing MACCE, diuretics appeared to be the preferred agent for stroke survivors (34%). Moreover, the meta-regression analysis failed to demonstrate a statistical significance between BP reduction and all outcomes (P = 0.1618 for total stroke, 0.4933 for CHD, and 0.2411 for MACCE). Evidence from RCTs supports the use of diuretics-based treatment, especially when combined with ACEI, for the secondary prevention of recurrent stroke and any vascular events in patients who have suffered from stroke. PMID:27082571
Global Genomic Epidemiology of Salmonella enterica Serovar Typhimurium DT104
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leekitcharoenphon, Pimlapas; Hendriksen, Rene S.; Le Hello, Simon
It has been 30 years since the initial emergence and subsequent rapid global spread of multidrug-resistant Salmonella enterica serovar Typhimurium DT104 (MDR DT104). Nonetheless, its origin and transmission route have never been revealed. In this paper, we used whole-genome sequencing (WGS) and temporally structured sequence analysis within a Bayesian framework to reconstruct temporal and spatial phylogenetic trees and estimate the rates of mutation and divergence times of 315 S. Typhimurium DT104 isolates sampled from 1969 to 2012 from 21 countries on six continents. DT104 was estimated to have emerged initially as antimicrobial susceptible in ~1948 (95% credible interval [CI], 1934more » to 1962) and later became MDR DT104 in ~1972 (95% CI, 1972 to 1988) through horizontal transfer of the 13-kb Salmonella genomic island 1 (SGI1) MDR region into susceptible strains already containing SGI1. This was followed by multiple transmission events, initially from central Europe and later between several European countries. An independent transmission to the United States and another to Japan occurred, and from there MDR DT104 was probably transmitted to Taiwan and Canada. An independent acquisition of resistance genes took place in Thailand in ~1975 (95% CI, 1975 to 1990). In Denmark, WGS analysis provided evidence for transmission of the organism between herds of animals. Interestingly, the demographic history of Danish MDR DT104 provided evidence for the success of the program to eradicate Salmonella from pig herds in Denmark from 1996 to 2000. Finally, the results from this study refute several hypotheses on the evolution of DT104 and suggest that WGS may be useful in monitoring emerging clones and devising strategies for prevention of Salmonella infections.« less
Global Genomic Epidemiology of Salmonella enterica Serovar Typhimurium DT104
Hendriksen, Rene S.; Le Hello, Simon; Weill, François-Xavier; Baggesen, Dorte Lau; Jun, Se-Ran; Lund, Ole; Crook, Derrick W.; Wilson, Daniel J.; Aarestrup, Frank M.
2016-01-01
It has been 30 years since the initial emergence and subsequent rapid global spread of multidrug-resistant Salmonella enterica serovar Typhimurium DT104 (MDR DT104). Nonetheless, its origin and transmission route have never been revealed. We used whole-genome sequencing (WGS) and temporally structured sequence analysis within a Bayesian framework to reconstruct temporal and spatial phylogenetic trees and estimate the rates of mutation and divergence times of 315 S. Typhimurium DT104 isolates sampled from 1969 to 2012 from 21 countries on six continents. DT104 was estimated to have emerged initially as antimicrobial susceptible in ∼1948 (95% credible interval [CI], 1934 to 1962) and later became MDR DT104 in ∼1972 (95% CI, 1972 to 1988) through horizontal transfer of the 13-kb Salmonella genomic island 1 (SGI1) MDR region into susceptible strains already containing SGI1. This was followed by multiple transmission events, initially from central Europe and later between several European countries. An independent transmission to the United States and another to Japan occurred, and from there MDR DT104 was probably transmitted to Taiwan and Canada. An independent acquisition of resistance genes took place in Thailand in ∼1975 (95% CI, 1975 to 1990). In Denmark, WGS analysis provided evidence for transmission of the organism between herds of animals. Interestingly, the demographic history of Danish MDR DT104 provided evidence for the success of the program to eradicate Salmonella from pig herds in Denmark from 1996 to 2000. The results from this study refute several hypotheses on the evolution of DT104 and suggest that WGS may be useful in monitoring emerging clones and devising strategies for prevention of Salmonella infections. PMID:26944846
Cameron, Chris; Zummo, Jacqueline; Desai, Dharmik N; Drake, Christine; Hutton, Brian; Kotb, Ahmed; Weiden, Peter J
Aripiprazole lauroxil (AL) is a long-acting injectable atypical antipsychotic recently approved for treatment of schizophrenia on the basis of a large-scale trial of two doses of AL versus placebo. There are no direct-comparison studies with paliperidone palmitate (PP; long-acting antipsychotic used most often in acute settings) for the acute psychotic episode. To indirectly compare efficacy and safety of the pivotal AL study with all PP studies meeting indirect comparison criteria. Systematic searches of MEDLINE, Embase, Cochrane CENTRAL, PsycINFO, ClinicalTrials.gov, International Clinical Trials Registry Platform, and gray literature were performed to identify randomized controlled trials of PP with similar designs to the AL trial. Bayesian network meta-analysis compared treatments with respect to symptom response and tolerability issues including weight gain, akathisia, parkinsonism, and likelihood of treatment-emergent adverse events. Three appropriate PP studies were identified for indirect comparison. Both doses of AL (441 mg and 882 mg monthly) were used and compared with two efficacious doses of PP (156 mg and 234 mg monthly). All four active-treatment conditions were associated with comparable reductions in acute symptoms (Positive and Negative Syndrome Scale) versus placebo and were of similar magnitude (range of mean difference -8.12 to -12.01, with overlapping 95% credible intervals). Between-group comparisons of active-treatment arms were associated with summary estimates of magnitude near 0. No clinically meaningful differences in selected safety or tolerability parameter incidence were found between active treatments. These results suggest that both AL and PP are effective for treatment of adults experiencing acute exacerbation of schizophrenia. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Zhang, Yuzhen; Vrancken, Bram; Feng, Yun; Dellicour, Simon; Yang, Qiqi; Yang, Weihong; Zhang, Yunzhi; Dong, Lu; Pybus, Oliver G; Zhang, Hailin; Tian, Huaiyu
2017-06-03
Rabies is an important but underestimated threat to public health, with most cases reported in Asia. Since 2000, a new epidemic wave of rabies has emerged in Yunnan Province, southwestern China, which borders three countries in Southeast Asia. We estimated gene-specific evolutionary rates for rabies virus using available data in GenBank, then used this information to calibrate the timescale of rabies virus (RABV) spread in Asia. We used 452 publicly available geo-referenced complete nucleoprotein (N) gene sequences, including 52 RABV sequences that were recently generated from samples collected in Yunnan between 2008 and 2012. The RABV N gene evolutionary rate was estimated to be 1.88 × 10 -4 (1.37-2.41 × 10 -4 , 95% Bayesian credible interval, BCI) substitutions per site per year. Phylogenetic reconstructions show that the currently circulating RABV lineages in Yunnan result from at least seven independent introductions (95% BCI: 6-9 introductions) and represent each of the three main Asian RABV lineages, SEA-1, -2 and -3. We find that Yunnan is a sink location for the domestic spread of RABV and connects RABV epidemics in North China, South China, and Southeast Asia. Cross-border spread from southeast Asia (SEA) into South China, and intermixing of the North and South China epidemics is also well supported. The influx of RABV into Yunnan from SEA was not well-supported, likely due to the poor sampling of SEA RABV diversity. We found evidence for a lineage displacement of the Yunnan SEA-2 and -3 lineages by Yunnan SEA-1 strains, and considered whether this could be attributed to fitness differences. Overall, our study contributes to a better understanding of the spread of RABV that could facilitate future rabies virus control and prevention efforts.
Thorlund, Kristian; Druyts, Eric; Wu, Ping; Balijepalli, Chakrapani; Keohane, Denis; Mills, Edward
2015-05-01
To establish the comparative efficacy and safety of selective serotonin reuptake inhibitors and serotonin-norepinephrine reuptake inhibitors in older adults using the network meta-analysis approach. Systematic review and network meta-analysis. Individuals aged 60 and older. Data on partial response (defined as at least 50% reduction in depression score from baseline) and safety (dizziness, vertigo, syncope, falls, loss of consciousness) were extracted. A Bayesian network meta-analysis was performed on the efficacy and safety outcomes, and relative risks (RRs) with 95% credible intervals (CrIs) were produced. Fifteen randomized controlled trials were eligible for inclusion in the analysis. Citalopram, escitalopram, paroxetine, duloxetine, venlafaxine, fluoxetine, and sertraline were represented. Reporting on partial response and dizziness was sufficient to conduct a network meta-analysis. Reporting on other outcomes was sparse. For partial response, sertraline (RR=1.28), paroxetine (RR=1.48), and duloxetine (RR=1.62) were significantly better than placebo. The remaining interventions yielded RRs lower than 1.20. For dizziness, duloxetine (RR=3.18) and venlafaxine (RR=2.94) were statistically significantly worse than placebo. Compared with placebo, sertraline had the lowest RR for dizziness (1.14) and fluoxetine the second lowest (1.31). Citalopram, escitalopram, and paroxetine all had RRs between 1.4 and 1.7. There was clear evidence of the effectiveness of sertraline, paroxetine, and duloxetine. There also appears to be a hierarchy of safety associated with the different antidepressants, although there appears to be a dearth of reporting of safety outcomes. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.
Accuracy of clinical diagnosis of Parkinson disease: A systematic review and meta-analysis.
Rizzo, Giovanni; Copetti, Massimiliano; Arcuti, Simona; Martino, Davide; Fontana, Andrea; Logroscino, Giancarlo
2016-02-09
To evaluate the diagnostic accuracy of clinical diagnosis of Parkinson disease (PD) reported in the last 25 years by a systematic review and meta-analysis. We searched for articles published between 1988 and August 2014. Studies were included if reporting diagnostic parameters regarding clinical diagnosis of PD or crude data. The selected studies were subclassified based on different study setting, type of test diagnosis, and gold standard. Bayesian meta-analyses of available data were performed. We selected 20 studies, including 11 using pathologic examination as gold standard. Considering only these 11 studies, the pooled diagnostic accuracy was 80.6% (95% credible interval [CrI] 75.2%-85.3%). Accuracy was 73.8% (95% CrI 67.8%-79.6%) for clinical diagnosis performed mainly by nonexperts. Accuracy of clinical diagnosis performed by movement disorders experts rose from 79.6% (95% CrI 46%-95.1%) of initial assessment to 83.9% (95% CrI 69.7%-92.6%) of refined diagnosis after follow-up. Using UK Parkinson's Disease Society Brain Bank Research Center criteria, the pooled diagnostic accuracy was 82.7% (95% CrI 62.6%-93%). The overall validity of clinical diagnosis of PD is not satisfying. The accuracy did not significantly improve in the last 25 years, particularly in the early stages of disease, where response to dopaminergic treatment is less defined and hallmarks of alternative diagnoses such as atypical parkinsonism may not have emerged. Misclassification rate should be considered to calculate the sample size both in observational studies and randomized controlled trials. Imaging and biomarkers are urgently needed to improve the accuracy of clinical diagnosis in vivo. © 2016 American Academy of Neurology.
Spatial and temporal patterns of dengue infections in Timor-Leste, 2005-2013.
Wangdi, Kinley; Clements, Archie C A; Du, Tai; Nery, Susana Vaz
2018-01-04
Dengue remains an important public health problem in Timor-Leste, with several major epidemics occurring over the last 10 years. The aim of this study was to identify dengue clusters at high geographical resolution and to determine the association between local environmental characteristics and the distribution and transmission of the disease. Notifications of dengue cases that occurred from January 2005 to December 2013 were obtained from the Ministry of Health, Timor-Leste. The population of each suco (the third-level administrative subdivision) was obtained from the Population and Housing Census 2010. Spatial autocorrelation in dengue incidence was explored using Moran's I statistic, Local Indicators of Spatial Association (LISA), and the Getis-Ord statistics. A multivariate, Zero-Inflated, Poisson (ZIP) regression model was developed with a conditional autoregressive (CAR) prior structure, and with posterior parameters estimated using Bayesian Markov chain Monte Carlo (MCMC) simulation with Gibbs sampling. The analysis used data from 3206 cases. Dengue incidence was highly seasonal with a large peak in January. Patients ≥ 14 years were found to be 74% [95% credible interval (CrI): 72-76%] less likely to be infected than those < 14 years, and females were 12% (95% CrI: 4-21%) more likely to suffer from dengue as compared to males. Dengue incidence increased by 0.7% (95% CrI: 0.6-0.8%) for a 1 °C increase in mean temperature; and 47% (95% CrI: 29-59%) for a 1 mm increase in precipitation. There was no significant residual spatial clustering after accounting for climate and demographic variables. Dengue incidence was highly seasonal and spatially clustered, with positive associations with temperature, precipitation and demographic factors. These factors explained the observed spatial heterogeneity of infection.
McDonald, Scott A; van Boven, Michiel; Wallinga, Jacco
2017-07-01
Estimation of the national-level incidence of seasonal influenza is notoriously challenging. Surveillance of influenza-like illness is carried out in many countries using a variety of data sources, and several methods have been developed to estimate influenza incidence. Our aim was to obtain maximally informed estimates of the proportion of influenza-like illness that is true influenza using all available data. We combined data on weekly general practice sentinel surveillance consultation rates for influenza-like illness, virologic testing of sampled patients with influenza-like illness, and positive laboratory tests for influenza and other pathogens, applying Bayesian evidence synthesis to estimate the positive predictive value (PPV) of influenza-like illness as a test for influenza virus infection. We estimated the weekly number of influenza-like illness consultations attributable to influenza for nine influenza seasons, and for four age groups. The estimated PPV for influenza in influenza-like illness patients was highest in the weeks surrounding seasonal peaks in influenza-like illness rates, dropping to near zero in between-peak periods. Overall, 14.1% (95% credible interval [CrI]: 13.5%, 14.8%) of influenza-like illness consultations were attributed to influenza infection; the estimated PPV was 50% (95% CrI: 48%, 53%) for the peak weeks and 5.8% during the summer periods. The model quantifies the correspondence between influenza-like illness consultations and influenza at a weekly granularity. Even during peak periods, a substantial proportion of influenza-like illness-61%-was not attributed to influenza. The much lower proportion of influenza outside the peak periods reflects the greater circulation of other respiratory pathogens relative to influenza.
Jarde, A; Lutsiv, O; Park, C K; Beyene, J; Dodd, J M; Barrett, J; Shah, P S; Cook, J L; Saito, S; Biringer, A B; Sabatino, L; Giglia, L; Han, Z; Staub, K; Mundle, W; Chamberlain, J; McDonald, S D
2017-07-01
Preterm birth (PTB) is the leading cause of infant death, but it is unclear which intervention is best to prevent it. To compare progesterone, cerclage and pessary, determine their relative effects and rank them. We searched Medline, EMBASE, CINAHL, Cochrane CENTRAL and Web of Science (to April 2016), without restrictions, and screened references of previous reviews. We included randomised trials of progesterone, cerclage or pessary for preventing PTB in women with singleton pregnancies at risk as defined by each study. We extracted data by duplicate using a piloted form and performed Bayesian random-effects network meta-analyses and pairwise meta-analyses. We rated evidence quality using GRADE, ranked interventions using SUCRA and calculated numbers needed to treat (NNT). We included 36 trials (9425 women; 25 low risk of bias trials). Progesterone ranked first or second for most outcomes, reducing PTB < 34 weeks [odds ratio (OR) 0.44; 95% credible interval (CrI) 0.22-0.79; NNT 9; low quality], <37 weeks (OR 0.58; 95% CrI 0.41-0.79; NNT 9; moderate quality), and neonatal death (OR 0.50; 95% CrI 0.28-0.85; NNT 35; high quality), compared with control, in women overall at risk. We found similar results in the subgroup with previous PTB, but only a reduction of PTB < 34 weeks in women with a short cervix. Pessary showed inconsistent benefit and cerclage did not reduce PTB < 37 or <34 weeks. Progesterone was the best intervention for preventing PTB in singleton pregnancies at risk, reducing PTB < 34 weeks, <37 weeks, neonatal demise and other sequelae. Progesterone was better than cerclage and pessary to prevent preterm birth, neonatal death and more in network meta-analysis. © 2017 Royal College of Obstetricians and Gynaecologists.
Mang, A V; Buczinski, S; Booker, C W; Timsit, E
2015-01-01
A computer-aided lung auscultation (CALA) system was recently developed to diagnose bovine respiratory disease (BRD) in feedlot cattle. To determine, in a case-control study, the level of agreement between CALA and veterinary lung auscultation and to evaluate the sensitivity (Se) and specificity (Sp) of CALA to diagnose BRD in feedlot cattle. A total of 561 Angus cross-steers (initial body weight = 246 ± 45 kg) were observed during the first 50 day after entry to a feedlot. Case-control study. Steers with visual signs of BRD identified by pen checkers were examined by a veterinarian, including lung auscultation using a conventional stethoscope and CALA that produced a lung score from 1 (normal) to 5 (chronic). For each steer examined for BRD, 1 apparently healthy steer was selected as control and similarly examined. Agreement between CALA and veterinary auscultation was assessed by kappa statistic. CALA's Se and Sp were estimated using Bayesian latent class analysis. Of the 561 steers, 35 were identified with visual signs of BRD and 35 were selected as controls. Comparison of veterinary auscultation and CALA (using a CALA score ≥2 as a cut off) revealed a substantial agreement (kappa = 0.77). Using latent class analysis, CALA had a relatively high Se (92.9%; 95% credible interval [CI] = 0.71-0.99) and Sp (89.6%; 95% CI = 0.64-0.99) for diagnosing BRD compared with pen checking. CALA had good diagnostic accuracy (albeit with a relatively wide CI). Its use in feedlots could increase the proportion of cattle accurately diagnosed with BRD. Copyright © 2015 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
Chang, Shu-Sen; Lu, Tsung-Hsueh; Sterne, Jonathan Ac; Eddleston, Michael; Lin, Jin-Jia; Gunnell, David
2012-04-02
Pesticide self-poisoning is the most commonly used suicide method worldwide, but few studies have investigated the national epidemiology of pesticide suicide in countries where it is a major public health problem. This study aims to investigate geographic variations in pesticide suicide and their impact on the spatial distribution of suicide in Taiwan. Smoothed standardized mortality ratios for pesticide suicide (2002-2009) were mapped across Taiwan's 358 districts (median population aged 15 or above = 27 000), and their associations with the size of agricultural workforce were investigated using Bayesian hierarchical models. In 2002-2009 pesticide poisoning was the third most common suicide method in Taiwan, accounting for 13.6% (4913/36 110) of all suicides. Rates were higher in agricultural East and Central Taiwan and lower in major cities. Almost half (47%) of all pesticide suicides occurred in areas where only 13% of Taiwan's population lived. The geographic distribution of overall suicides was more similar to that of pesticide suicides than non-pesticide suicides. Rural-urban differences in suicide were mostly due to pesticide suicide. Areas where a higher proportion of people worked in agriculture showed higher pesticide suicide rates (adjusted rate ratio [ARR] per standard deviation increase in the proportion of agricultural workers = 1.58, 95% Credible Interval [CrI] 1.44-1.74) and overall suicide rates (ARR = 1.06, 95% CrI 1.03-1.10) but lower non-pesticide suicide rates (ARR = 0.91, 95% CrI 0.87-0.95). Easy access to pesticides appears to influence the geographic distribution of suicide in Taiwan, highlighting the potential benefits of targeted prevention strategies such as restricting access to highly toxic pesticides.
Zhang, Yi; Lu, Yongfang; Yindee, Marnoch; Li, Kuan-Yi; Kuo, Hsiao-Yun; Ju, Yu-Ten; Ye, Shaohui; Faruque, Md Omar; Li, Qiang; Wang, Yachun; Cuong, Vu Chi; Pham, Lan Doan; Bouahom, Bounthong; Yang, Bingzhuang; Liang, Xianwei; Cai, Zhihua; Vankan, Dianne; Manatchaiworakul, Wallaya; Kowlim, Nonglid; Duangchantrasiri, Somphot; Wajjwalku, Worawidh; Colenbrander, Ben; Zhang, Yuan; Beerli, Peter; Lenstra, Johannes A; Barker, J Stuart F
2016-04-01
The swamp type of the Asian water buffalo is assumed to have been domesticated by about 4000 years BP, following the introduction of rice cultivation. Previous localizations of the domestication site were based on mitochondrial DNA (mtDNA) variation within China, accounting only for the maternal lineage. We carried out a comprehensive sampling of China, Taiwan, Vietnam, Laos, Thailand, Nepal and Bangladesh and sequenced the mtDNA Cytochrome b gene and control region and the Y-chromosomal ZFY, SRY and DBY sequences. Swamp buffalo has a higher diversity of both maternal and paternal lineages than river buffalo, with also a remarkable contrast between a weak phylogeographic structure of river buffalo and a strong geographic differentiation of swamp buffalo. The highest diversity of the swamp buffalo maternal lineages was found in south China and north Indochina on both banks of the Mekong River, while the highest diversity in paternal lineages was in the China/Indochina border region. We propose that domestication in this region was later followed by introgressive capture of wild cows west of the Mekong. Migration to the north followed the Yangtze valley as well as a more eastern route, but also involved translocations of both cows and bulls over large distances with a minor influence of river buffaloes in recent decades. Bayesian analyses of various migration models also supported domestication in the China/Indochina border region. Coalescence analysis yielded consistent estimates for the expansion of the major swamp buffalo haplogroups with a credibility interval of 900 to 3900 years BP. The spatial differentiation of mtDNA and Y-chromosomal haplotype distributions indicates a lack of gene flow between established populations that is unprecedented in livestock. © 2015 John Wiley & Sons Ltd.
Lum, Kirsten J.; Sundaram, Rajeshwari; Barr, Dana Boyd; Louis, Thomas A.; Louis, Germaine M. Buck
2016-01-01
Background Perfluoroalkyl substances have been associated with changes in menstrual cycle characteristics and fecundity, when modeled separately. However, these outcomes are biologically related, and we evaluate their joint association with exposure to perfluoroalkyl substances. Methods We recruited 501 couples from Michigan and Texas in 2005-2009 upon their discontinuing contraception and followed them until pregnancy or 12 months of trying. Female partners provided a serum sample upon enrollment and completed daily journals on menstruation, intercourse, and pregnancy test results. We measured seven perfluoroalkyl substances in serum using liquid-chromatography-tandem mass spectrometry. We assessed the association between perfluoroalkyl substances and menstrual cycle length using accelerated failure time models and between perfluoroalkyl substances and fecundity using a Bayesian joint modeling approach to incorporate cycle length. Results Menstrual cycles were 3% longer comparing women in the second versus first tertile of perfluorodecanoate (PFDeA; acceleration factor [AF]=1.03, 95% credible interval [CrI]=[1.00, 1.05]), but 2% shorter for women in the highest versus lowest tertile of perfluorooctanoic acid (PFOA) (AF=0.98, 95% CrI=[0.96, 1.00]). When accounting for cycle length, relevant covariates and remaining perfluoroalkyl substances, the probability of pregnancy was lower for women in second versus first tertile of PFNA (odds ratio [OR]=0.6, 95% CrI=[0.4, 1.0]) though not when comparing the highest versus lowest (OR=0.7, 95% CrI=[0.3, 1.1]) tertile. Conclusions In this prospective cohort study, we observed associations between two perfluoroalkyl substances and menstrual cycle length changes, and between select perfluoroalkyl substances and diminished fecundity at some (but not all) concentrations. PMID:27541842
Mair, Christina; Freisthler, Bridget; Ponicki, William R; Gaidus, Andrew
2015-09-01
As an increasing number of states liberalize cannabis use and develop laws and local policies, it is essential to better understand the impacts of neighborhood ecology and marijuana dispensary density on marijuana use, abuse, and dependence. We investigated associations between marijuana abuse/dependence hospitalizations and community demographic and environmental conditions from 2001 to 2012 in California, as well as cross-sectional associations between local and adjacent marijuana dispensary densities and marijuana hospitalizations. We analyzed panel population data relating hospitalizations coded for marijuana abuse or dependence and assigned to residential ZIP codes in California from 2001 through 2012 (20,219 space-time units) to ZIP code demographic and ecological characteristics. Bayesian space-time misalignment models were used to account for spatial variations in geographic unit definitions over time, while also accounting for spatial autocorrelation using conditional autoregressive priors. We also analyzed cross-sectional associations between marijuana abuse/dependence and the density of dispensaries in local and spatially adjacent ZIP codes in 2012. An additional one dispensary per square mile in a ZIP code was cross-sectionally associated with a 6.8% increase in the number of marijuana hospitalizations (95% credible interval 1.033, 1.105) with a marijuana abuse/dependence code. Other local characteristics, such as the median household income and age and racial/ethnic distributions, were associated with marijuana hospitalizations in cross-sectional and panel analyses. Prevention and intervention programs for marijuana abuse and dependence may be particularly essential in areas of concentrated disadvantage. Policy makers may want to consider regulations that limit the density of dispensaries. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Moja, L; Danese, S; Fiorino, G; Del Giovane, C; Bonovas, S
2015-06-01
Budesonide and mesalazine (mesalamine) are commonly used in the medical management of patients with mild to moderate Crohn's disease. To assess their comparative efficacy and harm using the methodology of network meta-analysis. A comprehensive search of Medline, Embase, the Cochrane Library and ClinicalTrials.gov, through October 2014, was performed to identify randomised controlled trials (RCTs) that recruited adult patients with active or quiescent Crohn's disease, and compared budesonide or mesalazine with placebo, or against each other, or different dosing strategies of one drug. Twenty-five RCTs were combined using Bayesian network meta-analysis. Budesonide 9 mg/day, or at higher doses (15 or 18 mg/day), was shown superior to placebo for induction of remission [odds ratio (OR), 2.93; 95% credible interval (CrI), 1.52-5.39, and OR, 3.28; CrI, 1.46-7.55 respectively] and ranks at the top of the hierarchy of the competing treatments. For maintenance of remission, budesonide 6 mg/day demonstrated superiority over placebo (OR, 1.69; CrI, 1.05-2.75), being also at the best ranking position among all compared treatment strategies. No other comparisons (i.e. different doses of mesalazine vs. placebo or budesonide, for induction or maintenance of remission) reached significance. The occurrence of withdrawals due to adverse events was not shown different between budesonide, mesalazine and placebo, in both the induction and maintenance phases. Budesonide, at the doses of 9 mg/day, or higher, for induction of remission in active mild or moderate Crohn's disease, and at 6 mg/day for maintenance of remission, appears to be the best treatment choice. © 2015 John Wiley & Sons Ltd.
Genomic selection for fruit quality traits in apple (Malus×domestica Borkh.).
Kumar, Satish; Chagné, David; Bink, Marco C A M; Volz, Richard K; Whitworth, Claire; Carlisle, Charmaine
2012-01-01
The genome sequence of apple (Malus×domestica Borkh.) was published more than a year ago, which helped develop an 8K SNP chip to assist in implementing genomic selection (GS). In apple breeding programmes, GS can be used to obtain genomic breeding values (GEBV) for choosing next-generation parents or selections for further testing as potential commercial cultivars at a very early stage. Thus GS has the potential to accelerate breeding efficiency significantly because of decreased generation interval or increased selection intensity. We evaluated the accuracy of GS in a population of 1120 seedlings generated from a factorial mating design of four females and two male parents. All seedlings were genotyped using an Illumina Infinium chip comprising 8,000 single nucleotide polymorphisms (SNPs), and were phenotyped for various fruit quality traits. Random-regression best liner unbiased prediction (RR-BLUP) and the Bayesian LASSO method were used to obtain GEBV, and compared using a cross-validation approach for their accuracy to predict unobserved BLUP-BV. Accuracies were very similar for both methods, varying from 0.70 to 0.90 for various fruit quality traits. The selection response per unit time using GS compared with the traditional BLUP-based selection were very high (>100%) especially for low-heritability traits. Genome-wide average estimated linkage disequilibrium (LD) between adjacent SNPs was 0.32, with a relatively slow decay of LD in the long range (r(2) = 0.33 and 0.19 at 100 kb and 1,000 kb respectively), contributing to the higher accuracy of GS. Distribution of estimated SNP effects revealed involvement of large effect genes with likely pleiotropic effects. These results demonstrated that genomic selection is a credible alternative to conventional selection for fruit quality traits.
Chronic kidney disease in dogs in UK veterinary practices: prevalence, risk factors, and survival.
O'Neill, D G; Elliott, J; Church, D B; McGreevy, P D; Thomson, P C; Brodbelt, D C
2013-01-01
The prevalence for chronic kidney disease (CKD) in dogs varies widely (0.05-3.74%). Identified risk factors include advancing age, specific breeds, small body size, and periodontal disease. To estimate the prevalence and identify risk factors associated with CKD diagnosis and survival in dogs. Purebred dogs were hypothesized to have higher CKD risk and poorer survival characteristics than crossbred dogs. A merged clinical database of 107,214 dogs attending 89 UK veterinary practices over a 2-year period (January 2010-December 2011). A longitudinal study design estimated the apparent prevalence (AP) whereas the true prevalence (TP) was estimated using Bayesian analysis. A nested case-control study design evaluated risk factors. Survival analysis used the Kaplan-Meier survival curve method and multivariable Cox proportional hazards regression modeling. The CKD AP was 0.21% (95% CI: 0.19-0.24%) and TP was 0.37% (95% posterior credibility interval 0.02-1.44%). Significant risk factors included increasing age, being insured, and certain breeds (Cocker Spaniel, Cavalier King Charles Spaniel). Cardiac disease was a significant comorbid disorder. Significant clinical signs included halitosis, weight loss, polyuria/polydipsia, urinary incontinence, vomiting, decreased appetite, lethargy, and diarrhea. The median survival time from diagnosis was 226 days (95% CI 112-326 days). International Renal Interest Society stage and blood urea nitrogen concentration at diagnosis were significantly associated with hazard of death due to CKD. Chronic kidney disease compromises dog welfare. Increased awareness of CKD risk factors and association of blood biochemistry results with survival time should facilitate diagnosis and optimize case management to improve animal survival and welfare. Copyright © 2013 by the American College of Veterinary Internal Medicine.
Body density and diving gas volume of the northern bottlenose whale (Hyperoodon ampullatus)
Miller, Patrick; Narazaki, Tomoko; Isojunno, Saana; Aoki, Kagari; Smout, Sophie; Sato, Katsufumi
2016-01-01
ABSTRACT Diving lung volume and tissue density, reflecting lipid store volume, are important physiological parameters that have only been estimated for a few breath-hold diving species. We fitted 12 northern bottlenose whales with data loggers that recorded depth, 3-axis acceleration and speed either with a fly-wheel or from change of depth corrected by pitch angle. We fitted measured values of the change in speed during 5 s descent and ascent glides to a hydrodynamic model of drag and buoyancy forces using a Bayesian estimation framework. The resulting estimate of diving gas volume was 27.4±4.2 (95% credible interval, CI) ml kg−1, closely matching the measured lung capacity of the species. Dive-by-dive variation in gas volume did not correlate with dive depth or duration. Estimated body densities of individuals ranged from 1028.4 to 1033.9 kg m−3 at the sea surface, indicating overall negative tissue buoyancy of this species in seawater. Body density estimates were highly precise with ±95% CI ranging from 0.1 to 0.4 kg m−3, which would equate to a precision of <0.5% of lipid content based upon extrapolation from the elephant seal. Six whales tagged near Jan Mayen (Norway, 71°N) had lower body density and were closer to neutral buoyancy than six whales tagged in the Gully (Nova Scotia, Canada, 44°N), a difference that was consistent with the amount of gliding observed during ascent versus descent phases in these animals. Implementation of this approach using longer-duration tags could be used to track longitudinal changes in body density and lipid store body condition of free-ranging cetaceans. PMID:27296044
Earnest, Arul; Hock Ong, Marcus Eng; Shahidah, Nur; Min Ng, Wen; Foo, Chuanyang; Nott, David John
2012-01-01
The main objective of this study was to establish the spatial variation in ambulance response times for out-of-hospital cardiac arrests (OHCAs) in the city-state of Singapore. The secondary objective involved studying the relationships between various covariates, such as traffic condition and time and day of collapse, and ambulance response times. The study design was observational and ecological in nature. Data on OHCAs were collected from a nationally representative database for the period October 2001 to October 2004. We used the conditional autoregressive (CAR) model to analyze the data. Within the Bayesian framework of analysis, we used a Weibull regression model that took into account spatial random effects. The regression model was used to study the independent effects of each covariate. Our results showed that there was spatial heterogeneity in the ambulance response times in Singapore. Generally, areas in the far outskirts (suburbs), such as Boon Lay (in the west) and Sembawang (in the north), fared badly in terms of ambulance response times. This improved when adjusted for key covariates, including distance from the nearest fire station. Ambulance response time was also associated with better traffic conditions, weekend OHCAs, distance from the nearest fire station, and OHCAs occurring during nonpeak driving hours. For instance, the hazard ratio for good ambulance response time was 2.35 (95% credible interval [CI] 1.97-2.81) when traffic conditions were light and 1.72 (95% CI 1.51-1.97) when traffic conditions were moderate, as compared with heavy traffic. We found a clear spatial gradient for ambulance response times, with far-outlying areas' exhibiting poorer response times. Our study highlights the utility of this novel approach, which may be helpful for planning emergency medical services and public emergency responses.
Prospects for stakeholder coordination by protected-area managers in Europe.
Mattsson, Brady J; Vacik, Harald
2018-02-01
Growing resource demands by humans, invasive species, natural hazards, and a changing climate have created broad-scale impacts and the need for broader-extent conservation activities that span ownerships and even political borders. Implementing regional-scale conservation brings great challenges, and learning how to overcome these challenges is essential for maintaining biodiversity (i.e., richness and evenness of biological communities) and ecosystem functions and services across scales and borders in the face of system change. We administered an online survey to examine factors potentially driving perspectives of protected-area (PA) managers regarding coordination with neighboring PAs and other stakeholders (i.e., stakeholder coordination) for conserving biodiversity and ecosystem services during the next decade within diverse regions across Europe. Although >70% (n = 58) of responding PA managers indicated that climate change and invasive species are relevant for their PAs, they gave <50% probability that these threats could be mitigated through stakeholder coordination. They thought there was a >60% probability (n = 85) that stakeholder coordination would take place with the aim to improve conservation outcomes. Consistent with the foundation on which many European PAs were established, managers viewed maintaining or enhancing biodiversity as the most important (>70%; n = 61) expected benefit. Other benefits included maintaining or enhancing human resources and environmental education (range of Bayesian credibility intervals [CIs] 57-93%). They thought the main barriers to stakeholder coordination were the lack of human and economic resources (CI 59-67% chance of hindering; n = 64) followed by communication and interstakeholder differences in political structures and laws (CI 51-64% probability of hindering). European policies and strategies that address these hindering factors could be particularly effective means of enabling implementation of green infrastructure networks in which PAs are the nodes. © 2017 Society for Conservation Biology.
Karolemeas, Katerina; de la Rua-Domenech, Ricardo; Cooper, Roderick; Goodchild, Anthony V.; Clifton-Hadley, Richard S.; Conlan, Andrew J. K.; Mitchell, Andrew P.; Hewinson, R. Glyn; Donnelly, Christl A.; Wood, James L. N.; McKinley, Trevelyan J.
2012-01-01
Bovine tuberculosis (bTB) is one of the most serious economic animal health problems affecting the cattle industry in Great Britain (GB), with incidence in cattle herds increasing since the mid-1980s. The single intradermal comparative cervical tuberculin (SICCT) test is the primary screening test in the bTB surveillance and control programme in GB and Ireland. The sensitivity (ability to detect infected cattle) of this test is central to the efficacy of the current testing regime, but most previous studies that have estimated test sensitivity (relative to the number of slaughtered cattle with visible lesions [VL] and/or positive culture results) lacked post-mortem data for SICCT test-negative cattle. The slaughter of entire herds (“whole herd slaughters” or “depopulations”) that are infected by bTB are occasionally conducted in GB as a last-resort control measure to resolve intractable bTB herd breakdowns. These provide additional post-mortem data for SICCT test-negative cattle, allowing a rare opportunity to calculate the animal-level sensitivity of the test relative to the total number of SICCT test-positive and negative VL animals identified post-mortem (rSe). In this study, data were analysed from 16 whole herd slaughters (748 SICCT test-positive and 1031 SICCT test-negative cattle) conducted in GB between 1988 and 2010, using a Bayesian hierarchical model. The overall rSe estimate of the SICCT test at the severe interpretation was 85% (95% credible interval [CI]: 78–91%), and at standard interpretation was 81% (95% CI: 70–89%). These estimates are more robust than those previously reported in GB due to inclusion of post-mortem data from SICCT test-negative cattle. PMID:22927952
Global Genomic Epidemiology of Salmonella enterica Serovar Typhimurium DT104
Leekitcharoenphon, Pimlapas; Hendriksen, Rene S.; Le Hello, Simon; ...
2016-03-04
It has been 30 years since the initial emergence and subsequent rapid global spread of multidrug-resistant Salmonella enterica serovar Typhimurium DT104 (MDR DT104). Nonetheless, its origin and transmission route have never been revealed. In this paper, we used whole-genome sequencing (WGS) and temporally structured sequence analysis within a Bayesian framework to reconstruct temporal and spatial phylogenetic trees and estimate the rates of mutation and divergence times of 315 S. Typhimurium DT104 isolates sampled from 1969 to 2012 from 21 countries on six continents. DT104 was estimated to have emerged initially as antimicrobial susceptible in ~1948 (95% credible interval [CI], 1934more » to 1962) and later became MDR DT104 in ~1972 (95% CI, 1972 to 1988) through horizontal transfer of the 13-kb Salmonella genomic island 1 (SGI1) MDR region into susceptible strains already containing SGI1. This was followed by multiple transmission events, initially from central Europe and later between several European countries. An independent transmission to the United States and another to Japan occurred, and from there MDR DT104 was probably transmitted to Taiwan and Canada. An independent acquisition of resistance genes took place in Thailand in ~1975 (95% CI, 1975 to 1990). In Denmark, WGS analysis provided evidence for transmission of the organism between herds of animals. Interestingly, the demographic history of Danish MDR DT104 provided evidence for the success of the program to eradicate Salmonella from pig herds in Denmark from 1996 to 2000. Finally, the results from this study refute several hypotheses on the evolution of DT104 and suggest that WGS may be useful in monitoring emerging clones and devising strategies for prevention of Salmonella infections.« less
Jacobs, Jeffrey P; He, Xia; O'Brien, Sean M; Welke, Karl F; Filardo, Giovanni; Han, Jane M; Ferraris, Victor A; Prager, Richard L; Shahian, David M
2013-09-01
Short postoperative ventilation times are accepted as a marker of quality. This analysis assesses center level variation in postoperative ventilation time in a subset of patients undergoing isolated coronary artery bypass grafting (CABG). In 2009 and 2010, 325,129 patients in the STS Adult Cardiac Surgery Database underwent isolated CABG. Patients were excluded if they were intubated before entering the operating room, required ventilation for greater than 24 hours, or had missing data on key covariates. The final study cohort was 274,231 isolated CABG patients from 1,008 centers. Bayesian hierarchical models were used to assess between-center variation in ventilation time and to explore the effect of center-level covariates. Analyses were performed with and without adjusting for case mix. After adjusting for case mix, the ratio of median ventilator time at the 90th percentile of the center-level distribution compared with the tenth percentile was 9.0:5.0=1.8 (95% credible interval: 1.79 to 1.85). This ratio illustrates the scale of between-center differences: centers above the 90th percentile have a ventilation time of at least 1.8 times that of centers below the tenth percentile. Smaller hospital volume, presence of a residency program, and some census regions were associated with longer ventilation times. After adjustment for severity of illness, substantial inter-center variation exists in postoperative ventilation time in this subset of patients undergoing isolated CABG. This finding represents an opportunity for multi-institutional quality improvement initiatives designed to limit variations in ventilator management and achieve the shortest possible ventilation times for all patients, thus benefiting both clinical outcomes and resource utilization. Copyright © 2013 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Young, J; Mucsi, I; Rollet-Kurhajec, K C; Klein, M B
2016-05-01
Fibroblast growth factor 23 (FGF23) has been associated with cardiovascular mortality. We estimate associations between the level of plasma FGF23 and exposure to abacavir (ABC) and to other components of antiretroviral therapy in patients co-infected with HIV and hepatitis C. Both intact and c-terminal FGF23 were measured in plasma using commercial assays for a sub-cohort of 295 patients selected at random from the 1150 patients enrolled in the Canadian Co-infection Cohort. The multiplicative effects of antiretroviral drug exposures and covariates on median FGF23 were then estimated using a hierarchical Bayesian model. The median level of intact FGF23 was independent of either past or recent exposure to abacavir, with multiplicative ratios of 1.00 and 1.07, 95% credible intervals 0.90-1.12 and 0.94-1.23, respectively. Median intact FGF23 tended to increase with past use of both nonnucleoside reverse-transcriptase inhibitors and protease inhibitors, but tended to decrease with recent use of either tenofovir, efavirenz or lopinavir. There were no obvious associations between the median level of c-terminal FGF23 and individual drugs or drug classes. Age, female gender, smoking and the aspartate aminotransferase to platelet ratio index were all associated with a higher median c-terminal FGF23 but not with a higher median intact FGF23. The level of FGF23 in plasma was independent of exposure to ABC. Lower levels of intact FGF23 with recent use of tenofovir, efavirenz or lopinavir may reflect their adverse effects on bone and vitamin D metabolism relative to other drugs in their respective drug classes. © 2015 British HIV Association.
Krishnan, Neeraja M; Seligmann, Hervé; Stewart, Caro-Beth; De Koning, A P Jason; Pollock, David D
2004-10-01
Reconstruction of ancestral DNA and amino acid sequences is an important means of inferring information about past evolutionary events. Such reconstructions suggest changes in molecular function and evolutionary processes over the course of evolution and are used to infer adaptation and convergence. Maximum likelihood (ML) is generally thought to provide relatively accurate reconstructed sequences compared to parsimony, but both methods lead to the inference of multiple directional changes in nucleotide frequencies in primate mitochondrial DNA (mtDNA). To better understand this surprising result, as well as to better understand how parsimony and ML differ, we constructed a series of computationally simple "conditional pathway" methods that differed in the number of substitutions allowed per site along each branch, and we also evaluated the entire Bayesian posterior frequency distribution of reconstructed ancestral states. We analyzed primate mitochondrial cytochrome b (Cyt-b) and cytochrome oxidase subunit I (COI) genes and found that ML reconstructs ancestral frequencies that are often more different from tip sequences than are parsimony reconstructions. In contrast, frequency reconstructions based on the posterior ensemble more closely resemble extant nucleotide frequencies. Simulations indicate that these differences in ancestral sequence inference are probably due to deterministic bias caused by high uncertainty in the optimization-based ancestral reconstruction methods (parsimony, ML, Bayesian maximum a posteriori). In contrast, ancestral nucleotide frequencies based on an average of the Bayesian set of credible ancestral sequences are much less biased. The methods involving simpler conditional pathway calculations have slightly reduced likelihood values compared to full likelihood calculations, but they can provide fairly unbiased nucleotide reconstructions and may be useful in more complex phylogenetic analyses than considered here due to their speed and flexibility. To determine whether biased reconstructions using optimization methods might affect inferences of functional properties, ancestral primate mitochondrial tRNA sequences were inferred and helix-forming propensities for conserved pairs were evaluated in silico. For ambiguously reconstructed nucleotides at sites with high base composition variability, ancestral tRNA sequences from Bayesian analyses were more compatible with canonical base pairing than were those inferred by other methods. Thus, nucleotide bias in reconstructed sequences apparently can lead to serious bias and inaccuracies in functional predictions.
Bayesian Optimal Interval Design: A Simple and Well-Performing Design for Phase I Oncology Trials
Yuan, Ying; Hess, Kenneth R.; Hilsenbeck, Susan G.; Gilbert, Mark R.
2016-01-01
Despite more than two decades of publications that offer more innovative model-based designs, the classical 3+3 design remains the most dominant phase I trial design in practice. In this article, we introduce a new trial design, the Bayesian optimal interval (BOIN) design. The BOIN design is easy to implement in a way similar to the 3+3 design, but is more flexible for choosing the target toxicity rate and cohort size and yields a substantially better performance that is comparable to that of more complex model-based designs. The BOIN design contains the 3+3 design and the accelerated titration design as special cases, thus linking it to established phase I approaches. A numerical study shows that the BOIN design generally outperforms the 3+3 design and the modified toxicity probability interval (mTPI) design. The BOIN design is more likely than the 3+3 design to correctly select the maximum tolerated dose (MTD) and allocate more patients to the MTD. Compared to the mTPI design, the BOIN design has a substantially lower risk of overdosing patients and generally a higher probability of correctly selecting the MTD. User-friendly software is freely available to facilitate the application of the BOIN design. PMID:27407096
Bayes Factor Approaches for Testing Interval Null Hypotheses
ERIC Educational Resources Information Center
Morey, Richard D.; Rouder, Jeffrey N.
2011-01-01
Psychological theories are statements of constraint. The role of hypothesis testing in psychology is to test whether specific theoretical constraints hold in data. Bayesian statistics is well suited to the task of finding supporting evidence for constraint, because it allows for comparing evidence for 2 hypotheses against each another. One issue…
Bayesian Methods and Confidence Intervals for Automatic Target Recognition of SAR Canonical Shapes
2014-03-27
and DirectX [22]. The CUDA platform was developed by the NVIDIA Corporation to allow programmers access to the computational capabilities of the...were used for the intense repetitive computations. Developing CUDA software requires writing code for specialized compilers provided by NVIDIA and
Nowakowska, Marzena
2017-04-01
The development of the Bayesian logistic regression model classifying the road accident severity is discussed. The already exploited informative priors (method of moments, maximum likelihood estimation, and two-stage Bayesian updating), along with the original idea of a Boot prior proposal, are investigated when no expert opinion has been available. In addition, two possible approaches to updating the priors, in the form of unbalanced and balanced training data sets, are presented. The obtained logistic Bayesian models are assessed on the basis of a deviance information criterion (DIC), highest probability density (HPD) intervals, and coefficients of variation estimated for the model parameters. The verification of the model accuracy has been based on sensitivity, specificity and the harmonic mean of sensitivity and specificity, all calculated from a test data set. The models obtained from the balanced training data set have a better classification quality than the ones obtained from the unbalanced training data set. The two-stage Bayesian updating prior model and the Boot prior model, both identified with the use of the balanced training data set, outperform the non-informative, method of moments, and maximum likelihood estimation prior models. It is important to note that one should be careful when interpreting the parameters since different priors can lead to different models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Liu, Fang; Eugenio, Evercita C
2018-04-01
Beta regression is an increasingly popular statistical technique in medical research for modeling of outcomes that assume values in (0, 1), such as proportions and patient reported outcomes. When outcomes take values in the intervals [0,1), (0,1], or [0,1], zero-or-one-inflated beta (zoib) regression can be used. We provide a thorough review on beta regression and zoib regression in the modeling, inferential, and computational aspects via the likelihood-based and Bayesian approaches. We demonstrate the statistical and practical importance of correctly modeling the inflation at zero/one rather than ad hoc replacing them with values close to zero/one via simulation studies; the latter approach can lead to biased estimates and invalid inferences. We show via simulation studies that the likelihood-based approach is computationally faster in general than MCMC algorithms used in the Bayesian inferences, but runs the risk of non-convergence, large biases, and sensitivity to starting values in the optimization algorithm especially with clustered/correlated data, data with sparse inflation at zero and one, and data that warrant regularization of the likelihood. The disadvantages of the regular likelihood-based approach make the Bayesian approach an attractive alternative in these cases. Software packages and tools for fitting beta and zoib regressions in both the likelihood-based and Bayesian frameworks are also reviewed.
Bayesian statistics in radionuclide metrology: measurement of a decaying source
NASA Astrophysics Data System (ADS)
Bochud, François O.; Bailat, Claude J.; Laedermann, Jean-Pascal
2007-08-01
The most intuitive way of defining a probability is perhaps through the frequency at which it appears when a large number of trials are realized in identical conditions. The probability derived from the obtained histogram characterizes the so-called frequentist or conventional statistical approach. In this sense, probability is defined as a physical property of the observed system. By contrast, in Bayesian statistics, a probability is not a physical property or a directly observable quantity, but a degree of belief or an element of inference. The goal of this paper is to show how Bayesian statistics can be used in radionuclide metrology and what its advantages and disadvantages are compared with conventional statistics. This is performed through the example of an yttrium-90 source typically encountered in environmental surveillance measurement. Because of the very low activity of this kind of source and the small half-life of the radionuclide, this measurement takes several days, during which the source decays significantly. Several methods are proposed to compute simultaneously the number of unstable nuclei at a given reference time, the decay constant and the background. Asymptotically, all approaches give the same result. However, Bayesian statistics produces coherent estimates and confidence intervals in a much smaller number of measurements. Apart from the conceptual understanding of statistics, the main difficulty that could deter radionuclide metrologists from using Bayesian statistics is the complexity of the computation.
O'Reilly, Joseph E; Donoghue, Philip C J
2018-03-01
Consensus trees are required to summarize trees obtained through MCMC sampling of a posterior distribution, providing an overview of the distribution of estimated parameters such as topology, branch lengths, and divergence times. Numerous consensus tree construction methods are available, each presenting a different interpretation of the tree sample. The rise of morphological clock and sampled-ancestor methods of divergence time estimation, in which times and topology are coestimated, has increased the popularity of the maximum clade credibility (MCC) consensus tree method. The MCC method assumes that the sampled, fully resolved topology with the highest clade credibility is an adequate summary of the most probable clades, with parameter estimates from compatible sampled trees used to obtain the marginal distributions of parameters such as clade ages and branch lengths. Using both simulated and empirical data, we demonstrate that MCC trees, and trees constructed using the similar maximum a posteriori (MAP) method, often include poorly supported and incorrect clades when summarizing diffuse posterior samples of trees. We demonstrate that the paucity of information in morphological data sets contributes to the inability of MCC and MAP trees to accurately summarise of the posterior distribution. Conversely, majority-rule consensus (MRC) trees represent a lower proportion of incorrect nodes when summarizing the same posterior samples of trees. Thus, we advocate the use of MRC trees, in place of MCC or MAP trees, in attempts to summarize the results of Bayesian phylogenetic analyses of morphological data.
O’Reilly, Joseph E; Donoghue, Philip C J
2018-01-01
Abstract Consensus trees are required to summarize trees obtained through MCMC sampling of a posterior distribution, providing an overview of the distribution of estimated parameters such as topology, branch lengths, and divergence times. Numerous consensus tree construction methods are available, each presenting a different interpretation of the tree sample. The rise of morphological clock and sampled-ancestor methods of divergence time estimation, in which times and topology are coestimated, has increased the popularity of the maximum clade credibility (MCC) consensus tree method. The MCC method assumes that the sampled, fully resolved topology with the highest clade credibility is an adequate summary of the most probable clades, with parameter estimates from compatible sampled trees used to obtain the marginal distributions of parameters such as clade ages and branch lengths. Using both simulated and empirical data, we demonstrate that MCC trees, and trees constructed using the similar maximum a posteriori (MAP) method, often include poorly supported and incorrect clades when summarizing diffuse posterior samples of trees. We demonstrate that the paucity of information in morphological data sets contributes to the inability of MCC and MAP trees to accurately summarise of the posterior distribution. Conversely, majority-rule consensus (MRC) trees represent a lower proportion of incorrect nodes when summarizing the same posterior samples of trees. Thus, we advocate the use of MRC trees, in place of MCC or MAP trees, in attempts to summarize the results of Bayesian phylogenetic analyses of morphological data. PMID:29106675
Joint distribution approaches to simultaneously quantifying benefit and risk.
Shaffer, Michele L; Watterberg, Kristi L
2006-10-12
The benefit-risk ratio has been proposed to measure the tradeoff between benefits and risks of two therapies for a single binary measure of efficacy and a single adverse event. The ratio is calculated from the difference in risk and difference in benefit between therapies. Small sample sizes or expected differences in benefit or risk can lead to no solution or problematic solutions for confidence intervals. Alternatively, using the joint distribution of benefit and risk, confidence regions for the differences in risk and benefit can be constructed in the benefit-risk plane. The information in the joint distribution can be summarized by choosing regions of interest in this plane. Using Bayesian methodology provides a very flexible framework for summarizing information in the joint distribution. Data from a National Institute of Child Health & Human Development trial of hydrocortisone illustrate the construction of confidence regions and regions of interest in the benefit-risk plane, where benefit is survival without supplemental oxygen at 36 weeks postmenstrual age, and risk is gastrointestinal perforation. For the subgroup of infants exposed to chorioamnionitis the confidence interval based on the benefit-risk ratio is wide (Benefit-risk ratio: 1.52; 90% confidence interval: 0.23 to 5.25). Choosing regions of appreciable risk and acceptable risk in the benefit-risk plane confirms the uncertainty seen in the wide confidence interval for the benefit-risk ratio--there is a greater than 50% chance of falling into the region of acceptable risk--while visually allowing the uncertainty in risk and benefit to be shown separately. Applying Bayesian methodology, an incremental net health benefit analysis shows there is a 72% chance of having a positive incremental net benefit if hydrocortisone is used in place of placebo if one is willing to incur at most one gastrointestinal perforation for each additional infant that survives without supplemental oxygen. If the benefit-risk ratio is presented, the joint distribution of benefit and risk also should be shown. These regions avoid the ambiguity associated with collapsing benefit and risk to a single dimension. Bayesian methods allow even greater flexibility in simultaneously quantifying benefit and risk.
Search for two-neutrino double electron capture of
NASA Astrophysics Data System (ADS)
Aprile, E.; Aalbers, J.; Agostini, F.; Alfonsi, M.; Amaro, F. D.; Anthony, M.; Arneodo, F.; Barrow, P.; Baudis, L.; Bauermeister, B.; Benabderrahmane, M. L.; Berger, T.; Breur, P. A.; Brown, A.; Brown, E.; Bruenner, S.; Bruno, G.; Budnik, R.; Bütikofer, L.; Calvén, J.; Cardoso, J. M. R.; Cervantes, M.; Cichon, D.; Coderre, D.; Colijn, A. P.; Conrad, J.; Cussonneau, J. P.; Decowski, M. P.; de Perio, P.; di Gangi, P.; di Giovanni, A.; Diglio, S.; Duchovni, E.; Fei, J.; Ferella, A. D.; Fieguth, A.; Franco, D.; Fulgione, W.; Gallo Rosso, A.; Galloway, M.; Gao, F.; Garbini, M.; Geis, C.; Goetzke, L. W.; Greene, Z.; Grignon, C.; Hasterok, C.; Hogenbirk, E.; Itay, R.; Kaminsky, B.; Kessler, G.; Kish, A.; Landsman, H.; Lang, R. F.; Lellouch, D.; Levinson, L.; Le Calloch, M.; Levy, C.; Lin, Q.; Lindemann, S.; Lindner, M.; Lopes, J. A. M.; Manfredini, A.; Marrodán Undagoitia, T.; Masbou, J.; Massoli, F. V.; Masson, D.; Mayani, D.; Meng, Y.; Messina, M.; Micheneau, K.; Miguez, B.; Molinario, A.; Murra, M.; Naganoma, J.; Ni, K.; Oberlack, U.; Orrigo, S. E. A.; Pakarha, P.; Pelssers, B.; Persiani, R.; Piastra, F.; Pienaar, J.; Piro, M.-C.; Plante, G.; Priel, N.; Rauch, L.; Reichard, S.; Reuter, C.; Rizzo, A.; Rosendahl, S.; Rupp, N.; Dos Santos, J. M. F.; Sartorelli, G.; Scheibelhut, M.; Schindler, S.; Schreiner, J.; Schumann, M.; Scotto Lavina, L.; Selvi, M.; Shagin, P.; Silva, M.; Simgen, H.; Sivers, M. V.; Stein, A.; Thers, D.; Tiseni, A.; Trinchero, G.; Tunnell, C. D.; Wall, R.; Wang, H.; Weber, M.; Wei, Y.; Weinheimer, C.; Wulf, J.; Zhang, Y.; Xenon Collaboration
2017-02-01
Two-neutrino double electron capture is a rare nuclear decay where two electrons are simultaneously captured from the atomic shell. For
NASA Astrophysics Data System (ADS)
Aaltonen, T.; Amerio, S.; Amidei, D.; Anastassov, A.; Annovi, A.; Antos, J.; Apollinari, G.; Appel, J. A.; Arisawa, T.; Artikov, A.; Asaadi, J.; Ashmanskas, W.; Auerbach, B.; Aurisano, A.; Azfar, F.; Badgett, W.; Bae, T.; Barbaro-Galtieri, A.; Barnes, V. E.; Barnett, B. A.; Barria, P.; Bartos, P.; Bauce, M.; Bedeschi, F.; Behari, S.; Bellettini, G.; Bellinger, J.; Benjamin, D.; Beretvas, A.; Bhatti, A.; Bland, K. R.; Blumenfeld, B.; Bocci, A.; Bodek, A.; Bortoletto, D.; Boudreau, J.; Boveia, A.; Brigliadori, L.; Bromberg, C.; Brucken, E.; Budagov, J.; Budd, H. S.; Burkett, K.; Busetto, G.; Bussey, P.; Butti, P.; Buzatu, A.; Calamba, A.; Camarda, S.; Campanelli, M.; Canelli, F.; Carls, B.; Carlsmith, D.; Carosi, R.; Carrillo, S.; Casal, B.; Casarsa, M.; Castro, A.; Catastini, P.; Cauz, D.; Cavaliere, V.; Cerri, A.; Cerrito, L.; Chen, Y. C.; Chertok, M.; Chiarelli, G.; Chlachidze, G.; Cho, K.; Chokheli, D.; Clark, A.; Clarke, C.; Convery, M. E.; Conway, J.; Corbo, M.; Cordelli, M.; Cox, C. A.; Cox, D. J.; Cremonesi, M.; Cruz, D.; Cuevas, J.; Culbertson, R.; d'Ascenzo, N.; Datta, M.; de Barbaro, P.; Demortier, L.; Deninno, M.; D'Errico, M.; Devoto, F.; Di Canto, A.; Di Ruzza, B.; Dittmann, J. R.; Donati, S.; D'Onofrio, M.; Dorigo, M.; Driutti, A.; Ebina, K.; Edgar, R.; Erbacher, R.; Errede, S.; Esham, B.; Farrington, S.; Fernández Ramos, J. P.; Field, R.; Flanagan, G.; Forrest, R.; Franklin, M.; Freeman, J. C.; Frisch, H.; Funakoshi, Y.; Galloni, C.; Garfinkel, A. F.; Garosi, P.; Gerberich, H.; Gerchtein, E.; Giagu, S.; Giakoumopoulou, V.; Gibson, K.; Ginsburg, C. M.; Giokaris, N.; Giromini, P.; Glagolev, V.; Glenzinski, D.; Gold, M.; Goldin, D.; Golossanov, A.; Gomez, G.; Gomez-Ceballos, G.; Goncharov, M.; González López, O.; Gorelov, I.; Goshaw, A. T.; Goulianos, K.; Gramellini, E.; Grosso-Pilcher, C.; Guimaraes da Costa, J.; Hahn, S. R.; Han, J. Y.; Happacher, F.; Hara, K.; Hare, M.; Harr, R. F.; Harrington-Taber, T.; Hatakeyama, K.; Hays, C.; Heinrich, J.; Herndon, M.; Hocker, A.; Hong, Z.; Hopkins, W.; Hou, S.; Hughes, R. E.; Husemann, U.; Hussein, M.; Huston, J.; Introzzi, G.; Iori, M.; Ivanov, A.; James, E.; Jang, D.; Jayatilaka, B.; Jeon, E. J.; Jindariani, S.; Jones, M.; Joo, K. K.; Jun, S. Y.; Junk, T. R.; Kambeitz, M.; Kamon, T.; Karchin, P. E.; Kasmi, A.; Kato, Y.; Ketchum, W.; Keung, J.; Kilminster, B.; Kim, D. H.; Kim, H. S.; Kim, J. E.; Kim, M. J.; Kim, S. H.; Kim, S. B.; Kim, Y. J.; Kim, Y. K.; Kimura, N.; Kirby, M.; Knoepfel, K.; Kondo, K.; Kong, D. J.; Konigsberg, J.; Kotwal, A. V.; Kreps, M.; Kroll, J.; Kruse, M.; Kuhr, T.; Kurata, M.; Laasanen, A. T.; Lammel, S.; Lancaster, M.; Lannon, K.; Latino, G.; Lee, H. S.; Lee, J. S.; Leo, S.; Leone, S.; Lewis, J. D.; Limosani, A.; Lipeles, E.; Lister, A.; Liu, Q.; Liu, T.; Lockwitz, S.; Loginov, A.; Lucchesi, D.; Lucà, A.; Lueck, J.; Lujan, P.; Lukens, P.; Lungu, G.; Lys, J.; Lysak, R.; Madrak, R.; Maestro, P.; Malik, S.; Manca, G.; Manousakis-Katsikakis, A.; Marchese, L.; Margaroli, F.; Marino, P.; Matera, K.; Mattson, M. E.; Mazzacane, A.; Mazzanti, P.; McNulty, R.; Mehta, A.; Mehtala, P.; Mesropian, C.; Miao, T.; Mietlicki, D.; Mitra, A.; Miyake, H.; Moed, S.; Moggi, N.; Moon, C. S.; Moore, R.; Morello, M. J.; Mukherjee, A.; Muller, Th.; Murat, P.; Mussini, M.; Nachtman, J.; Nagai, Y.; Naganoma, J.; Nakano, I.; Napier, A.; Nett, J.; Nigmanov, T.; Nodulman, L.; Noh, S. Y.; Norniella, O.; Oakes, L.; Oh, S. H.; Oh, Y. D.; Okusawa, T.; Orava, R.; Ortolan, L.; Pagliarone, C.; Palencia, E.; Palni, P.; Papadimitriou, V.; Parker, W.; Pauletta, G.; Paulini, M.; Paus, C.; Phillips, T. J.; Piacentino, G.; Pianori, E.; Pilot, J.; Pitts, K.; Plager, C.; Pondrom, L.; Poprocki, S.; Potamianos, K.; Pranko, A.; Prokoshin, F.; Ptohos, F.; Punzi, G.; Redondo Fernández, I.; Renton, P.; Rescigno, M.; Rimondi, F.; Ristori, L.; Robson, A.; Rodriguez, T.; Rolli, S.; Ronzani, M.; Roser, R.; Rosner, J. L.; Ruffini, F.; Ruiz, A.; Russ, J.; Rusu, V.; Sakumoto, W. K.; Sakurai, Y.; Santi, L.; Sato, K.; Saveliev, V.; Savoy-Navarro, A.; Schlabach, P.; Schmidt, E. E.; Schwarz, T.; Scodellaro, L.; Scuri, F.; Seidel, S.; Seiya, Y.; Semenov, A.; Sforza, F.; Shalhout, S. Z.; Shears, T.; Shepard, P. F.; Shimojima, M.; Shochet, M.; Shreyber-Tecker, I.; Simonenko, A.; Sliwa, K.; Smith, J. R.; Snider, F. D.; Song, H.; Sorin, V.; St. Denis, R.; Stancari, M.; Stentz, D.; Strologas, J.; Sudo, Y.; Sukhanov, A.; Suslov, I.; Takemasa, K.; Takeuchi, Y.; Tang, J.; Tecchio, M.; Teng, P. K.; Thom, J.; Thomson, E.; Thukral, V.; Toback, D.; Tokar, S.; Tollefson, K.; Tomura, T.; Tonelli, D.; Torre, S.; Torretta, D.; Totaro, P.; Trovato, M.; Ukegawa, F.; Uozumi, S.; Vázquez, F.; Velev, G.; Vellidis, C.; Vernieri, C.; Vidal, M.; Vilar, R.; Vizán, J.; Vogel, M.; Volpi, G.; Wagner, P.; Wallny, R.; Wang, S. M.; Waters, D.; Wester, W. C.; Whiteson, D.; Wicklund, A. B.; Wilbur, S.; Williams, H. H.; Wilson, J. S.; Wilson, P.; Winer, B. L.; Wittich, P.; Wolbers, S.; Wolfe, H.; Wright, T.; Wu, X.; Wu, Z.; Yamamoto, K.; Yamato, D.; Yang, T.; Yang, U. K.; Yang, Y. C.; Yao, W.-M.; Yeh, G. P.; Yi, K.; Yoh, J.; Yorita, K.; Yoshida, T.; Yu, G. B.; Yu, I.; Zanetti, A. M.; Zeng, Y.; Zhou, C.; Zucchelli, S.; CDF Collaboration
2016-06-01
A search for a Higgs boson with suppressed couplings to fermions, hf, assumed to be the neutral, lower-mass partner of the Higgs boson discovered at the Large Hadron Collider, is reported. Such a Higgs boson could exist in extensions of the standard model with two Higgs doublets, and could be produced via p p ¯→H±hf→W*hfhf→4 γ +X , where H± is a charged Higgs boson. This analysis uses all events with at least three photons in the final state from proton-antiproton collisions at a center-of-mass energy of 1.96 TeV collected by the Collider Detector at Fermilab, corresponding to an integrated luminosity of 9.2 fb-1. No evidence of a signal is observed in the data. Values of Higgs-boson masses between 10 and 100 GeV /c2 are excluded at 95% Bayesian credibility.
Use of Threshold of Toxicological Concern (TTC) with High ...
Although progress has been made with HTS (high throughput screening) in profiling biological activity (e.g., EPA’s ToxCast™), challenges arise interpreting HTS results in the context of adversity & converting HTS assay concentrations to equivalent human doses for the broad domain of commodity chemicals. Here, we propose using TTC as a risk screening method to evaluate exposure ranges derived from NHANES for 7968 chemicals. Because the well-established TTC approach uses hazard values derived from in vivo toxicity data, relevance to adverse effects is robust. We compared the conservative TTC (non-cancer) value of 90 μg/day (1.5 μg/kg/day) (Kroes et al., Fd Chem Toxicol, 2004) to quantitative exposure predictions of the upper 95% credible interval (UCI) of median daily exposures for 7968 chemicals in 10 different demographic groups (Wambaugh et al., Environ Sci Technol. 48:12760-7, 2014). Results indicate: (1) none of the median values of credible interval of exposure for any chemical in any demographic group was above the TTC; & (2) fewer than 5% of chemicals had an UCI that exceeded the TTC for any group. However, these median exposure predictions do not cover highly exposed (e.g., occupational) populations. Additionally, we propose an expanded risk-based screening workflow that comprises a TTC decision tree that includes screening compounds for structural alerts for DNA reactivity, OPs & carbamates as well as a comparison with bioactivity-based margins of
A hierarchical model for estimating change in American Woodcock populations
Sauer, J.R.; Link, W.A.; Kendall, W.L.; Kelley, J.R.; Niven, D.K.
2008-01-01
The Singing-Ground Survey (SGS) is a primary source of information on population change for American woodcock (Scolopax minor). We analyzed the SGS using a hierarchical log-linear model and compared the estimates of change and annual indices of abundance to a route regression analysis of SGS data. We also grouped SGS routes into Bird Conservation Regions (BCRs) and estimated population change and annual indices using BCRs within states and provinces as strata. Based on the hierarchical model?based estimates, we concluded that woodcock populations were declining in North America between 1968 and 2006 (trend = -0.9%/yr, 95% credible interval: -1.2, -0.5). Singing-Ground Survey results are generally similar between analytical approaches, but the hierarchical model has several important advantages over the route regression. Hierarchical models better accommodate changes in survey efficiency over time and space by treating strata, years, and observers as random effects in the context of a log-linear model, providing trend estimates that are derived directly from the annual indices. We also conducted a hierarchical model analysis of woodcock data from the Christmas Bird Count and the North American Breeding Bird Survey. All surveys showed general consistency in patterns of population change, but the SGS had the shortest credible intervals. We suggest that population management and conservation planning for woodcock involving interpretation of the SGS use estimates provided by the hierarchical model.
Guillain-Barré syndrome risk among individuals infected with Zika virus: a multi-country assessment.
Mier-Y-Teran-Romero, Luis; Delorey, Mark J; Sejvar, James J; Johansson, Michael A
2018-05-15
Countries with ongoing outbreaks of Zika virus have observed a notable rise in reported cases of Guillain-Barré syndrome (GBS), with mounting evidence of a causal link between Zika virus infection and the neurological syndrome. However, the risk of GBS following a Zika virus infection is not well characterized. In this work, we used data from 11 locations with publicly available data to estimate the risk of GBS following an infection with Zika virus, as well as the location-specific incidence of infection and the number of suspect GBS cases reported per infection. We built a mathematical inference framework utilizing data from 11 locations that had reported suspect Zika and GBS cases, two with completed outbreaks prior to 2015 (French Polynesia and Yap) and nine others in the Americas covering partial outbreaks and where transmission was ongoing as of early 2017. We estimated that 2.0 (95% credible interval 0.5-4.5) reported GBS cases may occur per 10,000 Zika virus infections. The frequency of reported suspect Zika cases varied substantially and was highly uncertain, with a mean of 0.11 (95% credible interval 0.01-0.24) suspect cases reported per infection. These estimates can help efforts to prepare for the GBS cases that may occur during Zika epidemics and highlight the need to better understand the relationship between infection and the reported incidence of clinical disease.
Bellan, Steve E.; Dushoff, Jonathan; Galvani, Alison P.; Meyers, Lauren Ancel
2015-01-01
Background The infectivity of the HIV-1 acute phase has been directly measured only once, from a retrospectively identified cohort of serodiscordant heterosexual couples in Rakai, Uganda. Analyses of this cohort underlie the widespread view that the acute phase is highly infectious, even more so than would be predicted from its elevated viral load, and that transmission occurring shortly after infection may therefore compromise interventions that rely on diagnosis and treatment, such as antiretroviral treatment as prevention (TasP). Here, we re-estimate the duration and relative infectivity of the acute phase, while accounting for several possible sources of bias in published estimates, including the retrospective cohort exclusion criteria and unmeasured heterogeneity in risk. Methods and Findings We estimated acute phase infectivity using two approaches. First, we combined viral load trajectories and viral load-infectivity relationships to estimate infectivity trajectories over the course of infection, under the assumption that elevated acute phase infectivity is caused by elevated viral load alone. Second, we estimated the relative hazard of transmission during the acute phase versus the chronic phase (RHacute) and the acute phase duration (d acute) by fitting a couples transmission model to the Rakai retrospective cohort using approximate Bayesian computation. Our model fit the data well and accounted for characteristics overlooked by previous analyses, including individual heterogeneity in infectiousness and susceptibility and the retrospective cohort's exclusion of couples that were recorded as serodiscordant only once before being censored by loss to follow-up, couple dissolution, or study termination. Finally, we replicated two highly cited analyses of the Rakai data on simulated data to identify biases underlying the discrepancies between previous estimates and our own. From the Rakai data, we estimated RHacute = 5.3 (95% credibility interval [95% CrI]: 0.79–57) and d acute = 1.7 mo (95% CrI: 0.55–6.8). The wide credibility intervals reflect an inability to distinguish a long, mildly infectious acute phase from a short, highly infectious acute phase, given the 10-mo Rakai observation intervals. The total additional risk, measured as excess hazard-months attributable to the acute phase (EHMacute) can be estimated more precisely: EHMacute = (RHacute - 1) × d acute, and should be interpreted with respect to the 120 hazard-months generated by a constant untreated chronic phase infectivity over 10 y of infection. From the Rakai data, we estimated that EHMacute = 8.4 (95% CrI: -0.27 to 64). This estimate is considerably lower than previously published estimates, and consistent with our independent estimate from viral load trajectories, 5.6 (95% confidence interval: 3.3–9.1). We found that previous overestimates likely stemmed from failure to account for risk heterogeneity and bias resulting from the retrospective cohort study design. Our results reflect the interaction between the retrospective cohort exclusion criteria and high (47%) rates of censorship amongst incident serodiscordant couples in the Rakai study due to loss to follow-up, couple dissolution, or study termination. We estimated excess physiological infectivity during the acute phase from couples data, but not the proportion of transmission attributable to the acute phase, which would require data on the broader population's sexual network structure. Conclusions Previous EHMacute estimates relying on the Rakai retrospective cohort data range from 31 to 141. Our results indicate that these are substantial overestimates of HIV-1 acute phase infectivity, biased by unmodeled heterogeneity in transmission rates between couples and by inconsistent censoring. Elevated acute phase infectivity is therefore less likely to undermine TasP interventions than previously thought. Heterogeneity in infectiousness and susceptibility may still play an important role in intervention success and deserves attention in future analyses PMID:25781323
Bellan, Steve E; Dushoff, Jonathan; Galvani, Alison P; Meyers, Lauren Ancel
2015-03-01
The infectivity of the HIV-1 acute phase has been directly measured only once, from a retrospectively identified cohort of serodiscordant heterosexual couples in Rakai, Uganda. Analyses of this cohort underlie the widespread view that the acute phase is highly infectious, even more so than would be predicted from its elevated viral load, and that transmission occurring shortly after infection may therefore compromise interventions that rely on diagnosis and treatment, such as antiretroviral treatment as prevention (TasP). Here, we re-estimate the duration and relative infectivity of the acute phase, while accounting for several possible sources of bias in published estimates, including the retrospective cohort exclusion criteria and unmeasured heterogeneity in risk. We estimated acute phase infectivity using two approaches. First, we combined viral load trajectories and viral load-infectivity relationships to estimate infectivity trajectories over the course of infection, under the assumption that elevated acute phase infectivity is caused by elevated viral load alone. Second, we estimated the relative hazard of transmission during the acute phase versus the chronic phase (RHacute) and the acute phase duration (dacute) by fitting a couples transmission model to the Rakai retrospective cohort using approximate Bayesian computation. Our model fit the data well and accounted for characteristics overlooked by previous analyses, including individual heterogeneity in infectiousness and susceptibility and the retrospective cohort's exclusion of couples that were recorded as serodiscordant only once before being censored by loss to follow-up, couple dissolution, or study termination. Finally, we replicated two highly cited analyses of the Rakai data on simulated data to identify biases underlying the discrepancies between previous estimates and our own. From the Rakai data, we estimated RHacute = 5.3 (95% credibility interval [95% CrI]: 0.79-57) and dacute = 1.7 mo (95% CrI: 0.55-6.8). The wide credibility intervals reflect an inability to distinguish a long, mildly infectious acute phase from a short, highly infectious acute phase, given the 10-mo Rakai observation intervals. The total additional risk, measured as excess hazard-months attributable to the acute phase (EHMacute) can be estimated more precisely: EHMacute = (RHacute - 1) × dacute, and should be interpreted with respect to the 120 hazard-months generated by a constant untreated chronic phase infectivity over 10 y of infection. From the Rakai data, we estimated that EHMacute = 8.4 (95% CrI: -0.27 to 64). This estimate is considerably lower than previously published estimates, and consistent with our independent estimate from viral load trajectories, 5.6 (95% confidence interval: 3.3-9.1). We found that previous overestimates likely stemmed from failure to account for risk heterogeneity and bias resulting from the retrospective cohort study design. Our results reflect the interaction between the retrospective cohort exclusion criteria and high (47%) rates of censorship amongst incident serodiscordant couples in the Rakai study due to loss to follow-up, couple dissolution, or study termination. We estimated excess physiological infectivity during the acute phase from couples data, but not the proportion of transmission attributable to the acute phase, which would require data on the broader population's sexual network structure. Previous EHMacute estimates relying on the Rakai retrospective cohort data range from 31 to 141. Our results indicate that these are substantial overestimates of HIV-1 acute phase infectivity, biased by unmodeled heterogeneity in transmission rates between couples and by inconsistent censoring. Elevated acute phase infectivity is therefore less likely to undermine TasP interventions than previously thought. Heterogeneity in infectiousness and susceptibility may still play an important role in intervention success and deserves attention in future analyses.
Nichols, J.M.; Link, W.A.; Murphy, K.D.; Olson, C.C.
2010-01-01
This work discusses a Bayesian approach to approximating the distribution of parameters governing nonlinear structural systems. Specifically, we use a Markov Chain Monte Carlo method for sampling the posterior parameter distributions thus producing both point and interval estimates for parameters. The method is first used to identify both linear and nonlinear parameters in a multiple degree-of-freedom structural systems using free-decay vibrations. The approach is then applied to the problem of identifying the location, size, and depth of delamination in a model composite beam. The influence of additive Gaussian noise on the response data is explored with respect to the quality of the resulting parameter estimates.
A Comparison of the β-Substitution Method and a Bayesian Method for Analyzing Left-Censored Data
Huynh, Tran; Quick, Harrison; Ramachandran, Gurumurthy; Banerjee, Sudipto; Stenzel, Mark; Sandler, Dale P.; Engel, Lawrence S.; Kwok, Richard K.; Blair, Aaron; Stewart, Patricia A.
2016-01-01
Classical statistical methods for analyzing exposure data with values below the detection limits are well described in the occupational hygiene literature, but an evaluation of a Bayesian approach for handling such data is currently lacking. Here, we first describe a Bayesian framework for analyzing censored data. We then present the results of a simulation study conducted to compare the β-substitution method with a Bayesian method for exposure datasets drawn from lognormal distributions and mixed lognormal distributions with varying sample sizes, geometric standard deviations (GSDs), and censoring for single and multiple limits of detection. For each set of factors, estimates for the arithmetic mean (AM), geometric mean, GSD, and the 95th percentile (X0.95) of the exposure distribution were obtained. We evaluated the performance of each method using relative bias, the root mean squared error (rMSE), and coverage (the proportion of the computed 95% uncertainty intervals containing the true value). The Bayesian method using non-informative priors and the β-substitution method were generally comparable in bias and rMSE when estimating the AM and GM. For the GSD and the 95th percentile, the Bayesian method with non-informative priors was more biased and had a higher rMSE than the β-substitution method, but use of more informative priors generally improved the Bayesian method’s performance, making both the bias and the rMSE more comparable to the β-substitution method. An advantage of the Bayesian method is that it provided estimates of uncertainty for these parameters of interest and good coverage, whereas the β-substitution method only provided estimates of uncertainty for the AM, and coverage was not as consistent. Selection of one or the other method depends on the needs of the practitioner, the availability of prior information, and the distribution characteristics of the measurement data. We suggest the use of Bayesian methods if the practitioner has the computational resources and prior information, as the method would generally provide accurate estimates and also provides the distributions of all of the parameters, which could be useful for making decisions in some applications. PMID:26209598
BELM: Bayesian extreme learning machine.
Soria-Olivas, Emilio; Gómez-Sanchis, Juan; Martín, José D; Vila-Francés, Joan; Martínez, Marcelino; Magdalena, José R; Serrano, Antonio J
2011-03-01
The theory of extreme learning machine (ELM) has become very popular on the last few years. ELM is a new approach for learning the parameters of the hidden layers of a multilayer neural network (as the multilayer perceptron or the radial basis function neural network). Its main advantage is the lower computational cost, which is especially relevant when dealing with many patterns defined in a high-dimensional space. This brief proposes a bayesian approach to ELM, which presents some advantages over other approaches: it allows the introduction of a priori knowledge; obtains the confidence intervals (CIs) without the need of applying methods that are computationally intensive, e.g., bootstrap; and presents high generalization capabilities. Bayesian ELM is benchmarked against classical ELM in several artificial and real datasets that are widely used for the evaluation of machine learning algorithms. Achieved results show that the proposed approach produces a competitive accuracy with some additional advantages, namely, automatic production of CIs, reduction of probability of model overfitting, and use of a priori knowledge.
Internal Medicine residents use heuristics to estimate disease probability.
Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin
2015-01-01
Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.
Bayesian Optimal Interval Design: A Simple and Well-Performing Design for Phase I Oncology Trials.
Yuan, Ying; Hess, Kenneth R; Hilsenbeck, Susan G; Gilbert, Mark R
2016-09-01
Despite more than two decades of publications that offer more innovative model-based designs, the classical 3 + 3 design remains the most dominant phase I trial design in practice. In this article, we introduce a new trial design, the Bayesian optimal interval (BOIN) design. The BOIN design is easy to implement in a way similar to the 3 + 3 design, but is more flexible for choosing the target toxicity rate and cohort size and yields a substantially better performance that is comparable with that of more complex model-based designs. The BOIN design contains the 3 + 3 design and the accelerated titration design as special cases, thus linking it to established phase I approaches. A numerical study shows that the BOIN design generally outperforms the 3 + 3 design and the modified toxicity probability interval (mTPI) design. The BOIN design is more likely than the 3 + 3 design to correctly select the MTD and allocate more patients to the MTD. Compared with the mTPI design, the BOIN design has a substantially lower risk of overdosing patients and generally a higher probability of correctly selecting the MTD. User-friendly software is freely available to facilitate the application of the BOIN design. Clin Cancer Res; 22(17); 4291-301. ©2016 AACR. ©2016 American Association for Cancer Research.
Uncertainty estimation of Intensity-Duration-Frequency relationships: A regional analysis
NASA Astrophysics Data System (ADS)
Mélèse, Victor; Blanchet, Juliette; Molinié, Gilles
2018-03-01
We propose in this article a regional study of uncertainties in IDF curves derived from point-rainfall maxima. We develop two generalized extreme value models based on the simple scaling assumption, first in the frequentist framework and second in the Bayesian framework. Within the frequentist framework, uncertainties are obtained i) from the Gaussian density stemming from the asymptotic normality theorem of the maximum likelihood and ii) with a bootstrap procedure. Within the Bayesian framework, uncertainties are obtained from the posterior densities. We confront these two frameworks on the same database covering a large region of 100, 000 km2 in southern France with contrasted rainfall regime, in order to be able to draw conclusion that are not specific to the data. The two frameworks are applied to 405 hourly stations with data back to the 1980's, accumulated in the range 3 h-120 h. We show that i) the Bayesian framework is more robust than the frequentist one to the starting point of the estimation procedure, ii) the posterior and the bootstrap densities are able to better adjust uncertainty estimation to the data than the Gaussian density, and iii) the bootstrap density give unreasonable confidence intervals, in particular for return levels associated to large return period. Therefore our recommendation goes towards the use of the Bayesian framework to compute uncertainty.
Bayesian posterior distributions without Markov chains.
Cole, Stephen R; Chu, Haitao; Greenland, Sander; Hamra, Ghassan; Richardson, David B
2012-03-01
Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976-1983) assessing the relation between residential exposure to magnetic fields and the development of childhood cancer. Results from rejection sampling (odds ratio (OR) = 1.69, 95% posterior interval (PI): 0.57, 5.00) were similar to MCMC results (OR = 1.69, 95% PI: 0.58, 4.95) and approximations from data-augmentation priors (OR = 1.74, 95% PI: 0.60, 5.06). In example 2, the authors apply rejection sampling to a cohort study of 315 human immunodeficiency virus seroconverters (1984-1998) to assess the relation between viral load after infection and 5-year incidence of acquired immunodeficiency syndrome, adjusting for (continuous) age at seroconversion and race. In this more complex example, rejection sampling required a notably longer run time than MCMC sampling but remained feasible and again yielded similar results. The transparency of the proposed approach comes at a price of being less broadly applicable than MCMC.
Receptive Field Inference with Localized Priors
Park, Mijung; Pillow, Jonathan W.
2011-01-01
The linear receptive field describes a mapping from sensory stimuli to a one-dimensional variable governing a neuron's spike response. However, traditional receptive field estimators such as the spike-triggered average converge slowly and often require large amounts of data. Bayesian methods seek to overcome this problem by biasing estimates towards solutions that are more likely a priori, typically those with small, smooth, or sparse coefficients. Here we introduce a novel Bayesian receptive field estimator designed to incorporate locality, a powerful form of prior information about receptive field structure. The key to our approach is a hierarchical receptive field model that flexibly adapts to localized structure in both spacetime and spatiotemporal frequency, using an inference method known as empirical Bayes. We refer to our method as automatic locality determination (ALD), and show that it can accurately recover various types of smooth, sparse, and localized receptive fields. We apply ALD to neural data from retinal ganglion cells and V1 simple cells, and find it achieves error rates several times lower than standard estimators. Thus, estimates of comparable accuracy can be achieved with substantially less data. Finally, we introduce a computationally efficient Markov Chain Monte Carlo (MCMC) algorithm for fully Bayesian inference under the ALD prior, yielding accurate Bayesian confidence intervals for small or noisy datasets. PMID:22046110
ERIC Educational Resources Information Center
Wilcox, Rand R.; Serang, Sarfaraz
2017-01-01
The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…
Scheduling viability tests for seeds in long-term storage based on a Bayesian Multi-Level Model
USDA-ARS?s Scientific Manuscript database
Genebank managers conduct viability tests on stored seeds so they can replace lots that have viability near a critical threshold, such as 50 or 85% germination. Currently, these tests are typically scheduled at uniform intervals; testing every 5 years is common. A manager needs to balance the cost...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef; Fast P.; Kraus, M.
2006-01-01
Terrorist attacks using an aerosolized pathogen preparation have gained credibility as a national security concern after the anthrax attacks of 2001. The ability to characterize such attacks, i.e., to estimate the number of people infected, the time of infection, and the average dose received, is important when planning a medical response. We address this question of characterization by formulating a Bayesian inverse problem predicated on a short time-series of diagnosed patients exhibiting symptoms. To be of relevance to response planning, we limit ourselves to 3-5 days of data. In tests performed with anthrax as the pathogen, we find that thesemore » data are usually sufficient, especially if the model of the outbreak used in the inverse problem is an accurate one. In some cases the scarcity of data may initially support outbreak characterizations at odds with the true one, but with sufficient data the correct inferences are recovered; in other words, the inverse problem posed and its solution methodology are consistent. We also explore the effect of model error-situations for which the model used in the inverse problem is only a partially accurate representation of the outbreak; here, the model predictions and the observations differ by more than a random noise. We find that while there is a consistent discrepancy between the inferred and the true characterizations, they are also close enough to be of relevance when planning a response.« less
Wainwright, Haruko M; Seki, Akiyuki; Chen, Jinsong; Saito, Kimiaki
2017-02-01
This paper presents a multiscale data integration method to estimate the spatial distribution of air dose rates in the regional scale around the Fukushima Daiichi Nuclear Power Plant. We integrate various types of datasets, such as ground-based walk and car surveys, and airborne surveys, all of which have different scales, resolutions, spatial coverage, and accuracy. This method is based on geostatistics to represent spatial heterogeneous structures, and also on Bayesian hierarchical models to integrate multiscale, multi-type datasets in a consistent manner. The Bayesian method allows us to quantify the uncertainty in the estimates, and to provide the confidence intervals that are critical for robust decision-making. Although this approach is primarily data-driven, it has great flexibility to include mechanistic models for representing radiation transport or other complex correlations. We demonstrate our approach using three types of datasets collected at the same time over Fukushima City in Japan: (1) coarse-resolution airborne surveys covering the entire area, (2) car surveys along major roads, and (3) walk surveys in multiple neighborhoods. Results show that the method can successfully integrate three types of datasets and create an integrated map (including the confidence intervals) of air dose rates over the domain in high resolution. Moreover, this study provides us with various insights into the characteristics of each dataset, as well as radiocaesium distribution. In particular, the urban areas show high heterogeneity in the contaminant distribution due to human activities as well as large discrepancy among different surveys due to such heterogeneity. Copyright © 2016 Elsevier Ltd. All rights reserved.
Wang, Ling; Abdel-Aty, Mohamed; Wang, Xuesong; Yu, Rongjie
2018-02-01
There have been plenty of traffic safety studies based on average daily traffic (ADT), average hourly traffic (AHT), or microscopic traffic at 5 min intervals. Nevertheless, not enough research has compared the performance of these three types of safety studies, and seldom of previous studies have intended to find whether the results of one type of study is transferable to the other two studies. First, this study built three models: a Bayesian Poisson-lognormal model to estimate the daily crash frequency using ADT, a Bayesian Poisson-lognormal model to estimate the hourly crash frequency using AHT, and a Bayesian logistic regression model for the real-time safety analysis using microscopic traffic. The model results showed that the crash contributing factors found by different models were comparable but not the same. Four variables, i.e., the logarithm of volume, the standard deviation of speed, the logarithm of segment length, and the existence of diverge segment, were positively significant in the three models. Additionally, weaving segments experienced higher daily and hourly crash frequencies than merge and basic segments. Then, each of the ADT-based, AHT-based, and real-time models was used to estimate safety conditions at different levels: daily and hourly, meanwhile, the real-time model was also used in 5 min intervals. The results uncovered that the ADT- and AHT-based safety models performed similar in predicting daily and hourly crash frequencies, and the real-time safety model was able to provide hourly crash frequency. Copyright © 2017 Elsevier Ltd. All rights reserved.
Message propagation in the network based on node credibility
NASA Astrophysics Data System (ADS)
Nian, Fuzhong; Dang, Zhongkai
2018-04-01
In the propagation efficiency point of view, the node credibility is introduced in this paper. For the message receiver, the node would partially believe the message according to the credibility of the propagator. For a node, the credibility is variable. The more the true message spread, the higher the credibility, and vice versa, the credibility becomes smaller. Based on the idea, a new network was established with the node credibility. Finally, a comparing experiment between the fully trusted network and the network with the node credibility was implemented. The results indicate that the spread effect of messages is better in the network with the node credibility.
Case studies in Bayesian microbial risk assessments.
Kennedy, Marc C; Clough, Helen E; Turner, Joanne
2009-12-21
The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs). We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5). The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11). In the second case study the effective number of inputs was reduced from 30 to 7 in the screening stage, and just 2 inputs were found to explain 82.8% of the output variance. A combined total of 500 runs of the computer code were used. These case studies illustrate the use of Bayesian statistics to perform detailed uncertainty and sensitivity analyses, integrating multiple information sources in a way that is both rigorous and efficient.
Dilokthornsakul, Piyameth; Patidar, Mausam; Campbell, Jonathan D
2017-12-01
To forecast lifetime outcomes and cost of lumacaftor/ivacaftor combination therapy in patients with cystic fibrosis (CF) with homozygous phe508del mutation from the US payer perspective. A lifetime Markov model was developed from a US payer perspective. The model included five health states: 1) mild lung disease (percent predicted forced expiratory volume in 1 second [FEV 1 ] >70%), 2) moderate lung disease (40% ≤ FEV 1 ≤ 70%), 3) severe lung disease (FEV 1 < 40%), 4) lung transplantation, and 5) death. All inputs were derived from published literature. We estimated lumacaftor/ivacaftor's improvement in outcomes compared with a non-CF referent population as well as CF-specific mortality estimates. Lumacaftor/ivacaftor was associated with additional 2.91 life-years (95% credible interval 2.55-3.56) and additional 2.42 quality-adjusted life-years (QALYs) (95% credible interval 2.10-2.98). Lumacaftor/ivacaftor was associated with improvements in survival and QALYs equivalent to 27.6% and 20.7%, respectively, for the survival and QALY gaps between CF usual care and their non-CF peers. The incremental lifetime cost was $2,632,249. Lumacaftor/ivacaftor increased life-years and QALYs in CF patients with the homozygous phe508del mutation and moved morbidity and mortality closer to that of their non-CF peers but it came with higher cost. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models
NASA Astrophysics Data System (ADS)
Melsen, Lieke A.; Teuling, Adriaan J.; Torfs, Paul J. J. F.; Uijlenhoet, Remko; Mizukami, Naoki; Clark, Martyn P.
2016-03-01
A meta-analysis on 192 peer-reviewed articles reporting on applications of the variable infiltration capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.
HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models
NASA Astrophysics Data System (ADS)
Melsen, L. A.; Teuling, A. J.; Torfs, P. J. J. F.; Uijlenhoet, R.; Mizukami, N.; Clark, M. P.
2015-12-01
A meta-analysis on 192 peer-reviewed articles reporting applications of the Variable Infiltration Capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.
Navas González, Francisco Javier; Jordana Vidal, Jordi; Camacho Vallejo, María Esperanza; León Jurado, Jose Manuel; de la Haba Giraldo, Manuel Rafael; Barba Capote, Cecilio; Delgado Bermejo, Juan Vicente
2018-03-15
Cutaneous habronematidosis (CH) is a highly prevalent seasonally recurrent skin disease that affects donkeys as a result from the action of spirurid stomach worm larvae. Carrier flies mistakenly deposit these larvae on previous skin lesions or on the moisture of natural orifices, causing distress and inflicting relapsing wounds to the animals. First, we carried out a meta-analysis of the predisposing factors that could condition the development of CH in Andalusian donkeys. Second, basing on the empirical existence of an inter and intrafamilial variation previously addressed by owners, we isolated the genetic background behind the hypersensibility to this parasitological disease. To this aim, we designed a Bayesian linear model (BLM) to estimate the breeding values and genetic parameters for the hypersensibility to CH as a way to infer the potential selection suitability of this trait, seeking the improvement of donkey conservation programs. We studied the historical record of the cases of CH of 765 donkeys from 1984 to 2017. Fixed effects included birth year, birth season, sex, farm/owner, and husbandry system. Age was included as a linear and quadratic covariate. Although the effects of birth season and birth year were statistically non-significant (P > 0.05), their respective interactions with sex and farm/owner were statistically significant (P < 0.01), what translated into an increase of 40.5% in the specificity and of 0.6% of the sensibility of the model designed, when such interactions were included. Our BLM reported highly accurate genetic parameters as suggested by the low error of around 0.005, and the 95% credible interval for the heritability of ±0.0012. The CH hypersensibility heritability was 0.0346. The value of 0.1232 for additive genetic variance addresses a relatively low genetic variation in the Andalusian donkey breed. Our results suggest that farms managed under extensive husbandry conditions are the most protective ones against developing CH. Furthermore, these results provide evidence of the lack of repercussion of other factors such as age or sex. Potentially considering CH hypersensibility as a negative selection aimed goal in donkey breeding programs, may turn into a measure to improve animal welfare indirectly. However, the low heritability value makes it compulsory to control environmental factors to ensure the effectiveness of the breeding measures implemented to obtain individuals that may genetically be less prone to develop the condition. Copyright © 2018 Elsevier B.V. All rights reserved.
Barbato, Luigi; Kalemaj, Zamira; Buti, Jacopo; Baccini, Michela; La Marca, Michele; Duvina, Marco; Tonelli, Paolo
2016-03-01
The aim of this systematic review is to evaluate and synthesize scientific evidence on the effect of surgical interventions for removal of mandibular third molar (M3M) on periodontal healing of adjacent mandibular second molar (M2M). The protocol was registered at PROSPERO (International Prospective Register of Systematic Reviews) as CRD42012003059. Medline, Cochrane, and EMBASE databases were interrogated to identify randomized controlled trials (RCTs) up to December 22, 2014. Patients with M3Ms fully developed, unilaterally or bilaterally impacted, were considered. Outcomes were clinical attachment level gain (CALg) and probing depth reduction (PDr) with a follow-up ≥ 6 months. Patient-subjective outcomes, such as pain, discomfort, and complications, and financial aspects and chair time, were also explored. A Bayesian network meta-analysis model was used to estimate direct and indirect effects and to establish a ranking of treatments. Sixteen RCTs were included and categorized into four groups investigating the following: 1) regenerative/grafting procedures (10 RCTs); 2) flap design (three RCTs); 3) type of suturing (one RCT); and 4) periodontal care of M2M (two RCTs). Guided tissue regeneration (GTR) with resorbable (GTRr) and non-resorbable (GTRnr) membrane and GTRr with anorganic xenograft (GTRr + AX) showed the highest mean ranking for CALg (2.99, 90% credible interval [CrI] = 1 to 5; 2.80, 90% CrI = 1 to 6; and 2.29, 90% CrI = 1 to 6, respectively) and PDr (2.83, 90% CrI = 1 to 5; 2.52, 90% CrI = 1 to 5; and 2.77, 90% CrI = 1 to 6, respectively). GTRr + AX showed the highest probability (Pr) of being the best treatment for CALg (Pr = 45%) and PDr (Pr = 32%). Direct and network quality of evidence were rated from very low to moderate. To the best of the authors' knowledge, the present review is the first one to evaluate quantitatively and qualitatively the effect of different interventions on periodontal healing distal to the second molar after extraction of the third molar. GTR-based procedures with or without combined grafting therapies provide some adjunctive clinical benefit compared to standard non-regenerative/non-grafting procedures. However, the overall low quality of evidence suggests a low degree of confidence and certainty in treatment effects. Evidence on variations of surgical M3M removal techniques based on flap design, type of suturing, and periodontal care of M2M is limited both qualitatively and quantitatively.
Clements, Michelle N; Corstjens, Paul L A M; Binder, Sue; Campbell, Carl H; de Dood, Claudia J; Fenwick, Alan; Harrison, Wendy; Kayugi, Donatien; King, Charles H; Kornelis, Dieuwke; Ndayishimiye, Onesime; Ortu, Giuseppina; Lamine, Mariama Sani; Zivieri, Antonio; Colley, Daniel G; van Dam, Govert J
2018-02-23
Kato-Katz examination of stool smears is the field-standard method for detecting Schistosoma mansoni infection. However, Kato-Katz misses many active infections, especially of light intensity. Point-of-care circulating cathodic antigen (CCA) is an alternative field diagnostic that is more sensitive than Kato-Katz when intensity is low, but interpretation of CCA-trace results is unclear. To evaluate trace results, we tested urine and stool specimens from 398 pupils from eight schools in Burundi using four approaches: two in Burundi and two in a laboratory in Leiden, the Netherlands. In Burundi, we used Kato-Katz and point-of-care CCA (CCAB). In Leiden, we repeated the CCA (CCAL) and also used Up-Converting Phosphor Circulating Anodic Antigen (CAA). We applied Bayesian latent class analyses (LCA), first considering CCA traces as negative and then as positive. We used the LCA output to estimate validity of the prevalence estimates of each test in comparison to the population-level infection prevalence and estimated the proportion of trace results that were likely true positives. Kato-Katz yielded the lowest prevalence (6.8%), and CCAB with trace considered positive yielded the highest (53.5%). There were many more trace results recorded by CCA in Burundi (32.4%) than in Leiden (2.3%). Estimated prevalence with CAA was 46.5%. LCA indicated that Kato-Katz had the lowest sensitivity: 15.9% [Bayesian Credible Interval (BCI): 9.2-23.5%] with CCA-trace considered negative and 15.0% with trace as positive (BCI: 9.6-21.4%), implying that Kato-Katz missed approximately 85% of infections. CCAB underestimated disease prevalence when trace was considered negative and overestimated disease prevalence when trace was considered positive, by approximately 12 percentage points each way, and CAA overestimated prevalence in both models. Our results suggest that approximately 52.2% (BCI: 37.8-5.8%) of the CCAB trace readings were true infections. Whether measured in the laboratory or the field, CCA outperformed Kato-Katz at the low infection intensities in Burundi. CCA with trace as negative likely missed many infections, whereas CCA with trace as positive overestimated prevalence. In the absence of a field-friendly gold standard diagnostic, the use of a variety of diagnostics with differing properties will become increasingly important as programs move towards elimination of schistosomiasis. It is clear that CCA is a valuable tool for the detection and mapping of S. mansoni infection in the field and CAA may be a valuable field tool in the future.
A Comparison of the β-Substitution Method and a Bayesian Method for Analyzing Left-Censored Data.
Huynh, Tran; Quick, Harrison; Ramachandran, Gurumurthy; Banerjee, Sudipto; Stenzel, Mark; Sandler, Dale P; Engel, Lawrence S; Kwok, Richard K; Blair, Aaron; Stewart, Patricia A
2016-01-01
Classical statistical methods for analyzing exposure data with values below the detection limits are well described in the occupational hygiene literature, but an evaluation of a Bayesian approach for handling such data is currently lacking. Here, we first describe a Bayesian framework for analyzing censored data. We then present the results of a simulation study conducted to compare the β-substitution method with a Bayesian method for exposure datasets drawn from lognormal distributions and mixed lognormal distributions with varying sample sizes, geometric standard deviations (GSDs), and censoring for single and multiple limits of detection. For each set of factors, estimates for the arithmetic mean (AM), geometric mean, GSD, and the 95th percentile (X0.95) of the exposure distribution were obtained. We evaluated the performance of each method using relative bias, the root mean squared error (rMSE), and coverage (the proportion of the computed 95% uncertainty intervals containing the true value). The Bayesian method using non-informative priors and the β-substitution method were generally comparable in bias and rMSE when estimating the AM and GM. For the GSD and the 95th percentile, the Bayesian method with non-informative priors was more biased and had a higher rMSE than the β-substitution method, but use of more informative priors generally improved the Bayesian method's performance, making both the bias and the rMSE more comparable to the β-substitution method. An advantage of the Bayesian method is that it provided estimates of uncertainty for these parameters of interest and good coverage, whereas the β-substitution method only provided estimates of uncertainty for the AM, and coverage was not as consistent. Selection of one or the other method depends on the needs of the practitioner, the availability of prior information, and the distribution characteristics of the measurement data. We suggest the use of Bayesian methods if the practitioner has the computational resources and prior information, as the method would generally provide accurate estimates and also provides the distributions of all of the parameters, which could be useful for making decisions in some applications. © The Author 2015. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Credibility analysis of risk classes by generalized linear model
NASA Astrophysics Data System (ADS)
Erdemir, Ovgucan Karadag; Sucu, Meral
2016-06-01
In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.
CredibleMeds.org: What does it offer?
Woosley, Raymond L; Black, Kristin; Heise, C William; Romero, Klaus
2018-02-01
Since the 1990s, when numerous non-cardiac drugs were first recognized to have the potential to prolong the QT interval and cause torsades de pointes (TdP), clinicians, drug regulators, drug developers, and clinical investigators have become aware of the complexities of assessing evidence and determining TdP causality for the many drugs being marketed or under development. To facilitate better understanding, the Arizona Center for Education and Research on Therapeutics, known as AZCERT, has developed the CredibleMeds.org website which includes QTdrugs, a listing of over 220 drugs placed in four risk categories based on their association with QT prolongation and TdP. Since the site was launched in 1999, it has become the single and most reliable source of information of its kind for patients, healthcare providers, and research scientists. Over 96,000 registered users rely on the QTdrugs database as their primary resource to inform their medication use, their prescribing or their clinical research into the impact of QT-prolonging drugs and drug-induced arrhythmias. The QTdrugs lists are increasingly used as the basis for clinical decision support systems in healthcare and for metrics of prescribing quality by healthcare insurers. A free smartphone app and an application program interface enable rapid and mobile access to the lists. Also, the CredibleMeds website offers numerous educational resources for patients, educators and healthcare providers that foster the safe use of medications. Copyright © 2018 Elsevier Inc. All rights reserved.
Pocket Handbook on Reliability
1975-09-01
exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future
ERIC Educational Resources Information Center
Jackson, Dan
2013-01-01
Statistical inference is problematic in the common situation in meta-analysis where the random effects model is fitted to just a handful of studies. In particular, the asymptotic theory of maximum likelihood provides a poor approximation, and Bayesian methods are sensitive to the prior specification. Hence, less efficient, but easily computed and…
"Magnitude-based inference": a statistical review.
Welsh, Alan H; Knight, Emma J
2015-04-01
We consider "magnitude-based inference" and its interpretation by examining in detail its use in the problem of comparing two means. We extract from the spreadsheets, which are provided to users of the analysis (http://www.sportsci.org/), a precise description of how "magnitude-based inference" is implemented. We compare the implemented version of the method with general descriptions of it and interpret the method in familiar statistical terms. We show that "magnitude-based inference" is not a progressive improvement on modern statistics. The additional probabilities introduced are not directly related to the confidence interval but, rather, are interpretable either as P values for two different nonstandard tests (for different null hypotheses) or as approximate Bayesian calculations, which also lead to a type of test. We also discuss sample size calculations associated with "magnitude-based inference" and show that the substantial reduction in sample sizes claimed for the method (30% of the sample size obtained from standard frequentist calculations) is not justifiable so the sample size calculations should not be used. Rather than using "magnitude-based inference," a better solution is to be realistic about the limitations of the data and use either confidence intervals or a fully Bayesian analysis.
Internal Medicine residents use heuristics to estimate disease probability
Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin
2015-01-01
Background Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. Results When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Conclusions Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing. PMID:27004080
Bayesian Estimation of Small Effects in Exercise and Sports Science.
Mengersen, Kerrie L; Drovandi, Christopher C; Robert, Christian P; Pyne, David B; Gore, Christopher J
2016-01-01
The aim of this paper is to provide a Bayesian formulation of the so-called magnitude-based inference approach to quantifying and interpreting effects, and in a case study example provide accurate probabilistic statements that correspond to the intended magnitude-based inferences. The model is described in the context of a published small-scale athlete study which employed a magnitude-based inference approach to compare the effect of two altitude training regimens (live high-train low (LHTL), and intermittent hypoxic exposure (IHE)) on running performance and blood measurements of elite triathletes. The posterior distributions, and corresponding point and interval estimates, for the parameters and associated effects and comparisons of interest, were estimated using Markov chain Monte Carlo simulations. The Bayesian analysis was shown to provide more direct probabilistic comparisons of treatments and able to identify small effects of interest. The approach avoided asymptotic assumptions and overcame issues such as multiple testing. Bayesian analysis of unscaled effects showed a probability of 0.96 that LHTL yields a substantially greater increase in hemoglobin mass than IHE, a 0.93 probability of a substantially greater improvement in running economy and a greater than 0.96 probability that both IHE and LHTL yield a substantially greater improvement in maximum blood lactate concentration compared to a Placebo. The conclusions are consistent with those obtained using a 'magnitude-based inference' approach that has been promoted in the field. The paper demonstrates that a fully Bayesian analysis is a simple and effective way of analysing small effects, providing a rich set of results that are straightforward to interpret in terms of probabilistic statements.
Another look at confidence intervals: Proposal for a more relevant and transparent approach
NASA Astrophysics Data System (ADS)
Biller, Steven D.; Oser, Scott M.
2015-02-01
The behaviors of various confidence/credible interval constructions are explored, particularly in the region of low event numbers where methods diverge most. We highlight a number of challenges, such as the treatment of nuisance parameters, and common misconceptions associated with such constructions. An informal survey of the literature suggests that confidence intervals are not always defined in relevant ways and are too often misinterpreted and/or misapplied. This can lead to seemingly paradoxical behaviors and flawed comparisons regarding the relevance of experimental results. We therefore conclude that there is a need for a more pragmatic strategy which recognizes that, while it is critical to objectively convey the information content of the data, there is also a strong desire to derive bounds on model parameter values and a natural instinct to interpret things this way. Accordingly, we attempt to put aside philosophical biases in favor of a practical view to propose a more transparent and self-consistent approach that better addresses these issues.
Assessing Mediational Models: Testing and Interval Estimation for Indirect Effects.
Biesanz, Jeremy C; Falk, Carl F; Savalei, Victoria
2010-08-06
Theoretical models specifying indirect or mediated effects are common in the social sciences. An indirect effect exists when an independent variable's influence on the dependent variable is mediated through an intervening variable. Classic approaches to assessing such mediational hypotheses ( Baron & Kenny, 1986 ; Sobel, 1982 ) have in recent years been supplemented by computationally intensive methods such as bootstrapping, the distribution of the product methods, and hierarchical Bayesian Markov chain Monte Carlo (MCMC) methods. These different approaches for assessing mediation are illustrated using data from Dunn, Biesanz, Human, and Finn (2007). However, little is known about how these methods perform relative to each other, particularly in more challenging situations, such as with data that are incomplete and/or nonnormal. This article presents an extensive Monte Carlo simulation evaluating a host of approaches for assessing mediation. We examine Type I error rates, power, and coverage. We study normal and nonnormal data as well as complete and incomplete data. In addition, we adapt a method, recently proposed in statistical literature, that does not rely on confidence intervals (CIs) to test the null hypothesis of no indirect effect. The results suggest that the new inferential method-the partial posterior p value-slightly outperforms existing ones in terms of maintaining Type I error rates while maximizing power, especially with incomplete data. Among confidence interval approaches, the bias-corrected accelerated (BC a ) bootstrapping approach often has inflated Type I error rates and inconsistent coverage and is not recommended; In contrast, the bootstrapped percentile confidence interval and the hierarchical Bayesian MCMC method perform best overall, maintaining Type I error rates, exhibiting reasonable power, and producing stable and accurate coverage rates.
Black, Andrew J.; Ross, Joshua V.
2013-01-01
The clinical serial interval of an infectious disease is the time between date of symptom onset in an index case and the date of symptom onset in one of its secondary cases. It is a quantity which is commonly collected during a pandemic and is of fundamental importance to public health policy and mathematical modelling. In this paper we present a novel method for calculating the serial interval distribution for a Markovian model of household transmission dynamics. This allows the use of Bayesian MCMC methods, with explicit evaluation of the likelihood, to fit to serial interval data and infer parameters of the underlying model. We use simulated and real data to verify the accuracy of our methodology and illustrate the importance of accounting for household size. The output of our approach can be used to produce posterior distributions of population level epidemic characteristics. PMID:24023679
Smith, Klayton; Boone, Kyle; Victor, Tara; Miora, Deborah; Cottingham, Maria; Ziegler, Elizabeth; Zeller, Michelle; Wright, Matthew
2014-01-01
The purpose of this archival study was to identify performance validity tests (PVTs) and standard IQ and neurocognitive test scores, which singly or in combination, differentiate credible patients of low IQ (FSIQ ≤ 75; n = 55) from non-credible patients. We compared the credible participants against a sample of 74 non-credible patients who appeared to have been attempting to feign low intelligence specifically (FSIQ ≤ 75), as well as a larger non-credible sample (n = 383) unselected for IQ. The entire non-credible group scored significantly higher than the credible participants on measures of verbal crystallized intelligence/semantic memory and manipulation of overlearned information, while the credible group performed significantly better on many processing speed and memory tests. Additionally, credible women showed faster finger-tapping speeds than non-credible women. The credible group also scored significantly higher than the non-credible subgroup with low IQ scores on measures of attention, visual perceptual/spatial tasks, processing speed, verbal learning/list learning, and visual memory, and credible women continued to outperform non-credible women on finger tapping. When cut-offs were selected to maintain approximately 90% specificity in the credible group, sensitivity rates were highest for verbal and visual memory measures (i.e., TOMM trials 1 and 2; Warrington Words correct and time; Rey Word Recognition Test total; RAVLT Effort Equation, Trial 5, total across learning trials, short delay, recognition, and RAVLT/RO discriminant function; and Digit Symbol recognition), followed by select attentional PVT scores (i.e., b Test omissions and time to recite four digits forward). When failure rates were tabulated across seven most sensitive scores, a cut-off of ≥ 2 failures was associated with 85.4% specificity and 85.7% sensitivity, while a cut-off of ≥ 3 failures resulted in 95.1% specificity and 66.0% sensitivity. Results are discussed in light of extant literature and directions for future research.
Tang, Zhongwen
2015-01-01
An analytical way to compute predictive probability of success (PPOS) together with credible interval at interim analysis (IA) is developed for big clinical trials with time-to-event endpoints. The method takes account of the fixed data up to IA, the amount of uncertainty in future data, and uncertainty about parameters. Predictive power is a special type of PPOS. The result is confirmed by simulation. An optimal design is proposed by finding optimal combination of analysis time and futility cutoff based on some PPOS criteria.
The Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)
,
2008-01-01
California?s 35 million people live among some of the most active earthquake faults in the United States. Public safety demands credible assessments of the earthquake hazard to maintain appropriate building codes for safe construction and earthquake insurance for loss protection. Seismic hazard analysis begins with an earthquake rupture forecast?a model of probabilities that earthquakes of specified magnitudes, locations, and faulting types will occur during a specified time interval. This report describes a new earthquake rupture forecast for California developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP 2007).