Science.gov

Sample records for joint bayesian analysis

  1. JBASE: Joint Bayesian Analysis of Subphenotypes and Epistasis

    PubMed Central

    Colak, Recep; Kim, TaeHyung; Kazan, Hilal; Oh, Yoomi; Cruz, Miguel; Valladares-Salgado, Adan; Peralta, Jesus; Escobedo, Jorge; Parra, Esteban J.; Kim, Philip M.; Goldenberg, Anna

    2016-01-01

    Motivation: Rapid advances in genotyping and genome-wide association studies have enabled the discovery of many new genotype–phenotype associations at the resolution of individual markers. However, these associations explain only a small proportion of theoretically estimated heritability of most diseases. In this work, we propose an integrative mixture model called JBASE: joint Bayesian analysis of subphenotypes and epistasis. JBASE explores two major reasons of missing heritability: interactions between genetic variants, a phenomenon known as epistasis and phenotypic heterogeneity, addressed via subphenotyping. Results: Our extensive simulations in a wide range of scenarios repeatedly demonstrate that JBASE can identify true underlying subphenotypes, including their associated variants and their interactions, with high precision. In the presence of phenotypic heterogeneity, JBASE has higher Power and lower Type 1 Error than five state-of-the-art approaches. We applied our method to a sample of individuals from Mexico with Type 2 diabetes and discovered two novel epistatic modules, including two loci each, that define two subphenotypes characterized by differences in body mass index and waist-to-hip ratio. We successfully replicated these subphenotypes and epistatic modules in an independent dataset from Mexico genotyped with a different platform. Availability and implementation: JBASE is implemented in C++, supported on Linux and is available at http://www.cs.toronto.edu/∼goldenberg/JBASE/jbase.tar.gz. The genotype data underlying this study are available upon approval by the ethics review board of the Medical Centre Siglo XXI. Please contact Dr Miguel Cruz at mcruzl@yahoo.com for assistance with the application. Contact: anna.goldenberg@utoronto.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26411870

  2. JAM: A Scalable Bayesian Framework for Joint Analysis of Marginal SNP Effects

    PubMed Central

    Conti, David V.; Richardson, Sylvia

    2016-01-01

    ABSTRACT Recently, large scale genome‐wide association study (GWAS) meta‐analyses have boosted the number of known signals for some traits into the tens and hundreds. Typically, however, variants are only analysed one‐at‐a‐time. This complicates the ability of fine‐mapping to identify a small set of SNPs for further functional follow‐up. We describe a new and scalable algorithm, joint analysis of marginal summary statistics (JAM), for the re‐analysis of published marginal summary stactistics under joint multi‐SNP models. The correlation is accounted for according to estimates from a reference dataset, and models and SNPs that best explain the complete joint pattern of marginal effects are highlighted via an integrated Bayesian penalized regression framework. We provide both enumerated and Reversible Jump MCMC implementations of JAM and present some comparisons of performance. In a series of realistic simulation studies, JAM demonstrated identical performance to various alternatives designed for single region settings. In multi‐region settings, where the only multivariate alternative involves stepwise selection, JAM offered greater power and specificity. We also present an application to real published results from MAGIC (meta‐analysis of glucose and insulin related traits consortium) – a GWAS meta‐analysis of more than 15,000 people. We re‐analysed several genomic regions that produced multiple significant signals with glucose levels 2 hr after oral stimulation. Through joint multivariate modelling, JAM was able to formally rule out many SNPs, and for one gene, ADCY5, suggests that an additional SNP, which transpired to be more biologically plausible, should be followed up with equal priority to the reported index. PMID:27027514

  3. JAM: A Scalable Bayesian Framework for Joint Analysis of Marginal SNP Effects.

    PubMed

    Newcombe, Paul J; Conti, David V; Richardson, Sylvia

    2016-04-01

    Recently, large scale genome-wide association study (GWAS) meta-analyses have boosted the number of known signals for some traits into the tens and hundreds. Typically, however, variants are only analysed one-at-a-time. This complicates the ability of fine-mapping to identify a small set of SNPs for further functional follow-up. We describe a new and scalable algorithm, joint analysis of marginal summary statistics (JAM), for the re-analysis of published marginal summary statistics under joint multi-SNP models. The correlation is accounted for according to estimates from a reference dataset, and models and SNPs that best explain the complete joint pattern of marginal effects are highlighted via an integrated Bayesian penalized regression framework. We provide both enumerated and Reversible Jump MCMC implementations of JAM and present some comparisons of performance. In a series of realistic simulation studies, JAM demonstrated identical performance to various alternatives designed for single region settings. In multi-region settings, where the only multivariate alternative involves stepwise selection, JAM offered greater power and specificity. We also present an application to real published results from MAGIC (meta-analysis of glucose and insulin related traits consortium) - a GWAS meta-analysis of more than 15,000 people. We re-analysed several genomic regions that produced multiple significant signals with glucose levels 2 hr after oral stimulation. Through joint multivariate modelling, JAM was able to formally rule out many SNPs, and for one gene, ADCY5, suggests that an additional SNP, which transpired to be more biologically plausible, should be followed up with equal priority to the reported index.

  4. A Bayesian approach to joint analysis of multivariate longitudinal data and parametric accelerated failure time.

    PubMed

    Luo, Sheng

    2014-02-20

    Impairment caused by Parkinson's disease (PD) is multidimensional (e.g., sensoria, functions, and cognition) and progressive. Its multidimensional nature precludes a single outcome to measure disease progression. Clinical trials of PD use multiple categorical and continuous longitudinal outcomes to assess the treatment effects on overall improvement. A terminal event such as death or dropout can stop the follow-up process. Moreover, the time to the terminal event may be dependent on the multivariate longitudinal measurements. In this article, we consider a joint random-effects model for the correlated outcomes. A multilevel item response theory model is used for the multivariate longitudinal outcomes and a parametric accelerated failure time model is used for the failure time because of the violation of proportional hazard assumption. These two models are linked via random effects. The Bayesian inference via MCMC is implemented in 'BUGS' language. Our proposed method is evaluated by a simulation study and is applied to DATATOP study, a motivating clinical trial to determine if deprenyl slows the progression of PD.

  5. Bayesian Mediation Analysis

    ERIC Educational Resources Information Center

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  6. Bayesian Exploratory Factor Analysis

    PubMed Central

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  7. Bayesian joint modeling of longitudinal and spatial survival AIDS data.

    PubMed

    Martins, Rui; Silva, Giovani L; Andreozzi, Valeska

    2016-08-30

    Joint analysis of longitudinal and survival data has received increasing attention in the recent years, especially for analyzing cancer and AIDS data. As both repeated measurements (longitudinal) and time-to-event (survival) outcomes are observed in an individual, a joint modeling is more appropriate because it takes into account the dependence between the two types of responses, which are often analyzed separately. We propose a Bayesian hierarchical model for jointly modeling longitudinal and survival data considering functional time and spatial frailty effects, respectively. That is, the proposed model deals with non-linear longitudinal effects and spatial survival effects accounting for the unobserved heterogeneity among individuals living in the same region. This joint approach is applied to a cohort study of patients with HIV/AIDS in Brazil during the years 2002-2006. Our Bayesian joint model presents considerable improvements in the estimation of survival times of the Brazilian HIV/AIDS patients when compared with those obtained through a separate survival model and shows that the spatial risk of death is the same across the different Brazilian states. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Bayesian bivariate meta-analysis of correlated effects: Impact of the prior distributions on the between-study correlation, borrowing of strength, and joint inferences.

    PubMed

    Burke, Danielle L; Bujkiewicz, Sylwia; Riley, Richard D

    2016-03-17

    Multivariate random-effects meta-analysis allows the joint synthesis of correlated results from multiple studies, for example, for multiple outcomes or multiple treatment groups. In a Bayesian univariate meta-analysis of one endpoint, the importance of specifying a sensible prior distribution for the between-study variance is well understood. However, in multivariate meta-analysis, there is little guidance about the choice of prior distributions for the variances or, crucially, the between-study correlation, ρB; for the latter, researchers often use a Uniform(-1,1) distribution assuming it is vague. In this paper, an extensive simulation study and a real illustrative example is used to examine the impact of various (realistically) vague prior distributions for ρB and the between-study variances within a Bayesian bivariate random-effects meta-analysis of two correlated treatment effects. A range of diverse scenarios are considered, including complete and missing data, to examine the impact of the prior distributions on posterior results (for treatment effect and between-study correlation), amount of borrowing of strength, and joint predictive distributions of treatment effectiveness in new studies. Two key recommendations are identified to improve the robustness of multivariate meta-analysis results. First, the routine use of a Uniform(-1,1) prior distribution for ρB should be avoided, if possible, as it is not necessarily vague. Instead, researchers should identify a sensible prior distribution, for example, by restricting values to be positive or negative as indicated by prior knowledge. Second, it remains critical to use sensible (e.g. empirically based) prior distributions for the between-study variances, as an inappropriate choice can adversely impact the posterior distribution for ρB, which may then adversely affect inferences such as joint predictive probabilities. These recommendations are especially important with a small number of studies and missing

  9. An Efficient Joint Formulation for Bayesian Face Verification.

    PubMed

    Chen, Dong; Cao, Xudong; Wipf, David; Wen, Fang; Sun, Jian

    2017-01-01

    This paper revisits the classical Bayesian face recognition algorithm from Baback Moghaddam et al. and proposes enhancements tailored to face verification, the problem of predicting whether or not a pair of facial images share the same identity. Like a variety of face verification algorithms, the original Bayesian face model only considers the appearance difference between two faces rather than the raw images themselves. However, we argue that such a fixed and blind projection may prematurely reduce the separability between classes. Consequently, we model two facial images jointly with an appropriate prior that considers intra- and extra-personal variations over the image pairs. This joint formulation is trained using a principled EM algorithm, while testing involves only efficient closed-formed computations that are suitable for real-time practical deployment. Supporting theoretical analyses investigate computational complexity, scale-invariance properties, and convergence issues. We also detail important relationships with existing algorithms, such as probabilistic linear discriminant analysis and metric learning. Finally, on extensive experimental evaluations, the proposed model is superior to the classical Bayesian face algorithm and many alternative state-of-the-art supervised approaches, achieving the best test accuracy on three challenging datasets, Labeled Face in Wild, Multi-PIE, and YouTube Faces, all with unparalleled computational efficiency.

  10. Road network safety evaluation using Bayesian hierarchical joint model.

    PubMed

    Wang, Jie; Huang, Helai

    2016-05-01

    Safety and efficiency are commonly regarded as two significant performance indicators of transportation systems. In practice, road network planning has focused on road capacity and transport efficiency whereas the safety level of a road network has received little attention in the planning stage. This study develops a Bayesian hierarchical joint model for road network safety evaluation to help planners take traffic safety into account when planning a road network. The proposed model establishes relationships between road network risk and micro-level variables related to road entities and traffic volume, as well as socioeconomic, trip generation and network density variables at macro level which are generally used for long term transportation plans. In addition, network spatial correlation between intersections and their connected road segments is also considered in the model. A road network is elaborately selected in order to compare the proposed hierarchical joint model with a previous joint model and a negative binomial model. According to the results of the model comparison, the hierarchical joint model outperforms the joint model and negative binomial model in terms of the goodness-of-fit and predictive performance, which indicates the reasonableness of considering the hierarchical data structure in crash prediction and analysis. Moreover, both random effects at the TAZ level and the spatial correlation between intersections and their adjacent segments are found to be significant, supporting the employment of the hierarchical joint model as an alternative in road-network-level safety modeling as well.

  11. Integrative bayesian network analysis of genomic data.

    PubMed

    Ni, Yang; Stingo, Francesco C; Baladandayuthapani, Veerabhadran

    2014-01-01

    Rapid development of genome-wide profiling technologies has made it possible to conduct integrative analysis on genomic data from multiple platforms. In this study, we develop a novel integrative Bayesian network approach to investigate the relationships between genetic and epigenetic alterations as well as how these mutations affect a patient's clinical outcome. We take a Bayesian network approach that admits a convenient decomposition of the joint distribution into local distributions. Exploiting the prior biological knowledge about regulatory mechanisms, we model each local distribution as linear regressions. This allows us to analyze multi-platform genome-wide data in a computationally efficient manner. We illustrate the performance of our approach through simulation studies. Our methods are motivated by and applied to a multi-platform glioblastoma dataset, from which we reveal several biologically relevant relationships that have been validated in the literature as well as new genes that could potentially be novel biomarkers for cancer progression.

  12. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  13. Bayesian Model Averaging for Propensity Score Analysis

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  14. Bayesian Statistics for Biological Data: Pedigree Analysis

    ERIC Educational Resources Information Center

    Stanfield, William D.; Carlton, Matthew A.

    2004-01-01

    The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.

  15. Bayesian robust principal component analysis.

    PubMed

    Ding, Xinghao; He, Lihan; Carin, Lawrence

    2011-12-01

    A hierarchical Bayesian model is considered for decomposing a matrix into low-rank and sparse components, assuming the observed matrix is a superposition of the two. The matrix is assumed noisy, with unknown and possibly non-stationary noise statistics. The Bayesian framework infers an approximate representation for the noise statistics while simultaneously inferring the low-rank and sparse-outlier contributions; the model is robust to a broad range of noise levels, without having to change model hyperparameter settings. In addition, the Bayesian framework allows exploitation of additional structure in the matrix. For example, in video applications each row (or column) corresponds to a video frame, and we introduce a Markov dependency between consecutive rows in the matrix (corresponding to consecutive frames in the video). The properties of this Markov process are also inferred based on the observed matrix, while simultaneously denoising and recovering the low-rank and sparse components. We compare the Bayesian model to a state-of-the-art optimization-based implementation of robust PCA; considering several examples, we demonstrate competitive performance of the proposed model.

  16. Joint Bayesian Component Separation and CMB Power Spectrum Estimation

    NASA Technical Reports Server (NTRS)

    Eriksen, H. K.; Jewell, J. B.; Dickinson, C.; Banday, A. J.; Gorski, K. M.; Lawrence, C. R.

    2008-01-01

    We describe and implement an exact, flexible, and computationally efficient algorithm for joint component separation and CMB power spectrum estimation, building on a Gibbs sampling framework. Two essential new features are (1) conditional sampling of foreground spectral parameters and (2) joint sampling of all amplitude-type degrees of freedom (e.g., CMB, foreground pixel amplitudes, and global template amplitudes) given spectral parameters. Given a parametric model of the foreground signals, we estimate efficiently and accurately the exact joint foreground- CMB posterior distribution and, therefore, all marginal distributions such as the CMB power spectrum or foreground spectral index posteriors. The main limitation of the current implementation is the requirement of identical beam responses at all frequencies, which restricts the analysis to the lowest resolution of a given experiment. We outline a future generalization to multiresolution observations. To verify the method, we analyze simple models and compare the results to analytical predictions. We then analyze a realistic simulation with properties similar to the 3 yr WMAP data, downgraded to a common resolution of 3 deg FWHM. The results from the actual 3 yr WMAP temperature analysis are presented in a companion Letter.

  17. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    ERIC Educational Resources Information Center

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A. G.

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are…

  18. Bayesian analysis for kaon photoproduction

    SciTech Connect

    Marsainy, T. Mart, T.

    2014-09-25

    We have investigated contribution of the nucleon resonances in the kaon photoproduction process by using an established statistical decision making method, i.e. the Bayesian method. This method does not only evaluate the model over its entire parameter space, but also takes the prior information and experimental data into account. The result indicates that certain resonances have larger probabilities to contribute to the process.

  19. Bayesian Meta-Analysis of Coefficient Alpha

    ERIC Educational Resources Information Center

    Brannick, Michael T.; Zhang, Nanhua

    2013-01-01

    The current paper describes and illustrates a Bayesian approach to the meta-analysis of coefficient alpha. Alpha is the most commonly used estimate of the reliability or consistency (freedom from measurement error) for educational and psychological measures. The conventional approach to meta-analysis uses inverse variance weights to combine…

  20. An Integrated Bayesian Model for DIF Analysis

    ERIC Educational Resources Information Center

    Soares, Tufi M.; Goncalves, Flavio B.; Gamerman, Dani

    2009-01-01

    In this article, an integrated Bayesian model for differential item functioning (DIF) analysis is proposed. The model is integrated in the sense of modeling the responses along with the DIF analysis. This approach allows DIF detection and explanation in a simultaneous setup. Previous empirical studies and/or subjective beliefs about the item…

  1. Bayesian analysis of structural equation models with dichotomous variables.

    PubMed

    Lee, Sik-Yum; Song, Xin-Yuan

    2003-10-15

    Structural equation modelling has been used extensively in the behavioural and social sciences for studying interrelationships among manifest and latent variables. Recently, its uses have been well recognized in medical research. This paper introduces a Bayesian approach to analysing general structural equation models with dichotomous variables. In the posterior analysis, the observed dichotomous data are augmented with the hypothetical missing values, which involve the latent variables in the model and the unobserved continuous measurements underlying the dichotomous data. An algorithm based on the Gibbs sampler is developed for drawing the parameters values and the hypothetical missing values from the joint posterior distributions. Useful statistics, such as the Bayesian estimates and their standard error estimates, and the highest posterior density intervals, can be obtained from the simulated observations. A posterior predictive p-value is used to test the goodness-of-fit of the posited model. The methodology is applied to a study of hypertensive patient non-adherence to medication.

  2. Bayesian and Frequentist Methods for Estimating Joint Uncertainty of Freundlich Adsorption Isotherm Fitting Parameters

    EPA Science Inventory

    In this paper, we present methods for estimating Freundlich isotherm fitting parameters (K and N) and their joint uncertainty, which have been implemented into the freeware software platforms R and WinBUGS. These estimates were determined by both Frequentist and Bayesian analyse...

  3. Bayesian Correlation Analysis for Sequence Count Data

    PubMed Central

    Lau, Nelson; Perkins, Theodore J.

    2016-01-01

    Evaluating the similarity of different measured variables is a fundamental task of statistics, and a key part of many bioinformatics algorithms. Here we propose a Bayesian scheme for estimating the correlation between different entities’ measurements based on high-throughput sequencing data. These entities could be different genes or miRNAs whose expression is measured by RNA-seq, different transcription factors or histone marks whose expression is measured by ChIP-seq, or even combinations of different types of entities. Our Bayesian formulation accounts for both measured signal levels and uncertainty in those levels, due to varying sequencing depth in different experiments and to varying absolute levels of individual entities, both of which affect the precision of the measurements. In comparison with a traditional Pearson correlation analysis, we show that our Bayesian correlation analysis retains high correlations when measurement confidence is high, but suppresses correlations when measurement confidence is low—especially for entities with low signal levels. In addition, we consider the influence of priors on the Bayesian correlation estimate. Perhaps surprisingly, we show that naive, uniform priors on entities’ signal levels can lead to highly biased correlation estimates, particularly when different experiments have widely varying sequencing depths. However, we propose two alternative priors that provably mitigate this problem. We also prove that, like traditional Pearson correlation, our Bayesian correlation calculation constitutes a kernel in the machine learning sense, and thus can be used as a similarity measure in any kernel-based machine learning algorithm. We demonstrate our approach on two RNA-seq datasets and one miRNA-seq dataset. PMID:27701449

  4. Bayesian Correlation Analysis for Sequence Count Data.

    PubMed

    Sánchez-Taltavull, Daniel; Ramachandran, Parameswaran; Lau, Nelson; Perkins, Theodore J

    2016-01-01

    Evaluating the similarity of different measured variables is a fundamental task of statistics, and a key part of many bioinformatics algorithms. Here we propose a Bayesian scheme for estimating the correlation between different entities' measurements based on high-throughput sequencing data. These entities could be different genes or miRNAs whose expression is measured by RNA-seq, different transcription factors or histone marks whose expression is measured by ChIP-seq, or even combinations of different types of entities. Our Bayesian formulation accounts for both measured signal levels and uncertainty in those levels, due to varying sequencing depth in different experiments and to varying absolute levels of individual entities, both of which affect the precision of the measurements. In comparison with a traditional Pearson correlation analysis, we show that our Bayesian correlation analysis retains high correlations when measurement confidence is high, but suppresses correlations when measurement confidence is low-especially for entities with low signal levels. In addition, we consider the influence of priors on the Bayesian correlation estimate. Perhaps surprisingly, we show that naive, uniform priors on entities' signal levels can lead to highly biased correlation estimates, particularly when different experiments have widely varying sequencing depths. However, we propose two alternative priors that provably mitigate this problem. We also prove that, like traditional Pearson correlation, our Bayesian correlation calculation constitutes a kernel in the machine learning sense, and thus can be used as a similarity measure in any kernel-based machine learning algorithm. We demonstrate our approach on two RNA-seq datasets and one miRNA-seq dataset.

  5. A SAS Interface for Bayesian Analysis with WinBUGS

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; McArdle, John J.; Wang, Lijuan; Hamagami, Fumiaki

    2008-01-01

    Bayesian methods are becoming very popular despite some practical difficulties in implementation. To assist in the practical application of Bayesian methods, we show how to implement Bayesian analysis with WinBUGS as part of a standard set of SAS routines. This implementation procedure is first illustrated by fitting a multiple regression model…

  6. Semiparametric Bayesian inference on skew-normal joint modeling of multivariate longitudinal and survival data.

    PubMed

    Tang, An-Min; Tang, Nian-Sheng

    2015-02-28

    We propose a semiparametric multivariate skew-normal joint model for multivariate longitudinal and multivariate survival data. One main feature of the posited model is that we relax the commonly used normality assumption for random effects and within-subject error by using a centered Dirichlet process prior to specify the random effects distribution and using a multivariate skew-normal distribution to specify the within-subject error distribution and model trajectory functions of longitudinal responses semiparametrically. A Bayesian approach is proposed to simultaneously obtain Bayesian estimates of unknown parameters, random effects and nonparametric functions by combining the Gibbs sampler and the Metropolis-Hastings algorithm. Particularly, a Bayesian local influence approach is developed to assess the effect of minor perturbations to within-subject measurement error and random effects. Several simulation studies and an example are presented to illustrate the proposed methodologies.

  7. Joint Lung CT Image Segmentation: A Hierarchical Bayesian Approach

    PubMed Central

    Cheng, Wenjun; Ma, Luyao; Yang, Tiejun; Liang, Jiali

    2016-01-01

    Accurate lung CT image segmentation is of great clinical value, especially when it comes to delineate pathological regions including lung tumor. In this paper, we present a novel framework that jointly segments multiple lung computed tomography (CT) images via hierarchical Dirichlet process (HDP). In specifics, based on the assumption that lung CT images from different patients share similar image structure (organ sets and relative positioning), we derive a mathematical model to segment them simultaneously so that shared information across patients could be utilized to regularize each individual segmentation. Moreover, compared to many conventional models, the algorithm requires little manual involvement due to the nonparametric nature of Dirichlet process (DP). We validated proposed model upon clinical data consisting of healthy and abnormal (lung cancer) patients. We demonstrate that, because of the joint segmentation fashion, more accurate and consistent segmentations could be obtained. PMID:27611188

  8. Bayesian Analysis of Individual Level Personality Dynamics

    PubMed Central

    Cripps, Edward; Wood, Robert E.; Beckmann, Nadin; Lau, John; Beckmann, Jens F.; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415

  9. Bayesian model selection analysis of WMAP3

    SciTech Connect

    Parkinson, David; Mukherjee, Pia; Liddle, Andrew R.

    2006-06-15

    We present a Bayesian model selection analysis of WMAP3 data using our code CosmoNest. We focus on the density perturbation spectral index n{sub S} and the tensor-to-scalar ratio r, which define the plane of slow-roll inflationary models. We find that while the Bayesian evidence supports the conclusion that n{sub S}{ne}1, the data are not yet powerful enough to do so at a strong or decisive level. If tensors are assumed absent, the current odds are approximately 8 to 1 in favor of n{sub S}{ne}1 under our assumptions, when WMAP3 data is used together with external data sets. WMAP3 data on its own is unable to distinguish between the two models. Further, inclusion of r as a parameter weakens the conclusion against the Harrison-Zel'dovich case (n{sub S}=1, r=0), albeit in a prior-dependent way. In appendices we describe the CosmoNest code in detail, noting its ability to supply posterior samples as well as to accurately compute the Bayesian evidence. We make a first public release of CosmoNest, now available at www.cosmonest.org.

  10. A joint inter- and intrascale statistical model for Bayesian wavelet based image denoising.

    PubMed

    Pizurica, Aleksandra; Philips, Wilfried; Lemahieu, Ignace; Acheroy, Marc

    2002-01-01

    This paper presents a new wavelet-based image denoising method, which extends a "geometrical" Bayesian framework. The new method combines three criteria for distinguishing supposedly useful coefficients from noise: coefficient magnitudes, their evolution across scales and spatial clustering of large coefficients near image edges. These three criteria are combined in a Bayesian framework. The spatial clustering properties are expressed in a prior model. The statistical properties concerning coefficient magnitudes and their evolution across scales are expressed in a joint conditional model. The three main novelties with respect to related approaches are (1) the interscale-ratios of wavelet coefficients are statistically characterized and different local criteria for distinguishing useful coefficients from noise are evaluated, (2) a joint conditional model is introduced, and (3) a novel anisotropic Markov random field prior model is proposed. The results demonstrate an improved denoising performance over related earlier techniques.

  11. Bayesian Approach to the Joint Inversion of Gravity and Magnetic Data, with Application to the Ismenius Area of Mars

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey B.; Raymond, C.; Smrekar, S.; Millbury, C.

    2004-01-01

    This viewgraph presentation reviews a Bayesian approach to the inversion of gravity and magnetic data with specific application to the Ismenius Area of Mars. Many inverse problems encountered in geophysics and planetary science are well known to be non-unique (i.e. inversion of gravity the density structure of a body). In hopes of reducing the non-uniqueness of solutions, there has been interest in the joint analysis of data. An example is the joint inversion of gravity and magnetic data, with the assumption that the same physical anomalies generate both the observed magnetic and gravitational anomalies. In this talk, we formulate the joint analysis of different types of data in a Bayesian framework and apply the formalism to the inference of the density and remanent magnetization structure for a local region in the Ismenius area of Mars. The Bayesian approach allows prior information or constraints in the solutions to be incorporated in the inversion, with the "best" solutions those whose forward predictions most closely match the data while remaining consistent with assumed constraints. The application of this framework to the inversion of gravity and magnetic data on Mars reveals two typical challenges - the forward predictions of the data have a linear dependence on some of the quantities of interest, and non-linear dependence on others (termed the "linear" and "non-linear" variables, respectively). For observations with Gaussian noise, a Bayesian approach to inversion for "linear" variables reduces to a linear filtering problem, with an explicitly computable "error" matrix. However, for models whose forward predictions have non-linear dependencies, inference is no longer given by such a simple linear problem, and moreover, the uncertainty in the solution is no longer completely specified by a computable "error matrix". It is therefore important to develop methods for sampling from the full Bayesian posterior to provide a complete and statistically consistent

  12. A Bayesian mixture of semiparametric mixed-effects joint models for skewed-longitudinal and time-to-event data.

    PubMed

    Chen, Jiaqing; Huang, Yangxin

    2015-09-10

    In longitudinal studies, it is of interest to investigate how repeatedly measured markers in time are associated with a time to an event of interest, and in the mean time, the repeated measurements are often observed with the features of a heterogeneous population, non-normality, and covariate measured with error because of longitudinal nature. Statistical analysis may complicate dramatically when one analyzes longitudinal-survival data with these features together. Recently, a mixture of skewed distributions has received increasing attention in the treatment of heterogeneous data involving asymmetric behaviors across subclasses, but there are relatively few studies accommodating heterogeneity, non-normality, and measurement error in covariate simultaneously arose in longitudinal-survival data setting. Under the umbrella of Bayesian inference, this article explores a finite mixture of semiparametric mixed-effects joint models with skewed distributions for longitudinal measures with an attempt to mediate homogeneous characteristics, adjust departures from normality, and tailor accuracy from measurement error in covariate as well as overcome shortages of confidence in specifying a time-to-event model. The Bayesian mixture of joint modeling offers an appropriate avenue to estimate not only all parameters of mixture joint models but also probabilities of class membership. Simulation studies are conducted to assess the performance of the proposed method, and a real example is analyzed to demonstrate the methodology. The results are reported by comparing potential models with various scenarios.

  13. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    PubMed Central

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B; Neyer, Franz J; van Aken, Marcel AG

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided. PMID:24116396

  14. Bayesian Nonparametric Models for Multiway Data Analysis.

    PubMed

    Xu, Zenglin; Yan, Feng; Qi, Yuan

    2015-02-01

    Tensor decomposition is a powerful computational tool for multiway data analysis. Many popular tensor decomposition approaches-such as the Tucker decomposition and CANDECOMP/PARAFAC (CP)-amount to multi-linear factorization. They are insufficient to model (i) complex interactions between data entities, (ii) various data types (e.g., missing data and binary data), and (iii) noisy observations and outliers. To address these issues, we propose tensor-variate latent nonparametric Bayesian models for multiway data analysis. We name these models InfTucker. These new models essentially conduct Tucker decomposition in an infinite feature space. Unlike classical tensor decomposition models, our new approaches handle both continuous and binary data in a probabilistic framework. Unlike previous Bayesian models on matrices and tensors, our models are based on latent Gaussian or t processes with nonlinear covariance functions. Moreover, on network data, our models reduce to nonparametric stochastic blockmodels and can be used to discover latent groups and predict missing interactions. To learn the models efficiently from data, we develop a variational inference technique and explore properties of the Kronecker product for computational efficiency. Compared with a classical variational implementation, this technique reduces both time and space complexities by several orders of magnitude. On real multiway and network data, our new models achieved significantly higher prediction accuracy than state-of-art tensor decomposition methods and blockmodels.

  15. Bayesian analysis of factors associated with fibromyalgia syndrome subjects

    NASA Astrophysics Data System (ADS)

    Jayawardana, Veroni; Mondal, Sumona; Russek, Leslie

    2015-01-01

    Factors contributing to movement-related fear were assessed by Russek, et al. 2014 for subjects with Fibromyalgia (FM) based on the collected data by a national internet survey of community-based individuals. The study focused on the variables, Activities-Specific Balance Confidence scale (ABC), Primary Care Post-Traumatic Stress Disorder screen (PC-PTSD), Tampa Scale of Kinesiophobia (TSK), a Joint Hypermobility Syndrome screen (JHS), Vertigo Symptom Scale (VSS-SF), Obsessive-Compulsive Personality Disorder (OCPD), Pain, work status and physical activity dependent from the "Revised Fibromyalgia Impact Questionnaire" (FIQR). The study presented in this paper revisits same data with a Bayesian analysis where appropriate priors were introduced for variables selected in the Russek's paper.

  16. Joint AVO inversion in the time and frequency domain with Bayesian interference

    NASA Astrophysics Data System (ADS)

    Zong, Zhao-Yun; Yin, Xing-Yao; Li, Kun

    2016-12-01

    Amplitude variations with offset or incident angle (AVO/AVA) inversion are typically combined with statistical methods, such as Bayesian inference or deterministic inversion. We propose a joint elastic inversion method in the time and frequency domain based on Bayesian inversion theory to improve the resolution of the estimated P- and S-wave velocities and density. We initially construct the objective function using Bayesian inference by combining seismic data in the time and frequency domain. We use Cauchy and Gaussian probability distribution density functions to obtain the prior information for the model parameters and the likelihood function, respectively. We estimate the elastic parameters by solving the initial objective function with added model constraints to improve the inversion robustness. The results of the synthetic data suggest that the frequency spectra of the estimated parameters are wider than those obtained with conventional elastic inversion in the time domain. In addition, the proposed inversion approach offers stronger antinoising compared to the inversion approach in the frequency domain. Furthermore, results from synthetic examples with added Gaussian noise demonstrate the robustness of the proposed approach. From the real data, we infer that more model parameter details can be reproduced with the proposed joint elastic inversion.

  17. Bayesian analysis of multiple direct detection experiments

    NASA Astrophysics Data System (ADS)

    Arina, Chiara

    2014-12-01

    Bayesian methods offer a coherent and efficient framework for implementing uncertainties into induction problems. In this article, we review how this approach applies to the analysis of dark matter direct detection experiments. In particular we discuss the exclusion limit of XENON100 and the debated hints of detection under the hypothesis of a WIMP signal. Within parameter inference, marginalizing consistently over uncertainties to extract robust posterior probability distributions, we find that the claimed tension between XENON100 and the other experiments can be partially alleviated in isospin violating scenario, while elastic scattering model appears to be compatible with the frequentist statistical approach. We then move to model comparison, for which Bayesian methods are particularly well suited. Firstly, we investigate the annual modulation seen in CoGeNT data, finding that there is weak evidence for a modulation. Modulation models due to other physics compare unfavorably with the WIMP models, paying the price for their excessive complexity. Secondly, we confront several coherent scattering models to determine the current best physical scenario compatible with the experimental hints. We find that exothermic and inelastic dark matter are moderatly disfavored against the elastic scenario, while the isospin violating model has a similar evidence. Lastly the Bayes' factor gives inconclusive evidence for an incompatibility between the data sets of XENON100 and the hints of detection. The same question assessed with goodness of fit would indicate a 2 σ discrepancy. This suggests that more data are therefore needed to settle this question.

  18. Bayesian Model Averaging for Propensity Score Analysis.

    PubMed

    Kaplan, David; Chen, Jianshen

    2014-01-01

    This article considers Bayesian model averaging as a means of addressing uncertainty in the selection of variables in the propensity score equation. We investigate an approximate Bayesian model averaging approach based on the model-averaged propensity score estimates produced by the R package BMA but that ignores uncertainty in the propensity score. We also provide a fully Bayesian model averaging approach via Markov chain Monte Carlo sampling (MCMC) to account for uncertainty in both parameters and models. A detailed study of our approach examines the differences in the causal estimate when incorporating noninformative versus informative priors in the model averaging stage. We examine these approaches under common methods of propensity score implementation. In addition, we evaluate the impact of changing the size of Occam's window used to narrow down the range of possible models. We also assess the predictive performance of both Bayesian model averaging propensity score approaches and compare it with the case without Bayesian model averaging. Overall, results show that both Bayesian model averaging propensity score approaches recover the treatment effect estimates well and generally provide larger uncertainty estimates, as expected. Both Bayesian model averaging approaches offer slightly better prediction of the propensity score compared with the Bayesian approach with a single propensity score equation. Covariate balance checks for the case study show that both Bayesian model averaging approaches offer good balance. The fully Bayesian model averaging approach also provides posterior probability intervals of the balance indices.

  19. Bayesian analysis of the solar neutrino anomaly

    SciTech Connect

    Bhat, C.M.

    1998-02-01

    We present an analysis of the recent solar neutrino data from the five experiments using Bayesian approach. We extract quantitative and easily understandable information pertaining to the solar neutrino problem. The probability distributions for the individual neutrino fluxes and, discrepancy distribution for B and Be fluxes, which include theoretical and experimental uncertainties have been extracted. The analysis carried out assuming that the neutrinos are unaltered during their passage from the sun to earth, clearly indicate that the observed PP flux is consistent with the 1995 standard solar model predictions of Bahcall and Pinsonneault within 2{sigma} (standard deviation), whereas the {sup 8}B flux is down by more than 12{sigma} and the {sup 7}Be flux is maximally suppressed. We also deduce the experimental survival probability for the solar neutrinos as a function of their energy in a model-independent way. We find that the shape of that distribution is in qualitative agreement with the MSW oscillation predictions.

  20. Bayesian analysis on gravitational waves and exoplanets

    NASA Astrophysics Data System (ADS)

    Deng, Xihao

    Attempts to detect gravitational waves using a pulsar timing array (PTA), i.e., a collection of pulsars in our Galaxy, have become more organized over the last several years. PTAs act to detect gravitational waves generated from very distant sources by observing the small and correlated effect the waves have on pulse arrival times at the Earth. In this thesis, I present advanced Bayesian analysis methods that can be used to search for gravitational waves in pulsar timing data. These methods were also applied to analyze a set of radial velocity (RV) data collected by the Hobby- Eberly Telescope on observing a K0 giant star. They confirmed the presence of two Jupiter mass planets around a K0 giant star and also characterized the stellar p-mode oscillation. The first part of the thesis investigates the effect of wavefront curvature on a pulsar's response to a gravitational wave. In it we show that we can assume the gravitational wave phasefront is planar across the array only if the source luminosity distance " 2piL2/lambda, where L is the pulsar distance to the Earth (˜ kpc) and lambda is the radiation wavelength (˜ pc) in the PTA waveband. Correspondingly, for a point gravitational wave source closer than ˜ 100 Mpc, we should take into account the effect of wavefront curvature across the pulsar-Earth line of sight, which depends on the luminosity distance to the source, when evaluating the pulsar timing response. As a consequence, if a PTA can detect a gravitational wave from a source closer than ˜ 100 Mpc, the effects of wavefront curvature on the response allows us to determine the source luminosity distance. The second and third parts of the thesis propose a new analysis method based on Bayesian nonparametric regression to search for gravitational wave bursts and a gravitational wave background in PTA data. Unlike the conventional Bayesian analysis that introduces a signal model with a fixed number of parameters, Bayesian nonparametric regression sets

  1. Bayesian Logical Data Analysis for the Physical Sciences

    NASA Astrophysics Data System (ADS)

    Gregory, Phil

    2010-05-01

    Preface; Acknowledgements; 1. Role of probability theory in science; 2. Probability theory as extended logic; 3. The how-to of Bayesian inference; 4. Assigning probabilities; 5. Frequentist statistical inference; 6. What is a statistic?; 7. Frequentist hypothesis testing; 8. Maximum entropy probabilities; 9. Bayesian inference (Gaussian errors); 10. Linear model fitting (Gaussian errors); 11. Nonlinear model fitting; 12. Markov Chain Monte Carlo; 13. Bayesian spectral analysis; 14. Bayesian inference (Poisson sampling); Appendix A. Singular value decomposition; Appendix B. Discrete Fourier transforms; Appendix C. Difference in two samples; Appendix D. Poisson ON/OFF details; Appendix E. Multivariate Gaussian from maximum entropy; References; Index.

  2. Evaluation of a partial genome screening of two asthma susceptibility regions using bayesian network based bayesian multilevel analysis of relevance.

    PubMed

    Ungvári, Ildikó; Hullám, Gábor; Antal, Péter; Kiszel, Petra Sz; Gézsi, András; Hadadi, Éva; Virág, Viktor; Hajós, Gergely; Millinghoffer, András; Nagy, Adrienne; Kiss, András; Semsei, Ágnes F; Temesi, Gergely; Melegh, Béla; Kisfali, Péter; Széll, Márta; Bikov, András; Gálffy, Gabriella; Tamási, Lilla; Falus, András; Szalai, Csaba

    2012-01-01

    Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls). The results were evaluated with traditional frequentist methods and we applied a new statistical method, called bayesian network based bayesian multilevel analysis of relevance (BN-BMLA). This method uses bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated.With frequentist methods one SNP (rs3751464 in the FRMD6 gene) provided evidence for an association with asthma (OR = 1.43(1.2-1.8); p = 3×10(-4)). The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics.In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance.

  3. A Hierarchical Bayesian Procedure for Two-Mode Cluster Analysis

    ERIC Educational Resources Information Center

    DeSarbo, Wayne S.; Fong, Duncan K. H.; Liechty, John; Saxton, M. Kim

    2004-01-01

    This manuscript introduces a new Bayesian finite mixture methodology for the joint clustering of row and column stimuli/objects associated with two-mode asymmetric proximity, dominance, or profile data. That is, common clusters are derived which partition both the row and column stimuli/objects simultaneously into the same derived set of clusters.…

  4. Optimal sequential Bayesian analysis for degradation tests.

    PubMed

    Rodríguez-Narciso, Silvia; Christen, J Andrés

    2016-07-01

    Degradation tests are especially difficult to conduct for items with high reliability. Test costs, caused mainly by prolonged item duration and item destruction costs, establish the necessity of sequential degradation test designs. We propose a methodology that sequentially selects the optimal observation times to measure the degradation, using a convenient rule that maximizes the inference precision and minimizes test costs. In particular our objective is to estimate a quantile of the time to failure distribution, where the degradation process is modelled as a linear model using Bayesian inference. The proposed sequential analysis is based on an index that measures the expected discrepancy between the estimated quantile and its corresponding prediction, using Monte Carlo methods. The procedure was successfully implemented for simulated and real data.

  5. A Bayesian Hierarchical Approach to Regional Frequency Analysis of Extremes

    NASA Astrophysics Data System (ADS)

    Renard, B.

    2010-12-01

    Rainfall and runoff frequency analysis is a major issue for the hydrological community. The distribution of hydrological extremes varies in space and possibly in time. Describing and understanding this spatiotemporal variability are primary challenges to improve hazard quantification and risk assessment. This presentation proposes a general approach based on a Bayesian hierarchical model, following previous work by Cooley et al. [2007], Micevski [2007], Aryal et al. [2009] or Lima and Lall [2009; 2010]. Such a hierarchical model is made up of two levels: (1) a data level modeling the distribution of observations, and (2) a process level describing the fluctuation of the distribution parameters in space and possibly in time. At the first level of the model, at-site data (e.g., annual maxima series) are modeled with a chosen distribution (e.g., a GEV distribution). Since data from several sites are considered, the joint distribution of a vector of (spatial) observations needs to be derived. This is challenging because data are in general not spatially independent, especially for nearby sites. An elliptical copula is therefore used to formally account for spatial dependence between at-site data. This choice might be questionable in the context of extreme value distributions. However, it is motivated by its applicability in spatial highly dimensional problems, where the joint pdf of a vector of n observations is required to derive the likelihood function (with n possibly amounting to hundreds of sites). At the second level of the model, parameters of the chosen at-site distribution are then modeled by a Gaussian spatial process, whose mean may depend on covariates (e.g. elevation, distance to sea, weather pattern, time). In particular, this spatial process allows estimating parameters at ungauged sites, and deriving the predictive distribution of rainfall/runoff at every pixel/catchment of the studied domain. An application to extreme rainfall series from the French

  6. Bayesian spatial joint modeling of traffic crashes on an urban road network.

    PubMed

    Zeng, Qiang; Huang, Helai

    2014-06-01

    This study proposes a Bayesian spatial joint model of crash prediction including both road segments and intersections located in an urban road network, through which the spatial correlations between heterogeneous types of entities could be considered. A road network in Hillsborough, Florida, with crash, road, and traffic characteristics data for a three-year period was selected in order to compare the proposed joint model with three site-level crash prediction models, that is, the Poisson, negative binomial (NB), and conditional autoregressive (CAR) models. According to the results, the CAR and Joint models outperform the Poisson and NB models in terms of model fitting and predictive performance, which indicates the reasonableness of considering cross-entity spatial correlations. Although the goodness-of-fit and predictive performance of the CAR and Joint models are equivalent in this case study, spatial correlations between segments and the connected intersections are found to be more significant than those solely between segments or between intersections, which supports the employment of the Joint model as an alternative in road-network-level safety modeling.

  7. Bayesian analysis. II. Signal detection and model selection

    NASA Astrophysics Data System (ADS)

    Bretthorst, G. Larry

    In the preceding. paper, Bayesian analysis was applied to the parameter estimation problem, given quadrature NMR data. Here Bayesian analysis is extended to the problem of selecting the model which is most probable in view of the data and all the prior information. In addition to the analytic calculation, two examples are given. The first example demonstrates how to use Bayesian probability theory to detect small signals in noise. The second example uses Bayesian probability theory to compute the probability of the number of decaying exponentials in simulated T1 data. The Bayesian answer to this question is essentially a microcosm of the scientific method and a quantitative statement of Ockham's razor: theorize about possible models, compare these to experiment, and select the simplest model that "best" fits the data.

  8. Learning Bayesian networks from big meteorological spatial datasets. An alternative to complex network analysis

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Jose Manuel; San Martín, Daniel; Herrera, Sixto; Santiago Cofiño, Antonio

    2016-04-01

    The growing availability of spatial datasets (observations, reanalysis, and regional and global climate models) demands efficient multivariate spatial modeling techniques for many problems of interest (e.g. teleconnection analysis, multi-site downscaling, etc.). Complex networks have been recently applied in this context using graphs built from pairwise correlations between the different stations (or grid boxes) forming the dataset. However, this analysis does not take into account the full dependence structure underlying the data, gien by all possible marginal and conditional dependencies among the stations, and does not allow a probabilistic analysis of the dataset. In this talk we introduce Bayesian networks as an alternative multivariate analysis and modeling data-driven technique which allows building a joint probability distribution of the stations including all relevant dependencies in the dataset. Bayesian networks is a sound machine learning technique using a graph to 1) encode the main dependencies among the variables and 2) to obtain a factorization of the joint probability distribution of the stations given by a reduced number of parameters. For a particular problem, the resulting graph provides a qualitative analysis of the spatial relationships in the dataset (alternative to complex network analysis), and the resulting model allows for a probabilistic analysis of the dataset. Bayesian networks have been widely applied in many fields, but their use in climate problems is hampered by the large number of variables (stations) involved in this field, since the complexity of the existing algorithms to learn from data the graphical structure grows nonlinearly with the number of variables. In this contribution we present a modified local learning algorithm for Bayesian networks adapted to this problem, which allows inferring the graphical structure for thousands of stations (from observations) and/or gridboxes (from model simulations) thus providing new

  9. Bayesian data analysis in population ecology: motivations, methods, and benefits

    USGS Publications Warehouse

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  10. Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation

    NASA Technical Reports Server (NTRS)

    Jefferys, William H.; Berger, James O.

    1992-01-01

    'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian analysis can, moreover, clarify the nature of the 'simplest' hypothesis consistent with the given data. By choosing the prior probabilities of hypotheses, it becomes possible to quantify the scientific judgment that simpler hypotheses are more likely to be correct. Bayesian analysis also shows that a hypothesis with fewer adjustable parameters intrinsically possesses an enhanced posterior probability, due to the clarity of its predictions.

  11. Enhancing the Modeling of PFOA Pharmacokinetics with Bayesian Analysis

    EPA Science Inventory

    The detail sufficient to describe the pharmacokinetics (PK) for perfluorooctanoic acid (PFOA) and the methods necessary to combine information from multiple data sets are both subjects of ongoing investigation. Bayesian analysis provides tools to accommodate these goals. We exa...

  12. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  13. Bayesian Analysis of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey

    2007-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

  14. Bayesian analysis of the backreaction models

    SciTech Connect

    Kurek, Aleksandra; Bolejko, Krzysztof; Szydlowski, Marek

    2010-03-15

    We present a Bayesian analysis of four different types of backreaction models, which are based on the Buchert equations. In this approach, one considers a solution to the Einstein equations for a general matter distribution and then an average of various observable quantities is taken. Such an approach became of considerable interest when it was shown that it could lead to agreement with observations without resorting to dark energy. In this paper we compare the {Lambda}CDM model and the backreaction models with type Ia supernovae, baryon acoustic oscillations, and cosmic microwave background data, and find that the former is favored. However, the tested models were based on some particular assumptions about the relation between the average spatial curvature and the backreaction, as well as the relation between the curvature and curvature index. In this paper we modified the latter assumption, leaving the former unchanged. We find that, by varying the relation between the curvature and curvature index, we can obtain a better fit. Therefore, some further work is still needed--in particular, the relation between the backreaction and the curvature should be revisited in order to fully determine the feasibility of the backreaction models to mimic dark energy.

  15. Asymptotic analysis of Bayesian generalization error with Newton diagram.

    PubMed

    Yamazaki, Keisuke; Aoyagi, Miki; Watanabe, Sumio

    2010-01-01

    Statistical learning machines that have singularities in the parameter space, such as hidden Markov models, Bayesian networks, and neural networks, are widely used in the field of information engineering. Singularities in the parameter space determine the accuracy of estimation in the Bayesian scenario. The Newton diagram in algebraic geometry is recognized as an effective method by which to investigate a singularity. The present paper proposes a new technique to plug the diagram in the Bayesian analysis. The proposed technique allows the generalization error to be clarified and provides a foundation for an efficient model selection. We apply the proposed technique to mixtures of binomial distributions.

  16. Bayesian analysis of a disability model for lung cancer survival.

    PubMed

    Armero, C; Cabras, S; Castellanos, M E; Perra, S; Quirós, A; Oruezábal, M J; Sánchez-Rubio, J

    2016-02-01

    Bayesian reasoning, survival analysis and multi-state models are used to assess survival times for Stage IV non-small-cell lung cancer patients and the evolution of the disease over time. Bayesian estimation is done using minimum informative priors for the Weibull regression survival model, leading to an automatic inferential procedure. Markov chain Monte Carlo methods have been used for approximating posterior distributions and the Bayesian information criterion has been considered for covariate selection. In particular, the posterior distribution of the transition probabilities, resulting from the multi-state model, constitutes a very interesting tool which could be useful to help oncologists and patients make efficient and effective decisions.

  17. Bayesian Analysis of Perceived Eye Level

    PubMed Central

    Orendorff, Elaine E.; Kalesinskas, Laurynas; Palumbo, Robert T.; Albert, Mark V.

    2016-01-01

    To accurately perceive the world, people must efficiently combine internal beliefs and external sensory cues. We introduce a Bayesian framework that explains the role of internal balance cues and visual stimuli on perceived eye level (PEL)—a self-reported measure of elevation angle. This framework provides a single, coherent model explaining a set of experimentally observed PEL over a range of experimental conditions. Further, it provides a parsimonious explanation for the additive effect of low fidelity cues as well as the averaging effect of high fidelity cues, as also found in other Bayesian cue combination psychophysical studies. Our model accurately estimates the PEL and explains the form of previous equations used in describing PEL behavior. Most importantly, the proposed Bayesian framework for PEL is more powerful than previous behavioral modeling; it permits behavioral estimation in a wider range of cue combination and perceptual studies than models previously reported. PMID:28018204

  18. Bayesian analysis of MEG visual evoked responses

    NASA Astrophysics Data System (ADS)

    Schmidt, David M.; George, John S.; Wood, C. C.

    1999-05-01

    We have developed a method for analyzing neural electromagnetic data that allows probabilistic inferences to be drawn about regions of activation. The method involves the generation of a large number of possible solutions which both fit the data and prior expectations about the nature of probable solutions made explicit by a Bayesian formalism. In addition, we have introduced a model for the current distributions that produce MEG (and EEG) data that allows extended regions of activity, and can easily incorporate prior information such as anatomical constraints from MRI. To evaluate the feasibility and utility of the Bayesian approach with actual data, we analyzed MEG data from a visual evoked response experiment. We compared Bayesian analyses of MEG responses to visual stimuli in the left and right visual fields, in order to examine the sensitivity of the method to detect known features of human visual cortex organization. We also examined the changing pattern of cortical activation as a function of time.

  19. Bayesian analysis of MEG visual evoked responses

    SciTech Connect

    Schmidt, D.M.; George, J.S.; Wood, C.C.

    1999-04-01

    The authors developed a method for analyzing neural electromagnetic data that allows probabilistic inferences to be drawn about regions of activation. The method involves the generation of a large number of possible solutions which both fir the data and prior expectations about the nature of probable solutions made explicit by a Bayesian formalism. In addition, they have introduced a model for the current distributions that produce MEG and (EEG) data that allows extended regions of activity, and can easily incorporate prior information such as anatomical constraints from MRI. To evaluate the feasibility and utility of the Bayesian approach with actual data, they analyzed MEG data from a visual evoked response experiment. They compared Bayesian analyses of MEG responses to visual stimuli in the left and right visual fields, in order to examine the sensitivity of the method to detect known features of human visual cortex organization. They also examined the changing pattern of cortical activation as a function of time.

  20. A Joint Bayesian Inversion for Glacial Isostatic Adjustment in North America and Greenland

    NASA Astrophysics Data System (ADS)

    Davis, J. L.; Wang, L.

    2014-12-01

    We have previously presented joint inversions of geodetic data for glacial isostatic adjustment (GIA) fields that employ a Bayesian framework for the combination of data and models. Data sets used include GNSS, GRACE gravity, and tide-gauge data, in order to estimate three-dimensional crustal deformation, geoid rate, relative sea-level change (RSLC). The benefit to this approach is that solutions are less dependent on any particular Earth/ice model used to calculate the GIA fields, and instead employ a suite of GIA predictions that are then used to calculate statistical constraints. This approach was used both for the determination of the SNARF geodetic reference frame for North America, and for a study of GIA in Fennoscandia (Hill et al., 2010). One challenge to the method we developed is that the inherent reduction in resolution of, and correlation among, GRACE Stokes coefficients caused by the destriping procedure (Swenson and Wahr, 2006; Duan et al., 2009) was not accounted for. This important obstacle has been overcome by developing a Bayesian approach to destriping (Wang et al., in prep.). However, important issues of mixed resolution of these data types still remain. In this presentation, we report on the progress of this effort, and present a new GIA field for North America. For the first time, the region used in the solution includes Greenland, in order to provide internally consistent solutions for GIA, the spatial and temporal variability of present-day sea-level change, and present-day melting in Greenland.

  1. A Bayesian method for the joint estimation of outcrossing rate and inbreeding depression.

    PubMed

    Koelling, V A; Monnahan, P J; Kelly, J K

    2012-12-01

    The population outcrossing rate (t) and adult inbreeding coefficient (F) are key parameters in mating system evolution. The magnitude of inbreeding depression as expressed in the field can be estimated given t and F via the method of Ritland (1990). For a given total sample size, the optimal design for the joint estimation of t and F requires sampling large numbers of families (100-400) with fewer offspring (1-4) per family. Unfortunately, the standard inference procedure (MLTR) yields significantly biased estimates for t and F when family sizes are small and maternal genotypes are unknown (a common occurrence when sampling natural populations). Here, we present a Bayesian method implemented in the program BORICE (Bayesian Outcrossing Rate and Inbreeding Coefficient Estimation) that effectively estimates t and F when family sizes are small and maternal genotype information is lacking. BORICE should enable wider use of the Ritland approach for field-based estimates of inbreeding depression. As proof of concept, we estimate t and F in a natural population of Mimulus guttatus. In addition, we describe how individual maternal inbreeding histories inferred by BORICE may prove useful in studies of inbreeding and its consequences.

  2. Joint spatial Bayesian modeling for studies combining longitudinal and cross-sectional data

    PubMed Central

    Lawson, Andrew B; Carroll, Rachel; Castro, Marcia

    2017-01-01

    Design for intervention studies may combine longitudinal data collected from sampled locations over several survey rounds and cross-sectional data from other locations in the study area. In this case, modeling the impact of the intervention requires an approach that can accommodate both types of data, accounting for the dependence between individuals followed up over time. Inadequate modeling can mask intervention effects, with serious implications for policy making. In this paper we use data from a large-scale larviciding intervention for malaria control implemented in Dar es Salaam, United Republic of Tanzania, collected over a period of almost 5 years. We apply a longitudinal Bayesian spatial model to the Dar es Salaam data, combining follow-up and cross-sectional data, treating the correlation in longitudinal observations separately, and controlling for potential confounders. An innovative feature of this modeling is the use of Ornstein–Uhlenbeck process to model random time effects. We contrast the results with other Bayesian modeling formulations, including cross-sectional approaches that consider individual-level random effects to account for subjects followed up in two or more surveys. The longitudinal modeling approach indicates that the intervention significantly reduced the prevalence of malaria infection in Dar es Salaam by 20% whereas the joint model did not suggest significance within the results. Our results suggest that the longitudinal model is to be preferred when longitudinal information is available at the individual level. PMID:24713159

  3. Bayesian Model Assessment in Joint Modeling of Longitudinal and Survival Data with Applications to Cancer Clinical Trials

    PubMed Central

    Zhang, Danjie; Chen, Ming-Hui; Ibrahim, Joseph G.; Boye, Mark E.; Shen, Wei

    2015-01-01

    Summary Joint models for longitudinal and survival data are routinely used in clinical trials or other studies to assess a treatment effect while accounting for longitudinal measures such as patient-reported outcomes (PROs). In the Bayesian framework, the deviance information criterion (DIC) and the logarithm of the pseudo marginal likelihood (LPML) are two well-known Bayesian criteria for comparing joint models. However, these criteria do not provide separate assessments of each component of the joint model. In this paper, we develop a novel decomposition of DIC and LPML to assess the fit of the longitudinal and survival components of the joint model, separately. Based on this decomposition, we then propose new Bayesian model assessment criteria, namely, ΔDIC and ΔLPML, to determine the importance and contribution of the longitudinal (survival) data to the model fit of the survival (longitudinal) data. Moreover, we develop an efficient Monte Carlo method for computing the Conditional Predictive Ordinate (CPO) statistics in the joint modeling setting. A simulation study is conducted to examine the empirical performance of the proposed criteria and the proposed methodology is further applied to a case study in mesothelioma. PMID:28239247

  4. Dealing with Reflection Invariance in Bayesian Factor Analysis.

    PubMed

    Erosheva, Elena A; Curtis, S McKay

    2017-03-13

    This paper considers the reflection unidentifiability problem in confirmatory factor analysis (CFA) and the associated implications for Bayesian estimation. We note a direct analogy between the multimodality in CFA models that is due to all possible column sign changes in the matrix of loadings and the multimodality in finite mixture models that is due to all possible relabelings of the mixture components. Drawing on this analogy, we derive and present a simple approach for dealing with reflection in variance in Bayesian factor analysis. We recommend fitting Bayesian factor analysis models without rotational constraints on the loadings-allowing Markov chain Monte Carlo algorithms to explore the full posterior distribution-and then using a relabeling algorithm to pick a factor solution that corresponds to one mode. We demonstrate our approach on the case of a bifactor model; however, the relabeling algorithm is straightforward to generalize for handling multimodalities due to sign invariance in the likelihood in other factor analysis models.

  5. A Bayesian analysis of plutonium exposures in Sellafield workers.

    PubMed

    Puncher, M; Riddell, A E

    2016-03-01

    The joint Russian (Mayak Production Association) and British (Sellafield) plutonium worker epidemiological analysis, undertaken as part of the European Union Framework Programme 7 (FP7) SOLO project, aims to investigate potential associations between cancer incidence and occupational exposures to plutonium using estimates of organ/tissue doses. The dose reconstruction protocol derived for the study makes best use of the most recent biokinetic models derived by the International Commission on Radiological Protection (ICRP) including a recent update to the human respiratory tract model (HRTM). This protocol was used to derive the final point estimates of absorbed doses for the study. Although uncertainties on the dose estimates were not included in the final epidemiological analysis, a separate Bayesian analysis has been performed for each of the 11 808 Sellafield plutonium workers included in the study in order to assess: A. The reliability of the point estimates provided to the epidemiologists and B. The magnitude of the uncertainty on dose estimates. This analysis, which accounts for uncertainties in biokinetic model parameters, intakes and measurement uncertainties, is described in the present paper. The results show that there is excellent agreement between the point estimates of dose and posterior mean values of dose. However, it is also evident that there are significant uncertainties associated with these dose estimates: the geometric range of the 97.5%:2.5% posterior values are a factor of 100 for lung dose, 30 for doses to liver and red bone marrow, and 40 for intakes: these uncertainties are not reflected in estimates of risk when point doses are used to assess them. It is also shown that better estimates of certain key HRTM absorption parameters could significantly reduce the uncertainties on lung dose in future studies.

  6. Bayesian linkage and segregation analysis: factoring the problem.

    PubMed

    Matthysse, S

    2000-01-01

    Complex segregation analysis and linkage methods are mathematical techniques for the genetic dissection of complex diseases. They are used to delineate complex modes of familial transmission and to localize putative disease susceptibility loci to specific chromosomal locations. The computational problem of Bayesian linkage and segregation analysis is one of integration in high-dimensional spaces. In this paper, three available techniques for Bayesian linkage and segregation analysis are discussed: Markov Chain Monte Carlo (MCMC), importance sampling, and exact calculation. The contribution of each to the overall integration will be explicitly discussed.

  7. Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.

    2016-03-01

    A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.

  8. Spatiotemporal Bayesian inference dipole analysis for MEG neuroimaging data.

    PubMed

    Jun, Sung C; George, John S; Paré-Blagoev, Juliana; Plis, Sergey M; Ranken, Doug M; Schmidt, David M; Wood, C C

    2005-10-15

    Recently, we described a Bayesian inference approach to the MEG/EEG inverse problem that used numerical techniques to estimate the full posterior probability distributions of likely solutions upon which all inferences were based [Schmidt, D.M., George, J.S., Wood, C.C., 1999. Bayesian inference applied to the electromagnetic inverse problem. Human Brain Mapping 7, 195; Schmidt, D.M., George, J.S., Ranken, D.M., Wood, C.C., 2001. Spatial-temporal bayesian inference for MEG/EEG. In: Nenonen, J., Ilmoniemi, R. J., Katila, T. (Eds.), Biomag 2000: 12th International Conference on Biomagnetism. Espoo, Norway, p. 671]. Schmidt et al. (1999) focused on the analysis of data at a single point in time employing an extended region source model. They subsequently extended their work to a spatiotemporal Bayesian inference analysis of the full spatiotemporal MEG/EEG data set. Here, we formulate spatiotemporal Bayesian inference analysis using a multi-dipole model of neural activity. This approach is faster than the extended region model, does not require use of the subject's anatomical information, does not require prior determination of the number of dipoles, and yields quantitative probabilistic inferences. In addition, we have incorporated the ability to handle much more complex and realistic estimates of the background noise, which may be represented as a sum of Kronecker products of temporal and spatial noise covariance components. This reduces the effects of undermodeling noise. In order to reduce the rigidity of the multi-dipole formulation which commonly causes problems due to multiple local minima, we treat the given covariance of the background as uncertain and marginalize over it in the analysis. Markov Chain Monte Carlo (MCMC) was used to sample the many possible likely solutions. The spatiotemporal Bayesian dipole analysis is demonstrated using simulated and empirical whole-head MEG data.

  9. APPLICATION OF BAYESIAN MONTE CARLO ANALYSIS TO A LAGRANGIAN PHOTOCHEMICAL AIR QUALITY MODEL. (R824792)

    EPA Science Inventory

    Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...

  10. Bayesian Analysis of the Pattern Informatics Technique

    NASA Astrophysics Data System (ADS)

    Cho, N.; Tiampo, K.; Klein, W.; Rundle, J.

    2007-12-01

    The pattern informatics (PI) [Rundle et al., 2000; Tiampo et al., 2002; Holliday et al., 2005] is a technique that uses phase dynamics in order to quantify temporal variations in seismicity patterns. This technique has shown interesting results for forecasting earthquakes with magnitude greater than or equal to 5 in southern California from 2000 to 2010 [Rundle et al., 2002]. In this work, a Bayesian approach is used to obtain a modified updated version of the PI called Bayesian pattern informatics (BPI). This alternative method uses the PI result as a prior probability and models such as ETAS [Ogata, 1988, 2004; Helmstetter and Sornette, 2002] or BASS [Turcotte et al., 2007] in order to obtain the likelihood. Its result is similar to the one obtained by the PI: the determination of regions, known as hotspots, that are most susceptible to the occurrence of events with M=5 and larger during the forecast period. As an initial test, retrospective forecasts for the southern California region from 1990 to 2000 were made with both the BPI and the PI techniques, and the results are discussed in this work.

  11. Bayesian networks as a tool for epidemiological systems analysis

    NASA Astrophysics Data System (ADS)

    Lewis, F. I.

    2012-11-01

    Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter attempts not only to identify statistically associated variables, but to additionally, and empirically, separate these into those directly and indirectly dependent with one or more outcome variables. Such discrimination is vastly more ambitious but has the potential to reveal far more about key features of complex disease systems. Applying Bayesian network modeling to biological and medical data has considerable computational demands, combined with the need to ensure robust model selection given the vast model space of possible DAGs. These challenges require the use of approximation techniques, such as the Laplace approximation, Markov chain Monte Carlo simulation and parametric bootstrapping, along with computational parallelization. A case study in structure discovery - identification of an optimal DAG for given data - is presented which uses additive Bayesian networks to explore veterinary disease data of industrial and medical relevance.

  12. bamr: Bayesian analysis of mass and radius observations

    NASA Astrophysics Data System (ADS)

    Steiner, Andrew W.

    2014-08-01

    bamr is an MPI implementation of a Bayesian analysis of neutron star mass and radius data that determines the mass versus radius curve and the equation of state of dense matter. Written in C++, bamr provides some EOS models. This code requires O2scl (ascl:1408.019) be installed before compilation.

  13. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    ERIC Educational Resources Information Center

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  14. Ensemble forecasting of sub-seasonal to seasonal streamflow by a Bayesian joint probability modelling approach

    NASA Astrophysics Data System (ADS)

    Zhao, Tongtiegang; Schepen, Andrew; Wang, Q. J.

    2016-10-01

    The Bayesian joint probability (BJP) modelling approach is used operationally to produce seasonal (three-month-total) ensemble streamflow forecasts in Australia. However, water resource managers are calling for more informative sub-seasonal forecasts. Taking advantage of BJP's capability of handling multiple predictands, ensemble forecasting of sub-seasonal to seasonal streamflows is investigated for 23 catchments around Australia. Using antecedent streamflow and climate indices as predictors, monthly forecasts are developed for the three-month period ahead. Forecast reliability and skill are evaluated for the period 1982-2011 using a rigorous leave-five-years-out cross validation strategy. BJP ensemble forecasts of monthly streamflow volumes are generally reliable in ensemble spread. Forecast skill, relative to climatology, is positive in 74% of cases in the first month, decreasing to 57% and 46% respectively for streamflow forecasts for the final two months of the season. As forecast skill diminishes with increasing lead time, the monthly forecasts approach climatology. Seasonal forecasts accumulated from monthly forecasts are found to be similarly skilful to forecasts from BJP models based on seasonal totals directly. The BJP modelling approach is demonstrated to be a viable option for producing ensemble time-series sub-seasonal to seasonal streamflow forecasts.

  15. Bayesian analysis of the flutter margin method in aeroelasticity

    NASA Astrophysics Data System (ADS)

    Khalil, Mohammad; Poirel, Dominique; Sarkar, Abhijit

    2016-12-01

    A Bayesian statistical framework is presented for Zimmerman and Weissenburger flutter margin method which considers the uncertainties in aeroelastic modal parameters. The proposed methodology overcomes the limitations of the previously developed least-square based estimation technique which relies on the Gaussian approximation of the flutter margin probability density function (pdf). Using the measured free-decay responses at subcritical (preflutter) airspeeds, the joint non-Gaussain posterior pdf of the modal parameters is sampled using the Metropolis-Hastings (MH) Markov chain Monte Carlo (MCMC) algorithm. The posterior MCMC samples of the modal parameters are then used to obtain the flutter margin pdfs and finally the flutter speed pdf. The usefulness of the Bayesian flutter margin method is demonstrated using synthetic data generated from a two-degree-of-freedom pitch-plunge aeroelastic model. The robustness of the statistical framework is demonstrated using different sets of measurement data. It will be shown that the probabilistic (Bayesian) approach reduces the number of test points required in providing a flutter speed estimate for a given accuracy and precision.

  16. Bayesian analysis of the flutter margin method in aeroelasticity

    SciTech Connect

    Khalil, Mohammad; Poirel, Dominique; Sarkar, Abhijit

    2016-08-27

    A Bayesian statistical framework is presented for Zimmerman and Weissenburger flutter margin method which considers the uncertainties in aeroelastic modal parameters. The proposed methodology overcomes the limitations of the previously developed least-square based estimation technique which relies on the Gaussian approximation of the flutter margin probability density function (pdf). Using the measured free-decay responses at subcritical (preflutter) airspeeds, the joint non-Gaussain posterior pdf of the modal parameters is sampled using the Metropolis–Hastings (MH) Markov chain Monte Carlo (MCMC) algorithm. The posterior MCMC samples of the modal parameters are then used to obtain the flutter margin pdfs and finally the flutter speed pdf. The usefulness of the Bayesian flutter margin method is demonstrated using synthetic data generated from a two-degree-of-freedom pitch-plunge aeroelastic model. The robustness of the statistical framework is demonstrated using different sets of measurement data. In conclusion, it will be shown that the probabilistic (Bayesian) approach reduces the number of test points required in providing a flutter speed estimate for a given accuracy and precision.

  17. Bayesian analysis of the flutter margin method in aeroelasticity

    DOE PAGES

    Khalil, Mohammad; Poirel, Dominique; Sarkar, Abhijit

    2016-08-27

    A Bayesian statistical framework is presented for Zimmerman and Weissenburger flutter margin method which considers the uncertainties in aeroelastic modal parameters. The proposed methodology overcomes the limitations of the previously developed least-square based estimation technique which relies on the Gaussian approximation of the flutter margin probability density function (pdf). Using the measured free-decay responses at subcritical (preflutter) airspeeds, the joint non-Gaussain posterior pdf of the modal parameters is sampled using the Metropolis–Hastings (MH) Markov chain Monte Carlo (MCMC) algorithm. The posterior MCMC samples of the modal parameters are then used to obtain the flutter margin pdfs and finally the fluttermore » speed pdf. The usefulness of the Bayesian flutter margin method is demonstrated using synthetic data generated from a two-degree-of-freedom pitch-plunge aeroelastic model. The robustness of the statistical framework is demonstrated using different sets of measurement data. In conclusion, it will be shown that the probabilistic (Bayesian) approach reduces the number of test points required in providing a flutter speed estimate for a given accuracy and precision.« less

  18. Bayesian Analysis of Nonlinear Structural Equation Models with Nonignorable Missing Data

    ERIC Educational Resources Information Center

    Lee, Sik-Yum

    2006-01-01

    A Bayesian approach is developed for analyzing nonlinear structural equation models with nonignorable missing data. The nonignorable missingness mechanism is specified by a logistic regression model. A hybrid algorithm that combines the Gibbs sampler and the Metropolis-Hastings algorithm is used to produce the joint Bayesian estimates of…

  19. Application of Bayesian graphs to SN Ia data analysis and compression

    NASA Astrophysics Data System (ADS)

    Ma, Cong; Corasaniti, Pier-Stefano; Bassett, Bruce A.

    2016-12-01

    Bayesian graphical models are an efficient tool for modelling complex data and derive self-consistent expressions of the posterior distribution of model parameters. We apply Bayesian graphs to perform statistical analyses of Type Ia supernova (SN Ia) luminosity distance measurements from the joint light-curve analysis (JLA) data set. In contrast to the χ2 approach used in previous studies, the Bayesian inference allows us to fully account for the standard-candle parameter dependence of the data covariance matrix. Comparing with χ2 analysis results, we find a systematic offset of the marginal model parameter bounds. We demonstrate that the bias is statistically significant in the case of the SN Ia standardization parameters with a maximal 6σ shift of the SN light-curve colour correction. In addition, we find that the evidence for a host galaxy correction is now only 2.4σ. Systematic offsets on the cosmological parameters remain small, but may increase by combining constraints from complementary cosmological probes. The bias of the χ2 analysis is due to neglecting the parameter-dependent log-determinant of the data covariance, which gives more statistical weight to larger values of the standardization parameters. We find a similar effect on compressed distance modulus data. To this end, we implement a fully consistent compression method of the JLA data set that uses a Gaussian approximation of the posterior distribution for fast generation of compressed data. Overall, the results of our analysis emphasize the need for a fully consistent Bayesian statistical approach in the analysis of future large SN Ia data sets.

  20. A Bayesian Analysis of Scale-Invariant Processes

    DTIC Science & Technology

    2012-01-01

    Analysis of Scale-Invariant Processes Jingfeng Wang, Rafael L. Bras, Veronica Nieves Georgia Tech Research Corporation Office of Sponsored Programs...processes Veronica Nieves , Jingfeng Wang, and Rafael L. Bras Citation: AIP Conf. Proc. 1443, 56 (2012); doi: 10.1063/1.3703620 View online: http...http://proceedings.aip.org/about/rights_permissions A Bayesian Analysis of Scale-Invariant Processes Veronica Nieves ∗, Jingfeng Wang† and Rafael L. Bras

  1. An Overview of Bayesian Methods for Neural Spike Train Analysis

    PubMed Central

    2013-01-01

    Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed. PMID:24348527

  2. Bayesian phylogeny analysis via stochastic approximation Monte Carlo.

    PubMed

    Cheon, Sooyoung; Liang, Faming

    2009-11-01

    Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time.

  3. A Bayesian Analysis of the Flood Frequency Hydrology Concept

    DTIC Science & Technology

    2016-02-01

    Concept by Brian E. Skahill, Alberto Viglione, and Aaron Byrd PURPOSE: The purpose of this document is to demonstrate a Bayesian analysis of the...flood frequency hydrology concept as a formal probabilistic-based means by which to coherently combine and also evaluate the worth of different types...and development. INTRODUCTION: Merz and Blöschl (2008a,b) proposed the concept of flood frequency hydrology, which emphasizes the importance of

  4. Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians

    SciTech Connect

    Dinklage, Andreas; Dreier, Heiko; Preuss, Roland; Fischer, Rainer; Gori, Silvio; Toussaint, Udo von

    2008-03-12

    Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.

  5. Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians

    NASA Astrophysics Data System (ADS)

    Dinklage, Andreas; Dreier, Heiko; Fischer, Rainer; Gori, Silvio; Preuss, Roland; Toussaint, Udo von

    2008-03-01

    Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.

  6. BAYESIAN SEMIPARAMETRIC ANALYSIS FOR TWO-PHASE STUDIES OF GENE-ENVIRONMENT INTERACTION

    PubMed Central

    Ahn, Jaeil; Mukherjee, Bhramar; Gruber, Stephen B.; Ghosh, Malay

    2013-01-01

    The two-phase sampling design is a cost-efficient way of collecting expensive covariate information on a judiciously selected sub-sample. It is natural to apply such a strategy for collecting genetic data in a sub-sample enriched for exposure to environmental factors for gene-environment interaction (G × E) analysis. In this paper, we consider two-phase studies of G × E interaction where phase I data are available on exposure, covariates and disease status. Stratified sampling is done to prioritize individuals for genotyping at phase II conditional on disease and exposure. We consider a Bayesian analysis based on the joint retrospective likelihood of phase I and phase II data. We address several important statistical issues: (i) we consider a model with multiple genes, environmental factors and their pairwise interactions. We employ a Bayesian variable selection algorithm to reduce the dimensionality of this potentially high-dimensional model; (ii) we use the assumption of gene-gene and gene-environment independence to trade-off between bias and efficiency for estimating the interaction parameters through use of hierarchical priors reflecting this assumption; (iii) we posit a flexible model for the joint distribution of the phase I categorical variables using the non-parametric Bayes construction of Dunson and Xing (2009). We carry out a small-scale simulation study to compare the proposed Bayesian method with weighted likelihood and pseudo likelihood methods that are standard choices for analyzing two-phase data. The motivating example originates from an ongoing case-control study of colorectal cancer, where the goal is to explore the interaction between the use of statins (a drug used for lowering lipid levels) and 294 genetic markers in the lipid metabolism/cholesterol synthesis pathway. The sub-sample of cases and controls on which these genetic markers were measured is enriched in terms of statin users. The example and simulation results illustrate that the

  7. Bayesian tomography and integrated data analysis in fusion diagnostics

    NASA Astrophysics Data System (ADS)

    Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.

    2016-11-01

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.

  8. BAYESIAN ANALYSIS OF MULTIPLE HARMONIC OSCILLATIONS IN THE SOLAR CORONA

    SciTech Connect

    Arregui, I.; Asensio Ramos, A.; Diaz, A. J.

    2013-03-01

    The detection of multiple mode harmonic kink oscillations in coronal loops enables us to obtain information on coronal density stratification and magnetic field expansion using seismology inversion techniques. The inference is based on the measurement of the period ratio between the fundamental mode and the first overtone and theoretical results for the period ratio under the hypotheses of coronal density stratification and magnetic field expansion of the wave guide. We present a Bayesian analysis of multiple mode harmonic oscillations for the inversion of the density scale height and magnetic flux tube expansion under each of the hypotheses. The two models are then compared using a Bayesian model comparison scheme to assess how plausible each one is given our current state of knowledge.

  9. Bayesian methods for the analysis of inequality constrained contingency tables.

    PubMed

    Laudy, Olav; Hoijtink, Herbert

    2007-04-01

    A Bayesian methodology for the analysis of inequality constrained models for contingency tables is presented. The problem of interest lies in obtaining the estimates of functions of cell probabilities subject to inequality constraints, testing hypotheses and selection of the best model. Constraints on conditional cell probabilities and on local, global, continuation and cumulative odds ratios are discussed. A Gibbs sampler to obtain a discrete representation of the posterior distribution of the inequality constrained parameters is used. Using this discrete representation, the credibility regions of functions of cell probabilities can be constructed. Posterior model probabilities are used for model selection and hypotheses are tested using posterior predictive checks. The Bayesian methodology proposed is illustrated in two examples.

  10. A Bayesian Double Fusion Model for Resting State Brain Connectivity Using Joint Functional and Structural Data.

    PubMed

    Kang, Hakmook; Ombao, Hernando; Fonnesbeck, Christopher; Ding, Zhaohua; Morgan, Victoria L

    2017-03-19

    Current approaches separately analyze concurrently acquired diffusion tensor imaging (DTI) and functional magnetic resonance imaging (fMRI) data. The primary limitation of these approaches is that they do not take advantage of the information from DTI that could potentially enhance estimation of resting state functional connectivity (FC) between brain regions. To overcome this limitation, we develop a Bayesian hierarchical spatio-temporal model that incorporates structural connectivity into estimating FC. In our proposed approach, structural connectivity (SC) based on DTI data is used to construct an informative prior for functional connectivity based on resting state fMRI data via the Cholesky decomposition. Simulation studies showed that incorporating the two data produced significantly reduced mean squared errors compared to the standard approach of separately analyzing the two data from different modalities. We applied our model to analyze the resting state DTI and fMRI data collected to estimate FC between the brain regions that were hypothetically important in the origination and spread of temporal lobe epilepsy seizures. Our analysis concludes that the proposed model achieves smaller false positive rates and is much robust to data decimation compared to the conventional approach.

  11. A Bayesian Nonparametric Meta-Analysis Model

    ERIC Educational Resources Information Center

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  12. Risk analysis using a hybrid Bayesian-approximate reasoning methodology.

    SciTech Connect

    Bott, T. F.; Eisenhawer, S. W.

    2001-01-01

    Analysts are sometimes asked to make frequency estimates for specific accidents in which the accident frequency is determined primarily by safety controls. Under these conditions, frequency estimates use considerable expert belief in determining how the controls affect the accident frequency. To evaluate and document beliefs about control effectiveness, we have modified a traditional Bayesian approach by using approximate reasoning (AR) to develop prior distributions. Our method produces accident frequency estimates that separately express the probabilistic results produced in Bayesian analysis and possibilistic results that reflect uncertainty about the prior estimates. Based on our experience using traditional methods, we feel that the AR approach better documents beliefs about the effectiveness of controls than if the beliefs are buried in Bayesian prior distributions. We have performed numerous expert elicitations in which probabilistic information was sought from subject matter experts not trained In probability. We find it rnuch easier to elicit the linguistic variables and fuzzy set membership values used in AR than to obtain the probability distributions used in prior distributions directly from these experts because it better captures their beliefs and better expresses their uncertainties.

  13. Acute Abdominal Pain: Bayesian Analysis in the Emergency Room

    PubMed Central

    Harvey, A. C.; Moodie, P. F.

    1982-01-01

    A non-sequential Bayesian analysis was deemed a suitable approach to the important clinical problem of analysis of acute abdominal pain in the Emergency Room. Using series reported in the literature as a data source complemented by expert clinical estimates of probabilities of clinical data a program has been established in St. Boniface, Canada. Prior to implementing the program as an online, quickly available diagnostic aid, a prospective preliminary study has shown that the performance of computer plus clinician is significantly better than either clinician or computer alone. A major emphasis has been developing the acceptability of the program in real-life diagnoses in the Emergency Room.

  14. Spectral Analysis of B Stars: An Application of Bayesian Statistics

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2012-12-01

    To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.

  15. BaalChIP: Bayesian analysis of allele-specific transcription factor binding in cancer genomes.

    PubMed

    de Santiago, Ines; Liu, Wei; Yuan, Ke; O'Reilly, Martin; Chilamakuri, Chandra Sekhar Reddy; Ponder, Bruce A J; Meyer, Kerstin B; Markowetz, Florian

    2017-02-24

    Allele-specific measurements of transcription factor binding from ChIP-seq data are key to dissecting the allelic effects of non-coding variants and their contribution to phenotypic diversity. However, most methods of detecting an allelic imbalance assume diploid genomes. This assumption severely limits their applicability to cancer samples with frequent DNA copy-number changes. Here we present a Bayesian statistical approach called BaalChIP to correct for the effect of background allele frequency on the observed ChIP-seq read counts. BaalChIP allows the joint analysis of multiple ChIP-seq samples across a single variant and outperforms competing approaches in simulations. Using 548 ENCODE ChIP-seq and six targeted FAIRE-seq samples, we show that BaalChIP effectively corrects allele-specific analysis for copy-number variation and increases the power to detect putative cis-acting regulatory variants in cancer genomes.

  16. A Bayesian analysis of pentaquark signals from CLAS data

    SciTech Connect

    David Ireland; Bryan McKinnon; Dan Protopopescu; Pawel Ambrozewicz; Marco Anghinolfi; G. Asryan; Harutyun Avakian; H. Bagdasaryan; Nathan Baillie; Jacques Ball; Nathan Baltzell; V. Batourine; Marco Battaglieri; Ivan Bedlinski; Ivan Bedlinskiy; Matthew Bellis; Nawal Benmouna; Barry Berman; Angela Biselli; Lukasz Blaszczyk; Sylvain Bouchigny; Sergey Boyarinov; Robert Bradford; Derek Branford; William Briscoe; William Brooks; Volker Burkert; Cornel Butuceanu; John Calarco; Sharon Careccia; Daniel Carman; Liam Casey; Shifeng Chen; Lu Cheng; Philip Cole; Patrick Collins; Philip Coltharp; Donald Crabb; Volker Crede; Natalya Dashyan; Rita De Masi; Raffaella De Vita; Enzo De Sanctis; Pavel Degtiarenko; Alexandre Deur; Richard Dickson; Chaden Djalali; Gail Dodge; Joseph Donnelly; David Doughty; Michael Dugger; Oleksandr Dzyubak; Hovanes Egiyan; Kim Egiyan; Lamiaa Elfassi; Latifa Elouadrhiri; Paul Eugenio; Gleb Fedotov; Gerald Feldman; Ahmed Fradi; Herbert Funsten; Michel Garcon; Gagik Gavalian; Nerses Gevorgyan; Gerard Gilfoyle; Kevin Giovanetti; Francois-Xavier Girod; John Goetz; Wesley Gohn; Atilla Gonenc; Ralf Gothe; Keith Griffioen; Michel Guidal; Nevzat Guler; Lei Guo; Vardan Gyurjyan; Kawtar Hafidi; Hayk Hakobyan; Charles Hanretty; Neil Hassall; F. Hersman; Ishaq Hleiqawi; Maurik Holtrop; Charles Hyde; Yordanka Ilieva; Boris Ishkhanov; Eugeny Isupov; D. Jenkins; Hyon-Suk Jo; John Johnstone; Kyungseon Joo; Henry Juengst; Narbe Kalantarians; James Kellie; Mahbubul Khandaker; Wooyoung Kim; Andreas Klein; Franz Klein; Mikhail Kossov; Zebulun Krahn; Laird Kramer; Valery Kubarovsky; Joachim Kuhn; Sergey Kuleshov; Viacheslav Kuznetsov; Jeff Lachniet; Jean Laget; Jorn Langheinrich; D. Lawrence; Kenneth Livingston; Haiyun Lu; Marion MacCormick; Nikolai Markov; Paul Mattione; Bernhard Mecking; Mac Mestayer; Curtis Meyer; Tsutomu Mibe; Konstantin Mikhaylov; Marco Mirazita; Rory Miskimen; Viktor Mokeev; Brahim Moreno; Kei Moriya; Steven Morrow; Maryam Moteabbed; Edwin Munevar Espitia; Gordon Mutchler; Pawel Nadel-Turonski; Rakhsha Nasseripour; Silvia Niccolai; Gabriel Niculescu; Maria-Ioana Niculescu; Bogdan Niczyporuk; Megh Niroula; Rustam Niyazov; Mina Nozar; Mikhail Osipenko; Alexander Ostrovidov; Kijun Park; Evgueni Pasyuk; Craig Paterson; Sergio Pereira; Joshua Pierce; Nikolay Pivnyuk; Oleg Pogorelko; Sergey Pozdnyakov; John Price; Sebastien Procureur; Yelena Prok; Brian Raue; Giovanni Ricco; Marco Ripani; Barry Ritchie; Federico Ronchetti; Guenther Rosner; Patrizia Rossi; Franck Sabatie; Julian Salamanca; Carlos Salgado; Joseph Santoro; Vladimir Sapunenko; Reinhard Schumacher; Vladimir Serov; Youri Sharabian; Dmitri Sharov; Nikolay Shvedunov; Elton Smith; Lee Smith; Daniel Sober; Daria Sokhan; Aleksey Stavinskiy; Samuel Stepanyan; Stepan Stepanyan; Burnham Stokes; Paul Stoler; Steffen Strauch; Mauro Taiuti; David Tedeschi; Ulrike Thoma; Avtandil Tkabladze; Svyatoslav Tkachenko; Clarisse Tur; Maurizio Ungaro; Michael Vineyard; Alexander Vlassov; Daniel Watts; Lawrence Weinstein; Dennis Weygand; M. Williams; Elliott Wolin; M.H. Wood; Amrit Yegneswaran; Lorenzo Zana; Jixie Zhang; Bo Zhao; Zhiwen Zhao

    2008-02-01

    We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a $\\Theta^{+}$ pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a $\\Theta^{+}$. Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner.

  17. A Bayesian Approach to Joint Modeling of Protein-DNA Binding, Gene Expression and Sequence Data

    PubMed Central

    Xie, Yang; Pan, Wei; Jeong, Kyeong S.; Xiao, Guanghua; Khodursky, Arkady B.

    2012-01-01

    The genome-wide DNA-protein binding data, DNA sequence data and gene expression data represent complementary means to deciphering global and local transcriptional regulatory circuits. Combining these different types of data can not only improve the statistical power, but also provide a more comprehensive picture of gene regulation. In this paper, we propose a novel statistical model to augment proteinDNA binding data with gene expression and DNA sequence data when available. We specify a hierarchical Bayes model and use Markov chain Monte Carlo simulations to draw inferences. Both simulation studies and an analysis of an experimental dataset show that the proposed joint modeling method can significantly improve the specificity and sensitivity of identifying target genes as compared to conventional approaches relying on a single data source. PMID:20049751

  18. Bayesian analysis of inflationary features in Planck and SDSS data

    NASA Astrophysics Data System (ADS)

    Benetti, Micol; Alcaniz, Jailson S.

    2016-07-01

    We perform a Bayesian analysis to study possible features in the primordial inflationary power spectrum of scalar perturbations. In particular, we analyze the possibility of detecting the imprint of these primordial features in the anisotropy temperature power spectrum of the cosmic microwave background (CMB) and also in the matter power spectrum P (k ) . We use the most recent CMB data provided by the Planck Collaboration and P (k ) measurements from the 11th data release of the Sloan Digital Sky Survey. We focus our analysis on a class of potentials whose features are localized at different intervals of angular scales, corresponding to multipoles in the ranges 10 <ℓ<60 (Oscill-1) and 150 <ℓ<300 (Oscill-2). Our results show that one of the step potentials (Oscill-1) provides a better fit to the CMB data than does the featureless Λ CDM scenario, with moderate Bayesian evidence in favor of the former. Adding the P (k ) data to the analysis weakens the evidence of the Oscill-1 potential relative to the standard model and strengthens the evidence of this latter scenario with respect to the Oscill-2 model.

  19. Implementation of a Bayesian Engine for Uncertainty Analysis

    SciTech Connect

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  20. Reginal Frequency Analysis Based on Scaling Properties and Bayesian Models

    NASA Astrophysics Data System (ADS)

    Kwon, Hyun-Han; Lee, Jeong-Ju; Moon, Young-Il

    2010-05-01

    A regional frequency analysis based on Hierarchical Bayesian Network (HBN) and scaling theory was developmed. Many recording rain gauges over South Korea were used for the analysis. First, a scaling approach combined with extreme distribution was employed to derive regional formula for frequency analysis. Second, HBN model was used to represent additional information about the regional structure of the scaling parameters, especially the location parameter and shape parameter. The location and shape parameters of the extreme distribution were estimated by utilizing scaling properties in a regression framework, and the scaling parameters linking the parameters (location and shape) to various duration times were simultaneously estimated. It was found that the regional frequency analysis combined with HBN and scaling properties show promising results in terms of establishing regional IDF curves.

  1. Bayesian principal geodesic analysis for estimating intrinsic diffeomorphic image variability.

    PubMed

    Zhang, Miaomiao; Fletcher, P Thomas

    2015-10-01

    In this paper, we present a generative Bayesian approach for estimating the low-dimensional latent space of diffeomorphic shape variability in a population of images. We develop a latent variable model for principal geodesic analysis (PGA) that provides a probabilistic framework for factor analysis in the space of diffeomorphisms. A sparsity prior in the model results in automatic selection of the number of relevant dimensions by driving unnecessary principal geodesics to zero. To infer model parameters, including the image atlas, principal geodesic deformations, and the effective dimensionality, we introduce an expectation maximization (EM) algorithm. We evaluate our proposed model on 2D synthetic data and the 3D OASIS brain database of magnetic resonance images, and show that the automatically selected latent dimensions from our model are able to reconstruct unobserved testing images with lower error than both linear principal component analysis (LPCA) in the image space and tangent space principal component analysis (TPCA) in the diffeomorphism space.

  2. Multivariate Bayesian analysis of Gaussian, right censored Gaussian, ordered categorical and binary traits using Gibbs sampling.

    PubMed

    Korsgaard, Inge Riis; Lund, Mogens Sandø; Sorensen, Daniel; Gianola, Daniel; Madsen, Per; Jensen, Just

    2003-01-01

    A fully Bayesian analysis using Gibbs sampling and data augmentation in a multivariate model of Gaussian, right censored, and grouped Gaussian traits is described. The grouped Gaussian traits are either ordered categorical traits (with more than two categories) or binary traits, where the grouping is determined via thresholds on the underlying Gaussian scale, the liability scale. Allowances are made for unequal models, unknown covariance matrices and missing data. Having outlined the theory, strategies for implementation are reviewed. These include joint sampling of location parameters; efficient sampling from the fully conditional posterior distribution of augmented data, a multivariate truncated normal distribution; and sampling from the conditional inverse Wishart distribution, the fully conditional posterior distribution of the residual covariance matrix. Finally, a simulated dataset was analysed to illustrate the methodology. This paper concentrates on a model where residuals associated with liabilities of the binary traits are assumed to be independent. A Bayesian analysis using Gibbs sampling is outlined for the model where this assumption is relaxed.

  3. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    PubMed

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach.

  4. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    SciTech Connect

    Keselman, Dmitry; Tompkins, George H; Leishman, Deborah A

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  5. Joint modeling of survival and longitudinal non-survival data: current methods and issues. Report of the DIA Bayesian joint modeling working group

    PubMed Central

    Gould, A. Lawrence; Boye, Mark Ernest; Crowther, Michael J.; Ibrahim, Joseph G.; Quartey, George; Micallef, Sandrine; Bois, Frederic Y.

    2015-01-01

    Explicitly modeling underlying relationships between a survival endpoint and processes that generate longitudinal measured or reported outcomes potentially could improve the efficiency of clinical trials and provide greater insight into the various dimensions of the clinical effect of interventions included in the trials. Various strategies have been proposed for using longitudinal findings to elucidate intervention effects on clinical outcomes such as survival. The application of specifically Bayesian approaches for constructing models that address longitudinal and survival outcomes explicitly has been recently addressed in the literature. We review currently available methods for carrying out joint analyses, including issues of implementation and interpretation, identify software tools that can be used to carry out the necessary calculations, and review applications of the methodology. PMID:24634327

  6. Bayesian regression analysis of data with random effects covariates from nonlinear longitudinal measurements

    PubMed Central

    De la Cruz, Rolando; Meza, Cristian; Arribas-Gil, Ana; Carroll, Raymond J.

    2016-01-01

    Joint models for a wide class of response variables and longitudinal measurements consist on a mixed-effects model to fit longitudinal trajectories whose random effects enter as covariates in a generalized linear model for the primary response. They provide a useful way to assess association between these two kinds of data, which in clinical studies are often collected jointly on a series of individuals and may help understanding, for instance, the mechanisms of recovery of a certain disease or the efficacy of a given therapy. When a nonlinear mixed-effects model is used to fit the longitudinal trajectories, the existing estimation strategies based on likelihood approximations have been shown to exhibit some computational efficiency problems (De la Cruz et al., 2011). In this article we consider a Bayesian estimation procedure for the joint model with a nonlinear mixed-effects model for the longitudinal data and a generalized linear model for the primary response. The proposed prior structure allows for the implementation of an MCMC sampler. Moreover, we consider that the errors in the longitudinal model may be correlated. We apply our method to the analysis of hormone levels measured at the early stages of pregnancy that can be used to predict normal versus abnormal pregnancy outcomes. We also conduct a simulation study to assess the importance of modelling correlated errors and quantify the consequences of model misspecification. PMID:27274601

  7. An Operant Analysis of Joint Attention Skills

    ERIC Educational Resources Information Center

    Holth, Per

    2005-01-01

    Joint attention, a synchronizing of the attention of two or more persons, has been an increasing focus of research in cognitive developmental psychology. Research in this area has progressed mainly outside of behavior analysis, and behavior-analytic research and theory has tended to ignore the work on joint attention. It is argued here, on the one…

  8. Bayesian informative dropout model for longitudinal binary data with random effects using conditional and joint modeling approaches.

    PubMed

    Chan, Jennifer S K

    2016-05-01

    Dropouts are common in longitudinal study. If the dropout probability depends on the missing observations at or after dropout, this type of dropout is called informative (or nonignorable) dropout (ID). Failure to accommodate such dropout mechanism into the model will bias the parameter estimates. We propose a conditional autoregressive model for longitudinal binary data with an ID model such that the probabilities of positive outcomes as well as the drop-out indicator in each occasion are logit linear in some covariates and outcomes. This model adopting a marginal model for outcomes and a conditional model for dropouts is called a selection model. To allow for the heterogeneity and clustering effects, the outcome model is extended to incorporate mixture and random effects. Lastly, the model is further extended to a novel model that models the outcome and dropout jointly such that their dependency is formulated through an odds ratio function. Parameters are estimated by a Bayesian approach implemented using the user-friendly Bayesian software WinBUGS. A methadone clinic dataset is analyzed to illustrate the proposed models. Result shows that the treatment time effect is still significant but weaker after allowing for an ID process in the data. Finally the effect of drop-out on parameter estimates is evaluated through simulation studies.

  9. A Bayesian Framework for Reliability Analysis of Spacecraft Deployments

    NASA Technical Reports Server (NTRS)

    Evans, John W.; Gallo, Luis; Kaminsky, Mark

    2012-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.

  10. A Bayesian subgroup analysis using collections of ANOVA models.

    PubMed

    Liu, Jinzhong; Sivaganesan, Siva; Laud, Purushottam W; Müller, Peter

    2017-03-20

    We develop a Bayesian approach to subgroup analysis using ANOVA models with multiple covariates, extending an earlier work. We assume a two-arm clinical trial with normally distributed response variable. We also assume that the covariates for subgroup finding are categorical and are a priori specified, and parsimonious easy-to-interpret subgroups are preferable. We represent the subgroups of interest by a collection of models and use a model selection approach to finding subgroups with heterogeneous effects. We develop suitable priors for the model space and use an objective Bayesian approach that yields multiplicity adjusted posterior probabilities for the models. We use a structured algorithm based on the posterior probabilities of the models to determine which subgroup effects to report. Frequentist operating characteristics of the approach are evaluated using simulation. While our approach is applicable in more general cases, we mainly focus on the 2 × 2 case of two covariates each at two levels for ease of presentation. The approach is illustrated using a real data example.

  11. Bayesian analysis of U.S. hurricane climate

    USGS Publications Warehouse

    Elsner, James B.; Bossak, Brian H.

    2001-01-01

    Predictive climate distributions of U.S. landfalling hurricanes are estimated from observational records over the period 1851–2000. The approach is Bayesian, combining the reliable records of hurricane activity during the twentieth century with the less precise accounts of activity during the nineteenth century to produce a best estimate of the posterior distribution on the annual rates. The methodology provides a predictive distribution of future activity that serves as a climatological benchmark. Results are presented for the entire coast as well as for the Gulf Coast, Florida, and the East Coast. Statistics on the observed annual counts of U.S. hurricanes, both for the entire coast and by region, are similar within each of the three consecutive 50-yr periods beginning in 1851. However, evidence indicates that the records during the nineteenth century are less precise. Bayesian theory provides a rational approach for defining hurricane climate that uses all available information and that makes no assumption about whether the 150-yr record of hurricanes has been adequately or uniformly monitored. The analysis shows that the number of major hurricanes expected to reach the U.S. coast over the next 30 yr is 18 and the number of hurricanes expected to hit Florida is 20.

  12. Developing and Testing a Bayesian Analysis of Fluorescence Lifetime Measurements

    PubMed Central

    Needleman, Daniel J.

    2017-01-01

    FRET measurements can provide dynamic spatial information on length scales smaller than the diffraction limit of light. Several methods exist to measure FRET between fluorophores, including Fluorescence Lifetime Imaging Microscopy (FLIM), which relies on the reduction of fluorescence lifetime when a fluorophore is undergoing FRET. FLIM measurements take the form of histograms of photon arrival times, containing contributions from a mixed population of fluorophores both undergoing and not undergoing FRET, with the measured distribution being a mixture of exponentials of different lifetimes. Here, we present an analysis method based on Bayesian inference that rigorously takes into account several experimental complications. We test the precision and accuracy of our analysis on controlled experimental data and verify that we can faithfully extract model parameters, both in the low-photon and low-fraction regimes. PMID:28060890

  13. Risk analysis of dust explosion scenarios using Bayesian networks.

    PubMed

    Yuan, Zhi; Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-02-01

    In this study, a methodology has been proposed for risk analysis of dust explosion scenarios based on Bayesian network. Our methodology also benefits from a bow-tie diagram to better represent the logical relationships existing among contributing factors and consequences of dust explosions. In this study, the risks of dust explosion scenarios are evaluated, taking into account common cause failures and dependencies among root events and possible consequences. Using a diagnostic analysis, dust particle properties, oxygen concentration, and safety training of staff are identified as the most critical root events leading to dust explosions. The probability adaptation concept is also used for sequential updating and thus learning from past dust explosion accidents, which is of great importance in dynamic risk assessment and management. We also apply the proposed methodology to a case study to model dust explosion scenarios, to estimate the envisaged risks, and to identify the vulnerable parts of the system that need additional safety measures.

  14. BaTMAn: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  15. Simulation and analysis of flexibly jointed manipulators

    NASA Technical Reports Server (NTRS)

    Murphy, Steve H.; Wen, John T.; Saridis, George M.

    1990-01-01

    Modeling, simulation, and analysis of robot manipulators with non-negligible joint flexibility are studied. A recursive Newton-Euler model of the flexibly jointed manipulator is developed with many advantages over the traditional Lagrange-Euler methods. The Newton-Euler approach leads to a method for the simulation of a flexibly jointed manipulator in which the number of computations grows linearly with the number of links. Additionally, any function for the flexibility between the motor and link may be used permitting the simulation of nonlinear effects, such as backlash, in a uniform manner for all joints. An analysis of the control problems for flexibly jointed manipulators is presented by converting the Newton-Euler model to a Lagrange-Euler form. The detailed structure available in the model is used to examine linearizing controllers and shows the dependency of the control on the choice of flexible model and structure of the manipulator.

  16. Structural analysis of Aircraft fuselage splice joint

    NASA Astrophysics Data System (ADS)

    Udaya Prakash, R.; Kumar, G. Raj; Vijayanandh, R.; Senthil Kumar, M.; Ramganesh, T.

    2016-09-01

    In Aviation sector, composite materials and its application to each component are one of the prime factors of consideration due to the high strength to weight ratio, design flexibility and non-corrosive so that the composite materials are widely used in the low weight constructions and also it can be treated as a suitable alternative to metals. The objective of this paper is to estimate and compare the suitability of a composite skin joint in an aircraft fuselage with different joints by simulating the displacement, normal stress, vonmises stress and shear stress with the help of numerical solution methods. The reference Z-stringer component of this paper is modeled by CATIA and numerical simulation is carried out by ANSYS has been used for splice joint presents in the aircraft fuselage with three combinations of joints such as riveted joint, bonded joint and hybrid joint. Nowadays the stringers are using to avoid buckling of fuselage skin, it has joined together by rivets and they are connected end to end by splice joint. Design and static analysis of three-dimensional models of joints such as bonded, riveted and hybrid are carried out and results are compared.

  17. BASE-9: Bayesian Analysis for Stellar Evolution with nine variables

    NASA Astrophysics Data System (ADS)

    Robinson, Elliot; von Hippel, Ted; Stein, Nathan; Stenning, David; Wagner-Kaiser, Rachel; Si, Shijing; van Dyk, David

    2016-08-01

    The BASE-9 (Bayesian Analysis for Stellar Evolution with nine variables) software suite recovers star cluster and stellar parameters from photometry and is useful for analyzing single-age, single-metallicity star clusters, binaries, or single stars, and for simulating such systems. BASE-9 uses a Markov chain Monte Carlo (MCMC) technique along with brute force numerical integration to estimate the posterior probability distribution for the age, metallicity, helium abundance, distance modulus, line-of-sight absorption, and parameters of the initial-final mass relation (IFMR) for a cluster, and for the primary mass, secondary mass (if a binary), and cluster probability for every potential cluster member. The MCMC technique is used for the cluster quantities (the first six items listed above) and numerical integration is used for the stellar quantities (the last three items in the above list).

  18. Objective Bayesian Comparison of Constrained Analysis of Variance Models.

    PubMed

    Consonni, Guido; Paroli, Roberta

    2016-10-04

    In the social sciences we are often interested in comparing models specified by parametric equality or inequality constraints. For instance, when examining three group means [Formula: see text] through an analysis of variance (ANOVA), a model may specify that [Formula: see text], while another one may state that [Formula: see text], and finally a third model may instead suggest that all means are unrestricted. This is a challenging problem, because it involves a combination of nonnested models, as well as nested models having the same dimension. We adopt an objective Bayesian approach, requiring no prior specification from the user, and derive the posterior probability of each model under consideration. Our method is based on the intrinsic prior methodology, suitably modified to accommodate equality and inequality constraints. Focussing on normal ANOVA models, a comparative assessment is carried out through simulation studies. We also present an application to real data collected in a psychological experiment.

  19. Bayesian Analysis of Peak Ground Acceleration Attenuation Relationship

    SciTech Connect

    Mu Heqing; Yuen Kaveng

    2010-05-21

    Estimation of peak ground acceleration is one of the main issues in civil and earthquake engineering practice. The Boore-Joyner-Fumal empirical formula is well known for this purpose. In this paper we propose to use the Bayesian probabilistic model class selection approach to obtain the most suitable prediction model class for the seismic attenuation formula. The optimal model class is robust in the sense that it has balance between the data fitting capability and the sensitivity to noise. A database of strong-motion records is utilized for the analysis. It turns out that the optimal model class is simpler than the full order attenuation model suggested by Boore, Joyner and Fumal (1993).

  20. Bayesian Library for the Analysis of Neutron Diffraction Data

    NASA Astrophysics Data System (ADS)

    Ratcliff, William; Lesniewski, Joseph; Quintana, Dylan

    During this talk, I will introduce the Bayesian Library for the Analysis of Neutron Diffraction Data. In this library we use of the DREAM algorithm to effectively sample parameter space. This offers several advantages over traditional least squares fitting approaches. It gives us more robust estimates of the fitting parameters, their errors, and their correlations. It also is more stable than least squares methods and provides more confidence in finding a global minimum. I will discuss the algorithm and its application to several materials. I will show applications to both structural and magnetic diffraction patterns. I will present examples of fitting both powder and single crystal data. We would like to acknowledge support from the Department of Commerce and the NSF.

  1. Discrete Dynamic Bayesian Network Analysis of fMRI Data

    PubMed Central

    Burge, John; Lane, Terran; Link, Hamilton; Qiu, Shibin; Clark, Vincent P.

    2010-01-01

    We examine the efficacy of using discrete Dynamic Bayesian Networks (dDBNs), a data-driven modeling technique employed in machine learning, to identify functional correlations among neuroanatomical regions of interest. Unlike many neuroimaging analysis techniques, this method is not limited by linear and/or Gaussian noise assumptions. It achieves this by modeling the time series of neuroanatomical regions as discrete, as opposed to continuous, random variables with multinomial distributions. We demonstrated this method using an fMRI dataset collected from healthy and demented elderly subjects and identify correlates based on a diagnosis of dementia. The results are validated in three ways. First, the elicited correlates are shown to be robust over leave-one-out cross-validation and, via a Fourier bootstrapping method, that they were not likely due to random chance. Second, the dDBNs identified correlates that would be expected given the experimental paradigm. Third, the dDBN's ability to predict dementia is competitive with two commonly employed machine-learning classifiers: the support vector machine and the Gaussian naïve Bayesian network. We also verify that the dDBN selects correlates based on non-linear criteria. Finally, we provide a brief analysis of the correlates elicited from Buckner et al.'s data that suggests that demented elderly subjects have reduced involvement of entorhinal and occipital cortex and greater involvement of the parietal lobe and amygdala in brain activity compared with healthy elderly (as measured via functional correlations among BOLD measurements). Limitations and extensions to the dDBN method are discussed. PMID:17990301

  2. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  3. An Exploratory Study Examining the Feasibility of Using Bayesian Networks to Predict Circuit Analysis Understanding

    ERIC Educational Resources Information Center

    Chung, Gregory K. W. K.; Dionne, Gary B.; Kaiser, William J.

    2006-01-01

    Our research question was whether we could develop a feasible technique, using Bayesian networks, to diagnose gaps in student knowledge. Thirty-four college-age participants completed tasks designed to measure conceptual knowledge, procedural knowledge, and problem-solving skills related to circuit analysis. A Bayesian network was used to model…

  4. Bayesian Analysis of Evolutionary Divergence with Genomic Data Under Diverse Demographic Models.

    PubMed

    Chung, Yujin; Hey, Jody

    2017-02-25

    We present a new Bayesian method for estimating demographic and phylogenetic history using population genomic data. Several key innovations are introduced that allow the study of diverse models within an Isolation with Migration framework. The new method implements a 2-step analysis, with an initial Markov chain Monte Carlo (MCMC) phase that samples simple coalescent trees, followed by the calculation of the joint posterior density for the parameters of a demographic model. In step 1, the MCMC sampling phase, the method uses a reduced state space, consisting of coalescent trees without migration paths, and a simple importance sampling distribution without the demography of interest. Once obtained, a single sample of trees can be used in step 2 to calculate the joint posterior density for model parameters under multiple diverse demographic models, without having to repeat MCMC runs. Because migration paths are not included in the state space of the MCMC phase, but rather are handled by analytic integration in step 2 of the analysis, the method is scalable to a large number of loci with excellent MCMC mixing properties. With an implementation of the new method in the computer program MIST, we demonstrate the method's accuracy, scalability and other advantages using simulated data and DNA sequences of two common chimpanzee subspecies: Pan troglodytes troglodytes (P. t.) and P. t. verus.

  5. Bayesian quantile regression for nonlinear mixed-effects joint models for longitudinal data in the presence of mismeasured covariate errors.

    PubMed

    Huang, Yangxin; Chen, Jiaqing; Qiu, Huahai

    2016-12-09

    Quantile regression (QR) models have recently received increasing attention in longitudinal studies where measurements of the same individuals are taken repeatedly over time. When continuous (longitudinal) responses follow a distribution that is quite different from a normal distribution, usual mean regression (MR)-based linear models may fail to produce efficient estimators, whereas QR-based linear models may perform satisfactorily. To the best of our knowledge, there have been very few studies on QR-based nonlinear models for longitudinal data in comparison to MR-based nonlinear models. In this article, we study QR-based nonlinear mixed-effects (NLME) joint models for longitudinal data with non-central location and outliers and/or heavy tails in response, and non-normality and measurement errors in covariate under Bayesian framework. The proposed QR-based modeling method is compared with an MR-based one by an AIDS clinical dataset and through simulation studies. The proposed QR joint modeling approach can be not only applied to AIDS clinical studies, but also may have general applications in other fields as long as relevant technical specifications are met.

  6. RECONSTRUCTING EXPOSURE SCENARIOS USING DOSE BIOMARKERS - AN APPLICATION OF BAYESIAN UNCERTAINTY ANALYSIS

    EPA Science Inventory

    We use Bayesian uncertainty analysis to explore how to estimate pollutant exposures from biomarker concentrations. The growing number of national databases with exposure data makes such an analysis possible. They contain datasets of pharmacokinetic biomarkers for many polluta...

  7. A Bayesian Analysis of Finite Mixtures in the LISREL Model.

    ERIC Educational Resources Information Center

    Zhu, Hong-Tu; Lee, Sik-Yum

    2001-01-01

    Proposes a Bayesian framework for estimating finite mixtures of the LISREL model. The model augments the observed data of the manifest variables with the latent variables and allocation variables and uses the Gibbs sampler to obtain the Bayesian solution. Discusses other associated statistical inferences. (SLD)

  8. UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE

    SciTech Connect

    Sanders, N. E.; Soderberg, A. M.; Betancourt, M.

    2015-02-10

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.

  9. Structural dynamic analysis of a ball joint

    NASA Astrophysics Data System (ADS)

    Hwang, Seok-Cheol; Lee, Kwon-Hee

    2012-11-01

    Ball joint is a rotating and swiveling element that is typically installed at the interface between two parts. In an automobile, the ball joint is the component that connects the control arms to the steering knuckle. The ball joint can also be installed in linkage systems for motion control applications. This paper describes the simulation strategy for a ball joint analysis, considering manufacturing process. Its manufacturing process can be divided into plugging and spinning. Then, the interested responses is selected as the stress distribution generated between its ball and bearing. In this paper, a commercial code of NX DAFUL using an implicit integration method is introduced to calculate the response. In addition, the gap analysis is performed to investigate the fitness, focusing on the response of the displacement of a ball stud. Also, the optimum design is suggested through case studies.

  10. Joint Bayesian variable and graph selection for regression models with network-structured predictors.

    PubMed

    Peterson, Christine B; Stingo, Francesco C; Vannucci, Marina

    2016-03-30

    In this work, we develop a Bayesian approach to perform selection of predictors that are linked within a network. We achieve this by combining a sparse regression model relating the predictors to a response variable with a graphical model describing conditional dependencies among the predictors. The proposed method is well-suited for genomic applications because it allows the identification of pathways of functionally related genes or proteins that impact an outcome of interest. In contrast to previous approaches for network-guided variable selection, we infer the network among predictors using a Gaussian graphical model and do not assume that network information is available a priori. We demonstrate that our method outperforms existing methods in identifying network-structured predictors in simulation settings and illustrate our proposed model with an application to inference of proteins relevant to glioblastoma survival.

  11. A Bayesian Analysis of the Cepheid Distance Scale

    NASA Astrophysics Data System (ADS)

    Barnes, Thomas G., III; Jefferys, W. H.; Berger, J. O.; Mueller, Peter J.; Orr, K.; Rodriguez, R.

    2003-07-01

    We develop and describe a Bayesian statistical analysis to solve the surface brightness equations for Cepheid distances and stellar properties. Our analysis provides a mathematically rigorous and objective solution to the problem, including immunity from Lutz-Kelker bias. We discuss the choice of priors, show the construction of the likelihood distribution, and give sampling algorithms in a Markov chain Monte Carlo approach for efficiently and completely sampling the posterior probability distribution. Our analysis averages over the probabilities associated with several models rather than attempting to pick the ``best model'' from several possible models. Using a sample of 13 Cepheids we demonstrate the method. We discuss diagnostics of the analysis and the effects of the astrophysical choices going into the model. We show that we can objectively model the order of Fourier polynomial fits to the light and velocity data. By comparison with theoretical models of Bono et al. we find that EU Tau and SZ Tau are overtone pulsators, most likely without convective overshoot. The period-radius and period-luminosity relations we obtain are shown to be compatible with those in the recent literature. Specifically, we find log()=(0.693+/-0.037)[log(P)-1.2]+(2.042+/-0.047) and v>=-(2.690+/-0.169)[log(P)-1.2]-(4.699+/-0.216).

  12. Bayesian survival analysis in clinical trials: What methods are used in practice?

    PubMed

    Brard, Caroline; Le Teuff, Gwénaël; Le Deley, Marie-Cécile; Hampson, Lisa V

    2017-02-01

    Background Bayesian statistics are an appealing alternative to the traditional frequentist approach to designing, analysing, and reporting of clinical trials, especially in rare diseases. Time-to-event endpoints are widely used in many medical fields. There are additional complexities to designing Bayesian survival trials which arise from the need to specify a model for the survival distribution. The objective of this article was to critically review the use and reporting of Bayesian methods in survival trials. Methods A systematic review of clinical trials using Bayesian survival analyses was performed through PubMed and Web of Science databases. This was complemented by a full text search of the online repositories of pre-selected journals. Cost-effectiveness, dose-finding studies, meta-analyses, and methodological papers using clinical trials were excluded. Results In total, 28 articles met the inclusion criteria, 25 were original reports of clinical trials and 3 were re-analyses of a clinical trial. Most trials were in oncology (n = 25), were randomised controlled (n = 21) phase III trials (n = 13), and half considered a rare disease (n = 13). Bayesian approaches were used for monitoring in 14 trials and for the final analysis only in 14 trials. In the latter case, Bayesian survival analyses were used for the primary analysis in four cases, for the secondary analysis in seven cases, and for the trial re-analysis in three cases. Overall, 12 articles reported fitting Bayesian regression models (semi-parametric, n = 3; parametric, n = 9). Prior distributions were often incompletely reported: 20 articles did not define the prior distribution used for the parameter of interest. Over half of the trials used only non-informative priors for monitoring and the final analysis (n = 12) when it was specified. Indeed, no articles fitting Bayesian regression models placed informative priors on the parameter of interest. The prior for the treatment

  13. BEAST 2: A Software Platform for Bayesian Evolutionary Analysis

    PubMed Central

    Bouckaert, Remco; Heled, Joseph; Kühnert, Denise; Vaughan, Tim; Wu, Chieh-Hsi; Xie, Dong; Suchard, Marc A.; Rambaut, Andrew; Drummond, Alexei J.

    2014-01-01

    We present a new open source, extensible and flexible software platform for Bayesian evolutionary analysis called BEAST 2. This software platform is a re-design of the popular BEAST 1 platform to correct structural deficiencies that became evident as the BEAST 1 software evolved. Key among those deficiencies was the lack of post-deployment extensibility. BEAST 2 now has a fully developed package management system that allows third party developers to write additional functionality that can be directly installed to the BEAST 2 analysis platform via a package manager without requiring a new software release of the platform. This package architecture is showcased with a number of recently published new models encompassing birth-death-sampling tree priors, phylodynamics and model averaging for substitution models and site partitioning. A second major improvement is the ability to read/write the entire state of the MCMC chain to/from disk allowing it to be easily shared between multiple instances of the BEAST software. This facilitates checkpointing and better support for multi-processor and high-end computing extensions. Finally, the functionality in new packages can be easily added to the user interface (BEAUti 2) by a simple XML template-based mechanism because BEAST 2 has been re-designed to provide greater integration between the analysis engine and the user interface so that, for example BEAST and BEAUti use exactly the same XML file format. PMID:24722319

  14. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  15. Bayesian Model Selection with Network Based Diffusion Analysis

    PubMed Central

    Whalen, Andrew; Hoppitt, William J. E.

    2016-01-01

    A number of recent studies have used Network Based Diffusion Analysis (NBDA) to detect the role of social transmission in the spread of a novel behavior through a population. In this paper we present a unified framework for performing NBDA in a Bayesian setting, and demonstrate how the Watanabe Akaike Information Criteria (WAIC) can be used for model selection. We present a specific example of applying this method to Time to Acquisition Diffusion Analysis (TADA). To examine the robustness of this technique, we performed a large scale simulation study and found that NBDA using WAIC could recover the correct model of social transmission under a wide range of cases, including under the presence of random effects, individual level variables, and alternative models of social transmission. This work suggests that NBDA is an effective and widely applicable tool for uncovering whether social transmission underpins the spread of a novel behavior, and may still provide accurate results even when key model assumptions are relaxed. PMID:27092089

  16. Bayesian network models in brain functional connectivity analysis

    PubMed Central

    Zhang, Sheng; Li, Chiang-shan R.

    2013-01-01

    Much effort has been made to better understand the complex integration of distinct parts of the human brain using functional magnetic resonance imaging (fMRI). Altered functional connectivity between brain regions is associated with many neurological and mental illnesses, such as Alzheimer and Parkinson diseases, addiction, and depression. In computational science, Bayesian networks (BN) have been used in a broad range of studies to model complex data set in the presence of uncertainty and when expert prior knowledge is needed. However, little is done to explore the use of BN in connectivity analysis of fMRI data. In this paper, we present an up-to-date literature review and methodological details of connectivity analyses using BN, while highlighting caveats in a real-world application. We present a BN model of fMRI dataset obtained from sixty healthy subjects performing the stop-signal task (SST), a paradigm widely used to investigate response inhibition. Connectivity results are validated with the extant literature including our previous studies. By exploring the link strength of the learned BN’s and correlating them to behavioral performance measures, this novel use of BN in connectivity analysis provides new insights to the functional neural pathways underlying response inhibition. PMID:24319317

  17. Using Bayesian analysis in repeated preclinical in vivo studies for a more effective use of animals.

    PubMed

    Walley, Rosalind; Sherington, John; Rastrick, Joe; Detrait, Eric; Hanon, Etienne; Watt, Gillian

    2016-05-01

    Whilst innovative Bayesian approaches are increasingly used in clinical studies, in the preclinical area Bayesian methods appear to be rarely used in the reporting of pharmacology data. This is particularly surprising in the context of regularly repeated in vivo studies where there is a considerable amount of data from historical control groups, which has potential value. This paper describes our experience with introducing Bayesian analysis for such studies using a Bayesian meta-analytic predictive approach. This leads naturally either to an informative prior for a control group as part of a full Bayesian analysis of the next study or using a predictive distribution to replace a control group entirely. We use quality control charts to illustrate study-to-study variation to the scientists and describe informative priors in terms of their approximate effective numbers of animals. We describe two case studies of animal models: the lipopolysaccharide-induced cytokine release model used in inflammation and the novel object recognition model used to screen cognitive enhancers, both of which show the advantage of a Bayesian approach over the standard frequentist analysis. We conclude that using Bayesian methods in stable repeated in vivo studies can result in a more effective use of animals, either by reducing the total number of animals used or by increasing the precision of key treatment differences. This will lead to clearer results and supports the "3Rs initiative" to Refine, Reduce and Replace animals in research. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Guidance on the implementation and reporting of a drug safety Bayesian network meta-analysis.

    PubMed

    Ohlssen, David; Price, Karen L; Xia, H Amy; Hong, Hwanhee; Kerman, Jouni; Fu, Haoda; Quartey, George; Heilmann, Cory R; Ma, Haijun; Carlin, Bradley P

    2014-01-01

    The Drug Information Association Bayesian Scientific Working Group (BSWG) was formed in 2011 with a vision to ensure that Bayesian methods are well understood and broadly utilized for design and analysis and throughout the medical product development process, and to improve industrial, regulatory, and economic decision making. The group, composed of individuals from academia, industry, and regulatory, has as its mission to facilitate the appropriate use and contribute to the progress of Bayesian methodology. In this paper, the safety sub-team of the BSWG explores the use of Bayesian methods when applied to drug safety meta-analysis and network meta-analysis. Guidance is presented on the conduct and reporting of such analyses. We also discuss different structural model assumptions and provide discussion on prior specification. The work is illustrated through a case study involving a network meta-analysis related to the cardiovascular safety of non-steroidal anti-inflammatory drugs.

  19. Nuclear stockpile stewardship and Bayesian image analysis (DARHT and the BIE)

    SciTech Connect

    Carroll, James L

    2011-01-11

    Since the end of nuclear testing, the reliability of our nation's nuclear weapon stockpile has been performed using sub-critical hydrodynamic testing. These tests involve some pretty 'extreme' radiography. We will be discussing the challenges and solutions to these problems provided by DARHT (the world's premiere hydrodynamic testing facility) and the BIE or Bayesian Inference Engine (a powerful radiography analysis software tool). We will discuss the application of Bayesian image analysis techniques to this important and difficult problem.

  20. Interacting Agricultural Pests and Their Effect on Crop Yield: Application of a Bayesian Decision Theory Approach to the Joint Management of Bromus tectorum and Cephus cinctus

    PubMed Central

    Keren, Ilai N.; Menalled, Fabian D.; Weaver, David K.; Robison-Cox, James F.

    2015-01-01

    Worldwide, the landscape homogeneity of extensive monocultures that characterizes conventional agriculture has resulted in the development of specialized and interacting multitrophic pest complexes. While integrated pest management emphasizes the need to consider the ecological context where multiple species coexist, management recommendations are often based on single-species tactics. This approach may not provide satisfactory solutions when confronted with the complex interactions occurring between organisms at the same or different trophic levels. Replacement of the single-species management model with more sophisticated, multi-species programs requires an understanding of the direct and indirect interactions occurring between the crop and all categories of pests. We evaluated a modeling framework to make multi-pest management decisions taking into account direct and indirect interactions among species belonging to different trophic levels. We adopted a Bayesian decision theory approach in combination with path analysis to evaluate interactions between Bromus tectorum (downy brome, cheatgrass) and Cephus cinctus (wheat stem sawfly) in wheat (Triticum aestivum) systems. We assessed their joint responses to weed management tactics, seeding rates, and cultivar tolerance to insect stem boring or competition. Our results indicated that C. cinctus oviposition behavior varied as a function of B. tectorum pressure. Crop responses were more readily explained by the joint effects of management tactics on both categories of pests and their interactions than just by the direct impact of any particular management scheme on yield. In accordance, a C. cinctus tolerant variety should be planted at a low seeding rate under high insect pressure. However as B. tectorum levels increase, the C. cinctus tolerant variety should be replaced by a competitive and drought tolerant cultivar at high seeding rates despite C. cinctus infestation. This study exemplifies the necessity of

  1. Bayesian analysis of genetic interactions in case-control studies, with application to adiponectin genes and colorectal cancer risk.

    PubMed

    Yi, Nengjun; Kaklamani, Virginia G; Pasche, Boris

    2011-01-01

    Complex diseases such as cancers are influenced by interacting networks of genetic and environmental factors. However, a joint analysis of multiple genes and environmental factors is challenging, owing to potentially large numbers of correlated and complex variables. We describe Bayesian generalized linear models for simultaneously analyzing covariates, main effects of numerous loci, gene-gene and gene-environment interactions in population case-control studies. Our Bayesian models use Student-t prior distributions with different shrinkage parameters for different types of effects, allowing reliable estimates of main effects and interactions and hence increasing the power for detection of real signals. We implement a fast and stable algorithm for fitting models by extending available tools for classical generalized linear models to the Bayesian case. We propose a novel method to interpret and visualize models with multiple interactions by computing the average predictive probability. Simulations show that the method has the potential to dissect interacting networks of complex diseases. Application of the method to a large case-control study of adiponectin genes and colorectal cancer risk highlights the previous results and detects new epistatic interactions and sex-specific effects that warrant follow-up in independent studies.

  2. Bayesian semiparametric nonlinear mixed-effects joint models for data with skewness, missing responses, and measurement errors in covariates.

    PubMed

    Huang, Yangxin; Dagne, Getachew

    2012-09-01

    It is a common practice to analyze complex longitudinal data using semiparametric nonlinear mixed-effects (SNLME) models with a normal distribution. Normality assumption of model errors may unrealistically obscure important features of subject variations. To partially explain between- and within-subject variations, covariates are usually introduced in such models, but some covariates may often be measured with substantial errors. Moreover, the responses may be missing and the missingness may be nonignorable. Inferential procedures can be complicated dramatically when data with skewness, missing values, and measurement error are observed. In the literature, there has been considerable interest in accommodating either skewness, incompleteness or covariate measurement error in such models, but there has been relatively little study concerning all three features simultaneously. In this article, our objective is to address the simultaneous impact of skewness, missingness, and covariate measurement error by jointly modeling the response and covariate processes based on a flexible Bayesian SNLME model. The method is illustrated using a real AIDS data set to compare potential models with various scenarios and different distribution specifications.

  3. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    SciTech Connect

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  4. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  5. A Bayesian model for the analysis of transgenerational epigenetic variation.

    PubMed

    Varona, Luis; Munilla, Sebastián; Mouresan, Elena Flavia; González-Rodríguez, Aldemar; Moreno, Carlos; Altarriba, Juan

    2015-01-23

    Epigenetics has become one of the major areas of biological research. However, the degree of phenotypic variability that is explained by epigenetic processes still remains unclear. From a quantitative genetics perspective, the estimation of variance components is achieved by means of the information provided by the resemblance between relatives. In a previous study, this resemblance was described as a function of the epigenetic variance component and a reset coefficient that indicates the rate of dissipation of epigenetic marks across generations. Given these assumptions, we propose a Bayesian mixed model methodology that allows the estimation of epigenetic variance from a genealogical and phenotypic database. The methodology is based on the development of a T: matrix of epigenetic relationships that depends on the reset coefficient. In addition, we present a simple procedure for the calculation of the inverse of this matrix ( T-1: ) and a Gibbs sampler algorithm that obtains posterior estimates of all the unknowns in the model. The new procedure was used with two simulated data sets and with a beef cattle database. In the simulated populations, the results of the analysis provided marginal posterior distributions that included the population parameters in the regions of highest posterior density. In the case of the beef cattle dataset, the posterior estimate of transgenerational epigenetic variability was very low and a model comparison test indicated that a model that did not included it was the most plausible.

  6. Cepheid light curve demography via Bayesian functional data analysis

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas J.; Hendry, Martin; Kowal, Daniel; Ruppert, David

    2016-01-01

    Synoptic time-domain surveys provide astronomers, not simply more data, but a different kind of data: large ensembles of multivariate, irregularly and asynchronously sampled light curves. We describe a statistical framework for light curve demography—optimal accumulation and extraction of information, not only along individual light curves as conventional methods do, but also across large ensembles of related light curves. We build the framework using tools from functional data analysis (FDA), a rapidly growing area of statistics that addresses inference from datasets that sample ensembles of related functions. Our Bayesian FDA framework builds hierarchical models that describe light curve ensembles using multiple levels of randomness: upper levels describe the source population, and lower levels describe the observation process, including measurement errors and selection effects. Roughly speaking, a particular object's light curve is modeled as the sum of a parameterized template component (modeling population-averaged behavior) and a peculiar component (modeling variability across the population), subsequently subjected to an observation model. A functional shrinkage adjustment to individual light curves emerges—an adaptive, functional generalization of the kind of adjustments made for Eddington or Malmquist bias in single-epoch photometric surveys. We describe ongoing work applying the framework to improved estimation of Cepheid variable star luminosities via FDA-based refinement and generalization of the Cepheid period-luminosity relation.

  7. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives

    PubMed Central

    Green, Adam W.; Bailey, Larissa L.

    2015-01-01

    Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA) using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica) metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools) using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA) and available information to inform a formal decision process to determine optimal and timely management policies. PMID:26658734

  8. Light curve demography via Bayesian functional data analysis

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas; Budavari, Tamas; Hendry, Martin A.; Kowal, Daniel; Ruppert, David

    2015-08-01

    Synoptic time-domain surveys provide astronomers, not simply more data, but a different kind of data: large ensembles of multivariate, irregularly and asynchronously sampled light curves. We describe a statistical framework for light curve demography—optimal accumulation and extraction of information, not only along individual light curves as conventional methods do, but also across large ensembles of related light curves. We build the framework using tools from functional data analysis (FDA), a rapidly growing area of statistics that addresses inference from datasets that sample ensembles of related functions. Our Bayesian FDA framework builds hierarchical models that describe light curve ensembles using multiple levels of randomness: upper levels describe the source population, and lower levels describe the observation process, including measurement errors and selection effects. Schematically, a particular object's light curve is modeled as the sum of a parameterized template component (modeling population-averaged behavior) and a peculiar component (modeling variability across the population), subsequently subjected to an observation model. A functional shrinkage adjustment to individual light curves emerges—an adaptive, functional generalization of the kind of adjustments made for Eddington or Malmquist bias in single-epoch photometric surveys. We are applying the framework to a variety of problems in synoptic time-domain survey astronomy, including optimal detection of weak sources in multi-epoch data, and improved estimation of Cepheid variable star luminosities from detailed demographic modeling of ensembles of Cepheid light curves.

  9. Bayesian Angular Power Spectrum Analysis of Interferometric Data

    NASA Astrophysics Data System (ADS)

    Sutter, P. M.; Wandelt, Benjamin D.; Malu, Siddarth S.

    2012-09-01

    We present a Bayesian angular power spectrum and signal map inference engine which can be adapted to interferometric observations of anisotropies in the cosmic microwave background (CMB), 21 cm emission line mapping of galactic brightness fluctuations, or 21 cm absorption line mapping of neutral hydrogen in the dark ages. The method uses Gibbs sampling to generate a sampled representation of the angular power spectrum posterior and the posterior of signal maps given a set of measured visibilities in the uv-plane. We use a mock interferometric CMB observation to demonstrate the validity of this method in the flat-sky approximation when adapted to take into account arbitrary coverage of the uv-plane, mode-mode correlations due to observations on a finite patch, and heteroschedastic visibility errors. The computational requirements scale as {O}(n_p log n_p) where np measures the ratio of the size of the detector array to the inter-detector spacing, meaning that Gibbs sampling is a promising technique for meeting the data analysis requirements of future cosmology missions.

  10. Spatial Hierarchical Bayesian Analysis of the Historical Extreme Streamflow

    NASA Astrophysics Data System (ADS)

    Najafi, M. R.; Moradkhani, H.

    2012-04-01

    Analysis of the climate change impact on extreme hydro-climatic events is crucial for future hydrologic/hydraulic designs and water resources decision making. The purpose of this study is to investigate the changes of the extreme value distribution parameters with respect to time to reflect upon the impact of climate change. We develop a statistical model using the observed streamflow data of the Columbia River Basin in USA to estimate the changes of high flows as a function of time as well as other variables. Generalized Pareto Distribution (GPD) is used to model the upper 95% flows during December through March for 31 gauge stations. In the process layer of the model the covariates including time, latitude, longitude, elevation and basin area are considered to assess the sensitivity of the model to each variable. Markov Chain Monte Carlo (MCMC) method is used to estimate the parameters. The Spatial Hierarchical Bayesian technique models the GPD parameters spatially and borrows strength from other locations by pooling data together, while providing an explicit estimation of the uncertainties in all stages of modeling.

  11. A Bayesian Analysis of Regularised Source Inversions in Gravitational Lensing

    SciTech Connect

    Suyu, Sherry H.; Marshall, P.J.; Hobson, M.P.; Blandford, R.D.; /Caltech /KIPAC, Menlo Park

    2006-01-25

    Strong gravitational lens systems with extended sources are of special interest because they provide additional constraints on the models of the lens systems. To use a gravitational lens system for measuring the Hubble constant, one would need to determine the lens potential and the source intensity distribution simultaneously. A linear inversion method to reconstruct a pixellated source distribution of a given lens potential model was introduced by Warren and Dye. In the inversion process, a regularization on the source intensity is often needed to ensure a successful inversion with a faithful resulting source. In this paper, we use Bayesian analysis to determine the optimal regularization constant (strength of regularization) of a given form of regularization and to objectively choose the optimal form of regularization given a selection of regularizations. We consider and compare quantitatively three different forms of regularization previously described in the literature for source inversions in gravitational lensing: zeroth-order, gradient and curvature. We use simulated data with the exact lens potential to demonstrate the method. We find that the preferred form of regularization depends on the nature of the source distribution.

  12. Bayesian analysis of a reduced-form air quality model.

    PubMed

    Foley, Kristen M; Reich, Brian J; Napelenok, Sergey L

    2012-07-17

    Numerical air quality models are being used for assessing emission control strategies for improving ambient pollution levels across the globe. This paper applies probabilistic modeling to evaluate the effectiveness of emission reduction scenarios aimed at lowering ground-level ozone concentrations. A Bayesian hierarchical model is used to combine air quality model output and monitoring data in order to characterize the impact of emissions reductions while accounting for different degrees of uncertainty in the modeled emissions inputs. The probabilistic model predictions are weighted based on population density in order to better quantify the societal benefits/disbenefits of four hypothetical emission reduction scenarios in which domain-wide NO(x) emissions from various sectors are reduced individually and then simultaneously. Cross validation analysis shows the statistical model performs well compared to observed ozone levels. Accounting for the variability and uncertainty in the emissions and atmospheric systems being modeled is shown to impact how emission reduction scenarios would be ranked, compared to standard methodology.

  13. Toward a Behavioral Analysis of Joint Attention

    ERIC Educational Resources Information Center

    Dube, William V.; MacDonald, Rebecca P. F.; Mansfield, Renee C.; Holcomb, William L.; Ahearn, William H.

    2004-01-01

    Joint attention (JA) initiation is defined in cognitive-developmental psychology as a child's actions that verify or produce simultaneous attending by that child and an adult to some object or event in the environment so that both may experience the object or event together. This paper presents a contingency analysis of gaze shift in JA…

  14. Bayesian analysis of anisotropic cosmologies: Bianchi VIIh and WMAP

    NASA Astrophysics Data System (ADS)

    McEwen, J. D.; Josset, T.; Feeney, S. M.; Peiris, H. V.; Lasenby, A. N.

    2013-12-01

    We perform a definitive analysis of Bianchi VIIh cosmologies with Wilkinson Microwave Anisotropy Probe (WMAP) observations of the cosmic microwave background (CMB) temperature anisotropies. Bayesian analysis techniques are developed to study anisotropic cosmologies using full-sky and partial-sky masked CMB temperature data. We apply these techniques to analyse the full-sky internal linear combination (ILC) map and a partial-sky masked W-band map of WMAP 9 yr observations. In addition to the physically motivated Bianchi VIIh model, we examine phenomenological models considered in previous studies, in which the Bianchi VIIh parameters are decoupled from the standard cosmological parameters. In the two phenomenological models considered, Bayes factors of 1.7 and 1.1 units of log-evidence favouring a Bianchi component are found in full-sky ILC data. The corresponding best-fitting Bianchi maps recovered are similar for both phenomenological models and are very close to those found in previous studies using earlier WMAP data releases. However, no evidence for a phenomenological Bianchi component is found in the partial-sky W-band data. In the physical Bianchi VIIh model, we find no evidence for a Bianchi component: WMAP data thus do not favour Bianchi VIIh cosmologies over the standard Λ cold dark matter (ΛCDM) cosmology. It is not possible to discount Bianchi VIIh cosmologies in favour of ΛCDM completely, but we are able to constrain the vorticity of physical Bianchi VIIh cosmologies at (ω/H)0 < 8.6 × 10-10 with 95 per cent confidence.

  15. Bayesian analysis versus discriminant function analysis: their relative utility in the diagnosis of coronary disease.

    PubMed

    Detrano, R; Leatherman, J; Salcedo, E E; Yiannikas, J; Williams, G

    1986-05-01

    Both Bayesian analysis assuming independence and discriminant function analysis have been used to estimate probabilities of coronary disease. To compare their relative accuracy, we submitted 303 subjects referred for coronary angiography to stress electrocardiography, thallium scintigraphy, and cine fluoroscopy. Severe angiographic disease was defined as at least one greater than 50% occlusion of a major vessel. Four calculations were done: (1) Bayesian analysis using literature estimates of pretest probabilities, sensitivities, and specificities was applied to the clinical and test data of a randomly selected subgroup (group I, 151 patients) to calculate posttest probabilities. (2) Bayesian analysis using literature estimates of pretest probabilities (but with sensitivities and specificities derived from the remaining 152 subjects [group II]) was applied to group I data to estimate posttest probabilities. (3) A discriminant function with logistic regression coefficients derived from the clinical and test variables of group II was used to calculate posttest probabilities of group I. (4) A discriminant function derived with the use of test results from group II and pretest probabilities from the literature was used to calculate posttest probabilities of group I. Receiver operating characteristic curve analysis showed that all four calculations could equivalently rank the disease probabilities for our patients. A goodness-of-fit analysis suggested the following relationship between the accuracies of the four calculations: (1) less than (2) approximately equal to (4) less than (3). Our results suggest that data-based discriminant functions are more accurate than literature-based Bayesian analysis assuming independence in predicting severe coronary disease based on clinical and noninvasive test results.

  16. Bayesian Analysis of Multiple Populations in Galactic Globular Clusters

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, Rachel A.; Sarajedini, Ata; von Hippel, Ted; Stenning, David; Piotto, Giampaolo; Milone, Antonino; van Dyk, David A.; Robinson, Elliot; Stein, Nathan

    2016-01-01

    We use GO 13297 Cycle 21 Hubble Space Telescope (HST) observations and archival GO 10775 Cycle 14 HST ACS Treasury observations of Galactic Globular Clusters to find and characterize multiple stellar populations. Determining how globular clusters are able to create and retain enriched material to produce several generations of stars is key to understanding how these objects formed and how they have affected the structural, kinematic, and chemical evolution of the Milky Way. We employ a sophisticated Bayesian technique with an adaptive MCMC algorithm to simultaneously fit the age, distance, absorption, and metallicity for each cluster. At the same time, we also fit unique helium values to two distinct populations of the cluster and determine the relative proportions of those populations. Our unique numerical approach allows objective and precise analysis of these complicated clusters, providing posterior distribution functions for each parameter of interest. We use these results to gain a better understanding of multiple populations in these clusters and their role in the history of the Milky Way.Support for this work was provided by NASA through grant numbers HST-GO-10775 and HST-GO-13297 from the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS5-26555. This material is based upon work supported by the National Aeronautics and Space Administration under Grant NNX11AF34G issued through the Office of Space Science. This project was supported by the National Aeronautics & Space Administration through the University of Central Florida's NASA Florida Space Grant Consortium.

  17. Bayesian inference approach to room-acoustic modal analysis

    NASA Astrophysics Data System (ADS)

    Henderson, Wesley; Goggans, Paul; Xiang, Ning; Botts, Jonathan

    2013-08-01

    Spectrum estimation is a problem common to many fields of physics, science, and engineering, and it has thus received a great deal of attention from the Bayesian data analysis community. In room acoustics, the modal or frequency response of a room is important for diagnosing and remedying acoustical defects. The physics of a sound field in a room dictates a model comprised of exponentially decaying sinusoids. Continuing in the tradition of the seminal work of Bretthorst and Jaynes, this work contributes an approach to analyzing the modal responses of rooms with a time-domain model. Room acoustic spectra are constructed of damped sinusoids, and the modelbased approach allows estimation of the number of sinusoids in the signal as well as their frequencies, amplitudes, damping constants, and phase delays. The frequency-amplitude spectrum may be most useful for characterizing a room, but in some settings the damping constants are of primary interest. This is the case for measuring the absorptive properties of materials, for example. A further challenge of the room acoustic spectrum problem is that modal density increases quadratically with frequency. At a point called the Schroeder frequency, adjacent modes overlap enough that the spectrum - particularly when estimated with the discrete Fourier transform - can be treated as a continuum. The time-domain, model-based approach can resolve overlapping modes and in some cases be used to estimate the Schroeder frequency. The proposed approach addresses the issue of filtering and preprocessing in order for the sampling to accurately identify all present room modes with their quadratically increasing density.

  18. Hierarchical Bayesian Modeling, Estimation, and Sampling for Multigroup Shape Analysis

    PubMed Central

    Yu, Yen-Yun; Fletcher, P. Thomas; Awate, Suyash P.

    2016-01-01

    This paper proposes a novel method for the analysis of anatomical shapes present in biomedical image data. Motivated by the natural organization of population data into multiple groups, this paper presents a novel hierarchical generative statistical model on shapes. The proposed method represents shapes using pointsets and defines a joint distribution on the population’s (i) shape variables and (ii) object-boundary data. The proposed method solves for optimal (i) point locations, (ii) correspondences, and (iii) model-parameter values as a single optimization problem. The optimization uses expectation maximization relying on a novel Markov-chain Monte-Carlo algorithm for sampling in Kendall shape space. Results on clinical brain images demonstrate advantages over the state of the art. PMID:25320776

  19. A Bayesian analysis of the solar neutrino problem

    SciTech Connect

    Bhat, C.M.; Bhat, P.C.; Paterno, M.; Prosper, H.B.

    1996-09-01

    We illustrate how the Bayesian approach can be used to provide a simple but powerful way to analyze data from solar neutrino experiments. The data are analyzed assuming that the neutrinos are unaltered during their passage from the Sun to the Earth. We derive quantitative and easily understood information pertaining to the solar neutrino problem.

  20. Bayesian Analysis of Order-Statistics Models for Ranking Data.

    ERIC Educational Resources Information Center

    Yu, Philip L. H.

    2000-01-01

    Studied the order-statistics models, extending the usual normal order-statistics model into one in which the underlying random variables followed a multivariate normal distribution. Used a Bayesian approach and the Gibbs sampling technique. Applied the proposed method to analyze presidential election data from the American Psychological…

  1. Incorporating Prior Theory in Covariance Structure Analysis: A Bayesian Approach.

    ERIC Educational Resources Information Center

    Fornell, Claes; Rust, Roland T.

    1989-01-01

    A Bayesian approach to the testing of competing covariance structures is developed. Approximate posterior probabilities are easily obtained from the chi square values and other known constants. The approach is illustrated using an example that demonstrates how the prior probabilities can alter results concerning the preferred model specification.…

  2. Semiparametric Thurstonian Models for Recurrent Choices: A Bayesian Analysis

    ERIC Educational Resources Information Center

    Ansari, Asim; Iyengar, Raghuram

    2006-01-01

    We develop semiparametric Bayesian Thurstonian models for analyzing repeated choice decisions involving multinomial, multivariate binary or multivariate ordinal data. Our modeling framework has multiple components that together yield considerable flexibility in modeling preference utilities, cross-sectional heterogeneity and parameter-driven…

  3. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services.

  4. Dust biasing of damped Lyman alpha systems: a Bayesian analysis

    NASA Astrophysics Data System (ADS)

    Pontzen, Andrew; Pettini, Max

    2009-02-01

    If damped Lyman alpha systems (DLAs) contain even modest amounts of dust, the ultraviolet luminosity of the background quasar can be severely diminished. When the spectrum is redshifted, this leads to a bias in optical surveys for DLAs. Previous estimates of the magnitude of this effect are in some tension; in particular, the distribution of DLAs in the (NHI, Z) (i.e. column density-metallicity) plane has led to claims that we may be missing a considerable fraction of metal-rich, high column density DLAs, whereas radio surveys do not unveil a substantial population of otherwise hidden systems. Motivated by this tension, we perform a Bayesian parameter estimation analysis of a simple dust obscuration model. We include radio and optical observations of DLAs in our overall likelihood analysis and show that these do not, in fact, constitute conflicting constraints. Our model gives statistical limits on the biasing effects of dust, predicting that only 7 per cent of DLAs are missing from optical samples due to dust obscuration; at 2σ confidence, this figure takes a maximum value of 17 per cent. This contrasts with recent claims that DLA incidence rates are underestimated by 30-50 per cent. Optical measures of the mean metallicities of DLAs are found to underestimate the true value by just 0.1dex (or at most 0.4dex,2σ confidence limit), in agreement with the radio survey results of Akerman et al. As an independent test, we use our model to make a rough prediction for dust reddening of the background quasar. We find a mean reddening in the DLA rest frame of log10 ~= -2.4 +/- 0.6, consistent with direct analysis of the Sloan Digital Sky Survey (SDSS) quasar population by Vladilo et al., log10 = -2.2 +/- 0.1. The quantity most affected by dust biasing is the total cosmic density of metals in DLAs, ΩZ,DLA, which is underestimated in optical surveys by a factor of approximately 2.

  5. Organizing for Effective Joint Warfare: A Deductive Analysis of U.S. Armed Forces Joint Doctrine

    DTIC Science & Technology

    1993-06-18

    AD-A266 735 (Unclassified Paper) NAVAL WAR COLLEGE Newport, R.I. ORGANIZING FOR EFFECTIVE JOINT WARFARE: A DEDUCTIVE ANALYSIS OF U.S. ARMED FORCES...8217biGJ-Tji--𔃼T JOINT WARFLRE: A DEDUCTIVE ANALYSIS OF U.S. ARMED FORGES JOINT DOCTRINE (j 12. PERSONAL AUTHOR(S) JESSE J. KELSO, GOMI-ANDER. U. S... ANALYSIS OF U.S. ARMED FORCES JOINT DOCTRINE Using organizational concepts embodied in existing joint doctrine, this paper deduces rational criteria for

  6. A Bayesian approach to meta-analysis of plant pathology studies.

    PubMed

    Mila, A L; Ngugi, H K

    2011-01-01

    Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework

  7. Bayesian analysis of heavy-tailed and long-range dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, Timothy; Watkins, Nick; Gramacy, Robert; Franzke, Christian

    2014-05-01

    We have used MCMC algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modelling long range dependence (e.g. Beran et al, 2013). Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. We have extended the ARFIMA model by weakening the Gaussianity assumption, assuming an alpha-stable, heavy tailed, distribution for the innovations, and performing joint inference on d and alpha. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other popular measures of d.

  8. A Bayesian Solution for Two-Way Analysis of Variance. ACT Technical Bulletin No. 8.

    ERIC Educational Resources Information Center

    Lindley, Dennis V.

    The standard statistical analysis of data classified in two ways (say into rows and columns) is through an analysis of variance that splits the total variation of the data into the main effect of rows, the main effect of columns, and the interaction between rows and columns. This paper presents an alternative Bayesian analysis of the same…

  9. Bayesian analysis of two stellar populations in Galactic globular clusters- III. Analysis of 30 clusters

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, R.; Stenning, D. C.; Sarajedini, A.; von Hippel, T.; van Dyk, D. A.; Robinson, E.; Stein, N.; Jefferys, W. H.

    2016-12-01

    We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival ACS Treasury observations of 30 Galactic globular clusters to characterize two distinct stellar populations. A sophisticated Bayesian technique is employed to simultaneously sample the joint posterior distribution of age, distance, and extinction for each cluster, as well as unique helium values for two populations within each cluster and the relative proportion of those populations. We find the helium differences among the two populations in the clusters fall in the range of ˜0.04 to 0.11. Because adequate models varying in carbon, nitrogen, and oxygen are not presently available, we view these spreads as upper limits and present them with statistical rather than observational uncertainties. Evidence supports previous studies suggesting an increase in helium content concurrent with increasing mass of the cluster and we also find that the proportion of the first population of stars increases with mass as well. Our results are examined in the context of proposed globular cluster formation scenarios. Additionally, we leverage our Bayesian technique to shed light on the inconsistencies between the theoretical models and the observed data.

  10. Efficient Bayesian hierarchical functional data analysis with basis function approximations using Gaussian-Wishart processes.

    PubMed

    Yang, Jingjing; Cox, Dennis D; Lee, Jong Soo; Ren, Peng; Choi, Taeryon

    2017-04-10

    Functional data are defined as realizations of random functions (mostly smooth functions) varying over a continuum, which are usually collected on discretized grids with measurement errors. In order to accurately smooth noisy functional observations and deal with the issue of high-dimensional observation grids, we propose a novel Bayesian method based on the Bayesian hierarchical model with a Gaussian-Wishart process prior and basis function representations. We first derive an induced model for the basis-function coefficients of the functional data, and then use this model to conduct posterior inference through Markov chain Monte Carlo methods. Compared to the standard Bayesian inference that suffers serious computational burden and instability in analyzing high-dimensional functional data, our method greatly improves the computational scalability and stability, while inheriting the advantage of simultaneously smoothing raw observations and estimating the mean-covariance functions in a nonparametric way. In addition, our method can naturally handle functional data observed on random or uncommon grids. Simulation and real studies demonstrate that our method produces similar results to those obtainable by the standard Bayesian inference with low-dimensional common grids, while efficiently smoothing and estimating functional data with random and high-dimensional observation grids when the standard Bayesian inference fails. In conclusion, our method can efficiently smooth and estimate high-dimensional functional data, providing one way to resolve the curse of dimensionality for Bayesian functional data analysis with Gaussian-Wishart processes.

  11. Bayesian quantile regression-based nonlinear mixed-effects joint models for time-to-event and longitudinal data with multiple features.

    PubMed

    Huang, Yangxin; Chen, Jiaqing

    2016-12-30

    This article explores Bayesian joint models for a quantile of longitudinal response, mismeasured covariate and event time outcome with an attempt to (i) characterize the entire conditional distribution of the response variable based on quantile regression that may be more robust to outliers and misspecification of error distribution; (ii) tailor accuracy from measurement error, evaluate non-ignorable missing observations, and adjust departures from normality in covariate; and (iii) overcome shortages of confidence in specifying a time-to-event model. When statistical inference is carried out for a longitudinal data set with non-central location, non-linearity, non-normality, measurement error, and missing values as well as event time with being interval censored, it is important to account for the simultaneous treatment of these data features in order to obtain more reliable and robust inferential results. Toward this end, we develop Bayesian joint modeling approach to simultaneously estimating all parameters in the three models: quantile regression-based nonlinear mixed-effects model for response using asymmetric Laplace distribution, linear mixed-effects model with skew-t distribution for mismeasured covariate in the presence of informative missingness and accelerated failure time model with unspecified nonparametric distribution for event time. We apply the proposed modeling approach to analyzing an AIDS clinical data set and conduct simulation studies to assess the performance of the proposed joint models and method. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Bayesian time-series analysis of a repeated-measures poisson outcome with excess zeroes.

    PubMed

    Murphy, Terrence E; Van Ness, Peter H; Araujo, Katy L B; Pisani, Margaret A

    2011-12-01

    In this article, the authors demonstrate a time-series analysis based on a hierarchical Bayesian model of a Poisson outcome with an excessive number of zeroes. The motivating example for this analysis comes from the intensive care unit (ICU) of an urban university teaching hospital (New Haven, Connecticut, 2002-2004). Studies of medication use among older patients in the ICU are complicated by statistical factors such as an excessive number of zero doses, periodicity, and within-person autocorrelation. Whereas time-series techniques adjust for autocorrelation and periodicity in outcome measurements, Bayesian analysis provides greater precision for small samples and the flexibility to conduct posterior predictive simulations. By applying elements of time-series analysis within both frequentist and Bayesian frameworks, the authors evaluate differences in shift-based dosing of medication in a medical ICU. From a small sample and with adjustment for excess zeroes, linear trend, autocorrelation, and clinical covariates, both frequentist and Bayesian models provide evidence of a significant association between a specific nursing shift and dosing level of a sedative medication. Furthermore, the posterior distributions from a Bayesian random-effects Poisson model permit posterior predictive simulations of related results that are potentially difficult to model.

  13. Bayesian uncertainty analysis compared with the application of the GUM and its supplements

    NASA Astrophysics Data System (ADS)

    Elster, Clemens

    2014-08-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) has proven to be a major step towards the harmonization of uncertainty evaluation in metrology. Its procedures contain elements from both classical and Bayesian statistics. The recent supplements 1 and 2 to the GUM appear to move the guidelines towards the Bayesian point of view, and they produce a probability distribution that shall encode one's state of knowledge about the measurand. In contrast to a Bayesian uncertainty analysis, however, Bayes' theorem is not applied explicitly. Instead, a distribution is assigned for the input quantities which is then ‘propagated’ through a model that relates the input quantities to the measurand. The resulting distribution for the measurand may coincide with a distribution obtained by the application of Bayes' theorem, but this is not true in general. The relation between a Bayesian uncertainty analysis and the application of the GUM and its supplements is investigated. In terms of a simple example, similarities and differences in the approaches are illustrated. Then a general class of models is considered and conditions are specified for which the distribution obtained by supplement 1 to the GUM is equivalent to a posterior distribution resulting from the application of Bayes' theorem. The corresponding prior distribution is identified and assessed. Finally, we briefly compare the GUM approach with a Bayesian uncertainty analysis in the context of regression problems.

  14. A Gibbs sampler for Bayesian analysis of site-occupancy data

    USGS Publications Warehouse

    Dorazio, Robert M.; Rodriguez, Daniel Taylor

    2012-01-01

    1. A Bayesian analysis of site-occupancy data containing covariates of species occurrence and species detection probabilities is usually completed using Markov chain Monte Carlo methods in conjunction with software programs that can implement those methods for any statistical model, not just site-occupancy models. Although these software programs are quite flexible, considerable experience is often required to specify a model and to initialize the Markov chain so that summaries of the posterior distribution can be estimated efficiently and accurately. 2. As an alternative to these programs, we develop a Gibbs sampler for Bayesian analysis of site-occupancy data that include covariates of species occurrence and species detection probabilities. This Gibbs sampler is based on a class of site-occupancy models in which probabilities of species occurrence and detection are specified as probit-regression functions of site- and survey-specific covariate measurements. 3. To illustrate the Gibbs sampler, we analyse site-occupancy data of the blue hawker, Aeshna cyanea (Odonata, Aeshnidae), a common dragonfly species in Switzerland. Our analysis includes a comparison of results based on Bayesian and classical (non-Bayesian) methods of inference. We also provide code (based on the R software program) for conducting Bayesian and classical analyses of site-occupancy data.

  15. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis: A Cross-National Investigation of Schwartz Values

    ERIC Educational Resources Information Center

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the model parameters and demonstrates the consequences…

  16. Bayesian Factor Analysis When Only a Sample Covariance Matrix Is Available

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Arav, Marina

    2006-01-01

    In traditional factor analysis, the variance-covariance matrix or the correlation matrix has often been a form of inputting data. In contrast, in Bayesian factor analysis, the entire data set is typically required to compute the posterior estimates, such as Bayes factor loadings and Bayes unique variances. We propose a simple method for computing…

  17. Analysis of mechanical joint in composite cylinder

    NASA Astrophysics Data System (ADS)

    Hong, C. S.; Kim, Y. W.; Park, J. S.

    Joining techniques of composite materials are of great interest in cylindrical structures as the application of composites is widely used for weight-sensitive structures. Little information for the mechanical fastening joint of the laminated shell structure is available in the literature. In this study, a finite element program, which was based on the first order shear deformation theory, was developed for the analysis of the mechanical joint in the laminated composite structure. The failure of the mechanical fastening joint for the laminated graphite/epoxy cylinder subject to internal pressure was analyzed by using the developed program. Modeling of the bolt head in the composite cylinder was studied, and the effect of steel reinforcement outside the composite cylinder on the failure was investigated. The stress component near the bolt head was influenced by the size of the bolt head. The failure load and the failure mode were dependent on the bolt diameter, the number of bolts, and fiber orientation. The failure load was constant when the edge distance exceeds three times the bolt diameter.

  18. A Bayesian hidden Markov model for motif discovery through joint modeling of genomic sequence and ChIP-chip data.

    PubMed

    Gelfond, Jonathan A L; Gupta, Mayetri; Ibrahim, Joseph G

    2009-12-01

    We propose a unified framework for the analysis of chromatin (Ch) immunoprecipitation (IP) microarray (ChIP-chip) data for detecting transcription factor binding sites (TFBSs) or motifs. ChIP-chip assays are used to focus the genome-wide search for TFBSs by isolating a sample of DNA fragments with TFBSs and applying this sample to a microarray with probes corresponding to tiled segments across the genome. Present analytical methods use a two-step approach: (i) analyze array data to estimate IP-enrichment peaks then (ii) analyze the corresponding sequences independently of intensity information. The proposed model integrates peak finding and motif discovery through a unified Bayesian hidden Markov model (HMM) framework that accommodates the inherent uncertainty in both measurements. A Markov chain Monte Carlo algorithm is formulated for parameter estimation, adapting recursive techniques used for HMMs. In simulations and applications to a yeast RAP1 dataset, the proposed method has favorable TFBS discovery performance compared to currently available two-stage procedures in terms of both sensitivity and specificity.

  19. A Bayesian hidden Markov model for motif discovery through joint modeling of genomic sequence and ChIP-chip data

    PubMed Central

    Gelfond, Jonathan A. L.; Gupta, Mayetri; Ibrahim, Joseph G.

    2009-01-01

    SUMMARY We propose a unified framework for the analysis of Chromatin (Ch) Immunoprecipitation (IP) microarray (ChIP-chip) data for detecting transcription factor binding sites (TFBSs) or motifs. ChIP-chip assays are used to focus the genome-wide search for TFBSs by isolating a sample of DNA fragments with TFBSs and applying this sample to a microarray with probes corresponding to tiled segments across the genome. Present analytical methods use a two-step approach: (i) analyze array data to estimate IP enrichment peaks then (ii) analyze the corresponding sequences independently of intensity information. The proposed model integrates peak finding and motif discovery through a unified Bayesian hidden Markov model (HMM) framework that accommodates the inherent uncertainty in both measurements. A Markov Chain Monte Carlo algorithm is formulated for parameter estimation, adapting recursive techniques used for HMMs. In simulations and applications to a yeast RAP1 dataset, the proposed method has favorable TFBS discovery performance compared to currently available two-stage procedures in terms of both sensitivity and specificity. PMID:19210737

  20. Analysis of Climate Change on Hydrologic Components by using Bayesian Neural Networks

    NASA Astrophysics Data System (ADS)

    Kang, K.

    2012-12-01

    Representation of hydrologic analysis in climate change is a challenging task. Hydrologic outputs in regional climate models (RCMs) from general circulation models (GCMs) have difficult representation due to several uncertainties in hydrologic impacts of climate change. To overcome this problem, this research presents practical options for hydrological climate change with Bayesian and Neural networks approached to regional adaption to climate change. Bayesian and Neural networks analysis to climate hydrologic components is one of new frontier researches considering to climate change expectation. Strong advantage in Bayesian Neural networks is detecting time series in hydrologic components, which is complicated due to data, parameter, and model hypothesis on climate change scenario, through changing steps by removing and adding connections in Neural network process that combined Bayesian concept from parameter, predict and update process. As an example study, Mekong River Watershed, which is surrounded by four countries (Myanmar, Laos, Thailand and Cambodia), is selected. Results will show understanding of hydrologic components trend on climate model simulations through Bayesian Neural networks.

  1. Bayesian analysis of time-series data under case-crossover designs: posterior equivalence and inference.

    PubMed

    Li, Shi; Mukherjee, Bhramar; Batterman, Stuart; Ghosh, Malay

    2013-12-01

    Case-crossover designs are widely used to study short-term exposure effects on the risk of acute adverse health events. While the frequentist literature on this topic is vast, there is no Bayesian work in this general area. The contribution of this paper is twofold. First, the paper establishes Bayesian equivalence results that require characterization of the set of priors under which the posterior distributions of the risk ratio parameters based on a case-crossover and time-series analysis are identical. Second, the paper studies inferential issues under case-crossover designs in a Bayesian framework. Traditionally, a conditional logistic regression is used for inference on risk-ratio parameters in case-crossover studies. We consider instead a more general full likelihood-based approach which makes less restrictive assumptions on the risk functions. Formulation of a full likelihood leads to growth in the number of parameters proportional to the sample size. We propose a semi-parametric Bayesian approach using a Dirichlet process prior to handle the random nuisance parameters that appear in a full likelihood formulation. We carry out a simulation study to compare the Bayesian methods based on full and conditional likelihood with the standard frequentist approaches for case-crossover and time-series analysis. The proposed methods are illustrated through the Detroit Asthma Morbidity, Air Quality and Traffic study, which examines the association between acute asthma risk and ambient air pollutant concentrations.

  2. FABADA: Fitting Algorithm for Bayesian Analysis of DAta

    NASA Astrophysics Data System (ADS)

    Pardo, L.; Sala, G.

    2014-07-01

    The extraction of any physical information from data has been generally made by fitting the data through a χ^2 minimization procedure. However, as pointed out by the pioneer work of Sivia D. S. et al. another way to analyze the data is possible using a probabilistic approach based on Bayes theorem. Expressed in a practical way, the main difference between the classical (χ^2 minimization) and the Bayesian approach is the way of expressing the final results of the fitting procedure: in the first case the result is expressed by values of parameters and a merit figure such as χ^2, while in the second case results are presented as probability distribution functions (PDF) of both. In the method presented here we obtain the final probability distribution functions exploring the combinations of parameters compatible with the experimental error, i.e. allowing the fitting procedure to wander in the parameter space with a probability of visiting a certain point P=exp(-χ^2/2), the so called Gibbs sampling. Among the advantages of this method, we would like to emphasize three. First of all, correlation between parameters is automatically taken into account with the Bayesian method. This implies, for example, that parameter errors are correctly calculated, correlations show up in a natural way and ill defined parameters are immediately recognized from their PDF (i.e. parameters for which data only support the calculation of lower or upper bounds). Secondly, it is possible to calculate the likelihood of a determined physical model, and therefore to select the one which best fits the data with the minimum number of parameters, in a correctly defined probabilistic way. Finally, the last but not less, in the case of a low count rate, where the known low error=√{counts} fails because Poisson distribution can no longer be approximated as a Gaussian, the Bayesian, method can also be used by simply redefining χ^2, which is not possible with the usual fitting procedure.

  3. Bayesian analysis of the dynamic structure in China's economic growth

    NASA Astrophysics Data System (ADS)

    Kyo, Koki; Noda, Hideo

    2008-11-01

    To analyze the dynamic structure in China's economic growth during the period 1952-1998, we introduce a model of the aggregate production function for the Chinese economy that considers total factor productivity (TFP) and output elasticities as time-varying parameters. Specifically, this paper is concerned with the relationship between the rate of economic growth in China and the trend in TFP. Here, we consider the time-varying parameters as random variables and introduce smoothness priors to construct a set of Bayesian linear models for parameter estimation. The results of the estimation are in agreement with the movements in China's social economy, thus illustrating the validity of the proposed methods.

  4. Bayesian analysis of truncation errors in chiral effective field theory

    NASA Astrophysics Data System (ADS)

    Melendez, J.; Furnstahl, R. J.; Klco, N.; Phillips, D. R.; Wesolowski, S.

    2016-09-01

    In the Bayesian approach to effective field theory (EFT) expansions, truncation errors are derived from degree-of-belief (DOB) intervals for EFT predictions. By encoding expectations about the naturalness of EFT expansion coefficients for observables, this framework provides a statistical interpretation of the standard EFT procedure where truncation errors are estimated using the order-by-order convergence of the expansion. We extend and test previous calculations of DOB intervals for chiral EFT observables, examine correlations between contributions at different orders and energies, and explore methods to validate the statistical consistency of the EFT expansion parameter. Supported in part by the NSF and the DOE.

  5. A Bayesian geostatistical transfer function approach to tracer test analysis

    NASA Astrophysics Data System (ADS)

    Fienen, Michael N.; Luo, Jian; Kitanidis, Peter K.

    2006-07-01

    Reactive transport modeling is often used in support of bioremediation and chemical treatment planning and design. There remains a pressing need for practical and efficient models that do not require (or assume attainable) the high level of characterization needed by complex numerical models. We focus on a linear systems or transfer function approach to the problem of reactive tracer transport in a heterogeneous saprolite aquifer. Transfer functions are obtained through the Bayesian geostatistical inverse method applied to tracer injection histories and breakthrough curves. We employ nonparametric transfer functions, which require minimal assumptions about shape and structure. The resulting flexibility empowers the data to determine the nature of the transfer function with minimal prior assumptions. Nonnegativity is enforced through a reflected Brownian motion stochastic model. The inverse method enables us to quantify uncertainty and to generate conditional realizations of the transfer function. Complex information about a hydrogeologic system is distilled into a relatively simple but rigorously obtained function that describes the transport behavior of the system between two wells. The resulting transfer functions are valuable in reactive transport models based on traveltime and streamline methods. The information contained in the data, particularly in the case of strong heterogeneity, is not overextended but is fully used. This is the first application of Bayesian geostatistical inversion to transfer functions in hydrogeology but the methodology can be extended to any linear system.

  6. OBJECTIVE BAYESIAN ANALYSIS OF ''ON/OFF'' MEASUREMENTS

    SciTech Connect

    Casadei, Diego

    2015-01-01

    In high-energy astrophysics, it is common practice to account for the background overlaid with counts from the source of interest with the help of auxiliary measurements carried out by pointing off-source. In this ''on/off'' measurement, one knows the number of photons detected while pointing toward the source, the number of photons collected while pointing away from the source, and how to estimate the background counts in the source region from the flux observed in the auxiliary measurements. For very faint sources, the number of photons detected is so low that the approximations that hold asymptotically are not valid. On the other hand, an analytical solution exists for the Bayesian statistical inference, which is valid at low and high counts. Here we illustrate the objective Bayesian solution based on the reference posterior and compare the result with the approach very recently proposed by Knoetig, and discuss its most delicate points. In addition, we propose to compute the significance of the excess with respect to the background-only expectation with a method that is able to account for any uncertainty on the background and is valid for any photon count. This method is compared to the widely used significance formula by Li and Ma, which is based on asymptotic properties.

  7. Objective Bayesian Analysis of "on/off" Measurements

    NASA Astrophysics Data System (ADS)

    Casadei, Diego

    2015-01-01

    In high-energy astrophysics, it is common practice to account for the background overlaid with counts from the source of interest with the help of auxiliary measurements carried out by pointing off-source. In this "on/off" measurement, one knows the number of photons detected while pointing toward the source, the number of photons collected while pointing away from the source, and how to estimate the background counts in the source region from the flux observed in the auxiliary measurements. For very faint sources, the number of photons detected is so low that the approximations that hold asymptotically are not valid. On the other hand, an analytical solution exists for the Bayesian statistical inference, which is valid at low and high counts. Here we illustrate the objective Bayesian solution based on the reference posterior and compare the result with the approach very recently proposed by Knoetig, and discuss its most delicate points. In addition, we propose to compute the significance of the excess with respect to the background-only expectation with a method that is able to account for any uncertainty on the background and is valid for any photon count. This method is compared to the widely used significance formula by Li & Ma, which is based on asymptotic properties.

  8. A Bayesian Analysis of the Ages of Four Open Clusters

    NASA Astrophysics Data System (ADS)

    Jeffery, Elizabeth J.; von Hippel, Ted; van Dyk, David A.; Stenning, David C.; Robinson, Elliot; Stein, Nathan; Jefferys, William H.

    2016-09-01

    In this paper we apply a Bayesian technique to determine the best fit of stellar evolution models to find the main sequence turn-off age and other cluster parameters of four intermediate-age open clusters: NGC 2360, NGC 2477, NGC 2660, and NGC 3960. Our algorithm utilizes a Markov chain Monte Carlo technique to fit these various parameters, objectively finding the best-fit isochrone for each cluster. The result is a high-precision isochrone fit. We compare these results with the those of traditional “by-eye” isochrone fitting methods. By applying this Bayesian technique to NGC 2360, NGC 2477, NGC 2660, and NGC 3960, we determine the ages of these clusters to be 1.35 ± 0.05, 1.02 ± 0.02, 1.64 ± 0.04, and 0.860 ± 0.04 Gyr, respectively. The results of this paper continue our effort to determine cluster ages to a higher precision than that offered by these traditional methods of isochrone fitting.

  9. Toward a behavioral analysis of joint attention

    PubMed Central

    Dube, William V.; MacDonald, Rebecca P. F.; Mansfield, Reneé C.; Holcomb, William L.; Ahearn, William H.

    2004-01-01

    Joint attention (JA) initiation is defined in cognitive-developmental psychology as a child's actions that verify or produce simultaneous attending by that child and an adult to some object or event in the environment so that both may experience the object or event together. This paper presents a contingency analysis of gaze shift in JA initiation. The analysis describes reinforcer-establishing and evocative effects of antecedent objects or events, discriminative and conditioned reinforcing functions of stimuli generated by adult behavior, and socially mediated reinforcers that may maintain JA behavior. A functional analysis of JA may describe multiple operant classes. The paper concludes with a discussion of JA deficits in children with autism spectrum disorders and suggestions for research and treatment. ImagesFigure 2 PMID:22478429

  10. The Joint Physics Analysis Center: Recent results

    NASA Astrophysics Data System (ADS)

    Fernández-Ramírez, César

    2016-10-01

    We review some of the recent achievements of the Joint Physics Analysis Center, a theoretical collaboration with ties to experimental collaborations, that aims to provide amplitudes suitable for the analysis of the current and forthcoming experimental data on hadron physics. Since its foundation in 2013, the group is focused on hadron spectroscopy in preparation for the forthcoming high statistics and high precision experimental data from BELLEII, BESIII, CLAS12, COMPASS, GlueX, LHCb and (hopefully) PANDA collaborations. So far, we have developed amplitudes for πN scattering, KN scattering, pion and J/ψ photoproduction, two kaon photoproduction and three-body decays of light mesons (η, ω, ϕ). The codes for the amplitudes are available to download from the group web page and can be straightforwardly incorporated to the analysis of the experimental data.

  11. Calibration of crash risk models on freeways with limited real-time traffic data using Bayesian meta-analysis and Bayesian inference approach.

    PubMed

    Xu, Chengcheng; Wang, Wei; Liu, Pan; Li, Zhibin

    2015-12-01

    This study aimed to develop a real-time crash risk model with limited data in China by using Bayesian meta-analysis and Bayesian inference approach. A systematic review was first conducted by using three different Bayesian meta-analyses, including the fixed effect meta-analysis, the random effect meta-analysis, and the meta-regression. The meta-analyses provided a numerical summary of the effects of traffic variables on crash risks by quantitatively synthesizing results from previous studies. The random effect meta-analysis and the meta-regression produced a more conservative estimate for the effects of traffic variables compared with the fixed effect meta-analysis. Then, the meta-analyses results were used as informative priors for developing crash risk models with limited data. Three different meta-analyses significantly affect model fit and prediction accuracy. The model based on meta-regression can increase the prediction accuracy by about 15% as compared to the model that was directly developed with limited data. Finally, the Bayesian predictive densities analysis was used to identify the outliers in the limited data. It can further improve the prediction accuracy by 5.0%.

  12. Covariate balance in a Bayesian propensity score analysis of beta blocker therapy in heart failure patients.

    PubMed

    McCandless, Lawrence C; Gustafson, Paul; Austin, Peter C; Levy, Adrian R

    2009-09-10

    Regression adjustment for the propensity score is a statistical method that reduces confounding from measured variables in observational data. A Bayesian propensity score analysis extends this idea by using simultaneous estimation of the propensity scores and the treatment effect. In this article, we conduct an empirical investigation of the performance of Bayesian propensity scores in the context of an observational study of the effectiveness of beta-blocker therapy in heart failure patients. We study the balancing properties of the estimated propensity scores. Traditional Frequentist propensity scores focus attention on balancing covariates that are strongly associated with treatment. In contrast, we demonstrate that Bayesian propensity scores can be used to balance the association between covariates and the outcome. This balancing property has the effect of reducing confounding bias because it reduces the degree to which covariates are outcome risk factors.

  13. Robust Bayesian Analysis of Heavy-tailed Stochastic Volatility Models using Scale Mixtures of Normal Distributions

    PubMed Central

    Abanto-Valle, C. A.; Bandyopadhyay, D.; Lachos, V. H.; Enriquez, I.

    2009-01-01

    A Bayesian analysis of stochastic volatility (SV) models using the class of symmetric scale mixtures of normal (SMN) distributions is considered. In the face of non-normality, this provides an appealing robust alternative to the routine use of the normal distribution. Specific distributions examined include the normal, student-t, slash and the variance gamma distributions. Using a Bayesian paradigm, an efficient Markov chain Monte Carlo (MCMC) algorithm is introduced for parameter estimation. Moreover, the mixing parameters obtained as a by-product of the scale mixture representation can be used to identify outliers. The methods developed are applied to analyze daily stock returns data on S&P500 index. Bayesian model selection criteria as well as out-of- sample forecasting results reveal that the SV models based on heavy-tailed SMN distributions provide significant improvement in model fit as well as prediction to the S&P500 index data over the usual normal model. PMID:20730043

  14. Symptoms of Depression and Challenging Behaviours in People with Intellectual Disability: A Bayesian Analysis. Brief Report

    ERIC Educational Resources Information Center

    Tsiouris, John; Mann, Rachel; Patti, Paul; Sturmey, Peter

    2004-01-01

    Clinicians need to know the likelihood of a condition given a positive or negative diagnostic test. In this study a Bayesian analysis of the Clinical Behavior Checklist for Persons with Intellectual Disabilities (CBCPID) to predict depression in people with intellectual disability was conducted. The CBCPID was administered to 92 adults with…

  15. Bayesian analysis of cross-prefectural production function with time varying structure in Japan

    NASA Astrophysics Data System (ADS)

    Kyo, Koki; Noda, Hideo

    2006-11-01

    A cross-prefectural production function (CPPF) in Japan is constructed in a set of Bayesian models to examine the performance of Japan's post-war economy. The parameters in the model are estimated by using the procedure of a Monte Carlo filter together with the method of maximum likelihood. The estimated results are applied to regional and historical analysis of the Japanese economy.

  16. Bayesian Meta-Analysis of Cronbach's Coefficient Alpha to Evaluate Informative Hypotheses

    ERIC Educational Resources Information Center

    Okada, Kensuke

    2015-01-01

    This paper proposes a new method to evaluate informative hypotheses for meta-analysis of Cronbach's coefficient alpha using a Bayesian approach. The coefficient alpha is one of the most widely used reliability indices. In meta-analyses of reliability, researchers typically form specific informative hypotheses beforehand, such as "alpha of…

  17. Monte Carlo Algorithms for a Bayesian Analysis of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey B.; Eriksen, H. K.; ODwyer, I. J.; Wandelt, B. D.; Gorski, K.; Knox, L.; Chu, M.

    2006-01-01

    A viewgraph presentation on the review of Bayesian approach to Cosmic Microwave Background (CMB) analysis, numerical implementation with Gibbs sampling, a summary of application to WMAP I and work in progress with generalizations to polarization, foregrounds, asymmetric beams, and 1/f noise is given.

  18. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  19. Bayesian Factor Analysis as a Variable-Selection Problem: Alternative Priors and Consequences.

    PubMed

    Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric

    2016-01-01

    Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor-loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, Muthén & Asparouhov proposed a Bayesian structural equation modeling (BSEM) approach to explore the presence of cross loadings in CFA models. We show that the issue of determining factor-loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov's approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike-and-slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set is used to demonstrate our approach.

  20. Using Discrete Loss Functions and Weighted Kappa for Classification: An Illustration Based on Bayesian Network Analysis

    ERIC Educational Resources Information Center

    Zwick, Rebecca; Lenaburg, Lubella

    2009-01-01

    In certain data analyses (e.g., multiple discriminant analysis and multinomial log-linear modeling), classification decisions are made based on the estimated posterior probabilities that individuals belong to each of several distinct categories. In the Bayesian network literature, this type of classification is often accomplished by assigning…

  1. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis

    NASA Technical Reports Server (NTRS)

    Herskovits, Edward H.; Gerring, Joan P.

    2003-01-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.

  2. Applying Bayesian Modeling and Receiver Operating Characteristic Methodologies for Test Utility Analysis

    ERIC Educational Resources Information Center

    Wang, Qiu; Diemer, Matthew A.; Maier, Kimberly S.

    2013-01-01

    This study integrated Bayesian hierarchical modeling and receiver operating characteristic analysis (BROCA) to evaluate how interest strength (IS) and interest differentiation (ID) predicted low–socioeconomic status (SES) youth's interest-major congruence (IMC). Using large-scale Kuder Career Search online-assessment data, this study fit three…

  3. Bayesian Network Meta-Analysis for Unordered Categorical Outcomes with Incomplete Data

    ERIC Educational Resources Information Center

    Schmid, Christopher H.; Trikalinos, Thomas A.; Olkin, Ingram

    2014-01-01

    We develop a Bayesian multinomial network meta-analysis model for unordered (nominal) categorical outcomes that allows for partially observed data in which exact event counts may not be known for each category. This model properly accounts for correlations of counts in mutually exclusive categories and enables proper comparison and ranking of…

  4. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer

    2010-01-01

    This joint mobility KC lecture included information from two papers, "A Method for and Issues Associated with the Determination of Space Suit Joint Requirements" and "Results and Analysis from Space Suit Joint Torque Testing," as presented for the International Conference on Environmental Systems in 2009 and 2010, respectively. The first paper discusses historical joint torque testing methodologies and approaches that were tested in 2008 and 2009. The second paper discusses the testing that was completed in 2009 and 2010.

  5. Bayesian Analysis of Non-Gaussian Long-Range Dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, Timothy; Watkins, Nicholas; Franzke, Christian; Gramacy, Robert

    2013-04-01

    Recent studies [e.g. the Antarctic study of Franzke, J. Climate, 2010] have strongly suggested that surface temperatures exhibit long-range dependence (LRD). The presence of LRD would hamper the identification of deterministic trends and the quantification of their significance. It is well established that LRD processes exhibit stochastic trends over rather long periods of time. Thus, accurate methods for discriminating between physical processes that possess long memory and those that do not are an important adjunct to climate modeling. As we briefly review, the LRD idea originated at the same time as H-selfsimilarity, so it is often not realised that a model does not have to be H-self similar to show LRD [e.g. Watkins, GRL Frontiers, 2013]. We have used Markov Chain Monte Carlo algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modeling LRD. Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. Many physical processes, for example the Faraday Antarctic time series, are significantly non-Gaussian. We have therefore extended this work by weakening the Gaussianity assumption, assuming an alpha-stable distribution for the innovations, and performing joint inference on d and alpha. Such a modified FARIMA(p,d,q) process is a flexible, initial model for non-Gaussian processes with long memory. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other measures of d.

  6. Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals

    NASA Astrophysics Data System (ADS)

    De Francesco, A.; Guarini, E.; Bafile, U.; Formisano, F.; Scaccia, L.

    2016-08-01

    When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way.

  7. Variational Bayesian causal connectivity analysis for fMRI.

    PubMed

    Luessi, Martin; Babacan, S Derin; Molina, Rafael; Booth, James R; Katsaggelos, Aggelos K

    2014-01-01

    The ability to accurately estimate effective connectivity among brain regions from neuroimaging data could help answering many open questions in neuroscience. We propose a method which uses causality to obtain a measure of effective connectivity from fMRI data. The method uses a vector autoregressive model for the latent variables describing neuronal activity in combination with a linear observation model based on a convolution with a hemodynamic response function. Due to the employed modeling, it is possible to efficiently estimate all latent variables of the model using a variational Bayesian inference algorithm. The computational efficiency of the method enables us to apply it to large scale problems with high sampling rates and several hundred regions of interest. We use a comprehensive empirical evaluation with synthetic and real fMRI data to evaluate the performance of our method under various conditions.

  8. FABADA: a Fitting Algorithm for Bayesian Analysis of DAta

    NASA Astrophysics Data System (ADS)

    Pardo, L. C.; Rovira-Esteva, M.; Busch, S.; Ruiz-Martin, M. D.; Tamarit, J. Ll

    2011-10-01

    The fit of data using a mathematical model is the standard way to know if the model describes data correctly and to obtain parameters that describe the physical processes hidden behind the experimental results. This is usually done by means of a χ2 minimization procedure. Although this procedure is fast and quite reliable for simple models, it has many drawbacks when dealing with complicated problems such as models with many or correlated parameters. We present here a Bayesian method to explore the parameter space guided only by the probability laws underlying the χ2 figure of merit. The presented method does not get stuck in local minima of the χ2 landscape as it usually happens with classical minimization procedures. Moreover correlations between parameters are taken into account in a natural way. Finally, parameters are obtained as probability distribution functions so that all the complexity of the parameter space is shown.

  9. Variational Bayesian causal connectivity analysis for fMRI

    PubMed Central

    Luessi, Martin; Babacan, S. Derin; Molina, Rafael; Booth, James R.; Katsaggelos, Aggelos K.

    2014-01-01

    The ability to accurately estimate effective connectivity among brain regions from neuroimaging data could help answering many open questions in neuroscience. We propose a method which uses causality to obtain a measure of effective connectivity from fMRI data. The method uses a vector autoregressive model for the latent variables describing neuronal activity in combination with a linear observation model based on a convolution with a hemodynamic response function. Due to the employed modeling, it is possible to efficiently estimate all latent variables of the model using a variational Bayesian inference algorithm. The computational efficiency of the method enables us to apply it to large scale problems with high sampling rates and several hundred regions of interest. We use a comprehensive empirical evaluation with synthetic and real fMRI data to evaluate the performance of our method under various conditions. PMID:24847244

  10. Micronutrients in HIV: A Bayesian Meta-Analysis

    PubMed Central

    Carter, George M.; Indyk, Debbie; Johnson, Matthew; Andreae, Michael; Suslov, Kathryn; Busani, Sudharani; Esmaeili, Aryan; Sacks, Henry S.

    2015-01-01

    Background Approximately 28.5 million people living with HIV are eligible for treatment (CD4<500), but currently have no access to antiretroviral therapy. Reduced serum level of micronutrients is common in HIV disease. Micronutrient supplementation (MNS) may mitigate disease progression and mortality. Objectives We synthesized evidence on the effect of micronutrient supplementation on mortality and rate of disease progression in HIV disease. Methods We searched MEDLINE, EMBASE, the Cochrane Central, AMED and CINAHL databases through December 2014, without language restriction, for studies of greater than 3 micronutrients versus any or no comparator. We built a hierarchical Bayesian random effects model to synthesize results. Inferences are based on the posterior distribution of the population effects; posterior distributions were approximated by Markov chain Monte Carlo in OpenBugs. Principal Findings From 2166 initial references, we selected 49 studies for full review and identified eight reporting on disease progression and/or mortality. Bayesian synthesis of data from 2,249 adults in three studies estimated the relative risk of disease progression in subjects on MNS vs. control as 0.62 (95% credible interval, 0.37, 0.96). Median number needed to treat is 8.4 (4.8, 29.9) and the Bayes Factor 53.4. Based on data reporting on 4,095 adults reporting mortality in 7 randomized controlled studies, the RR was 0.84 (0.38, 1.85), NNT is 25 (4.3, ∞). Conclusions MNS significantly and substantially slows disease progression in HIV+ adults not on ARV, and possibly reduces mortality. Micronutrient supplements are effective in reducing progression with a posterior probability of 97.9%. Considering MNS low cost and lack of adverse effects, MNS should be standard of care for HIV+ adults not yet on ARV. PMID:25830916

  11. Uncertainty Analysis in Fatigue Life Prediction of Gas Turbine Blades Using Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Li, Yan-Feng; Zhu, Shun-Peng; Li, Jing; Peng, Weiwen; Huang, Hong-Zhong

    2015-12-01

    This paper investigates Bayesian model selection for fatigue life estimation of gas turbine blades considering model uncertainty and parameter uncertainty. Fatigue life estimation of gas turbine blades is a critical issue for the operation and health management of modern aircraft engines. Since lots of life prediction models have been presented to predict the fatigue life of gas turbine blades, model uncertainty and model selection among these models have consequently become an important issue in the lifecycle management of turbine blades. In this paper, fatigue life estimation is carried out by considering model uncertainty and parameter uncertainty simultaneously. It is formulated as the joint posterior distribution of a fatigue life prediction model and its model parameters using Bayesian inference method. Bayes factor is incorporated to implement the model selection with the quantified model uncertainty. Markov Chain Monte Carlo method is used to facilitate the calculation. A pictorial framework and a step-by-step procedure of the Bayesian inference method for fatigue life estimation considering model uncertainty are presented. Fatigue life estimation of a gas turbine blade is implemented to demonstrate the proposed method.

  12. A Pragmatic Bayesian Perspective on Correlation Analysis. The exoplanetary gravity - stellar activity case

    NASA Astrophysics Data System (ADS)

    Figueira, P.; Faria, J. P.; Adibekyan, V. Zh.; Oshagh, M.; Santos, N. C.

    2016-11-01

    We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, ρ, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short (˜ 130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log(R^' }_{ {HK}}) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the reader interested in this kind of problem to apply our code to his/her own scientific problems. The full understanding of what the Bayesian framework is can only be gained through the insight that comes by handling priors, assessing the convergence of Monte Carlo runs, and a multitude of other practical problems. We hope to contribute so that Bayesian analysis becomes a tool in the toolkit of researchers, and they understand by experience its advantages and limitations.

  13. A fully Bayesian before-after analysis of permeable friction course (PFC) pavement wet weather safety.

    PubMed

    Buddhavarapu, Prasad; Smit, Andre F; Prozzi, Jorge A

    2015-07-01

    Permeable friction course (PFC), a porous hot-mix asphalt, is typically applied to improve wet weather safety on high-speed roadways in Texas. In order to warrant expensive PFC construction, a statistical evaluation of its safety benefits is essential. Generally, the literature on the effectiveness of porous mixes in reducing wet-weather crashes is limited and often inconclusive. In this study, the safety effectiveness of PFC was evaluated using a fully Bayesian before-after safety analysis. First, two groups of road segments overlaid with PFC and non-PFC material were identified across Texas; the non-PFC or reference road segments selected were similar to their PFC counterparts in terms of site specific features. Second, a negative binomial data generating process was assumed to model the underlying distribution of crash counts of PFC and reference road segments to perform Bayesian inference on the safety effectiveness. A data-augmentation based computationally efficient algorithm was employed for a fully Bayesian estimation. The statistical analysis shows that PFC is not effective in reducing wet weather crashes. It should be noted that the findings of this study are in agreement with the existing literature, although these studies were not based on a fully Bayesian statistical analysis. Our study suggests that the safety effectiveness of PFC road surfaces, or any other safety infrastructure, largely relies on its interrelationship with the road user. The results suggest that the safety infrastructure must be properly used to reap the benefits of the substantial investments.

  14. Fully Bayesian inference for structural MRI: application to segmentation and statistical analysis of T2-hypointensities.

    PubMed

    Schmidt, Paul; Schmid, Volker J; Gaser, Christian; Buck, Dorothea; Bührlen, Susanne; Förschler, Annette; Mühlau, Mark

    2013-01-01

    Aiming at iron-related T2-hypointensity, which is related to normal aging and neurodegenerative processes, we here present two practicable approaches, based on Bayesian inference, for preprocessing and statistical analysis of a complex set of structural MRI data. In particular, Markov Chain Monte Carlo methods were used to simulate posterior distributions. First, we rendered a segmentation algorithm that uses outlier detection based on model checking techniques within a Bayesian mixture model. Second, we rendered an analytical tool comprising a Bayesian regression model with smoothness priors (in the form of Gaussian Markov random fields) mitigating the necessity to smooth data prior to statistical analysis. For validation, we used simulated data and MRI data of 27 healthy controls (age: [Formula: see text]; range, [Formula: see text]). We first observed robust segmentation of both simulated T2-hypointensities and gray-matter regions known to be T2-hypointense. Second, simulated data and images of segmented T2-hypointensity were analyzed. We found not only robust identification of simulated effects but also a biologically plausible age-related increase of T2-hypointensity primarily within the dentate nucleus but also within the globus pallidus, substantia nigra, and red nucleus. Our results indicate that fully Bayesian inference can successfully be applied for preprocessing and statistical analysis of structural MRI data.

  15. A Pragmatic Bayesian Perspective on Correlation Analysis : The exoplanetary gravity - stellar activity case.

    PubMed

    Figueira, P; Faria, J P; Adibekyan, V Zh; Oshagh, M; Santos, N C

    2016-11-01

    We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, ρ, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short (∼ 130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log([Formula: see text]) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the reader interested in this kind of problem to apply our code to his/her own scientific problems. The full understanding of what the Bayesian framework is can only be gained through the insight that comes by handling priors, assessing the convergence of Monte Carlo runs, and a multitude of other practical problems. We hope to contribute so that Bayesian analysis becomes a tool in the toolkit of researchers, and they understand by experience its advantages and limitations.

  16. Bayesian dose-response analysis for epidemiological studies with complex uncertainty in dose estimation.

    PubMed

    Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L

    2016-02-10

    Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure.

  17. Coronal joint spaces of the Temporomandibular joint: Systematic review and meta-analysis

    PubMed Central

    Silva, Joana-Cristina; Pires, Carlos A.; Ponces-Ramalhão, Maria-João-Feio; Lopes, Jorge-Dias

    2015-01-01

    Introduction The joint space measurements of the temporomandibular joint have been used to determine the condyle position variation. Therefore, the aim of this study is to perform a systematic review and meta-analysis on the coronal joint spaces measurements of the temporomandibular joint. Material and Methods An electronic database search was performed with the terms “condylar position”; “joint space”AND”TMJ”. Inclusionary criteria included: tomographic 3D imaging of the TMJ, presentation of at least two joint space measurements on the coronal plane. Exclusionary criteria were: mandibular fractures, animal studies, surgery, presence of genetic or chronic diseases, case reports, opinion or debate articles or unpublished material. The risk of bias of each study was judged as high, moderate or low according to the “Cochrane risk of bias tool”. The values used in the meta-analysis were the medial, superior and lateral joint space measurements and their differences between the right and left joint. Results From the initial search 2706 articles were retrieved. After excluding the duplicates and all the studies that did not match the eligibility criteria 4 articles classified for final review. All the retrieved articles were judged as low level of evidence. All of the reviewed studies were included in the meta-analysis concluding that the mean coronal joint space values were: medial joint space 2.94 mm, superior 2.55 mm and lateral 2.16 mm. Conclusions the analysis also showed high levels of heterogeneity. Right and left comparison did not show statistically significant differences. Key words:Temporomandibular joint, systematic review, meta-analysis. PMID:26330944

  18. APPLICATION OF PRINCIPAL COMPONENT ANALYSIS AND BAYESIAN DECOMPOSITION TO RELAXOGRAPHIC IMAGING

    SciTech Connect

    OCHS,M.F.; STOYANOVA,R.S.; BROWN,T.R.; ROONEY,W.D.; LI,X.; LEE,J.H.; SPRINGER,C.S.

    1999-05-22

    Recent developments in high field imaging have made possible the acquisition of high quality, low noise relaxographic data in reasonable imaging times. The datasets comprise a huge amount of information (>>1 million points) which makes rigorous analysis daunting. Here, the authors present results demonstrating that Principal Component Analysis (PCA) and Bayesian Decomposition (BD) provide powerful methods for relaxographic analysis of T{sub 1} recovery curves and editing of tissue type in resulting images.

  19. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  20. Toward an ecological analysis of Bayesian inferences: how task characteristics influence responses

    PubMed Central

    Hafenbrädl, Sebastian; Hoffrage, Ulrich

    2015-01-01

    In research on Bayesian inferences, the specific tasks, with their narratives and characteristics, are typically seen as exchangeable vehicles that merely transport the structure of the problem to research participants. In the present paper, we explore whether, and possibly how, task characteristics that are usually ignored influence participants’ responses in these tasks. We focus on both quantitative dimensions of the tasks, such as their base rates, hit rates, and false-alarm rates, as well as qualitative characteristics, such as whether the task involves a norm violation or not, whether the stakes are high or low, and whether the focus is on the individual case or on the numbers. Using a data set of 19 different tasks presented to 500 different participants who provided a total of 1,773 responses, we analyze these responses in two ways: first, on the level of the numerical estimates themselves, and second, on the level of various response strategies, Bayesian and non-Bayesian, that might have produced the estimates. We identified various contingencies, and most of the task characteristics had an influence on participants’ responses. Typically, this influence has been stronger when the numerical information in the tasks was presented in terms of probabilities or percentages, compared to natural frequencies – and this effect cannot be fully explained by a higher proportion of Bayesian responses when natural frequencies were used. One characteristic that did not seem to influence participants’ response strategy was the numerical value of the Bayesian solution itself. Our exploratory study is a first step toward an ecological analysis of Bayesian inferences, and highlights new avenues for future research. PMID:26300791

  1. Joint spatial analysis of gastrointestinal infectious diseases.

    PubMed

    Held, Leonhard; Graziano, Giusi; Frank, Christina; Rue, Håvard

    2006-10-01

    A major obstacle in the spatial analysis of infectious disease surveillance data is the problem of under-reporting. This article investigates the possibility of inferring reporting rates through joint statistical modelling of several infectious diseases with different aetiologies. Once variation in under-reporting can be estimated, geographic risk patterns for infections associated with specific food vehicles may be discerned. We adopt the shared component model, proposed by Knorr-Held and Best for two chronic diseases and further extended by (Held L, Natario I, Fenton S, Rue H, Becker N. Towards joint disease mapping. Statistical Methods in Medical Research 2005b; 14: 61-82) for more than two chronic diseases to the infectious disease setting. Our goal is to estimate a shared component, common to all diseases, which may be interpreted as representing the spatial variation in reporting rates. Additional components are introduced to describe the real spatial variation of the different diseases. Of course, this interpretation is only allowed under specific assumptions, in particular, the geographical variation in under-reporting should be similar for the diseases considered. In addition, it is vital that the data do not contain large local outbreaks, so adjustment based on a time series method recently proposed by (Held L, Höhle M, Hofmann M. A statistical framework for the analysis of multivariate infectious disease surveillance data. Statistical Modelling 2005a; 5: 187-99) is made at a preliminary stage. We will illustrate our approach through the analysis of gastrointestinal diseases notification data obtained from the German infectious disease surveillance system, administered by the Robert Koch Institute in Berlin.

  2. Bayesian inversion analysis of nonlinear dynamics in surface heterogeneous reactions

    NASA Astrophysics Data System (ADS)

    Omori, Toshiaki; Kuwatani, Tatsu; Okamoto, Atsushi; Hukushima, Koji

    2016-09-01

    It is essential to extract nonlinear dynamics from time-series data as an inverse problem in natural sciences. We propose a Bayesian statistical framework for extracting nonlinear dynamics of surface heterogeneous reactions from sparse and noisy observable data. Surface heterogeneous reactions are chemical reactions with conjugation of multiple phases, and they have the intrinsic nonlinearity of their dynamics caused by the effect of surface-area between different phases. We adapt a belief propagation method and an expectation-maximization (EM) algorithm to partial observation problem, in order to simultaneously estimate the time course of hidden variables and the kinetic parameters underlying dynamics. The proposed belief propagation method is performed by using sequential Monte Carlo algorithm in order to estimate nonlinear dynamical system. Using our proposed method, we show that the rate constants of dissolution and precipitation reactions, which are typical examples of surface heterogeneous reactions, as well as the temporal changes of solid reactants and products, were successfully estimated only from the observable temporal changes in the concentration of the dissolved intermediate product.

  3. Bayesian Analysis of the Mass Distribution of Neutron Stars

    NASA Astrophysics Data System (ADS)

    Valentim, Rodolfo; Horvath, Jorge E.; Rangel, Eraldo M.

    The distribution of masses for neutron stars is analyzed using the Bayesian statistical inference, evaluating the likelihood of two a priori gaussian peaks distribution by using fifty-five measured points obtained in a variety of systems. The results strongly suggest the existence of a bimodal distribution of the masses, with the first peak around 1.35M⊙ ± 0.06M⊙ and a much wider second peak at 1.73M⊙ ± 0.36M⊙. We compared the two gaussian's model centered at 1.35M⊙ and 1.55M⊙ against a "single gaussian" model with 1.50M⊙ ± 0.11M⊙ using 3σ that provided a wide peak covering objects the full range of observed of masses. In order to compare models, BIC (Baysesian Information Criterion) can be used and a strong evidence for two distributions model against one peak model was found. The results support earlier views related to the different evolutionary histories of the members for the first two peaks, which produces a natural separation (in spite that no attempt to "label" the systems has been made). However, the recently claimed low-mass group, possibly related to O - Mg - Ne core collapse events, has a monotonically decreasing likelihood and has not been identified within this sample.

  4. Multi-Class Sparse Bayesian Regression for Neuroimaging Data Analysis

    NASA Astrophysics Data System (ADS)

    Michel, Vincent; Eger, Evelyn; Keribin, Christine; Thirion, Bertrand

    The use of machine learning tools is gaining popularity in neuroimaging, as it provides a sensitive assessment of the information conveyed by brain images. In particular, finding regions of the brain whose functional signal reliably predicts some behavioral information makes it possible to better understand how this information is encoded or processed in the brain. However, such a prediction is performed through regression or classification algorithms that suffer from the curse of dimensionality, because a huge number of features (i.e. voxels) are available to fit some target, with very few samples (i.e. scans) to learn the informative regions. A commonly used solution is to regularize the weights of the parametric prediction function. However, model specification needs a careful design to balance adaptiveness and sparsity. In this paper, we introduce a novel method, Multi - Class Sparse Bayesian Regression(MCBR), that generalizes classical approaches such as Ridge regression and Automatic Relevance Determination. Our approach is based on a grouping of the features into several classes, where each class is regularized with specific parameters. We apply our algorithm to the prediction of a behavioral variable from brain activation images. The method presented here achieves similar prediction accuracies than reference methods, and yields more interpretable feature loadings.

  5. Heterogeneous multimodal biomarkers analysis for Alzheimer's disease via Bayesian network.

    PubMed

    Jin, Yan; Su, Yi; Zhou, Xiao-Hua; Huang, Shuai

    2016-12-01

    By 2050, it is estimated that the number of worldwide Alzheimer's disease (AD) patients will quadruple from the current number of 36 million, while no proven disease-modifying treatments are available. At present, the underlying disease mechanisms remain under investigation, and recent studies suggest that the disease involves multiple etiological pathways. To better understand the disease and develop treatment strategies, a number of ongoing studies including the Alzheimer's Disease Neuroimaging Initiative (ADNI) enroll many study participants and acquire a large number of biomarkers from various modalities including demographic, genotyping, fluid biomarkers, neuroimaging, neuropsychometric test, and clinical assessments. However, a systematic approach that can integrate all the collected data is lacking. The overarching goal of our study is to use machine learning techniques to understand the relationships among different biomarkers and to establish a system-level model that can better describe the interactions among biomarkers and provide superior diagnostic and prognostic information. In this pilot study, we use Bayesian network (BN) to analyze multimodal data from ADNI, including demographics, volumetric MRI, PET, genotypes, and neuropsychometric measurements and demonstrate our approach to have superior prediction accuracy.

  6. Klebsiella pneumoniae blaKPC-3 nosocomial epidemic: Bayesian and evolutionary analysis.

    PubMed

    Angeletti, Silvia; Presti, Alessandra Lo; Cella, Eleonora; Fogolari, Marta; De Florio, Lucia; Dedej, Etleva; Blasi, Aletheia; Milano, Teresa; Pascarella, Stefano; Incalzi, Raffaele Antonelli; Coppola, Roberto; Dicuonzo, Giordano; Ciccozzi, Massimo

    2016-12-01

    K. pneumoniae isolates carrying blaKPC-3 gene were collected to perform Bayesian phylogenetic and selective pressure analysis and to apply homology modeling to the KPC-3 protein. A dataset of 44 blakpc-3 gene sequences from clinical isolates of K. pneumoniae was used for Bayesian phylogenetic, selective pressure analysis and homology modeling. The mean evolutionary rate for blakpc-3 gene was 2.67×10(-3) substitution/site/year (95% HPD: 3.4×10(-4-)5.59×10(-)(3)). The root of the Bayesian tree dated back to the year 2011 (95% HPD: 2007-2012). Two main clades (I and II) were identified. The population dynamics analysis showed an exponential growth from 2011 to 2013 and the reaching of a plateau. The phylogeographic reconstruction showed that the root of the tree had a probable common ancestor in the general surgery ward. Selective pressure analysis revealed twelve positively selected sites. Structural analysis of KPC-3 protein predicted that the amino acid mutations are destabilizing for the protein and could alter the substrate specificity. Phylogenetic analysis and homology modeling of blaKPC-3 gene could represent a useful tool to follow KPC spread in nosocomial setting and to evidence amino acid substitutions altering the substrate specificity.

  7. PFG NMR and Bayesian analysis to characterise non-Newtonian fluids

    NASA Astrophysics Data System (ADS)

    Blythe, Thomas W.; Sederman, Andrew J.; Stitt, E. Hugh; York, Andrew P. E.; Gladden, Lynn F.

    2017-01-01

    Many industrial flow processes are sensitive to changes in the rheological behaviour of process fluids, and there therefore exists a need for methods that provide online, or inline, rheological characterisation necessary for process control and optimisation over timescales of minutes or less. Nuclear magnetic resonance (NMR) offers a non-invasive technique for this application, without limitation on optical opacity. We present a Bayesian analysis approach using pulsed field gradient (PFG) NMR to enable estimation of the rheological parameters of Herschel-Bulkley fluids in a pipe flow geometry, characterised by a flow behaviour index n , yield stress τ0 , and consistency factor k , by analysis of the signal in q -space. This approach eliminates the need for velocity image acquisition and expensive gradient hardware. We investigate the robustness of the proposed Bayesian NMR approach to noisy data and reduced sampling using simulated NMR data and show that even with a signal-to-noise ratio (SNR) of 100, only 16 points are required to be sampled to provide rheological parameters accurate to within 2% of the ground truth. Experimental validation is provided through an experimental case study on Carbopol 940 solutions (model Herschel-Bulkley fluids) using PFG NMR at a 1H resonance frequency of 85.2 MHz; for SNR > 1000, only 8 points are required to be sampled. This corresponds to a total acquisition time of <60 s and represents an 88% reduction in acquisition time when compared to MR flow imaging. Comparison of the shear stress-shear rate relationship, quantified using Bayesian NMR, with non-Bayesian NMR methods demonstrates that the Bayesian NMR approach is in agreement with MR flow imaging to within the accuracy of the measurement. Furthermore, as we increase the concentration of Carbopol 940 we observe a change in rheological characteristics, probably due to shear history-dependent behaviour and the different geometries used. This behaviour highlights the need for

  8. An intake prior for the Bayesian analysis of plutonium and uranium exposures in an epidemiology study.

    PubMed

    Puncher, M; Birchall, A; Bull, R K

    2014-12-01

    In Bayesian inference, the initial knowledge regarding the value of a parameter, before additional data are considered, is represented as a prior probability distribution. This paper describes the derivation of a prior distribution of intake that was used for the Bayesian analysis of plutonium and uranium worker doses in a recent epidemiology study. The chosen distribution is log-normal with a geometric standard deviation of 6 and a median value that is derived for each worker based on the duration of the work history and the number of reported acute intakes. The median value is a function of the work history and a constant related to activity in air concentration, M, which is derived separately for uranium and plutonium. The value of M is based primarily on measurements of plutonium and uranium in air derived from historical personal air sampler (PAS) data. However, there is significant uncertainty on the value of M that results from paucity of PAS data and from extrapolating these measurements to actual intakes. This paper compares posterior and prior distributions of intake and investigates the sensitivity of the Bayesian analyses to the assumed value of M. It is found that varying M by a factor of 10 results in a much smaller factor of 2 variation in mean intake and lung dose for both plutonium and uranium. It is concluded that if a log-normal distribution is considered to adequately represent worker intakes, then the Bayesian posterior distribution of dose is relatively insensitive to the value assumed of M.

  9. A problem in particle physics and its Bayesian analysis

    NASA Astrophysics Data System (ADS)

    Landon, Joshua

    An up and coming field in contemporary nuclear and particle physics is "Lattice Quantum Chromodynamics", henceforth Lattice QCD. Indeed the 2004 Nobel Prize in Physics went to the developers of equations that describe QCD. In this dissertation, following a layperson's introduction to the structure of matter, we outline the statistical aspects of a problem in Lattice QCD faced by particle physicists, and point out the difficulties encountered by them in trying to address the problem. The difficulties stem from the fact that one is required to estimate a large -- conceptually infinite -- number of parameters based on a finite number of non-linear equations, each of which is a sum of exponential functions. We then present a plausible approach for solving the problem. Our approach is Bayesian and is driven by a computationally intensive Markov Chain Monte Carlo based solution. However, in order to invoke our approach we first look at the underlying anatomy of the problem and synthesize its essentials. These essentials reveal a pattern that can be harnessed via some assumptions, and this in turn enables us to outline a pathway towards a solution. We demonstrate the viability of our approach via simulated data, followed by its validation against real data provided to us by our physicist colleagues. Our approach yields results that in the past were not obtainable via alternate approaches. The contribution of this dissertation is two-fold. The first is a use of computationally intensive statistical technology to produce results in physics that could not be obtained using physics based techniques. Since the statistical architecture of the problem considered here can arise in other contexts as well, the second contribution of this dissertation is to indicate a plausible approach for addressing a generic class of problems wherein the number of parameters to be estimated exceeds the number of constraints, each constraint being a non-linear equation that is the sum of

  10. A hierarchical Bayesian approach to the modified Bartlett-Lewis rectangular pulse model for a joint estimation of model parameters across stations

    NASA Astrophysics Data System (ADS)

    Kim, Jang-Gyeong; Kwon, Hyun-Han; Kim, Dongkyun

    2017-01-01

    Poisson cluster stochastic rainfall generators (e.g., modified Bartlett-Lewis rectangular pulse, MBLRP) have been widely applied to generate synthetic sub-daily rainfall sequences. The MBLRP model reproduces the underlying distribution of the rainfall generating process. The existing optimization techniques are typically based on individual parameter estimates that treat each parameter as independent. However, parameter estimates sometimes compensate for the estimates of other parameters, which can cause high variability in the results if the covariance structure is not formally considered. Moreover, uncertainty associated with model parameters in the MBLRP rainfall generator is not usually addressed properly. Here, we develop a hierarchical Bayesian model (HBM)-based MBLRP model to jointly estimate parameters across weather stations and explicitly consider the covariance and uncertainty through a Bayesian framework. The model is tested using weather stations in South Korea. The HBM-based MBLRP model improves the identification of parameters with better reproduction of rainfall statistics at various temporal scales. Additionally, the spatial variability of the parameters across weather stations is substantially reduced compared to that of other methods.

  11. Contact analysis for riveted and bolted joints of composite laminates

    NASA Astrophysics Data System (ADS)

    Ye, Tian-Qi; Li, Wei; Shen, Guanqing

    The computational strategy and numerical technique developed are demonstrated to be efficient for the analysis of riveted and bolted joints of composite laminates. The 3D contact analysis provides more accurate results for the evaluation of strength of the mechanically fastened joints in the composite structures. The method described can be extended to multibody contact problems, it has been implemented in the computer codes.

  12. The effect of close relatives on unsupervised Bayesian clustering algorithms in population genetic structure analysis.

    PubMed

    Rodríguez-Ramilo, Silvia T; Wang, Jinliang

    2012-09-01

    The inference of population genetic structures is essential in many research areas in population genetics, conservation biology and evolutionary biology. Recently, unsupervised Bayesian clustering algorithms have been developed to detect a hidden population structure from genotypic data, assuming among others that individuals taken from the population are unrelated. Under this assumption, markers in a sample taken from a subpopulation can be considered to be in Hardy-Weinberg and linkage equilibrium. However, close relatives might be sampled from the same subpopulation, and consequently, might cause Hardy-Weinberg and linkage disequilibrium and thus bias a population genetic structure analysis. In this study, we used simulated and real data to investigate the impact of close relatives in a sample on Bayesian population structure analysis. We also showed that, when close relatives were identified by a pedigree reconstruction approach and removed, the accuracy of a population genetic structure analysis can be greatly improved. The results indicate that unsupervised Bayesian clustering algorithms cannot be used blindly to detect genetic structure in a sample with closely related individuals. Rather, when closely related individuals are suspected to be frequent in a sample, these individuals should be first identified and removed before conducting a population structure analysis.

  13. Enhanced characterization of solid solitary pulmonary nodules with Bayesian analysis-based computer-aided diagnosis

    PubMed Central

    Perandini, Simone; Soardi, Gian Alberto; Motton, Massimiliano; Augelli, Raffaele; Dallaserra, Chiara; Puntel, Gino; Rossi, Arianna; Sala, Giuseppe; Signorini, Manuel; Spezia, Laura; Zamboni, Federico; Montemezzi, Stefania

    2016-01-01

    The aim of this study was to prospectively assess the accuracy gain of Bayesian analysis-based computer-aided diagnosis (CAD) vs human judgment alone in characterizing solitary pulmonary nodules (SPNs) at computed tomography (CT). The study included 100 randomly selected SPNs with a definitive diagnosis. Nodule features at first and follow-up CT scans as well as clinical data were evaluated individually on a 1 to 5 points risk chart by 7 radiologists, firstly blinded then aware of Bayesian Inference Malignancy Calculator (BIMC) model predictions. Raters’ predictions were evaluated by means of receiver operating characteristic (ROC) curve analysis and decision analysis. Overall ROC area under the curve was 0.758 before and 0.803 after the disclosure of CAD predictions (P = 0.003). A net gain in diagnostic accuracy was found in 6 out of 7 readers. Mean risk class of benign nodules dropped from 2.48 to 2.29, while mean risk class of malignancies rose from 3.66 to 3.92. Awareness of CAD predictions also determined a significant drop on mean indeterminate SPNs (15 vs 23.86 SPNs) and raised the mean number of correct and confident diagnoses (mean 39.57 vs 25.71 SPNs). This study provides evidence supporting the integration of the Bayesian analysis-based BIMC model in SPN characterization. PMID:27648166

  14. Empirical Markov Chain Monte Carlo Bayesian analysis of fMRI data.

    PubMed

    de Pasquale, F; Del Gratta, C; Romani, G L

    2008-08-01

    In this work an Empirical Markov Chain Monte Carlo Bayesian approach to analyse fMRI data is proposed. The Bayesian framework is appealing since complex models can be adopted in the analysis both for the image and noise model. Here, the noise autocorrelation is taken into account by adopting an AutoRegressive model of order one and a versatile non-linear model is assumed for the task-related activation. Model parameters include the noise variance and autocorrelation, activation amplitudes and the hemodynamic response function parameters. These are estimated at each voxel from samples of the Posterior Distribution. Prior information is included by means of a 4D spatio-temporal model for the interaction between neighbouring voxels in space and time. The results show that this model can provide smooth estimates from low SNR data while important spatial structures in the data can be preserved. A simulation study is presented in which the accuracy and bias of the estimates are addressed. Furthermore, some results on convergence diagnostic of the adopted algorithm are presented. To validate the proposed approach a comparison of the results with those from a standard GLM analysis, spatial filtering techniques and a Variational Bayes approach is provided. This comparison shows that our approach outperforms the classical analysis and is consistent with other Bayesian techniques. This is investigated further by means of the Bayes Factors and the analysis of the residuals. The proposed approach applied to Blocked Design and Event Related datasets produced reliable maps of activation.

  15. Bayesian Statistical Analysis Applied to NAA Data for Neutron Flux Spectrum Determination

    NASA Astrophysics Data System (ADS)

    Chiesa, D.; Previtali, E.; Sisti, M.

    2014-04-01

    In this paper, we present a statistical method, based on Bayesian statistics, to evaluate the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation analysis (NAA) experiment [A. Borio di Tigliole et al., Absolute flux measurement by NAA at the Pavia University TRIGA Mark II reactor facilities, ENC 2012 - Transactions Research Reactors, ISBN 978-92-95064-14-0, 22 (2012)] performed at the TRIGA Mark II reactor of Pavia University (Italy). In order to evaluate the neutron flux spectrum, subdivided in energy groups, we must solve a system of linear equations containing the grouped cross sections and the activation rate data. We solve this problem with Bayesian statistical analysis, including the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, is used to define the problem statistical model and solve it. The energy group fluxes and their uncertainties are then determined with great accuracy and the correlations between the groups are analyzed. Finally, the dependence of the results on the prior distribution choice and on the group cross section data is investigated to confirm the reliability of the analysis.

  16. Development of a clinical pathways analysis system with adaptive Bayesian nets and data mining techniques.

    PubMed

    Kopec, D; Shagas, G; Reinharth, D; Tamang, S

    2004-01-01

    The use and development of software in the medical field offers tremendous opportunities for making health care delivery more efficient, more effective, and less error-prone. We discuss and explore the use of clinical pathways analysis with Adaptive Bayesian Networks and Data Mining Techniques to perform such analyses. The computation of "lift" (a measure of completed pathways improvement potential) leads us to optimism regarding the potential for this approach.

  17. Bayesian Switching Factor Analysis for Estimating Time-varying Functional Connectivity in fMRI.

    PubMed

    Taghia, Jalil; Ryali, Srikanth; Chen, Tianwen; Supekar, Kaustubh; Cai, Weidong; Menon, Vinod

    2017-03-03

    There is growing interest in understanding the dynamical properties of functional interactions between distributed brain regions. However, robust estimation of temporal dynamics from functional magnetic resonance imaging (fMRI) data remains challenging due to limitations in extant multivariate methods for modeling time-varying functional interactions between multiple brain areas. Here, we develop a Bayesian generative model for fMRI time-series within the framework of hidden Markov models (HMMs). The model is a dynamic variant of the static factor analysis model (Ghahramani and Beal, 2000). We refer to this model as Bayesian switching factor analysis (BSFA) as it integrates factor analysis into a generative HMM in a unified Bayesian framework. In BSFA, brain dynamic functional networks are represented by latent states which are learnt from the data. Crucially, BSFA is a generative model which estimates the temporal evolution of brain states and transition probabilities between states as a function of time. An attractive feature of BSFA is the automatic determination of the number of latent states via Bayesian model selection arising from penalization of excessively complex models. Key features of BSFA are validated using extensive simulations on carefully designed synthetic data. We further validate BSFA using fingerprint analysis of multisession resting-state fMRI data from the Human Connectome Project (HCP). Our results show that modeling temporal dependencies in the generative model of BSFA results in improved fingerprinting of individual participants. Finally, we apply BSFA to elucidate the dynamic functional organization of the salience, central-executive, and default mode networks-three core neurocognitive systems with central role in cognitive and affective information processing (Menon, 2011). Across two HCP sessions, we demonstrate a high level of dynamic interactions between these networks and determine that the salience network has the highest temporal

  18. Progressive Damage Analysis of Bonded Composite Joints

    NASA Technical Reports Server (NTRS)

    Leone, Frank A., Jr.; Girolamo, Donato; Davila, Carlos G.

    2012-01-01

    The present work is related to the development and application of progressive damage modeling techniques to bonded joint technology. The joint designs studied in this work include a conventional composite splice joint and a NASA-patented durable redundant joint. Both designs involve honeycomb sandwich structures with carbon/epoxy facesheets joined using adhesively bonded doublers.Progressive damage modeling allows for the prediction of the initiation and evolution of damage within a structure. For structures that include multiple material systems, such as the joint designs under consideration, the number of potential failure mechanisms that must be accounted for drastically increases the complexity of the analyses. Potential failure mechanisms include fiber fracture, intraply matrix cracking, delamination, core crushing, adhesive failure, and their interactions. The bonded joints were modeled using highly parametric, explicitly solved finite element models, with damage modeling implemented via custom user-written subroutines. Each ply was discretely meshed using three-dimensional solid elements. Layers of cohesive elements were included between each ply to account for the possibility of delaminations and were used to model the adhesive layers forming the joint. Good correlation with experimental results was achieved both in terms of load-displacement history and the predicted failure mechanism(s).

  19. Labour and residential accessibility: a Bayesian analysis based on Poisson gravity models with spatial effects

    NASA Astrophysics Data System (ADS)

    Alonso, M. P.; Beamonte, M. A.; Gargallo, P.; Salvador, M. J.

    2014-10-01

    In this study, we measure jointly the labour and the residential accessibility of a basic spatial unit using a Bayesian Poisson gravity model with spatial effects. The accessibility measures are broken down into two components: the attractiveness component, which is related to its socio-economic and demographic characteristics, and the impedance component, which reflects the ease of communication within and between basic spatial units. For illustration purposes, the methodology is applied to a data set containing information about commuters from the Spanish region of Aragón. We identify the areas with better labour and residential accessibility, and we also analyse the attractiveness and the impedance components of a set of chosen localities which allows us to better understand their mobility patterns.

  20. Reverse engineering of modified genes by Bayesian network analysis defines molecular determinants critical to the development of glioblastoma.

    PubMed

    Kunkle, Brian W; Yoo, Changwon; Roy, Deodutta

    2013-01-01

    In this study we have identified key genes that are critical in development of astrocytic tumors. Meta-analysis of microarray studies which compared normal tissue to astrocytoma revealed a set of 646 differentially expressed genes in the majority of astrocytoma. Reverse engineering of these 646 genes using Bayesian network analysis produced a gene network for each grade of astrocytoma (Grade I-IV), and 'key genes' within each grade were identified. Genes found to be most influential to development of the highest grade of astrocytoma, Glioblastoma multiforme were: COL4A1, EGFR, BTF3, MPP2, RAB31, CDK4, CD99, ANXA2, TOP2A, and SERBP1. All of these genes were up-regulated, except MPP2 (down regulated). These 10 genes were able to predict tumor status with 96-100% confidence when using logistic regression, cross validation, and the support vector machine analysis. Markov genes interact with NFkβ, ERK, MAPK, VEGF, growth hormone and collagen to produce a network whose top biological functions are cancer, neurological disease, and cellular movement. Three of the 10 genes - EGFR, COL4A1, and CDK4, in particular, seemed to be potential 'hubs of activity'. Modified expression of these 10 Markov Blanket genes increases lifetime risk of developing glioblastoma compared to the normal population. The glioblastoma risk estimates were dramatically increased with joint effects of 4 or more than 4 Markov Blanket genes. Joint interaction effects of 4, 5, 6, 7, 8, 9 or 10 Markov Blanket genes produced 9, 13, 20.9, 26.7, 52.8, 53.2, 78.1 or 85.9%, respectively, increase in lifetime risk of developing glioblastoma compared to normal population. In summary, it appears that modified expression of several 'key genes' may be required for the development of glioblastoma. Further studies are needed to validate these 'key genes' as useful tools for early detection and novel therapeutic options for these tumors.

  1. Nonlinear transient analysis of joint dominated structures

    NASA Technical Reports Server (NTRS)

    Chapman, J. M.; Shaw, F. H.; Russell, W. C.

    1987-01-01

    A residual force technique is presented that can perform the transient analyses of large, flexible, and joint dominated structures. The technique permits substantial size reduction in the number of degrees of freedom describing the nonlinear structural models and can account for such nonlinear joint phenomena as free-play and hysteresis. In general, joints can have arbitrary force-state map representations but these are used in the form of residual force maps. One essential feature of the technique is to replace the arbitrary force-state maps describing the nonlinear joints with residual force maps describing the truss links. The main advantage of this replacement is that the incrementally small relative displacements and velocities across a joint are not monitored directly thereby avoiding numerical difficulties. Instead, very small and 'soft' residual forces are defined giving a numerically attractive form for the equations of motion and thereby permitting numerically stable integration algorithms. The technique was successfully applied to the transient analyses of a large 58 bay, 60 meter truss having nonlinear joints. A method to perform link testing is also presented.

  2. Nonlinear dynamic characteristic analysis of jointed beam with clearance

    NASA Astrophysics Data System (ADS)

    Zhang, Jing; Guo, Hong-Wei; Liu, Rong-Qiang; Wu, Juan; Kou, Zi-Ming; Deng, Zong-Quan

    2016-12-01

    The impact and elasticity of discontinuous beams with clearance frequently affect the dynamic response of structures used in space missions. This study investigates the dynamic response of jointed beams which are the periodic units of deployable structures. The vibration process of jointed beams includes free-play and impact stages. A method for the dynamic analysis of jointed beams with clearance is proposed based on mode superposition and instantaneous static deformation. Transfer matrix, which expresses the relationship of the responses before and after the impact of jointed beams, is derived to calculate the response of the jointed beams after a critical position. The dynamic responses of jointed beams are then simulated. The effects of various parameters on the displacement and velocity of beams are investigated.

  3. Bayesian estimation of dynamic matching function for U-V analysis in Japan

    NASA Astrophysics Data System (ADS)

    Kyo, Koki; Noda, Hideo; Kitagawa, Genshiro

    2012-05-01

    In this paper we propose a Bayesian method for analyzing unemployment dynamics. We derive a Beveridge curve for unemployment and vacancy (U-V) analysis from a Bayesian model based on a labor market matching function. In our framework, the efficiency of matching and the elasticities of new hiring with respect to unemployment and vacancy are regarded as time varying parameters. To construct a flexible model and obtain reasonable estimates in an underdetermined estimation problem, we treat the time varying parameters as random variables and introduce smoothness priors. The model is then described in a state space representation, enabling the parameter estimation to be carried out using Kalman filter and fixed interval smoothing. In such a representation, dynamic features of the cyclic unemployment rate and the structural-frictional unemployment rate can be accurately captured.

  4. MorePower 6.0 for ANOVA with relational confidence intervals and Bayesian analysis.

    PubMed

    Campbell, Jamie I D; Thompson, Valerie A

    2012-12-01

    MorePower 6.0 is a flexible freeware statistical calculator that computes sample size, effect size, and power statistics for factorial ANOVA designs. It also calculates relational confidence intervals for ANOVA effects based on formulas from Jarmasz and Hollands (Canadian Journal of Experimental Psychology 63:124-138, 2009), as well as Bayesian posterior probabilities for the null and alternative hypotheses based on formulas in Masson (Behavior Research Methods 43:679-690, 2011). The program is unique in affording direct comparison of these three approaches to the interpretation of ANOVA tests. Its high numerical precision and ability to work with complex ANOVA designs could facilitate researchers' attention to issues of statistical power, Bayesian analysis, and the use of confidence intervals for data interpretation. MorePower 6.0 is available at https://wiki.usask.ca/pages/viewpageattachments.action?pageId=420413544 .

  5. Bayesian Propensity Score Analysis: Simulation and Case Study

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Cassie J. S.

    2011-01-01

    Propensity score analysis (PSA) has been used in a variety of settings, such as education, epidemiology, and sociology. Most typically, propensity score analysis has been implemented within the conventional frequentist perspective of statistics. This perspective, as is well known, does not account for uncertainty in either the parameters of the…

  6. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    NASA Astrophysics Data System (ADS)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  7. Mapping joint grey and white matter reductions in Alzheimer's disease using joint independent component analysis.

    PubMed

    Guo, Xiaojuan; Han, Yuan; Chen, Kewei; Wang, Yan; Yao, Li

    2012-12-07

    Alzheimer's disease (AD) is a neurodegenerative disease concomitant with grey and white matter damages. However, the interrelationship of volumetric changes between grey and white matter remains poorly understood in AD. Using joint independent component analysis, this study identified joint grey and white matter volume reductions based on structural magnetic resonance imaging data to construct the covariant networks in twelve AD patients and fourteen normal controls (NC). We found that three networks showed significant volume reductions in joint grey-white matter sources in AD patients, including (1) frontal/parietal/temporal-superior longitudinal fasciculus/corpus callosum, (2) temporal/parietal/occipital-frontal/occipital, and (3) temporal-precentral/postcentral. The corresponding expression scores distinguished AD patients from NC with 85.7%, 100% and 85.7% sensitivity for joint sources 1, 2 and 3, respectively; 75.0%, 66.7% and 75.0% specificity for joint sources 1, 2 and 3, respectively. Furthermore, the combined source of three significant joint sources best predicted the AD/NC group membership with 92.9% sensitivity and 83.3% specificity. Our findings revealed joint grey and white matter loss in AD patients, and these results can help elucidate the mechanism of grey and white matter reductions in the development of AD.

  8. Bayesian methodology to estimate and update safety performance functions under limited data conditions: a sensitivity analysis.

    PubMed

    Heydari, Shahram; Miranda-Moreno, Luis F; Lord, Dominique; Fu, Liping

    2014-03-01

    In road safety studies, decision makers must often cope with limited data conditions. In such circumstances, the maximum likelihood estimation (MLE), which relies on asymptotic theory, is unreliable and prone to bias. Moreover, it has been reported in the literature that (a) Bayesian estimates might be significantly biased when using non-informative prior distributions under limited data conditions, and that (b) the calibration of limited data is plausible when existing evidence in the form of proper priors is introduced into analyses. Although the Highway Safety Manual (2010) (HSM) and other research studies provide calibration and updating procedures, the data requirements can be very taxing. This paper presents a practical and sound Bayesian method to estimate and/or update safety performance function (SPF) parameters combining the information available from limited data with the SPF parameters reported in the HSM. The proposed Bayesian updating approach has the advantage of requiring fewer observations to get reliable estimates. This paper documents this procedure. The adopted technique is validated by conducting a sensitivity analysis through an extensive simulation study with 15 different models, which include various prior combinations. This sensitivity analysis contributes to our understanding of the comparative aspects of a large number of prior distributions. Furthermore, the proposed method contributes to unification of the Bayesian updating process for SPFs. The results demonstrate the accuracy of the developed methodology. Therefore, the suggested approach offers considerable promise as a methodological tool to estimate and/or update baseline SPFs and to evaluate the efficacy of road safety countermeasures under limited data conditions.

  9. Joint regression analysis of correlated data using Gaussian copulas.

    PubMed

    Song, Peter X-K; Li, Mingyao; Yuan, Ying

    2009-03-01

    This article concerns a new joint modeling approach for correlated data analysis. Utilizing Gaussian copulas, we present a unified and flexible machinery to integrate separate one-dimensional generalized linear models (GLMs) into a joint regression analysis of continuous, discrete, and mixed correlated outcomes. This essentially leads to a multivariate analogue of the univariate GLM theory and hence an efficiency gain in the estimation of regression coefficients. The availability of joint probability models enables us to develop a full maximum likelihood inference. Numerical illustrations are focused on regression models for discrete correlated data, including multidimensional logistic regression models and a joint model for mixed normal and binary outcomes. In the simulation studies, the proposed copula-based joint model is compared to the popular generalized estimating equations, which is a moment-based estimating equation method to join univariate GLMs. Two real-world data examples are used in the illustration.

  10. A Bayesian Framework for Functional Mapping through Joint Modeling of Longitudinal and Time-to-Event Data

    PubMed Central

    Das, Kiranmoy; Li, Runze; Huang, Zhongwen; Gai, Junyi; Wu, Rongling

    2012-01-01

    The most powerful and comprehensive approach of study in modern biology is to understand the whole process of development and all events of importance to development which occur in the process. As a consequence, joint modeling of developmental processes and events has become one of the most demanding tasks in statistical research. Here, we propose a joint modeling framework for functional mapping of specific quantitative trait loci (QTLs) which controls developmental processes and the timing of development and their causal correlation over time. The joint model contains two submodels, one for a developmental process, known as a longitudinal trait, and the other for a developmental event, known as the time to event, which are connected through a QTL mapping framework. A nonparametric approach is used to model the mean and covariance function of the longitudinal trait while the traditional Cox proportional hazard (PH) model is used to model the event time. The joint model is applied to map QTLs that control whole-plant vegetative biomass growth and time to first flower in soybeans. Results show that this model should be broadly useful for detecting genes controlling physiological and pathological processes and other events of interest in biomedicine. PMID:22685454

  11. Latent structure analysis in pharmaceutical formulations using Kohonen's self-organizing map and a Bayesian network.

    PubMed

    Kikuchi, Shingo; Onuki, Yoshinori; Yasuda, Akihito; Hayashi, Yoshihiro; Takayama, Kozo

    2011-03-01

    A latent structure analysis of pharmaceutical formulations was performed using Kohonen's self-organizing map (SOM) and a Bayesian network. A hydrophilic matrix tablet containing diltiazem hydrochloride (DTZ), a highly water-soluble model drug, was used as a model formulation. Nonlinear relationship correlations among formulation factors (oppositely charged dextran derivatives and hydroxypropyl methylcellulose), latent variables (turbidity and viscosity of the polymer mixtures and binding affinity of DTZ to polymers), and release properties [50% dissolution times (t50s) and similarity factor] were clearly visualized by self organizing feature maps. The quantities of dextran derivatives forming polyion complexes were strongly related to the binding affinity of DTZ to polymers and t50s. The latent variables were classified into five characteristic clusters with similar properties by SOM clustering. The probabilistic graphical model of the latent structure was successfully constructed using a Bayesian network. The causal relationships among the factors were quantitatively estimated by inferring conditional probability distributions. Moreover, these causal relationships estimated by the Bayesian network coincided well with estimations by SOM clustering, and the probabilistic graphical model was reflected in the characteristics of SOM clusters. These techniques provide a better understanding of the latent structure between formulation factors and responses in DTZ hydrophilic matrix tablet formulations.

  12. Bayesian GWAS and network analysis revealed new candidate genes for number of teats in pigs.

    PubMed

    Verardo, L L; Silva, F F; Varona, L; Resende, M D V; Bastiaansen, J W M; Lopes, P S; Guimarães, S E F

    2015-02-01

    The genetic improvement of reproductive traits such as the number of teats is essential to the success of the pig industry. As opposite to most SNP association studies that consider continuous phenotypes under Gaussian assumptions, this trait is characterized as a discrete variable, which could potentially follow other distributions, such as the Poisson. Therefore, in order to access the complexity of a counting random regression considering all SNPs simultaneously as covariate under a GWAS modeling, the Bayesian inference tools become necessary. Currently, another point that deserves to be highlighted in GWAS is the genetic dissection of complex phenotypes through candidate genes network derived from significant SNPs. We present a full Bayesian treatment of SNP association analysis for number of teats assuming alternatively Gaussian and Poisson distributions for this trait. Under this framework, significant SNP effects were identified by hypothesis tests using 95% highest posterior density intervals. These SNPs were used to construct associated candidate genes network aiming to explain the genetic mechanism behind this reproductive trait. The Bayesian model comparisons based on deviance posterior distribution indicated the superiority of Gaussian model. In general, our results suggest the presence of 19 significant SNPs, which mapped 13 genes. Besides, we predicted gene interactions through networks that are consistent with the mammals known breast biology (e.g., development of prolactin receptor signaling, and cell proliferation), captured known regulation binding sites, and provided candidate genes for that trait (e.g., TINAGL1 and ICK).

  13. Bayesian analysis of fingerprint, face and signature evidences with automatic biometric systems.

    PubMed

    Gonzalez-Rodriguez, Joaquin; Fierrez-Aguilar, Julian; Ramos-Castro, Daniel; Ortega-Garcia, Javier

    2005-12-20

    The Bayesian approach provides a unified and logical framework for the analysis of evidence and to provide results in the form of likelihood ratios (LR) from the forensic laboratory to court. In this contribution we want to clarify how the biometric scientist or laboratory can adapt their conventional biometric systems or technologies to work according to this Bayesian approach. Forensic systems providing their results in the form of LR will be assessed through Tippett plots, which give a clear representation of the LR-based performance both for targets (the suspect is the author/source of the test pattern) and non-targets. However, the computation procedures of the LR values, especially with biometric evidences, are still an open issue. Reliable estimation techniques showing good generalization properties for the estimation of the between- and within-source variabilities of the test pattern are required, as variance restriction techniques in the within-source density estimation to stand for the variability of the source with the course of time. Fingerprint, face and on-line signature recognition systems will be adapted to work according to this Bayesian approach showing both the likelihood ratios range in each application and the adequacy of these biometric techniques to the daily forensic work.

  14. Bayesian latent variable models for the analysis of experimental psychology data.

    PubMed

    Merkle, Edgar C; Wang, Ting

    2016-03-18

    In this paper, we address the use of Bayesian factor analysis and structural equation models to draw inferences from experimental psychology data. While such application is non-standard, the models are generally useful for the unified analysis of multivariate data that stem from, e.g., subjects' responses to multiple experimental stimuli. We first review the models and the parameter identification issues inherent in the models. We then provide details on model estimation via JAGS and on Bayes factor estimation. Finally, we use the models to re-analyze experimental data on risky choice, comparing the approach to simpler, alternative methods.

  15. Highly efficient Bayesian joint inversion for receiver-based data and its application to lithospheric structure beneath the southern Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Dettmer, Jan; Rhie, Junkee; Tkalčić, Hrvoje

    2016-07-01

    With the deployment of extensive seismic arrays, systematic and efficient parameter and uncertainty estimation is of increasing importance and can provide reliable, regional models for crustal and upper-mantle structure. We present an efficient Bayesian method for the joint inversion of surface-wave dispersion and receiver-function data that combines trans-dimensional (trans-D) model selection in an optimization phase with subsequent rigorous parameter uncertainty estimation. Parameter and uncertainty estimation depend strongly on the chosen parametrization such that meaningful regional comparison requires quantitative model selection that can be carried out efficiently at several sites. While significant progress has been made for model selection (e.g. trans-D inference) at individual sites, the lack of efficiency can prohibit application to large data volumes or cause questionable results due to lack of convergence. Studies that address large numbers of data sets have mostly ignored model selection in favour of more efficient/simple estimation techniques (i.e. focusing on uncertainty estimation but employing ad-hoc model choices). Our approach consists of a two-phase inversion that combines trans-D optimization to select the most probable parametrization with subsequent Bayesian sampling for uncertainty estimation given that parametrization. The trans-D optimization is implemented here by replacing the likelihood function with the Bayesian information criterion (BIC). The BIC provides constraints on model complexity that facilitate the search for an optimal parametrization. Parallel tempering (PT) is applied as an optimization algorithm. After optimization, the optimal model choice is identified by the minimum BIC value from all PT chains. Uncertainty estimation is then carried out in fixed dimension. Data errors are estimated as part of the inference problem by a combination of empirical and hierarchical estimation. Data covariance matrices are estimated from

  16. Bayesian Finite Mixtures for Nonlinear Modeling of Educational Data.

    ERIC Educational Resources Information Center

    Tirri, Henry; And Others

    A Bayesian approach for finding latent classes in data is discussed. The approach uses finite mixture models to describe the underlying structure in the data and demonstrate that the possibility of using full joint probability models raises interesting new prospects for exploratory data analysis. The concepts and methods discussed are illustrated…

  17. Diagnostic accuracy of a bayesian latent group analysis for the detection of malingering-related poor effort.

    PubMed

    Ortega, Alonso; Labrenz, Stephan; Markowitsch, Hans J; Piefke, Martina

    2013-01-01

    In the last decade, different statistical techniques have been introduced to improve assessment of malingering-related poor effort. In this context, we have recently shown preliminary evidence that a Bayesian latent group model may help to optimize classification accuracy using a simulation research design. In the present study, we conducted two analyses. Firstly, we evaluated how accurately this Bayesian approach can distinguish between participants answering in an honest way (honest response group) and participants feigning cognitive impairment (experimental malingering group). Secondly, we tested the accuracy of our model in the differentiation between patients who had real cognitive deficits (cognitively impaired group) and participants who belonged to the experimental malingering group. All Bayesian analyses were conducted using the raw scores of a visual recognition forced-choice task (2AFC), the Test of Memory Malingering (TOMM, Trial 2), and the Word Memory Test (WMT, primary effort subtests). The first analysis showed 100% accuracy for the Bayesian model in distinguishing participants of both groups with all effort measures. The second analysis showed outstanding overall accuracy of the Bayesian model when estimates were obtained from the 2AFC and the TOMM raw scores. Diagnostic accuracy of the Bayesian model diminished when using the WMT total raw scores. Despite, overall diagnostic accuracy can still be considered excellent. The most plausible explanation for this decrement is the low performance in verbal recognition and fluency tasks of some patients of the cognitively impaired group. Additionally, the Bayesian model provides individual estimates, p(zi |D), of examinees' effort levels. In conclusion, both high classification accuracy levels and Bayesian individual estimates of effort may be very useful for clinicians when assessing for effort in medico-legal settings.

  18. Hierarchical models and Bayesian analysis of bird survey information

    USGS Publications Warehouse

    Sauer, J.R.; Link, W.A.; Royle, J. Andrew; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Summary of bird survey information is a critical component of conservation activities, but often our summaries rely on statistical methods that do not accommodate the limitations of the information. Prioritization of species requires ranking and analysis of species by magnitude of population trend, but often magnitude of trend is a misleading measure of actual decline when trend is poorly estimated. Aggregation of population information among regions is also complicated by varying quality of estimates among regions. Hierarchical models provide a reasonable means of accommodating concerns about aggregation and ranking of quantities of varying precision. In these models the need to consider multiple scales is accommodated by placing distributional assumptions on collections of parameters. For collections of species trends, this allows probability statements to be made about the collections of species-specific parameters, rather than about the estimates. We define and illustrate hierarchical models for two commonly encountered situations in bird conservation: (1) Estimating attributes of collections of species estimates, including ranking of trends, estimating number of species with increasing populations, and assessing population stability with regard to predefined trend magnitudes; and (2) estimation of regional population change, aggregating information from bird surveys over strata. User-friendly computer software makes hierarchical models readily accessible to scientists.

  19. Majorana Demonstrator Bolted Joint Mechanical and Thermal Analysis

    SciTech Connect

    Aguayo Navarrete, Estanislao; Reid, Douglas J.; Fast, James E.

    2012-06-01

    The MAJORANA DEMONSTRATOR is designed to probe for neutrinoless double-beta decay, an extremely rare process with a half-life in the order of 1026 years. The experiment uses an ultra-low background, high-purity germanium detector array. The germanium crystals are both the source and the detector in this experiment. Operating these crystals as ionizing radiation detectors requires having them under cryogenic conditions (below 90 K). A liquid nitrogen thermosyphon is used to extract the heat from the detectors. The detector channels are arranged in strings and thermally coupled to the thermosyphon through a cold plate. The cold plate is joined to the thermosyphon by a bolted joint. This circular plate is housed inside the cryostat can. This document provides a detailed study of the bolted joint that connects the cold plate and the thermosyphon. An analysis of the mechanical and thermal properties of this bolted joint is presented. The force applied to the joint is derived from the torque applied to each one of the six bolts that form the joint. The thermal conductivity of the joint is measured as a function of applied force. The required heat conductivity for a successful experiment is the combination of the thermal conductivity of the detector string and this joint. The thermal behavior of the joint is experimentally implemented and analyzed in this study.

  20. Bayesian Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David

    2009-12-01

    Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.

  1. Bayesian Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David

    2014-02-01

    Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.

  2. Crash risk analysis for Shanghai urban expressways: A Bayesian semi-parametric modeling approach.

    PubMed

    Yu, Rongjie; Wang, Xuesong; Yang, Kui; Abdel-Aty, Mohamed

    2016-10-01

    Urban expressway systems have been developed rapidly in recent years in China; it has become one key part of the city roadway networks as carrying large traffic volume and providing high traveling speed. Along with the increase of traffic volume, traffic safety has become a major issue for Chinese urban expressways due to the frequent crash occurrence and the non-recurrent congestions caused by them. For the purpose of unveiling crash occurrence mechanisms and further developing Active Traffic Management (ATM) control strategies to improve traffic safety, this study developed disaggregate crash risk analysis models with loop detector traffic data and historical crash data. Bayesian random effects logistic regression models were utilized as it can account for the unobserved heterogeneity among crashes. However, previous crash risk analysis studies formulated random effects distributions in a parametric approach, which assigned them to follow normal distributions. Due to the limited information known about random effects distributions, subjective parametric setting may be incorrect. In order to construct more flexible and robust random effects to capture the unobserved heterogeneity, Bayesian semi-parametric inference technique was introduced to crash risk analysis in this study. Models with both inference techniques were developed for total crashes; semi-parametric models were proved to provide substantial better model goodness-of-fit, while the two models shared consistent coefficient estimations. Later on, Bayesian semi-parametric random effects logistic regression models were developed for weekday peak hour crashes, weekday non-peak hour crashes, and weekend non-peak hour crashes to investigate different crash occurrence scenarios. Significant factors that affect crash risk have been revealed and crash mechanisms have been concluded.

  3. Bayesian design and analysis of computer experiments: Use of derivatives in surface prediction

    SciTech Connect

    Morris, M.D.; Mitchell, T.J. ); Ylvisaker, D. . Dept. of Mathematics)

    1991-06-01

    The work of Currin et al. and others in developing fast predictive approximations'' of computer models is extended for the case in which derivatives of the output variable of interest with respect to input variables are available. In addition to describing the calculations required for the Bayesian analysis, the issue of experimental design is also discussed, and an algorithm is described for constructing maximin distance'' designs. An example is given based on a demonstration model of eight inputs and one output, in which predictions based on a maximin design, a Latin hypercube design, and two compromise'' designs are evaluated and compared. 12 refs., 2 figs., 6 tabs.

  4. Nonparametric Bayesian Dictionary Learning for Analysis of Noisy and Incomplete Images

    DTIC Science & Technology

    2010-04-01

    OF EACH CELL ARE RESULTS OF KSVD AND BPFA, RESPECTIVELY. σ C.man House Peppers Lena Barbara Boats F.print Couple Hill 5 37.87 39.37 37.78 38.60 38.08...INTERPOLATION PSNR RESULTS, USING PATCH SIZE 8× 8. BOTTOM: BPFA RGB IMAGE INTERPOLATION PSNR RESULTS, USING PATCH SIZE 7× 7. data ratio C.man House Peppers Lena...of subspaces. IEEE Trans. Inform. Theory, 2009. [16] T. Ferguson . A Bayesian analysis of some nonparametric problems. Annals of Statistics, 1:209–230

  5. Reusable Solid Rocket Motor Nozzle Joint-4 Thermal Analysis

    NASA Technical Reports Server (NTRS)

    Clayton, J. Louie

    2001-01-01

    This study provides for development and test verification of a thermal model used for prediction of joint heating environments, structural temperatures and seal erosions in the Space Shuttle Reusable Solid Rocket Motor (RSRM) Nozzle Joint-4. The heating environments are a result of rapid pressurization of the joint free volume assuming a leak path has occurred in the filler material used for assembly gap close out. Combustion gases flow along the leak path from nozzle environment to joint O-ring gland resulting in local heating to the metal housing and erosion of seal materials. Analysis of this condition was based on usage of the NASA Joint Pressurization Routine (JPR) for environment determination and the Systems Improved Numerical Differencing Analyzer (SINDA) for structural temperature prediction. Model generated temperatures, pressures and seal erosions are compared to hot fire test data for several different leak path situations. Investigated in the hot fire test program were nozzle joint-4 O-ring erosion sensitivities to leak path width in both open and confined joint geometries. Model predictions were in generally good agreement with the test data for the confined leak path cases. Worst case flight predictions are provided using the test-calibrated model. Analysis issues are discussed based on model calibration procedures.

  6. Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification

    PubMed Central

    Yan, Fang; Xu, Kaili; Yao, Xiwen; Li, Yang

    2016-01-01

    Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA) is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie) analysis was proposed by mapping bow-tie analysis into Bayesian network (BN). Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures. PMID:27463975

  7. Drug-drug interaction prediction: a Bayesian meta-analysis approach.

    PubMed

    Li, Lang; Yu, Menggang; Chin, Raymond; Lucksiri, Aroonrut; Flockhart, David A; Hall, Stephen D

    2007-09-10

    In drug-drug interaction (DDI) research, a two drug interaction is usually predicted by individual drug pharmacokinetics (PK). Although subject-specific drug concentration data from clinical PK studies on inhibitor/inducer or substrate's PK are not usually published, sample mean plasma drug concentrations and their standard deviations have been routinely reported. In this paper, an innovative DDI prediction method based on a three-level hierarchical Bayesian meta-analysis model is developed. The first level model is a study-specific sample mean model; the second level model is a random effect model connecting different PK studies; and all priors of PK parameters are specified in the third level model. A Monte Carlo Markov chain (MCMC) PK parameter estimation procedure is developed, and DDI prediction for a future study is conducted based on the PK models of two drugs and posterior distributions of the PK parameters. The performance of Bayesian meta-analysis in DDI prediction is demonstrated through a ketoconazole-midazolam example. The biases of DDI prediction are evaluated through statistical simulation studies. The DDI marker, ratio of area under the concentration curves, is predicted with little bias (less than 5 per cent), and its 90 per cent credible interval coverage rate is close to the nominal level. Sensitivity analysis is conducted to justify prior distribution selections.

  8. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    PubMed

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  9. A Two-Step Bayesian Approach for Propensity Score Analysis: Simulations and Case Study

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Jianshen

    2012-01-01

    A two-step Bayesian propensity score approach is introduced that incorporates prior information in the propensity score equation and outcome equation without the problems associated with simultaneous Bayesian propensity score approaches. The corresponding variance estimators are also provided. The two-step Bayesian propensity score is provided for…

  10. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-02-28

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry.

  11. A Bayesian analysis of uncertainties on lung doses resulting from occupational exposures to uranium.

    PubMed

    Puncher, M; Birchall, A; Bull, R K

    2013-09-01

    In a recent epidemiological study, Bayesian estimates of lung doses were calculated in order to determine a possible association between lung dose and lung cancer incidence resulting from occupational exposures to uranium. These calculations, which produce probability distributions of doses, used the human respiratory tract model (HRTM) published by the International Commission on Radiological Protection (ICRP) with a revised particle transport clearance model. In addition to the Bayesian analyses, point estimates (PEs) of doses were also provided for that study using the existing HRTM as it is described in ICRP Publication 66. The PEs are to be used in a preliminary analysis of risk. To explain the differences between the PEs and Bayesian analysis, in this paper the methodology was applied to former UK nuclear workers who constituted a subset of the study cohort. The resulting probability distributions of lung doses calculated using the Bayesian methodology were compared with the PEs obtained for each worker. Mean posterior lung doses were on average 8-fold higher than PEs and the uncertainties on doses varied over a wide range, being greater than two orders of magnitude for some lung tissues. It is shown that it is the prior distributions of the parameters describing absorption from the lungs to blood that are responsible for the large difference between posterior mean doses and PEs. Furthermore, it is the large prior uncertainties on these parameters that are mainly responsible for the large uncertainties on lung doses. It is concluded that accurate determination of the chemical form of inhaled uranium, as well as the absorption parameter values for these materials, is important for obtaining unbiased estimates of lung doses from occupational exposures to uranium for epidemiological studies. Finally, it should be noted that the inferences regarding the PEs described here apply only to the assessments of cases provided for the epidemiological study, where central

  12. Evolutionary Analysis of Dengue Serotype 2 Viruses Using Phylogenetic and Bayesian Methods from New Delhi, India

    PubMed Central

    Afreen, Nazia; Naqvi, Irshad H.; Broor, Shobha; Ahmed, Anwar; Kazim, Syed Naqui; Dohare, Ravins; Kumar, Manoj; Parveen, Shama

    2016-01-01

    Dengue fever is the most important arboviral disease in the tropical and sub-tropical countries of the world. Delhi, the metropolitan capital state of India, has reported many dengue outbreaks, with the last outbreak occurring in 2013. We have recently reported predominance of dengue virus serotype 2 during 2011–2014 in Delhi. In the present study, we report molecular characterization and evolutionary analysis of dengue serotype 2 viruses which were detected in 2011–2014 in Delhi. Envelope genes of 42 DENV-2 strains were sequenced in the study. All DENV-2 strains grouped within the Cosmopolitan genotype and further clustered into three lineages; Lineage I, II and III. Lineage III replaced lineage I during dengue fever outbreak of 2013. Further, a novel mutation Thr404Ile was detected in the stem region of the envelope protein of a single DENV-2 strain in 2014. Nucleotide substitution rate and time to the most recent common ancestor were determined by molecular clock analysis using Bayesian methods. A change in effective population size of Indian DENV-2 viruses was investigated through Bayesian skyline plot. The study will be a vital road map for investigation of epidemiology and evolutionary pattern of dengue viruses in India. PMID:26977703

  13. Evolutionary Analysis of Dengue Serotype 2 Viruses Using Phylogenetic and Bayesian Methods from New Delhi, India.

    PubMed

    Afreen, Nazia; Naqvi, Irshad H; Broor, Shobha; Ahmed, Anwar; Kazim, Syed Naqui; Dohare, Ravins; Kumar, Manoj; Parveen, Shama

    2016-03-01

    Dengue fever is the most important arboviral disease in the tropical and sub-tropical countries of the world. Delhi, the metropolitan capital state of India, has reported many dengue outbreaks, with the last outbreak occurring in 2013. We have recently reported predominance of dengue virus serotype 2 during 2011-2014 in Delhi. In the present study, we report molecular characterization and evolutionary analysis of dengue serotype 2 viruses which were detected in 2011-2014 in Delhi. Envelope genes of 42 DENV-2 strains were sequenced in the study. All DENV-2 strains grouped within the Cosmopolitan genotype and further clustered into three lineages; Lineage I, II and III. Lineage III replaced lineage I during dengue fever outbreak of 2013. Further, a novel mutation Thr404Ile was detected in the stem region of the envelope protein of a single DENV-2 strain in 2014. Nucleotide substitution rate and time to the most recent common ancestor were determined by molecular clock analysis using Bayesian methods. A change in effective population size of Indian DENV-2 viruses was investigated through Bayesian skyline plot. The study will be a vital road map for investigation of epidemiology and evolutionary pattern of dengue viruses in India.

  14. Bayesian decision analysis for evaluating management options to promote recovery of a depleted salmon population.

    PubMed

    Pestes, Lynsey R; Peterman, Randall M; Bradford, Michael J; Wood, Chris C

    2008-04-01

    The endangered population of sockeye salmon (Oncorhynchus nerka) in Cultus Lake, British Columbia, Canada, migrates through commercial fishing areas along with other, much more abundant sockeye salmon populations, but it is not feasible to selectively harvest only the latter, abundant populations. This situation creates controversial trade-offs between recovery actions and economic revenue. We conducted a Bayesian decision analysis to evaluate options for recovery of Cultus Lake sockeye salmon. We used a stochastic population model that included 2 sources of uncertainty that are often omitted from such analyses: structural uncertainty in the magnitude of a potential Allee effect and implementation uncertainty (the deviation between targets and actual outcomes of management actions). Numerous state-dependent, time-independent management actions meet recovery objectives. These actions prescribe limitations on commercial harvest rates as a function of abundance of Cultus Lake sockeye salmon. We also quantified how much reduction in economic value of commercial harvests of the more abundant sockeye salmon populations would be expected for a given increase in the probability of recovery of the Cultus population. Such results illustrate how Bayesian decision analysis can rank options for dealing with conservation risks and can help inform trade-off discussions among decision makers and among groups that have competing objectives.

  15. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents.

  16. A Bayesian framework for cell-level protein network analysis for multivariate proteomics image data

    NASA Astrophysics Data System (ADS)

    Kovacheva, Violet N.; Sirinukunwattana, Korsuk; Rajpoot, Nasir M.

    2014-03-01

    The recent development of multivariate imaging techniques, such as the Toponome Imaging System (TIS), has facilitated the analysis of multiple co-localisation of proteins. This could hold the key to understanding complex phenomena such as protein-protein interaction in cancer. In this paper, we propose a Bayesian framework for cell level network analysis allowing the identification of several protein pairs having significantly higher co-expression levels in cancerous tissue samples when compared to normal colon tissue. It involves segmenting the DAPI-labeled image into cells and determining the cell phenotypes according to their protein-protein dependence profile. The cells are phenotyped using Gaussian Bayesian hierarchical clustering (GBHC) after feature selection is performed. The phenotypes are then analysed using Difference in Sums of Weighted cO-dependence Profiles (DiSWOP), which detects differences in the co-expression patterns of protein pairs. We demonstrate that the pairs highlighted by the proposed framework have high concordance with recent results using a different phenotyping method. This demonstrates that the results are independent of the clustering method used. In addition, the highlighted protein pairs are further analysed via protein interaction pathway databases and by considering the localization of high protein-protein dependence within individual samples. This suggests that the proposed approach could identify potentially functional protein complexes active in cancer progression and cell differentiation.

  17. Bayesian meta-analysis of Cronbach's coefficient alpha to evaluate informative hypotheses.

    PubMed

    Okada, Kensuke

    2015-12-01

    This paper proposes a new method to evaluate informative hypotheses for meta-analysis of Cronbach's coefficient alpha using a Bayesian approach. The coefficient alpha is one of the most widely used reliability indices. In meta-analyses of reliability, researchers typically form specific informative hypotheses beforehand, such as 'alpha of this test is greater than 0.8' or 'alpha of one form of a test is greater than the others.' The proposed method enables direct evaluation of these informative hypotheses. To this end, a Bayes factor is calculated to evaluate the informative hypothesis against its complement. It allows researchers to summarize the evidence provided by previous studies in favor of their informative hypothesis. The proposed approach can be seen as a natural extension of the Bayesian meta-analysis of coefficient alpha recently proposed in this journal (Brannick and Zhang, 2013). The proposed method is illustrated through two meta-analyses of real data that evaluate different kinds of informative hypotheses on superpopulation: one is that alpha of a particular test is above the criterion value, and the other is that alphas among different test versions have ordered relationships. Informative hypotheses are supported from the data in both cases, suggesting that the proposed approach is promising for application.

  18. BayGO: Bayesian analysis of ontology term enrichment in microarray data

    PubMed Central

    Vêncio, Ricardo ZN; Koide, Tie; Gomes, Suely L; de B Pereira, Carlos A

    2006-01-01

    Background The search for enriched (aka over-represented or enhanced) ontology terms in a list of genes obtained from microarray experiments is becoming a standard procedure for a system-level analysis. This procedure tries to summarize the information focussing on classification designs such as Gene Ontology, KEGG pathways, and so on, instead of focussing on individual genes. Although it is well known in statistics that association and significance are distinct concepts, only the former approach has been used to deal with the ontology term enrichment problem. Results BayGO implements a Bayesian approach to search for enriched terms from microarray data. The R source-code is freely available at in three versions: Linux, which can be easily incorporated into pre-existent pipelines; Windows, to be controlled interactively; and as a web-tool. The software was validated using a bacterial heat shock response dataset, since this stress triggers known system-level responses. Conclusion The Bayesian model accounts for the fact that, eventually, not all the genes from a given category are observable in microarray data due to low intensity signal, quality filters, genes that were not spotted and so on. Moreover, BayGO allows one to measure the statistical association between generic ontology terms and differential expression, instead of working only with the common significance analysis. PMID:16504085

  19. Integrative Bayesian Analysis of Neuroimaging-Genetic Data with Application to Cocaine Dependence

    PubMed Central

    Azadeh, Shabnam; Hobbs, Brian P.; Ma, Liangsuo; Nielsen, David A.; Moeller, F. Gerard; Baladandayuthapani, Veerabhadran

    2016-01-01

    Neuroimaging and genetic studies provide distinct and complementary information about the structural and biological aspects of a disease. Integrating the two sources of data facilitates the investigation of the links between genetic variability and brain mechanisms among different individuals for various medical disorders. This article presents a general statistical framework for integrative Bayesian analysis of neuroimaging-genetic (iBANG) data, which is motivated by a neuroimaging-genetic study in cocaine dependence. Statistical inference necessitated the integration of spatially dependent voxel-level measurements with various patient-level genetic and demographic characteristics under an appropriate probability model to account for the multiple inherent sources of variation. Our framework uses Bayesian model averaging to integrate genetic information into the analysis of voxel-wise neuroimaging data, accounting for spatial correlations in the voxels. Using multiplicity controls based on the false discovery rate, we delineate voxels associated with genetic and demographic features that may impact diffusion as measured by fractional anisotropy (FA) obtained from DTI images. We demonstrate the benefits of accounting for model uncertainties in both model fit and prediction. Our results suggest that cocaine consumption is associated with FA reduction in most white matter regions of interest in the brain. Additionally, gene polymorphisms associated with GABAergic, serotonergic and dopaminergic neurotransmitters and receptors were associated with FA. PMID:26484829

  20. Critical composite joint subcomponents: Analysis and test results

    NASA Technical Reports Server (NTRS)

    Bunin, B. L.

    1983-01-01

    This program has been conducted to develop the technology for critical structural joints of a composite wing structure meeting design requirements for a 1990 commercial transport aircraft. A prime objective of the program was to demonstrate the ability to reliably predict the strength of large bolted composite joints. Load sharing between bolts in multirow joints was computed by a nonlinear analysis program (A4FJ) which was used both to assess the efficiency of different joint design concepts and to predict the strengths of large test articles representing a section from a wing root chord-wise splice. In most cases, the predictions were accurate to within a few percent of the test results. A highlight of these tests was the consistent ability to achieve gross-section failure strains on the order of 0.005 which represents a considerable improvement over the state of the art. The improvement was attained largely as the result of the better understanding of the load sharing in multirow joints provided by the analysis. The typical load intensity on the structural joints was about 40 to 45 thousand pound per inch in laminates having interspersed 37 1/2-percent 0-degree plies, 50-percent + or - 45-degrees plies and 12 1/2-percent 90-degrees plies. The composite material was Toray 300 fiber and Ciba-Geigy 914 resin, in the form of 0.010-inch thick unidirectional tape.

  1. Bayesian sensitivity analysis of incomplete data: bridging pattern-mixture and selection models.

    PubMed

    Kaciroti, Niko A; Raghunathan, Trivellore

    2014-11-30

    Pattern-mixture models (PMM) and selection models (SM) are alternative approaches for statistical analysis when faced with incomplete data and a nonignorable missing-data mechanism. Both models make empirically unverifiable assumptions and need additional constraints to identify the parameters. Here, we first introduce intuitive parameterizations to identify PMM for different types of outcome with distribution in the exponential family; then we translate these to their equivalent SM approach. This provides a unified framework for performing sensitivity analysis under either setting. These new parameterizations are transparent, easy-to-use, and provide dual interpretation from both the PMM and SM perspectives. A Bayesian approach is used to perform sensitivity analysis, deriving inferences using informative prior distributions on the sensitivity parameters. These models can be fitted using software that implements Gibbs sampling.

  2. Bayesian analysis for OPC modeling with film stack properties and posterior predictive checking

    NASA Astrophysics Data System (ADS)

    Burbine, Andrew; Fenger, Germain; Sturtevant, John; Fryer, David

    2016-10-01

    The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and analysis techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper expands upon Bayesian analysis methods for parameter selection in lithographic models by increasing the parameter set and employing posterior predictive checks. Work continues with a Markov chain Monte Carlo (MCMC) search algorithm to generate posterior distributions of parameters. Models now include wafer film stack refractive indices, n and k, as parameters, recognizing the uncertainties associated with these values. Posterior predictive checks are employed as a method to validate parameter vectors discovered by the analysis, akin to cross validation.

  3. Methods for the Joint Meta-Analysis of Multiple Tests

    ERIC Educational Resources Information Center

    Trikalinos, Thomas A.; Hoaglin, David C.; Small, Kevin M.; Terrin, Norma; Schmid, Christopher H.

    2014-01-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests'…

  4. Finite element analysis of human joints

    SciTech Connect

    Bossart, P.L.; Hollerbach, K.

    1996-09-01

    Our work focuses on the development of finite element models (FEMs) that describe the biomechanics of human joints. Finite element modeling is becoming a standard tool in industrial applications. In highly complex problems such as those found in biomechanics research, however, the full potential of FEMs is just beginning to be explored, due to the absence of precise, high resolution medical data and the difficulties encountered in converting these enormous datasets into a form that is usable in FEMs. With increasing computing speed and memory available, it is now feasible to address these challenges. We address the first by acquiring data with a high resolution C-ray CT scanner and the latter by developing semi-automated method for generating the volumetric meshes used in the FEM. Issues related to tomographic reconstruction, volume segmentation, the use of extracted surfaces to generate volumetric hexahedral meshes, and applications of the FEM are described.

  5. A Preliminary Bayesian Analysis of Incomplete Longitudinal Data from a Small Sample: Methodological Advances in an International Comparative Study of Educational Inequality

    ERIC Educational Resources Information Center

    Hsieh, Chueh-An; Maier, Kimberly S.

    2009-01-01

    The capacity of Bayesian methods in estimating complex statistical models is undeniable. Bayesian data analysis is seen as having a range of advantages, such as an intuitive probabilistic interpretation of the parameters of interest, the efficient incorporation of prior information to empirical data analysis, model averaging and model selection.…

  6. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach

    PubMed Central

    Mohammadi, Tayeb; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables “number of blood donation” and “number of blood deferral”: as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models. PMID:27703493

  7. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.

    PubMed

    Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.

  8. Spatial-Temporal Epidemiology of Tuberculosis in Mainland China: An Analysis Based on Bayesian Theory

    PubMed Central

    Cao, Kai; Yang, Kun; Wang, Chao; Guo, Jin; Tao, Lixin; Liu, Qingrong; Gehendra, Mahara; Zhang, Yingjie; Guo, Xiuhua

    2016-01-01

    Objective: To explore the spatial-temporal interaction effect within a Bayesian framework and to probe the ecological influential factors for tuberculosis. Methods: Six different statistical models containing parameters of time, space, spatial-temporal interaction and their combination were constructed based on a Bayesian framework. The optimum model was selected according to the deviance information criterion (DIC) value. Coefficients of climate variables were then estimated using the best fitting model. Results: The model containing spatial-temporal interaction parameter was the best fitting one, with the smallest DIC value (−4,508,660). Ecological analysis results showed the relative risks (RRs) of average temperature, rainfall, wind speed, humidity, and air pressure were 1.00324 (95% CI, 1.00150–1.00550), 1.01010 (95% CI, 1.01007–1.01013), 0.83518 (95% CI, 0.93732–0.96138), 0.97496 (95% CI, 0.97181–1.01386), and 1.01007 (95% CI, 1.01003–1.01011), respectively. Conclusions: The spatial-temporal interaction was statistically meaningful and the prevalence of tuberculosis was influenced by the time and space interaction effect. Average temperature, rainfall, wind speed, and air pressure influenced tuberculosis. Average humidity had no influence on tuberculosis. PMID:27164117

  9. Hierarchical Bayesian analysis of censored microbiological contamination data for use in risk assessment and mitigation.

    PubMed

    Busschaert, P; Geeraerd, A H; Uyttendaele, M; Van Impe, J F

    2011-06-01

    Microbiological contamination data often is censored because of the presence of non-detects or because measurement outcomes are known only to be smaller than, greater than, or between certain boundary values imposed by the laboratory procedures. Therefore, it is not straightforward to fit distributions that summarize contamination data for use in quantitative microbiological risk assessment, especially when variability and uncertainty are to be characterized separately. In this paper, distributions are fit using Bayesian analysis, and results are compared to results obtained with a methodology based on maximum likelihood estimation and the non-parametric bootstrap method. The Bayesian model is also extended hierarchically to estimate the effects of the individual elements of a covariate such as, for example, on a national level, the food processing company where the analyzed food samples were processed, or, on an international level, the geographical origin of contamination data. Including this extra information allows a risk assessor to differentiate between several scenario's and increase the specificity of the estimate of risk of illness, or compare different scenario's to each other. Furthermore, inference is made on the predictive importance of several different covariates while taking into account uncertainty, allowing to indicate which covariates are influential factors determining contamination.

  10. Two waves of diversification in mammals and reptiles of Baja California revealed by hierarchical Bayesian analysis.

    PubMed

    Leaché, Adam D; Crews, Sarah C; Hickerson, Michael J

    2007-12-22

    Many species inhabiting the Peninsular Desert of Baja California demonstrate a phylogeographic break at the mid-peninsula, and previous researchers have attributed this shared pattern to a single vicariant event, a mid-peninsular seaway. However, previous studies have not explicitly considered the inherent stochasticity associated with the gene-tree coalescence for species preceding the time of the putative mid-peninsular divergence. We use a Bayesian analysis of a hierarchical model to test for simultaneous vicariance across co-distributed sister lineages sharing a genealogical break at the mid-peninsula. This Bayesian method is advantageous over traditional phylogenetic interpretations of biogeography because it considers the genetic variance associated with the coalescent and mutational processes, as well as the among-lineage demographic differences that affect gene-tree coalescent patterns. Mitochondrial DNA data from six small mammals and six squamate reptiles do not support the perception of a shared vicariant history among lineages exhibiting a north-south divergence at the mid-peninsula, and instead support two events differentially structuring genetic diversity in this region.

  11. Critically evaluating the theory and performance of Bayesian analysis of macroevolutionary mixtures

    PubMed Central

    Moore, Brian R.; Höhna, Sebastian; May, Michael R.; Rannala, Bruce; Huelsenbeck, John P.

    2016-01-01

    Bayesian analysis of macroevolutionary mixtures (BAMM) has recently taken the study of lineage diversification by storm. BAMM estimates the diversification-rate parameters (speciation and extinction) for every branch of a study phylogeny and infers the number and location of diversification-rate shifts across branches of a tree. Our evaluation of BAMM reveals two major theoretical errors: (i) the likelihood function (which estimates the model parameters from the data) is incorrect, and (ii) the compound Poisson process prior model (which describes the prior distribution of diversification-rate shifts across branches) is incoherent. Using simulation, we demonstrate that these theoretical issues cause statistical pathologies; posterior estimates of the number of diversification-rate shifts are strongly influenced by the assumed prior, and estimates of diversification-rate parameters are unreliable. Moreover, the inability to correctly compute the likelihood or to correctly specify the prior for rate-variable trees precludes the use of Bayesian approaches for testing hypotheses regarding the number and location of diversification-rate shifts using BAMM. PMID:27512038

  12. A Bayesian analysis of the 69 highest energy cosmic rays detected by the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Khanin, Alexander; Mortlock, Daniel J.

    2016-08-01

    The origins of ultrahigh energy cosmic rays (UHECRs) remain an open question. Several attempts have been made to cross-correlate the arrival directions of the UHECRs with catalogues of potential sources, but no definite conclusion has been reached. We report a Bayesian analysis of the 69 events, from the Pierre Auger Observatory (PAO), that aims to determine the fraction of the UHECRs that originate from known AGNs in the Veron-Cety & Verson (VCV) catalogue, as well as AGNs detected with the Swift Burst Alert Telescope (Swift-BAT), galaxies from the 2MASS Redshift Survey (2MRS), and an additional volume-limited sample of 17 nearby AGNs. The study makes use of a multilevel Bayesian model of UHECR injection, propagation and detection. We find that for reasonable ranges of prior parameters the Bayes factors disfavour a purely isotropic model. For fiducial values of the model parameters, we report 68 per cent credible intervals for the fraction of source originating UHECRs of 0.09^{+0.05}_{-0.04}, 0.25^{+0.09}_{-0.08}, 0.24^{+0.12}_{-0.10}, and 0.08^{+0.04}_{-0.03} for the VCV, Swift-BAT and 2MRS catalogues, and the sample of 17 AGNs, respectively.

  13. Bayesian analysis of response to selection: a case study using litter size in Danish Yorkshire pigs.

    PubMed Central

    Sorensen, D; Vernersen, A; Andersen, S

    2000-01-01

    Implementation of a Bayesian analysis of a selection experiment is illustrated using litter size [total number of piglets born (TNB)] in Danish Yorkshire pigs. Other traits studied include average litter weight at birth (WTAB) and proportion of piglets born dead (PRBD). Response to selection for TNB was analyzed with a number of models, which differed in their level of hierarchy, in their prior distributions, and in the parametric form of the likelihoods. A model assessment study favored a particular form of an additive genetic model. With this model, the Monte Carlo estimate of the 95% probability interval of response to selection was (0.23; 0.60), with a posterior mean of 0.43 piglets. WTAB showed a correlated response of -7.2 g, with a 95% probability interval equal to (-33.1; 18.9). The posterior mean of the genetic correlation between TNB and WTAB was -0.23 with a 95% probability interval equal to (-0.46; -0.01). PRBD was studied informally; it increases with larger litters, when litter size is >7 piglets born. A number of methodological issues related to the Bayesian model assessment study are discussed, as well as the genetic consequences of inferring response to selection using additive genetic models. PMID:10978292

  14. Assessing State Nuclear Weapons Proliferation: Using Bayesian Network Analysis of Social Factors

    SciTech Connect

    Coles, Garill A.; Brothers, Alan J.; Olson, Jarrod; Whitney, Paul D.

    2010-04-16

    A Bayesian network (BN) model of social factors can support proliferation assessments by estimating the likelihood that a state will pursue a nuclear weapon. Social factors including political, economic, nuclear capability, security, and national identity and psychology factors may play as important a role in whether a State pursues nuclear weapons as more physical factors. This paper will show how using Bayesian reasoning on a generic case of a would-be proliferator State can be used to combine evidence that supports proliferation assessment. Theories and analysis by political scientists can be leveraged in a quantitative and transparent way to indicate proliferation risk. BN models facilitate diagnosis and inference in a probabilistic environment by using a network of nodes and acyclic directed arcs between the nodes whose connections, or absence of, indicate probabilistic relevance, or independence. We propose a BN model that would use information from both traditional safeguards and the strengthened safeguards associated with the Additional Protocol to indicate countries with a high risk of proliferating nuclear weapons. This model could be used in a variety of applications such a prioritization tool and as a component of state safeguards evaluations. This paper will discuss the benefits of BN reasoning, the development of Pacific Northwest National Laboratory’s (PNNL) BN state proliferation model and how it could be employed as an analytical tool.

  15. Fast Bayesian whole-brain fMRI analysis with spatial 3D priors.

    PubMed

    Sidén, Per; Eklund, Anders; Bolin, David; Villani, Mattias

    2017-02-01

    Spatial whole-brain Bayesian modeling of task-related functional magnetic resonance imaging (fMRI) is a great computational challenge. Most of the currently proposed methods therefore do inference in subregions of the brain separately or do approximate inference without comparison to the true posterior distribution. A popular such method, which is now the standard method for Bayesian single subject analysis in the SPM software, is introduced in Penny et al. (2005b). The method processes the data slice-by-slice and uses an approximate variational Bayes (VB) estimation algorithm that enforces posterior independence between activity coefficients in different voxels. We introduce a fast and practical Markov chain Monte Carlo (MCMC) scheme for exact inference in the same model, both slice-wise and for the whole brain using a 3D prior on activity coefficients. The algorithm exploits sparsity and uses modern techniques for efficient sampling from high-dimensional Gaussian distributions, leading to speed-ups without which MCMC would not be a practical option. Using MCMC, we are for the first time able to evaluate the approximate VB posterior against the exact MCMC posterior, and show that VB can lead to spurious activation. In addition, we develop an improved VB method that drops the assumption of independent voxels a posteriori. This algorithm is shown to be much faster than both MCMC and the original VB for large datasets, with negligible error compared to the MCMC posterior.

  16. Developing a Validation Methodology for Expert-Informed Bayesian Network Models Supporting Nuclear Nonproliferation Analysis

    SciTech Connect

    White, Amanda M.; Gastelum, Zoe N.; Whitney, Paul D.

    2014-05-13

    Under the auspices of Pacific Northwest National Laboratory’s Signature Discovery Initiative (SDI), the research team developed a series of Bayesian Network models to assess multi-source signatures of nuclear programs. A Bayesian network is a mathematical model that can be used to marshal evidence to assess competing hypotheses. The purpose of the models was to allow non-expert analysts to benefit from the use of expert-informed mathematical models to assess nuclear programs, because such assessments require significant technical expertise ranging from the nuclear fuel cycle, construction and engineering, imagery analysis, and so forth. One such model developed under this research was aimed at assessing the consistency of open-source information about a nuclear facility with the facility’s declared use. The model incorporates factors such as location, security and safety features among others identified by subject matter experts as crucial to their assessments. The model includes key features, observables and their relationships. The model also provides documentation, which serves as training materials for the non-experts.

  17. Bayesian analysis of a multivariate null intercept errors-in-variables regression model.

    PubMed

    Aoki, Reiko; Bolfarine, Heleno; Achcar, Jorge A; Dorival, Leão P Júnior

    2003-11-01

    Longitudinal data are of great interest in analysis of clinical trials. In many practical situations the covariate can not be measured precisely and a natural alternative model is the errors-in-variables regression models. In this paper we study a null intercept errors-in-variables regression model with a structure of dependency between the response variables within the same group. We apply the model to real data presented in Hadgu and Koch (Hadgu, A., Koch, G. (1999). Application of generalized estimating equations to a dental randomized clinical trial. J. Biopharmaceutical Statistics 9(1):161-178). In that study volunteers with preexisting dental plaque were randomized to two experimental mouth rinses (A and B) or a control mouth rinse with double blinding. The dental plaque index was measured for each subject in the beginning of the study and at two follow-up times, which leads to the presence of an interclass correlation. We propose the use of a Bayesian approach to model a multivariate null intercept errors-in-variables regression model to the longitudinal data. The proposed Bayesian approach accommodates the correlated measurements and incorporates the restriction that the slopes must lie in the (0, 1) interval. A Gibbs sampler is used to perform the computations.

  18. Bayesian analysis of an admixture model with mutations and arbitrarily linked markers.

    PubMed

    Excoffier, Laurent; Estoup, Arnaud; Cornuet, Jean-Marie

    2005-03-01

    We introduce here a Bayesian analysis of a classical admixture model in which all parameters are simultaneously estimated. Our approach follows the approximate Bayesian computation (ABC) framework, relying on massive simulations and a rejection-regression algorithm. Although computationally intensive, this approach can easily deal with complex mutation models and partially linked loci, and it can be thoroughly validated without much additional computation cost. Compared to a recent maximum-likelihood (ML) method, the ABC approach leads to similarly accurate estimates of admixture proportions in the case of recent admixture events, but it is found superior when the admixture is more ancient. All other parameters of the admixture model such as the divergence time between parental populations, the admixture time, and the population sizes are also well estimated, unlike the ML method. The use of partially linked markers does not introduce any particular bias in the estimation of admixture, but ML confidence intervals are found too narrow if linkage is not specifically accounted for. The application of our method to an artificially admixed domestic bee population from northwest Italy suggests that the admixture occurred in the last 10-40 generations and that the parental Apis mellifera and A. ligustica populations were completely separated since the last glacial maximum.

  19. Bayesian analysis of radiocarbon chronologies: examples from the European Late-glacial

    NASA Astrophysics Data System (ADS)

    Blockley, S. P. E.; Lowe, J. J.; Walker, M. J. C.; Asioli, A.; Trincardi, F.; Coope, G. R.; Donahue, R. E.

    2004-02-01

    Although there are many Late-glacial (ca. 15 000-11 000 cal. yr BP) proxy climate records from northwest Europe, some analysed at a very high temporal resolution (decadal to century scale), attempts to establish time-stratigraphical correlations between sequences are constrained by problems of radiocarbon dating. In an attempt to overcome some of these difficulties, we have used a Bayesian approach to the analysis of radiocarbon chronologies for two Late-glacial sites in the British Isles and one in the Adriatic Sea. The palaeoclimatic records from the three sites were then compared with that from the GRIP Greenland ice-core. Although there are some apparent differences in the timing of climatic events during the early part of the Late-glacial (pre-14 000 cal. yr BP), the results suggest that regional climatic changes appear to have been broadly comparable between Greenland, the British Isles and the Adriatic during the major part of the Late-glacial (i.e. between 14 000 and 11 000 cal. yr BP). The advantage of using the Bayesian approach is that it provides a means of testing the reliability of Late-glacial radiocarbon chronologies that is independent of regional chronostratigraphical (climatostratigraphical) frameworks. It also uses the full radiocarbon inventory available for each sequence and makes explicit any data selection applied. Potentially, therefore, it offers a more objective basis for comparing regional radiocarbon chronologies than the conventional approaches that have been used hitherto. Copyright

  20. BASiCS: Bayesian Analysis of Single-Cell Sequencing Data.

    PubMed

    Vallejos, Catalina A; Marioni, John C; Richardson, Sylvia

    2015-06-01

    Single-cell mRNA sequencing can uncover novel cell-to-cell heterogeneity in gene expression levels in seemingly homogeneous populations of cells. However, these experiments are prone to high levels of unexplained technical noise, creating new challenges for identifying genes that show genuine heterogeneous expression within the population of cells under study. BASiCS (Bayesian Analysis of Single-Cell Sequencing data) is an integrated Bayesian hierarchical model where: (i) cell-specific normalisation constants are estimated as part of the model parameters, (ii) technical variability is quantified based on spike-in genes that are artificially introduced to each analysed cell's lysate and (iii) the total variability of the expression counts is decomposed into technical and biological components. BASiCS also provides an intuitive detection criterion for highly (or lowly) variable genes within the population of cells under study. This is formalised by means of tail posterior probabilities associated to high (or low) biological cell-to-cell variance contributions, quantities that can be easily interpreted by users. We demonstrate our method using gene expression measurements from mouse Embryonic Stem Cells. Cross-validation and meaningful enrichment of gene ontology categories within genes classified as highly (or lowly) variable supports the efficacy of our approach.

  1. BASiCS: Bayesian Analysis of Single-Cell Sequencing Data

    PubMed Central

    Vallejos, Catalina A.; Marioni, John C.; Richardson, Sylvia

    2015-01-01

    Single-cell mRNA sequencing can uncover novel cell-to-cell heterogeneity in gene expression levels in seemingly homogeneous populations of cells. However, these experiments are prone to high levels of unexplained technical noise, creating new challenges for identifying genes that show genuine heterogeneous expression within the population of cells under study. BASiCS (Bayesian Analysis of Single-Cell Sequencing data) is an integrated Bayesian hierarchical model where: (i) cell-specific normalisation constants are estimated as part of the model parameters, (ii) technical variability is quantified based on spike-in genes that are artificially introduced to each analysed cell’s lysate and (iii) the total variability of the expression counts is decomposed into technical and biological components. BASiCS also provides an intuitive detection criterion for highly (or lowly) variable genes within the population of cells under study. This is formalised by means of tail posterior probabilities associated to high (or low) biological cell-to-cell variance contributions, quantities that can be easily interpreted by users. We demonstrate our method using gene expression measurements from mouse Embryonic Stem Cells. Cross-validation and meaningful enrichment of gene ontology categories within genes classified as highly (or lowly) variable supports the efficacy of our approach. PMID:26107944

  2. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling

    PubMed Central

    Onisko, Agnieszka; Druzdzel, Marek J.; Austin, R. Marshall

    2016-01-01

    Background: Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. Aim: The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. Materials and Methods: This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan–Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. Results: The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Conclusion: Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches. PMID:28163973

  3. Analysis of the shearout failure mode in composite bolted joints

    NASA Technical Reports Server (NTRS)

    Wilson, D. W.; Pipes, R. B.

    1981-01-01

    A semi-empirical shearout strength model has been formulated for the analysis of composite bolted joints with allowance for the effects of joint geometry. The model employs a polynomial stress function in conjunction with a point stress failure criterion to predict strength as a function of fastener size, edge distance, and half spacing. The stress function is obtained by two-dimensional plane-stress finite element analysis using quadrilateral elements with orthotropic material properties. Comparison of experimentally determined shearout strength data with model predicted failures has substantiated the accuracy of the model.

  4. Stress analysis of bolted joints under centrifugal force

    NASA Astrophysics Data System (ADS)

    Imura, Makoto; Iizuka, Motonobu; Nakae, Shigeki; Mori, Takeshi; Koyama, Takayuki

    2014-06-01

    Our objective is to develop a long-life rotary machine for synchronous generators and motors. To do this, it is necessary to design a high-strength bolted joint, which is responsible for fixing a salient pole on a rotor shaft. While the rotary machine is in operation, not only centrifugal force but also moment are loaded on a bolted joint, because a point of load is eccentric to a centre of a bolt. We tried to apply the theory proposed in VDI2230-Blatt1 to evaluate the bolted joint under eccentric force, estimate limited centrifugal force, which is the cause of partial separation between the pole and the rotor shaft, and then evaluate additional tension of a bolt after the partial separation has occurred. We analyzed the bolted joint by FEM, and defined load introduction factor in that case. Additionally, we investigated the effect of the variation of bolt preload on the partial separation. We did a full scale experiment with a prototype rotor to reveal the variation of bolt preload against tightening torque. After that, we verified limited centrifugal force and the strength of the bolted joint by the VDI2230-Blatt1 theory and FEM considering the variation of bolt preload. Finally, we could design a high-strength bolted joint verified by the theoretical study and FEM analysis.

  5. Computational Modelling and Movement Analysis of Hip Joint with Muscles

    NASA Astrophysics Data System (ADS)

    Siswanto, W. A.; Yoon, C. C.; Salleh, S. Md.; Ngali, M. Z.; Yusup, Eliza M.

    2017-01-01

    In this study, the model of hip joint and the main muscles are modelled by finite elements. The parts included in the model are hip joint, hemi pelvis, gluteus maximus, quadratus femoris and gamellus inferior. The materials that used in these model are isotropic elastic, Mooney Rivlin and Neo-hookean. The hip resultant force of the normal gait and stair climbing are applied on the model of hip joint. The responses of displacement, stress and strain of the muscles are then recorded. FEBio non-linear solver for biomechanics is employed to conduct the simulation of the model of hip joint with muscles. The contact interfaces that used in this model are sliding contact and tied contact. From the analysis results, the gluteus maximus has the maximum displacement, stress and strain in the stair climbing. Quadratus femoris and gamellus inferior has the maximum displacement and strain in the normal gait however the maximum stress in the stair climbing. Besides that, the computational model of hip joint with muscles is produced for research and investigation platform. The model can be used as a visualization platform of hip joint.

  6. A Bayesian Approach for Instrumental Variable Analysis with Censored Time-to-Event Outcome

    PubMed Central

    Li, Gang; Lu, Xuyang

    2014-01-01

    Instrumental variable (IV) analysis has been widely used in economics, epidemiology, and other fields to estimate the causal effects of covariates on outcomes, in the presence of unobserved confounders and/or measurement errors in covariates. However, IV methods for time-to-event outcome with censored data remain underdeveloped. This paper proposes a Bayesian approach for IV analysis with censored time-to-event outcome by using a two-stage linear model. A Markov Chain Monte Carlo sampling method is developed for parameter estimation for both normal and non-normal linear models with elliptically contoured error distributions. Performance of our method is examined by simulation studies. Our method largely reduces bias and greatly improves coverage probability of the estimated causal effect, compared to the method that ignores the unobserved confounders and measurement errors. We illustrate our method on the Women's Health Initiative Observational Study and the Atherosclerosis Risk in Communities Study. PMID:25393617

  7. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer E.; Aitchison, Lindsay

    2009-01-01

    A space suit s mobility is critical to an astronaut s ability to perform work efficiently. As mobility increases, the astronaut can perform tasks for longer durations with less fatigue. The term mobility, with respect to space suits, is defined in terms of two key components: joint range of motion and joint torque. Individually these measures describe the path which in which a joint travels and the force required to move it through that path. Previous space suits mobility requirements were defined as the collective result of these two measures and verified by the completion of discrete functional tasks. While a valid way to impose mobility requirements, such a method does necessitate a solid understanding of the operational scenarios in which the final suit will be performing. Because the Constellation space suit system requirements are being finalized with a relatively immature concept of operations, the Space Suit Element team elected to define mobility in terms of its constituent parts to increase the likelihood that the future pressure garment will be mobile enough to enable a broad scope of undefined exploration activities. The range of motion requirements were defined by measuring the ranges of motion test subjects achieved while performing a series of joint maximizing tasks in a variety of flight and prototype space suits. The definition of joint torque requirements has proved more elusive. NASA evaluated several different approaches to the problem before deciding to generate requirements based on unmanned joint torque evaluations of six different space suit configurations being articulated through 16 separate joint movements. This paper discusses the experiment design, data analysis and results, and the process used to determine the final values for the Constellation pressure garment joint torque requirements.

  8. Joint linkage and segregation analysis under multiallelic trait inheritance: simplifying interpretations for complex traits.

    PubMed

    Rosenthal, Elisabeth A; Wijsman, Ellen M

    2010-05-01

    Identification of the genetic basis of common traits may be hindered by underlying complex genetic architectures that are inadequately captured by existing models, including both multiallelic and multilocus modes of inheritance (MOI). One useful approach for localizing genes underlying continuous complex traits is the joint oligogenic linkage and segregation analysis implemented in the package Loki. The method uses reversible jump Markov chain Monte Carlo to eliminate the need to prespecify the number of quantitative trait loci (QTLs) in the trait model, thus providing posterior distributions for the number of QTLs in a Bayesian framework. The current implementation assumes QTLs are diallelic, and therefore can overestimate the number of linked QTLs in the presence of a multiallelic QTL. To address the possibility of multiple alleles, we extended the QTL model to allow for a variable number of additive alleles at each locus. Application to simulated data shows that, under a diallelic MOI, the multiallelic and diallelic analysis models give similar results. Under a multiallelic MOI, the multiallelic analysis model provides better mixing and improved convergence, and leads to a more accurate estimate of the underlying trait MOI and model parameter values, than does the diallelic model. Application to real data shows the multiallelic model results in fewer estimated linked QTLs and that the predominant QTL model is similar to one of two predominant models estimated from the diallelic analysis. Our results indicate that use of a multiallelic analysis model can lead to better understanding of the genetic architecture underlying complex traits.

  9. Dynamics analysis of space robot manipulator with joint clearance

    NASA Astrophysics Data System (ADS)

    Zhao, Yang; Bai, Zheng Feng

    2011-04-01

    A computational methodology for analysis of space robot manipulator systems, considering the effects of the clearances in the joint, is presented. The contact dynamics model in joint clearance is established using the nonlinear equivalent spring-damp model and the friction effect is considered using the Coulomb friction model. The space robot system dynamic equation of manipulator with clearance is established. Then the dynamics simulation is presented and the dynamics characteristics of robot manipulator with clearance are analyzed. This work provides a practical method to analyze the dynamics characteristics of space robot manipulator with joint clearance and improves the engineering application. The computational methodology can predict the effects of clearance on space robot manipulator preferably, which is the basis of space robot manipulator design, precision analysis and ground test.

  10. An analysis of a joint shear model for jointed media with orthogonal joint sets; Yucca Mountain Site Characterization Project

    SciTech Connect

    Koteras, J.R.

    1991-10-01

    This report describes a joint shear model used in conjunction with a computational model for jointed media with orthogonal joint sets. The joint shear model allows nonlinear behavior for both joint sets. Because nonlinear behavior is allowed for both joint sets, a great many cases must be considered to fully describe the joint shear behavior of the jointed medium. An extensive set of equations is required to describe the joint shear stress and slip displacements that can occur for all the various cases. This report examines possible methods for simplifying this set of equations so that the model can be implemented efficiently form a computational standpoint. The shear model must be examined carefully to obtain a computationally efficient implementation that does not lead to numerical problems. The application to fractures in rock is discussed. 5 refs., 4 figs.

  11. Biomechanical study of tarsometatarsal joint fusion using finite element analysis.

    PubMed

    Wang, Yan; Li, Zengyong; Zhang, Ming

    2014-11-01

    Complications of surgeries in foot and ankle bring patients with severe sufferings. Sufficient understanding of the internal biomechanical information such as stress distribution, contact pressure, and deformation is critical to estimate the effectiveness of surgical treatments and avoid complications. Foot and ankle is an intricate and synergetic system, and localized intervention may alter the functions to the adjacent components. The aim of this study was to estimate biomechanical effects of the TMT joint fusion using comprehensive finite element (FE) analysis. A foot and ankle model consists of 28 bones, 72 ligaments, and plantar fascia with soft tissues embracing all the segments. Kinematic information and ground reaction force during gait were obtained from motion analysis. Three gait instants namely the first peak, second peak and mid-stance were simulated in a normal foot and a foot with TMT joint fusion. It was found that contact pressure on plantar foot increased by 0.42%, 19% and 37%, respectively after TMT fusion compared with normal foot walking. Navico-cuneiform and fifth meta-cuboid joints sustained 27% and 40% increase in contact pressure at second peak, implying potential risk of joint problems such as arthritis. Von Mises stress in the second metatarsal bone increased by 22% at midstance, making it susceptible to stress fracture. This study provides biomechanical information for understanding the possible consequences of TMT joint fusion.

  12. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    PubMed

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  13. Space Shuttle RTOS Bayesian Network

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  14. Benchmark Composite Wing Design Including Joint Analysis and Optimization

    NASA Astrophysics Data System (ADS)

    Albers, Robert G.

    A composite wing panel software package, named WING Joint OpTimization and Analysis (WINGJOTA) featuring bolted joint analysis, is created and presented in this research. Three areas of focus were the development of an analytic composite bolted joint analysis suitable for fast evaluation; a more realistic wing design than what has been considered in the open literature; and the application of two optimization algorithms for composite wing design. Optimization results from 14 wing load cases applied to a composite wing panel with joints are presented. The composite bolted joint analysis consists of an elasticity solution that provides the stress state at a characteristic distance away from the bolt holes. The stresses at the characteristic distance are compared to a failure criterion on a ply-by-ply basis that not only determines first ply failure but also the failure mode. The loads in the multi-fastener joints used in this study were determined by an iterative scheme that provides the bearing-bypass loads to the elasticity analysis. A preliminary design of a composite subsonic transport wing was developed, based around a mid-size, twin-aisle aircraft. The benchmark design includes the leading and trailing edge structures and the center box inside the fuselage. Wing masses were included as point loads, and fuel loads were incorporated as distributed loads. The side-of-body boundary condition was modeled using high stiffness springs, and the aerodynamic loads were applied using an approximate point load scheme. The entire wing structure was modeled using the finite element code ANSYS to provide the internal loads needed as boundary conditions for the wing panel analyzed by WINGJOTA. The software package WINGJOTA combines the composite bolted joint analysis, a composite plate finite element analysis, a wing aeroelastic cycle, and two optimization algorithms to form the basis of a computer code for analysis and optimization. Both the Improving Hit-and-Run (IHR) and

  15. Analysis of Knee Joint Line Obliquity after High Tibial Osteotomy.

    PubMed

    Oh, Kwang-Jun; Ko, Young Bong; Bae, Ji Hoon; Yoon, Suk Tae; Kim, Jae Gyoon

    2016-11-01

    The aim of this study was to evaluate which lower extremity alignment (knee and ankle joint) parameters affect knee joint line obliquity (KJLO) in the coronal plane after open wedge high tibial osteotomy (OWHTO). Overall, 69 knees of patients that underwent OWHTO were evaluated using radiographs obtained preoperatively and from 6 weeks to 3 months postoperatively. We measured multiple parameters of knee and ankle joint alignment (hip-knee-ankle angle [HKA], joint line height [JLH], posterior tibial slope [PS], femoral condyle-tibial plateau angle [FCTP], medial proximal tibial angle [MPTA], mechanical lateral distal femoral angle [mLDFA], KJLO, talar tilt angle [TTA], ankle joint obliquity [AJO], and the lateral distal tibial ground surface angle [LDTGA]; preoperative [-pre], postoperative [-post], and the difference between -pre and -post values [-Δ]). We categorized patients into two groups according to the KJLO-post value (the normal group [within ± 4 degrees, 56 knees] and the abnormal group [greater than ± 4 degrees, 13 knees]), and compared their -pre parameters. Multiple logistic regression analysis was used to examine the contribution of the -pre parameters to abnormal KJLO-post. The mean HKA-Δ (-9.4 ± 4.7 degrees) was larger than the mean KJLO-Δ (-2.1 ± 3.2 degrees). The knee joint alignment parameters (the HKA-pre, FCTP-pre) differed significantly between the two groups (p < 0.05). In addition, the HKA-pre (odds ratio [OR] = 1.27, p = 0.006) and FCTP-pre (OR = 2.13, p = 0.006) were significant predictors of abnormal KJLO-post. However, -pre ankle joint parameters (TTA, AJO, and LDTGA) did not differ significantly between the two groups and were not significantly associated with the abnormal KJLO-post. The -pre knee joint alignment and knee joint convergence angle evaluated by HKA-pre and FCTP-pre angle, respectively, were significant predictors of abnormal KJLO after OWHTO. However, -pre ankle joint parameters

  16. Fermi's paradox, extraterrestrial life and the future of humanity: a Bayesian analysis

    NASA Astrophysics Data System (ADS)

    Verendel, Vilhelm; Häggström, Olle

    2017-01-01

    The Great Filter interpretation of Fermi's great silence asserts that Npq is not a very large number, where N is the number of potentially life-supporting planets in the observable universe, p is the probability that a randomly chosen such planet develops intelligent life to the level of present-day human civilization, and q is the conditional probability that it then goes on to develop a technological supercivilization visible all over the observable universe. Evidence suggests that N is huge, which implies that pq is very small. Hanson (1998) and Bostrom (2008) have argued that the discovery of extraterrestrial life would point towards p not being small and therefore a very small q, which can be seen as bad news for humanity's prospects of colonizing the universe. Here we investigate whether a Bayesian analysis supports their argument, and the answer turns out to depend critically on the choice of prior distribution.

  17. Statistical analysis using the Bayesian nonparametric method for irradiation embrittlement of reactor pressure vessels

    NASA Astrophysics Data System (ADS)

    Takamizawa, Hisashi; Itoh, Hiroto; Nishiyama, Yutaka

    2016-10-01

    In order to understand neutron irradiation embrittlement in high fluence regions, statistical analysis using the Bayesian nonparametric (BNP) method was performed for the Japanese surveillance and material test reactor irradiation database. The BNP method is essentially expressed as an infinite summation of normal distributions, with input data being subdivided into clusters with identical statistical parameters, such as mean and standard deviation, for each cluster to estimate shifts in ductile-to-brittle transition temperature (DBTT). The clusters typically depend on chemical compositions, irradiation conditions, and the irradiation embrittlement. Specific variables contributing to the irradiation embrittlement include the content of Cu, Ni, P, Si, and Mn in the pressure vessel steels, neutron flux, neutron fluence, and irradiation temperatures. It was found that the measured shifts of DBTT correlated well with the calculated ones. Data associated with the same materials were subdivided into the same clusters even if neutron fluences were increased.

  18. PSF decomposition of nanoscopy images via Bayesian analysis unravels distinct molecular organization of the cell membrane

    PubMed Central

    Manzo, Carlo; van Zanten, Thomas S.; Saha, Suvrajit; Torreno-Pina, Juan A.; Mayor, Satyajit; Garcia-Parajo, Maria F.

    2014-01-01

    The spatial organization of membrane receptors at the nanoscale has major implications in cellular function and signaling. The advent of super-resolution techniques has greatly contributed to our understanding of the cellular membrane. Yet, despite the increased resolution, unbiased quantification of highly dense features, such as molecular aggregates, remains challenging. Here we describe an algorithm based on Bayesian inference of the marker intensity distribution that improves the determination of molecular positions inside dense nanometer-scale molecular aggregates. We tested the performance of the method on synthetic images representing a broad range of experimental conditions, demonstrating its wide applicability. We further applied this approach to STED images of GPI-anchored and model transmembrane proteins expressed in mammalian cells. The analysis revealed subtle differences in the organization of these receptors, emphasizing the role of cortical actin in the compartmentalization of the cell membrane. PMID:24619088

  19. Bayesian network meta-analysis for unordered categorical outcomes with incomplete data.

    PubMed

    Schmid, Christopher H; Trikalinos, Thomas A; Olkin, Ingram

    2014-06-01

    We develop a Bayesian multinomial network meta-analysis model for unordered (nominal) categorical outcomes that allows for partially observed data in which exact event counts may not be known for each category. This model properly accounts for correlations of counts in mutually exclusive categories and enables proper comparison and ranking of treatment effects across multiple treatments and multiple outcome categories. We apply the model to analyze 17 trials, each of which compares two of three treatments (high and low dose statins and standard care/control) for three outcomes for which data are complete: cardiovascular death, non-cardiovascular death and no death. We also analyze the cardiovascular death category divided into the three subcategories (coronary heart disease, stroke and other cardiovascular diseases) that are not completely observed. The multinomial and network representations show that high dose statins are effective in reducing the risk of cardiovascular disease.

  20. A Bayesian approach to probabilistic sensitivity analysis in structured benefit-risk assessment.

    PubMed

    Waddingham, Ed; Mt-Isa, Shahrul; Nixon, Richard; Ashby, Deborah

    2016-01-01

    Quantitative decision models such as multiple criteria decision analysis (MCDA) can be used in benefit-risk assessment to formalize trade-offs between benefits and risks, providing transparency to the assessment process. There is however no well-established method for propagating uncertainty of treatment effects data through such models to provide a sense of the variability of the benefit-risk balance. Here, we present a Bayesian statistical method that directly models the outcomes observed in randomized placebo-controlled trials and uses this to infer indirect comparisons between competing active treatments. The resulting treatment effects estimates are suitable for use within the MCDA setting, and it is possible to derive the distribution of the overall benefit-risk balance through Markov Chain Monte Carlo simulation. The method is illustrated using a case study of natalizumab for relapsing-remitting multiple sclerosis.

  1. Bayesian estimation of semiparametric nonlinear dynamic factor analysis models using the Dirichlet process prior.

    PubMed

    Chow, Sy-Miin; Tang, Niansheng; Yuan, Ying; Song, Xinyuan; Zhu, Hongtu

    2011-02-01

    Parameters in time series and other dynamic models often show complex range restrictions and their distributions may deviate substantially from multivariate normal or other standard parametric distributions. We use the truncated Dirichlet process (DP) as a non-parametric prior for such dynamic parameters in a novel nonlinear Bayesian dynamic factor analysis model. This is equivalent to specifying the prior distribution to be a mixture distribution composed of an unknown number of discrete point masses (or clusters). The stick-breaking prior and the blocked Gibbs sampler are used to enable efficient simulation of posterior samples. Using a series of empirical and simulation examples, we illustrate the flexibility of the proposed approach in approximating distributions of very diverse shapes.

  2. Bayesian Inference in Probabilistic Risk Assessment -- The Current State of the Art

    SciTech Connect

    Dana L. Kelly; Curtis L. Smith

    2009-02-01

    Markov chain Monte Carlo approaches to sampling directly from the joint posterior distribution of aleatory model parameters have led to tremendous advances in Bayesian inference capability in a wide variety of fields, including probabilistic risk analysis. The advent of freely available software coupled with inexpensive computing power has catalyzed this advance. This paper examines where the risk assessment community is with respect to implementing modern computational-based Bayesian approaches to inference. Through a series of examples in different topical areas, it introduces salient concepts and illustrates the practical application of Bayesian inference via Markov chain Monte Carlo sampling to a variety of important problems.

  3. Video-analysis Interface for Angular Joint Measurement

    NASA Astrophysics Data System (ADS)

    Mondani, M.; Ghersi, I.; Miralles, M. T.

    2016-04-01

    Real-time quantification of joint articular movement is instrumental in the comprehensive assessment of significant biomechanical gestures. The development of an interface, based on an automatic algorithm for 3D-motion analysis, is presented in this work. The graphical interface uses open-source libraries for video processing, and its use is intuitive. The proposed method is low-cost, of acceptable precision (|εθ| < 1°), and minimally invasive. It allows to obtain angular movement of joints in different planes, synchronized with the video of the gesture, as well as to make comparisons and calculate parameters of interest from the acquired angular kinematics.

  4. Type Ia Supernova Light-Curve Inference: Hierarchical Bayesian Analysis in the Near-Infrared

    NASA Astrophysics Data System (ADS)

    Mandel, Kaisey S.; Wood-Vasey, W. Michael; Friedman, Andrew S.; Kirshner, Robert P.

    2009-10-01

    We present a comprehensive statistical analysis of the properties of Type Ia supernova (SN Ia) light curves in the near-infrared using recent data from Peters Automated InfraRed Imaging TELescope and the literature. We construct a hierarchical Bayesian framework, incorporating several uncertainties including photometric error, peculiar velocities, dust extinction, and intrinsic variations, for principled and coherent statistical inference. SN Ia light-curve inferences are drawn from the global posterior probability of parameters describing both individual supernovae and the population conditioned on the entire SN Ia NIR data set. The logical structure of the hierarchical model is represented by a directed acyclic graph. Fully Bayesian analysis of the model and data is enabled by an efficient Markov Chain Monte Carlo algorithm exploiting the conditional probabilistic structure using Gibbs sampling. We apply this framework to the JHKs SN Ia light-curve data. A new light-curve model captures the observed J-band light-curve shape variations. The marginal intrinsic variances in peak absolute magnitudes are σ(MJ ) = 0.17 ± 0.03, σ(MH ) = 0.11 ± 0.03, and σ(MKs ) = 0.19 ± 0.04. We describe the first quantitative evidence for correlations between the NIR absolute magnitudes and J-band light-curve shapes, and demonstrate their utility for distance estimation. The average residual in the Hubble diagram for the training set SNe at cz > 2000kms-1 is 0.10 mag. The new application of bootstrap cross-validation to SN Ia light-curve inference tests the sensitivity of the statistical model fit to the finite sample and estimates the prediction error at 0.15 mag. These results demonstrate that SN Ia NIR light curves are as effective as corrected optical light curves, and, because they are less vulnerable to dust absorption, they have great potential as precise and accurate cosmological distance indicators.

  5. Predictive distributions for between-study heterogeneity and simple methods for their application in Bayesian meta-analysis.

    PubMed

    Turner, Rebecca M; Jackson, Dan; Wei, Yinghui; Thompson, Simon G; Higgins, Julian P T

    2015-03-15

    Numerous meta-analyses in healthcare research combine results from only a small number of studies, for which the variance representing between-study heterogeneity is estimated imprecisely. A Bayesian approach to estimation allows external evidence on the expected magnitude of heterogeneity to be incorporated. The aim of this paper is to provide tools that improve the accessibility of Bayesian meta-analysis. We present two methods for implementing Bayesian meta-analysis, using numerical integration and importance sampling techniques. Based on 14,886 binary outcome meta-analyses in the Cochrane Database of Systematic Reviews, we derive a novel set of predictive distributions for the degree of heterogeneity expected in 80 settings depending on the outcomes assessed and comparisons made. These can be used as prior distributions for heterogeneity in future meta-analyses. The two methods are implemented in R, for which code is provided. Both methods produce equivalent results to standard but more complex Markov chain Monte Carlo approaches. The priors are derived as log-normal distributions for the between-study variance, applicable to meta-analyses of binary outcomes on the log odds-ratio scale. The methods are applied to two example meta-analyses, incorporating the relevant predictive distributions as prior distributions for between-study heterogeneity. We have provided resources to facilitate Bayesian meta-analysis, in a form accessible to applied researchers, which allow relevant prior information on the degree of heterogeneity to be incorporated.

  6. Adhesive Characterization and Progressive Damage Analysis of Bonded Composite Joints

    NASA Technical Reports Server (NTRS)

    Girolamo, Donato; Davila, Carlos G.; Leone, Frank A.; Lin, Shih-Yung

    2014-01-01

    The results of an experimental/numerical campaign aimed to develop progressive damage analysis (PDA) tools for predicting the strength of a composite bonded joint under tensile loads are presented. The PDA is based on continuum damage mechanics (CDM) to account for intralaminar damage, and cohesive laws to account for interlaminar and adhesive damage. The adhesive response is characterized using standard fracture specimens and digital image correlation (DIC). The displacement fields measured by DIC are used to calculate the J-integrals, from which the associated cohesive laws of the structural adhesive can be derived. A finite element model of a sandwich conventional splice joint (CSJ) under tensile loads was developed. The simulations indicate that the model is capable of predicting the interactions of damage modes that lead to the failure of the joint.

  7. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer

    2010-01-01

    A space suit's mobility is critical to an astronaut's ability to perform work efficiently. As mobility increases, the astronaut can perform tasks for longer durations with less fatigue. Mobility can be broken down into two parts: range of motion (ROM) and torque. These two measurements describe how the suit moves and how much force it takes to move. Two methods were chosen to define mobility requirements for the Constellation Space Suit Element (CSSE). One method focuses on range of motion and the second method centers on joint torque. A joint torque test was conducted to determine a baseline for current advanced space suit joint torques. This test utilized the following space suits: Extravehicular Mobility Unit (EMU), Advanced Crew Escape Suit (ACES), I-Suit, D-Suit, Enhanced Mobility (EM)- ACES, and Mark III (MK-III). Data was collected data from 16 different joint movements of each suit. The results were then reviewed and CSSE joint torque requirement values were selected. The focus of this paper is to discuss trends observed during data analysis.

  8. Sea-level variability in tide-gauge and geological records: An empirical Bayesian analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Kopp, R. E.; Hay, C.; Morrow, E.; Mitrovica, J. X.; Horton, B.; Kemp, A.

    2013-12-01

    Sea level varies at a range of temporal and spatial scales, and understanding all its significant sources of variability is crucial to building sea-level rise projections relevant to local decision-making. In the twentieth-century record, sites along the U.S. east coast have exhibited typical year-to-year variability of several centimeters. A faster-than-global increase in sea-level rise in the northeastern United States since about 1990 has led some to hypothesize a 'sea-level rise hot spot' in this region, perhaps driven by a trend in the Atlantic Meridional Overturning Circulation related to anthropogenic climate change [1]. However, such hypotheses must be evaluated in the context of natural variability, as revealed by observational and paleo-records. Bayesian and empirical Bayesian statistical approaches are well suited for assimilating data from diverse sources, such as tide-gauges and peats, with differing data availability and uncertainties, and for identifying regionally covarying patterns within these data. We present empirical Bayesian analyses of twentieth-century tide gauge data [2]. We find that the mid-Atlantic region of the United States has experienced a clear acceleration of sea level relative to the global average since about 1990, but this acceleration does not appear to be unprecedented in the twentieth-century record. The rate and extent of this acceleration instead appears comparable to an acceleration observed in the 1930s and 1940s. Both during the earlier episode of acceleration and today, the effect appears to be significantly positively correlated with the Atlantic Multidecadal Oscillation and likely negatively correlated with the North Atlantic Oscillation [2]. The Holocene and Common Era database of geological sea-level rise proxies [3,4] may allow these relationships to be assessed beyond the span of the direct observational record. At a global scale, similar approaches can be employed to look for the spatial fingerprints of land ice

  9. Bayesian uncertainty analysis for advanced seismic imaging - Application to the Mentelle Basin, Australia

    NASA Astrophysics Data System (ADS)

    Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.

    2016-04-01

    multivariate posterior distribution. The novelty of our approach and the major difference compared to the traditional semblance spectrum velocity analysis procedure is the calculation of uncertainty of the output model. As the model is able to estimate the credibility intervals of the corresponding interval velocities, we can produce the most probable PSDM images in an iterative manner. The depths extracted using our statistical algorithm are in very good agreement with the key horizons retrieved from the drilled core DSDP-258, showing that the Bayesian model is able to control the depth migration of the seismic data and estimate the uncertainty to the drilling targets.

  10. Joint modality fusion and temporal context exploitation for semantic video analysis

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Georgios Th; Mezaris, Vasileios; Kompatsiaris, Ioannis; Strintzis, Michael G.

    2011-12-01

    In this paper, a multi-modal context-aware approach to semantic video analysis is presented. Overall, the examined video sequence is initially segmented into shots and for every resulting shot appropriate color, motion and audio features are extracted. Then, Hidden Markov Models (HMMs) are employed for performing an initial association of each shot with the semantic classes that are of interest separately for each modality. Subsequently, a graphical modeling-based approach is proposed for jointly performing modality fusion and temporal context exploitation. Novelties of this work include the combined use of contextual information and multi-modal fusion, and the development of a new representation for providing motion distribution information to HMMs. Specifically, an integrated Bayesian Network is introduced for simultaneously performing information fusion of the individual modality analysis results and exploitation of temporal context, contrary to the usual practice of performing each task separately. Contextual information is in the form of temporal relations among the supported classes. Additionally, a new computationally efficient method for providing motion energy distribution-related information to HMMs, which supports the incorporation of motion characteristics from previous frames to the currently examined one, is presented. The final outcome of this overall video analysis framework is the association of a semantic class with every shot. Experimental results as well as comparative evaluation from the application of the proposed approach to four datasets belonging to the domains of tennis, news and volleyball broadcast video are presented.

  11. A Bayesian Approach to Person Fit Analysis in Item Response Theory Models. Research Report.

    ERIC Educational Resources Information Center

    Glas, Cees A. W.; Meijer, Rob R.

    A Bayesian approach to the evaluation of person fit in item response theory (IRT) models is presented. In a posterior predictive check, the observed value on a discrepancy variable is positioned in its posterior distribution. In a Bayesian framework, a Markov Chain Monte Carlo procedure can be used to generate samples of the posterior distribution…

  12. Bayesian Analysis and Characterization of Multiple Populations in Galactic Globular Clusters

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, Rachel A.; Stenning, David; Sarajedini, Ata; von Hippel, Ted; van Dyk, David A.; Robinson, Elliot; Stein, Nathan; Jefferys, William H.; BASE-9, HST UVIS Globular Cluster Treasury Program

    2017-01-01

    Globular clusters have long been important tools to unlock the early history of galaxies. Thus, it is crucial we understand the formation and characteristics of the globular clusters (GCs) themselves. Historically, GCs were thought to be simple and largely homogeneous populations, formed via collapse of a single molecular cloud. However, this classical view has been overwhelmingly invalidated by recent work. It is now clear that the vast majority of globular clusters in our Galaxy host two or more chemically distinct populations of stars, with variations in helium and light elements at discrete abundance levels. No coherent story has arisen that is able to fully explain the formation of multiple populations in globular clusters nor the mechanisms that drive stochastic variations from cluster to cluster.We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival ACS Treasury observations of 30 Galactic Globular Clusters to characterize two distinct stellar populations. A sophisticated Bayesian technique is employed to simultaneously sample the joint posterior distribution of age, distance, and extinction for each cluster, as well as unique helium values for two populations within each cluster and the relative proportion of those populations. We find the helium differences among the two populations in the clusters fall in the range of 0.04 to 0.11. Because adequate models varying in CNO are not presently available, we view these spreads as upper limits and present them with statistical rather than observational uncertainties. Evidence supports previous studies suggesting an increase in helium content concurrent with increasing mass of the cluster. We also find that the proportion of the first population of stars increases with mass. Our results are examined in the context of proposed globular cluster formation scenarios.

  13. Combined target factor analysis and Bayesian soft-classification of interference-contaminated samples: forensic fire debris analysis.

    PubMed

    Williams, Mary R; Sigman, Michael E; Lewis, Jennifer; Pitan, Kelly McHugh

    2012-10-10

    A bayesian soft classification method combined with target factor analysis (TFA) is described and tested for the analysis of fire debris data. The method relies on analysis of the average mass spectrum across the chromatographic profile (i.e., the total ion spectrum, TIS) from multiple samples taken from a single fire scene. A library of TIS from reference ignitable liquids with assigned ASTM classification is used as the target factors in TFA. The class-conditional distributions of correlations between the target and predicted factors for each ASTM class are represented by kernel functions and analyzed by bayesian decision theory. The soft classification approach assists in assessing the probability that ignitable liquid residue from a specific ASTM E1618 class, is present in a set of samples from a single fire scene, even in the presence of unspecified background contributions from pyrolysis products. The method is demonstrated with sample data sets and then tested on laboratory-scale burn data and large-scale field test burns. The overall performance achieved in laboratory and field test of the method is approximately 80% correct classification of fire debris samples.

  14. Integrated survival analysis using an event-time approach in a Bayesian framework

    USGS Publications Warehouse

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  15. Analysis of regression confidence intervals and Bayesian credible intervals for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Lu, Dan; Ye, Ming; Hill, Mary C.

    2012-09-01

    Confidence intervals based on classical regression theories augmented to include prior information and credible intervals based on Bayesian theories are conceptually different ways to quantify parametric and predictive uncertainties. Because both confidence and credible intervals are used in environmental modeling, we seek to understand their differences and similarities. This is of interest in part because calculating confidence intervals typically requires tens to thousands of model runs, while Bayesian credible intervals typically require tens of thousands to millions of model runs. Given multi-Gaussian distributed observation errors, our theoretical analysis shows that, for linear or linearized-nonlinear models, confidence and credible intervals are always numerically identical when consistent prior information is used. For nonlinear models, nonlinear confidence and credible intervals can be numerically identical if parameter confidence regions defined using the approximate likelihood method and parameter credible regions estimated using Markov chain Monte Carlo realizations are numerically identical and predictions are a smooth, monotonic function of the parameters. Both occur if intrinsic model nonlinearity is small. While the conditions of Gaussian errors and small intrinsic model nonlinearity are violated by many environmental models, heuristic tests using analytical and numerical models suggest that linear and nonlinear confidence intervals can be useful approximations of uncertainty even under significantly nonideal conditions. In the context of epistemic model error for a complex synthetic nonlinear groundwater problem, the linear and nonlinear confidence and credible intervals for individual models performed similarly enough to indicate that the computationally frugal confidence intervals can be useful in many circumstances. Experiences with these groundwater models are expected to be broadly applicable to many environmental models. We suggest that for

  16. Integrated survival analysis using an event-time approach in a Bayesian framework

    PubMed Central

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  17. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini, Benjamin; Young, Daniel

    2005-01-01

    Contents include the following: JWST/ISIM introduction. Design and analysis challenges for ISIM bonded joints. JWST/ISIM joint designs. Bonded joint analysis. Finite element modeling. Failure criteria and margin calculation. Analysis/test correlation procedure. Example of test data and analysis.

  18. Bayesian hierarchical analysis of within-units variances in repeated measures experiments.

    PubMed

    Ten Have, T R; Chinchilli, V M

    1994-09-30

    We develop hierarchical Bayesian models for biomedical data that consist of multiple measurements on each individual under each of several conditions. The focus is on investigating differences in within-subject variation between conditions. We present both population-level and individual-level comparisons. We extend the partial likelihood models of Chinchilli et al. with a unique Bayesian hierarchical framework for variance components and associated degrees of freedom. We use the Gibbs sampler to estimate posterior marginal distributions for the parameters of the Bayesian hierarchical models. The application involves a comparison of two cholesterol analysers each applied repeatedly to a sample of subjects. Both the partial likelihood and Bayesian approaches yield similar results, although confidence limits tend to be wider under the Bayesian models.

  19. Joint association analysis of bivariate quantitative and qualitative traits.

    PubMed

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  20. The Phylogeographic History of the New World Screwworm Fly, Inferred by Approximate Bayesian Computation Analysis

    PubMed Central

    Azeredo-Espin, Ana Maria L.

    2013-01-01

    Insect pest phylogeography might be shaped both by biogeographic events and by human influence. Here, we conducted an approximate Bayesian computation (ABC) analysis to investigate the phylogeography of the New World screwworm fly, Cochliomyia hominivorax, with the aim of understanding its population history and its order and time of divergence. Our ABC analysis supports that populations spread from North to South in the Americas, in at least two different moments. The first split occurred between the North/Central American and South American populations in the end of the Last Glacial Maximum (15,300-19,000 YBP). The second split occurred between the North and South Amazonian populations in the transition between the Pleistocene and the Holocene eras (9,100-11,000 YBP). The species also experienced population expansion. Phylogenetic analysis likewise suggests this north to south colonization and Maxent models suggest an increase in the number of suitable areas in South America from the past to present. We found that the phylogeographic patterns observed in C. hominivorax cannot be explained only by climatic oscillations and can be connected to host population histories. Interestingly we found these patterns are very coincident with general patterns of ancient human movements in the Americas, suggesting that humans might have played a crucial role in shaping the distribution and population structure of this insect pest. This work presents the first hypothesis test regarding the processes that shaped the current phylogeographic structure of C. hominivorax and represents an alternate perspective on investigating the problem of insect pests. PMID:24098436

  1. Integrating diagnostic data analysis for W7-AS using Bayesian graphical models

    SciTech Connect

    Svensson, J.; Dinklage, A.; Geiger, J.; Werner, A.; Fischer, R

    2004-10-01

    Analysis of diagnostic data in fusion experiments is usually dealt with separately for each diagnostic, in spite of the existence of a large number of interdependencies between global physics parameters and measurements from different diagnostics. In this article, we demonstrate an integrated data analysis model, applied to the W7-AS stellarator, where diagnostic interdependencies have been modeled in a novel way by using so called Bayesian graphical models. A Thomson scattering system, interferometer, diamagnetic loop, and neutral particle analyzer are combined with an equilibrium reconstruction, forming together one single model for the determination of quantities such as density and temperature profiles, directly in magnetic coordinates. The magnetic coordinate transformation is itself inferred from the measurements. Influence of both statistical and systematic uncertainties on quantities from equilibrium calculations, such as position of flux surfaces, can therefore be readily estimated together with uncertainties of profile estimates. The model allows for modular addition of further diagnostics. A software architecture for such integrated analysis where possibly large number of diagnostic and theoretical codes need to be combined, will also be discussed.

  2. The phylogeographic history of the new world screwworm fly, inferred by approximate bayesian computation analysis.

    PubMed

    Fresia, Pablo; Azeredo-Espin, Ana Maria L; Lyra, Mariana L

    2013-01-01

    Insect pest phylogeography might be shaped both by biogeographic events and by human influence. Here, we conducted an approximate Bayesian computation (ABC) analysis to investigate the phylogeography of the New World screwworm fly, Cochliomyia hominivorax, with the aim of understanding its population history and its order and time of divergence. Our ABC analysis supports that populations spread from North to South in the Americas, in at least two different moments. The first split occurred between the North/Central American and South American populations in the end of the Last Glacial Maximum (15,300-19,000 YBP). The second split occurred between the North and South Amazonian populations in the transition between the Pleistocene and the Holocene eras (9,100-11,000 YBP). The species also experienced population expansion. Phylogenetic analysis likewise suggests this north to south colonization and Maxent models suggest an increase in the number of suitable areas in South America from the past to present. We found that the phylogeographic patterns observed in C. hominivorax cannot be explained only by climatic oscillations and can be connected to host population histories. Interestingly we found these patterns are very coincident with general patterns of ancient human movements in the Americas, suggesting that humans might have played a crucial role in shaping the distribution and population structure of this insect pest. This work presents the first hypothesis test regarding the processes that shaped the current phylogeographic structure of C. hominivorax and represents an alternate perspective on investigating the problem of insect pests.

  3. Bayesian Hierarchical Spatially Correlated Functional Data Analysis with Application to Colon Carcinogenesis

    PubMed Central

    Baladandayuthapani, Veerabhadran; Mallick, Bani K.; Hong, Mee Young; Lupton, Joanne R.; Turner, Nancy D.; Carroll, Raymond J.

    2009-01-01

    Summary In this article, we present new methods to analyze data from an experiment using rodent models to investigate the role of p27, an important cell-cycle mediator, in early colon carcinogenesis. The responses modeled here are essentially functions nested within a two-stage hierarchy. Standard functional data analysis literature focuses on a single stage of hierarchy and conditionally independent functions with near white noise. However, in our experiment, there is substantial biological motivation for the existence of spatial correlation among the functions, which arise from the locations of biological structures called colonic crypts: this possible functional correlation is a phenomenon we term crypt signaling. Thus, as a point of general methodology, we require an analysis that allows for functions to be correlated at the deepest level of the hierarchy. Our approach is fully Bayesian and uses Markov chain Monte Carlo methods for inference and estimation. Analysis of this data set gives new insights into the structure of p27 expression in early colon carcinogenesis and suggests the existence of significant crypt signaling. Our methodology uses regression splines, and because of the hierarchical nature of the data, dimension reduction of the covariance matrix of the spline coefficients is important: we suggest simple methods for overcoming this problem. PMID:17608780

  4. Application of Bayesian and cost benefit risk analysis in water resources management

    NASA Astrophysics Data System (ADS)

    Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.

    2016-03-01

    Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.

  5. Bayesian meta-analysis of diagnostic tests allowing for imperfect reference standards.

    PubMed

    Menten, J; Boelaert, M; Lesaffre, E

    2013-12-30

    There is an increasing interest in meta-analyses of rapid diagnostic tests (RDTs) for infectious diseases. To avoid spectrum bias, these meta-analyses should focus on phase IV studies performed in the target population. For many infectious diseases, these target populations attend primary health care centers in resource-constrained settings where it is difficult to perform gold standard diagnostic tests. As a consequence, phase IV diagnostic studies often use imperfect reference standards, which may result in biased meta-analyses of the diagnostic accuracy of novel RDTs. We extend the standard bivariate model for the meta-analysis of diagnostic studies to correct for differing and imperfect reference standards in the primary studies and to accommodate data from studies that try to overcome the absence of a true gold standard through the use of latent class analysis. Using Bayesian methods, improved estimates of sensitivity and specificity are possible, especially when prior information is available on the diagnostic accuracy of the reference test. In this analysis, the deviance information criterion can be used to detect conflicts between the prior information and observed data. When applying the model to a dataset of the diagnostic accuracy of an RDT for visceral leishmaniasis, the standard meta-analytic methods appeared to underestimate the specificity of the RDT.

  6. Bayesian inference for neural electromagnetic source localization: analysis of MEG visual evoked activity

    NASA Astrophysics Data System (ADS)

    Schmidt, David M.; George, John S.; Wood, C. C.

    1999-05-01

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sampling the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surface, within a sphere centered on some location in cortex. The number and radii of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented.

  7. Bayesian Inference for Neural Electromagnetic Source Localization: Analysis of MEG Visual Evoked Activity

    SciTech Connect

    George, J.S.; Schmidt, D.M.; Wood, C.C.

    1999-02-01

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented.

  8. A Bayesian analysis of mouse infectivity data to evaluate the effectiveness of using ultraviolet light as a drinking water disinfectant.

    PubMed

    Qian, Song S; Linden, Karl; Donnelly, Maureen

    2005-10-01

    Modelling disinfectant performance using Bayesian hierarchical methods can overcome problems with traditional methods and lead to improved estimates. Animal and cell-culture assays are used to estimate the degree of inactivation of a microorganism produced by a given disinfectant dose. Assay data traditionally are analyzed with logistic model or most probable number (MPN) method. These methods are limited particularly when assays show all (or no) animals or cells to be infected-estimates are reported as greater than (or less than) a measurement limit (i.e., censored data). The proposed Bayesian approach (1) properly models the propagation of uncertainty through the data analysis/modelling process, resulting in reduced model uncertainty, and (2) uses appropriate probability distribution models for the response variables, avoiding the censored data problem and more accurately describing statistical error when estimating dose-response behavior. This paper applies the Bayesian hierarchical models to logistic and MPN data from published papers for the ultraviolet (UV) inactivation of Cryptosporidium. Results are compared to those from three alternative models. The Bayesian model estimates a significantly lower UV dose for a given level of Cryptosporidium inactivation than the alternative models, due mainly to the reduced model uncertainty.

  9. Bayesian sequential meta-analysis design in evaluating cardiovascular risk in a new antidiabetic drug development program.

    PubMed

    Chen, Ming-Hui; Ibrahim, Joseph G; Amy Xia, H; Liu, Thomas; Hennessey, Violeta

    2014-04-30

    Recently, the Center for Drug Evaluation and Research at the Food and Drug Administration released a guidance that makes recommendations about how to demonstrate that a new antidiabetic therapy to treat type 2 diabetes is not associated with an unacceptable increase in cardiovascular risk. One of the recommendations from the guidance is that phases II and III trials should be appropriately designed and conducted so that a meta-analysis can be performed. In addition, the guidance implies that a sequential meta-analysis strategy could be adopted. That is, the initial meta-analysis could aim at demonstrating the upper bound of a 95% confidence interval (CI) for the estimated hazard ratio to be < 1.8 for the purpose of enabling a new drug application or a biologics license application. Subsequently after the marketing authorization, a final meta-analysis would need to show the upper bound to be < 1.3. In this context, we develop a new Bayesian sequential meta-analysis approach using survival regression models to assess whether the size of a clinical development program is adequate to evaluate a particular safety endpoint. We propose a Bayesian sample size determination methodology for sequential meta-analysis clinical trial design with a focus on controlling the familywise type I error rate and power. We use the partial borrowing power prior to incorporate the historical survival meta-data into the Bayesian design. We examine various properties of the proposed methodology, and simulation-based computational algorithms are developed to generate predictive data at various interim analyses, sample from the posterior distributions, and compute various quantities such as the power and the type I error in the Bayesian sequential meta-analysis trial design. We apply the proposed methodology to the design of a hypothetical antidiabetic drug development program for evaluating cardiovascular risk.

  10. A BAYESIAN HIERARCHICAL SPATIAL POINT PROCESS MODEL FOR MULTI-TYPE NEUROIMAGING META-ANALYSIS

    PubMed Central

    Kang, Jian; Nichols, Thomas E.; Wager, Tor D.; Johnson, Timothy D.

    2014-01-01

    Neuroimaging meta-analysis is an important tool for finding consistent effects over studies that each usually have 20 or fewer subjects. Interest in meta-analysis in brain mapping is also driven by a recent focus on so-called “reverse inference”: where as traditional “forward inference” identifies the regions of the brain involved in a task, a reverse inference identifies the cognitive processes that a task engages. Such reverse inferences, however, requires a set of meta-analysis, one for each possible cognitive domain. However, existing methods for neuroimaging meta-analysis have significant limitations. Commonly used methods for neuroimaging meta-analysis are not model based, do not provide interpretable parameter estimates, and only produce null hypothesis inferences; further, they are generally designed for a single group of studies and cannot produce reverse inferences. In this work we address these limitations by adopting a non-parametric Bayesian approach for meta analysis data from multiple classes or types of studies. In particular, foci from each type of study are modeled as a cluster process driven by a random intensity function that is modeled as a kernel convolution of a gamma random field. The type-specific gamma random fields are linked and modeled as a realization of a common gamma random field, shared by all types, that induces correlation between study types and mimics the behavior of a univariate mixed effects model. We illustrate our model on simulation studies and a meta analysis of five emotions from 219 studies and check model fit by a posterior predictive assessment. In addition, we implement reverse inference by using the model to predict study type from a newly presented study. We evaluate this predictive performance via leave-one-out cross validation that is efficiently implemented using importance sampling techniques. PMID:25426185

  11. Analysis of Bolted Flanged Panel Joint for GRP Sectional Tanks

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, S. M.; Dyer, B.; Kashtalyan, M.; Akisanya, A. R.; Guz, I.; Wilkinson, C.

    2014-02-01

    The performance of flanged panel bolted joints used in Glass Reinforced Plastic (GRP) sectional tanks is investigated using a combination of experimental and computational methods. A four-panel bolted assembly is subjected to varying pressure in a rupture test rig to study damage development at the intersection of the four panels. It is found that cracking initiates at a panel corner at the four panel intersection at a pressure of 35 kPa and propagates to other panel corners with increasing pressure. This is attributed to the excessive deformation at the four panel intersection. The effect of bolt spacing, varying end distances and bolt pre-tension in decreasing the localized deformation and maximum induced stresses are investigated using finite element analysis. It is found that varying the amount of bolt spacing and end distances had a considerable influence on the joint performance whereas varying bolt pretension had very negligible effect. Consequently, this study establishes the maximum pressure which the GRP panel joint can withstand without failure and the corresponding optimum joint parameters.

  12. A bayesian approach to classification criteria for spectacled eiders

    USGS Publications Warehouse

    Taylor, B.L.; Wade, P.R.; Stehn, R.A.; Cochrane, J.F.

    1996-01-01

    To facilitate decisions to classify species according to risk of extinction, we used Bayesian methods to analyze trend data for the Spectacled Eider, an arctic sea duck. Trend data from three independent surveys of the Yukon-Kuskokwim Delta were analyzed individually and in combination to yield posterior distributions for population growth rates. We used classification criteria developed by the recovery team for Spectacled Eiders that seek to equalize errors of under- or overprotecting the species. We conducted both a Bayesian decision analysis and a frequentist (classical statistical inference) decision analysis. Bayesian decision analyses are computationally easier, yield basically the same results, and yield results that are easier to explain to nonscientists. With the exception of the aerial survey analysis of the 10 most recent years, both Bayesian and frequentist methods indicated that an endangered classification is warranted. The discrepancy between surveys warrants further research. Although the trend data are abundance indices, we used a preliminary estimate of absolute abundance to demonstrate how to calculate extinction distributions using the joint probability distributions for population growth rate and variance in growth rate generated by the Bayesian analysis. Recent apparent increases in abundance highlight the need for models that apply to declining and then recovering species.

  13. The Bayesian approach to reporting GSR analysis results: some first-hand experiences

    NASA Astrophysics Data System (ADS)

    Charles, Sebastien; Nys, Bart

    2010-06-01

    The use of Bayesian principles in the reporting of forensic findings has been a matter of interest for some years. Recently, also the GSR community is gradually exploring the advantages of this method, or rather approach, for writing reports. Since last year, our GSR group is adapting reporting procedures to the use of Bayesian principles. The police and magistrates find the reports more directly accessible and useful in their part of the criminal investigation. In the lab we find that, through applying the Bayesian principles, unnecessary analyses can be eliminated and thus time can be freed on the instruments.

  14. Asymmetric joint multifractal analysis in Chinese stock markets

    NASA Astrophysics Data System (ADS)

    Chen, Yuwen; Zheng, Tingting

    2017-04-01

    In this paper, the asymmetric joint multifractal analysis method based on statistical physics is proposed to explore the asymmetric correlation between daily returns and trading volumes in Chinese stock markets. The result shows asymmetric multifractal correlations exist between return and trading volume in Chinese stock markets. Moreover, when the stock indexes are upward, the fluctuations of returns are always weaker than when they are downward, whether the trading volumes are more or less.

  15. Joint analysis of the seismic data and velocity gravity model

    NASA Astrophysics Data System (ADS)

    Belyakov, A. S.; Lavrov, V. S.; Muchamedov, V. A.; Nikolaev, A. V.

    2016-03-01

    We performed joint analysis of the seismic noises recorded at the Japanese Ogasawara station located on Titijima Island in the Philippine Sea using the STS-2 seismograph at the OSW station in the winter period of January 1-15, 2015, over the background of a velocity gravity model. The graphs prove the existence of a cause-and-effect relation between the seismic noise and gravity and allow us to consider it as a desired signal.

  16. Bayesian Covariate Selection in Mixed-Effects Models For Longitudinal Shape Analysis

    PubMed Central

    Muralidharan, Prasanna; Fishbaugh, James; Kim, Eun Young; Johnson, Hans J.; Paulsen, Jane S.; Gerig, Guido; Fletcher, P. Thomas

    2016-01-01

    The goal of longitudinal shape analysis is to understand how anatomical shape changes over time, in response to biological processes, including growth, aging, or disease. In many imaging studies, it is also critical to understand how these shape changes are affected by other factors, such as sex, disease diagnosis, IQ, etc. Current approaches to longitudinal shape analysis have focused on modeling age-related shape changes, but have not included the ability to handle covariates. In this paper, we present a novel Bayesian mixed-effects shape model that incorporates simultaneous relationships between longitudinal shape data and multiple predictors or covariates to the model. Moreover, we place an Automatic Relevance Determination (ARD) prior on the parameters, that lets us automatically select which covariates are most relevant to the model based on observed data. We evaluate our proposed model and inference procedure on a longitudinal study of Huntington's disease from PREDICT-HD. We first show the utility of the ARD prior for model selection in a univariate modeling of striatal volume, and next we apply the full high-dimensional longitudinal shape model to putamen shapes. PMID:28090246

  17. Bayesian analysis of equivalent sound sources for a military jet aircraft

    NASA Astrophysics Data System (ADS)

    Hart, David

    2012-10-01

    Radiated jet noise is believed to be generated by a mixture of fine-scale turbulent structures (FSS) and large-scale turbulent structures (LSS). In previous work, the noise from an F -22A Raptor has been modeled as two sets of monopole sources whose characteristics account for both FSS and LSS sound propagation [Morgan, J. Acoust. Soc. Am. 129, 2442 (2011)]. The source parameters are manually adjusted until the calculations produce the measured field along a surface. Once this has been done, the equivalent source of monopoles can be used to further analyze the sound field around the jet. In order to automate this process, parameters are selected based on Bayesian methods that are implemented with simulated annealing and fast Gibbs sampler algorithms. This method yields the best fit parameters, and the sensitivity of the solution based on generated posterior probability distributions (PPD). For example, analysis has shown that the peak source region of the LSS is more important than the peak source region of the FSS. Further analysis of the generated PPD's will give greater insight into the nature of the radiated jet noise.

  18. Bayesian inverse analysis of neuromagnetic data using cortically constrained multiple dipoles.

    PubMed

    Auranen, Toni; Nummenmaa, Aapo; Hämäläinen, Matti S; Jääskeläinen, Iiro P; Lampinen, Jouko; Vehtari, Aki; Sams, Mikko

    2007-10-01

    A recently introduced Bayesian model for magnetoencephalographic (MEG) data consistently localized multiple simulated dipoles with the help of marginalization of spatiotemporal background noise covariance structure in the analysis [Jun et al., (2005): Neuroimage 28:84-98]. Here, we elaborated this model to include subject's individual brain surface reconstructions with cortical location and orientation constraints. To enable efficient Markov chain Monte Carlo sampling of the dipole locations, we adopted a parametrization of the source space surfaces with two continuous variables (i.e., spherical angle coordinates). Prior to analysis, we simplified the likelihood by exploiting only a small set of independent measurement combinations obtained by singular value decomposition of the gain matrix, which also makes the sampler significantly faster. We analyzed both realistically simulated and empirical MEG data recorded during simple auditory and visual stimulation. The results show that our model produces reasonable solutions and adequate data fits without much manual interaction. However, the rigid cortical constraints seemed to make the utilized scheme challenging as the sampler did not switch modes of the dipoles efficiently. This is problematic in the presence of evidently highly multimodal posterior distribution, and especially in the relative quantitative comparison of the different modes. To overcome the difficulties with the present model, we propose the use of loose orientation constraints and combined model of prelocalization utilizing the hierarchical minimum-norm estimate and multiple dipole sampling scheme.

  19. Analysis of traffic accidents on rural highways using Latent Class Clustering and Bayesian Networks.

    PubMed

    de Oña, Juan; López, Griselda; Mujalli, Randa; Calvo, Francisco J

    2013-03-01

    One of the principal objectives of traffic accident analyses is to identify key factors that affect the severity of an accident. However, with the presence of heterogeneity in the raw data used, the analysis of traffic accidents becomes difficult. In this paper, Latent Class Cluster (LCC) is used as a preliminary tool for segmentation of 3229 accidents on rural highways in Granada (Spain) between 2005 and 2008. Next, Bayesian Networks (BNs) are used to identify the main factors involved in accident severity for both, the entire database (EDB) and the clusters previously obtained by LCC. The results of these cluster-based analyses are compared with the results of a full-data analysis. The results show that the combined use of both techniques is very interesting as it reveals further information that would not have been obtained without prior segmentation of the data. BN inference is used to obtain the variables that best identify accidents with killed or seriously injured. Accident type and sight distance have been identify in all the cases analysed; other variables such as time, occupant involved or age are identified in EDB and only in one cluster; whereas variables vehicles involved, number of injuries, atmospheric factors, pavement markings and pavement width are identified only in one cluster.

  20. On preserving original variables in Bayesian PCA with application to image analysis.

    PubMed

    Li, Jun; Tao, Dacheng

    2012-12-01

    Principal component analysis (PCA) computes a succinct data representation by converting the data to a few new variables while retaining maximum variation. However, the new variables are difficult to interpret, because each one is combined with all of the original input variables and has obscure semantics. Under the umbrella of Bayesian data analysis, this paper presents a new prior to explicitly regularize combinations of input variables. In particular, the prior penalizes pair-wise products of the coefficients of PCA and encourages a sparse model. Compared to the commonly used l1regularizer, the proposed prior encourages the sparsity pattern in the resultant coefficients to be consistent with the intrinsic groups in the original input variables. Moreover, the proposed prior can be explained as recovering a robust estimation of the covariance matrix for PCA. The proposed model is suited for analyzing visual data, where it encourages the output variables to correspond to meaningful parts in the data. We demonstrate the characteristics and effectiveness of the proposed technique through experiments on both synthetic and real data.

  1. What’s in a Name: A Bayesian Hierarchical Analysis of the Name-Letter Effect

    PubMed Central

    Dyjas, Oliver; Grasman, Raoul P. P. P.; Wetzels, Ruud; van der Maas, Han L. J.; Wagenmakers, Eric-Jan

    2012-01-01

    People generally prefer their initials to the other letters of the alphabet, a phenomenon known as the name-letter effect. This effect, researchers have argued, makes people move to certain cities, buy particular brands of consumer products, and choose particular professions (e.g., Angela moves to Los Angeles, Phil buys a Philips TV, and Dennis becomes a dentist). In order to establish such associations between people’s initials and their behavior, researchers typically carry out statistical analyses of large databases. Current methods of analysis ignore the hierarchical structure of the data, do not naturally handle order-restrictions, and are fundamentally incapable of confirming the null hypothesis. Here we outline a Bayesian hierarchical analysis that avoids these limitations and allows coherent inference both on the level of the individual and on the level of the group. To illustrate our method, we re-analyze two data sets that address the question of whether people are disproportionately likely to live in cities that resemble their name. PMID:23055989

  2. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples.

  3. Reliable evaluation of the quantal determinants of synaptic efficacy using Bayesian analysis

    PubMed Central

    Beato, M.

    2013-01-01

    Communication between neurones in the central nervous system depends on synaptic transmission. The efficacy of synapses is determined by pre- and postsynaptic factors that can be characterized using quantal parameters such as the probability of neurotransmitter release, number of release sites, and quantal size. Existing methods of estimating the quantal parameters based on multiple probability fluctuation analysis (MPFA) are limited by their requirement for long recordings to acquire substantial data sets. We therefore devised an algorithm, termed Bayesian Quantal Analysis (BQA), that can yield accurate estimates of the quantal parameters from data sets of as small a size as 60 observations for each of only 2 conditions of release probability. Computer simulations are used to compare its performance in accuracy with that of MPFA, while varying the number of observations and the simulated range in release probability. We challenge BQA with realistic complexities characteristic of complex synapses, such as increases in the intra- or intersite variances, and heterogeneity in release probabilities. Finally, we validate the method using experimental data obtained from electrophysiological recordings to show that the effect of an antagonist on postsynaptic receptors is correctly characterized by BQA by a specific reduction in the estimates of quantal size. Since BQA routinely yields reliable estimates of the quantal parameters from small data sets, it is ideally suited to identify the locus of synaptic plasticity for experiments in which repeated manipulations of the recording environment are unfeasible. PMID:23076101

  4. Bayesian Analysis Diagnostics: Diagnosing Predictive and Parameter Uncertainty for Hydrological Models

    NASA Astrophysics Data System (ADS)

    Thyer, Mark; Kavetski, Dmitri; Evin, Guillaume; Kuczera, George; Renard, Ben; McInerney, David

    2015-04-01

    All scientific and statistical analysis, particularly in natural sciences, is based on approximations and assumptions. For example, the calibration of hydrological models using approaches such as Nash-Sutcliffe efficiency and/or simple least squares (SLS) objective functions may appear to be 'assumption-free'. However, this is a naïve point of view, as SLS assumes that the model residuals (residuals=observed-predictions) are independent, homoscedastic and Gaussian. If these assumptions are poor, parameter inference and model predictions will be correspondingly poor. An essential step in model development is therefore to verify the assumptions and approximations made in the modeling process. Diagnostics play a key role in verifying modeling assumptions. An important advantage of the formal Bayesian approach is that the modeler is required to make the assumptions explicit. Specialized diagnostics can then be developed and applied to test and verify their assumptions. This paper presents a suite of statistical and modeling diagnostics that can be used by environmental modelers to test their modeling calibration assumptions and diagnose model deficiencies. Three major types of diagnostics are presented: Residual Diagnostics Residual diagnostics are used to test whether the assumptions of the residual error model within the likelihood function are compatible with the data. This includes testing for statistical independence, homoscedasticity, unbiasedness, Gaussianity and any distributional assumptions. Parameter Uncertainty and MCMC Diagnostics An important part of Bayesian analysis is assess parameter uncertainty. Markov Chain Monte Carlo (MCMC) methods are a powerful numerical tool for estimating these uncertainties. Diagnostics based on posterior parameter distributions can be used to assess parameter identifiability, interactions and correlations. This provides a very useful tool for detecting and remedying model deficiencies. In addition, numerical diagnostics are

  5. A comparison of Bayesian and Monte Carlo sensitivity analysis for unmeasured confounding.

    PubMed

    McCandless, Lawrence C; Gustafson, Paul

    2017-04-06

    Bias from unmeasured confounding is a persistent concern in observational studies, and sensitivity analysis has been proposed as a solution. In the recent years, probabilistic sensitivity analysis using either Monte Carlo sensitivity analysis (MCSA) or Bayesian sensitivity analysis (BSA) has emerged as a practical analytic strategy when there are multiple bias parameters inputs. BSA uses Bayes theorem to formally combine evidence from the prior distribution and the data. In contrast, MCSA samples bias parameters directly from the prior distribution. Intuitively, one would think that BSA and MCSA ought to give similar results. Both methods use similar models and the same (prior) probability distributions for the bias parameters. In this paper, we illustrate the surprising finding that BSA and MCSA can give very different results. Specifically, we demonstrate that MCSA can give inaccurate uncertainty assessments (e.g. 95% intervals) that do not reflect the data's influence on uncertainty about unmeasured confounding. Using a data example from epidemiology and simulation studies, we show that certain combinations of data and prior distributions can result in dramatic prior-to-posterior changes in uncertainty about the bias parameters. This occurs because the application of Bayes theorem in a non-identifiable model can sometimes rule out certain patterns of unmeasured confounding that are not compatible with the data. Consequently, the MCSA approach may give 95% intervals that are either too wide or too narrow and that do not have 95% frequentist coverage probability. Based on our findings, we recommend that analysts use BSA for probabilistic sensitivity analysis. Copyright © 2017 John Wiley & Sons, Ltd.

  6. BAYESIAN ANALYSIS TO IDENTIFY NEW STAR CANDIDATES IN NEARBY YOUNG STELLAR KINEMATIC GROUPS

    SciTech Connect

    Malo, Lison; Doyon, Rene; Lafreniere, David; Artigau, Etienne; Gagne, Jonathan; Baron, Frederique; Riedel, Adric E-mail: doyon@astro.umontreal.ca E-mail: artigau@astro.umontreal.ca E-mail: baron@astro.umontreal.ca

    2013-01-10

    We present a new method based on a Bayesian analysis to identify new members of nearby young kinematic groups. The analysis minimally takes into account the position, proper motion, magnitude, and color of a star, but other observables can be readily added (e.g., radial velocity, distance). We use this method to find new young low-mass stars in the {beta} Pictoris and AB Doradus moving groups and in the TW Hydrae, Tucana-Horologium, Columba, Carina, and Argus associations. Starting from a sample of 758 mid-K to mid-M (K5V-M5V) stars showing youth indicators such as H{alpha} and X-ray emission, our analysis yields 214 new highly probable low-mass members of the kinematic groups analyzed. One is in TW Hydrae, 37 in {beta} Pictoris, 17 in Tucana-Horologium, 20 in Columba, 6 in Carina, 50 in Argus, 32 in AB Doradus, and the remaining 51 candidates are likely young but have an ambiguous membership to more than one association. The false alarm rate for new candidates is estimated to be 5% for {beta} Pictoris and TW Hydrae, 10% for Tucana-Horologium, Columba, Carina, and Argus, and 14% for AB Doradus. Our analysis confirms the membership of 58 stars proposed in the literature. Firm membership confirmation of our new candidates will require measurement of their radial velocity (predicted by our analysis), parallax, and lithium 6708 A equivalent width. We have initiated these follow-up observations for a number of candidates, and we have identified two stars (2MASSJ01112542+1526214, 2MASSJ05241914-1601153) as very strong candidate members of the {beta} Pictoris moving group and one strong candidate member (2MASSJ05332558-5117131) of the Tucana-Horologium association; these three stars have radial velocity measurements confirming their membership and lithium detections consistent with young age.

  7. An automated land-use mapping comparison of the Bayesian maximum likelihood and linear discriminant analysis algorithms

    NASA Technical Reports Server (NTRS)

    Tom, C. H.; Miller, L. D.

    1984-01-01

    The Bayesian maximum likelihood parametric classifier has been tested against the data-based formulation designated 'linear discrimination analysis', using the 'GLIKE' decision and "CLASSIFY' classification algorithms in the Landsat Mapping System. Identical supervised training sets, USGS land use/land cover classes, and various combinations of Landsat image and ancilliary geodata variables, were used to compare the algorithms' thematic mapping accuracy on a single-date summer subscene, with a cellularized USGS land use map of the same time frame furnishing the ground truth reference. CLASSIFY, which accepts a priori class probabilities, is found to be more accurate than GLIKE, which assumes equal class occurrences, for all three mapping variable sets and both levels of detail. These results may be generalized to direct accuracy, time, cost, and flexibility advantages of linear discriminant analysis over Bayesian methods.

  8. A Two-Step Bayesian Approach for Propensity Score Analysis: Simulations and Case Study.

    PubMed

    Kaplan, David; Chen, Jianshen

    2012-07-01

    A two-step Bayesian propensity score approach is introduced that incorporates prior information in the propensity score equation and outcome equation without the problems associated with simultaneous Bayesian propensity score approaches. The corresponding variance estimators are also provided. The two-step Bayesian propensity score is provided for three methods of implementation: propensity score stratification, weighting, and optimal full matching. Three simulation studies and one case study are presented to elaborate the proposed two-step Bayesian propensity score approach. Results of the simulation studies reveal that greater precision in the propensity score equation yields better recovery of the frequentist-based treatment effect. A slight advantage is shown for the Bayesian approach in small samples. Results also reveal that greater precision around the wrong treatment effect can lead to seriously distorted results. However, greater precision around the correct treatment effect parameter yields quite good results, with slight improvement seen with greater precision in the propensity score equation. A comparison of coverage rates for the conventional frequentist approach and proposed Bayesian approach is also provided. The case study reveals that credible intervals are wider than frequentist confidence intervals when priors are non-informative.

  9. Joint linkage and association analysis with exome sequence data implicates SLC25A40 in hypertriglyceridemia.

    PubMed

    Rosenthal, Elisabeth A; Ranchalis, Jane; Crosslin, David R; Burt, Amber; Brunzell, John D; Motulsky, Arno G; Nickerson, Deborah A; Wijsman, Ellen M; Jarvik, Gail P

    2013-12-05

    Hypertriglyceridemia (HTG) is a heritable risk factor for cardiovascular disease. Investigating the genetics of HTG may identify new drug targets. There are ~35 known single-nucleotide variants (SNVs) that explain only ~10% of variation in triglyceride (TG) level. Because of the genetic heterogeneity of HTG, a family study design is optimal for identification of rare genetic variants with large effect size because the same mutation can be observed in many relatives and cosegregation with TG can be tested. We considered HTG in a five-generation family of European American descent (n = 121), ascertained for familial combined hyperlipidemia. By using Bayesian Markov chain Monte Carlo joint oligogenic linkage and association analysis, we detected linkage to chromosomes 7 and 17. Whole-exome sequence data revealed shared, highly conserved, private missense SNVs in both SLC25A40 on chr7 and PLD2 on chr17. Jointly, these SNVs explained 49% of the genetic variance in TG; however, only the SLC25A40 SNV was significantly associated with TG (p = 0.0001). This SNV, c.374A>G, causes a highly disruptive p.Tyr125Cys substitution just outside the second helical transmembrane region of the SLC25A40 inner mitochondrial membrane transport protein. Whole-gene testing in subjects from the Exome Sequencing Project confirmed the association between TG and SLC25A40 rare, highly conserved, coding variants (p = 0.03). These results suggest a previously undescribed pathway for HTG and illustrate the power of large pedigrees in the search for rare, causal variants.

  10. The Analysis of Adhesively Bonded Advanced Composite Joints Using Joint Finite Elements

    NASA Technical Reports Server (NTRS)

    Stapleton, Scott E.; Waas, Anthony M.

    2012-01-01

    The design and sizing of adhesively bonded joints has always been a major bottleneck in the design of composite vehicles. Dense finite element (FE) meshes are required to capture the full behavior of a joint numerically, but these dense meshes are impractical in vehicle-scale models where a course mesh is more desirable to make quick assessments and comparisons of different joint geometries. Analytical models are often helpful in sizing, but difficulties arise in coupling these models with full-vehicle FE models. Therefore, a joint FE was created which can be used within structural FE models to make quick assessments of bonded composite joints. The shape functions of the joint FE were found by solving the governing equations for a structural model for a joint. By analytically determining the shape functions of the joint FE, the complex joint behavior can be captured with very few elements. This joint FE was modified and used to consider adhesives with functionally graded material properties to reduce the peel stress concentrations located near adherend discontinuities. Several practical concerns impede the actual use of such adhesives. These include increased manufacturing complications, alterations to the grading due to adhesive flow during manufacturing, and whether changing the loading conditions significantly impact the effectiveness of the grading. An analytical study is conducted to address these three concerns. Furthermore, proof-of-concept testing is conducted to show the potential advantages of functionally graded adhesives. In this study, grading is achieved by strategically placing glass beads within the adhesive layer at different densities along the joint. Furthermore, the capability to model non-linear adhesive constitutive behavior with large rotations was developed, and progressive failure of the adhesive was modeled by re-meshing the joint as the adhesive fails. Results predicted using the joint FE was compared with experimental results for various

  11. Experimental analysis of a joint free space cryptosystem

    NASA Astrophysics Data System (ADS)

    Ramírez, John Fredy Barrera; Osorio, Alexis Jaramillo; Zea, Alejandro Vélez; Torroba, Roberto

    2016-08-01

    In this paper, we analyze a joint free space cryptosystem scheme implemented in an actual laboratory environment. In this encrypting architecture, the object to be encoded and the security key are placed side by side in the input plane without optical elements between the input and the output planes. In order to get the encrypted information, the joint Fresnel power distribution JFPD coming from the input plane is registered in a CMOS camera. The information of the encrypting key is registered with an off axis Fresnel holographic setup. The data registered with the experimental setup is digitally filtered to obtain the encrypted object and the encryption key. In addition, we explore the performance of the experimental system as a function of the object-camera and key-camera distances, which are two new parameters of interest. These parameters become available as a result of developing this encrypting scheme. The theoretical and experimental analysis shows the validity and applicability of the cryptosystem.

  12. A unified method for inference of tokamak equilibria and validation of force-balance models based on Bayesian analysis

    NASA Astrophysics Data System (ADS)

    von Nessi, G. T.; Hole, M. J.; the MAST Team

    2013-05-01

    A new method, based on Bayesian analysis, is presented which unifies the inference of plasma equilibria parameters in a tokamak with the ability to quantify differences between inferred equilibria and Grad-Shafranov (GS) force-balance solutions. At the heart of this technique is the new concept of weak observation, which allows multiple forward models to be associated with a single diagnostic observation. This new idea subsequently provides a means by which the space of GS solutions can be efficiently characterized via a prior distribution. The posterior evidence (a normalization constant of the inferred posterior distribution) is also inferred in the analysis and is used as a proxy for determining how relatively close inferred equilibria are to force-balance for different discharges/times. These points have been implemented in a code called BEAST (Bayesian equilibrium analysis and simulation tool), which uses a special implementation of Skilling’s nested sampling algorithm (Skilling 2006 Bayesian Anal. 1 833-59) to perform sampling and evidence calculations on high-dimensional, non-Gaussian posteriors. Initial BEAST equilibrium inference results are presented for two high-performance MAST discharges.

  13. Foliar interception of radionuclides in dry conditions: a meta-analysis using a Bayesian modeling approach.

    PubMed

    Sy, Mouhamadou Moustapha; Ancelet, Sophie; Henner, Pascale; Hurtevent, Pierre; Simon-Cornu, Marie

    2015-09-01

    Uncertainty on the parameters that describe the transfer of radioactive materials into the (terrestrial) environment may be characterized thanks to datasets such as those compiled within International Atomic Energy Agency (IAEA) documents. Nevertheless, the information included in these documents is too poor to derive a relevant and informative uncertainty distribution regarding dry interception of radionuclides by the pasture grass and the leaves of vegetables. In this paper, 145 sets of dry interception measurements by the aboveground biomass of specific plants were collected from published scientific papers. A Bayesian meta-analysis was performed to derive the posterior probability distributions of the parameters that reflect their uncertainty given the collected data. Four competing models were compared in terms of both fitting performances and predictive abilities to reproduce plausible dry interception data. The asymptotic interception factor, applicable whatever the species and radionuclide to the highest aboveground biomass values (e.g. mature leafy vegetables), was estimated with the best model, to be 0.87 with a 95% credible interval (0.85, 0.89).

  14. Integration of Bayesian analysis for eutrophication prediction and assessment in a landscape lake.

    PubMed

    Yang, Likun; Zhao, Xinhua; Peng, Sen; Zhou, Guangyu

    2015-01-01

    Eutrophication models have been widely used to assess water quality in landscape lakes. Because flow rate in landscape lakes is relatively low and similar to that of natural lakes, eutrophication is more dominant in landscape lakes. To assess the risk of eutrophication in landscape lakes, a set of dynamic equations was developed to simulate lake water quality for total nitrogen (TN), total phosphorous (TP), dissolve oxygen (DO) and chlorophyll a (Chl a). Firstly, the Bayesian calibration results were described. Moreover, the ability of the model to reproduce adequately the observed mean patterns and major cause-effect relationships for water quality conditions in landscape lakes were presented. Two loading scenarios were used. A Monte Carlo algorithm was applied to calculate the predicated water quality distributions, which were used in the established hierarchical assessment system for lake water quality risk. The important factors affecting the lake water quality risk were defined using linear regression analysis. The results indicated that the variations in the landscape lake receiving recharge water quality caused considerable landscape lake water quality risk in the surrounding area. Moreover, the Chl a concentration in lake water was significantly affected by TP and TN concentrations; the lake TP concentration was the limiting factor for growth of plankton in lake water. The lake water TN concentration provided the basic nutritional requirements. Lastly, lower TN and TP concentrations in the receiving recharge water caused increased lake water quality risk.

  15. Multimodel Bayesian analysis of data-worth applied to unsaturated fractured tuffs

    NASA Astrophysics Data System (ADS)

    Lu, Dan; Ye, Ming; Neuman, Shlomo P.; Xue, Liang

    2012-01-01

    To manage water resource and environmental systems effectively requires suitable data. The worth of collecting such data depends on their potential benefit and cost, including the expected cost (risk) of failing to take an appropriate decision. Evaluating this risk calls for a probabilistic approach to data-worth assessment. Recently we [39] developed a multimodel approach to optimum value-of-information or data-worth analysis based on model averaging within a maximum likelihood Bayesian framework. Adopting a two-dimensional synthetic example, we implemented our approach using Monte Carlo (MC) simulations with and without lead order approximations, finding that the former approach was almost equally accurate but computationally more efficient. Here we apply our methodology to pneumatic permeability data from vertical and inclined boreholes drilled into unsaturated fractured tuff near Superior, Arizona. In an attempt to improve computational efficiency, we introduce three new approximations that require less computational effort and compare results with those obtained by the original Monte Carlo method. The first approximation disregards uncertainty in model parameter estimates, the second does so for estimates of potential new data, and the third disregards both uncertainties. We find that only the first approximation yields reliable quantitative assessments of reductions in predictive uncertainty brought about by the collection of new data. We conclude that, whereas parameter uncertainty may sometimes be disregarded for purposes of analyzing data worth, the same does not generally apply to uncertainty in estimates of potential new data.

  16. Bayesian time series analysis of segments of the Rocky Mountain trumpeter swan population

    USGS Publications Warehouse

    Wright, Christopher K.; Sojda, Richard S.; Goodman, Daniel

    2002-01-01

    A Bayesian time series analysis technique, the dynamic linear model, was used to analyze counts of Trumpeter Swans (Cygnus buccinator) summering in Idaho, Montana, and Wyoming from 1931 to 2000. For the Yellowstone National Park segment of white birds (sub-adults and adults combined) the estimated probability of a positive growth rate is 0.01. The estimated probability of achieving the Subcommittee on Rocky Mountain Trumpeter Swans 2002 population goal of 40 white birds for the Yellowstone segment is less than 0.01. Outside of Yellowstone National Park, Wyoming white birds are estimated to have a 0.79 probability of a positive growth rate with a 0.05 probability of achieving the 2002 objective of 120 white birds. In the Centennial Valley in southwest Montana, results indicate a probability of 0.87 that the white bird population is growing at a positive rate with considerable uncertainty. The estimated probability of achieving the 2002 Centennial Valley objective of 160 white birds is 0.14 but under an alternative model falls to 0.04. The estimated probability that the Targhee National Forest segment of white birds has a positive growth rate is 0.03. In Idaho outside of the Targhee National Forest, white birds are estimated to have a 0.97 probability of a positive growth rate with a 0.18 probability of attaining the 2002 goal of 150 white birds.

  17. Improving Bayesian analysis for LISA Pathfinder using an efficient Markov Chain Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ferraioli, Luigi; Porter, Edward K.; Armano, Michele; Audley, Heather; Congedo, Giuseppe; Diepholz, Ingo; Gibert, Ferran; Hewitson, Martin; Hueller, Mauro; Karnesis, Nikolaos; Korsakova, Natalia; Nofrarias, Miquel; Plagnol, Eric; Vitale, Stefano

    2014-02-01

    We present a parameter estimation procedure based on a Bayesian framework by applying a Markov Chain Monte Carlo algorithm to the calibration of the dynamical parameters of the LISA Pathfinder satellite. The method is based on the Metropolis-Hastings algorithm and a two-stage annealing treatment in order to ensure an effective exploration of the parameter space at the beginning of the chain. We compare two versions of the algorithm with an application to a LISA Pathfinder data analysis problem. The two algorithms share the same heating strategy but with one moving in coordinate directions using proposals from a multivariate Gaussian distribution, while the other uses the natural logarithm of some parameters and proposes jumps in the eigen-space of the Fisher Information matrix. The algorithm proposing jumps in the eigen-space of the Fisher Information matrix demonstrates a higher acceptance rate and a slightly better convergence towards the equilibrium parameter distributions in the application to LISA Pathfinder data. For this experiment, we return parameter values that are all within ˜1 σ of the injected values. When we analyse the accuracy of our parameter estimation in terms of the effect they have on the force-per-unit of mass noise, we find that the induced errors are three orders of magnitude less than the expected experimental uncertainty in the power spectral density.

  18. Bayesian Analysis of Sparse Anisotropic Universe Models and Application to the Five-Year WMAP Data

    NASA Astrophysics Data System (ADS)

    Groeneboom, Nicolaas E.; Eriksen, Hans Kristian

    2009-01-01

    We extend the previously described cosmic microwave background Gibbs sampling framework to allow for exact Bayesian analysis of anisotropic universe models, and apply this method to the five-year Wilkinson Microwave Anisotropy Probe (WMAP) temperature observations. This involves adding support for nondiagonal signal covariance matrices, and implementing a general spectral parameter Monte Carlo Markov chain sampler. As a working example, we apply these techniques to the model recently introduced by Ackerman et al., describing, for instance, violations of rotational invariance during the inflationary epoch. After verifying the code with simulated data, we analyze the foreground-reduced five-year WMAP temperature sky maps. For ell <= 400 and the W-band data, we find tentative evidence for a preferred direction pointing toward (l, b) = (110°, 10°) with an anisotropy amplitude of g * = 0.15 ± 0.039. Similar results are obtained from the V-band data (g * = 0.11 ± 0.039; (l, b) = (130°, 20°)). Further, the preferred direction is stable with respect to multipole range, seen independently in both ell = [2, 100] and [100, 400], although at lower statistical significance. We have not yet been able to establish a fully satisfactory explanation for the observations in terms of known systematics, such as noncosmological foregrounds, correlated noise, or asymmetric beams, but stress that further study of all these issues is warranted before a cosmological interpretation can be supported.

  19. Uncertainty in the Bayesian meta-analysis of normally distributed surrogate endpoints.

    PubMed

    Bujkiewicz, Sylwia; Thompson, John R; Spata, Enti; Abrams, Keith R

    2015-08-13

    We investigate the effect of the choice of parameterisation of meta-analytic models and related uncertainty on the validation of surrogate endpoints. Different meta-analytical approaches take into account different levels of uncertainty which may impact on the accuracy of the predictions of treatment effect on the target outcome from the treatment effect on a surrogate endpoint obtained from these models. A range of Bayesian as well as frequentist meta-analytical methods are implemented using illustrative examples in relapsing-remitting multiple sclerosis, where the treatment effect on disability worsening is the primary outcome of interest in healthcare evaluation, while the effect on relapse rate is considered as a potential surrogate to the effect on disability progression, and in gastric cancer, where the disease-free survival has been shown to be a good surrogate endpoint to the overall survival. Sensitivity analysis was carried out to assess the impact of distributional assumptions on the predictions. Also, sensitivity to modelling assumptions and performance of the models were investigated by simulation. Although different methods can predict mean true outcome almost equally well, inclusion of uncertainty around all relevant parameters of the model may lead to less certain and hence more conservative predictions. When investigating endpoints as candidate surrogate outcomes, a careful choice of the meta-analytical approach has to be made. Models underestimating the uncertainty of available evidence may lead to overoptimistic predictions which can then have an effect on decisions made based on such predictions.

  20. Novel information theoretic and Bayesian approach to fMRI data analysis

    NASA Astrophysics Data System (ADS)

    Reddy, Chandan K.; Terrazas, Alejandro

    2003-05-01

    Functional Magnetic Resonance Imaging (fMRI) is a powerful technique for studying the working of the human brain. This overall goals of the project are to devlop a novel method for the analysis of fMRI data in order to discover the activation of a network of regions involving most likely the hippocampus, parietal cortex and cerebellum as a person is navigating in a virtual environment. Spatially sensitive voxels are extracted by selecting voxels that have high mutual information. Each of these extracted voxels is then used to create a response curve for the stimulus of interest, in this case spatial location. Following the voxel extraction stage, the set of extracted voxel time series would be treated as a population and used to predict the location of the subject at any randomly selected time in the experiment. The population of voxels essentially "votes" with their current activity. The approach used for prediction is the Bayesian reconstruction method. The ability to predict the location of a subject in the virtual environment based on brain signals will be useful in developing a physiological understanding of spatial cognition in virtual environments.

  1. A Bayesian statistical analysis of behavioral facilitation associated with deep brain stimulation

    PubMed Central

    Smith, Anne C; Shah, Sudhin A; Hudson, Andrew E; Purpura, Keith P; Victor, Jonathan D; Brown, Emery N; Schiff, Nicholas D

    2009-01-01

    Deep brain stimulation (DBS) is an established therapy for Parkinson’s Disease and is being investigated as a treatment for chronic depression, obsessive compulsive disorder and for facilitating functional recovery of patients in minimally conscious states following brain injury. For all of these applications, quantitative assessments of the behavioral effects of DBS are crucial to determine whether the therapy is effective and, if so, how stimulation parameters can be optimized. Behavioral analyses for DBS are challenging because subject performance is typically assessed from only a small set of discrete measurements made on a discrete rating scale, the time course of DBS effects is unknown, and between-subject differences are often large. We demonstrate how Bayesian state-space methods can be used to characterize the relationship between DBS and behavior comparing our approach with logistic regression in two experiments: the effects of DBS on attention of a macaque monkey performing a reaction-time task, and the effects of DBS on motor behavior of a human patient in a minimally conscious state. The state-space analysis can assess the magnitude of DBS behavioral facilitation (positive or negative) at specific time points and has important implications for developing principled strategies to optimize DBS paradigms. PMID:19576932

  2. Hierarchical Bayesian approaches for detecting inconsistency in network meta-analysis.

    PubMed

    Zhao, Hong; Hodges, James S; Ma, Haijun; Jiang, Qi; Carlin, Bradley P

    2016-09-10

    Network meta-analysis (NMA), also known as multiple treatment comparisons, is commonly used to incorporate direct and indirect evidence comparing treatments. With recent advances in methods and software, Bayesian approaches to NMA have become quite popular and allow models of previously unanticipated complexity. However, when direct and indirect evidence differ in an NMA, the model is said to suffer from inconsistency. Current inconsistency detection in NMA is usually based on contrast-based (CB) models; however, this approach has certain limitations. In this work, we propose an arm-based random effects model, where we detect discrepancy of direct and indirect evidence for comparing two treatments using the fixed effects in the model while flagging extreme trials using the random effects. We define discrepancy factors to characterize evidence of inconsistency for particular treatment comparisons, which is novel in NMA research. Our approaches permit users to address issues previously tackled via CB models. We compare sources of inconsistency identified by our approach and existing loop-based CB methods using real and simulated datasets and demonstrate that our methods can offer powerful inconsistency detection. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Bayesian analysis for erosion modelling of sediments in combined sewer systems.

    PubMed

    Kanso, A; Chebbo, G; Tassin, B

    2005-01-01

    Previous research has confirmed that the sediments at the bed of combined sewer systems are the main source of particulate and organic pollution during rain events contributing to combined sewer overflows. However, existing urban stormwater models utilize inappropriate sediment transport formulas initially developed from alluvial hydrodynamics. Recently, a model has been formulated and profoundly assessed based on laboratory experiments to simulate the erosion of sediments in sewer pipes taking into account the increase in strength with depth in the weak layer of deposits. In order to objectively evaluate this model, this paper presents a Bayesian analysis of the model using field data collected in sewer pipes in Paris under known hydraulic conditions. The test has been performed using a MCMC sampling method for calibration and uncertainty assessment. Results demonstrate the capacity of the model to reproduce erosion as a direct response to the increase in bed shear stress. This is due to the model description of the erosional strength in the deposits and to the shape of the measured bed shear stress. However, large uncertainties in some of the model parameters suggest that the model could be over-parameterised and necessitates a large amount of informative data for its calibration.

  4. Bayesian network meta-analysis comparing five contemporary treatment strategies for newly diagnosed acute promyelocytic leukaemia.

    PubMed

    Wu, Fenfang; Wu, Di; Ren, Yong; Duan, Chongyang; Chen, Shangwu; Xu, Anlong

    2016-07-26

    Acute promyelocytic leukemia (APL) is a curable subtype of acute myeloid leukemia. The optimum regimen for newly diagnosed APL remains inconclusive. In this Bayesian network meta-analysis, we compared the effectiveness of five regimens-arsenic trioxide (ATO) + all-trans retinoic acid (ATRA), realgar-indigo naturalis formula (RIF) which contains arsenic tetrasulfide + ATRA, ATRA + anthracycline-based chemotherapy (CT), ATO alone and ATRA alone, based on fourteen randomized controlled trials (RCTs), which included 1407 newly diagnosed APL patients. According to the results, the ranking efficacy of the treatment, including early death and complete remission in the induction stage, was the following: 1. ATO/RIF + ATRA; 2. ATRA + CT; 3. ATO, and 4. ATRA. For long-term benefit, ATO/RIF + ATRA significantly improved overall survival (OS) (hazard ratio = 0.35, 95%CI 0.15-0.82, p = 0.02) and event-free survival (EFS) (hazard ratio = 0.32, 95%CI 0.16-0.61, p = 0.001) over ATRA + CT regimen for the low-to-intermediate-risk patients. Thus, ATO + ATRA and RIF + ATRA might be considered the optimum treatments for the newly diagnosed APL and should be recommended as the standard care for frontline therapy.

  5. Data analysis using scale-space filtering and Bayesian probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Kutulakos, Kiriakos; Robinson, Peter

    1991-01-01

    This paper describes a program for analysis of output curves from Differential Thermal Analyzer (DTA). The program first extracts probabilistic qualitative features from a DTA curve of a soil sample, and then uses Bayesian probabilistic reasoning to infer the mineral in the soil. The qualifier module employs a simple and efficient extension of scale-space filtering suitable for handling DTA data. We have observed that points can vanish from contours in the scale-space image when filtering operations are not highly accurate. To handle the problem of vanishing points, perceptual organizations heuristics are used to group the points into lines. Next, these lines are grouped into contours by using additional heuristics. Probabilities are associated with these contours using domain-specific correlations. A Bayes tree classifier processes probabilistic features to infer the presence of different minerals in the soil. Experiments show that the algorithm that uses domain-specific correlation to infer qualitative features outperforms a domain-independent algorithm that does not.

  6. Bayesian analysis of stage-fall-discharge rating curves and their uncertainties

    NASA Astrophysics Data System (ADS)

    Mansanarez, V.; Le Coz, J.; Renard, B.; Lang, M.; Pierrefeu, G.; Vauchel, P.

    2016-09-01

    Stage-fall-discharge (SFD) rating curves are traditionally used to compute streamflow records at sites where the energy slope of the flow is variable due to variable backwater effects. We introduce a model with hydraulically interpretable parameters for estimating SFD rating curves and their uncertainties. Conventional power functions for channel and section controls are used. The transition to a backwater-affected channel control is computed based on a continuity condition, solved either analytically or numerically. The practical use of the method is demonstrated with two real twin-gauge stations, the Rhône River at Valence, France, and the Guthusbekken stream at station 0003ṡ0033, Norway. Those stations are typical of a channel control and a section control, respectively, when backwater-unaffected conditions apply. The performance of the method is investigated through sensitivity analysis to prior information on controls and to observations (i.e., available gaugings) for the station of Valence. These analyses suggest that precisely identifying SFD rating curves requires adapted gauging strategy and/or informative priors. The Madeira River, one of the largest tributaries of the Amazon, provides a challenging case typical of large, flat, tropical river networks where bed roughness can also be variable in addition to slope. In this case, the difference in staff gauge reference levels must be estimated as another uncertain parameter of the SFD model. The proposed Bayesian method is a valuable alternative solution to the graphical and empirical techniques still proposed in hydrometry guidance and standards.

  7. Bayesian Modeling of MPSS Data: Gene Expression Analysis of Bovine Salmonella Infection

    PubMed Central

    Dhavala, Soma S.; Datta, Sujay; Mallick, Bani K.; Carroll, Raymond J.; Khare, Sangeeta; Lawhon, Sara D.; Adams, L. Garry

    2010-01-01

    Massively Parallel Signature Sequencing (MPSS) is a high-throughput counting-based technology available for gene expression profiling. It produces output that is similar to Serial Analysis of Gene Expression (SAGE) and is ideal for building complex relational databases for gene expression. Our goal is to compare the in vivo global gene expression profiles of tissues infected with different strains of Salmonella obtained using the MPSS technology. In this article, we develop an exact ANOVA type model for this count data using a zero-inflated Poisson (ZIP) distribution, different from existing methods that assume continuous densities. We adopt two Bayesian hierarchical models—one parametric and the other semiparametric with a Dirichlet process prior that has the ability to “borrow strength” across related signatures, where a signature is a specific arrangement of the nucleotides, usually 16-21 base-pairs long. We utilize the discreteness of Dirichlet process prior to cluster signatures that exhibit similar differential expression profiles. Tests for differential expression are carried out using non-parametric approaches, while controlling the false discovery rate. We identify several differentially expressed genes that have important biological significance and conclude with a summary of the biological discoveries. PMID:21165171

  8. A Bayesian Approach to the Design and Analysis of Computer Experiments

    SciTech Connect

    Currin, C.

    1988-01-01

    We consider the problem of designing and analyzing experiments for prediction of the function y(f), t {element_of} T, where y is evaluated by means of a computer code (typically by solving complicated equations that model a physical system), and T represents the domain of inputs to the code. We use a Bayesian approach, in which uncertainty about y is represented by a spatial stochastic process (random function); here we restrict attention to stationary Gaussian processes. The posterior mean function can be used as an interpolating function, with uncertainties given by the posterior standard deviations. Instead of completely specifying the prior process, we consider several families of priors, and suggest some cross-validational methods for choosing one that performs relatively well on the function at hand. As a design criterion, we use the expected reduction in the entropy of the random vector y (T*), where T* {contained_in} T is a given finite set of ''sites'' (input configurations) at which predictions are to be made. We describe an exchange algorithm for constructing designs that are optimal with respect to this criterion. To demonstrate the use of these design and analysis methods, several examples are given, including one experiment on a computer model of a thermal energy storage device and another on an integrated circuit simulator.

  9. Revisiting thermodynamics and kinetic diffusivities of uranium–niobium with Bayesian uncertainty analysis

    DOE PAGES

    Duong, Thien C.; Hackenberg, Robert Errol; Landa, Alex; ...

    2016-09-20

    In this paper, thermodynamic and kinetic diffusivities of uranium–niobium (U–Nb) are re-assessed by means of the CALPHAD (CALculation of PHAse Diagram) methodology. In order to improve the consistency and reliability of the assessments, first-principles calculations are coupled with CALPHAD. In particular, heats of formation of γ -U–Nb are estimated and verified using various density-functional theory (DFT) approaches. These thermochemistry data are then used as constraints to guide the thermodynamic optimization process in such a way that the mutual-consistency between first-principles calculations and CALPHAD assessment is satisfactory. In addition, long-term aging experiments are conducted in order to generate new phase equilibriamore » data at the γ2/α+γ2 boundary. These data are meant to verify the thermodynamic model. Assessment results are generally in good agreement with experiments and previous calculations, without showing the artifacts that were observed in previous modeling. The mutual-consistent thermodynamic description is then used to evaluate atomic mobility and diffusivity of γ-U–Nb. Finally, Bayesian analysis is conducted to evaluate the uncertainty of the thermodynamic model and its impact on the system's phase stability.« less

  10. Iteration of Partially-Specified Target Matrices: Applications in Exploratory and Bayesian Confirmatory Factor Analysis

    PubMed Central

    Moore, Tyler M.; Reise, Steven P.; Depaoli, Sarah; Haviland, Mark G.

    2015-01-01

    We describe and evaluate a factor rotation algorithm, iterated target rotation (ITR). Whereas target rotation (Browne, 2001) requires a user to specify a target matrix a priori based on theory or prior research, ITR begins with a standard analytic factor rotation (i.e., an empirically-informed target) followed by an iterative search procedure to update the target matrix. Monte Carlo simulations were conducted to evaluate the performance of ITR relative to analytic rotations from the Crawford-Ferguson family with population factor structures varying in complexity. Simulation results: (a) suggested that ITR analyses will be particularly useful when evaluating data with complex structures (i.e., multiple cross-loadings) and (b) showed that the rotation method used to define an initial target matrix did not materially affect the accuracy of the various ITRs. In Study 2, we: (a) demonstrated the application of ITR as a way to determine empirically-informed priors in a Bayesian confirmatory factor analysis (BCFA; Muthén & Asparouhov, 2012) of a rater-report alexithymia measure (Haviland, Warren, & Riggs, 2000) and (b) highlighted some of the challenges when specifying empirically-based priors and assessing item and overall model fit. PMID:26609875

  11. Iteration of Partially Specified Target Matrices: Applications in Exploratory and Bayesian Confirmatory Factor Analysis.

    PubMed

    Moore, Tyler M; Reise, Steven P; Depaoli, Sarah; Haviland, Mark G

    2015-01-01

    We describe and evaluate a factor rotation algorithm, iterated target rotation (ITR). Whereas target rotation (Browne, 2001) requires a user to specify a target matrix a priori based on theory or prior research, ITR begins with a standard analytic factor rotation (i.e., an empirically informed target) followed by an iterative search procedure to update the target matrix. In Study 1, Monte Carlo simulations were conducted to evaluate the performance of ITR relative to analytic rotations from the Crawford-Ferguson family with population factor structures varying in complexity. Simulation results: (a) suggested that ITR analyses will be particularly useful when evaluating data with complex structures (i.e., multiple cross-loadings) and (b) showed that the rotation method used to define an initial target matrix did not materially affect the accuracy of the various ITRs. In Study 2, we: (a) demonstrated the application of ITR as a way to determine empirically informed priors in a Bayesian confirmatory factor analysis (BCFA; Muthén & Asparouhov, 2012) of a rater-report alexithymia measure (Haviland, Warren, & Riggs, 2000) and (b) highlighted some of the challenges when specifying empirically based priors and assessing item and overall model fit.

  12. Revisiting thermodynamics and kinetic diffusivities of uranium–niobium with Bayesian uncertainty analysis

    SciTech Connect

    Duong, Thien C.; Hackenberg, Robert Errol; Landa, Alex; Honarmandi, Pejman; Talapatra, Anjana; Volz, Heather Michelle; Llobet, Anna Megias; Smith, Alice Iulia; King, Graham Missell; Bajaj, Saurabh; Ruban, Andrei; Vitos, Levente; Turchi, Patrice E. A.; Arroyave, Raymundo

    2016-09-20

    In this paper, thermodynamic and kinetic diffusivities of uranium–niobium (U–Nb) are re-assessed by means of the CALPHAD (CALculation of PHAse Diagram) methodology. In order to improve the consistency and reliability of the assessments, first-principles calculations are coupled with CALPHAD. In particular, heats of formation of γ -U–Nb are estimated and verified using various density-functional theory (DFT) approaches. These thermochemistry data are then used as constraints to guide the thermodynamic optimization process in such a way that the mutual-consistency between first-principles calculations and CALPHAD assessment is satisfactory. In addition, long-term aging experiments are conducted in order to generate new phase equilibria data at the γ2/α+γ2 boundary. These data are meant to verify the thermodynamic model. Assessment results are generally in good agreement with experiments and previous calculations, without showing the artifacts that were observed in previous modeling. The mutual-consistent thermodynamic description is then used to evaluate atomic mobility and diffusivity of γ-U–Nb. Finally, Bayesian analysis is conducted to evaluate the uncertainty of the thermodynamic model and its impact on the system's phase stability.

  13. Cross-validation analysis of bias models in Bayesian multi-model projections of climate

    NASA Astrophysics Data System (ADS)

    Huttunen, J. M. J.; Räisänen, J.; Nissinen, A.; Lipponen, A.; Kolehmainen, V.

    2017-03-01

    Climate change projections are commonly based on multi-model ensembles of climate simulations. In this paper we consider the choice of bias models in Bayesian multimodel predictions. Buser et al. (Clim Res 44(2-3):227-241, 2010a) introduced a hybrid bias model which combines commonly used constant bias and constant relation bias assumptions. The hybrid model includes a weighting parameter which balances these bias models. In this study, we use a cross-validation approach to study which bias model or bias parameter leads to, in a specific sense, optimal climate change projections. The analysis is carried out for summer and winter season means of 2 m-temperatures spatially averaged over the IPCC SREX regions, using 19 model runs from the CMIP5 data set. The cross-validation approach is applied to calculate optimal bias parameters (in the specific sense) for projecting the temperature change from the control period (1961-2005) to the scenario period (2046-2090). The results are compared to the results of the Buser et al. (Clim Res 44(2-3):227-241, 2010a) method which includes the bias parameter as one of the unknown parameters to be estimated from the data.

  14. Bayesian analysis of nonlinear mixed-effects mixture models for longitudinal data with heterogeneity and skewness.

    PubMed

    Lu, Xiaosun; Huang, Yangxin

    2014-07-20

    It is a common practice to analyze complex longitudinal data using nonlinear mixed-effects (NLME) models with normality assumption. The NLME models with normal distributions provide the most popular framework for modeling continuous longitudinal outcomes, assuming individuals are from a homogeneous population and relying on random-effects to accommodate inter-individual variation. However, the following two issues may standout: (i) normality assumption for model errors may cause lack of robustness and subsequently lead to invalid inference and unreasonable estimates, particularly, if the data exhibit skewness and (ii) a homogeneous population assumption may be unrealistically obscuring important features of between-subject and within-subject variations, which may result in unreliable modeling results. There has been relatively few studies concerning longitudinal data with both heterogeneity and skewness features. In the last two decades, the skew distributions have shown beneficial in dealing with asymmetric data in various applications. In this article, our objective is to address the simultaneous impact of both features arisen from longitudinal data by developing a flexible finite mixture of NLME models with skew distributions under Bayesian framework that allows estimates of both model parameters and class membership probabilities for longitudinal data. Simulation studies are conducted to assess the performance of the proposed models and methods, and a real example from an AIDS clinical trial illustrates the methodology by modeling the viral dynamics to compare potential models with different distribution specifications; the analysis results are reported.

  15. Bayesian analysis for nonlinear mixed-effects models under heavy-tailed distributions.

    PubMed

    De la Cruz, Rolando

    2014-01-01

    A common assumption in nonlinear mixed-effects models is the normality of both random effects and within-subject errors. However, such assumptions make inferences vulnerable to the presence of outliers. More flexible distributions are therefore necessary for modeling both sources of variability in this class of models. In the present paper, I consider an extension of the nonlinear mixed-effects models in which random effects and within-subject errors are assumed to be distributed according to a rich class of parametric models that are often used for robust inference. The class of distributions I consider is the scale mixture of multivariate normal distributions that consist of a wide range of symmetric and continuous distributions. This class includes heavy-tailed multivariate distributions, such as the Student's t and slash and contaminated normal. With the scale mixture of multivariate normal distributions, robustification is achieved from the tail behavior of the different distributions. A Bayesian framework is adopted, and MCMC is used to carry out posterior analysis. Model comparison using different criteria was considered. The procedures are illustrated using a real dataset from a pharmacokinetic study. I contrast results from the normal and robust models and show how the implementation can be used to detect outliers.

  16. Bayesian nonparametric regression analysis of data with random effects covariates from longitudinal measurements.

    PubMed

    Ryu, Duchwan; Li, Erning; Mallick, Bani K

    2011-06-01

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves.

  17. Composite behavior analysis for video surveillance using hierarchical dynamic Bayesian networks

    NASA Astrophysics Data System (ADS)

    Cheng, Huanhuan; Shan, Yong; Wang, Runsheng

    2011-03-01

    Analyzing composite behaviors involving objects from multiple categories in surveillance videos is a challenging task due to the complicated relationships among human and objects. This paper presents a novel behavior analysis framework using a hierarchical dynamic Bayesian network (DBN) for video surveillance systems. The model is built for extracting objects' behaviors and their relationships by representing behaviors using spatial-temporal characteristics. The recognition of object behaviors is processed by the DBN at multiple levels: features of objects at low level, objects and their relationships at middle level, and event at high level, where event refers to behaviors of a single type object as well as behaviors consisting of several types of objects such as ``a person getting in a car.'' Furthermore, to reduce the complexity, a simple model selection criterion is addressed, by which the appropriated model is picked out from a pool of candidate models. Experiments are shown to demonstrate that the proposed framework could efficiently recognize and semantically describe composite object and human activities in surveillance videos.

  18. Bayesian analysis of spatial-dependent cosmic-ray propagation: Astrophysical background of antiprotons and positrons

    NASA Astrophysics Data System (ADS)

    Feng, Jie; Tomassetti, Nicola; Oliva, Alberto

    2016-12-01

    The AMS-02 experiment has reported a new measurement of the antiproton/proton ratio in Galactic cosmic rays (CRs). In the energy range E ˜60 - 450 GeV , this ratio is found to be remarkably constant. Using recent data on CR proton, helium, and carbon fluxes, 10Be/9Be and B/C ratios, we have performed a global Bayesian analysis based on a Markov chain Monte Carlo sampling algorithm under a "two halo model" of CR propagation. In this model, CRs are allowed to experience a different type of diffusion when they propagate in the region close to the Galactic disk. We found that the vertical extent of this region is about 900 pc above and below the disk, and the corresponding diffusion coefficient scales with energy as D ∝E0.15 , describing well the observations on primary CR spectra, secondary/primary ratios, and anisotropy. Under this model, we have carried out improved calculations of antiparticle spectra arising from secondary CR production and their corresponding uncertainties. We made use of Monte Carlo generators and accelerator data to assess the antiproton production cross sections and their uncertainties. While the positron excess requires the contribution of additional unknown sources, we found that the new AMS-02 antiproton data are consistent, within the estimated uncertainties, with our calculations based on secondary production.

  19. Assessment of occupational safety risks in Floridian solid waste systems using Bayesian analysis.

    PubMed

    Bastani, Mehrad; Celik, Nurcin

    2015-10-01

    Safety risks embedded within solid waste management systems continue to be a significant issue and are prevalent at every step in the solid waste management process. To recognise and address these occupational hazards, it is necessary to discover the potential safety concerns that cause them, as well as their direct and/or indirect impacts on the different types of solid waste workers. In this research, our goal is to statistically assess occupational safety risks to solid waste workers in the state of Florida. Here, we first review the related standard industrial codes to major solid waste management methods including recycling, incineration, landfilling, and composting. Then, a quantitative assessment of major risks is conducted based on the data collected using a Bayesian data analysis and predictive methods. The risks estimated in this study for the period of 2005-2012 are then compared with historical statistics (1993-1997) from previous assessment studies. The results have shown that the injury rates among refuse collectors in both musculoskeletal and dermal injuries have decreased from 88 and 15 to 16 and three injuries per 1000 workers, respectively. However, a contrasting trend is observed for the injury rates among recycling workers, for whom musculoskeletal and dermal injuries have increased from 13 and four injuries to 14 and six injuries per 1000 workers, respectively. Lastly, a linear regression model has been proposed to identify major elements of the high number of musculoskeletal and dermal injuries.

  20. Dissecting High-Dimensional Phenotypes with Bayesian Sparse Factor Analysis of Genetic Covariance Matrices

    PubMed Central

    Runcie, Daniel E.; Mukherjee, Sayan

    2013-01-01

    Quantitative genetic studies that model complex, multivariate phenotypes are important for both evolutionary prediction and artificial selection. For example, changes in gene expression can provide insight into developmental and physiological mechanisms that link genotype and phenotype. However, classical analytical techniques are poorly suited to quantitative genetic studies of gene expression where the number of traits assayed per individual can reach many thousand. Here, we derive a Bayesian genetic sparse factor model for estimating the genetic covariance matrix (G-matrix) of high-dimensional traits, such as gene expression, in a mixed-effects model. The key idea of our model is that we need consider only G-matrices that are biologically plausible. An organism’s entire phenotype is the result of processes that are modular and have limited complexity. This implies that the G-matrix will be highly structured. In particular, we assume that a limited number of intermediate traits (or factors, e.g., variations in development or physiology) control the variation in the high-dimensional phenotype, and that each of these intermediate traits is sparse – affecting only a few observed traits. The advantages of this approach are twofold. First, sparse factors are interpretable and provide biological insight into mechanisms underlying the genetic architecture. Second, enforcing sparsity helps prevent sampling errors from swamping out the true signal in high-dimensional data. We demonstrate the advantages of our model on simulated data and in an analysis of a published Drosophila melanogaster gene expression data set. PMID:23636737

  1. Guideline for bolted joint design and analysis : version 1.0.

    SciTech Connect

    Brown, Kevin H.; Morrow, Charles W.; Durbin, Samuel; Baca, Allen

    2008-01-01

    This document provides general guidance for the design and analysis of bolted joint connections. An overview of the current methods used to analyze bolted joint connections is given. Several methods for the design and analysis of bolted joint connections are presented. Guidance is provided for general bolted joint design, computation of preload uncertainty and preload loss, and the calculation of the bolted joint factor of safety. Axial loads, shear loads, thermal loads, and thread tear out are used in factor of safety calculations. Additionally, limited guidance is provided for fatigue considerations. An overview of an associated Mathcad{copyright} Worksheet containing all bolted joint design formulae presented is also provided.

  2. Bayesian spatiotemporal analysis of zero-inflated biological population density data by a delta-normal spatiotemporal additive model.

    PubMed

    Arcuti, Simona; Pollice, Alessio; Ribecco, Nunziata; D'Onghia, Gianfranco

    2016-03-01

    We evaluate the spatiotemporal changes in the density of a particular species of crustacean known as deep-water rose shrimp, Parapenaeus longirostris, based on biological sample data collected during trawl surveys carried out from 1995 to 2006 as part of the international project MEDITS (MEDiterranean International Trawl Surveys). As is the case for many biological variables, density data are continuous and characterized by unusually large amounts of zeros, accompanied by a skewed distribution of the remaining values. Here we analyze the normalized density data by a Bayesian delta-normal semiparametric additive model including the effects of covariates, using penalized regression with low-rank thin-plate splines for nonlinear spatial and temporal effects. Modeling the zero and nonzero values by two joint processes, as we propose in this work, allows to obtain great flexibility and easily handling of complex likelihood functions, avoiding inaccurate statistical inferences due to misclassification of the high proportion of exact zeros in the model. Bayesian model estimation is obtained by Markov chain Monte Carlo simulations, suitably specifying the complex likelihood function of the zero-inflated density data. The study highlights relevant nonlinear spatial and temporal effects and the influence of the annual Mediterranean oscillations index and of the sea surface temperature on the distribution of the deep-water rose shrimp density.

  3. Bayesian Statistics: A Place in Educational Research?

    ERIC Educational Resources Information Center

    Diamond, James

    The use of Bayesian statistics as the basis of classical analysis of data is described. Bayesian analysis is a set of procedures for changing opinions about a given phenomenon based upon rational observation of a set of data. The Bayesian arrives at a set of prior beliefs regarding some states of nature; he observes data in a study and then…

  4. IMU-based joint angle measurement for gait analysis.

    PubMed

    Seel, Thomas; Raisch, Jörg; Schauer, Thomas

    2014-04-16

    This contribution is concerned with joint angle calculation based on inertial measurement data in the context of human motion analysis. Unlike most robotic devices, the human body lacks even surfaces and right angles. Therefore, we focus on methods that avoid assuming certain orientations in which the sensors are mounted with respect to the body segments. After a review of available methods that may cope with this challenge, we present a set of new methods for: (1) joint axis and position identification; and (2) flexion/extension joint angle measurement. In particular, we propose methods that use only gyroscopes and accelerometers and, therefore, do not rely on a homogeneous magnetic field. We provide results from gait trials of a transfemoral amputee in which we compare the inertial measurement unit (IMU)-based methods to an optical 3D motion capture system. Unlike most authors, we place the optical markers on anatomical landmarks instead of attaching them to the IMUs. Root mean square errors of the knee flexion/extension angles are found to be less than 1° on the prosthesis and about 3° on the human leg. For the plantar/dorsiflexion of the ankle, both deviations are about 1°.

  5. Analysis Method for Inelastic, Adhesively Bonded Joints with Anisotropic Adherends

    NASA Technical Reports Server (NTRS)

    Smeltzer, Stanley S., III; Klang, Eric C.

    2003-01-01

    A one-dimensional analysis method for evaluating adhesively bonded joints composed of anisotropic adherends and adhesives with nonlinear material behavior is presented in the proposed paper. The strain and resulting stress field in a general, bonded joint overlap are determined by using a variable-step, finite-difference solution algorithm to iteratively solve a system of first-order differential equations. Applied loading is given by a system of combined extensional, bending, and shear forces that are applied to the edge of the joint overlap. Adherends are assumed to behave as linear, cylindrically bent plates using classical laminated plate theory that includes the effects of first-order transverse shear deformation. Using the deformation theory of plasticity and a modified von-Mises yield criterion, inelastic material behavior is modeled in the adhesive layer. Results for the proposed method are verified against previous results from the literature and shown to be in excellent agreement. An additional case that highlights the effects of transverse shear deformation between similar adherends is also presented.

  6. Analysis of preloaded bolted joints under exponentially decaying pressure

    SciTech Connect

    Esmailzadeh, E.; Chorashi, M.; Ohadi, A.R.

    1996-11-01

    Dynamic properties of joints must be considered when designing complex structures. A good deal of investigation has been carried out for a better understanding of the dynamic behavior of mechanical joints. It is suitable initially to identify the parameters of a mechanical joint by using either experimental modal analysis or accurate finite element model, and then predicating the behavior of closure bolting system by means of spring-mass-damper model. The effect of bolt prestress on the maximum bolt displacement and stress has been treated. The loading is assumed to be initially peaked, exponentially decaying internal pressure pulse acting on the closure. The dependence of peak bolt stresses and deflections on the bolt prestress level and system damping is investigated. It has been shown that the derived formulas, if damping is neglected, reduce to those reported in the literature. Furthermore, the damping effect is shown to be most important, especially for large natural frequencies, longer loading duration, and lower levels of prestress. Existence of damping, which results in the reduction of maximum bolt displacement and stress, was shown to be beneficial, especially for longer loading duration. The importance of bolt displacement reduction from the viewpoint of fatigue life, vibration loosening, and sealing, especially for lower values of prestress, has been fully emphasized.

  7. A Bayesian SIRS model for the analysis of respiratory syncytial virus in the region of Valencia, Spain.

    PubMed

    Corberán-Vallet, Ana; Santonja, Francisco J

    2014-09-01

    We present a Bayesian stochastic susceptible-infected-recovered-susceptible (SIRS) model in discrete time to understand respiratory syncytial virus dynamics in the region of Valencia, Spain. A SIRS model based on ordinary differential equations has also been proposed to describe RSV dynamics in the region of Valencia. However, this continuous-time deterministic model is not suitable when the initial number of infected individuals is small. Stochastic epidemic models based on a probability of disease transmission provide a more natural description of the spread of infectious diseases. In addition, by allowing the transmission rate to vary stochastically over time, the proposed model provides an improved description of RSV dynamics. The Bayesian analysis of the model allows us to calculate both the posterior distribution of the model parameters and the posterior predictive distribution, which facilitates the computation of point forecasts and prediction intervals for future observations.

  8. A Bayesian Approach for Evaluation of Determinants of Health System Efficiency Using Stochastic Frontier Analysis and Beta Regression

    PubMed Central

    2016-01-01

    In today's world, Public expenditures on health are one of the most important issues for governments. These increased expenditures are putting pressure on public budgets. Therefore, health policy makers have focused on the performance of their health systems and many countries have introduced reforms to improve the performance of their health systems. This study investigates the most important determinants of healthcare efficiency for OECD countries using second stage approach for Bayesian Stochastic Frontier Analysis (BSFA). There are two steps in this study. First we measure 29 OECD countries' healthcare efficiency by BSFA using the data from the OECD Health Database. At second stage, we expose the multiple relationships between the healthcare efficiency and characteristics of healthcare systems across OECD countries using Bayesian beta regression. PMID:27118987

  9. A Bayesian Meta-Analysis on Prevalence of Hepatitis B Virus Infection among Chinese Volunteer Blood Donors

    PubMed Central

    Liu, Guang-ying; Zheng, Yang; Deng, Yan; Gao, Yan-yan; Wang, Lie

    2013-01-01

    Background Although transfusion-transmitted infection of hepatitis B virus (HBV) threatens the blood safety of China, the nationwide circumstance of HBV infection among blood donors is still unclear. Objectives To comprehensively estimate the prevalence of HBsAg positive and HBV occult infection (OBI) among Chinese volunteer blood donors through bayesian meta-analysis. Methods We performed an electronic search in Pub-Med, Web of Knowledge, Medline, Wanfang Data and CNKI, complemented by a hand search of relevant reference lists. Two authors independently extracted data from the eligible studies. Then two bayesian random-effect meta-analyses were performed, followed by bayesian meta-regressions. Results 5957412 and 571227 donors were identified in HBsAg group and OBI group, respectively. The pooled prevalence of HBsAg group and OBI group among donors is 1.085% (95% credible interval [CI] 0.859%∼1.398%) and 0.094% (95% CI 0.0578%∼0.1655%). For HBsAg group, subgroup analysis shows the more developed area has a lower prevalence than the less developed area; meta-regression indicates there is a significant decreasing trend in HBsAg positive prevalence with sampling year (beta = −0.1202, 95% −0.2081∼−0.0312). Conclusion Blood safety against HBV infection in China is suffering serious threats and the government should take effective measures to improve this situation. PMID:24236110

  10. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    DOE PAGES

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less

  11. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    SciTech Connect

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relative contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.

  12. Bayesian Statistical Analysis of Historical and Late Holocene Rates of Sea-Level Change

    NASA Astrophysics Data System (ADS)

    Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin

    2014-05-01

    A fundamental concern associated with climate change is the rate at which sea levels are rising. Studies of past sea level (particularly beyond the instrumental data range) allow modern sea-level rise to be placed in a more complete context. Considering this, we perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level rise, to determine when modern rates of sea-level rise began and to observe how these rates have been changing over time. Many of the current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over confidence in the sea-level trends being characterized. The proposed Bayesian model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, this is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary, in this case, for the model to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method provides a flexible fit and it allows for the direct estimation of the rate process with full consideration of all sources of uncertainty. Analysis of tide-gauge datasets and proxy reconstructions in this way means that changing rates of sea level can be estimated more comprehensively and accurately than previously possible. The model captures the continuous and dynamic evolution of sea-level change and results show that not only are modern sea levels rising but that the rates

  13. Reduction of a Whole-Body Physiologically Based Pharmacokinetic Model to Stabilise the Bayesian Analysis of Clinical Data.

    PubMed

    Wendling, Thierry; Tsamandouras, Nikolaos; Dumitras, Swati; Pigeolet, Etienne; Ogungbenro, Kayode; Aarons, Leon

    2016-01-01

    Whole-body physiologically based pharmacokinetic (PBPK) models are increasingly used in drug development for their ability to predict drug concentrations in clinically relevant tissues and to extrapolate across species, experimental conditions and sub-populations. A whole-body PBPK model can be fitted to clinical data using a Bayesian population approach. However, the analysis might be time consuming and numerically unstable if prior information on the model parameters is too vague given the complexity of the system. We suggest an approach where (i) a whole-body PBPK model is formally reduced using a Bayesian proper lumping method to retain the mechanistic interpretation of the system and account for parameter uncertainty, (ii) the simplified model is fitted to clinical data using Markov Chain Monte Carlo techniques and (iii) the optimised reduced PBPK model is used for extrapolation. A previously developed 16-compartment whole-body PBPK model for mavoglurant was reduced to 7 compartments while preserving plasma concentration-time profiles (median and variance) and giving emphasis to the brain (target site) and the liver (elimination site). The reduced model was numerically more stable than the whole-body model for the Bayesian analysis of mavoglurant pharmacokinetic data in healthy adult volunteers. Finally, the reduced yet mechanistic model could easily be scaled from adults to children and predict mavoglurant pharmacokinetics in children aged from 3 to 11 years with similar performance compared with the whole-body model. This study is a first example of the practicality of formal reduction of complex mechanistic models for Bayesian inference in drug development.

  14. Bayesian analysis of the dynamic cosmic web in the SDSS galaxy survey

    NASA Astrophysics Data System (ADS)

    Leclercq, Florent; Jasche, Jens; Wandelt, Benjamin

    2015-06-01

    Recent application of the Bayesian algorithm \\textsc{borg} to the Sloan Digital Sky Survey (SDSS) main sample galaxies resulted in the physical inference of the formation history of the observed large-scale structure from its origin to the present epoch. In this work, we use these inferences as inputs for a detailed probabilistic cosmic web-type analysis. To do so, we generate a large set of data-constrained realizations of the large-scale structure using a fast, fully non-linear gravitational model. We then perform a dynamic classification of the cosmic web into four distinct components (voids, sheets, filaments, and clusters) on the basis of the tidal field. Our inference framework automatically and self-consistently propagates typical observational uncertainties to web-type classification. As a result, this study produces accurate cosmographic classification of large-scale structure elements in the SDSS volume. By also providing the history of these structure maps, the approach allows an analysis of the origin and growth of the early traces of the cosmic web present in the initial density field and of the evolution of global quantities such as the volume and mass filling fractions of different structures. For the problem of web-type classification, the results described in this work constitute the first connection between theory and observations at non-linear scales including a physical model of structure formation and the demonstrated capability of uncertainty quantification. A connection between cosmology and information theory using real data also naturally emerges from our probabilistic approach. Our results constitute quantitative chrono-cosmography of the complex web-like patterns underlying the observed galaxy distribution.

  15. Bayesian analysis of the dynamic cosmic web in the SDSS galaxy survey

    SciTech Connect

    Leclercq, Florent; Wandelt, Benjamin

    2015-06-01

    Recent application of the Bayesian algorithm \\textsc(borg) to the Sloan Digital Sky Survey (SDSS) main sample galaxies resulted in the physical inference of the formation history of the observed large-scale structure from its origin to the present epoch. In this work, we use these inferences as inputs for a detailed probabilistic cosmic web-type analysis. To do so, we generate a large set of data-constrained realizations of the large-scale structure using a fast, fully non-linear gravitational model. We then perform a dynamic classification of the cosmic web into four distinct components (voids, sheets, filaments, and clusters) on the basis of the tidal field. Our inference framework automatically and self-consistently propagates typical observational uncertainties to web-type classification. As a result, this study produces accurate cosmographic classification of large-scale structure elements in the SDSS volume. By also providing the history of these structure maps, the approach allows an analysis of the origin and growth of the early traces of the cosmic web present in the initial density field and of the evolution of global quantities such as the volume and mass filling fractions of different structures. For the problem of web-type classification, the results described in this work constitute the first connection between theory and observations at non-linear scales including a physical model of structure formation and the demonstrated capability of uncertainty quantification. A connection between cosmology and information theory using real data also naturally emerges from our probabilistic approach. Our results constitute quantitative chrono-cosmography of the complex web-like patterns underlying the observed galaxy distribution.

  16. Health at the borders: Bayesian multilevel analysis of women's malnutrition determinants in Ethiopia

    PubMed Central

    Delbiso, Tefera Darge; Rodriguez-Llanes, Jose Manuel; Altare, Chiara; Masquelier, Bruno; Guha-Sapir, Debarati

    2016-01-01

    Background Women's malnutrition, particularly undernutrition, remains an important public health challenge in Ethiopia. Although various studies examined the levels and determinants of women's nutritional status, the influence of living close to an international border on women's nutrition has not been investigated. Yet, Ethiopian borders are regularly affected by conflict and refugee flows, which might ultimately impact health. Objective To investigate the impact of living close to borders in the nutritional status of women in Ethiopia, while considering other important covariates. Design Our analysis was based on the body mass index (BMI) of 6,334 adult women aged 20–49 years, obtained from the 2011 Ethiopian Demographic and Health Survey (EDHS). A Bayesian multilevel multinomial logistic regression analysis was used to capture the clustered structure of the data and the possible correlation that may exist within and between clusters. Results After controlling for potential confounders, women living close to borders (i.e. ≤100 km) in Ethiopia were 59% more likely to be underweight (posterior odds ratio [OR]=1.59; 95% credible interval [CrI]: 1.32–1.90) than their counterparts living far from the borders. This result was robust to different choices of border delineation (i.e. ≤50, ≤75, ≤125, and ≤150 km). Women from poor families, those who have no access to improved toilets, reside in lowland areas, and are Muslim, were independently associated with underweight. In contrast, more wealth, higher education, older age, access to improved toilets, being married, and living in urban or lowlands were independently associated with overweight. Conclusions The problem of undernutrition among women in Ethiopia is most worrisome in the border areas. Targeted interventions to improve nutritional status in these areas, such as improved access to sanitation, economic and livelihood support, are recommended. PMID:27388539

  17. Reconstruction of a beech population bottleneck using archival demographic information and Bayesian analysis of genetic data.

    PubMed

    Lander, Tonya A; Oddou-Muratorio, Sylvie; Prouillet-Leplat, Helene; Klein, Etienne K

    2011-12-01

    Range expansion and contraction has occurred in the history of most species and can seriously impact patterns of genetic diversity. Historical data about range change are rare and generally appropriate for studies at large scales, whereas the individual pollen and seed dispersal events that form the basis of geneflow and colonization generally occur at a local scale. In this study, we investigated range change in Fagus sylvatica on Mont Ventoux, France, using historical data from 1838 to the present and approximate Bayesian computation (ABC) analyses of genetic data. From the historical data, we identified a population minimum in 1845 and located remnant populations at least 200 years old. The ABC analysis selected a demographic scenario with three populations, corresponding to two remnant populations and one area of recent expansion. It also identified expansion from a smaller ancestral population but did not find that this expansion followed a population bottleneck, as suggested by the historical data. Despite a strong support to the selected scenario for our data set, the ABC approach showed a low power to discriminate among scenarios on average and a low ability to accurately estimate effective population sizes and divergence dates, probably due to the temporal scale of the study. This study provides an unusual opportunity to test ABC analysis in a system with a well-documented demographic history and identify discrepancies between the results of historical, classical population genetic and ABC analyses. The results also provide valuable insights into genetic processes at work at a fine spatial and temporal scale in range change and colonization.

  18. Bayesian Belief Network to support conflict analysis for groundwater protection: the case of the Apulia region.

    PubMed

    Giordano, Raffaele; D'Agostino, Daniela; Apollonio, Ciro; Lamaddalena, Nicola; Vurro, Michele

    2013-01-30

    Water resource management is often characterized by conflicts, as a result of the heterogeneity of interests associated with a shared resource. Many water conflicts arise on a global scale and, in particular, an increasing level of conflicts can be observed in the Mediterranean basin, characterized by water scarcity. In the present work, in order to assist the conflict analysis process, and thus outline a proper groundwater management, stakeholders were involved in the process and suitable tools were used in a Mediterranean area (the Apulia region, in Italy). In particular, this paper seeks to elicit and structure farmers' mental models influencing their decision over the main water source for irrigation. The more crucial groundwater is for farmers' objectives, the more controversial is the groundwater protection strategy. Bayesian Belief Networks were developed to simulate farmers' behavior with regard to groundwater management and to assess the impacts of protection strategy. These results have been used to calculate the conflict degree in the study area, derived from the introduction of policies for the reduction of groundwater exploitation for irrigation purposes. The less acceptable the policy is, the more likely it is that conflict will develop between farmers and the Regional Authority. The results of conflict analysis were also used to contribute to the debate concerning potential conflict mitigation measures. The approach adopted in this work has been discussed with a number of experts in groundwater management policies and irrigation management, and its main strengths and weaknesses have been identified. Increasing awareness of the existence of potential conflicts and the need to deal with them can be seen as an interesting initial shift in the Apulia region's water management regime, which is still grounded in merely technical approaches.

  19. Personality and coping traits: A joint factor analysis.

    PubMed

    Ferguson, Eamonn

    2001-11-01

    OBJECTIVES: The main objective of this paper is to explore the structural similarities between Eysenck's model of personality and the dimensions of the dispositional COPE. Costa et al. {Costa P., Somerfield, M., & McCrae, R. (1996). Personality and coping: A reconceptualisation. In (pp. 44-61) Handbook of coping: Theory, research and applications. New York: Wiley} suggest that personality and coping behaviour are part of a continuum based on adaptation. If this is the case, there should be structural similarities between measures of personality and coping behaviour. This is tested using a joint factor analysis of personality and coping measures. DESIGN: Cross-sectional survey. METHODS: The EPQ-R and the dispositional COPE were administered to 154 participants, and the data were analysed using joint factor analysis and bivariate associations. RESULTS: The joint factor analysis indicated that these data were best explained by a four-factor model. One factor was primarily unrelated to personality. There was a COPE-neurotic-introvert factor (NI-COPE) containing coping behaviours such as denial, a COPE-extroversion (E-COPE) factor containing behaviours such as seeking social support and a COPE-psychoticism factor (P-COPE) containing behaviours such as alcohol use. This factor pattern, especially for NI- and E-COPE, was interpreted in terms of Gray's model of personality {Gray, J. A. (1987) The psychology of fear and stress. Cambridge: Cambridge University Press}. NI-, E-, and P-COPE were shown to be related, in a theoretically consistent manner, to perceived coping success and perceived coping functions. CONCLUSIONS: The results indicate that there are indeed conceptual links between models of personality and coping. It is argued that future research should focus on identifying coping 'trait complexes'. Implications for practice are discussed.

  20. Finite Element Analysis of the Maximum Stress at the Joints of the Transmission Tower

    NASA Astrophysics Data System (ADS)

    Itam, Zarina; Beddu, Salmia; Liyana Mohd Kamal, Nur; Bamashmos, Khaled H.

    2016-03-01

    Transmission towers are tall structures, usually a steel lattice tower, used to support an overhead power line. Usually, transmission towers are analyzed as frame-truss systems and the members are assumed to be pin-connected without explicitly considering the effects of joints on the tower behavior. In this research, an engineering example of joint will be analyzed with the consideration of the joint detailing to investigate how it will affect the tower analysis. A static analysis using STAAD Pro was conducted to indicate the joint with the maximum stress. This joint will then be explicitly analyzed in ANSYS using the Finite Element Method. Three approaches were used in the software which are the simple plate model, bonded contact with no bolts, and beam element bolts. Results from the joint analysis show that stress values increased with joint details consideration. This proves that joints and connections play an important role in the distribution of stress within the transmission tower.

  1. A hybrid Bayesian hierarchical model combining cohort and case-control studies for meta-analysis of diagnostic tests: Accounting for partial verification bias.

    PubMed

    Ma, Xiaoye; Chen, Yong; Cole, Stephen R; Chu, Haitao

    2016-12-01

    To account for between-study heterogeneity in meta-analysis of diagnostic accuracy studies, bivariate random effects models have been recommended to jointly model the sensitivities and specificities. As study design and population vary, the definition of disease status or severity could differ across studies. Consequently, sensitivity and specificity may be correlated with disease prevalence. To account for this dependence, a trivariate random effects model had been proposed. However, the proposed approach can only include cohort studies with information estimating study-specific disease prevalence. In addition, some diagnostic accuracy studies only select a subset of samples to be verified by the reference test. It is known that ignoring unverified subjects may lead to partial verification bias in the estimation of prevalence, sensitivities, and specificities in a single study. However, the impact of this bias on a meta-analysis has not been investigated. In this paper, we propose a novel hybrid Bayesian hierarchical model combining cohort and case-control studies and correcting partial verification bias at the same time. We investigate the performance of the proposed methods through a set of simulation studies. Two case studies on assessing the diagnostic accuracy of gadolinium-enhanced magnetic resonance imaging in detecting lymph node metastases and of adrenal fluorine-18 fluorodeoxyglucose positron emission tomography in characterizing adrenal masses are presented.

  2. Bayesian Meta-Analysis of the Accuracy of a Test for Tuberculous Pleuritis in the Absence of a Gold Standard Reference

    PubMed Central

    Dendukuri, Nandini; Schiller, Ian; Joseph, Lawrence; Pai, Madhukar

    2013-01-01

    Summary Absence of a perfect reference test is an acknowledged source of bias in diagnostic studies. In the case of tuberculous pleuritis, standard reference tests such as smear microscopy, culture and biopsy have poor sensitivity. Yet meta-analyses of new tests for this disease have always assumed the reference standard is perfect, leading to biased estimates of the new test’s accuracy. We describe a method for joint meta-analysis of sensitivity and specificity of the diagnostic test under evaluation, while considering the imperfect nature of the reference standard. We use a Bayesian hierarchical model that takes into account within- and between-study variability. We show how to obtain pooled estimates of sensitivity and specificity, and how to plot a hierarchical summary receiver operating characteristic curve. We describe extensions of the model to situations where multiple reference tests are used, and where index and reference tests are conditionally dependent. The performance of the model is evaluated using simulations and illustrated using data from a meta-analysis of nucleic acid amplification tests (NAATs) for tuberculous pleuritis. The estimate of NAAT specificity was higher and the sensitivity lower compared to a model that assumed that the reference test was perfect. PMID:22568612

  3. Use of Bayesian Inference in Crystallographic Structure Refinement via Full Diffraction Profile Analysis

    PubMed Central

    Fancher, Chris M.; Han, Zhen; Levin, Igor; Page, Katharine; Reich, Brian J.; Smith, Ralph C.; Wilson, Alyson G.; Jones, Jacob L.

    2016-01-01

    A Bayesian inference method for refining crystallographic structures is presented. The distribution of model parameters is stochastically sampled using Markov chain Monte Carlo. Posterior probability distributions are constructed for all model parameters to properly quantify uncertainty by appropriately modeling the heteroskedasticity and correlation of the error structure. The proposed method is demonstrated by analyzing a National Institute of Standards and Technology silicon standard reference material. The results obtained by Bayesian inference are compared with those determined by Rietveld refinement. Posterior probability distributions of model parameters provide both estimates and uncertainties. The new method better estimates the true uncertainties in the model as compared to the Rietveld method. PMID:27550221

  4. Use of Bayesian Inference in Crystallographic Structure Refinement via Full Diffraction Profile Analysis.

    PubMed

    Fancher, Chris M; Han, Zhen; Levin, Igor; Page, Katharine; Reich, Brian J; Smith, Ralph C; Wilson, Alyson G; Jones, Jacob L

    2016-08-23

    A Bayesian inference method for refining crystallographic structures is presented. The distribution of model parameters is stochastically sampled using Markov chain Monte Carlo. Posterior probability distributions are constructed for all model parameters to properly quantify uncertainty by appropriately modeling the heteroskedasticity and correlation of the error structure. The proposed method is demonstrated by analyzing a National Institute of Standards and Technology silicon standard reference material. The results obtained by Bayesian inference are compared with those determined by Rietveld refinement. Posterior probability distributions of model parameters provide both estimates and uncertainties. The new method better estimates the true uncertainties in the model as compared to the Rietveld method.

  5. Bayesian Analysis for Risk Assessment of Selected Medical Events in Support of the Integrated Medical Model Effort

    NASA Technical Reports Server (NTRS)

    Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.

    2012-01-01

    The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.

  6. Modeling of joints for the dynamic analysis of truss structures

    NASA Technical Reports Server (NTRS)

    Belvin, W. Keith

    1987-01-01

    An experimentally-based method for determining the stiffness and damping of truss joints is described. The analytical models use springs and both viscous and friction dampers to simulate joint load-deflection behavior. A least-squares algorithm is developed to identify the stiffness and damping coefficients of the analytical joint models from test data. The effects of nonlinear joint stiffness such as joint dead band are also studied. Equations for predicting the sensitivity of beam deformations to changes in joint stiffness are derived and used to show the level of joint stiffness required for nearly rigid joint behavior. Finally, the global frequency sensitivity of a truss structure to random perturbations in joint stiffness is discussed.

  7. A method of spherical harmonic analysis in the geosciences via hierarchical Bayesian inference

    NASA Astrophysics Data System (ADS)

    Muir, J. B.; Tkalčić, H.

    2015-11-01

    The problem of decomposing irregular data on the sphere into a set of spherical harmonics is common in many fields of geosciences where it is necessary to build a quantitative understanding of a globally varying field. For example, in global seismology, a compressional or shear wave speed that emerges from tomographic images is used to interpret current state and composition of the mantle, and in geomagnetism, secular variation of magnetic field intensity measured at the surface is studied to better understand the changes in the Earth's core. Optimization methods are widely used for spherical harmonic analysis of irregular data, but they typically do not treat the dependence of the uncertainty estimates on the imposed regularization. This can cause significant difficulties in interpretation, especially when the best-fit model requires more variables as a result of underestimating data noise. Here, with the above limitations in mind, the problem of spherical harmonic expansion of irregular data is treated within the hierarchical Bayesian framework. The hierarchical approach significantly simplifies the problem by removing the need for regularization terms and user-supplied noise estimates. The use of the corrected Akaike Information Criterion for picking the optimal maximum degree of spherical harmonic expansion and the resulting spherical harmonic analyses are first illustrated on a noisy synthetic data set. Subsequently, the method is applied to two global data sets sensitive to the Earth's inner core and lowermost mantle, consisting of PKPab-df and PcP-P differential traveltime residuals relative to a spherically symmetric Earth model. The posterior probability distributions for each spherical harmonic coefficient are calculated via Markov Chain Monte Carlo sampling; the uncertainty obtained for the coefficients thus reflects the noise present in the real data and the imperfections in the spherical harmonic expansion.

  8. Bayesian Analysis of Non-Gaussian Long-Range Dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, T.; Franzke, C.; Gramacy, R. B.; Watkins, N. W.

    2012-12-01

    Recent studies have strongly suggested that surface temperatures exhibit long-range dependence (LRD). The presence of LRD would hamper the identification of deterministic trends and the quantification of their significance. It is well established that LRD processes exhibit stochastic trends over rather long periods of time. Thus, accurate methods for discriminating between physical processes that possess long memory and those that do not are an important adjunct to climate modeling. We have used Markov Chain Monte Carlo algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average (ARFIMA) processes, which are capable of modeling LRD. Our principal aim is to obtain inference about the long memory parameter, d,with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series such as the Central England Temperature. Many physical processes, for example the Faraday time series from Antarctica, are highly non-Gaussian. We have therefore extended this work by weakening the Gaussianity assumption. Specifically, we assume a symmetric α -stable distribution for the innovations. Such processes provide good, flexible, initial models for non-Gaussian processes with long memory. We will present a study of the dependence of the posterior variance σ d of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other measures of d.

  9. Mapping, bayesian geostatistical analysis and spatial prediction of lymphatic filariasis prevalence in Africa.

    PubMed

    Slater, Hannah; Michael, Edwin

    2013-01-01

    There is increasing interest to control or eradicate the major neglected tropical diseases. Accurate modelling of the geographic distributions of parasitic infections will be crucial to this endeavour. We used 664 community level infection prevalence data collated from the published literature in conjunction with eight environmental variables, altitude and population density, and a multivariate Bayesian generalized linear spatial model that allows explicit accounting for spatial autocorrelation and incorporation of uncertainty in input data and model parameters, to construct the first spatially-explicit map describing LF prevalence distribution in Africa. We also ran the best-fit model against predictions made by the HADCM3 and CCCMA climate models for 2050 to predict the likely distributions of LF under future climate and population changes. We show that LF prevalence is strongly influenced by spatial autocorrelation between locations but is only weakly associated with environmental covariates. Infection prevalence, however, is found to be related to variations in population density. All associations with key environmental/demographic variables appear to be complex and non-linear. LF prevalence is predicted to be highly heterogenous across Africa, with high prevalences (>20%) estimated to occur primarily along coastal West and East Africa, and lowest prevalences predicted for the central part of the continent. Error maps, however, indicate a need for further surveys to overcome problems with data scarcity in the latter and other regions. Analysis of future changes in prevalence indicates that population growth rather than climate change per se will represent the dominant factor in the predicted increase/decrease and spread of LF on the continent. We indicate that these results could play an important role in aiding the development of strategies that are best able to achieve the goals of parasite elimination locally and globally in a manner that may also account

  10. A Bayesian re-analysis of HD 11964 extrasolar planet data

    NASA Astrophysics Data System (ADS)

    Gregory, Philip C.

    2007-05-01

    A Bayesian multi-planet Kepler periodogram has been developed for the analysis of precision radial velocity data (Gregory, ApJ, 631, 1198, 2005 and Astro-ph/0609229). The periodogram employs a parallel tempering Markov chain Monte Carlo algorithm. The HD 11964 data (Butler et al. ApJ, 646, 505, 2006) has been re-analyzed using 1, 2, 3 and 4 planet models. Each model incorporates an extra noise parameter which can allow for additional independent Gaussian noise beyond the known measurement uncertainties. The most probable model exhibits three periods of 38.02-0.05+0.06, 360-4+4 and 1924-43+44d, and eccentricities of 0.22-0.22+0.11, 0.63-0.17+0.34 and 0.05-0.05+0.03, respectively. Assuming the three signals (each one consistent with a Keplerian orbit) are caused by planets, the corresponding limits on planetary mass (M sin i) and semi-major axis are (0.090-0.14+0.15 MJ, 0.253-0.009+0.009 au), (0.21-0.02+0.05 MJ, 1.13-0.04+0.04 au), (0.77-0.08+0.08 MJ, 3.46-0.13+0.13 au), respectively. The small difference (1.3 sigma) between the 360 day period and one year suggests that it might be worth investigating the barycentric correction for the HD 11964 data. This research was supported in part by a grant from the Canadian Natural Sciences and Engineering Research Council of Canada at the University of British Columbia.

  11. BAYESIAN META-ANALYSIS ON MEDICAL DEVICES: APPLICATION TO IMPLANTABLE CARDIOVERTER DEFIBRILLATORS

    PubMed Central

    Youn, Ji-Hee; Lord, Joanne; Hemming, Karla; Girling, Alan; Buxton, Martin

    2012-01-01

    Objectives: The aim of this study is to describe and illustrate a method to obtain early estimates of the effectiveness of a new version of a medical device. Methods: In the absence of empirical data, expert opinion may be elicited on the expected difference between the conventional and modified devices. Bayesian Mixed Treatment Comparison (MTC) meta-analysis can then be used to combine this expert opinion with existing trial data on earlier versions of the device. We illustrate this approach for a new four-pole implantable cardioverter defibrillator (ICD) compared with conventional ICDs, Class III anti-arrhythmic drugs, and conventional drug therapy for the prevention of sudden cardiac death in high risk patients. Existing RCTs were identified from a published systematic review, and we elicited opinion on the difference between four-pole and conventional ICDs from experts recruited at a cardiology conference. Results: Twelve randomized controlled trials were identified. Seven experts provided valid probability distributions for the new ICDs compared with current devices. The MTC model resulted in estimated relative risks of mortality of 0.74 (0.60–0.89) (predictive relative risk [RR] = 0.77 [0.41–1.26]) and 0.83 (0.70–0.97) (predictive RR = 0.84 [0.55–1.22]) with the new ICD therapy compared to Class III anti-arrhythmic drug therapy and conventional drug therapy, respectively. These results showed negligible differences from the preliminary results for the existing ICDs. Conclusions: The proposed method incorporating expert opinion to adjust for a modification made to an existing device may play a useful role in assisting decision makers to make early informed judgments on the effectiveness of frequently modified healthcare technologies. PMID:22559753

  12. MASSIVE: A Bayesian analysis of giant planet populations around low-mass stars

    NASA Astrophysics Data System (ADS)

    Lannier, J.; Delorme, P.; Lagrange, A. M.; Borgniet, S.; Rameau, J.; Schlieder, J. E.; Gagné, J.; Bonavita, M. A.; Malo, L.; Chauvin, G.; Bonnefoy, M.; Girard, J. H.

    2016-12-01

    Context. Direct imaging has led to the discovery of several giant planet and brown dwarf companions. These imaged companions populate a mass, separation and age domain (mass >1 MJup, orbits > 5 AU, age < 1 Gyr) quite distinct from the one occupied by exoplanets discovered by the radial velocity or transit methods. This distinction could indicate that different formation mechanisms are at play. Aims: We aim at investigating correlations between the host star's mass and the presence of wide-orbit giant planets, and at providing new observational constraints on planetary formation models. Methods: We observed 58 young and nearby M-type dwarfs in L'-band with the VLT/NaCo instrument and used angular differential imaging algorithms to optimize the sensitivity to planetary-mass companions and to derive the best detection limits. We estimate the probability of detecting a planet as a function of its mass and physical separation around each target. We conduct a Bayesian analysis to determine the frequency of substellar companions orbiting low-mass stars, using a homogenous sub-sample of 54 stars. Results: We derive a frequency of for companions with masses in the range of 2-80 MJup, and % for planetary mass companions (2-14 MJup), at physical separations of 8 to 400 AU for both cases. Comparing our results with a previous survey targeting more massive stars, we find evidence that substellar companions more massive than 1 MJup with a low mass ratio Q with respect to their host star (Q < 1%), are less frequent around low-mass stars. This may represent observational evidence that the frequency of imaged wide-orbit substellar companions is correlated with stellar mass, corroborating theoretical expectations. Contrarily, we show statistical evidence that intermediate-mass ratio (1% < Q < 5%) companion with masses >2 MJup might be independent from the mass of the host star.

  13. New class of hybrid EoS and Bayesian M - R data analysis

    NASA Astrophysics Data System (ADS)

    Alvarez-Castillo, D.; Ayriyan, A.; Benic, S.; Blaschke, D.; Grigorian, H.; Typel, S.

    2016-03-01

    We explore systematically a new class of two-phase equations of state (EoS) for hybrid stars that is characterized by three main features: 1) stiffening of the nuclear EoS at supersaturation densities due to quark exchange effects (Pauli blocking) between hadrons, modelled by an excluded volume correction; 2) stiffening of the quark matter EoS at high densities due to multiquark interactions; and 3) possibility for a strong first-order phase transition with an early onset and large density jump. The third feature results from a Maxwell construction for the possible transition from the nuclear to a quark matter phase and its properties depend on the two parameters used for 1) and 2), respectively. Varying these two parameters, one obtains a class of hybrid EoS that yields solutions of the Tolman-Oppenheimer-Volkoff (TOV) equations for sequences of hadronic and hybrid stars in the mass-radius diagram which cover the full range of patterns according to the Alford-Han-Prakash classification following which a hybrid star branch can be either absent, connected or disconnected with the hadronic one. The latter case often includes a tiny connected branch. The disconnected hybrid star branch, also called "third family", corresponds to high-mass twin stars characterized by the same gravitational mass but different radii. We perform a Bayesian analysis and demonstrate that the observation of such a pair of high-mass twin stars would have a sufficient discriminating power to favor hybrid EoS with a strong first-order phase transition over alternative EoS.

  14. Mapping, Bayesian Geostatistical Analysis and Spatial Prediction of Lymphatic Filariasis Prevalence in Africa

    PubMed Central

    Slater, Hannah; Michael, Edwin

    2013-01-01

    There is increasing interest to control or eradicate the major neglected tropical diseases. Accurate modelling of the geographic distributions of parasitic infections will be crucial to this endeavour. We used 664 community level infection prevalence data collated from the published literature in conjunction with eight environmental variables, altitude and population density, and a multivariate Bayesian generalized linear spatial model that allows explicit accounting for spatial autocorrelation and incorporation of uncertainty in input data and model parameters, to construct the first spatially-explicit map describing LF prevalence distribution in Africa. We also ran the best-fit model against predictions made by the HADCM3 and CCCMA climate models for 2050 to predict the likely distributions of LF under future climate and population changes. We show that LF prevalence is strongly influenced by spatial autocorrelation between locations but is only weakly associated with environmental covariates. Infection prevalence, however, is found to be related to variations in population density. All associations with key environmental/demographic variables appear to be complex and non-linear. LF prevalence is predicted to be highly heterogenous across Africa, with high prevalences (>20%) estimated to occur primarily along coastal West and East Africa, and lowest prevalences predicted for the central part of the continent. Error maps, however, indicate a need for further surveys to overcome problems with data scarcity in the latter and other regions. Analysis of future changes in prevalence indicates that population growth rather than climate change per se will represent the dominant factor in the predicted increase/decrease and spread of LF on the continent. We indicate that these results could play an important role in aiding the development of strategies that are best able to achieve the goals of parasite elimination locally and globally in a manner that may also account

  15. Predictive uncertainty analysis of plume distribution for geological carbon sequestration using sparse-grid Bayesian method

    NASA Astrophysics Data System (ADS)

    Shi, X.; Zhang, G.

    2013-12-01

    Because of the extensive computational burden, parametric uncertainty analyses are rarely conducted for geological carbon sequestration (GCS) process based multi-phase models. The difficulty of predictive uncertainty analysis for the CO2 plume migration in realistic GCS models is not only due to the spatial distribution of the caprock and reservoir (i.e. heterogeneous model parameters), but also because the GCS optimization estimation problem has multiple local minima due to the complex nonlinear multi-phase (gas and aqueous), and multi-component (water, CO2, salt) transport equations. The geological model built by Doughty and Pruess (2004) for the Frio pilot site (Texas) was selected and assumed to represent the 'true' system, which was composed of seven different facies (geological units) distributed among 10 layers. We chose to calibrate the permeabilities of these facies. Pressure and gas saturation values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. Each simulation of the model lasts about 2 hours. In this study, we develop a new approach that improves computational efficiency of Bayesian inference by constructing a surrogate system based on an adaptive sparse-grid stochastic collocation method. This surrogate response surface global optimization algorithm is firstly used to calibrate the model parameters, then prediction uncertainty of the CO2 plume position is quantified due to the propagation from parametric uncertainty in the numerical experiments, which is also compared to the actual plume from the 'true' model. Results prove that the approach is computationally efficient for multi-modal optimization and prediction uncertainty quantification for computationally expensive simulation models. Both our inverse methodology and findings can be broadly applicable to GCS in heterogeneous storage formations.

  16. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  17. Cancer mortality inequalities in urban areas: a Bayesian small area analysis in Spanish cities

    PubMed Central

    2011-01-01

    Background Intra-urban inequalities in mortality have been infrequently analysed in European contexts. The aim of the present study was to analyse patterns of cancer mortality and their relationship with socioeconomic deprivation in small areas in 11 Spanish cities. Methods It is a cross-sectional ecological design using mortality data (years 1996-2003). Units of analysis were the census tracts. A deprivation index was calculated for each census tract. In order to control the variability in estimating the risk of dying we used Bayesian models. We present the RR of the census tract with the highest deprivation vs. the census tract with the lowest deprivation. Results In the case of men, socioeconomic inequalities are observed in total cancer mortality in all cities, except in Castellon, Cordoba and Vigo, while Barcelona (RR = 1.53 95%CI 1.42-1.67), Madrid (RR = 1.57 95%CI 1.49-1.65) and Seville (RR = 1.53 95%CI 1.36-1.74) present the greatest inequalities. In general Barcelona and Madrid, present inequalities for most types of cancer. Among women for total cancer mortality, inequalities have only been found in Barcelona and Zaragoza. The excess number of cancer deaths due to socioeconomic deprivation was 16,413 for men and 1,142 for women. Conclusion This study has analysed inequalities in cancer mortality in small areas of cities in Spain, not only relating this mortality with socioeconomic deprivation, but also calculating the excess mortality which may be attributed to such deprivation. This knowledge is particularly useful to determine which geographical areas in each city need intersectorial policies in order to promote a healthy environment. PMID:21232096

  18. Identification of QTLs of resistance to white mold in common bean from multiple markers by using Bayesian analysis.

    PubMed

    Lara, L A C; Santos, J B; Balestre, M; Lima, I A; Pamplona, A K A; Veloso, J S; Silva, P H

    2015-02-06

    In this study, we identified simple sequence repeat, ampli-fied fragment length polymorphism, and sequence-related amplified poly-morphism markers linked to quantitative trait loci (QTLs) for resistance to white mold disease in common bean progenies derived from a cross between lines CNFC 9506 and RP-2, evaluated using the oxalic acid test and using Bayesian analysis. DNA was extracted from 186 F₂ plants and their parental lines for molecular analysis. Fifteen experiments were car-ried out for phenotypic analysis, which included 186 F₂:₄ progenies, the F₁ generation, the F₂ generation, and the lines CNFC 9506, RP-2, and G122 as common treatments. A completely randomized experimental design with 3 replications was used in controlled environments. The adjusted means for the F₂:₄ generation were to identify QTLs by Bayesian shrink-age analysis. Significant differences were observed among the progenies for the reaction to white mold. The moving away method under the Bayes-ian approach was effective for identifying QTLs when it was not possible to obtain a genetic map because of low marker density. Using the Wald test, 25 markers identified QTLs for resistance to white mold, as well as 16 simple sequence repeats, 7 amplified fragment length polymorphisms, and 2 sequence-related amplified polymorphisms. The markers BM184, BM211, and PV-gaat001 showed low distances from QTLs related white mold resistance. In addition, these markers showed, signal effects with increasing resistance to white mold and high heritability in the analysis with oxalic acid, and thus, are promising for marker-assisted selection.

  19. Comparing energy sources for surgical ablation of atrial fibrillation: a Bayesian network meta-analysis of randomized, controlled trials.

    PubMed

    Phan, Kevin; Xie, Ashleigh; Kumar, Narendra; Wong, Sophia; Medi, Caroline; La Meir, Mark; Yan, Tristan D

    2015-08-01

    Simplified maze procedures involving radiofrequency, cryoenergy and microwave energy sources have been increasingly utilized for surgical treatment of atrial fibrillation as an alternative to the traditional cut-and-sew approach. In the absence of direct comparisons, a Bayesian network meta-analysis is another alternative to assess the relative effect of different treatments, using indirect evidence. A Bayesian meta-analysis of indirect evidence was performed using 16 published randomized trials identified from 6 databases. Rank probability analysis was used to rank each intervention in terms of their probability of having the best outcome. Sinus rhythm prevalence beyond the 12-month follow-up was similar between the cut-and-sew, microwave and radiofrequency approaches, which were all ranked better than cryoablation (respectively, 39, 36, and 25 vs 1%). The cut-and-sew maze was ranked worst in terms of mortality outcomes compared with microwave, radiofrequency and cryoenergy (2 vs 19, 34, and 24%, respectively). The cut-and-sew maze procedure was associated with significantly lower stroke rates compared with microwave ablation [odds ratio <0.01; 95% confidence interval 0.00, 0.82], and ranked the best in terms of pacemaker requirements compared with microwave, radiofrequency and cryoenergy (81 vs 14, and 1, <0.01% respectively). Bayesian rank probability analysis shows that the cut-and-sew approach is associated with the best outcomes in terms of sinus rhythm prevalence and stroke outcomes, and remains the gold standard approach for AF treatment. Given the limitations of indirect comparison analysis, these results should be viewed with caution and not over-interpreted.

  20. Genome-wide association study of swine farrowing traits. Part II: Bayesian analysis of marker data.

    PubMed

    Schneider, J F; Rempel, L A; Snelling, W M; Wiedmann, R T; Nonneman, D J; Rohrer, G A

    2012-10-01

    Reproductive efficiency has a great impact on the economic success of pork (sus scrofa) production. Number born alive (NBA) and average piglet birth weight (ABW) contribute greatly to reproductive efficiency. To better understand the underlying genetics of birth traits, a genome-wide association study (GWAS) was undertaken. Samples of DNA were collected and tested using the Illumina PorcineSNP60 BeadChip from 1,152 first parity gilts. Traits included total number born (TNB), NBA, number born dead (NBD), number stillborn (NSB), number of mummies (MUM), total litter birth weight (LBW), and ABW. A total of 41,151 SNP were tested using a Bayesian approach. Beginning with the first 5 SNP on SSC1 and ending with the last 5 SNP on the SSCX, SNP were assigned to groups of 5 consecutive SNP by chromosome-position order and analyzed again using a Bayesian approach. From that analysis, 5-SNP groups were selected having no overlap with another 5-SNP groups and no overlap across chromosomes. These selected 5-SNP non-overlapping groups were defined as QTL. Of the available 8,814 QTL, 124 were found to be statistically significant (P < 0.01). Multiple testing was considered using the probability of false positives. Eleven QTL were found for TNB, 3 on SSC1, 3 on SSC4, 1 on SSC13, 1 on SSC14, 2 on SSC15, and 1 on SSC17. Statistical testing for NBA identified 14 QTL, 4 on SSC1, 1 on SSC4, 1 on SSC6, 1 on SSC10, 1on SSC13, 3 on SSC15, and 3 on SSC17. A single NBD QTL was found on SSC11. No QTL were identified for NSB or MUM. Thirty-three QTL were found for LBW, 3 on SSC1, 1 on SSC2, 1 on SSC3, 5 on SSC4, 2 on SSC5, 5 on SSC6, 3 on SSC7, 2 on SSC9, 1 on SSC10, 2 on SSC14, 6 on SSC15, and 2 on SSC17. A total of 65 QTL were found for ABW, 9 on SSC1, 3 on SSC2, 9 on SSC5, 5 on SSC6, 1 on SSC7, 2 on SSC8, 2 on SSC9, 3 on SSC10, 1 on SSC11, 3 on SSC12, 2 on SSC13, 8 on SSC14, 8 on SSC15, 1 on SSC17, and 8 on SSC18. Several candidate genes have been identified that overlap QTL locations

  1. Selected aspects of prior and likelihood information for a Bayesian classifier in a road safety analysis.

    PubMed

    Nowakowska, Marzena

    2017-04-01

    The development of the Bayesian logistic regression model classifying the road accident severity is discussed. The already exploited informative priors (method of moments, maximum likelihood estimation, and two-stage Bayesian updating), along with the original idea of a Boot prior proposal, are investigated when no expert opinion has been available. In addition, two possible approaches to updating the priors, in the form of unbalanced and balanced training data sets, are presented. The obtained logistic Bayesian models are assessed on the basis of a deviance information criterion (DIC), highest probability density (HPD) intervals, and coefficients of variation estimated for the model parameters. The verification of the model accuracy has been based on sensitivity, specificity and the harmonic mean of sensitivity and specificity, all calculated from a test data set. The models obtained from the balanced training data set have a better classification quality than the ones obtained from the unbalanced training data set. The two-stage Bayesian updating prior model and the Boot prior model, both identified with the use of the balanced training data set, outperform the non-informative, method of moments, and maximum likelihood estimation prior models. It is important to note that one should be careful when interpreting the parameters since different priors can lead to different models.

  2. Bayesian Analysis of Structural Equation Models with Nonlinear Covariates and Latent Variables

    ERIC Educational Resources Information Center

    Song, Xin-Yuan; Lee, Sik-Yum

    2006-01-01

    In this article, we formulate a nonlinear structural equation model (SEM) that can accommodate covariates in the measurement equation and nonlinear terms of covariates and exogenous latent variables in the structural equation. The covariates can come from continuous or discrete distributions. A Bayesian approach is developed to analyze the…

  3. Bayesian Analysis for Linearized Multi-Stage Models in Quantal Bioassay.

    ERIC Educational Resources Information Center

    Kuo, Lynn; Cohen, Michael P.

    Bayesian methods for estimating dose response curves in quantal bioassay are studied. A linearized multi-stage model is assumed for the shape of the curves. A Gibbs sampling approach with data augmentation is employed to compute the Bayes estimates. In addition, estimation of the "relative additional risk" and the "risk specific…

  4. Analysis of Bonded Joints Between the Facesheet and Flange of Corrugated Composite Panels

    NASA Technical Reports Server (NTRS)

    Yarrington, Phillip W.; Collier, Craig S.; Bednarcyk, Brett A.

    2008-01-01

    This paper outlines a method for the stress analysis of bonded composite corrugated panel facesheet to flange joints. The method relies on the existing HyperSizer Joints software, which analyzes the bonded joint, along with a beam analogy model that provides the necessary boundary loading conditions to the joint analysis. The method is capable of predicting the full multiaxial stress and strain fields within the flange to facesheet joint and thus can determine ply-level margins and evaluate delamination. Results comparing the method to NASTRAN finite element model stress fields are provided illustrating the accuracy of the method.

  5. SRB Environment Evaluation and Analysis. Volume 2: RSRB Joint Filling Test/Analysis Improvements

    NASA Technical Reports Server (NTRS)

    Knox, E. C.; Woods, G. Hamilton

    1991-01-01

    Following the Challenger accident a very comprehensive solid rocket booster (SRB) redesign program was initiated. One objective of the program was to develop expertise at NASA/MSFC in the techniques for analyzing the flow of hot gases in the SRB joints. Several test programs were undertaken to provide a data base of joint performance with manufactured defects in the joints to allow hot gases to fill the joints. This data base was used also to develop the analytical techniques. Some of the test programs were Joint Environment Simulator (JES), Nozzle Joint Environment Simulator (NJES), Transient Pressure Test Article (TPTA), and Seventy-Pound Charge (SPC). In 1988 the TPTA test hardware was moved from the Utah site to MSFC and several RSRM tests were scheduled, to be followed by tests for the ASRM program. REMTECH Inc. supported these activities with pretest estimates of the flow conditions in the test joints, and post-test analysis and evaluation of the measurements. During this support REMTECH identified deficiencies in the gas-measurement instrumentation that existed in the TPTA hardware, made recommendations for its replacement, and identified improvements to the analytical tools used in the test support. Only one test was completed under the TPTA RSRM test program, and those scheduled for the ASRM were rescheduled to a time after the expiration of this contract. The attention of this effort was directed toward improvements in the analytical techniques in preparation for when the ASRM program begins.

  6. Bayesian Information-Gap Decision Analysis Applied to a CO2 Leakage Problem

    NASA Astrophysics Data System (ADS)

    O'Malley, D.; Vesselinov, V. V.

    2014-12-01

    We describe a decision analysis in the presence of uncertainty that combines a non-probabilistic approach (information-gap decision theory) with a probabilistic approach (Bayes' theorem). Bayes' theorem is one of the most popular techniques for probabilistic uncertainty quantification (UQ). It is effective in many situations, because it updates our understanding of the uncertainties by conditioning on real data using a mathematically rigorous technique. However, the application of Bayes' theorem in science and engineering is not always rigorous. There are two reasons for this: (1) We can enumerate the possible outcomes of dice-rolling, but not the possible outcomes of real-world contamination remediation; (2) We can precisely determine conditional probabilities for coin-tossing, but substantial uncertainty surrounds the conditional probabilities for real-world contamination remediation. Of course, Bayes' theorem is rigorously applicable beyond dice-rolling and coin-tossing, but even in cases that are constructed to be simple with ostensibly good probabilistic models, applying Bayes' theorem to the real world may not work as well as one might expect. Bayes' theorem is rigorously applicable only if all possible events can be described, and their conditional probabilities can be derived rigorously. Outside of this domain, it may still be useful, but its use lacks at least some rigor. The information-gap approach allows us to circumvent some of the highlighted shortcomings of Bayes' theorem. In particular, it provides a way to account for possibilities beyond those described by our models, and a way to deal with uncertainty in the conditional distribution that forms the core of Bayesian analysis. We have developed a three-tiered technique enables one to make scientifically defensible decisions in the face of severe uncertainty such as is found in many geologic problems. To demonstrate the applicability, we apply the technique to a CO2 leakage problem. The goal is to

  7. Reliability Analysis of a Glacier Lake Warning System Using a Bayesian Net

    NASA Astrophysics Data System (ADS)

    Sturny, Rouven A.; Bründl, Michael

    2013-04-01

    Beside structural mitigation measures like avalanche defense structures, dams and galleries, warning and alarm systems have become important measures for dealing with Alpine natural hazards. Integrating them into risk mitigation strategies and comparing their effectiveness with structural measures requires quantification of the reliability of these systems. However, little is known about how reliability of warning systems can be quantified and which methods are suitable for comparing their contribution to risk reduction with that of structural mitigation measures. We present a reliability analysis of a warning system located in Grindelwald, Switzerland. The warning system was built for warning and protecting residents and tourists from glacier outburst floods as consequence of a rapid drain of the glacier lake. We have set up a Bayesian Net (BN, BPN) that allowed for a qualitative and quantitative reliability analysis. The Conditional Probability Tables (CPT) of the BN were determined according to manufacturer's reliability data for each component of the system as well as by assigning weights for specific BN nodes accounting for information flows and decision-making processes of the local safety service. The presented results focus on the two alerting units 'visual acoustic signal' (VAS) and 'alerting of the intervention entities' (AIE). For the summer of 2009, the reliability was determined to be 94 % for the VAS and 83 % for the AEI. The probability of occurrence of a major event was calculated as 0.55 % per day resulting in an overall reliability of 99.967 % for the VAS and 99.906 % for the AEI. We concluded that a failure of the VAS alerting unit would be the consequence of a simultaneous failure of the four probes located in the lake and the gorge. Similarly, we deduced that the AEI would fail either if there were a simultaneous connectivity loss of the mobile and fixed network in Grindelwald, an Internet access loss or a failure of the regional operations

  8. Bayesian Multiscale Analysis of X-Ray Jet Features in High Redshift Quasars

    NASA Astrophysics Data System (ADS)

    McKeough, Kathryn; Siemiginowska, A.; Kashyap, V.; Stein, N.

    2014-01-01

    X-ray emission of powerful quasar jets may be a result of the inverse Compton (IC) process in which the Cosmic Microwave Background (CMB) photons gain energy by interactions with the jet’s relativistic electrons. However, there is no definite evidence that IC/CMB process is responsible for the observed X-ray emission of large scale jets. A step toward understanding the X-ray emission process is to study the Radio and X-ray morphologies of the jet. We implement a sophisticated Bayesian image analysis program, Low-count Image Reconstruction and Analysis (LIRA) (Esch et al. 2004; Conners & van Dyk 2007), to analyze jet features in 11 Chandra images of high redshift quasars (z ~ 2 - 4.8). Out of the 36 regions where knots are visible in the radio jets, nine showed detectable X-ray emission. We measured the ratios of the X-ray and radio luminosities of the detected features and found that they are consistent with the CMB radiation relationship. We derived a range of the bulk lorentz factor (Γ) for detected jet features under the CMB jet emission model. There is no discernible trend of Γ with redshift within the sample. The efficiency of the X-ray emission between the detected jet feature and the corresponding quasar also shows no correlation with redshift. This work is supported in part by the National Science Foundation REU and the Department of Defense ASSURE programs under NSF Grant no.1262851 and by the Smithsonian Institution, and by NASA Contract NAS8-39073 to the Chandra X-ray Center (CXC). This research has made use of data obtained from the Chandra Data Archive and Chandra Source Catalog, and software provided by the CXC in the application packages CIAO, ChIPS, and Sherpa. We thank Teddy Cheung for providing the VLA radio images. Connors, A., & van Dyk, D. A. 2007, Statistical Challenges in Modern Astronomy IV, 371, 101 Esch, D. N., Connors, A., Karovska, M., & van Dyk, D. A. 2004, ApJ, 610, 1213

  9. Rating locomotive crew diesel emission exposure profiles using statistics and Bayesian Decision Analysis.

    PubMed

    Hewett, Paul; Bullock, William H

    2014-01-01

    For more than 20 years CSX Transportation (CSXT) has collected exposure measurements from locomotive engineers and conductors who are potentially exposed to diesel emissions. The database included measurements for elemental and total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, carbon monoxide, and nitrogen dioxide. This database was statistically analyzed and summarized, and the resulting statistics and exposure profiles were compared to relevant occupational exposure limits (OELs) using both parametric and non-parametric descriptive and compliance statistics. Exposure ratings, using the American Industrial Health Association (AIHA) exposure categorization scheme, were determined using both the compliance statistics and Bayesian Decision Analysis (BDA). The statistical analysis of the elemental carbon data (a marker for diesel particulate) strongly suggests that the majority of levels in the cabs of the lead locomotives (n = 156) were less than the California guideline of 0.020 mg/m(3). The sample 95th percentile was roughly half the guideline; resulting in an AIHA exposure rating of category 2/3 (determined using BDA). The elemental carbon (EC) levels in the trailing locomotives tended to be greater than those in the lead locomotive; however, locomotive crews rarely ride in the trailing locomotive. Lead locomotive EC levels were similar to those reported by other investigators studying locomotive crew exposures and to levels measured in urban areas. Lastly, both the EC sample mean and 95%UCL were less than the Environmental Protection Agency (EPA) reference concentration of 0.005 mg/m(3). With the exception of nitrogen dioxide, the overwhelming majority of the measurements for total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, and combustion gases in the cabs of CSXT locomotives were either non-detects or considerably less than the working OELs for the years represented in the database. When compared to the previous American

  10. Nonlinear Analysis of Bonded Composite Single-LAP Joints

    NASA Technical Reports Server (NTRS)

    Oterkus, E.; Barut, A.; Madenci, E.; Smeltzer, S. S.; Ambur, D. R.

    2004-01-01

    This study presents a semi-analytical solution method to analyze the geometrically nonlinear response of bonded composite single-lap joints with tapered adherend edges under uniaxial tension. The solution method provides the transverse shear and normal stresses in the adhesive and in-plane stress resultants and bending moments in the adherends. The method utilizes the principle of virtual work in conjunction with von Karman s nonlinear plate theory to model the adherends and the shear lag model to represent the kinematics of the thin adhesive layer between the adherends. Furthermore, the method accounts for the bilinear elastic material behavior of the adhesive while maintaining a linear stress-strain relationship in the adherends. In order to account for the stiffness changes due to thickness variation of the adherends along the tapered edges, their in-plane and bending stiffness matrices are varied as a function of thickness along the tapered region. The combination of these complexities results in a system of nonlinear governing equilibrium equations. This approach represents a computationally efficient alternative to finite element method. Comparisons are made with corresponding results obtained from finite-element analysis. The results confirm the validity of the solution method. The numerical results present the effects of taper angle, adherend overlap length, and the bilinear adhesive material on the stress fields in the adherends, as well as the adhesive, of a single-lap joint

  11. JAMIE: joint analysis of multiple ChIP-chip experiments

    PubMed Central

    Wu, Hao; Ji, Hongkai

    2010-01-01

    Motivation: Chromatin immunoprecipitation followed by genome tiling array hybridization (ChIP-chip) is a powerful approach to identify transcription factor binding sites (TFBSs) in target genomes. When multiple related ChIP-chip datasets are available, analyzing them jointly allows one to borrow information across datasets to improve peak detection. This is particularly useful for analyzing noisy datasets. Results: We propose a hierarchical mixture model and develop an R package JAMIE to perform the joint analysis. The genome is assumed to consist of background and potential binding regions (PBRs). PBRs have context-dependent probabilities to become bona fide binding sites in individual datasets. This model captures the correlation among datasets, which provides basis for sharing information across experiments. Real data tests illustrate the advantage of JAMIE over a strategy that analyzes individual datasets separately. Availability: JAMIE is freely available from http://www.biostat.jhsph.edu/∼hji/jamie Contact: hji@jhsph.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20551135

  12. Bayesian clinical trials in action.

    PubMed

    Lee, J Jack; Chu, Caleb T

    2012-11-10

    Although the frequentist paradigm has been the predominant approach to clinical trial design since the 1940s, it has several notable limitations. Advancements in computational algorithms and computer hardware have greatly enhanced the alternative Bayesian paradigm. Compared with its frequentist counterpart, the Bayesian framework has several unique advantages, and its incorporation into clinical trial design is occurring more frequently. Using an extensive literature review to assess how Bayesian methods are used in clinical trials, we find them most commonly used for dose finding, efficacy monitoring, toxicity monitoring, diagnosis/decision making, and studying pharmacokinetics/pharmacodynamics. The additional infrastructure required for implementing Bayesian methods in clinical trials may include specialized software programs to run the study design, simulation and analysis, and web-based applications, all of which are particularly useful for timely data entry and analysis. Trial success requires not only the development of proper tools but also timely and accurate execution of data entry, quality control, adaptive randomization, and Bayesian computation. The relative merit of the Bayesian and frequentist approaches continues to be the subject of debate in statistics. However, more evidence can be found showing the convergence of the two camps, at least at the practical level. Ultimately, better clinical trial methods lead to more efficient designs, lower sample sizes, more accurate conclusions, and better outcomes for patients enrolled in the trials. Bayesian methods offer attractive alternatives for better trials. More Bayesian trials should be designed and conducted to refine the approach and demonstrate their real benefit in action.

  13. Limitations of cytochrome oxidase I for the barcoding of Neritidae (Mollusca: Gastropoda) as revealed by Bayesian analysis.

    PubMed

    Chee, S Y

    2015-05-25

    The mitochondrial DNA (mtDNA) cytochrome oxidase I (COI) gene has been universally and successfully utilized as a barcoding gene, mainly because it can be amplified easily, applied across a wide range of taxa, and results can be obtained cheaply and quickly. However, in rare cases, the gene can fail to distinguish between species, particularly when exposed to highly sensitive methods of data analysis, such as the Bayesian method, or when taxa have undergone introgressive hybridization, over-splitting, or incomplete lineage sorting. Such cases require the use of alternative markers, and nuclear DNA markers are commonly used. In this study, a dendrogram produced by Bayesian analysis of an mtDNA COI dataset was compared with that of a nuclear DNA ATPS-α dataset, in order to evaluate the efficiency of COI in barcoding Malaysian nerites (Neritidae). In the COI dendrogram, most of the species were in individual clusters, except for two species: Nerita chamaeleon and N. histrio. These two species were placed in the same subcluster, whereas in the ATPS-α dendrogram they were in their own subclusters. Analysis of the ATPS-α gene also placed the two genera of nerites (Nerita and Neritina) in separate clusters, whereas COI gene analysis placed both genera in the same cluster. Therefore, in the case of the Neritidae, the ATPS-α gene is a better barcoding gene than the COI gene.

  14. Towards a Fuzzy Bayesian Network Based Approach for Safety Risk Analysis of Tunnel-Induced Pipeline Damage.

    PubMed

    Zhang, Limao; Wu, Xianguo; Qin, Yawei; Skibniewski, Miroslaw J; Liu, Wenli

    2016-02-01

    Tunneling excavation is bound to produce significant disturbances to surrounding environments, and the tunnel-induced damage to adjacent underground buried pipelines is of considerable importance for geotechnical practice. A fuzzy Bayesian networks (FBNs) based approach for safety risk analysis is developed in this article with detailed step-by-step procedures, consisting of risk mechanism analysis, the FBN model establishment, fuzzification, FBN-based inference, defuzzification, and decision making. In accordance with the failure mechanism analysis, a tunnel-induced pipeline damage model is proposed to reveal the cause-effect relationships between the pipeline damage and its influential variables. In terms of the fuzzification process, an expert confidence indicator is proposed to reveal the reliability of the data when determining the fuzzy probability of occurrence of basic events, with both the judgment ability level and the subjectivity reliability level taken into account. By means of the fuzzy Bayesian inference, the approach proposed in this article is capable of calculating the probability distribution of potential safety risks and identifying the most likely potential causes of accidents under both prior knowledge and given evidence circumstances. A case concerning the safety analysis of underground buried pipelines adjacent to the construction of the Wuhan Yangtze River Tunnel is presented. The results demonstrate the feasibility of the proposed FBN approach and its application potential. The proposed approach can be used as a decision tool to provide support for safety assurance and management in tunnel construction, and thus increase the likelihood of a successful project in a complex project environment.

  15. Bayesian Block Analysis of Terrestrial Gamma-ray Flashes Detected by the Gamma-ray Burst Monitor

    NASA Astrophysics Data System (ADS)

    Roberts, O.; Fitzpatrick, G.; McBreen, S.; Briggs, M. S.

    2014-12-01

    The Gamma-ray Burst Monitor (GBM) is one of two instruments aboard the Fermi Gamma-ray Space Telescope. Since the launch of the spacecraft in 2008, a sequence of flight software enhancements and new observing modes have resulted in the detection of over 2500 Terrestrial Gamma-ray Flashes (TGFs) by GBM. As a result, a catalogue of TGFs will be published and released online to provide the community with information on the most important characteristics of these TGFs. We will present a Bayesian Block analysis of the TGFs of this catalogue, obtaining for this large sample size the durations, peak times, hardness ratios, and delays between soft and hard counts.

  16. Medical diagnosis aboard submarines. Use of a computer-based Bayesian method of analysis in an abdominal pain diagnostic program.

    PubMed

    Osborne, S F

    1984-02-01

    The medical issues that arise in the isolated environment of a submarine can occasionally be grave. While crewmembers are carefully screened for health problems, they are still susceptible to serious acute illness. Currently, the submarine medical department representative, the hospital corpsman, utilizes a history and physical examination, clinical acumen, and limited laboratory testing in diagnosis. The application of a Bayesian method of analysis to an abdominal pain diagnostic system utilizing an onboard microcomputer is described herein. Early results from sea trials show an appropriate diagnosis in eight of 10 cases of abdominal pain, but the program should still be viewed as an extended "laboratory test" until proved effective at sea.

  17. Failure Analysis in Space: International Space Station (ISS) Starboard Solar Alpha Rotary Joint (SARJ) Debris Analysis

    NASA Technical Reports Server (NTRS)

    Long, V. S.; Wright, M. C.; McDanels, S. J.; Lubas, D.; Tucker, B.; Marciniak, P. J.

    2010-01-01

    This slide presentation reviews the debris analysis of the Starboard Solar Alpha Rotary Joint (SARJ), a mechanism that is designed to keep the solar arrays facing the sun. The goal of this was to identify the failure mechanism based on surface morphology and to determine the source of debris through elemental and particle analysis.

  18. Elastic-plastic analysis of crack in ductile adhesive joint

    SciTech Connect

    Ikeda, Toru; Miyazaki, Noriyuki; Yamashita, Akira; Munakata, Tsuyoshi

    1995-11-01

    The fracture of a crack in adhesive is important to the structural integrity of adhesive structures and composite materials. Though the fracture toughness of a material should be constant according to fracture mechanics, it is said that the fracture toughness of a crack in an adhesive joint depends on the bond thickness. In the present study, the elastic-plastic stress analyses of a crack in a thin adhesive layer are performed by the combination of the boundary element method and the finite element method. The effect of adhesive thickness on the J-integral, the Q`-factor which is a modified version of the Q-factor, and the crack tip opening displacement (CTOD) are investigated. It is found from the analyses that the CTOD begins to decrease at very thin bond thickness, the Q`-factor being almost constant. The decrease of the fracture toughness at very thin adhesive layer is expected by the present analysis.

  19. Magellan/Galileo solder joint failure analysis and recommendations

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    1989-01-01

    On or about November 10, 1988 an open circuit solder joint was discovered in the Magellan Radar digital unit (DFU) during integration testing at Kennedy Space Center (KSC). A detailed analysis of the cause of the failure was conducted at the Jet Propulsion Laboratory leading to the successful repair of many pieces of affected electronic hardware on both the Magellan and Galileo spacecraft. The problem was caused by the presence of high thermal coefficient of expansion heat sink and conformal coating materials located in the large (0.055 inch) gap between Dual Inline Packages (DIPS) and the printed wiring board. The details of the observed problems are described and recommendations are made for improved design and testing activities in the future.

  20. Finite element stress analysis of some ankle joint prostheses.

    PubMed

    Falsig, J; Hvid, I; Jensen, N C

    1986-05-01

    A three-dimensional finite element stress analysis was employed to calculate stresses in a distal tibia modelled with three simple total ankle joint replacement tibial components. The bone was modelled as a composite structure consisting of cortical and trabecular bone in which the trabecular bone was either homogeneous with a constant modulus of elasticity or heterogenous with experimentally determined heterogeneity. The results were sensitive to variations in trabecular bone material property distributions, with lower stresses being calculated in the heterogeneous model. An anterolateral application of load, which proved the least favourable, was used in comparing the prosthetic variants. Normal and shear stresses at the trabecular bone-cement interface and supporting trabecular bone were slightly reduced by addition of metal backing to the polyethylene articular surface, and a further reduction to very low values was obtained by addition of a long intramedullary peg bypassing stresses to the cortical bone.

  1. Bayesian Analysis of Hmi Images and Comparison to Tsi Variations and MWO Image Observables

    NASA Astrophysics Data System (ADS)

    Parker, D. G.; Ulrich, R. K.; Beck, J.; Tran, T. V.

    2015-12-01

    We have previously applied the Bayesian automatic classification system AutoClass to solar magnetogram and intensity images from the 150 Foot Solar Tower at Mount Wilson to identify classes of solar surface features associated with variations in total solar irradiance (TSI) and, using those identifications, modeled TSI time series with improved accuracy (r > 0.96). (Ulrich, et al, 2010) AutoClass identifies classes by a two-step process in which it: (1) finds, without human supervision, a set of class definitions based on specified attributes of a sample of the image data pixels, such as magnetic field and intensity in the case of MWO images, and (2) applies the class definitions thus found to new data sets to identify automatically in them the classes found in the sample set. HMI high resolution images capture four observables-magnetic field, continuum intensity, line depth and line width-in contrast to MWO's two observables-magnetic field and intensity. In this study, we apply AutoClass to the HMI observables for images from June, 2010 to December, 2014 to identify solar surface feature classes. We use contemporaneous TSI measurements to determine whether and how variations in the HMI classes are related to TSI variations and compare the characteristic statistics of the HMI classes to those found from MWO images. We also attempt to derive scale factors between the HMI and MWO magnetic and intensity observables.The ability to categorize automatically surface features in the HMI images holds out the promise of consistent, relatively quick and manageable analysis of the large quantity of data available in these images. Given that the classes found in MWO images using AutoClass have been found to improve modeling of TSI, application of AutoClass to the more complex HMI images should enhance understanding of the physical processes at work in solar surface features and their implications for the solar-terrestrial environment.Ulrich, R.K., Parker, D, Bertello, L. and

  2. Bayesian Analysis Of HMI Solar Image Observables And Comparison To TSI Variations And MWO Image Observables

    NASA Astrophysics Data System (ADS)

    Parker, D. G.; Ulrich, R. K.; Beck, J.

    2014-12-01

    We have previously applied the Bayesian automatic classification system AutoClass to solar magnetogram and intensity images from the 150 Foot Solar Tower at Mount Wilson to identify classes of solar surface features associated with variations in total solar irradiance (TSI) and, using those identifications, modeled TSI time series with improved accuracy (r > 0.96). (Ulrich, et al, 2010) AutoClass identifies classes by a two-step process in which it: (1) finds, without human supervision, a set of class definitions based on specified attributes of a sample of the image data pixels, such as magnetic field and intensity in the case of MWO images, and (2) applies the class definitions thus found to new data sets to identify automatically in them the classes found in the sample set. HMI high resolution images capture four observables-magnetic field, continuum intensity, line depth and line width-in contrast to MWO's two observables-magnetic field and intensity. In this study, we apply AutoClass to the HMI observables for images from May, 2010 to June, 2014 to identify solar surface feature classes. We use contemporaneous TSI measurements to determine whether and how variations in the HMI classes are related to TSI variations and compare the characteristic statistics of the HMI classes to those found from MWO images. We also attempt to derive scale factors between the HMI and MWO magnetic and intensity observables. The ability to categorize automatically surface features in the HMI images holds out the promise of consistent, relatively quick and manageable analysis of the large quantity of data available in these images. Given that the classes found in MWO images using AutoClass have been found to improve modeling of TSI, application of AutoClass to the more complex HMI images should enhance understanding of the physical processes at work in solar surface features and their implications for the solar-terrestrial environment. Ulrich, R.K., Parker, D, Bertello, L. and

  3. Performance of Canonical Correlation Analysis (CCA) and Bayesian Hierarchical Modelling (BHM) for European temperature field reconstructions

    NASA Astrophysics Data System (ADS)

    Werner, J. P.; Smerdon, J. E.; Luterbacher, J.

    2011-12-01

    A Pseudoproxy comparison is presented for two statistical methods used to derive annual climate field reconstructions (CFR) for europe. The employed methods use the canonical correlation analysis (CCA) procedure presented by Smerdon et al. (2010, J. Climate) and the Bayesian Hierarchical Model (BHM) based method adopted from Tingley and Huybers (2010a,b, J. Climate). Pseudoproxy experiments are constructed from modelled temperature data sampled from the 1250-year paleo-run of the NCAR CCSM 1.4 model (Ammann et al. 2007, PNAS). The pseudoproxies approximate the distribution of the Mann et al. (1998, Nature) multi-proxy network and use Gaussian white noise to mimic the combined signal and noise properties of real-world proxies. The derived CFRs are tested by comparing the mean temperature bias, the reconstructed temperature variability and two error measures: the cross correlation and the root mean square error. The results show that the BHM method performs much better than the CCA method in areas with good proxy coverage. The BHM method also delivers the added value over the more traditional CCA method by providing objective error estimates. Reconstructions of key years are also analysed. While CCA returns estimates for the full climate field even for areas with sparse data, the more flexible model used in the BHM method returns results that are closer to the target for most of the reconstruction area, albeit with higher uncertainties in data sparse regions. Based on the success of these current BHM results, the algorithm will be extended to make use of proxies with different temporal resolution (cf. Li et al. 2010) in order to reconstruct the temperature and precipitation fields over Europe and the Mediterranean covering much of the common-era period. Ammann, C. et al. (2007), PNAS 104, 3713--3718 Li, B. et al. (2010), J. Am. Stat. Assoc. 105, 883-911 Mann, M. et al. (1998), Nature 392, 779-787 Smerdon, J. et al. (2010), J. Climate 24, 1284-1309 Tingley, M. and

  4. Multi-site identification of a distributed hydrological nitrogen model using Bayesian uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Sanyuan; Jomaa, Seifeddine; Büttner, Olaf; Meon, Günter; Rode, Michael

    2015-10-01

    For capturing spatial variations of runoff and nutrient fluxes attributed to catchment heterogeneity, multi-site hydrological water quality monitoring strategies are increasingly put into practice. This study aimed to investigate the impacts of spatially distributed streamflow and streamwater Inorganic Nitrogen (IN) concentration observations on the identification of a continuous time, spatially semi-distributed and process-based hydrological water quality model HYPE (HYdrological Predictions for the Environment). A Bayesian inference based approach DREAM(ZS) (DiffeRential Evolution Adaptive Metrololis algorithm) was combined with HYPE to implement model optimisation and uncertainty analysis on streamflow and streamwater IN concentration simulations at a nested meso scale catchment in central Germany. To this end, a 10-year period (1994-1999 for calibration and 1999-2004 for validation) was utilised. We compared the parameters' posterior distributions, modelling performance using the best estimated parameter set and 95% prediction confidence intervals at catchment outlet for the calibration period that were derived from single-site calibration (SSC) and multi-site calibration (MSC) modes. For SSC, streamflow and streamwater IN concentration observations at only the catchment outlet were used. While, for MSC, streamflow and streamwater IN concentration observations from both catchment outlet and two internal sites were considered. Results showed that the uncertainty intervals of hydrological water quality parameters' posterior distributions estimated from MSC, were narrower than those obtained from SSC. In addition, it was found that the MSC outperformed SSC on streamwater IN concentration simulations at internal sites for both calibration and validation periods, while the influence on streamflow modelling performance was small. This can be explained by the "nested" nature of the catchment and high correlation between discharge observations from different sites

  5. Apples and oranges: avoiding different priors in Bayesian DNA sequence analysis

    PubMed Central

    2010-01-01

    Background One of the challenges of bioinformatics remains the recognition of short signal sequences in genomic DNA such as donor or acceptor splice sites, splicing enhancers or silencers, translation initiation sites, transcription start sites, transcription factor binding sites, nucleosome binding sites, miRNA binding sites, or insulator binding sites. During the last decade, a wealth of algorithms for the recognition of such DNA sequences has been developed and compared with the goal of improving their performance and to deepen our understanding of the underlying cellular processes. Most of these algorithms are based on statistical models belonging to the family of Markov random fields such as position weight matrix models, weight array matrix models, Markov models of higher order, or moral Bayesian networks. While in many comparative studies different learning principles or different statistical models have been compared, the influence of choosing different prior distributions for the model parameters when using different learning principles has been overlooked, and possibly lead to questionable conclusions. Results With the goal of allowing direct comparisons of different learning principles for models from the family of Markov random fields based on the same a-priori information, we derive a generalization of the commonly-used product-Dirichlet prior. We find that the derived prior behaves like a Gaussian prior close to the maximum and like a Laplace prior in the far tails. In two case studies, we illustrate the utility of the derived prior for a direct comparison of different learning principles with different models for the recognition of binding sites of the transcription factor Sp1 and human donor splice sites. Conclusions We find that comparisons of different learning principles using the same a-priori information can lead to conclusions different from those of previous studies in which the effect resulting from different priors has been neglected. We

  6. A Bayesian ridge regression analysis of congestion's impact on urban expressway safety.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Lee, Jaeyoung

    2016-03-01

    With the rapid growth of traffic in urban areas, concerns about congestion and traffic safety have been heightened. This study leveraged both Automatic Vehicle Identification (AVI) system and Microwave Vehicle Detection System (MVDS) installed on an expressway in Central Florida to explore how congestion impacts the crash occurrence in urban areas. Multiple congestion measures from the two systems were developed. To ensure more precise estimates of the congestion's effects, the traffic data were aggregated into peak and non-peak hours. Multicollinearity among traffic parameters was examined. The results showed the presence of multicollinearity especially during peak hours. As a response, ridge regression was introduced to cope with this issue. Poisson models with uncorrelated random effects, correlated random effects, and both correlated random effects and random parameters were constructed within the Bayesian framework. It was proven that correlated random effects could significantly enhance model performance. The random parameters model has similar goodness-of-fit compared with the model with only correlated random effects. However, by accounting for the unobserved heterogeneity, more variables were found to be significantly related to crash frequency. The models indicated that congestion increased crash frequency during peak hours while during non-peak hours it was not a major crash contributing factor. Using the random parameter model, the three congestion measures were compared. It was found that all congestion indicators had similar effects while Congestion Index (CI) derived from MVDS data was a better congestion indicator for safety analysis. Also, analyses showed that the segments with higher congestion intensity could not only increase property damage only (PDO) crashes, but also more severe crashes. In addition, the issues regarding the necessity to incorporate specific congestion indicator for congestion's effects on safety and to take care of the

  7. Bayesian Analysis of Step-Stress Accelerated Life Test with Exponential Distribution

    SciTech Connect

    Lee, J.; Pan, R.

    2012-04-01

    In this article, we propose a general Bayesian inference approach to the step-stress accelerated life test with type II censoring. We assume that the failure times at each stress level are exponentially distributed and the test units are tested in an increasing order of stress levels. We formulate the prior distribution of the parameters of life-stress function and integrate the engineering knowledge of product failure rate and acceleration factor into the prior. The posterior distribution and the point estimates for the parameters of interest are provided. Through the Markov chain Monte Carlo technique, we demonstrate a nonconjugate prior case using an industrial example. It is shown that with the Bayesian approach, the statistical precision of parameter estimation is improved and, consequently, the required number of failures could be reduced.

  8. Bayesian decision sequential analysis with survival endpoint in phase II clinical trials.

    PubMed

    Zhao, Lili; Woodworth, George

    2009-04-30

    Chen and Chaloner (Statist. Med. 2006; 25:2956-2966. DOI: 10.1002/sim.2429) present a Bayesian stopping rule for a single-arm clinical trial with a binary endpoint. In some cases, earlier stopping may be possible by basing the stopping rule on the time to a binary event. We investigate the feasibility of computing exact, Bayesian, decision-theoretic time-to-event stopping rules for a single-arm group sequential non-inferiority trial relative to an objective performance criterion. For a conjugate prior distribution, exponential failure time distribution, and linear and threshold loss structures, we obtain the optimal Bayes stopping rule by backward induction. We compute frequentist operating characteristics of including Type I error, statistical power, and expected run length. We also briefly address design issues.

  9. Bayesian Analysis of the Power Spectrum of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey B.; Eriksen, H. K.; O'Dwyer, I. J.; Wandelt, B. D.

    2005-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background. The sky, when viewed in the microwave, is very uniform, with a nearly perfect blackbody spectrum at 2.7 degrees. Very small amplitude brightness fluctuations (to one part in a million!!) trace small density perturbations in the early universe (roughly 300,000 years after the Big Bang), which later grow through gravitational instability to the large-scale structure seen in redshift surveys... In this talk, I will discuss a Bayesian formulation of this problem; discuss a Gibbs sampling approach to numerically sampling from the Bayesian posterior, and the application of this approach to the first-year data from the Wilkinson Microwave Anisotropy Probe. I will also comment on recent algorithmic developments for this approach to be tractable for the even more massive data set to be returned from the Planck satellite.

  10. Coarse master equation from Bayesian analysis of replica molecular dynamics simulations.

    PubMed

    Sriraman, Saravanapriyan; Kevrekidis, Ioannis G; Hummer, Gerhard

    2005-04-14

    We use Bayesian inference to derive the rate coefficients of a coarse master equation from molecular dynamics simulations. Results from multiple short simulation trajectories are used to estimate propagators. A likelihood function constructed as a product of the propagators provides a posterior distribution of the free coefficients in the rate matrix determining the Markovian master equation. Extensions to non-Markovian dynamics are discussed, using the trajectory "paths" as observations. The Markovian approach is illustrated for the filling and emptying transitions of short carbon nanotubes dissolved in water. We show that accurate thermodynamic and kinetic properties, such as free energy surfaces and kinetic rate coefficients, can be computed from coarse master equations obtained through Bayesian inference.

  11. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini,Benjamin; Young, Daniel

    1990-01-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite bonded joints below the cryogenic temperature of 30K (-405 F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to M55J/954-6 and T300/954-6 hybrid composite tubes (75mm square). Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  12. A Bayesian Nonparametric Approach for the Analysis of Multiple Categorical Item Responses

    PubMed Central

    Waters, Andrew; Fronczyk, Kassandra; Guindani, Michele; Baraniuk, Richard G.; Vannucci, Marina

    2014-01-01

    We develop a modeling framework for joint factor and cluster analysis of datasets where multiple categorical response items are collected on a heterogeneous population of individuals. We introduce a latent factor multinomial probit model and employ prior constructions that allow inference on the number of factors as well as clustering of the subjects into homogenous groups according to their relevant factors. Clustering, in particular, allows us to borrow strength across subjects, therefore helping in the estimation of the model parameters, particularly when the number of observations is small. We employ Markov chain Monte Carlo techniques and obtain tractable posterior inference for our objectives, including sampling of missing data. We demonstrate the effectiveness of our method on simulated data. We also analyze two real-world educational datasets and show that our method outperforms state-of-the-art methods. In the analysis of the real-world data, we uncover hidden relationships between the questions and the underlying educational concepts, while simultaneously partitioning the students into groups of similar educational mastery. PMID:26500388

  13. A tutorial on Bayesian bivariate meta-analysis of mixed binary-continuous outcomes with missing treatment effects.

    PubMed

    Gajic-Veljanoski, Olga; Cheung, Angela M; Bayoumi, Ahmed M; Tomlinson, George

    2016-05-30

    Bivariate random-effects meta-analysis (BVMA) is a method of data synthesis that accounts for treatment effects measured on two outcomes. BVMA gives more precise estimates of the population mean and predicted values than two univariate random-effects meta-analyses (UVMAs). BVMA also addresses bias from incomplete reporting of outcomes. A few tutorials have covered technical details of BVMA of categorical or continuous outcomes. Limited guidance is available on how to analyze datasets that include trials with mixed continuous-binary outcomes where treatment effects on one outcome or the other are not reported. Given the advantages of Bayesian BVMA for handling missing outcomes, we present a tutorial for Bayesian BVMA of incompletely reported treatment effects on mixed bivariate outcomes. This step-by-step approach can serve as a model for our intended audience, the methodologist familiar with Bayesian meta-analysis, looking for practical advice on fitting bivariate models. To facilitate application of the proposed methods, we include our WinBUGS code. As an example, we use aggregate-level data from published trials to demonstrate the estimation of the effects of vitamin K and bisphosphonates on two correlated bone outcomes, fracture, and bone mineral density. We present datasets where reporting of the pairs of treatment effects on both outcomes was 'partially' complete (i.e., pairs completely reported in some trials), and we outline steps for modeling the incompletely reported data. To assess what is gained from the additional work required by BVMA, we compare the resulting estimates to those from separate UVMAs. We discuss methodological findings and make four recommendations. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Evaluating Google, Twitter, and Wikipedia as Tools for Influenza Surveillance Using Bayesian Change Point Analysis: A Comparative Analysis

    PubMed Central

    Hopkins, Richard S; Cook, Robert L; Striley, Catherine W

    2016-01-01

    Background Traditional influenza surveillance relies on influenza-like illness (ILI) syndrome that is reported by health care providers. It primarily captures individuals who seek medical care and misses those who do not. Recently, Web-based data sources have been studied for application to public health surveillance, as there is a growing number of people who search, post, and tweet about their illnesses before seeking medical care. Existing research has shown some promise of using data from Google, Twitter, and Wikipedia to complement traditional surveillance for ILI. However, past studies have evaluated these Web-based sources individually or dually without comparing all 3 of them, and it would be beneficial to know which of the Web-based sources performs best in order to be considered to complement traditional methods. Objective The objective of this study is to comparatively analyze Google, Twitter, and Wikipedia by examining which best corresponds with Centers for Disease Control and Prevention (CDC) ILI data. It was hypothesized that Wikipedia will best correspond with CDC ILI data as previous research found it to be least influenced by high media coverage in comparison with Google and Twitter. Methods Publicly available, deidentified data were collected from the CDC, Google Flu Trends, HealthTweets, and Wikipedia for the 2012-2015 influenza seasons. Bayesian change point analysis was used to detect seasonal changes, or change points, in each of the data sources. Change points in Google, Twitter, and Wikipedia that occurred during the exact week, 1 preceding week, or 1 week after the CDC’s change points were compared with the CDC data as the gold standard. All analyses were conducted using the R package “bcp” version 4.0.0 in RStudio version 0.99.484 (RStudio Inc). In addition, sensitivity and positive predictive values (PPV) were calculated for Google, Twitter, and Wikipedia. Results During the 2012-2015 influenza seasons, a high sensitivity of 92

  15. Analysis of a Preloaded Bolted Joint in a Ceramic Composite Combustor

    NASA Technical Reports Server (NTRS)

    Hissam, D. Andy; Bower, Mark V.

    2003-01-01

    This paper presents the detailed analysis of a preloaded bolted joint incorporating ceramic materials. The objective of this analysis is to determine the suitability of a joint design for a ceramic combustor. The analysis addresses critical factors in bolted joint design including preload, preload uncertainty, and load factor. The relationship between key joint variables is also investigated. The analysis is based on four key design criteria, each addressing an anticipated failure mode. The criteria are defined in terms of margin of safety, which must be greater than zero for the design criteria to be satisfied. Since the proposed joint has positive margins of safety, the design criteria are satisfied. Therefore, the joint design is acceptable.

  16. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  17. Bayesian reliability analysis in connection with the evaluation of multiple-site damage

    NASA Astrophysics Data System (ADS)

    Itagaki, Hiroshi; Asada, H.; Ito, S.

    A rivet splice model for analyzing multiple-site damage is presented. The effectiveness of the periodic inspection of multiple-site fatigue cracks along a rivet joint is discussed. The relative frequency distribution of visible crack length and joint posterior probability density are described.

  18. Do people with benign joint hypermobility syndrome (BJHS) have reduced joint proprioception? A systematic review and meta-analysis.

    PubMed

    Smith, Toby O; Jerman, Emma; Easton, Victoria; Bacon, Holly; Armon, Kate; Poland, Fiona; Macgregor, Alex J

    2013-11-01

    Joint proprioceptive deficit is documented in a variety of musculoskeletal conditions including osteoarthritis, ligament and meniscal injuries, and individuals with increased joint hypermobility, such as those with Ehlers-Danlos. No systematic reviews have assessed joint proprioception in people with benign joint hypermobility syndrome (BJHS). This study addresses this to determine whether people with BJHS exhibit reduced joint proprioception, and, if so, whether this is evident in all age groups. The search strategy was conducted on 31st January 2013. The published literature was assessed using the databases: AMED, CINAHL, MEDLINE, EMBASE, PubMed and the Cochrane Library. Unpublished literature and trial registries were assessed including: OpenGrey, the WHO International Clinical Trials Registry Platform, Current Controlled Trials, the UK National Research Register Archive. All studies comparing the proprioceptive capability of people with and without BJHS were included. Study methodological quality was assessed using the CASP appraisal tool. Meta-analysis techniques were used when study homogeneity permitted. Five studies including 254 people were identified. People with BJHS demonstrated statistically significantly poorer lower limb joint position sense (JPS) (p < 0.001) and threshold detection to movement (p < 0.001) than those without BJHS. The evidence for upper limb proprioceptive difference was less clear, with no statistically significant difference between the cohorts for shoulder JPS (p = 0.10), but a statistically significant difference in finger JPS (p < 0.001). One study which assessed childhood BJHS reported reduced knee proprioceptive capability in those with BJHS (p < 0.001). To conclude, lower limb joint proprioception is reduced in those with BJHS compared to non-BJHS cohorts, whilst unclear in the upper limb.

  19. Bayesian analysis of the astrobiological implications of life's early emergence on Earth.

    PubMed

    Spiegel, David S; Turner, Edwin L

    2012-01-10

    Life arose on Earth sometime in the first few hundred million years after the young planet had cooled to the point that it could support water-based organisms on its surface. The early emergence of life on Earth has been taken as evidence that the probability of abiogenesis is high, if starting from young Earth-like conditions. We revisit this argument quantitatively in a bayesian statistical framework. By constructing a simple model of the probability of abiogenesis, we calculate a bayesian estimate of its posterior probability, given the data that life emerged fairly early in Earth's history and that, billions of years later, curious creatures noted this fact and considered its implications. We find that, given only this very limited empirical information, the choice of bayesian prior for the abiogenesis probability parameter has a dominant influence on the computed posterior probability. Although terrestrial life's early emergence provides evidence that life might be abundant in the universe if early-Earth-like conditions are common, the evidence is inconclusive and indeed is consistent with an arbitrarily low intrinsic probability of abiogenesis for plausible uninformative priors. Finding a single case of life arising independently of our lineage (on Earth, elsewhere in the solar system, or on an extrasolar planet) would provide much stronger evidence that abiogenesis is not extremely rare in the universe.

  20. A Primer on Bayesian Decision Analysis With an Application to a Kidney Transplant Decision.

    PubMed

    Neapolitan, Richard; Jiang, Xia; Ladner, Daniela P; Kaplan, Bruce

    2016-03-01

    A clinical decision support system (CDSS) is a computer program, which is designed to assist health care professionals with decision making tasks. A well-developed CDSS weighs the benefits of therapy versus the cost in terms of loss of quality of life and financial loss and recommends the decision that can be expected to provide maximum overall benefit. This article provides an introduction to developing CDSSs using Bayesian networks, such CDSS can help with the often complex decisions involving transplants. First, we review Bayes theorem in the context of medical decision making. Then, we introduce Bayesian networks, which can model probabilistic relationships among many related variables and are based on Bayes theorem. Next, we discuss influence diagrams, which are Bayesian networks augmented with decision and value nodes and which can be used to develop CDSSs that are able to recommend decisions that maximize the expected utility of the predicted outcomes to the patient. By way of comparison, we examine the benefit and challenges of using the Kidney Donor Risk Index as the sole decision tool. Finally, we develop a schema for an influence diagram that models generalized kidney transplant decisions and show how the influence diagram approach can provide the clinician and the potential transplant recipient with a valuable decision support tool.

  1. Bayesian analysis of the astrobiological implications of life’s early emergence on Earth

    PubMed Central

    Spiegel, David S.; Turner, Edwin L.

    2012-01-01

    Life arose on Earth sometime in the first few hundred million years after the young planet had cooled to the point that it could support water-based organisms on its surface. The early emergence of life on Earth has been taken as evidence that the probability of abiogenesis is high, if starting from young Earth-like conditions. We revisit this argument quantitatively in a Bayesian statistical framework. By constructing a simple model of the probability of abiogenesis, we calculate a Bayesian estimate of its posterior probability, given the data that life emerged fairly early in Earth’s history and that, billions of years later, curious creatures noted this fact and considered its implications. We find that, given only this very limited empirical information, the choice of Bayesian prior for the abiogenesis probability parameter has a dominant influence on the computed posterior probability. Although terrestrial life's early emergence provides evidence that life might be abundant in the universe if early-Earth-like conditions are common, the evidence is inconclusive and indeed is consistent with an arbitrarily low intrinsic probability of abiogenesis for plausible uninformative priors. Finding a single case of life arising independently of our lineage (on Earth, elsewhere in the solar system, or on an extrasolar planet) would provide much stronger evidence that abiogenesis is not extremely rare in the universe. PMID:22198766

  2. Kinematic and dynamic analysis of an anatomically based knee joint.

    PubMed

    Lee, Kok-Meng; Guo, Jiajie

    2010-05-07

    This paper presents a knee-joint model to provide a better understanding on the interaction between natural joints and artificial mechanisms for design and control of rehabilitation exoskeletons. The anatomically based knee model relaxes several commonly made assumptions that approximate a human knee as engineering pin-joint in exoskeleton design. Based on published MRI data, we formulate the kinematics of a knee-joint and compare three mathematical approximations; one model bases on two sequential circles rolling a flat plane; and the other two are mathematically differentiable ellipses-based models with and without sliding at the contact. The ellipses-based model taking sliding contact into accounts shows that the rolling-sliding ratio of a knee-joint is not a constant but has an average value consistent with published measurements. This knee-joint kinematics leads to a physically more accurate contact-point trajectory than methods based on multiple circles or lines, and provides a basis to derive a knee-joint kinetic model upon which the effects of a planar exoskeleton mechanism on the internal joint forces and torque during flexion can be numerically investigated. Two different knee-joint kinetic models (pin-joint approximation and anatomically based model) are compared against a condition with no exoskeleton. The leg and exoskeleton form a closed kinematic chain that has a significant effect on the joint forces in the knee. Human knee is more tolerant than pin-joint in negotiating around a singularity but its internal forces increase with the exoskeleton mass-to-length ratio. An oversimplifying pin-joint approximation cannot capture the finite change in the knee forces due to the singularity effect.

  3. Joint Modeling Compliance and Outcome for Causal Analysis in Longitudinal Studies

    PubMed Central

    Gao, Xin; Brown, Gregory K.; Elliott, Michael R.

    2013-01-01

    This article discusses joint modeling of compliance and outcome for longitudinal studies when noncompliance is present. We focus on two-arm randomized longitudinal studies in which subjects are randomized at baseline, treatment is applied repeatedly over time, and compliance behaviors and clinical outcomes are measured and recorded repeatedly over time. In the proposed Markov compliance and outcome model, we use the potential outcome framework to define pre-randomization principal strata from the joint distribution of compliance under treatment and control arms, and estimate the effect of treatment within each principal strata. Besides the causal effect of the treatment, our proposed model can estimate the impact of the causal effect of the treatment at a given time on the future compliance. Bayesian methods are used to estimate the parameters. The results are illustrated using a study assessing the effect of cognitive behavior therapy on depression. A simulation study is used to assess the repeated sampling properties of the proposed model. PMID:23576159

  4. Joint modeling compliance and outcome for causal analysis in longitudinal studies.

    PubMed

    Gao, Xin; Brown, Gregory K; Elliott, Michael R

    2014-09-10

    This article discusses joint modeling of compliance and outcome for longitudinal studies when noncompliance is present. We focus on two-arm randomized longitudinal studies in which subjects are randomized at baseline, treatment is applied repeatedly over time, and compliance behaviors and clinical outcomes are measured and recorded repeatedly over time. In the proposed Markov compliance and outcome model, we use the potential outcome framework to define pre-randomization principal strata from the joint distribution of compliance under treatment and control arms, and estimate the effect of treatment within each principal strata. Besides the causal effect of the treatment, our proposed model can estimate the impact of the causal effect of the treatment at a given time on future compliance. Bayesian methods are used to estimate the parameters. The results are illustrated using a study assessing the effect of cognitive behavior therapy on depression. A simulation study is used to assess the repeated sampling properties of the proposed model.

  5. A Bayesian Hierarchical Non-Linear Regression Model in Receiver Operating Characteristic Analysis of Clustered Continuous Diagnostic Data

    PubMed Central

    Zou, Kelly H.; O’Malley, A. James

    2005-01-01

    Receiver operating characteristic (ROC) analysis is a useful evaluative method of diagnostic accuracy. A Bayesian hierarchical nonlinear regression model for ROC analysis was developed. A validation analysis of diagnostic accuracy was conducted using prospective multi-center clinical trial prostate cancer biopsy data collected from three participating centers. The gold standard was based on radical prostatectomy to determine local and advanced disease. To evaluate the diagnostic performance of PSA level at fixed levels of Gleason score, a normality transformation was applied to the outcome data. A hierarchical regression analysis incorporating the effects of cluster (clinical center) and cancer risk (low, intermediate, and high) was performed, and the area under the ROC curve (AUC) was estimated. PMID:16161801

  6. Analysis of space crane articulated-truss joints