Science.gov

Sample records for joint bayesian analysis

  1. JBASE: Joint Bayesian Analysis of Subphenotypes and Epistasis

    PubMed Central

    Colak, Recep; Kim, TaeHyung; Kazan, Hilal; Oh, Yoomi; Cruz, Miguel; Valladares-Salgado, Adan; Peralta, Jesus; Escobedo, Jorge; Parra, Esteban J.; Kim, Philip M.; Goldenberg, Anna

    2016-01-01

    Motivation: Rapid advances in genotyping and genome-wide association studies have enabled the discovery of many new genotype–phenotype associations at the resolution of individual markers. However, these associations explain only a small proportion of theoretically estimated heritability of most diseases. In this work, we propose an integrative mixture model called JBASE: joint Bayesian analysis of subphenotypes and epistasis. JBASE explores two major reasons of missing heritability: interactions between genetic variants, a phenomenon known as epistasis and phenotypic heterogeneity, addressed via subphenotyping. Results: Our extensive simulations in a wide range of scenarios repeatedly demonstrate that JBASE can identify true underlying subphenotypes, including their associated variants and their interactions, with high precision. In the presence of phenotypic heterogeneity, JBASE has higher Power and lower Type 1 Error than five state-of-the-art approaches. We applied our method to a sample of individuals from Mexico with Type 2 diabetes and discovered two novel epistatic modules, including two loci each, that define two subphenotypes characterized by differences in body mass index and waist-to-hip ratio. We successfully replicated these subphenotypes and epistatic modules in an independent dataset from Mexico genotyped with a different platform. Availability and implementation: JBASE is implemented in C++, supported on Linux and is available at http://www.cs.toronto.edu/∼goldenberg/JBASE/jbase.tar.gz. The genotype data underlying this study are available upon approval by the ethics review board of the Medical Centre Siglo XXI. Please contact Dr Miguel Cruz at mcruzl@yahoo.com for assistance with the application. Contact: anna.goldenberg@utoronto.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26411870

  2. A Bayesian approach to joint analysis of longitudinal measurements and competing risks failure time data.

    PubMed

    Hu, Wenhua; Li, Gang; Li, Ning

    2009-05-15

    In this paper, we develop a Bayesian method for joint analysis of longitudinal measurements and competing risks failure time data. The model allows one to analyze the longitudinal outcome with nonignorable missing data induced by multiple types of events, to analyze survival data with dependent censoring for the key event, and to draw inferences on multiple endpoints simultaneously. Compared with the likelihood approach, the Bayesian method has several advantages. It is computationally more tractable for high-dimensional random effects. It is also convenient to draw inference. Moreover, it provides a means to incorporate prior information that may help to improve estimation accuracy. An illustration is given using a clinical trial data of scleroderma lung disease. The performance of our method is evaluated by simulation studies. PMID:19308919

  3. Joint Bayesian analysis of birthweight and censored gestational age using finite mixture models

    PubMed Central

    Schwartz, Scott L.; Gelfand, Alan E.; Miranda, Marie L.

    2016-01-01

    Birthweight and gestational age are closely related and represent important indicators of a healthy pregnancy. Customary modeling for birthweight is conditional on gestational age. However, joint modeling directly addresses the relationship between gestational age and birthweight, and provides increased flexibility and interpretation as well as a strategy to avoid using gestational age as an intermediate variable. Previous proposals have utilized finite mixtures of bivariate regression models to incorporate well-established risk factors into analysis (e.g. sex and birth order of the baby, maternal age, race, and tobacco use) while examining the non-Gaussian shape of the joint birthweight and gestational age distribution. We build on this approach by demonstrating the inferential (prognostic) benefits of joint modeling (e.g. investigation of `age inappropriate' outcomes like small for gestational age) and hence re-emphasize the importance of capturing the non-Gaussian distributional shapes. We additionally extend current models through a latent specification which admits interval-censored gestational age. We work within a Bayesian framework which enables inference beyond customary parameter estimation and prediction as well as exact uncertainty assessment. The model is applied to a portion of the 2003–2006 North Carolina Detailed Birth Record data (n=336129) available through the Children's Environmental Health Initiative and is fitted using the Bayesian methodology and Markov chain Monte Carlo approaches. PMID:20575047

  4. JAM: A Scalable Bayesian Framework for Joint Analysis of Marginal SNP Effects.

    PubMed

    Newcombe, Paul J; Conti, David V; Richardson, Sylvia

    2016-04-01

    Recently, large scale genome-wide association study (GWAS) meta-analyses have boosted the number of known signals for some traits into the tens and hundreds. Typically, however, variants are only analysed one-at-a-time. This complicates the ability of fine-mapping to identify a small set of SNPs for further functional follow-up. We describe a new and scalable algorithm, joint analysis of marginal summary statistics (JAM), for the re-analysis of published marginal summary statistics under joint multi-SNP models. The correlation is accounted for according to estimates from a reference dataset, and models and SNPs that best explain the complete joint pattern of marginal effects are highlighted via an integrated Bayesian penalized regression framework. We provide both enumerated and Reversible Jump MCMC implementations of JAM and present some comparisons of performance. In a series of realistic simulation studies, JAM demonstrated identical performance to various alternatives designed for single region settings. In multi-region settings, where the only multivariate alternative involves stepwise selection, JAM offered greater power and specificity. We also present an application to real published results from MAGIC (meta-analysis of glucose and insulin related traits consortium) - a GWAS meta-analysis of more than 15,000 people. We re-analysed several genomic regions that produced multiple significant signals with glucose levels 2 hr after oral stimulation. Through joint multivariate modelling, JAM was able to formally rule out many SNPs, and for one gene, ADCY5, suggests that an additional SNP, which transpired to be more biologically plausible, should be followed up with equal priority to the reported index. PMID:27027514

  5. JAM: A Scalable Bayesian Framework for Joint Analysis of Marginal SNP Effects

    PubMed Central

    Conti, David V.; Richardson, Sylvia

    2016-01-01

    ABSTRACT Recently, large scale genome‐wide association study (GWAS) meta‐analyses have boosted the number of known signals for some traits into the tens and hundreds. Typically, however, variants are only analysed one‐at‐a‐time. This complicates the ability of fine‐mapping to identify a small set of SNPs for further functional follow‐up. We describe a new and scalable algorithm, joint analysis of marginal summary statistics (JAM), for the re‐analysis of published marginal summary stactistics under joint multi‐SNP models. The correlation is accounted for according to estimates from a reference dataset, and models and SNPs that best explain the complete joint pattern of marginal effects are highlighted via an integrated Bayesian penalized regression framework. We provide both enumerated and Reversible Jump MCMC implementations of JAM and present some comparisons of performance. In a series of realistic simulation studies, JAM demonstrated identical performance to various alternatives designed for single region settings. In multi‐region settings, where the only multivariate alternative involves stepwise selection, JAM offered greater power and specificity. We also present an application to real published results from MAGIC (meta‐analysis of glucose and insulin related traits consortium) – a GWAS meta‐analysis of more than 15,000 people. We re‐analysed several genomic regions that produced multiple significant signals with glucose levels 2 hr after oral stimulation. Through joint multivariate modelling, JAM was able to formally rule out many SNPs, and for one gene, ADCY5, suggests that an additional SNP, which transpired to be more biologically plausible, should be followed up with equal priority to the reported index. PMID:27027514

  6. Joint high-dimensional Bayesian variable and covariance selection with an application to eQTL analysis.

    PubMed

    Bhadra, Anindya; Mallick, Bani K

    2013-06-01

    We describe a Bayesian technique to (a) perform a sparse joint selection of significant predictor variables and significant inverse covariance matrix elements of the response variables in a high-dimensional linear Gaussian sparse seemingly unrelated regression (SSUR) setting and (b) perform an association analysis between the high-dimensional sets of predictors and responses in such a setting. To search the high-dimensional model space, where both the number of predictors and the number of possibly correlated responses can be larger than the sample size, we demonstrate that a marginalization-based collapsed Gibbs sampler, in combination with spike and slab type of priors, offers a computationally feasible and efficient solution. As an example, we apply our method to an expression quantitative trait loci (eQTL) analysis on publicly available single nucleotide polymorphism (SNP) and gene expression data for humans where the primary interest lies in finding the significant associations between the sets of SNPs and possibly correlated genetic transcripts. Our method also allows for inference on the sparse interaction network of the transcripts (response variables) after accounting for the effect of the SNPs (predictor variables). We exploit properties of Gaussian graphical models to make statements concerning conditional independence of the responses. Our method compares favorably to existing Bayesian approaches developed for this purpose. PMID:23607608

  7. Bayesian Mediation Analysis

    ERIC Educational Resources Information Center

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  8. A Bayesian Approach for Multigroup Nonlinear Factor Analysis.

    ERIC Educational Resources Information Center

    Song, Xin-Yuan; Lee, Sik-Yum

    2002-01-01

    Developed a Bayesian approach for a general multigroup nonlinear factor analysis model that simultaneously obtains joint Bayesian estimates of the factor scores and the structural parameters subjected to some constraints across different groups. (SLD)

  9. Bayesian Exploratory Factor Analysis

    PubMed Central

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  10. Bayesian joint modeling of longitudinal and spatial survival AIDS data.

    PubMed

    Martins, Rui; Silva, Giovani L; Andreozzi, Valeska

    2016-08-30

    Joint analysis of longitudinal and survival data has received increasing attention in the recent years, especially for analyzing cancer and AIDS data. As both repeated measurements (longitudinal) and time-to-event (survival) outcomes are observed in an individual, a joint modeling is more appropriate because it takes into account the dependence between the two types of responses, which are often analyzed separately. We propose a Bayesian hierarchical model for jointly modeling longitudinal and survival data considering functional time and spatial frailty effects, respectively. That is, the proposed model deals with non-linear longitudinal effects and spatial survival effects accounting for the unobserved heterogeneity among individuals living in the same region. This joint approach is applied to a cohort study of patients with HIV/AIDS in Brazil during the years 2002-2006. Our Bayesian joint model presents considerable improvements in the estimation of survival times of the Brazilian HIV/AIDS patients when compared with those obtained through a separate survival model and shows that the spatial risk of death is the same across the different Brazilian states. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26990773

  11. Road network safety evaluation using Bayesian hierarchical joint model.

    PubMed

    Wang, Jie; Huang, Helai

    2016-05-01

    Safety and efficiency are commonly regarded as two significant performance indicators of transportation systems. In practice, road network planning has focused on road capacity and transport efficiency whereas the safety level of a road network has received little attention in the planning stage. This study develops a Bayesian hierarchical joint model for road network safety evaluation to help planners take traffic safety into account when planning a road network. The proposed model establishes relationships between road network risk and micro-level variables related to road entities and traffic volume, as well as socioeconomic, trip generation and network density variables at macro level which are generally used for long term transportation plans. In addition, network spatial correlation between intersections and their connected road segments is also considered in the model. A road network is elaborately selected in order to compare the proposed hierarchical joint model with a previous joint model and a negative binomial model. According to the results of the model comparison, the hierarchical joint model outperforms the joint model and negative binomial model in terms of the goodness-of-fit and predictive performance, which indicates the reasonableness of considering the hierarchical data structure in crash prediction and analysis. Moreover, both random effects at the TAZ level and the spatial correlation between intersections and their adjacent segments are found to be significant, supporting the employment of the hierarchical joint model as an alternative in road-network-level safety modeling as well. PMID:26945109

  12. Bayesian Joint Modelling for Object Localisation in Weakly Labelled Images.

    PubMed

    Shi, Zhiyuan; Hospedales, Timothy M; Xiang, Tao

    2015-10-01

    We address the problem of localisation of objects as bounding boxes in images and videos with weak labels. This weakly supervised object localisation problem has been tackled in the past using discriminative models where each object class is localised independently from other classes. In this paper, a novel framework based on Bayesian joint topic modelling is proposed, which differs significantly from the existing ones in that: (1) All foreground object classes are modelled jointly in a single generative model that encodes multiple object co-existence so that "explaining away" inference can resolve ambiguity and lead to better learning and localisation. (2) Image backgrounds are shared across classes to better learn varying surroundings and "push out" objects of interest. (3) Our model can be learned with a mixture of weakly labelled and unlabelled data, allowing the large volume of unlabelled images on the Internet to be exploited for learning. Moreover, the Bayesian formulation enables the exploitation of various types of prior knowledge to compensate for the limited supervision offered by weakly labelled data, as well as Bayesian domain adaptation for transfer learning. Extensive experiments on the PASCAL VOC, ImageNet and YouTube-Object videos datasets demonstrate the effectiveness of our Bayesian joint model for weakly supervised object localisation. PMID:26340253

  13. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  14. Bayesian Model Averaging for Propensity Score Analysis

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  15. Analysis of COSIMA spectra: Bayesian approach

    NASA Astrophysics Data System (ADS)

    Lehto, H. J.; Zaprudin, B.; Lehto, K. M.; Lönnberg, T.; Silén, J.; Rynö, J.; Krüger, H.; Hilchenbach, M.; Kissel, J.

    2015-06-01

    We describe the use of Bayesian analysis methods applied to time-of-flight secondary ion mass spectrometer (TOF-SIMS) spectra. The method is applied to the COmetary Secondary Ion Mass Analyzer (COSIMA) TOF-SIMS mass spectra where the analysis can be broken into subgroups of lines close to integer mass values. The effects of the instrumental dead time are discussed in a new way. The method finds the joint probability density functions of measured line parameters (number of lines, and their widths, peak amplitudes, integrated amplitudes and positions). In the case of two or more lines, these distributions can take complex forms. The derived line parameters can be used to further calibrate the mass scaling of TOF-SIMS and to feed the results into other analysis methods such as multivariate analyses of spectra. We intend to use the method, first as a comprehensive tool to perform quantitative analysis of spectra, and second as a fast tool for studying interesting targets for obtaining additional TOF-SIMS measurements of the sample, a property unique to COSIMA. Finally, we point out that the Bayesian method can be thought of as a means to solve inverse problems but with forward calculations, only with no iterative corrections or other manipulation of the observed data.

  16. Bayesian Statistics for Biological Data: Pedigree Analysis

    ERIC Educational Resources Information Center

    Stanfield, William D.; Carlton, Matthew A.

    2004-01-01

    The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.

  17. A Bayesian joint model of menstrual cycle length and fecundity.

    PubMed

    Lum, Kirsten J; Sundaram, Rajeshwari; Buck Louis, Germaine M; Louis, Thomas A

    2016-03-01

    Menstrual cycle length (MCL) has been shown to play an important role in couple fecundity, which is the biologic capacity for reproduction irrespective of pregnancy intentions. However, a comprehensive assessment of its role requires a fecundity model that accounts for male and female attributes and the couple's intercourse pattern relative to the ovulation day. To this end, we employ a Bayesian joint model for MCL and pregnancy. MCLs follow a scale multiplied (accelerated) mixture model with Gaussian and Gumbel components; the pregnancy model includes MCL as a covariate and computes the cycle-specific probability of pregnancy in a menstrual cycle conditional on the pattern of intercourse and no previous fertilization. Day-specific fertilization probability is modeled using natural, cubic splines. We analyze data from the Longitudinal Investigation of Fertility and the Environment Study (the LIFE Study), a couple based prospective pregnancy study, and find a statistically significant quadratic relation between fecundity and menstrual cycle length, after adjustment for intercourse pattern and other attributes, including male semen quality, both partner's age, and active smoking status (determined by baseline cotinine level 100 ng/mL). We compare results to those produced by a more basic model and show the advantages of a more comprehensive approach. PMID:26295923

  18. A Bayesian Joint Model of Menstrual Cycle Length and Fecundity

    PubMed Central

    Lum, Kirsten J.; Sundaram, Rajeshwari; Louis, Germaine M. Buck; Louis, Thomas A.

    2015-01-01

    Summary Menstrual cycle length (MCL) has been shown to play an important role in couple fecundity, which is the biologic capacity for reproduction irrespective of pregnancy intentions. However, a comprehensive assessment of its role requires a fecundity model that accounts for male and female attributes and the couple’s intercourse pattern relative to the ovulation day. To this end, we employ a Bayesian joint model for MCL and pregnancy. MCLs follow a scale multiplied (accelerated) mixture model with Gaussian and Gumbel components; the pregnancy model includes MCL as a covariate and computes the cycle-specific probability of pregnancy in a menstrual cycle conditional on the pattern of intercourse and no previous fertilization. Day-specific fertilization probability is modeled using natural, cubic splines. We analyze data from the Longitudinal Investigation of Fertility and the Environment Study (the LIFE Study), a couple based prospective pregnancy study, and find a statistically significant quadratic relation between fecundity and menstrual cycle length, after adjustment for intercourse pattern and other attributes, including male semen quality, both partner’s age, and active smoking status (determined by baseline cotinine level 100ng/mL). We compare results to those produced by a more basic model and show the advantages of a more comprehensive approach. PMID:26295923

  19. Bayesian Analysis of Underground Flooding

    NASA Astrophysics Data System (ADS)

    Bogardi, Istvan; Duckstein, Lucien; Szidarovszky, Ferenc

    1982-08-01

    An event-based stochastic model is used to describe the spatial phenomenon of water inrush into underground works located under a karstic aquifer, and a Bayesian analysis is performed because of high parameter uncertainty. The random variables of the model are inrush yield per event, distance between events, number of events per unit underground space, maximum yield, and total yield over mine lifetime. Physically based hypotheses on the types of distributions are made and reinforced by observations. High parameter uncertainty stems from the random characteristics of karstic limestone and the limited amount of observation data. Thus, during the design stage, only indirect data such as regional information and geological analogies are available; updating of this information should then be done as the construction progresses and inrush events are observed and recorded. A Bayes simulation algorithm is developed and applied to estimate the probability distributions of inrush event characteristics used in the design of water control facilities in underground mining. A real-life example in the Transdanubian region of Hungary is used to illustrate the methodology.

  20. Bayesian analysis of volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Ho, Chih-Hsiang

    1990-10-01

    The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.

  1. Jointly modeling time-to-event and longitudinal data: A Bayesian approach.

    PubMed

    Huang, Yangxin; Hu, X Joan; Dagne, Getachew A

    2014-03-01

    This article explores Bayesian joint models of event times and longitudinal measures with an attempt to overcome departures from normality of the longitudinal response, measurement errors, and shortages of confidence in specifying a parametric time-to-event model. We allow the longitudinal response to have a skew distribution in the presence of measurement errors, and assume the time-to-event variable to have a nonparametric prior distribution. Posterior distributions of the parameters are attained simultaneously for inference based on Bayesian approach. An example from a recent AIDS clinical trial illustrates the methodology by jointly modeling the viral dynamics and the time to decrease in CD4/CD8 ratio in the presence of CD4 counts with measurement errors and to compare potential models with various scenarios and different distribution specifications. The analysis outcome indicates that the time-varying CD4 covariate is closely related to the first-phase viral decay rate, but the time to CD4/CD8 decrease is not highly associated with either the two viral decay rates or the CD4 changing rate over time. These findings may provide some quantitative guidance to better understand the relationship of the virological and immunological responses to antiretroviral treatments. PMID:24611039

  2. Joint Bayesian Component Separation and CMB Power Spectrum Estimation

    NASA Technical Reports Server (NTRS)

    Eriksen, H. K.; Jewell, J. B.; Dickinson, C.; Banday, A. J.; Gorski, K. M.; Lawrence, C. R.

    2008-01-01

    We describe and implement an exact, flexible, and computationally efficient algorithm for joint component separation and CMB power spectrum estimation, building on a Gibbs sampling framework. Two essential new features are (1) conditional sampling of foreground spectral parameters and (2) joint sampling of all amplitude-type degrees of freedom (e.g., CMB, foreground pixel amplitudes, and global template amplitudes) given spectral parameters. Given a parametric model of the foreground signals, we estimate efficiently and accurately the exact joint foreground- CMB posterior distribution and, therefore, all marginal distributions such as the CMB power spectrum or foreground spectral index posteriors. The main limitation of the current implementation is the requirement of identical beam responses at all frequencies, which restricts the analysis to the lowest resolution of a given experiment. We outline a future generalization to multiresolution observations. To verify the method, we analyze simple models and compare the results to analytical predictions. We then analyze a realistic simulation with properties similar to the 3 yr WMAP data, downgraded to a common resolution of 3 deg FWHM. The results from the actual 3 yr WMAP temperature analysis are presented in a companion Letter.

  3. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    ERIC Educational Resources Information Center

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A. G.

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are…

  4. Bayesian analysis for kaon photoproduction

    SciTech Connect

    Marsainy, T. Mart, T.

    2014-09-25

    We have investigated contribution of the nucleon resonances in the kaon photoproduction process by using an established statistical decision making method, i.e. the Bayesian method. This method does not only evaluate the model over its entire parameter space, but also takes the prior information and experimental data into account. The result indicates that certain resonances have larger probabilities to contribute to the process.

  5. An Integrated Bayesian Model for DIF Analysis

    ERIC Educational Resources Information Center

    Soares, Tufi M.; Goncalves, Flavio B.; Gamerman, Dani

    2009-01-01

    In this article, an integrated Bayesian model for differential item functioning (DIF) analysis is proposed. The model is integrated in the sense of modeling the responses along with the DIF analysis. This approach allows DIF detection and explanation in a simultaneous setup. Previous empirical studies and/or subjective beliefs about the item…

  6. Heterogeneous Factor Analysis Models: A Bayesian Approach.

    ERIC Educational Resources Information Center

    Ansari, Asim; Jedidi, Kamel; Dube, Laurette

    2002-01-01

    Developed Markov Chain Monte Carlo procedures to perform Bayesian inference, model checking, and model comparison in heterogeneous factor analysis. Tested the approach with synthetic data and data from a consumption emotion study involving 54 consumers. Results show that traditional psychometric methods cannot fully capture the heterogeneity in…

  7. hiHMM: Bayesian non-parametric joint inference of chromatin state maps

    PubMed Central

    Sohn, Kyung-Ah; Ho, Joshua W. K.; Djordjevic, Djordje; Jeong, Hyun-hwan; Park, Peter J.; Kim, Ju Han

    2015-01-01

    Motivation: Genome-wide mapping of chromatin states is essential for defining regulatory elements and inferring their activities in eukaryotic genomes. A number of hidden Markov model (HMM)-based methods have been developed to infer chromatin state maps from genome-wide histone modification data for an individual genome. To perform a principled comparison of evolutionarily distant epigenomes, we must consider species-specific biases such as differences in genome size, strength of signal enrichment and co-occurrence patterns of histone modifications. Results: Here, we present a new Bayesian non-parametric method called hierarchically linked infinite HMM (hiHMM) to jointly infer chromatin state maps in multiple genomes (different species, cell types and developmental stages) using genome-wide histone modification data. This flexible framework provides a new way to learn a consistent definition of chromatin states across multiple genomes, thus facilitating a direct comparison among them. We demonstrate the utility of this method using synthetic data as well as multiple modENCODE ChIP-seq datasets. Conclusion: The hierarchical and Bayesian non-parametric formulation in our approach is an important extension to the current set of methodologies for comparative chromatin landscape analysis. Availability and implementation: Source codes are available at https://github.com/kasohn/hiHMM. Chromatin data are available at http://encode-x.med.harvard.edu/data_sets/chromatin/. Contact: peter_park@harvard.edu or juhan@snu.ac.kr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25725496

  8. Bayesian and Frequentist Methods for Estimating Joint Uncertainty of Freundlich Adsorption Isotherm Fitting Parameters

    EPA Science Inventory

    In this paper, we present methods for estimating Freundlich isotherm fitting parameters (K and N) and their joint uncertainty, which have been implemented into the freeware software platforms R and WinBUGS. These estimates were determined by both Frequentist and Bayesian analyse...

  9. A SAS Interface for Bayesian Analysis with WinBUGS

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; McArdle, John J.; Wang, Lijuan; Hamagami, Fumiaki

    2008-01-01

    Bayesian methods are becoming very popular despite some practical difficulties in implementation. To assist in the practical application of Bayesian methods, we show how to implement Bayesian analysis with WinBUGS as part of a standard set of SAS routines. This implementation procedure is first illustrated by fitting a multiple regression model…

  10. A note on variational Bayesian factor analysis.

    PubMed

    Zhao, Jian-hua; Yu, Philip L H

    2009-09-01

    Existing works on variational bayesian (VB) treatment for factor analysis (FA) model such as [Ghahramani, Z., & Beal, M. (2000). Variational inference for Bayesian mixture of factor analysers. In Advances in neural information proceeding systems. Cambridge, MA: MIT Press; Nielsen, F. B. (2004). Variational approach to factor analysis and related models. Master's thesis, The Institute of Informatics and Mathematical Modelling, Technical University of Denmark.] are found theoretically and empirically to suffer two problems: (1) penalize the model more heavily than BIC and (2) perform unsatisfactorily in low noise cases as redundant factors can not be effectively suppressed. A novel VB treatment is proposed in this paper to resolve the two problems and a simulation study is conducted to testify its improved performance over existing treatments. PMID:19135337

  11. Semiparametric Bayesian inference on skew-normal joint modeling of multivariate longitudinal and survival data.

    PubMed

    Tang, An-Min; Tang, Nian-Sheng

    2015-02-28

    We propose a semiparametric multivariate skew-normal joint model for multivariate longitudinal and multivariate survival data. One main feature of the posited model is that we relax the commonly used normality assumption for random effects and within-subject error by using a centered Dirichlet process prior to specify the random effects distribution and using a multivariate skew-normal distribution to specify the within-subject error distribution and model trajectory functions of longitudinal responses semiparametrically. A Bayesian approach is proposed to simultaneously obtain Bayesian estimates of unknown parameters, random effects and nonparametric functions by combining the Gibbs sampler and the Metropolis-Hastings algorithm. Particularly, a Bayesian local influence approach is developed to assess the effect of minor perturbations to within-subject measurement error and random effects. Several simulation studies and an example are presented to illustrate the proposed methodologies. PMID:25404574

  12. Bayesian Analysis of Individual Level Personality Dynamics

    PubMed Central

    Cripps, Edward; Wood, Robert E.; Beckmann, Nadin; Lau, John; Beckmann, Jens F.; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415

  13. Bayesian Analysis of Individual Level Personality Dynamics.

    PubMed

    Cripps, Edward; Wood, Robert E; Beckmann, Nadin; Lau, John; Beckmann, Jens F; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415

  14. Bayesian model selection analysis of WMAP3

    SciTech Connect

    Parkinson, David; Mukherjee, Pia; Liddle, Andrew R.

    2006-06-15

    We present a Bayesian model selection analysis of WMAP3 data using our code CosmoNest. We focus on the density perturbation spectral index n{sub S} and the tensor-to-scalar ratio r, which define the plane of slow-roll inflationary models. We find that while the Bayesian evidence supports the conclusion that n{sub S}{ne}1, the data are not yet powerful enough to do so at a strong or decisive level. If tensors are assumed absent, the current odds are approximately 8 to 1 in favor of n{sub S}{ne}1 under our assumptions, when WMAP3 data is used together with external data sets. WMAP3 data on its own is unable to distinguish between the two models. Further, inclusion of r as a parameter weakens the conclusion against the Harrison-Zel'dovich case (n{sub S}=1, r=0), albeit in a prior-dependent way. In appendices we describe the CosmoNest code in detail, noting its ability to supply posterior samples as well as to accurately compute the Bayesian evidence. We make a first public release of CosmoNest, now available at www.cosmonest.org.

  15. Bayesian Approach to the Joint Inversion of Gravity and Magnetic Data, with Application to the Ismenius Area of Mars

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey B.; Raymond, C.; Smrekar, S.; Millbury, C.

    2004-01-01

    This viewgraph presentation reviews a Bayesian approach to the inversion of gravity and magnetic data with specific application to the Ismenius Area of Mars. Many inverse problems encountered in geophysics and planetary science are well known to be non-unique (i.e. inversion of gravity the density structure of a body). In hopes of reducing the non-uniqueness of solutions, there has been interest in the joint analysis of data. An example is the joint inversion of gravity and magnetic data, with the assumption that the same physical anomalies generate both the observed magnetic and gravitational anomalies. In this talk, we formulate the joint analysis of different types of data in a Bayesian framework and apply the formalism to the inference of the density and remanent magnetization structure for a local region in the Ismenius area of Mars. The Bayesian approach allows prior information or constraints in the solutions to be incorporated in the inversion, with the "best" solutions those whose forward predictions most closely match the data while remaining consistent with assumed constraints. The application of this framework to the inversion of gravity and magnetic data on Mars reveals two typical challenges - the forward predictions of the data have a linear dependence on some of the quantities of interest, and non-linear dependence on others (termed the "linear" and "non-linear" variables, respectively). For observations with Gaussian noise, a Bayesian approach to inversion for "linear" variables reduces to a linear filtering problem, with an explicitly computable "error" matrix. However, for models whose forward predictions have non-linear dependencies, inference is no longer given by such a simple linear problem, and moreover, the uncertainty in the solution is no longer completely specified by a computable "error matrix". It is therefore important to develop methods for sampling from the full Bayesian posterior to provide a complete and statistically consistent

  16. A Bayesian mixture of semiparametric mixed-effects joint models for skewed-longitudinal and time-to-event data.

    PubMed

    Chen, Jiaqing; Huang, Yangxin

    2015-09-10

    In longitudinal studies, it is of interest to investigate how repeatedly measured markers in time are associated with a time to an event of interest, and in the mean time, the repeated measurements are often observed with the features of a heterogeneous population, non-normality, and covariate measured with error because of longitudinal nature. Statistical analysis may complicate dramatically when one analyzes longitudinal-survival data with these features together. Recently, a mixture of skewed distributions has received increasing attention in the treatment of heterogeneous data involving asymmetric behaviors across subclasses, but there are relatively few studies accommodating heterogeneity, non-normality, and measurement error in covariate simultaneously arose in longitudinal-survival data setting. Under the umbrella of Bayesian inference, this article explores a finite mixture of semiparametric mixed-effects joint models with skewed distributions for longitudinal measures with an attempt to mediate homogeneous characteristics, adjust departures from normality, and tailor accuracy from measurement error in covariate as well as overcome shortages of confidence in specifying a time-to-event model. The Bayesian mixture of joint modeling offers an appropriate avenue to estimate not only all parameters of mixture joint models but also probabilities of class membership. Simulation studies are conducted to assess the performance of the proposed method, and a real example is analyzed to demonstrate the methodology. The results are reported by comparing potential models with various scenarios. PMID:25924891

  17. Bayesian PET image reconstruction incorporating anato-functional joint entropy

    NASA Astrophysics Data System (ADS)

    Tang, Jing; Rahmim, Arman

    2009-12-01

    We developed a maximum a posterior (MAP) reconstruction method for positron emission tomography (PET) image reconstruction incorporating magnetic resonance (MR) image information, with the joint entropy between the PET and MR image features serving as the regularization constraint. A non-parametric method was used to estimate the joint probability density of the PET and MR images. Using realistically simulated PET and MR human brain phantoms, the quantitative performance of the proposed algorithm was investigated. Incorporation of the anatomic information via this technique, after parameter optimization, was seen to dramatically improve the noise versus bias tradeoff in every region of interest, compared to the result from using conventional MAP reconstruction. In particular, hot lesions in the FDG PET image, which had no anatomical correspondence in the MR image, also had improved contrast versus noise tradeoff. Corrections were made to figures 3, 4 and 6, and to the second paragraph of section 3.1 on 13 November 2009. The corrected electronic version is identical to the print version.

  18. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    PubMed Central

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B; Neyer, Franz J; van Aken, Marcel AG

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided. PMID:24116396

  19. Bayesian Nonparametric Models for Multiway Data Analysis.

    PubMed

    Xu, Zenglin; Yan, Feng; Qi, Yuan

    2015-02-01

    Tensor decomposition is a powerful computational tool for multiway data analysis. Many popular tensor decomposition approaches-such as the Tucker decomposition and CANDECOMP/PARAFAC (CP)-amount to multi-linear factorization. They are insufficient to model (i) complex interactions between data entities, (ii) various data types (e.g., missing data and binary data), and (iii) noisy observations and outliers. To address these issues, we propose tensor-variate latent nonparametric Bayesian models for multiway data analysis. We name these models InfTucker. These new models essentially conduct Tucker decomposition in an infinite feature space. Unlike classical tensor decomposition models, our new approaches handle both continuous and binary data in a probabilistic framework. Unlike previous Bayesian models on matrices and tensors, our models are based on latent Gaussian or t processes with nonlinear covariance functions. Moreover, on network data, our models reduce to nonparametric stochastic blockmodels and can be used to discover latent groups and predict missing interactions. To learn the models efficiently from data, we develop a variational inference technique and explore properties of the Kronecker product for computational efficiency. Compared with a classical variational implementation, this technique reduces both time and space complexities by several orders of magnitude. On real multiway and network data, our new models achieved significantly higher prediction accuracy than state-of-art tensor decomposition methods and blockmodels. PMID:26353255

  20. Bayesian analysis of factors associated with fibromyalgia syndrome subjects

    NASA Astrophysics Data System (ADS)

    Jayawardana, Veroni; Mondal, Sumona; Russek, Leslie

    2015-01-01

    Factors contributing to movement-related fear were assessed by Russek, et al. 2014 for subjects with Fibromyalgia (FM) based on the collected data by a national internet survey of community-based individuals. The study focused on the variables, Activities-Specific Balance Confidence scale (ABC), Primary Care Post-Traumatic Stress Disorder screen (PC-PTSD), Tampa Scale of Kinesiophobia (TSK), a Joint Hypermobility Syndrome screen (JHS), Vertigo Symptom Scale (VSS-SF), Obsessive-Compulsive Personality Disorder (OCPD), Pain, work status and physical activity dependent from the "Revised Fibromyalgia Impact Questionnaire" (FIQR). The study presented in this paper revisits same data with a Bayesian analysis where appropriate priors were introduced for variables selected in the Russek's paper.

  1. Multimodel Bayesian analysis of groundwater data worth

    NASA Astrophysics Data System (ADS)

    Xue, Liang; Zhang, Dongxiao; Guadagnini, Alberto; Neuman, Shlomo P.

    2014-11-01

    We explore the way in which uncertain descriptions of aquifer heterogeneity and groundwater flow impact one's ability to assess the worth of collecting additional data. We do so on the basis of Maximum Likelihood Bayesian Model Averaging (MLBMA) by accounting jointly for uncertainties in geostatistical and flow model structures and parameter (hydraulic conductivity) as well as system state (hydraulic head) estimates, given uncertain measurements of one or both variables. Previous description of our approach was limited to geostatistical models based solely on hydraulic conductivity data. Here we implement the approach on a synthetic example of steady state flow in a two-dimensional random log hydraulic conductivity field with and without recharge by embedding an inverse stochastic moment solution of groundwater flow in MLBMA. A moment-equations-based geostatistical inversion method is utilized to circumvent the need for computationally expensive numerical Monte Carlo simulations. The approach is compatible with either deterministic or stochastic flow models and consistent with modern statistical methods of parameter estimation, admitting but not requiring prior information about the parameters. It allows but does not require approximating lead predictive statistical moments of system states by linearization while updating model posterior probabilities and parameter estimates on the basis of potential new data both before and after such data are actually collected.

  2. Optimal sequential Bayesian analysis for degradation tests.

    PubMed

    Rodríguez-Narciso, Silvia; Christen, J Andrés

    2016-07-01

    Degradation tests are especially difficult to conduct for items with high reliability. Test costs, caused mainly by prolonged item duration and item destruction costs, establish the necessity of sequential degradation test designs. We propose a methodology that sequentially selects the optimal observation times to measure the degradation, using a convenient rule that maximizes the inference precision and minimizes test costs. In particular our objective is to estimate a quantile of the time to failure distribution, where the degradation process is modelled as a linear model using Bayesian inference. The proposed sequential analysis is based on an index that measures the expected discrepancy between the estimated quantile and its corresponding prediction, using Monte Carlo methods. The procedure was successfully implemented for simulated and real data. PMID:26307336

  3. A Hierarchical Bayesian Procedure for Two-Mode Cluster Analysis

    ERIC Educational Resources Information Center

    DeSarbo, Wayne S.; Fong, Duncan K. H.; Liechty, John; Saxton, M. Kim

    2004-01-01

    This manuscript introduces a new Bayesian finite mixture methodology for the joint clustering of row and column stimuli/objects associated with two-mode asymmetric proximity, dominance, or profile data. That is, common clusters are derived which partition both the row and column stimuli/objects simultaneously into the same derived set of clusters.…

  4. Bayesian analysis on gravitational waves and exoplanets

    NASA Astrophysics Data System (ADS)

    Deng, Xihao

    Attempts to detect gravitational waves using a pulsar timing array (PTA), i.e., a collection of pulsars in our Galaxy, have become more organized over the last several years. PTAs act to detect gravitational waves generated from very distant sources by observing the small and correlated effect the waves have on pulse arrival times at the Earth. In this thesis, I present advanced Bayesian analysis methods that can be used to search for gravitational waves in pulsar timing data. These methods were also applied to analyze a set of radial velocity (RV) data collected by the Hobby- Eberly Telescope on observing a K0 giant star. They confirmed the presence of two Jupiter mass planets around a K0 giant star and also characterized the stellar p-mode oscillation. The first part of the thesis investigates the effect of wavefront curvature on a pulsar's response to a gravitational wave. In it we show that we can assume the gravitational wave phasefront is planar across the array only if the source luminosity distance " 2piL2/lambda, where L is the pulsar distance to the Earth (˜ kpc) and lambda is the radiation wavelength (˜ pc) in the PTA waveband. Correspondingly, for a point gravitational wave source closer than ˜ 100 Mpc, we should take into account the effect of wavefront curvature across the pulsar-Earth line of sight, which depends on the luminosity distance to the source, when evaluating the pulsar timing response. As a consequence, if a PTA can detect a gravitational wave from a source closer than ˜ 100 Mpc, the effects of wavefront curvature on the response allows us to determine the source luminosity distance. The second and third parts of the thesis propose a new analysis method based on Bayesian nonparametric regression to search for gravitational wave bursts and a gravitational wave background in PTA data. Unlike the conventional Bayesian analysis that introduces a signal model with a fixed number of parameters, Bayesian nonparametric regression sets

  5. Evaluation of a Partial Genome Screening of Two Asthma Susceptibility Regions Using Bayesian Network Based Bayesian Multilevel Analysis of Relevance

    PubMed Central

    Antal, Péter; Kiszel, Petra Sz.; Gézsi, András; Hadadi, Éva; Virág, Viktor; Hajós, Gergely; Millinghoffer, András; Nagy, Adrienne; Kiss, András; Semsei, Ágnes F.; Temesi, Gergely; Melegh, Béla; Kisfali, Péter; Széll, Márta; Bikov, András; Gálffy, Gabriella; Tamási, Lilla; Falus, András; Szalai, Csaba

    2012-01-01

    Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls). The results were evaluated with traditional frequentist methods and we applied a new statistical method, called Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA). This method uses Bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the Bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated. With frequentist methods one SNP (rs3751464 in the FRMD6 gene) provided evidence for an association with asthma (OR = 1.43(1.2–1.8); p = 3×10−4). The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics. In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance. PMID:22432035

  6. A Bayesian Hierarchical Approach to Regional Frequency Analysis of Extremes

    NASA Astrophysics Data System (ADS)

    Renard, B.

    2010-12-01

    Rainfall and runoff frequency analysis is a major issue for the hydrological community. The distribution of hydrological extremes varies in space and possibly in time. Describing and understanding this spatiotemporal variability are primary challenges to improve hazard quantification and risk assessment. This presentation proposes a general approach based on a Bayesian hierarchical model, following previous work by Cooley et al. [2007], Micevski [2007], Aryal et al. [2009] or Lima and Lall [2009; 2010]. Such a hierarchical model is made up of two levels: (1) a data level modeling the distribution of observations, and (2) a process level describing the fluctuation of the distribution parameters in space and possibly in time. At the first level of the model, at-site data (e.g., annual maxima series) are modeled with a chosen distribution (e.g., a GEV distribution). Since data from several sites are considered, the joint distribution of a vector of (spatial) observations needs to be derived. This is challenging because data are in general not spatially independent, especially for nearby sites. An elliptical copula is therefore used to formally account for spatial dependence between at-site data. This choice might be questionable in the context of extreme value distributions. However, it is motivated by its applicability in spatial highly dimensional problems, where the joint pdf of a vector of n observations is required to derive the likelihood function (with n possibly amounting to hundreds of sites). At the second level of the model, parameters of the chosen at-site distribution are then modeled by a Gaussian spatial process, whose mean may depend on covariates (e.g. elevation, distance to sea, weather pattern, time). In particular, this spatial process allows estimating parameters at ungauged sites, and deriving the predictive distribution of rainfall/runoff at every pixel/catchment of the studied domain. An application to extreme rainfall series from the French

  7. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  8. Transdimensional Bayesian Joint Inversion of Complementary Seismic Observables with Realistic Data Uncertainties

    NASA Astrophysics Data System (ADS)

    Gao, C.; Lekic, V.

    2014-12-01

    Due to their different and complementary sensitivities to structure, multiple seismic observables are often combined to image the Earth's deep interior. We use a reversible jump Markov chain Monte Carlo (rjMCMC) algorithm to incorporate surface wave dispersion, particle motion ellipticity (HZ ratio), and receiver functions into transdimensional, Bayesian inversion for the profiles of shear velocity (Vs), compressional velocity (Vp), and density beneath a seismic station. While traditional inversion approaches seek a single best-fit model, a Bayesian approach yields an ensemble of models, allowing us to fully quantify uncertainty and trade-offs between model parameters. Furthermore, we show that by treating the number model parameters as an unknown to be estimated from the data, we both eliminate the need for a fixed parameterization based on prior information, and obtain better model estimates with reduced trade-offs. Optimal weighting of disparate datasets is paramount for maximizing the resolving power of joint inversions. In a Bayesian framework, data uncertainty directly determines the variance of the model posterior probability distribution; therefore, characteristics of the uncertainties on the observables become even more important in the inversion (Bodin et al., 2011). To properly account for the noise characteristics of the different seismic observables, we compute covariance matrices of data errors for each data type by generating realistic synthetic noise using noise covariance matrices computed from thousands of noise samples, and then measuring the seismic observables of interest from synthetic waveforms contaminated by many different realizations of noise. We find large non-diagonal terms in the covariance matrices for different data types, indicating that typical assumptions of uncorrelated data errors are unjustified. We quantify how the use of realistic data covariance matrices in the joint inversion affects the retrieval of seismic structure under

  9. Bayesian data analysis in population ecology: motivations, methods, and benefits

    USGS Publications Warehouse

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  10. Learning Bayesian networks from big meteorological spatial datasets. An alternative to complex network analysis

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Jose Manuel; San Martín, Daniel; Herrera, Sixto; Santiago Cofiño, Antonio

    2016-04-01

    The growing availability of spatial datasets (observations, reanalysis, and regional and global climate models) demands efficient multivariate spatial modeling techniques for many problems of interest (e.g. teleconnection analysis, multi-site downscaling, etc.). Complex networks have been recently applied in this context using graphs built from pairwise correlations between the different stations (or grid boxes) forming the dataset. However, this analysis does not take into account the full dependence structure underlying the data, gien by all possible marginal and conditional dependencies among the stations, and does not allow a probabilistic analysis of the dataset. In this talk we introduce Bayesian networks as an alternative multivariate analysis and modeling data-driven technique which allows building a joint probability distribution of the stations including all relevant dependencies in the dataset. Bayesian networks is a sound machine learning technique using a graph to 1) encode the main dependencies among the variables and 2) to obtain a factorization of the joint probability distribution of the stations given by a reduced number of parameters. For a particular problem, the resulting graph provides a qualitative analysis of the spatial relationships in the dataset (alternative to complex network analysis), and the resulting model allows for a probabilistic analysis of the dataset. Bayesian networks have been widely applied in many fields, but their use in climate problems is hampered by the large number of variables (stations) involved in this field, since the complexity of the existing algorithms to learn from data the graphical structure grows nonlinearly with the number of variables. In this contribution we present a modified local learning algorithm for Bayesian networks adapted to this problem, which allows inferring the graphical structure for thousands of stations (from observations) and/or gridboxes (from model simulations) thus providing new

  11. Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation

    NASA Technical Reports Server (NTRS)

    Jefferys, William H.; Berger, James O.

    1992-01-01

    'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian analysis can, moreover, clarify the nature of the 'simplest' hypothesis consistent with the given data. By choosing the prior probabilities of hypotheses, it becomes possible to quantify the scientific judgment that simpler hypotheses are more likely to be correct. Bayesian analysis also shows that a hypothesis with fewer adjustable parameters intrinsically possesses an enhanced posterior probability, due to the clarity of its predictions.

  12. Enhancing the Modeling of PFOA Pharmacokinetics with Bayesian Analysis

    EPA Science Inventory

    The detail sufficient to describe the pharmacokinetics (PK) for perfluorooctanoic acid (PFOA) and the methods necessary to combine information from multiple data sets are both subjects of ongoing investigation. Bayesian analysis provides tools to accommodate these goals. We exa...

  13. Bayesian Analysis of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey

    2007-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

  14. Bayesian mixture analysis for metagenomic community profiling

    PubMed Central

    Morfopoulou, Sofia; Plagnol, Vincent

    2015-01-01

    Motivation: Deep sequencing of clinical samples is now an established tool for the detection of infectious pathogens, with direct medical applications. The large amount of data generated produces an opportunity to detect species even at very low levels, provided that computational tools can effectively profile the relevant metagenomic communities. Data interpretation is complicated by the fact that short sequencing reads can match multiple organisms and by the lack of completeness of existing databases, in particular for viral pathogens. Here we present metaMix, a Bayesian mixture model framework for resolving complex metagenomic mixtures. We show that the use of parallel Monte Carlo Markov chains for the exploration of the species space enables the identification of the set of species most likely to contribute to the mixture. Results: We demonstrate the greater accuracy of metaMix compared with relevant methods, particularly for profiling complex communities consisting of several related species. We designed metaMix specifically for the analysis of deep transcriptome sequencing datasets, with a focus on viral pathogen detection; however, the principles are generally applicable to all types of metagenomic mixtures. Availability and implementation: metaMix is implemented as a user friendly R package, freely available on CRAN: http://cran.r-project.org/web/packages/metaMix Contact: sofia.morfopoulou.10@ucl.ac.uk Supplementary information: Supplementary data are available at Bionformatics online. PMID:26002885

  15. Bayesian analysis of the backreaction models

    SciTech Connect

    Kurek, Aleksandra; Bolejko, Krzysztof; Szydlowski, Marek

    2010-03-15

    We present a Bayesian analysis of four different types of backreaction models, which are based on the Buchert equations. In this approach, one considers a solution to the Einstein equations for a general matter distribution and then an average of various observable quantities is taken. Such an approach became of considerable interest when it was shown that it could lead to agreement with observations without resorting to dark energy. In this paper we compare the {Lambda}CDM model and the backreaction models with type Ia supernovae, baryon acoustic oscillations, and cosmic microwave background data, and find that the former is favored. However, the tested models were based on some particular assumptions about the relation between the average spatial curvature and the backreaction, as well as the relation between the curvature and curvature index. In this paper we modified the latter assumption, leaving the former unchanged. We find that, by varying the relation between the curvature and curvature index, we can obtain a better fit. Therefore, some further work is still needed--in particular, the relation between the backreaction and the curvature should be revisited in order to fully determine the feasibility of the backreaction models to mimic dark energy.

  16. Bayesian analysis of a disability model for lung cancer survival.

    PubMed

    Armero, C; Cabras, S; Castellanos, M E; Perra, S; Quirós, A; Oruezábal, M J; Sánchez-Rubio, J

    2016-02-01

    Bayesian reasoning, survival analysis and multi-state models are used to assess survival times for Stage IV non-small-cell lung cancer patients and the evolution of the disease over time. Bayesian estimation is done using minimum informative priors for the Weibull regression survival model, leading to an automatic inferential procedure. Markov chain Monte Carlo methods have been used for approximating posterior distributions and the Bayesian information criterion has been considered for covariate selection. In particular, the posterior distribution of the transition probabilities, resulting from the multi-state model, constitutes a very interesting tool which could be useful to help oncologists and patients make efficient and effective decisions. PMID:22767866

  17. A fully Bayesian method for jointly fitting instrumental calibration and X-ray spectral models

    SciTech Connect

    Xu, Jin; Yu, Yaming; Van Dyk, David A.; Kashyap, Vinay L.; Siemiginowska, Aneta; Drake, Jeremy; Ratzlaff, Pete; Connors, Alanna; Meng, Xiao-Li E-mail: yamingy@ics.uci.edu E-mail: vkashyap@cfa.harvard.edu E-mail: jdrake@cfa.harvard.edu E-mail: meng@stat.harvard.edu

    2014-10-20

    Owing to a lack of robust principled methods, systematic instrumental uncertainties have generally been ignored in astrophysical data analysis despite wide recognition of the importance of including them. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. We previously introduced a pragmatic Bayesian method to address this problem. The method is 'pragmatic' in that it introduced an ad hoc technique that simplified computation by neglecting the potential information in the data for narrowing the uncertainty for the calibration product. Following that work, we use a principal component analysis to efficiently represent the uncertainty of the effective area of an X-ray (or γ-ray) telescope. Here, however, we leverage this representation to enable a principled, fully Bayesian method that coherently accounts for the calibration uncertainty in high-energy spectral analysis. In this setting, the method is compared with standard analysis techniques and the pragmatic Bayesian method. The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product—here the effective area, conditional on the adopted spectral model. In this way, it can yield more accurate and efficient estimates of the source parameters along with valid estimates of their uncertainty. Provided that the source spectrum can be accurately described by a parameterized model, this method allows rigorous inference about the effective area by quantifying which possible curves are most consistent with the data.

  18. Joint Segmentation and Deconvolution of Ultrasound Images Using a Hierarchical Bayesian Model Based on Generalized Gaussian Priors.

    PubMed

    Zhao, Ningning; Basarab, Adrian; Kouame, Denis; Tourneret, Jean-Yves

    2016-08-01

    This paper proposes a joint segmentation and deconvolution Bayesian method for medical ultrasound (US) images. Contrary to piecewise homogeneous images, US images exhibit heavy characteristic speckle patterns correlated with the tissue structures. The generalized Gaussian distribution (GGD) has been shown to be one of the most relevant distributions for characterizing the speckle in US images. Thus, we propose a GGD-Potts model defined by a label map coupling US image segmentation and deconvolution. The Bayesian estimators of the unknown model parameters, including the US image, the label map, and all the hyperparameters are difficult to be expressed in a closed form. Thus, we investigate a Gibbs sampler to generate samples distributed according to the posterior of interest. These generated samples are finally used to compute the Bayesian estimators of the unknown parameters. The performance of the proposed Bayesian model is compared with the existing approaches via several experiments conducted on realistic synthetic data and in vivo US images. PMID:27187959

  19. Semiparametric Bayesian joint modeling of a binary and continuous outcome with applications in toxicological risk assessment.

    PubMed

    Hwang, Beom Seuk; Pennell, Michael L

    2014-03-30

    Many dose-response studies collect data on correlated outcomes. For example, in developmental toxicity studies, uterine weight and presence of malformed pups are measured on the same dam. Joint modeling can result in more efficient inferences than independent models for each outcome. Most methods for joint modeling assume standard parametric response distributions. However, in toxicity studies, it is possible that response distributions vary in location and shape with dose, which may not be easily captured by standard models. To address this issue, we propose a semiparametric Bayesian joint model for a binary and continuous response. In our model, a kernel stick-breaking process prior is assigned to the distribution of a random effect shared across outcomes, which allows flexible changes in distribution shape with dose shared across outcomes. The model also includes outcome-specific fixed effects to allow different location effects. In simulation studies, we found that the proposed model provides accurate estimates of toxicological risk when the data do not satisfy assumptions of standard parametric models. We apply our method to data from a developmental toxicity study of ethylene glycol diethyl ether. PMID:24123309

  20. Bayesian analysis of MEG visual evoked responses

    SciTech Connect

    Schmidt, D.M.; George, J.S.; Wood, C.C.

    1999-04-01

    The authors developed a method for analyzing neural electromagnetic data that allows probabilistic inferences to be drawn about regions of activation. The method involves the generation of a large number of possible solutions which both fir the data and prior expectations about the nature of probable solutions made explicit by a Bayesian formalism. In addition, they have introduced a model for the current distributions that produce MEG and (EEG) data that allows extended regions of activity, and can easily incorporate prior information such as anatomical constraints from MRI. To evaluate the feasibility and utility of the Bayesian approach with actual data, they analyzed MEG data from a visual evoked response experiment. They compared Bayesian analyses of MEG responses to visual stimuli in the left and right visual fields, in order to examine the sensitivity of the method to detect known features of human visual cortex organization. They also examined the changing pattern of cortical activation as a function of time.

  1. A Joint Bayesian Inversion for Glacial Isostatic Adjustment in North America and Greenland

    NASA Astrophysics Data System (ADS)

    Davis, J. L.; Wang, L.

    2014-12-01

    We have previously presented joint inversions of geodetic data for glacial isostatic adjustment (GIA) fields that employ a Bayesian framework for the combination of data and models. Data sets used include GNSS, GRACE gravity, and tide-gauge data, in order to estimate three-dimensional crustal deformation, geoid rate, relative sea-level change (RSLC). The benefit to this approach is that solutions are less dependent on any particular Earth/ice model used to calculate the GIA fields, and instead employ a suite of GIA predictions that are then used to calculate statistical constraints. This approach was used both for the determination of the SNARF geodetic reference frame for North America, and for a study of GIA in Fennoscandia (Hill et al., 2010). One challenge to the method we developed is that the inherent reduction in resolution of, and correlation among, GRACE Stokes coefficients caused by the destriping procedure (Swenson and Wahr, 2006; Duan et al., 2009) was not accounted for. This important obstacle has been overcome by developing a Bayesian approach to destriping (Wang et al., in prep.). However, important issues of mixed resolution of these data types still remain. In this presentation, we report on the progress of this effort, and present a new GIA field for North America. For the first time, the region used in the solution includes Greenland, in order to provide internally consistent solutions for GIA, the spatial and temporal variability of present-day sea-level change, and present-day melting in Greenland.

  2. Quantifying predictive uncertainty of streamflow forecasts based on a Bayesian joint probability model

    NASA Astrophysics Data System (ADS)

    Zhao, Tongtiegang; Wang, Q. J.; Bennett, James C.; Robertson, David E.; Shao, Quanxi; Zhao, Jianshi

    2015-09-01

    Uncertainty is inherent in streamflow forecasts and is an important determinant of the utility of forecasts for water resources management. However, predictions by deterministic models provide only single values without uncertainty attached. This study presents a method for using a Bayesian joint probability (BJP) model to post-process deterministic streamflow forecasts by quantifying predictive uncertainty. The BJP model is comprised of a log-sinh transformation that normalises hydrological data, and a bi-variate Gaussian distribution that characterises the dependence relationship. The parameters of the transformation and the distribution are estimated through Bayesian inference with a Monte Carlo Markov chain (MCMC) algorithm. The BJP model produces, from a raw deterministic forecast, an ensemble of values to represent forecast uncertainty. The model is applied to raw deterministic forecasts of inflows to the Three Gorges Reservoir in China as a case study. The heteroscedasticity and non-Gaussianity of forecast uncertainty are effectively addressed. The ensemble spread accounts for the forecast uncertainty and leads to considerable improvement in terms of the continuous ranked probability score. The forecasts become less accurate as lead time increases, and the ensemble spread provides reliable information on the forecast uncertainty. We conclude that the BJP model is a useful tool to quantify predictive uncertainty in post-processing deterministic streamflow forecasts.

  3. Efficient Bayesian Joint Models for Group Randomized Trials with Multiple Observation Times and Multiple Outcomes

    PubMed Central

    Xu, Xinyi; Pennell, Michael L.; Lu, Bo; Murray, David M.

    2013-01-01

    Summary In this paper, we propose a Bayesian method for Group Randomized Trials (GRTs) with multiple observation times and multiple outcomes of different types. We jointly model these outcomes using latent multivariate normal linear regression, which allows treatment effects to change with time and accounts for 1.) intra-class correlation (ICC) within groups 2.) the correlation between different outcomes measured on the same subject and 3.) the over-time correlation (OTC) of each outcome. Moreover we develop a set of innovative priors for the variance components which yield direct inference on the correlations, avoid undesirable constraints, and allow utilization of information from previous studies. We illustrate through simulations that our model can improve estimation efficiency (lower posterior standard deviations) of ICCs and treatment effects relative to single outcome models and models with diffuse priors on the variance components. We also demonstrate the methodology using body composition data collected in the Trial of Activity in Adolescent Girls (TAAG). PMID:22733563

  4. Joint prediction of multiple quantitative traits using a Bayesian multivariate antedependence model

    PubMed Central

    Jiang, J; Zhang, Q; Ma, L; Li, J; Wang, Z; Liu, J-F

    2015-01-01

    Predicting organismal phenotypes from genotype data is important for preventive and personalized medicine as well as plant and animal breeding. Although genome-wide association studies (GWAS) for complex traits have discovered a large number of trait- and disease-associated variants, phenotype prediction based on associated variants is usually in low accuracy even for a high-heritability trait because these variants can typically account for a limited fraction of total genetic variance. In comparison with GWAS, the whole-genome prediction (WGP) methods can increase prediction accuracy by making use of a huge number of variants simultaneously. Among various statistical methods for WGP, multiple-trait model and antedependence model show their respective advantages. To take advantage of both strategies within a unified framework, we proposed a novel multivariate antedependence-based method for joint prediction of multiple quantitative traits using a Bayesian algorithm via modeling a linear relationship of effect vector between each pair of adjacent markers. Through both simulation and real-data analyses, our studies demonstrated that the proposed antedependence-based multiple-trait WGP method is more accurate and robust than corresponding traditional counterparts (Bayes A and multi-trait Bayes A) under various scenarios. Our method can be readily extended to deal with missing phenotypes and resequence data with rare variants, offering a feasible way to jointly predict phenotypes for multiple complex traits in human genetic epidemiology as well as plant and livestock breeding. PMID:25873147

  5. Bayesian analysis of the modified Omori law

    NASA Astrophysics Data System (ADS)

    Holschneider, M.; Narteau, C.; Shebalin, P.; Peng, Z.; Schorlemmer, D.

    2012-06-01

    In order to examine variations in aftershock decay rate, we propose a Bayesian framework to estimate the {K, c, p}-values of the modified Omori law (MOL), λ(t) = K(c + t)-p. The Bayesian setting allows not only to produce a point estimator of these three parameters but also to assess their uncertainties and posterior dependencies with respect to the observed aftershock sequences. Using a new parametrization of the MOL, we identify the trade-off between the c and p-value estimates and discuss its dependence on the number of aftershocks. Then, we analyze the influence of the catalog completeness interval [tstart, tstop] on the various estimates. To test this Bayesian approach on natural aftershock sequences, we use two independent and non-overlapping aftershock catalogs of the same earthquakes in Japan. Taking into account the posterior uncertainties, we show that both the handpicked (short times) and the instrumental (long times) catalogs predict the same ranges of parameter values. We therefore conclude that the same MOL may be valid over short and long times.

  6. A Bayesian analysis of plutonium exposures in Sellafield workers.

    PubMed

    Puncher, M; Riddell, A E

    2016-03-01

    The joint Russian (Mayak Production Association) and British (Sellafield) plutonium worker epidemiological analysis, undertaken as part of the European Union Framework Programme 7 (FP7) SOLO project, aims to investigate potential associations between cancer incidence and occupational exposures to plutonium using estimates of organ/tissue doses. The dose reconstruction protocol derived for the study makes best use of the most recent biokinetic models derived by the International Commission on Radiological Protection (ICRP) including a recent update to the human respiratory tract model (HRTM). This protocol was used to derive the final point estimates of absorbed doses for the study. Although uncertainties on the dose estimates were not included in the final epidemiological analysis, a separate Bayesian analysis has been performed for each of the 11 808 Sellafield plutonium workers included in the study in order to assess: A. The reliability of the point estimates provided to the epidemiologists and B. The magnitude of the uncertainty on dose estimates. This analysis, which accounts for uncertainties in biokinetic model parameters, intakes and measurement uncertainties, is described in the present paper. The results show that there is excellent agreement between the point estimates of dose and posterior mean values of dose. However, it is also evident that there are significant uncertainties associated with these dose estimates: the geometric range of the 97.5%:2.5% posterior values are a factor of 100 for lung dose, 30 for doses to liver and red bone marrow, and 40 for intakes: these uncertainties are not reflected in estimates of risk when point doses are used to assess them. It is also shown that better estimates of certain key HRTM absorption parameters could significantly reduce the uncertainties on lung dose in future studies. PMID:26584413

  7. Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.

    2016-03-01

    A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.

  8. Nested sampling applied in Bayesian room-acoustics decay analysis.

    PubMed

    Jasa, Tomislav; Xiang, Ning

    2012-11-01

    Room-acoustic energy decays often exhibit single-rate or multiple-rate characteristics in a wide variety of rooms/halls. Both the energy decay order and decay parameter estimation are of practical significance in architectural acoustics applications, representing two different levels of Bayesian probabilistic inference. This paper discusses a model-based sound energy decay analysis within a Bayesian framework utilizing the nested sampling algorithm. The nested sampling algorithm is specifically developed to evaluate the Bayesian evidence required for determining the energy decay order with decay parameter estimates as a secondary result. Taking the energy decay analysis in architectural acoustics as an example, this paper demonstrates that two different levels of inference, decay model-selection and decay parameter estimation, can be cohesively accomplished by the nested sampling algorithm. PMID:23145609

  9. APPLICATION OF BAYESIAN MONTE CARLO ANALYSIS TO A LAGRANGIAN PHOTOCHEMICAL AIR QUALITY MODEL. (R824792)

    EPA Science Inventory

    Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...

  10. Phycas: software for Bayesian phylogenetic analysis.

    PubMed

    Lewis, Paul O; Holder, Mark T; Swofford, David L

    2015-05-01

    Phycas is open source, freely available Bayesian phylogenetics software written primarily in C++ but with a Python interface. Phycas specializes in Bayesian model selection for nucleotide sequence data, particularly the estimation of marginal likelihoods, central to computing Bayes Factors. Marginal likelihoods can be estimated using newer methods (Thermodynamic Integration and Generalized Steppingstone) that are more accurate than the widely used Harmonic Mean estimator. In addition, Phycas supports two posterior predictive approaches to model selection: Gelfand-Ghosh and Conditional Predictive Ordinates. The General Time Reversible family of substitution models, as well as a codon model, are available, and data can be partitioned with all parameters unlinked except tree topology and edge lengths. Phycas provides for analyses in which the prior on tree topologies allows polytomous trees as well as fully resolved trees, and provides for several choices for edge length priors, including a hierarchical model as well as the recently described compound Dirichlet prior, which helps avoid overly informative induced priors on tree length. PMID:25577605

  11. Bayesian networks as a tool for epidemiological systems analysis

    NASA Astrophysics Data System (ADS)

    Lewis, F. I.

    2012-11-01

    Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter attempts not only to identify statistically associated variables, but to additionally, and empirically, separate these into those directly and indirectly dependent with one or more outcome variables. Such discrimination is vastly more ambitious but has the potential to reveal far more about key features of complex disease systems. Applying Bayesian network modeling to biological and medical data has considerable computational demands, combined with the need to ensure robust model selection given the vast model space of possible DAGs. These challenges require the use of approximation techniques, such as the Laplace approximation, Markov chain Monte Carlo simulation and parametric bootstrapping, along with computational parallelization. A case study in structure discovery - identification of an optimal DAG for given data - is presented which uses additive Bayesian networks to explore veterinary disease data of industrial and medical relevance.

  12. Methods for the joint meta-analysis of multiple tests.

    PubMed

    Trikalinos, Thomas A; Hoaglin, David C; Small, Kevin M; Terrin, Norma; Schmid, Christopher H

    2014-12-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests' true-positive rates (TPRs) and between their false-positive rates (FPRs) (induced because tests are applied to the same participants), and allow for between-study correlations between TPRs and FPRs (such as those induced by threshold effects). We estimate models in the Bayesian setting. We demonstrate using a meta-analysis of screening for Down syndrome with two tests: shortened humerus (arm bone), and shortened femur (thigh bone). Separate and joint meta-analyses yielded similar TPR and FPR estimates. For example, the summary TPR for a shortened humerus was 35.3% (95% credible interval (CrI): 26.9, 41.8%) versus 37.9% (27.7, 50.3%) with joint versus separate meta-analysis. Joint meta-analysis is more efficient when calculating comparative accuracy: the difference in the summary TPRs was 0.0% (-8.9, 9.5%; TPR higher for shortened humerus) with joint versus 2.6% (-14.7, 19.8%) with separate meta-analyses. Simulation and empirical analyses are needed to refine the role of the proposed methodology. PMID:26052954

  13. On Bayesian analysis of on-off measurements

    NASA Astrophysics Data System (ADS)

    Nosek, Dalibor; Nosková, Jana

    2016-06-01

    We propose an analytical solution to the on-off problem within the framework of Bayesian statistics. Both the statistical significance for the discovery of new phenomena and credible intervals on model parameters are presented in a consistent way. We use a large enough family of prior distributions of relevant parameters. The proposed analysis is designed to provide Bayesian solutions that can be used for any number of observed on-off events, including zero. The procedure is checked using Monte Carlo simulations. The usefulness of the method is demonstrated on examples from γ-ray astronomy.

  14. Factor analysis models for structuring covariance matrices of additive genetic effects: a Bayesian implementation

    PubMed Central

    de los Campos, Gustavo; Gianola, Daniel

    2007-01-01

    Multivariate linear models are increasingly important in quantitative genetics. In high dimensional specifications, factor analysis (FA) may provide an avenue for structuring (co)variance matrices, thus reducing the number of parameters needed for describing (co)dispersion. We describe how FA can be used to model genetic effects in the context of a multivariate linear mixed model. An orthogonal common factor structure is used to model genetic effects under Gaussian assumption, so that the marginal likelihood is multivariate normal with a structured genetic (co)variance matrix. Under standard prior assumptions, all fully conditional distributions have closed form, and samples from the joint posterior distribution can be obtained via Gibbs sampling. The model and the algorithm developed for its Bayesian implementation were used to describe five repeated records of milk yield in dairy cattle, and a one common FA model was compared with a standard multiple trait model. The Bayesian Information Criterion favored the FA model. PMID:17897592

  15. A Bayesian QTL linkage analysis of the common dataset from the 12th QTLMAS workshop

    PubMed Central

    Bink, Marco CAM; van Eeuwijk, Fred A

    2009-01-01

    Background To compare the power of various QTL mapping methodologies, a dataset was simulated within the framework of 12th QTLMAS workshop. A total of 5865 diploid individuals was simulated, spanning seven generations, with known pedigree. Individuals were genotyped for 6000 SNPs across six chromosomes. We present an illustration of a Bayesian QTL linkage analysis, as implemented in the special purpose software FlexQTL. Most importantly, we treated the number of bi-allelic QTL as a random variable and used Bayes Factors to infer plausible QTL models. We investigated the power of our analysis in relation to the number of phenotyped individuals and SNPs. Results We report clear posterior evidence for 12 QTL that jointly explained 30% of the phenotypic variance, which was very close to the total of included simulation effects, when using all phenotypes and a set of 600 SNPs. Decreasing the number of phenotyped individuals from 4665 to 1665 and/or the number of SNPs in the analysis from 600 to 120 dramatically reduced the power to identify and locate QTL. Posterior estimates of genome-wide breeding values for a small set of individuals were given. Conclusion We presented a successful Bayesian linkage analysis of a simulated dataset with a pedigree spanning several generations. Our analysis identified all regions that contained QTL with effects explaining more than one percent of the phenotypic variance. We showed how the results of a Bayesian QTL mapping can be used in genomic prediction. PMID:19278543

  16. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    ERIC Educational Resources Information Center

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  17. Bayesian Analysis of Nonlinear Structural Equation Models with Nonignorable Missing Data

    ERIC Educational Resources Information Center

    Lee, Sik-Yum

    2006-01-01

    A Bayesian approach is developed for analyzing nonlinear structural equation models with nonignorable missing data. The nonignorable missingness mechanism is specified by a logistic regression model. A hybrid algorithm that combines the Gibbs sampler and the Metropolis-Hastings algorithm is used to produce the joint Bayesian estimates of…

  18. Simultaneous Bayesian analysis of contingency tables in genetic association studies.

    PubMed

    Dickhaus, Thorsten

    2015-08-01

    Genetic association studies lead to simultaneous categorical data analysis. The sample for every genetic locus consists of a contingency table containing the numbers of observed genotype-phenotype combinations. Under case-control design, the row counts of every table are identical and fixed, while column counts are random. The aim of the statistical analysis is to test independence of the phenotype and the genotype at every locus. We present an objective Bayesian methodology for these association tests, which relies on the conjugacy of Dirichlet and multinomial distributions. Being based on the likelihood principle, the Bayesian tests avoid looping over all tables with given marginals. Making use of data generated by The Wellcome Trust Case Control Consortium (WTCCC), we illustrate that the ordering of the Bayes factors shows a good agreement with that of frequentist p-values. Furthermore, we deal with specifying prior probabilities for the validity of the null hypotheses, by taking linkage disequilibrium structure into account and exploiting the concept of effective numbers of tests. Application of a Bayesian decision theoretic multiple test procedure to the WTCCC data illustrates the proposed methodology. Finally, we discuss two methods for reconciling frequentist and Bayesian approaches to the multiple association test problem. PMID:26215535

  19. Multiple quantitative trait analysis using bayesian networks.

    PubMed

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness. PMID:25236454

  20. Application of Bayesian graphs to SN Ia data analysis and compression

    NASA Astrophysics Data System (ADS)

    Ma, Cong; Corasaniti, Pier-Stefano; Bassett, Bruce A.

    2016-08-01

    Bayesian graphical models are an efficient tool for modelling complex data and derive self-consistent expressions of the posterior distribution of model parameters. We apply Bayesian graphs to perform statistical analyses of Type Ia supernova (SN Ia) luminosity distance measurements from the Joint Light-curve Analysis (JLA) dataset (Betoule et al. 2014). In contrast to the χ2 approach used in previous studies, the Bayesian inference allows us to fully account for the standard-candle parameter dependence of the data covariance matrix. Comparing with χ2 analysis results we find a systematic offset of the marginal model parameter bounds. We demonstrate that the bias is statistically significant in the case of the SN Ia standardization parameters with a maximal 6σ shift of the SN light-curve colour correction. In addition, we find that the evidence for a host galaxy correction is now only 2.4σ. Systematic offsets on the cosmological parameters remain small, but may increase by combining constraints from complementary cosmological probes. The bias of the χ2 analysis is due to neglecting the parameter-dependent log-determinant of the data covariance, which gives more statistical weight to larger values of the standardization parameters. We find a similar effect on compressed distance modulus data. To this end we implement a fully consistent compression method of the JLA dataset that uses a Gaussian approximation of the posterior distribution for fast generation of compressed data. Overall, the results of our analysis emphasize the need for a fully consistent Bayesian statistical approach in the analysis of future large SN Ia datasets.

  1. Bayesian joint inversion of surface deformation and hydraulic data for aquifer characterization

    NASA Astrophysics Data System (ADS)

    Hesse, M. A.; Stadler, G.

    2013-12-01

    Remote sensing and geodetic measurements are providing a wealth of new, spatially-distributed, time-series data that promise to improve the characterization of regional aquifers. The integration of these geodetic measurements with other hydrological observations has the potential to aid the sustainable management of groundwater resources through improved characterization of the spatial variation of aquifer properties. The joint inversion of geomechanical and hydrological data is challenging, because it requires fully-coupled hydrogeophysical inversion for the aquifer parameters, based on a coupled geomechanical and hydrological process model. We formulate a Bayesian inverse problem to infer the lateral permeability variation in an aquifer from geodetic and hydraulic data, and from prior information. We compute the maximum a posteriori (MAP) estimate of the posterior permeability distribution, and use a local Gaussian approximation around the MAP point to characterize the uncertainty. For two-dimensional test cases we also explore the full posterior permeability distribution through Markov-Chain Monte Carlo (MCMC) sampling. To cope with the large parameter space dimension, we use local Gaussian approximations as proposal densities in the MCMC algorithm. Using increasingly complex model problems, based on the work of Mandel (1953) and Segall (1985), we find the following general properties of poroelastic inversions: (1) Augmenting standard hydraulic well data by surface deformation data improves the aquifer characterization. (2) Surface deformation contributes the most in shallow aquifers, but provides useful information even for the characterization of aquifers down to 1 km. (3) In general, it is more difficult to infer high permeability regions, and their characterization requires frequent measurement to resolve the associated short response time scales. (4) In horizontal aquifers, the vertical component of the surface deformation provides a smoothed image of the

  2. An Overview of Bayesian Methods for Neural Spike Train Analysis

    PubMed Central

    2013-01-01

    Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed. PMID:24348527

  3. Bayesian Shrinkage Analysis of Quantitative Trait Loci for Dynamic Traits

    PubMed Central

    Yang, Runqing; Xu, Shizhong

    2007-01-01

    Many quantitative traits are measured repeatedly during the life of an organism. Such traits are called dynamic traits. The pattern of the changes of a dynamic trait is called the growth trajectory. Studying the growth trajectory may enhance our understanding of the genetic architecture of the growth trajectory. Recently, we developed an interval-mapping procedure to map QTL for dynamic traits under the maximum-likelihood framework. We fit the growth trajectory by Legendre polynomials. The method intended to map one QTL at a time and the entire QTL analysis involved scanning the entire genome by fitting multiple single-QTL models. In this study, we propose a Bayesian shrinkage analysis for estimating and mapping multiple QTL in a single model. The method is a combination between the shrinkage mapping for individual quantitative traits and the Legendre polynomial analysis for dynamic traits. The multiple-QTL model is implemented in two ways: (1) a fixed-interval approach where a QTL is placed in each marker interval and (2) a moving-interval approach where the position of a QTL can be searched in a range that covers many marker intervals. Simulation study shows that the Bayesian shrinkage method generates much better signals for QTL than the interval-mapping approach. We propose several alternative methods to present the results of the Bayesian shrinkage analysis. In particular, we found that the Wald test-statistic profile can serve as a mechanism to test the significance of a putative QTL. PMID:17435239

  4. Bayesian Variable Selection in Cost-Effectiveness Analysis

    PubMed Central

    Negrín, Miguel A.; Vázquez-Polo, Francisco J.; Martel, María; Moreno, Elías; Girón, Francisco J.

    2010-01-01

    Linear regression models are often used to represent the cost and effectiveness of medical treatment. The covariates used may include sociodemographic variables, such as age, gender or race; clinical variables, such as initial health status, years of treatment or the existence of concomitant illnesses; and a binary variable indicating the treatment received. However, most studies estimate only one model, which usually includes all the covariates. This procedure ignores the question of uncertainty in model selection. In this paper, we examine four alternative Bayesian variable selection methods that have been proposed. In this analysis, we estimate the inclusion probability of each covariate in the real model conditional on the data. Variable selection can be useful for estimating incremental effectiveness and incremental cost, through Bayesian model averaging, as well as for subgroup analysis. PMID:20617047

  5. Bayesian Analysis Toolkit: 1.0 and beyond

    NASA Astrophysics Data System (ADS)

    Beaujean, Frederik; Caldwell, Allen; Greenwald, D.; Kluth, S.; Kröninger, Kevin; Schulz, O.

    2015-12-01

    The Bayesian Analysis Toolkit is a C++ package centered around Markov-chain Monte Carlo sampling. It is used in high-energy physics analyses by experimentalists and theorists alike. The software has matured over the last few years. We present new features to enter version 1.0, then summarize some of the software-engineering lessons learned and give an outlook on future versions.

  6. BAYESIAN ANALYSIS OF MULTIPLE HARMONIC OSCILLATIONS IN THE SOLAR CORONA

    SciTech Connect

    Arregui, I.; Asensio Ramos, A.; Diaz, A. J.

    2013-03-01

    The detection of multiple mode harmonic kink oscillations in coronal loops enables us to obtain information on coronal density stratification and magnetic field expansion using seismology inversion techniques. The inference is based on the measurement of the period ratio between the fundamental mode and the first overtone and theoretical results for the period ratio under the hypotheses of coronal density stratification and magnetic field expansion of the wave guide. We present a Bayesian analysis of multiple mode harmonic oscillations for the inversion of the density scale height and magnetic flux tube expansion under each of the hypotheses. The two models are then compared using a Bayesian model comparison scheme to assess how plausible each one is given our current state of knowledge.

  7. Analysis of NSTX TF Joint Voltage Measurements

    SciTech Connect

    R, Woolley

    2005-10-07

    This report presents findings of analyses of recorded current and voltage data associated with 72 electrical joints operating at high current and high mechanical stress. The analysis goal was to characterize the mechanical behavior of each joint and thus evaluate its mechanical supports. The joints are part of the toroidal field (TF) magnet system of the National Spherical Torus Experiment (NSTX) pulsed plasma device operating at the Princeton Plasma Physics Laboratory (PPPL). Since there is not sufficient space near the joints for much traditional mechanical instrumentation, small voltage probes were installed on each joint and their voltage monitoring waveforms have been recorded on sampling digitizers during each NSTX ''shot''.

  8. Risk analysis using a hybrid Bayesian-approximate reasoning methodology.

    SciTech Connect

    Bott, T. F.; Eisenhawer, S. W.

    2001-01-01

    Analysts are sometimes asked to make frequency estimates for specific accidents in which the accident frequency is determined primarily by safety controls. Under these conditions, frequency estimates use considerable expert belief in determining how the controls affect the accident frequency. To evaluate and document beliefs about control effectiveness, we have modified a traditional Bayesian approach by using approximate reasoning (AR) to develop prior distributions. Our method produces accident frequency estimates that separately express the probabilistic results produced in Bayesian analysis and possibilistic results that reflect uncertainty about the prior estimates. Based on our experience using traditional methods, we feel that the AR approach better documents beliefs about the effectiveness of controls than if the beliefs are buried in Bayesian prior distributions. We have performed numerous expert elicitations in which probabilistic information was sought from subject matter experts not trained In probability. We find it rnuch easier to elicit the linguistic variables and fuzzy set membership values used in AR than to obtain the probability distributions used in prior distributions directly from these experts because it better captures their beliefs and better expresses their uncertainties.

  9. Spectral Analysis of B Stars: An Application of Bayesian Statistics

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2012-12-01

    To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.

  10. Bayesian sensitivity analysis of bifurcating nonlinear models

    NASA Astrophysics Data System (ADS)

    Becker, W.; Worden, K.; Rowson, J.

    2013-01-01

    Sensitivity analysis allows one to investigate how changes in input parameters to a system affect the output. When computational expense is a concern, metamodels such as Gaussian processes can offer considerable computational savings over Monte Carlo methods, albeit at the expense of introducing a data modelling problem. In particular, Gaussian processes assume a smooth, non-bifurcating response surface. This work highlights a recent extension to Gaussian processes which uses a decision tree to partition the input space into homogeneous regions, and then fits separate Gaussian processes to each region. In this way, bifurcations can be modelled at region boundaries and different regions can have different covariance properties. To test this method, both the treed and standard methods were applied to the bifurcating response of a Duffing oscillator and a bifurcating FE model of a heart valve. It was found that the treed Gaussian process provides a practical way of performing uncertainty and sensitivity analysis on large, potentially-bifurcating models, which cannot be dealt with by using a single GP, although an open problem remains how to manage bifurcation boundaries that are not parallel to coordinate axes.

  11. Bayesian Dose-Finding in Two Treatment Cycles Based on the Joint Utility of Efficacy and Toxicity

    PubMed Central

    Lee, Juhee; Thall, Peter F.; Ji, Yuan; Müller, Peter

    2014-01-01

    A phase I/II clinical trial design is proposed for adaptively and dynamically optimizing each patient's dose in each of two cycles of therapy based on the joint binary efficacy and toxicity outcomes in each cycle. A dose-outcome model is assumed that includes a Bayesian hierarchical latent variable structure to induce association among the outcomes and also facilitate posterior computation. Doses are chosen in each cycle based on posteriors of a model-based objective function, similar to a reinforcement learning or Q-learning function, defined in terms of numerical utilities of the joint outcomes in each cycle. For each patient, the procedure outputs a sequence of two actions, one for each cycle, with each action being the decision to either treat the patient at a chosen dose or not to treat. The cycle 2 action depends on the individual patient's cycle 1 dose and outcomes. In addition, decisions are based on posterior inference using other patients’ data, and therefore the proposed method is adaptive both within and between patients. A simulation study of the method is presented, including comparison to two-cycle extensions of the conventional 3+3 algorithm, continual reassessment method, and a Bayesian model-based design, and evaluation of robustness. PMID:26366026

  12. Bayesian analysis for extreme climatic events: A review

    NASA Astrophysics Data System (ADS)

    Chu, Pao-Shin; Zhao, Xin

    2011-11-01

    This article reviews Bayesian analysis methods applied to extreme climatic data. We particularly focus on applications to three different problems related to extreme climatic events including detection of abrupt regime shifts, clustering tropical cyclone tracks, and statistical forecasting for seasonal tropical cyclone activity. For identifying potential change points in an extreme event count series, a hierarchical Bayesian framework involving three layers - data, parameter, and hypothesis - is formulated to demonstrate the posterior probability of the shifts throughout the time. For the data layer, a Poisson process with a gamma distributed rate is presumed. For the hypothesis layer, multiple candidate hypotheses with different change-points are considered. To calculate the posterior probability for each hypothesis and its associated parameters we developed an exact analytical formula, a Markov Chain Monte Carlo (MCMC) algorithm, and a more sophisticated reversible jump Markov Chain Monte Carlo (RJMCMC) algorithm. The algorithms are applied to several rare event series: the annual tropical cyclone or typhoon counts over the central, eastern, and western North Pacific; the annual extremely heavy rainfall event counts at Manoa, Hawaii; and the annual heat wave frequency in France. Using an Expectation-Maximization (EM) algorithm, a Bayesian clustering method built on a mixture Gaussian model is applied to objectively classify historical, spaghetti-like tropical cyclone tracks (1945-2007) over the western North Pacific and the South China Sea into eight distinct track types. A regression based approach to forecasting seasonal tropical cyclone frequency in a region is developed. Specifically, by adopting large-scale environmental conditions prior to the tropical cyclone season, a Poisson regression model is built for predicting seasonal tropical cyclone counts, and a probit regression model is alternatively developed toward a binary classification problem. With a non

  13. The Bayesian Analysis Software Developed At Washington University

    NASA Astrophysics Data System (ADS)

    Marutyan, Karen R.; Bretthorst, G. Larry

    2009-12-01

    Over the last few years there has been an ongoing effort at the Biomedical Magnetic Resonance Laboratory within Washington University to develop data analysis applications using Bayesian probability theory. A few of these applications are specific to Magnetic Resonance data, however, most are general and can analyze data from a wide variety of sources. These data analysis applications are server based and they have been written in such a way as to allow them to utilize as many processors as are available. The interface to these Bayesian applications is a client based Java interface. The client, usually a Windows PC, runs the interface, sets up an analysis, sends the analysis to the server, fetches the results and displays the appropriate plots on the users client machine. Together, the client and server software can be used to solve a host of interesting problems that occur regularly in the sciences. In this paper, we describe both the client and server software and briefly discuss how to acquire, install and maintain this software.

  14. Bayesian analysis to detect abrupt changes in extreme hydrological processes

    NASA Astrophysics Data System (ADS)

    Jo, Seongil; Kim, Gwangsu; Jeon, Jong-June

    2016-07-01

    In this study, we develop a new method for a Bayesian change point analysis. The proposed method is easy to implement and can be extended to a wide class of distributions. Using a generalized extreme-value distribution, we investigate the annual maximum of precipitations observed at stations in the South Korean Peninsula, and find significant changes in the considered sites. We evaluate the hydrological risk in predictions using the estimated return levels. In addition, we explain that the misspecification of the probability model can lead to a bias in the number of change points and using a simple example, show that this problem is difficult to avoid by technical data transformation.

  15. A Bayesian analysis of pentaquark signals from CLAS data

    SciTech Connect

    David Ireland; Bryan McKinnon; Dan Protopopescu; Pawel Ambrozewicz; Marco Anghinolfi; G. Asryan; Harutyun Avakian; H. Bagdasaryan; Nathan Baillie; Jacques Ball; Nathan Baltzell; V. Batourine; Marco Battaglieri; Ivan Bedlinski; Ivan Bedlinskiy; Matthew Bellis; Nawal Benmouna; Barry Berman; Angela Biselli; Lukasz Blaszczyk; Sylvain Bouchigny; Sergey Boyarinov; Robert Bradford; Derek Branford; William Briscoe; William Brooks; Volker Burkert; Cornel Butuceanu; John Calarco; Sharon Careccia; Daniel Carman; Liam Casey; Shifeng Chen; Lu Cheng; Philip Cole; Patrick Collins; Philip Coltharp; Donald Crabb; Volker Crede; Natalya Dashyan; Rita De Masi; Raffaella De Vita; Enzo De Sanctis; Pavel Degtiarenko; Alexandre Deur; Richard Dickson; Chaden Djalali; Gail Dodge; Joseph Donnelly; David Doughty; Michael Dugger; Oleksandr Dzyubak; Hovanes Egiyan; Kim Egiyan; Lamiaa Elfassi; Latifa Elouadrhiri; Paul Eugenio; Gleb Fedotov; Gerald Feldman; Ahmed Fradi; Herbert Funsten; Michel Garcon; Gagik Gavalian; Nerses Gevorgyan; Gerard Gilfoyle; Kevin Giovanetti; Francois-Xavier Girod; John Goetz; Wesley Gohn; Atilla Gonenc; Ralf Gothe; Keith Griffioen; Michel Guidal; Nevzat Guler; Lei Guo; Vardan Gyurjyan; Kawtar Hafidi; Hayk Hakobyan; Charles Hanretty; Neil Hassall; F. Hersman; Ishaq Hleiqawi; Maurik Holtrop; Charles Hyde; Yordanka Ilieva; Boris Ishkhanov; Eugeny Isupov; D. Jenkins; Hyon-Suk Jo; John Johnstone; Kyungseon Joo; Henry Juengst; Narbe Kalantarians; James Kellie; Mahbubul Khandaker; Wooyoung Kim; Andreas Klein; Franz Klein; Mikhail Kossov; Zebulun Krahn; Laird Kramer; Valery Kubarovsky; Joachim Kuhn; Sergey Kuleshov; Viacheslav Kuznetsov; Jeff Lachniet; Jean Laget; Jorn Langheinrich; D. Lawrence; Kenneth Livingston; Haiyun Lu; Marion MacCormick; Nikolai Markov; Paul Mattione; Bernhard Mecking; Mac Mestayer; Curtis Meyer; Tsutomu Mibe; Konstantin Mikhaylov; Marco Mirazita; Rory Miskimen; Viktor Mokeev; Brahim Moreno; Kei Moriya; Steven Morrow; Maryam Moteabbed; Edwin Munevar Espitia; Gordon Mutchler; Pawel Nadel-Turonski; Rakhsha Nasseripour; Silvia Niccolai; Gabriel Niculescu; Maria-Ioana Niculescu; Bogdan Niczyporuk; Megh Niroula; Rustam Niyazov; Mina Nozar; Mikhail Osipenko; Alexander Ostrovidov; Kijun Park; Evgueni Pasyuk; Craig Paterson; Sergio Pereira; Joshua Pierce; Nikolay Pivnyuk; Oleg Pogorelko; Sergey Pozdnyakov; John Price; Sebastien Procureur; Yelena Prok; Brian Raue; Giovanni Ricco; Marco Ripani; Barry Ritchie; Federico Ronchetti; Guenther Rosner; Patrizia Rossi; Franck Sabatie; Julian Salamanca; Carlos Salgado; Joseph Santoro; Vladimir Sapunenko; Reinhard Schumacher; Vladimir Serov; Youri Sharabian; Dmitri Sharov; Nikolay Shvedunov; Elton Smith; Lee Smith; Daniel Sober; Daria Sokhan; Aleksey Stavinskiy; Samuel Stepanyan; Stepan Stepanyan; Burnham Stokes; Paul Stoler; Steffen Strauch; Mauro Taiuti; David Tedeschi; Ulrike Thoma; Avtandil Tkabladze; Svyatoslav Tkachenko; Clarisse Tur; Maurizio Ungaro; Michael Vineyard; Alexander Vlassov; Daniel Watts; Lawrence Weinstein; Dennis Weygand; M. Williams; Elliott Wolin; M.H. Wood; Amrit Yegneswaran; Lorenzo Zana; Jixie Zhang; Bo Zhao; Zhiwen Zhao

    2008-02-01

    We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a $\\Theta^{+}$ pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a $\\Theta^{+}$. Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner.

  16. Direct message passing for hybrid Bayesian networks and performance analysis

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Chang, K. C.

    2010-04-01

    Probabilistic inference for hybrid Bayesian networks, which involves both discrete and continuous variables, has been an important research topic over the recent years. This is not only because a number of efficient inference algorithms have been developed and used maturely for simple types of networks such as pure discrete model, but also for the practical needs that continuous variables are inevitable in modeling complex systems. Pearl's message passing algorithm provides a simple framework to compute posterior distribution by propagating messages between nodes and can provides exact answer for polytree models with pure discrete or continuous variables. In addition, applying Pearl's message passing to network with loops usually converges and results in good approximation. However, for hybrid model, there is a need of a general message passing algorithm between different types of variables. In this paper, we develop a method called Direct Message Passing (DMP) for exchanging messages between discrete and continuous variables. Based on Pearl's algorithm, we derive formulae to compute messages for variables in various dependence relationships encoded in conditional probability distributions. Mixture of Gaussian is used to represent continuous messages, with the number of mixture components up to the size of the joint state space of all discrete parents. For polytree Conditional Linear Gaussian (CLG) Bayesian network, DMP has the same computational requirements and can provide exact solution as the one obtained by the Junction Tree (JT) algorithm. However, while JT can only work for the CLG model, DMP can be applied for general nonlinear, non-Gaussian hybrid model to produce approximate solution using unscented transformation and loopy propagation. Furthermore, we can scale the algorithm by restricting the number of mixture components in the messages. Empirically, we found that the approximation errors are relatively small especially for nodes that are far away from

  17. Bayesian analysis of inflationary features in Planck and SDSS data

    NASA Astrophysics Data System (ADS)

    Benetti, Micol; Alcaniz, Jailson S.

    2016-07-01

    We perform a Bayesian analysis to study possible features in the primordial inflationary power spectrum of scalar perturbations. In particular, we analyze the possibility of detecting the imprint of these primordial features in the anisotropy temperature power spectrum of the cosmic microwave background (CMB) and also in the matter power spectrum P (k ) . We use the most recent CMB data provided by the Planck Collaboration and P (k ) measurements from the 11th data release of the Sloan Digital Sky Survey. We focus our analysis on a class of potentials whose features are localized at different intervals of angular scales, corresponding to multipoles in the ranges 10 <ℓ<60 (Oscill-1) and 150 <ℓ<300 (Oscill-2). Our results show that one of the step potentials (Oscill-1) provides a better fit to the CMB data than does the featureless Λ CDM scenario, with moderate Bayesian evidence in favor of the former. Adding the P (k ) data to the analysis weakens the evidence of the Oscill-1 potential relative to the standard model and strengthens the evidence of this latter scenario with respect to the Oscill-2 model.

  18. Implementation of a Bayesian Engine for Uncertainty Analysis

    SciTech Connect

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  19. Analysis of magnetic field fluctuation thermometry using Bayesian inference

    NASA Astrophysics Data System (ADS)

    Wübbeler, G.; Schmähling, F.; Beyer, J.; Engert, J.; Elster, C.

    2012-12-01

    A Bayesian approach is proposed for the analysis of magnetic field fluctuation thermometry. The approach addresses the estimation of temperature from the measurement of a noise power spectrum as well as the analysis of previous calibration measurements. A key aspect is the reliable determination of uncertainties associated with the obtained temperature estimates, and the proposed approach naturally accounts for both the uncertainties in the calibration stage and the noise in the temperature measurement. Erlang distributions are employed to model the fluctuations of thermal noise power spectra and we show that such a procedure is justified in the light of the data. We describe in detail the Bayesian approach and briefly refer to Markov Chain Monte Carlo techniques used in the numerical calculation of the results. The MATLAB® software package we used for calculating our results is provided. The proposed approach is validated using magnetic field fluctuation power spectra recorded in the sub-kelvin region for which an independently determined reference temperature is available. As a result, the obtained temperature estimates were found to be fully consistent with the reference temperature.

  20. Bayesian probability analysis for acoustic-seismic landmine detection

    NASA Astrophysics Data System (ADS)

    Xiang, Ning; Sabatier, James M.; Goggans, Paul M.

    2002-11-01

    Landmines buried in the subsurface induce distinct changes in the seismic vibration of the ground surface when an acoustic source insonifies the ground. A scanning laser Doppler vibrometer (SLDV) senses the acoustically-induced seismic vibration of the ground surface in a noncontact, remote manner. The SLDV-based acoustic-to-seismic coupling technology exhibits significant advantages over conventional sensors due to its capability for detecting both metal and nonmetal mines and its stand-off distance. The seismic vibration data scanned from the SLDV are preprocessed to form images. The detection of landmines relies primarily on an analysis of the target amplitude, size, shape, and frequency range. A parametric model has been established [Xiang and Sabatier, J. Acoust. Soc. Am. 110, 2740 (2001)] to describe the amplified surface vibration velocity induced by buried landmines within an appropriate frequency range. This model incorporates vibrational amplitude, size, position of landmines, and the background amplitude into a model-based analysis process in which Bayesian target detection and parameter estimation have been applied. Based on recent field measurement results, the landmine detection procedure within a Bayesian framework will be discussed. [Work supported by the United States Army Communications-Electronics Command, Night Vision and Electronic Sensors Directorate.

  1. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    PubMed

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach. PMID:16466842

  2. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    SciTech Connect

    Keselman, Dmitry; Tompkins, George H; Leishman, Deborah A

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  3. Inference algorithms and learning theory for Bayesian sparse factor analysis

    NASA Astrophysics Data System (ADS)

    Rattray, Magnus; Stegle, Oliver; Sharp, Kevin; Winn, John

    2009-12-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  4. Joint modeling of survival and longitudinal non-survival data: current methods and issues. Report of the DIA Bayesian joint modeling working group.

    PubMed

    Lawrence Gould, A; Boye, Mark Ernest; Crowther, Michael J; Ibrahim, Joseph G; Quartey, George; Micallef, Sandrine; Bois, Frederic Y

    2015-06-30

    Explicitly modeling underlying relationships between a survival endpoint and processes that generate longitudinal measured or reported outcomes potentially could improve the efficiency of clinical trials and provide greater insight into the various dimensions of the clinical effect of interventions included in the trials. Various strategies have been proposed for using longitudinal findings to elucidate intervention effects on clinical outcomes such as survival. The application of specifically Bayesian approaches for constructing models that address longitudinal and survival outcomes explicitly has been recently addressed in the literature. We review currently available methods for carrying out joint analyses, including issues of implementation and interpretation, identify software tools that can be used to carry out the necessary calculations, and review applications of the methodology. PMID:24634327

  5. Bayesian modelling of the emission spectrum of the Joint European Torus Lithium Beam Emission Spectroscopy system.

    PubMed

    Kwak, Sehyun; Svensson, J; Brix, M; Ghim, Y-C

    2016-02-01

    A Bayesian model of the emission spectrum of the JET lithium beam has been developed to infer the intensity of the Li I (2p-2s) line radiation and associated uncertainties. The detected spectrum for each channel of the lithium beam emission spectroscopy system is here modelled by a single Li line modified by an instrumental function, Bremsstrahlung background, instrumental offset, and interference filter curve. Both the instrumental function and the interference filter curve are modelled with non-parametric Gaussian processes. All free parameters of the model, the intensities of the Li line, Bremsstrahlung background, and instrumental offset, are inferred using Bayesian probability theory with a Gaussian likelihood for photon statistics and electronic background noise. The prior distributions of the free parameters are chosen as Gaussians. Given these assumptions, the intensity of the Li line and corresponding uncertainties are analytically available using a Bayesian linear inversion technique. The proposed approach makes it possible to extract the intensity of Li line without doing a separate background subtraction through modulation of the Li beam. PMID:26931843

  6. A Bayesian analysis of two probability models describing thunderstorm activity at Cape Kennedy, Florida

    NASA Technical Reports Server (NTRS)

    Williford, W. O.; Hsieh, P.; Carter, M. C.

    1974-01-01

    A Bayesian analysis of the two discrete probability models, the negative binomial and the modified negative binomial distributions, which have been used to describe thunderstorm activity at Cape Kennedy, Florida, is presented. The Bayesian approach with beta prior distributions is compared to the classical approach which uses a moment method of estimation or a maximum-likelihood method. The accuracy and simplicity of the Bayesian method is demonstrated.

  7. Bayesian regression analysis of data with random effects covariates from nonlinear longitudinal measurements

    PubMed Central

    De la Cruz, Rolando; Meza, Cristian; Arribas-Gil, Ana; Carroll, Raymond J.

    2016-01-01

    Joint models for a wide class of response variables and longitudinal measurements consist on a mixed-effects model to fit longitudinal trajectories whose random effects enter as covariates in a generalized linear model for the primary response. They provide a useful way to assess association between these two kinds of data, which in clinical studies are often collected jointly on a series of individuals and may help understanding, for instance, the mechanisms of recovery of a certain disease or the efficacy of a given therapy. When a nonlinear mixed-effects model is used to fit the longitudinal trajectories, the existing estimation strategies based on likelihood approximations have been shown to exhibit some computational efficiency problems (De la Cruz et al., 2011). In this article we consider a Bayesian estimation procedure for the joint model with a nonlinear mixed-effects model for the longitudinal data and a generalized linear model for the primary response. The proposed prior structure allows for the implementation of an MCMC sampler. Moreover, we consider that the errors in the longitudinal model may be correlated. We apply our method to the analysis of hormone levels measured at the early stages of pregnancy that can be used to predict normal versus abnormal pregnancy outcomes. We also conduct a simulation study to assess the importance of modelling correlated errors and quantify the consequences of model misspecification. PMID:27274601

  8. Bayesian informative dropout model for longitudinal binary data with random effects using conditional and joint modeling approaches.

    PubMed

    Chan, Jennifer S K

    2016-05-01

    Dropouts are common in longitudinal study. If the dropout probability depends on the missing observations at or after dropout, this type of dropout is called informative (or nonignorable) dropout (ID). Failure to accommodate such dropout mechanism into the model will bias the parameter estimates. We propose a conditional autoregressive model for longitudinal binary data with an ID model such that the probabilities of positive outcomes as well as the drop-out indicator in each occasion are logit linear in some covariates and outcomes. This model adopting a marginal model for outcomes and a conditional model for dropouts is called a selection model. To allow for the heterogeneity and clustering effects, the outcome model is extended to incorporate mixture and random effects. Lastly, the model is further extended to a novel model that models the outcome and dropout jointly such that their dependency is formulated through an odds ratio function. Parameters are estimated by a Bayesian approach implemented using the user-friendly Bayesian software WinBUGS. A methadone clinic dataset is analyzed to illustrate the proposed models. Result shows that the treatment time effect is still significant but weaker after allowing for an ID process in the data. Finally the effect of drop-out on parameter estimates is evaluated through simulation studies. PMID:26467236

  9. A Bayesian Framework for Reliability Analysis of Spacecraft Deployments

    NASA Technical Reports Server (NTRS)

    Evans, John W.; Gallo, Luis; Kaminsky, Mark

    2012-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.

  10. Bayesian Models for fMRI Data Analysis

    PubMed Central

    Zhang, Linlin; Guindani, Michele; Vannucci, Marina

    2015-01-01

    Functional magnetic resonance imaging (fMRI), a noninvasive neuroimaging method that provides an indirect measure of neuronal activity by detecting blood flow changes, has experienced an explosive growth in the past years. Statistical methods play a crucial role in understanding and analyzing fMRI data. Bayesian approaches, in particular, have shown great promise in applications. A remarkable feature of fully Bayesian approaches is that they allow a flexible modeling of spatial and temporal correlations in the data. This paper provides a review of the most relevant models developed in recent years. We divide methods according to the objective of the analysis. We start from spatio-temporal models for fMRI data that detect task-related activation patterns. We then address the very important problem of estimating brain connectivity. We also touch upon methods that focus on making predictions of an individual's brain activity or a clinical or behavioral response. We conclude with a discussion of recent integrative models that aim at combining fMRI data with other imaging modalities, such as EEG/MEG and DTI data, measured on the same subjects. We also briefly discuss the emerging field of imaging genetics. PMID:25750690

  11. Bayesian robust analysis for genetic architecture of quantitative traits

    PubMed Central

    Yang, Runqing; Wang, Xin; Li, Jian; Deng, Hongwen

    2009-01-01

    Motivation: In most quantitative trait locus (QTL) mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may affect the accuracy of QTL detection and lead to detection of spurious QTLs. To improve the robustness of QTL mapping methods, we replaced the normal distribution for residuals in multiple interacting QTL models with the normal/independent distributions that are a class of symmetric and long-tailed distributions and are able to accommodate residual outliers. Subsequently, we developed a Bayesian robust analysis strategy for dissecting genetic architecture of quantitative traits and for mapping genome-wide interacting QTLs in line crosses. Results: Through computer simulations, we showed that our strategy had a similar power for QTL detection compared with traditional methods assuming normal-distributed traits, but had a substantially increased power for non-normal phenotypes. When this strategy was applied to a group of traits associated with physical/chemical characteristics and quality in rice, more main and epistatic QTLs were detected than traditional Bayesian model analyses under the normal assumption. Contact: runqingyang@sjtu.edu.cn; dengh@umkc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18974168

  12. Bayesian imperfect information analysis for clinical recurrent data

    PubMed Central

    Chang, Chih-Kuang; Chang, Chi-Chang

    2015-01-01

    In medical research, clinical practice must often be undertaken with imperfect information from limited resources. This study applied Bayesian imperfect information-value analysis to realistic situations to produce likelihood functions and posterior distributions, to a clinical decision-making problem for recurrent events. In this study, three kinds of failure models are considered, and our methods illustrated with an analysis of imperfect information from a trial of immunotherapy in the treatment of chronic granulomatous disease. In addition, we present evidence toward a better understanding of the differing behaviors along with concomitant variables. Based on the results of simulations, the imperfect information value of the concomitant variables was evaluated and different realistic situations were compared to see which could yield more accurate results for medical decision-making. PMID:25565853

  13. Risk analysis of dust explosion scenarios using Bayesian networks.

    PubMed

    Yuan, Zhi; Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-02-01

    In this study, a methodology has been proposed for risk analysis of dust explosion scenarios based on Bayesian network. Our methodology also benefits from a bow-tie diagram to better represent the logical relationships existing among contributing factors and consequences of dust explosions. In this study, the risks of dust explosion scenarios are evaluated, taking into account common cause failures and dependencies among root events and possible consequences. Using a diagnostic analysis, dust particle properties, oxygen concentration, and safety training of staff are identified as the most critical root events leading to dust explosions. The probability adaptation concept is also used for sequential updating and thus learning from past dust explosion accidents, which is of great importance in dynamic risk assessment and management. We also apply the proposed methodology to a case study to model dust explosion scenarios, to estimate the envisaged risks, and to identify the vulnerable parts of the system that need additional safety measures. PMID:25264172

  14. Bayesian Model Selection in 'Big Data' Spectral Analysis

    NASA Astrophysics Data System (ADS)

    Fischer, Travis C.; Crenshaw, D. Michael; Baron, Fabien; Kloppenborg, Brian K.; Pope, Crystal L.

    2015-01-01

    As IFU observations and large spectral surveys continue to become more prevalent, the handling of thousands of spectra has become common place. Astronomers look at objects with increasingly complex emission-linestructures, so establishing a method that will easily allow for multiple-component analysis of these features in an automated fashion would be of great use to the community. Already used in exoplanet detection and interferometric image reconstruction, we present a new application of Bayesian model selection in `big data' spectral analysis. With this technique, the fitting of multiple emission-line components in an automated fashion while simultaneously determining the correct number of components in each spectrum streamlines the line measurements for a large number of spectra into a single process.

  15. Structural dynamic analysis of a ball joint

    NASA Astrophysics Data System (ADS)

    Hwang, Seok-Cheol; Lee, Kwon-Hee

    2012-11-01

    Ball joint is a rotating and swiveling element that is typically installed at the interface between two parts. In an automobile, the ball joint is the component that connects the control arms to the steering knuckle. The ball joint can also be installed in linkage systems for motion control applications. This paper describes the simulation strategy for a ball joint analysis, considering manufacturing process. Its manufacturing process can be divided into plugging and spinning. Then, the interested responses is selected as the stress distribution generated between its ball and bearing. In this paper, a commercial code of NX DAFUL using an implicit integration method is introduced to calculate the response. In addition, the gap analysis is performed to investigate the fitness, focusing on the response of the displacement of a ball stud. Also, the optimum design is suggested through case studies.

  16. Bayesian Analysis of Peak Ground Acceleration Attenuation Relationship

    SciTech Connect

    Mu Heqing; Yuen Kaveng

    2010-05-21

    Estimation of peak ground acceleration is one of the main issues in civil and earthquake engineering practice. The Boore-Joyner-Fumal empirical formula is well known for this purpose. In this paper we propose to use the Bayesian probabilistic model class selection approach to obtain the most suitable prediction model class for the seismic attenuation formula. The optimal model class is robust in the sense that it has balance between the data fitting capability and the sensitivity to noise. A database of strong-motion records is utilized for the analysis. It turns out that the optimal model class is simpler than the full order attenuation model suggested by Boore, Joyner and Fumal (1993).

  17. BASE-9: Bayesian Analysis for Stellar Evolution with nine variables

    NASA Astrophysics Data System (ADS)

    Robinson, Elliot; von Hippel, Ted; Stein, Nathan; Stenning, David; Wagner-Kaiser, Rachel; Si, Shijing; van Dyk, David

    2016-08-01

    The BASE-9 (Bayesian Analysis for Stellar Evolution with nine variables) software suite recovers star cluster and stellar parameters from photometry and is useful for analyzing single-age, single-metallicity star clusters, binaries, or single stars, and for simulating such systems. BASE-9 uses a Markov chain Monte Carlo (MCMC) technique along with brute force numerical integration to estimate the posterior probability distribution for the age, metallicity, helium abundance, distance modulus, line-of-sight absorption, and parameters of the initial-final mass relation (IFMR) for a cluster, and for the primary mass, secondary mass (if a binary), and cluster probability for every potential cluster member. The MCMC technique is used for the cluster quantities (the first six items listed above) and numerical integration is used for the stellar quantities (the last three items in the above list).

  18. Bayesian Library for the Analysis of Neutron Diffraction Data

    NASA Astrophysics Data System (ADS)

    Ratcliff, William; Lesniewski, Joseph; Quintana, Dylan

    During this talk, I will introduce the Bayesian Library for the Analysis of Neutron Diffraction Data. In this library we use of the DREAM algorithm to effectively sample parameter space. This offers several advantages over traditional least squares fitting approaches. It gives us more robust estimates of the fitting parameters, their errors, and their correlations. It also is more stable than least squares methods and provides more confidence in finding a global minimum. I will discuss the algorithm and its application to several materials. I will show applications to both structural and magnetic diffraction patterns. I will present examples of fitting both powder and single crystal data. We would like to acknowledge support from the Department of Commerce and the NSF.

  19. Testing Hardy-Weinberg equilibrium: an objective Bayesian analysis.

    PubMed

    Consonni, Guido; Moreno, Elías; Venturini, Sergio

    2011-01-15

    We analyze the general (multiallelic) Hardy-Weinberg equilibrium problem from an objective Bayesian testing standpoint. We argue that for small or moderate sample sizes the answer is rather sensitive to the prior chosen, and this suggests to carry out a sensitivity analysis with respect to the prior. This goal is achieved through the identification of a class of priors specifically designed for this testing problem. In this paper, we consider the class of intrinsic priors under the full model, indexed by a tuning quantity, the training sample size. These priors are objective, satisfy Savage's continuity condition and have proved to behave extremely well for many statistical testing problems. We compute the posterior probability of the Hardy-Weinberg equilibrium model for the class of intrinsic priors, assess robustness over the range of plausible answers, as well as stability of the decision in favor of either hypothesis. PMID:20963736

  20. Geometrically nonlinear analysis of adhesively bonded joints

    NASA Technical Reports Server (NTRS)

    Dattaguru, B.; Everett, R. A., Jr.; Whitcomb, J. D.; Johnson, W. S.

    1982-01-01

    A geometrically nonlinear finite element analysis of cohesive failure in typical joints is presented. Cracked-lap-shear joints were chosen for analysis. Results obtained from linear and nonlinear analysis show that nonlinear effects, due to large rotations, significantly affect the calculated mode 1, crack opening, and mode 2, inplane shear, strain-energy-release rates. The ratio of the mode 1 to mode 2 strain-energy-relase rates (G1/G2) was found to be strongly affected by he adhesive modulus and the adherend thickness. The ratios between 0.2 and 0.8 can be obtained by varying adherend thickness and using either a single or double cracked-lap-shear specimen configuration. Debond growth rate data, together with the analysis, indicate that mode 1 strain-energy-release rate governs debond growth. Results from the present analysis agree well with experimentally measured joint opening displacements.

  1. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  2. Discrete Dynamic Bayesian Network Analysis of fMRI Data

    PubMed Central

    Burge, John; Lane, Terran; Link, Hamilton; Qiu, Shibin; Clark, Vincent P.

    2010-01-01

    We examine the efficacy of using discrete Dynamic Bayesian Networks (dDBNs), a data-driven modeling technique employed in machine learning, to identify functional correlations among neuroanatomical regions of interest. Unlike many neuroimaging analysis techniques, this method is not limited by linear and/or Gaussian noise assumptions. It achieves this by modeling the time series of neuroanatomical regions as discrete, as opposed to continuous, random variables with multinomial distributions. We demonstrated this method using an fMRI dataset collected from healthy and demented elderly subjects and identify correlates based on a diagnosis of dementia. The results are validated in three ways. First, the elicited correlates are shown to be robust over leave-one-out cross-validation and, via a Fourier bootstrapping method, that they were not likely due to random chance. Second, the dDBNs identified correlates that would be expected given the experimental paradigm. Third, the dDBN's ability to predict dementia is competitive with two commonly employed machine-learning classifiers: the support vector machine and the Gaussian naïve Bayesian network. We also verify that the dDBN selects correlates based on non-linear criteria. Finally, we provide a brief analysis of the correlates elicited from Buckner et al.'s data that suggests that demented elderly subjects have reduced involvement of entorhinal and occipital cortex and greater involvement of the parietal lobe and amygdala in brain activity compared with healthy elderly (as measured via functional correlations among BOLD measurements). Limitations and extensions to the dDBN method are discussed. PMID:17990301

  3. Modal analysis of jointed structures

    NASA Astrophysics Data System (ADS)

    Quinn, D. Dane

    2012-01-01

    Structural systems are often composed of multiple components joined together at localized interfaces. Compared to a corresponding monolithic system these interfaces are designed to have little influence on the load carrying capability of the system, and the resulting change in the overall system mass and stiffness is minimal. Hence, under nominal operating conditions the mode shapes and frequencies of the dominant structural modes are relatively insensitive to the presence of the interfaces. However, the energy dissipation in such systems is strongly dependent on the joints. The microslip that occurs at each interface couples together the structural modes of the system and introduces nonlinear damping into the system, effectively altering the observed damping of the structural modes, which can then significantly alter the amplitude of the response at the resonant modal frequencies. This work develops equations of motion for a jointed structure in terms of the structural modal coordinates and implements a reduced-order description of the microslip that occurs at the interface between components. The interface is incorporated into the modal description of the system through an existing decomposition of a series-series Iwan interface model and a continuum approximation for microslip of an elastic rod. The developed framework is illustrated on several examples, including a discrete three degree-of-freedom system as well as the longitudinal deformation of a continuum beam.

  4. Multiple SNP Set Analysis for Genome-Wide Association Studies Through Bayesian Latent Variable Selection.

    PubMed

    Lu, Zhao-Hua; Zhu, Hongtu; Knickmeyer, Rebecca C; Sullivan, Patrick F; Williams, Stephanie N; Zou, Fei

    2015-12-01

    The power of genome-wide association studies (GWAS) for mapping complex traits with single-SNP analysis (where SNP is single-nucleotide polymorphism) may be undermined by modest SNP effect sizes, unobserved causal SNPs, correlation among adjacent SNPs, and SNP-SNP interactions. Alternative approaches for testing the association between a single SNP set and individual phenotypes have been shown to be promising for improving the power of GWAS. We propose a Bayesian latent variable selection (BLVS) method to simultaneously model the joint association mapping between a large number of SNP sets and complex traits. Compared with single SNP set analysis, such joint association mapping not only accounts for the correlation among SNP sets but also is capable of detecting causal SNP sets that are marginally uncorrelated with traits. The spike-and-slab prior assigned to the effects of SNP sets can greatly reduce the dimension of effective SNP sets, while speeding up computation. An efficient Markov chain Monte Carlo algorithm is developed. Simulations demonstrate that BLVS outperforms several competing variable selection methods in some important scenarios. PMID:26515609

  5. Multiple SNP-sets Analysis for Genome-wide Association Studies through Bayesian Latent Variable Selection

    PubMed Central

    Lu, Zhaohua; Zhu, Hongtu; Knickmeyer, Rebecca C; Sullivan, Patrick F.; Stephanie, Williams N.; Zou, Fei

    2015-01-01

    The power of genome-wide association studies (GWAS) for mapping complex traits with single SNP analysis may be undermined by modest SNP effect sizes, unobserved causal SNPs, correlation among adjacent SNPs, and SNP-SNP interactions. Alternative approaches for testing the association between a single SNP-set and individual phenotypes have been shown to be promising for improving the power of GWAS. We propose a Bayesian latent variable selection (BLVS) method to simultaneously model the joint association mapping between a large number of SNP-sets and complex traits. Compared to single SNP-set analysis, such joint association mapping not only accounts for the correlation among SNP-sets, but also is capable of detecting causal SNP-sets that are marginally uncorrelated with traits. The spike-slab prior assigned to the effects of SNP-sets can greatly reduce the dimension of effective SNP-sets, while speeding up computation. An efficient MCMC algorithm is developed. Simulations demonstrate that BLVS outperforms several competing variable selection methods in some important scenarios. PMID:26515609

  6. RECONSTRUCTING EXPOSURE SCENARIOS USING DOSE BIOMARKERS - AN APPLICATION OF BAYESIAN UNCERTAINTY ANALYSIS

    EPA Science Inventory

    We use Bayesian uncertainty analysis to explore how to estimate pollutant exposures from biomarker concentrations. The growing number of national databases with exposure data makes such an analysis possible. They contain datasets of pharmacokinetic biomarkers for many polluta...

  7. Unsupervised Transient Light Curve Analysis via Hierarchical Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Sanders, N. E.; Betancourt, M.; Soderberg, A. M.

    2015-02-01

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.

  8. UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE

    SciTech Connect

    Sanders, N. E.; Soderberg, A. M.; Betancourt, M.

    2015-02-10

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.

  9. Multivariate meta-analysis of mixed outcomes: a Bayesian approach.

    PubMed

    Bujkiewicz, Sylwia; Thompson, John R; Sutton, Alex J; Cooper, Nicola J; Harrison, Mark J; Symmons, Deborah P M; Abrams, Keith R

    2013-09-30

    Multivariate random effects meta-analysis (MRMA) is an appropriate way for synthesizing data from studies reporting multiple correlated outcomes. In a Bayesian framework, it has great potential for integrating evidence from a variety of sources. In this paper, we propose a Bayesian model for MRMA of mixed outcomes, which extends previously developed bivariate models to the trivariate case and also allows for combination of multiple outcomes that are both continuous and binary. We have constructed informative prior distributions for the correlations by using external evidence. Prior distributions for the within-study correlations were constructed by employing external individual patent data and using a double bootstrap method to obtain the correlations between mixed outcomes. The between-study model of MRMA was parameterized in the form of a product of a series of univariate conditional normal distributions. This allowed us to place explicit prior distributions on the between-study correlations, which were constructed using external summary data. Traditionally, independent 'vague' prior distributions are placed on all parameters of the model. In contrast to this approach, we constructed prior distributions for the between-study model parameters in a way that takes into account the inter-relationship between them. This is a flexible method that can be extended to incorporate mixed outcomes other than continuous and binary and beyond the trivariate case. We have applied this model to a motivating example in rheumatoid arthritis with the aim of incorporating all available evidence in the synthesis and potentially reducing uncertainty around the estimate of interest. PMID:23630081

  10. A Bayesian latent group analysis for detecting poor effort in the assessment of malingering.

    PubMed

    Ortega, Alonso; Wagenmakers, Eric-Jan; Lee, Michael D; Markowitsch, Hans J; Piefke, Martina

    2012-06-01

    Despite their theoretical appeal, Bayesian methods for the assessment of poor effort and malingering are still rarely used in neuropsychological research and clinical diagnosis. In this article, we outline a novel and easy-to-use Bayesian latent group analysis of malingering whose goal is to identify participants displaying poor effort when tested. Our Bayesian approach also quantifies the confidence with which each participant is classified and estimates the base rates of malingering from the observed data. We implement our Bayesian approach and compare its utility in effort assessment to that of the classic below-chance criterion of symptom validity testing (SVT). In two experiments, we evaluate the accuracy of both a Bayesian latent group analysis and the below-chance criterion of SVT in recovering the membership of participants assigned to the malingering group. Experiment 1 uses a simulation research design, whereas Experiment 2 involves the differentiation of patients with a history of stroke from coached malingerers. In both experiments, sensitivity levels are high for the Bayesian method, but low for the below-chance criterion of SVT. Additionally, the Bayesian approach proves to be resistant to possible effects of coaching. We conclude that Bayesian latent group methods complement existing methods in making more informed choices about malingering. PMID:22543568

  11. Highly efficient Bayesian joint inversion for receiver based data and its application to lithospheric structure beneath the southern Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Dettmer, Jan; Rhie, Junkee; Tkalčić, Hrvoje

    2016-04-01

    With the deployment of extensive seismic arrays, systematic and efficient parameter and uncertainty estimation is of increasing importance and can provide reliable, regional models for crustal and upper-mantle structure. We present an efficient Bayesian method for the joint inversion of surface-wave dispersion and receiver-function data that combines trans-dimensional (trans-D) model selection in an optimisation phase with subsequent rigorous parameter uncertainty estimation. Parameter and uncertainty estimation depend strongly on the chosen parametrization such that meaningful regional comparison requires quantitative model selection that can be carried out efficiently at several sites. While significant progress has been made for model selection (e.g. trans-D inference) at individual sites, the lack of efficiency can prohibit application to large data volumes or cause questionable results due to lack of convergence. Studies that address large numbers of data sets have mostly ignored model selection in favour of more efficient/simple estimation techniques (i.e. focusing on uncertainty estimation but employing ad-hoc model choices). Our approach consists of a two-phase inversion that combines trans-D optimisation to select the most probable parametrization with subsequent Bayesian sampling for uncertainty estimation given that parametrization. The trans-D optimisation is implemented here by replacing the likelihood function with the Bayesian information criterion (BIC). The BIC provides constraints on model complexity that facilitate the search for an optimal parametrization. Parallel tempering (PT) is applied as an optimisation algorithm. After optimisation, the optimal model choice is identified by the minimum BIC value from all PT chains. Uncertainty estimation is then carried out in fixed dimension. Data errors are estimated as part of the inference problem by a combination of empirical and hierarchical estimation. Data covariance matrices are estimated from

  12. Analysis of minor fractures associated with joints and faulted joints

    NASA Astrophysics Data System (ADS)

    Cruikshank, Kenneth M.; Zhao, Guozhu; Johnson, Arvid M.

    In this paper, we use fracture mechanics to interpret conditions responsible for secondary cracks that adorn joints and faulted joints in the Entrada Sandstone in Arches National Park, U.S.A. Because the joints in most places accommodated shearing offsets of a few mm to perhaps 1 dm, and thus became faulted joints, some of the minor cracks are due to faulting. However, in a few places where the shearing was zero, one can examine minor cracks due solely to interaction of joint segments at the time they formed. We recognize several types of minor cracks associated with subsequent faulting of the joints. One is the kink, a crack that occurs at the termination of a straight joint and whose trend is abruptly different from that of the joint. Kinks are common and should be studied because they contain a great deal of information about conditions during fracturing. The sense of kinking indicates the sense of shear during faulting: a kink that turns clockwise with respect to the direction of the main joint is a result of right-lateral shear, and a kink that turns counterclockwise is a result of left-lateral shear. Furthermore, the kink angle is related to the ratio of the shear stress responsible for the kinking to the normal stress responsible for the opening of the joint. The amount of opening of a joint at the time it faulted or even at the time the joint itself formed can be estimated by measuring the kink angle and the amount of strike-slip at some point along the faulted joint. Other fractures that form near terminations of pre-existing joints in response to shearing along the joint are horsetail fractures. Similar short fractures can occur anywhere along the length of the joints. The primary value in recognizing these fractures is that they indicate the sense of faulting accommodated by the host fracture and the direction of maximum tension. Even where there has been insignificant regional shearing in the Garden Area, the joints can have ornate terminations. Perhaps

  13. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  14. BEAST 2: A Software Platform for Bayesian Evolutionary Analysis

    PubMed Central

    Bouckaert, Remco; Heled, Joseph; Kühnert, Denise; Vaughan, Tim; Wu, Chieh-Hsi; Xie, Dong; Suchard, Marc A.; Rambaut, Andrew; Drummond, Alexei J.

    2014-01-01

    We present a new open source, extensible and flexible software platform for Bayesian evolutionary analysis called BEAST 2. This software platform is a re-design of the popular BEAST 1 platform to correct structural deficiencies that became evident as the BEAST 1 software evolved. Key among those deficiencies was the lack of post-deployment extensibility. BEAST 2 now has a fully developed package management system that allows third party developers to write additional functionality that can be directly installed to the BEAST 2 analysis platform via a package manager without requiring a new software release of the platform. This package architecture is showcased with a number of recently published new models encompassing birth-death-sampling tree priors, phylodynamics and model averaging for substitution models and site partitioning. A second major improvement is the ability to read/write the entire state of the MCMC chain to/from disk allowing it to be easily shared between multiple instances of the BEAST software. This facilitates checkpointing and better support for multi-processor and high-end computing extensions. Finally, the functionality in new packages can be easily added to the user interface (BEAUti 2) by a simple XML template-based mechanism because BEAST 2 has been re-designed to provide greater integration between the analysis engine and the user interface so that, for example BEAST and BEAUti use exactly the same XML file format. PMID:24722319

  15. Bayesian Model Selection with Network Based Diffusion Analysis.

    PubMed

    Whalen, Andrew; Hoppitt, William J E

    2016-01-01

    A number of recent studies have used Network Based Diffusion Analysis (NBDA) to detect the role of social transmission in the spread of a novel behavior through a population. In this paper we present a unified framework for performing NBDA in a Bayesian setting, and demonstrate how the Watanabe Akaike Information Criteria (WAIC) can be used for model selection. We present a specific example of applying this method to Time to Acquisition Diffusion Analysis (TADA). To examine the robustness of this technique, we performed a large scale simulation study and found that NBDA using WAIC could recover the correct model of social transmission under a wide range of cases, including under the presence of random effects, individual level variables, and alternative models of social transmission. This work suggests that NBDA is an effective and widely applicable tool for uncovering whether social transmission underpins the spread of a novel behavior, and may still provide accurate results even when key model assumptions are relaxed. PMID:27092089

  16. New Ephemeris for LSI+61 303, A Bayesian Analysis

    NASA Astrophysics Data System (ADS)

    Gregory, P. C.

    1997-12-01

    The luminous early-type binary LSI+61 303 is an interesting radio, X-ray and possible gamma-ray source. At radio wavelengths it exhibits periodic outbursts with an approximate period of 26.5 days as well as a longer term modulation of the outburst peaks of approximately 4 years. Recently Paredes et al. have found evidence that the X-ray outbursts are very likely to recur with the same radio outburst period from an analysis of RXTE all sky monitoring data. The system has been observed by many groups at all wavelengths but still the energy source powering the radio outbursts and their relation to the high energy emission remains a mystery. For more details see the "LSI+61 303 Resource Page" at http://www.srl.caltech.edu/personnel/paulr/lsi.html . There has been increasing evidence for a change in the period of the system. We will present a new ephemeris for the system based on a Bayesian analysis of 20 years of radio observations including the GBI-NASA radio monitoring data.

  17. A procedure for seiche analysis with Bayesian information criterion

    NASA Astrophysics Data System (ADS)

    Aichi, Masaatsu

    2016-04-01

    Seiche is a standing wave in enclosed or semi-enclosed water body. Its amplitude irregularly changes in time due to weather condition etc. Then, extracting seiche signal is not easy by usual methods for time series analysis such as fast Fourier transform (FFT). In this study, a new method for time series analysis with Bayesian information criterion was developed to decompose seiche, tide, long-term trend and residual components from time series data of tide stations. The method was developed based on the maximum marginal likelihood estimation of tide amplitudes, seiche amplitude, and trend components. Seiche amplitude and trend components were assumed that they gradually changes as second derivative in time was close to zero. These assumptions were incorporated as prior distributions. The variances of prior distributions were estimated by minimizing Akaike-Bayes information criterion (ABIC). The frequency of seiche was determined by Newton method with initial guess by FFT. The accuracy of proposed method was checked by analyzing synthetic time series data composed of known components. The reproducibility of the original components was quite well. The proposed method was also applied to the actual time series data of sea level observed by tide station and the strain of coastal rock masses observed by fiber Bragg grating sensor in Aburatsubo Bay, Japan. The seiche in bay and its response of rock masses were successfully extracted.

  18. Microcanonical thermostatistics analysis without histograms: Cumulative distribution and Bayesian approaches

    NASA Astrophysics Data System (ADS)

    Alves, Nelson A.; Morero, Lucas D.; Rizzi, Leandro G.

    2015-06-01

    Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature β(E) and the microcanonical entropy S(E) is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms H(E) , which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for H(E) in order to evaluate β(E) by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distribution function (CDF), and (ii) a Bayesian approach to model this CDF. Comparison with a simple linear regression method is also carried out. The performance of these approaches is evaluated considering coarse-grained protein models for folding and peptide aggregation.

  19. Bayesian Model Selection with Network Based Diffusion Analysis

    PubMed Central

    Whalen, Andrew; Hoppitt, William J. E.

    2016-01-01

    A number of recent studies have used Network Based Diffusion Analysis (NBDA) to detect the role of social transmission in the spread of a novel behavior through a population. In this paper we present a unified framework for performing NBDA in a Bayesian setting, and demonstrate how the Watanabe Akaike Information Criteria (WAIC) can be used for model selection. We present a specific example of applying this method to Time to Acquisition Diffusion Analysis (TADA). To examine the robustness of this technique, we performed a large scale simulation study and found that NBDA using WAIC could recover the correct model of social transmission under a wide range of cases, including under the presence of random effects, individual level variables, and alternative models of social transmission. This work suggests that NBDA is an effective and widely applicable tool for uncovering whether social transmission underpins the spread of a novel behavior, and may still provide accurate results even when key model assumptions are relaxed. PMID:27092089

  20. Using Bayesian analysis in repeated preclinical in vivo studies for a more effective use of animals.

    PubMed

    Walley, Rosalind; Sherington, John; Rastrick, Joe; Detrait, Eric; Hanon, Etienne; Watt, Gillian

    2016-05-01

    Whilst innovative Bayesian approaches are increasingly used in clinical studies, in the preclinical area Bayesian methods appear to be rarely used in the reporting of pharmacology data. This is particularly surprising in the context of regularly repeated in vivo studies where there is a considerable amount of data from historical control groups, which has potential value. This paper describes our experience with introducing Bayesian analysis for such studies using a Bayesian meta-analytic predictive approach. This leads naturally either to an informative prior for a control group as part of a full Bayesian analysis of the next study or using a predictive distribution to replace a control group entirely. We use quality control charts to illustrate study-to-study variation to the scientists and describe informative priors in terms of their approximate effective numbers of animals. We describe two case studies of animal models: the lipopolysaccharide-induced cytokine release model used in inflammation and the novel object recognition model used to screen cognitive enhancers, both of which show the advantage of a Bayesian approach over the standard frequentist analysis. We conclude that using Bayesian methods in stable repeated in vivo studies can result in a more effective use of animals, either by reducing the total number of animals used or by increasing the precision of key treatment differences. This will lead to clearer results and supports the "3Rs initiative" to Refine, Reduce and Replace animals in research. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27028721

  1. Toward a Behavioral Analysis of Joint Attention

    ERIC Educational Resources Information Center

    Dube, William V.; MacDonald, Rebecca P. F.; Mansfield, Renee C.; Holcomb, William L.; Ahearn, William H.

    2004-01-01

    Joint attention (JA) initiation is defined in cognitive-developmental psychology as a child's actions that verify or produce simultaneous attending by that child and an adult to some object or event in the environment so that both may experience the object or event together. This paper presents a contingency analysis of gaze shift in JA…

  2. Nuclear stockpile stewardship and Bayesian image analysis (DARHT and the BIE)

    SciTech Connect

    Carroll, James L

    2011-01-11

    Since the end of nuclear testing, the reliability of our nation's nuclear weapon stockpile has been performed using sub-critical hydrodynamic testing. These tests involve some pretty 'extreme' radiography. We will be discussing the challenges and solutions to these problems provided by DARHT (the world's premiere hydrodynamic testing facility) and the BIE or Bayesian Inference Engine (a powerful radiography analysis software tool). We will discuss the application of Bayesian image analysis techniques to this important and difficult problem.

  3. Bayesian analysis of multimodal data and brain imaging

    NASA Astrophysics Data System (ADS)

    Assadi, Amir H.; Eghbalnia, Hamid; Backonja, Miroslav; Wakai, Ronald T.; Rutecki, Paul; Haughton, Victor

    2000-06-01

    It is often the case that information about a process can be obtained using a variety of methods. Each method is employed because of specific advantages over the competing alternatives. An example in medical neuro-imaging is the choice between fMRI and MEG modes where fMRI can provide high spatial resolution in comparison to the superior temporal resolution of MEG. The combination of data from varying modes provides the opportunity to infer results that may not be possible by means of any one mode alone. We discuss a Bayesian and learning theoretic framework for enhanced feature extraction that is particularly suited to multi-modal investigations of massive data sets from multiple experiments. In the following Bayesian approach, acquired knowledge (information) regarding various aspects of the process are all directly incorporated into the formulation. This information can come from a variety of sources. In our case, it represents statistical information obtained from other modes of data collection. The information is used to train a learning machine to estimate a probability distribution, which is used in turn by a second machine as a prior, in order to produce a more refined estimation of the distribution of events. The computational demand of the algorithm is handled by proposing a distributed parallel implementation on a cluster of workstations that can be scaled to address real-time needs if required. We provide a simulation of these methods on a set of synthetically generated MEG and EEG data. We show how spatial and temporal resolutions improve by using prior distributions. The method on fMRI signals permits one to construct the probability distribution of the non-linear hemodynamics of the human brain (real data). These computational results are in agreement with biologically based measurements of other labs, as reported to us by researchers from UK. We also provide preliminary analysis involving multi-electrode cortical recording that accompanies

  4. Bayesian semiparametric nonlinear mixed-effects joint models for data with skewness, missing responses, and measurement errors in covariates.

    PubMed

    Huang, Yangxin; Dagne, Getachew

    2012-09-01

    It is a common practice to analyze complex longitudinal data using semiparametric nonlinear mixed-effects (SNLME) models with a normal distribution. Normality assumption of model errors may unrealistically obscure important features of subject variations. To partially explain between- and within-subject variations, covariates are usually introduced in such models, but some covariates may often be measured with substantial errors. Moreover, the responses may be missing and the missingness may be nonignorable. Inferential procedures can be complicated dramatically when data with skewness, missing values, and measurement error are observed. In the literature, there has been considerable interest in accommodating either skewness, incompleteness or covariate measurement error in such models, but there has been relatively little study concerning all three features simultaneously. In this article, our objective is to address the simultaneous impact of skewness, missingness, and covariate measurement error by jointly modeling the response and covariate processes based on a flexible Bayesian SNLME model. The method is illustrated using a real AIDS data set to compare potential models with various scenarios and different distribution specifications. PMID:22150787

  5. Interacting Agricultural Pests and Their Effect on Crop Yield: Application of a Bayesian Decision Theory Approach to the Joint Management of Bromus tectorum and Cephus cinctus

    PubMed Central

    Keren, Ilai N.; Menalled, Fabian D.; Weaver, David K.; Robison-Cox, James F.

    2015-01-01

    Worldwide, the landscape homogeneity of extensive monocultures that characterizes conventional agriculture has resulted in the development of specialized and interacting multitrophic pest complexes. While integrated pest management emphasizes the need to consider the ecological context where multiple species coexist, management recommendations are often based on single-species tactics. This approach may not provide satisfactory solutions when confronted with the complex interactions occurring between organisms at the same or different trophic levels. Replacement of the single-species management model with more sophisticated, multi-species programs requires an understanding of the direct and indirect interactions occurring between the crop and all categories of pests. We evaluated a modeling framework to make multi-pest management decisions taking into account direct and indirect interactions among species belonging to different trophic levels. We adopted a Bayesian decision theory approach in combination with path analysis to evaluate interactions between Bromus tectorum (downy brome, cheatgrass) and Cephus cinctus (wheat stem sawfly) in wheat (Triticum aestivum) systems. We assessed their joint responses to weed management tactics, seeding rates, and cultivar tolerance to insect stem boring or competition. Our results indicated that C. cinctus oviposition behavior varied as a function of B. tectorum pressure. Crop responses were more readily explained by the joint effects of management tactics on both categories of pests and their interactions than just by the direct impact of any particular management scheme on yield. In accordance, a C. cinctus tolerant variety should be planted at a low seeding rate under high insect pressure. However as B. tectorum levels increase, the C. cinctus tolerant variety should be replaced by a competitive and drought tolerant cultivar at high seeding rates despite C. cinctus infestation. This study exemplifies the necessity of

  6. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    PubMed

    Green, Adam W; Bailey, Larissa L

    2015-01-01

    Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA) using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica) metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools) using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA) and available information to inform a formal decision process to determine optimal and timely management policies. PMID:26658734

  7. Bayesian analysis of input uncertainty in hydrological modeling: 2. Application

    NASA Astrophysics Data System (ADS)

    Kavetski, Dmitri; Kuczera, George; Franks, Stewart W.

    2006-03-01

    The Bayesian total error analysis (BATEA) methodology directly addresses both input and output errors in hydrological modeling, requiring the modeler to make explicit, rather than implicit, assumptions about the likely extent of data uncertainty. This study considers a BATEA assessment of two North American catchments: (1) French Broad River and (2) Potomac basins. It assesses the performance of the conceptual Variable Infiltration Capacity (VIC) model with and without accounting for input (precipitation) uncertainty. The results show the considerable effects of precipitation errors on the predicted hydrographs (especially the prediction limits) and on the calibrated parameters. In addition, the performance of BATEA in the presence of severe model errors is analyzed. While BATEA allows a very direct treatment of input uncertainty and yields some limited insight into model errors, it requires the specification of valid error models, which are currently poorly understood and require further work. Moreover, it leads to computationally challenging highly dimensional problems. For some types of models, including the VIC implemented using robust numerical methods, the computational cost of BATEA can be reduced using Newton-type methods.

  8. A Bayesian Model for the Analysis of Transgenerational Epigenetic Variation

    PubMed Central

    Varona, Luis; Munilla, Sebastián; Mouresan, Elena Flavia; González-Rodríguez, Aldemar; Moreno, Carlos; Altarriba, Juan

    2015-01-01

    Epigenetics has become one of the major areas of biological research. However, the degree of phenotypic variability that is explained by epigenetic processes still remains unclear. From a quantitative genetics perspective, the estimation of variance components is achieved by means of the information provided by the resemblance between relatives. In a previous study, this resemblance was described as a function of the epigenetic variance component and a reset coefficient that indicates the rate of dissipation of epigenetic marks across generations. Given these assumptions, we propose a Bayesian mixed model methodology that allows the estimation of epigenetic variance from a genealogical and phenotypic database. The methodology is based on the development of a T matrix of epigenetic relationships that depends on the reset coefficient. In addition, we present a simple procedure for the calculation of the inverse of this matrix (T−1) and a Gibbs sampler algorithm that obtains posterior estimates of all the unknowns in the model. The new procedure was used with two simulated data sets and with a beef cattle database. In the simulated populations, the results of the analysis provided marginal posterior distributions that included the population parameters in the regions of highest posterior density. In the case of the beef cattle dataset, the posterior estimate of transgenerational epigenetic variability was very low and a model comparison test indicated that a model that did not included it was the most plausible. PMID:25617408

  9. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  10. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives

    PubMed Central

    Green, Adam W.; Bailey, Larissa L.

    2015-01-01

    Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA) using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica) metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools) using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA) and available information to inform a formal decision process to determine optimal and timely management policies. PMID:26658734

  11. Cepheid light curve demography via Bayesian functional data analysis

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas J.; Hendry, Martin; Kowal, Daniel; Ruppert, David

    2016-01-01

    Synoptic time-domain surveys provide astronomers, not simply more data, but a different kind of data: large ensembles of multivariate, irregularly and asynchronously sampled light curves. We describe a statistical framework for light curve demography—optimal accumulation and extraction of information, not only along individual light curves as conventional methods do, but also across large ensembles of related light curves. We build the framework using tools from functional data analysis (FDA), a rapidly growing area of statistics that addresses inference from datasets that sample ensembles of related functions. Our Bayesian FDA framework builds hierarchical models that describe light curve ensembles using multiple levels of randomness: upper levels describe the source population, and lower levels describe the observation process, including measurement errors and selection effects. Roughly speaking, a particular object's light curve is modeled as the sum of a parameterized template component (modeling population-averaged behavior) and a peculiar component (modeling variability across the population), subsequently subjected to an observation model. A functional shrinkage adjustment to individual light curves emerges—an adaptive, functional generalization of the kind of adjustments made for Eddington or Malmquist bias in single-epoch photometric surveys. We describe ongoing work applying the framework to improved estimation of Cepheid variable star luminosities via FDA-based refinement and generalization of the Cepheid period-luminosity relation.

  12. Light curve demography via Bayesian functional data analysis

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas; Budavari, Tamas; Hendry, Martin A.; Kowal, Daniel; Ruppert, David

    2015-08-01

    Synoptic time-domain surveys provide astronomers, not simply more data, but a different kind of data: large ensembles of multivariate, irregularly and asynchronously sampled light curves. We describe a statistical framework for light curve demography—optimal accumulation and extraction of information, not only along individual light curves as conventional methods do, but also across large ensembles of related light curves. We build the framework using tools from functional data analysis (FDA), a rapidly growing area of statistics that addresses inference from datasets that sample ensembles of related functions. Our Bayesian FDA framework builds hierarchical models that describe light curve ensembles using multiple levels of randomness: upper levels describe the source population, and lower levels describe the observation process, including measurement errors and selection effects. Schematically, a particular object's light curve is modeled as the sum of a parameterized template component (modeling population-averaged behavior) and a peculiar component (modeling variability across the population), subsequently subjected to an observation model. A functional shrinkage adjustment to individual light curves emerges—an adaptive, functional generalization of the kind of adjustments made for Eddington or Malmquist bias in single-epoch photometric surveys. We are applying the framework to a variety of problems in synoptic time-domain survey astronomy, including optimal detection of weak sources in multi-epoch data, and improved estimation of Cepheid variable star luminosities from detailed demographic modeling of ensembles of Cepheid light curves.

  13. Nonparametric survival analysis using Bayesian Additive Regression Trees (BART).

    PubMed

    Sparapani, Rodney A; Logan, Brent R; McCulloch, Robert E; Laud, Purushottam W

    2016-07-20

    Bayesian additive regression trees (BART) provide a framework for flexible nonparametric modeling of relationships of covariates to outcomes. Recently, BART models have been shown to provide excellent predictive performance, for both continuous and binary outcomes, and exceeding that of its competitors. Software is also readily available for such outcomes. In this article, we introduce modeling that extends the usefulness of BART in medical applications by addressing needs arising in survival analysis. Simulation studies of one-sample and two-sample scenarios, in comparison with long-standing traditional methods, establish face validity of the new approach. We then demonstrate the model's ability to accommodate data from complex regression models with a simulation study of a nonproportional hazards scenario with crossing survival functions and survival function estimation in a scenario where hazards are multiplicatively modified by a highly nonlinear function of the covariates. Using data from a recently published study of patients undergoing hematopoietic stem cell transplantation, we illustrate the use and some advantages of the proposed method in medical investigations. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26854022

  14. Dynamic sensor action selection with Bayesian decision analysis

    NASA Astrophysics Data System (ADS)

    Kristensen, Steen; Hansen, Volker; Kondak, Konstantin

    1998-10-01

    The aim of this work is to create a framework for the dynamic planning of sensor actions for an autonomous mobile robot. The framework uses Bayesian decision analysis, i.e., a decision-theoretic method, to evaluate possible sensor actions and selecting the most appropriate ones given the available sensors and what is currently known about the state of the world. Since sensing changes the knowledge of the system and since the current state of the robot (task, position, etc.) determines what knowledge is relevant, the evaluation and selection of sensing actions is an on-going process that effectively determines the behavior of the robot. The framework has been implemented on a real mobile robot and has been proven to be able to control in real-time the sensor actions of the system. In current work we are investigating methods to reduce or automatically generate the necessary model information needed by the decision- theoretic method to select the appropriate sensor actions.

  15. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    SciTech Connect

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  16. Bayesian belief network analysis applied to determine the progression of temporomandibular disorders using MRI

    PubMed Central

    2015-01-01

    Objectives: This study investigated the applicability of a Bayesian belief network (BBN) to MR images to diagnose temporomandibular disorders (TMDs). Our aim was to determine the progression of TMDs, focusing on how each finding affects the other. Methods: We selected 1.5-T MRI findings (33 variables) and diagnoses (bone changes and disc displacement) of patients with TMD from 2007 to 2008. There were a total of 295 cases with 590 sides of temporomandibular joints (TMJs). The data were modified according to the research diagnostic criteria of TMD. We compared the accuracy of the BBN using 11 algorithms (necessary path condition, path condition, greedy search-and-score with Bayesian information criterion, Chow–Liu tree, Rebane–Pearl poly tree, tree augmented naïve Bayes model, maximum log likelihood, Akaike information criterion, minimum description length, K2 and C4.5), a multiple regression analysis and an artificial neural network using resubstitution validation and 10-fold cross-validation. Results: There were 191 TMJs (32.4%) with bone changes and 340 (57.6%) with articular disc displacement. The BBN path condition algorithm using resubstitution validation and 10-fold cross-validation was >99% accurate. However, the main advantage of a BBN is that it can represent the causal relationships between different findings and assign conditional probabilities, which can then be used to interpret the progression of TMD. Conclusions: Osteoarthritic bone changes progressed from condyle to articular fossa and finally to mandibular bone contours. Disc displacement was directly related to severe bone changes. Early bone changes were not directly related to disc displacement. TMJ functional factors (condylar translation, bony space and disc form) and age mediated between bone changes and disc displacement. PMID:25472616

  17. ChIP-BIT: Bayesian inference of target genes using a novel joint probabilistic model of ChIP-seq profiles.

    PubMed

    Chen, Xi; Jung, Jin-Gyoung; Shajahan-Haq, Ayesha N; Clarke, Robert; Shih, Ie-Ming; Wang, Yue; Magnani, Luca; Wang, Tian-Li; Xuan, Jianhua

    2016-04-20

    Chromatin immunoprecipitation with massively parallel DNA sequencing (ChIP-seq) has greatly improved the reliability with which transcription factor binding sites (TFBSs) can be identified from genome-wide profiling studies. Many computational tools are developed to detect binding events or peaks, however the robust detection of weak binding events remains a challenge for current peak calling tools. We have developed a novel Bayesian approach (ChIP-BIT) to reliably detect TFBSs and their target genes by jointly modeling binding signal intensities and binding locations of TFBSs. Specifically, a Gaussian mixture model is used to capture both binding and background signals in sample data. As a unique feature of ChIP-BIT, background signals are modeled by a local Gaussian distribution that is accurately estimated from the input data. Extensive simulation studies showed a significantly improved performance of ChIP-BIT in target gene prediction, particularly for detecting weak binding signals at gene promoter regions. We applied ChIP-BIT to find target genes from NOTCH3 and PBX1 ChIP-seq data acquired from MCF-7 breast cancer cells. TF knockdown experiments have initially validated about 30% of co-regulated target genes identified by ChIP-BIT as being differentially expressed in MCF-7 cells. Functional analysis on these genes further revealed the existence of crosstalk between Notch and Wnt signaling pathways. PMID:26704972

  18. ChIP-BIT: Bayesian inference of target genes using a novel joint probabilistic model of ChIP-seq profiles

    PubMed Central

    Chen, Xi; Jung, Jin-Gyoung; Shajahan-Haq, Ayesha N.; Clarke, Robert; Shih, Ie-Ming; Wang, Yue; Magnani, Luca; Wang, Tian-Li; Xuan, Jianhua

    2016-01-01

    Chromatin immunoprecipitation with massively parallel DNA sequencing (ChIP-seq) has greatly improved the reliability with which transcription factor binding sites (TFBSs) can be identified from genome-wide profiling studies. Many computational tools are developed to detect binding events or peaks, however the robust detection of weak binding events remains a challenge for current peak calling tools. We have developed a novel Bayesian approach (ChIP-BIT) to reliably detect TFBSs and their target genes by jointly modeling binding signal intensities and binding locations of TFBSs. Specifically, a Gaussian mixture model is used to capture both binding and background signals in sample data. As a unique feature of ChIP-BIT, background signals are modeled by a local Gaussian distribution that is accurately estimated from the input data. Extensive simulation studies showed a significantly improved performance of ChIP-BIT in target gene prediction, particularly for detecting weak binding signals at gene promoter regions. We applied ChIP-BIT to find target genes from NOTCH3 and PBX1 ChIP-seq data acquired from MCF-7 breast cancer cells. TF knockdown experiments have initially validated about 30% of co-regulated target genes identified by ChIP-BIT as being differentially expressed in MCF-7 cells. Functional analysis on these genes further revealed the existence of crosstalk between Notch and Wnt signaling pathways. PMID:26704972

  19. Bayesian Joint Selection of Genes and Pathways: Applications in Multiple Myeloma Genomics

    PubMed Central

    Zhang, Lin; Morris, Jeffrey S; Zhang, Jiexin; Orlowski, Robert Z; Baladandayuthapani, Veerabhadran

    2014-01-01

    It is well-established that the development of a disease, especially cancer, is a complex process that results from the joint effects of multiple genes involved in various molecular signaling pathways. In this article, we propose methods to discover genes and molecular pathways significantly associated with clinical outcomes in cancer samples. We exploit the natural hierarchal structure of genes related to a given pathway as a group of interacting genes to conduct selection of both pathways and genes. We posit the problem in a hierarchical structured variable selection (HSVS) framework to analyze the corresponding gene expression data. HSVS methods conduct simultaneous variable selection at the pathway (group level) and the gene (within-group) level. To adapt to the overlapping group structure present in the pathway–gene hierarchy of the data, we developed an overlap-HSVS method that introduces latent partial effect variables that partition the marginal effect of the covariates and corresponding weights for a proportional shrinkage of the partial effects. Combining gene expression data with prior pathway information from the KEGG databases, we identified several gene–pathway combinations that are significantly associated with clinical outcomes of multiple myeloma. Biological discoveries support this relationship for the pathways and the corresponding genes we identified. PMID:25520554

  20. Simultaneous Bayesian Estimation of Alignment and Phylogeny under a Joint Model of Protein Sequence and Structure

    PubMed Central

    Herman, Joseph L.; Challis, Christopher J.; Novák, Ádám; Hein, Jotun; Schmidler, Scott C.

    2014-01-01

    For sequences that are highly divergent, there is often insufficient information to infer accurate alignments, and phylogenetic uncertainty may be high. One way to address this issue is to make use of protein structural information, since structures generally diverge more slowly than sequences. In this work, we extend a recently developed stochastic model of pairwise structural evolution to multiple structures on a tree, analytically integrating over ancestral structures to permit efficient likelihood computations under the resulting joint sequence–structure model. We observe that the inclusion of structural information significantly reduces alignment and topology uncertainty, and reduces the number of topology and alignment errors in cases where the true trees and alignments are known. In some cases, the inclusion of structure results in changes to the consensus topology, indicating that structure may contain additional information beyond that which can be obtained from sequences. We use the model to investigate the order of divergence of cytoglobins, myoglobins, and hemoglobins and observe a stabilization of phylogenetic inference: although a sequence-based inference assigns significant posterior probability to several different topologies, the structural model strongly favors one of these over the others and is more robust to the choice of data set. PMID:24899668

  1. Bayesian Analysis of Multiple Populations in Galactic Globular Clusters

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, Rachel A.; Sarajedini, Ata; von Hippel, Ted; Stenning, David; Piotto, Giampaolo; Milone, Antonino; van Dyk, David A.; Robinson, Elliot; Stein, Nathan

    2016-01-01

    We use GO 13297 Cycle 21 Hubble Space Telescope (HST) observations and archival GO 10775 Cycle 14 HST ACS Treasury observations of Galactic Globular Clusters to find and characterize multiple stellar populations. Determining how globular clusters are able to create and retain enriched material to produce several generations of stars is key to understanding how these objects formed and how they have affected the structural, kinematic, and chemical evolution of the Milky Way. We employ a sophisticated Bayesian technique with an adaptive MCMC algorithm to simultaneously fit the age, distance, absorption, and metallicity for each cluster. At the same time, we also fit unique helium values to two distinct populations of the cluster and determine the relative proportions of those populations. Our unique numerical approach allows objective and precise analysis of these complicated clusters, providing posterior distribution functions for each parameter of interest. We use these results to gain a better understanding of multiple populations in these clusters and their role in the history of the Milky Way.Support for this work was provided by NASA through grant numbers HST-GO-10775 and HST-GO-13297 from the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS5-26555. This material is based upon work supported by the National Aeronautics and Space Administration under Grant NNX11AF34G issued through the Office of Space Science. This project was supported by the National Aeronautics & Space Administration through the University of Central Florida's NASA Florida Space Grant Consortium.

  2. Semiparametric Thurstonian Models for Recurrent Choices: A Bayesian Analysis

    ERIC Educational Resources Information Center

    Ansari, Asim; Iyengar, Raghuram

    2006-01-01

    We develop semiparametric Bayesian Thurstonian models for analyzing repeated choice decisions involving multinomial, multivariate binary or multivariate ordinal data. Our modeling framework has multiple components that together yield considerable flexibility in modeling preference utilities, cross-sectional heterogeneity and parameter-driven…

  3. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. PMID:26674686

  4. Bayesian network representing system dynamics in risk analysis of nuclear systems

    NASA Astrophysics Data System (ADS)

    Varuttamaseni, Athi

    2011-12-01

    A dynamic Bayesian network (DBN) model is used in conjunction with the alternating conditional expectation (ACE) regression method to analyze the risk associated with the loss of feedwater accident coupled with a subsequent initiation of the feed and bleed operation in the Zion-1 nuclear power plant. The use of the DBN allows the joint probability distribution to be factorized, enabling the analysis to be done on many simpler network structures rather than on one complicated structure. The construction of the DBN model assumes conditional independence relations among certain key reactor parameters. The choice of parameter to model is based on considerations of the macroscopic balance statements governing the behavior of the reactor under a quasi-static assumption. The DBN is used to relate the peak clad temperature to a set of independent variables that are known to be important in determining the success of the feed and bleed operation. A simple linear relationship is then used to relate the clad temperature to the core damage probability. To obtain a quantitative relationship among different nodes in the DBN, surrogates of the RELAP5 reactor transient analysis code are used. These surrogates are generated by applying the ACE algorithm to output data obtained from about 50 RELAP5 cases covering a wide range of the selected independent variables. These surrogates allow important safety parameters such as the fuel clad temperature to be expressed as a function of key reactor parameters such as the coolant temperature and pressure together with important independent variables such as the scram delay time. The time-dependent core damage probability is calculated by sampling the independent variables from their probability distributions and propagate the information up through the Bayesian network to give the clad temperature. With the knowledge of the clad temperature and the assumption that the core damage probability has a one-to-one relationship to it, we have

  5. Bayesian analysis of heavy-tailed and long-range dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, Timothy; Watkins, Nick; Gramacy, Robert; Franzke, Christian

    2014-05-01

    We have used MCMC algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modelling long range dependence (e.g. Beran et al, 2013). Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. We have extended the ARFIMA model by weakening the Gaussianity assumption, assuming an alpha-stable, heavy tailed, distribution for the innovations, and performing joint inference on d and alpha. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other popular measures of d.

  6. Bayesian analysis of the neuromagnetic inverse problem with l(p)-norm priors.

    PubMed

    Auranen, Toni; Nummenmaa, Aapo; Hämäläinen, Matti S; Jääskeläinen, Iiro P; Lampinen, Jouko; Vehtari, Aki; Sams, Mikko

    2005-07-01

    Magnetoencephalography (MEG) allows millisecond-scale non-invasive measurement of magnetic fields generated by neural currents in the brain. However, localization of the underlying current sources is ambiguous due to the so-called inverse problem. The most widely used source localization methods (i.e., minimum-norm and minimum-current estimates (MNE and MCE) and equivalent current dipole (ECD) fitting) require ad hoc determination of the cortical current distribution (l(2)-, l(1)-norm priors and point-sized dipolar, respectively). In this article, we perform a Bayesian analysis of the MEG inverse problem with l(p)-norm priors for the current sources. This way, we circumvent the arbitrary choice between l(1)- and l(2)-norm prior, which is instead rendered automatically based on the data. By obtaining numerical samples from the joint posterior probability distribution of the source current parameters and model hyperparameters (such as the l(p)-norm order p) using Markov chain Monte Carlo (MCMC) methods, we calculated the spatial inverse estimates as expectation values of the source current parameters integrated over the hyperparameters. Real MEG data and simulated (known) source currents with realistic MRI-based cortical geometry and 306-channel MEG sensor array were used. While the proposed model is sensitive to source space discretization size and computationally rather heavy, it is mathematically straightforward, thus allowing incorporation of, for instance, a priori functional magnetic resonance imaging (fMRI) information. PMID:15955497

  7. Bayesian Geostatistical Analysis and Prediction of Rhodesian Human African Trypanosomiasis

    PubMed Central

    Wardrop, Nicola A.; Atkinson, Peter M.; Gething, Peter W.; Fèvre, Eric M.; Picozzi, Kim; Kakembo, Abbas S. L.; Welburn, Susan C.

    2010-01-01

    Background The persistent spread of Rhodesian human African trypanosomiasis (HAT) in Uganda in recent years has increased concerns of a potential overlap with the Gambian form of the disease. Recent research has aimed to increase the evidence base for targeting control measures by focusing on the environmental and climatic factors that control the spatial distribution of the disease. Objectives One recent study used simple logistic regression methods to explore the relationship between prevalence of Rhodesian HAT and several social, environmental and climatic variables in two of the most recently affected districts of Uganda, and suggested the disease had spread into the study area due to the movement of infected, untreated livestock. Here we extend this study to account for spatial autocorrelation, incorporate uncertainty in input data and model parameters and undertake predictive mapping for risk of high HAT prevalence in future. Materials and Methods Using a spatial analysis in which a generalised linear geostatistical model is used in a Bayesian framework to account explicitly for spatial autocorrelation and incorporate uncertainty in input data and model parameters we are able to demonstrate a more rigorous analytical approach, potentially resulting in more accurate parameter and significance estimates and increased predictive accuracy, thereby allowing an assessment of the validity of the livestock movement hypothesis given more robust parameter estimation and appropriate assessment of covariate effects. Results Analysis strongly supports the theory that Rhodesian HAT was imported to the study area via the movement of untreated, infected livestock from endemic areas. The confounding effect of health care accessibility on the spatial distribution of Rhodesian HAT and the linkages between the disease's distribution and minimum land surface temperature have also been confirmed via the application of these methods. Conclusions Predictive mapping indicates an

  8. A Bayesian Solution for Two-Way Analysis of Variance. ACT Technical Bulletin No. 8.

    ERIC Educational Resources Information Center

    Lindley, Dennis V.

    The standard statistical analysis of data classified in two ways (say into rows and columns) is through an analysis of variance that splits the total variation of the data into the main effect of rows, the main effect of columns, and the interaction between rows and columns. This paper presents an alternative Bayesian analysis of the same…

  9. Bayesian Analysis of Two Stellar Populations in Galactic Globular Clusters III: Analysis of 30 Clusters

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, R.; Stenning, D. C.; Sarajedini, A.; von Hippel, T.; van Dyk, D. A.; Robinson, E.; Stein, N.; Jefferys, W. H.

    2016-09-01

    We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival ACS Treasury observations of 30 Galactic Globular Clusters to characterize two distinct stellar populations. A sophisticated Bayesian technique is employed to simultaneously sample the joint posterior distribution of age, distance, and extinction for each cluster, as well as unique helium values for two populations within each cluster and the relative proportion of those populations. We find the helium differences among the two populations in the clusters fall in the range of ˜0.04 to 0.11. Because adequate models varying in CNO are not presently available, we view these spreads as upper limits and present them with statistical rather than observational uncertainties. Evidence supports previous studies suggesting an increase in helium content concurrent with increasing mass of the cluster and also find that the proportion of the first population of stars increases with mass as well. Our results are examined in the context of proposed globular cluster formation scenarios. Additionally, we leverage our Bayesian technique to shed light on inconsistencies between the theoretical models and the observed data.

  10. Joint Bayesian and N-body Analyses of the 55 Cancri and GJ 876 Planetary Systems

    NASA Astrophysics Data System (ADS)

    Nelson, Benjamin E.; Ford, Eric B; Wright, Jason; Fischer, Debra

    2014-05-01

    We present the latest dynamical models for the 55 Cancri and GJ 876 systems based on 1,418 and 367 radial velocity (RV) observations, respectively. We apply our Radial velocity Using N-body Differential evolution Markov chain Monte Carlo code (RUN DMC; B. Nelson et al. 2014) to these two landmark systems and perform long-term 10^8 year) dynamical integrations using the Mercury symplectic integrator. For 55 Cancri, we find the transiting planet "e" cannot be misaligned with the outer four planets by more than 60 degrees and has a relativistic precession timescale on the order of the secular interactions. Based on a statistical analysis, we conclude planets "b" and "c" are apsidally aligned about 180 degrees but not in a mean-motion resonance. For GJ 876, we derive a set of 3-dimensional (non-coplanar) dynamical models based solely on RVs.

  11. Analysis of mechanical joint in composite cylinder

    NASA Astrophysics Data System (ADS)

    Hong, C. S.; Kim, Y. W.; Park, J. S.

    Joining techniques of composite materials are of great interest in cylindrical structures as the application of composites is widely used for weight-sensitive structures. Little information for the mechanical fastening joint of the laminated shell structure is available in the literature. In this study, a finite element program, which was based on the first order shear deformation theory, was developed for the analysis of the mechanical joint in the laminated composite structure. The failure of the mechanical fastening joint for the laminated graphite/epoxy cylinder subject to internal pressure was analyzed by using the developed program. Modeling of the bolt head in the composite cylinder was studied, and the effect of steel reinforcement outside the composite cylinder on the failure was investigated. The stress component near the bolt head was influenced by the size of the bolt head. The failure load and the failure mode were dependent on the bolt diameter, the number of bolts, and fiber orientation. The failure load was constant when the edge distance exceeds three times the bolt diameter.

  12. Objective Bayesian fMRI analysis-a pilot study in different clinical environments.

    PubMed

    Magerkurth, Joerg; Mancini, Laura; Penny, William; Flandin, Guillaume; Ashburner, John; Micallef, Caroline; De Vita, Enrico; Daga, Pankaj; White, Mark J; Buckley, Craig; Yamamoto, Adam K; Ourselin, Sebastien; Yousry, Tarek; Thornton, John S; Weiskopf, Nikolaus

    2015-01-01

    Functional MRI (fMRI) used for neurosurgical planning delineates functionally eloquent brain areas by time-series analysis of task-induced BOLD signal changes. Commonly used frequentist statistics protect against false positive results based on a p-value threshold. In surgical planning, false negative results are equally if not more harmful, potentially masking true brain activity leading to erroneous resection of eloquent regions. Bayesian statistics provides an alternative framework, categorizing areas as activated, deactivated, non-activated or with low statistical confidence. This approach has not yet found wide clinical application partly due to the lack of a method to objectively define an effect size threshold. We implemented a Bayesian analysis framework for neurosurgical planning fMRI. It entails an automated effect-size threshold selection method for posterior probability maps accounting for inter-individual BOLD response differences, which was calibrated based on the frequentist results maps thresholded by two clinical experts. We compared Bayesian and frequentist analysis of passive-motor fMRI data from 10 healthy volunteers measured on a pre-operative 3T and an intra-operative 1.5T MRI scanner. As a clinical case study, we tested passive motor task activation in a brain tumor patient at 3T under clinical conditions. With our novel effect size threshold method, the Bayesian analysis revealed regions of all four categories in the 3T data. Activated region foci and extent were consistent with the frequentist analysis results. In the lower signal-to-noise ratio 1.5T intra-operative scanner data, Bayesian analysis provided improved brain-activation detection sensitivity compared with the frequentist analysis, albeit the spatial extents of the activations were smaller than at 3T. Bayesian analysis of fMRI data using operator-independent effect size threshold selection may improve the sensitivity and certainty of information available to guide neurosurgery

  13. Hyper-efficient model-independent Bayesian method for the analysis of pulsar timing data

    NASA Astrophysics Data System (ADS)

    Lentati, Lindley; Alexander, P.; Hobson, M. P.; Taylor, S.; Gair, J.; Balan, S. T.; van Haasteren, R.

    2013-05-01

    A new model-independent method is presented for the analysis of pulsar timing data and the estimation of the spectral properties of an isotropic gravitational wave background (GWB). Taking a Bayesian approach, we show that by rephrasing the likelihood we are able to eliminate the most costly aspects of computation normally associated with this type of data analysis. When applied to the International Pulsar Timing Array Mock Data Challenge data sets this results in speedups of approximately 2-3 orders of magnitude compared to established methods, in the most extreme cases reducing the run time from several hours on the high performance computer “DARWIN” to less than a minute on a normal work station. Because of the versatility of this approach, we present three applications of the new likelihood. In the low signal-to-noise regime we sample directly from the power spectrum coefficients of the GWB signal realization. In the high signal-to-noise regime, where the data can support a large number of coefficients, we sample from the joint probability density of the power spectrum coefficients for the individual pulsars and the GWB signal realization using a “guided Hamiltonian sampler” to sample efficiently from this high-dimensional (˜1000) space. Critically in both these cases we need make no assumptions about the form of the power spectrum of the GWB, or the individual pulsars. Finally, we show that, if desired, a power-law model can still be fitted during sampling. We then apply this method to a more complex data set designed to represent better a future International Pulsar Timing Array or European Pulsar Timing Array data release. We show that even in challenging cases where the data features large jumps of the order 5 years, with observations spanning between 4 and 18 years for different pulsars and including steep red noise processes we are able to parametrize the underlying GWB signal correctly. Finally we present a method for characterizing the spatial

  14. A Gibbs sampler for Bayesian analysis of site-occupancy data

    USGS Publications Warehouse

    Dorazio, Robert M.; Rodriguez, Daniel Taylor

    2012-01-01

    1. A Bayesian analysis of site-occupancy data containing covariates of species occurrence and species detection probabilities is usually completed using Markov chain Monte Carlo methods in conjunction with software programs that can implement those methods for any statistical model, not just site-occupancy models. Although these software programs are quite flexible, considerable experience is often required to specify a model and to initialize the Markov chain so that summaries of the posterior distribution can be estimated efficiently and accurately. 2. As an alternative to these programs, we develop a Gibbs sampler for Bayesian analysis of site-occupancy data that include covariates of species occurrence and species detection probabilities. This Gibbs sampler is based on a class of site-occupancy models in which probabilities of species occurrence and detection are specified as probit-regression functions of site- and survey-specific covariate measurements. 3. To illustrate the Gibbs sampler, we analyse site-occupancy data of the blue hawker, Aeshna cyanea (Odonata, Aeshnidae), a common dragonfly species in Switzerland. Our analysis includes a comparison of results based on Bayesian and classical (non-Bayesian) methods of inference. We also provide code (based on the R software program) for conducting Bayesian and classical analyses of site-occupancy data.

  15. Bayesian Factor Analysis When Only a Sample Covariance Matrix Is Available

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Arav, Marina

    2006-01-01

    In traditional factor analysis, the variance-covariance matrix or the correlation matrix has often been a form of inputting data. In contrast, in Bayesian factor analysis, the entire data set is typically required to compute the posterior estimates, such as Bayes factor loadings and Bayes unique variances. We propose a simple method for computing…

  16. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis: A Cross-National Investigation of Schwartz Values

    ERIC Educational Resources Information Center

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the model parameters and demonstrates the consequences…

  17. Cyclist activity and injury risk analysis at signalized intersections: a Bayesian modelling approach.

    PubMed

    Strauss, Jillian; Miranda-Moreno, Luis F; Morency, Patrick

    2013-10-01

    This study proposes a two-equation Bayesian modelling approach to simultaneously study cyclist injury occurrence and bicycle activity at signalized intersections as joint outcomes. This approach deals with the potential presence of endogeneity and unobserved heterogeneities and is used to identify factors associated with both cyclist injuries and volumes. Its application to identify high-risk corridors is also illustrated. Montreal, Quebec, Canada is the application environment, using an extensive inventory of a large sample of signalized intersections containing disaggregate motor-vehicle traffic volumes and bicycle flows, geometric design, traffic control and built environment characteristics in the vicinity of the intersections. Cyclist injury data for the period of 2003-2008 is used in this study. Also, manual bicycle counts were standardized using temporal and weather adjustment factors to obtain average annual daily volumes. Results confirm and quantify the effects of both bicycle and motor-vehicle flows on cyclist injury occurrence. Accordingly, more cyclists at an intersection translate into more cyclist injuries but lower injury rates due to the non-linear association between bicycle volume and injury occurrence. Furthermore, the results emphasize the importance of turning motor-vehicle movements. The presence of bus stops and total crosswalk length increase cyclist injury occurrence whereas the presence of a raised median has the opposite effect. Bicycle activity through intersections was found to increase as employment, number of metro stations, land use mix, area of commercial land use type, length of bicycle facilities and the presence of schools within 50-800 m of the intersection increase. Intersections with three approaches are expected to have fewer cyclists than those with four. Using Bayesian analysis, expected injury frequency and injury rates were estimated for each intersection and used to rank corridors. Corridors with high bicycle volumes

  18. Wear analysis of revolute joints with clearance in multibody systems

    NASA Astrophysics Data System (ADS)

    Bai, ZhengFeng; Zhao, Yang; Wang, XingGui

    2013-08-01

    In this work, the prediction of wear for revolute joint with clearance in multibody systems is investigated using a computational methodology. The contact model in clearance joint is established using a new hybrid nonlinear contact force model and the friction effect is considered by using a modified Coulomb friction model. The dynamics model of multibody system with clearance is established using dynamic segmentation modeling method and the computational process for wear analysis of clearance joint in multibody systems is presented. The main computational process for wear analysis of clearance joint includes two steps, which are dynamics analysis and wear analysis. The dynamics simulation of multibody system with revolute clearance joint is carried out and the contact forces are drawn and used to calculate the wear amount of revolute clearance joint based on the Archard's wear model. Finally, a four-bar multibody mechanical system with revolute clearance joint is used as numerical example application to perform the simulation and show the dynamics responses and wear characteristics of multibody systems with revolute clearance joint. The main results of this work indicate that the contact between the joint elements is wider and more frequent in some specific regions and the wear phenomenon is not regular around the joint surface, which causes the clearance size increase non-regularly after clearance joint wear. This work presents an effective method to predict wear of revolute joint with clearance in multibody systems.

  19. OBJECTIVE BAYESIAN ANALYSIS OF ''ON/OFF'' MEASUREMENTS

    SciTech Connect

    Casadei, Diego

    2015-01-01

    In high-energy astrophysics, it is common practice to account for the background overlaid with counts from the source of interest with the help of auxiliary measurements carried out by pointing off-source. In this ''on/off'' measurement, one knows the number of photons detected while pointing toward the source, the number of photons collected while pointing away from the source, and how to estimate the background counts in the source region from the flux observed in the auxiliary measurements. For very faint sources, the number of photons detected is so low that the approximations that hold asymptotically are not valid. On the other hand, an analytical solution exists for the Bayesian statistical inference, which is valid at low and high counts. Here we illustrate the objective Bayesian solution based on the reference posterior and compare the result with the approach very recently proposed by Knoetig, and discuss its most delicate points. In addition, we propose to compute the significance of the excess with respect to the background-only expectation with a method that is able to account for any uncertainty on the background and is valid for any photon count. This method is compared to the widely used significance formula by Li and Ma, which is based on asymptotic properties.

  20. Exclusive breastfeeding practice in Nigeria: a bayesian stepwise regression analysis.

    PubMed

    Gayawan, Ezra; Adebayo, Samson B; Chitekwe, Stanley

    2014-11-01

    Despite the importance of breast milk, the prevalence of exclusive breastfeeding (EBF) in Nigeria is far lower than what has been recommended for developing countries. Worse still, the practise has been on downward trend in the country recently. This study was aimed at investigating the determinants and geographical variations of EBF in Nigeria. Any intervention programme would require a good knowledge of factors that enhance the practise. A pooled data set from Nigeria Demographic and Health Survey conducted in 1999, 2003, and 2008 were analyzed using a Bayesian stepwise approach that involves simultaneous selection of variables and smoothing parameters. Further, the approach allows for geographical variations at a highly disaggregated level of states to be investigated. Within a Bayesian context, appropriate priors are assigned on all the parameters and functions. Findings reveal that education of women and their partners, place of delivery, mother's age at birth, and current age of child are associated with increasing prevalence of EBF. However, visits for antenatal care during pregnancy are not associated with EBF in Nigeria. Further, results reveal considerable geographical variations in the practise of EBF. The likelihood of exclusively breastfeeding children are significantly higher in Kwara, Kogi, Osun, and Oyo states but lower in Jigawa, Katsina, and Yobe. Intensive interventions that can lead to improved practise are required in all states in Nigeria. The importance of breastfeeding needs to be emphasized to women during antenatal visits as this can encourage and enhance the practise after delivery. PMID:24619227

  1. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer

    2010-01-01

    This joint mobility KC lecture included information from two papers, "A Method for and Issues Associated with the Determination of Space Suit Joint Requirements" and "Results and Analysis from Space Suit Joint Torque Testing," as presented for the International Conference on Environmental Systems in 2009 and 2010, respectively. The first paper discusses historical joint torque testing methodologies and approaches that were tested in 2008 and 2009. The second paper discusses the testing that was completed in 2009 and 2010.

  2. Calibration of crash risk models on freeways with limited real-time traffic data using Bayesian meta-analysis and Bayesian inference approach.

    PubMed

    Xu, Chengcheng; Wang, Wei; Liu, Pan; Li, Zhibin

    2015-12-01

    This study aimed to develop a real-time crash risk model with limited data in China by using Bayesian meta-analysis and Bayesian inference approach. A systematic review was first conducted by using three different Bayesian meta-analyses, including the fixed effect meta-analysis, the random effect meta-analysis, and the meta-regression. The meta-analyses provided a numerical summary of the effects of traffic variables on crash risks by quantitatively synthesizing results from previous studies. The random effect meta-analysis and the meta-regression produced a more conservative estimate for the effects of traffic variables compared with the fixed effect meta-analysis. Then, the meta-analyses results were used as informative priors for developing crash risk models with limited data. Three different meta-analyses significantly affect model fit and prediction accuracy. The model based on meta-regression can increase the prediction accuracy by about 15% as compared to the model that was directly developed with limited data. Finally, the Bayesian predictive densities analysis was used to identify the outliers in the limited data. It can further improve the prediction accuracy by 5.0%. PMID:26468977

  3. Bayesian model-averaged benchmark dose analysis via reparameterized quantal-response models.

    PubMed

    Fang, Q; Piegorsch, W W; Simmons, S J; Li, X; Chen, C; Wang, Y

    2015-12-01

    An important objective in biomedical and environmental risk assessment is estimation of minimum exposure levels that induce a pre-specified adverse response in a target population. The exposure points in such settings are typically referred to as benchmark doses (BMDs). Parametric Bayesian estimation for finding BMDs has grown in popularity, and a large variety of candidate dose-response models is available for applying these methods. Each model can possess potentially different parametric interpretation(s), however. We present reparameterized dose-response models that allow for explicit use of prior information on the target parameter of interest, the BMD. We also enhance our Bayesian estimation technique for BMD analysis by applying Bayesian model averaging to produce point estimates and (lower) credible bounds, overcoming associated questions of model adequacy when multimodel uncertainty is present. An example from carcinogenicity testing illustrates the calculations. PMID:26102570

  4. Coronal joint spaces of the Temporomandibular joint: Systematic review and meta-analysis

    PubMed Central

    Silva, Joana-Cristina; Pires, Carlos A.; Ponces-Ramalhão, Maria-João-Feio; Lopes, Jorge-Dias

    2015-01-01

    Introduction The joint space measurements of the temporomandibular joint have been used to determine the condyle position variation. Therefore, the aim of this study is to perform a systematic review and meta-analysis on the coronal joint spaces measurements of the temporomandibular joint. Material and Methods An electronic database search was performed with the terms “condylar position”; “joint space”AND”TMJ”. Inclusionary criteria included: tomographic 3D imaging of the TMJ, presentation of at least two joint space measurements on the coronal plane. Exclusionary criteria were: mandibular fractures, animal studies, surgery, presence of genetic or chronic diseases, case reports, opinion or debate articles or unpublished material. The risk of bias of each study was judged as high, moderate or low according to the “Cochrane risk of bias tool”. The values used in the meta-analysis were the medial, superior and lateral joint space measurements and their differences between the right and left joint. Results From the initial search 2706 articles were retrieved. After excluding the duplicates and all the studies that did not match the eligibility criteria 4 articles classified for final review. All the retrieved articles were judged as low level of evidence. All of the reviewed studies were included in the meta-analysis concluding that the mean coronal joint space values were: medial joint space 2.94 mm, superior 2.55 mm and lateral 2.16 mm. Conclusions the analysis also showed high levels of heterogeneity. Right and left comparison did not show statistically significant differences. Key words:Temporomandibular joint, systematic review, meta-analysis. PMID:26330944

  5. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  6. Using Discrete Loss Functions and Weighted Kappa for Classification: An Illustration Based on Bayesian Network Analysis

    ERIC Educational Resources Information Center

    Zwick, Rebecca; Lenaburg, Lubella

    2009-01-01

    In certain data analyses (e.g., multiple discriminant analysis and multinomial log-linear modeling), classification decisions are made based on the estimated posterior probabilities that individuals belong to each of several distinct categories. In the Bayesian network literature, this type of classification is often accomplished by assigning…

  7. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis

    NASA Technical Reports Server (NTRS)

    Herskovits, Edward H.; Gerring, Joan P.

    2003-01-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.

  8. Symptoms of Depression and Challenging Behaviours in People with Intellectual Disability: A Bayesian Analysis. Brief Report

    ERIC Educational Resources Information Center

    Tsiouris, John; Mann, Rachel; Patti, Paul; Sturmey, Peter

    2004-01-01

    Clinicians need to know the likelihood of a condition given a positive or negative diagnostic test. In this study a Bayesian analysis of the Clinical Behavior Checklist for Persons with Intellectual Disabilities (CBCPID) to predict depression in people with intellectual disability was conducted. The CBCPID was administered to 92 adults with…

  9. Family Background Variables as Instruments for Education in Income Regressions: A Bayesian Analysis

    ERIC Educational Resources Information Center

    Hoogerheide, Lennart; Block, Joern H.; Thurik, Roy

    2012-01-01

    The validity of family background variables instrumenting education in income regressions has been much criticized. In this paper, we use data from the 2004 German Socio-Economic Panel and Bayesian analysis to analyze to what degree violations of the strict validity assumption affect the estimation results. We show that, in case of moderate direct…

  10. Bayesian Factor Analysis as a Variable-Selection Problem: Alternative Priors and Consequences.

    PubMed

    Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric

    2016-01-01

    Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor-loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, Muthén & Asparouhov proposed a Bayesian structural equation modeling (BSEM) approach to explore the presence of cross loadings in CFA models. We show that the issue of determining factor-loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov's approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike-and-slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set is used to demonstrate our approach. PMID:27314566

  11. Applying Bayesian Modeling and Receiver Operating Characteristic Methodologies for Test Utility Analysis

    ERIC Educational Resources Information Center

    Wang, Qiu; Diemer, Matthew A.; Maier, Kimberly S.

    2013-01-01

    This study integrated Bayesian hierarchical modeling and receiver operating characteristic analysis (BROCA) to evaluate how interest strength (IS) and interest differentiation (ID) predicted low–socioeconomic status (SES) youth's interest-major congruence (IMC). Using large-scale Kuder Career Search online-assessment data, this study fit three…

  12. Variational Bayesian causal connectivity analysis for fMRI

    PubMed Central

    Luessi, Martin; Babacan, S. Derin; Molina, Rafael; Booth, James R.; Katsaggelos, Aggelos K.

    2014-01-01

    The ability to accurately estimate effective connectivity among brain regions from neuroimaging data could help answering many open questions in neuroscience. We propose a method which uses causality to obtain a measure of effective connectivity from fMRI data. The method uses a vector autoregressive model for the latent variables describing neuronal activity in combination with a linear observation model based on a convolution with a hemodynamic response function. Due to the employed modeling, it is possible to efficiently estimate all latent variables of the model using a variational Bayesian inference algorithm. The computational efficiency of the method enables us to apply it to large scale problems with high sampling rates and several hundred regions of interest. We use a comprehensive empirical evaluation with synthetic and real fMRI data to evaluate the performance of our method under various conditions. PMID:24847244

  13. Bayesian Analysis of Foraging by Pigeons (Columba livia)

    PubMed Central

    Killeen, Peter R.; Palombo, Gina-Marie; Gottlob, Lawrence R.; Beam, Jon

    2008-01-01

    In this article, the authors combine models of timing and Bayesian revision of information concerning patch quality to predict foraging behavior. Pigeons earned food by pecking on 2 keys (patches) in an experimental chamber. Food was primed for only 1 of the patches on each trial. There was a constant probability of finding food in a primed patch, but it accumulated only while the animals searched there. The optimal strategy was to choose the better patch first and remain for a fixed duration, thereafter alternating evenly between the patches. Pigeons were nonoptimal in 3 ways: (a) they departed too early, (b) their departure times were variable, and (c) they were biased in their choices after initial departure. The authors review various explanations of these data. PMID:8865614

  14. Micronutrients in HIV: A Bayesian Meta-Analysis

    PubMed Central

    Carter, George M.; Indyk, Debbie; Johnson, Matthew; Andreae, Michael; Suslov, Kathryn; Busani, Sudharani; Esmaeili, Aryan; Sacks, Henry S.

    2015-01-01

    Background Approximately 28.5 million people living with HIV are eligible for treatment (CD4<500), but currently have no access to antiretroviral therapy. Reduced serum level of micronutrients is common in HIV disease. Micronutrient supplementation (MNS) may mitigate disease progression and mortality. Objectives We synthesized evidence on the effect of micronutrient supplementation on mortality and rate of disease progression in HIV disease. Methods We searched MEDLINE, EMBASE, the Cochrane Central, AMED and CINAHL databases through December 2014, without language restriction, for studies of greater than 3 micronutrients versus any or no comparator. We built a hierarchical Bayesian random effects model to synthesize results. Inferences are based on the posterior distribution of the population effects; posterior distributions were approximated by Markov chain Monte Carlo in OpenBugs. Principal Findings From 2166 initial references, we selected 49 studies for full review and identified eight reporting on disease progression and/or mortality. Bayesian synthesis of data from 2,249 adults in three studies estimated the relative risk of disease progression in subjects on MNS vs. control as 0.62 (95% credible interval, 0.37, 0.96). Median number needed to treat is 8.4 (4.8, 29.9) and the Bayes Factor 53.4. Based on data reporting on 4,095 adults reporting mortality in 7 randomized controlled studies, the RR was 0.84 (0.38, 1.85), NNT is 25 (4.3, ∞). Conclusions MNS significantly and substantially slows disease progression in HIV+ adults not on ARV, and possibly reduces mortality. Micronutrient supplements are effective in reducing progression with a posterior probability of 97.9%. Considering MNS low cost and lack of adverse effects, MNS should be standard of care for HIV+ adults not yet on ARV. PMID:25830916

  15. Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals.

    PubMed

    De Francesco, A; Guarini, E; Bafile, U; Formisano, F; Scaccia, L

    2016-08-01

    When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way. PMID:27627410

  16. Bayesian Analysis of Non-Gaussian Long-Range Dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, Timothy; Watkins, Nicholas; Franzke, Christian; Gramacy, Robert

    2013-04-01

    Recent studies [e.g. the Antarctic study of Franzke, J. Climate, 2010] have strongly suggested that surface temperatures exhibit long-range dependence (LRD). The presence of LRD would hamper the identification of deterministic trends and the quantification of their significance. It is well established that LRD processes exhibit stochastic trends over rather long periods of time. Thus, accurate methods for discriminating between physical processes that possess long memory and those that do not are an important adjunct to climate modeling. As we briefly review, the LRD idea originated at the same time as H-selfsimilarity, so it is often not realised that a model does not have to be H-self similar to show LRD [e.g. Watkins, GRL Frontiers, 2013]. We have used Markov Chain Monte Carlo algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modeling LRD. Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. Many physical processes, for example the Faraday Antarctic time series, are significantly non-Gaussian. We have therefore extended this work by weakening the Gaussianity assumption, assuming an alpha-stable distribution for the innovations, and performing joint inference on d and alpha. Such a modified FARIMA(p,d,q) process is a flexible, initial model for non-Gaussian processes with long memory. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other measures of d.

  17. Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals

    NASA Astrophysics Data System (ADS)

    De Francesco, A.; Guarini, E.; Bafile, U.; Formisano, F.; Scaccia, L.

    2016-08-01

    When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way.

  18. Uncertainty Analysis in Fatigue Life Prediction of Gas Turbine Blades Using Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Li, Yan-Feng; Zhu, Shun-Peng; Li, Jing; Peng, Weiwen; Huang, Hong-Zhong

    2015-12-01

    This paper investigates Bayesian model selection for fatigue life estimation of gas turbine blades considering model uncertainty and parameter uncertainty. Fatigue life estimation of gas turbine blades is a critical issue for the operation and health management of modern aircraft engines. Since lots of life prediction models have been presented to predict the fatigue life of gas turbine blades, model uncertainty and model selection among these models have consequently become an important issue in the lifecycle management of turbine blades. In this paper, fatigue life estimation is carried out by considering model uncertainty and parameter uncertainty simultaneously. It is formulated as the joint posterior distribution of a fatigue life prediction model and its model parameters using Bayesian inference method. Bayes factor is incorporated to implement the model selection with the quantified model uncertainty. Markov Chain Monte Carlo method is used to facilitate the calculation. A pictorial framework and a step-by-step procedure of the Bayesian inference method for fatigue life estimation considering model uncertainty are presented. Fatigue life estimation of a gas turbine blade is implemented to demonstrate the proposed method.

  19. A Pragmatic Bayesian Perspective on Correlation Analysis - The exoplanetary gravity - stellar activity case

    NASA Astrophysics Data System (ADS)

    Figueira, P.; Faria, J. P.; Adibekyan, V. Zh.; Oshagh, M.; Santos, N. C.

    2016-05-01

    We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, ρ, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short (˜ 130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log( R^' }_{{HK}}) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the reader interested in this kind of problem to apply our code to his/her own scientific problems. The full understanding of what the Bayesian framework is can only be gained through the insight that comes by handling priors, assessing the convergence of Monte Carlo runs, and a multitude of other practical problems. We hope to contribute so that Bayesian analysis becomes a tool in the toolkit of researchers, and they understand by experience its advantages and limitations.

  20. A fully Bayesian before-after analysis of permeable friction course (PFC) pavement wet weather safety.

    PubMed

    Buddhavarapu, Prasad; Smit, Andre F; Prozzi, Jorge A

    2015-07-01

    Permeable friction course (PFC), a porous hot-mix asphalt, is typically applied to improve wet weather safety on high-speed roadways in Texas. In order to warrant expensive PFC construction, a statistical evaluation of its safety benefits is essential. Generally, the literature on the effectiveness of porous mixes in reducing wet-weather crashes is limited and often inconclusive. In this study, the safety effectiveness of PFC was evaluated using a fully Bayesian before-after safety analysis. First, two groups of road segments overlaid with PFC and non-PFC material were identified across Texas; the non-PFC or reference road segments selected were similar to their PFC counterparts in terms of site specific features. Second, a negative binomial data generating process was assumed to model the underlying distribution of crash counts of PFC and reference road segments to perform Bayesian inference on the safety effectiveness. A data-augmentation based computationally efficient algorithm was employed for a fully Bayesian estimation. The statistical analysis shows that PFC is not effective in reducing wet weather crashes. It should be noted that the findings of this study are in agreement with the existing literature, although these studies were not based on a fully Bayesian statistical analysis. Our study suggests that the safety effectiveness of PFC road surfaces, or any other safety infrastructure, largely relies on its interrelationship with the road user. The results suggest that the safety infrastructure must be properly used to reap the benefits of the substantial investments. PMID:25897515

  1. APPLICATION OF PRINCIPAL COMPONENT ANALYSIS AND BAYESIAN DECOMPOSITION TO RELAXOGRAPHIC IMAGING

    SciTech Connect

    OCHS,M.F.; STOYANOVA,R.S.; BROWN,T.R.; ROONEY,W.D.; LI,X.; LEE,J.H.; SPRINGER,C.S.

    1999-05-22

    Recent developments in high field imaging have made possible the acquisition of high quality, low noise relaxographic data in reasonable imaging times. The datasets comprise a huge amount of information (>>1 million points) which makes rigorous analysis daunting. Here, the authors present results demonstrating that Principal Component Analysis (PCA) and Bayesian Decomposition (BD) provide powerful methods for relaxographic analysis of T{sub 1} recovery curves and editing of tissue type in resulting images.

  2. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  3. Toward an ecological analysis of Bayesian inferences: how task characteristics influence responses

    PubMed Central

    Hafenbrädl, Sebastian; Hoffrage, Ulrich

    2015-01-01

    In research on Bayesian inferences, the specific tasks, with their narratives and characteristics, are typically seen as exchangeable vehicles that merely transport the structure of the problem to research participants. In the present paper, we explore whether, and possibly how, task characteristics that are usually ignored influence participants’ responses in these tasks. We focus on both quantitative dimensions of the tasks, such as their base rates, hit rates, and false-alarm rates, as well as qualitative characteristics, such as whether the task involves a norm violation or not, whether the stakes are high or low, and whether the focus is on the individual case or on the numbers. Using a data set of 19 different tasks presented to 500 different participants who provided a total of 1,773 responses, we analyze these responses in two ways: first, on the level of the numerical estimates themselves, and second, on the level of various response strategies, Bayesian and non-Bayesian, that might have produced the estimates. We identified various contingencies, and most of the task characteristics had an influence on participants’ responses. Typically, this influence has been stronger when the numerical information in the tasks was presented in terms of probabilities or percentages, compared to natural frequencies – and this effect cannot be fully explained by a higher proportion of Bayesian responses when natural frequencies were used. One characteristic that did not seem to influence participants’ response strategy was the numerical value of the Bayesian solution itself. Our exploratory study is a first step toward an ecological analysis of Bayesian inferences, and highlights new avenues for future research. PMID:26300791

  4. Candidate gene association study in pediatric acute lymphoblastic leukemia evaluated by Bayesian network based Bayesian multilevel analysis of relevance

    PubMed Central

    2012-01-01

    Background We carried out a candidate gene association study in pediatric acute lymphoblastic leukemia (ALL) to identify possible genetic risk factors in a Hungarian population. Methods The results were evaluated with traditional statistical methods and with our newly developed Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA) method. We collected genomic DNA and clinical data from 543 children, who underwent chemotherapy due to ALL, and 529 healthy controls. Altogether 66 single nucleotide polymorphisms (SNPs) in 19 candidate genes were genotyped. Results With logistic regression, we identified 6 SNPs in the ARID5B and IKZF1 genes associated with increased risk to B-cell ALL, and two SNPs in the STAT3 gene, which decreased the risk to hyperdiploid ALL. Because the associated SNPs were in linkage in each gene, these associations corresponded to one signal per gene. The odds ratio (OR) associated with the tag SNPs were: OR = 1.69, P = 2.22x10-7 for rs4132601 (IKZF1), OR = 1.53, P = 1.95x10-5 for rs10821936 (ARID5B) and OR = 0.64, P = 2.32x10-4 for rs12949918 (STAT3). With the BN-BMLA we confirmed the findings of the frequentist-based method and received additional information about the nature of the relations between the SNPs and the disease. E.g. the rs10821936 in ARID5B and rs17405722 in STAT3 showed a weak interaction, and in case of T-cell lineage sample group, the gender showed a weak interaction with three SNPs in three genes. In the hyperdiploid patient group the BN-BMLA detected a strong interaction among SNPs in the NOTCH1, STAT1, STAT3 and BCL2 genes. Evaluating the survival rate of the patients with ALL, the BN-BMLA showed that besides risk groups and subtypes, genetic variations in the BAX and CEBPA genes might also influence the probability of survival of the patients. Conclusions In the present study we confirmed the roles of genetic variations in ARID5B and IKZF1 in the susceptibility to B-cell ALL

  5. Doubly Bayesian Analysis of Confidence in Perceptual Decision-Making

    PubMed Central

    Bahrami, Bahador; Latham, Peter E.

    2015-01-01

    Humans stand out from other animals in that they are able to explicitly report on the reliability of their internal operations. This ability, which is known as metacognition, is typically studied by asking people to report their confidence in the correctness of some decision. However, the computations underlying confidence reports remain unclear. In this paper, we present a fully Bayesian method for directly comparing models of confidence. Using a visual two-interval forced-choice task, we tested whether confidence reports reflect heuristic computations (e.g. the magnitude of sensory data) or Bayes optimal ones (i.e. how likely a decision is to be correct given the sensory data). In a standard design in which subjects were first asked to make a decision, and only then gave their confidence, subjects were mostly Bayes optimal. In contrast, in a less-commonly used design in which subjects indicated their confidence and decision simultaneously, they were roughly equally likely to use the Bayes optimal strategy or to use a heuristic but suboptimal strategy. Our results suggest that, while people’s confidence reports can reflect Bayes optimal computations, even a small unusual twist or additional element of complexity can prevent optimality. PMID:26517475

  6. Bayesian Analysis of Cosmic Ray Propagation: Evidence against Homogeneous Diffusion

    NASA Astrophysics Data System (ADS)

    Jóhannesson, G.; Ruiz de Austri, R.; Vincent, A. C.; Moskalenko, I. V.; Orlando, E.; Porter, T. A.; Strong, A. W.; Trotta, R.; Feroz, F.; Graff, P.; Hobson, M. P.

    2016-06-01

    We present the results of the most complete scan of the parameter space for cosmic ray (CR) injection and propagation. We perform a Bayesian search of the main GALPROP parameters, using the MultiNest nested sampling algorithm, augmented by the BAMBI neural network machine-learning package. This is the first study to separate out low-mass isotopes (p, \\bar{p}, and He) from the usual light elements (Be, B, C, N, and O). We find that the propagation parameters that best-fit p,\\bar{p}, and He data are significantly different from those that fit light elements, including the B/C and 10Be/9Be secondary-to-primary ratios normally used to calibrate propagation parameters. This suggests that each set of species is probing a very different interstellar medium, and that the standard approach of calibrating propagation parameters using B/C can lead to incorrect results. We present posterior distributions and best-fit parameters for propagation of both sets of nuclei, as well as for the injection abundances of elements from H to Si. The input GALDEF files with these new parameters will be included in an upcoming public GALPROP update.

  7. fMRI data analysis with nonstationary noise models: a Bayesian approach.

    PubMed

    Luo, Huaien; Puthusserypady, Sadasivan

    2007-09-01

    The assumption of noise stationarity in the functional magnetic resonance imaging (fMRI) data analysis may lead to the loss of crucial dynamic features of the data and thus result in inaccurate activation detection. In this paper, a Bayesian approach is proposed to analyze the fMRI data with two nonstationary noise models (the time-varying variance noise model and the fractional noise model). The covariance matrices of the time-varying variance noise and the fractional noise after wavelet transform are diagonal matrices. This property is investigated under the Bayesian framework. The Bayesian estimator not only gives an accurate estimate of the weights in general linear model, but also provides posterior probability of activation in a voxel and, hence, avoids the limitations (i.e., using only hypothesis testing) in the classical methods. The performance of the proposed Bayesian methods (under the assumption of different noise models) are compared with the ordinary least squares (OLS) and the weighted least squares (WLS) methods. Results from the simulation studies validate the superiority of the proposed approach to the OLS and WLS methods considering the complex noise structures in the fMRI data. PMID:17867354

  8. Bayesian approach for counting experiment statistics applied to a neutrino point source analysis

    NASA Astrophysics Data System (ADS)

    Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.

    2013-12-01

    In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.

  9. Progressive Damage Analysis of Bonded Composite Joints

    NASA Technical Reports Server (NTRS)

    Leone, Frank A., Jr.; Girolamo, Donato; Davila, Carlos G.

    2012-01-01

    The present work is related to the development and application of progressive damage modeling techniques to bonded joint technology. The joint designs studied in this work include a conventional composite splice joint and a NASA-patented durable redundant joint. Both designs involve honeycomb sandwich structures with carbon/epoxy facesheets joined using adhesively bonded doublers.Progressive damage modeling allows for the prediction of the initiation and evolution of damage within a structure. For structures that include multiple material systems, such as the joint designs under consideration, the number of potential failure mechanisms that must be accounted for drastically increases the complexity of the analyses. Potential failure mechanisms include fiber fracture, intraply matrix cracking, delamination, core crushing, adhesive failure, and their interactions. The bonded joints were modeled using highly parametric, explicitly solved finite element models, with damage modeling implemented via custom user-written subroutines. Each ply was discretely meshed using three-dimensional solid elements. Layers of cohesive elements were included between each ply to account for the possibility of delaminations and were used to model the adhesive layers forming the joint. Good correlation with experimental results was achieved both in terms of load-displacement history and the predicted failure mechanism(s).

  10. Analysis and design of advanced composite bounded joints

    NASA Technical Reports Server (NTRS)

    Hart-Smith, L. J.

    1974-01-01

    Advances in the analysis of adhesive-bonded joints are presented with particular emphasis on advanced composite structures. The joints analyzed are of double-lap, single-lap, scarf, stepped-lap and tapered-lap configurations. Tensile, compressive, and in-plane shear loads are covered. In addition to the usual geometric variables, the theory accounts for the strength increases attributable to adhesive plasticity (in terms of the elastic-plastic adhesive model) and the joint strength reductions imposed by imbalances between the adherends. The solutions are largely closed-form analytical results, employing iterative solutions on a digital computer for the more complicated joint configurations. In assessing the joint efficiency, three potential failure modes are considered. These are adherend failure outside the joint, adhesive failure in shear, and adherend interlaminar tension failure (or adhesive failure in peel). Each mode is governed by a distinct mathematical analysis and each prevails throughout different ranges of geometric sizes and proportions.

  11. Bayesian extreme rainfall analysis using informative prior: A case study of Alor Setar

    NASA Astrophysics Data System (ADS)

    Eli, Annazirin; Zin, Wan Zawiah Wan; Ibrahim, Kamarulzaman; Jemain, Abdul Aziz

    2014-09-01

    Bayesian analysis is an alternative approach in statistical inferences. The inclusion of other information regarding the parameter of the model is one of analysis capabilities. In the area of extreme rainfall analysis, expert opinion can be used as prior information to model the extreme events. Thus, considering previous or expert knowledge about the parameter of interest would reduce the uncertainty of the model. In this study, the annual maximum (AM) rainfall data of Alor Setar rain gauge station is modeled by the Generalized Extreme Value (GEV) distribution. A Bayesian Markov Chain Monte Carlo (MCMC) simulation is used for parameter estimation. Comparison of the outcomes between non-informative and informative priors is our main interest. The results show that there is a reduction in estimated values, which is due to informative priors.

  12. Bayesian Statistical Analysis Applied to NAA Data for Neutron Flux Spectrum Determination

    NASA Astrophysics Data System (ADS)

    Chiesa, D.; Previtali, E.; Sisti, M.

    2014-04-01

    In this paper, we present a statistical method, based on Bayesian statistics, to evaluate the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation analysis (NAA) experiment [A. Borio di Tigliole et al., Absolute flux measurement by NAA at the Pavia University TRIGA Mark II reactor facilities, ENC 2012 - Transactions Research Reactors, ISBN 978-92-95064-14-0, 22 (2012)] performed at the TRIGA Mark II reactor of Pavia University (Italy). In order to evaluate the neutron flux spectrum, subdivided in energy groups, we must solve a system of linear equations containing the grouped cross sections and the activation rate data. We solve this problem with Bayesian statistical analysis, including the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, is used to define the problem statistical model and solve it. The energy group fluxes and their uncertainties are then determined with great accuracy and the correlations between the groups are analyzed. Finally, the dependence of the results on the prior distribution choice and on the group cross section data is investigated to confirm the reliability of the analysis.

  13. Environmental Modeling and Bayesian Analysis for Assessing Human Health Impacts from Radioactive Waste Disposal

    NASA Astrophysics Data System (ADS)

    Stockton, T.; Black, P.; Tauxe, J.; Catlett, K.

    2004-12-01

    Bayesian decision analysis provides a unified framework for coherent decision-making. Two key components of Bayesian decision analysis are probability distributions and utility functions. Calculating posterior distributions and performing decision analysis can be computationally challenging, especially for complex environmental models. In addition, probability distributions and utility functions for environmental models must be specified through expert elicitation, stakeholder consensus, or data collection, all of which have their own set of technical and political challenges. Nevertheless, a grand appeal of the Bayesian approach for environmental decision- making is the explicit treatment of uncertainty, including expert judgment. The impact of expert judgment on the environmental decision process, though integral, goes largely unassessed. Regulations and orders of the Environmental Protection Agency, Department Of Energy, and Nuclear Regulatory Agency orders require assessing the impact on human health of radioactive waste contamination over periods of up to ten thousand years. Towards this end complex environmental simulation models are used to assess "risk" to human and ecological health from migration of radioactive waste. As the computational burden of environmental modeling is continually reduced probabilistic process modeling using Monte Carlo simulation is becoming routinely used to propagate uncertainty from model inputs through model predictions. The utility of a Bayesian approach to environmental decision-making is discussed within the context of a buried radioactive waste example. This example highlights the desirability and difficulties of merging the cost of monitoring, the cost of the decision analysis, the cost and viability of clean up, and the probability of human health impacts within a rigorous decision framework.

  14. Bayesian uncertainty analysis for complex physical systems modelled by computer simulators with applications to tipping points

    NASA Astrophysics Data System (ADS)

    Caiado, C. C. S.; Goldstein, M.

    2015-09-01

    In this paper we present and illustrate basic Bayesian techniques for the uncertainty analysis of complex physical systems modelled by computer simulators. We focus on emulation and history matching and also discuss the treatment of observational errors and structural discrepancies in time series. We exemplify such methods using a four-box model for the termohaline circulation. We show how these methods may be applied to systems containing tipping points and how to treat possible discontinuities using multiple emulators.

  15. Joint Analysis of Multiple Metagenomic Samples

    PubMed Central

    Baran, Yael; Halperin, Eran

    2012-01-01

    The availability of metagenomic sequencing data, generated by sequencing DNA pooled from multiple microbes living jointly, has increased sharply in the last few years with developments in sequencing technology. Characterizing the contents of metagenomic samples is a challenging task, which has been extensively attempted by both supervised and unsupervised techniques, each with its own limitations. Common to practically all the methods is the processing of single samples only; when multiple samples are sequenced, each is analyzed separately and the results are combined. In this paper we propose to perform a combined analysis of a set of samples in order to obtain a better characterization of each of the samples, and provide two applications of this principle. First, we use an unsupervised probabilistic mixture model to infer hidden components shared across metagenomic samples. We incorporate the model in a novel framework for studying association of microbial sequence elements with phenotypes, analogous to the genome-wide association studies performed on human genomes: We demonstrate that stratification may result in false discoveries of such associations, and that the components inferred by the model can be used to correct for this stratification. Second, we propose a novel read clustering (also termed “binning”) algorithm which operates on multiple samples simultaneously, leveraging on the assumption that the different samples contain the same microbial species, possibly in different proportions. We show that integrating information across multiple samples yields more precise binning on each of the samples. Moreover, for both applications we demonstrate that given a fixed depth of coverage, the average per-sample performance generally increases with the number of sequenced samples as long as the per-sample coverage is high enough. PMID:22359490

  16. Fully Bayesian hierarchical modelling in two stages, with application to meta-analysis

    PubMed Central

    Lunn, David; Barrett, Jessica; Sweeting, Michael; Thompson, Simon

    2013-01-01

    Meta-analysis is often undertaken in two stages, with each study analysed separately in stage 1 and estimates combined across studies in stage 2. The study-specific estimates are assumed to arise from normal distributions with known variances equal to their corresponding estimates. In contrast, a one-stage analysis estimates all parameters simultaneously. A Bayesian one-stage approach offers additional advantages, such as the acknowledgement of uncertainty in all parameters and greater flexibility. However, there are situations when a two-stage strategy is compelling, e.g. when study-specific analyses are complex and/or time consuming. We present a novel method for fitting the full Bayesian model in two stages, hence benefiting from its advantages while retaining the convenience and flexibility of a two-stage approach. Using Markov chain Monte Carlo methods, posteriors for the parameters of interest are derived separately for each study. These are then used as proposal distributions in a computationally efficient second stage. We illustrate these ideas on a small binomial data set; we also analyse motivating data on the growth and rupture of abdominal aortic aneurysms. The two-stage Bayesian approach closely reproduces a one-stage analysis when it can be undertaken, but can also be easily carried out when a one-stage approach is difficult or impossible. PMID:24223435

  17. Copula models for frequency analysis what can be learned from a Bayesian perspective?

    NASA Astrophysics Data System (ADS)

    Parent, Eric; Favre, Anne-Catherine; Bernier, Jacques; Perreault, Luc

    2014-01-01

    Large spring floods in the Québec region exhibit correlated peakflow, duration and volume. Consequently, traditional univariate hydrological frequency analyses must be complemented by multivariate probabilistic assessment to provide a meaningful design flood level as requested in hydrological engineering (based on return period evaluation of a single quantity of interest). In this paper we study 47 years of a peak/volume dataset for the Romaine River with a parametric copula model. The margins are modeled with a normal or gamma distribution and the dependence is depicted through a parametric family of copulas (Arch 12 or Arch 14). Parameter joint inference and model selection are performed under the Bayesian paradigm. This approach enlightens specific features of interest for hydrological engineering: (i) cross correlation between margin parameters are stronger than expected , (ii) marginal distributions cannot be forgotten in the model selection process and (iii) special attention must be addressed to model validation as far as extreme values are of concern.

  18. Labour and residential accessibility: a Bayesian analysis based on Poisson gravity models with spatial effects

    NASA Astrophysics Data System (ADS)

    Alonso, M. P.; Beamonte, M. A.; Gargallo, P.; Salvador, M. J.

    2014-10-01

    In this study, we measure jointly the labour and the residential accessibility of a basic spatial unit using a Bayesian Poisson gravity model with spatial effects. The accessibility measures are broken down into two components: the attractiveness component, which is related to its socio-economic and demographic characteristics, and the impedance component, which reflects the ease of communication within and between basic spatial units. For illustration purposes, the methodology is applied to a data set containing information about commuters from the Spanish region of Aragón. We identify the areas with better labour and residential accessibility, and we also analyse the attractiveness and the impedance components of a set of chosen localities which allows us to better understand their mobility patterns.

  19. Inference of posterior inclusion probability of QTLs in Bayesian shrinkage analysis.

    PubMed

    Yang, Deguang; Han, Shanshan; Jiang, Dan; Yang, Runqing; Fang, Ming

    2015-01-01

    Bayesian shrinkage analysis estimates all QTLs effects simultaneously, which shrinks the effect of "insignificant" QTLs close to zero so that it does not need special model selection. Bayesian shrinkage estimation usually has an excellent performance on multiple QTLs mapping, but it could not give a probabilistic explanation of how often a QTLs is included in the model, also called posterior inclusion probability, which is important to assess the importance of a QTL. In this research, two methods, FitMix and SimMix, are proposed to approximate the posterior probabilities. Under the assumption of mixture distribution of the estimated QTL effect, FitMix and SimMix mathematically and intuitively fit mixture distribution, respectively. The simulation results showed that both methods gave very reasonable estimates for posterior probabilities. We also applied the two methods to map QTLs for the North American Barley Genome Mapping Project data. PMID:25857576

  20. Bayesian estimation of dynamic matching function for U-V analysis in Japan

    NASA Astrophysics Data System (ADS)

    Kyo, Koki; Noda, Hideo; Kitagawa, Genshiro

    2012-05-01

    In this paper we propose a Bayesian method for analyzing unemployment dynamics. We derive a Beveridge curve for unemployment and vacancy (U-V) analysis from a Bayesian model based on a labor market matching function. In our framework, the efficiency of matching and the elasticities of new hiring with respect to unemployment and vacancy are regarded as time varying parameters. To construct a flexible model and obtain reasonable estimates in an underdetermined estimation problem, we treat the time varying parameters as random variables and introduce smoothness priors. The model is then described in a state space representation, enabling the parameter estimation to be carried out using Kalman filter and fixed interval smoothing. In such a representation, dynamic features of the cyclic unemployment rate and the structural-frictional unemployment rate can be accurately captured.

  1. Bayesian Propensity Score Analysis: Simulation and Case Study

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Cassie J. S.

    2011-01-01

    Propensity score analysis (PSA) has been used in a variety of settings, such as education, epidemiology, and sociology. Most typically, propensity score analysis has been implemented within the conventional frequentist perspective of statistics. This perspective, as is well known, does not account for uncertainty in either the parameters of the…

  2. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    NASA Astrophysics Data System (ADS)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  3. Bayesian calibration of a soil organic carbon model with radiocarbon measurements of heterotrophic respiration and soil organic carbon as joint constraints

    NASA Astrophysics Data System (ADS)

    Ahrens, B.; Borken, W.; Muhr, J.; Savage, K.; Wutzler, T.; Trumbore, S.; Reichstein, M.

    2012-04-01

    Soils of temperate forests store significant amounts of organic matter and are considered to be net sinks of atmospheric CO2. Soil organic carbon (SOC) dynamics have been studied using the Δ14C signature of bulk SOC or different SOC fractions as observational constraints in SOC models. Further, the Δ14C signature of CO2 evolved during the incubation of soil and roots has been widely used together with Δ14C of total soil respiration to partition soil respiration into heterotrophic respiration (HR) and rhizosphere respiration. However, this data has not been used as joint observational constraints to determine SOC turnover times. Thus, we want to present: (1) how different combinations of observational constraints help to narrow estimates of turnover times and other parameters of a simple two-pool model, ICBM; (2) if a multiple constraints approach allows determining whether a forest soil has been storing or losing SOC. To this end ICBM was adapted to model SOC and SO14C in parallel with litterfall and the Δ14C signature of litterfall as driving variables. The Δ14C signature of the atmosphere with its prominent bomb peak was used as a proxy for the Δ14C signature of litterfall. Data from three spruce dominated temperate forests in Germany and the USA (Coulissenhieb II, Solling D0 and Howland Tower site) were used to estimate the parameters of ICBM via Bayesian calibration. Key findings are: (1) the joint use of all 4 observational constraints helped to considerably narrow turnover times of the young pool (primarily by Δ14C of HR) and the old pool (primarily by Δ14C of SOC). Furthermore, the joint use all observational constraints allowed constraining the humification factor in ICBM, which describes the fraction of the annual outflux from the young pool that enters the old pool. The Bayesian parameter estimation yielded the following turnover times (median ± interquartile range) for SOC in the young pool: Coulissenhieb II 2.9 ± 2.1 years, Solling D0 8.4 ± 1

  4. Bayesian sensitivity analysis of a nonlinear finite element model

    NASA Astrophysics Data System (ADS)

    Becker, W.; Oakley, J. E.; Surace, C.; Gili, P.; Rowson, J.; Worden, K.

    2012-10-01

    A major problem in uncertainty and sensitivity analysis is that the computational cost of propagating probabilistic uncertainty through large nonlinear models can be prohibitive when using conventional methods (such as Monte Carlo methods). A powerful solution to this problem is to use an emulator, which is a mathematical representation of the model built from a small set of model runs at specified points in input space. Such emulators are massively cheaper to run and can be used to mimic the "true" model, with the result that uncertainty analysis and sensitivity analysis can be performed for a greatly reduced computational cost. The work here investigates the use of an emulator known as a Gaussian process (GP), which is an advanced probabilistic form of regression. The GP is particularly suited to uncertainty analysis since it is able to emulate a wide class of models, and accounts for its own emulation uncertainty. Additionally, uncertainty and sensitivity measures can be estimated analytically, given certain assumptions. The GP approach is explained in detail here, and a case study of a finite element model of an airship is used to demonstrate the method. It is concluded that the GP is a very attractive way of performing uncertainty and sensitivity analysis on large models, provided that the dimensionality is not too high.

  5. A Bayesian Framework for Functional Mapping through Joint Modeling of Longitudinal and Time-to-Event Data

    PubMed Central

    Das, Kiranmoy; Li, Runze; Huang, Zhongwen; Gai, Junyi; Wu, Rongling

    2012-01-01

    The most powerful and comprehensive approach of study in modern biology is to understand the whole process of development and all events of importance to development which occur in the process. As a consequence, joint modeling of developmental processes and events has become one of the most demanding tasks in statistical research. Here, we propose a joint modeling framework for functional mapping of specific quantitative trait loci (QTLs) which controls developmental processes and the timing of development and their causal correlation over time. The joint model contains two submodels, one for a developmental process, known as a longitudinal trait, and the other for a developmental event, known as the time to event, which are connected through a QTL mapping framework. A nonparametric approach is used to model the mean and covariance function of the longitudinal trait while the traditional Cox proportional hazard (PH) model is used to model the event time. The joint model is applied to map QTLs that control whole-plant vegetative biomass growth and time to first flower in soybeans. Results show that this model should be broadly useful for detecting genes controlling physiological and pathological processes and other events of interest in biomedicine. PMID:22685454

  6. Analysis and experimental study of the spherical joint clearance

    NASA Astrophysics Data System (ADS)

    Zhao, Peng; Hu, Penghao; Bao, Xinxin; Li, Shuaipeng

    2013-10-01

    The spherical joint clearance is a key error factor, which influenced and restricted the application of parallel mechanism in high precision field. This paper discusses the regularity of the spherical joint clearance in the parallel mechanism and its influence on the accuracy of the parallel mechanism in both theoretical and experimental aspects. A spherical joint clearance measuring instrument is introduced and used to measure the joint clearance. And the relationship between the clearance and its work pose is revealed. Based on the theoretical and experimental analysis, it is concluded that the clearance of the spherical joint is near-linear proportional to the applied load as well as the clearances in different poses obey the Rayleigh distribution approximately under the same load.

  7. An intake prior for the Bayesian analysis of plutonium and uranium exposures in an epidemiology study.

    PubMed

    Puncher, M; Birchall, A; Bull, R K

    2014-12-01

    In Bayesian inference, the initial knowledge regarding the value of a parameter, before additional data are considered, is represented as a prior probability distribution. This paper describes the derivation of a prior distribution of intake that was used for the Bayesian analysis of plutonium and uranium worker doses in a recent epidemiology study. The chosen distribution is log-normal with a geometric standard deviation of 6 and a median value that is derived for each worker based on the duration of the work history and the number of reported acute intakes. The median value is a function of the work history and a constant related to activity in air concentration, M, which is derived separately for uranium and plutonium. The value of M is based primarily on measurements of plutonium and uranium in air derived from historical personal air sampler (PAS) data. However, there is significant uncertainty on the value of M that results from paucity of PAS data and from extrapolating these measurements to actual intakes. This paper compares posterior and prior distributions of intake and investigates the sensitivity of the Bayesian analyses to the assumed value of M. It is found that varying M by a factor of 10 results in a much smaller factor of 2 variation in mean intake and lung dose for both plutonium and uranium. It is concluded that if a log-normal distribution is considered to adequately represent worker intakes, then the Bayesian posterior distribution of dose is relatively insensitive to the value assumed of M. PMID:24191121

  8. Highly efficient Bayesian joint inversion for receiver-based data and its application to lithospheric structure beneath the southern Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Dettmer, Jan; Rhie, Junkee; Tkalčić, Hrvoje

    2016-07-01

    With the deployment of extensive seismic arrays, systematic and efficient parameter and uncertainty estimation is of increasing importance and can provide reliable, regional models for crustal and upper-mantle structure. We present an efficient Bayesian method for the joint inversion of surface-wave dispersion and receiver-function data that combines trans-dimensional (trans-D) model selection in an optimization phase with subsequent rigorous parameter uncertainty estimation. Parameter and uncertainty estimation depend strongly on the chosen parametrization such that meaningful regional comparison requires quantitative model selection that can be carried out efficiently at several sites. While significant progress has been made for model selection (e.g. trans-D inference) at individual sites, the lack of efficiency can prohibit application to large data volumes or cause questionable results due to lack of convergence. Studies that address large numbers of data sets have mostly ignored model selection in favour of more efficient/simple estimation techniques (i.e. focusing on uncertainty estimation but employing ad-hoc model choices). Our approach consists of a two-phase inversion that combines trans-D optimization to select the most probable parametrization with subsequent Bayesian sampling for uncertainty estimation given that parametrization. The trans-D optimization is implemented here by replacing the likelihood function with the Bayesian information criterion (BIC). The BIC provides constraints on model complexity that facilitate the search for an optimal parametrization. Parallel tempering (PT) is applied as an optimization algorithm. After optimization, the optimal model choice is identified by the minimum BIC value from all PT chains. Uncertainty estimation is then carried out in fixed dimension. Data errors are estimated as part of the inference problem by a combination of empirical and hierarchical estimation. Data covariance matrices are estimated from

  9. Majorana Demonstrator Bolted Joint Mechanical and Thermal Analysis

    SciTech Connect

    Aguayo Navarrete, Estanislao; Reid, Douglas J.; Fast, James E.

    2012-06-01

    The MAJORANA DEMONSTRATOR is designed to probe for neutrinoless double-beta decay, an extremely rare process with a half-life in the order of 1026 years. The experiment uses an ultra-low background, high-purity germanium detector array. The germanium crystals are both the source and the detector in this experiment. Operating these crystals as ionizing radiation detectors requires having them under cryogenic conditions (below 90 K). A liquid nitrogen thermosyphon is used to extract the heat from the detectors. The detector channels are arranged in strings and thermally coupled to the thermosyphon through a cold plate. The cold plate is joined to the thermosyphon by a bolted joint. This circular plate is housed inside the cryostat can. This document provides a detailed study of the bolted joint that connects the cold plate and the thermosyphon. An analysis of the mechanical and thermal properties of this bolted joint is presented. The force applied to the joint is derived from the torque applied to each one of the six bolts that form the joint. The thermal conductivity of the joint is measured as a function of applied force. The required heat conductivity for a successful experiment is the combination of the thermal conductivity of the detector string and this joint. The thermal behavior of the joint is experimentally implemented and analyzed in this study.

  10. Reusable Solid Rocket Motor Nozzle Joint-4 Thermal Analysis

    NASA Technical Reports Server (NTRS)

    Clayton, J. Louie

    2001-01-01

    This study provides for development and test verification of a thermal model used for prediction of joint heating environments, structural temperatures and seal erosions in the Space Shuttle Reusable Solid Rocket Motor (RSRM) Nozzle Joint-4. The heating environments are a result of rapid pressurization of the joint free volume assuming a leak path has occurred in the filler material used for assembly gap close out. Combustion gases flow along the leak path from nozzle environment to joint O-ring gland resulting in local heating to the metal housing and erosion of seal materials. Analysis of this condition was based on usage of the NASA Joint Pressurization Routine (JPR) for environment determination and the Systems Improved Numerical Differencing Analyzer (SINDA) for structural temperature prediction. Model generated temperatures, pressures and seal erosions are compared to hot fire test data for several different leak path situations. Investigated in the hot fire test program were nozzle joint-4 O-ring erosion sensitivities to leak path width in both open and confined joint geometries. Model predictions were in generally good agreement with the test data for the confined leak path cases. Worst case flight predictions are provided using the test-calibrated model. Analysis issues are discussed based on model calibration procedures.

  11. Bayesian Analysis and Segmentation of Multichannel Image Sequences

    NASA Astrophysics Data System (ADS)

    Chang, Michael Ming Hsin

    This thesis is concerned with the segmentation and analysis of multichannel image sequence data. In particular, we use maximum a posteriori probability (MAP) criterion and Gibbs random fields (GRF) to formulate the problems. We start by reviewing the significance of MAP estimation with GRF priors and study the feasibility of various optimization methods for implementing the MAP estimator. We proceed to investigate three areas where image data and parameter estimates are present in multichannels, multiframes, and interrelated in complicated manners. These areas of study include color image segmentation, multislice MR image segmentation, and optical flow estimation and segmentation in multiframe temporal sequences. Besides developing novel algorithms in each of these areas, we demonstrate how to exploit the potential of MAP estimation and GRFs, and we propose practical and efficient implementations. Illustrative examples and relevant experimental results are included.

  12. Analysis of adhesively bonded composite lap joints

    SciTech Connect

    Tong, L.; Kuruppu, M.; Kelly, D.

    1994-12-31

    A new nonlinear formulation is developed for the governing equations for the shear and peel stresses in adhesively bonded composite double lap joints. The new formulation allows arbitrary nonlinear stress-strain characteristics in both shear and peel behavior. The equations are numerically integrated using a shooting technique and Newton-Raphson method behind a user friendly interface. The failure loads are predicted by utilizing the maximum stress criterion, interlaminar delamination and the energy density failure criteria. Numerical examples are presented to demonstrate the effect of the nonlinear adhesive behavior on the stress distribution and predict the failure load and the associated mode.

  13. Viscoelastic analysis of adhesively bonded joints

    NASA Technical Reports Server (NTRS)

    Delale, F.; Erdogan, F.

    1981-01-01

    In this paper an adhesively bonded lap joint is analyzed by assuming that the adherends are elastic and the adhesive is linearly viscoelastic. After formulating the general problem a specific example for two identical adherends bonded through a three parameter viscoelastic solid adhesive is considered. The standard Laplace transform technique is used to solve the problem. The stress distribution in the adhesive layer is calculated for three different external loads namely, membrane loading, bending, and transverse shear loading. The results indicate that the peak value of the normal stress in the adhesive is not only consistently higher than the corresponding shear stress but also decays slower.

  14. Viscoelastic analysis of adhesively bonded joints

    NASA Technical Reports Server (NTRS)

    Delale, F.; Erdogan, F.

    1980-01-01

    An adhesively bonded lap joint is analyzed by assuming that the adherends are elastic and the adhesive is linearly viscoelastic. After formulating the general problem a specific example for two identical adherends bonded through a three parameter viscoelastic solid adhesive is considered. The standard Laplace transform technique is used to solve the problem. The stress distribution in the adhesive layer is calculated for three different external loads, namely, membrane loading, bending, and transverse shear loading. The results indicate that the peak value of the normal stress in the adhesive is not only consistently higher than the corresponding shear stress but also decays slower.

  15. Transdimensional Bayesian approach to pulsar timing noise analysis

    NASA Astrophysics Data System (ADS)

    Ellis, J. A.; Cornish, N. J.

    2016-04-01

    The modeling of intrinsic noise in pulsar timing residual data is of crucial importance for gravitational wave detection and pulsar timing (astro)physics in general. The noise budget in pulsars is a collection of several well-studied effects including radiometer noise, pulse-phase jitter noise, dispersion measure variations, and low-frequency spin noise. However, as pulsar timing data continue to improve, nonstationary and non-power-law noise terms are beginning to manifest which are not well modeled by current noise analysis techniques. In this work, we use a transdimensional approach to model these nonstationary and non-power-law effects through the use of a wavelet basis and an interpolation-based adaptive spectral modeling. In both cases, the number of wavelets and the number of control points in the interpolated spectrum are free parameters that are constrained by the data and then marginalized over in the final inferences, thus fully incorporating our ignorance of the noise model. We show that these new methods outperform standard techniques when nonstationary and non-power-law noise is present. We also show that these methods return results consistent with the standard analyses when no such signals are present.

  16. A Bayesian Analysis of the Correlations Among Sunspot Cycles

    NASA Astrophysics Data System (ADS)

    Yu, Y.; van Dyk, D. A.; Kashyap, V. L.; Young, C. A.

    2012-12-01

    Sunspot numbers form a comprehensive, long-duration proxy of solar activity and have been used numerous times to empirically investigate the properties of the solar cycle. A number of correlations have been discovered over the 24 cycles for which observational records are available. Here we carry out a sophisticated statistical analysis of the sunspot record that reaffirms these correlations, and sets up an empirical predictive framework for future cycles. An advantage of our approach is that it allows for rigorous assessment of both the statistical significance of various cycle features and the uncertainty associated with predictions. We summarize the data into three sequential relations that estimate the amplitude, duration, and time of rise to maximum for any cycle, given the values from the previous cycle. We find that there is no indication of a persistence in predictive power beyond one cycle, and we conclude that the dynamo does not retain memory beyond one cycle. Based on sunspot records up to October 2011, we obtain, for Cycle 24, an estimated maximum smoothed monthly sunspot number of 97±15, to occur in January - February 2014 ± six months.

  17. Hierarchical models and Bayesian analysis of bird survey information

    USGS Publications Warehouse

    Sauer, J.R.; Link, W.A.; Royle, J. Andrew

    2005-01-01

    Summary of bird survey information is a critical component of conservation activities, but often our summaries rely on statistical methods that do not accommodate the limitations of the information. Prioritization of species requires ranking and analysis of species by magnitude of population trend, but often magnitude of trend is a misleading measure of actual decline when trend is poorly estimated. Aggregation of population information among regions is also complicated by varying quality of estimates among regions. Hierarchical models provide a reasonable means of accommodating concerns about aggregation and ranking of quantities of varying precision. In these models the need to consider multiple scales is accommodated by placing distributional assumptions on collections of parameters. For collections of species trends, this allows probability statements to be made about the collections of species-specific parameters, rather than about the estimates. We define and illustrate hierarchical models for two commonly encountered situations in bird conservation: (1) Estimating attributes of collections of species estimates, including ranking of trends, estimating number of species with increasing populations, and assessing population stability with regard to predefined trend magnitudes; and (2) estimation of regional population change, aggregating information from bird surveys over strata. User-friendly computer software makes hierarchical models readily accessible to scientists.

  18. Bayesian Finite Mixtures for Nonlinear Modeling of Educational Data.

    ERIC Educational Resources Information Center

    Tirri, Henry; And Others

    A Bayesian approach for finding latent classes in data is discussed. The approach uses finite mixture models to describe the underlying structure in the data and demonstrate that the possibility of using full joint probability models raises interesting new prospects for exploratory data analysis. The concepts and methods discussed are illustrated…

  19. Crash risk analysis for Shanghai urban expressways: A Bayesian semi-parametric modeling approach.

    PubMed

    Yu, Rongjie; Wang, Xuesong; Yang, Kui; Abdel-Aty, Mohamed

    2016-10-01

    Urban expressway systems have been developed rapidly in recent years in China; it has become one key part of the city roadway networks as carrying large traffic volume and providing high traveling speed. Along with the increase of traffic volume, traffic safety has become a major issue for Chinese urban expressways due to the frequent crash occurrence and the non-recurrent congestions caused by them. For the purpose of unveiling crash occurrence mechanisms and further developing Active Traffic Management (ATM) control strategies to improve traffic safety, this study developed disaggregate crash risk analysis models with loop detector traffic data and historical crash data. Bayesian random effects logistic regression models were utilized as it can account for the unobserved heterogeneity among crashes. However, previous crash risk analysis studies formulated random effects distributions in a parametric approach, which assigned them to follow normal distributions. Due to the limited information known about random effects distributions, subjective parametric setting may be incorrect. In order to construct more flexible and robust random effects to capture the unobserved heterogeneity, Bayesian semi-parametric inference technique was introduced to crash risk analysis in this study. Models with both inference techniques were developed for total crashes; semi-parametric models were proved to provide substantial better model goodness-of-fit, while the two models shared consistent coefficient estimations. Later on, Bayesian semi-parametric random effects logistic regression models were developed for weekday peak hour crashes, weekday non-peak hour crashes, and weekend non-peak hour crashes to investigate different crash occurrence scenarios. Significant factors that affect crash risk have been revealed and crash mechanisms have been concluded. PMID:26847949

  20. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network

    PubMed Central

    2011-01-01

    Background Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. Results We herein introduce a framework for network modularization and Bayesian network analysis (FMB) to investigate organism’s metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. Conclusions After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis. PMID:22784571

  1. Life-threatening arrhythmia verification in ICU patients using the joint cardiovascular dynamical model and a Bayesian filter.

    PubMed

    Sayadi, Omid; Shamsollahi, Mohammad B

    2011-10-01

    In this paper, a novel nonlinear joint dynamical model is presented, which is based on a set of coupled ordinary differential equations of motion and a Gaussian mixture model representation of pulsatile cardiovascular (CV) signals. In the proposed framework, the joint interdependences of CV signals are incorporated by assuming a unique angular frequency that controls the limit cycle of the heart rate. Moreover, the time consequence of CV signals is controlled by the same phase parameter that results in the space dimensionality reduction. These joint equations together with linear assignments to observation are further used in the Kalman filter structure for estimation and tracking. Moreover, we propose a measure of signal fidelity by monitoring the covariance matrix of the innovation signals throughout the filtering procedure. Five categories of life-threatening arrhythmias were verified by simultaneously tracking the signal fidelity and the polar representation of the CV signal estimations. We analyzed data from Physiobank multiparameter databases (MIMIC I and II). Performance evaluation results demonstrated that the sensitivity of the detection ranges over 93.50% and 100.00%. In particular, the addition of more CV signals improved the positive predictivity of the proposed method to 99.27% for the total arrhythmic types. The method was also used for false arrhythmia suppression issued by ICU monitors, with an overall false suppression rate reduced from 42.3% to 9.9%. In addition, false critical ECG arrhythmia alarm rates were found to be, on average, 42.3%, with individual rates varying between 16.7% and 86.5%. The results illustrate that the method can contribute to, and enhance the performance of clinical life-threatening arrhythmia detection. PMID:21324772

  2. Crystalline nucleation in undercooled liquids: A Bayesian data-analysis approach for a nonhomogeneous Poisson process

    NASA Astrophysics Data System (ADS)

    Filipponi, A.; Di Cicco, A.; Principi, E.

    2012-12-01

    A Bayesian data-analysis approach to data sets of maximum undercooling temperatures recorded in repeated melting-cooling cycles of high-purity samples is proposed. The crystallization phenomenon is described in terms of a nonhomogeneous Poisson process driven by a temperature-dependent sample nucleation rate J(T). The method was extensively tested by computer simulations and applied to real data for undercooled liquid Ge. It proved to be particularly useful in the case of scarce data sets where the usage of binned data would degrade the available experimental information.

  3. A Bayesian analysis of the chromosome architecture of human disorders by integrating reductionist data.

    PubMed

    Emmert-Streib, Frank; de Matos Simoes, Ricardo; Tripathi, Shailesh; Glazko, Galina V; Dehmer, Matthias

    2012-01-01

    In this paper, we present a Bayesian approach to estimate a chromosome and a disorder network from the Online Mendelian Inheritance in Man (OMIM) database. In contrast to other approaches, we obtain statistic rather than deterministic networks enabling a parametric control in the uncertainty of the underlying disorder-disease gene associations contained in the OMIM, on which the networks are based. From a structural investigation of the chromosome network, we identify three chromosome subgroups that reflect architectural differences in chromosome-disorder associations that are predictively exploitable for a functional analysis of diseases. PMID:22822426

  4. Bayesian design and analysis of computer experiments: Use of derivatives in surface prediction

    SciTech Connect

    Morris, M.D.; Mitchell, T.J. ); Ylvisaker, D. . Dept. of Mathematics)

    1991-06-01

    The work of Currin et al. and others in developing fast predictive approximations'' of computer models is extended for the case in which derivatives of the output variable of interest with respect to input variables are available. In addition to describing the calculations required for the Bayesian analysis, the issue of experimental design is also discussed, and an algorithm is described for constructing maximin distance'' designs. An example is given based on a demonstration model of eight inputs and one output, in which predictions based on a maximin design, a Latin hypercube design, and two compromise'' designs are evaluated and compared. 12 refs., 2 figs., 6 tabs.

  5. Bayesian Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David

    2014-02-01

    Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.

  6. Bayesian Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David

    2009-12-01

    Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.

  7. Bayesian analysis of RNA sequencing data by estimating multiple shrinkage priors.

    PubMed

    Van De Wiel, Mark A; Leday, Gwenaël G R; Pardo, Luba; Rue, Håvard; Van Der Vaart, Aad W; Van Wieringen, Wessel N

    2013-01-01

    Next generation sequencing is quickly replacing microarrays as a technique to probe different molecular levels of the cell, such as DNA or RNA. The technology provides higher resolution, while reducing bias. RNA sequencing results in counts of RNA strands. This type of data imposes new statistical challenges. We present a novel, generic approach to model and analyze such data. Our approach aims at large flexibility of the likelihood (count) model and the regression model alike. Hence, a variety of count models is supported, such as the popular NB model, which accounts for overdispersion. In addition, complex, non-balanced designs and random effects are accommodated. Like some other methods, our method provides shrinkage of dispersion-related parameters. However, we extend it by enabling joint shrinkage of parameters, including those for which inference is desired. We argue that this is essential for Bayesian multiplicity correction. Shrinkage is effectuated by empirically estimating priors. We discuss several parametric (mixture) and non-parametric priors and develop procedures to estimate (parameters of) those. Inference is provided by means of local and Bayesian false discovery rates. We illustrate our method on several simulations and two data sets, also to compare it with other methods. Model- and data-based simulations show substantial improvements in the sensitivity at the given specificity. The data motivate the use of the ZI-NB as a powerful alternative to the NB, which results in higher detection rates for low-count data. Finally, compared with other methods, the results on small sample subsets are more reproducible when validated on their large sample complements, illustrating the importance of the type of shrinkage. PMID:22988280

  8. Finite element analysis of human joints

    SciTech Connect

    Bossart, P.L.; Hollerbach, K.

    1996-09-01

    Our work focuses on the development of finite element models (FEMs) that describe the biomechanics of human joints. Finite element modeling is becoming a standard tool in industrial applications. In highly complex problems such as those found in biomechanics research, however, the full potential of FEMs is just beginning to be explored, due to the absence of precise, high resolution medical data and the difficulties encountered in converting these enormous datasets into a form that is usable in FEMs. With increasing computing speed and memory available, it is now feasible to address these challenges. We address the first by acquiring data with a high resolution C-ray CT scanner and the latter by developing semi-automated method for generating the volumetric meshes used in the FEM. Issues related to tomographic reconstruction, volume segmentation, the use of extracted surfaces to generate volumetric hexahedral meshes, and applications of the FEM are described.

  9. Analysis of molecular expression patterns and integration with other knowledge bases using probabilistic Bayesian network models

    SciTech Connect

    Moler, Edward J.; Mian, I.S.

    2000-03-01

    How can molecular expression experiments be interpreted with greater than ten to the fourth measurements per chip? How can one get the most quantitative information possible from the experimental data with good confidence? These are important questions whose solutions require an interdisciplinary combination of molecular and cellular biology, computer science, statistics, and complex systems analysis. The explosion of data from microarray techniques present the problem of interpreting the experiments. The availability of large-scale knowledge bases provide the opportunity to maximize the information extracted from these experiments. We have developed new methods of discovering biological function, metabolic pathways, and regulatory networks from these data and knowledge bases. These techniques are applicable to analyses for biomedical engineering, clinical, and fundamental cell and molecular biology studies. Our approach uses probabilistic, computational methods that give quantitative interpretations of data in a biological context. We have selected Bayesian statistical models with graphical network representations as a framework for our methods. As a first step, we use a nave Bayesian classifier to identify statistically significant patterns in gene expression data. We have developed methods which allow us to (a) characterize which genes or experiments distinguish each class from the others, (b) cross-index the resulting classes with other databases to assess biological meaning of the classes, and (c) display a gross overview of cellular dynamics. We have developed a number of visualization tools to convey the results. We report here our methods of classification and our first attempts at integrating the data and other knowledge bases together with new visualization tools. We demonstrate the utility of these methods and tools by analysis of a series of yeast cDNA microarray data and to a set of cancerous/normal sample data from colon cancer patients. We discuss

  10. Calculation of joint reaction force and joint moments using by wearable walking analysis system.

    PubMed

    Adachi, Wataru; Tsujiuchi, Nobutaka; Koizumi, Takayuki; Shiojima, Kouzou; Tsuchiya, Youtaro; Inoue, Yoshio

    2012-01-01

    In gait analysis, which is one useful method for efficient physical rehabilitation, the ground reaction force, the center of pressure, and the body orientation data are measured during walking. In the past, these data were measured by a 3D motion analysis system consisting of high-speed cameras and force plates, which must be installed in the floor. However, a conventional 3D motion analysis system can measure the ground reaction force and the center of pressure just on force plates during a few steps. In addition, the subjects' stride lengths are limited because they have to walk on the center of the force plate. These problems can be resolved by converting conventional devices into wearable devices. We used a measuring device consisting of portable force plates and motion sensors. We developed a walking analysis system that calculates the ground reaction force, the center of pressure, and the body orientations and measured a walking subject to estimate this system. We simultaneously used a conventional 3D motion analysis system to compare with our development system and showed its validity for measurements of ground reaction force and the center of pressure. Moreover we calculated joint reactions and joint moment of each joint. PMID:23365940