Science.gov

Sample records for joint bayesian analysis

  1. JBASE: Joint Bayesian Analysis of Subphenotypes and Epistasis

    PubMed Central

    Colak, Recep; Kim, TaeHyung; Kazan, Hilal; Oh, Yoomi; Cruz, Miguel; Valladares-Salgado, Adan; Peralta, Jesus; Escobedo, Jorge; Parra, Esteban J.; Kim, Philip M.; Goldenberg, Anna

    2016-01-01

    Motivation: Rapid advances in genotyping and genome-wide association studies have enabled the discovery of many new genotype–phenotype associations at the resolution of individual markers. However, these associations explain only a small proportion of theoretically estimated heritability of most diseases. In this work, we propose an integrative mixture model called JBASE: joint Bayesian analysis of subphenotypes and epistasis. JBASE explores two major reasons of missing heritability: interactions between genetic variants, a phenomenon known as epistasis and phenotypic heterogeneity, addressed via subphenotyping. Results: Our extensive simulations in a wide range of scenarios repeatedly demonstrate that JBASE can identify true underlying subphenotypes, including their associated variants and their interactions, with high precision. In the presence of phenotypic heterogeneity, JBASE has higher Power and lower Type 1 Error than five state-of-the-art approaches. We applied our method to a sample of individuals from Mexico with Type 2 diabetes and discovered two novel epistatic modules, including two loci each, that define two subphenotypes characterized by differences in body mass index and waist-to-hip ratio. We successfully replicated these subphenotypes and epistatic modules in an independent dataset from Mexico genotyped with a different platform. Availability and implementation: JBASE is implemented in C++, supported on Linux and is available at http://www.cs.toronto.edu/∼goldenberg/JBASE/jbase.tar.gz. The genotype data underlying this study are available upon approval by the ethics review board of the Medical Centre Siglo XXI. Please contact Dr Miguel Cruz at mcruzl@yahoo.com for assistance with the application. Contact: anna.goldenberg@utoronto.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26411870

  2. A Bayesian approach to joint analysis of longitudinal measurements and competing risks failure time data.

    PubMed

    Hu, Wenhua; Li, Gang; Li, Ning

    2009-05-15

    In this paper, we develop a Bayesian method for joint analysis of longitudinal measurements and competing risks failure time data. The model allows one to analyze the longitudinal outcome with nonignorable missing data induced by multiple types of events, to analyze survival data with dependent censoring for the key event, and to draw inferences on multiple endpoints simultaneously. Compared with the likelihood approach, the Bayesian method has several advantages. It is computationally more tractable for high-dimensional random effects. It is also convenient to draw inference. Moreover, it provides a means to incorporate prior information that may help to improve estimation accuracy. An illustration is given using a clinical trial data of scleroderma lung disease. The performance of our method is evaluated by simulation studies. PMID:19308919

  3. Joint Bayesian analysis of birthweight and censored gestational age using finite mixture models

    PubMed Central

    Schwartz, Scott L.; Gelfand, Alan E.; Miranda, Marie L.

    2016-01-01

    Birthweight and gestational age are closely related and represent important indicators of a healthy pregnancy. Customary modeling for birthweight is conditional on gestational age. However, joint modeling directly addresses the relationship between gestational age and birthweight, and provides increased flexibility and interpretation as well as a strategy to avoid using gestational age as an intermediate variable. Previous proposals have utilized finite mixtures of bivariate regression models to incorporate well-established risk factors into analysis (e.g. sex and birth order of the baby, maternal age, race, and tobacco use) while examining the non-Gaussian shape of the joint birthweight and gestational age distribution. We build on this approach by demonstrating the inferential (prognostic) benefits of joint modeling (e.g. investigation of `age inappropriate' outcomes like small for gestational age) and hence re-emphasize the importance of capturing the non-Gaussian distributional shapes. We additionally extend current models through a latent specification which admits interval-censored gestational age. We work within a Bayesian framework which enables inference beyond customary parameter estimation and prediction as well as exact uncertainty assessment. The model is applied to a portion of the 2003–2006 North Carolina Detailed Birth Record data (n=336129) available through the Children's Environmental Health Initiative and is fitted using the Bayesian methodology and Markov chain Monte Carlo approaches. PMID:20575047

  4. JAM: A Scalable Bayesian Framework for Joint Analysis of Marginal SNP Effects

    PubMed Central

    Conti, David V.; Richardson, Sylvia

    2016-01-01

    ABSTRACT Recently, large scale genome‐wide association study (GWAS) meta‐analyses have boosted the number of known signals for some traits into the tens and hundreds. Typically, however, variants are only analysed one‐at‐a‐time. This complicates the ability of fine‐mapping to identify a small set of SNPs for further functional follow‐up. We describe a new and scalable algorithm, joint analysis of marginal summary statistics (JAM), for the re‐analysis of published marginal summary stactistics under joint multi‐SNP models. The correlation is accounted for according to estimates from a reference dataset, and models and SNPs that best explain the complete joint pattern of marginal effects are highlighted via an integrated Bayesian penalized regression framework. We provide both enumerated and Reversible Jump MCMC implementations of JAM and present some comparisons of performance. In a series of realistic simulation studies, JAM demonstrated identical performance to various alternatives designed for single region settings. In multi‐region settings, where the only multivariate alternative involves stepwise selection, JAM offered greater power and specificity. We also present an application to real published results from MAGIC (meta‐analysis of glucose and insulin related traits consortium) – a GWAS meta‐analysis of more than 15,000 people. We re‐analysed several genomic regions that produced multiple significant signals with glucose levels 2 hr after oral stimulation. Through joint multivariate modelling, JAM was able to formally rule out many SNPs, and for one gene, ADCY5, suggests that an additional SNP, which transpired to be more biologically plausible, should be followed up with equal priority to the reported index. PMID:27027514

  5. JAM: A Scalable Bayesian Framework for Joint Analysis of Marginal SNP Effects.

    PubMed

    Newcombe, Paul J; Conti, David V; Richardson, Sylvia

    2016-04-01

    Recently, large scale genome-wide association study (GWAS) meta-analyses have boosted the number of known signals for some traits into the tens and hundreds. Typically, however, variants are only analysed one-at-a-time. This complicates the ability of fine-mapping to identify a small set of SNPs for further functional follow-up. We describe a new and scalable algorithm, joint analysis of marginal summary statistics (JAM), for the re-analysis of published marginal summary statistics under joint multi-SNP models. The correlation is accounted for according to estimates from a reference dataset, and models and SNPs that best explain the complete joint pattern of marginal effects are highlighted via an integrated Bayesian penalized regression framework. We provide both enumerated and Reversible Jump MCMC implementations of JAM and present some comparisons of performance. In a series of realistic simulation studies, JAM demonstrated identical performance to various alternatives designed for single region settings. In multi-region settings, where the only multivariate alternative involves stepwise selection, JAM offered greater power and specificity. We also present an application to real published results from MAGIC (meta-analysis of glucose and insulin related traits consortium) - a GWAS meta-analysis of more than 15,000 people. We re-analysed several genomic regions that produced multiple significant signals with glucose levels 2 hr after oral stimulation. Through joint multivariate modelling, JAM was able to formally rule out many SNPs, and for one gene, ADCY5, suggests that an additional SNP, which transpired to be more biologically plausible, should be followed up with equal priority to the reported index. PMID:27027514

  6. Joint high-dimensional Bayesian variable and covariance selection with an application to eQTL analysis.

    PubMed

    Bhadra, Anindya; Mallick, Bani K

    2013-06-01

    We describe a Bayesian technique to (a) perform a sparse joint selection of significant predictor variables and significant inverse covariance matrix elements of the response variables in a high-dimensional linear Gaussian sparse seemingly unrelated regression (SSUR) setting and (b) perform an association analysis between the high-dimensional sets of predictors and responses in such a setting. To search the high-dimensional model space, where both the number of predictors and the number of possibly correlated responses can be larger than the sample size, we demonstrate that a marginalization-based collapsed Gibbs sampler, in combination with spike and slab type of priors, offers a computationally feasible and efficient solution. As an example, we apply our method to an expression quantitative trait loci (eQTL) analysis on publicly available single nucleotide polymorphism (SNP) and gene expression data for humans where the primary interest lies in finding the significant associations between the sets of SNPs and possibly correlated genetic transcripts. Our method also allows for inference on the sparse interaction network of the transcripts (response variables) after accounting for the effect of the SNPs (predictor variables). We exploit properties of Gaussian graphical models to make statements concerning conditional independence of the responses. Our method compares favorably to existing Bayesian approaches developed for this purpose. PMID:23607608

  7. Bayesian Mediation Analysis

    ERIC Educational Resources Information Center

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  8. A Bayesian Approach for Multigroup Nonlinear Factor Analysis.

    ERIC Educational Resources Information Center

    Song, Xin-Yuan; Lee, Sik-Yum

    2002-01-01

    Developed a Bayesian approach for a general multigroup nonlinear factor analysis model that simultaneously obtains joint Bayesian estimates of the factor scores and the structural parameters subjected to some constraints across different groups. (SLD)

  9. Bayesian Exploratory Factor Analysis

    PubMed Central

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  10. Bayesian joint modeling of longitudinal and spatial survival AIDS data.

    PubMed

    Martins, Rui; Silva, Giovani L; Andreozzi, Valeska

    2016-08-30

    Joint analysis of longitudinal and survival data has received increasing attention in the recent years, especially for analyzing cancer and AIDS data. As both repeated measurements (longitudinal) and time-to-event (survival) outcomes are observed in an individual, a joint modeling is more appropriate because it takes into account the dependence between the two types of responses, which are often analyzed separately. We propose a Bayesian hierarchical model for jointly modeling longitudinal and survival data considering functional time and spatial frailty effects, respectively. That is, the proposed model deals with non-linear longitudinal effects and spatial survival effects accounting for the unobserved heterogeneity among individuals living in the same region. This joint approach is applied to a cohort study of patients with HIV/AIDS in Brazil during the years 2002-2006. Our Bayesian joint model presents considerable improvements in the estimation of survival times of the Brazilian HIV/AIDS patients when compared with those obtained through a separate survival model and shows that the spatial risk of death is the same across the different Brazilian states. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26990773

  11. Road network safety evaluation using Bayesian hierarchical joint model.

    PubMed

    Wang, Jie; Huang, Helai

    2016-05-01

    Safety and efficiency are commonly regarded as two significant performance indicators of transportation systems. In practice, road network planning has focused on road capacity and transport efficiency whereas the safety level of a road network has received little attention in the planning stage. This study develops a Bayesian hierarchical joint model for road network safety evaluation to help planners take traffic safety into account when planning a road network. The proposed model establishes relationships between road network risk and micro-level variables related to road entities and traffic volume, as well as socioeconomic, trip generation and network density variables at macro level which are generally used for long term transportation plans. In addition, network spatial correlation between intersections and their connected road segments is also considered in the model. A road network is elaborately selected in order to compare the proposed hierarchical joint model with a previous joint model and a negative binomial model. According to the results of the model comparison, the hierarchical joint model outperforms the joint model and negative binomial model in terms of the goodness-of-fit and predictive performance, which indicates the reasonableness of considering the hierarchical data structure in crash prediction and analysis. Moreover, both random effects at the TAZ level and the spatial correlation between intersections and their adjacent segments are found to be significant, supporting the employment of the hierarchical joint model as an alternative in road-network-level safety modeling as well. PMID:26945109

  12. Bayesian Joint Modelling for Object Localisation in Weakly Labelled Images.

    PubMed

    Shi, Zhiyuan; Hospedales, Timothy M; Xiang, Tao

    2015-10-01

    We address the problem of localisation of objects as bounding boxes in images and videos with weak labels. This weakly supervised object localisation problem has been tackled in the past using discriminative models where each object class is localised independently from other classes. In this paper, a novel framework based on Bayesian joint topic modelling is proposed, which differs significantly from the existing ones in that: (1) All foreground object classes are modelled jointly in a single generative model that encodes multiple object co-existence so that "explaining away" inference can resolve ambiguity and lead to better learning and localisation. (2) Image backgrounds are shared across classes to better learn varying surroundings and "push out" objects of interest. (3) Our model can be learned with a mixture of weakly labelled and unlabelled data, allowing the large volume of unlabelled images on the Internet to be exploited for learning. Moreover, the Bayesian formulation enables the exploitation of various types of prior knowledge to compensate for the limited supervision offered by weakly labelled data, as well as Bayesian domain adaptation for transfer learning. Extensive experiments on the PASCAL VOC, ImageNet and YouTube-Object videos datasets demonstrate the effectiveness of our Bayesian joint model for weakly supervised object localisation. PMID:26340253

  13. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  14. Bayesian Model Averaging for Propensity Score Analysis

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  15. Analysis of COSIMA spectra: Bayesian approach

    NASA Astrophysics Data System (ADS)

    Lehto, H. J.; Zaprudin, B.; Lehto, K. M.; Lönnberg, T.; Silén, J.; Rynö, J.; Krüger, H.; Hilchenbach, M.; Kissel, J.

    2015-06-01

    We describe the use of Bayesian analysis methods applied to time-of-flight secondary ion mass spectrometer (TOF-SIMS) spectra. The method is applied to the COmetary Secondary Ion Mass Analyzer (COSIMA) TOF-SIMS mass spectra where the analysis can be broken into subgroups of lines close to integer mass values. The effects of the instrumental dead time are discussed in a new way. The method finds the joint probability density functions of measured line parameters (number of lines, and their widths, peak amplitudes, integrated amplitudes and positions). In the case of two or more lines, these distributions can take complex forms. The derived line parameters can be used to further calibrate the mass scaling of TOF-SIMS and to feed the results into other analysis methods such as multivariate analyses of spectra. We intend to use the method, first as a comprehensive tool to perform quantitative analysis of spectra, and second as a fast tool for studying interesting targets for obtaining additional TOF-SIMS measurements of the sample, a property unique to COSIMA. Finally, we point out that the Bayesian method can be thought of as a means to solve inverse problems but with forward calculations, only with no iterative corrections or other manipulation of the observed data.

  16. Bayesian Statistics for Biological Data: Pedigree Analysis

    ERIC Educational Resources Information Center

    Stanfield, William D.; Carlton, Matthew A.

    2004-01-01

    The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.

  17. A Bayesian joint model of menstrual cycle length and fecundity.

    PubMed

    Lum, Kirsten J; Sundaram, Rajeshwari; Buck Louis, Germaine M; Louis, Thomas A

    2016-03-01

    Menstrual cycle length (MCL) has been shown to play an important role in couple fecundity, which is the biologic capacity for reproduction irrespective of pregnancy intentions. However, a comprehensive assessment of its role requires a fecundity model that accounts for male and female attributes and the couple's intercourse pattern relative to the ovulation day. To this end, we employ a Bayesian joint model for MCL and pregnancy. MCLs follow a scale multiplied (accelerated) mixture model with Gaussian and Gumbel components; the pregnancy model includes MCL as a covariate and computes the cycle-specific probability of pregnancy in a menstrual cycle conditional on the pattern of intercourse and no previous fertilization. Day-specific fertilization probability is modeled using natural, cubic splines. We analyze data from the Longitudinal Investigation of Fertility and the Environment Study (the LIFE Study), a couple based prospective pregnancy study, and find a statistically significant quadratic relation between fecundity and menstrual cycle length, after adjustment for intercourse pattern and other attributes, including male semen quality, both partner's age, and active smoking status (determined by baseline cotinine level 100 ng/mL). We compare results to those produced by a more basic model and show the advantages of a more comprehensive approach. PMID:26295923

  18. A Bayesian Joint Model of Menstrual Cycle Length and Fecundity

    PubMed Central

    Lum, Kirsten J.; Sundaram, Rajeshwari; Louis, Germaine M. Buck; Louis, Thomas A.

    2015-01-01

    Summary Menstrual cycle length (MCL) has been shown to play an important role in couple fecundity, which is the biologic capacity for reproduction irrespective of pregnancy intentions. However, a comprehensive assessment of its role requires a fecundity model that accounts for male and female attributes and the couple’s intercourse pattern relative to the ovulation day. To this end, we employ a Bayesian joint model for MCL and pregnancy. MCLs follow a scale multiplied (accelerated) mixture model with Gaussian and Gumbel components; the pregnancy model includes MCL as a covariate and computes the cycle-specific probability of pregnancy in a menstrual cycle conditional on the pattern of intercourse and no previous fertilization. Day-specific fertilization probability is modeled using natural, cubic splines. We analyze data from the Longitudinal Investigation of Fertility and the Environment Study (the LIFE Study), a couple based prospective pregnancy study, and find a statistically significant quadratic relation between fecundity and menstrual cycle length, after adjustment for intercourse pattern and other attributes, including male semen quality, both partner’s age, and active smoking status (determined by baseline cotinine level 100ng/mL). We compare results to those produced by a more basic model and show the advantages of a more comprehensive approach. PMID:26295923

  19. Bayesian Analysis of Underground Flooding

    NASA Astrophysics Data System (ADS)

    Bogardi, Istvan; Duckstein, Lucien; Szidarovszky, Ferenc

    1982-08-01

    An event-based stochastic model is used to describe the spatial phenomenon of water inrush into underground works located under a karstic aquifer, and a Bayesian analysis is performed because of high parameter uncertainty. The random variables of the model are inrush yield per event, distance between events, number of events per unit underground space, maximum yield, and total yield over mine lifetime. Physically based hypotheses on the types of distributions are made and reinforced by observations. High parameter uncertainty stems from the random characteristics of karstic limestone and the limited amount of observation data. Thus, during the design stage, only indirect data such as regional information and geological analogies are available; updating of this information should then be done as the construction progresses and inrush events are observed and recorded. A Bayes simulation algorithm is developed and applied to estimate the probability distributions of inrush event characteristics used in the design of water control facilities in underground mining. A real-life example in the Transdanubian region of Hungary is used to illustrate the methodology.

  20. Bayesian analysis of volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Ho, Chih-Hsiang

    1990-10-01

    The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.

  1. Jointly modeling time-to-event and longitudinal data: A Bayesian approach.

    PubMed

    Huang, Yangxin; Hu, X Joan; Dagne, Getachew A

    2014-03-01

    This article explores Bayesian joint models of event times and longitudinal measures with an attempt to overcome departures from normality of the longitudinal response, measurement errors, and shortages of confidence in specifying a parametric time-to-event model. We allow the longitudinal response to have a skew distribution in the presence of measurement errors, and assume the time-to-event variable to have a nonparametric prior distribution. Posterior distributions of the parameters are attained simultaneously for inference based on Bayesian approach. An example from a recent AIDS clinical trial illustrates the methodology by jointly modeling the viral dynamics and the time to decrease in CD4/CD8 ratio in the presence of CD4 counts with measurement errors and to compare potential models with various scenarios and different distribution specifications. The analysis outcome indicates that the time-varying CD4 covariate is closely related to the first-phase viral decay rate, but the time to CD4/CD8 decrease is not highly associated with either the two viral decay rates or the CD4 changing rate over time. These findings may provide some quantitative guidance to better understand the relationship of the virological and immunological responses to antiretroviral treatments. PMID:24611039

  2. Joint Bayesian Component Separation and CMB Power Spectrum Estimation

    NASA Technical Reports Server (NTRS)

    Eriksen, H. K.; Jewell, J. B.; Dickinson, C.; Banday, A. J.; Gorski, K. M.; Lawrence, C. R.

    2008-01-01

    We describe and implement an exact, flexible, and computationally efficient algorithm for joint component separation and CMB power spectrum estimation, building on a Gibbs sampling framework. Two essential new features are (1) conditional sampling of foreground spectral parameters and (2) joint sampling of all amplitude-type degrees of freedom (e.g., CMB, foreground pixel amplitudes, and global template amplitudes) given spectral parameters. Given a parametric model of the foreground signals, we estimate efficiently and accurately the exact joint foreground- CMB posterior distribution and, therefore, all marginal distributions such as the CMB power spectrum or foreground spectral index posteriors. The main limitation of the current implementation is the requirement of identical beam responses at all frequencies, which restricts the analysis to the lowest resolution of a given experiment. We outline a future generalization to multiresolution observations. To verify the method, we analyze simple models and compare the results to analytical predictions. We then analyze a realistic simulation with properties similar to the 3 yr WMAP data, downgraded to a common resolution of 3 deg FWHM. The results from the actual 3 yr WMAP temperature analysis are presented in a companion Letter.

  3. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    ERIC Educational Resources Information Center

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A. G.

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are…

  4. Bayesian analysis for kaon photoproduction

    SciTech Connect

    Marsainy, T. Mart, T.

    2014-09-25

    We have investigated contribution of the nucleon resonances in the kaon photoproduction process by using an established statistical decision making method, i.e. the Bayesian method. This method does not only evaluate the model over its entire parameter space, but also takes the prior information and experimental data into account. The result indicates that certain resonances have larger probabilities to contribute to the process.

  5. An Integrated Bayesian Model for DIF Analysis

    ERIC Educational Resources Information Center

    Soares, Tufi M.; Goncalves, Flavio B.; Gamerman, Dani

    2009-01-01

    In this article, an integrated Bayesian model for differential item functioning (DIF) analysis is proposed. The model is integrated in the sense of modeling the responses along with the DIF analysis. This approach allows DIF detection and explanation in a simultaneous setup. Previous empirical studies and/or subjective beliefs about the item…

  6. Heterogeneous Factor Analysis Models: A Bayesian Approach.

    ERIC Educational Resources Information Center

    Ansari, Asim; Jedidi, Kamel; Dube, Laurette

    2002-01-01

    Developed Markov Chain Monte Carlo procedures to perform Bayesian inference, model checking, and model comparison in heterogeneous factor analysis. Tested the approach with synthetic data and data from a consumption emotion study involving 54 consumers. Results show that traditional psychometric methods cannot fully capture the heterogeneity in…

  7. hiHMM: Bayesian non-parametric joint inference of chromatin state maps

    PubMed Central

    Sohn, Kyung-Ah; Ho, Joshua W. K.; Djordjevic, Djordje; Jeong, Hyun-hwan; Park, Peter J.; Kim, Ju Han

    2015-01-01

    Motivation: Genome-wide mapping of chromatin states is essential for defining regulatory elements and inferring their activities in eukaryotic genomes. A number of hidden Markov model (HMM)-based methods have been developed to infer chromatin state maps from genome-wide histone modification data for an individual genome. To perform a principled comparison of evolutionarily distant epigenomes, we must consider species-specific biases such as differences in genome size, strength of signal enrichment and co-occurrence patterns of histone modifications. Results: Here, we present a new Bayesian non-parametric method called hierarchically linked infinite HMM (hiHMM) to jointly infer chromatin state maps in multiple genomes (different species, cell types and developmental stages) using genome-wide histone modification data. This flexible framework provides a new way to learn a consistent definition of chromatin states across multiple genomes, thus facilitating a direct comparison among them. We demonstrate the utility of this method using synthetic data as well as multiple modENCODE ChIP-seq datasets. Conclusion: The hierarchical and Bayesian non-parametric formulation in our approach is an important extension to the current set of methodologies for comparative chromatin landscape analysis. Availability and implementation: Source codes are available at https://github.com/kasohn/hiHMM. Chromatin data are available at http://encode-x.med.harvard.edu/data_sets/chromatin/. Contact: peter_park@harvard.edu or juhan@snu.ac.kr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25725496

  8. Bayesian and Frequentist Methods for Estimating Joint Uncertainty of Freundlich Adsorption Isotherm Fitting Parameters

    EPA Science Inventory

    In this paper, we present methods for estimating Freundlich isotherm fitting parameters (K and N) and their joint uncertainty, which have been implemented into the freeware software platforms R and WinBUGS. These estimates were determined by both Frequentist and Bayesian analyse...

  9. A SAS Interface for Bayesian Analysis with WinBUGS

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; McArdle, John J.; Wang, Lijuan; Hamagami, Fumiaki

    2008-01-01

    Bayesian methods are becoming very popular despite some practical difficulties in implementation. To assist in the practical application of Bayesian methods, we show how to implement Bayesian analysis with WinBUGS as part of a standard set of SAS routines. This implementation procedure is first illustrated by fitting a multiple regression model…

  10. A note on variational Bayesian factor analysis.

    PubMed

    Zhao, Jian-hua; Yu, Philip L H

    2009-09-01

    Existing works on variational bayesian (VB) treatment for factor analysis (FA) model such as [Ghahramani, Z., & Beal, M. (2000). Variational inference for Bayesian mixture of factor analysers. In Advances in neural information proceeding systems. Cambridge, MA: MIT Press; Nielsen, F. B. (2004). Variational approach to factor analysis and related models. Master's thesis, The Institute of Informatics and Mathematical Modelling, Technical University of Denmark.] are found theoretically and empirically to suffer two problems: (1) penalize the model more heavily than BIC and (2) perform unsatisfactorily in low noise cases as redundant factors can not be effectively suppressed. A novel VB treatment is proposed in this paper to resolve the two problems and a simulation study is conducted to testify its improved performance over existing treatments. PMID:19135337

  11. Semiparametric Bayesian inference on skew-normal joint modeling of multivariate longitudinal and survival data.

    PubMed

    Tang, An-Min; Tang, Nian-Sheng

    2015-02-28

    We propose a semiparametric multivariate skew-normal joint model for multivariate longitudinal and multivariate survival data. One main feature of the posited model is that we relax the commonly used normality assumption for random effects and within-subject error by using a centered Dirichlet process prior to specify the random effects distribution and using a multivariate skew-normal distribution to specify the within-subject error distribution and model trajectory functions of longitudinal responses semiparametrically. A Bayesian approach is proposed to simultaneously obtain Bayesian estimates of unknown parameters, random effects and nonparametric functions by combining the Gibbs sampler and the Metropolis-Hastings algorithm. Particularly, a Bayesian local influence approach is developed to assess the effect of minor perturbations to within-subject measurement error and random effects. Several simulation studies and an example are presented to illustrate the proposed methodologies. PMID:25404574

  12. Bayesian Analysis of Individual Level Personality Dynamics

    PubMed Central

    Cripps, Edward; Wood, Robert E.; Beckmann, Nadin; Lau, John; Beckmann, Jens F.; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415

  13. Bayesian Analysis of Individual Level Personality Dynamics.

    PubMed

    Cripps, Edward; Wood, Robert E; Beckmann, Nadin; Lau, John; Beckmann, Jens F; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415

  14. Bayesian model selection analysis of WMAP3

    SciTech Connect

    Parkinson, David; Mukherjee, Pia; Liddle, Andrew R.

    2006-06-15

    We present a Bayesian model selection analysis of WMAP3 data using our code CosmoNest. We focus on the density perturbation spectral index n{sub S} and the tensor-to-scalar ratio r, which define the plane of slow-roll inflationary models. We find that while the Bayesian evidence supports the conclusion that n{sub S}{ne}1, the data are not yet powerful enough to do so at a strong or decisive level. If tensors are assumed absent, the current odds are approximately 8 to 1 in favor of n{sub S}{ne}1 under our assumptions, when WMAP3 data is used together with external data sets. WMAP3 data on its own is unable to distinguish between the two models. Further, inclusion of r as a parameter weakens the conclusion against the Harrison-Zel'dovich case (n{sub S}=1, r=0), albeit in a prior-dependent way. In appendices we describe the CosmoNest code in detail, noting its ability to supply posterior samples as well as to accurately compute the Bayesian evidence. We make a first public release of CosmoNest, now available at www.cosmonest.org.

  15. Bayesian Approach to the Joint Inversion of Gravity and Magnetic Data, with Application to the Ismenius Area of Mars

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey B.; Raymond, C.; Smrekar, S.; Millbury, C.

    2004-01-01

    This viewgraph presentation reviews a Bayesian approach to the inversion of gravity and magnetic data with specific application to the Ismenius Area of Mars. Many inverse problems encountered in geophysics and planetary science are well known to be non-unique (i.e. inversion of gravity the density structure of a body). In hopes of reducing the non-uniqueness of solutions, there has been interest in the joint analysis of data. An example is the joint inversion of gravity and magnetic data, with the assumption that the same physical anomalies generate both the observed magnetic and gravitational anomalies. In this talk, we formulate the joint analysis of different types of data in a Bayesian framework and apply the formalism to the inference of the density and remanent magnetization structure for a local region in the Ismenius area of Mars. The Bayesian approach allows prior information or constraints in the solutions to be incorporated in the inversion, with the "best" solutions those whose forward predictions most closely match the data while remaining consistent with assumed constraints. The application of this framework to the inversion of gravity and magnetic data on Mars reveals two typical challenges - the forward predictions of the data have a linear dependence on some of the quantities of interest, and non-linear dependence on others (termed the "linear" and "non-linear" variables, respectively). For observations with Gaussian noise, a Bayesian approach to inversion for "linear" variables reduces to a linear filtering problem, with an explicitly computable "error" matrix. However, for models whose forward predictions have non-linear dependencies, inference is no longer given by such a simple linear problem, and moreover, the uncertainty in the solution is no longer completely specified by a computable "error matrix". It is therefore important to develop methods for sampling from the full Bayesian posterior to provide a complete and statistically consistent

  16. Bayesian PET image reconstruction incorporating anato-functional joint entropy

    NASA Astrophysics Data System (ADS)

    Tang, Jing; Rahmim, Arman

    2009-12-01

    We developed a maximum a posterior (MAP) reconstruction method for positron emission tomography (PET) image reconstruction incorporating magnetic resonance (MR) image information, with the joint entropy between the PET and MR image features serving as the regularization constraint. A non-parametric method was used to estimate the joint probability density of the PET and MR images. Using realistically simulated PET and MR human brain phantoms, the quantitative performance of the proposed algorithm was investigated. Incorporation of the anatomic information via this technique, after parameter optimization, was seen to dramatically improve the noise versus bias tradeoff in every region of interest, compared to the result from using conventional MAP reconstruction. In particular, hot lesions in the FDG PET image, which had no anatomical correspondence in the MR image, also had improved contrast versus noise tradeoff. Corrections were made to figures 3, 4 and 6, and to the second paragraph of section 3.1 on 13 November 2009. The corrected electronic version is identical to the print version.

  17. A Bayesian mixture of semiparametric mixed-effects joint models for skewed-longitudinal and time-to-event data.

    PubMed

    Chen, Jiaqing; Huang, Yangxin

    2015-09-10

    In longitudinal studies, it is of interest to investigate how repeatedly measured markers in time are associated with a time to an event of interest, and in the mean time, the repeated measurements are often observed with the features of a heterogeneous population, non-normality, and covariate measured with error because of longitudinal nature. Statistical analysis may complicate dramatically when one analyzes longitudinal-survival data with these features together. Recently, a mixture of skewed distributions has received increasing attention in the treatment of heterogeneous data involving asymmetric behaviors across subclasses, but there are relatively few studies accommodating heterogeneity, non-normality, and measurement error in covariate simultaneously arose in longitudinal-survival data setting. Under the umbrella of Bayesian inference, this article explores a finite mixture of semiparametric mixed-effects joint models with skewed distributions for longitudinal measures with an attempt to mediate homogeneous characteristics, adjust departures from normality, and tailor accuracy from measurement error in covariate as well as overcome shortages of confidence in specifying a time-to-event model. The Bayesian mixture of joint modeling offers an appropriate avenue to estimate not only all parameters of mixture joint models but also probabilities of class membership. Simulation studies are conducted to assess the performance of the proposed method, and a real example is analyzed to demonstrate the methodology. The results are reported by comparing potential models with various scenarios. PMID:25924891

  18. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    PubMed Central

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B; Neyer, Franz J; van Aken, Marcel AG

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided. PMID:24116396

  19. Bayesian Nonparametric Models for Multiway Data Analysis.

    PubMed

    Xu, Zenglin; Yan, Feng; Qi, Yuan

    2015-02-01

    Tensor decomposition is a powerful computational tool for multiway data analysis. Many popular tensor decomposition approaches-such as the Tucker decomposition and CANDECOMP/PARAFAC (CP)-amount to multi-linear factorization. They are insufficient to model (i) complex interactions between data entities, (ii) various data types (e.g., missing data and binary data), and (iii) noisy observations and outliers. To address these issues, we propose tensor-variate latent nonparametric Bayesian models for multiway data analysis. We name these models InfTucker. These new models essentially conduct Tucker decomposition in an infinite feature space. Unlike classical tensor decomposition models, our new approaches handle both continuous and binary data in a probabilistic framework. Unlike previous Bayesian models on matrices and tensors, our models are based on latent Gaussian or t processes with nonlinear covariance functions. Moreover, on network data, our models reduce to nonparametric stochastic blockmodels and can be used to discover latent groups and predict missing interactions. To learn the models efficiently from data, we develop a variational inference technique and explore properties of the Kronecker product for computational efficiency. Compared with a classical variational implementation, this technique reduces both time and space complexities by several orders of magnitude. On real multiway and network data, our new models achieved significantly higher prediction accuracy than state-of-art tensor decomposition methods and blockmodels. PMID:26353255

  20. Bayesian analysis of factors associated with fibromyalgia syndrome subjects

    NASA Astrophysics Data System (ADS)

    Jayawardana, Veroni; Mondal, Sumona; Russek, Leslie

    2015-01-01

    Factors contributing to movement-related fear were assessed by Russek, et al. 2014 for subjects with Fibromyalgia (FM) based on the collected data by a national internet survey of community-based individuals. The study focused on the variables, Activities-Specific Balance Confidence scale (ABC), Primary Care Post-Traumatic Stress Disorder screen (PC-PTSD), Tampa Scale of Kinesiophobia (TSK), a Joint Hypermobility Syndrome screen (JHS), Vertigo Symptom Scale (VSS-SF), Obsessive-Compulsive Personality Disorder (OCPD), Pain, work status and physical activity dependent from the "Revised Fibromyalgia Impact Questionnaire" (FIQR). The study presented in this paper revisits same data with a Bayesian analysis where appropriate priors were introduced for variables selected in the Russek's paper.

  1. Multimodel Bayesian analysis of groundwater data worth

    NASA Astrophysics Data System (ADS)

    Xue, Liang; Zhang, Dongxiao; Guadagnini, Alberto; Neuman, Shlomo P.

    2014-11-01

    We explore the way in which uncertain descriptions of aquifer heterogeneity and groundwater flow impact one's ability to assess the worth of collecting additional data. We do so on the basis of Maximum Likelihood Bayesian Model Averaging (MLBMA) by accounting jointly for uncertainties in geostatistical and flow model structures and parameter (hydraulic conductivity) as well as system state (hydraulic head) estimates, given uncertain measurements of one or both variables. Previous description of our approach was limited to geostatistical models based solely on hydraulic conductivity data. Here we implement the approach on a synthetic example of steady state flow in a two-dimensional random log hydraulic conductivity field with and without recharge by embedding an inverse stochastic moment solution of groundwater flow in MLBMA. A moment-equations-based geostatistical inversion method is utilized to circumvent the need for computationally expensive numerical Monte Carlo simulations. The approach is compatible with either deterministic or stochastic flow models and consistent with modern statistical methods of parameter estimation, admitting but not requiring prior information about the parameters. It allows but does not require approximating lead predictive statistical moments of system states by linearization while updating model posterior probabilities and parameter estimates on the basis of potential new data both before and after such data are actually collected.

  2. Optimal sequential Bayesian analysis for degradation tests.

    PubMed

    Rodríguez-Narciso, Silvia; Christen, J Andrés

    2016-07-01

    Degradation tests are especially difficult to conduct for items with high reliability. Test costs, caused mainly by prolonged item duration and item destruction costs, establish the necessity of sequential degradation test designs. We propose a methodology that sequentially selects the optimal observation times to measure the degradation, using a convenient rule that maximizes the inference precision and minimizes test costs. In particular our objective is to estimate a quantile of the time to failure distribution, where the degradation process is modelled as a linear model using Bayesian inference. The proposed sequential analysis is based on an index that measures the expected discrepancy between the estimated quantile and its corresponding prediction, using Monte Carlo methods. The procedure was successfully implemented for simulated and real data. PMID:26307336

  3. A Hierarchical Bayesian Procedure for Two-Mode Cluster Analysis

    ERIC Educational Resources Information Center

    DeSarbo, Wayne S.; Fong, Duncan K. H.; Liechty, John; Saxton, M. Kim

    2004-01-01

    This manuscript introduces a new Bayesian finite mixture methodology for the joint clustering of row and column stimuli/objects associated with two-mode asymmetric proximity, dominance, or profile data. That is, common clusters are derived which partition both the row and column stimuli/objects simultaneously into the same derived set of clusters.…

  4. Bayesian analysis on gravitational waves and exoplanets

    NASA Astrophysics Data System (ADS)

    Deng, Xihao

    Attempts to detect gravitational waves using a pulsar timing array (PTA), i.e., a collection of pulsars in our Galaxy, have become more organized over the last several years. PTAs act to detect gravitational waves generated from very distant sources by observing the small and correlated effect the waves have on pulse arrival times at the Earth. In this thesis, I present advanced Bayesian analysis methods that can be used to search for gravitational waves in pulsar timing data. These methods were also applied to analyze a set of radial velocity (RV) data collected by the Hobby- Eberly Telescope on observing a K0 giant star. They confirmed the presence of two Jupiter mass planets around a K0 giant star and also characterized the stellar p-mode oscillation. The first part of the thesis investigates the effect of wavefront curvature on a pulsar's response to a gravitational wave. In it we show that we can assume the gravitational wave phasefront is planar across the array only if the source luminosity distance " 2piL2/lambda, where L is the pulsar distance to the Earth (˜ kpc) and lambda is the radiation wavelength (˜ pc) in the PTA waveband. Correspondingly, for a point gravitational wave source closer than ˜ 100 Mpc, we should take into account the effect of wavefront curvature across the pulsar-Earth line of sight, which depends on the luminosity distance to the source, when evaluating the pulsar timing response. As a consequence, if a PTA can detect a gravitational wave from a source closer than ˜ 100 Mpc, the effects of wavefront curvature on the response allows us to determine the source luminosity distance. The second and third parts of the thesis propose a new analysis method based on Bayesian nonparametric regression to search for gravitational wave bursts and a gravitational wave background in PTA data. Unlike the conventional Bayesian analysis that introduces a signal model with a fixed number of parameters, Bayesian nonparametric regression sets

  5. Evaluation of a Partial Genome Screening of Two Asthma Susceptibility Regions Using Bayesian Network Based Bayesian Multilevel Analysis of Relevance

    PubMed Central

    Antal, Péter; Kiszel, Petra Sz.; Gézsi, András; Hadadi, Éva; Virág, Viktor; Hajós, Gergely; Millinghoffer, András; Nagy, Adrienne; Kiss, András; Semsei, Ágnes F.; Temesi, Gergely; Melegh, Béla; Kisfali, Péter; Széll, Márta; Bikov, András; Gálffy, Gabriella; Tamási, Lilla; Falus, András; Szalai, Csaba

    2012-01-01

    Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls). The results were evaluated with traditional frequentist methods and we applied a new statistical method, called Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA). This method uses Bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the Bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated. With frequentist methods one SNP (rs3751464 in the FRMD6 gene) provided evidence for an association with asthma (OR = 1.43(1.2–1.8); p = 3×10−4). The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics. In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance. PMID:22432035

  6. A Bayesian Hierarchical Approach to Regional Frequency Analysis of Extremes

    NASA Astrophysics Data System (ADS)

    Renard, B.

    2010-12-01

    Rainfall and runoff frequency analysis is a major issue for the hydrological community. The distribution of hydrological extremes varies in space and possibly in time. Describing and understanding this spatiotemporal variability are primary challenges to improve hazard quantification and risk assessment. This presentation proposes a general approach based on a Bayesian hierarchical model, following previous work by Cooley et al. [2007], Micevski [2007], Aryal et al. [2009] or Lima and Lall [2009; 2010]. Such a hierarchical model is made up of two levels: (1) a data level modeling the distribution of observations, and (2) a process level describing the fluctuation of the distribution parameters in space and possibly in time. At the first level of the model, at-site data (e.g., annual maxima series) are modeled with a chosen distribution (e.g., a GEV distribution). Since data from several sites are considered, the joint distribution of a vector of (spatial) observations needs to be derived. This is challenging because data are in general not spatially independent, especially for nearby sites. An elliptical copula is therefore used to formally account for spatial dependence between at-site data. This choice might be questionable in the context of extreme value distributions. However, it is motivated by its applicability in spatial highly dimensional problems, where the joint pdf of a vector of n observations is required to derive the likelihood function (with n possibly amounting to hundreds of sites). At the second level of the model, parameters of the chosen at-site distribution are then modeled by a Gaussian spatial process, whose mean may depend on covariates (e.g. elevation, distance to sea, weather pattern, time). In particular, this spatial process allows estimating parameters at ungauged sites, and deriving the predictive distribution of rainfall/runoff at every pixel/catchment of the studied domain. An application to extreme rainfall series from the French

  7. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  8. Transdimensional Bayesian Joint Inversion of Complementary Seismic Observables with Realistic Data Uncertainties

    NASA Astrophysics Data System (ADS)

    Gao, C.; Lekic, V.

    2014-12-01

    Due to their different and complementary sensitivities to structure, multiple seismic observables are often combined to image the Earth's deep interior. We use a reversible jump Markov chain Monte Carlo (rjMCMC) algorithm to incorporate surface wave dispersion, particle motion ellipticity (HZ ratio), and receiver functions into transdimensional, Bayesian inversion for the profiles of shear velocity (Vs), compressional velocity (Vp), and density beneath a seismic station. While traditional inversion approaches seek a single best-fit model, a Bayesian approach yields an ensemble of models, allowing us to fully quantify uncertainty and trade-offs between model parameters. Furthermore, we show that by treating the number model parameters as an unknown to be estimated from the data, we both eliminate the need for a fixed parameterization based on prior information, and obtain better model estimates with reduced trade-offs. Optimal weighting of disparate datasets is paramount for maximizing the resolving power of joint inversions. In a Bayesian framework, data uncertainty directly determines the variance of the model posterior probability distribution; therefore, characteristics of the uncertainties on the observables become even more important in the inversion (Bodin et al., 2011). To properly account for the noise characteristics of the different seismic observables, we compute covariance matrices of data errors for each data type by generating realistic synthetic noise using noise covariance matrices computed from thousands of noise samples, and then measuring the seismic observables of interest from synthetic waveforms contaminated by many different realizations of noise. We find large non-diagonal terms in the covariance matrices for different data types, indicating that typical assumptions of uncorrelated data errors are unjustified. We quantify how the use of realistic data covariance matrices in the joint inversion affects the retrieval of seismic structure under

  9. Bayesian data analysis in population ecology: motivations, methods, and benefits

    USGS Publications Warehouse

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  10. Learning Bayesian networks from big meteorological spatial datasets. An alternative to complex network analysis

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Jose Manuel; San Martín, Daniel; Herrera, Sixto; Santiago Cofiño, Antonio

    2016-04-01

    The growing availability of spatial datasets (observations, reanalysis, and regional and global climate models) demands efficient multivariate spatial modeling techniques for many problems of interest (e.g. teleconnection analysis, multi-site downscaling, etc.). Complex networks have been recently applied in this context using graphs built from pairwise correlations between the different stations (or grid boxes) forming the dataset. However, this analysis does not take into account the full dependence structure underlying the data, gien by all possible marginal and conditional dependencies among the stations, and does not allow a probabilistic analysis of the dataset. In this talk we introduce Bayesian networks as an alternative multivariate analysis and modeling data-driven technique which allows building a joint probability distribution of the stations including all relevant dependencies in the dataset. Bayesian networks is a sound machine learning technique using a graph to 1) encode the main dependencies among the variables and 2) to obtain a factorization of the joint probability distribution of the stations given by a reduced number of parameters. For a particular problem, the resulting graph provides a qualitative analysis of the spatial relationships in the dataset (alternative to complex network analysis), and the resulting model allows for a probabilistic analysis of the dataset. Bayesian networks have been widely applied in many fields, but their use in climate problems is hampered by the large number of variables (stations) involved in this field, since the complexity of the existing algorithms to learn from data the graphical structure grows nonlinearly with the number of variables. In this contribution we present a modified local learning algorithm for Bayesian networks adapted to this problem, which allows inferring the graphical structure for thousands of stations (from observations) and/or gridboxes (from model simulations) thus providing new

  11. Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation

    NASA Technical Reports Server (NTRS)

    Jefferys, William H.; Berger, James O.

    1992-01-01

    'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian analysis can, moreover, clarify the nature of the 'simplest' hypothesis consistent with the given data. By choosing the prior probabilities of hypotheses, it becomes possible to quantify the scientific judgment that simpler hypotheses are more likely to be correct. Bayesian analysis also shows that a hypothesis with fewer adjustable parameters intrinsically possesses an enhanced posterior probability, due to the clarity of its predictions.

  12. Enhancing the Modeling of PFOA Pharmacokinetics with Bayesian Analysis

    EPA Science Inventory

    The detail sufficient to describe the pharmacokinetics (PK) for perfluorooctanoic acid (PFOA) and the methods necessary to combine information from multiple data sets are both subjects of ongoing investigation. Bayesian analysis provides tools to accommodate these goals. We exa...

  13. Bayesian analysis of the backreaction models

    SciTech Connect

    Kurek, Aleksandra; Bolejko, Krzysztof; Szydlowski, Marek

    2010-03-15

    We present a Bayesian analysis of four different types of backreaction models, which are based on the Buchert equations. In this approach, one considers a solution to the Einstein equations for a general matter distribution and then an average of various observable quantities is taken. Such an approach became of considerable interest when it was shown that it could lead to agreement with observations without resorting to dark energy. In this paper we compare the {Lambda}CDM model and the backreaction models with type Ia supernovae, baryon acoustic oscillations, and cosmic microwave background data, and find that the former is favored. However, the tested models were based on some particular assumptions about the relation between the average spatial curvature and the backreaction, as well as the relation between the curvature and curvature index. In this paper we modified the latter assumption, leaving the former unchanged. We find that, by varying the relation between the curvature and curvature index, we can obtain a better fit. Therefore, some further work is still needed--in particular, the relation between the backreaction and the curvature should be revisited in order to fully determine the feasibility of the backreaction models to mimic dark energy.

  14. Bayesian mixture analysis for metagenomic community profiling

    PubMed Central

    Morfopoulou, Sofia; Plagnol, Vincent

    2015-01-01

    Motivation: Deep sequencing of clinical samples is now an established tool for the detection of infectious pathogens, with direct medical applications. The large amount of data generated produces an opportunity to detect species even at very low levels, provided that computational tools can effectively profile the relevant metagenomic communities. Data interpretation is complicated by the fact that short sequencing reads can match multiple organisms and by the lack of completeness of existing databases, in particular for viral pathogens. Here we present metaMix, a Bayesian mixture model framework for resolving complex metagenomic mixtures. We show that the use of parallel Monte Carlo Markov chains for the exploration of the species space enables the identification of the set of species most likely to contribute to the mixture. Results: We demonstrate the greater accuracy of metaMix compared with relevant methods, particularly for profiling complex communities consisting of several related species. We designed metaMix specifically for the analysis of deep transcriptome sequencing datasets, with a focus on viral pathogen detection; however, the principles are generally applicable to all types of metagenomic mixtures. Availability and implementation: metaMix is implemented as a user friendly R package, freely available on CRAN: http://cran.r-project.org/web/packages/metaMix Contact: sofia.morfopoulou.10@ucl.ac.uk Supplementary information: Supplementary data are available at Bionformatics online. PMID:26002885

  15. Bayesian Analysis of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey

    2007-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

  16. Bayesian analysis of a disability model for lung cancer survival.

    PubMed

    Armero, C; Cabras, S; Castellanos, M E; Perra, S; Quirós, A; Oruezábal, M J; Sánchez-Rubio, J

    2016-02-01

    Bayesian reasoning, survival analysis and multi-state models are used to assess survival times for Stage IV non-small-cell lung cancer patients and the evolution of the disease over time. Bayesian estimation is done using minimum informative priors for the Weibull regression survival model, leading to an automatic inferential procedure. Markov chain Monte Carlo methods have been used for approximating posterior distributions and the Bayesian information criterion has been considered for covariate selection. In particular, the posterior distribution of the transition probabilities, resulting from the multi-state model, constitutes a very interesting tool which could be useful to help oncologists and patients make efficient and effective decisions. PMID:22767866

  17. A fully Bayesian method for jointly fitting instrumental calibration and X-ray spectral models

    SciTech Connect

    Xu, Jin; Yu, Yaming; Van Dyk, David A.; Kashyap, Vinay L.; Siemiginowska, Aneta; Drake, Jeremy; Ratzlaff, Pete; Connors, Alanna; Meng, Xiao-Li E-mail: yamingy@ics.uci.edu E-mail: vkashyap@cfa.harvard.edu E-mail: jdrake@cfa.harvard.edu E-mail: meng@stat.harvard.edu

    2014-10-20

    Owing to a lack of robust principled methods, systematic instrumental uncertainties have generally been ignored in astrophysical data analysis despite wide recognition of the importance of including them. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. We previously introduced a pragmatic Bayesian method to address this problem. The method is 'pragmatic' in that it introduced an ad hoc technique that simplified computation by neglecting the potential information in the data for narrowing the uncertainty for the calibration product. Following that work, we use a principal component analysis to efficiently represent the uncertainty of the effective area of an X-ray (or γ-ray) telescope. Here, however, we leverage this representation to enable a principled, fully Bayesian method that coherently accounts for the calibration uncertainty in high-energy spectral analysis. In this setting, the method is compared with standard analysis techniques and the pragmatic Bayesian method. The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product—here the effective area, conditional on the adopted spectral model. In this way, it can yield more accurate and efficient estimates of the source parameters along with valid estimates of their uncertainty. Provided that the source spectrum can be accurately described by a parameterized model, this method allows rigorous inference about the effective area by quantifying which possible curves are most consistent with the data.

  18. Joint Segmentation and Deconvolution of Ultrasound Images Using a Hierarchical Bayesian Model Based on Generalized Gaussian Priors.

    PubMed

    Zhao, Ningning; Basarab, Adrian; Kouame, Denis; Tourneret, Jean-Yves

    2016-08-01

    This paper proposes a joint segmentation and deconvolution Bayesian method for medical ultrasound (US) images. Contrary to piecewise homogeneous images, US images exhibit heavy characteristic speckle patterns correlated with the tissue structures. The generalized Gaussian distribution (GGD) has been shown to be one of the most relevant distributions for characterizing the speckle in US images. Thus, we propose a GGD-Potts model defined by a label map coupling US image segmentation and deconvolution. The Bayesian estimators of the unknown model parameters, including the US image, the label map, and all the hyperparameters are difficult to be expressed in a closed form. Thus, we investigate a Gibbs sampler to generate samples distributed according to the posterior of interest. These generated samples are finally used to compute the Bayesian estimators of the unknown parameters. The performance of the proposed Bayesian model is compared with the existing approaches via several experiments conducted on realistic synthetic data and in vivo US images. PMID:27187959

  19. Semiparametric Bayesian joint modeling of a binary and continuous outcome with applications in toxicological risk assessment.

    PubMed

    Hwang, Beom Seuk; Pennell, Michael L

    2014-03-30

    Many dose-response studies collect data on correlated outcomes. For example, in developmental toxicity studies, uterine weight and presence of malformed pups are measured on the same dam. Joint modeling can result in more efficient inferences than independent models for each outcome. Most methods for joint modeling assume standard parametric response distributions. However, in toxicity studies, it is possible that response distributions vary in location and shape with dose, which may not be easily captured by standard models. To address this issue, we propose a semiparametric Bayesian joint model for a binary and continuous response. In our model, a kernel stick-breaking process prior is assigned to the distribution of a random effect shared across outcomes, which allows flexible changes in distribution shape with dose shared across outcomes. The model also includes outcome-specific fixed effects to allow different location effects. In simulation studies, we found that the proposed model provides accurate estimates of toxicological risk when the data do not satisfy assumptions of standard parametric models. We apply our method to data from a developmental toxicity study of ethylene glycol diethyl ether. PMID:24123309

  20. Bayesian analysis of MEG visual evoked responses

    SciTech Connect

    Schmidt, D.M.; George, J.S.; Wood, C.C.

    1999-04-01

    The authors developed a method for analyzing neural electromagnetic data that allows probabilistic inferences to be drawn about regions of activation. The method involves the generation of a large number of possible solutions which both fir the data and prior expectations about the nature of probable solutions made explicit by a Bayesian formalism. In addition, they have introduced a model for the current distributions that produce MEG and (EEG) data that allows extended regions of activity, and can easily incorporate prior information such as anatomical constraints from MRI. To evaluate the feasibility and utility of the Bayesian approach with actual data, they analyzed MEG data from a visual evoked response experiment. They compared Bayesian analyses of MEG responses to visual stimuli in the left and right visual fields, in order to examine the sensitivity of the method to detect known features of human visual cortex organization. They also examined the changing pattern of cortical activation as a function of time.

  1. A Joint Bayesian Inversion for Glacial Isostatic Adjustment in North America and Greenland

    NASA Astrophysics Data System (ADS)

    Davis, J. L.; Wang, L.

    2014-12-01

    We have previously presented joint inversions of geodetic data for glacial isostatic adjustment (GIA) fields that employ a Bayesian framework for the combination of data and models. Data sets used include GNSS, GRACE gravity, and tide-gauge data, in order to estimate three-dimensional crustal deformation, geoid rate, relative sea-level change (RSLC). The benefit to this approach is that solutions are less dependent on any particular Earth/ice model used to calculate the GIA fields, and instead employ a suite of GIA predictions that are then used to calculate statistical constraints. This approach was used both for the determination of the SNARF geodetic reference frame for North America, and for a study of GIA in Fennoscandia (Hill et al., 2010). One challenge to the method we developed is that the inherent reduction in resolution of, and correlation among, GRACE Stokes coefficients caused by the destriping procedure (Swenson and Wahr, 2006; Duan et al., 2009) was not accounted for. This important obstacle has been overcome by developing a Bayesian approach to destriping (Wang et al., in prep.). However, important issues of mixed resolution of these data types still remain. In this presentation, we report on the progress of this effort, and present a new GIA field for North America. For the first time, the region used in the solution includes Greenland, in order to provide internally consistent solutions for GIA, the spatial and temporal variability of present-day sea-level change, and present-day melting in Greenland.

  2. Quantifying predictive uncertainty of streamflow forecasts based on a Bayesian joint probability model

    NASA Astrophysics Data System (ADS)

    Zhao, Tongtiegang; Wang, Q. J.; Bennett, James C.; Robertson, David E.; Shao, Quanxi; Zhao, Jianshi

    2015-09-01

    Uncertainty is inherent in streamflow forecasts and is an important determinant of the utility of forecasts for water resources management. However, predictions by deterministic models provide only single values without uncertainty attached. This study presents a method for using a Bayesian joint probability (BJP) model to post-process deterministic streamflow forecasts by quantifying predictive uncertainty. The BJP model is comprised of a log-sinh transformation that normalises hydrological data, and a bi-variate Gaussian distribution that characterises the dependence relationship. The parameters of the transformation and the distribution are estimated through Bayesian inference with a Monte Carlo Markov chain (MCMC) algorithm. The BJP model produces, from a raw deterministic forecast, an ensemble of values to represent forecast uncertainty. The model is applied to raw deterministic forecasts of inflows to the Three Gorges Reservoir in China as a case study. The heteroscedasticity and non-Gaussianity of forecast uncertainty are effectively addressed. The ensemble spread accounts for the forecast uncertainty and leads to considerable improvement in terms of the continuous ranked probability score. The forecasts become less accurate as lead time increases, and the ensemble spread provides reliable information on the forecast uncertainty. We conclude that the BJP model is a useful tool to quantify predictive uncertainty in post-processing deterministic streamflow forecasts.

  3. Efficient Bayesian Joint Models for Group Randomized Trials with Multiple Observation Times and Multiple Outcomes

    PubMed Central

    Xu, Xinyi; Pennell, Michael L.; Lu, Bo; Murray, David M.

    2013-01-01

    Summary In this paper, we propose a Bayesian method for Group Randomized Trials (GRTs) with multiple observation times and multiple outcomes of different types. We jointly model these outcomes using latent multivariate normal linear regression, which allows treatment effects to change with time and accounts for 1.) intra-class correlation (ICC) within groups 2.) the correlation between different outcomes measured on the same subject and 3.) the over-time correlation (OTC) of each outcome. Moreover we develop a set of innovative priors for the variance components which yield direct inference on the correlations, avoid undesirable constraints, and allow utilization of information from previous studies. We illustrate through simulations that our model can improve estimation efficiency (lower posterior standard deviations) of ICCs and treatment effects relative to single outcome models and models with diffuse priors on the variance components. We also demonstrate the methodology using body composition data collected in the Trial of Activity in Adolescent Girls (TAAG). PMID:22733563

  4. Joint prediction of multiple quantitative traits using a Bayesian multivariate antedependence model

    PubMed Central

    Jiang, J; Zhang, Q; Ma, L; Li, J; Wang, Z; Liu, J-F

    2015-01-01

    Predicting organismal phenotypes from genotype data is important for preventive and personalized medicine as well as plant and animal breeding. Although genome-wide association studies (GWAS) for complex traits have discovered a large number of trait- and disease-associated variants, phenotype prediction based on associated variants is usually in low accuracy even for a high-heritability trait because these variants can typically account for a limited fraction of total genetic variance. In comparison with GWAS, the whole-genome prediction (WGP) methods can increase prediction accuracy by making use of a huge number of variants simultaneously. Among various statistical methods for WGP, multiple-trait model and antedependence model show their respective advantages. To take advantage of both strategies within a unified framework, we proposed a novel multivariate antedependence-based method for joint prediction of multiple quantitative traits using a Bayesian algorithm via modeling a linear relationship of effect vector between each pair of adjacent markers. Through both simulation and real-data analyses, our studies demonstrated that the proposed antedependence-based multiple-trait WGP method is more accurate and robust than corresponding traditional counterparts (Bayes A and multi-trait Bayes A) under various scenarios. Our method can be readily extended to deal with missing phenotypes and resequence data with rare variants, offering a feasible way to jointly predict phenotypes for multiple complex traits in human genetic epidemiology as well as plant and livestock breeding. PMID:25873147

  5. Bayesian analysis of the modified Omori law

    NASA Astrophysics Data System (ADS)

    Holschneider, M.; Narteau, C.; Shebalin, P.; Peng, Z.; Schorlemmer, D.

    2012-06-01

    In order to examine variations in aftershock decay rate, we propose a Bayesian framework to estimate the {K, c, p}-values of the modified Omori law (MOL), λ(t) = K(c + t)-p. The Bayesian setting allows not only to produce a point estimator of these three parameters but also to assess their uncertainties and posterior dependencies with respect to the observed aftershock sequences. Using a new parametrization of the MOL, we identify the trade-off between the c and p-value estimates and discuss its dependence on the number of aftershocks. Then, we analyze the influence of the catalog completeness interval [tstart, tstop] on the various estimates. To test this Bayesian approach on natural aftershock sequences, we use two independent and non-overlapping aftershock catalogs of the same earthquakes in Japan. Taking into account the posterior uncertainties, we show that both the handpicked (short times) and the instrumental (long times) catalogs predict the same ranges of parameter values. We therefore conclude that the same MOL may be valid over short and long times.

  6. A Bayesian analysis of plutonium exposures in Sellafield workers.

    PubMed

    Puncher, M; Riddell, A E

    2016-03-01

    The joint Russian (Mayak Production Association) and British (Sellafield) plutonium worker epidemiological analysis, undertaken as part of the European Union Framework Programme 7 (FP7) SOLO project, aims to investigate potential associations between cancer incidence and occupational exposures to plutonium using estimates of organ/tissue doses. The dose reconstruction protocol derived for the study makes best use of the most recent biokinetic models derived by the International Commission on Radiological Protection (ICRP) including a recent update to the human respiratory tract model (HRTM). This protocol was used to derive the final point estimates of absorbed doses for the study. Although uncertainties on the dose estimates were not included in the final epidemiological analysis, a separate Bayesian analysis has been performed for each of the 11 808 Sellafield plutonium workers included in the study in order to assess: A. The reliability of the point estimates provided to the epidemiologists and B. The magnitude of the uncertainty on dose estimates. This analysis, which accounts for uncertainties in biokinetic model parameters, intakes and measurement uncertainties, is described in the present paper. The results show that there is excellent agreement between the point estimates of dose and posterior mean values of dose. However, it is also evident that there are significant uncertainties associated with these dose estimates: the geometric range of the 97.5%:2.5% posterior values are a factor of 100 for lung dose, 30 for doses to liver and red bone marrow, and 40 for intakes: these uncertainties are not reflected in estimates of risk when point doses are used to assess them. It is also shown that better estimates of certain key HRTM absorption parameters could significantly reduce the uncertainties on lung dose in future studies. PMID:26584413

  7. Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.

    2016-03-01

    A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.

  8. Nested sampling applied in Bayesian room-acoustics decay analysis.

    PubMed

    Jasa, Tomislav; Xiang, Ning

    2012-11-01

    Room-acoustic energy decays often exhibit single-rate or multiple-rate characteristics in a wide variety of rooms/halls. Both the energy decay order and decay parameter estimation are of practical significance in architectural acoustics applications, representing two different levels of Bayesian probabilistic inference. This paper discusses a model-based sound energy decay analysis within a Bayesian framework utilizing the nested sampling algorithm. The nested sampling algorithm is specifically developed to evaluate the Bayesian evidence required for determining the energy decay order with decay parameter estimates as a secondary result. Taking the energy decay analysis in architectural acoustics as an example, this paper demonstrates that two different levels of inference, decay model-selection and decay parameter estimation, can be cohesively accomplished by the nested sampling algorithm. PMID:23145609

  9. APPLICATION OF BAYESIAN MONTE CARLO ANALYSIS TO A LAGRANGIAN PHOTOCHEMICAL AIR QUALITY MODEL. (R824792)

    EPA Science Inventory

    Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...

  10. Phycas: software for Bayesian phylogenetic analysis.

    PubMed

    Lewis, Paul O; Holder, Mark T; Swofford, David L

    2015-05-01

    Phycas is open source, freely available Bayesian phylogenetics software written primarily in C++ but with a Python interface. Phycas specializes in Bayesian model selection for nucleotide sequence data, particularly the estimation of marginal likelihoods, central to computing Bayes Factors. Marginal likelihoods can be estimated using newer methods (Thermodynamic Integration and Generalized Steppingstone) that are more accurate than the widely used Harmonic Mean estimator. In addition, Phycas supports two posterior predictive approaches to model selection: Gelfand-Ghosh and Conditional Predictive Ordinates. The General Time Reversible family of substitution models, as well as a codon model, are available, and data can be partitioned with all parameters unlinked except tree topology and edge lengths. Phycas provides for analyses in which the prior on tree topologies allows polytomous trees as well as fully resolved trees, and provides for several choices for edge length priors, including a hierarchical model as well as the recently described compound Dirichlet prior, which helps avoid overly informative induced priors on tree length. PMID:25577605

  11. Bayesian networks as a tool for epidemiological systems analysis

    NASA Astrophysics Data System (ADS)

    Lewis, F. I.

    2012-11-01

    Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter attempts not only to identify statistically associated variables, but to additionally, and empirically, separate these into those directly and indirectly dependent with one or more outcome variables. Such discrimination is vastly more ambitious but has the potential to reveal far more about key features of complex disease systems. Applying Bayesian network modeling to biological and medical data has considerable computational demands, combined with the need to ensure robust model selection given the vast model space of possible DAGs. These challenges require the use of approximation techniques, such as the Laplace approximation, Markov chain Monte Carlo simulation and parametric bootstrapping, along with computational parallelization. A case study in structure discovery - identification of an optimal DAG for given data - is presented which uses additive Bayesian networks to explore veterinary disease data of industrial and medical relevance.

  12. Methods for the joint meta-analysis of multiple tests.

    PubMed

    Trikalinos, Thomas A; Hoaglin, David C; Small, Kevin M; Terrin, Norma; Schmid, Christopher H

    2014-12-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests' true-positive rates (TPRs) and between their false-positive rates (FPRs) (induced because tests are applied to the same participants), and allow for between-study correlations between TPRs and FPRs (such as those induced by threshold effects). We estimate models in the Bayesian setting. We demonstrate using a meta-analysis of screening for Down syndrome with two tests: shortened humerus (arm bone), and shortened femur (thigh bone). Separate and joint meta-analyses yielded similar TPR and FPR estimates. For example, the summary TPR for a shortened humerus was 35.3% (95% credible interval (CrI): 26.9, 41.8%) versus 37.9% (27.7, 50.3%) with joint versus separate meta-analysis. Joint meta-analysis is more efficient when calculating comparative accuracy: the difference in the summary TPRs was 0.0% (-8.9, 9.5%; TPR higher for shortened humerus) with joint versus 2.6% (-14.7, 19.8%) with separate meta-analyses. Simulation and empirical analyses are needed to refine the role of the proposed methodology. PMID:26052954

  13. On Bayesian analysis of on-off measurements

    NASA Astrophysics Data System (ADS)

    Nosek, Dalibor; Nosková, Jana

    2016-06-01

    We propose an analytical solution to the on-off problem within the framework of Bayesian statistics. Both the statistical significance for the discovery of new phenomena and credible intervals on model parameters are presented in a consistent way. We use a large enough family of prior distributions of relevant parameters. The proposed analysis is designed to provide Bayesian solutions that can be used for any number of observed on-off events, including zero. The procedure is checked using Monte Carlo simulations. The usefulness of the method is demonstrated on examples from γ-ray astronomy.

  14. Factor analysis models for structuring covariance matrices of additive genetic effects: a Bayesian implementation

    PubMed Central

    de los Campos, Gustavo; Gianola, Daniel

    2007-01-01

    Multivariate linear models are increasingly important in quantitative genetics. In high dimensional specifications, factor analysis (FA) may provide an avenue for structuring (co)variance matrices, thus reducing the number of parameters needed for describing (co)dispersion. We describe how FA can be used to model genetic effects in the context of a multivariate linear mixed model. An orthogonal common factor structure is used to model genetic effects under Gaussian assumption, so that the marginal likelihood is multivariate normal with a structured genetic (co)variance matrix. Under standard prior assumptions, all fully conditional distributions have closed form, and samples from the joint posterior distribution can be obtained via Gibbs sampling. The model and the algorithm developed for its Bayesian implementation were used to describe five repeated records of milk yield in dairy cattle, and a one common FA model was compared with a standard multiple trait model. The Bayesian Information Criterion favored the FA model. PMID:17897592

  15. A Bayesian QTL linkage analysis of the common dataset from the 12th QTLMAS workshop

    PubMed Central

    Bink, Marco CAM; van Eeuwijk, Fred A

    2009-01-01

    Background To compare the power of various QTL mapping methodologies, a dataset was simulated within the framework of 12th QTLMAS workshop. A total of 5865 diploid individuals was simulated, spanning seven generations, with known pedigree. Individuals were genotyped for 6000 SNPs across six chromosomes. We present an illustration of a Bayesian QTL linkage analysis, as implemented in the special purpose software FlexQTL. Most importantly, we treated the number of bi-allelic QTL as a random variable and used Bayes Factors to infer plausible QTL models. We investigated the power of our analysis in relation to the number of phenotyped individuals and SNPs. Results We report clear posterior evidence for 12 QTL that jointly explained 30% of the phenotypic variance, which was very close to the total of included simulation effects, when using all phenotypes and a set of 600 SNPs. Decreasing the number of phenotyped individuals from 4665 to 1665 and/or the number of SNPs in the analysis from 600 to 120 dramatically reduced the power to identify and locate QTL. Posterior estimates of genome-wide breeding values for a small set of individuals were given. Conclusion We presented a successful Bayesian linkage analysis of a simulated dataset with a pedigree spanning several generations. Our analysis identified all regions that contained QTL with effects explaining more than one percent of the phenotypic variance. We showed how the results of a Bayesian QTL mapping can be used in genomic prediction. PMID:19278543

  16. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    ERIC Educational Resources Information Center

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  17. Bayesian Analysis of Nonlinear Structural Equation Models with Nonignorable Missing Data

    ERIC Educational Resources Information Center

    Lee, Sik-Yum

    2006-01-01

    A Bayesian approach is developed for analyzing nonlinear structural equation models with nonignorable missing data. The nonignorable missingness mechanism is specified by a logistic regression model. A hybrid algorithm that combines the Gibbs sampler and the Metropolis-Hastings algorithm is used to produce the joint Bayesian estimates of…

  18. Simultaneous Bayesian analysis of contingency tables in genetic association studies.

    PubMed

    Dickhaus, Thorsten

    2015-08-01

    Genetic association studies lead to simultaneous categorical data analysis. The sample for every genetic locus consists of a contingency table containing the numbers of observed genotype-phenotype combinations. Under case-control design, the row counts of every table are identical and fixed, while column counts are random. The aim of the statistical analysis is to test independence of the phenotype and the genotype at every locus. We present an objective Bayesian methodology for these association tests, which relies on the conjugacy of Dirichlet and multinomial distributions. Being based on the likelihood principle, the Bayesian tests avoid looping over all tables with given marginals. Making use of data generated by The Wellcome Trust Case Control Consortium (WTCCC), we illustrate that the ordering of the Bayes factors shows a good agreement with that of frequentist p-values. Furthermore, we deal with specifying prior probabilities for the validity of the null hypotheses, by taking linkage disequilibrium structure into account and exploiting the concept of effective numbers of tests. Application of a Bayesian decision theoretic multiple test procedure to the WTCCC data illustrates the proposed methodology. Finally, we discuss two methods for reconciling frequentist and Bayesian approaches to the multiple association test problem. PMID:26215535

  19. Multiple quantitative trait analysis using bayesian networks.

    PubMed

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness. PMID:25236454

  20. Application of Bayesian graphs to SN Ia data analysis and compression

    NASA Astrophysics Data System (ADS)

    Ma, Cong; Corasaniti, Pier-Stefano; Bassett, Bruce A.

    2016-08-01

    Bayesian graphical models are an efficient tool for modelling complex data and derive self-consistent expressions of the posterior distribution of model parameters. We apply Bayesian graphs to perform statistical analyses of Type Ia supernova (SN Ia) luminosity distance measurements from the Joint Light-curve Analysis (JLA) dataset (Betoule et al. 2014). In contrast to the χ2 approach used in previous studies, the Bayesian inference allows us to fully account for the standard-candle parameter dependence of the data covariance matrix. Comparing with χ2 analysis results we find a systematic offset of the marginal model parameter bounds. We demonstrate that the bias is statistically significant in the case of the SN Ia standardization parameters with a maximal 6σ shift of the SN light-curve colour correction. In addition, we find that the evidence for a host galaxy correction is now only 2.4σ. Systematic offsets on the cosmological parameters remain small, but may increase by combining constraints from complementary cosmological probes. The bias of the χ2 analysis is due to neglecting the parameter-dependent log-determinant of the data covariance, which gives more statistical weight to larger values of the standardization parameters. We find a similar effect on compressed distance modulus data. To this end we implement a fully consistent compression method of the JLA dataset that uses a Gaussian approximation of the posterior distribution for fast generation of compressed data. Overall, the results of our analysis emphasize the need for a fully consistent Bayesian statistical approach in the analysis of future large SN Ia datasets.

  1. Bayesian joint inversion of surface deformation and hydraulic data for aquifer characterization

    NASA Astrophysics Data System (ADS)

    Hesse, M. A.; Stadler, G.

    2013-12-01

    Remote sensing and geodetic measurements are providing a wealth of new, spatially-distributed, time-series data that promise to improve the characterization of regional aquifers. The integration of these geodetic measurements with other hydrological observations has the potential to aid the sustainable management of groundwater resources through improved characterization of the spatial variation of aquifer properties. The joint inversion of geomechanical and hydrological data is challenging, because it requires fully-coupled hydrogeophysical inversion for the aquifer parameters, based on a coupled geomechanical and hydrological process model. We formulate a Bayesian inverse problem to infer the lateral permeability variation in an aquifer from geodetic and hydraulic data, and from prior information. We compute the maximum a posteriori (MAP) estimate of the posterior permeability distribution, and use a local Gaussian approximation around the MAP point to characterize the uncertainty. For two-dimensional test cases we also explore the full posterior permeability distribution through Markov-Chain Monte Carlo (MCMC) sampling. To cope with the large parameter space dimension, we use local Gaussian approximations as proposal densities in the MCMC algorithm. Using increasingly complex model problems, based on the work of Mandel (1953) and Segall (1985), we find the following general properties of poroelastic inversions: (1) Augmenting standard hydraulic well data by surface deformation data improves the aquifer characterization. (2) Surface deformation contributes the most in shallow aquifers, but provides useful information even for the characterization of aquifers down to 1 km. (3) In general, it is more difficult to infer high permeability regions, and their characterization requires frequent measurement to resolve the associated short response time scales. (4) In horizontal aquifers, the vertical component of the surface deformation provides a smoothed image of the

  2. An Overview of Bayesian Methods for Neural Spike Train Analysis

    PubMed Central

    2013-01-01

    Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed. PMID:24348527

  3. Bayesian Shrinkage Analysis of Quantitative Trait Loci for Dynamic Traits

    PubMed Central

    Yang, Runqing; Xu, Shizhong

    2007-01-01

    Many quantitative traits are measured repeatedly during the life of an organism. Such traits are called dynamic traits. The pattern of the changes of a dynamic trait is called the growth trajectory. Studying the growth trajectory may enhance our understanding of the genetic architecture of the growth trajectory. Recently, we developed an interval-mapping procedure to map QTL for dynamic traits under the maximum-likelihood framework. We fit the growth trajectory by Legendre polynomials. The method intended to map one QTL at a time and the entire QTL analysis involved scanning the entire genome by fitting multiple single-QTL models. In this study, we propose a Bayesian shrinkage analysis for estimating and mapping multiple QTL in a single model. The method is a combination between the shrinkage mapping for individual quantitative traits and the Legendre polynomial analysis for dynamic traits. The multiple-QTL model is implemented in two ways: (1) a fixed-interval approach where a QTL is placed in each marker interval and (2) a moving-interval approach where the position of a QTL can be searched in a range that covers many marker intervals. Simulation study shows that the Bayesian shrinkage method generates much better signals for QTL than the interval-mapping approach. We propose several alternative methods to present the results of the Bayesian shrinkage analysis. In particular, we found that the Wald test-statistic profile can serve as a mechanism to test the significance of a putative QTL. PMID:17435239

  4. Bayesian Variable Selection in Cost-Effectiveness Analysis

    PubMed Central

    Negrín, Miguel A.; Vázquez-Polo, Francisco J.; Martel, María; Moreno, Elías; Girón, Francisco J.

    2010-01-01

    Linear regression models are often used to represent the cost and effectiveness of medical treatment. The covariates used may include sociodemographic variables, such as age, gender or race; clinical variables, such as initial health status, years of treatment or the existence of concomitant illnesses; and a binary variable indicating the treatment received. However, most studies estimate only one model, which usually includes all the covariates. This procedure ignores the question of uncertainty in model selection. In this paper, we examine four alternative Bayesian variable selection methods that have been proposed. In this analysis, we estimate the inclusion probability of each covariate in the real model conditional on the data. Variable selection can be useful for estimating incremental effectiveness and incremental cost, through Bayesian model averaging, as well as for subgroup analysis. PMID:20617047

  5. Bayesian Analysis Toolkit: 1.0 and beyond

    NASA Astrophysics Data System (ADS)

    Beaujean, Frederik; Caldwell, Allen; Greenwald, D.; Kluth, S.; Kröninger, Kevin; Schulz, O.

    2015-12-01

    The Bayesian Analysis Toolkit is a C++ package centered around Markov-chain Monte Carlo sampling. It is used in high-energy physics analyses by experimentalists and theorists alike. The software has matured over the last few years. We present new features to enter version 1.0, then summarize some of the software-engineering lessons learned and give an outlook on future versions.

  6. BAYESIAN ANALYSIS OF MULTIPLE HARMONIC OSCILLATIONS IN THE SOLAR CORONA

    SciTech Connect

    Arregui, I.; Asensio Ramos, A.; Diaz, A. J.

    2013-03-01

    The detection of multiple mode harmonic kink oscillations in coronal loops enables us to obtain information on coronal density stratification and magnetic field expansion using seismology inversion techniques. The inference is based on the measurement of the period ratio between the fundamental mode and the first overtone and theoretical results for the period ratio under the hypotheses of coronal density stratification and magnetic field expansion of the wave guide. We present a Bayesian analysis of multiple mode harmonic oscillations for the inversion of the density scale height and magnetic flux tube expansion under each of the hypotheses. The two models are then compared using a Bayesian model comparison scheme to assess how plausible each one is given our current state of knowledge.

  7. Analysis of NSTX TF Joint Voltage Measurements

    SciTech Connect

    R, Woolley

    2005-10-07

    This report presents findings of analyses of recorded current and voltage data associated with 72 electrical joints operating at high current and high mechanical stress. The analysis goal was to characterize the mechanical behavior of each joint and thus evaluate its mechanical supports. The joints are part of the toroidal field (TF) magnet system of the National Spherical Torus Experiment (NSTX) pulsed plasma device operating at the Princeton Plasma Physics Laboratory (PPPL). Since there is not sufficient space near the joints for much traditional mechanical instrumentation, small voltage probes were installed on each joint and their voltage monitoring waveforms have been recorded on sampling digitizers during each NSTX ''shot''.

  8. Risk analysis using a hybrid Bayesian-approximate reasoning methodology.

    SciTech Connect

    Bott, T. F.; Eisenhawer, S. W.

    2001-01-01

    Analysts are sometimes asked to make frequency estimates for specific accidents in which the accident frequency is determined primarily by safety controls. Under these conditions, frequency estimates use considerable expert belief in determining how the controls affect the accident frequency. To evaluate and document beliefs about control effectiveness, we have modified a traditional Bayesian approach by using approximate reasoning (AR) to develop prior distributions. Our method produces accident frequency estimates that separately express the probabilistic results produced in Bayesian analysis and possibilistic results that reflect uncertainty about the prior estimates. Based on our experience using traditional methods, we feel that the AR approach better documents beliefs about the effectiveness of controls than if the beliefs are buried in Bayesian prior distributions. We have performed numerous expert elicitations in which probabilistic information was sought from subject matter experts not trained In probability. We find it rnuch easier to elicit the linguistic variables and fuzzy set membership values used in AR than to obtain the probability distributions used in prior distributions directly from these experts because it better captures their beliefs and better expresses their uncertainties.

  9. Spectral Analysis of B Stars: An Application of Bayesian Statistics

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2012-12-01

    To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.

  10. Bayesian Dose-Finding in Two Treatment Cycles Based on the Joint Utility of Efficacy and Toxicity

    PubMed Central

    Lee, Juhee; Thall, Peter F.; Ji, Yuan; Müller, Peter

    2014-01-01

    A phase I/II clinical trial design is proposed for adaptively and dynamically optimizing each patient's dose in each of two cycles of therapy based on the joint binary efficacy and toxicity outcomes in each cycle. A dose-outcome model is assumed that includes a Bayesian hierarchical latent variable structure to induce association among the outcomes and also facilitate posterior computation. Doses are chosen in each cycle based on posteriors of a model-based objective function, similar to a reinforcement learning or Q-learning function, defined in terms of numerical utilities of the joint outcomes in each cycle. For each patient, the procedure outputs a sequence of two actions, one for each cycle, with each action being the decision to either treat the patient at a chosen dose or not to treat. The cycle 2 action depends on the individual patient's cycle 1 dose and outcomes. In addition, decisions are based on posterior inference using other patients’ data, and therefore the proposed method is adaptive both within and between patients. A simulation study of the method is presented, including comparison to two-cycle extensions of the conventional 3+3 algorithm, continual reassessment method, and a Bayesian model-based design, and evaluation of robustness. PMID:26366026

  11. Bayesian sensitivity analysis of bifurcating nonlinear models

    NASA Astrophysics Data System (ADS)

    Becker, W.; Worden, K.; Rowson, J.

    2013-01-01

    Sensitivity analysis allows one to investigate how changes in input parameters to a system affect the output. When computational expense is a concern, metamodels such as Gaussian processes can offer considerable computational savings over Monte Carlo methods, albeit at the expense of introducing a data modelling problem. In particular, Gaussian processes assume a smooth, non-bifurcating response surface. This work highlights a recent extension to Gaussian processes which uses a decision tree to partition the input space into homogeneous regions, and then fits separate Gaussian processes to each region. In this way, bifurcations can be modelled at region boundaries and different regions can have different covariance properties. To test this method, both the treed and standard methods were applied to the bifurcating response of a Duffing oscillator and a bifurcating FE model of a heart valve. It was found that the treed Gaussian process provides a practical way of performing uncertainty and sensitivity analysis on large, potentially-bifurcating models, which cannot be dealt with by using a single GP, although an open problem remains how to manage bifurcation boundaries that are not parallel to coordinate axes.

  12. Bayesian analysis for extreme climatic events: A review

    NASA Astrophysics Data System (ADS)

    Chu, Pao-Shin; Zhao, Xin

    2011-11-01

    This article reviews Bayesian analysis methods applied to extreme climatic data. We particularly focus on applications to three different problems related to extreme climatic events including detection of abrupt regime shifts, clustering tropical cyclone tracks, and statistical forecasting for seasonal tropical cyclone activity. For identifying potential change points in an extreme event count series, a hierarchical Bayesian framework involving three layers - data, parameter, and hypothesis - is formulated to demonstrate the posterior probability of the shifts throughout the time. For the data layer, a Poisson process with a gamma distributed rate is presumed. For the hypothesis layer, multiple candidate hypotheses with different change-points are considered. To calculate the posterior probability for each hypothesis and its associated parameters we developed an exact analytical formula, a Markov Chain Monte Carlo (MCMC) algorithm, and a more sophisticated reversible jump Markov Chain Monte Carlo (RJMCMC) algorithm. The algorithms are applied to several rare event series: the annual tropical cyclone or typhoon counts over the central, eastern, and western North Pacific; the annual extremely heavy rainfall event counts at Manoa, Hawaii; and the annual heat wave frequency in France. Using an Expectation-Maximization (EM) algorithm, a Bayesian clustering method built on a mixture Gaussian model is applied to objectively classify historical, spaghetti-like tropical cyclone tracks (1945-2007) over the western North Pacific and the South China Sea into eight distinct track types. A regression based approach to forecasting seasonal tropical cyclone frequency in a region is developed. Specifically, by adopting large-scale environmental conditions prior to the tropical cyclone season, a Poisson regression model is built for predicting seasonal tropical cyclone counts, and a probit regression model is alternatively developed toward a binary classification problem. With a non

  13. The Bayesian Analysis Software Developed At Washington University

    NASA Astrophysics Data System (ADS)

    Marutyan, Karen R.; Bretthorst, G. Larry

    2009-12-01

    Over the last few years there has been an ongoing effort at the Biomedical Magnetic Resonance Laboratory within Washington University to develop data analysis applications using Bayesian probability theory. A few of these applications are specific to Magnetic Resonance data, however, most are general and can analyze data from a wide variety of sources. These data analysis applications are server based and they have been written in such a way as to allow them to utilize as many processors as are available. The interface to these Bayesian applications is a client based Java interface. The client, usually a Windows PC, runs the interface, sets up an analysis, sends the analysis to the server, fetches the results and displays the appropriate plots on the users client machine. Together, the client and server software can be used to solve a host of interesting problems that occur regularly in the sciences. In this paper, we describe both the client and server software and briefly discuss how to acquire, install and maintain this software.

  14. Bayesian analysis to detect abrupt changes in extreme hydrological processes

    NASA Astrophysics Data System (ADS)

    Jo, Seongil; Kim, Gwangsu; Jeon, Jong-June

    2016-07-01

    In this study, we develop a new method for a Bayesian change point analysis. The proposed method is easy to implement and can be extended to a wide class of distributions. Using a generalized extreme-value distribution, we investigate the annual maximum of precipitations observed at stations in the South Korean Peninsula, and find significant changes in the considered sites. We evaluate the hydrological risk in predictions using the estimated return levels. In addition, we explain that the misspecification of the probability model can lead to a bias in the number of change points and using a simple example, show that this problem is difficult to avoid by technical data transformation.

  15. A Bayesian analysis of pentaquark signals from CLAS data

    SciTech Connect

    David Ireland; Bryan McKinnon; Dan Protopopescu; Pawel Ambrozewicz; Marco Anghinolfi; G. Asryan; Harutyun Avakian; H. Bagdasaryan; Nathan Baillie; Jacques Ball; Nathan Baltzell; V. Batourine; Marco Battaglieri; Ivan Bedlinski; Ivan Bedlinskiy; Matthew Bellis; Nawal Benmouna; Barry Berman; Angela Biselli; Lukasz Blaszczyk; Sylvain Bouchigny; Sergey Boyarinov; Robert Bradford; Derek Branford; William Briscoe; William Brooks; Volker Burkert; Cornel Butuceanu; John Calarco; Sharon Careccia; Daniel Carman; Liam Casey; Shifeng Chen; Lu Cheng; Philip Cole; Patrick Collins; Philip Coltharp; Donald Crabb; Volker Crede; Natalya Dashyan; Rita De Masi; Raffaella De Vita; Enzo De Sanctis; Pavel Degtiarenko; Alexandre Deur; Richard Dickson; Chaden Djalali; Gail Dodge; Joseph Donnelly; David Doughty; Michael Dugger; Oleksandr Dzyubak; Hovanes Egiyan; Kim Egiyan; Lamiaa Elfassi; Latifa Elouadrhiri; Paul Eugenio; Gleb Fedotov; Gerald Feldman; Ahmed Fradi; Herbert Funsten; Michel Garcon; Gagik Gavalian; Nerses Gevorgyan; Gerard Gilfoyle; Kevin Giovanetti; Francois-Xavier Girod; John Goetz; Wesley Gohn; Atilla Gonenc; Ralf Gothe; Keith Griffioen; Michel Guidal; Nevzat Guler; Lei Guo; Vardan Gyurjyan; Kawtar Hafidi; Hayk Hakobyan; Charles Hanretty; Neil Hassall; F. Hersman; Ishaq Hleiqawi; Maurik Holtrop; Charles Hyde; Yordanka Ilieva; Boris Ishkhanov; Eugeny Isupov; D. Jenkins; Hyon-Suk Jo; John Johnstone; Kyungseon Joo; Henry Juengst; Narbe Kalantarians; James Kellie; Mahbubul Khandaker; Wooyoung Kim; Andreas Klein; Franz Klein; Mikhail Kossov; Zebulun Krahn; Laird Kramer; Valery Kubarovsky; Joachim Kuhn; Sergey Kuleshov; Viacheslav Kuznetsov; Jeff Lachniet; Jean Laget; Jorn Langheinrich; D. Lawrence; Kenneth Livingston; Haiyun Lu; Marion MacCormick; Nikolai Markov; Paul Mattione; Bernhard Mecking; Mac Mestayer; Curtis Meyer; Tsutomu Mibe; Konstantin Mikhaylov; Marco Mirazita; Rory Miskimen; Viktor Mokeev; Brahim Moreno; Kei Moriya; Steven Morrow; Maryam Moteabbed; Edwin Munevar Espitia; Gordon Mutchler; Pawel Nadel-Turonski; Rakhsha Nasseripour; Silvia Niccolai; Gabriel Niculescu; Maria-Ioana Niculescu; Bogdan Niczyporuk; Megh Niroula; Rustam Niyazov; Mina Nozar; Mikhail Osipenko; Alexander Ostrovidov; Kijun Park; Evgueni Pasyuk; Craig Paterson; Sergio Pereira; Joshua Pierce; Nikolay Pivnyuk; Oleg Pogorelko; Sergey Pozdnyakov; John Price; Sebastien Procureur; Yelena Prok; Brian Raue; Giovanni Ricco; Marco Ripani; Barry Ritchie; Federico Ronchetti; Guenther Rosner; Patrizia Rossi; Franck Sabatie; Julian Salamanca; Carlos Salgado; Joseph Santoro; Vladimir Sapunenko; Reinhard Schumacher; Vladimir Serov; Youri Sharabian; Dmitri Sharov; Nikolay Shvedunov; Elton Smith; Lee Smith; Daniel Sober; Daria Sokhan; Aleksey Stavinskiy; Samuel Stepanyan; Stepan Stepanyan; Burnham Stokes; Paul Stoler; Steffen Strauch; Mauro Taiuti; David Tedeschi; Ulrike Thoma; Avtandil Tkabladze; Svyatoslav Tkachenko; Clarisse Tur; Maurizio Ungaro; Michael Vineyard; Alexander Vlassov; Daniel Watts; Lawrence Weinstein; Dennis Weygand; M. Williams; Elliott Wolin; M.H. Wood; Amrit Yegneswaran; Lorenzo Zana; Jixie Zhang; Bo Zhao; Zhiwen Zhao

    2008-02-01

    We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a $\\Theta^{+}$ pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a $\\Theta^{+}$. Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner.

  16. Direct message passing for hybrid Bayesian networks and performance analysis

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Chang, K. C.

    2010-04-01

    Probabilistic inference for hybrid Bayesian networks, which involves both discrete and continuous variables, has been an important research topic over the recent years. This is not only because a number of efficient inference algorithms have been developed and used maturely for simple types of networks such as pure discrete model, but also for the practical needs that continuous variables are inevitable in modeling complex systems. Pearl's message passing algorithm provides a simple framework to compute posterior distribution by propagating messages between nodes and can provides exact answer for polytree models with pure discrete or continuous variables. In addition, applying Pearl's message passing to network with loops usually converges and results in good approximation. However, for hybrid model, there is a need of a general message passing algorithm between different types of variables. In this paper, we develop a method called Direct Message Passing (DMP) for exchanging messages between discrete and continuous variables. Based on Pearl's algorithm, we derive formulae to compute messages for variables in various dependence relationships encoded in conditional probability distributions. Mixture of Gaussian is used to represent continuous messages, with the number of mixture components up to the size of the joint state space of all discrete parents. For polytree Conditional Linear Gaussian (CLG) Bayesian network, DMP has the same computational requirements and can provide exact solution as the one obtained by the Junction Tree (JT) algorithm. However, while JT can only work for the CLG model, DMP can be applied for general nonlinear, non-Gaussian hybrid model to produce approximate solution using unscented transformation and loopy propagation. Furthermore, we can scale the algorithm by restricting the number of mixture components in the messages. Empirically, we found that the approximation errors are relatively small especially for nodes that are far away from

  17. Bayesian analysis of inflationary features in Planck and SDSS data

    NASA Astrophysics Data System (ADS)

    Benetti, Micol; Alcaniz, Jailson S.

    2016-07-01

    We perform a Bayesian analysis to study possible features in the primordial inflationary power spectrum of scalar perturbations. In particular, we analyze the possibility of detecting the imprint of these primordial features in the anisotropy temperature power spectrum of the cosmic microwave background (CMB) and also in the matter power spectrum P (k ) . We use the most recent CMB data provided by the Planck Collaboration and P (k ) measurements from the 11th data release of the Sloan Digital Sky Survey. We focus our analysis on a class of potentials whose features are localized at different intervals of angular scales, corresponding to multipoles in the ranges 10 <ℓ<60 (Oscill-1) and 150 <ℓ<300 (Oscill-2). Our results show that one of the step potentials (Oscill-1) provides a better fit to the CMB data than does the featureless Λ CDM scenario, with moderate Bayesian evidence in favor of the former. Adding the P (k ) data to the analysis weakens the evidence of the Oscill-1 potential relative to the standard model and strengthens the evidence of this latter scenario with respect to the Oscill-2 model.

  18. Implementation of a Bayesian Engine for Uncertainty Analysis

    SciTech Connect

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  19. Joint modeling of survival and longitudinal non-survival data: current methods and issues. Report of the DIA Bayesian joint modeling working group.

    PubMed

    Lawrence Gould, A; Boye, Mark Ernest; Crowther, Michael J; Ibrahim, Joseph G; Quartey, George; Micallef, Sandrine; Bois, Frederic Y

    2015-06-30

    Explicitly modeling underlying relationships between a survival endpoint and processes that generate longitudinal measured or reported outcomes potentially could improve the efficiency of clinical trials and provide greater insight into the various dimensions of the clinical effect of interventions included in the trials. Various strategies have been proposed for using longitudinal findings to elucidate intervention effects on clinical outcomes such as survival. The application of specifically Bayesian approaches for constructing models that address longitudinal and survival outcomes explicitly has been recently addressed in the literature. We review currently available methods for carrying out joint analyses, including issues of implementation and interpretation, identify software tools that can be used to carry out the necessary calculations, and review applications of the methodology. PMID:24634327

  20. Analysis of magnetic field fluctuation thermometry using Bayesian inference

    NASA Astrophysics Data System (ADS)

    Wübbeler, G.; Schmähling, F.; Beyer, J.; Engert, J.; Elster, C.

    2012-12-01

    A Bayesian approach is proposed for the analysis of magnetic field fluctuation thermometry. The approach addresses the estimation of temperature from the measurement of a noise power spectrum as well as the analysis of previous calibration measurements. A key aspect is the reliable determination of uncertainties associated with the obtained temperature estimates, and the proposed approach naturally accounts for both the uncertainties in the calibration stage and the noise in the temperature measurement. Erlang distributions are employed to model the fluctuations of thermal noise power spectra and we show that such a procedure is justified in the light of the data. We describe in detail the Bayesian approach and briefly refer to Markov Chain Monte Carlo techniques used in the numerical calculation of the results. The MATLAB® software package we used for calculating our results is provided. The proposed approach is validated using magnetic field fluctuation power spectra recorded in the sub-kelvin region for which an independently determined reference temperature is available. As a result, the obtained temperature estimates were found to be fully consistent with the reference temperature.

  1. Bayesian probability analysis for acoustic-seismic landmine detection

    NASA Astrophysics Data System (ADS)

    Xiang, Ning; Sabatier, James M.; Goggans, Paul M.

    2002-11-01

    Landmines buried in the subsurface induce distinct changes in the seismic vibration of the ground surface when an acoustic source insonifies the ground. A scanning laser Doppler vibrometer (SLDV) senses the acoustically-induced seismic vibration of the ground surface in a noncontact, remote manner. The SLDV-based acoustic-to-seismic coupling technology exhibits significant advantages over conventional sensors due to its capability for detecting both metal and nonmetal mines and its stand-off distance. The seismic vibration data scanned from the SLDV are preprocessed to form images. The detection of landmines relies primarily on an analysis of the target amplitude, size, shape, and frequency range. A parametric model has been established [Xiang and Sabatier, J. Acoust. Soc. Am. 110, 2740 (2001)] to describe the amplified surface vibration velocity induced by buried landmines within an appropriate frequency range. This model incorporates vibrational amplitude, size, position of landmines, and the background amplitude into a model-based analysis process in which Bayesian target detection and parameter estimation have been applied. Based on recent field measurement results, the landmine detection procedure within a Bayesian framework will be discussed. [Work supported by the United States Army Communications-Electronics Command, Night Vision and Electronic Sensors Directorate.

  2. Inference algorithms and learning theory for Bayesian sparse factor analysis

    NASA Astrophysics Data System (ADS)

    Rattray, Magnus; Stegle, Oliver; Sharp, Kevin; Winn, John

    2009-12-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  3. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    PubMed

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach. PMID:16466842

  4. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    SciTech Connect

    Keselman, Dmitry; Tompkins, George H; Leishman, Deborah A

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  5. Bayesian modelling of the emission spectrum of the Joint European Torus Lithium Beam Emission Spectroscopy system.

    PubMed

    Kwak, Sehyun; Svensson, J; Brix, M; Ghim, Y-C

    2016-02-01

    A Bayesian model of the emission spectrum of the JET lithium beam has been developed to infer the intensity of the Li I (2p-2s) line radiation and associated uncertainties. The detected spectrum for each channel of the lithium beam emission spectroscopy system is here modelled by a single Li line modified by an instrumental function, Bremsstrahlung background, instrumental offset, and interference filter curve. Both the instrumental function and the interference filter curve are modelled with non-parametric Gaussian processes. All free parameters of the model, the intensities of the Li line, Bremsstrahlung background, and instrumental offset, are inferred using Bayesian probability theory with a Gaussian likelihood for photon statistics and electronic background noise. The prior distributions of the free parameters are chosen as Gaussians. Given these assumptions, the intensity of the Li line and corresponding uncertainties are analytically available using a Bayesian linear inversion technique. The proposed approach makes it possible to extract the intensity of Li line without doing a separate background subtraction through modulation of the Li beam. PMID:26931843

  6. A Bayesian analysis of two probability models describing thunderstorm activity at Cape Kennedy, Florida

    NASA Technical Reports Server (NTRS)

    Williford, W. O.; Hsieh, P.; Carter, M. C.

    1974-01-01

    A Bayesian analysis of the two discrete probability models, the negative binomial and the modified negative binomial distributions, which have been used to describe thunderstorm activity at Cape Kennedy, Florida, is presented. The Bayesian approach with beta prior distributions is compared to the classical approach which uses a moment method of estimation or a maximum-likelihood method. The accuracy and simplicity of the Bayesian method is demonstrated.

  7. Bayesian informative dropout model for longitudinal binary data with random effects using conditional and joint modeling approaches.

    PubMed

    Chan, Jennifer S K

    2016-05-01

    Dropouts are common in longitudinal study. If the dropout probability depends on the missing observations at or after dropout, this type of dropout is called informative (or nonignorable) dropout (ID). Failure to accommodate such dropout mechanism into the model will bias the parameter estimates. We propose a conditional autoregressive model for longitudinal binary data with an ID model such that the probabilities of positive outcomes as well as the drop-out indicator in each occasion are logit linear in some covariates and outcomes. This model adopting a marginal model for outcomes and a conditional model for dropouts is called a selection model. To allow for the heterogeneity and clustering effects, the outcome model is extended to incorporate mixture and random effects. Lastly, the model is further extended to a novel model that models the outcome and dropout jointly such that their dependency is formulated through an odds ratio function. Parameters are estimated by a Bayesian approach implemented using the user-friendly Bayesian software WinBUGS. A methadone clinic dataset is analyzed to illustrate the proposed models. Result shows that the treatment time effect is still significant but weaker after allowing for an ID process in the data. Finally the effect of drop-out on parameter estimates is evaluated through simulation studies. PMID:26467236

  8. Bayesian regression analysis of data with random effects covariates from nonlinear longitudinal measurements

    PubMed Central

    De la Cruz, Rolando; Meza, Cristian; Arribas-Gil, Ana; Carroll, Raymond J.

    2016-01-01

    Joint models for a wide class of response variables and longitudinal measurements consist on a mixed-effects model to fit longitudinal trajectories whose random effects enter as covariates in a generalized linear model for the primary response. They provide a useful way to assess association between these two kinds of data, which in clinical studies are often collected jointly on a series of individuals and may help understanding, for instance, the mechanisms of recovery of a certain disease or the efficacy of a given therapy. When a nonlinear mixed-effects model is used to fit the longitudinal trajectories, the existing estimation strategies based on likelihood approximations have been shown to exhibit some computational efficiency problems (De la Cruz et al., 2011). In this article we consider a Bayesian estimation procedure for the joint model with a nonlinear mixed-effects model for the longitudinal data and a generalized linear model for the primary response. The proposed prior structure allows for the implementation of an MCMC sampler. Moreover, we consider that the errors in the longitudinal model may be correlated. We apply our method to the analysis of hormone levels measured at the early stages of pregnancy that can be used to predict normal versus abnormal pregnancy outcomes. We also conduct a simulation study to assess the importance of modelling correlated errors and quantify the consequences of model misspecification. PMID:27274601

  9. A Bayesian Framework for Reliability Analysis of Spacecraft Deployments

    NASA Technical Reports Server (NTRS)

    Evans, John W.; Gallo, Luis; Kaminsky, Mark

    2012-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.

  10. Bayesian Models for fMRI Data Analysis

    PubMed Central

    Zhang, Linlin; Guindani, Michele; Vannucci, Marina

    2015-01-01

    Functional magnetic resonance imaging (fMRI), a noninvasive neuroimaging method that provides an indirect measure of neuronal activity by detecting blood flow changes, has experienced an explosive growth in the past years. Statistical methods play a crucial role in understanding and analyzing fMRI data. Bayesian approaches, in particular, have shown great promise in applications. A remarkable feature of fully Bayesian approaches is that they allow a flexible modeling of spatial and temporal correlations in the data. This paper provides a review of the most relevant models developed in recent years. We divide methods according to the objective of the analysis. We start from spatio-temporal models for fMRI data that detect task-related activation patterns. We then address the very important problem of estimating brain connectivity. We also touch upon methods that focus on making predictions of an individual's brain activity or a clinical or behavioral response. We conclude with a discussion of recent integrative models that aim at combining fMRI data with other imaging modalities, such as EEG/MEG and DTI data, measured on the same subjects. We also briefly discuss the emerging field of imaging genetics. PMID:25750690

  11. Bayesian robust analysis for genetic architecture of quantitative traits

    PubMed Central

    Yang, Runqing; Wang, Xin; Li, Jian; Deng, Hongwen

    2009-01-01

    Motivation: In most quantitative trait locus (QTL) mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may affect the accuracy of QTL detection and lead to detection of spurious QTLs. To improve the robustness of QTL mapping methods, we replaced the normal distribution for residuals in multiple interacting QTL models with the normal/independent distributions that are a class of symmetric and long-tailed distributions and are able to accommodate residual outliers. Subsequently, we developed a Bayesian robust analysis strategy for dissecting genetic architecture of quantitative traits and for mapping genome-wide interacting QTLs in line crosses. Results: Through computer simulations, we showed that our strategy had a similar power for QTL detection compared with traditional methods assuming normal-distributed traits, but had a substantially increased power for non-normal phenotypes. When this strategy was applied to a group of traits associated with physical/chemical characteristics and quality in rice, more main and epistatic QTLs were detected than traditional Bayesian model analyses under the normal assumption. Contact: runqingyang@sjtu.edu.cn; dengh@umkc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18974168

  12. Bayesian imperfect information analysis for clinical recurrent data

    PubMed Central

    Chang, Chih-Kuang; Chang, Chi-Chang

    2015-01-01

    In medical research, clinical practice must often be undertaken with imperfect information from limited resources. This study applied Bayesian imperfect information-value analysis to realistic situations to produce likelihood functions and posterior distributions, to a clinical decision-making problem for recurrent events. In this study, three kinds of failure models are considered, and our methods illustrated with an analysis of imperfect information from a trial of immunotherapy in the treatment of chronic granulomatous disease. In addition, we present evidence toward a better understanding of the differing behaviors along with concomitant variables. Based on the results of simulations, the imperfect information value of the concomitant variables was evaluated and different realistic situations were compared to see which could yield more accurate results for medical decision-making. PMID:25565853

  13. Risk analysis of dust explosion scenarios using Bayesian networks.

    PubMed

    Yuan, Zhi; Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-02-01

    In this study, a methodology has been proposed for risk analysis of dust explosion scenarios based on Bayesian network. Our methodology also benefits from a bow-tie diagram to better represent the logical relationships existing among contributing factors and consequences of dust explosions. In this study, the risks of dust explosion scenarios are evaluated, taking into account common cause failures and dependencies among root events and possible consequences. Using a diagnostic analysis, dust particle properties, oxygen concentration, and safety training of staff are identified as the most critical root events leading to dust explosions. The probability adaptation concept is also used for sequential updating and thus learning from past dust explosion accidents, which is of great importance in dynamic risk assessment and management. We also apply the proposed methodology to a case study to model dust explosion scenarios, to estimate the envisaged risks, and to identify the vulnerable parts of the system that need additional safety measures. PMID:25264172

  14. Bayesian Model Selection in 'Big Data' Spectral Analysis

    NASA Astrophysics Data System (ADS)

    Fischer, Travis C.; Crenshaw, D. Michael; Baron, Fabien; Kloppenborg, Brian K.; Pope, Crystal L.

    2015-01-01

    As IFU observations and large spectral surveys continue to become more prevalent, the handling of thousands of spectra has become common place. Astronomers look at objects with increasingly complex emission-linestructures, so establishing a method that will easily allow for multiple-component analysis of these features in an automated fashion would be of great use to the community. Already used in exoplanet detection and interferometric image reconstruction, we present a new application of Bayesian model selection in `big data' spectral analysis. With this technique, the fitting of multiple emission-line components in an automated fashion while simultaneously determining the correct number of components in each spectrum streamlines the line measurements for a large number of spectra into a single process.

  15. Structural dynamic analysis of a ball joint

    NASA Astrophysics Data System (ADS)

    Hwang, Seok-Cheol; Lee, Kwon-Hee

    2012-11-01

    Ball joint is a rotating and swiveling element that is typically installed at the interface between two parts. In an automobile, the ball joint is the component that connects the control arms to the steering knuckle. The ball joint can also be installed in linkage systems for motion control applications. This paper describes the simulation strategy for a ball joint analysis, considering manufacturing process. Its manufacturing process can be divided into plugging and spinning. Then, the interested responses is selected as the stress distribution generated between its ball and bearing. In this paper, a commercial code of NX DAFUL using an implicit integration method is introduced to calculate the response. In addition, the gap analysis is performed to investigate the fitness, focusing on the response of the displacement of a ball stud. Also, the optimum design is suggested through case studies.

  16. Geometrically nonlinear analysis of adhesively bonded joints

    NASA Technical Reports Server (NTRS)

    Dattaguru, B.; Everett, R. A., Jr.; Whitcomb, J. D.; Johnson, W. S.

    1982-01-01

    A geometrically nonlinear finite element analysis of cohesive failure in typical joints is presented. Cracked-lap-shear joints were chosen for analysis. Results obtained from linear and nonlinear analysis show that nonlinear effects, due to large rotations, significantly affect the calculated mode 1, crack opening, and mode 2, inplane shear, strain-energy-release rates. The ratio of the mode 1 to mode 2 strain-energy-relase rates (G1/G2) was found to be strongly affected by he adhesive modulus and the adherend thickness. The ratios between 0.2 and 0.8 can be obtained by varying adherend thickness and using either a single or double cracked-lap-shear specimen configuration. Debond growth rate data, together with the analysis, indicate that mode 1 strain-energy-release rate governs debond growth. Results from the present analysis agree well with experimentally measured joint opening displacements.

  17. Bayesian Analysis of Peak Ground Acceleration Attenuation Relationship

    SciTech Connect

    Mu Heqing; Yuen Kaveng

    2010-05-21

    Estimation of peak ground acceleration is one of the main issues in civil and earthquake engineering practice. The Boore-Joyner-Fumal empirical formula is well known for this purpose. In this paper we propose to use the Bayesian probabilistic model class selection approach to obtain the most suitable prediction model class for the seismic attenuation formula. The optimal model class is robust in the sense that it has balance between the data fitting capability and the sensitivity to noise. A database of strong-motion records is utilized for the analysis. It turns out that the optimal model class is simpler than the full order attenuation model suggested by Boore, Joyner and Fumal (1993).

  18. BASE-9: Bayesian Analysis for Stellar Evolution with nine variables

    NASA Astrophysics Data System (ADS)

    Robinson, Elliot; von Hippel, Ted; Stein, Nathan; Stenning, David; Wagner-Kaiser, Rachel; Si, Shijing; van Dyk, David

    2016-08-01

    The BASE-9 (Bayesian Analysis for Stellar Evolution with nine variables) software suite recovers star cluster and stellar parameters from photometry and is useful for analyzing single-age, single-metallicity star clusters, binaries, or single stars, and for simulating such systems. BASE-9 uses a Markov chain Monte Carlo (MCMC) technique along with brute force numerical integration to estimate the posterior probability distribution for the age, metallicity, helium abundance, distance modulus, line-of-sight absorption, and parameters of the initial-final mass relation (IFMR) for a cluster, and for the primary mass, secondary mass (if a binary), and cluster probability for every potential cluster member. The MCMC technique is used for the cluster quantities (the first six items listed above) and numerical integration is used for the stellar quantities (the last three items in the above list).

  19. Bayesian Library for the Analysis of Neutron Diffraction Data

    NASA Astrophysics Data System (ADS)

    Ratcliff, William; Lesniewski, Joseph; Quintana, Dylan

    During this talk, I will introduce the Bayesian Library for the Analysis of Neutron Diffraction Data. In this library we use of the DREAM algorithm to effectively sample parameter space. This offers several advantages over traditional least squares fitting approaches. It gives us more robust estimates of the fitting parameters, their errors, and their correlations. It also is more stable than least squares methods and provides more confidence in finding a global minimum. I will discuss the algorithm and its application to several materials. I will show applications to both structural and magnetic diffraction patterns. I will present examples of fitting both powder and single crystal data. We would like to acknowledge support from the Department of Commerce and the NSF.

  20. Testing Hardy-Weinberg equilibrium: an objective Bayesian analysis.

    PubMed

    Consonni, Guido; Moreno, Elías; Venturini, Sergio

    2011-01-15

    We analyze the general (multiallelic) Hardy-Weinberg equilibrium problem from an objective Bayesian testing standpoint. We argue that for small or moderate sample sizes the answer is rather sensitive to the prior chosen, and this suggests to carry out a sensitivity analysis with respect to the prior. This goal is achieved through the identification of a class of priors specifically designed for this testing problem. In this paper, we consider the class of intrinsic priors under the full model, indexed by a tuning quantity, the training sample size. These priors are objective, satisfy Savage's continuity condition and have proved to behave extremely well for many statistical testing problems. We compute the posterior probability of the Hardy-Weinberg equilibrium model for the class of intrinsic priors, assess robustness over the range of plausible answers, as well as stability of the decision in favor of either hypothesis. PMID:20963736

  1. Modal analysis of jointed structures

    NASA Astrophysics Data System (ADS)

    Quinn, D. Dane

    2012-01-01

    Structural systems are often composed of multiple components joined together at localized interfaces. Compared to a corresponding monolithic system these interfaces are designed to have little influence on the load carrying capability of the system, and the resulting change in the overall system mass and stiffness is minimal. Hence, under nominal operating conditions the mode shapes and frequencies of the dominant structural modes are relatively insensitive to the presence of the interfaces. However, the energy dissipation in such systems is strongly dependent on the joints. The microslip that occurs at each interface couples together the structural modes of the system and introduces nonlinear damping into the system, effectively altering the observed damping of the structural modes, which can then significantly alter the amplitude of the response at the resonant modal frequencies. This work develops equations of motion for a jointed structure in terms of the structural modal coordinates and implements a reduced-order description of the microslip that occurs at the interface between components. The interface is incorporated into the modal description of the system through an existing decomposition of a series-series Iwan interface model and a continuum approximation for microslip of an elastic rod. The developed framework is illustrated on several examples, including a discrete three degree-of-freedom system as well as the longitudinal deformation of a continuum beam.

  2. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  3. Discrete Dynamic Bayesian Network Analysis of fMRI Data

    PubMed Central

    Burge, John; Lane, Terran; Link, Hamilton; Qiu, Shibin; Clark, Vincent P.

    2010-01-01

    We examine the efficacy of using discrete Dynamic Bayesian Networks (dDBNs), a data-driven modeling technique employed in machine learning, to identify functional correlations among neuroanatomical regions of interest. Unlike many neuroimaging analysis techniques, this method is not limited by linear and/or Gaussian noise assumptions. It achieves this by modeling the time series of neuroanatomical regions as discrete, as opposed to continuous, random variables with multinomial distributions. We demonstrated this method using an fMRI dataset collected from healthy and demented elderly subjects and identify correlates based on a diagnosis of dementia. The results are validated in three ways. First, the elicited correlates are shown to be robust over leave-one-out cross-validation and, via a Fourier bootstrapping method, that they were not likely due to random chance. Second, the dDBNs identified correlates that would be expected given the experimental paradigm. Third, the dDBN's ability to predict dementia is competitive with two commonly employed machine-learning classifiers: the support vector machine and the Gaussian naïve Bayesian network. We also verify that the dDBN selects correlates based on non-linear criteria. Finally, we provide a brief analysis of the correlates elicited from Buckner et al.'s data that suggests that demented elderly subjects have reduced involvement of entorhinal and occipital cortex and greater involvement of the parietal lobe and amygdala in brain activity compared with healthy elderly (as measured via functional correlations among BOLD measurements). Limitations and extensions to the dDBN method are discussed. PMID:17990301

  4. Multiple SNP Set Analysis for Genome-Wide Association Studies Through Bayesian Latent Variable Selection.

    PubMed

    Lu, Zhao-Hua; Zhu, Hongtu; Knickmeyer, Rebecca C; Sullivan, Patrick F; Williams, Stephanie N; Zou, Fei

    2015-12-01

    The power of genome-wide association studies (GWAS) for mapping complex traits with single-SNP analysis (where SNP is single-nucleotide polymorphism) may be undermined by modest SNP effect sizes, unobserved causal SNPs, correlation among adjacent SNPs, and SNP-SNP interactions. Alternative approaches for testing the association between a single SNP set and individual phenotypes have been shown to be promising for improving the power of GWAS. We propose a Bayesian latent variable selection (BLVS) method to simultaneously model the joint association mapping between a large number of SNP sets and complex traits. Compared with single SNP set analysis, such joint association mapping not only accounts for the correlation among SNP sets but also is capable of detecting causal SNP sets that are marginally uncorrelated with traits. The spike-and-slab prior assigned to the effects of SNP sets can greatly reduce the dimension of effective SNP sets, while speeding up computation. An efficient Markov chain Monte Carlo algorithm is developed. Simulations demonstrate that BLVS outperforms several competing variable selection methods in some important scenarios. PMID:26515609

  5. Multiple SNP-sets Analysis for Genome-wide Association Studies through Bayesian Latent Variable Selection

    PubMed Central

    Lu, Zhaohua; Zhu, Hongtu; Knickmeyer, Rebecca C; Sullivan, Patrick F.; Stephanie, Williams N.; Zou, Fei

    2015-01-01

    The power of genome-wide association studies (GWAS) for mapping complex traits with single SNP analysis may be undermined by modest SNP effect sizes, unobserved causal SNPs, correlation among adjacent SNPs, and SNP-SNP interactions. Alternative approaches for testing the association between a single SNP-set and individual phenotypes have been shown to be promising for improving the power of GWAS. We propose a Bayesian latent variable selection (BLVS) method to simultaneously model the joint association mapping between a large number of SNP-sets and complex traits. Compared to single SNP-set analysis, such joint association mapping not only accounts for the correlation among SNP-sets, but also is capable of detecting causal SNP-sets that are marginally uncorrelated with traits. The spike-slab prior assigned to the effects of SNP-sets can greatly reduce the dimension of effective SNP-sets, while speeding up computation. An efficient MCMC algorithm is developed. Simulations demonstrate that BLVS outperforms several competing variable selection methods in some important scenarios. PMID:26515609

  6. RECONSTRUCTING EXPOSURE SCENARIOS USING DOSE BIOMARKERS - AN APPLICATION OF BAYESIAN UNCERTAINTY ANALYSIS

    EPA Science Inventory

    We use Bayesian uncertainty analysis to explore how to estimate pollutant exposures from biomarker concentrations. The growing number of national databases with exposure data makes such an analysis possible. They contain datasets of pharmacokinetic biomarkers for many polluta...

  7. UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE

    SciTech Connect

    Sanders, N. E.; Soderberg, A. M.; Betancourt, M.

    2015-02-10

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.

  8. Unsupervised Transient Light Curve Analysis via Hierarchical Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Sanders, N. E.; Betancourt, M.; Soderberg, A. M.

    2015-02-01

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.

  9. Multivariate meta-analysis of mixed outcomes: a Bayesian approach.

    PubMed

    Bujkiewicz, Sylwia; Thompson, John R; Sutton, Alex J; Cooper, Nicola J; Harrison, Mark J; Symmons, Deborah P M; Abrams, Keith R

    2013-09-30

    Multivariate random effects meta-analysis (MRMA) is an appropriate way for synthesizing data from studies reporting multiple correlated outcomes. In a Bayesian framework, it has great potential for integrating evidence from a variety of sources. In this paper, we propose a Bayesian model for MRMA of mixed outcomes, which extends previously developed bivariate models to the trivariate case and also allows for combination of multiple outcomes that are both continuous and binary. We have constructed informative prior distributions for the correlations by using external evidence. Prior distributions for the within-study correlations were constructed by employing external individual patent data and using a double bootstrap method to obtain the correlations between mixed outcomes. The between-study model of MRMA was parameterized in the form of a product of a series of univariate conditional normal distributions. This allowed us to place explicit prior distributions on the between-study correlations, which were constructed using external summary data. Traditionally, independent 'vague' prior distributions are placed on all parameters of the model. In contrast to this approach, we constructed prior distributions for the between-study model parameters in a way that takes into account the inter-relationship between them. This is a flexible method that can be extended to incorporate mixed outcomes other than continuous and binary and beyond the trivariate case. We have applied this model to a motivating example in rheumatoid arthritis with the aim of incorporating all available evidence in the synthesis and potentially reducing uncertainty around the estimate of interest. PMID:23630081

  10. A Bayesian latent group analysis for detecting poor effort in the assessment of malingering.

    PubMed

    Ortega, Alonso; Wagenmakers, Eric-Jan; Lee, Michael D; Markowitsch, Hans J; Piefke, Martina

    2012-06-01

    Despite their theoretical appeal, Bayesian methods for the assessment of poor effort and malingering are still rarely used in neuropsychological research and clinical diagnosis. In this article, we outline a novel and easy-to-use Bayesian latent group analysis of malingering whose goal is to identify participants displaying poor effort when tested. Our Bayesian approach also quantifies the confidence with which each participant is classified and estimates the base rates of malingering from the observed data. We implement our Bayesian approach and compare its utility in effort assessment to that of the classic below-chance criterion of symptom validity testing (SVT). In two experiments, we evaluate the accuracy of both a Bayesian latent group analysis and the below-chance criterion of SVT in recovering the membership of participants assigned to the malingering group. Experiment 1 uses a simulation research design, whereas Experiment 2 involves the differentiation of patients with a history of stroke from coached malingerers. In both experiments, sensitivity levels are high for the Bayesian method, but low for the below-chance criterion of SVT. Additionally, the Bayesian approach proves to be resistant to possible effects of coaching. We conclude that Bayesian latent group methods complement existing methods in making more informed choices about malingering. PMID:22543568

  11. Analysis of minor fractures associated with joints and faulted joints

    NASA Astrophysics Data System (ADS)

    Cruikshank, Kenneth M.; Zhao, Guozhu; Johnson, Arvid M.

    In this paper, we use fracture mechanics to interpret conditions responsible for secondary cracks that adorn joints and faulted joints in the Entrada Sandstone in Arches National Park, U.S.A. Because the joints in most places accommodated shearing offsets of a few mm to perhaps 1 dm, and thus became faulted joints, some of the minor cracks are due to faulting. However, in a few places where the shearing was zero, one can examine minor cracks due solely to interaction of joint segments at the time they formed. We recognize several types of minor cracks associated with subsequent faulting of the joints. One is the kink, a crack that occurs at the termination of a straight joint and whose trend is abruptly different from that of the joint. Kinks are common and should be studied because they contain a great deal of information about conditions during fracturing. The sense of kinking indicates the sense of shear during faulting: a kink that turns clockwise with respect to the direction of the main joint is a result of right-lateral shear, and a kink that turns counterclockwise is a result of left-lateral shear. Furthermore, the kink angle is related to the ratio of the shear stress responsible for the kinking to the normal stress responsible for the opening of the joint. The amount of opening of a joint at the time it faulted or even at the time the joint itself formed can be estimated by measuring the kink angle and the amount of strike-slip at some point along the faulted joint. Other fractures that form near terminations of pre-existing joints in response to shearing along the joint are horsetail fractures. Similar short fractures can occur anywhere along the length of the joints. The primary value in recognizing these fractures is that they indicate the sense of faulting accommodated by the host fracture and the direction of maximum tension. Even where there has been insignificant regional shearing in the Garden Area, the joints can have ornate terminations. Perhaps

  12. Highly efficient Bayesian joint inversion for receiver based data and its application to lithospheric structure beneath the southern Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Dettmer, Jan; Rhie, Junkee; Tkalčić, Hrvoje

    2016-04-01

    With the deployment of extensive seismic arrays, systematic and efficient parameter and uncertainty estimation is of increasing importance and can provide reliable, regional models for crustal and upper-mantle structure. We present an efficient Bayesian method for the joint inversion of surface-wave dispersion and receiver-function data that combines trans-dimensional (trans-D) model selection in an optimisation phase with subsequent rigorous parameter uncertainty estimation. Parameter and uncertainty estimation depend strongly on the chosen parametrization such that meaningful regional comparison requires quantitative model selection that can be carried out efficiently at several sites. While significant progress has been made for model selection (e.g. trans-D inference) at individual sites, the lack of efficiency can prohibit application to large data volumes or cause questionable results due to lack of convergence. Studies that address large numbers of data sets have mostly ignored model selection in favour of more efficient/simple estimation techniques (i.e. focusing on uncertainty estimation but employing ad-hoc model choices). Our approach consists of a two-phase inversion that combines trans-D optimisation to select the most probable parametrization with subsequent Bayesian sampling for uncertainty estimation given that parametrization. The trans-D optimisation is implemented here by replacing the likelihood function with the Bayesian information criterion (BIC). The BIC provides constraints on model complexity that facilitate the search for an optimal parametrization. Parallel tempering (PT) is applied as an optimisation algorithm. After optimisation, the optimal model choice is identified by the minimum BIC value from all PT chains. Uncertainty estimation is then carried out in fixed dimension. Data errors are estimated as part of the inference problem by a combination of empirical and hierarchical estimation. Data covariance matrices are estimated from

  13. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  14. BEAST 2: A Software Platform for Bayesian Evolutionary Analysis

    PubMed Central

    Bouckaert, Remco; Heled, Joseph; Kühnert, Denise; Vaughan, Tim; Wu, Chieh-Hsi; Xie, Dong; Suchard, Marc A.; Rambaut, Andrew; Drummond, Alexei J.

    2014-01-01

    We present a new open source, extensible and flexible software platform for Bayesian evolutionary analysis called BEAST 2. This software platform is a re-design of the popular BEAST 1 platform to correct structural deficiencies that became evident as the BEAST 1 software evolved. Key among those deficiencies was the lack of post-deployment extensibility. BEAST 2 now has a fully developed package management system that allows third party developers to write additional functionality that can be directly installed to the BEAST 2 analysis platform via a package manager without requiring a new software release of the platform. This package architecture is showcased with a number of recently published new models encompassing birth-death-sampling tree priors, phylodynamics and model averaging for substitution models and site partitioning. A second major improvement is the ability to read/write the entire state of the MCMC chain to/from disk allowing it to be easily shared between multiple instances of the BEAST software. This facilitates checkpointing and better support for multi-processor and high-end computing extensions. Finally, the functionality in new packages can be easily added to the user interface (BEAUti 2) by a simple XML template-based mechanism because BEAST 2 has been re-designed to provide greater integration between the analysis engine and the user interface so that, for example BEAST and BEAUti use exactly the same XML file format. PMID:24722319

  15. Bayesian Model Selection with Network Based Diffusion Analysis.

    PubMed

    Whalen, Andrew; Hoppitt, William J E

    2016-01-01

    A number of recent studies have used Network Based Diffusion Analysis (NBDA) to detect the role of social transmission in the spread of a novel behavior through a population. In this paper we present a unified framework for performing NBDA in a Bayesian setting, and demonstrate how the Watanabe Akaike Information Criteria (WAIC) can be used for model selection. We present a specific example of applying this method to Time to Acquisition Diffusion Analysis (TADA). To examine the robustness of this technique, we performed a large scale simulation study and found that NBDA using WAIC could recover the correct model of social transmission under a wide range of cases, including under the presence of random effects, individual level variables, and alternative models of social transmission. This work suggests that NBDA is an effective and widely applicable tool for uncovering whether social transmission underpins the spread of a novel behavior, and may still provide accurate results even when key model assumptions are relaxed. PMID:27092089

  16. New Ephemeris for LSI+61 303, A Bayesian Analysis

    NASA Astrophysics Data System (ADS)

    Gregory, P. C.

    1997-12-01

    The luminous early-type binary LSI+61 303 is an interesting radio, X-ray and possible gamma-ray source. At radio wavelengths it exhibits periodic outbursts with an approximate period of 26.5 days as well as a longer term modulation of the outburst peaks of approximately 4 years. Recently Paredes et al. have found evidence that the X-ray outbursts are very likely to recur with the same radio outburst period from an analysis of RXTE all sky monitoring data. The system has been observed by many groups at all wavelengths but still the energy source powering the radio outbursts and their relation to the high energy emission remains a mystery. For more details see the "LSI+61 303 Resource Page" at http://www.srl.caltech.edu/personnel/paulr/lsi.html . There has been increasing evidence for a change in the period of the system. We will present a new ephemeris for the system based on a Bayesian analysis of 20 years of radio observations including the GBI-NASA radio monitoring data.

  17. Microcanonical thermostatistics analysis without histograms: Cumulative distribution and Bayesian approaches

    NASA Astrophysics Data System (ADS)

    Alves, Nelson A.; Morero, Lucas D.; Rizzi, Leandro G.

    2015-06-01

    Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature β(E) and the microcanonical entropy S(E) is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms H(E) , which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for H(E) in order to evaluate β(E) by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distribution function (CDF), and (ii) a Bayesian approach to model this CDF. Comparison with a simple linear regression method is also carried out. The performance of these approaches is evaluated considering coarse-grained protein models for folding and peptide aggregation.

  18. A procedure for seiche analysis with Bayesian information criterion

    NASA Astrophysics Data System (ADS)

    Aichi, Masaatsu

    2016-04-01

    Seiche is a standing wave in enclosed or semi-enclosed water body. Its amplitude irregularly changes in time due to weather condition etc. Then, extracting seiche signal is not easy by usual methods for time series analysis such as fast Fourier transform (FFT). In this study, a new method for time series analysis with Bayesian information criterion was developed to decompose seiche, tide, long-term trend and residual components from time series data of tide stations. The method was developed based on the maximum marginal likelihood estimation of tide amplitudes, seiche amplitude, and trend components. Seiche amplitude and trend components were assumed that they gradually changes as second derivative in time was close to zero. These assumptions were incorporated as prior distributions. The variances of prior distributions were estimated by minimizing Akaike-Bayes information criterion (ABIC). The frequency of seiche was determined by Newton method with initial guess by FFT. The accuracy of proposed method was checked by analyzing synthetic time series data composed of known components. The reproducibility of the original components was quite well. The proposed method was also applied to the actual time series data of sea level observed by tide station and the strain of coastal rock masses observed by fiber Bragg grating sensor in Aburatsubo Bay, Japan. The seiche in bay and its response of rock masses were successfully extracted.

  19. Bayesian Model Selection with Network Based Diffusion Analysis

    PubMed Central

    Whalen, Andrew; Hoppitt, William J. E.

    2016-01-01

    A number of recent studies have used Network Based Diffusion Analysis (NBDA) to detect the role of social transmission in the spread of a novel behavior through a population. In this paper we present a unified framework for performing NBDA in a Bayesian setting, and demonstrate how the Watanabe Akaike Information Criteria (WAIC) can be used for model selection. We present a specific example of applying this method to Time to Acquisition Diffusion Analysis (TADA). To examine the robustness of this technique, we performed a large scale simulation study and found that NBDA using WAIC could recover the correct model of social transmission under a wide range of cases, including under the presence of random effects, individual level variables, and alternative models of social transmission. This work suggests that NBDA is an effective and widely applicable tool for uncovering whether social transmission underpins the spread of a novel behavior, and may still provide accurate results even when key model assumptions are relaxed. PMID:27092089

  20. Toward a Behavioral Analysis of Joint Attention

    ERIC Educational Resources Information Center

    Dube, William V.; MacDonald, Rebecca P. F.; Mansfield, Renee C.; Holcomb, William L.; Ahearn, William H.

    2004-01-01

    Joint attention (JA) initiation is defined in cognitive-developmental psychology as a child's actions that verify or produce simultaneous attending by that child and an adult to some object or event in the environment so that both may experience the object or event together. This paper presents a contingency analysis of gaze shift in JA…

  1. Using Bayesian analysis in repeated preclinical in vivo studies for a more effective use of animals.

    PubMed

    Walley, Rosalind; Sherington, John; Rastrick, Joe; Detrait, Eric; Hanon, Etienne; Watt, Gillian

    2016-05-01

    Whilst innovative Bayesian approaches are increasingly used in clinical studies, in the preclinical area Bayesian methods appear to be rarely used in the reporting of pharmacology data. This is particularly surprising in the context of regularly repeated in vivo studies where there is a considerable amount of data from historical control groups, which has potential value. This paper describes our experience with introducing Bayesian analysis for such studies using a Bayesian meta-analytic predictive approach. This leads naturally either to an informative prior for a control group as part of a full Bayesian analysis of the next study or using a predictive distribution to replace a control group entirely. We use quality control charts to illustrate study-to-study variation to the scientists and describe informative priors in terms of their approximate effective numbers of animals. We describe two case studies of animal models: the lipopolysaccharide-induced cytokine release model used in inflammation and the novel object recognition model used to screen cognitive enhancers, both of which show the advantage of a Bayesian approach over the standard frequentist analysis. We conclude that using Bayesian methods in stable repeated in vivo studies can result in a more effective use of animals, either by reducing the total number of animals used or by increasing the precision of key treatment differences. This will lead to clearer results and supports the "3Rs initiative" to Refine, Reduce and Replace animals in research. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27028721

  2. Bayesian semiparametric nonlinear mixed-effects joint models for data with skewness, missing responses, and measurement errors in covariates.

    PubMed

    Huang, Yangxin; Dagne, Getachew

    2012-09-01

    It is a common practice to analyze complex longitudinal data using semiparametric nonlinear mixed-effects (SNLME) models with a normal distribution. Normality assumption of model errors may unrealistically obscure important features of subject variations. To partially explain between- and within-subject variations, covariates are usually introduced in such models, but some covariates may often be measured with substantial errors. Moreover, the responses may be missing and the missingness may be nonignorable. Inferential procedures can be complicated dramatically when data with skewness, missing values, and measurement error are observed. In the literature, there has been considerable interest in accommodating either skewness, incompleteness or covariate measurement error in such models, but there has been relatively little study concerning all three features simultaneously. In this article, our objective is to address the simultaneous impact of skewness, missingness, and covariate measurement error by jointly modeling the response and covariate processes based on a flexible Bayesian SNLME model. The method is illustrated using a real AIDS data set to compare potential models with various scenarios and different distribution specifications. PMID:22150787

  3. Nuclear stockpile stewardship and Bayesian image analysis (DARHT and the BIE)

    SciTech Connect

    Carroll, James L

    2011-01-11

    Since the end of nuclear testing, the reliability of our nation's nuclear weapon stockpile has been performed using sub-critical hydrodynamic testing. These tests involve some pretty 'extreme' radiography. We will be discussing the challenges and solutions to these problems provided by DARHT (the world's premiere hydrodynamic testing facility) and the BIE or Bayesian Inference Engine (a powerful radiography analysis software tool). We will discuss the application of Bayesian image analysis techniques to this important and difficult problem.

  4. Interacting Agricultural Pests and Their Effect on Crop Yield: Application of a Bayesian Decision Theory Approach to the Joint Management of Bromus tectorum and Cephus cinctus

    PubMed Central

    Keren, Ilai N.; Menalled, Fabian D.; Weaver, David K.; Robison-Cox, James F.

    2015-01-01

    Worldwide, the landscape homogeneity of extensive monocultures that characterizes conventional agriculture has resulted in the development of specialized and interacting multitrophic pest complexes. While integrated pest management emphasizes the need to consider the ecological context where multiple species coexist, management recommendations are often based on single-species tactics. This approach may not provide satisfactory solutions when confronted with the complex interactions occurring between organisms at the same or different trophic levels. Replacement of the single-species management model with more sophisticated, multi-species programs requires an understanding of the direct and indirect interactions occurring between the crop and all categories of pests. We evaluated a modeling framework to make multi-pest management decisions taking into account direct and indirect interactions among species belonging to different trophic levels. We adopted a Bayesian decision theory approach in combination with path analysis to evaluate interactions between Bromus tectorum (downy brome, cheatgrass) and Cephus cinctus (wheat stem sawfly) in wheat (Triticum aestivum) systems. We assessed their joint responses to weed management tactics, seeding rates, and cultivar tolerance to insect stem boring or competition. Our results indicated that C. cinctus oviposition behavior varied as a function of B. tectorum pressure. Crop responses were more readily explained by the joint effects of management tactics on both categories of pests and their interactions than just by the direct impact of any particular management scheme on yield. In accordance, a C. cinctus tolerant variety should be planted at a low seeding rate under high insect pressure. However as B. tectorum levels increase, the C. cinctus tolerant variety should be replaced by a competitive and drought tolerant cultivar at high seeding rates despite C. cinctus infestation. This study exemplifies the necessity of

  5. Bayesian analysis of multimodal data and brain imaging

    NASA Astrophysics Data System (ADS)

    Assadi, Amir H.; Eghbalnia, Hamid; Backonja, Miroslav; Wakai, Ronald T.; Rutecki, Paul; Haughton, Victor

    2000-06-01

    It is often the case that information about a process can be obtained using a variety of methods. Each method is employed because of specific advantages over the competing alternatives. An example in medical neuro-imaging is the choice between fMRI and MEG modes where fMRI can provide high spatial resolution in comparison to the superior temporal resolution of MEG. The combination of data from varying modes provides the opportunity to infer results that may not be possible by means of any one mode alone. We discuss a Bayesian and learning theoretic framework for enhanced feature extraction that is particularly suited to multi-modal investigations of massive data sets from multiple experiments. In the following Bayesian approach, acquired knowledge (information) regarding various aspects of the process are all directly incorporated into the formulation. This information can come from a variety of sources. In our case, it represents statistical information obtained from other modes of data collection. The information is used to train a learning machine to estimate a probability distribution, which is used in turn by a second machine as a prior, in order to produce a more refined estimation of the distribution of events. The computational demand of the algorithm is handled by proposing a distributed parallel implementation on a cluster of workstations that can be scaled to address real-time needs if required. We provide a simulation of these methods on a set of synthetically generated MEG and EEG data. We show how spatial and temporal resolutions improve by using prior distributions. The method on fMRI signals permits one to construct the probability distribution of the non-linear hemodynamics of the human brain (real data). These computational results are in agreement with biologically based measurements of other labs, as reported to us by researchers from UK. We also provide preliminary analysis involving multi-electrode cortical recording that accompanies

  6. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  7. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives

    PubMed Central

    Green, Adam W.; Bailey, Larissa L.

    2015-01-01

    Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA) using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica) metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools) using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA) and available information to inform a formal decision process to determine optimal and timely management policies. PMID:26658734

  8. Bayesian analysis of input uncertainty in hydrological modeling: 2. Application

    NASA Astrophysics Data System (ADS)

    Kavetski, Dmitri; Kuczera, George; Franks, Stewart W.

    2006-03-01

    The Bayesian total error analysis (BATEA) methodology directly addresses both input and output errors in hydrological modeling, requiring the modeler to make explicit, rather than implicit, assumptions about the likely extent of data uncertainty. This study considers a BATEA assessment of two North American catchments: (1) French Broad River and (2) Potomac basins. It assesses the performance of the conceptual Variable Infiltration Capacity (VIC) model with and without accounting for input (precipitation) uncertainty. The results show the considerable effects of precipitation errors on the predicted hydrographs (especially the prediction limits) and on the calibrated parameters. In addition, the performance of BATEA in the presence of severe model errors is analyzed. While BATEA allows a very direct treatment of input uncertainty and yields some limited insight into model errors, it requires the specification of valid error models, which are currently poorly understood and require further work. Moreover, it leads to computationally challenging highly dimensional problems. For some types of models, including the VIC implemented using robust numerical methods, the computational cost of BATEA can be reduced using Newton-type methods.

  9. A Bayesian Model for the Analysis of Transgenerational Epigenetic Variation

    PubMed Central

    Varona, Luis; Munilla, Sebastián; Mouresan, Elena Flavia; González-Rodríguez, Aldemar; Moreno, Carlos; Altarriba, Juan

    2015-01-01

    Epigenetics has become one of the major areas of biological research. However, the degree of phenotypic variability that is explained by epigenetic processes still remains unclear. From a quantitative genetics perspective, the estimation of variance components is achieved by means of the information provided by the resemblance between relatives. In a previous study, this resemblance was described as a function of the epigenetic variance component and a reset coefficient that indicates the rate of dissipation of epigenetic marks across generations. Given these assumptions, we propose a Bayesian mixed model methodology that allows the estimation of epigenetic variance from a genealogical and phenotypic database. The methodology is based on the development of a T matrix of epigenetic relationships that depends on the reset coefficient. In addition, we present a simple procedure for the calculation of the inverse of this matrix (T−1) and a Gibbs sampler algorithm that obtains posterior estimates of all the unknowns in the model. The new procedure was used with two simulated data sets and with a beef cattle database. In the simulated populations, the results of the analysis provided marginal posterior distributions that included the population parameters in the regions of highest posterior density. In the case of the beef cattle dataset, the posterior estimate of transgenerational epigenetic variability was very low and a model comparison test indicated that a model that did not included it was the most plausible. PMID:25617408

  10. Dynamic sensor action selection with Bayesian decision analysis

    NASA Astrophysics Data System (ADS)

    Kristensen, Steen; Hansen, Volker; Kondak, Konstantin

    1998-10-01

    The aim of this work is to create a framework for the dynamic planning of sensor actions for an autonomous mobile robot. The framework uses Bayesian decision analysis, i.e., a decision-theoretic method, to evaluate possible sensor actions and selecting the most appropriate ones given the available sensors and what is currently known about the state of the world. Since sensing changes the knowledge of the system and since the current state of the robot (task, position, etc.) determines what knowledge is relevant, the evaluation and selection of sensing actions is an on-going process that effectively determines the behavior of the robot. The framework has been implemented on a real mobile robot and has been proven to be able to control in real-time the sensor actions of the system. In current work we are investigating methods to reduce or automatically generate the necessary model information needed by the decision- theoretic method to select the appropriate sensor actions.

  11. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    SciTech Connect

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  12. Nonparametric survival analysis using Bayesian Additive Regression Trees (BART).

    PubMed

    Sparapani, Rodney A; Logan, Brent R; McCulloch, Robert E; Laud, Purushottam W

    2016-07-20

    Bayesian additive regression trees (BART) provide a framework for flexible nonparametric modeling of relationships of covariates to outcomes. Recently, BART models have been shown to provide excellent predictive performance, for both continuous and binary outcomes, and exceeding that of its competitors. Software is also readily available for such outcomes. In this article, we introduce modeling that extends the usefulness of BART in medical applications by addressing needs arising in survival analysis. Simulation studies of one-sample and two-sample scenarios, in comparison with long-standing traditional methods, establish face validity of the new approach. We then demonstrate the model's ability to accommodate data from complex regression models with a simulation study of a nonproportional hazards scenario with crossing survival functions and survival function estimation in a scenario where hazards are multiplicatively modified by a highly nonlinear function of the covariates. Using data from a recently published study of patients undergoing hematopoietic stem cell transplantation, we illustrate the use and some advantages of the proposed method in medical investigations. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26854022

  13. Cepheid light curve demography via Bayesian functional data analysis

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas J.; Hendry, Martin; Kowal, Daniel; Ruppert, David

    2016-01-01

    Synoptic time-domain surveys provide astronomers, not simply more data, but a different kind of data: large ensembles of multivariate, irregularly and asynchronously sampled light curves. We describe a statistical framework for light curve demography—optimal accumulation and extraction of information, not only along individual light curves as conventional methods do, but also across large ensembles of related light curves. We build the framework using tools from functional data analysis (FDA), a rapidly growing area of statistics that addresses inference from datasets that sample ensembles of related functions. Our Bayesian FDA framework builds hierarchical models that describe light curve ensembles using multiple levels of randomness: upper levels describe the source population, and lower levels describe the observation process, including measurement errors and selection effects. Roughly speaking, a particular object's light curve is modeled as the sum of a parameterized template component (modeling population-averaged behavior) and a peculiar component (modeling variability across the population), subsequently subjected to an observation model. A functional shrinkage adjustment to individual light curves emerges—an adaptive, functional generalization of the kind of adjustments made for Eddington or Malmquist bias in single-epoch photometric surveys. We describe ongoing work applying the framework to improved estimation of Cepheid variable star luminosities via FDA-based refinement and generalization of the Cepheid period-luminosity relation.

  14. Light curve demography via Bayesian functional data analysis

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas; Budavari, Tamas; Hendry, Martin A.; Kowal, Daniel; Ruppert, David

    2015-08-01

    Synoptic time-domain surveys provide astronomers, not simply more data, but a different kind of data: large ensembles of multivariate, irregularly and asynchronously sampled light curves. We describe a statistical framework for light curve demography—optimal accumulation and extraction of information, not only along individual light curves as conventional methods do, but also across large ensembles of related light curves. We build the framework using tools from functional data analysis (FDA), a rapidly growing area of statistics that addresses inference from datasets that sample ensembles of related functions. Our Bayesian FDA framework builds hierarchical models that describe light curve ensembles using multiple levels of randomness: upper levels describe the source population, and lower levels describe the observation process, including measurement errors and selection effects. Schematically, a particular object's light curve is modeled as the sum of a parameterized template component (modeling population-averaged behavior) and a peculiar component (modeling variability across the population), subsequently subjected to an observation model. A functional shrinkage adjustment to individual light curves emerges—an adaptive, functional generalization of the kind of adjustments made for Eddington or Malmquist bias in single-epoch photometric surveys. We are applying the framework to a variety of problems in synoptic time-domain survey astronomy, including optimal detection of weak sources in multi-epoch data, and improved estimation of Cepheid variable star luminosities from detailed demographic modeling of ensembles of Cepheid light curves.

  15. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    PubMed

    Green, Adam W; Bailey, Larissa L

    2015-01-01

    Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA) using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica) metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools) using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA) and available information to inform a formal decision process to determine optimal and timely management policies. PMID:26658734

  16. Bayesian belief network analysis applied to determine the progression of temporomandibular disorders using MRI

    PubMed Central

    2015-01-01

    Objectives: This study investigated the applicability of a Bayesian belief network (BBN) to MR images to diagnose temporomandibular disorders (TMDs). Our aim was to determine the progression of TMDs, focusing on how each finding affects the other. Methods: We selected 1.5-T MRI findings (33 variables) and diagnoses (bone changes and disc displacement) of patients with TMD from 2007 to 2008. There were a total of 295 cases with 590 sides of temporomandibular joints (TMJs). The data were modified according to the research diagnostic criteria of TMD. We compared the accuracy of the BBN using 11 algorithms (necessary path condition, path condition, greedy search-and-score with Bayesian information criterion, Chow–Liu tree, Rebane–Pearl poly tree, tree augmented naïve Bayes model, maximum log likelihood, Akaike information criterion, minimum description length, K2 and C4.5), a multiple regression analysis and an artificial neural network using resubstitution validation and 10-fold cross-validation. Results: There were 191 TMJs (32.4%) with bone changes and 340 (57.6%) with articular disc displacement. The BBN path condition algorithm using resubstitution validation and 10-fold cross-validation was >99% accurate. However, the main advantage of a BBN is that it can represent the causal relationships between different findings and assign conditional probabilities, which can then be used to interpret the progression of TMD. Conclusions: Osteoarthritic bone changes progressed from condyle to articular fossa and finally to mandibular bone contours. Disc displacement was directly related to severe bone changes. Early bone changes were not directly related to disc displacement. TMJ functional factors (condylar translation, bony space and disc form) and age mediated between bone changes and disc displacement. PMID:25472616

  17. ChIP-BIT: Bayesian inference of target genes using a novel joint probabilistic model of ChIP-seq profiles.

    PubMed

    Chen, Xi; Jung, Jin-Gyoung; Shajahan-Haq, Ayesha N; Clarke, Robert; Shih, Ie-Ming; Wang, Yue; Magnani, Luca; Wang, Tian-Li; Xuan, Jianhua

    2016-04-20

    Chromatin immunoprecipitation with massively parallel DNA sequencing (ChIP-seq) has greatly improved the reliability with which transcription factor binding sites (TFBSs) can be identified from genome-wide profiling studies. Many computational tools are developed to detect binding events or peaks, however the robust detection of weak binding events remains a challenge for current peak calling tools. We have developed a novel Bayesian approach (ChIP-BIT) to reliably detect TFBSs and their target genes by jointly modeling binding signal intensities and binding locations of TFBSs. Specifically, a Gaussian mixture model is used to capture both binding and background signals in sample data. As a unique feature of ChIP-BIT, background signals are modeled by a local Gaussian distribution that is accurately estimated from the input data. Extensive simulation studies showed a significantly improved performance of ChIP-BIT in target gene prediction, particularly for detecting weak binding signals at gene promoter regions. We applied ChIP-BIT to find target genes from NOTCH3 and PBX1 ChIP-seq data acquired from MCF-7 breast cancer cells. TF knockdown experiments have initially validated about 30% of co-regulated target genes identified by ChIP-BIT as being differentially expressed in MCF-7 cells. Functional analysis on these genes further revealed the existence of crosstalk between Notch and Wnt signaling pathways. PMID:26704972

  18. ChIP-BIT: Bayesian inference of target genes using a novel joint probabilistic model of ChIP-seq profiles

    PubMed Central

    Chen, Xi; Jung, Jin-Gyoung; Shajahan-Haq, Ayesha N.; Clarke, Robert; Shih, Ie-Ming; Wang, Yue; Magnani, Luca; Wang, Tian-Li; Xuan, Jianhua

    2016-01-01

    Chromatin immunoprecipitation with massively parallel DNA sequencing (ChIP-seq) has greatly improved the reliability with which transcription factor binding sites (TFBSs) can be identified from genome-wide profiling studies. Many computational tools are developed to detect binding events or peaks, however the robust detection of weak binding events remains a challenge for current peak calling tools. We have developed a novel Bayesian approach (ChIP-BIT) to reliably detect TFBSs and their target genes by jointly modeling binding signal intensities and binding locations of TFBSs. Specifically, a Gaussian mixture model is used to capture both binding and background signals in sample data. As a unique feature of ChIP-BIT, background signals are modeled by a local Gaussian distribution that is accurately estimated from the input data. Extensive simulation studies showed a significantly improved performance of ChIP-BIT in target gene prediction, particularly for detecting weak binding signals at gene promoter regions. We applied ChIP-BIT to find target genes from NOTCH3 and PBX1 ChIP-seq data acquired from MCF-7 breast cancer cells. TF knockdown experiments have initially validated about 30% of co-regulated target genes identified by ChIP-BIT as being differentially expressed in MCF-7 cells. Functional analysis on these genes further revealed the existence of crosstalk between Notch and Wnt signaling pathways. PMID:26704972

  19. Bayesian Joint Selection of Genes and Pathways: Applications in Multiple Myeloma Genomics

    PubMed Central

    Zhang, Lin; Morris, Jeffrey S; Zhang, Jiexin; Orlowski, Robert Z; Baladandayuthapani, Veerabhadran

    2014-01-01

    It is well-established that the development of a disease, especially cancer, is a complex process that results from the joint effects of multiple genes involved in various molecular signaling pathways. In this article, we propose methods to discover genes and molecular pathways significantly associated with clinical outcomes in cancer samples. We exploit the natural hierarchal structure of genes related to a given pathway as a group of interacting genes to conduct selection of both pathways and genes. We posit the problem in a hierarchical structured variable selection (HSVS) framework to analyze the corresponding gene expression data. HSVS methods conduct simultaneous variable selection at the pathway (group level) and the gene (within-group) level. To adapt to the overlapping group structure present in the pathway–gene hierarchy of the data, we developed an overlap-HSVS method that introduces latent partial effect variables that partition the marginal effect of the covariates and corresponding weights for a proportional shrinkage of the partial effects. Combining gene expression data with prior pathway information from the KEGG databases, we identified several gene–pathway combinations that are significantly associated with clinical outcomes of multiple myeloma. Biological discoveries support this relationship for the pathways and the corresponding genes we identified. PMID:25520554

  20. Simultaneous Bayesian Estimation of Alignment and Phylogeny under a Joint Model of Protein Sequence and Structure

    PubMed Central

    Herman, Joseph L.; Challis, Christopher J.; Novák, Ádám; Hein, Jotun; Schmidler, Scott C.

    2014-01-01

    For sequences that are highly divergent, there is often insufficient information to infer accurate alignments, and phylogenetic uncertainty may be high. One way to address this issue is to make use of protein structural information, since structures generally diverge more slowly than sequences. In this work, we extend a recently developed stochastic model of pairwise structural evolution to multiple structures on a tree, analytically integrating over ancestral structures to permit efficient likelihood computations under the resulting joint sequence–structure model. We observe that the inclusion of structural information significantly reduces alignment and topology uncertainty, and reduces the number of topology and alignment errors in cases where the true trees and alignments are known. In some cases, the inclusion of structure results in changes to the consensus topology, indicating that structure may contain additional information beyond that which can be obtained from sequences. We use the model to investigate the order of divergence of cytoglobins, myoglobins, and hemoglobins and observe a stabilization of phylogenetic inference: although a sequence-based inference assigns significant posterior probability to several different topologies, the structural model strongly favors one of these over the others and is more robust to the choice of data set. PMID:24899668

  1. Bayesian Analysis of Multiple Populations in Galactic Globular Clusters

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, Rachel A.; Sarajedini, Ata; von Hippel, Ted; Stenning, David; Piotto, Giampaolo; Milone, Antonino; van Dyk, David A.; Robinson, Elliot; Stein, Nathan

    2016-01-01

    We use GO 13297 Cycle 21 Hubble Space Telescope (HST) observations and archival GO 10775 Cycle 14 HST ACS Treasury observations of Galactic Globular Clusters to find and characterize multiple stellar populations. Determining how globular clusters are able to create and retain enriched material to produce several generations of stars is key to understanding how these objects formed and how they have affected the structural, kinematic, and chemical evolution of the Milky Way. We employ a sophisticated Bayesian technique with an adaptive MCMC algorithm to simultaneously fit the age, distance, absorption, and metallicity for each cluster. At the same time, we also fit unique helium values to two distinct populations of the cluster and determine the relative proportions of those populations. Our unique numerical approach allows objective and precise analysis of these complicated clusters, providing posterior distribution functions for each parameter of interest. We use these results to gain a better understanding of multiple populations in these clusters and their role in the history of the Milky Way.Support for this work was provided by NASA through grant numbers HST-GO-10775 and HST-GO-13297 from the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS5-26555. This material is based upon work supported by the National Aeronautics and Space Administration under Grant NNX11AF34G issued through the Office of Space Science. This project was supported by the National Aeronautics & Space Administration through the University of Central Florida's NASA Florida Space Grant Consortium.

  2. Semiparametric Thurstonian Models for Recurrent Choices: A Bayesian Analysis

    ERIC Educational Resources Information Center

    Ansari, Asim; Iyengar, Raghuram

    2006-01-01

    We develop semiparametric Bayesian Thurstonian models for analyzing repeated choice decisions involving multinomial, multivariate binary or multivariate ordinal data. Our modeling framework has multiple components that together yield considerable flexibility in modeling preference utilities, cross-sectional heterogeneity and parameter-driven…

  3. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. PMID:26674686

  4. Bayesian network representing system dynamics in risk analysis of nuclear systems

    NASA Astrophysics Data System (ADS)

    Varuttamaseni, Athi

    2011-12-01

    A dynamic Bayesian network (DBN) model is used in conjunction with the alternating conditional expectation (ACE) regression method to analyze the risk associated with the loss of feedwater accident coupled with a subsequent initiation of the feed and bleed operation in the Zion-1 nuclear power plant. The use of the DBN allows the joint probability distribution to be factorized, enabling the analysis to be done on many simpler network structures rather than on one complicated structure. The construction of the DBN model assumes conditional independence relations among certain key reactor parameters. The choice of parameter to model is based on considerations of the macroscopic balance statements governing the behavior of the reactor under a quasi-static assumption. The DBN is used to relate the peak clad temperature to a set of independent variables that are known to be important in determining the success of the feed and bleed operation. A simple linear relationship is then used to relate the clad temperature to the core damage probability. To obtain a quantitative relationship among different nodes in the DBN, surrogates of the RELAP5 reactor transient analysis code are used. These surrogates are generated by applying the ACE algorithm to output data obtained from about 50 RELAP5 cases covering a wide range of the selected independent variables. These surrogates allow important safety parameters such as the fuel clad temperature to be expressed as a function of key reactor parameters such as the coolant temperature and pressure together with important independent variables such as the scram delay time. The time-dependent core damage probability is calculated by sampling the independent variables from their probability distributions and propagate the information up through the Bayesian network to give the clad temperature. With the knowledge of the clad temperature and the assumption that the core damage probability has a one-to-one relationship to it, we have

  5. Bayesian analysis of the neuromagnetic inverse problem with l(p)-norm priors.

    PubMed

    Auranen, Toni; Nummenmaa, Aapo; Hämäläinen, Matti S; Jääskeläinen, Iiro P; Lampinen, Jouko; Vehtari, Aki; Sams, Mikko

    2005-07-01

    Magnetoencephalography (MEG) allows millisecond-scale non-invasive measurement of magnetic fields generated by neural currents in the brain. However, localization of the underlying current sources is ambiguous due to the so-called inverse problem. The most widely used source localization methods (i.e., minimum-norm and minimum-current estimates (MNE and MCE) and equivalent current dipole (ECD) fitting) require ad hoc determination of the cortical current distribution (l(2)-, l(1)-norm priors and point-sized dipolar, respectively). In this article, we perform a Bayesian analysis of the MEG inverse problem with l(p)-norm priors for the current sources. This way, we circumvent the arbitrary choice between l(1)- and l(2)-norm prior, which is instead rendered automatically based on the data. By obtaining numerical samples from the joint posterior probability distribution of the source current parameters and model hyperparameters (such as the l(p)-norm order p) using Markov chain Monte Carlo (MCMC) methods, we calculated the spatial inverse estimates as expectation values of the source current parameters integrated over the hyperparameters. Real MEG data and simulated (known) source currents with realistic MRI-based cortical geometry and 306-channel MEG sensor array were used. While the proposed model is sensitive to source space discretization size and computationally rather heavy, it is mathematically straightforward, thus allowing incorporation of, for instance, a priori functional magnetic resonance imaging (fMRI) information. PMID:15955497

  6. Bayesian analysis of heavy-tailed and long-range dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, Timothy; Watkins, Nick; Gramacy, Robert; Franzke, Christian

    2014-05-01

    We have used MCMC algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modelling long range dependence (e.g. Beran et al, 2013). Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. We have extended the ARFIMA model by weakening the Gaussianity assumption, assuming an alpha-stable, heavy tailed, distribution for the innovations, and performing joint inference on d and alpha. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other popular measures of d.

  7. Bayesian Geostatistical Analysis and Prediction of Rhodesian Human African Trypanosomiasis

    PubMed Central

    Wardrop, Nicola A.; Atkinson, Peter M.; Gething, Peter W.; Fèvre, Eric M.; Picozzi, Kim; Kakembo, Abbas S. L.; Welburn, Susan C.

    2010-01-01

    Background The persistent spread of Rhodesian human African trypanosomiasis (HAT) in Uganda in recent years has increased concerns of a potential overlap with the Gambian form of the disease. Recent research has aimed to increase the evidence base for targeting control measures by focusing on the environmental and climatic factors that control the spatial distribution of the disease. Objectives One recent study used simple logistic regression methods to explore the relationship between prevalence of Rhodesian HAT and several social, environmental and climatic variables in two of the most recently affected districts of Uganda, and suggested the disease had spread into the study area due to the movement of infected, untreated livestock. Here we extend this study to account for spatial autocorrelation, incorporate uncertainty in input data and model parameters and undertake predictive mapping for risk of high HAT prevalence in future. Materials and Methods Using a spatial analysis in which a generalised linear geostatistical model is used in a Bayesian framework to account explicitly for spatial autocorrelation and incorporate uncertainty in input data and model parameters we are able to demonstrate a more rigorous analytical approach, potentially resulting in more accurate parameter and significance estimates and increased predictive accuracy, thereby allowing an assessment of the validity of the livestock movement hypothesis given more robust parameter estimation and appropriate assessment of covariate effects. Results Analysis strongly supports the theory that Rhodesian HAT was imported to the study area via the movement of untreated, infected livestock from endemic areas. The confounding effect of health care accessibility on the spatial distribution of Rhodesian HAT and the linkages between the disease's distribution and minimum land surface temperature have also been confirmed via the application of these methods. Conclusions Predictive mapping indicates an

  8. A Bayesian Solution for Two-Way Analysis of Variance. ACT Technical Bulletin No. 8.

    ERIC Educational Resources Information Center

    Lindley, Dennis V.

    The standard statistical analysis of data classified in two ways (say into rows and columns) is through an analysis of variance that splits the total variation of the data into the main effect of rows, the main effect of columns, and the interaction between rows and columns. This paper presents an alternative Bayesian analysis of the same…

  9. Bayesian Analysis of Two Stellar Populations in Galactic Globular Clusters III: Analysis of 30 Clusters

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, R.; Stenning, D. C.; Sarajedini, A.; von Hippel, T.; van Dyk, D. A.; Robinson, E.; Stein, N.; Jefferys, W. H.

    2016-09-01

    We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival ACS Treasury observations of 30 Galactic Globular Clusters to characterize two distinct stellar populations. A sophisticated Bayesian technique is employed to simultaneously sample the joint posterior distribution of age, distance, and extinction for each cluster, as well as unique helium values for two populations within each cluster and the relative proportion of those populations. We find the helium differences among the two populations in the clusters fall in the range of ˜0.04 to 0.11. Because adequate models varying in CNO are not presently available, we view these spreads as upper limits and present them with statistical rather than observational uncertainties. Evidence supports previous studies suggesting an increase in helium content concurrent with increasing mass of the cluster and also find that the proportion of the first population of stars increases with mass as well. Our results are examined in the context of proposed globular cluster formation scenarios. Additionally, we leverage our Bayesian technique to shed light on inconsistencies between the theoretical models and the observed data.

  10. Joint Bayesian and N-body Analyses of the 55 Cancri and GJ 876 Planetary Systems

    NASA Astrophysics Data System (ADS)

    Nelson, Benjamin E.; Ford, Eric B; Wright, Jason; Fischer, Debra

    2014-05-01

    We present the latest dynamical models for the 55 Cancri and GJ 876 systems based on 1,418 and 367 radial velocity (RV) observations, respectively. We apply our Radial velocity Using N-body Differential evolution Markov chain Monte Carlo code (RUN DMC; B. Nelson et al. 2014) to these two landmark systems and perform long-term 10^8 year) dynamical integrations using the Mercury symplectic integrator. For 55 Cancri, we find the transiting planet "e" cannot be misaligned with the outer four planets by more than 60 degrees and has a relativistic precession timescale on the order of the secular interactions. Based on a statistical analysis, we conclude planets "b" and "c" are apsidally aligned about 180 degrees but not in a mean-motion resonance. For GJ 876, we derive a set of 3-dimensional (non-coplanar) dynamical models based solely on RVs.

  11. Analysis of mechanical joint in composite cylinder

    NASA Astrophysics Data System (ADS)

    Hong, C. S.; Kim, Y. W.; Park, J. S.

    Joining techniques of composite materials are of great interest in cylindrical structures as the application of composites is widely used for weight-sensitive structures. Little information for the mechanical fastening joint of the laminated shell structure is available in the literature. In this study, a finite element program, which was based on the first order shear deformation theory, was developed for the analysis of the mechanical joint in the laminated composite structure. The failure of the mechanical fastening joint for the laminated graphite/epoxy cylinder subject to internal pressure was analyzed by using the developed program. Modeling of the bolt head in the composite cylinder was studied, and the effect of steel reinforcement outside the composite cylinder on the failure was investigated. The stress component near the bolt head was influenced by the size of the bolt head. The failure load and the failure mode were dependent on the bolt diameter, the number of bolts, and fiber orientation. The failure load was constant when the edge distance exceeds three times the bolt diameter.

  12. Objective Bayesian fMRI analysis-a pilot study in different clinical environments.

    PubMed

    Magerkurth, Joerg; Mancini, Laura; Penny, William; Flandin, Guillaume; Ashburner, John; Micallef, Caroline; De Vita, Enrico; Daga, Pankaj; White, Mark J; Buckley, Craig; Yamamoto, Adam K; Ourselin, Sebastien; Yousry, Tarek; Thornton, John S; Weiskopf, Nikolaus

    2015-01-01

    Functional MRI (fMRI) used for neurosurgical planning delineates functionally eloquent brain areas by time-series analysis of task-induced BOLD signal changes. Commonly used frequentist statistics protect against false positive results based on a p-value threshold. In surgical planning, false negative results are equally if not more harmful, potentially masking true brain activity leading to erroneous resection of eloquent regions. Bayesian statistics provides an alternative framework, categorizing areas as activated, deactivated, non-activated or with low statistical confidence. This approach has not yet found wide clinical application partly due to the lack of a method to objectively define an effect size threshold. We implemented a Bayesian analysis framework for neurosurgical planning fMRI. It entails an automated effect-size threshold selection method for posterior probability maps accounting for inter-individual BOLD response differences, which was calibrated based on the frequentist results maps thresholded by two clinical experts. We compared Bayesian and frequentist analysis of passive-motor fMRI data from 10 healthy volunteers measured on a pre-operative 3T and an intra-operative 1.5T MRI scanner. As a clinical case study, we tested passive motor task activation in a brain tumor patient at 3T under clinical conditions. With our novel effect size threshold method, the Bayesian analysis revealed regions of all four categories in the 3T data. Activated region foci and extent were consistent with the frequentist analysis results. In the lower signal-to-noise ratio 1.5T intra-operative scanner data, Bayesian analysis provided improved brain-activation detection sensitivity compared with the frequentist analysis, albeit the spatial extents of the activations were smaller than at 3T. Bayesian analysis of fMRI data using operator-independent effect size threshold selection may improve the sensitivity and certainty of information available to guide neurosurgery

  13. Hyper-efficient model-independent Bayesian method for the analysis of pulsar timing data

    NASA Astrophysics Data System (ADS)

    Lentati, Lindley; Alexander, P.; Hobson, M. P.; Taylor, S.; Gair, J.; Balan, S. T.; van Haasteren, R.

    2013-05-01

    A new model-independent method is presented for the analysis of pulsar timing data and the estimation of the spectral properties of an isotropic gravitational wave background (GWB). Taking a Bayesian approach, we show that by rephrasing the likelihood we are able to eliminate the most costly aspects of computation normally associated with this type of data analysis. When applied to the International Pulsar Timing Array Mock Data Challenge data sets this results in speedups of approximately 2-3 orders of magnitude compared to established methods, in the most extreme cases reducing the run time from several hours on the high performance computer “DARWIN” to less than a minute on a normal work station. Because of the versatility of this approach, we present three applications of the new likelihood. In the low signal-to-noise regime we sample directly from the power spectrum coefficients of the GWB signal realization. In the high signal-to-noise regime, where the data can support a large number of coefficients, we sample from the joint probability density of the power spectrum coefficients for the individual pulsars and the GWB signal realization using a “guided Hamiltonian sampler” to sample efficiently from this high-dimensional (˜1000) space. Critically in both these cases we need make no assumptions about the form of the power spectrum of the GWB, or the individual pulsars. Finally, we show that, if desired, a power-law model can still be fitted during sampling. We then apply this method to a more complex data set designed to represent better a future International Pulsar Timing Array or European Pulsar Timing Array data release. We show that even in challenging cases where the data features large jumps of the order 5 years, with observations spanning between 4 and 18 years for different pulsars and including steep red noise processes we are able to parametrize the underlying GWB signal correctly. Finally we present a method for characterizing the spatial

  14. A Gibbs sampler for Bayesian analysis of site-occupancy data

    USGS Publications Warehouse

    Dorazio, Robert M.; Rodriguez, Daniel Taylor

    2012-01-01

    1. A Bayesian analysis of site-occupancy data containing covariates of species occurrence and species detection probabilities is usually completed using Markov chain Monte Carlo methods in conjunction with software programs that can implement those methods for any statistical model, not just site-occupancy models. Although these software programs are quite flexible, considerable experience is often required to specify a model and to initialize the Markov chain so that summaries of the posterior distribution can be estimated efficiently and accurately. 2. As an alternative to these programs, we develop a Gibbs sampler for Bayesian analysis of site-occupancy data that include covariates of species occurrence and species detection probabilities. This Gibbs sampler is based on a class of site-occupancy models in which probabilities of species occurrence and detection are specified as probit-regression functions of site- and survey-specific covariate measurements. 3. To illustrate the Gibbs sampler, we analyse site-occupancy data of the blue hawker, Aeshna cyanea (Odonata, Aeshnidae), a common dragonfly species in Switzerland. Our analysis includes a comparison of results based on Bayesian and classical (non-Bayesian) methods of inference. We also provide code (based on the R software program) for conducting Bayesian and classical analyses of site-occupancy data.

  15. Bayesian Factor Analysis When Only a Sample Covariance Matrix Is Available

    ERIC Educational Resources Information Center

    Hayashi, Kentaro; Arav, Marina

    2006-01-01

    In traditional factor analysis, the variance-covariance matrix or the correlation matrix has often been a form of inputting data. In contrast, in Bayesian factor analysis, the entire data set is typically required to compute the posterior estimates, such as Bayes factor loadings and Bayes unique variances. We propose a simple method for computing…

  16. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis: A Cross-National Investigation of Schwartz Values

    ERIC Educational Resources Information Center

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the model parameters and demonstrates the consequences…

  17. Cyclist activity and injury risk analysis at signalized intersections: a Bayesian modelling approach.

    PubMed

    Strauss, Jillian; Miranda-Moreno, Luis F; Morency, Patrick

    2013-10-01

    This study proposes a two-equation Bayesian modelling approach to simultaneously study cyclist injury occurrence and bicycle activity at signalized intersections as joint outcomes. This approach deals with the potential presence of endogeneity and unobserved heterogeneities and is used to identify factors associated with both cyclist injuries and volumes. Its application to identify high-risk corridors is also illustrated. Montreal, Quebec, Canada is the application environment, using an extensive inventory of a large sample of signalized intersections containing disaggregate motor-vehicle traffic volumes and bicycle flows, geometric design, traffic control and built environment characteristics in the vicinity of the intersections. Cyclist injury data for the period of 2003-2008 is used in this study. Also, manual bicycle counts were standardized using temporal and weather adjustment factors to obtain average annual daily volumes. Results confirm and quantify the effects of both bicycle and motor-vehicle flows on cyclist injury occurrence. Accordingly, more cyclists at an intersection translate into more cyclist injuries but lower injury rates due to the non-linear association between bicycle volume and injury occurrence. Furthermore, the results emphasize the importance of turning motor-vehicle movements. The presence of bus stops and total crosswalk length increase cyclist injury occurrence whereas the presence of a raised median has the opposite effect. Bicycle activity through intersections was found to increase as employment, number of metro stations, land use mix, area of commercial land use type, length of bicycle facilities and the presence of schools within 50-800 m of the intersection increase. Intersections with three approaches are expected to have fewer cyclists than those with four. Using Bayesian analysis, expected injury frequency and injury rates were estimated for each intersection and used to rank corridors. Corridors with high bicycle volumes

  18. Wear analysis of revolute joints with clearance in multibody systems

    NASA Astrophysics Data System (ADS)

    Bai, ZhengFeng; Zhao, Yang; Wang, XingGui

    2013-08-01

    In this work, the prediction of wear for revolute joint with clearance in multibody systems is investigated using a computational methodology. The contact model in clearance joint is established using a new hybrid nonlinear contact force model and the friction effect is considered by using a modified Coulomb friction model. The dynamics model of multibody system with clearance is established using dynamic segmentation modeling method and the computational process for wear analysis of clearance joint in multibody systems is presented. The main computational process for wear analysis of clearance joint includes two steps, which are dynamics analysis and wear analysis. The dynamics simulation of multibody system with revolute clearance joint is carried out and the contact forces are drawn and used to calculate the wear amount of revolute clearance joint based on the Archard's wear model. Finally, a four-bar multibody mechanical system with revolute clearance joint is used as numerical example application to perform the simulation and show the dynamics responses and wear characteristics of multibody systems with revolute clearance joint. The main results of this work indicate that the contact between the joint elements is wider and more frequent in some specific regions and the wear phenomenon is not regular around the joint surface, which causes the clearance size increase non-regularly after clearance joint wear. This work presents an effective method to predict wear of revolute joint with clearance in multibody systems.

  19. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer

    2010-01-01

    This joint mobility KC lecture included information from two papers, "A Method for and Issues Associated with the Determination of Space Suit Joint Requirements" and "Results and Analysis from Space Suit Joint Torque Testing," as presented for the International Conference on Environmental Systems in 2009 and 2010, respectively. The first paper discusses historical joint torque testing methodologies and approaches that were tested in 2008 and 2009. The second paper discusses the testing that was completed in 2009 and 2010.

  20. OBJECTIVE BAYESIAN ANALYSIS OF ''ON/OFF'' MEASUREMENTS

    SciTech Connect

    Casadei, Diego

    2015-01-01

    In high-energy astrophysics, it is common practice to account for the background overlaid with counts from the source of interest with the help of auxiliary measurements carried out by pointing off-source. In this ''on/off'' measurement, one knows the number of photons detected while pointing toward the source, the number of photons collected while pointing away from the source, and how to estimate the background counts in the source region from the flux observed in the auxiliary measurements. For very faint sources, the number of photons detected is so low that the approximations that hold asymptotically are not valid. On the other hand, an analytical solution exists for the Bayesian statistical inference, which is valid at low and high counts. Here we illustrate the objective Bayesian solution based on the reference posterior and compare the result with the approach very recently proposed by Knoetig, and discuss its most delicate points. In addition, we propose to compute the significance of the excess with respect to the background-only expectation with a method that is able to account for any uncertainty on the background and is valid for any photon count. This method is compared to the widely used significance formula by Li and Ma, which is based on asymptotic properties.

  1. Exclusive breastfeeding practice in Nigeria: a bayesian stepwise regression analysis.

    PubMed

    Gayawan, Ezra; Adebayo, Samson B; Chitekwe, Stanley

    2014-11-01

    Despite the importance of breast milk, the prevalence of exclusive breastfeeding (EBF) in Nigeria is far lower than what has been recommended for developing countries. Worse still, the practise has been on downward trend in the country recently. This study was aimed at investigating the determinants and geographical variations of EBF in Nigeria. Any intervention programme would require a good knowledge of factors that enhance the practise. A pooled data set from Nigeria Demographic and Health Survey conducted in 1999, 2003, and 2008 were analyzed using a Bayesian stepwise approach that involves simultaneous selection of variables and smoothing parameters. Further, the approach allows for geographical variations at a highly disaggregated level of states to be investigated. Within a Bayesian context, appropriate priors are assigned on all the parameters and functions. Findings reveal that education of women and their partners, place of delivery, mother's age at birth, and current age of child are associated with increasing prevalence of EBF. However, visits for antenatal care during pregnancy are not associated with EBF in Nigeria. Further, results reveal considerable geographical variations in the practise of EBF. The likelihood of exclusively breastfeeding children are significantly higher in Kwara, Kogi, Osun, and Oyo states but lower in Jigawa, Katsina, and Yobe. Intensive interventions that can lead to improved practise are required in all states in Nigeria. The importance of breastfeeding needs to be emphasized to women during antenatal visits as this can encourage and enhance the practise after delivery. PMID:24619227

  2. Calibration of crash risk models on freeways with limited real-time traffic data using Bayesian meta-analysis and Bayesian inference approach.

    PubMed

    Xu, Chengcheng; Wang, Wei; Liu, Pan; Li, Zhibin

    2015-12-01

    This study aimed to develop a real-time crash risk model with limited data in China by using Bayesian meta-analysis and Bayesian inference approach. A systematic review was first conducted by using three different Bayesian meta-analyses, including the fixed effect meta-analysis, the random effect meta-analysis, and the meta-regression. The meta-analyses provided a numerical summary of the effects of traffic variables on crash risks by quantitatively synthesizing results from previous studies. The random effect meta-analysis and the meta-regression produced a more conservative estimate for the effects of traffic variables compared with the fixed effect meta-analysis. Then, the meta-analyses results were used as informative priors for developing crash risk models with limited data. Three different meta-analyses significantly affect model fit and prediction accuracy. The model based on meta-regression can increase the prediction accuracy by about 15% as compared to the model that was directly developed with limited data. Finally, the Bayesian predictive densities analysis was used to identify the outliers in the limited data. It can further improve the prediction accuracy by 5.0%. PMID:26468977

  3. Coronal joint spaces of the Temporomandibular joint: Systematic review and meta-analysis

    PubMed Central

    Silva, Joana-Cristina; Pires, Carlos A.; Ponces-Ramalhão, Maria-João-Feio; Lopes, Jorge-Dias

    2015-01-01

    Introduction The joint space measurements of the temporomandibular joint have been used to determine the condyle position variation. Therefore, the aim of this study is to perform a systematic review and meta-analysis on the coronal joint spaces measurements of the temporomandibular joint. Material and Methods An electronic database search was performed with the terms “condylar position”; “joint space”AND”TMJ”. Inclusionary criteria included: tomographic 3D imaging of the TMJ, presentation of at least two joint space measurements on the coronal plane. Exclusionary criteria were: mandibular fractures, animal studies, surgery, presence of genetic or chronic diseases, case reports, opinion or debate articles or unpublished material. The risk of bias of each study was judged as high, moderate or low according to the “Cochrane risk of bias tool”. The values used in the meta-analysis were the medial, superior and lateral joint space measurements and their differences between the right and left joint. Results From the initial search 2706 articles were retrieved. After excluding the duplicates and all the studies that did not match the eligibility criteria 4 articles classified for final review. All the retrieved articles were judged as low level of evidence. All of the reviewed studies were included in the meta-analysis concluding that the mean coronal joint space values were: medial joint space 2.94 mm, superior 2.55 mm and lateral 2.16 mm. Conclusions the analysis also showed high levels of heterogeneity. Right and left comparison did not show statistically significant differences. Key words:Temporomandibular joint, systematic review, meta-analysis. PMID:26330944

  4. Bayesian model-averaged benchmark dose analysis via reparameterized quantal-response models.

    PubMed

    Fang, Q; Piegorsch, W W; Simmons, S J; Li, X; Chen, C; Wang, Y

    2015-12-01

    An important objective in biomedical and environmental risk assessment is estimation of minimum exposure levels that induce a pre-specified adverse response in a target population. The exposure points in such settings are typically referred to as benchmark doses (BMDs). Parametric Bayesian estimation for finding BMDs has grown in popularity, and a large variety of candidate dose-response models is available for applying these methods. Each model can possess potentially different parametric interpretation(s), however. We present reparameterized dose-response models that allow for explicit use of prior information on the target parameter of interest, the BMD. We also enhance our Bayesian estimation technique for BMD analysis by applying Bayesian model averaging to produce point estimates and (lower) credible bounds, overcoming associated questions of model adequacy when multimodel uncertainty is present. An example from carcinogenicity testing illustrates the calculations. PMID:26102570

  5. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis

    NASA Technical Reports Server (NTRS)

    Herskovits, Edward H.; Gerring, Joan P.

    2003-01-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.

  6. Symptoms of Depression and Challenging Behaviours in People with Intellectual Disability: A Bayesian Analysis. Brief Report

    ERIC Educational Resources Information Center

    Tsiouris, John; Mann, Rachel; Patti, Paul; Sturmey, Peter

    2004-01-01

    Clinicians need to know the likelihood of a condition given a positive or negative diagnostic test. In this study a Bayesian analysis of the Clinical Behavior Checklist for Persons with Intellectual Disabilities (CBCPID) to predict depression in people with intellectual disability was conducted. The CBCPID was administered to 92 adults with…

  7. Family Background Variables as Instruments for Education in Income Regressions: A Bayesian Analysis

    ERIC Educational Resources Information Center

    Hoogerheide, Lennart; Block, Joern H.; Thurik, Roy

    2012-01-01

    The validity of family background variables instrumenting education in income regressions has been much criticized. In this paper, we use data from the 2004 German Socio-Economic Panel and Bayesian analysis to analyze to what degree violations of the strict validity assumption affect the estimation results. We show that, in case of moderate direct…

  8. Using Discrete Loss Functions and Weighted Kappa for Classification: An Illustration Based on Bayesian Network Analysis

    ERIC Educational Resources Information Center

    Zwick, Rebecca; Lenaburg, Lubella

    2009-01-01

    In certain data analyses (e.g., multiple discriminant analysis and multinomial log-linear modeling), classification decisions are made based on the estimated posterior probabilities that individuals belong to each of several distinct categories. In the Bayesian network literature, this type of classification is often accomplished by assigning…

  9. Applying Bayesian Modeling and Receiver Operating Characteristic Methodologies for Test Utility Analysis

    ERIC Educational Resources Information Center

    Wang, Qiu; Diemer, Matthew A.; Maier, Kimberly S.

    2013-01-01

    This study integrated Bayesian hierarchical modeling and receiver operating characteristic analysis (BROCA) to evaluate how interest strength (IS) and interest differentiation (ID) predicted low–socioeconomic status (SES) youth's interest-major congruence (IMC). Using large-scale Kuder Career Search online-assessment data, this study fit three…

  10. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  11. Bayesian Factor Analysis as a Variable-Selection Problem: Alternative Priors and Consequences.

    PubMed

    Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric

    2016-01-01

    Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor-loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, Muthén & Asparouhov proposed a Bayesian structural equation modeling (BSEM) approach to explore the presence of cross loadings in CFA models. We show that the issue of determining factor-loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov's approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike-and-slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set is used to demonstrate our approach. PMID:27314566

  12. Variational Bayesian causal connectivity analysis for fMRI

    PubMed Central

    Luessi, Martin; Babacan, S. Derin; Molina, Rafael; Booth, James R.; Katsaggelos, Aggelos K.

    2014-01-01

    The ability to accurately estimate effective connectivity among brain regions from neuroimaging data could help answering many open questions in neuroscience. We propose a method which uses causality to obtain a measure of effective connectivity from fMRI data. The method uses a vector autoregressive model for the latent variables describing neuronal activity in combination with a linear observation model based on a convolution with a hemodynamic response function. Due to the employed modeling, it is possible to efficiently estimate all latent variables of the model using a variational Bayesian inference algorithm. The computational efficiency of the method enables us to apply it to large scale problems with high sampling rates and several hundred regions of interest. We use a comprehensive empirical evaluation with synthetic and real fMRI data to evaluate the performance of our method under various conditions. PMID:24847244

  13. Bayesian Analysis of Foraging by Pigeons (Columba livia)

    PubMed Central

    Killeen, Peter R.; Palombo, Gina-Marie; Gottlob, Lawrence R.; Beam, Jon

    2008-01-01

    In this article, the authors combine models of timing and Bayesian revision of information concerning patch quality to predict foraging behavior. Pigeons earned food by pecking on 2 keys (patches) in an experimental chamber. Food was primed for only 1 of the patches on each trial. There was a constant probability of finding food in a primed patch, but it accumulated only while the animals searched there. The optimal strategy was to choose the better patch first and remain for a fixed duration, thereafter alternating evenly between the patches. Pigeons were nonoptimal in 3 ways: (a) they departed too early, (b) their departure times were variable, and (c) they were biased in their choices after initial departure. The authors review various explanations of these data. PMID:8865614

  14. Micronutrients in HIV: A Bayesian Meta-Analysis

    PubMed Central

    Carter, George M.; Indyk, Debbie; Johnson, Matthew; Andreae, Michael; Suslov, Kathryn; Busani, Sudharani; Esmaeili, Aryan; Sacks, Henry S.

    2015-01-01

    Background Approximately 28.5 million people living with HIV are eligible for treatment (CD4<500), but currently have no access to antiretroviral therapy. Reduced serum level of micronutrients is common in HIV disease. Micronutrient supplementation (MNS) may mitigate disease progression and mortality. Objectives We synthesized evidence on the effect of micronutrient supplementation on mortality and rate of disease progression in HIV disease. Methods We searched MEDLINE, EMBASE, the Cochrane Central, AMED and CINAHL databases through December 2014, without language restriction, for studies of greater than 3 micronutrients versus any or no comparator. We built a hierarchical Bayesian random effects model to synthesize results. Inferences are based on the posterior distribution of the population effects; posterior distributions were approximated by Markov chain Monte Carlo in OpenBugs. Principal Findings From 2166 initial references, we selected 49 studies for full review and identified eight reporting on disease progression and/or mortality. Bayesian synthesis of data from 2,249 adults in three studies estimated the relative risk of disease progression in subjects on MNS vs. control as 0.62 (95% credible interval, 0.37, 0.96). Median number needed to treat is 8.4 (4.8, 29.9) and the Bayes Factor 53.4. Based on data reporting on 4,095 adults reporting mortality in 7 randomized controlled studies, the RR was 0.84 (0.38, 1.85), NNT is 25 (4.3, ∞). Conclusions MNS significantly and substantially slows disease progression in HIV+ adults not on ARV, and possibly reduces mortality. Micronutrient supplements are effective in reducing progression with a posterior probability of 97.9%. Considering MNS low cost and lack of adverse effects, MNS should be standard of care for HIV+ adults not yet on ARV. PMID:25830916

  15. Bayesian Analysis of Non-Gaussian Long-Range Dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, Timothy; Watkins, Nicholas; Franzke, Christian; Gramacy, Robert

    2013-04-01

    Recent studies [e.g. the Antarctic study of Franzke, J. Climate, 2010] have strongly suggested that surface temperatures exhibit long-range dependence (LRD). The presence of LRD would hamper the identification of deterministic trends and the quantification of their significance. It is well established that LRD processes exhibit stochastic trends over rather long periods of time. Thus, accurate methods for discriminating between physical processes that possess long memory and those that do not are an important adjunct to climate modeling. As we briefly review, the LRD idea originated at the same time as H-selfsimilarity, so it is often not realised that a model does not have to be H-self similar to show LRD [e.g. Watkins, GRL Frontiers, 2013]. We have used Markov Chain Monte Carlo algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modeling LRD. Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. Many physical processes, for example the Faraday Antarctic time series, are significantly non-Gaussian. We have therefore extended this work by weakening the Gaussianity assumption, assuming an alpha-stable distribution for the innovations, and performing joint inference on d and alpha. Such a modified FARIMA(p,d,q) process is a flexible, initial model for non-Gaussian processes with long memory. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other measures of d.

  16. Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals.

    PubMed

    De Francesco, A; Guarini, E; Bafile, U; Formisano, F; Scaccia, L

    2016-08-01

    When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way. PMID:27627410

  17. Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals

    NASA Astrophysics Data System (ADS)

    De Francesco, A.; Guarini, E.; Bafile, U.; Formisano, F.; Scaccia, L.

    2016-08-01

    When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way.

  18. Uncertainty Analysis in Fatigue Life Prediction of Gas Turbine Blades Using Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Li, Yan-Feng; Zhu, Shun-Peng; Li, Jing; Peng, Weiwen; Huang, Hong-Zhong

    2015-12-01

    This paper investigates Bayesian model selection for fatigue life estimation of gas turbine blades considering model uncertainty and parameter uncertainty. Fatigue life estimation of gas turbine blades is a critical issue for the operation and health management of modern aircraft engines. Since lots of life prediction models have been presented to predict the fatigue life of gas turbine blades, model uncertainty and model selection among these models have consequently become an important issue in the lifecycle management of turbine blades. In this paper, fatigue life estimation is carried out by considering model uncertainty and parameter uncertainty simultaneously. It is formulated as the joint posterior distribution of a fatigue life prediction model and its model parameters using Bayesian inference method. Bayes factor is incorporated to implement the model selection with the quantified model uncertainty. Markov Chain Monte Carlo method is used to facilitate the calculation. A pictorial framework and a step-by-step procedure of the Bayesian inference method for fatigue life estimation considering model uncertainty are presented. Fatigue life estimation of a gas turbine blade is implemented to demonstrate the proposed method.

  19. A fully Bayesian before-after analysis of permeable friction course (PFC) pavement wet weather safety.

    PubMed

    Buddhavarapu, Prasad; Smit, Andre F; Prozzi, Jorge A

    2015-07-01

    Permeable friction course (PFC), a porous hot-mix asphalt, is typically applied to improve wet weather safety on high-speed roadways in Texas. In order to warrant expensive PFC construction, a statistical evaluation of its safety benefits is essential. Generally, the literature on the effectiveness of porous mixes in reducing wet-weather crashes is limited and often inconclusive. In this study, the safety effectiveness of PFC was evaluated using a fully Bayesian before-after safety analysis. First, two groups of road segments overlaid with PFC and non-PFC material were identified across Texas; the non-PFC or reference road segments selected were similar to their PFC counterparts in terms of site specific features. Second, a negative binomial data generating process was assumed to model the underlying distribution of crash counts of PFC and reference road segments to perform Bayesian inference on the safety effectiveness. A data-augmentation based computationally efficient algorithm was employed for a fully Bayesian estimation. The statistical analysis shows that PFC is not effective in reducing wet weather crashes. It should be noted that the findings of this study are in agreement with the existing literature, although these studies were not based on a fully Bayesian statistical analysis. Our study suggests that the safety effectiveness of PFC road surfaces, or any other safety infrastructure, largely relies on its interrelationship with the road user. The results suggest that the safety infrastructure must be properly used to reap the benefits of the substantial investments. PMID:25897515

  20. A Pragmatic Bayesian Perspective on Correlation Analysis - The exoplanetary gravity - stellar activity case

    NASA Astrophysics Data System (ADS)

    Figueira, P.; Faria, J. P.; Adibekyan, V. Zh.; Oshagh, M.; Santos, N. C.

    2016-05-01

    We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, ρ, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short (˜ 130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log( R^' }_{{HK}}) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the reader interested in this kind of problem to apply our code to his/her own scientific problems. The full understanding of what the Bayesian framework is can only be gained through the insight that comes by handling priors, assessing the convergence of Monte Carlo runs, and a multitude of other practical problems. We hope to contribute so that Bayesian analysis becomes a tool in the toolkit of researchers, and they understand by experience its advantages and limitations.

  1. APPLICATION OF PRINCIPAL COMPONENT ANALYSIS AND BAYESIAN DECOMPOSITION TO RELAXOGRAPHIC IMAGING

    SciTech Connect

    OCHS,M.F.; STOYANOVA,R.S.; BROWN,T.R.; ROONEY,W.D.; LI,X.; LEE,J.H.; SPRINGER,C.S.

    1999-05-22

    Recent developments in high field imaging have made possible the acquisition of high quality, low noise relaxographic data in reasonable imaging times. The datasets comprise a huge amount of information (>>1 million points) which makes rigorous analysis daunting. Here, the authors present results demonstrating that Principal Component Analysis (PCA) and Bayesian Decomposition (BD) provide powerful methods for relaxographic analysis of T{sub 1} recovery curves and editing of tissue type in resulting images.

  2. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  3. Toward an ecological analysis of Bayesian inferences: how task characteristics influence responses

    PubMed Central

    Hafenbrädl, Sebastian; Hoffrage, Ulrich

    2015-01-01

    In research on Bayesian inferences, the specific tasks, with their narratives and characteristics, are typically seen as exchangeable vehicles that merely transport the structure of the problem to research participants. In the present paper, we explore whether, and possibly how, task characteristics that are usually ignored influence participants’ responses in these tasks. We focus on both quantitative dimensions of the tasks, such as their base rates, hit rates, and false-alarm rates, as well as qualitative characteristics, such as whether the task involves a norm violation or not, whether the stakes are high or low, and whether the focus is on the individual case or on the numbers. Using a data set of 19 different tasks presented to 500 different participants who provided a total of 1,773 responses, we analyze these responses in two ways: first, on the level of the numerical estimates themselves, and second, on the level of various response strategies, Bayesian and non-Bayesian, that might have produced the estimates. We identified various contingencies, and most of the task characteristics had an influence on participants’ responses. Typically, this influence has been stronger when the numerical information in the tasks was presented in terms of probabilities or percentages, compared to natural frequencies – and this effect cannot be fully explained by a higher proportion of Bayesian responses when natural frequencies were used. One characteristic that did not seem to influence participants’ response strategy was the numerical value of the Bayesian solution itself. Our exploratory study is a first step toward an ecological analysis of Bayesian inferences, and highlights new avenues for future research. PMID:26300791

  4. Candidate gene association study in pediatric acute lymphoblastic leukemia evaluated by Bayesian network based Bayesian multilevel analysis of relevance

    PubMed Central

    2012-01-01

    Background We carried out a candidate gene association study in pediatric acute lymphoblastic leukemia (ALL) to identify possible genetic risk factors in a Hungarian population. Methods The results were evaluated with traditional statistical methods and with our newly developed Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA) method. We collected genomic DNA and clinical data from 543 children, who underwent chemotherapy due to ALL, and 529 healthy controls. Altogether 66 single nucleotide polymorphisms (SNPs) in 19 candidate genes were genotyped. Results With logistic regression, we identified 6 SNPs in the ARID5B and IKZF1 genes associated with increased risk to B-cell ALL, and two SNPs in the STAT3 gene, which decreased the risk to hyperdiploid ALL. Because the associated SNPs were in linkage in each gene, these associations corresponded to one signal per gene. The odds ratio (OR) associated with the tag SNPs were: OR = 1.69, P = 2.22x10-7 for rs4132601 (IKZF1), OR = 1.53, P = 1.95x10-5 for rs10821936 (ARID5B) and OR = 0.64, P = 2.32x10-4 for rs12949918 (STAT3). With the BN-BMLA we confirmed the findings of the frequentist-based method and received additional information about the nature of the relations between the SNPs and the disease. E.g. the rs10821936 in ARID5B and rs17405722 in STAT3 showed a weak interaction, and in case of T-cell lineage sample group, the gender showed a weak interaction with three SNPs in three genes. In the hyperdiploid patient group the BN-BMLA detected a strong interaction among SNPs in the NOTCH1, STAT1, STAT3 and BCL2 genes. Evaluating the survival rate of the patients with ALL, the BN-BMLA showed that besides risk groups and subtypes, genetic variations in the BAX and CEBPA genes might also influence the probability of survival of the patients. Conclusions In the present study we confirmed the roles of genetic variations in ARID5B and IKZF1 in the susceptibility to B-cell ALL

  5. Doubly Bayesian Analysis of Confidence in Perceptual Decision-Making

    PubMed Central

    Bahrami, Bahador; Latham, Peter E.

    2015-01-01

    Humans stand out from other animals in that they are able to explicitly report on the reliability of their internal operations. This ability, which is known as metacognition, is typically studied by asking people to report their confidence in the correctness of some decision. However, the computations underlying confidence reports remain unclear. In this paper, we present a fully Bayesian method for directly comparing models of confidence. Using a visual two-interval forced-choice task, we tested whether confidence reports reflect heuristic computations (e.g. the magnitude of sensory data) or Bayes optimal ones (i.e. how likely a decision is to be correct given the sensory data). In a standard design in which subjects were first asked to make a decision, and only then gave their confidence, subjects were mostly Bayes optimal. In contrast, in a less-commonly used design in which subjects indicated their confidence and decision simultaneously, they were roughly equally likely to use the Bayes optimal strategy or to use a heuristic but suboptimal strategy. Our results suggest that, while people’s confidence reports can reflect Bayes optimal computations, even a small unusual twist or additional element of complexity can prevent optimality. PMID:26517475

  6. Bayesian Analysis of Cosmic Ray Propagation: Evidence against Homogeneous Diffusion

    NASA Astrophysics Data System (ADS)

    Jóhannesson, G.; Ruiz de Austri, R.; Vincent, A. C.; Moskalenko, I. V.; Orlando, E.; Porter, T. A.; Strong, A. W.; Trotta, R.; Feroz, F.; Graff, P.; Hobson, M. P.

    2016-06-01

    We present the results of the most complete scan of the parameter space for cosmic ray (CR) injection and propagation. We perform a Bayesian search of the main GALPROP parameters, using the MultiNest nested sampling algorithm, augmented by the BAMBI neural network machine-learning package. This is the first study to separate out low-mass isotopes (p, \\bar{p}, and He) from the usual light elements (Be, B, C, N, and O). We find that the propagation parameters that best-fit p,\\bar{p}, and He data are significantly different from those that fit light elements, including the B/C and 10Be/9Be secondary-to-primary ratios normally used to calibrate propagation parameters. This suggests that each set of species is probing a very different interstellar medium, and that the standard approach of calibrating propagation parameters using B/C can lead to incorrect results. We present posterior distributions and best-fit parameters for propagation of both sets of nuclei, as well as for the injection abundances of elements from H to Si. The input GALDEF files with these new parameters will be included in an upcoming public GALPROP update.

  7. Progressive Damage Analysis of Bonded Composite Joints

    NASA Technical Reports Server (NTRS)

    Leone, Frank A., Jr.; Girolamo, Donato; Davila, Carlos G.

    2012-01-01

    The present work is related to the development and application of progressive damage modeling techniques to bonded joint technology. The joint designs studied in this work include a conventional composite splice joint and a NASA-patented durable redundant joint. Both designs involve honeycomb sandwich structures with carbon/epoxy facesheets joined using adhesively bonded doublers.Progressive damage modeling allows for the prediction of the initiation and evolution of damage within a structure. For structures that include multiple material systems, such as the joint designs under consideration, the number of potential failure mechanisms that must be accounted for drastically increases the complexity of the analyses. Potential failure mechanisms include fiber fracture, intraply matrix cracking, delamination, core crushing, adhesive failure, and their interactions. The bonded joints were modeled using highly parametric, explicitly solved finite element models, with damage modeling implemented via custom user-written subroutines. Each ply was discretely meshed using three-dimensional solid elements. Layers of cohesive elements were included between each ply to account for the possibility of delaminations and were used to model the adhesive layers forming the joint. Good correlation with experimental results was achieved both in terms of load-displacement history and the predicted failure mechanism(s).

  8. fMRI data analysis with nonstationary noise models: a Bayesian approach.

    PubMed

    Luo, Huaien; Puthusserypady, Sadasivan

    2007-09-01

    The assumption of noise stationarity in the functional magnetic resonance imaging (fMRI) data analysis may lead to the loss of crucial dynamic features of the data and thus result in inaccurate activation detection. In this paper, a Bayesian approach is proposed to analyze the fMRI data with two nonstationary noise models (the time-varying variance noise model and the fractional noise model). The covariance matrices of the time-varying variance noise and the fractional noise after wavelet transform are diagonal matrices. This property is investigated under the Bayesian framework. The Bayesian estimator not only gives an accurate estimate of the weights in general linear model, but also provides posterior probability of activation in a voxel and, hence, avoids the limitations (i.e., using only hypothesis testing) in the classical methods. The performance of the proposed Bayesian methods (under the assumption of different noise models) are compared with the ordinary least squares (OLS) and the weighted least squares (WLS) methods. Results from the simulation studies validate the superiority of the proposed approach to the OLS and WLS methods considering the complex noise structures in the fMRI data. PMID:17867354

  9. Bayesian approach for counting experiment statistics applied to a neutrino point source analysis

    NASA Astrophysics Data System (ADS)

    Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.

    2013-12-01

    In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.

  10. Analysis and design of advanced composite bounded joints

    NASA Technical Reports Server (NTRS)

    Hart-Smith, L. J.

    1974-01-01

    Advances in the analysis of adhesive-bonded joints are presented with particular emphasis on advanced composite structures. The joints analyzed are of double-lap, single-lap, scarf, stepped-lap and tapered-lap configurations. Tensile, compressive, and in-plane shear loads are covered. In addition to the usual geometric variables, the theory accounts for the strength increases attributable to adhesive plasticity (in terms of the elastic-plastic adhesive model) and the joint strength reductions imposed by imbalances between the adherends. The solutions are largely closed-form analytical results, employing iterative solutions on a digital computer for the more complicated joint configurations. In assessing the joint efficiency, three potential failure modes are considered. These are adherend failure outside the joint, adhesive failure in shear, and adherend interlaminar tension failure (or adhesive failure in peel). Each mode is governed by a distinct mathematical analysis and each prevails throughout different ranges of geometric sizes and proportions.

  11. Bayesian extreme rainfall analysis using informative prior: A case study of Alor Setar

    NASA Astrophysics Data System (ADS)

    Eli, Annazirin; Zin, Wan Zawiah Wan; Ibrahim, Kamarulzaman; Jemain, Abdul Aziz

    2014-09-01

    Bayesian analysis is an alternative approach in statistical inferences. The inclusion of other information regarding the parameter of the model is one of analysis capabilities. In the area of extreme rainfall analysis, expert opinion can be used as prior information to model the extreme events. Thus, considering previous or expert knowledge about the parameter of interest would reduce the uncertainty of the model. In this study, the annual maximum (AM) rainfall data of Alor Setar rain gauge station is modeled by the Generalized Extreme Value (GEV) distribution. A Bayesian Markov Chain Monte Carlo (MCMC) simulation is used for parameter estimation. Comparison of the outcomes between non-informative and informative priors is our main interest. The results show that there is a reduction in estimated values, which is due to informative priors.

  12. Bayesian Statistical Analysis Applied to NAA Data for Neutron Flux Spectrum Determination

    NASA Astrophysics Data System (ADS)

    Chiesa, D.; Previtali, E.; Sisti, M.

    2014-04-01

    In this paper, we present a statistical method, based on Bayesian statistics, to evaluate the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation analysis (NAA) experiment [A. Borio di Tigliole et al., Absolute flux measurement by NAA at the Pavia University TRIGA Mark II reactor facilities, ENC 2012 - Transactions Research Reactors, ISBN 978-92-95064-14-0, 22 (2012)] performed at the TRIGA Mark II reactor of Pavia University (Italy). In order to evaluate the neutron flux spectrum, subdivided in energy groups, we must solve a system of linear equations containing the grouped cross sections and the activation rate data. We solve this problem with Bayesian statistical analysis, including the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, is used to define the problem statistical model and solve it. The energy group fluxes and their uncertainties are then determined with great accuracy and the correlations between the groups are analyzed. Finally, the dependence of the results on the prior distribution choice and on the group cross section data is investigated to confirm the reliability of the analysis.

  13. Joint Analysis of Multiple Metagenomic Samples

    PubMed Central

    Baran, Yael; Halperin, Eran

    2012-01-01

    The availability of metagenomic sequencing data, generated by sequencing DNA pooled from multiple microbes living jointly, has increased sharply in the last few years with developments in sequencing technology. Characterizing the contents of metagenomic samples is a challenging task, which has been extensively attempted by both supervised and unsupervised techniques, each with its own limitations. Common to practically all the methods is the processing of single samples only; when multiple samples are sequenced, each is analyzed separately and the results are combined. In this paper we propose to perform a combined analysis of a set of samples in order to obtain a better characterization of each of the samples, and provide two applications of this principle. First, we use an unsupervised probabilistic mixture model to infer hidden components shared across metagenomic samples. We incorporate the model in a novel framework for studying association of microbial sequence elements with phenotypes, analogous to the genome-wide association studies performed on human genomes: We demonstrate that stratification may result in false discoveries of such associations, and that the components inferred by the model can be used to correct for this stratification. Second, we propose a novel read clustering (also termed “binning”) algorithm which operates on multiple samples simultaneously, leveraging on the assumption that the different samples contain the same microbial species, possibly in different proportions. We show that integrating information across multiple samples yields more precise binning on each of the samples. Moreover, for both applications we demonstrate that given a fixed depth of coverage, the average per-sample performance generally increases with the number of sequenced samples as long as the per-sample coverage is high enough. PMID:22359490

  14. Environmental Modeling and Bayesian Analysis for Assessing Human Health Impacts from Radioactive Waste Disposal

    NASA Astrophysics Data System (ADS)

    Stockton, T.; Black, P.; Tauxe, J.; Catlett, K.

    2004-12-01

    Bayesian decision analysis provides a unified framework for coherent decision-making. Two key components of Bayesian decision analysis are probability distributions and utility functions. Calculating posterior distributions and performing decision analysis can be computationally challenging, especially for complex environmental models. In addition, probability distributions and utility functions for environmental models must be specified through expert elicitation, stakeholder consensus, or data collection, all of which have their own set of technical and political challenges. Nevertheless, a grand appeal of the Bayesian approach for environmental decision- making is the explicit treatment of uncertainty, including expert judgment. The impact of expert judgment on the environmental decision process, though integral, goes largely unassessed. Regulations and orders of the Environmental Protection Agency, Department Of Energy, and Nuclear Regulatory Agency orders require assessing the impact on human health of radioactive waste contamination over periods of up to ten thousand years. Towards this end complex environmental simulation models are used to assess "risk" to human and ecological health from migration of radioactive waste. As the computational burden of environmental modeling is continually reduced probabilistic process modeling using Monte Carlo simulation is becoming routinely used to propagate uncertainty from model inputs through model predictions. The utility of a Bayesian approach to environmental decision-making is discussed within the context of a buried radioactive waste example. This example highlights the desirability and difficulties of merging the cost of monitoring, the cost of the decision analysis, the cost and viability of clean up, and the probability of human health impacts within a rigorous decision framework.

  15. Bayesian uncertainty analysis for complex physical systems modelled by computer simulators with applications to tipping points

    NASA Astrophysics Data System (ADS)

    Caiado, C. C. S.; Goldstein, M.

    2015-09-01

    In this paper we present and illustrate basic Bayesian techniques for the uncertainty analysis of complex physical systems modelled by computer simulators. We focus on emulation and history matching and also discuss the treatment of observational errors and structural discrepancies in time series. We exemplify such methods using a four-box model for the termohaline circulation. We show how these methods may be applied to systems containing tipping points and how to treat possible discontinuities using multiple emulators.

  16. Fully Bayesian hierarchical modelling in two stages, with application to meta-analysis

    PubMed Central

    Lunn, David; Barrett, Jessica; Sweeting, Michael; Thompson, Simon

    2013-01-01

    Meta-analysis is often undertaken in two stages, with each study analysed separately in stage 1 and estimates combined across studies in stage 2. The study-specific estimates are assumed to arise from normal distributions with known variances equal to their corresponding estimates. In contrast, a one-stage analysis estimates all parameters simultaneously. A Bayesian one-stage approach offers additional advantages, such as the acknowledgement of uncertainty in all parameters and greater flexibility. However, there are situations when a two-stage strategy is compelling, e.g. when study-specific analyses are complex and/or time consuming. We present a novel method for fitting the full Bayesian model in two stages, hence benefiting from its advantages while retaining the convenience and flexibility of a two-stage approach. Using Markov chain Monte Carlo methods, posteriors for the parameters of interest are derived separately for each study. These are then used as proposal distributions in a computationally efficient second stage. We illustrate these ideas on a small binomial data set; we also analyse motivating data on the growth and rupture of abdominal aortic aneurysms. The two-stage Bayesian approach closely reproduces a one-stage analysis when it can be undertaken, but can also be easily carried out when a one-stage approach is difficult or impossible. PMID:24223435

  17. Copula models for frequency analysis what can be learned from a Bayesian perspective?

    NASA Astrophysics Data System (ADS)

    Parent, Eric; Favre, Anne-Catherine; Bernier, Jacques; Perreault, Luc

    2014-01-01

    Large spring floods in the Québec region exhibit correlated peakflow, duration and volume. Consequently, traditional univariate hydrological frequency analyses must be complemented by multivariate probabilistic assessment to provide a meaningful design flood level as requested in hydrological engineering (based on return period evaluation of a single quantity of interest). In this paper we study 47 years of a peak/volume dataset for the Romaine River with a parametric copula model. The margins are modeled with a normal or gamma distribution and the dependence is depicted through a parametric family of copulas (Arch 12 or Arch 14). Parameter joint inference and model selection are performed under the Bayesian paradigm. This approach enlightens specific features of interest for hydrological engineering: (i) cross correlation between margin parameters are stronger than expected , (ii) marginal distributions cannot be forgotten in the model selection process and (iii) special attention must be addressed to model validation as far as extreme values are of concern.

  18. Labour and residential accessibility: a Bayesian analysis based on Poisson gravity models with spatial effects

    NASA Astrophysics Data System (ADS)

    Alonso, M. P.; Beamonte, M. A.; Gargallo, P.; Salvador, M. J.

    2014-10-01

    In this study, we measure jointly the labour and the residential accessibility of a basic spatial unit using a Bayesian Poisson gravity model with spatial effects. The accessibility measures are broken down into two components: the attractiveness component, which is related to its socio-economic and demographic characteristics, and the impedance component, which reflects the ease of communication within and between basic spatial units. For illustration purposes, the methodology is applied to a data set containing information about commuters from the Spanish region of Aragón. We identify the areas with better labour and residential accessibility, and we also analyse the attractiveness and the impedance components of a set of chosen localities which allows us to better understand their mobility patterns.

  19. Inference of posterior inclusion probability of QTLs in Bayesian shrinkage analysis.

    PubMed

    Yang, Deguang; Han, Shanshan; Jiang, Dan; Yang, Runqing; Fang, Ming

    2015-01-01

    Bayesian shrinkage analysis estimates all QTLs effects simultaneously, which shrinks the effect of "insignificant" QTLs close to zero so that it does not need special model selection. Bayesian shrinkage estimation usually has an excellent performance on multiple QTLs mapping, but it could not give a probabilistic explanation of how often a QTLs is included in the model, also called posterior inclusion probability, which is important to assess the importance of a QTL. In this research, two methods, FitMix and SimMix, are proposed to approximate the posterior probabilities. Under the assumption of mixture distribution of the estimated QTL effect, FitMix and SimMix mathematically and intuitively fit mixture distribution, respectively. The simulation results showed that both methods gave very reasonable estimates for posterior probabilities. We also applied the two methods to map QTLs for the North American Barley Genome Mapping Project data. PMID:25857576

  20. Bayesian estimation of dynamic matching function for U-V analysis in Japan

    NASA Astrophysics Data System (ADS)

    Kyo, Koki; Noda, Hideo; Kitagawa, Genshiro

    2012-05-01

    In this paper we propose a Bayesian method for analyzing unemployment dynamics. We derive a Beveridge curve for unemployment and vacancy (U-V) analysis from a Bayesian model based on a labor market matching function. In our framework, the efficiency of matching and the elasticities of new hiring with respect to unemployment and vacancy are regarded as time varying parameters. To construct a flexible model and obtain reasonable estimates in an underdetermined estimation problem, we treat the time varying parameters as random variables and introduce smoothness priors. The model is then described in a state space representation, enabling the parameter estimation to be carried out using Kalman filter and fixed interval smoothing. In such a representation, dynamic features of the cyclic unemployment rate and the structural-frictional unemployment rate can be accurately captured.

  1. Bayesian calibration of a soil organic carbon model with radiocarbon measurements of heterotrophic respiration and soil organic carbon as joint constraints

    NASA Astrophysics Data System (ADS)

    Ahrens, B.; Borken, W.; Muhr, J.; Savage, K.; Wutzler, T.; Trumbore, S.; Reichstein, M.

    2012-04-01

    Soils of temperate forests store significant amounts of organic matter and are considered to be net sinks of atmospheric CO2. Soil organic carbon (SOC) dynamics have been studied using the Δ14C signature of bulk SOC or different SOC fractions as observational constraints in SOC models. Further, the Δ14C signature of CO2 evolved during the incubation of soil and roots has been widely used together with Δ14C of total soil respiration to partition soil respiration into heterotrophic respiration (HR) and rhizosphere respiration. However, this data has not been used as joint observational constraints to determine SOC turnover times. Thus, we want to present: (1) how different combinations of observational constraints help to narrow estimates of turnover times and other parameters of a simple two-pool model, ICBM; (2) if a multiple constraints approach allows determining whether a forest soil has been storing or losing SOC. To this end ICBM was adapted to model SOC and SO14C in parallel with litterfall and the Δ14C signature of litterfall as driving variables. The Δ14C signature of the atmosphere with its prominent bomb peak was used as a proxy for the Δ14C signature of litterfall. Data from three spruce dominated temperate forests in Germany and the USA (Coulissenhieb II, Solling D0 and Howland Tower site) were used to estimate the parameters of ICBM via Bayesian calibration. Key findings are: (1) the joint use of all 4 observational constraints helped to considerably narrow turnover times of the young pool (primarily by Δ14C of HR) and the old pool (primarily by Δ14C of SOC). Furthermore, the joint use all observational constraints allowed constraining the humification factor in ICBM, which describes the fraction of the annual outflux from the young pool that enters the old pool. The Bayesian parameter estimation yielded the following turnover times (median ± interquartile range) for SOC in the young pool: Coulissenhieb II 2.9 ± 2.1 years, Solling D0 8.4 ± 1

  2. Bayesian Propensity Score Analysis: Simulation and Case Study

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Cassie J. S.

    2011-01-01

    Propensity score analysis (PSA) has been used in a variety of settings, such as education, epidemiology, and sociology. Most typically, propensity score analysis has been implemented within the conventional frequentist perspective of statistics. This perspective, as is well known, does not account for uncertainty in either the parameters of the…

  3. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    NASA Astrophysics Data System (ADS)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  4. Bayesian sensitivity analysis of a nonlinear finite element model

    NASA Astrophysics Data System (ADS)

    Becker, W.; Oakley, J. E.; Surace, C.; Gili, P.; Rowson, J.; Worden, K.

    2012-10-01

    A major problem in uncertainty and sensitivity analysis is that the computational cost of propagating probabilistic uncertainty through large nonlinear models can be prohibitive when using conventional methods (such as Monte Carlo methods). A powerful solution to this problem is to use an emulator, which is a mathematical representation of the model built from a small set of model runs at specified points in input space. Such emulators are massively cheaper to run and can be used to mimic the "true" model, with the result that uncertainty analysis and sensitivity analysis can be performed for a greatly reduced computational cost. The work here investigates the use of an emulator known as a Gaussian process (GP), which is an advanced probabilistic form of regression. The GP is particularly suited to uncertainty analysis since it is able to emulate a wide class of models, and accounts for its own emulation uncertainty. Additionally, uncertainty and sensitivity measures can be estimated analytically, given certain assumptions. The GP approach is explained in detail here, and a case study of a finite element model of an airship is used to demonstrate the method. It is concluded that the GP is a very attractive way of performing uncertainty and sensitivity analysis on large models, provided that the dimensionality is not too high.

  5. A Bayesian Framework for Functional Mapping through Joint Modeling of Longitudinal and Time-to-Event Data

    PubMed Central

    Das, Kiranmoy; Li, Runze; Huang, Zhongwen; Gai, Junyi; Wu, Rongling

    2012-01-01

    The most powerful and comprehensive approach of study in modern biology is to understand the whole process of development and all events of importance to development which occur in the process. As a consequence, joint modeling of developmental processes and events has become one of the most demanding tasks in statistical research. Here, we propose a joint modeling framework for functional mapping of specific quantitative trait loci (QTLs) which controls developmental processes and the timing of development and their causal correlation over time. The joint model contains two submodels, one for a developmental process, known as a longitudinal trait, and the other for a developmental event, known as the time to event, which are connected through a QTL mapping framework. A nonparametric approach is used to model the mean and covariance function of the longitudinal trait while the traditional Cox proportional hazard (PH) model is used to model the event time. The joint model is applied to map QTLs that control whole-plant vegetative biomass growth and time to first flower in soybeans. Results show that this model should be broadly useful for detecting genes controlling physiological and pathological processes and other events of interest in biomedicine. PMID:22685454

  6. Analysis and experimental study of the spherical joint clearance

    NASA Astrophysics Data System (ADS)

    Zhao, Peng; Hu, Penghao; Bao, Xinxin; Li, Shuaipeng

    2013-10-01

    The spherical joint clearance is a key error factor, which influenced and restricted the application of parallel mechanism in high precision field. This paper discusses the regularity of the spherical joint clearance in the parallel mechanism and its influence on the accuracy of the parallel mechanism in both theoretical and experimental aspects. A spherical joint clearance measuring instrument is introduced and used to measure the joint clearance. And the relationship between the clearance and its work pose is revealed. Based on the theoretical and experimental analysis, it is concluded that the clearance of the spherical joint is near-linear proportional to the applied load as well as the clearances in different poses obey the Rayleigh distribution approximately under the same load.

  7. An intake prior for the Bayesian analysis of plutonium and uranium exposures in an epidemiology study.

    PubMed

    Puncher, M; Birchall, A; Bull, R K

    2014-12-01

    In Bayesian inference, the initial knowledge regarding the value of a parameter, before additional data are considered, is represented as a prior probability distribution. This paper describes the derivation of a prior distribution of intake that was used for the Bayesian analysis of plutonium and uranium worker doses in a recent epidemiology study. The chosen distribution is log-normal with a geometric standard deviation of 6 and a median value that is derived for each worker based on the duration of the work history and the number of reported acute intakes. The median value is a function of the work history and a constant related to activity in air concentration, M, which is derived separately for uranium and plutonium. The value of M is based primarily on measurements of plutonium and uranium in air derived from historical personal air sampler (PAS) data. However, there is significant uncertainty on the value of M that results from paucity of PAS data and from extrapolating these measurements to actual intakes. This paper compares posterior and prior distributions of intake and investigates the sensitivity of the Bayesian analyses to the assumed value of M. It is found that varying M by a factor of 10 results in a much smaller factor of 2 variation in mean intake and lung dose for both plutonium and uranium. It is concluded that if a log-normal distribution is considered to adequately represent worker intakes, then the Bayesian posterior distribution of dose is relatively insensitive to the value assumed of M. PMID:24191121

  8. Highly efficient Bayesian joint inversion for receiver-based data and its application to lithospheric structure beneath the southern Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Dettmer, Jan; Rhie, Junkee; Tkalčić, Hrvoje

    2016-07-01

    With the deployment of extensive seismic arrays, systematic and efficient parameter and uncertainty estimation is of increasing importance and can provide reliable, regional models for crustal and upper-mantle structure. We present an efficient Bayesian method for the joint inversion of surface-wave dispersion and receiver-function data that combines trans-dimensional (trans-D) model selection in an optimization phase with subsequent rigorous parameter uncertainty estimation. Parameter and uncertainty estimation depend strongly on the chosen parametrization such that meaningful regional comparison requires quantitative model selection that can be carried out efficiently at several sites. While significant progress has been made for model selection (e.g. trans-D inference) at individual sites, the lack of efficiency can prohibit application to large data volumes or cause questionable results due to lack of convergence. Studies that address large numbers of data sets have mostly ignored model selection in favour of more efficient/simple estimation techniques (i.e. focusing on uncertainty estimation but employing ad-hoc model choices). Our approach consists of a two-phase inversion that combines trans-D optimization to select the most probable parametrization with subsequent Bayesian sampling for uncertainty estimation given that parametrization. The trans-D optimization is implemented here by replacing the likelihood function with the Bayesian information criterion (BIC). The BIC provides constraints on model complexity that facilitate the search for an optimal parametrization. Parallel tempering (PT) is applied as an optimization algorithm. After optimization, the optimal model choice is identified by the minimum BIC value from all PT chains. Uncertainty estimation is then carried out in fixed dimension. Data errors are estimated as part of the inference problem by a combination of empirical and hierarchical estimation. Data covariance matrices are estimated from

  9. Majorana Demonstrator Bolted Joint Mechanical and Thermal Analysis

    SciTech Connect

    Aguayo Navarrete, Estanislao; Reid, Douglas J.; Fast, James E.

    2012-06-01

    The MAJORANA DEMONSTRATOR is designed to probe for neutrinoless double-beta decay, an extremely rare process with a half-life in the order of 1026 years. The experiment uses an ultra-low background, high-purity germanium detector array. The germanium crystals are both the source and the detector in this experiment. Operating these crystals as ionizing radiation detectors requires having them under cryogenic conditions (below 90 K). A liquid nitrogen thermosyphon is used to extract the heat from the detectors. The detector channels are arranged in strings and thermally coupled to the thermosyphon through a cold plate. The cold plate is joined to the thermosyphon by a bolted joint. This circular plate is housed inside the cryostat can. This document provides a detailed study of the bolted joint that connects the cold plate and the thermosyphon. An analysis of the mechanical and thermal properties of this bolted joint is presented. The force applied to the joint is derived from the torque applied to each one of the six bolts that form the joint. The thermal conductivity of the joint is measured as a function of applied force. The required heat conductivity for a successful experiment is the combination of the thermal conductivity of the detector string and this joint. The thermal behavior of the joint is experimentally implemented and analyzed in this study.

  10. Reusable Solid Rocket Motor Nozzle Joint-4 Thermal Analysis

    NASA Technical Reports Server (NTRS)

    Clayton, J. Louie

    2001-01-01

    This study provides for development and test verification of a thermal model used for prediction of joint heating environments, structural temperatures and seal erosions in the Space Shuttle Reusable Solid Rocket Motor (RSRM) Nozzle Joint-4. The heating environments are a result of rapid pressurization of the joint free volume assuming a leak path has occurred in the filler material used for assembly gap close out. Combustion gases flow along the leak path from nozzle environment to joint O-ring gland resulting in local heating to the metal housing and erosion of seal materials. Analysis of this condition was based on usage of the NASA Joint Pressurization Routine (JPR) for environment determination and the Systems Improved Numerical Differencing Analyzer (SINDA) for structural temperature prediction. Model generated temperatures, pressures and seal erosions are compared to hot fire test data for several different leak path situations. Investigated in the hot fire test program were nozzle joint-4 O-ring erosion sensitivities to leak path width in both open and confined joint geometries. Model predictions were in generally good agreement with the test data for the confined leak path cases. Worst case flight predictions are provided using the test-calibrated model. Analysis issues are discussed based on model calibration procedures.

  11. Bayesian Analysis and Segmentation of Multichannel Image Sequences

    NASA Astrophysics Data System (ADS)

    Chang, Michael Ming Hsin

    This thesis is concerned with the segmentation and analysis of multichannel image sequence data. In particular, we use maximum a posteriori probability (MAP) criterion and Gibbs random fields (GRF) to formulate the problems. We start by reviewing the significance of MAP estimation with GRF priors and study the feasibility of various optimization methods for implementing the MAP estimator. We proceed to investigate three areas where image data and parameter estimates are present in multichannels, multiframes, and interrelated in complicated manners. These areas of study include color image segmentation, multislice MR image segmentation, and optical flow estimation and segmentation in multiframe temporal sequences. Besides developing novel algorithms in each of these areas, we demonstrate how to exploit the potential of MAP estimation and GRFs, and we propose practical and efficient implementations. Illustrative examples and relevant experimental results are included.

  12. Viscoelastic analysis of adhesively bonded joints

    NASA Technical Reports Server (NTRS)

    Delale, F.; Erdogan, F.

    1981-01-01

    In this paper an adhesively bonded lap joint is analyzed by assuming that the adherends are elastic and the adhesive is linearly viscoelastic. After formulating the general problem a specific example for two identical adherends bonded through a three parameter viscoelastic solid adhesive is considered. The standard Laplace transform technique is used to solve the problem. The stress distribution in the adhesive layer is calculated for three different external loads namely, membrane loading, bending, and transverse shear loading. The results indicate that the peak value of the normal stress in the adhesive is not only consistently higher than the corresponding shear stress but also decays slower.

  13. Viscoelastic analysis of adhesively bonded joints

    NASA Technical Reports Server (NTRS)

    Delale, F.; Erdogan, F.

    1980-01-01

    An adhesively bonded lap joint is analyzed by assuming that the adherends are elastic and the adhesive is linearly viscoelastic. After formulating the general problem a specific example for two identical adherends bonded through a three parameter viscoelastic solid adhesive is considered. The standard Laplace transform technique is used to solve the problem. The stress distribution in the adhesive layer is calculated for three different external loads, namely, membrane loading, bending, and transverse shear loading. The results indicate that the peak value of the normal stress in the adhesive is not only consistently higher than the corresponding shear stress but also decays slower.

  14. Analysis of adhesively bonded composite lap joints

    SciTech Connect

    Tong, L.; Kuruppu, M.; Kelly, D.

    1994-12-31

    A new nonlinear formulation is developed for the governing equations for the shear and peel stresses in adhesively bonded composite double lap joints. The new formulation allows arbitrary nonlinear stress-strain characteristics in both shear and peel behavior. The equations are numerically integrated using a shooting technique and Newton-Raphson method behind a user friendly interface. The failure loads are predicted by utilizing the maximum stress criterion, interlaminar delamination and the energy density failure criteria. Numerical examples are presented to demonstrate the effect of the nonlinear adhesive behavior on the stress distribution and predict the failure load and the associated mode.

  15. A Bayesian Analysis of the Correlations Among Sunspot Cycles

    NASA Astrophysics Data System (ADS)

    Yu, Y.; van Dyk, D. A.; Kashyap, V. L.; Young, C. A.

    2012-12-01

    Sunspot numbers form a comprehensive, long-duration proxy of solar activity and have been used numerous times to empirically investigate the properties of the solar cycle. A number of correlations have been discovered over the 24 cycles for which observational records are available. Here we carry out a sophisticated statistical analysis of the sunspot record that reaffirms these correlations, and sets up an empirical predictive framework for future cycles. An advantage of our approach is that it allows for rigorous assessment of both the statistical significance of various cycle features and the uncertainty associated with predictions. We summarize the data into three sequential relations that estimate the amplitude, duration, and time of rise to maximum for any cycle, given the values from the previous cycle. We find that there is no indication of a persistence in predictive power beyond one cycle, and we conclude that the dynamo does not retain memory beyond one cycle. Based on sunspot records up to October 2011, we obtain, for Cycle 24, an estimated maximum smoothed monthly sunspot number of 97±15, to occur in January - February 2014 ± six months.

  16. Hierarchical models and Bayesian analysis of bird survey information

    USGS Publications Warehouse

    Sauer, J.R.; Link, W.A.; Royle, J. Andrew

    2005-01-01

    Summary of bird survey information is a critical component of conservation activities, but often our summaries rely on statistical methods that do not accommodate the limitations of the information. Prioritization of species requires ranking and analysis of species by magnitude of population trend, but often magnitude of trend is a misleading measure of actual decline when trend is poorly estimated. Aggregation of population information among regions is also complicated by varying quality of estimates among regions. Hierarchical models provide a reasonable means of accommodating concerns about aggregation and ranking of quantities of varying precision. In these models the need to consider multiple scales is accommodated by placing distributional assumptions on collections of parameters. For collections of species trends, this allows probability statements to be made about the collections of species-specific parameters, rather than about the estimates. We define and illustrate hierarchical models for two commonly encountered situations in bird conservation: (1) Estimating attributes of collections of species estimates, including ranking of trends, estimating number of species with increasing populations, and assessing population stability with regard to predefined trend magnitudes; and (2) estimation of regional population change, aggregating information from bird surveys over strata. User-friendly computer software makes hierarchical models readily accessible to scientists.

  17. Transdimensional Bayesian approach to pulsar timing noise analysis

    NASA Astrophysics Data System (ADS)

    Ellis, J. A.; Cornish, N. J.

    2016-04-01

    The modeling of intrinsic noise in pulsar timing residual data is of crucial importance for gravitational wave detection and pulsar timing (astro)physics in general. The noise budget in pulsars is a collection of several well-studied effects including radiometer noise, pulse-phase jitter noise, dispersion measure variations, and low-frequency spin noise. However, as pulsar timing data continue to improve, nonstationary and non-power-law noise terms are beginning to manifest which are not well modeled by current noise analysis techniques. In this work, we use a transdimensional approach to model these nonstationary and non-power-law effects through the use of a wavelet basis and an interpolation-based adaptive spectral modeling. In both cases, the number of wavelets and the number of control points in the interpolated spectrum are free parameters that are constrained by the data and then marginalized over in the final inferences, thus fully incorporating our ignorance of the noise model. We show that these new methods outperform standard techniques when nonstationary and non-power-law noise is present. We also show that these methods return results consistent with the standard analyses when no such signals are present.

  18. Bayesian Finite Mixtures for Nonlinear Modeling of Educational Data.

    ERIC Educational Resources Information Center

    Tirri, Henry; And Others

    A Bayesian approach for finding latent classes in data is discussed. The approach uses finite mixture models to describe the underlying structure in the data and demonstrate that the possibility of using full joint probability models raises interesting new prospects for exploratory data analysis. The concepts and methods discussed are illustrated…

  19. Crash risk analysis for Shanghai urban expressways: A Bayesian semi-parametric modeling approach.

    PubMed

    Yu, Rongjie; Wang, Xuesong; Yang, Kui; Abdel-Aty, Mohamed

    2016-10-01

    Urban expressway systems have been developed rapidly in recent years in China; it has become one key part of the city roadway networks as carrying large traffic volume and providing high traveling speed. Along with the increase of traffic volume, traffic safety has become a major issue for Chinese urban expressways due to the frequent crash occurrence and the non-recurrent congestions caused by them. For the purpose of unveiling crash occurrence mechanisms and further developing Active Traffic Management (ATM) control strategies to improve traffic safety, this study developed disaggregate crash risk analysis models with loop detector traffic data and historical crash data. Bayesian random effects logistic regression models were utilized as it can account for the unobserved heterogeneity among crashes. However, previous crash risk analysis studies formulated random effects distributions in a parametric approach, which assigned them to follow normal distributions. Due to the limited information known about random effects distributions, subjective parametric setting may be incorrect. In order to construct more flexible and robust random effects to capture the unobserved heterogeneity, Bayesian semi-parametric inference technique was introduced to crash risk analysis in this study. Models with both inference techniques were developed for total crashes; semi-parametric models were proved to provide substantial better model goodness-of-fit, while the two models shared consistent coefficient estimations. Later on, Bayesian semi-parametric random effects logistic regression models were developed for weekday peak hour crashes, weekday non-peak hour crashes, and weekend non-peak hour crashes to investigate different crash occurrence scenarios. Significant factors that affect crash risk have been revealed and crash mechanisms have been concluded. PMID:26847949

  20. Life-threatening arrhythmia verification in ICU patients using the joint cardiovascular dynamical model and a Bayesian filter.

    PubMed

    Sayadi, Omid; Shamsollahi, Mohammad B

    2011-10-01

    In this paper, a novel nonlinear joint dynamical model is presented, which is based on a set of coupled ordinary differential equations of motion and a Gaussian mixture model representation of pulsatile cardiovascular (CV) signals. In the proposed framework, the joint interdependences of CV signals are incorporated by assuming a unique angular frequency that controls the limit cycle of the heart rate. Moreover, the time consequence of CV signals is controlled by the same phase parameter that results in the space dimensionality reduction. These joint equations together with linear assignments to observation are further used in the Kalman filter structure for estimation and tracking. Moreover, we propose a measure of signal fidelity by monitoring the covariance matrix of the innovation signals throughout the filtering procedure. Five categories of life-threatening arrhythmias were verified by simultaneously tracking the signal fidelity and the polar representation of the CV signal estimations. We analyzed data from Physiobank multiparameter databases (MIMIC I and II). Performance evaluation results demonstrated that the sensitivity of the detection ranges over 93.50% and 100.00%. In particular, the addition of more CV signals improved the positive predictivity of the proposed method to 99.27% for the total arrhythmic types. The method was also used for false arrhythmia suppression issued by ICU monitors, with an overall false suppression rate reduced from 42.3% to 9.9%. In addition, false critical ECG arrhythmia alarm rates were found to be, on average, 42.3%, with individual rates varying between 16.7% and 86.5%. The results illustrate that the method can contribute to, and enhance the performance of clinical life-threatening arrhythmia detection. PMID:21324772

  1. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network

    PubMed Central

    2011-01-01

    Background Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. Results We herein introduce a framework for network modularization and Bayesian network analysis (FMB) to investigate organism’s metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. Conclusions After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis. PMID:22784571

  2. Bayesian design and analysis of computer experiments: Use of derivatives in surface prediction

    SciTech Connect

    Morris, M.D.; Mitchell, T.J. ); Ylvisaker, D. . Dept. of Mathematics)

    1991-06-01

    The work of Currin et al. and others in developing fast predictive approximations'' of computer models is extended for the case in which derivatives of the output variable of interest with respect to input variables are available. In addition to describing the calculations required for the Bayesian analysis, the issue of experimental design is also discussed, and an algorithm is described for constructing maximin distance'' designs. An example is given based on a demonstration model of eight inputs and one output, in which predictions based on a maximin design, a Latin hypercube design, and two compromise'' designs are evaluated and compared. 12 refs., 2 figs., 6 tabs.

  3. Crystalline nucleation in undercooled liquids: A Bayesian data-analysis approach for a nonhomogeneous Poisson process

    NASA Astrophysics Data System (ADS)

    Filipponi, A.; Di Cicco, A.; Principi, E.

    2012-12-01

    A Bayesian data-analysis approach to data sets of maximum undercooling temperatures recorded in repeated melting-cooling cycles of high-purity samples is proposed. The crystallization phenomenon is described in terms of a nonhomogeneous Poisson process driven by a temperature-dependent sample nucleation rate J(T). The method was extensively tested by computer simulations and applied to real data for undercooled liquid Ge. It proved to be particularly useful in the case of scarce data sets where the usage of binned data would degrade the available experimental information.

  4. A Bayesian analysis of the chromosome architecture of human disorders by integrating reductionist data.

    PubMed

    Emmert-Streib, Frank; de Matos Simoes, Ricardo; Tripathi, Shailesh; Glazko, Galina V; Dehmer, Matthias

    2012-01-01

    In this paper, we present a Bayesian approach to estimate a chromosome and a disorder network from the Online Mendelian Inheritance in Man (OMIM) database. In contrast to other approaches, we obtain statistic rather than deterministic networks enabling a parametric control in the uncertainty of the underlying disorder-disease gene associations contained in the OMIM, on which the networks are based. From a structural investigation of the chromosome network, we identify three chromosome subgroups that reflect architectural differences in chromosome-disorder associations that are predictively exploitable for a functional analysis of diseases. PMID:22822426

  5. Bayesian Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David

    2014-02-01

    Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.

  6. Bayesian Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David

    2009-12-01

    Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.

  7. Finite element analysis of human joints

    SciTech Connect

    Bossart, P.L.; Hollerbach, K.

    1996-09-01

    Our work focuses on the development of finite element models (FEMs) that describe the biomechanics of human joints. Finite element modeling is becoming a standard tool in industrial applications. In highly complex problems such as those found in biomechanics research, however, the full potential of FEMs is just beginning to be explored, due to the absence of precise, high resolution medical data and the difficulties encountered in converting these enormous datasets into a form that is usable in FEMs. With increasing computing speed and memory available, it is now feasible to address these challenges. We address the first by acquiring data with a high resolution C-ray CT scanner and the latter by developing semi-automated method for generating the volumetric meshes used in the FEM. Issues related to tomographic reconstruction, volume segmentation, the use of extracted surfaces to generate volumetric hexahedral meshes, and applications of the FEM are described.

  8. Bayesian analysis of RNA sequencing data by estimating multiple shrinkage priors.

    PubMed

    Van De Wiel, Mark A; Leday, Gwenaël G R; Pardo, Luba; Rue, Håvard; Van Der Vaart, Aad W; Van Wieringen, Wessel N

    2013-01-01

    Next generation sequencing is quickly replacing microarrays as a technique to probe different molecular levels of the cell, such as DNA or RNA. The technology provides higher resolution, while reducing bias. RNA sequencing results in counts of RNA strands. This type of data imposes new statistical challenges. We present a novel, generic approach to model and analyze such data. Our approach aims at large flexibility of the likelihood (count) model and the regression model alike. Hence, a variety of count models is supported, such as the popular NB model, which accounts for overdispersion. In addition, complex, non-balanced designs and random effects are accommodated. Like some other methods, our method provides shrinkage of dispersion-related parameters. However, we extend it by enabling joint shrinkage of parameters, including those for which inference is desired. We argue that this is essential for Bayesian multiplicity correction. Shrinkage is effectuated by empirically estimating priors. We discuss several parametric (mixture) and non-parametric priors and develop procedures to estimate (parameters of) those. Inference is provided by means of local and Bayesian false discovery rates. We illustrate our method on several simulations and two data sets, also to compare it with other methods. Model- and data-based simulations show substantial improvements in the sensitivity at the given specificity. The data motivate the use of the ZI-NB as a powerful alternative to the NB, which results in higher detection rates for low-count data. Finally, compared with other methods, the results on small sample subsets are more reproducible when validated on their large sample complements, illustrating the importance of the type of shrinkage. PMID:22988280

  9. Calculation of joint reaction force and joint moments using by wearable walking analysis system.

    PubMed

    Adachi, Wataru; Tsujiuchi, Nobutaka; Koizumi, Takayuki; Shiojima, Kouzou; Tsuchiya, Youtaro; Inoue, Yoshio

    2012-01-01

    In gait analysis, which is one useful method for efficient physical rehabilitation, the ground reaction force, the center of pressure, and the body orientation data are measured during walking. In the past, these data were measured by a 3D motion analysis system consisting of high-speed cameras and force plates, which must be installed in the floor. However, a conventional 3D motion analysis system can measure the ground reaction force and the center of pressure just on force plates during a few steps. In addition, the subjects' stride lengths are limited because they have to walk on the center of the force plate. These problems can be resolved by converting conventional devices into wearable devices. We used a measuring device consisting of portable force plates and motion sensors. We developed a walking analysis system that calculates the ground reaction force, the center of pressure, and the body orientations and measured a walking subject to estimate this system. We simultaneously used a conventional 3D motion analysis system to compare with our development system and showed its validity for measurements of ground reaction force and the center of pressure. Moreover we calculated joint reactions and joint moment of each joint. PMID:23365940

  10. Analysis of molecular expression patterns and integration with other knowledge bases using probabilistic Bayesian network models

    SciTech Connect

    Moler, Edward J.; Mian, I.S.

    2000-03-01

    How can molecular expression experiments be interpreted with greater than ten to the fourth measurements per chip? How can one get the most quantitative information possible from the experimental data with good confidence? These are important questions whose solutions require an interdisciplinary combination of molecular and cellular biology, computer science, statistics, and complex systems analysis. The explosion of data from microarray techniques present the problem of interpreting the experiments. The availability of large-scale knowledge bases provide the opportunity to maximize the information extracted from these experiments. We have developed new methods of discovering biological function, metabolic pathways, and regulatory networks from these data and knowledge bases. These techniques are applicable to analyses for biomedical engineering, clinical, and fundamental cell and molecular biology studies. Our approach uses probabilistic, computational methods that give quantitative interpretations of data in a biological context. We have selected Bayesian statistical models with graphical network representations as a framework for our methods. As a first step, we use a nave Bayesian classifier to identify statistically significant patterns in gene expression data. We have developed methods which allow us to (a) characterize which genes or experiments distinguish each class from the others, (b) cross-index the resulting classes with other databases to assess biological meaning of the classes, and (c) display a gross overview of cellular dynamics. We have developed a number of visualization tools to convey the results. We report here our methods of classification and our first attempts at integrating the data and other knowledge bases together with new visualization tools. We demonstrate the utility of these methods and tools by analysis of a series of yeast cDNA microarray data and to a set of cancerous/normal sample data from colon cancer patients. We discuss

  11. Critical composite joint subcomponents: Analysis and test results

    NASA Technical Reports Server (NTRS)

    Bunin, B. L.

    1983-01-01

    This program has been conducted to develop the technology for critical structural joints of a composite wing structure meeting design requirements for a 1990 commercial transport aircraft. A prime objective of the program was to demonstrate the ability to reliably predict the strength of large bolted composite joints. Load sharing between bolts in multirow joints was computed by a nonlinear analysis program (A4FJ) which was used both to assess the efficiency of different joint design concepts and to predict the strengths of large test articles representing a section from a wing root chord-wise splice. In most cases, the predictions were accurate to within a few percent of the test results. A highlight of these tests was the consistent ability to achieve gross-section failure strains on the order of 0.005 which represents a considerable improvement over the state of the art. The improvement was attained largely as the result of the better understanding of the load sharing in multirow joints provided by the analysis. The typical load intensity on the structural joints was about 40 to 45 thousand pound per inch in laminates having interspersed 37 1/2-percent 0-degree plies, 50-percent + or - 45-degrees plies and 12 1/2-percent 90-degrees plies. The composite material was Toray 300 fiber and Ciba-Geigy 914 resin, in the form of 0.010-inch thick unidirectional tape.

  12. Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification.

    PubMed

    Yan, Fang; Xu, Kaili; Yao, Xiwen; Li, Yang

    2016-01-01

    Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA) is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie) analysis was proposed by mapping bow-tie analysis into Bayesian network (BN). Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures. PMID:27463975

  13. Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification

    PubMed Central

    Yan, Fang; Xu, Kaili; Yao, Xiwen; Li, Yang

    2016-01-01

    Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA) is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie) analysis was proposed by mapping bow-tie analysis into Bayesian network (BN). Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures. PMID:27463975

  14. A Two-Step Bayesian Approach for Propensity Score Analysis: Simulations and Case Study

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Jianshen

    2012-01-01

    A two-step Bayesian propensity score approach is introduced that incorporates prior information in the propensity score equation and outcome equation without the problems associated with simultaneous Bayesian propensity score approaches. The corresponding variance estimators are also provided. The two-step Bayesian propensity score is provided for…

  15. A Bayesian analysis of uncertainties on lung doses resulting from occupational exposures to uranium.

    PubMed

    Puncher, M; Birchall, A; Bull, R K

    2013-09-01

    In a recent epidemiological study, Bayesian estimates of lung doses were calculated in order to determine a possible association between lung dose and lung cancer incidence resulting from occupational exposures to uranium. These calculations, which produce probability distributions of doses, used the human respiratory tract model (HRTM) published by the International Commission on Radiological Protection (ICRP) with a revised particle transport clearance model. In addition to the Bayesian analyses, point estimates (PEs) of doses were also provided for that study using the existing HRTM as it is described in ICRP Publication 66. The PEs are to be used in a preliminary analysis of risk. To explain the differences between the PEs and Bayesian analysis, in this paper the methodology was applied to former UK nuclear workers who constituted a subset of the study cohort. The resulting probability distributions of lung doses calculated using the Bayesian methodology were compared with the PEs obtained for each worker. Mean posterior lung doses were on average 8-fold higher than PEs and the uncertainties on doses varied over a wide range, being greater than two orders of magnitude for some lung tissues. It is shown that it is the prior distributions of the parameters describing absorption from the lungs to blood that are responsible for the large difference between posterior mean doses and PEs. Furthermore, it is the large prior uncertainties on these parameters that are mainly responsible for the large uncertainties on lung doses. It is concluded that accurate determination of the chemical form of inhaled uranium, as well as the absorption parameter values for these materials, is important for obtaining unbiased estimates of lung doses from occupational exposures to uranium for epidemiological studies. Finally, it should be noted that the inferences regarding the PEs described here apply only to the assessments of cases provided for the epidemiological study, where central

  16. A Bayesian fingerprinting analysis for detection and attribution of changes in extreme flows

    NASA Astrophysics Data System (ADS)

    Hundecha, Yeshewatesfa; Merz, Bruno; Perdigão, Rui A. P.; Vorogushyn, Sergiy; Viglione, Alberto; Blöschl, Günter

    2014-05-01

    Fingerprinting analysis has widely been used in the detection and attribution problem within the climate community over the past several decades. In the approach, a field of certain observed climate indicator is represented as a linear model of a signal pattern (fingerprint) that is simulated by a climate model under external forcing plus a noise field, which represents a realisation of the internal climate variability. A scaling factor is introduced to adjust the amplitude of the signal pattern so that it matches the observations well. In the approach, the scaling factor is optimally estimated to maximise the signal-to-noise ratio, thereby increasing detectability of the signal due to a forced climate change. Many of the fingerprinting analyses that are reported in the literature are framed on the classical statistical theory. Such an approach can give reliable results under the condition that the natural variability of the system and the uncertainties in the predicted signals under a given forcing can be quantified. If these uncertainties cannot be objectively estimated, interpretation of the results will mainly be guided by a subjective judgement. Recent analyses have made a shift towards a Bayesian approach, which provides a quantitative framework for the integration of subjective prior information on the uncertainties into the statistical detection and attribution problem. Hasselmann (1998) reviews the fingerprinting approach that is based on the classical statistical framework and presents generalisation of the approach to a Bayesian framework. Berliner et al. (2000) also presents a formal Bayesian fingerprinting analytical framework for the detection and attribution problem. The potential applicability of the fingerprinting approach to the detection and attribution problem of extreme flows has been discussed in the opinion paper by Merz et al. (2012). Hundecha and Merz (2012) have also implemented an approach that is similar to the fingerprinting approach

  17. Bayesian calibration of a soil organic carbon model using Δ14C measurements of soil organic carbon and heterotrophic respiration as joint constraints

    NASA Astrophysics Data System (ADS)

    Ahrens, B.; Reichstein, M.; Borken, W.; Muhr, J.; Trumbore, S. E.; Wutzler, T.

    2013-08-01

    Soils of temperate forests store significant amounts of organic matter and are considered to be net sinks of atmospheric CO2. Soil organic carbon (SOC) turnover has been studied using the Δ14C values of bulk SOC or different SOC fractions as observational constraints in SOC models. Further, the Δ14C values of CO2 evolved during the incubation of soil and roots have been widely used together with Δ14C of total soil respiration to partition soil respiration into heterotrophic respiration (HR) and rhizosphere respiration. However, these data have not been used as joint observational constraints to determine SOC turnover times. Thus, we focus on: (1) how different combinations of observational constraints help to narrow estimates of turnover times and other parameters of a simple two-pool model, ICBM; (2) if a multiple constraints approach allows determining whether the soil has been storing or losing SOC. To this end ICBM was adapted to model SOC and SO14C in parallel with litterfall and the Δ14C of litterfall as driving variables. The Δ14C of the atmosphere with its prominent bomb peak was used as a proxy for the Δ14C of litterfall. Data from three spruce dominated temperate forests in Germany and the USA (Coulissenhieb II, Solling D0 and Howland Tower site) were used to estimate the parameters of ICBM via Bayesian calibration. Key findings are: (1) the joint use of all 4 observational constraints (SOC stock and its Δ14C, HR flux and its Δ14C) helped to considerably narrow turnover times of the young pool (primarily by Δ14C of HR) and the old pool (primarily by Δ14C of SOC). Furthermore, the joint use all observational constraints allowed constraining the humification factor in ICBM, which describes the fraction of the annual outflux from the young pool that enters the old pool. The Bayesian parameter estimation yielded the following turnover times (mean ± standard deviation) for SOC in the young pool: Coulissenhieb II 1.7 ± 0.5 yr, Solling D0 5.7 ± 0

  18. Bayesian calibration of a soil organic carbon model using Δ14C measurements of soil organic carbon and heterotrophic respiration as joint constraints

    NASA Astrophysics Data System (ADS)

    Ahrens, B.; Reichstein, M.; Borken, W.; Muhr, J.; Trumbore, S. E.; Wutzler, T.

    2014-04-01

    Soils of temperate forests store significant amounts of organic matter and are considered to be net sinks of atmospheric CO2. Soil organic carbon (SOC) turnover has been studied using the Δ14C values of bulk SOC or different SOC fractions as observational constraints in SOC models. Further, the Δ14C values of CO2 that evolved during the incubation of soil and roots have been widely used together with Δ14C of total soil respiration to partition soil respiration into heterotrophic respiration (HR) and rhizosphere respiration. However, these data have not been used as joint observational constraints to determine SOC turnover times. Thus, we focus on (1) how different combinations of observational constraints help to narrow estimates of turnover times and other parameters of a simple two-pool model, the Introductory Carbon Balance Model (ICBM); (2) whether relaxing the steady-state assumption in a multiple constraints approach allows the source/sink strength of the soil to be determined while estimating turnover times at the same time. To this end ICBM was adapted to model SOC and SO14C in parallel with litterfall and the Δ14C of litterfall as driving variables. The Δ14C of the atmosphere with its prominent bomb peak was used as a proxy for the Δ14C of litterfall. Data from three spruce-dominated temperate forests in Germany and the USA (Coulissenhieb II, Solling D0 and Howland Tower site) were used to estimate the parameters of ICBM via Bayesian calibration. Key findings are as follows: (1) the joint use of all four observational constraints (SOC stock and its Δ14C, HR flux and its Δ14C) helped to considerably narrow turnover times of the young pool (primarily by Δ14C of HR) and the old pool (primarily by Δ14C of SOC). Furthermore, the joint use of all observational constraints made it possible to constrain the humification factor in ICBM, which describes the fraction of the annual outflux from the young pool that enters the old pool. The Bayesian parameter

  19. Joint aspiration and injection and synovial fluid analysis.

    PubMed

    Courtney, Philip; Doherty, Michael

    2009-04-01

    Joint aspiration/injection and synovial fluid (SF) analysis are both invaluable procedures for the diagnosis and treatment of joint disease. This chapter addresses: (1) the indications, the technical principles and the expected benefits and risks of aspiration and injection of intra-articular corticosteroid; and (2) practical aspects relating to SF analysis, especially in relation to crystal identification. Intra-articular injection of long-acting insoluble corticosteroids is a well-established procedure that produces rapid pain relief and resolution of inflammation in most injected joints. The knee is the most common site to require aspiration, although any non-axial joint is accessible for obtaining SF. The technique requires a knowledge of basic anatomy and should not be unduly painful for the patient. Provided sterile equipment and a sensible, aseptic approach are used, it is very safe. Analysis of aspirated SF is helpful in the differential diagnosis of arthritis and is the definitive method for diagnosis of septic arthritis and crystal arthritis. The gross appearance of SF can provide useful diagnostic information in terms of the degree of joint inflammation and presence of haemarthrosis. Microbiological studies of SF are the key to the confirmation of infectious conditions. Increasing joint inflammation is associated with increased SF volume, reduced viscosity, increasing turbidity and cell count, and increasing ratio of polymorphonuclear: mononuclear cells, but such changes are non-specific and must be interpreted in the clinical setting. However, detection of SF monosodium urate and calcium pyrophosphate dihydrate crystals, even from un-inflamed joints during intercritical periods, allow a precise diagnosis of gout and of calcium pyrophosphate crystal-related arthritis. PMID:19393565

  20. A Bayesian Approach to the Overlap Analysis of Epidemiologically Linked Traits

    PubMed Central

    Panoutsopoulou, Kalliope; Wheeler, Eleanor; Berndt, Sonja I.; Cordell, Heather J.; Morris, Andrew P.; Zeggini, Eleftheria; Barroso, Inês

    2015-01-01

    ABSTRACT Diseases often cooccur in individuals more often than expected by chance, and may be explained by shared underlying genetic etiology. A common approach to genetic overlap analyses is to use summary genome‐wide association study data to identify single‐nucleotide polymorphisms (SNPs) that are associated with multiple traits at a selected P‐value threshold. However, P‐values do not account for differences in power, whereas Bayes’ factors (BFs) do, and may be approximated using summary statistics. We use simulation studies to compare the power of frequentist and Bayesian approaches with overlap analyses, and to decide on appropriate thresholds for comparison between the two methods. It is empirically illustrated that BFs have the advantage over P‐values of a decreasing type I error rate as study size increases for single‐disease associations. Consequently, the overlap analysis of traits from different‐sized studies encounters issues in fair P‐value threshold selection, whereas BFs are adjusted automatically. Extensive simulations show that Bayesian overlap analyses tend to have higher power than those that assess association strength with P‐values, particularly in low‐power scenarios. Calibration tables between BFs and P‐values are provided for a range of sample sizes, as well as an approximation approach for sample sizes that are not in the calibration table. Although P‐values are sometimes thought more intuitive, these tables assist in removing the opaqueness of Bayesian thresholds and may also be used in the selection of a BF threshold to meet a certain type I error rate. An application of our methods is used to identify variants associated with both obesity and osteoarthritis. PMID:26411566

  1. A Bayesian Approach to Pathway Analysis by Integrating Gene–Gene Functional Directions and Microarray Data

    PubMed Central

    Zhao, Yifang; Chen, Ming-Hui; Pei, Baikang; Rowe, David; Shin, Dong-Guk; Xie, Wangang; Yu, Fang; Kuo, Lynn

    2012-01-01

    Many statistical methods have been developed to screen for differentially expressed genes associated with specific phenotypes in the microarray data. However, it remains a major challenge to synthesize the observed expression patterns with abundant biological knowledge for more complete understanding of the biological functions among genes. Various methods including clustering analysis on genes, neural network, Bayesian network and pathway analysis have been developed toward this goal. In most of these procedures, the activation and inhibition relationships among genes have hardly been utilized in the modeling steps. We propose two novel Bayesian models to integrate the microarray data with the putative pathway structures obtained from the KEGG database and the directional gene–gene interactions in the medical literature. We define the symmetric Kullback–Leibler divergence of a pathway, and use it to identify the pathway(s) most supported by the microarray data. Monte Carlo Markov Chain sampling algorithm is given for posterior computation in the hierarchical model. The proposed method is shown to select the most supported pathway in an illustrative example. Finally, we apply the methodology to a real microarray data set to understand the gene expression profile of osteoblast lineage at defined stages of differentiation. We observe that our method correctly identifies the pathways that are reported to play essential roles in modulating bone mass. PMID:23482678

  2. Evolutionary Analysis of Dengue Serotype 2 Viruses Using Phylogenetic and Bayesian Methods from New Delhi, India.

    PubMed

    Afreen, Nazia; Naqvi, Irshad H; Broor, Shobha; Ahmed, Anwar; Kazim, Syed Naqui; Dohare, Ravins; Kumar, Manoj; Parveen, Shama

    2016-03-01

    Dengue fever is the most important arboviral disease in the tropical and sub-tropical countries of the world. Delhi, the metropolitan capital state of India, has reported many dengue outbreaks, with the last outbreak occurring in 2013. We have recently reported predominance of dengue virus serotype 2 during 2011-2014 in Delhi. In the present study, we report molecular characterization and evolutionary analysis of dengue serotype 2 viruses which were detected in 2011-2014 in Delhi. Envelope genes of 42 DENV-2 strains were sequenced in the study. All DENV-2 strains grouped within the Cosmopolitan genotype and further clustered into three lineages; Lineage I, II and III. Lineage III replaced lineage I during dengue fever outbreak of 2013. Further, a novel mutation Thr404Ile was detected in the stem region of the envelope protein of a single DENV-2 strain in 2014. Nucleotide substitution rate and time to the most recent common ancestor were determined by molecular clock analysis using Bayesian methods. A change in effective population size of Indian DENV-2 viruses was investigated through Bayesian skyline plot. The study will be a vital road map for investigation of epidemiology and evolutionary pattern of dengue viruses in India. PMID:26977703

  3. A Bayesian framework for cell-level protein network analysis for multivariate proteomics image data

    NASA Astrophysics Data System (ADS)

    Kovacheva, Violet N.; Sirinukunwattana, Korsuk; Rajpoot, Nasir M.

    2014-03-01

    The recent development of multivariate imaging techniques, such as the Toponome Imaging System (TIS), has facilitated the analysis of multiple co-localisation of proteins. This could hold the key to understanding complex phenomena such as protein-protein interaction in cancer. In this paper, we propose a Bayesian framework for cell level network analysis allowing the identification of several protein pairs having significantly higher co-expression levels in cancerous tissue samples when compared to normal colon tissue. It involves segmenting the DAPI-labeled image into cells and determining the cell phenotypes according to their protein-protein dependence profile. The cells are phenotyped using Gaussian Bayesian hierarchical clustering (GBHC) after feature selection is performed. The phenotypes are then analysed using Difference in Sums of Weighted cO-dependence Profiles (DiSWOP), which detects differences in the co-expression patterns of protein pairs. We demonstrate that the pairs highlighted by the proposed framework have high concordance with recent results using a different phenotyping method. This demonstrates that the results are independent of the clustering method used. In addition, the highlighted protein pairs are further analysed via protein interaction pathway databases and by considering the localization of high protein-protein dependence within individual samples. This suggests that the proposed approach could identify potentially functional protein complexes active in cancer progression and cell differentiation.

  4. Quantitative Bayesian model-based analysis of amide proton transfer MRI.

    PubMed

    Chappell, Michael A; Donahue, Manus J; Tee, Yee Kai; Khrapitchev, Alexandre A; Sibson, Nicola R; Jezzard, Peter; Payne, Stephen J

    2013-08-01

    Amide Proton Transfer (APT) reports on contrast derived from the exchange of protons between amide groups and water. Commonly, APT contrast is quantified by asymmetry analysis, providing an ensemble contrast of both amide proton concentration and exchange rate. An alternative is to sample the off-resonant spectrum and fit an exchange model, permitting the APT effect to be quantified, correcting automatically for confounding effects of spillover, field inhomogeneity, and magnetization transfer. Additionally, it should permit amide concentration and exchange rate to be independently quantified. Here, a Bayesian method is applied to this problem allowing pertinent prior information to be specified. A three-pool model was used incorporating water protons, amide protons, and magnetization transfer effect. The method is demonstrated in simulations, creatine phantoms with varying pH and in vivo (n = 7). The Bayesian model-based approach was able to quantify the APT effect accurately (root-mean-square error < 2%) even when subject to confounding field variation and magnetization transfer effect, unlike traditional asymmetry analysis. The in vivo results gave approximate APT concentration (relative to water) and exchange rate values of 3 × 10(-3) and 15 s(-1) . A degree of correlation was observed between these parameter making the latter difficult to quantify with absolute accuracy, suggesting that more optimal sampling strategies might be required. PMID:23008121

  5. Evolutionary Analysis of Dengue Serotype 2 Viruses Using Phylogenetic and Bayesian Methods from New Delhi, India

    PubMed Central

    Afreen, Nazia; Naqvi, Irshad H.; Broor, Shobha; Ahmed, Anwar; Kazim, Syed Naqui; Dohare, Ravins; Kumar, Manoj; Parveen, Shama

    2016-01-01

    Dengue fever is the most important arboviral disease in the tropical and sub-tropical countries of the world. Delhi, the metropolitan capital state of India, has reported many dengue outbreaks, with the last outbreak occurring in 2013. We have recently reported predominance of dengue virus serotype 2 during 2011–2014 in Delhi. In the present study, we report molecular characterization and evolutionary analysis of dengue serotype 2 viruses which were detected in 2011–2014 in Delhi. Envelope genes of 42 DENV-2 strains were sequenced in the study. All DENV-2 strains grouped within the Cosmopolitan genotype and further clustered into three lineages; Lineage I, II and III. Lineage III replaced lineage I during dengue fever outbreak of 2013. Further, a novel mutation Thr404Ile was detected in the stem region of the envelope protein of a single DENV-2 strain in 2014. Nucleotide substitution rate and time to the most recent common ancestor were determined by molecular clock analysis using Bayesian methods. A change in effective population size of Indian DENV-2 viruses was investigated through Bayesian skyline plot. The study will be a vital road map for investigation of epidemiology and evolutionary pattern of dengue viruses in India. PMID:26977703

  6. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents. PMID:26433361

  7. Bayesian flux balance analysis applied to a skeletal muscle metabolic model

    PubMed Central

    Heino, Jenni; Tunyan, Knarik; Calvetti, Daniela; Somersalo, Erkki

    2007-01-01

    In this article, the steady state condition for the multi-compartment models for cellular metabolism is considered. The problem is to estimate the reaction and transport fluxes, as well as the concentrations in venous blood when the stoichiometry and bound constraints for the fluxes and the concentrations are given. The problem has been addressed previously by a number of authors, and optimization based approaches as well as extreme pathway analysis have been proposed. These approaches are briefly discussed here. The main emphasis of this work is a Bayesian statistical approach to the flux balance analysis (FBA). We show how the bound constraints and optimality conditions such as maximizing the oxidative phosphorylation flux can be incorporated into the model in the Bayesian framework by proper construction of the prior densities. We propose an effective Markov Chain Monte Carlo (MCMC) scheme to explore the posterior densities, and compare the results with those obtained via the previously studied Linear Programming (LP) approach. The proposed methodology, which is applied here to a two-compartment model for skeletal muscle metabolism, can be extended to more complex models. PMID:17568615

  8. Integrative Bayesian analysis of neuroimaging-genetic data with application to cocaine dependence.

    PubMed

    Azadeh, Shabnam; Hobbs, Brian P; Ma, Liangsuo; Nielsen, David A; Moeller, F Gerard; Baladandayuthapani, Veerabhadran

    2016-01-15

    Neuroimaging and genetic studies provide distinct and complementary information about the structural and biological aspects of a disease. Integrating the two sources of data facilitates the investigation of the links between genetic variability and brain mechanisms among different individuals for various medical disorders. This article presents a general statistical framework for integrative Bayesian analysis of neuroimaging-genetic (iBANG) data, which is motivated by a neuroimaging-genetic study in cocaine dependence. Statistical inference necessitated the integration of spatially dependent voxel-level measurements with various patient-level genetic and demographic characteristics under an appropriate probability model to account for the multiple inherent sources of variation. Our framework uses Bayesian model averaging to integrate genetic information into the analysis of voxel-wise neuroimaging data, accounting for spatial correlations in the voxels. Using multiplicity controls based on the false discovery rate, we delineate voxels associated with genetic and demographic features that may impact diffusion as measured by fractional anisotropy (FA) obtained from DTI images. We demonstrate the benefits of accounting for model uncertainties in both model fit and prediction. Our results suggest that cocaine consumption is associated with FA reduction in most white matter regions of interest in the brain. Additionally, gene polymorphisms associated with GABAergic, serotonergic and dopaminergic neurotransmitters and receptors were associated with FA. PMID:26484829

  9. Analysis of Blast Wave Interaction with a Rock Joint

    NASA Astrophysics Data System (ADS)

    Li, Jianchun; Ma, Guowei

    2010-11-01

    The interaction between rock joints and blast waves is crucial in rock engineering when rock mass is suffered from artificial or accidental explosions, bursts or weapon attacks. Based on the conservation of momentum at the wave fronts and the displacement discontinuity method, quantitative analysis for the interaction between obliquely incident P- or S-blast wave and a linear elastic rock joint is carried out in the present study, so as to deduce a wave propagation equation. For some special cases, such as normal or tangential incidence, rigid or weak joint, the analytical solution of the stress wave interaction with a rock joint is obtained by simplifying the wave propagation equation. By verification, it is found that the transmission and reflection coefficients from the wave propagation equation agree very well with the existing results. Parametric studies are then conducted to evaluate the effects of the joint stiffness and incident waves on wave transmission and reflection. The wave propagation equation derived in the present study can be straightforwardly extended for different incident waveforms and nonlinear rock joints to calculate the transmitted and reflected waves without mathematical methods such as the Fourier and inverse Fourier transforms.

  10. Bayesian Gibbs Markov chain: MRF-based Stochastic Joint Inversion of Hydrological and Geophysical Datasets for Improved Characterization of Aquifer Heterogeneities.

    NASA Astrophysics Data System (ADS)

    Oware, E. K.

    2015-12-01

    Modeling aquifer heterogeneities (AH) is a complex, multidimensional problem that mostly requires stochastic imaging strategies for tractability. While the traditional Bayesian Markov chain Monte Carlo (McMC) provides a powerful framework to model AH, the generic McMC is computationally prohibitive and, thus, unappealing for large-scale problems. An innovative variant of the McMC scheme that imposes priori spatial statistical constraints on model parameter updates, for improved characterization in a computationally efficient manner is proposed. The proposed algorithm (PA) is based on Markov random field (MRF) modeling, which is an image processing technique that infers the global behavior of a random field from its local properties, making the MRF approach well suited for imaging AH. MRF-based modeling leverages the equivalence of Gibbs (or Boltzmann) distribution (GD) and MRF to identify the local properties of an MRF in terms of the easily quantifiable Gibbs energy. The PA employs the two-step approach to model the lithological structure of the aquifer and the hydraulic properties within the identified lithologies simultaneously. It performs local Gibbs energy minimizations along a random path, which requires parameters of the GD (spatial statistics) to be specified. A PA that implicitly infers site-specific GD parameters within a Bayesian framework is also presented. The PA is illustrated with a synthetic binary facies aquifer with a lognormal heterogeneity simulated within each facies. GD parameters of 2.6, 1.2, -0.4, and -0.2 were estimated for the horizontal, vertical, NESW, and NWSE directions, respectively. Most of the high hydraulic conductivity zones (facies 2) were fairly resolved (see results below) with facies identification accuracy rate of 81%, 89%, and 90% for the inversions conditioned on concentration (R1), resistivity (R2), and joint (R3), respectively. The incorporation of the conditioning datasets improved on the root mean square error (RMSE

  11. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer E.; Aitchison, Lindsay

    2009-01-01

    A space suit s mobility is critical to an astronaut s ability to perform work efficiently. As mobility increases, the astronaut can perform tasks for longer durations with less fatigue. The term mobility, with respect to space suits, is defined in terms of two key components: joint range of motion and joint torque. Individually these measures describe the path which in which a joint travels and the force required to move it through that path. Previous space suits mobility requirements were defined as the collective result of these two measures and verified by the completion of discrete functional tasks. While a valid way to impose mobility requirements, such a method does necessitate a solid understanding of the operational scenarios in which the final suit will be performing. Because the Constellation space suit system requirements are being finalized with a relatively immature concept of operations, the Space Suit Element team elected to define mobility in terms of its constituent parts to increase the likelihood that the future pressure garment will be mobile enough to enable a broad scope of undefined exploration activities. The range of motion requirements were defined by measuring the ranges of motion test subjects achieved while performing a series of joint maximizing tasks in a variety of flight and prototype space suits. The definition of joint torque requirements has proved more elusive. NASA evaluated several different approaches to the problem before deciding to generate requirements based on unmanned joint torque evaluations of six different space suit configurations being articulated through 16 separate joint movements. This paper discusses the experiment design, data analysis and results, and the process used to determine the final values for the Constellation pressure garment joint torque requirements.

  12. Dynamics analysis of space robot manipulator with joint clearance

    NASA Astrophysics Data System (ADS)

    Zhao, Yang; Bai, Zheng Feng

    2011-04-01

    A computational methodology for analysis of space robot manipulator systems, considering the effects of the clearances in the joint, is presented. The contact dynamics model in joint clearance is established using the nonlinear equivalent spring-damp model and the friction effect is considered using the Coulomb friction model. The space robot system dynamic equation of manipulator with clearance is established. Then the dynamics simulation is presented and the dynamics characteristics of robot manipulator with clearance are analyzed. This work provides a practical method to analyze the dynamics characteristics of space robot manipulator with joint clearance and improves the engineering application. The computational methodology can predict the effects of clearance on space robot manipulator preferably, which is the basis of space robot manipulator design, precision analysis and ground test.

  13. An analysis of a joint shear model for jointed media with orthogonal joint sets; Yucca Mountain Site Characterization Project

    SciTech Connect

    Koteras, J.R.

    1991-10-01

    This report describes a joint shear model used in conjunction with a computational model for jointed media with orthogonal joint sets. The joint shear model allows nonlinear behavior for both joint sets. Because nonlinear behavior is allowed for both joint sets, a great many cases must be considered to fully describe the joint shear behavior of the jointed medium. An extensive set of equations is required to describe the joint shear stress and slip displacements that can occur for all the various cases. This report examines possible methods for simplifying this set of equations so that the model can be implemented efficiently form a computational standpoint. The shear model must be examined carefully to obtain a computationally efficient implementation that does not lead to numerical problems. The application to fractures in rock is discussed. 5 refs., 4 figs.

  14. A Preliminary Bayesian Analysis of Incomplete Longitudinal Data from a Small Sample: Methodological Advances in an International Comparative Study of Educational Inequality

    ERIC Educational Resources Information Center

    Hsieh, Chueh-An; Maier, Kimberly S.

    2009-01-01

    The capacity of Bayesian methods in estimating complex statistical models is undeniable. Bayesian data analysis is seen as having a range of advantages, such as an intuitive probabilistic interpretation of the parameters of interest, the efficient incorporation of prior information to empirical data analysis, model averaging and model selection.…

  15. Information-theoretic analysis of iterated Bayesian acoustic source localization in a static ocean waveguide.

    PubMed

    Hayward, Thomas J

    2015-05-01

    Fundamental constructs of information theory are applied to quantify the performance of iterated (sequential) Bayesian localization of a time-harmonic source in a range- and time-invariant acoustic waveguide using the segmented Fourier transforms of the received pressure time series. The nonlinear relation, defined by acoustic propagation, between the source location and the received narrowband spectral components is treated as a nonlinear communication channel. The performance analysis includes mismatch between the acoustic channel and the model channel on which the Bayesian inference is based. Source location uncertainty is quantified by the posterior probability density of source location, by the posterior entropy and associated uncertainty area, by the information gain (relative entropy) at each iteration, and by large-ensemble limits of these quantities. A computational example for a vertical receiver array in a shallow-water waveguide is presented with acoustic propagation represented by normal modes and ambient noise represented by a Kuperman-Ingenito model. Performance degradation due to noise-model mismatch is quantified in an example. Potential extensions to uncertain and stochastic environments are discussed. PMID:25994704

  16. Variational Bayesian mixture of experts models and sensitivity analysis for nonlinear dynamical systems

    NASA Astrophysics Data System (ADS)

    Baldacchino, Tara; Cross, Elizabeth J.; Worden, Keith; Rowson, Jennifer

    2016-01-01

    Most physical systems in reality exhibit a nonlinear relationship between input and output variables. This nonlinearity can manifest itself in terms of piecewise continuous functions or bifurcations, between some or all of the variables. The aims of this paper are two-fold. Firstly, a mixture of experts (MoE) model was trained on different physical systems exhibiting these types of nonlinearities. MoE models separate the input space into homogeneous regions and a different expert is responsible for the different regions. In this paper, the experts were low order polynomial regression models, thus avoiding the need for high-order polynomials. The model was trained within a Bayesian framework using variational Bayes, whereby a novel approach within the MoE literature was used in order to determine the number of experts in the model. Secondly, Bayesian sensitivity analysis (SA) of the systems under investigation was performed using the identified probabilistic MoE model in order to assess how uncertainty in the output can be attributed to uncertainty in the different inputs. The proposed methodology was first tested on a bifurcating Duffing oscillator, and it was then applied to real data sets obtained from the Tamar and Z24 bridges. In all cases, the MoE model was successful in identifying bifurcations and different physical regimes in the data by accurately dividing the input space; including identifying boundaries that were not parallel to coordinate axes.

  17. Bayesian soft x-ray tomography and MHD mode analysis on HL-2A

    NASA Astrophysics Data System (ADS)

    Li, Dong; Liu, Yi; Svensson, J.; Liu, Y. Q.; Song, X. M.; Yu, L. M.; Mao, Rui; Fu, B. Z.; Deng, Wei; Yuan, B. S.; Ji, X. Q.; Xu, Yuan; Chen, Wei; Zhou, Yan; Yang, Q. W.; Duan, X. R.; Liu, Yong; HL-2A Team

    2016-03-01

    A Bayesian based tomography method using so-called Gaussian processes (GPs) for the emission model has been applied to the soft x-ray (SXR) diagnostics on HL-2A tokamak. To improve the accuracy of reconstructions, the standard GP is extended to a non-stationary version so that different smoothness between the plasma center and the edge can be taken into account in the algorithm. The uncertainty in the reconstruction arising from measurement errors and incapability can be fully analyzed by the usage of Bayesian probability theory. In this work, the SXR reconstructions by this non-stationary Gaussian processes tomography (NSGPT) method have been compared with the equilibrium magnetic flux surfaces, generally achieving a satisfactory agreement in terms of both shape and position. In addition, singular-value-decomposition (SVD) and Fast Fourier Transform (FFT) techniques have been applied for the analysis of SXR and magnetic diagnostics, in order to explore the spatial and temporal features of the saturated long-lived magnetohydrodynamics (MHD) instability induced by energetic particles during neutral beam injection (NBI) on HL-2A. The result shows that this ideal internal kink instability has a dominant m/n  =  1/1 mode structure along with a harmonics m/n  =  2/2, which are coupled near the q  =  1 surface with a rotation frequency of 12 kHz.

  18. A Bayesian analysis of the 69 highest energy cosmic rays detected by the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Khanin, Alexander; Mortlock, Daniel J.

    2016-08-01

    The origins of ultrahigh energy cosmic rays (UHECRs) remain an open question. Several attempts have been made to cross-correlate the arrival directions of the UHECRs with catalogues of potential sources, but no definite conclusion has been reached. We report a Bayesian analysis of the 69 events, from the Pierre Auger Observatory (PAO), that aims to determine the fraction of the UHECRs that originate from known AGNs in the Veron-Cety & Verson (VCV) catalogue, as well as AGNs detected with the Swift Burst Alert Telescope (Swift-BAT), galaxies from the 2MASS Redshift Survey (2MRS), and an additional volume-limited sample of 17 nearby AGNs. The study makes use of a multilevel Bayesian model of UHECR injection, propagation and detection. We find that for reasonable ranges of prior parameters the Bayes factors disfavour a purely isotropic model. For fiducial values of the model parameters, we report 68 per cent credible intervals for the fraction of source originating UHECRs of 0.09^{+0.05}_{-0.04}, 0.25^{+0.09}_{-0.08}, 0.24^{+0.12}_{-0.10}, and 0.08^{+0.04}_{-0.03} for the VCV, Swift-BAT and 2MRS catalogues, and the sample of 17 AGNs, respectively.

  19. Bayesian analysis of response to selection: a case study using litter size in Danish Yorkshire pigs.

    PubMed Central

    Sorensen, D; Vernersen, A; Andersen, S

    2000-01-01

    Implementation of a Bayesian analysis of a selection experiment is illustrated using litter size [total number of piglets born (TNB)] in Danish Yorkshire pigs. Other traits studied include average litter weight at birth (WTAB) and proportion of piglets born dead (PRBD). Response to selection for TNB was analyzed with a number of models, which differed in their level of hierarchy, in their prior distributions, and in the parametric form of the likelihoods. A model assessment study favored a particular form of an additive genetic model. With this model, the Monte Carlo estimate of the 95% probability interval of response to selection was (0.23; 0.60), with a posterior mean of 0.43 piglets. WTAB showed a correlated response of -7.2 g, with a 95% probability interval equal to (-33.1; 18.9). The posterior mean of the genetic correlation between TNB and WTAB was -0.23 with a 95% probability interval equal to (-0.46; -0.01). PRBD was studied informally; it increases with larger litters, when litter size is >7 piglets born. A number of methodological issues related to the Bayesian model assessment study are discussed, as well as the genetic consequences of inferring response to selection using additive genetic models. PMID:10978292

  20. Developing a Validation Methodology for Expert-Informed Bayesian Network Models Supporting Nuclear Nonproliferation Analysis

    SciTech Connect

    White, Amanda M.; Gastelum, Zoe N.; Whitney, Paul D.

    2014-05-13

    Under the auspices of Pacific Northwest National Laboratory’s Signature Discovery Initiative (SDI), the research team developed a series of Bayesian Network models to assess multi-source signatures of nuclear programs. A Bayesian network is a mathematical model that can be used to marshal evidence to assess competing hypotheses. The purpose of the models was to allow non-expert analysts to benefit from the use of expert-informed mathematical models to assess nuclear programs, because such assessments require significant technical expertise ranging from the nuclear fuel cycle, construction and engineering, imagery analysis, and so forth. One such model developed under this research was aimed at assessing the consistency of open-source information about a nuclear facility with the facility’s declared use. The model incorporates factors such as location, security and safety features among others identified by subject matter experts as crucial to their assessments. The model includes key features, observables and their relationships. The model also provides documentation, which serves as training materials for the non-experts.

  1. Bayesian analysis of nanodosimetric ionisation distributions due to alpha particles and protons.

    PubMed

    De Nardo, L; Ferretti, A; Colautti, P; Grosswendt, B

    2011-02-01

    Track-nanodosimetry has the objective to investigate the stochastic aspect of ionisation events in particle tracks, by evaluating the probability distribution of the number of ionisations produced in a nanometric target volume positioned at distance d from a particle track. Such kind of measurements makes use of electron (or ion) gas detectors with detecting efficiencies non-uniformly distributed inside the target volume. This fact makes the reconstruction of true ionisation distributions, which correspond to an ideal efficiency of 100%, non-trivial. Bayesian unfolding has been applied to ionisation distributions produced by 5.4 MeV alpha particles and 20 MeV protons in cylindrical volumes of propane of 20 nm equivalent size, positioned at different impact parameters with respect to the primary beam. It will be shown that a Bayesian analysis performed by subdividing the target volume in sub-regions of different detection efficiencies is able to provide a good reconstruction of the true nanodosimetric ionisation distributions. PMID:21112893

  2. Spatial-Temporal Epidemiology of Tuberculosis in Mainland China: An Analysis Based on Bayesian Theory

    PubMed Central

    Cao, Kai; Yang, Kun; Wang, Chao; Guo, Jin; Tao, Lixin; Liu, Qingrong; Gehendra, Mahara; Zhang, Yingjie; Guo, Xiuhua

    2016-01-01

    Objective: To explore the spatial-temporal interaction effect within a Bayesian framework and to probe the ecological influential factors for tuberculosis. Methods: Six different statistical models containing parameters of time, space, spatial-temporal interaction and their combination were constructed based on a Bayesian framework. The optimum model was selected according to the deviance information criterion (DIC) value. Coefficients of climate variables were then estimated using the best fitting model. Results: The model containing spatial-temporal interaction parameter was the best fitting one, with the smallest DIC value (−4,508,660). Ecological analysis results showed the relative risks (RRs) of average temperature, rainfall, wind speed, humidity, and air pressure were 1.00324 (95% CI, 1.00150–1.00550), 1.01010 (95% CI, 1.01007–1.01013), 0.83518 (95% CI, 0.93732–0.96138), 0.97496 (95% CI, 0.97181–1.01386), and 1.01007 (95% CI, 1.01003–1.01011), respectively. Conclusions: The spatial-temporal interaction was statistically meaningful and the prevalence of tuberculosis was influenced by the time and space interaction effect. Average temperature, rainfall, wind speed, and air pressure influenced tuberculosis. Average humidity had no influence on tuberculosis. PMID:27164117

  3. A Bayesian analysis of the 69 highest energy cosmic rays detected by the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Khanin, Alexander; Mortlock, Daniel J.

    2016-08-01

    The origins of ultra-high energy cosmic rays (UHECRs) remain an open question. Several attempts have been made to cross-correlate the arrival directions of the UHECRs with catalogs of potential sources, but no definite conclusion has been reached. We report a Bayesian analysis of the 69 events from the Pierre Auger Observatory (PAO), that aims to determine the fraction of the UHECRs that originate from known AGNs in the Veron-Cety & Veron (VCV) catalog, as well as AGNs detected with the Swift Burst Alert Telescope (Swift-BAT), galaxies from the 2MASS Redshift Survey (2MRS), and an additional volume-limited sample of 17 nearby AGNs. The study makes use of a multi-level Bayesian model of UHECR injection, propagation and detection. We find that for reasonable ranges of prior parameters, the Bayes factors disfavour a purely isotropic model. For fiducial values of the model parameters, we report 68% credible intervals for the fraction of source originating UHECRs of 0.09+0.05-0.04, 0.25+0.09-0.08, 0.24+0.12-0.10, and 0.08+0.04-0.03 for the VCV, Swift-BAT and 2MRS catalogs, and the sample of 17 AGNs, respectively.

  4. Bayesian Neural Networks for Uncertainty Analysis of Hydrologic Modeling: A Comparison of Two Schemes

    SciTech Connect

    Zhang, Xuesong; Zhao, Kaiguang

    2012-06-01

    Bayesian Neural Networks (BNNs) have been shown as useful tools to analyze modeling uncertainty of Neural Networks (NNs). This research focuses on the comparison of two BNNs. The first BNNs (BNN-I) use statistical methods to describe the characteristics of different uncertainty sources (input, parameter, and model structure) and integrate these uncertainties into a Markov Chain Monte Carlo (MCMC) framework to estimate total uncertainty. The second BNNs (BNN-II) lump all uncertainties into a single error term (i.e. the residual between model prediction and measurement). In this study, we propose a simple BNN-II, which use Genetic Algorithms (GA) and Bayesian Model Averaging (BMA) to calibrate Neural Networks with different structures (number of hidden units) and combine the predictions from different NNs to derive predictions and uncertainty analysis. We tested these two BNNs in two watersheds for daily and monthly hydrologic simulation. The BMA based BNNs developed in this study outperforms BNN-I in the two watersheds in terms of both accurate prediction and uncertainty estimation. These results show that, given incomplete understanding of the characteristics associated with each uncertainty source, the simple lumped error approach may yield better prediction and uncertainty estimation.

  5. GO-Bayes: Gene Ontology-based overrepresentation analysis using a Bayesian approach

    PubMed Central

    Zhang, Song; Cao, Jing; Kong, Y. Megan; Scheuermann, Richard H.

    2010-01-01

    Motivation: A typical approach for the interpretation of high-throughput experiments, such as gene expression microarrays, is to produce groups of genes based on certain criteria (e.g. genes that are differentially expressed). To gain more mechanistic insights into the underlying biology, overrepresentation analysis (ORA) is often conducted to investigate whether gene sets associated with particular biological functions, for example, as represented by Gene Ontology (GO) annotations, are statistically overrepresented in the identified gene groups. However, the standard ORA, which is based on the hypergeometric test, analyzes each GO term in isolation and does not take into account the dependence structure of the GO-term hierarchy. Results: We have developed a Bayesian approach (GO-Bayes) to measure overrepresentation of GO terms that incorporates the GO dependence structure by taking into account evidence not only from individual GO terms, but also from their related terms (i.e. parents, children, siblings, etc.). The Bayesian framework borrows information across related GO terms to strengthen the detection of overrepresentation signals. As a result, this method tends to identify sets of closely related GO terms rather than individual isolated GO terms. The advantage of the GO-Bayes approach is demonstrated with a simulation study and an application example. Contact: song.zhang@utsouthwestern.edu; richard.scheuermann@utsouthwestern.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20176581

  6. Critically evaluating the theory and performance of Bayesian analysis of macroevolutionary mixtures

    PubMed Central

    Moore, Brian R.; Höhna, Sebastian; May, Michael R.; Rannala, Bruce; Huelsenbeck, John P.

    2016-01-01

    Bayesian analysis of macroevolutionary mixtures (BAMM) has recently taken the study of lineage diversification by storm. BAMM estimates the diversification-rate parameters (speciation and extinction) for every branch of a study phylogeny and infers the number and location of diversification-rate shifts across branches of a tree. Our evaluation of BAMM reveals two major theoretical errors: (i) the likelihood function (which estimates the model parameters from the data) is incorrect, and (ii) the compound Poisson process prior model (which describes the prior distribution of diversification-rate shifts across branches) is incoherent. Using simulation, we demonstrate that these theoretical issues cause statistical pathologies; posterior estimates of the number of diversification-rate shifts are strongly influenced by the assumed prior, and estimates of diversification-rate parameters are unreliable. Moreover, the inability to correctly compute the likelihood or to correctly specify the prior for rate-variable trees precludes the use of Bayesian approaches for testing hypotheses regarding the number and location of diversification-rate shifts using BAMM. PMID:27512038

  7. Assessing State Nuclear Weapons Proliferation: Using Bayesian Network Analysis of Social Factors

    SciTech Connect

    Coles, Garill A.; Brothers, Alan J.; Olson, Jarrod; Whitney, Paul D.

    2010-04-16

    A Bayesian network (BN) model of social factors can support proliferation assessments by estimating the likelihood that a state will pursue a nuclear weapon. Social factors including political, economic, nuclear capability, security, and national identity and psychology factors may play as important a role in whether a State pursues nuclear weapons as more physical factors. This paper will show how using Bayesian reasoning on a generic case of a would-be proliferator State can be used to combine evidence that supports proliferation assessment. Theories and analysis by political scientists can be leveraged in a quantitative and transparent way to indicate proliferation risk. BN models facilitate diagnosis and inference in a probabilistic environment by using a network of nodes and acyclic directed arcs between the nodes whose connections, or absence of, indicate probabilistic relevance, or independence. We propose a BN model that would use information from both traditional safeguards and the strengthened safeguards associated with the Additional Protocol to indicate countries with a high risk of proliferating nuclear weapons. This model could be used in a variety of applications such a prioritization tool and as a component of state safeguards evaluations. This paper will discuss the benefits of BN reasoning, the development of Pacific Northwest National Laboratory’s (PNNL) BN state proliferation model and how it could be employed as an analytical tool.

  8. Two waves of diversification in mammals and reptiles of Baja California revealed by hierarchical Bayesian analysis.

    PubMed

    Leaché, Adam D; Crews, Sarah C; Hickerson, Michael J

    2007-12-22

    Many species inhabiting the Peninsular Desert of Baja California demonstrate a phylogeographic break at the mid-peninsula, and previous researchers have attributed this shared pattern to a single vicariant event, a mid-peninsular seaway. However, previous studies have not explicitly considered the inherent stochasticity associated with the gene-tree coalescence for species preceding the time of the putative mid-peninsular divergence. We use a Bayesian analysis of a hierarchical model to test for simultaneous vicariance across co-distributed sister lineages sharing a genealogical break at the mid-peninsula. This Bayesian method is advantageous over traditional phylogenetic interpretations of biogeography because it considers the genetic variance associated with the coalescent and mutational processes, as well as the among-lineage demographic differences that affect gene-tree coalescent patterns. Mitochondrial DNA data from six small mammals and six squamate reptiles do not support the perception of a shared vicariant history among lineages exhibiting a north-south divergence at the mid-peninsula, and instead support two events differentially structuring genetic diversity in this region. PMID:17698443

  9. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature. PMID:26722989

  10. Critically evaluating the theory and performance of Bayesian analysis of macroevolutionary mixtures.

    PubMed

    Moore, Brian R; Höhna, Sebastian; May, Michael R; Rannala, Bruce; Huelsenbeck, John P

    2016-08-23

    Bayesian analysis of macroevolutionary mixtures (BAMM) has recently taken the study of lineage diversification by storm. BAMM estimates the diversification-rate parameters (speciation and extinction) for every branch of a study phylogeny and infers the number and location of diversification-rate shifts across branches of a tree. Our evaluation of BAMM reveals two major theoretical errors: (i) the likelihood function (which estimates the model parameters from the data) is incorrect, and (ii) the compound Poisson process prior model (which describes the prior distribution of diversification-rate shifts across branches) is incoherent. Using simulation, we demonstrate that these theoretical issues cause statistical pathologies; posterior estimates of the number of diversification-rate shifts are strongly influenced by the assumed prior, and estimates of diversification-rate parameters are unreliable. Moreover, the inability to correctly compute the likelihood or to correctly specify the prior for rate-variable trees precludes the use of Bayesian approaches for testing hypotheses regarding the number and location of diversification-rate shifts using BAMM. PMID:27512038

  11. Benchmark Composite Wing Design Including Joint Analysis and Optimization

    NASA Astrophysics Data System (ADS)

    Albers, Robert G.

    A composite wing panel software package, named WING Joint OpTimization and Analysis (WINGJOTA) featuring bolted joint analysis, is created and presented in this research. Three areas of focus were the development of an analytic composite bolted joint analysis suitable for fast evaluation; a more realistic wing design than what has been considered in the open literature; and the application of two optimization algorithms for composite wing design. Optimization results from 14 wing load cases applied to a composite wing panel with joints are presented. The composite bolted joint analysis consists of an elasticity solution that provides the stress state at a characteristic distance away from the bolt holes. The stresses at the characteristic distance are compared to a failure criterion on a ply-by-ply basis that not only determines first ply failure but also the failure mode. The loads in the multi-fastener joints used in this study were determined by an iterative scheme that provides the bearing-bypass loads to the elasticity analysis. A preliminary design of a composite subsonic transport wing was developed, based around a mid-size, twin-aisle aircraft. The benchmark design includes the leading and trailing edge structures and the center box inside the fuselage. Wing masses were included as point loads, and fuel loads were incorporated as distributed loads. The side-of-body boundary condition was modeled using high stiffness springs, and the aerodynamic loads were applied using an approximate point load scheme. The entire wing structure was modeled using the finite element code ANSYS to provide the internal loads needed as boundary conditions for the wing panel analyzed by WINGJOTA. The software package WINGJOTA combines the composite bolted joint analysis, a composite plate finite element analysis, a wing aeroelastic cycle, and two optimization algorithms to form the basis of a computer code for analysis and optimization. Both the Improving Hit-and-Run (IHR) and

  12. Reverse Engineering of Modified Genes by Bayesian Network Analysis Defines Molecular Determinants Critical to the Development of Glioblastoma

    PubMed Central

    Kunkle, Brian W.; Yoo, Changwon; Roy, Deodutta

    2013-01-01

    In this study we have identified key genes that are critical in development of astrocytic tumors. Meta-analysis of microarray studies which compared normal tissue to astrocytoma revealed a set of 646 differentially expressed genes in the majority of astrocytoma. Reverse engineering of these 646 genes using Bayesian network analysis produced a gene network for each grade of astrocytoma (Grade I–IV), and ‘key genes’ within each grade were identified. Genes found to be most influential to development of the highest grade of astrocytoma, Glioblastoma multiforme were: COL4A1, EGFR, BTF3, MPP2, RAB31, CDK4, CD99, ANXA2, TOP2A, and SERBP1. All of these genes were up-regulated, except MPP2 (down regulated). These 10 genes were able to predict tumor status with 96–100% confidence when using logistic regression, cross validation, and the support vector machine analysis. Markov genes interact with NFkβ, ERK, MAPK, VEGF, growth hormone and collagen to produce a network whose top biological functions are cancer, neurological disease, and cellular movement. Three of the 10 genes - EGFR, COL4A1, and CDK4, in particular, seemed to be potential ‘hubs of activity’. Modified expression of these 10 Markov Blanket genes increases lifetime risk of developing glioblastoma compared to the normal population. The glioblastoma risk estimates were dramatically increased with joint effects of 4 or more than 4 Markov Blanket genes. Joint interaction effects of 4, 5, 6, 7, 8, 9 or 10 Markov Blanket genes produced 9, 13, 20.9, 26.7, 52.8, 53.2, 78.1 or 85.9%, respectively, increase in lifetime risk of developing glioblastoma compared to normal population. In summary, it appears that modified expression of several ‘key genes’ may be required for the development of glioblastoma. Further studies are needed to validate these ‘key genes’ as useful tools for early detection and novel therapeutic options for these tumors. PMID:23737970

  13. Bayesian Analysis for Exponential Random Graph Models Using the Adaptive Exchange Sampler*

    PubMed Central

    Jin, Ick Hoon; Yuan, Ying; Liang, Faming

    2014-01-01

    Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the intractable normalizing constant and model degeneracy. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the intractable normalizing constant and model degeneracy issues encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency. PMID:24653788

  14. Application of Bayesian statistical techniques in the analysis of spacecraft pointing errors

    NASA Astrophysics Data System (ADS)

    Dungate, D. G.

    1993-09-01

    A key problem in the statistical analysis of spacecraft pointing performance is the justifiable identification of a Probability Density Function (PDF) for each contributing error source. The drawbacks of Gaussian distributions are well known, and more flexible families of distributions have been identified, but often only limited data is available to support PDF assignment. Two methods based on Bayesian statistical principles, each working from alternative viewpoints, are applied to the problem here, and appear to offer significant advantages in the analysis of many error types. In particular, errors such as time-varying thermal distortions, where data is only available via a small number of Finite Element Analyses, appear to be satisfactorily dealt with via one of these methods, which also explicitly allows for the inclusion of estimated errors in quantities formed from the data available for a particular error source.

  15. A Bayesian Approach for Instrumental Variable Analysis with Censored Time-to-Event Outcome

    PubMed Central

    Li, Gang; Lu, Xuyang

    2014-01-01

    Instrumental variable (IV) analysis has been widely used in economics, epidemiology, and other fields to estimate the causal effects of covariates on outcomes, in the presence of unobserved confounders and/or measurement errors in covariates. However, IV methods for time-to-event outcome with censored data remain underdeveloped. This paper proposes a Bayesian approach for IV analysis with censored time-to-event outcome by using a two-stage linear model. A Markov Chain Monte Carlo sampling method is developed for parameter estimation for both normal and non-normal linear models with elliptically contoured error distributions. Performance of our method is examined by simulation studies. Our method largely reduces bias and greatly improves coverage probability of the estimated causal effect, compared to the method that ignores the unobserved confounders and measurement errors. We illustrate our method on the Women's Health Initiative Observational Study and the Atherosclerosis Risk in Communities Study. PMID:25393617

  16. Video-analysis Interface for Angular Joint Measurement

    NASA Astrophysics Data System (ADS)

    Mondani, M.; Ghersi, I.; Miralles, M. T.

    2016-04-01

    Real-time quantification of joint articular movement is instrumental in the comprehensive assessment of significant biomechanical gestures. The development of an interface, based on an automatic algorithm for 3D-motion analysis, is presented in this work. The graphical interface uses open-source libraries for video processing, and its use is intuitive. The proposed method is low-cost, of acceptable precision (|εθ| < 1°), and minimally invasive. It allows to obtain angular movement of joints in different planes, synchronized with the video of the gesture, as well as to make comparisons and calculate parameters of interest from the acquired angular kinematics.

  17. Bayesian analysis of complex interacting mutations in HIV drug resistance and cross-resistance.

    PubMed

    Kozyryev, Ivan; Zhang, Jing

    2015-01-01

    A successful treatment of AIDS world-wide is severely hindered by the HIV virus' drug resistance capability resulting from complicated mutation patterns of viral proteins. Such a system of mutations enables the virus to survive and reproduce despite the presence of various antiretroviral drugs by disrupting their binding capability. Although these interacting mutation patterns are extremely difficult to efficiently uncover and interpret, they contribute valuable information to personalized therapeutic regimen design. The use of Bayesian statistical modeling provides an unprecedented opportunity in the field of anti-HIV therapy to understand detailed interaction structures of drug resistant mutations. Multiple Bayesian models equipped with Markov Chain Monte Carlo (MCMC) methods have been recently proposed in this field (Zhang et al. in PNAS 107:1321, 2010 [1]; Zhang et al. in J Proteome Sci Comput Biol 1:2, 2012 [2]; Svicher et al. in Antiviral Res 93(1):86-93, 2012 [3]; Svicher et al. in Antiviral Therapy 16(7):1035-1045, 2011 [4]; Svicher et al. in Antiviral Ther 16(4):A14-A14, 2011 [5]; Svicher et al. in Antiviral Ther 16(4):A85-A85, 2011 [6]; Alteri et al. in Signature mutations in V3 and bridging sheet domain of HIV-1 gp120 HIV-1 are specifically associated with dual tropism and modulate the interaction with CCR5 N-Terminus, 2011 [7]). Probabilistically modeling mutations in the HIV-1 protease or reverse transcriptase (RT) isolated from drug-treated patients provides a powerful statistical procedure that first detects mutation combinations associated with single or multiple-drug resistance, and then infers detailed dependence structures among the interacting mutations in viral proteins (Zhang et al. in PNAS 107:1321, 2010 [1]; Zhang et al. in J Proteome Sci Comput Biol 1:2, 2012 [2]). Combined with molecular dynamics simulations and free energy calculations, Bayesian analysis predictions help to uncover genetic and structural mechanisms in the HIV treatment

  18. Adhesive Characterization and Progressive Damage Analysis of Bonded Composite Joints

    NASA Technical Reports Server (NTRS)

    Girolamo, Donato; Davila, Carlos G.; Leone, Frank A.; Lin, Shih-Yung

    2014-01-01

    The results of an experimental/numerical campaign aimed to develop progressive damage analysis (PDA) tools for predicting the strength of a composite bonded joint under tensile loads are presented. The PDA is based on continuum damage mechanics (CDM) to account for intralaminar damage, and cohesive laws to account for interlaminar and adhesive damage. The adhesive response is characterized using standard fracture specimens and digital image correlation (DIC). The displacement fields measured by DIC are used to calculate the J-integrals, from which the associated cohesive laws of the structural adhesive can be derived. A finite element model of a sandwich conventional splice joint (CSJ) under tensile loads was developed. The simulations indicate that the model is capable of predicting the interactions of damage modes that lead to the failure of the joint.

  19. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer

    2010-01-01

    A space suit's mobility is critical to an astronaut's ability to perform work efficiently. As mobility increases, the astronaut can perform tasks for longer durations with less fatigue. Mobility can be broken down into two parts: range of motion (ROM) and torque. These two measurements describe how the suit moves and how much force it takes to move. Two methods were chosen to define mobility requirements for the Constellation Space Suit Element (CSSE). One method focuses on range of motion and the second method centers on joint torque. A joint torque test was conducted to determine a baseline for current advanced space suit joint torques. This test utilized the following space suits: Extravehicular Mobility Unit (EMU), Advanced Crew Escape Suit (ACES), I-Suit, D-Suit, Enhanced Mobility (EM)- ACES, and Mark III (MK-III). Data was collected data from 16 different joint movements of each suit. The results were then reviewed and CSSE joint torque requirement values were selected. The focus of this paper is to discuss trends observed during data analysis.

  20. A New Bayesian Lasso

    PubMed Central

    Mallick, Himel; Yi, Nengjun

    2016-01-01

    Park and Casella (2008) provided the Bayesian lasso for linear models by assigning scale mixture of normal (SMN) priors on the parameters and independent exponential priors on their variances. In this paper, we propose an alternative Bayesian analysis of the lasso problem. A different hierarchical formulation of Bayesian lasso is introduced by utilizing the scale mixture of uniform (SMU) representation of the Laplace density. We consider a fully Bayesian treatment that leads to a new Gibbs sampler with tractable full conditional posterior distributions. Empirical results and real data analyses show that the new algorithm has good mixing property and performs comparably to the existing Bayesian method in terms of both prediction accuracy and variable selection. An ECM algorithm is provided to compute the MAP estimates of the parameters. Easy extension to general models is also briefly discussed.

  1. Joint modality fusion and temporal context exploitation for semantic video analysis

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Georgios Th; Mezaris, Vasileios; Kompatsiaris, Ioannis; Strintzis, Michael G.

    2011-12-01

    In this paper, a multi-modal context-aware approach to semantic video analysis is presented. Overall, the examined video sequence is initially segmented into shots and for every resulting shot appropriate color, motion and audio features are extracted. Then, Hidden Markov Models (HMMs) are employed for performing an initial association of each shot with the semantic classes that are of interest separately for each modality. Subsequently, a graphical modeling-based approach is proposed for jointly performing modality fusion and temporal context exploitation. Novelties of this work include the combined use of contextual information and multi-modal fusion, and the development of a new representation for providing motion distribution information to HMMs. Specifically, an integrated Bayesian Network is introduced for simultaneously performing information fusion of the individual modality analysis results and exploitation of temporal context, contrary to the usual practice of performing each task separately. Contextual information is in the form of temporal relations among the supported classes. Additionally, a new computationally efficient method for providing motion energy distribution-related information to HMMs, which supports the incorporation of motion characteristics from previous frames to the currently examined one, is presented. The final outcome of this overall video analysis framework is the association of a semantic class with every shot. Experimental results as well as comparative evaluation from the application of the proposed approach to four datasets belonging to the domains of tennis, news and volleyball broadcast video are presented.

  2. Exposure models for the prior distribution in bayesian decision analysis for occupational hygiene decision making.

    PubMed

    Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E; Harper, Martin

    2013-01-01

    This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure. PMID:23252451

  3. Bayesian network meta-analysis for unordered categorical outcomes with incomplete data.

    PubMed

    Schmid, Christopher H; Trikalinos, Thomas A; Olkin, Ingram

    2014-06-01

    We develop a Bayesian multinomial network meta-analysis model for unordered (nominal) categorical outcomes that allows for partially observed data in which exact event counts may not be known for each category. This model properly accounts for correlations of counts in mutually exclusive categories and enables proper comparison and ranking of treatment effects across multiple treatments and multiple outcome categories. We apply the model to analyze 17 trials, each of which compares two of three treatments (high and low dose statins and standard care/control) for three outcomes for which data are complete: cardiovascular death, non-cardiovascular death and no death. We also analyze the cardiovascular death category divided into the three subcategories (coronary heart disease, stroke and other cardiovascular diseases) that are not completely observed. The multinomial and network representations show that high dose statins are effective in reducing the risk of cardiovascular disease. PMID:26052655

  4. Bayesian analysis of the effective charge from spectroscopic bremsstrahlung measurement in fusion plasmas

    NASA Astrophysics Data System (ADS)

    Krychowiak, M.; König, R.; Klinger, T.; Fischer, R.

    2004-11-01

    At the stellarator Wendelstein 7-AS (W7-AS) a spectrally resolving two channel system for the measurement of line-of-sight averaged Zeff values has been tested in preparation for its planned installation as a multichannel Zeff-profile measurement system on the stellarator Wendelstein 7-X (W7-X) which is presently under construction. The measurement is performed using the bremsstrahlung intensity in the wavelength region of ultraviolet to near infrared. The spectrally resolved measurement allows to eliminate signal contamination by line radiation. For statistical data analysis a procedure based on Bayesian probability theory has been developed. With this method it is possible to estimate the bremsstrahlung background in the measured signal and its error without the necessity to fit the spectral lines. For evaluation of the random error in Zeff the signal noise has been investigated. Furthermore, the linearity and behavior of the charge-coupled device detector at saturation has been analyzed.

  5. Bayesian semiparametric power spectral density estimation with applications in gravitational wave data analysis

    NASA Astrophysics Data System (ADS)

    Edwards, Matthew C.; Meyer, Renate; Christensen, Nelson

    2015-09-01

    The standard noise model in gravitational wave (GW) data analysis assumes detector noise is stationary and Gaussian distributed, with a known power spectral density (PSD) that is usually estimated using clean off-source data. Real GW data often depart from these assumptions, and misspecified parametric models of the PSD could result in misleading inferences. We propose a Bayesian semiparametric approach to improve this. We use a nonparametric Bernstein polynomial prior on the PSD, with weights attained via a Dirichlet process distribution, and update this using the Whittle likelihood. Posterior samples are obtained using a blocked Metropolis-within-Gibbs sampler. We simultaneously estimate the reconstruction parameters of a rotating core collapse supernova GW burst that has been embedded in simulated Advanced LIGO noise. We also discuss an approach to deal with nonstationary data by breaking longer data streams into smaller and locally stationary components.

  6. Intuitive Logic Revisited: New Data and a Bayesian Mixed Model Meta-Analysis

    PubMed Central

    Singmann, Henrik; Klauer, Karl Christoph; Kellen, David

    2014-01-01

    Recent research on syllogistic reasoning suggests that the logical status (valid vs. invalid) of even difficult syllogisms can be intuitively detected via differences in conceptual fluency between logically valid and invalid syllogisms when participants are asked to rate how much they like a conclusion following from a syllogism (Morsanyi & Handley, 2012). These claims of an intuitive logic are at odds with most theories on syllogistic reasoning which posit that detecting the logical status of difficult syllogisms requires effortful and deliberate cognitive processes. We present new data replicating the effects reported by Morsanyi and Handley, but show that this effect is eliminated when controlling for a possible confound in terms of conclusion content. Additionally, we reanalyze three studies () without this confound with a Bayesian mixed model meta-analysis (i.e., controlling for participant and item effects) which provides evidence for the null-hypothesis and against Morsanyi and Handley's claim. PMID:24755777

  7. Exposure Models for the Prior Distribution in Bayesian Decision Analysis for Occupational Hygiene Decision Making

    PubMed Central

    Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E.; Harper, Martin

    2015-01-01

    This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure. PMID:23252451

  8. pyBLoCXS: Bayesian Low-Count X-ray Spectral analysis

    NASA Astrophysics Data System (ADS)

    Siemiginowska, Aneta; Kashyap, Vinay; Refsdal, Brian; van Dyk, David; Connors, Alanna; Park, Taeyoung

    2012-04-01

    pyBLoCXS is a sophisticated Markov chain Monte Carlo (MCMC) based algorithm designed to carry out Bayesian Low-Count X-ray Spectral (BLoCXS) analysis in the Sherpa environment. The code is a Python extension to Sherpa that explores parameter space at a suspected minimum using a predefined Sherpa model to high-energy X-ray spectral data. pyBLoCXS includes a flexible definition of priors and allows for variations in the calibration information. It can be used to compute posterior predictive p-values for the likelihood ratio test. The pyBLoCXS code has been tested with a number of simple single-component spectral models; it should be used with great care in more complex settings.

  9. Space Shuttle RTOS Bayesian Network

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  10. TYPE Ia SUPERNOVA LIGHT-CURVE INFERENCE: HIERARCHICAL BAYESIAN ANALYSIS IN THE NEAR-INFRARED

    SciTech Connect

    Mandel, Kaisey S.; Friedman, Andrew S.; Kirshner, Robert P.; Wood-Vasey, W. Michael

    2009-10-10

    We present a comprehensive statistical analysis of the properties of Type Ia supernova (SN Ia) light curves in the near-infrared using recent data from Peters Automated InfraRed Imaging TELescope and the literature. We construct a hierarchical Bayesian framework, incorporating several uncertainties including photometric error, peculiar velocities, dust extinction, and intrinsic variations, for principled and coherent statistical inference. SN Ia light-curve inferences are drawn from the global posterior probability of parameters describing both individual supernovae and the population conditioned on the entire SN Ia NIR data set. The logical structure of the hierarchical model is represented by a directed acyclic graph. Fully Bayesian analysis of the model and data is enabled by an efficient Markov Chain Monte Carlo algorithm exploiting the conditional probabilistic structure using Gibbs sampling. We apply this framework to the JHK{sub s} SN Ia light-curve data. A new light-curve model captures the observed J-band light-curve shape variations. The marginal intrinsic variances in peak absolute magnitudes are sigma(M{sub J}) = 0.17 +- 0.03, sigma(M{sub H}) = 0.11 +- 0.03, and sigma(M{sub Ks}) = 0.19 +- 0.04. We describe the first quantitative evidence for correlations between the NIR absolute magnitudes and J-band light-curve shapes, and demonstrate their utility for distance estimation. The average residual in the Hubble diagram for the training set SNe at cz > 2000kms{sup -1} is 0.10 mag. The new application of bootstrap cross-validation to SN Ia light-curve inference tests the sensitivity of the statistical model fit to the finite sample and estimates the prediction error at 0.15 mag. These results demonstrate that SN Ia NIR light curves are as effective as corrected optical light curves, and, because they are less vulnerable to dust absorption, they have great potential as precise and accurate cosmological distance indicators.

  11. Application of Bayesian Approach to Cost-Effectiveness Analysis of Antiviral Treatments in Chronic Hepatitis B

    PubMed Central

    Zhang, Hua; Huo, Mingdong; Chao, Jianqian; Liu, Pei

    2016-01-01

    Background Hepatitis B virus (HBV) infection is a major problem for public health; timely antiviral treatment can significantly prevent the progression of liver damage from HBV by slowing down or stopping the virus from reproducing. In the study we applied Bayesian approach to cost-effectiveness analysis, using Markov Chain Monte Carlo (MCMC) simulation methods for the relevant evidence input into the model to evaluate cost-effectiveness of entecavir (ETV) and lamivudine (LVD) therapy for chronic hepatitis B (CHB) in Jiangsu, China, thus providing information to the public health system in the CHB therapy. Methods Eight-stage Markov model was developed, a hypothetical cohort of 35-year-old HBeAg-positive patients with CHB was entered into the model. Treatment regimens were LVD100mg daily and ETV 0.5 mg daily. The transition parameters were derived either from systematic reviews of the literature or from previous economic studies. The outcome measures were life-years, quality-adjusted lifeyears (QALYs), and expected costs associated with the treatments and disease progression. For the Bayesian models all the analysis was implemented by using WinBUGS version 1.4. Results Expected cost, life expectancy, QALYs decreased with age. Cost-effectiveness increased with age. Expected cost of ETV was less than LVD, while life expectancy and QALYs were higher than that of LVD, ETV strategy was more cost-effective. Costs and benefits of the Monte Carlo simulation were very close to the results of exact form among the group, but standard deviation of each group indicated there was a big difference between individual patients. Conclusions Compared with lamivudine, entecavir is the more cost-effective option. CHB patients should accept antiviral treatment as soon as possible as the lower age the more cost-effective. Monte Carlo simulation obtained costs and effectiveness distribution, indicate our Markov model is of good robustness. PMID:27574976

  12. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    PubMed

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis. PMID:25592482

  13. Predictive distributions for between-study heterogeneity and simple methods for their application in Bayesian meta-analysis.

    PubMed

    Turner, Rebecca M; Jackson, Dan; Wei, Yinghui; Thompson, Simon G; Higgins, Julian P T

    2015-03-15

    Numerous meta-analyses in healthcare research combine results from only a small number of studies, for which the variance representing between-study heterogeneity is estimated imprecisely. A Bayesian approach to estimation allows external evidence on the expected magnitude of heterogeneity to be incorporated. The aim of this paper is to provide tools that improve the accessibility of Bayesian meta-analysis. We present two methods for implementing Bayesian meta-analysis, using numerical integration and importance sampling techniques. Based on 14,886 binary outcome meta-analyses in the Cochrane Database of Systematic Reviews, we derive a novel set of predictive distributions for the degree of heterogeneity expected in 80 settings depending on the outcomes assessed and comparisons made. These can be used as prior distributions for heterogeneity in future meta-analyses. The two methods are implemented in R, for which code is provided. Both methods produce equivalent results to standard but more complex Markov chain Monte Carlo approaches. The priors are derived as log-normal distributions for the between-study variance, applicable to meta-analyses of binary outcomes on the log odds-ratio scale. The methods are applied to two example meta-analyses, incorporating the relevant predictive distributions as prior distributions for between-study heterogeneity. We have provided resources to facilitate Bayesian meta-analysis, in a form accessible to applied researchers, which allow relevant prior information on the degree of heterogeneity to be incorporated. PMID:25475839

  14. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini, Benjamin; Young, Daniel

    2005-01-01

    Contents include the following: JWST/ISIM introduction. Design and analysis challenges for ISIM bonded joints. JWST/ISIM joint designs. Bonded joint analysis. Finite element modeling. Failure criteria and margin calculation. Analysis/test correlation procedure. Example of test data and analysis.

  15. Bayesian Inference in Probabilistic Risk Assessment -- The Current State of the Art

    SciTech Connect

    Dana L. Kelly; Curtis L. Smith

    2009-02-01

    Markov chain Monte Carlo approaches to sampling directly from the joint posterior distribution of aleatory model parameters have led to tremendous advances in Bayesian inference capability in a wide variety of fields, including probabilistic risk analysis. The advent of freely available software coupled with inexpensive computing power has catalyzed this advance. This paper examines where the risk assessment community is with respect to implementing modern computational-based Bayesian approaches to inference. Through a series of examples in different topical areas, it introduces salient concepts and illustrates the practical application of Bayesian inference via Markov chain Monte Carlo sampling to a variety of important problems.

  16. Bayesian uncertainty analysis for advanced seismic imaging - Application to the Mentelle Basin, Australia

    NASA Astrophysics Data System (ADS)

    Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.

    2016-04-01

    multivariate posterior distribution. The novelty of our approach and the major difference compared to the traditional semblance spectrum velocity analysis procedure is the calculation of uncertainty of the output model. As the model is able to estimate the credibility intervals of the corresponding interval velocities, we can produce the most probable PSDM images in an iterative manner. The depths extracted using our statistical algorithm are in very good agreement with the key horizons retrieved from the drilled core DSDP-258, showing that the Bayesian model is able to control the depth migration of the seismic data and estimate the uncertainty to the drilling targets.

  17. A parametric Bayesian combination of local and regional information in flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Seidou, O.; Ouarda, T. B. M. J.; Barbet, M.; Bruneau, P.; BobéE, B.

    2006-11-01

    Because of their impact on hydraulic structure design as well as on floodplain management, flood quantiles must be estimated with the highest precision given available information. If the site of interest has been monitored for a sufficiently long period (more than 30-40 years), at-site frequency analysis can be used to estimate flood quantiles with a fair precision. Otherwise, regional estimation may be used to mitigate the lack of data, but local information is then ignored. A commonly used approach to combine at-site and regional information is the linear empirical Bayes estimation: Under the assumption that both local and regional flood quantile estimators have a normal distribution, the empirical Bayesian estimator of the true quantile is the weighted average of both estimations. The weighting factor for each estimator is conversely proportional to its variance. We propose in this paper an alternative Bayesian method for combining local and regional information which provides the full probability density of quantiles and parameters. The application of the method is made with the generalized extreme values (GEV) distribution, but it can be extended to other types of extreme value distributions. In this method the prior distributions are obtained using a regional log linear regression model, and then local observations are used within a Markov chain Monte Carlo algorithm to infer the posterior distributions of parameters and quantiles. Unlike the empirical Bayesian approach the proposed method works even with a single local observation. It also relaxes the hypothesis of normality of the local quantiles probability distribution. The performance of the proposed methodology is compared to that of local, regional, and empirical Bayes estimators on three generated regional data sets with different statistical characteristics. The results show that (1) when the regional log linear model is unbiased, the proposed method gives better estimations of the GEV quantiles and

  18. Sea-level variability in tide-gauge and geological records: An empirical Bayesian analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Kopp, R. E.; Hay, C.; Morrow, E.; Mitrovica, J. X.; Horton, B.; Kemp, A.

    2013-12-01

    Sea level varies at a range of temporal and spatial scales, and understanding all its significant sources of variability is crucial to building sea-level rise projections relevant to local decision-making. In the twentieth-century record, sites along the U.S. east coast have exhibited typical year-to-year variability of several centimeters. A faster-than-global increase in sea-level rise in the northeastern United States since about 1990 has led some to hypothesize a 'sea-level rise hot spot' in this region, perhaps driven by a trend in the Atlantic Meridional Overturning Circulation related to anthropogenic climate change [1]. However, such hypotheses must be evaluated in the context of natural variability, as revealed by observational and paleo-records. Bayesian and empirical Bayesian statistical approaches are well suited for assimilating data from diverse sources, such as tide-gauges and peats, with differing data availability and uncertainties, and for identifying regionally covarying patterns within these data. We present empirical Bayesian analyses of twentieth-century tide gauge data [2]. We find that the mid-Atlantic region of the United States has experienced a clear acceleration of sea level relative to the global average since about 1990, but this acceleration does not appear to be unprecedented in the twentieth-century record. The rate and extent of this acceleration instead appears comparable to an acceleration observed in the 1930s and 1940s. Both during the earlier episode of acceleration and today, the effect appears to be significantly positively correlated with the Atlantic Multidecadal Oscillation and likely negatively correlated with the North Atlantic Oscillation [2]. The Holocene and Common Era database of geological sea-level rise proxies [3,4] may allow these relationships to be assessed beyond the span of the direct observational record. At a global scale, similar approaches can be employed to look for the spatial fingerprints of land ice

  19. Motion analysis of knee joint using dynamic volume images

    NASA Astrophysics Data System (ADS)

    Haneishi, Hideaki; Kohno, Takahiro; Suzuki, Masahiko; Moriya, Hideshige; Mori, Sin-ichiro; Endo, Masahiro

    2006-03-01

    Acquisition and analysis of three-dimensional movement of knee joint is desired in orthopedic surgery. We have developed two methods to obtain dynamic volume images of knee joint. One is a 2D/3D registration method combining a bi-plane dynamic X-ray fluoroscopy and a static three-dimensional CT, the other is a method using so-called 4D-CT that uses a cone-beam and a wide 2D detector. In this paper, we present two analyses of knee joint movement obtained by these methods: (1) transition of the nearest points between femur and tibia (2) principal component analysis (PCA) of six parameters representing the three dimensional movement of knee. As a preprocessing for the analysis, at first the femur and tibia regions are extracted from volume data at each time frame and then the registration of the tibia between different frames by an affine transformation consisting of rotation and translation are performed. The same transformation is applied femur as well. Using those image data, the movement of femur relative to tibia can be analyzed. Six movement parameters of femur consisting of three translation parameters and three rotation parameters are obtained from those images. In the analysis (1), axis of each bone is first found and then the flexion angle of the knee joint is calculated. For each flexion angle, the minimum distance between femur and tibia and the location giving the minimum distance are found in both lateral condyle and medial condyle. As a result, it was observed that the movement of lateral condyle is larger than medial condyle. In the analysis (2), it was found that the movement of the knee can be represented by the first three principal components with precision of 99.58% and those three components seem to strongly relate to three major movements of femur in the knee bend known in orthopedic surgery.

  20. Joint association analysis of bivariate quantitative and qualitative traits.

    PubMed

    Yuan, Mengdie; Diao, Guoqing

    2011-01-01

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF. PMID:22373162

  1. A Bayesian Approach to Person Fit Analysis in Item Response Theory Models. Research Report.

    ERIC Educational Resources Information Center

    Glas, Cees A. W.; Meijer, Rob R.

    A Bayesian approach to the evaluation of person fit in item response theory (IRT) models is presented. In a posterior predictive check, the observed value on a discrepancy variable is positioned in its posterior distribution. In a Bayesian framework, a Markov Chain Monte Carlo procedure can be used to generate samples of the posterior distribution…

  2. Integrated survival analysis using an event-time approach in a Bayesian framework.

    PubMed

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-02-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  3. Integrated survival analysis using an event-time approach in a Bayesian framework

    PubMed Central

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  4. Analysis of regression confidence intervals and Bayesian credible intervals for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Lu, Dan; Ye, Ming; Hill, Mary C.

    2012-09-01

    Confidence intervals based on classical regression theories augmented to include prior information and credible intervals based on Bayesian theories are conceptually different ways to quantify parametric and predictive uncertainties. Because both confidence and credible intervals are used in environmental modeling, we seek to understand their differences and similarities. This is of interest in part because calculating confidence intervals typically requires tens to thousands of model runs, while Bayesian credible intervals typically require tens of thousands to millions of model runs. Given multi-Gaussian distributed observation errors, our theoretical analysis shows that, for linear or linearized-nonlinear models, confidence and credible intervals are always numerically identical when consistent prior information is used. For nonlinear models, nonlinear confidence and credible intervals can be numerically identical if parameter confidence regions defined using the approximate likelihood method and parameter credible regions estimated using Markov chain Monte Carlo realizations are numerically identical and predictions are a smooth, monotonic function of the parameters. Both occur if intrinsic model nonlinearity is small. While the conditions of Gaussian errors and small intrinsic model nonlinearity are violated by many environmental models, heuristic tests using analytical and numerical models suggest that linear and nonlinear confidence intervals can be useful approximations of uncertainty even under significantly nonideal conditions. In the context of epistemic model error for a complex synthetic nonlinear groundwater problem, the linear and nonlinear confidence and credible intervals for individual models performed similarly enough to indicate that the computationally frugal confidence intervals can be useful in many circumstances. Experiences with these groundwater models are expected to be broadly applicable to many environmental models. We suggest that for

  5. A joint analysis of AMI and CARMA observations of the recently discovered SZ galaxy cluster system AMI-CL J0300+2613

    NASA Astrophysics Data System (ADS)

    AMI Consortium; Shimwell, Timothy W.; Carpenter, John M.; Feroz, Farhan; Grainge, Keith J. B.; Hobson, Michael P.; Hurley-Walker, Natasha; Lasenby, Anthony N.; Olamaie, Malak; Perrott, Yvette C.; Pooley, Guy G.; Rodríguez-Gonzálvez, Carmen; Rumsey, Clare; Saunders, Richard D. E.; Schammel, Michel P.; Scott, Paul F.; Titterington, David J.; Waldram, Elizabeth M.

    2013-08-01

    We present Combined Array for Research in Millimeter-wave Astronomy (CARMA) observations of a massive galaxy cluster discovered in the Arcminute Microkelvin Imager (AMI) blind Sunyaev-Zel'dovich (SZ) survey. Without knowledge of the cluster redshift a Bayesian analysis of the AMI, CARMA and joint AMI and CARMA uv-data is used to quantify the detection significance and parametrize both the physical and observational properties of the cluster whilst accounting for the statistics of primary cosmic microwave background anisotropies, receiver noise and radio sources. The joint analysis of the AMI and CARMA uv-data was performed with two parametric physical cluster models: the β-model; and the model described in Olamaie et al. with the pressure profile fixed according to Arnaud et al. The cluster mass derived from these different models is comparable but our Bayesian evidences indicate a preference for the β-profile which we, therefore, use throughout our analysis. From the CARMA data alone we obtain a formal Bayesian probability of detection ratio of 12.8:1 when assuming that a cluster exists within our search area; alternatively assuming that Jenkins et al. accurately predict the number of clusters as a function of mass and redshift, the formal Bayesian probability of detection is 0.29:1. From the Bayesian analysis of the AMI or AMI and CARMA data the probability of detection ratio exceeds 4.5 × 103:1. Performing a joint analysis of the AMI and CARMA data with a physical cluster model we derive the total mass internal to r200 as MT, 200 = 4.1 ± 1.1 × 1014 M⊙. Using a phenomenological β-model to quantify the temperature decrement as a function of angular distance we find a central SZ temperature decrement of 170 ± 24 μK in the AMI and CARMA data. The SZ decrement in the CARMA data is weaker than expected and we speculate that this is a consequence of the cluster morphology. In a forthcoming study the pipeline that we have developed for the analyses of these

  6. Joint analysis of the seismic data and velocity gravity model

    NASA Astrophysics Data System (ADS)

    Belyakov, A. S.; Lavrov, V. S.; Muchamedov, V. A.; Nikolaev, A. V.

    2016-03-01

    We performed joint analysis of the seismic noises recorded at the Japanese Ogasawara station located on Titijima Island in the Philippine Sea using the STS-2 seismograph at the OSW station in the winter period of January 1-15, 2015, over the background of a velocity gravity model. The graphs prove the existence of a cause-and-effect relation between the seismic noise and gravity and allow us to consider it as a desired signal.

  7. The Phylogeographic History of the New World Screwworm Fly, Inferred by Approximate Bayesian Computation Analysis

    PubMed Central

    Azeredo-Espin, Ana Maria L.

    2013-01-01

    Insect pest phylogeography might be shaped both by biogeographic events and by human influence. Here, we conducted an approximate Bayesian computation (ABC) analysis to investigate the phylogeography of the New World screwworm fly, Cochliomyia hominivorax, with the aim of understanding its population history and its order and time of divergence. Our ABC analysis supports that populations spread from North to South in the Americas, in at least two different moments. The first split occurred between the North/Central American and South American populations in the end of the Last Glacial Maximum (15,300-19,000 YBP). The second split occurred between the North and South Amazonian populations in the transition between the Pleistocene and the Holocene eras (9,100-11,000 YBP). The species also experienced population expansion. Phylogenetic analysis likewise suggests this north to south colonization and Maxent models suggest an increase in the number of suitable areas in South America from the past to present. We found that the phylogeographic patterns observed in C. hominivorax cannot be explained only by climatic oscillations and can be connected to host population histories. Interestingly we found these patterns are very coincident with general patterns of ancient human movements in the Americas, suggesting that humans might have played a crucial role in shaping the distribution and population structure of this insect pest. This work presents the first hypothesis test regarding the processes that shaped the current phylogeographic structure of C. hominivorax and represents an alternate perspective on investigating the problem of insect pests. PMID:24098436

  8. Bayesian Inference for Neural Electromagnetic Source Localization: Analysis of MEG Visual Evoked Activity

    SciTech Connect

    George, J.S.; Schmidt, D.M.; Wood, C.C.

    1999-02-01

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented.

  9. A Bayesian Semi-parametric Approach for the Differential Analysis of Sequence Counts Data.

    PubMed

    Guindani, Michele; Sepúlveda, Nuno; Paulino, Carlos Daniel; Müller, Peter

    2014-04-01

    Data obtained using modern sequencing technologies are often summarized by recording the frequencies of observed sequences. Examples include the analysis of T cell counts in immunological research and studies of gene expression based on counts of RNA fragments. In both cases the items being counted are sequences, of proteins and base pairs, respectively. The resulting sequence-abundance distribution is usually characterized by overdispersion. We propose a Bayesian semi-parametric approach to implement inference for such data. Besides modeling the overdispersion, the approach takes also into account two related sources of bias that are usually associated with sequence counts data: some sequence types may not be recorded during the experiment and the total count may differ from one experiment to another. We illustrate our methodology with two data sets, one regarding the analysis of CD4+ T cell counts in healthy and diabetic mice and another data set concerning the comparison of mRNA fragments recorded in a Serial Analysis of Gene Expression (SAGE) experiment with gastrointestinal tissue of healthy and cancer patients. PMID:24833809

  10. Efficient Methods for Bayesian Uncertainty Analysis and Global Optimization of Computationally Expensive Environmental Models

    NASA Astrophysics Data System (ADS)

    Shoemaker, Christine; Espinet, Antoine; Pang, Min

    2015-04-01

    Models of complex environmental systems can be computationally expensive in order to describe the dynamic interactions of the many components over a sizeable time period. Diagnostics of these systems can include forward simulations of calibrated models under uncertainty and analysis of alternatives of systems management. This discussion will focus on applications of new surrogate optimization and uncertainty analysis methods to environmental models that can enhance our ability to extract information and understanding. For complex models, optimization and especially uncertainty analysis can require a large number of model simulations, which is not feasible for computationally expensive models. Surrogate response surfaces can be used in Global Optimization and Uncertainty methods to obtain accurate answers with far fewer model evaluations, which made the methods practical for computationally expensive models for which conventional methods are not feasible. In this paper we will discuss the application of the SOARS surrogate method for estimating Bayesian posterior density functions for model parameters for a TOUGH2 model of geologic carbon sequestration. We will also briefly discuss new parallel surrogate global optimization algorithm applied to two groundwater remediation sites that was implemented on a supercomputer with up to 64 processors. The applications will illustrate the use of these methods to predict the impact of monitoring and management on subsurface contaminants.

  11. A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis.

    PubMed

    Down, Thomas A; Rakyan, Vardhman K; Turner, Daniel J; Flicek, Paul; Li, Heng; Kulesha, Eugene; Gräf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M; Thorne, Natalie P; Bäckdahl, Liselotte; Herberth, Marlis; Howe, Kevin L; Jackson, David K; Miretti, Marcos M; Marioni, John C; Birney, Ewan; Hubbard, Tim J P; Durbin, Richard; Tavaré, Simon; Beck, Stephan

    2008-07-01

    DNA methylation is an indispensible epigenetic modification required for regulating the expression of mammalian genomes. Immunoprecipitation-based methods for DNA methylome analysis are rapidly shifting the bottleneck in this field from data generation to data analysis, necessitating the development of better analytical tools. In particular, an inability to estimate absolute methylation levels remains a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling. To address this issue, we developed a cross-platform algorithm-Bayesian tool for methylation analysis (Batman)-for analyzing methylated DNA immunoprecipitation (MeDIP) profiles generated using oligonucleotide arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). We developed the latter approach to provide a high-resolution whole-genome DNA methylation profile (DNA methylome) of a mammalian genome. Strong correlation of our data, obtained using mature human spermatozoa, with those obtained using bisulfite sequencing suggest that combining MeDIP-seq or MeDIP-chip with Batman provides a robust, quantitative and cost-effective functional genomic strategy for elucidating the function of DNA methylation. PMID:18612301

  12. Application of Bayesian and cost benefit risk analysis in water resources management

    NASA Astrophysics Data System (ADS)

    Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.

    2016-03-01

    Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.

  13. Bayesian sequential meta-analysis design in evaluating cardiovascular risk in a new antidiabetic drug development program.

    PubMed

    Chen, Ming-Hui; Ibrahim, Joseph G; Amy Xia, H; Liu, Thomas; Hennessey, Violeta

    2014-04-30

    Recently, the Center for Drug Evaluation and Research at the Food and Drug Administration released a guidance that makes recommendations about how to demonstrate that a new antidiabetic therapy to treat type 2 diabetes is not associated with an unacceptable increase in cardiovascular risk. One of the recommendations from the guidance is that phases II and III trials should be appropriately designed and conducted so that a meta-analysis can be performed. In addition, the guidance implies that a sequential meta-analysis strategy could be adopted. That is, the initial meta-analysis could aim at demonstrating the upper bound of a 95% confidence interval (CI) for the estimated hazard ratio to be < 1.8 for the purpose of enabling a new drug application or a biologics license application. Subsequently after the marketing authorization, a final meta-analysis would need to show the upper bound to be < 1.3. In this context, we develop a new Bayesian sequential meta-analysis approach using survival regression models to assess whether the size of a clinical development program is adequate to evaluate a particular safety endpoint. We propose a Bayesian sample size determination methodology for sequential meta-analysis clinical trial design with a focus on controlling the familywise type I error rate and power. We use the partial borrowing power prior to incorporate the historical survival meta-data into the Bayesian design. We examine various properties of the proposed methodology, and simulation-based computational algorithms are developed to generate predictive data at various interim analyses, sample from the posterior distributions, and compute various quantities such as the power and the type I error in the Bayesian sequential meta-analysis trial design. We apply the proposed methodology to the design of a hypothetical antidiabetic drug development program for evaluating cardiovascular risk. PMID:24343859

  14. The analysis of adhesively bonded advanced composite joints using joint finite elements

    NASA Astrophysics Data System (ADS)

    Stapleton, Scott E.

    The design and sizing of adhesively bonded joints has always been a major bottleneck in the design of composite vehicles. Dense finite element (FE) meshes are required to capture the full behavior of a joint numerically, but these dense meshes are impractical in vehicle-scale models where a course mesh is more desirable to make quick assessments and comparisons of different joint geometries. Analytical models are often helpful in sizing, but difficulties arise in coupling these models with full-vehicle FE models. Therefore, a joint FE was created which can be used within structural FE models to make quick assessments of bonded composite joints. The shape functions of the joint FE were found by solving the governing equations for a structural model for a joint. By analytically determining the shape functions of the joint FE, the complex joint behavior can be captured with very few elements. This joint FE was modified and used to consider adhesives with functionally graded material properties to reduce the peel stress concentrations located near adherend discontinuities. Several practical concerns impede the actual use of such adhesives. These include increased manufacturing complications, alterations to the grading due to adhesive flow during manufacturing, and whether changing the loading conditions significantly impact the effectiveness of the grading. An analytical study is conducted to address these three concerns. Furthermore, proof-of-concept testing is conducted to show the potential advantages of functionally graded adhesives. In this study, grading is achieved by strategically placing glass beads within the adhesive layer at different densities along the joint. Furthermore, the capability to model non-linear adhesive constitutive behavior with large rotations was developed, and progressive failure of the adhesive was modeled by re-meshing the joint as the adhesive fails. Results predicted using the joint FE was compared with experimental results for various

  15. The Analysis of Adhesively Bonded Advanced Composite Joints Using Joint Finite Elements

    NASA Technical Reports Server (NTRS)

    Stapleton, Scott E.; Waas, Anthony M.

    2012-01-01

    The design and sizing of adhesively bonded joints has always been a major bottleneck in the design of composite vehicles. Dense finite element (FE) meshes are required to capture the full behavior of a joint numerically, but these dense meshes are impractical in vehicle-scale models where a course mesh is more desirable to make quick assessments and comparisons of different joint geometries. Analytical models are often helpful in sizing, but difficulties arise in coupling these models with full-vehicle FE models. Therefore, a joint FE was created which can be used within structural FE models to make quick assessments of bonded composite joints. The shape functions of the joint FE were found by solving the governing equations for a structural model for a joint. By analytically determining the shape functions of the joint FE, the complex joint behavior can be captured with very few elements. This joint FE was modified and used to consider adhesives with functionally graded material properties to reduce the peel stress concentrations located near adherend discontinuities. Several practical concerns impede the actual use of such adhesives. These include increased manufacturing complications, alterations to the grading due to adhesive flow during manufacturing, and whether changing the loading conditions significantly impact the effectiveness of the grading. An analytical study is conducted to address these three concerns. Furthermore, proof-of-concept testing is conducted to show the potential advantages of functionally graded adhesives. In this study, grading is achieved by strategically placing glass beads within the adhesive layer at different densities along the joint. Furthermore, the capability to model non-linear adhesive constitutive behavior with large rotations was developed, and progressive failure of the adhesive was modeled by re-meshing the joint as the adhesive fails. Results predicted using the joint FE was compared with experimental results for various

  16. A BAYESIAN HIERARCHICAL SPATIAL POINT PROCESS MODEL FOR MULTI-TYPE NEUROIMAGING META-ANALYSIS

    PubMed Central

    Kang, Jian; Nichols, Thomas E.; Wager, Tor D.; Johnson, Timothy D.

    2014-01-01

    Neuroimaging meta-analysis is an important tool for finding consistent effects over studies that each usually have 20 or fewer subjects. Interest in meta-analysis in brain mapping is also driven by a recent focus on so-called “reverse inference”: where as traditional “forward inference” identifies the regions of the brain involved in a task, a reverse inference identifies the cognitive processes that a task engages. Such reverse inferences, however, requires a set of meta-analysis, one for each possible cognitive domain. However, existing methods for neuroimaging meta-analysis have significant limitations. Commonly used methods for neuroimaging meta-analysis are not model based, do not provide interpretable parameter estimates, and only produce null hypothesis inferences; further, they are generally designed for a single group of studies and cannot produce reverse inferences. In this work we address these limitations by adopting a non-parametric Bayesian approach for meta analysis data from multiple classes or types of studies. In particular, foci from each type of study are modeled as a cluster process driven by a random intensity function that is modeled as a kernel convolution of a gamma random field. The type-specific gamma random fields are linked and modeled as a realization of a common gamma random field, shared by all types, that induces correlation between study types and mimics the behavior of a univariate mixed effects model. We illustrate our model on simulation studies and a meta analysis of five emotions from 219 studies and check model fit by a posterior predictive assessment. In addition, we implement reverse inference by using the model to predict study type from a newly presented study. We evaluate this predictive performance via leave-one-out cross validation that is efficiently implemented using importance sampling techniques. PMID:25426185

  17. Joint optimization of algorithmic suites for EEG analysis.

    PubMed

    Santana, Eder; Brockmeier, Austin J; Principe, Jose C

    2014-01-01

    Electroencephalogram (EEG) data analysis algorithms consist of multiple processing steps each with a number of free parameters. A joint optimization methodology can be used as a wrapper to fine-tune these parameters for the patient or application. This approach is inspired by deep learning neural network models, but differs because the processing layers for EEG are heterogeneous with different approaches used for processing space and time. Nonetheless, we treat the processing stages as a neural network and apply backpropagation to jointly optimize the parameters. This approach outperforms previous results on the BCI Competition II - dataset IV; additionally, it outperforms the common spatial patterns (CSP) algorithm on the BCI Competition III dataset IV. In addition, the optimized parameters in the architecture are still interpretable. PMID:25570621

  18. Experimental analysis of a joint free space cryptosystem

    NASA Astrophysics Data System (ADS)

    Ramírez, John Fredy Barrera; Osorio, Alexis Jaramillo; Zea, Alejandro Vélez; Torroba, Roberto

    2016-08-01

    In this paper, we analyze a joint free space cryptosystem scheme implemented in an actual laboratory environment. In this encrypting architecture, the object to be encoded and the security key are placed side by side in the input plane without optical elements between the input and the output planes. In order to get the encrypted information, the joint Fresnel power distribution JFPD coming from the input plane is registered in a CMOS camera. The information of the encrypting key is registered with an off axis Fresnel holographic setup. The data registered with the experimental setup is digitally filtered to obtain the encrypted object and the encryption key. In addition, we explore the performance of the experimental system as a function of the object-camera and key-camera distances, which are two new parameters of interest. These parameters become available as a result of developing this encrypting scheme. The theoretical and experimental analysis shows the validity and applicability of the cryptosystem.

  19. The Bayesian approach to reporting GSR analysis results: some first-hand experiences

    NASA Astrophysics Data System (ADS)

    Charles, Sebastien; Nys, Bart

    2010-06-01

    The use of Bayesian principles in the reporting of forensic findings has been a matter of interest for some years. Recently, also the GSR community is gradually exploring the advantages of this method, or rather approach, for writing reports. Since last year, our GSR group is adapting reporting procedures to the use of Bayesian principles. The police and magistrates find the reports more directly accessible and useful in their part of the criminal investigation. In the lab we find that, through applying the Bayesian principles, unnecessary analyses can be eliminated and thus time can be freed on the instruments.

  20. A three-dimensional finite element analysis of finger joint stresses in the MCP joint while performing common tasks.

    PubMed

    Butz, Kent D; Merrell, Greg; Nauman, Eric A

    2012-09-01

    The goal of this study was to develop a three-dimensional finite element model of the metacarpophalangeal (MCP) joint to characterize joint contact stresses incurred during common daily activities. The metacarpal and proximal phalanx were modeled using a COMSOL-based finite element analysis. Muscle forces determined from a static force analysis of two common activities (pen grip and carrying a weight) were applied to the simulation to characterize the surface stress distributions at the MCP joint. The finite element analysis predicted that stresses as high as 1.9 MPa, similar in magnitude to stresses experienced at the hip, may be experienced by the subchondral bone in the MCP joint. The internal structure and material properties of the phalanges were found to play a significant role in both the magnitude and distribution of stresses, but the dependence on cancellous bone modulus was not as severe as predicted by previous two dimensional models. PMID:23997746

  1. Bayesian analysis of equivalent sound sources for a military jet aircraft

    NASA Astrophysics Data System (ADS)

    Hart, David

    2012-10-01

    Radiated jet noise is believed to be generated by a mixture of fine-scale turbulent structures (FSS) and large-scale turbulent structures (LSS). In previous work, the noise from an F -22A Raptor has been modeled as two sets of monopole sources whose characteristics account for both FSS and LSS sound propagation [Morgan, J. Acoust. Soc. Am. 129, 2442 (2011)]. The source parameters are manually adjusted until the calculations produce the measured field along a surface. Once this has been done, the equivalent source of monopoles can be used to further analyze the sound field around the jet. In order to automate this process, parameters are selected based on Bayesian methods that are implemented with simulated annealing and fast Gibbs sampler algorithms. This method yields the best fit parameters, and the sensitivity of the solution based on generated posterior probability distributions (PPD). For example, analysis has shown that the peak source region of the LSS is more important than the peak source region of the FSS. Further analysis of the generated PPD's will give greater insight into the nature of the radiated jet noise.

  2. Bayesian analysis of uncertainty in predisposing and triggering factors for landslides hazard assessment

    NASA Astrophysics Data System (ADS)

    Sandric, I.; Petropoulos, Y.; Chitu, Z.; Mihai, B.

    2012-04-01

    The landslide hazard analysis models takes into consideration both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. The latter is expressed not as land use classes, as for example CORINE, but as leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index was derived from Landsat time series images, starting from 1984 and up to 2011. All the images available for the Panatau administrative unit in Buzau County, Romania, have been downloaded from http://earthexplorer.usgs.gov, including the images with cloud cover. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created

  3. Bayesian Normalization Model for Label-Free Quantitative Analysis by LC-MS

    PubMed Central

    Nezami Ranjbar, Mohammad R.; Tadesse, Mahlet G.; Wang, Yue; Ressom, Habtom W.

    2016-01-01

    We introduce a new method for normalization of data acquired by liquid chromatography coupled with mass spectrometry (LC-MS) in label-free differential expression analysis. Normalization of LC-MS data is desired prior to subsequent statistical analysis to adjust variabilities in ion intensities that are not caused by biological differences but experimental bias. There are different sources of bias including variabilities during sample collection and sample storage, poor experimental design, noise, etc. In addition, instrument variability in experiments involving a large number of LC-MS runs leads to a significant drift in intensity measurements. Although various methods have been proposed for normalization of LC-MS data, there is no universally applicable approach. In this paper, we propose a Bayesian normalization model (BNM) that utilizes scan-level information from LC-MS data. Specifically, the proposed method uses peak shapes to model the scan-level data acquired from extracted ion chromatograms (EIC) with parameters considered as a linear mixed effects model. We extended the model into BNM with drift (BNMD) to compensate for the variability in intensity measurements due to long LC-MS runs. We evaluated the performance of our method using synthetic and experimental data. In comparison with several existing methods, the proposed BNM and BNMD yielded significant improvement. PMID:26357332

  4. Bayesian inference in physics

    NASA Astrophysics Data System (ADS)

    von Toussaint, Udo

    2011-07-01

    Bayesian inference provides a consistent method for the extraction of information from physics experiments even in ill-conditioned circumstances. The approach provides a unified rationale for data analysis, which both justifies many of the commonly used analysis procedures and reveals some of the implicit underlying assumptions. This review summarizes the general ideas of the Bayesian probability theory with emphasis on the application to the evaluation of experimental data. As case studies for Bayesian parameter estimation techniques examples ranging from extra-solar planet detection to the deconvolution of the apparatus functions for improving the energy resolution and change point estimation in time series are discussed. Special attention is paid to the numerical techniques suited for Bayesian analysis, with a focus on recent developments of Markov chain Monte Carlo algorithms for high-dimensional integration problems. Bayesian model comparison, the quantitative ranking of models for the explanation of a given data set, is illustrated with examples collected from cosmology, mass spectroscopy, and surface physics, covering problems such as background subtraction and automated outlier detection. Additionally the Bayesian inference techniques for the design and optimization of future experiments are introduced. Experiments, instead of being merely passive recording devices, can now be designed to adapt to measured data and to change the measurement strategy on the fly to maximize the information of an experiment. The applied key concepts and necessary numerical tools which provide the means of designing such inference chains and the crucial aspects of data fusion are summarized and some of the expected implications are highlighted.

  5. Bayesian Statistics.

    ERIC Educational Resources Information Center

    Meyer, Donald L.

    Bayesian statistical methodology and its possible uses in the behavioral sciences are discussed in relation to the solution of problems in both the use and teaching of fundamental statistical methods, including confidence intervals, significance tests, and sampling. The Bayesian model explains these statistical methods and offers a consistent…

  6. Bayesian value-of-information analysis. An application to a policy model of Alzheimer's disease.

    PubMed

    Claxton, K; Neumann, P J; Araki, S; Weinstein, M C

    2001-01-01

    A framework is presented that distinguishes the conceptually separate decisions of which treatment strategy is optimal from the question of whether more information is required to inform this choice in the future. The authors argue that the choice of treatment strategy should be based on expected utility, and the only valid reason to characterize the uncertainty surrounding outcomes of interest is to establish the value of acquiring additional information. A Bayesian decision theoretic approach is demonstrated through a probabilistic analysis of a published policy model of Alzheimer's disease. The expected value of perfect information is estimated for the decision to adopt a new pharmaceutical for the population of patients with Alzheimer's disease in the United States. This provides an upper bound on the value of additional research. The value of information is also estimated for each of the model inputs. This analysis can focus future research by identifying those parameters where more precise estimates would be most valuable and indicating whether an experimental design would be required. We also discuss how this type of analysis can also be used to design experimental research efficiently (identifying optimal sample size and optimal sample allocation) based on the marginal cost and marginal benefit of sample information. Value-of-information analysis can provide a measure of the expected payoff from proposed research, which can be used to set priorities in research and development. It can also inform an efficient regulatory framework for new healthcare technologies: an analysis of the value of information would define when a claim for a new technology should be deemed substantiated and when evidence should be considered competent and reliable when it is not cost-effective to gather any more information. PMID:11329844

  7. BAYESIAN ANALYSIS TO IDENTIFY NEW STAR CANDIDATES IN NEARBY YOUNG STELLAR KINEMATIC GROUPS

    SciTech Connect

    Malo, Lison; Doyon, Rene; Lafreniere, David; Artigau, Etienne; Gagne, Jonathan; Baron, Frederique; Riedel, Adric E-mail: doyon@astro.umontreal.ca E-mail: artigau@astro.umontreal.ca E-mail: baron@astro.umontreal.ca

    2013-01-10

    We present a new method based on a Bayesian analysis to identify new members of nearby young kinematic groups. The analysis minimally takes into account the position, proper motion, magnitude, and color of a star, but other observables can be readily added (e.g., radial velocity, distance). We use this method to find new young low-mass stars in the {beta} Pictoris and AB Doradus moving groups and in the TW Hydrae, Tucana-Horologium, Columba, Carina, and Argus associations. Starting from a sample of 758 mid-K to mid-M (K5V-M5V) stars showing youth indicators such as H{alpha} and X-ray emission, our analysis yields 214 new highly probable low-mass members of the kinematic groups analyzed. One is in TW Hydrae, 37 in {beta} Pictoris, 17 in Tucana-Horologium, 20 in Columba, 6 in Carina, 50 in Argus, 32 in AB Doradus, and the remaining 51 candidates are likely young but have an ambiguous membership to more than one association. The false alarm rate for new candidates is estimated to be 5% for {beta} Pictoris and TW Hydrae, 10% for Tucana-Horologium, Columba, Carina, and Argus, and 14% for AB Doradus. Our analysis confirms the membership of 58 stars proposed in the literature. Firm membership confirmation of our new candidates will require measurement of their radial velocity (predicted by our analysis), parallax, and lithium 6708 A equivalent width. We have initiated these follow-up observations for a number of candidates, and we have identified two stars (2MASSJ01112542+1526214, 2MASSJ05241914-1601153) as very strong candidate members of the {beta} Pictoris moving group and one strong candidate member (2MASSJ05332558-5117131) of the Tucana-Horologium association; these three stars have radial velocity measurements confirming their membership and lithium detections consistent with young age.

  8. Joint analysis of spikes and local field potentials using copula.

    PubMed

    Hu, Meng; Li, Mingyao; Li, Wu; Liang, Hualou

    2016-06-01

    Recent technological advances, which allow for simultaneous recording of spikes and local field potentials (LFPs) at multiple sites in a given cortical area or across different areas, have greatly increased our understanding of signal processing in brain circuits. Joint analysis of simultaneously collected spike and LFP signals is an important step to explicate how the brain orchestrates information processing. In this contribution, we present a novel statistical framework based on Gaussian copula to jointly model spikes and LFP. In our approach, we use copula to link separate, marginal regression models to construct a joint regression model, in which the binary-valued spike train data are modeled using generalized linear model (GLM) and the continuous-valued LFP data are modeled using linear regression. Model parameters can be efficiently estimated via maximum-likelihood. In particular, we show that our model offers a means to statistically detect directional influence between spikes and LFP, akin to Granger causality measure, and that we are able to assess its statistical significance by conducting a Wald test. Through extensive simulations, we also show that our method is able to reliably recover the true model used to generate the data. To demonstrate the effectiveness of our approach in real setting, we further apply the method to a mixed neural dataset, consisting of spikes and LFP simultaneously recorded from the visual cortex of a monkey performing a contour detection task. PMID:27012500

  9. IMU-Based Joint Angle Measurement for Gait Analysis

    PubMed Central

    Seel, Thomas; Raisch, Jorg; Schauer, Thomas

    2014-01-01

    This contribution is concerned with joint angle calculation based on inertial measurement data in the context of human motion analysis. Unlike most robotic devices, the human body lacks even surfaces and right angles. Therefore, we focus on methods that avoid assuming certain orientations in which the sensors are mounted with respect to the body segments. After a review of available methods that may cope with this challenge, we present a set of new methods for: (1) joint axis and position identification; and (2) flexion/extension joint angle measurement. In particular, we propose methods that use only gyroscopes and accelerometers and, therefore, do not rely on a homogeneous magnetic field. We provide results from gait trials of a transfemoral amputee in which we compare the inertial measurement unit (IMU)-based methods to an optical 3D motion capture system. Unlike most authors, we place the optical markers on anatomical landmarks instead of attaching them to the IMUs. Root mean square errors of the knee flexion/extension angles are found to be less than 1° on the prosthesis and about 3° on the human leg. For the plantar/dorsiflexion of the ankle, both deviations are about 1°. PMID:24743160

  10. Analysis Method for Inelastic, Adhesively Bonded Joints with Anisotropic Adherends

    NASA Technical Reports Server (NTRS)

    Smeltzer, Stanley S., III; Klang, Eric C.

    2003-01-01

    A one-dimensional analysis method for evaluating adhesively bonded joints composed of anisotropic adherends and adhesives with nonlinear material behavior is presented in the proposed paper. The strain and resulting stress field in a general, bonded joint overlap are determined by using a variable-step, finite-difference solution algorithm to iteratively solve a system of first-order differential equations. Applied loading is given by a system of combined extensional, bending, and shear forces that are applied to the edge of the joint overlap. Adherends are assumed to behave as linear, cylindrically bent plates using classical laminated plate theory that includes the effects of first-order transverse shear deformation. Using the deformation theory of plasticity and a modified von-Mises yield criterion, inelastic material behavior is modeled in the adhesive layer. Results for the proposed method are verified against previous results from the literature and shown to be in excellent agreement. An additional case that highlights the effects of transverse shear deformation between similar adherends is also presented.

  11. Guideline for bolted joint design and analysis : version 1.0.

    SciTech Connect

    Brown, Kevin H.; Morrow, Charles W.; Durbin, Samuel; Baca, Allen

    2008-01-01

    This document provides general guidance for the design and analysis of bolted joint connections. An overview of the current methods used to analyze bolted joint connections is given. Several methods for the design and analysis of bolted joint connections are presented. Guidance is provided for general bolted joint design, computation of preload uncertainty and preload loss, and the calculation of the bolted joint factor of safety. Axial loads, shear loads, thermal loads, and thread tear out are used in factor of safety calculations. Additionally, limited guidance is provided for fatigue considerations. An overview of an associated Mathcad{copyright} Worksheet containing all bolted joint design formulae presented is also provided.

  12. An automated land-use mapping comparison of the Bayesian maximum likelihood and linear discriminant analysis algorithms

    NASA Technical Reports Server (NTRS)

    Tom, C. H.; Miller, L. D.

    1984-01-01

    The Bayesian maximum likelihood parametric classifier has been tested against the data-based formulation designated 'linear discrimination analysis', using the 'GLIKE' decision and "CLASSIFY' classification algorithms in the Landsat Mapping System. Identical supervised training sets, USGS land use/land cover classes, and various combinations of Landsat image and ancilliary geodata variables, were used to compare the algorithms' thematic mapping accuracy on a single-date summer subscene, with a cellularized USGS land use map of the same time frame furnishing the ground truth reference. CLASSIFY, which accepts a priori class probabilities, is found to be more accurate than GLIKE, which assumes equal class occurrences, for all three mapping variable sets and both levels of detail. These results may be generalized to direct accuracy, time, cost, and flexibility advantages of linear discriminant analysis over Bayesian methods.

  13. Personality and coping traits: A joint factor analysis.

    PubMed

    Ferguson, Eamonn

    2001-11-01

    OBJECTIVES: The main objective of this paper is to explore the structural similarities between Eysenck's model of personality and the dimensions of the dispositional COPE. Costa et al. {Costa P., Somerfield, M., & McCrae, R. (1996). Personality and coping: A reconceptualisation. In (pp. 44-61) Handbook of coping: Theory, research and applications. New York: Wiley} suggest that personality and coping behaviour are part of a continuum based on adaptation. If this is the case, there should be structural similarities between measures of personality and coping behaviour. This is tested using a joint factor analysis of personality and coping measures. DESIGN: Cross-sectional survey. METHODS: The EPQ-R and the dispositional COPE were administered to 154 participants, and the data were analysed using joint factor analysis and bivariate associations. RESULTS: The joint factor analysis indicated that these data were best explained by a four-factor model. One factor was primarily unrelated to personality. There was a COPE-neurotic-introvert factor (NI-COPE) containing coping behaviours such as denial, a COPE-extroversion (E-COPE) factor containing behaviours such as seeking social support and a COPE-psychoticism factor (P-COPE) containing behaviours such as alcohol use. This factor pattern, especially for NI- and E-COPE, was interpreted in terms of Gray's model of personality {Gray, J. A. (1987) The psychology of fear and stress. Cambridge: Cambridge University Press}. NI-, E-, and P-COPE were shown to be related, in a theoretically consistent manner, to perceived coping success and perceived coping functions. CONCLUSIONS: The results indicate that there are indeed conceptual links between models of personality and coping. It is argued that future research should focus on identifying coping 'trait complexes'. Implications for practice are discussed. PMID:12614507

  14. Synthetic aperture sonar imaging using joint time-frequency analysis

    NASA Astrophysics Data System (ADS)

    Wang, Genyuan; Xia, Xiang-Gen

    1999-03-01

    The non-ideal motion of the hydrophone usually induces the aperture error of the synthetic aperture sonar (SAS), which is one of the most important factors degrading the SAS imaging quality. In the SAS imaging, the return signals are usually nonstationary due to the non-ideal hydrophone motion. In this paper, joint time-frequency analysis (JTFA), as a good technique for analyzing nonstationary signals, is used in the SAS imaging. Based on the JTFA of the sonar return signals, a novel SAS imaging algorithm is proposed. The algorithm is verified by simulation examples.

  15. Flexible multisensor inspection system for solder-joint analysis

    NASA Astrophysics Data System (ADS)

    Lacey, Gerard; Waldron, Ronan; Dinten, Jean-Marc; Lilley, Francis

    1993-08-01

    This paper describes the design and construction of an open, automated, solder bond verification machine for the electronics manufacturing industry. The application domain is the higher end assembly technologies, with an emphasis on fine pitch surface mount components. The system serves a measurement function, quantifying the solder bonds. It interfaces with the manufacturing process to close the manufacturing loop. A geometric model of the solder in a joint, coupled with a finite element analysis of the physical properties of solder, lead to objective measurement of the solder. Principle illumination systems are laser, X-ray and noncoherent lighting. Open, Objected Oriented design and implementation practices enable a forward looking system to be developed.

  16. A unified method for inference of tokamak equilibria and validation of force-balance models based on Bayesian analysis

    NASA Astrophysics Data System (ADS)

    von Nessi, G. T.; Hole, M. J.; the MAST Team

    2013-05-01

    A new method, based on Bayesian analysis, is presented which unifies the inference of plasma equilibria parameters in a tokamak with the ability to quantify differences between inferred equilibria and Grad-Shafranov (GS) force-balance solutions. At the heart of this technique is the new concept of weak observation, which allows multiple forward models to be associated with a single diagnostic observation. This new idea subsequently provides a means by which the space of GS solutions can be efficiently characterized via a prior distribution. The posterior evidence (a normalization constant of the inferred posterior distribution) is also inferred in the analysis and is used as a proxy for determining how relatively close inferred equilibria are to force-balance for different discharges/times. These points have been implemented in a code called BEAST (Bayesian equilibrium analysis and simulation tool), which uses a special implementation of Skilling’s nested sampling algorithm (Skilling 2006 Bayesian Anal. 1 833-59) to perform sampling and evidence calculations on high-dimensional, non-Gaussian posteriors. Initial BEAST equilibrium inference results are presented for two high-performance MAST discharges.

  17. Bayesian analysis of the kinetics of quantal transmitter secretion at the neuromuscular junction.

    PubMed

    Saveliev, Anatoly; Khuzakhmetova, Venera; Samigullin, Dmitry; Skorinkin, Andrey; Kovyazina, Irina; Nikolsky, Eugeny; Bukharaeva, Ellya

    2015-10-01

    The timing of transmitter release from nerve endings is considered nowadays as one of the factors determining the plasticity and efficacy of synaptic transmission. In the neuromuscular junction, the moments of release of individual acetylcholine quanta are related to the synaptic delays of uniquantal endplate currents recorded under conditions of lowered extracellular calcium. Using Bayesian modelling, we performed a statistical analysis of synaptic delays in mouse neuromuscular junction with different patterns of rhythmic nerve stimulation and when the entry of calcium ions into the nerve terminal was modified. We have obtained a statistical model of the release timing which is represented as the summation of two independent statistical distributions. The first of these is the exponentially modified Gaussian distribution. The mixture of normal and exponential components in this distribution can be interpreted as a two-stage mechanism of early and late periods of phasic synchronous secretion. The parameters of this distribution depend on both the stimulation frequency of the motor nerve and the calcium ions' entry conditions. The second distribution was modelled as quasi-uniform, with parameters independent of nerve stimulation frequency and calcium entry. Two different probability density functions for the distribution of synaptic delays suggest at least two independent processes controlling the time course of secretion, one of them potentially involving two stages. The relative contribution of these processes to the total number of mediator quanta released depends differently on the motor nerve stimulation pattern and on calcium ion entry into nerve endings. PMID:26129670

  18. Bayesian analysis of nonlinear mixed-effects mixture models for longitudinal data with heterogeneity and skewness.

    PubMed

    Lu, Xiaosun; Huang, Yangxin

    2014-07-20

    It is a common practice to analyze complex longitudinal data using nonlinear mixed-effects (NLME) models with normality assumption. The NLME models with normal distributions provide the most popular framework for modeling continuous longitudinal outcomes, assuming individuals are from a homogeneous population and relying on random-effects to accommodate inter-individual variation. However, the following two issues may standout: (i) normality assumption for model errors may cause lack of robustness and subsequently lead to invalid inference and unreasonable estimates, particularly, if the data exhibit skewness and (ii) a homogeneous population assumption may be unrealistically obscuring important features of between-subject and within-subject variations, which may result in unreliable modeling results. There has been relatively few studies concerning longitudinal data with both heterogeneity and skewness features. In the last two decades, the skew distributions have shown beneficial in dealing with asymmetric data in various applications. In this article, our objective is to address the simultaneous impact of both features arisen from longitudinal data by developing a flexible finite mixture of NLME models with skew distributions under Bayesian framework that allows estimates of both model parameters and class membership probabilities for longitudinal data. Simulation studies are conducted to assess the performance of the proposed models and methods, and a real example from an AIDS clinical trial illustrates the methodology by modeling the viral dynamics to compare potential models with different distribution specifications; the analysis results are reported. PMID:24623529

  19. A fully Bayesian hidden Ising model for ChIP-seq data analysis.

    PubMed

    Mo, Qianxing

    2012-01-01

    Chromatin immunoprecipitation followed by next generation sequencing (ChIP-seq) is a powerful technique that is being used in a wide range of biological studies including genome-wide measurements of protein-DNA interactions, DNA methylation, and histone modifications. The vast amount of data and biases introduced by sequencing and/or genome mapping pose new challenges and call for effective methods and fast computer programs for statistical analysis. To systematically model ChIP-seq data, we build a dynamic signal profile for each chromosome and then model the profile using a fully Bayesian hidden Ising model. The proposed model naturally takes into account spatial dependency and global and local distributions of sequence tags. It can be used for one-sample and two-sample analyses. Through model diagnosis, the proposed method can detect falsely enriched regions caused by sequencing and/or mapping errors, which is usually not offered by the existing hypothesis-testing-based methods. The proposed method is illustrated using 3 transcription factor (TF) ChIP-seq data sets and 2 mixed ChIP-seq data sets and compared with 4 popular and/or well-documented methods: MACS, CisGenome, BayesPeak, and SISSRs. The results indicate that the proposed method achieves equivalent or higher sensitivity and spatial resolution in detecting TF binding sites with false discovery rate at a much lower level. PMID:21914728

  20. Quantitative analysis of dynamic contrast-enhanced MR images based on Bayesian P-splines.

    PubMed

    Schmid, Volker J; Whitcher, Brandon; Padhani, Anwar R; Yang, Guang-Zhong

    2009-06-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is an important tool for detecting subtle kinetic changes in cancerous tissue. Quantitative analysis of DCE-MRI typically involves the convolution of an arterial input function (AIF) with a nonlinear pharmacokinetic model of the contrast agent concentration. Parameters of the kinetic model are biologically meaningful, but the optimization of the nonlinear model has significant computational issues. In practice, convergence of the optimization algorithm is not guaranteed and the accuracy of the model fitting may be compromised. To overcome these problems, this paper proposes a semi-parametric penalized spline smoothing approach, where the AIF is convolved with a set of B-splines to produce a design matrix using locally adaptive smoothing parameters based on Bayesian penalized spline models (P-splines). It has been shown that kinetic parameter estimation can be obtained from the resulting deconvolved response function, which also includes the onset of contrast enhancement. Detailed validation of the method, both with simulated and in vivo data, is provided. PMID:19272996

  1. Bayesian linear regression with skew-symmetric error distributions with applications to survival analysis.

    PubMed

    Rubio, Francisco J; Genton, Marc G

    2016-06-30

    We study Bayesian linear regression models with skew-symmetric scale mixtures of normal error distributions. These kinds of models can be used to capture departures from the usual assumption of normality of the errors in terms of heavy tails and asymmetry. We propose a general noninformative prior structure for these regression models and show that the corresponding posterior distribution is proper under mild conditions. We extend these propriety results to cases where the response variables are censored. The latter scenario is of interest in the context of accelerated failure time models, which are relevant in survival analysis. We present a simulation study that demonstrates good frequentist properties of the posterior credible intervals associated with the proposed priors. This study also sheds some light on the trade-off between increased model flexibility and the risk of over-fitting. We illustrate the performance of the proposed models with real data. Although we focus on models with univariate response variables, we also present some extensions to the multivariate case in the Supporting Information. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26856806

  2. Bayesian probability analysis: a prospective demonstration of its clinical utility in diagnosing coronary disease

    SciTech Connect

    Detrano, R.; Yiannikas, J.; Salcedo, E.E.; Rincon, G.; Go, R.T.; Williams, G.; Leatherman, J.

    1984-03-01

    One hundred fifty-four patients referred for coronary arteriography were prospectively studied with stress electrocardiography, stress thallium scintigraphy, cine fluoroscopy (for coronary calcifications), and coronary angiography. Pretest probabilities of coronary disease were determined based on age, sex, and type of chest pain. These and pooled literature values for the conditional probabilities of test results based on disease state were used in Bayes theorem to calculate posttest probabilities of disease. The results of the three noninvasive tests were compared for statistical independence, a necessary condition for their simultaneous use in Bayes theorem. The test results were found to demonstrate pairwise independence in patients with and those without disease. Some dependencies that were observed between the test results and the clinical variables of age and sex were not sufficient to invalidate application of the theorem. Sixty-eight of the study patients had at least one major coronary artery obstruction of greater than 50%. When these patients were divided into low-, intermediate-, and high-probability subgroups according to their pretest probabilities, noninvasive test results analyzed by Bayesian probability analysis appropriately advanced 17 of them by at least one probability subgroup while only seven were moved backward. Of the 76 patients without disease, 34 were appropriately moved into a lower probability subgroup while 10 were incorrectly moved up. We conclude that posttest probabilities calculated from Bayes theorem more accurately classified patients with and without disease than did pretest probabilities, thus demonstrating the utility of the theorem in this application.

  3. A Bayesian Approach to the Design and Analysis of Computer Experiments

    SciTech Connect

    Currin, C.

    1988-01-01

    We consider the problem of designing and analyzing experiments for prediction of the function y(f), t {element_of} T, where y is evaluated by means of a computer code (typically by solving complicated equations that model a physical system), and T represents the domain of inputs to the code. We use a Bayesian approach, in which uncertainty about y is represented by a spatial stochastic process (random function); here we restrict attention to stationary Gaussian processes. The posterior mean function can be used as an interpolating function, with uncertainties given by the posterior standard deviations. Instead of completely specifying the prior process, we consider several families of priors, and suggest some cross-validational methods for choosing one that performs relatively well on the function at hand. As a design criterion, we use the expected reduction in the entropy of the random vector y (T*), where T* {contained_in} T is a given finite set of ''sites'' (input configurations) at which predictions are to be made. We describe an exchange algorithm for constructing designs that are optimal with respect to this criterion. To demonstrate the use of these design and analysis methods, several examples are given, including one experiment on a computer model of a thermal energy storage device and another on an integrated circuit simulator.

  4. Sex Differences in the Development of Neuroanatomical Functional Connectivity underlying Intelligence found using Bayesian Connectivity Analysis

    PubMed Central

    Schmithorst, Vincent J.; Holland, Scott K.

    2007-01-01

    A Bayesian method for functional connectivity analysis was adapted to investigate between-group differences. This method was applied in a large cohort of almost 300 children to investigate differences in boys and girls in the relationship between intelligence and functional connectivity for the task of narrative comprehension. For boys, a greater association was shown between intelligence and the functional connectivity linking Broca’s area to auditory processing areas, including Wernicke’s areas and the right posterior superior temporal gyrus. For girls, a greater association was shown between intelligence and the functional connectivity linking the left posterior superior temporal gyrus to Wernicke’s areas bilaterally. A developmental effect was also seen, with girls displaying a positive correlation with age in the association between intelligence and the functional connectivity linking the right posterior superior temporal gyrus to Wernicke’s areas bilaterally. Our results demonstrate a sexual dimorphism in the relationship of functional connectivity to intelligence in children and an increasing reliance on inter-hemispheric connectivity in girls with age. PMID:17223578

  5. Sex differences in the development of neuroanatomical functional connectivity underlying intelligence found using Bayesian connectivity analysis.

    PubMed

    Schmithorst, Vincent J; Holland, Scott K

    2007-03-01

    A Bayesian method for functional connectivity analysis was adapted to investigate between-group differences. This method was applied in a large cohort of almost 300 children to investigate differences in boys and girls in the relationship between intelligence and functional connectivity for the task of narrative comprehension. For boys, a greater association was shown between intelligence and the functional connectivity linking Broca's area to auditory processing areas, including Wernicke's areas and the right posterior superior temporal gyrus. For girls, a greater association was shown between intelligence and the functional connectivity linking the left posterior superior temporal gyrus to Wernicke's areas bilaterally. A developmental effect was also seen, with girls displaying a positive correlation with age in the association between intelligence and the functional connectivity linking the right posterior superior temporal gyrus to Wernicke's areas bilaterally. Our results demonstrate a sexual dimorphism in the relationship of functional connectivity to intelligence in children and an increasing reliance on inter-hemispheric connectivity in girls with age. PMID:17223578

  6. Iteration of Partially-Specified Target Matrices: Applications in Exploratory and Bayesian Confirmatory Factor Analysis

    PubMed Central

    Moore, Tyler M.; Reise, Steven P.; Depaoli, Sarah; Haviland, Mark G.

    2015-01-01

    We describe and evaluate a factor rotation algorithm, iterated target rotation (ITR). Whereas target rotation (Browne, 2001) requires a user to specify a target matrix a priori based on theory or prior research, ITR begins with a standard analytic factor rotation (i.e., an empirically-informed target) followed by an iterative search procedure to update the target matrix. Monte Carlo simulations were conducted to evaluate the performance of ITR relative to analytic rotations from the Crawford-Ferguson family with population factor structures varying in complexity. Simulation results: (a) suggested that ITR analyses will be particularly useful when evaluating data with complex structures (i.e., multiple cross-loadings) and (b) showed that the rotation method used to define an initial target matrix did not materially affect the accuracy of the various ITRs. In Study 2, we: (a) demonstrated the application of ITR as a way to determine empirically-informed priors in a Bayesian confirmatory factor analysis (BCFA; Muthén & Asparouhov, 2012) of a rater-report alexithymia measure (Haviland, Warren, & Riggs, 2000) and (b) highlighted some of the challenges when specifying empirically-based priors and assessing item and overall model fit. PMID:26609875

  7. Hierarchical Bayesian approaches for detecting inconsistency in network meta-analysis.

    PubMed

    Zhao, Hong; Hodges, James S; Ma, Haijun; Jiang, Qi; Carlin, Bradley P

    2016-09-10

    Network meta-analysis (NMA), also known as multiple treatment comparisons, is commonly used to incorporate direct and indirect evidence comparing treatments. With recent advances in methods and software, Bayesian approaches to NMA have become quite popular and allow models of previously unanticipated complexity. However, when direct and indirect evidence differ in an NMA, the model is said to suffer from inconsistency. Current inconsistency detection in NMA is usually based on contrast-based (CB) models; however, this approach has certain limitations. In this work, we propose an arm-based random effects model, where we detect discrepancy of direct and indirect evidence for comparing two treatments using the fixed effects in the model while flagging extreme trials using the random effects. We define discrepancy factors to characterize evidence of inconsistency for particular treatment comparisons, which is novel in NMA research. Our approaches permit users to address issues previously tackled via CB models. We compare sources of inconsistency identified by our approach and existing loop-based CB methods using real and simulated datasets and demonstrate that our methods can offer powerful inconsistency detection. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27037506

  8. Bayesian time series analysis of segments of the Rocky Mountain trumpeter swan population

    USGS Publications Warehouse

    Wright, Christopher K.; Sojda, Richard S.; Goodman, Daniel

    2002-01-01

    A Bayesian time series analysis technique, the dynamic linear model, was used to analyze counts of Trumpeter Swans (Cygnus buccinator) summering in Idaho, Montana, and Wyoming from 1931 to 2000. For the Yellowstone National Park segment of white birds (sub-adults and adults combined) the estimated probability of a positive growth rate is 0.01. The estimated probability of achieving the Subcommittee on Rocky Mountain Trumpeter Swans 2002 population goal of 40 white birds for the Yellowstone segment is less than 0.01. Outside of Yellowstone National Park, Wyoming white birds are estimated to have a 0.79 probability of a positive growth rate with a 0.05 probability of achieving the 2002 objective of 120 white birds. In the Centennial Valley in southwest Montana, results indicate a probability of 0.87 that the white bird population is growing at a positive rate with considerable uncertainty. The estimated probability of achieving the 2002 Centennial Valley objective of 160 white birds is 0.14 but under an alternative model falls to 0.04. The estimated probability that the Targhee National Forest segment of white birds has a positive growth rate is 0.03. In Idaho outside of the Targhee National Forest, white birds are estimated to have a 0.97 probability of a positive growth rate with a 0.18 probability of attaining the 2002 goal of 150 white birds.

  9. Development of Bayesian analysis program for extraction of polarisation observables at CLAS

    SciTech Connect

    Lewis, Stefanie; Ireland, David; Vanderbauwhede, W.

    2014-06-01

    At the mass scale of a proton, the strong force is not well understood. Various quark models exist, but it is important to determine which quark model(s) are most accurate. Experimentally, finding resonances predicted by some models and not others would give valuable insight into this fundamental interaction. Several labs around the world use photoproduction experiments to find these missing resonances. The aim of this work is to develop a robust Bayesian data analysis program for extracting polarisation observables from pseudoscalar meson photoproduction experiments using CLAS at Jefferson Lab. This method, known as nested sampling, has been compared to traditional methods and has incorporated data parallelisation and GPU programming. It involves an event-by-event likelihood function, which has no associated loss of information from histogram binning, and results can be easily constrained to the physical region. One of the most important advantages of the nested sampling approach is that data from different experiments can be combined and analysed simultaneously. Results on both simulated and previously analysed experimental data for the K{sup +}Λ channel will be discussed.

  10. Foliar interception of radionuclides in dry conditions: a meta-analysis using a Bayesian modeling approach.

    PubMed

    Sy, Mouhamadou Moustapha; Ancelet, Sophie; Henner, Pascale; Hurtevent, Pierre; Simon-Cornu, Marie

    2015-09-01

    Uncertainty on the parameters that describe the transfer of radioactive materials into the (terrestrial) environment may be characterized thanks to datasets such as those compiled within International Atomic Energy Agency (IAEA) documents. Nevertheless, the information included in these documents is too poor to derive a relevant and informative uncertainty distribution regarding dry interception of radionuclides by the pasture grass and the leaves of vegetables. In this paper, 145 sets of dry interception measurements by the aboveground biomass of specific plants were collected from published scientific papers. A Bayesian meta-analysis was performed to derive the posterior probability distributions of the parameters that reflect their uncertainty given the collected data. Four competing models were compared in terms of both fitting performances and predictive abilities to reproduce plausible dry interception data. The asymptotic interception factor, applicable whatever the species and radionuclide to the highest aboveground biomass values (e.g. mature leafy vegetables), was estimated with the best model, to be 0.87 with a 95% credible interval (0.85, 0.89). PMID:26043277

  11. Problems and Cautions With Sequence Mismatch Analysis and Bayesian Skyline Plots to Infer Historical Demography.

    PubMed

    Grant, William Stewart

    2015-01-01

    Sequence mismatch analysis (MMA) and Bayesian skyline plots (BSP) are commonly used to reconstruct historical demography. A survey of 173 research articles (2009-2014), which included estimates of historical population sizes from mtDNA or cpDNA, shows a widespread genetic signature of demographic or spatial population expansion in species of all major taxonomic groups. Associating these expansions with climatic events can provide insights into the origins of lineage diversity, range expansions (or contractions), and speciation. However, several variables can introduce error into reconstructions of demographic history, including levels of sequence polymorphism, sampling scheme, sample size, natural selection, and estimates of mutation rate. Most researchers use substitution rates estimated from divergences in phylogenetic trees dated with fossils, or geological events. Recent studies show that molecular clocks calibrated with phylogenetic divergences can overestimate the timings of population-level events by an order of magnitude. Overestimates disconnect historical population reconstructions from climatic history and confound our understanding of the factors influencing genetic variability. If mismatch distributions and BSPs largely reflect demographic history, the widespread signature of population expansion in vertebrate, invertebrate, and plant populations appears to reflect responses to postglacial climate warming. PMID:25926628

  12. Integration of Bayesian analysis for eutrophication prediction and assessment in a landscape lake.

    PubMed

    Yang, Likun; Zhao, Xinhua; Peng, Sen; Zhou, Guangyu

    2015-01-01

    Eutrophication models have been widely used to assess water quality in landscape lakes. Because flow rate in landscape lakes is relatively low and similar to that of natural lakes, eutrophication is more dominant in landscape lakes. To assess the risk of eutrophication in landscape lakes, a set of dynamic equations was developed to simulate lake water quality for total nitrogen (TN), total phosphorous (TP), dissolve oxygen (DO) and chlorophyll a (Chl a). Firstly, the Bayesian calibration results were described. Moreover, the ability of the model to reproduce adequately the observed mean patterns and major cause-effect relationships for water quality conditions in landscape lakes were presented. Two loading scenarios were used. A Monte Carlo algorithm was applied to calculate the predicated water quality distributions, which were used in the established hierarchical assessment system for lake water quality risk. The important factors affecting the lake water quality risk were defined using linear regression analysis. The results indicated that the variations in the landscape lake receiving recharge water quality caused considerable landscape lake water quality risk in the surrounding area. Moreover, the Chl a concentration in lake water was significantly affected by TP and TN concentrations; the lake TP concentration was the limiting factor for growth of plankton in lake water. The lake water TN concentration provided the basic nutritional requirements. Lastly, lower TN and TP concentrations in the receiving recharge water caused increased lake water quality risk. PMID:25467413

  13. Data analysis using scale-space filtering and Bayesian probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Kutulakos, Kiriakos; Robinson, Peter

    1991-01-01

    This paper describes a program for analysis of output curves from Differential Thermal Analyzer (DTA). The program first extracts probabilistic qualitative features from a DTA curve of a soil sample, and then uses Bayesian probabilistic reasoning to infer the mineral in the soil. The qualifier module employs a simple and efficient extension of scale-space filtering suitable for handling DTA data. We have observed that points can vanish from contours in the scale-space image when filtering operations are not highly accurate. To handle the problem of vanishing points, perceptual organizations heuristics are used to group the points into lines. Next, these lines are grouped into contours by using additional heuristics. Probabilities are associated with these contours using domain-specific correlations. A Bayes tree classifier processes probabilistic features to infer the presence of different minerals in the soil. Experiments show that the algorithm that uses domain-specific correlation to infer qualitative features outperforms a domain-independent algorithm that does not.

  14. Composite behavior analysis for video surveillance using hierarchical dynamic Bayesian networks

    NASA Astrophysics Data System (ADS)

    Cheng, Huanhuan; Shan, Yong; Wang, Runsheng

    2011-03-01

    Analyzing composite behaviors involving objects from multiple categories in surveillance videos is a challenging task due to the complicated relationships among human and objects. This paper presents a novel behavior analysis framework using a hierarchical dynamic Bayesian network (DBN) for video surveillance systems. The model is built for extracting objects' behaviors and their relationships by representing behaviors using spatial-temporal characteristics. The recognition of object behaviors is processed by the DBN at multiple levels: features of objects at low level, objects and their relationships at middle level, and event at high level, where event refers to behaviors of a single type object as well as behaviors consisting of several types of objects such as ``a person getting in a car.'' Furthermore, to reduce the complexity, a simple model selection criterion is addressed, by which the appropriated model is picked out from a pool of candidate models. Experiments are shown to demonstrate that the proposed framework could efficiently recognize and semantically describe composite object and human activities in surveillance videos.

  15. Volume measurement of cryogenic deuterium pellets by Bayesian analysis of single shadowgraphy images.

    PubMed

    Szepesi, T; Kálvin, S; Kocsis, G; Lang, P T; Wittmann, C

    2008-03-01

    In situ commissioning of the Blower-gun injector for launching cryogenic deuterium pellets at ASDEX Upgrade tokamak was performed. This injector is designed for high repetitive launch of small pellets for edge localised modes pacing experiments. During the investigation the final injection geometry was simulated with pellets passing to the torus through a 5.5 m long guiding tube. For investigation of pellet quality at launch and after tube passage laser flash camera shadowgraphy diagnostic units before and after the tube were installed. As indicator of pellet quality we adopted the pellet mass represented by the volume of the main remaining pellet fragment. Since only two-dimensional (2D) shadow images were obtained, a reconstruction of the full three-dimensional pellet body had to be performed. For this the image was first converted into a 1-bit version prescribing an exact 2D contour. From this contour the expected value of the volume was calculated by Bayesian analysis taking into account the likely cylindrical shape of the pellet. Under appropriate injection conditions sound pellets with more than half of their nominal mass are detected after acceleration; the passage causes in average an additional loss of about 40% to the launched mass. Analyzing pellets arriving at tube exit allowed for deriving the injector's optimized operational conditions. For these more than 90% of the pellets were arriving with sound quality when operating in the frequency range 5-50 Hz. PMID:18377004

  16. Improving Bayesian analysis for LISA Pathfinder using an efficient Markov Chain Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ferraioli, Luigi; Porter, Edward K.; Armano, Michele; Audley, Heather; Congedo, Giuseppe; Diepholz, Ingo; Gibert, Ferran; Hewitson, Martin; Hueller, Mauro; Karnesis, Nikolaos; Korsakova, Natalia; Nofrarias, Miquel; Plagnol, Eric; Vitale, Stefano

    2014-02-01

    We present a parameter estimation procedure based on a Bayesian framework by applying a Markov Chain Monte Carlo algorithm to the calibration of the dynamical parameters of the LISA Pathfinder satellite. The method is based on the Metropolis-Hastings algorithm and a two-stage annealing treatment in order to ensure an effective exploration of the parameter space at the beginning of the chain. We compare two versions of the algorithm with an application to a LISA Pathfinder data analysis problem. The two algorithms share the same heating strategy but with one moving in coordinate directions using proposals from a multivariate Gaussian distribution, while the other uses the natural logarithm of some parameters and proposes jumps in the eigen-space of the Fisher Information matrix. The algorithm proposing jumps in the eigen-space of the Fisher Information matrix demonstrates a higher acceptance rate and a slightly better convergence towards the equilibrium parameter distributions in the application to LISA Pathfinder data. For this experiment, we return parameter values that are all within ˜1 σ of the injected values. When we analyse the accuracy of our parameter estimation in terms of the effect they have on the force-per-unit of mass noise, we find that the induced errors are three orders of magnitude less than the expected experimental uncertainty in the power spectral density.

  17. Adverse and Advantageous Selection in the Medicare Supplemental Market: A Bayesian Analysis of Prescription drug Expenditure.

    PubMed

    Li, Qian; Trivedi, Pravin K

    2016-02-01

    This paper develops an extended specification of the two-part model, which controls for unobservable self-selection and heterogeneity of health insurance, and analyzes the impact of Medicare supplemental plans on the prescription drug expenditure of the elderly, using a linked data set based on the Medicare Current Beneficiary Survey data for 2003-2004. The econometric analysis is conducted using a Bayesian econometric framework. We estimate the treatment effects for different counterfactuals and find significant evidence of endogeneity in plan choice and the presence of both adverse and advantageous selections in the supplemental insurance market. The average incentive effect is estimated to be $757 (2004 value) or 41% increase per person per year for the elderly enrolled in supplemental plans with drug coverage against the Medicare fee-for-service counterfactual and is $350 or 21% against the supplemental plans without drug coverage counterfactual. The incentive effect varies by different sources of drug coverage: highest for employer-sponsored insurance plans, followed by Medigap and managed medicare plans. PMID:25504934

  18. Assessment of occupational safety risks in Floridian solid waste systems using Bayesian analysis.

    PubMed

    Bastani, Mehrad; Celik, Nurcin

    2015-10-01

    Safety risks embedded within solid waste management systems continue to be a significant issue and are prevalent at every step in the solid waste management process. To recognise and address these occupational hazards, it is necessary to discover the potential safety concerns that cause them, as well as their direct and/or indirect impacts on the different types of solid waste workers. In this research, our goal is to statistically assess occupational safety risks to solid waste workers in the state of Florida. Here, we first review the related standard industrial codes to major solid waste management methods including recycling, incineration, landfilling, and composting. Then, a quantitative assessment of major risks is conducted based on the data collected using a Bayesian data analysis and predictive methods. The risks estimated in this study for the period of 2005-2012 are then compared with historical statistics (1993-1997) from previous assessment studies. The results have shown that the injury rates among refuse collectors in both musculoskeletal and dermal injuries have decreased from 88 and 15 to 16 and three injuries per 1000 workers, respectively. However, a contrasting trend is observed for the injury rates among recycling workers, for whom musculoskeletal and dermal injuries have increased from 13 and four injuries to 14 and six injuries per 1000 workers, respectively. Lastly, a linear regression model has been proposed to identify major elements of the high number of musculoskeletal and dermal injuries. PMID:26219294

  19. Cross-validation analysis of bias models in Bayesian multi-model projections of climate

    NASA Astrophysics Data System (ADS)

    Huttunen, J. M. J.; Räisänen, J.; Nissinen, A.; Lipponen, A.; Kolehmainen, V.

    2016-05-01

    Climate change projections are commonly based on multi-model ensembles of climate simulations. In this paper we consider the choice of bias models in Bayesian multimodel predictions. Buser et al. (Clim Res 44(2-3):227-241, 2010a) introduced a hybrid bias model which combines commonly used constant bias and constant relation bias assumptions. The hybrid model includes a weighting parameter which balances these bias models. In this study, we use a cross-validation approach to study which bias model or bias parameter leads to, in a specific sense, optimal climate change projections. The analysis is carried out for summer and winter season means of 2 m-temperatures spatially averaged over the IPCC SREX regions, using 19 model runs from the CMIP5 data set. The cross-validation approach is applied to calculate optimal bias parameters (in the specific sense) for projecting the temperature change from the control period (1961-2005) to the scenario period (2046-2090). The results are compared to the results of the Buser et al. (Clim Res 44(2-3):227-241, 2010a) method which includes the bias parameter as one of the unknown parameters to be estimated from the data.

  20. Joint Analysis of Multiple Traits in Rare Variant Association Studies.

    PubMed

    Wang, Zhenchuan; Wang, Xuexia; Sha, Qiuying; Zhang, Shuanglin

    2016-05-01

    The joint analysis of multiple traits has recently become popular since it can increase statistical power to detect genetic variants and there is increasing evidence showing that pleiotropy is a widespread phenomenon in complex diseases. Currently, the majority of existing methods for the joint analysis of multiple traits test association between one common variant and multiple traits. However, the variant-by-variant methods for common variant association studies may not be optimal for rare variant association studies due to the allelic heterogeneity as well as the extreme rarity of individual variants. Current statistical methods for rare variant association studies are for one single trait only. In this paper, we propose an adaptive weighting reverse regression (AWRR) method to test association between multiple traits and rare variants in a genomic region. AWRR is robust to the directions of effects of causal variants and is also robust to the directions of association of traits. Using extensive simulation studies, we compare the performance of AWRR with canonical correlation analysis (CCA), Single-TOW, and the weighted sum reverse regression (WSRR). Our results show that, in all of the simulation scenarios, AWRR is consistently more powerful than CCA. In most scenarios, AWRR is more powerful than Single-TOW and WSRR. PMID:26990300

  1. Joint regression analysis and AMMI model applied to oat improvement

    NASA Astrophysics Data System (ADS)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  2. Finite Element Analysis of the Maximum Stress at the Joints of the Transmission Tower

    NASA Astrophysics Data System (ADS)

    Itam, Zarina; Beddu, Salmia; Liyana Mohd Kamal, Nur; Bamashmos, Khaled H.

    2016-03-01

    Transmission towers are tall structures, usually a steel lattice tower, used to support an overhead power line. Usually, transmission towers are analyzed as frame-truss systems and the members are assumed to be pin-connected without explicitly considering the effects of joints on the tower behavior. In this research, an engineering example of joint will be analyzed with the consideration of the joint detailing to investigate how it will affect the tower analysis. A static analysis using STAAD Pro was conducted to indicate the joint with the maximum stress. This joint will then be explicitly analyzed in ANSYS using the Finite Element Method. Three approaches were used in the software which are the simple plate model, bonded contact with no bolts, and beam element bolts. Results from the joint analysis show that stress values increased with joint details consideration. This proves that joints and connections play an important role in the distribution of stress within the transmission tower.

  3. Bayesian spatiotemporal analysis of zero-inflated biological population density data by a delta-normal spatiotemporal additive model.

    PubMed

    Arcuti, Simona; Pollice, Alessio; Ribecco, Nunziata; D'Onghia, Gianfranco

    2016-03-01

    We evaluate the spatiotemporal changes in the density of a particular species of crustacean known as deep-water rose shrimp, Parapenaeus longirostris, based on biological sample data collected during trawl surveys carried out from 1995 to 2006 as part of the international project MEDITS (MEDiterranean International Trawl Surveys). As is the case for many biological variables, density data are continuous and characterized by unusually large amounts of zeros, accompanied by a skewed distribution of the remaining values. Here we analyze the normalized density data by a Bayesian delta-normal semiparametric additive model including the effects of covariates, using penalized regression with low-rank thin-plate splines for nonlinear spatial and temporal effects. Modeling the zero and nonzero values by two joint processes, as we propose in this work, allows to obtain great flexibility and easily handling of complex likelihood functions, avoiding inaccurate statistical inferences due to misclassification of the high proportion of exact zeros in the model. Bayesian model estimation is obtained by Markov chain Monte Carlo simulations, suitably specifying the complex likelihood function of the zero-inflated density data. The study highlights relevant nonlinear spatial and temporal effects and the influence of the annual Mediterranean oscillations index and of the sea surface temperature on the distribution of the deep-water rose shrimp density. PMID:26418888

  4. Modeling of joints for the dynamic analysis of truss structures

    NASA Technical Reports Server (NTRS)

    Belvin, W. Keith

    1987-01-01

    An experimentally-based method for determining the stiffness and damping of truss joints is described. The analytical models use springs and both viscous and friction dampers to simulate joint load-deflection behavior. A least-squares algorithm is developed to identify the stiffness and damping coefficients of the analytical joint models from test data. The effects of nonlinear joint stiffness such as joint dead band are also studied. Equations for predicting the sensitivity of beam deformations to changes in joint stiffness are derived and used to show the level of joint stiffness required for nearly rigid joint behavior. Finally, the global frequency sensitivity of a truss structure to random perturbations in joint stiffness is discussed.

  5. A Bayesian Approach for Evaluation of Determinants of Health System Efficiency Using Stochastic Frontier Analysis and Beta Regression

    PubMed Central

    2016-01-01

    In today's world, Public expenditures on health are one of the most important issues for governments. These increased expenditures are putting pressure on public budgets. Therefore, health policy makers have focused on the performance of their health systems and many countries have introduced reforms to improve the performance of their health systems. This study investigates the most important determinants of healthcare efficiency for OECD countries using second stage approach for Bayesian Stochastic Frontier Analysis (BSFA). There are two steps in this study. First we measure 29 OECD countries' healthcare efficiency by BSFA using the data from the OECD Health Database. At second stage, we expose the multiple relationships between the healthcare efficiency and characteristics of healthcare systems across OECD countries using Bayesian beta regression. PMID:27118987

  6. A Bayesian Approach for Evaluation of Determinants of Health System Efficiency Using Stochastic Frontier Analysis and Beta Regression.

    PubMed

    Şenel, Talat; Cengiz, Mehmet Ali

    2016-01-01

    In today's world, Public expenditures on health are one of the most important issues for governments. These increased expenditures are putting pressure on public budgets. Therefore, health policy makers have focused on the performance of their health systems and many countries have introduced reforms to improve the performance of their health systems. This study investigates the most important determinants of healthcare efficiency for OECD countries using second stage approach for Bayesian Stochastic Frontier Analysis (BSFA). There are two steps in this study. First we measure 29 OECD countries' healthcare efficiency by BSFA using the data from the OECD Health Database. At second stage, we expose the multiple relationships between the healthcare efficiency and characteristics of healthcare systems across OECD countries using Bayesian beta regression. PMID:27118987

  7. A Bayesian Meta-Analysis on Prevalence of Hepatitis B Virus Infection among Chinese Volunteer Blood Donors

    PubMed Central

    Liu, Guang-ying; Zheng, Yang; Deng, Yan; Gao, Yan-yan; Wang, Lie

    2013-01-01

    Background Although transfusion-transmitted infection of hepatitis B virus (HBV) threatens the blood safety of China, the nationwide circumstance of HBV infection among blood donors is still unclear. Objectives To comprehensively estimate the prevalence of HBsAg positive and HBV occult infection (OBI) among Chinese volunteer blood donors through bayesian meta-analysis. Methods We performed an electronic search in Pub-Med, Web of Knowledge, Medline, Wanfang Data and CNKI, complemented by a hand search of relevant reference lists. Two authors independently extracted data from the eligible studies. Then two bayesian random-effect meta-analyses were performed, followed by bayesian meta-regressions. Results 5957412 and 571227 donors were identified in HBsAg group and OBI group, respectively. The pooled prevalence of HBsAg group and OBI group among donors is 1.085% (95% credible interval [CI] 0.859%∼1.398%) and 0.094% (95% CI 0.0578%∼0.1655%). For HBsAg group, subgroup analysis shows the more developed area has a lower prevalence than the less developed area; meta-regression indicates there is a significant decreasing trend in HBsAg positive prevalence with sampling year (beta = −0.1202, 95% −0.2081∼−0.0312). Conclusion Blood safety against HBV infection in China is suffering serious threats and the government should take effective measures to improve this situation. PMID:24236110

  8. Bayesian Statistical Analysis of Historical and Late Holocene Rates of Sea-Level Change

    NASA Astrophysics Data System (ADS)

    Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin

    2014-05-01

    A fundamental concern associated with climate change is the rate at which sea levels are rising. Studies of past sea level (particularly beyond the instrumental data range) allow modern sea-level rise to be placed in a more complete context. Considering this, we perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level rise, to determine when modern rates of sea-level rise began and to observe how these rates have been changing over time. Many of the current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over confidence in the sea-level trends being characterized. The proposed Bayesian model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, this is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary, in this case, for the model to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method provides a flexible fit and it allows for the direct estimation of the rate process with full consideration of all sources of uncertainty. Analysis of tide-gauge datasets and proxy reconstructions in this way means that changing rates of sea level can be estimated more comprehensively and accurately than previously possible. The model captures the continuous and dynamic evolution of sea-level change and results show that not only are modern sea levels rising but that the rates

  9. Reduction of a Whole-Body Physiologically Based Pharmacokinetic Model to Stabilise the Bayesian Analysis of Clinical Data.

    PubMed

    Wendling, Thierry; Tsamandouras, Nikolaos; Dumitras, Swati; Pigeolet, Etienne; Ogungbenro, Kayode; Aarons, Leon

    2016-01-01

    Whole-body physiologically based pharmacokinetic (PBPK) models are increasingly used in drug development for their ability to predict drug concentrations in clinically relevant tissues and to extrapolate across species, experimental conditions and sub-populations. A whole-body PBPK model can be fitted to clinical data using a Bayesian population approach. However, the analysis might be time consuming and numerically unstable if prior information on the model parameters is too vague given the complexity of the system. We suggest an approach where (i) a whole-body PBPK model is formally reduced using a Bayesian proper lumping method to retain the mechanistic interpretation of the system and account for parameter uncertainty, (ii) the simplified model is fitted to clinical data using Markov Chain Monte Carlo techniques and (iii) the optimised reduced PBPK model is used for extrapolation. A previously developed 16-compartment whole-body PBPK model for mavoglurant was reduced to 7 compartments while preserving plasma concentration-time profiles (median and variance) and giving emphasis to the brain (target site) and the liver (elimination site). The reduced model was numerically more stable than the whole-body model for the Bayesian analysis of mavoglurant pharmacokinetic data in healthy adult volunteers. Finally, the reduced yet mechanistic model could easily be scaled from adults to children and predict mavoglurant pharmacokinetics in children aged from 3 to 11 years with similar performance compared with the whole-body model. This study is a first example of the practicality of formal reduction of complex mechanistic models for Bayesian inference in drug development. PMID:26538125

  10. Bayesian Belief Network to support conflict analysis for groundwater protection: the case of the Apulia region.

    PubMed

    Giordano, Raffaele; D'Agostino, Daniela; Apollonio, Ciro; Lamaddalena, Nicola; Vurro, Michele

    2013-01-30

    Water resource management is often characterized by conflicts, as a result of the heterogeneity of interests associated with a shared resource. Many water conflicts arise on a global scale and, in particular, an increasing level of conflicts can be observed in the Mediterranean basin, characterized by water scarcity. In the present work, in order to assist the conflict analysis process, and thus outline a proper groundwater management, stakeholders were involved in the process and suitable tools were used in a Mediterranean area (the Apulia region, in Italy). In particular, this paper seeks to elicit and structure farmers' mental models influencing their decision over the main water source for irrigation. The more crucial groundwater is for farmers' objectives, the more controversial is the groundwater protection strategy. Bayesian Belief Networks were developed to simulate farmers' behavior with regard to groundwater management and to assess the impacts of protection strategy. These results have been used to calculate the conflict degree in the study area, derived from the introduction of policies for the reduction of groundwater exploitation for irrigation purposes. The less acceptable the policy is, the more likely it is that conflict will develop between farmers and the Regional Authority. The results of conflict analysis were also used to contribute to the debate concerning potential conflict mitigation measures. The approach adopted in this work has been discussed with a number of experts in groundwater management policies and irrigation management, and its main strengths and weaknesses have been identified. Increasing awareness of the existence of potential conflicts and the need to deal with them can be seen as an interesting initial shift in the Apulia region's water management regime, which is still grounded in merely technical approaches. PMID:23246906

  11. Bayesian network modeling: A case study of an epidemiologic system analysis of cardiovascular risk.

    PubMed

    Fuster-Parra, P; Tauler, P; Bennasar-Veny, M; Ligęza, A; López-González, A A; Aguiló, A

    2016-04-01

    An extensive, in-depth study of cardiovascular risk factors (CVRF) seems to be of crucial importance in the research of cardiovascular disease (CVD) in order to prevent (or reduce) the chance of developing or dying from CVD. The main focus of data analysis is on the use of models able to discover and understand the relationships between different CVRF. In this paper a report on applying Bayesian network (BN) modeling to discover the relationships among thirteen relevant epidemiological features of heart age domain in order to analyze cardiovascular lost years (CVLY), cardiovascular risk score (CVRS), and metabolic syndrome (MetS) is presented. Furthermore, the induced BN was used to make inference taking into account three reasoning patterns: causal reasoning, evidential reasoning, and intercausal reasoning. Application of BN tools has led to discovery of several direct and indirect relationships between different CVRF. The BN analysis showed several interesting results, among them: CVLY was highly influenced by smoking being the group of men the one with highest risk in CVLY; MetS was highly influence by physical activity (PA) being again the group of men the one with highest risk in MetS, and smoking did not show any influence. BNs produce an intuitive, transparent, graphical representation of the relationships between different CVRF. The ability of BNs to predict new scenarios when hypothetical information is introduced makes BN modeling an Artificial Intelligence (AI) tool of special interest in epidemiological studies. As CVD is multifactorial the use of BNs seems to be an adequate modeling tool. PMID:26777431

  12. Bayesian hierarchical multi-subject multiscale analysis of functional MRI data.

    PubMed

    Sanyal, Nilotpal; Ferreira, Marco A R

    2012-11-15

    We develop a methodology for Bayesian hierarchical multi-subject multiscale analysis of functional Magnetic Resonance Imaging (fMRI) data. We begin by modeling the brain images temporally with a standard general linear model. After that, we transform the resulting estimated standardized regression coefficient maps through a discrete wavelet transformation to obtain a sparse representation in the wavelet space. Subsequently, we assign to the wavelet coefficients a prior that is a mixture of a point mass at zero and a Gaussian white noise. In this mixture prior for the wavelet coefficients, the mixture probabilities are related to the pattern of brain activity across different resolutions. To incorporate this information, we assume that the mixture probabilities for wavelet coefficients at the same location and level are common across subjects. Furthermore, we assign for the mixture probabilities a prior that depends on a few hyperparameters. We develop an empirical Bayes methodology to estimate the hyperparameters and, as these hyperparameters are shared by all subjects, we obtain precise estimated values. Then we carry out inference in the wavelet space and obtain smoothed images of the regression coefficients by applying the inverse wavelet transform to the posterior means of the wavelet coefficients. An application to computer simulated synthetic data has shown that, when compared to single-subject analysis, our multi-subject methodology performs better in terms of mean squared error. Finally, we illustrate the utility and flexibility of our multi-subject methodology with an application to an event-related fMRI dataset generated by Postle (2005) through a multi-subject fMRI study of working memory related brain activation. PMID:22951257

  13. Bayesian Monte Carlo analysis applied to regional-scale inverse emission modeling for reactive trace gases

    NASA Astrophysics Data System (ADS)

    Deguillaume, L.; Beekmann, M.; Menut, L.

    2007-01-01

    The purpose of this article is inverse modeling of emissions at regional scale for photochemical applications. The study is performed for the Ile-de-France region over a two summers (1998 and 1999) period. This area represents an ideal framework since concentrated anthropogenic emissions in the Paris region frequently lead to the formation of urban plumes. The inversion method is based on Bayesian Monte Carlo analysis applied to a regional-scale chemistry transport model, CHIMERE. This method consists in performing a large number of successive simulations with the same model but with a distinct set of model input parameters at each time. Then a posteriori weights are attributed to individual Monte Carlo simulations by comparing them with observations from the AIRPARIF network: urban NO and O3 concentrations and rural O3 concentrations around the Paris area. For both NO and O3 measurements, observations used for constraining Monte Carlo simulations are additionally averaged over the time period considered for analysis. The observational constraints strongly reduce the a priori uncertainties in anthropogenic NOx and volatile organic compounds (VOC) emissions: (1) The a posteriori probability density function (pdf) for NOx emissions is not modified in its average, but the standard deviation is decreased to around 20% (40% for the a priori one). (2) VOC emissions are enhanced (+16%) in the a posteriori pdf's with a standard deviation around 30% (40% for the a priori one). Uncertainties in the simulated urban NO, urban O3, and O3 production within the plume are reduced by a factor of 3.2, 2.4, and 1.7, respectively.

  14. Health at the borders: Bayesian multilevel analysis of women's malnutrition determinants in Ethiopia

    PubMed Central

    Delbiso, Tefera Darge; Rodriguez-Llanes, Jose Manuel; Altare, Chiara; Masquelier, Bruno; Guha-Sapir, Debarati

    2016-01-01

    Background Women's malnutrition, particularly undernutrition, remains an important public health challenge in Ethiopia. Although various studies examined the levels and determinants of women's nutritional status, the influence of living close to an international border on women's nutrition has not been investigated. Yet, Ethiopian borders are regularly affected by conflict and refugee flows, which might ultimately impact health. Objective To investigate the impact of living close to borders in the nutritional status of women in Ethiopia, while considering other important covariates. Design Our analysis was based on the body mass index (BMI) of 6,334 adult women aged 20–49 years, obtained from the 2011 Ethiopian Demographic and Health Survey (EDHS). A Bayesian multilevel multinomial logistic regression analysis was used to capture the clustered structure of the data and the possible correlation that may exist within and between clusters. Results After controlling for potential confounders, women living close to borders (i.e. ≤100 km) in Ethiopia were 59% more likely to be underweight (posterior odds ratio [OR]=1.59; 95% credible interval [CrI]: 1.32–1.90) than their counterparts living far from the borders. This result was robust to different choices of border delineation (i.e. ≤50, ≤75, ≤125, and ≤150 km). Women from poor families, those who have no access to improved toilets, reside in lowland areas, and are Muslim, were independently associated with underweight. In contrast, more wealth, higher education, older age, access to improved toilets, being married, and living in urban or lowlands were independently associated with overweight. Conclusions The problem of undernutrition among women in Ethiopia is most worrisome in the border areas. Targeted interventions to improve nutritional status in these areas, such as improved access to sanitation, economic and livelihood support, are recommended. PMID:27388539

  15. An inelastic analysis of a welded aluminum joint

    NASA Astrophysics Data System (ADS)

    Vaughan, Robert E.; Schonberg, William P.

    1995-02-01

    Butt weld joints are most commonly designed into pressure vessels by using weld material properties that are determined from a tensile test. These properties are provided to the stress analyst in the form of a stress vs strain diagram. Variations in properties through the thickness of the weld and along the width of the weld have been suspect but not explored because of inaccessibility and cost. The purpose of this study is to investigate analytical and computational methods used for analysis of multiple pass aluminum 2219-T87 butt welds. The weld specimens are analyzed using classical plasticity theory to provide a basis for modeling the inelastic properties in a finite element solution. The results of the analysis are compared to experimental data to determine the weld behavior and the accuracy of currently available numerical prediction methods.

  16. Multi-component joint analysis of surface waves

    NASA Astrophysics Data System (ADS)

    Dal Moro, Giancarlo; Moura, Rui Miguel Marques; Moustafa, Sayed S. R.

    2015-08-01

    Propagation of surface waves can occur with complex energy distribution amongst the various modes. It is shown that even simple VS (shear-wave velocity) profiles can generate velocity spectra that, because of a complex mode excitation, can be quite difficult to interpret in terms of modal dispersion curves. In some cases, Rayleigh waves show relevant differences depending on the considered component (radial or vertical) and the kind of source (vertical impact or explosive). Contrary to several simplistic assumptions often proposed, it is shown, both via synthetic and field datasets, that the fundamental mode of Rayleigh waves can be almost completely absent. This sort of evidence demonstrates the importance of a multi-component analysis capable of providing the necessary elements to properly interpret the data and adequately constrain the subsurface model. It is purposely shown, also through the sole use of horizontal geophones, how it can be possible to efficiently and quickly acquire both Love and Rayleigh (radial-component) waves. The presented field dataset reports a case where Rayleigh waves (both their vertical and radial components) appear largely dominated by higher modes with little or no evidence of the fundamental mode. The joint inversion of the radial and vertical components of Rayleigh waves jointly with Love waves is performed by adopting a multi-objective inversion scheme based on the computation of synthetic seismograms for the three considered components and the minimization of the whole velocity spectra misfits (Full Velocity Spectra - FVS - inversion). Such a FVS multi-component joint inversion can better handle complex velocity spectra thus providing a more robust subsurface model not affected by erroneous velocity spectra interpretations and non-uniqueness of the solution.

  17. Bayesian Analysis to Identify New Star Candidates in Nearby Young Stellar Kinematic Groups

    NASA Astrophysics Data System (ADS)

    Malo, Lison; Doyon, René; Lafrenière, David; Artigau, Étienne; Gagné, Jonathan; Baron, Frédérique; Riedel, Adric

    2013-01-01

    We present a new method based on a Bayesian analysis to identify new members of nearby young kinematic groups. The analysis minimally takes into account the position, proper motion, magnitude, and color of a star, but other observables can be readily added (e.g., radial velocity, distance). We use this method to find new young low-mass stars in the β Pictoris and AB Doradus moving groups and in the TW Hydrae, Tucana-Horologium, Columba, Carina, and Argus associations. Starting from a sample of 758 mid-K to mid-M (K5V-M5V) stars showing youth indicators such as Hα and X-ray emission, our analysis yields 214 new highly probable low-mass members of the kinematic groups analyzed. One is in TW Hydrae, 37 in β Pictoris, 17 in Tucana-Horologium, 20 in Columba, 6 in Carina, 50 in Argus, 32 in AB Doradus, and the remaining 51 candidates are likely young but have an ambiguous membership to more than one association. The false alarm rate for new candidates is estimated to be 5% for β Pictoris and TW Hydrae, 10% for Tucana-Horologium, Columba, Carina, and Argus, and 14% for AB Doradus. Our analysis confirms the membership of 58 stars proposed in the literature. Firm membership confirmation of our new candidates will require measurement of their radial velocity (predicted by our analysis), parallax, and lithium 6708 Å equivalent width. We have initiated these follow-up observations for a number of candidates, and we have identified two stars (2MASSJ01112542+1526214, 2MASSJ05241914-1601153) as very strong candidate members of the β Pictoris moving group and one strong candidate member (2MASSJ05332558-5117131) of the Tucana-Horologium association; these three stars have radial velocity measurements confirming their membership and lithium detections consistent with young age. Based on observations obtained at the Canada-France-Hawaii Telescope (CFHT) which is operated by the National Research Council of Canada, the Institut National des Sciences de l'Univers of the Centre

  18. Use of Bayesian Inference in Crystallographic Structure Refinement via Full Diffraction Profile Analysis.

    PubMed

    Fancher, Chris M; Han, Zhen; Levin, Igor; Page, Katharine; Reich, Brian J; Smith, Ralph C; Wilson, Alyson G; Jones, Jacob L

    2016-01-01

    A Bayesian inference method for refining crystallographic structures is presented. The distribution of model parameters is stochastically sampled using Markov chain Monte Carlo. Posterior probability distributions are constructed for all model parameters to properly quantify uncertainty by appropriately modeling the heteroskedasticity and correlation of the error structure. The proposed method is demonstrated by analyzing a National Institute of Standards and Technology silicon standard reference material. The results obtained by Bayesian inference are compared with those determined by Rietveld refinement. Posterior probability distributions of model parameters provide both estimates and uncertainties. The new method better estimates the true uncertainties in the model as compared to the Rietveld method. PMID:27550221

  19. Use of Bayesian Inference in Crystallographic Structure Refinement via Full Diffraction Profile Analysis

    PubMed Central

    Fancher, Chris M.; Han, Zhen; Levin, Igor; Page, Katharine; Reich, Brian J.; Smith, Ralph C.; Wilson, Alyson G.; Jones, Jacob L.

    2016-01-01

    A Bayesian inference method for refining crystallographic structures is presented. The distribution of model parameters is stochastically sampled using Markov chain Monte Carlo. Posterior probability distributions are constructed for all model parameters to properly quantify uncertainty by appropriately modeling the heteroskedasticity and correlation of the error structure. The proposed method is demonstrated by analyzing a National Institute of Standards and Technology silicon standard reference material. The results obtained by Bayesian inference are compared with those determined by Rietveld refinement. Posterior probability distributions of model parameters provide both estimates and uncertainties. The new method better estimates the true uncertainties in the model as compared to the Rietveld method. PMID:27550221

  20. Bayesian Analysis for Risk Assessment of Selected Medical Events in Support of the Integrated Medical Model Effort

    NASA Technical Reports Server (NTRS)

    Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.

    2012-01-01

    The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.

  1. Nonlinear Analysis of Bonded Composite Single-LAP Joints

    NASA Technical Reports Server (NTRS)

    Oterkus, E.; Barut, A.; Madenci, E.; Smeltzer, S. S.; Ambur, D. R.

    2004-01-01

    This study presents a semi-analytical solution method to analyze the geometrically nonlinear response of bonded composite single-lap joints with tapered adherend edges under uniaxial tension. The solution method provides the transverse shear and normal stresses in the adhesive and in-plane stress resultants and bending moments in the adherends. The method utilizes the principle of virtual work in conjunction with von Karman s nonlinear plate theory to model the adherends and the shear lag model to represent the kinematics of the thin adhesive layer between the adherends. Furthermore, the method accounts for the bilinear elastic material behavior of the adhesive while maintaining a linear stress-strain relationship in the adherends. In order to account for the stiffness changes due to thickness variation of the adherends along the tapered edges, their in-plane and bending stiffness matrices are varied as a function of thickness along the tapered region. The combination of these complexities results in a system of nonlinear governing equilibrium equations. This approach represents a computationally efficient alternative to finite element method. Comparisons are made with corresponding results obtained from finite-element analysis. The results confirm the validity of the solution method. The numerical results present the effects of taper angle, adherend overlap length, and the bilinear adhesive material on the stress fields in the adherends, as well as the adhesive, of a single-lap joint

  2. BAYESIAN ANALYSIS OF WHITE NOISE LEVELS IN THE FIVE-YEAR WMAP DATA

    SciTech Connect

    Groeneboom, N. E.; Eriksen, H. K.; Gorski, K.; Huey, G.; Jewell, J.; Wandelt, B.

    2009-09-01

    We develop a new Bayesian method for estimating white noise levels in CMB sky maps, and apply this algorithm to the five-year Wilkinson Microwave Anisotropy Probe (WMAP) data. We assume that the amplitude of the noise rms is scaled by a constant value, {alpha}, relative to a pre-specified noise level. We then derive the corresponding conditional density, P({alpha} | s, C {sub l}, d), which is subsequently integrated into a general CMB Gibbs sampler. We first verify our code by analyzing simulated data sets, and then apply the framework to the WMAP data. For the foreground-reduced five-year WMAP sky maps and the nominal noise levels initially provided in the five-year data release, we find that the posterior means typically range between {alpha} = 1.005 {+-} 0.001 and {alpha} = 1.010 {+-} 0.001 depending on differencing assembly, indicating that the noise level of these maps are biased low by 0.5%-1.0%. The same problem is not observed for the uncorrected WMAP sky maps. After the preprint version of this letter appeared on astro-ph., the WMAP team has corrected the values presented on their web page, noting that the initially provided values were in fact estimates from the three-year data release, not from the five-year estimates. However, internally in their five-year analysis the correct noise values were used, and no cosmological results are therefore compromised by this error. Thus, our method has already been demonstrated in practice to be both useful and accurate.

  3. Impact of breed and sex on porcine endocrine transcriptome: a bayesian biometrical analysis

    PubMed Central

    Pérez-Enciso, Miguel; Ferraz, André LJ; Ojeda, Ana; López-Béjar, Manel

    2009-01-01

    Background Transcriptome variability is due to genetic and environmental causes, much like any other complex phenotype. Ascertaining the transcriptome differences between individuals is an important step to understand how selection and genetic drift may affect gene expression. To that end, extant divergent livestock breeds offer an ideal genetic material. Results We have analyzed with microarrays five tissues from the endocrine axis (hypothalamus, adenohypophysis, thyroid gland, gonads and fat tissue) of 16 pigs from both sexes pertaining to four extreme breeds (Duroc, Large White, Iberian and a cross with SinoEuropean hybrid line). Using a Bayesian linear model approach, we observed that the largest breed variability corresponded to the male gonads, and was larger than at the remaining tissues, including ovaries. Measurement of sex hormones in peripheral blood at slaughter did not detect any breed-related differences. Not unexpectedly, the gonads were the tissue with the largest number of sex biased genes. There was a strong correlation between sex and breed bias expression, although the most breed biased genes were not the most sex biased genes. A combined analysis of connectivity and differential expression suggested three biological processes as being primarily different between breeds: spermatogenesis, muscle differentiation and several metabolic processes. Conclusion These results suggest that differences across breeds in gene expression of the male gonads are larger than in other endocrine tissues in the pig. Nevertheless, the strong presence of breed biased genes in the male gonads cannot be explained solely by changes in spermatogenesis nor by differences in the reproductive tract development. PMID:19239697

  4. Bayesian analysis of stage-fall-discharge rating curves and their uncertainties

    NASA Astrophysics Data System (ADS)

    Mansanarez, Valentin; Le Coz, Jérôme; Renard, Benjamin; Lang, Michel; Pierrefeu, Gilles; Le Boursicaud, Raphaël; Pobanz, Karine

    2016-04-01

    Stage-fall-discharge (SFD) rating curves are traditionally used to compute streamflow records at sites where the energy slope of the flow is variable due to variable backwater effects. Building on existing Bayesian approaches, we introduce an original hydraulics-based method for developing SFD rating curves used at twin gauge stations and estimating their uncertainties. Conventional power functions for channel and section controls are used, and transition to a backwater-affected channel control is computed based on a continuity condition, solved either analytically or numerically. The difference between the reference levels at the two stations is estimated as another uncertain parameter of the SFD model. The method proposed in this presentation incorporates information from both the hydraulic knowledge (equations of channel or section controls) and the information available in the stage-fall-discharge observations (gauging data). The obtained total uncertainty combines the parametric uncertainty and the remnant uncertainty related to the model of rating curve. This method provides a direct estimation of the physical inputs of the rating curve (roughness, width, slope bed, distance between twin gauges, etc.). The performance of the new method is tested using an application case affected by the variable backwater of a run-of-the-river dam: the Rhône river at Valence, France. In particular, a sensitivity analysis to the prior information and to the gauging dataset is performed. At that site, the stage-fall-discharge domain is well documented with gaugings conducted over a range of backwater affected and unaffected conditions. The performance of the new model was deemed to be satisfactory. Notably, transition to uniform flow when the overall range of the auxiliary stage is gauged is correctly simulated. The resulting curves are in good agreement with the observations (gaugings) and their uncertainty envelopes are acceptable for computing streamflow records. Similar

  5. Mapping, Bayesian Geostatistical Analysis and Spatial Prediction of Lymphatic Filariasis Prevalence in Africa

    PubMed Central

    Slater, Hannah; Michael, Edwin

    2013-01-01

    There is increasing interest to control or eradicate the major neglected tropical diseases. Accurate modelling of the geographic distributions of parasitic infections will be crucial to this endeavour. We used 664 community level infection prevalence data collated from the published literature in conjunction with eight environmental variables, altitude and population density, and a multivariate Bayesian generalized linear spatial model that allows explicit accounting for spatial autocorrelation and incorporation of uncertainty in input data and model parameters, to construct the first spatially-explicit map describing LF prevalence distribution in Africa. We also ran the best-fit model against predictions made by the HADCM3 and CCCMA climate models for 2050 to predict the likely distributions of LF under future climate and population changes. We show that LF prevalence is strongly influenced by spatial autocorrelation between locations but is only weakly associated with environmental covariates. Infection prevalence, however, is found to be related to variations in population density. All associations with key environmental/demographic variables appear to be complex and non-linear. LF prevalence is predicted to be highly heterogenous across Africa, with high prevalences (>20%) estimated to occur primarily along coastal West and East Africa, and lowest prevalences predicted for the central part of the continent. Error maps, however, indicate a need for further surveys to overcome problems with data scarcity in the latter and other regions. Analysis of future changes in prevalence indicates that population growth rather than climate change per se will represent the dominant factor in the predicted increase/decrease and spread of LF on the continent. We indicate that these results could play an important role in aiding the development of strategies that are best able to achieve the goals of parasite elimination locally and globally in a manner that may also account

  6. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  7. New class of hybrid EoS and Bayesian M - R data analysis

    NASA Astrophysics Data System (ADS)

    Alvarez-Castillo, D.; Ayriyan, A.; Benic, S.; Blaschke, D.; Grigorian, H.; Typel, S.

    2016-03-01

    We explore systematically a new class of two-phase equations of state (EoS) for hybrid stars that is characterized by three main features: 1) stiffening of the nuclear EoS at supersaturation densities due to quark exchange effects (Pauli blocking) between hadrons, modelled by an excluded volume correction; 2) stiffening of the quark matter EoS at high densities due to multiquark interactions; and 3) possibility for a strong first-order phase transition with an early onset and large density jump. The third feature results from a Maxwell construction for the possible transition from the nuclear to a quark matter phase and its properties depend on the two parameters used for 1) and 2), respectively. Varying these two parameters, one obtains a class of hybrid EoS that yields solutions of the Tolman-Oppenheimer-Volkoff (TOV) equations for sequences of hadronic and hybrid stars in the mass-radius diagram which cover the full range of patterns according to the Alford-Han-Prakash classification following which a hybrid star branch can be either absent, connected or disconnected with the hadronic one. The latter case often includes a tiny connected branch. The disconnected hybrid star branch, also called "third family", corresponds to high-mass twin stars characterized by the same gravitational mass but different radii. We perform a Bayesian analysis and demonstrate that the observation of such a pair of high-mass twin stars would have a sufficient discriminating power to favor hybrid EoS with a strong first-order phase transition over alternative EoS.

  8. Bayesian Analysis of Non-Gaussian Long-Range Dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, T.; Franzke, C.; Gramacy, R. B.; Watkins, N. W.

    2012-12-01

    Recent studies have strongly suggested that surface temperatures exhibit long-range dependence (LRD). The presence of LRD would hamper the identification of deterministic trends and the quantification of their significance. It is well established that LRD processes exhibit stochastic trends over rather long periods of time. Thus, accurate methods for discriminating between physical processes that possess long memory and those that do not are an important adjunct to climate modeling. We have used Markov Chain Monte Carlo algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average (ARFIMA) processes, which are capable of modeling LRD. Our principal aim is to obtain inference about the long memory parameter, d,with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series such as the Central England Temperature. Many physical processes, for example the Faraday time series from Antarctica, are highly non-Gaussian. We have therefore extended this work by weakening the Gaussianity assumption. Specifically, we assume a symmetric α -stable distribution for the innovations. Such processes provide good, flexible, initial models for non-Gaussian processes with long memory. We will present a study of the dependence of the posterior variance σ d of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other measures of d.

  9. BAYESIAN META-ANALYSIS ON MEDICAL DEVICES: APPLICATION TO IMPLANTABLE CARDIOVERTER DEFIBRILLATORS

    PubMed Central

    Youn, Ji-Hee; Lord, Joanne; Hemming, Karla; Girling, Alan; Buxton, Martin

    2012-01-01

    Objectives: The aim of this study is to describe and illustrate a method to obtain early estimates of the effectiveness of a new version of a medical device. Methods: In the absence of empirical data, expert opinion may be elicited on the expected difference between the conventional and modified devices. Bayesian Mixed Treatment Comparison (MTC) meta-analysis can then be used to combine this expert opinion with existing trial data on earlier versions of the device. We illustrate this approach for a new four-pole implantable cardioverter defibrillator (ICD) compared with conventional ICDs, Class III anti-arrhythmic drugs, and conventional drug therapy for the prevention of sudden cardiac death in high risk patients. Existing RCTs were identified from a published systematic review, and we elicited opinion on the difference between four-pole and conventional ICDs from experts recruited at a cardiology conference. Results: Twelve randomized controlled trials were identified. Seven experts provided valid probability distributions for the new ICDs compared with current devices. The MTC model resulted in estimated relative risks of mortality of 0.74 (0.60–0.89) (predictive relative risk [RR] = 0.77 [0.41–1.26]) and 0.83 (0.70–0.97) (predictive RR = 0.84 [0.55–1.22]) with the new ICD therapy compared to Class III anti-arrhythmic drug therapy and conventional drug therapy, respectively. These results showed negligible differences from the preliminary results for the existing ICDs. Conclusions: The proposed method incorporating expert opinion to adjust for a modification made to an existing device may play a useful role in assisting decision makers to make early informed judgments on the effectiveness of frequently modified healthcare technologies. PMID:22559753

  10. Cancer mortality inequalities in urban areas: a Bayesian small area analysis in Spanish cities

    PubMed Central

    2011-01-01

    Background Intra-urban inequalities in mortality have been infrequently analysed in European contexts. The aim of the present study was to analyse patterns of cancer mortality and their relationship with socioeconomic deprivation in small areas in 11 Spanish cities. Methods It is a cross-sectional ecological design using mortality data (years 1996-2003). Units of analysis were the census tracts. A deprivation index was calculated for each census tract. In order to control the variability in estimating the risk of dying we used Bayesian models. We present the RR of the census tract with the highest deprivation vs. the census tract with the lowest deprivation. Results In the case of men, socioeconomic inequalities are observed in total cancer mortality in all cities, except in Castellon, Cordoba and Vigo, while Barcelona (RR = 1.53 95%CI 1.42-1.67), Madrid (RR = 1.57 95%CI 1.49-1.65) and Seville (RR = 1.53 95%CI 1.36-1.74) present the greatest inequalities. In general Barcelona and Madrid, present inequalities for most types of cancer. Among women for total cancer mortality, inequalities have only been found in Barcelona and Zaragoza. The excess number of cancer deaths due to socioeconomic deprivation was 16,413 for men and 1,142 for women. Conclusion This study has analysed inequalities in cancer mortality in small areas of cities in Spain, not only relating this mortality with socioeconomic deprivation, but also calculating the excess mortality which may be attributed to such deprivation. This knowledge is particularly useful to determine which geographical areas in each city need intersectorial policies in order to promote a healthy environment. PMID:21232096

  11. Analysis of Bonded Joints Between the Facesheet and Flange of Corrugated Composite Panels

    NASA Technical Reports Server (NTRS)

    Yarrington, Phillip W.; Collier, Craig S.; Bednarcyk, Brett A.

    2008-01-01

    This paper outlines a method for the stress analysis of bonded composite corrugated panel facesheet to flange joints. The method relies on the existing HyperSizer Joints software, which analyzes the bonded joint, along with a beam analogy model that provides the necessary boundary loading conditions to the joint analysis. The method is capable of predicting the full multiaxial stress and strain fields within the flange to facesheet joint and thus can determine ply-level margins and evaluate delamination. Results comparing the method to NASTRAN finite element model stress fields are provided illustrating the accuracy of the method.

  12. SRB environment evaluation and analysis. Volume 2: RSRB joint filling test/analysis improvements

    NASA Astrophysics Data System (ADS)

    Knox, E. C.; Woods, G. Hamilton

    1991-09-01

    Following the Challenger accident a very comprehensive solid rocket booster (SRB) redesign program was initiated. One objective of the program was to develop expertise at NASA/MSFC in the techniques for analyzing the flow of hot gases in the SRB joints. Several test programs were undertaken to provide a data base of joint performance with manufactured defects in the joints to allow hot gases to fill the joints. This data base was used also to develop the analytical techniques. Some of the test programs were Joint Environment Simulator (JES), Nozzle Joint Environment Simulator (NJES), Transient Pressure Test Article (TPTA), and Seventy-Pound Charge (SPC). In 1988 the TPTA test hardware was moved from the Utah site to MSFC and several RSRM tests were scheduled, to be followed by tests for the ASRM program. REMTECH Inc. supported these activities with pretest estimates of the flow conditions in the test joints, and post-test analysis and evaluation of the measurements. During this support REMTECH identified deficiencies in the gas-measurement instrumentation that existed in the TPTA hardware, made recommendations for its replacement, and identified improvements to the analytical tools used in the test support. Only one test was completed under the TPTA RSRM test program, and those scheduled for the ASRM were rescheduled to a time after the expiration of this contract. The attention of this effort was directed toward improvements in the analytical techniques in preparation for when the ASRM program begins.

  13. SRB Environment Evaluation and Analysis. Volume 2: RSRB Joint Filling Test/Analysis Improvements

    NASA Technical Reports Server (NTRS)

    Knox, E. C.; Woods, G. Hamilton

    1991-01-01

    Following the Challenger accident a very comprehensive solid rocket booster (SRB) redesign program was initiated. One objective of the program was to develop expertise at NASA/MSFC in the techniques for analyzing the flow of hot gases in the SRB joints. Several test programs were undertaken to provide a data base of joint performance with manufactured defects in the joints to allow hot gases to fill the joints. This data base was used also to develop the analytical techniques. Some of the test programs were Joint Environment Simulator (JES), Nozzle Joint Environment Simulator (NJES), Transient Pressure Test Article (TPTA), and Seventy-Pound Charge (SPC). In 1988 the TPTA test hardware was moved from the Utah site to MSFC and several RSRM tests were scheduled, to be followed by tests for the ASRM program. REMTECH Inc. supported these activities with pretest estimates of the flow conditions in the test joints, and post-test analysis and evaluation of the measurements. During this support REMTECH identified deficiencies in the gas-measurement instrumentation that existed in the TPTA hardware, made recommendations for its replacement, and identified improvements to the analytical tools used in the test support. Only one test was completed under the TPTA RSRM test program, and those scheduled for the ASRM were rescheduled to a time after the expiration of this contract. The attention of this effort was directed toward improvements in the analytical techniques in preparation for when the ASRM program begins.

  14. Genome-wide association study of swine farrowing traits. Part II: Bayesian analysis of marker data.

    PubMed

    Schneider, J F; Rempel, L A; Snelling, W M; Wiedmann, R T; Nonneman, D J; Rohrer, G A

    2012-10-01

    Reproductive efficiency has a great impact on the economic success of pork (sus scrofa) production. Number born alive (NBA) and average piglet birth weight (ABW) contribute greatly to reproductive efficiency. To better understand the underlying genetics of birth traits, a genome-wide association study (GWAS) was undertaken. Samples of DNA were collected and tested using the Illumina PorcineSNP60 BeadChip from 1,152 first parity gilts. Traits included total number born (TNB), NBA, number born dead (NBD), number stillborn (NSB), number of mummies (MUM), total litter birth weight (LBW), and ABW. A total of 41,151 SNP were tested using a Bayesian approach. Beginning with the first 5 SNP on SSC1 and ending with the last 5 SNP on the SSCX, SNP were assigned to groups of 5 consecutive SNP by chromosome-position order and analyzed again using a Bayesian approach. From that analysis, 5-SNP groups were selected having no overlap with another 5-SNP groups and no overlap across chromosomes. These selected 5-SNP non-overlapping groups were defined as QTL. Of the available 8,814 QTL, 124 were found to be statistically significant (P < 0.01). Multiple testing was considered using the probability of false positives. Eleven QTL were found for TNB, 3 on SSC1, 3 on SSC4, 1 on SSC13, 1 on SSC14, 2 on SSC15, and 1 on SSC17. Statistical testing for NBA identified 14 QTL, 4 on SSC1, 1 on SSC4, 1 on SSC6, 1 on SSC10, 1on SSC13, 3 on SSC15, and 3 on SSC17. A single NBD QTL was found on SSC11. No QTL were identified for NSB or MUM. Thirty-three QTL were found for LBW, 3 on SSC1, 1 on SSC2, 1 on SSC3, 5 on SSC4, 2 on SSC5, 5 on SSC6, 3 on SSC7, 2 on SSC9, 1 on SSC10, 2 on SSC14, 6 on SSC15, and 2 on SSC17. A total of 65 QTL were found for ABW, 9 on SSC1, 3 on SSC2, 9 on SSC5, 5 on SSC6, 1 on SSC7, 2 on SSC8, 2 on SSC9, 3 on SSC10, 1 on SSC11, 3 on SSC12, 2 on SSC13, 8 on SSC14, 8 on SSC15, 1 on SSC17, and 8 on SSC18. Several candidate genes have been identified that overlap QTL locations

  15. Bayesian Analysis of Structural Equation Models with Nonlinear Covariates and Latent Variables

    ERIC Educational Resources Information Center

    Song, Xin-Yuan; Lee, Sik-Yum

    2006-01-01

    In this article, we formulate a nonlinear structural equation model (SEM) that can accommodate covariates in the measurement equation and nonlinear terms of covariates and exogenous latent variables in the structural equation. The covariates can come from continuous or discrete distributions. A Bayesian approach is developed to analyze the…

  16. Exemplars, Prototypes, Similarities, and Rules in Category Representation: An Example of Hierarchical Bayesian Analysis

    ERIC Educational Resources Information Center

    Lee, Michael D.; Vanpaemel, Wolf

    2008-01-01

    This article demonstrates the potential of using hierarchical Bayesian methods to relate models and data in the cognitive sciences. This is done using a worked example that considers an existing model of category representation, the Varying Abstraction Model (VAM), which attempts to infer the representations people use from their behavior in…

  17. GLOBAL RELATIONSHIPS OF BEMISIA TABACI (HEMIPTERA: ALEYRODIDAE) REVEALED USING BAYESIAN ANALYSIS OF MITOCHONDRIAL COI DNA SEQUENCE

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Global phylogenetic relationships of the major races of B. tabaci remain unresolved thus a Bayesian phylogenetic technique was utilized to elucidate affinities. All COI DNA sequence data available in Genbank for B. tabaci world-wide (369 specimens) were obtained and the first well resolved phylogen...

  18. IMPLICATIONS OF USING ROBUST BAYESIAN ANALYSIS TO REPRESENT DIVERSE SOURCES OF UNCERTAINTY IN INTEGRATED ASSESSMENT

    EPA Science Inventory

    In our previous research, we showed that robust Bayesian methods can be used in environmental modeling to define a set of probability distributions for key parameters that captures the effects of expert disagreement, ambiguity, or ignorance. This entire set can then be update...

  19. Elastic-plastic analysis of crack in ductile adhesive joint

    SciTech Connect

    Ikeda, Toru; Miyazaki, Noriyuki; Yamashita, Akira; Munakata, Tsuyoshi

    1995-11-01

    The fracture of a crack in adhesive is important to the structural integrity of adhesive structures and composite materials. Though the fracture toughness of a material should be constant according to fracture mechanics, it is said that the fracture toughness of a crack in an adhesive joint depends on the bond thickness. In the present study, the elastic-plastic stress analyses of a crack in a thin adhesive layer are performed by the combination of the boundary element method and the finite element method. The effect of adhesive thickness on the J-integral, the Q`-factor which is a modified version of the Q-factor, and the crack tip opening displacement (CTOD) are investigated. It is found from the analyses that the CTOD begins to decrease at very thin bond thickness, the Q`-factor being almost constant. The decrease of the fracture toughness at very thin adhesive layer is expected by the present analysis.

  20. Magellan/Galileo solder joint failure analysis and recommendations

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    1989-01-01

    On or about November 10, 1988 an open circuit solder joint was discovered in the Magellan Radar digital unit (DFU) during integration testing at Kennedy Space Center (KSC). A detailed analysis of the cause of the failure was conducted at the Jet Propulsion Laboratory leading to the successful repair of many pieces of affected electronic hardware on both the Magellan and Galileo spacecraft. The problem was caused by the presence of high thermal coefficient of expansion heat sink and conformal coating materials located in the large (0.055 inch) gap between Dual Inline Packages (DIPS) and the printed wiring board. The details of the observed problems are described and recommendations are made for improved design and testing activities in the future.

  1. Failure Analysis in Space: International Space Station (ISS) Starboard Solar Alpha Rotary Joint (SARJ) Debris Analysis

    NASA Technical Reports Server (NTRS)

    Long, V. S.; Wright, M. C.; McDanels, S. J.; Lubas, D.; Tucker, B.; Marciniak, P. J.

    2010-01-01

    This slide presentation reviews the debris analysis of the Starboard Solar Alpha Rotary Joint (SARJ), a mechanism that is designed to keep the solar arrays facing the sun. The goal of this was to identify the failure mechanism based on surface morphology and to determine the source of debris through elemental and particle analysis.

  2. Kinematic and dynamic analysis of an anatomically based knee joint.

    PubMed

    Lee, Kok-Meng; Guo, Jiajie

    2010-05-01

    This paper presents a knee-joint model to provide a better understanding on the interaction between natural joints and artificial mechanisms for design and control of rehabilitation exoskeletons. The anatomically based knee model relaxes several commonly made assumptions that approximate a human knee as engineering pin-joint in exoskeleton design. Based on published MRI data, we formulate the kinematics of a knee-joint and compare three mathematical approximations; one model bases on two sequential circles rolling a flat plane; and the other two are mathematically differentiable ellipses-based models with and without sliding at the contact. The ellipses-based model taking sliding contact into accounts shows that the rolling-sliding ratio of a knee-joint is not a constant but has an average value consistent with published measurements. This knee-joint kinematics leads to a physically more accurate contact-point trajectory than methods based on multiple circles or lines, and provides a basis to derive a knee-joint kinetic model upon which the effects of a planar exoskeleton mechanism on the internal joint forces and torque during flexion can be numerically investigated. Two different knee-joint kinetic models (pin-joint approximation and anatomically based model) are compared against a condition with no exoskeleton. The leg and exoskeleton form a closed kinematic chain that has a significant effect on the joint forces in the knee. Human knee is more tolerant than pin-joint in negotiating around a singularity but its internal forces increase with the exoskeleton mass-to-length ratio. An oversimplifying pin-joint approximation cannot capture the finite change in the knee forces due to the singularity effect. PMID:20189182

  3. Bayesian Information-Gap Decision Analysis Applied to a CO2 Leakage Problem

    NASA Astrophysics Data System (ADS)

    O'Malley, D.; Vesselinov, V. V.

    2014-12-01

    We describe a decision analysis in the presence of uncertainty that combines a non-probabilistic approach (information-gap decision theory) with a probabilistic approach (Bayes' theorem). Bayes' theorem is one of the most popular techniques for probabilistic uncertainty quantification (UQ). It is effective in many situations, because it updates our understanding of the uncertainties by conditioning on real data using a mathematically rigorous technique. However, the application of Bayes' theorem in science and engineering is not always rigorous. There are two reasons for this: (1) We can enumerate the possible outcomes of dice-rolling, but not the possible outcomes of real-world contamination remediation; (2) We can precisely determine conditional probabilities for coin-tossing, but substantial uncertainty surrounds the conditional probabilities for real-world contamination remediation. Of course, Bayes' theorem is rigorously applicable beyond dice-rolling and coin-tossing, but even in cases that are constructed to be simple with ostensibly good probabilistic models, applying Bayes' theorem to the real world may not work as well as one might expect. Bayes' theorem is rigorously applicable only if all possible events can be described, and their conditional probabilities can be derived rigorously. Outside of this domain, it may still be useful, but its use lacks at least some rigor. The information-gap approach allows us to circumvent some of the highlighted shortcomings of Bayes' theorem. In particular, it provides a way to account for possibilities beyond those described by our models, and a way to deal with uncertainty in the conditional distribution that forms the core of Bayesian analysis. We have developed a three-tiered technique enables one to make scientifically defensible decisions in the face of severe uncertainty such as is found in many geologic problems. To demonstrate the applicability, we apply the technique to a CO2 leakage problem. The goal is to

  4. Rating locomotive crew diesel emission exposure profiles using statistics and Bayesian Decision Analysis.

    PubMed

    Hewett, Paul; Bullock, William H

    2014-01-01

    For more than 20 years CSX Transportation (CSXT) has collected exposure measurements from locomotive engineers and conductors who are potentially exposed to diesel emissions. The database included measurements for elemental and total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, carbon monoxide, and nitrogen dioxide. This database was statistically analyzed and summarized, and the resulting statistics and exposure profiles were compared to relevant occupational exposure limits (OELs) using both parametric and non-parametric descriptive and compliance statistics. Exposure ratings, using the American Industrial Health Association (AIHA) exposure categorization scheme, were determined using both the compliance statistics and Bayesian Decision Analysis (BDA). The statistical analysis of the elemental carbon data (a marker for diesel particulate) strongly suggests that the majority of levels in the cabs of the lead locomotives (n = 156) were less than the California guideline of 0.020 mg/m(3). The sample 95th percentile was roughly half the guideline; resulting in an AIHA exposure rating of category 2/3 (determined using BDA). The elemental carbon (EC) levels in the trailing locomotives tended to be greater than those in the lead locomotive; however, locomotive crews rarely ride in the trailing locomotive. Lead locomotive EC levels were similar to those reported by other investigators studying locomotive crew exposures and to levels measured in urban areas. Lastly, both the EC sample mean and 95%UCL were less than the Environmental Protection Agency (EPA) reference concentration of 0.005 mg/m(3). With the exception of nitrogen dioxide, the overwhelming majority of the measurements for total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, and combustion gases in the cabs of CSXT locomotives were either non-detects or considerably less than the working OELs for the years represented in the database. When compared to the previous American

  5. Bayesian Multiscale Analysis of X-Ray Jet Features in High Redshift Quasars

    NASA Astrophysics Data System (ADS)

    McKeough, Kathryn; Siemiginowska, A.; Kashyap, V.; Stein, N.

    2014-01-01

    X-ray emission of powerful quasar jets may be a result of the inverse Compton (IC) process in which the Cosmic Microwave Background (CMB) photons gain energy by interactions with the jet’s relativistic electrons. However, there is no definite evidence that IC/CMB process is responsible for the observed X-ray emission of large scale jets. A step toward understanding the X-ray emission process is to study the Radio and X-ray morphologies of the jet. We implement a sophisticated Bayesian image analysis program, Low-count Image Reconstruction and Analysis (LIRA) (Esch et al. 2004; Conners & van Dyk 2007), to analyze jet features in 11 Chandra images of high redshift quasars (z ~ 2 - 4.8). Out of the 36 regions where knots are visible in the radio jets, nine showed detectable X-ray emission. We measured the ratios of the X-ray and radio luminosities of the detected features and found that they are consistent with the CMB radiation relationship. We derived a range of the bulk lorentz factor (Γ) for detected jet features under the CMB jet emission model. There is no discernible trend of Γ with redshift within the sample. The efficiency of the X-ray emission between the detected jet feature and the corresponding quasar also shows no correlation with redshift. This work is supported in part by the National Science Foundation REU and the Department of Defense ASSURE programs under NSF Grant no.1262851 and by the Smithsonian Institution, and by NASA Contract NAS8-39073 to the Chandra X-ray Center (CXC). This research has made use of data obtained from the Chandra Data Archive and Chandra Source Catalog, and software provided by the CXC in the application packages CIAO, ChIPS, and Sherpa. We thank Teddy Cheung for providing the VLA radio images. Connors, A., & van Dyk, D. A. 2007, Statistical Challenges in Modern Astronomy IV, 371, 101 Esch, D. N., Connors, A., Karovska, M., & van Dyk, D. A. 2004, ApJ, 610, 1213

  6. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini,Benjamin; Young, Daniel

    1990-01-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite bonded joints below the cryogenic temperature of 30K (-405 F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to M55J/954-6 and T300/954-6 hybrid composite tubes (75mm square). Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  7. Analysis of a Preloaded Bolted Joint in a Ceramic Composite Combustor

    NASA Technical Reports Server (NTRS)

    Hissam, D. Andy; Bower, Mark V.

    2003-01-01

    This paper presents the detailed analysis of a preloaded bolted joint incorporating ceramic materials. The objective of this analysis is to determine the suitability of a joint design for a ceramic combustor. The analysis addresses critical factors in bolted joint design including preload, preload uncertainty, and load factor. The relationship between key joint variables is also investigated. The analysis is based on four key design criteria, each addressing an anticipated failure mode. The criteria are defined in terms of margin of safety, which must be greater than zero for the design criteria to be satisfied. Since the proposed joint has positive margins of safety, the design criteria are satisfied. Therefore, the joint design is acceptable.

  8. An inelastic analysis of a welded aluminum joint

    NASA Technical Reports Server (NTRS)

    Vaughan, R. E.

    1994-01-01

    Butt-weld joints are most commonly designed into pressure vessels which then become as reliable as the weakest increment in the weld chain. In practice, weld material properties are determined from tensile test specimen and provided to the stress analyst in the form of a stress versus strain diagram. Variations in properties through the thickness of the weld and along the width of the weld have been suspect but not explored because of inaccessibility and cost. The purpose of this study is to investigate analytical and computational methods used for analysis of welds. The weld specimens are analyzed using classical elastic and plastic theory to provide a basis for modeling the inelastic properties in a finite-element solution. The results of the analysis are compared to experimental data to determine the weld behavior and the accuracy of prediction methods. The weld considered in this study is a multiple-pass aluminum 2219-T87 butt weld with thickness of 1.40 in. The weld specimen is modeled using the finite-element code ABAQUS. The finite-element model is used to produce the stress-strain behavior in the elastic and plastic regimes and to determine Poisson's ratio in the plastic region. The value of Poisson's ratio in the plastic regime is then compared to experimental data. The results of the comparisons are used to explain multipass weld behavior and to make recommendations concerning the analysis and testing of welds.

  9. Inelastic Strain Analysis of Solder Joint in NASA Fatigue Specimen

    NASA Technical Reports Server (NTRS)

    Dasgupta, Abhijit; Oyan, Chen

    1991-01-01

    The solder fatigue specimen designed by NASA-GSFC/UNISYS is analyzed in order to obtain the inelastic strain history during two different representative temperature cycles specified by UNISYS. In previous reports (dated July 25, 1990, and November 15, 1990), results were presented of the elastic-plastic and creep analysis for delta T = 31 C cycle, respectively. Subsequent results obtained during the current phase, from viscoplastic finite element analysis of the solder fatigue specimen for delta T = 113 C cycle are summarized. Some common information is repeated for self-completeness. Large-deformation continuum formulations in conjunction with a standard linear solid model is utilized for modeling the solder constitutive creep-plasticity behavior. Relevant material properties are obtained from the literature. Strain amplitudes, mean strains, and residual strains (as well as stresses) accumulated due to a representative complete temperature cycle are obtained as a result of this analysis. The partitioning between elastic strains, time-independent inelastic (plastic) strains, and time-dependent inelastic (creep) strains is also explicitly obtained for two representative cycles. Detailed plots are presented for two representative temperature cycles. This information forms an important input for fatigue damage models, when predicting the fatigue life of solder joints under thermal cycling

  10. An inelastic analysis of a welded aluminum joint

    NASA Astrophysics Data System (ADS)

    Vaughan, R. E.

    1994-09-01

    Butt-weld joints are most commonly designed into pressure vessels which then become as reliable as the weakest increment in the weld chain. In practice, weld material properties are determined from tensile test specimen and provided to the stress analyst in the form of a stress versus strain diagram. Variations in properties through the thickness of the weld and along the width of the weld have been suspect but not explored because of inaccessibility and cost. The purpose of this study is to investigate analytical and computational methods used for analysis of welds. The weld specimens are analyzed using classical elastic and plastic theory to provide a basis for modeling the inelastic properties in a finite-element solution. The results of the analysis are compared to experimental data to determine the weld behavior and the accuracy of prediction methods. The weld considered in this study is a multiple-pass aluminum 2219-T87 butt weld with thickness of 1.40 in. The weld specimen is modeled using the finite-element code ABAQUS. The finite-element model is used to produce the stress-strain behavior in the elastic and plastic regimes and to determine Poisson's ratio in the plastic region. The value of Poisson's ratio in the plastic regime is then compared to experimental data. The results of the comparisons are used to explain multipass weld behavior and to make recommendations concerning the analysis and testing of welds.

  11. Wave-height hazard analysis in Eastern Coast of Spain - Bayesian approach using generalized Pareto distribution

    NASA Astrophysics Data System (ADS)

    Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.

    2005-03-01

    Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.

  12. Bayesian Analysis Of HMI Solar Image Observables And Comparison To TSI Variations And MWO Image Observables

    NASA Astrophysics Data System (ADS)

    Parker, D. G.; Ulrich, R. K.; Beck, J.

    2014-12-01

    We have previously applied the Bayesian automatic classification system AutoClass to solar magnetogram and intensity images from the 150 Foot Solar Tower at Mount Wilson to identify classes of solar surface features associated with variations in total solar irradiance (TSI) and, using those identifications, modeled TSI time series with improved accuracy (r > 0.96). (Ulrich, et al, 2010) AutoClass identifies classes by a two-step process in which it: (1) finds, without human supervision, a set of class definitions based on specified attributes of a sample of the image data pixels, such as magnetic field and intensity in the case of MWO images, and (2) applies the class definitions thus found to new data sets to identify automatically in them the classes found in the sample set. HMI high resolution images capture four observables-magnetic field, continuum intensity, line depth and line width-in contrast to MWO's two observables-magnetic field and intensity. In this study, we apply AutoClass to the HMI observables for images from May, 2010 to June, 2014 to identify solar surface feature classes. We use contemporaneous TSI measurements to determine whether and how variations in the HMI classes are related to TSI variations and compare the characteristic statistics of the HMI classes to those found from MWO images. We also attempt to derive scale factors between the HMI and MWO magnetic and intensity observables. The ability to categorize automatically surface features in the HMI images holds out the promise of consistent, relatively quick and manageable analysis of the large quantity of data available in these images. Given that the classes found in MWO images using AutoClass have been found to improve modeling of TSI, application of AutoClass to the more complex HMI images should enhance understanding of the physical processes at work in solar surface features and their implications for the solar-terrestrial environment. Ulrich, R.K., Parker, D, Bertello, L. and

  13. Bayesian Analysis of Hmi Images and Comparison to Tsi Variations and MWO Image Observables

    NASA Astrophysics Data System (ADS)

    Parker, D. G.; Ulrich, R. K.; Beck, J.; Tran, T. V.

    2015-12-01

    We have previously applied the Bayesian automatic classification system AutoClass to solar magnetogram and intensity images from the 150 Foot Solar Tower at Mount Wilson to identify classes of solar surface features associated with variations in total solar irradiance (TSI) and, using those identifications, modeled TSI time series with improved accuracy (r > 0.96). (Ulrich, et al, 2010) AutoClass identifies classes by a two-step process in which it: (1) finds, without human supervision, a set of class definitions based on specified attributes of a sample of the image data pixels, such as magnetic field and intensity in the case of MWO images, and (2) applies the class definitions thus found to new data sets to identify automatically in them the classes found in the sample set. HMI high resolution images capture four observables-magnetic field, continuum intensity, line depth and line width-in contrast to MWO's two observables-magnetic field and intensity. In this study, we apply AutoClass to the HMI observables for images from June, 2010 to December, 2014 to identify solar surface feature classes. We use contemporaneous TSI measurements to determine whether and how variations in the HMI classes are related to TSI variations and compare the characteristic statistics of the HMI classes to those found from MWO images. We also attempt to derive scale factors between the HMI and MWO magnetic and intensity observables.The ability to categorize automatically surface features in the HMI images holds out the promise of consistent, relatively quick and manageable analysis of the large quantity of data available in these images. Given that the classes found in MWO images using AutoClass have been found to improve modeling of TSI, application of AutoClass to the more complex HMI images should enhance understanding of the physical processes at work in solar surface features and their implications for the solar-terrestrial environment.Ulrich, R.K., Parker, D, Bertello, L. and

  14. A Bayesian ridge regression analysis of congestion's impact on urban expressway safety.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Lee, Jaeyoung

    2016-03-01

    With the rapid growth of traffic in urban areas, concerns about congestion and traffic safety have been heightened. This study leveraged both Automatic Vehicle Identification (AVI) system and Microwave Vehicle Detection System (MVDS) installed on an expressway in Central Florida to explore how congestion impacts the crash occurrence in urban areas. Multiple congestion measures from the two systems were developed. To ensure more precise estimates of the congestion's effects, the traffic data were aggregated into peak and non-peak hours. Multicollinearity among traffic parameters was examined. The results showed the presence of multicollinearity especially during peak hours. As a response, ridge regression was introduced to cope with this issue. Poisson models with uncorrelated random effects, correlated random effects, and both correlated random effects and random parameters were constructed within the Bayesian framework. It was proven that correlated random effects could significantly enhance model performance. The random parameters model has similar goodness-of-fit compared with the model with only correlated random effects. However, by accounting for the unobserved heterogeneity, more variables were found to be significantly related to crash frequency. The models indicated that congestion increased crash frequency during peak hours while during non-peak hours it was not a major crash contributing factor. Using the random parameter model, the three congestion measures were compared. It was found that all congestion indicators had similar effects while Congestion Index (CI) derived from MVDS data was a better congestion indicator for safety analysis. Also, analyses showed that the segments with higher congestion intensity could not only increase property damage only (PDO) crashes, but also more severe crashes. In addition, the issues regarding the necessity to incorporate specific congestion indicator for congestion's effects on safety and to take care of the

  15. Multi-site identification of a distributed hydrological nitrogen model using Bayesian uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Sanyuan; Jomaa, Seifeddine; Büttner, Olaf; Meon, Günter; Rode, Michael

    2015-10-01

    For capturing spatial variations of runoff and nutrient fluxes attributed to catchment heterogeneity, multi-site hydrological water quality monitoring strategies are increasingly put into practice. This study aimed to investigate the impacts of spatially distributed streamflow and streamwater Inorganic Nitrogen (IN) concentration observations on the identification of a continuous time, spatially semi-distributed and process-based hydrological water quality model HYPE (HYdrological Predictions for the Environment). A Bayesian inference based approach DREAM(ZS) (DiffeRential Evolution Adaptive Metrololis algorithm) was combined with HYPE to implement model optimisation and uncertainty analysis on streamflow and streamwater IN concentration simulations at a nested meso scale catchment in central Germany. To this end, a 10-year period (1994-1999 for calibration and 1999-2004 for validation) was utilised. We compared the parameters' posterior distributions, modelling performance using the best estimated parameter set and 95% prediction confidence intervals at catchment outlet for the calibration period that were derived from single-site calibration (SSC) and multi-site calibration (MSC) modes. For SSC, streamflow and streamwater IN concentration observations at only the catchment outlet were used. While, for MSC, streamflow and streamwater IN concentration observations from both catchment outlet and two internal sites were considered. Results showed that the uncertainty intervals of hydrological water quality parameters' posterior distributions estimated from MSC, were narrower than those obtained from SSC. In addition, it was found that the MSC outperformed SSC on streamwater IN concentration simulations at internal sites for both calibration and validation periods, while the influence on streamflow modelling performance was small. This can be explained by the "nested" nature of the catchment and high correlation between discharge observations from different sites

  16. Spatial Intensity Duration Frequency Relationships Using Hierarchical Bayesian Analysis for Urban Areas

    NASA Astrophysics Data System (ADS)

    Rupa, Chandra; Mujumdar, Pradeep

    2016-04-01

    In urban areas, quantification of extreme precipitation is important in the design of storm water drains and other infrastructure. Intensity Duration Frequency (IDF) relationships are generally used to obtain design return level for a given duration and return period. Due to lack of availability of extreme precipitation data for sufficiently large number of years, estimating the probability of extreme events is difficult. Typically, a single station data is used to obtain the design return levels for various durations and return periods, which are used in the design of urban infrastructure for the entire city. In an urban setting, the spatial variation of precipitation can be high; the precipitation amounts and patterns often vary within short distances of less than 5 km. Therefore it is crucial to study the uncertainties in the spatial variation of return levels for various durations. In this work, the extreme precipitation is modeled spatially using the Bayesian hierarchical analysis and the spatial variation of return levels is studied. The analysis is carried out with Block Maxima approach for defining the extreme precipitation, using Generalized Extreme Value (GEV) distribution for Bangalore city, Karnataka state, India. Daily data for nineteen stations in and around Bangalore city is considered in the study. The analysis is carried out for summer maxima (March - May), monsoon maxima (June - September) and the annual maxima rainfall. In the hierarchical analysis, the statistical model is specified in three layers. The data layer models the block maxima, pooling the extreme precipitation from all the stations. In the process layer, the latent spatial process characterized by geographical and climatological covariates (lat-lon, elevation, mean temperature etc.) which drives the extreme precipitation is modeled and in the prior level, the prior distributions that govern the latent process are modeled. Markov Chain Monte Carlo (MCMC) algorithm (Metropolis Hastings

  17. Bayesian Analysis of Step-Stress Accelerated Life Test with Exponential Distribution

    SciTech Connect

    Lee, J.; Pan, R.

    2012-04-01

    In this article, we propose a general Bayesian inference approach to the step-stress accelerated life test with type II censoring. We assume that the failure times at each stress level are exponentially distributed and the test units are tested in an increasing order of stress levels. We formulate the prior distribution of the parameters of life-stress function and integrate the engineering knowledge of product failure rate and acceleration factor into the prior. The posterior distribution and the point estimates for the parameters of interest are provided. Through the Markov chain Monte Carlo technique, we demonstrate a nonconjugate prior case using an industrial example. It is shown that with the Bayesian approach, the statistical precision of parameter estimation is improved and, consequently, the required number of failures could be reduced.

  18. Bayesian Sparse Regression Analysis Documents the Diversity of Spinal Inhibitory Interneurons.

    PubMed

    Gabitto, Mariano I; Pakman, Ari; Bikoff, Jay B; Abbott, L F; Jessell, Thomas M; Paninski, Liam

    2016-03-24

    Documenting the extent of cellular diversity is a critical step in defining the functional organization of tissues and organs. To infer cell-type diversity from partial or incomplete transcription factor expression data, we devised a sparse Bayesian framework that is able to handle estimation uncertainty and can incorporate diverse cellular characteristics to optimize experimental design. Focusing on spinal V1 inhibitory interneurons, for which the spatial expression of 19 transcription factors has been mapped, we infer the existence of ~50 candidate V1 neuronal types, many of which localize in compact spatial domains in the ventral spinal cord. We have validated the existence of inferred cell types by direct experimental measurement, establishing this Bayesian framework as an effective platform for cell-type characterization in the nervous system and elsewhere. PMID:26949187

  19. Bayesian Analysis of the Power Spectrum of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey B.; Eriksen, H. K.; O'Dwyer, I. J.; Wandelt, B. D.

    2005-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background. The sky, when viewed in the microwave, is very uniform, with a nearly perfect blackbody spectrum at 2.7 degrees. Very small amplitude brightness fluctuations (to one part in a million!!) trace small density perturbations in the early universe (roughly 300,000 years after the Big Bang), which later grow through gravitational instability to the large-scale structure seen in redshift surveys... In this talk, I will discuss a Bayesian formulation of this problem; discuss a Gibbs sampling approach to numerically sampling from the Bayesian posterior, and the application of this approach to the first-year data from the Wilkinson Microwave Anisotropy Probe. I will also comment on recent algorithmic developments for this approach to be tractable for the even more massive data set to be returned from the Planck satellite.

  20. A tutorial on Bayesian bivariate meta-analysis of mixed binary-continuous outcomes with missing treatment effects.

    PubMed

    Gajic-Veljanoski, Olga; Cheung, Angela M; Bayoumi, Ahmed M; Tomlinson, George

    2016-05-30

    Bivariate random-effects meta-analysis (BVMA) is a method of data synthesis that accounts for treatment effects measured on two outcomes. BVMA gives more precise estimates of the population mean and predicted values than two univariate random-effects meta-analyses (UVMAs). BVMA also addresses bias from incomplete reporting of outcomes. A few tutorials have covered technical details of BVMA of categorical or continuous outcomes. Limited guidance is available on how to analyze datasets that include trials with mixed continuous-binary outcomes where treatment effects on one outcome or the other are not reported. Given the advantages of Bayesian BVMA for handling missing outcomes, we present a tutorial for Bayesian BVMA of incompletely reported treatment effects on mixed bivariate outcomes. This step-by-step approach can serve as a model for our intended audience, the methodologist familiar with Bayesian meta-analysis, looking for practical advice on fitting bivariate models. To facilitate application of the proposed methods, we include our WinBUGS code. As an example, we use aggregate-level data from published trials to demonstrate the estimation of the effects of vitamin K and bisphosphonates on two correlated bone outcomes, fracture, and bone mineral density. We present datasets where reporting of the pairs of treatment effects on both outcomes was 'partially' complete (i.e., pairs completely reported in some trials), and we outline steps for modeling the incompletely reported data. To assess what is gained from the additional work required by BVMA, we compare the resulting estimates to those from separate UVMAs. We discuss methodological findings and make four recommendations. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26553369

  1. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  2. A Primer on Bayesian Decision Analysis With an Application to a Kidney Transplant Decision.

    PubMed

    Neapolitan, Richard; Jiang, Xia; Ladner, Daniela P; Kaplan, Bruce

    2016-03-01

    A clinical decision support system (CDSS) is a computer program, which is designed to assist health care professionals with decision making tasks. A well-developed CDSS weighs the benefits of therapy versus the cost in terms of loss of quality of life and financial loss and recommends the decision that can be expected to provide maximum overall benefit. This article provides an introduction to developing CDSSs using Bayesian networks, such CDSS can help with the often complex decisions involving transplants. First, we review Bayes theorem in the context of medical decision making. Then, we introduce Bayesian networks, which can model probabilistic relationships among many related variables and are based on Bayes theorem. Next, we discuss influence diagrams, which are Bayesian networks augmented with decision and value nodes and which can be used to develop CDSSs that are able to recommend decisions that maximize the expected utility of the predicted outcomes to the patient. By way of comparison, we examine the benefit and challenges of using the Kidney Donor Risk Index as the sole decision tool. Finally, we develop a schema for an influence diagram that models generalized kidney transplant decisions and show how the influence diagram approach can provide the clinician and the potential transplant recipient with a valuable decision support tool. PMID:26900809

  3. Bayesian analysis of the astrobiological implications of life’s early emergence on Earth

    PubMed Central

    Spiegel, David S.; Turner, Edwin L.

    2012-01-01

    Life arose on Earth sometime in the first few hundred million years after the young planet had cooled to the point that it could support water-based organisms on its surface. The early emergence of life on Earth has been taken as evidence that the probability of abiogenesis is high, if starting from young Earth-like conditions. We revisit this argument quantitatively in a Bayesian statistical framework. By constructing a simple model of the probability of abiogenesis, we calculate a Bayesian estimate of its posterior probability, given the data that life emerged fairly early in Earth’s history and that, billions of years later, curious creatures noted this fact and considered its implications. We find that, given only this very limited empirical information, the choice of Bayesian prior for the abiogenesis probability parameter has a dominant influence on the computed posterior probability. Although terrestrial life's early emergence provides evidence that life might be abundant in the universe if early-Earth-like conditions are common, the evidence is inconclusive and indeed is consistent with an arbitrarily low intrinsic probability of abiogenesis for plausible uninformative priors. Finding a single case of life arising independently of our lineage (on Earth, elsewhere in the solar system, or on an extrasolar planet) would provide much stronger evidence that abiogenesis is not extremely rare in the universe. PMID:22198766

  4. Bayesian analysis of the astrobiological implications of life's early emergence on Earth

    NASA Astrophysics Data System (ADS)

    Spiegel, David S.; Turner, Edwin L.

    2012-01-01

    Life arose on Earth sometime in the first few hundred million years after the young planet had cooled to the point that it could support water-based organisms on its surface. The early emergence of life on Earth has been taken as evidence that the probability of abiogenesis is high, if starting from young Earth-like conditions. We revisit this argument quantitatively in a Bayesian statistical framework. By constructing a simple model of the probability of abiogenesis, we calculate a Bayesian estimate of its posterior probability, given the data that life emerged fairly early in Earth’s history and that, billions of years later, curious creatures noted this fact and considered its implications. We find that, given only this very limited empirical information, the choice of Bayesian prior for the abiogenesis probability parameter has a dominant influence on the computed posterior probability. Although terrestrial life's early emergence provides evidence that life might be abundant in the universe if early-Earth-like conditions are common, the evidence is inconclusive and indeed is consistent with an arbitrarily low intrinsic probability of abiogenesis for plausible uninformative priors. Finding a single case of life arising independently of our lineage (on Earth, elsewhere in the solar system, or on an extrasolar planet) would provide much stronger evidence that abiogenesis is not extremely rare in the universe.

  5. Bayesian analysis of the astrobiological implications of life's early emergence on Earth.

    PubMed

    Spiegel, David S; Turner, Edwin L

    2012-01-10

    Life arose on Earth sometime in the first few hundred million years after the young planet had cooled to the point that it could support water-based organisms on its surface. The early emergence of life on Earth has been taken as evidence that the probability of abiogenesis is high, if starting from young Earth-like conditions. We revisit this argument quantitatively in a bayesian statistical framework. By constructing a simple model of the probability of abiogenesis, we calculate a bayesian estimate of its posterior probability, given the data that life emerged fairly early in Earth's history and that, billions of years later, curious creatures noted this fact and considered its implications. We find that, given only this very limited empirical information, the choice of bayesian prior for the abiogenesis probability parameter has a dominant influence on the computed posterior probability. Although terrestrial life's early emergence provides evidence that life might be abundant in the universe if early-Earth-like conditions are common, the evidence is inconclusive and indeed is consistent with an arbitrarily low intrinsic probability of abiogenesis for plausible uninformative priors. Finding a single case of life arising independently of our lineage (on Earth, elsewhere in the solar system, or on an extrasolar planet) would provide much stronger evidence that abiogenesis is not extremely rare in the universe. PMID:22198766

  6. Pathological Knee Joint Motion Analysis By High Speed Cinephotography

    NASA Astrophysics Data System (ADS)

    Baumann, Jurg U.

    1985-02-01

    The use of cinephotography for evaluation of disturbed knee joint function was compared in three groups of patients. While a sampling rate of 50 images per second was adequate for patients with neuromuscular disorders, a higher frequency of around 300 i.p.s. is necessary in osteoarthritis and ligamentous knee joint injuries, but the task of digitizing is prohibitive unless automated.

  7. In vivo analysis of trapeziometacarpal joint kinematics during pinch tasks.

    PubMed

    Kuo, Li-Chieh; Lin, Chien-Ju; Chen, Guan-Po; Jou, I-Ming; Wang, Chien-Kuo; Goryacheva, Irina G; Dosaev, Marat Z; Su, Fong-Chin

    2014-01-01

    This study investigated how the posture of the thumb while performing common pinch movements and the levels of pinch force applied by the thumb affect the arthrokinematics of the trapeziometacarpal joint in vivo. Fifteen subjects performed the pinch tasks at the distal phalange (DP), proximal interphalangeal (PIP) joint, and metacarpophalangeal (MP) joint of the index finger with 0%, 50%, and 80% of maximal pinch forces by a single-axis load cell. 3D images of the thumb were obtained using the computed tomography. The results show that the reference points moved from the central region to the dorsal-radial region when changing from pinching the DP to the MP joint without pinching force being applied. Pinching with 80% of the maximum pinching force resulted in reference points being the closest to the volar-ulnar direction. Significant differences were seen between 0% and 50% of maximum pinch force, as well as between 0% and 80%, when pinching the MP joint in the distal-proximal direction. The effects of posture of the thumb and applied pinch force on the arthrokinematics of the joint were investigated with a 3D model of the trapeziometacarpal joint. Pinching with more than 50% of maximum pinch force might subject this joint to extreme displacement. PMID:24683540

  8. In Vivo Analysis of Trapeziometacarpal Joint Kinematics during Pinch Tasks

    PubMed Central

    Chen, Guan-Po; Jou, I-Ming; Goryacheva, Irina G.; Dosaev, Marat Z.; Su, Fong-Chin

    2014-01-01

    This study investigated how the posture of the thumb while performing common pinch movements and the levels of pinch force applied by the thumb affect the arthrokinematics of the trapeziometacarpal joint in vivo. Fifteen subjects performed the pinch tasks at the distal phalange (DP), proximal interphalangeal (PIP) joint, and metacarpophalangeal (MP) joint of the index finger with 0%, 50%, and 80% of maximal pinch forces by a single-axis load cell. 3D images of the thumb were obtained using the computed tomography. The results show that the reference points moved from the central region to the dorsal-radial region when changing from pinching the DP to the MP joint without pinching force being applied. Pinching with 80% of the maximum pinching force resulted in reference points being the closest to the volar-ulnar direction. Significant differences were seen between 0% and 50% of maximum pinch force, as well as between 0% and 80%, when pinching the MP joint in the distal-proximal direction. The effects of posture of the thumb and applied pinch force on the arthrokinematics of the joint were investigated with a 3D model of the trapeziometacarpal joint. Pinching with more than 50% of maximum pinch force might subject this joint to extreme displacement. PMID:24683540

  9. Scarf Joints of Composite Materials: Testing and Analysis

    NASA Astrophysics Data System (ADS)

    Kwon, Y. W.; Marrón, A.

    2009-12-01

    The objective of this study is to develop a reliable computational model in order to investigate joint strengths of scarf joint configurations constructed from carbon-fiber and glass-fiber woven fabric laminates with different material combinations like glass/glass, glass/carbon, carbon/glass, and carbon/carbon under various loading conditions such as axial, bending moment and shear loading. Both experimental and computational studies are conducted. For the experimental study, specimens made of hybrid scarf joints using carbon-fiber and glass-fiber woven fabrics are tested under compressive loadings to determine their joint failure strengths. Computational models are then developed using the discrete resin layer model along with fracture mechanics and virtual crack closure techniques. The numerical models are validated against the experimental data. The validate models are used to predict the joint strengths under different loading conditions such as axial, shear, and bending moment loadings.

  10. Design and Dynamic Analysis of a Novel Biomimetic Robotics Hip Joint.

    PubMed

    Cui, Bingyan; Chen, Liwen; Wang, Zhijun; Zhao, Yuanhao; Li, Zhanxian; Jin, Zhenlin

    2015-01-01

    In order to increase the workspace and the carrying capacity of biomimetic robotics hip joint, a novel biomimetic robotics hip joint was developed. The biomimetic robotics hip joint is mainly composed of a moving platform, frame, and 3-RRR orthogonal spherical parallel mechanism branched chains, and has the characteristics of compact structure, large bearing capacity, high positioning accuracy, and good controllability. The functions of the biomimetic robotics hip joint are introduced, such as the technical parameters, the structure and the driving mode. The biomimetic robotics hip joint model of the robot is established, the kinematics equation is described, and then the dynamics are analyzed and simulated with ADAMS software. The proposed analysis methodology can be provided a theoretical base for biomimetic robotics hip joint of the servo motor selection and structural design. The designed hip joint can be applied in serial and parallel robots or any other mechanisms. PMID:27018226

  11. Design and Dynamic Analysis of a Novel Biomimetic Robotics Hip Joint

    PubMed Central

    Cui, Bingyan; Chen, Liwen; Wang, Zhijun; Zhao, Yuanhao; Li, Zhanxian; Jin, Zhenlin

    2015-01-01

    In order to increase the workspace and the carrying capacity of biomimetic robotics hip joint, a novel biomimetic robotics hip joint was developed. The biomimetic robotics hip joint is mainly composed of a moving platform, frame, and 3-RRR orthogonal spherical parallel mechanism branched chains, and has the characteristics of compact structure, large bearing capacity, high positioning accuracy, and good controllability. The functions of the biomimetic robotics hip joint are introduced, such as the technical parameters, the structure and the driving mode. The biomimetic robotics hip joint model of the robot is established, the kinematics equation is described, and then the dynamics are analyzed and simulated with ADAMS software. The proposed analysis methodology can be provided a theoretical base for biomimetic robotics hip joint of the servo motor selection and structural design. The designed hip joint can be applied in serial and parallel robots or any other mechanisms. PMID:27018226

  12. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    NASA Astrophysics Data System (ADS)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  13. Bayesian Attractor Learning

    NASA Astrophysics Data System (ADS)

    Wiegerinck, Wim; Schoenaker, Christiaan; Duane, Gregory

    2016-04-01

    Recently, methods for model fusion by dynamically combining model components in an interactive ensemble have been proposed. In these proposals, fusion parameters have to be learned from data. One can view these systems as parametrized dynamical systems. We address the question of learnability of dynamical systems with respect to both short term (vector field) and long term (attractor) behavior. In particular we are interested in learning in the imperfect model class setting, in which the ground truth has a higher complexity than the models, e.g. due to unresolved scales. We take a Bayesian point of view and we define a joint log-likelihood that consists of two terms, one is the vector field error and the other is the attractor error, for which we take the L1 distance between the stationary distributions of the model and the assumed ground truth. In the context of linear models (like so-called weighted supermodels), and assuming a Gaussian error model in the vector fields, vector field learning leads to a tractable Gaussian solution. This solution can then be used as a prior for the next step, Bayesian attractor learning, in which the attractor error is used as a log-likelihood term. Bayesian attractor learning is implemented by elliptical slice sampling, a sampling method for systems with a Gaussian prior and a non Gaussian likelihood. Simulations with a partially observed driven Lorenz 63 system illustrate the approach.

  14. Joint analysis of BICEP2/keck array and Planck Data.

    PubMed

    Ade, P A R; Aghanim, N; Ahmed, Z; Aikin, R W; Alexander, K D; Arnaud, M; Aumont, J; Baccigalupi, C; Banday, A J; Barkats, D; Barreiro, R B; Bartlett, J G; Bartolo, N; Battaner, E; Benabed, K; Benoît, A; Benoit-Lévy, A; Benton, S J; Bernard, J-P; Bersanelli, M; Bielewicz, P; Bischoff, C A; Bock, J J; Bonaldi, A; Bonavera, L; Bond, J R; Borrill, J; Bouchet, F R; Boulanger, F; Brevik, J A; Bucher, M; Buder, I; Bullock, E; Burigana, C; Butler, R C; Buza, V; Calabrese, E; Cardoso, J-F; Catalano, A; Challinor, A; Chary, R-R; Chiang, H C; Christensen, P R; Colombo, L P L; Combet, C; Connors, J; Couchot, F; Coulais, A; Crill, B P; Curto, A; Cuttaia, F; Danese, L; Davies, R D; Davis, R J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Delouis, J-M; Désert, F-X; Dickinson, C; Diego, J M; Dole, H; Donzelli, S; Doré, O; Douspis, M; Dowell, C D; Duband, L; Ducout, A; Dunkley, J; Dupac, X; Dvorkin, C; Efstathiou, G; Elsner, F; Enßlin, T A; Eriksen, H K; Falgarone, E; Filippini, J P; Finelli, F; Fliescher, S; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Frejsel, A; Galeotta, S; Galli, S; Ganga, K; Ghosh, T; Giard, M; Gjerløw, E; Golwala, S R; González-Nuevo, J; Górski, K M; Gratton, S; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Halpern, M; Hansen, F K; Hanson, D; Harrison, D L; Hasselfield, M; Helou, G; Henrot-Versillé, S; Herranz, D; Hildebrandt, S R; Hilton, G C; Hivon, E; Hobson, M; Holmes, W A; Hovest, W; Hristov, V V; Huffenberger, K M; Hui, H; Hurier, G; Irwin, K D; Jaffe, A H; Jaffe, T R; Jewell, J; Jones, W C; Juvela, M; Karakci, A; Karkare, K S; Kaufman, J P; Keating, B G; Kefeli, S; Keihänen, E; Kernasovskiy, S A; Keskitalo, R; Kisner, T S; Kneissl, R; Knoche, J; Knox, L; Kovac, J M; Krachmalnicoff, N; Kunz, M; Kuo, C L; Kurki-Suonio, H; Lagache, G; Lähteenmäki, A; Lamarre, J-M; Lasenby, A; Lattanzi, M; Lawrence, C R; Leitch, E M; Leonardi, R; Levrier, F; Lewis, A; Liguori, M; Lilje, P B; Linden-Vørnle, M; López-Caniego, M; Lubin, P M; Lueker, M; Macías-Pérez, J F; Maffei, B; Maino, D; Mandolesi, N; Mangilli, A; Maris, M; Martin, P G; Martínez-González, E; Masi, S; Mason, P; Matarrese, S; Megerian, K G; Meinhold, P R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Miville-Deschênes, M-A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Netterfield, C B; Nguyen, H T; Nørgaard-Nielsen, H U; Noviello, F; Novikov, D; Novikov, I; O'Brient, R; Ogburn, R W; Orlando, A; Pagano, L; Pajot, F; Paladini, R; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, T J; Perdereau, O; Perotto, L; Pettorino, V; Piacentini, F; Piat, M; Pietrobon, D; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Pratt, G W; Prunet, S; Pryke, C; Puget, J-L; Rachen, J P; Reach, W T; Rebolo, R; Reinecke, M; Remazeilles, M; Renault, C; Renzi, A; Richter, S; Ristorcelli, I; Rocha, G; Rossetti, M; Roudier, G; Rowan-Robinson, M; Rubiño-Martín, J A; Rusholme, B; Sandri, M; Santos, D; Savelainen, M; Savini, G; Schwarz, R; Scott, D; Seiffert, M D; Sheehy, C D; Spencer, L D; Staniszewski, Z K; Stolyarov, V; Sudiwala, R; Sunyaev, R; Sutton, D; Suur-Uski, A-S; Sygnet, J-F; Tauber, J A; Teply, G P; Terenzi, L; Thompson, K L; Toffolatti, L; Tolan, J E; Tomasi, M; Tristram, M; Tucci, M; Turner, A D; Valenziano, L; Valiviita, J; Van Tent, B; Vibert, L; Vielva, P; Vieregg, A G; Villa, F; Wade, L A; Wandelt, B D; Watson, R; Weber, A C; Wehus, I K; White, M; White, S D M; Willmert, J; Wong, C L; Yoon, K W; Yvon, D; Zacchei, A; Zonca, A

    2015-03-13

    We report the results of a joint analysis of data from BICEP2/Keck Array and Planck. BICEP2 and Keck Array have observed the same approximately 400  deg^{2} patch of sky centered on RA 0 h, Dec. -57.5°. The combined maps reach a depth of 57 nK deg in Stokes Q and U in a band centered at 150 GHz. Planck has observed the full sky in polarization at seven frequencies from 30 to 353 GHz, but much less deeply in any given region (1.2  μK deg in Q and U at 143 GHz). We detect 150×353 cross-correlation in B modes at high significance. We fit the single- and cross-frequency power spectra at frequencies ≥150  GHz to a lensed-ΛCDM model that includes dust and a possible contribution from inflationary gravitational waves (as parametrized by the tensor-to-scalar ratio r), using a prior on the frequency spectral behavior of polarized dust emission from previous Planck analysis of other regions of the sky. We find strong evidence for dust and no statistically significant evidence for tensor modes. We probe various model variations and extensions, including adding a synchrotron component in combination with lower frequency data, and find that these make little difference to the r constraint. Finally, we present an alternative analysis which is similar to a map-based cleaning of the dust contribution, and show that this gives similar constraints. The final result is expressed as a likelihood curve for r, and yields an upper limit r_{0.05}<0.12 at 95% confidence. Marginalizing over dust and r, lensing B modes are detected at 7.0σ significance. PMID:25815919

  15. Joint Analysis of BICEP2/Keck Array and Planck Data

    NASA Astrophysics Data System (ADS)

    BICEP2/Keck and Planck Collaborations; Ade, P. A. R.; Aghanim, N.; Ahmed, Z.; Aikin, R. W.; Alexander, K. D.; Arnaud, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barkats, D.; Barreiro, R. B.; Bartlett, J. G.; Bartolo, N.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Benton, S. J.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bischoff, C. A.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Brevik, J. A.; Bucher, M.; Buder, I.; Bullock, E.; Burigana, C.; Butler, R. C.; Buza, V.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chary, R.-R.; Chiang, H. C.; Christensen, P. R.; Colombo, L. P. L.; Combet, C.; Connors, J.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; De Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dowell, C. D.; Duband, L.; Ducout, A.; Dunkley, J.; Dupac, X.; Dvorkin, C.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Falgarone, E.; Filippini, J. P.; Finelli, F.; Fliescher, S.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Ghosh, T.; Giard, M.; Gjerløw, E.; Golwala, S. R.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Halpern, M.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Hasselfield, M.; Helou, G.; Henrot-Versillé, S.; Herranz, D.; Hildebrandt, S. R.; Hilton, G. C.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hovest, W.; Hristov, V. V.; Huffenberger, K. M.; Hui, H.; Hurier, G.; Irwin, K. D.; Jaffe, A. H.; Jaffe, T. R.; Jewell, J.; Jones, W. C.; Juvela, M.; Karakci, A.; Karkare, K. S.; Kaufman, J. P.; Keating, B. G.; Kefeli, S.; Keihänen, E.; Kernasovskiy, S. A.; Keskitalo, R.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kovac, J. M.; Krachmalnicoff, N.; Kunz, M.; Kuo, C. L.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leitch, E. M.; Leonardi, R.; Levrier, F.; Lewis, A.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Lueker, M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Mason, P.; Matarrese, S.; Megerian, K. G.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nguyen, H. T.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Brient, R.; Ogburn, R. W.; Orlando, A.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Pettorino, V.; Piacentini, F.; Piat, M.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Pratt, G. W.; Prunet, S.; Pryke, C.; Puget, J.-L.; Rachen, J. P.; Reach, W. T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Richter, S.; Ristorcelli, I.; Rocha, G.; Rossetti, M.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Schwarz, R.; Scott, D.; Seiffert, M. D.; Sheehy, C. D.; Spencer, L. D.; Staniszewski, Z. K.; Stolyarov, V.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Teply, G. P.; Terenzi, L.; Thompson, K. L.; Toffolatti, L.; Tolan, J. E.; Tomasi, M.; Tristram, M.; Tucci, M.; Turner, A. D.; Valenziano, L.; Valiviita, J.; van Tent, B.; Vibert, L.; Vielva, P.; Vieregg, A. G.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Watson, R.; Weber, A. C.; Wehus, I. K.; White, M.; White, S. D. M.; Willmert, J.; Wong, C. L.; Yoon, K. W.; Yvon, D.; Zacchei, A.; Zonca, A.; Bicep2/Keck; Planck Collaborations

    2015-03-01

    We report the results of a joint analysis of data from BICEP2/Keck Array and Planck. BICEP2 and Keck Array have observed the same approximately 400 deg2 patch of sky centered on RA 0 h, Dec. -57.5 ° . The combined maps reach a depth of 57 nK deg in Stokes Q and U in a band centered at 150 GHz. Planck has observed the full sky in polarization at seven frequencies from 30 to 353 GHz, but much less deeply in any given region (1.2 μ K deg in Q and U at 143 GHz). We detect 150 ×353 cross-correlation in B modes at high significance. We fit the single- and cross-frequency power spectra at frequencies ≥150 GHz to a lensed-Λ CDM model that includes dust and a possible contribution from inflationary gravitational waves (as parametrized by the tensor-to-scalar ratio r), using a prior on the frequency spectral behavior of polarized dust emission from previous Planck analysis of other regions of the sky. We find strong evidence for dust and no statistically significant evidence for tensor modes. We probe various model variations and extensions, including adding a synchrotron component in combination with lower frequency data, and find that these make little difference to the r constraint. Finally, we present an alternative analysis which is similar to a map-based cleaning of the dust contribution, and show that this gives similar constraints. The final result is expressed as a likelihood curve for r, and yields an upper limit r0.05<0.12 at 95% confidence. Marginalizing over dust and r, lensing B modes are detected at 7.0 σ significance.

  16. Order-Constrained Reference Priors with Implications for Bayesian Isotonic Regression, Analysis of Covariance and Spatial Models

    NASA Astrophysics Data System (ADS)

    Gong, Maozhen

    Selecting an appropriate prior distribution is a fundamental issue in Bayesian Statistics. In this dissertation, under the framework provided by Berger and Bernardo, I derive the reference priors for several models which include: Analysis of Variance (ANOVA)/Analysis of Covariance (ANCOVA) models with a categorical variable under common ordering constraints, the conditionally autoregressive (CAR) models and the simultaneous autoregressive (SAR) models with a spatial autoregression parameter rho considered. The performances of reference priors for ANOVA/ANCOVA models are evaluated by simulation studies with comparisons to Jeffreys' prior and Least Squares Estimation (LSE). The priors are then illustrated in a Bayesian model of the "Risk of Type 2 Diabetes in New Mexico" data, where the relationship between the type 2 diabetes risk (through Hemoglobin A1c) and different smoking levels is investigated. In both simulation studies and real data set modeling, the reference priors that incorporate internal order information show good performances and can be used as default priors. The reference priors for the CAR and SAR models are also illustrated in the "1999 SAT State Average Verbal Scores" data with a comparison to a Uniform prior distribution. Due to the complexity of the reference priors for both CAR and SAR models, only a portion (12 states in the Midwest) of the original data set is considered. The reference priors can give a different marginal posterior distribution compared to a Uniform prior, which provides an alternative for prior specifications for areal data in Spatial statistics.

  17. Japanese Encephalitis Risk and Contextual Risk Factors in Southwest China: A Bayesian Hierarchical Spatial and Spatiotemporal Analysis

    PubMed Central

    Zhao, Xing; Cao, Mingqin; Feng, Hai-Huan; Fan, Heng; Chen, Fei; Feng, Zijian; Li, Xiaosong; Zhou, Xiao-Hua

    2014-01-01

    It is valuable to study the spatiotemporal pattern of Japanese encephalitis (JE) and its association with the contextual risk factors in southwest China, which is the most endemic area in China. Using data from 2004 to 2009, we applied GISmapping and spatial autocorrelation analysis to analyze reported incidence data of JE in 438 counties in southwest China, finding that JE cases were not randomly distributed, and a Bayesian hierarchical spatiotemporal model identified the east part of southwest China as a high risk area. Meanwhile, the Bayesian hierarchical spatial model in 2006 demonstrated a statistically significant association between JE and the agricultural and climatic variables, including the proportion of rural population, the pig-to-human ratio, the monthly precipitation and the monthly mean minimum and maximum temperatures. Particular emphasis was placed on the time-lagged effect for climatic factors. The regression method and the Spearman correlation analysis both identified a two-month lag for the precipitation, while the regression method found a one-month lag for temperature. The results show that the high risk area in the east part of southwest China may be connected to the agricultural and climatic factors. The routine surveillance and the allocation of health resources should be given more attention in this area. Moreover, the meteorological variables might be considered as possible predictors of JE in southwest China. PMID:24739769

  18. Solid rocket motor aft field joint flow field analysis

    NASA Technical Reports Server (NTRS)

    Sabnis, Jayant S.; Gibeling, Edward J.; Mcdonald, Henry

    1987-01-01

    An efficient Navier-Stokes analysis was successfully applied to simulate the complex flow field in the vicinity of a slot in a solid rocket motor with segment joints. The capability of the computer code to resolve the flow near solid surfaces without using a wall function assumption was demonstrated. In view of the complex nature of the flow field in the vicinity of the slot, this approach is considered essential. The results obtained from these calculations provide valuable design information, which would otherwise be extremely difficult to obtain. The results of the axisymmetric calculations indicate the presence of a region of reversed axial flow at the aft-edge of the slot and show the over-pressure in the slot to be only about 10 psi. The results of the asymmetric calculations indicate that a pressure asymmetry more than two diameters downstream of the slot has no noticeable effect on the flow field in the slot. They also indicate that the circumferential pressure differential caused in the slot due to failure of a 15 deg section of the castable inhibitor will be approximately 1 psi.

  19. Analysis and testing of a space crane articulating joint testbed

    NASA Technical Reports Server (NTRS)

    Sutter, Thomas R.; Wu, K. Chauncey

    1992-01-01

    The topics are presented in viewgraph form and include: space crane concept with mobile base; mechanical versus structural articulating joint; articulating joint test bed and reference truss; static and dynamic characterization completed for space crane reference truss configuration; improved linear actuators reduce articulating joint test bed backlash; 1-DOF space crane slew maneuver; boom 2 tip transient response finite element dynamic model; boom 2 tip transient response shear-corrected component modes torque driver profile; peak root member force vs. slew time torque driver profile; and open loop control of space crane motion.

  20. Using Bayesian analysis and Gaussian processes to infer electron temperature and density profiles on the Mega-Ampere Spherical Tokamak experiment

    SciTech Connect

    Nessi, G. T. von; Hole, M. J.

    2013-06-15

    A unified, Bayesian inference of midplane electron temperature and density profiles using both Thomson scattering (TS) and interferometric data is presented. Beyond the Bayesian nature of the analysis, novel features of the inference are the use of a Gaussian process prior to infer a mollification length-scale of inferred profiles and the use of Gauss-Laguerre quadratures to directly calculate the depolarisation term associated with the TS forward model. Results are presented from an application of the method to data from the high resolution TS system on the Mega-Ampere Spherical Tokamak, along with a comparison to profiles coming from the standard analysis carried out on that system.

  1. Using Bayesian analysis and Gaussian processes to infer electron temperature and density profiles on the Mega-Ampere Spherical Tokamak experiment.

    PubMed

    von Nessi, G T; Hole, M J

    2013-06-01

    A unified, Bayesian inference of midplane electron temperature and density profiles using both Thomson scattering (TS) and interferometric data is presented. Beyond the Bayesian nature of the analysis, novel features of the inference are the use of a Gaussian process prior to infer a mollification length-scale of inferred profiles and the use of Gauss-Laguerre quadratures to directly calculate the depolarisation term associated with the TS forward model. Results are presented from an application of the method to data from the high resolution TS system on the Mega-Ampere Spherical Tokamak, along with a comparison to profiles coming from the standard analysis carried out on that system. PMID:23822343

  2. Bayesian network learning for natural hazard assessments

    NASA Astrophysics Data System (ADS)

    Vogel, Kristin

    2016-04-01

    Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables

  3. Neural network modeling and an uncertainty analysis in Bayesian framework: A case study from the KTB borehole site

    NASA Astrophysics Data System (ADS)

    Maiti, Saumen; Tiwari, Ram Krishna

    2010-10-01

    A new probabilistic approach based on the concept of Bayesian neural network (BNN) learning theory is proposed for decoding litho-facies boundaries from well-log data. We show that how a multi-layer-perceptron neural network model can be employed in Bayesian framework to classify changes in litho-log successions. The method is then applied to the German Continental Deep Drilling Program (KTB) well-log data for classification and uncertainty estimation in the litho-facies boundaries. In this framework, a posteriori distribution of network parameter is estimated via the principle of Bayesian probabilistic theory, and an objective function is minimized following the scaled conjugate gradient optimization scheme. For the model development, we inflict a suitable criterion, which provides probabilistic information by emulating different combinations of synthetic data. Uncertainty in the relationship between the data and the model space is appropriately taken care by assuming a Gaussian a priori distribution of networks parameters (e.g., synaptic weights and biases). Prior to applying the new method to the real KTB data, we tested the proposed method on synthetic examples to examine the sensitivity of neural network hyperparameters in prediction. Within this framework, we examine stability and efficiency of this new probabilistic approach using different kinds of synthetic data assorted with different level of correlated noise. Our data analysis suggests that the designed network topology based on the Bayesian paradigm is steady up to nearly 40% correlated noise; however, adding more noise (˜50% or more) degrades the results. We perform uncertainty analyses on training, validation, and test data sets with and devoid of intrinsic noise by making the Gaussian approximation of the a posteriori distribution about the peak model. We present a standard deviation error-map at the network output corresponding to the three types of the litho-facies present over the entire litho

  4. Elasto-Plastic Analysis of Tee Joints Using HOT-SMAC

    NASA Technical Reports Server (NTRS)

    Arnold, Steve M. (Technical Monitor); Bednarcyk, Brett A.; Yarrington, Phillip W.

    2004-01-01

    The Higher Order Theory - Structural/Micro Analysis Code (HOT-SMAC) software package is applied to analyze the linearly elastic and elasto-plastic response of adhesively bonded tee joints. Joints of this type are finding an increasing number of applications with the increased use of composite materials within advanced aerospace vehicles, and improved tools for the design and analysis of these joints are needed. The linearly elastic results of the code are validated vs. finite element analysis results from the literature under different loading and boundary conditions, and new results are generated to investigate the inelastic behavior of the tee joint. The comparison with the finite element results indicates that HOT-SMAC is an efficient and accurate alternative to the finite element method and has a great deal of potential as an analysis tool for a wide range of bonded joints.

  5. Development of design and analysis methodology for composite bolted joints

    NASA Astrophysics Data System (ADS)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  6. Spatial component analysis of MRI data for Alzheimer's disease diagnosis: a Bayesian network approach

    PubMed Central

    Illan, Ignacio A.; Górriz, Juan M.; Ramírez, Javier; Meyer-Base, Anke

    2014-01-01

    This work presents a spatial-component (SC) based approach to aid the diagnosis of Alzheimer's disease (AD) using magnetic resonance images. In this approach, the whole brain image is subdivided in regions or spatial components, and a Bayesian network is used to model the dependencies between affected regions of AD. The structure of relations between affected regions allows to detect neurodegeneration with an estimated performance of 88% on more than 400 subjects and predict neurodegeneration with 80% accuracy, supporting the conclusion that modeling the dependencies between components increases the recognition of different patterns of brain degeneration in AD. PMID:25505408

  7. Bayesian analysis of non-linear differential equation models with application to a gut microbial ecosystem.

    PubMed

    Lawson, Daniel J; Holtrop, Grietje; Flint, Harry

    2011-07-01

    Process models specified by non-linear dynamic differential equations contain many parameters, which often must be inferred from a limited amount of data. We discuss a hierarchical Bayesian approach combining data from multiple related experiments in a meaningful way, which permits more powerful inference than treating each experiment as independent. The approach is illustrated with a simulation study and example data from experiments replicating the aspects of the human gut microbial ecosystem. A predictive model is obtained that contains prediction uncertainty caused by uncertainty in the parameters, and we extend the model to capture situations of interest that cannot easily be studied experimentally. PMID:21681780

  8. Bayesian analysis of recurrent event with dependent termination: an application to a heart transplant study.

    PubMed

    Ouyang, Bichun; Sinha, Debajyoti; Slate, Elizabeth H; Van Bakel, Adrian B

    2013-07-10

    For a heart transplant patient, the risk of graft rejection and risk of death are likely to be associated. Two fully specified Bayesian models for recurrent events with dependent termination are applied to investigate the potential relationships between these two types of risk as well as association with risk factors. We particularly focus on the choice of priors, selection of the appropriate prediction model, and prediction methods for these two types of risk for an individual patient. Our prediction tools can be easily implemented and helpful to physicians for setting heart transplant patients' biopsy schedule. PMID:23280968

  9. A Bayesian approach to the analysis of quantal bioassay studies using nonparametric mixture models.

    PubMed

    Fronczyk, Kassandra; Kottas, Athanasios

    2014-03-01

    We develop a Bayesian nonparametric mixture modeling framework for quantal bioassay settings. The approach is built upon modeling dose-dependent response distributions. We adopt a structured nonparametric prior mixture model, which induces a monotonicity restriction for the dose-response curve. Particular emphasis is placed on the key risk assessment goal of calibration for the dose level that corresponds to a specified response. The proposed methodology yields flexible inference for the dose-response relationship as well as for other inferential objectives, as illustrated with two data sets from the literature. PMID:24354490

  10. Joint venture versus outreach: a financial analysis of case studies.

    PubMed

    Forsman, R W

    2001-01-01

    Medical centers across the country are facing cost challenges, and national commercial laboratories are experiencing financial declines that necessitate their capturing market share in any way possible. Many laboratories are turning to joint ventures or partnerships for financial relief. However, it often is in the best interest of the patient and the medical center to integrate laboratory services across the continuum of care. This article analyzes two hypothetical joint ventures involving a laboratory management agreement and full laboratory outsourcing. PMID:11490651

  11. Bayesian analysis of stage-discharge relationships affected by hysteresis and quantification of the associated uncertainties

    NASA Astrophysics Data System (ADS)

    Mansanarez, Valentin; Le Coz, Jérôme; Renard, Benjamin; Lang, Michel; Birgand, François

    2015-04-01

    The hysteresis effect is a hydraulic phenomenon associated with transient flow in a relatively flat channel. Hysteresis leads to non-univocal stage-discharge relationships: for a given stage, discharge during the rising limb is greater than during the recession. Hysteresis occurs in open-channel flows because the velocity pressure wave usually propagates faster than the pressure wave. In practice, hysteresis is often ignored when developing hydrometric rating curves, leading to biased flood hydrographs. When hysteresis is not ignored, the most common practice is to correct the univocal rating curve by using the simple Jones formula. This formula requires the estimation of different physical variables through numerical modelling and/or expertise. The estimation of the associated discharge uncertainty is still an open question. The Bayesian method proposed in this presentation incorporates information from both hydraulic knowledge (equations of channel controls based on geometry and roughness estimates) and stage-discharge observations (gauging data). The obtained total uncertainty combines parametric uncertainty (unknown rating curve parameters) and structural uncertainty (imperfection of the rating curve model). This method provides a direct estimation of the physical inputs of the rating curve (roughness, bed slope, kinematic wave celerity, etc.). Two hysteresis formulas were used: the most widely-used Jones formula and its expansion to the 3rd order, known as the Fenton formula. The wave celerity may be either constant or expressed as a simple function of stage based on the kinematic wave assumption. This method has been applied to one data set. Sensitivity tests allowed us to draw the following conclusions. As expected, more precise hydraulic priors and/or less uncertain gaugings provide rating curves that agree well with discharge measurements and have a smaller uncertainty. The simple Jones formula leads to as good results as the more complex Fenton formula

  12. A Bayesian self-clustering analysis of the highest energy cosmic rays detected by the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Khanin, Alexander; Mortlock, Daniel J.

    2014-10-01

    Cosmic rays are protons and atomic nuclei that flow into our Solar system and reach the Earth with energies of up to ˜1021 eV. The sources of ultrahigh energy cosmic rays (UHECRs) with E ≳ 1019 eV remain unknown, although there are theoretical reasons to think that at least some come from active galactic nuclei (AGNs). One way to assess the different hypotheses is by analysing the arrival directions of UHECRs, in particular their self-clustering. We have developed a fully Bayesian approach to analysing the self-clustering of points on the sphere, which we apply to the UHECR arrival directions. The analysis is based on a multistep approach that enables the application of Bayesian model comparison to cases with weak prior information. We have applied this approach to the 69 highest energy events recorded by the Pierre Auger Observatory, which is the largest current UHECR data set. We do not detect self-clustering, but simulations show that this is consistent with the AGN-sourced model for a data set of this size. Data sets of several hundred UHECRs would be sufficient to detect clustering in the AGN model. Samples of this magnitude are expected to be produced by future experiments, such as the Japanese Experiment Module Extreme Universe Space Observatory.

  13. Genotype-Based Bayesian Analysis of Gene-Environment Interactions with Multiple Genetic Markers and Misclassification in Environmental Factors

    PubMed Central

    Lobach, Iryna; Fan, Ruzong

    2015-01-01

    A key component to understanding etiology of complex diseases, such as cancer, diabetes, alcohol dependence, is to investigate gene-environment interactions. This work is motivated by the following two concerns in the analysis of gene-environment interactions. First, multiple genetic markers in moderate linkage disequilibrium may be involved in susceptibility to a complex disease. Second, environmental factors may be subject to misclassification. We develop a genotype based Bayesian pseudolikelihood approach that accommodates linkage disequilibrium in genetic markers and misclassification in environmental factors. Since our approach is genotype based, it allows the observed genetic information to enter the model directly thus eliminating the need to infer haplotype phase and simplifying computations. Bayesian approach allows shrinking parameter estimates towards prior distribution to improve estimation and inference when environmental factors are subject to misclassification. Simulation experiments demonstrated that our method produced parameter estimates that are nearly unbiased even for small sample sizes. An application of our method is illustrated using a case-control study of interaction between early onset of drinking and genes involved in dopamine pathway. PMID:26180529

  14. Bayesian analysis and classification of two Enzyme-Linked Immunosorbent Assay (ELISA) tests without a gold standard

    PubMed Central

    Zhang, Jingyang; Chaloner, Kathryn; McLinden, James H.; Stapleton, Jack T.

    2013-01-01

    Reconciling two quantitative ELISA tests for an antibody to an RNA virus, in a situation without a gold standard and where false negatives may occur, is the motivation for this work. False negatives occur when access of the antibody to the binding site is blocked. Based on the mechanism of the assay, a mixture of four bivariate normal distributions is proposed with the mixture probabilities depending on a two-stage latent variable model including the prevalence of the antibody in the population and the probabilities of blocking on each test. There is prior information on the prevalence of the antibody, and also on the probability of false negatives, and so a Bayesian analysis is used. The dependence between the two tests is modeled to be consistent with the biological mechanism. Bayesian decision theory is utilized for classification. The proposed method is applied to the motivating data set to classify the data into two groups: those with and those without the antibody. Simulation studies describe the properties of the estimation and the classification. Sensitivity to the choice of the prior distribution is also addressed by simulation. The same model with two levels of latent variables is applicable in other testing procedures such as quantitative polymerase chain reaction tests where false negatives occur when there is a mutation in the primer sequence. PMID:23592433

  15. Analysis of in-situ rock joint strength using digital borehole scanner images

    SciTech Connect

    Thapa, B.B.

    1994-09-01

    The availability of high resolution digital images of borehole walls using the Borehole Scanner System has made it possible to develop new methods of in-situ rock characterization. This thesis addresses particularly new approaches to the characterization of in-situ joint strength arising from surface roughness. An image processing technique is used to extract the roughness profile from joints in the unrolled image of the borehole wall. A method for estimating in-situ Rengers envelopes using this data is presented along with results from using the method on joints in a borehole in porphyritic granite. Next, an analysis of the joint dilation angle anisotropy is described and applied to the porphyritic granite joints. The results indicate that the dilation angle of the joints studied are anisotropic at small scales and tend to reflect joint waviness as scale increases. A procedure to unroll the opposing roughness profiles to obtain a two dimensional sample is presented. The measurement of apertures during this process is shown to produce an error which increases with the dip of the joint. The two dimensional sample of opposing profiles is used in a new kinematic analysis of the joint shear stress-shear deformation behavior. Examples of applying these methods on the porphyritic granite joints are presented. The unrolled opposing profiles were used in a numerical simulation of a direct shear test using Discontinuous Deformation Analysis. Results were compared to laboratory test results using core samples containing the same joints. The simulated dilatancy and shear stress-shear deformation curves were close to the laboratory curves in the case of a joint in porphyritic granite.

  16. Bayesian learning

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    In 1983 and 1984, the Infrared Astronomical Satellite (IRAS) detected 5,425 stellar objects and measured their infrared spectra. In 1987 a program called AUTOCLASS used Bayesian inference methods to discover the classes present in these data and determine the most probable class of each object, revealing unknown phenomena in astronomy. AUTOCLASS has rekindled the old debate on the suitability of Bayesian methods, which are computationally intensive, interpret probabilities as plausibility measures rather than frequencies, and appear to depend on a subjective assessment of the probability of a hypothesis before the data were collected. Modern statistical methods have, however, recently been shown to also depend on subjective elements. These debates bring into question the whole tradition of scientific objectivity and offer scientists a new way to take responsibility for their findings and conclusions.

  17. Bayesian Analysis of Two Stellar Populations in Galactic Globular Clusters. II. NGC 5024, NGC 5272, and NGC 6352

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, R.; Stenning, D. C.; Robinson, E.; von Hippel, T.; Sarajedini, A.; van Dyk, D. A.; Stein, N.; Jefferys, W. H.

    2016-07-01

    We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival Advanced Camera for Surveys Treasury observations of Galactic Globular Clusters to find and characterize two stellar populations in NGC 5024 (M53), NGC 5272 (M3), and NGC 6352. For these three clusters, both single and double-population analyses are used to determine a best fit isochrone(s). We employ a sophisticated Bayesian analysis technique to simultaneously fit the cluster parameters (age, distance, absorption, and metallicity) that characterize each cluster. For the two-population analysis, unique population level helium values are also fit to each distinct population of the cluster and the relative proportions of the populations are determined. We find differences in helium ranging from ∼0.05 to 0.11 for these three clusters. Model grids with solar α-element abundances ([α/Fe] = 0.0) and enhanced α-elements ([α/Fe] = 0.4) are adopted.

  18. Bayesian Analysis of Two Stellar Populations in Galactic Globular Clusters. II. NGC 5024, NGC 5272, and NGC 6352

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, R.; Stenning, D. C.; Robinson, E.; von Hippel, T.; Sarajedini, A.; van Dyk, D. A.; Stein, N.; Jefferys, W. H.

    2016-07-01

    We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival Advanced Camera for Surveys Treasury observations of Galactic Globular Clusters to find and characterize two stellar populations in NGC 5024 (M53), NGC 5272 (M3), and NGC 6352. For these three clusters, both single and double-population analyses are used to determine a best fit isochrone(s). We employ a sophisticated Bayesian analysis technique to simultaneously fit the cluster parameters (age, distance, absorption, and metallicity) that characterize each cluster. For the two-population analysis, unique population level helium values are also fit to each distinct population of the cluster and the relative proportions of the populations are determined. We find differences in helium ranging from ˜0.05 to 0.11 for these three clusters. Model grids with solar α-element abundances ([α/Fe] = 0.0) and enhanced α-elements ([α/Fe] = 0.4) are adopted.

  19. Stresses In Vehicle Chassis Joints - A Canparison Of SPATE With Other Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Loader, A. J.; Turner, W. B.; Harwood, N.

    1987-04-01

    Joints in ladder frame chassis have been studied as part of an SERC Teaching Company Schene. The joints between the cross members and side members are complex structures involving bolts, welds and/or rivets, as the cross member section can be tubular, box or C-section. It is therefore difficult to apply simple analytical methods to such joints. This paper compares the stresses obtained by brittle lacquer, strain gauge and SPATE measurements with those found from a finite elenent analysis of the joints.

  20. Bayesian analysis of a morphological supermatrix sheds light on controversial fossil hominin relationships

    PubMed Central

    Dembo, Mana; Matzke, Nicholas J.; Mooers, Arne Ø.; Collard, Mark

    2015-01-01

    The phylogenetic relationships of several hominin species remain controversial. Two methodological issues contribute to the uncertainty—use of partial, inconsistent datasets and reliance on phylogenetic methods that are ill-suited to testing competing hypotheses. Here, we report a study designed to overcome these issues. We first compiled a supermatrix of craniodental characters for all widely accepted hominin species. We then took advantage of recently developed Bayesian methods for building trees of serially sampled tips to test among hypotheses that have been put forward in three of the most important current debates in hominin phylogenetics—the relationship between Australopithecus sediba and Homo, the taxonomic status of the Dmanisi hominins, and the place of the so-called hobbit fossils from Flores, Indonesia, in the hominin tree. Based on our results, several published hypotheses can be statistically rejected. For example, the data do not support the claim that Dmanisi hominins and all other early Homo specimens represent a single species, nor that the hobbit fossils are the remains of small-bodied modern humans, one of whom had Down syndrome. More broadly, our study provides a new baseline dataset for future work on hominin phylogeny and illustrates the promise of Bayesian approaches for understanding hominin phylogenetic relationships. PMID:26202999

  1. Bayesian change point analysis of abundance trends for pelagic fishes in the upper San Francisco Estuary

    USGS Publications Warehouse

    Thompson, James R.; Kimmerer, Wim J.; Brown, Larry R.; Newman, Ken B.; Mac Nally, Ralph; Bennett, William A.; Feyrer, Frederick; Fleishman, Erica

    2010-01-01

    We examined trends in abundance of four pelagic fish species (delta smelt, longfin smelt, striped bass, and threadfin shad) in the upper San Francisco Estuary, California, USA, over 40 years using Bayesian change point models. Change point models identify times of abrupt or unusual changes in absolute abundance (step changes) or in rates of change in abundance (trend changes). We coupled Bayesian model selection with linear regression splines to identify biotic or abiotic covariates with the strongest associations with abundances of each species. We then refitted change point models conditional on the selected covariates to explore whether those covariates could explain statistical trends or change points in species abundances. We also fitted a multispecies change point model that identified change points common to all species. All models included hierarchical structures to model data uncertainties, including observation errors and missing covariate values. There were step declines in abundances of all four species in the early 2000s, with a likely common decline in 2002. Abiotic variables, including water clarity, position of the 2‰ isohaline (X2), and the volume of freshwater exported from the estuary, explained some variation in species' abundances over the time series, but no selected covariates could explain statistically the post-2000 change points for any species.

  2. Bayesian analysis of a morphological supermatrix sheds light on controversial fossil hominin relationships.

    PubMed

    Dembo, Mana; Matzke, Nicholas J; Mooers, Arne Ø; Collard, Mark

    2015-08-01

    The phylogenetic relationships of several hominin species remain controversial. Two methodological issues contribute to the uncertainty-use of partial, inconsistent datasets and reliance on phylogenetic methods that are ill-suited to testing competing hypotheses. Here, we report a study designed to overcome these issues. We first compiled a supermatrix of craniodental characters for all widely accepted hominin species. We then took advantage of recently developed Bayesian methods for building trees of serially sampled tips to test among hypotheses that have been put forward in three of the most important current debates in hominin phylogenetics--the relationship between Australopithecus sediba and Homo, the taxonomic status of the Dmanisi hominins, and the place of the so-called hobbit fossils from Flores, Indonesia, in the hominin tree. Based on our results, several published hypotheses can be statistically rejected. For example, the data do not support the claim that Dmanisi hominins and all other early Homo specimens represent a single species, nor that the hobbit fossils are the remains of small-bodied modern humans, one of whom had Down syndrome. More broadly, our study provides a new baseline dataset for future work on hominin phylogeny and illustrates the promise of Bayesian approaches for understanding hominin phylogenetic relationships. PMID:26202999

  3. The GRB Golentskii Correlation as a Cosmological Probe via Bayesian Analysis

    NASA Astrophysics Data System (ADS)

    Burgess, Michael

    2016-07-01

    Gamma-ray bursts (GRBs) are characterized by a strong correlation between the instantaneous luminosity and the spectral peak energy within a burst. This correlation, which is known as the hardness-intensity correlation or the Golenetskii correlation, not only holds important clues to the physics of GRBs but is thought to have the potential to determine redshifts of bursts. In this paper, I use a hierarchical Bayesian model to study the universality of the rest-frame Golenetskii correlation and in particular I assess its use as a redshift estimator for GRBs. I find that, using a power-law prescription of the correlation, the power-law indices cluster near a common value, but have a broader variance than previously reported ( 1-2). Furthermore, I find evidence that there is spread in intrinsic rest-frame correlation normalizations for the GRBs in our sample ( 10 ^{51}-10 ^{53} erg/s). This points towards variable physical settings of the emission (magnetic field strength, number of emitting electrons, photospheric radius, viewing angle, etc.). Subsequently, these results eliminate the Golenetskii correlation as a useful tool for redshift determination and hence a cosmological probe. Nevertheless, the Bayesian method introduced in this paper allows for a better determination of the rest frame properties of the correlation, which in turn allows for more stringent limitations for physical models of the emission to be set.

  4. A Bayesian analysis of HAT-P-7b using the EXONEST algorithm

    SciTech Connect

    Placek, Ben; Knuth, Kevin H.

    2015-01-13

    The study of exoplanets (planets orbiting other stars) is revolutionizing the way we view our universe. High-precision photometric data provided by the Kepler Space Telescope (Kepler) enables not only the detection of such planets, but also their characterization. This presents a unique opportunity to apply Bayesian methods to better characterize the multitude of previously confirmed exoplanets. This paper focuses on applying the EXONEST algorithm to characterize the transiting short-period-hot-Jupiter, HAT-P-7b (also referred to as Kepler-2b). EXONEST evaluates a suite of exoplanet photometric models by applying Bayesian Model Selection, which is implemented with the MultiNest algorithm. These models take into account planetary effects, such as reflected light and thermal emissions, as well as the effect of the planetary motion on the host star, such as Doppler beaming, or boosting, of light from the reflex motion of the host star, and photometric variations due to the planet-induced ellipsoidal shape of the host star. By calculating model evidences, one can determine which model best describes the observed data, thus identifying which effects dominate the planetary system. Presented are parameter estimates and model evidences for HAT-P-7b.

  5. Hierarchical Bayesian model for rare variant association analysis integrating genotype uncertainty in human sequence data.

    PubMed

    He, Liang; Pitkäniemi, Janne; Sarin, Antti-Pekka; Salomaa, Veikko; Sillanpää, Mikko J; Ripatti, Samuli

    2015-02-01

    Next-generation sequencing (NGS) has led to the study of rare genetic variants, which possibly explain the missing heritability for complex diseases. Most existing methods for rare variant (RV) association detection do not account for the common presence of sequencing errors in NGS data. The errors can largely affect the power and perturb the accuracy of association tests due to rare observations of minor alleles. We developed a hierarchical Bayesian approach to estimate the association between RVs and complex diseases. Our integrated framework combines the misclassification probability with shrinkage-based Bayesian variable selection. It allows for flexibility in handling neutral and protective RVs with measurement error, and is robust enough for detecting causal RVs with a wide spectrum of minor allele frequency (MAF). Imputation uncertainty and MAF are incorporated into the integrated framework to achieve the optimal statistical power. We demonstrate that sequencing error does significantly affect the findings, and our proposed model can take advantage of it to improve statistical power in both simulated and real data. We further show that our model outperforms existing methods, such as sequence kernel association test (SKAT). Finally, we illustrate the behavior of the proposed method using a Finnish low-density lipoprotein cholesterol study, and show that it identifies an RV known as FH North Karelia in LDLR gene with three carriers in 1,155 individuals, which is missed by both SKAT and Granvil. PMID:25395270

  6. pyblocxs: Bayesian Low-Counts X-ray Spectral Analysis in Sherpa

    NASA Astrophysics Data System (ADS)

    Siemiginowska, A.; Kashyap, V.; Refsdal, B.; van Dyk, D.; Connors, A.; Park, T.

    2011-07-01

    Typical X-ray spectra have low counts and should be modeled using the Poisson distribution. However, χ2 statistic is often applied as an alternative and the data are assumed to follow the Gaussian distribution. A variety of weights to the statistic or a binning of the data is performed to overcome the low counts issues. However, such modifications introduce biases or/and a loss of information. Standard modeling packages such as XSPEC and Sherpa provide the Poisson likelihood and allow computation of rudimentary MCMC chains, but so far do not allow for setting a full Bayesian model. We have implemented a sophisticated Bayesian MCMC-based algorithm to carry out spectral fitting of low counts sources in the Sherpa environment. The code is a Python extension to Sherpa and allows to fit a predefined Sherpa model to high-energy X-ray spectral data and other generic data. We present the algorithm and discuss several issues related to the implementation, including flexible definition of priors and allowing for variations in the calibration information.

  7. Reconstruction of a windborne insect invasion using a particle dispersal model, historical wind data, and Bayesian analysis of genetic data

    PubMed Central

    Lander, Tonya A; Klein, Etienne K; Oddou-Muratorio, Sylvie; Candau, Jean-Noël; Gidoin, Cindy; Chalon, Alain; Roig, Anne; Fallour, Delphine; Auger-Rozenberg, Marie-Anne; Boivin, Thomas

    2014-01-01

    Understanding how invasive species establish and spread is vital for developing effective management strategies for invaded areas and identifying new areas where the risk of invasion is highest. We investigated the explanatory power of dispersal histories reconstructed based on local-scale wind data and a regional-scale wind-dispersed particle trajectory model for the invasive seed chalcid wasp Megastigmus schimitscheki (Hymenoptera: Torymidae) in France. The explanatory power was tested by: (1) survival analysis of empirical data on M. schimitscheki presence, absence and year of arrival at 52 stands of the wasp's obligate hosts, Cedrus (true cedar trees); and (2) Approximate Bayesian analysis of M. schimitscheki genetic data using a coalescence model. The Bayesian demographic modeling and traditional population genetic analysis suggested that initial invasion across the range was the result of long-distance dispersal from the longest established sites. The survival analyses of the windborne expansion patterns derived from a particle dispersal model indicated that there was an informative correlation between the M. schimitscheki presence/absence data from the annual surveys and the scenarios based on regional-scale wind data. These three very different analyses produced highly congruent results supporting our proposal that wind is the most probable vector for passive long-distance dispersal of this invasive seed wasp. This result confirms that long-distance dispersal from introduction areas is a likely driver of secondary expansion of alien invasive species. Based on our results, management programs for this and other windborne invasive species may consider (1) focusing effort at the longest established sites and (2) monitoring outlying populations remains critically important due to their influence on rates of spread. We also suggest that there is a distinct need for new analysis methods that have the capacity to combine empirical spatiotemporal field data

  8. Towards a Fuzzy Bayesian Network Based Approach for Safety Risk Analysis of Tunnel-Induced Pipeline Damage.

    PubMed

    Zhang, Limao; Wu, Xianguo; Qin, Yawei; Skibniewski, Miroslaw J; Liu, Wenli

    2016-02-01

    Tunneling excavation is bound to produce significant disturbances to surrounding environments, and the tunnel-induced damage to adjacent underground buried pipelines is of considerable importance for geotechnical practice. A fuzzy Bayesian networks (FBNs) based approach for safety risk analysis is developed in this article with detailed step-by-step procedures, consisting of risk mechanism analysis, the FBN model establishment, fuzzification, FBN-based inference, defuzzification, and decision making. In accordance with the failure mechanism analysis, a tunnel-induced pipeline damage model is proposed to reveal the cause-effect relationships between the pipeline damage and its influential variables. In terms of the fuzzification process, an expert confidence indicator is proposed to reveal the reliability of the data when determining the fuzzy probability of occurrence of basic events, with both the judgment ability level and the subjectivity reliability level taken into account. By means of the fuzzy Bayesian inference, the approach proposed in this article is capable of calculating the probability distribution of potential safety risks and identifying the most likely potential causes of accidents under both prior knowledge and given evidence circumstances. A case concerning the safety analysis of underground buried pipelines adjacent to the construction of the Wuhan Yangtze River Tunnel is presented. The results demonstrate the feasibility of the proposed FBN approach and its application potential. The proposed approach can be used as a decision tool to provide support for safety assurance and management in tunnel construction, and thus increase the likelihood of a successful project in a complex project environment. PMID:26224125

  9. Computed tomography analysis of radiostereometric data to determine flexion axes after total joint replacement: application to the elbow joint.

    PubMed

    Ericson, A; Olivecrona, H; Stark, A; Noz, M E; Maguire, G Q; Zeleznik, M P; Arndt, A

    2010-07-20

    Kinematic analysis for in vivo assessment of elbow endoprostheses requires knowledge of the exact positions of motion axes relative to bony landmarks or the prosthesis. A prosthesis-based reference system is required for comparison between individuals and studies. The primary aim of this study was to further develop an earlier described algorithm for fusion of radiostereometric analysis (RSA) data and data obtained in 3D computed tomography (CT) for application to the elbow after total joint replacement. The secondary aim was to propose a method for marking of prostheses in 3D CT, enabling definition of a prosthesis-based reference system. Six patients with elbow endoprostheses were investigated. The fusion of data made it possible to visualize the motion axes in relation to the prostheses in the 3D CT volume. The differences between two repeated positioning repetitions of the longitudinal prosthesis axis were less than 0.6 degrees in the frontal and sagittal planes. Corresponding values for the transverse axis were less than 0.6 degrees in the frontal and less than 1.4 degrees (in four out of six less than 0.6 degrees ) in the horizontal plane. This study shows that by fusion of CT and RSA data it is possible to determine the accurate position of the flexion axes of the elbow joint after total joint replacement in vivo. The proposed method for implant marking and registration of reference axes enables comparison of prosthesis function between patients and studies. PMID:20394932

  10. Training image analysis for model error assessment and dimension reduction in Bayesian-MCMC solutions to inverse problems

    NASA Astrophysics Data System (ADS)

    Koepke, C.; Irving, J.

    2015-12-01

    Bayesian solutions to inverse problems in near-surface geophysics and hydrology have gained increasing popularity as a means of estimating not only subsurface model parameters, but also their corresponding uncertainties that can be used in probabilistic forecasting and risk analysis. In particular, Markov-chain-Monte-Carlo (MCMC) methods have attracted much recent attention as a means of statistically sampling from the Bayesian posterior distribution. In this regard, two approaches are commonly used to improve the computational tractability of the Bayesian-MCMC approach: (i) Forward models involving a simplification of the underlying physics are employed, which offer a significant reduction in the time required to calculate data, but generally at the expense of model accuracy, and (ii) the model parameter space is represented using a limited set of spatially correlated basis functions as opposed to a more intuitive high-dimensional pixel-based parameterization. It has become well understood that model inaccuracies resulting from (i) can lead to posterior parameter distributions that are highly biased and overly confident. Further, when performing model reduction as described in (ii), it is not clear how the prior distribution for the basis weights should be defined because simple (e.g., Gaussian or uniform) priors that may be suitable for a pixel-based parameterization may result in a strong prior bias when used for the weights. To address the issue of model error resulting from known forward model approximations, we generate a set of error training realizations and analyze them with principal component analysis (PCA) in order to generate a sparse basis. The latter is used in the MCMC inversion to