Science.gov

Sample records for joint bayesian analysis

  1. JBASE: Joint Bayesian Analysis of Subphenotypes and Epistasis

    PubMed Central

    Colak, Recep; Kim, TaeHyung; Kazan, Hilal; Oh, Yoomi; Cruz, Miguel; Valladares-Salgado, Adan; Peralta, Jesus; Escobedo, Jorge; Parra, Esteban J.; Kim, Philip M.; Goldenberg, Anna

    2016-01-01

    Motivation: Rapid advances in genotyping and genome-wide association studies have enabled the discovery of many new genotype–phenotype associations at the resolution of individual markers. However, these associations explain only a small proportion of theoretically estimated heritability of most diseases. In this work, we propose an integrative mixture model called JBASE: joint Bayesian analysis of subphenotypes and epistasis. JBASE explores two major reasons of missing heritability: interactions between genetic variants, a phenomenon known as epistasis and phenotypic heterogeneity, addressed via subphenotyping. Results: Our extensive simulations in a wide range of scenarios repeatedly demonstrate that JBASE can identify true underlying subphenotypes, including their associated variants and their interactions, with high precision. In the presence of phenotypic heterogeneity, JBASE has higher Power and lower Type 1 Error than five state-of-the-art approaches. We applied our method to a sample of individuals from Mexico with Type 2 diabetes and discovered two novel epistatic modules, including two loci each, that define two subphenotypes characterized by differences in body mass index and waist-to-hip ratio. We successfully replicated these subphenotypes and epistatic modules in an independent dataset from Mexico genotyped with a different platform. Availability and implementation: JBASE is implemented in C++, supported on Linux and is available at http://www.cs.toronto.edu/∼goldenberg/JBASE/jbase.tar.gz. The genotype data underlying this study are available upon approval by the ethics review board of the Medical Centre Siglo XXI. Please contact Dr Miguel Cruz at mcruzl@yahoo.com for assistance with the application. Contact: anna.goldenberg@utoronto.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26411870

  2. Joint Bayesian analysis of birthweight and censored gestational age using finite mixture models

    PubMed Central

    Schwartz, Scott L.; Gelfand, Alan E.; Miranda, Marie L.

    2016-01-01

    Birthweight and gestational age are closely related and represent important indicators of a healthy pregnancy. Customary modeling for birthweight is conditional on gestational age. However, joint modeling directly addresses the relationship between gestational age and birthweight, and provides increased flexibility and interpretation as well as a strategy to avoid using gestational age as an intermediate variable. Previous proposals have utilized finite mixtures of bivariate regression models to incorporate well-established risk factors into analysis (e.g. sex and birth order of the baby, maternal age, race, and tobacco use) while examining the non-Gaussian shape of the joint birthweight and gestational age distribution. We build on this approach by demonstrating the inferential (prognostic) benefits of joint modeling (e.g. investigation of `age inappropriate' outcomes like small for gestational age) and hence re-emphasize the importance of capturing the non-Gaussian distributional shapes. We additionally extend current models through a latent specification which admits interval-censored gestational age. We work within a Bayesian framework which enables inference beyond customary parameter estimation and prediction as well as exact uncertainty assessment. The model is applied to a portion of the 2003–2006 North Carolina Detailed Birth Record data (n=336129) available through the Children's Environmental Health Initiative and is fitted using the Bayesian methodology and Markov chain Monte Carlo approaches. PMID:20575047

  3. JAM: A Scalable Bayesian Framework for Joint Analysis of Marginal SNP Effects

    PubMed Central

    Conti, David V.; Richardson, Sylvia

    2016-01-01

    ABSTRACT Recently, large scale genome‐wide association study (GWAS) meta‐analyses have boosted the number of known signals for some traits into the tens and hundreds. Typically, however, variants are only analysed one‐at‐a‐time. This complicates the ability of fine‐mapping to identify a small set of SNPs for further functional follow‐up. We describe a new and scalable algorithm, joint analysis of marginal summary statistics (JAM), for the re‐analysis of published marginal summary stactistics under joint multi‐SNP models. The correlation is accounted for according to estimates from a reference dataset, and models and SNPs that best explain the complete joint pattern of marginal effects are highlighted via an integrated Bayesian penalized regression framework. We provide both enumerated and Reversible Jump MCMC implementations of JAM and present some comparisons of performance. In a series of realistic simulation studies, JAM demonstrated identical performance to various alternatives designed for single region settings. In multi‐region settings, where the only multivariate alternative involves stepwise selection, JAM offered greater power and specificity. We also present an application to real published results from MAGIC (meta‐analysis of glucose and insulin related traits consortium) – a GWAS meta‐analysis of more than 15,000 people. We re‐analysed several genomic regions that produced multiple significant signals with glucose levels 2 hr after oral stimulation. Through joint multivariate modelling, JAM was able to formally rule out many SNPs, and for one gene, ADCY5, suggests that an additional SNP, which transpired to be more biologically plausible, should be followed up with equal priority to the reported index. PMID:27027514

  4. A Bayesian approach to joint analysis of multivariate longitudinal data and parametric accelerated failure time

    PubMed Central

    Luo, Sheng

    2013-01-01

    Impairment caused by Parkinson’s disease (PD) is multidimensional (e.g., sensoria, functions, and cognition) and progressive. Its multidimensional nature precludes a single outcome to measure disease progression. Clinical trials of PD use multiple categorical and continuous longitudinal outcomes to assess the treatment effects on overall improvement. A terminal event such as death or dropout can stop the follow-up process. Moreover, the time to the terminal event may be dependent on the multivariate longitudinal measurements. In this article, we consider a joint random-effects model for the correlated outcomes. A multilevel item response theory model is used for the multivariate longitudinal outcomes and a parametric accelerated failure time model is used for the failure time because of the violation of proportional hazard assumption. These two models are linked via random effects. The Bayesian inference via MCMC is implemented in ‘ BUGS’ language. Our proposed method is evaluated by a simulation study and is applied to DATATOP study, a motivating clinical trial to determine if deprenyl slows the progression of PD. PMID:24009073

  5. A Bayesian approach to joint analysis of multivariate longitudinal data and parametric accelerated failure time.

    PubMed

    Luo, Sheng

    2014-02-20

    Impairment caused by Parkinson's disease (PD) is multidimensional (e.g., sensoria, functions, and cognition) and progressive. Its multidimensional nature precludes a single outcome to measure disease progression. Clinical trials of PD use multiple categorical and continuous longitudinal outcomes to assess the treatment effects on overall improvement. A terminal event such as death or dropout can stop the follow-up process. Moreover, the time to the terminal event may be dependent on the multivariate longitudinal measurements. In this article, we consider a joint random-effects model for the correlated outcomes. A multilevel item response theory model is used for the multivariate longitudinal outcomes and a parametric accelerated failure time model is used for the failure time because of the violation of proportional hazard assumption. These two models are linked via random effects. The Bayesian inference via MCMC is implemented in 'BUGS' language. Our proposed method is evaluated by a simulation study and is applied to DATATOP study, a motivating clinical trial to determine if deprenyl slows the progression of PD. PMID:24009073

  6. Joint high-dimensional Bayesian variable and covariance selection with an application to eQTL analysis.

    PubMed

    Bhadra, Anindya; Mallick, Bani K

    2013-06-01

    We describe a Bayesian technique to (a) perform a sparse joint selection of significant predictor variables and significant inverse covariance matrix elements of the response variables in a high-dimensional linear Gaussian sparse seemingly unrelated regression (SSUR) setting and (b) perform an association analysis between the high-dimensional sets of predictors and responses in such a setting. To search the high-dimensional model space, where both the number of predictors and the number of possibly correlated responses can be larger than the sample size, we demonstrate that a marginalization-based collapsed Gibbs sampler, in combination with spike and slab type of priors, offers a computationally feasible and efficient solution. As an example, we apply our method to an expression quantitative trait loci (eQTL) analysis on publicly available single nucleotide polymorphism (SNP) and gene expression data for humans where the primary interest lies in finding the significant associations between the sets of SNPs and possibly correlated genetic transcripts. Our method also allows for inference on the sparse interaction network of the transcripts (response variables) after accounting for the effect of the SNPs (predictor variables). We exploit properties of Gaussian graphical models to make statements concerning conditional independence of the responses. Our method compares favorably to existing Bayesian approaches developed for this purpose. PMID:23607608

  7. Bayesian Mediation Analysis

    ERIC Educational Resources Information Center

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  8. Bayesian Exploratory Factor Analysis

    PubMed Central

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  9. Road network safety evaluation using Bayesian hierarchical joint model.

    PubMed

    Wang, Jie; Huang, Helai

    2016-05-01

    Safety and efficiency are commonly regarded as two significant performance indicators of transportation systems. In practice, road network planning has focused on road capacity and transport efficiency whereas the safety level of a road network has received little attention in the planning stage. This study develops a Bayesian hierarchical joint model for road network safety evaluation to help planners take traffic safety into account when planning a road network. The proposed model establishes relationships between road network risk and micro-level variables related to road entities and traffic volume, as well as socioeconomic, trip generation and network density variables at macro level which are generally used for long term transportation plans. In addition, network spatial correlation between intersections and their connected road segments is also considered in the model. A road network is elaborately selected in order to compare the proposed hierarchical joint model with a previous joint model and a negative binomial model. According to the results of the model comparison, the hierarchical joint model outperforms the joint model and negative binomial model in terms of the goodness-of-fit and predictive performance, which indicates the reasonableness of considering the hierarchical data structure in crash prediction and analysis. Moreover, both random effects at the TAZ level and the spatial correlation between intersections and their adjacent segments are found to be significant, supporting the employment of the hierarchical joint model as an alternative in road-network-level safety modeling as well.

  10. Bayesian Joint Modelling for Object Localisation in Weakly Labelled Images.

    PubMed

    Shi, Zhiyuan; Hospedales, Timothy M; Xiang, Tao

    2015-10-01

    We address the problem of localisation of objects as bounding boxes in images and videos with weak labels. This weakly supervised object localisation problem has been tackled in the past using discriminative models where each object class is localised independently from other classes. In this paper, a novel framework based on Bayesian joint topic modelling is proposed, which differs significantly from the existing ones in that: (1) All foreground object classes are modelled jointly in a single generative model that encodes multiple object co-existence so that "explaining away" inference can resolve ambiguity and lead to better learning and localisation. (2) Image backgrounds are shared across classes to better learn varying surroundings and "push out" objects of interest. (3) Our model can be learned with a mixture of weakly labelled and unlabelled data, allowing the large volume of unlabelled images on the Internet to be exploited for learning. Moreover, the Bayesian formulation enables the exploitation of various types of prior knowledge to compensate for the limited supervision offered by weakly labelled data, as well as Bayesian domain adaptation for transfer learning. Extensive experiments on the PASCAL VOC, ImageNet and YouTube-Object videos datasets demonstrate the effectiveness of our Bayesian joint model for weakly supervised object localisation. PMID:26340253

  11. Reasoning Under Uncertainty in Seismic Hazard Analysis: Modeling the Joint Probability of Earthquake, Site and Ground-Motion Parameters Using Bayesian Networks

    NASA Astrophysics Data System (ADS)

    Kuehn, N. M.; Carsten, R.; Frank, S.

    2008-12-01

    Empirical ground-motion models for use in seismic hazard analysis are commonly described by regression models, where the ground-motion parameter is assumed to be dependent on some earthquake- and site- specific parameters such as magnitude, distance or local vs30. In regression analysis only the target is treated as a random variable, while the predictors are not; they are implicitly assumed to be complete and error-free, which is not the case for magnitudes or distances in earthquake catalogs. However, in research areas such as machine learning or artificial intelligence techniques to overcome these issues exist. Borrowing from these fields, we present a novel multivariate approach to ground-motion estimation by means of the Bayesian network (BN) formalism. This elegant and intuitively appealing framework allows for reasoning under uncertainty by modeling directly the joint probability distribution of all variables, while at the same time offering explicit insight into the probabilistic relationships between variables. The formalism provides us with efficient methods for computing any marginal or conditional distribution of any subset of variables. In particular, if some earthquake- or site-related parameters are unknown, the distribution of the ground motion parameter of interest can still be calculated. In this case, the associated uncertainty is incorporated in the model framework. Here, we explore the use of BNs in the development of ground-motion models. Therefore, we construct BNs for both a synthetic and the NGA dataset, the most comprehensive strong ground motion dataset currently available. The analysis shows that BNs are able to capture the probabilistic dependencies between the different variables of interest. Comparison of the learned BN with the NGA model of Boore and Atkinson (2008) shows a reasonable agreement in distance and magnitude ranges with good data coverage.

  12. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  13. Bayesian Model Averaging for Propensity Score Analysis

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  14. Voxelwise Bayesian Lesion Deficit Analysis

    PubMed Central

    Chen, Rong; Hillis, Argye E.; Pawlak, Mikolaj; Herskovits, Edward H

    2008-01-01

    Relating cognitive deficits to the presence of lesions has been an important means of delineating structure-function associations in the human brain. We propose a voxel-based Bayesian method for lesion-deficit analysis, which identifies complex linear or nonlinear associations among brain-lesion locations, and neurological status. We validated this method using a simulated data set, and we applied this algorithm to data obtained from an acute-stroke study to identify associations among voxels with infarct or hypoperfusion, and impaired word reading. We found that a distributed region involving Brodmann areas (BA) 22, 37, 39, and 40 was implicated in word reading. PMID:18328733

  15. A Bayesian Joint Model of Menstrual Cycle Length and Fecundity

    PubMed Central

    Lum, Kirsten J.; Sundaram, Rajeshwari; Louis, Germaine M. Buck; Louis, Thomas A.

    2015-01-01

    Summary Menstrual cycle length (MCL) has been shown to play an important role in couple fecundity, which is the biologic capacity for reproduction irrespective of pregnancy intentions. However, a comprehensive assessment of its role requires a fecundity model that accounts for male and female attributes and the couple’s intercourse pattern relative to the ovulation day. To this end, we employ a Bayesian joint model for MCL and pregnancy. MCLs follow a scale multiplied (accelerated) mixture model with Gaussian and Gumbel components; the pregnancy model includes MCL as a covariate and computes the cycle-specific probability of pregnancy in a menstrual cycle conditional on the pattern of intercourse and no previous fertilization. Day-specific fertilization probability is modeled using natural, cubic splines. We analyze data from the Longitudinal Investigation of Fertility and the Environment Study (the LIFE Study), a couple based prospective pregnancy study, and find a statistically significant quadratic relation between fecundity and menstrual cycle length, after adjustment for intercourse pattern and other attributes, including male semen quality, both partner’s age, and active smoking status (determined by baseline cotinine level 100ng/mL). We compare results to those produced by a more basic model and show the advantages of a more comprehensive approach. PMID:26295923

  16. Joint Bayesian Component Separation and CMB Power Spectrum Estimation

    NASA Technical Reports Server (NTRS)

    Eriksen, H. K.; Jewell, J. B.; Dickinson, C.; Banday, A. J.; Gorski, K. M.; Lawrence, C. R.

    2008-01-01

    We describe and implement an exact, flexible, and computationally efficient algorithm for joint component separation and CMB power spectrum estimation, building on a Gibbs sampling framework. Two essential new features are (1) conditional sampling of foreground spectral parameters and (2) joint sampling of all amplitude-type degrees of freedom (e.g., CMB, foreground pixel amplitudes, and global template amplitudes) given spectral parameters. Given a parametric model of the foreground signals, we estimate efficiently and accurately the exact joint foreground- CMB posterior distribution and, therefore, all marginal distributions such as the CMB power spectrum or foreground spectral index posteriors. The main limitation of the current implementation is the requirement of identical beam responses at all frequencies, which restricts the analysis to the lowest resolution of a given experiment. We outline a future generalization to multiresolution observations. To verify the method, we analyze simple models and compare the results to analytical predictions. We then analyze a realistic simulation with properties similar to the 3 yr WMAP data, downgraded to a common resolution of 3 deg FWHM. The results from the actual 3 yr WMAP temperature analysis are presented in a companion Letter.

  17. Jointly modeling time-to-event and longitudinal data: A Bayesian approach.

    PubMed

    Huang, Yangxin; Hu, X Joan; Dagne, Getachew A

    2014-03-01

    This article explores Bayesian joint models of event times and longitudinal measures with an attempt to overcome departures from normality of the longitudinal response, measurement errors, and shortages of confidence in specifying a parametric time-to-event model. We allow the longitudinal response to have a skew distribution in the presence of measurement errors, and assume the time-to-event variable to have a nonparametric prior distribution. Posterior distributions of the parameters are attained simultaneously for inference based on Bayesian approach. An example from a recent AIDS clinical trial illustrates the methodology by jointly modeling the viral dynamics and the time to decrease in CD4/CD8 ratio in the presence of CD4 counts with measurement errors and to compare potential models with various scenarios and different distribution specifications. The analysis outcome indicates that the time-varying CD4 covariate is closely related to the first-phase viral decay rate, but the time to CD4/CD8 decrease is not highly associated with either the two viral decay rates or the CD4 changing rate over time. These findings may provide some quantitative guidance to better understand the relationship of the virological and immunological responses to antiretroviral treatments. PMID:24611039

  18. Determination of the EEDF using a Bayesian analysis framework

    NASA Astrophysics Data System (ADS)

    Poznic, Dominic; Samarian, Alex; James, Brian

    2013-10-01

    A statistical analysis framework is presented that determines the electron energy distribution function (EEDF) of an argon discharge plasma from optical emission spectra and Langmuir probe data. The analysis framework is based on Bayesian inference, in which data are treated in a rigorously statistical manner, that naturally includes all sources of uncertainty. The framework is designed to allow models describing different data sets from the same system to be combined in a straightforward manner. Spectral line intensities are described using a collisional-radiative model, while Langmuir probe data are described with a simple 1D Langmuir probe model. The models are inverted and combined using Bayesian probability theory in a joint analysis of both data sets. This framework was tested using data simulated by the two models from a known set of plasma conditions. The testing confirmed the ability of the framework to determine non-Maxwellian EEDFs and use multiple data sets to increase the accuracy of results.

  19. Bayesian analysis for kaon photoproduction

    SciTech Connect

    Marsainy, T. Mart, T.

    2014-09-25

    We have investigated contribution of the nucleon resonances in the kaon photoproduction process by using an established statistical decision making method, i.e. the Bayesian method. This method does not only evaluate the model over its entire parameter space, but also takes the prior information and experimental data into account. The result indicates that certain resonances have larger probabilities to contribute to the process.

  20. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    ERIC Educational Resources Information Center

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A. G.

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are…

  1. An Integrated Bayesian Model for DIF Analysis

    ERIC Educational Resources Information Center

    Soares, Tufi M.; Goncalves, Flavio B.; Gamerman, Dani

    2009-01-01

    In this article, an integrated Bayesian model for differential item functioning (DIF) analysis is proposed. The model is integrated in the sense of modeling the responses along with the DIF analysis. This approach allows DIF detection and explanation in a simultaneous setup. Previous empirical studies and/or subjective beliefs about the item…

  2. Heterogeneous Factor Analysis Models: A Bayesian Approach.

    ERIC Educational Resources Information Center

    Ansari, Asim; Jedidi, Kamel; Dube, Laurette

    2002-01-01

    Developed Markov Chain Monte Carlo procedures to perform Bayesian inference, model checking, and model comparison in heterogeneous factor analysis. Tested the approach with synthetic data and data from a consumption emotion study involving 54 consumers. Results show that traditional psychometric methods cannot fully capture the heterogeneity in…

  3. Bayesian and Frequentist Methods for Estimating Joint Uncertainty of Freundlich Adsorption Isotherm Fitting Parameters

    EPA Science Inventory

    In this paper, we present methods for estimating Freundlich isotherm fitting parameters (K and N) and their joint uncertainty, which have been implemented into the freeware software platforms R and WinBUGS. These estimates were determined by both Frequentist and Bayesian analyse...

  4. Bayesian Correlation Analysis for Sequence Count Data

    PubMed Central

    Lau, Nelson; Perkins, Theodore J.

    2016-01-01

    Evaluating the similarity of different measured variables is a fundamental task of statistics, and a key part of many bioinformatics algorithms. Here we propose a Bayesian scheme for estimating the correlation between different entities’ measurements based on high-throughput sequencing data. These entities could be different genes or miRNAs whose expression is measured by RNA-seq, different transcription factors or histone marks whose expression is measured by ChIP-seq, or even combinations of different types of entities. Our Bayesian formulation accounts for both measured signal levels and uncertainty in those levels, due to varying sequencing depth in different experiments and to varying absolute levels of individual entities, both of which affect the precision of the measurements. In comparison with a traditional Pearson correlation analysis, we show that our Bayesian correlation analysis retains high correlations when measurement confidence is high, but suppresses correlations when measurement confidence is low—especially for entities with low signal levels. In addition, we consider the influence of priors on the Bayesian correlation estimate. Perhaps surprisingly, we show that naive, uniform priors on entities’ signal levels can lead to highly biased correlation estimates, particularly when different experiments have widely varying sequencing depths. However, we propose two alternative priors that provably mitigate this problem. We also prove that, like traditional Pearson correlation, our Bayesian correlation calculation constitutes a kernel in the machine learning sense, and thus can be used as a similarity measure in any kernel-based machine learning algorithm. We demonstrate our approach on two RNA-seq datasets and one miRNA-seq dataset. PMID:27701449

  5. A Bayesian nonparametric meta-analysis model.

    PubMed

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G

    2015-03-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.

  6. A SAS Interface for Bayesian Analysis with WinBUGS

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; McArdle, John J.; Wang, Lijuan; Hamagami, Fumiaki

    2008-01-01

    Bayesian methods are becoming very popular despite some practical difficulties in implementation. To assist in the practical application of Bayesian methods, we show how to implement Bayesian analysis with WinBUGS as part of a standard set of SAS routines. This implementation procedure is first illustrated by fitting a multiple regression model…

  7. Joint Lung CT Image Segmentation: A Hierarchical Bayesian Approach

    PubMed Central

    Cheng, Wenjun; Ma, Luyao; Yang, Tiejun; Liang, Jiali

    2016-01-01

    Accurate lung CT image segmentation is of great clinical value, especially when it comes to delineate pathological regions including lung tumor. In this paper, we present a novel framework that jointly segments multiple lung computed tomography (CT) images via hierarchical Dirichlet process (HDP). In specifics, based on the assumption that lung CT images from different patients share similar image structure (organ sets and relative positioning), we derive a mathematical model to segment them simultaneously so that shared information across patients could be utilized to regularize each individual segmentation. Moreover, compared to many conventional models, the algorithm requires little manual involvement due to the nonparametric nature of Dirichlet process (DP). We validated proposed model upon clinical data consisting of healthy and abnormal (lung cancer) patients. We demonstrate that, because of the joint segmentation fashion, more accurate and consistent segmentations could be obtained. PMID:27611188

  8. Joint Lung CT Image Segmentation: A Hierarchical Bayesian Approach.

    PubMed

    Cheng, Wenjun; Ma, Luyao; Yang, Tiejun; Liang, Jiali; Zhang, Yan

    2016-01-01

    Accurate lung CT image segmentation is of great clinical value, especially when it comes to delineate pathological regions including lung tumor. In this paper, we present a novel framework that jointly segments multiple lung computed tomography (CT) images via hierarchical Dirichlet process (HDP). In specifics, based on the assumption that lung CT images from different patients share similar image structure (organ sets and relative positioning), we derive a mathematical model to segment them simultaneously so that shared information across patients could be utilized to regularize each individual segmentation. Moreover, compared to many conventional models, the algorithm requires little manual involvement due to the nonparametric nature of Dirichlet process (DP). We validated proposed model upon clinical data consisting of healthy and abnormal (lung cancer) patients. We demonstrate that, because of the joint segmentation fashion, more accurate and consistent segmentations could be obtained. PMID:27611188

  9. Semiparametric Bayesian inference on skew-normal joint modeling of multivariate longitudinal and survival data.

    PubMed

    Tang, An-Min; Tang, Nian-Sheng

    2015-02-28

    We propose a semiparametric multivariate skew-normal joint model for multivariate longitudinal and multivariate survival data. One main feature of the posited model is that we relax the commonly used normality assumption for random effects and within-subject error by using a centered Dirichlet process prior to specify the random effects distribution and using a multivariate skew-normal distribution to specify the within-subject error distribution and model trajectory functions of longitudinal responses semiparametrically. A Bayesian approach is proposed to simultaneously obtain Bayesian estimates of unknown parameters, random effects and nonparametric functions by combining the Gibbs sampler and the Metropolis-Hastings algorithm. Particularly, a Bayesian local influence approach is developed to assess the effect of minor perturbations to within-subject measurement error and random effects. Several simulation studies and an example are presented to illustrate the proposed methodologies. PMID:25404574

  10. Bayesian Analysis of Individual Level Personality Dynamics.

    PubMed

    Cripps, Edward; Wood, Robert E; Beckmann, Nadin; Lau, John; Beckmann, Jens F; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques.

  11. Bayesian model selection analysis of WMAP3

    SciTech Connect

    Parkinson, David; Mukherjee, Pia; Liddle, Andrew R.

    2006-06-15

    We present a Bayesian model selection analysis of WMAP3 data using our code CosmoNest. We focus on the density perturbation spectral index n{sub S} and the tensor-to-scalar ratio r, which define the plane of slow-roll inflationary models. We find that while the Bayesian evidence supports the conclusion that n{sub S}{ne}1, the data are not yet powerful enough to do so at a strong or decisive level. If tensors are assumed absent, the current odds are approximately 8 to 1 in favor of n{sub S}{ne}1 under our assumptions, when WMAP3 data is used together with external data sets. WMAP3 data on its own is unable to distinguish between the two models. Further, inclusion of r as a parameter weakens the conclusion against the Harrison-Zel'dovich case (n{sub S}=1, r=0), albeit in a prior-dependent way. In appendices we describe the CosmoNest code in detail, noting its ability to supply posterior samples as well as to accurately compute the Bayesian evidence. We make a first public release of CosmoNest, now available at www.cosmonest.org.

  12. Bayesian Analysis of Individual Level Personality Dynamics.

    PubMed

    Cripps, Edward; Wood, Robert E; Beckmann, Nadin; Lau, John; Beckmann, Jens F; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415

  13. Bayesian Analysis of Individual Level Personality Dynamics

    PubMed Central

    Cripps, Edward; Wood, Robert E.; Beckmann, Nadin; Lau, John; Beckmann, Jens F.; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415

  14. Bayesian Approach to the Joint Inversion of Gravity and Magnetic Data, with Application to the Ismenius Area of Mars

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey B.; Raymond, C.; Smrekar, S.; Millbury, C.

    2004-01-01

    This viewgraph presentation reviews a Bayesian approach to the inversion of gravity and magnetic data with specific application to the Ismenius Area of Mars. Many inverse problems encountered in geophysics and planetary science are well known to be non-unique (i.e. inversion of gravity the density structure of a body). In hopes of reducing the non-uniqueness of solutions, there has been interest in the joint analysis of data. An example is the joint inversion of gravity and magnetic data, with the assumption that the same physical anomalies generate both the observed magnetic and gravitational anomalies. In this talk, we formulate the joint analysis of different types of data in a Bayesian framework and apply the formalism to the inference of the density and remanent magnetization structure for a local region in the Ismenius area of Mars. The Bayesian approach allows prior information or constraints in the solutions to be incorporated in the inversion, with the "best" solutions those whose forward predictions most closely match the data while remaining consistent with assumed constraints. The application of this framework to the inversion of gravity and magnetic data on Mars reveals two typical challenges - the forward predictions of the data have a linear dependence on some of the quantities of interest, and non-linear dependence on others (termed the "linear" and "non-linear" variables, respectively). For observations with Gaussian noise, a Bayesian approach to inversion for "linear" variables reduces to a linear filtering problem, with an explicitly computable "error" matrix. However, for models whose forward predictions have non-linear dependencies, inference is no longer given by such a simple linear problem, and moreover, the uncertainty in the solution is no longer completely specified by a computable "error matrix". It is therefore important to develop methods for sampling from the full Bayesian posterior to provide a complete and statistically consistent

  15. Bayesian analysis of genetic differentiation between populations.

    PubMed Central

    Corander, Jukka; Waldmann, Patrik; Sillanpää, Mikko J

    2003-01-01

    We introduce a Bayesian method for estimating hidden population substructure using multilocus molecular markers and geographical information provided by the sampling design. The joint posterior distribution of the substructure and allele frequencies of the respective populations is available in an analytical form when the number of populations is small, whereas an approximation based on a Markov chain Monte Carlo simulation approach can be obtained for a moderate or large number of populations. Using the joint posterior distribution, posteriors can also be derived for any evolutionary population parameters, such as the traditional fixation indices. A major advantage compared to most earlier methods is that the number of populations is treated here as an unknown parameter. What is traditionally considered as two genetically distinct populations, either recently founded or connected by considerable gene flow, is here considered as one panmictic population with a certain probability based on marker data and prior information. Analyses of previously published data on the Moroccan argan tree (Argania spinosa) and of simulated data sets suggest that our method is capable of estimating a population substructure, while not artificially enforcing a substructure when it does not exist. The software (BAPS) used for the computations is freely available from http://www.rni.helsinki.fi/~mjs. PMID:12586722

  16. A Bayesian mixture of semiparametric mixed-effects joint models for skewed-longitudinal and time-to-event data.

    PubMed

    Chen, Jiaqing; Huang, Yangxin

    2015-09-10

    In longitudinal studies, it is of interest to investigate how repeatedly measured markers in time are associated with a time to an event of interest, and in the mean time, the repeated measurements are often observed with the features of a heterogeneous population, non-normality, and covariate measured with error because of longitudinal nature. Statistical analysis may complicate dramatically when one analyzes longitudinal-survival data with these features together. Recently, a mixture of skewed distributions has received increasing attention in the treatment of heterogeneous data involving asymmetric behaviors across subclasses, but there are relatively few studies accommodating heterogeneity, non-normality, and measurement error in covariate simultaneously arose in longitudinal-survival data setting. Under the umbrella of Bayesian inference, this article explores a finite mixture of semiparametric mixed-effects joint models with skewed distributions for longitudinal measures with an attempt to mediate homogeneous characteristics, adjust departures from normality, and tailor accuracy from measurement error in covariate as well as overcome shortages of confidence in specifying a time-to-event model. The Bayesian mixture of joint modeling offers an appropriate avenue to estimate not only all parameters of mixture joint models but also probabilities of class membership. Simulation studies are conducted to assess the performance of the proposed method, and a real example is analyzed to demonstrate the methodology. The results are reported by comparing potential models with various scenarios.

  17. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    PubMed Central

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B; Neyer, Franz J; van Aken, Marcel AG

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided. PMID:24116396

  18. Bayesian global analysis of neutrino oscillation data

    NASA Astrophysics Data System (ADS)

    Bergström, Johannes; Gonzalez-Garcia, M. C.; Maltoni, Michele; Schwetz, Thomas

    2015-09-01

    We perform a Bayesian analysis of current neutrino oscillation data. When estimating the oscillation parameters we find that the results generally agree with those of the χ 2 method, with some differences involving s 23 2 and CP-violating effects. We discuss the additional subtleties caused by the circular nature of the CP-violating phase, and how it is possible to obtain correlation coefficients with s 23 2 . When performing model comparison, we find that there is no significant evidence for any mass ordering, any octant of s 23 2 or a deviation from maximal mixing, nor the presence of CP-violation.

  19. Bayesian residual analysis for beta-binomial regression models

    NASA Astrophysics Data System (ADS)

    Pires, Rubiane Maria; Diniz, Carlos Alberto Ribeiro

    2012-10-01

    The beta-binomial regression model is an alternative model to the sum of any sequence of equicorrelated binary variables with common probability of success p. In this work a Bayesian perspective of this model is presented considering different link functions and different correlation structures. A general Bayesian residual analysis for this model, a issue which is often neglected in Bayesian analysis, using the residuals based on the predicted values obtained by the conditional predictive ordinate [1], the residuals based on the posterior distribution of the model parameters [2] and the Bayesian deviance residual [3] are presented in order to check the assumptions in the model.

  20. Bayesian analysis of factors associated with fibromyalgia syndrome subjects

    NASA Astrophysics Data System (ADS)

    Jayawardana, Veroni; Mondal, Sumona; Russek, Leslie

    2015-01-01

    Factors contributing to movement-related fear were assessed by Russek, et al. 2014 for subjects with Fibromyalgia (FM) based on the collected data by a national internet survey of community-based individuals. The study focused on the variables, Activities-Specific Balance Confidence scale (ABC), Primary Care Post-Traumatic Stress Disorder screen (PC-PTSD), Tampa Scale of Kinesiophobia (TSK), a Joint Hypermobility Syndrome screen (JHS), Vertigo Symptom Scale (VSS-SF), Obsessive-Compulsive Personality Disorder (OCPD), Pain, work status and physical activity dependent from the "Revised Fibromyalgia Impact Questionnaire" (FIQR). The study presented in this paper revisits same data with a Bayesian analysis where appropriate priors were introduced for variables selected in the Russek's paper.

  1. The Application of Bayesian Analysis to Issues in Developmental Research

    ERIC Educational Resources Information Center

    Walker, Lawrence J.; Gustafson, Paul; Frimer, Jeremy A.

    2007-01-01

    This article reviews the concepts and methods of Bayesian statistical analysis, which can offer innovative and powerful solutions to some challenging analytical problems that characterize developmental research. In this article, we demonstrate the utility of Bayesian analysis, explain its unique adeptness in some circumstances, address some…

  2. Bayesian Analysis of High Dimensional Classification

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Subhadeep; Liang, Faming

    2009-12-01

    Modern data mining and bioinformatics have presented an important playground for statistical learning techniques, where the number of input variables is possibly much larger than the sample size of the training data. In supervised learning, logistic regression or probit regression can be used to model a binary output and form perceptron classification rules based on Bayesian inference. In these cases , there is a lot of interest in searching for sparse model in High Dimensional regression(/classification) setup. we first discuss two common challenges for analyzing high dimensional data. The first one is the curse of dimensionality. The complexity of many existing algorithms scale exponentially with the dimensionality of the space and by virtue of that algorithms soon become computationally intractable and therefore inapplicable in many real applications. secondly, multicollinearities among the predictors which severely slowdown the algorithm. In order to make Bayesian analysis operational in high dimension we propose a novel 'Hierarchical stochastic approximation monte carlo algorithm' (HSAMC), which overcomes the curse of dimensionality, multicollinearity of predictors in high dimension and also it possesses the self-adjusting mechanism to avoid the local minima separated by high energy barriers. Models and methods are illustrated by simulation inspired from from the feild of genomics. Numerical results indicate that HSAMC can work as a general model selection sampler in high dimensional complex model space.

  3. Bayesian analysis on gravitational waves and exoplanets

    NASA Astrophysics Data System (ADS)

    Deng, Xihao

    Attempts to detect gravitational waves using a pulsar timing array (PTA), i.e., a collection of pulsars in our Galaxy, have become more organized over the last several years. PTAs act to detect gravitational waves generated from very distant sources by observing the small and correlated effect the waves have on pulse arrival times at the Earth. In this thesis, I present advanced Bayesian analysis methods that can be used to search for gravitational waves in pulsar timing data. These methods were also applied to analyze a set of radial velocity (RV) data collected by the Hobby- Eberly Telescope on observing a K0 giant star. They confirmed the presence of two Jupiter mass planets around a K0 giant star and also characterized the stellar p-mode oscillation. The first part of the thesis investigates the effect of wavefront curvature on a pulsar's response to a gravitational wave. In it we show that we can assume the gravitational wave phasefront is planar across the array only if the source luminosity distance " 2piL2/lambda, where L is the pulsar distance to the Earth (˜ kpc) and lambda is the radiation wavelength (˜ pc) in the PTA waveband. Correspondingly, for a point gravitational wave source closer than ˜ 100 Mpc, we should take into account the effect of wavefront curvature across the pulsar-Earth line of sight, which depends on the luminosity distance to the source, when evaluating the pulsar timing response. As a consequence, if a PTA can detect a gravitational wave from a source closer than ˜ 100 Mpc, the effects of wavefront curvature on the response allows us to determine the source luminosity distance. The second and third parts of the thesis propose a new analysis method based on Bayesian nonparametric regression to search for gravitational wave bursts and a gravitational wave background in PTA data. Unlike the conventional Bayesian analysis that introduces a signal model with a fixed number of parameters, Bayesian nonparametric regression sets

  4. Optimal sequential Bayesian analysis for degradation tests.

    PubMed

    Rodríguez-Narciso, Silvia; Christen, J Andrés

    2016-07-01

    Degradation tests are especially difficult to conduct for items with high reliability. Test costs, caused mainly by prolonged item duration and item destruction costs, establish the necessity of sequential degradation test designs. We propose a methodology that sequentially selects the optimal observation times to measure the degradation, using a convenient rule that maximizes the inference precision and minimizes test costs. In particular our objective is to estimate a quantile of the time to failure distribution, where the degradation process is modelled as a linear model using Bayesian inference. The proposed sequential analysis is based on an index that measures the expected discrepancy between the estimated quantile and its corresponding prediction, using Monte Carlo methods. The procedure was successfully implemented for simulated and real data.

  5. Bayesian spatial joint modeling of traffic crashes on an urban road network.

    PubMed

    Zeng, Qiang; Huang, Helai

    2014-06-01

    This study proposes a Bayesian spatial joint model of crash prediction including both road segments and intersections located in an urban road network, through which the spatial correlations between heterogeneous types of entities could be considered. A road network in Hillsborough, Florida, with crash, road, and traffic characteristics data for a three-year period was selected in order to compare the proposed joint model with three site-level crash prediction models, that is, the Poisson, negative binomial (NB), and conditional autoregressive (CAR) models. According to the results, the CAR and Joint models outperform the Poisson and NB models in terms of model fitting and predictive performance, which indicates the reasonableness of considering cross-entity spatial correlations. Although the goodness-of-fit and predictive performance of the CAR and Joint models are equivalent in this case study, spatial correlations between segments and the connected intersections are found to be more significant than those solely between segments or between intersections, which supports the employment of the Joint model as an alternative in road-network-level safety modeling.

  6. A Bayesian Hierarchical Approach to Regional Frequency Analysis of Extremes

    NASA Astrophysics Data System (ADS)

    Renard, B.

    2010-12-01

    Rainfall and runoff frequency analysis is a major issue for the hydrological community. The distribution of hydrological extremes varies in space and possibly in time. Describing and understanding this spatiotemporal variability are primary challenges to improve hazard quantification and risk assessment. This presentation proposes a general approach based on a Bayesian hierarchical model, following previous work by Cooley et al. [2007], Micevski [2007], Aryal et al. [2009] or Lima and Lall [2009; 2010]. Such a hierarchical model is made up of two levels: (1) a data level modeling the distribution of observations, and (2) a process level describing the fluctuation of the distribution parameters in space and possibly in time. At the first level of the model, at-site data (e.g., annual maxima series) are modeled with a chosen distribution (e.g., a GEV distribution). Since data from several sites are considered, the joint distribution of a vector of (spatial) observations needs to be derived. This is challenging because data are in general not spatially independent, especially for nearby sites. An elliptical copula is therefore used to formally account for spatial dependence between at-site data. This choice might be questionable in the context of extreme value distributions. However, it is motivated by its applicability in spatial highly dimensional problems, where the joint pdf of a vector of n observations is required to derive the likelihood function (with n possibly amounting to hundreds of sites). At the second level of the model, parameters of the chosen at-site distribution are then modeled by a Gaussian spatial process, whose mean may depend on covariates (e.g. elevation, distance to sea, weather pattern, time). In particular, this spatial process allows estimating parameters at ungauged sites, and deriving the predictive distribution of rainfall/runoff at every pixel/catchment of the studied domain. An application to extreme rainfall series from the French

  7. Bayesian joint modeling of longitudinal measurements and time-to-event data using robust distributions.

    PubMed

    Baghfalaki, T; Ganjali, M; Hashemi, R

    2014-01-01

    Distributional assumptions of most of the existing methods for joint modeling of longitudinal measurements and time-to-event data cannot allow incorporation of outlier robustness. In this article, we develop and implement a joint modeling of longitudinal and time-to-event data using some powerful distributions for robust analyzing that are known as normal/independent distributions. These distributions include univariate and multivariate versions of the Student's t, the slash, and the contaminated normal distributions. The proposed model implements a linear mixed effects model under a normal/independent distribution assumption for both random effects and residuals of the longitudinal process. For the time-to-event process a parametric proportional hazard model with a Weibull baseline hazard is used. Also, a Bayesian approach using the Markov-chain Monte Carlo method is adopted for parameter estimation. Some simulation studies are performed to investigate the performance of the proposed method under presence and absence of outliers. Also, the proposed methods are applied for analyzing a real AIDS clinical trial, with the aim of comparing the efficiency and safety of two antiretroviral drugs, where CD4 count measurements are gathered as longitudinal outcomes. In these data, time to death or dropout is considered as the interesting time-to-event outcome variable. Different model structures are developed for analyzing these data sets, where model selection is performed by the deviance information criterion (DIC), expected Akaike information criterion (EAIC), and expected Bayesian information criterion (EBIC).

  8. Multi-view TWRI scene reconstruction using a joint Bayesian sparse approximation model

    NASA Astrophysics Data System (ADS)

    Tang, V. H.; Bouzerdoum, A.; Phung, S. L.; Tivive, F. H. C.

    2015-05-01

    This paper addresses the problem of scene reconstruction in conjunction with wall-clutter mitigation for com- pressed multi-view through-the-wall radar imaging (TWRI). We consider the problem where the scene behind- the-wall is illuminated from different vantage points using a different set of frequencies at each antenna. First, a joint Bayesian sparse recovery model is employed to estimate the antenna signal coefficients simultaneously, by exploiting the sparsity and inter-signal correlations among antenna signals. Then, a subspace-projection technique is applied to suppress the signal coefficients related to the wall returns. Furthermore, a multi-task linear model is developed to relate the target coefficients to the image of the scene. The composite image is reconstructed using a joint Bayesian sparse framework, taking into account the inter-view dependencies. Experimental results are presented which demonstrate the effectiveness of the proposed approach for multi-view imaging of indoor scenes using a reduced set of measurements at each view.

  9. Bayesian data analysis in population ecology: motivations, methods, and benefits

    USGS Publications Warehouse

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  10. Learning Bayesian networks from big meteorological spatial datasets. An alternative to complex network analysis

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Jose Manuel; San Martín, Daniel; Herrera, Sixto; Santiago Cofiño, Antonio

    2016-04-01

    The growing availability of spatial datasets (observations, reanalysis, and regional and global climate models) demands efficient multivariate spatial modeling techniques for many problems of interest (e.g. teleconnection analysis, multi-site downscaling, etc.). Complex networks have been recently applied in this context using graphs built from pairwise correlations between the different stations (or grid boxes) forming the dataset. However, this analysis does not take into account the full dependence structure underlying the data, gien by all possible marginal and conditional dependencies among the stations, and does not allow a probabilistic analysis of the dataset. In this talk we introduce Bayesian networks as an alternative multivariate analysis and modeling data-driven technique which allows building a joint probability distribution of the stations including all relevant dependencies in the dataset. Bayesian networks is a sound machine learning technique using a graph to 1) encode the main dependencies among the variables and 2) to obtain a factorization of the joint probability distribution of the stations given by a reduced number of parameters. For a particular problem, the resulting graph provides a qualitative analysis of the spatial relationships in the dataset (alternative to complex network analysis), and the resulting model allows for a probabilistic analysis of the dataset. Bayesian networks have been widely applied in many fields, but their use in climate problems is hampered by the large number of variables (stations) involved in this field, since the complexity of the existing algorithms to learn from data the graphical structure grows nonlinearly with the number of variables. In this contribution we present a modified local learning algorithm for Bayesian networks adapted to this problem, which allows inferring the graphical structure for thousands of stations (from observations) and/or gridboxes (from model simulations) thus providing new

  11. Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation

    NASA Technical Reports Server (NTRS)

    Jefferys, William H.; Berger, James O.

    1992-01-01

    'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian analysis can, moreover, clarify the nature of the 'simplest' hypothesis consistent with the given data. By choosing the prior probabilities of hypotheses, it becomes possible to quantify the scientific judgment that simpler hypotheses are more likely to be correct. Bayesian analysis also shows that a hypothesis with fewer adjustable parameters intrinsically possesses an enhanced posterior probability, due to the clarity of its predictions.

  12. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  13. Coherent Bayesian analysis of inspiral signals

    NASA Astrophysics Data System (ADS)

    Röver, Christian; Meyer, Renate; Guidi, Gianluca M.; Viceré, Andrea; Christensen, Nelson

    2007-10-01

    In this paper we present a Bayesian parameter estimation method for the analysis of interferometric gravitational wave observations of an inspiral of binary compact objects using data recorded simultaneously by a network of several interferometers at different sites. We consider neutron star or black hole inspirals that are modeled to 3.5 post-Newtonian (PN) order in phase and 2.5 PN in amplitude. Inference is facilitated using Markov chain Monte Carlo (MCMC) methods that are adapted in order to efficiently explore the particular parameter space. Examples are shown to illustrate how and what information about the different parameters can be derived from the data. This study uses simulated signals and data with noise characteristics that are assumed to be defined by the LIGO and Virgo detectors operating at their design sensitivities. Nine parameters are estimated, including those associated with the binary system plus its location on the sky. We explain how this technique will be part of a detection pipeline for binary systems of compact objects with masses up to 20 M_{\\odot} , including cases where the ratio of the individual masses can be extreme.

  14. Bayesian Analysis of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey

    2007-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

  15. Bayesian analysis of the backreaction models

    SciTech Connect

    Kurek, Aleksandra; Bolejko, Krzysztof; Szydlowski, Marek

    2010-03-15

    We present a Bayesian analysis of four different types of backreaction models, which are based on the Buchert equations. In this approach, one considers a solution to the Einstein equations for a general matter distribution and then an average of various observable quantities is taken. Such an approach became of considerable interest when it was shown that it could lead to agreement with observations without resorting to dark energy. In this paper we compare the {Lambda}CDM model and the backreaction models with type Ia supernovae, baryon acoustic oscillations, and cosmic microwave background data, and find that the former is favored. However, the tested models were based on some particular assumptions about the relation between the average spatial curvature and the backreaction, as well as the relation between the curvature and curvature index. In this paper we modified the latter assumption, leaving the former unchanged. We find that, by varying the relation between the curvature and curvature index, we can obtain a better fit. Therefore, some further work is still needed--in particular, the relation between the backreaction and the curvature should be revisited in order to fully determine the feasibility of the backreaction models to mimic dark energy.

  16. A fully Bayesian method for jointly fitting instrumental calibration and X-ray spectral models

    SciTech Connect

    Xu, Jin; Yu, Yaming; Van Dyk, David A.; Kashyap, Vinay L.; Siemiginowska, Aneta; Drake, Jeremy; Ratzlaff, Pete; Connors, Alanna; Meng, Xiao-Li E-mail: yamingy@ics.uci.edu E-mail: vkashyap@cfa.harvard.edu E-mail: jdrake@cfa.harvard.edu E-mail: meng@stat.harvard.edu

    2014-10-20

    Owing to a lack of robust principled methods, systematic instrumental uncertainties have generally been ignored in astrophysical data analysis despite wide recognition of the importance of including them. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. We previously introduced a pragmatic Bayesian method to address this problem. The method is 'pragmatic' in that it introduced an ad hoc technique that simplified computation by neglecting the potential information in the data for narrowing the uncertainty for the calibration product. Following that work, we use a principal component analysis to efficiently represent the uncertainty of the effective area of an X-ray (or γ-ray) telescope. Here, however, we leverage this representation to enable a principled, fully Bayesian method that coherently accounts for the calibration uncertainty in high-energy spectral analysis. In this setting, the method is compared with standard analysis techniques and the pragmatic Bayesian method. The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product—here the effective area, conditional on the adopted spectral model. In this way, it can yield more accurate and efficient estimates of the source parameters along with valid estimates of their uncertainty. Provided that the source spectrum can be accurately described by a parameterized model, this method allows rigorous inference about the effective area by quantifying which possible curves are most consistent with the data.

  17. A Fully Bayesian Method for Jointly Fitting Instrumental Calibration and X-Ray Spectral Models

    NASA Astrophysics Data System (ADS)

    Xu, Jin; van Dyk, David A.; Kashyap, Vinay L.; Siemiginowska, Aneta; Connors, Alanna; Drake, Jeremy; Meng, Xiao-Li; Ratzlaff, Pete; Yu, Yaming

    2014-10-01

    Owing to a lack of robust principled methods, systematic instrumental uncertainties have generally been ignored in astrophysical data analysis despite wide recognition of the importance of including them. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. We previously introduced a pragmatic Bayesian method to address this problem. The method is "pragmatic" in that it introduced an ad hoc technique that simplified computation by neglecting the potential information in the data for narrowing the uncertainty for the calibration product. Following that work, we use a principal component analysis to efficiently represent the uncertainty of the effective area of an X-ray (or γ-ray) telescope. Here, however, we leverage this representation to enable a principled, fully Bayesian method that coherently accounts for the calibration uncertainty in high-energy spectral analysis. In this setting, the method is compared with standard analysis techniques and the pragmatic Bayesian method. The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product—here the effective area, conditional on the adopted spectral model. In this way, it can yield more accurate and efficient estimates of the source parameters along with valid estimates of their uncertainty. Provided that the source spectrum can be accurately described by a parameterized model, this method allows rigorous inference about the effective area by quantifying which possible curves are most consistent with the data.

  18. Bayesian analysis of a disability model for lung cancer survival.

    PubMed

    Armero, C; Cabras, S; Castellanos, M E; Perra, S; Quirós, A; Oruezábal, M J; Sánchez-Rubio, J

    2016-02-01

    Bayesian reasoning, survival analysis and multi-state models are used to assess survival times for Stage IV non-small-cell lung cancer patients and the evolution of the disease over time. Bayesian estimation is done using minimum informative priors for the Weibull regression survival model, leading to an automatic inferential procedure. Markov chain Monte Carlo methods have been used for approximating posterior distributions and the Bayesian information criterion has been considered for covariate selection. In particular, the posterior distribution of the transition probabilities, resulting from the multi-state model, constitutes a very interesting tool which could be useful to help oncologists and patients make efficient and effective decisions.

  19. Semiparametric Bayesian joint modeling of a binary and continuous outcome with applications in toxicological risk assessment.

    PubMed

    Hwang, Beom Seuk; Pennell, Michael L

    2014-03-30

    Many dose-response studies collect data on correlated outcomes. For example, in developmental toxicity studies, uterine weight and presence of malformed pups are measured on the same dam. Joint modeling can result in more efficient inferences than independent models for each outcome. Most methods for joint modeling assume standard parametric response distributions. However, in toxicity studies, it is possible that response distributions vary in location and shape with dose, which may not be easily captured by standard models. To address this issue, we propose a semiparametric Bayesian joint model for a binary and continuous response. In our model, a kernel stick-breaking process prior is assigned to the distribution of a random effect shared across outcomes, which allows flexible changes in distribution shape with dose shared across outcomes. The model also includes outcome-specific fixed effects to allow different location effects. In simulation studies, we found that the proposed model provides accurate estimates of toxicological risk when the data do not satisfy assumptions of standard parametric models. We apply our method to data from a developmental toxicity study of ethylene glycol diethyl ether. PMID:24123309

  20. Using a Bayesian Model to Jointly Estimate the Flaw Size Distribution and the Pod Function

    NASA Astrophysics Data System (ADS)

    Li, M.; Meeker, W. Q.; Hovey, P.

    2010-02-01

    In this paper we extend previous work by the authors to jointly estimate the flaw size distribution and the POD function from simulated field inspection data. Similar to our previous work, we assume that when a crack is above a detection threshold, both the signal amplitude and the flaw size are recorded. For a signal that is above the noise floor, but below the detection threshold, only the amplitude is recorded. At all other locations we know only that the signal is below the noise floor, i.e. left censored. Now our model allows different airplanes to have different crack growth rates, and the distribution of crack growth rates is to be estimated from the data. To estimate the parameters of the model, we use a Bayesian formulation that provides a convenient structure for estimating the plane-to-plane differences. The Bayesian formulation also allows the use of prior information based on knowledge of physics or previous experience with similar inspection situations. For example, there may be useful information about crack growth rates and about the slope in the amplitude and crack size relationship. Use of such information can importantly improve estimation precision.

  1. Bayesian analysis of MEG visual evoked responses

    SciTech Connect

    Schmidt, D.M.; George, J.S.; Wood, C.C.

    1999-04-01

    The authors developed a method for analyzing neural electromagnetic data that allows probabilistic inferences to be drawn about regions of activation. The method involves the generation of a large number of possible solutions which both fir the data and prior expectations about the nature of probable solutions made explicit by a Bayesian formalism. In addition, they have introduced a model for the current distributions that produce MEG and (EEG) data that allows extended regions of activity, and can easily incorporate prior information such as anatomical constraints from MRI. To evaluate the feasibility and utility of the Bayesian approach with actual data, they analyzed MEG data from a visual evoked response experiment. They compared Bayesian analyses of MEG responses to visual stimuli in the left and right visual fields, in order to examine the sensitivity of the method to detect known features of human visual cortex organization. They also examined the changing pattern of cortical activation as a function of time.

  2. Efficient Bayesian Joint Models for Group Randomized Trials with Multiple Observation Times and Multiple Outcomes

    PubMed Central

    Xu, Xinyi; Pennell, Michael L.; Lu, Bo; Murray, David M.

    2013-01-01

    Summary In this paper, we propose a Bayesian method for Group Randomized Trials (GRTs) with multiple observation times and multiple outcomes of different types. We jointly model these outcomes using latent multivariate normal linear regression, which allows treatment effects to change with time and accounts for 1.) intra-class correlation (ICC) within groups 2.) the correlation between different outcomes measured on the same subject and 3.) the over-time correlation (OTC) of each outcome. Moreover we develop a set of innovative priors for the variance components which yield direct inference on the correlations, avoid undesirable constraints, and allow utilization of information from previous studies. We illustrate through simulations that our model can improve estimation efficiency (lower posterior standard deviations) of ICCs and treatment effects relative to single outcome models and models with diffuse priors on the variance components. We also demonstrate the methodology using body composition data collected in the Trial of Activity in Adolescent Girls (TAAG). PMID:22733563

  3. A predictive Bayesian approach to risk analysis in health care

    PubMed Central

    Aven, Terje; Eidesen, Karianne

    2007-01-01

    Background The Bayesian approach is now widely recognised as a proper framework for analysing risk in health care. However, the traditional text-book Bayesian approach is in many cases difficult to implement, as it is based on abstract concepts and modelling. Methods The essential points of the risk analyses conducted according to the predictive Bayesian approach are identification of observable quantities, prediction and uncertainty assessments of these quantities, using all the relevant information. The risk analysis summarizes the knowledge and lack of knowledge concerning critical operations and other activities, and give in this way a basis for making rational decisions. Results It is shown that Bayesian risk analysis can be significantly simplified and made more accessible compared to the traditional text-book Bayesian approach by focusing on predictions of observable quantities and performing uncertainty assessments of these quantities using subjective probabilities. Conclusion The predictive Bayesian approach provides a framework for ensuring quality of risk analysis. The approach acknowledges that risk cannot be adequately described and evaluated simply by reference to summarising probabilities. Risk is defined by the combination of possible consequences and associated uncertainties. PMID:17714597

  4. Bayesian methods for the design and analysis of noninferiority trials.

    PubMed

    Gamalo-Siebers, Margaret; Gao, Aijun; Lakshminarayanan, Mani; Liu, Guanghan; Natanegara, Fanni; Railkar, Radha; Schmidli, Heinz; Song, Guochen

    2016-01-01

    The gold standard for evaluating treatment efficacy of a medical product is a placebo-controlled trial. However, when the use of placebo is considered to be unethical or impractical, a viable alternative for evaluating treatment efficacy is through a noninferiority (NI) study where a test treatment is compared to an active control treatment. The minimal objective of such a study is to determine whether the test treatment is superior to placebo. An assumption is made that if the active control treatment remains efficacious, as was observed when it was compared against placebo, then a test treatment that has comparable efficacy with the active control, within a certain range, must also be superior to placebo. Because of this assumption, the design, implementation, and analysis of NI trials present challenges for sponsors and regulators. In designing and analyzing NI trials, substantial historical data are often required on the active control treatment and placebo. Bayesian approaches provide a natural framework for synthesizing the historical data in the form of prior distributions that can effectively be used in design and analysis of a NI clinical trial. Despite a flurry of recent research activities in the area of Bayesian approaches in medical product development, there are still substantial gaps in recognition and acceptance of Bayesian approaches in NI trial design and analysis. The Bayesian Scientific Working Group of the Drug Information Association provides a coordinated effort to target the education and implementation issues on Bayesian approaches for NI trials. In this article, we provide a review of both frequentist and Bayesian approaches in NI trials, and elaborate on the implementation for two common Bayesian methods including hierarchical prior method and meta-analytic-predictive approach. Simulations are conducted to investigate the properties of the Bayesian methods, and some real clinical trial examples are presented for illustration.

  5. Joint prediction of multiple quantitative traits using a Bayesian multivariate antedependence model

    PubMed Central

    Jiang, J; Zhang, Q; Ma, L; Li, J; Wang, Z; Liu, J-F

    2015-01-01

    Predicting organismal phenotypes from genotype data is important for preventive and personalized medicine as well as plant and animal breeding. Although genome-wide association studies (GWAS) for complex traits have discovered a large number of trait- and disease-associated variants, phenotype prediction based on associated variants is usually in low accuracy even for a high-heritability trait because these variants can typically account for a limited fraction of total genetic variance. In comparison with GWAS, the whole-genome prediction (WGP) methods can increase prediction accuracy by making use of a huge number of variants simultaneously. Among various statistical methods for WGP, multiple-trait model and antedependence model show their respective advantages. To take advantage of both strategies within a unified framework, we proposed a novel multivariate antedependence-based method for joint prediction of multiple quantitative traits using a Bayesian algorithm via modeling a linear relationship of effect vector between each pair of adjacent markers. Through both simulation and real-data analyses, our studies demonstrated that the proposed antedependence-based multiple-trait WGP method is more accurate and robust than corresponding traditional counterparts (Bayes A and multi-trait Bayes A) under various scenarios. Our method can be readily extended to deal with missing phenotypes and resequence data with rare variants, offering a feasible way to jointly predict phenotypes for multiple complex traits in human genetic epidemiology as well as plant and livestock breeding. PMID:25873147

  6. Joint prediction of multiple quantitative traits using a Bayesian multivariate antedependence model.

    PubMed

    Jiang, J; Zhang, Q; Ma, L; Li, J; Wang, Z; Liu, J-F

    2015-07-01

    Predicting organismal phenotypes from genotype data is important for preventive and personalized medicine as well as plant and animal breeding. Although genome-wide association studies (GWAS) for complex traits have discovered a large number of trait- and disease-associated variants, phenotype prediction based on associated variants is usually in low accuracy even for a high-heritability trait because these variants can typically account for a limited fraction of total genetic variance. In comparison with GWAS, the whole-genome prediction (WGP) methods can increase prediction accuracy by making use of a huge number of variants simultaneously. Among various statistical methods for WGP, multiple-trait model and antedependence model show their respective advantages. To take advantage of both strategies within a unified framework, we proposed a novel multivariate antedependence-based method for joint prediction of multiple quantitative traits using a Bayesian algorithm via modeling a linear relationship of effect vector between each pair of adjacent markers. Through both simulation and real-data analyses, our studies demonstrated that the proposed antedependence-based multiple-trait WGP method is more accurate and robust than corresponding traditional counterparts (Bayes A and multi-trait Bayes A) under various scenarios. Our method can be readily extended to deal with missing phenotypes and resequence data with rare variants, offering a feasible way to jointly predict phenotypes for multiple complex traits in human genetic epidemiology as well as plant and livestock breeding.

  7. Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.

    2016-03-01

    A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.

  8. Elite Athletes Refine Their Internal Clocks: A Bayesian Analysis.

    PubMed

    Chen, Yin-Hua; Verdinelli, Isabella; Cesari, Paola

    2016-07-01

    This paper carries out a full Bayesian analysis for a data set examined in Chen & Cesari (2015). These data were collected for assessing people's ability in evaluating short intervals of time. Chen & Cesari (2015) showed evidence of the existence of two independent internal clocks for evaluating time intervals below and above the second. We reexamine here, the same question by performing a complete statistical Bayesian analysis of the data. The Bayesian approach can be used to analyze these data thanks to the specific trial design. Data were obtained from evaluation of time ranges from two groups of individuals. More specifically, information gathered from a nontrained group (considered as baseline) allowed us to build a prior distribution for the parameter(s) of interest, and data from the trained group determined the likelihood function. This paper's main goals are (i) showing how the Bayesian inferential method can be used in statistical analyses and (ii) showing that the Bayesian methodology gives additional support to the findings presented in Chen & Cesari (2015) regarding the existence of two internal clocks in assessing duration of time intervals.

  9. Bayesian Analysis of the Pattern Informatics Technique

    NASA Astrophysics Data System (ADS)

    Cho, N.; Tiampo, K.; Klein, W.; Rundle, J.

    2007-12-01

    The pattern informatics (PI) [Rundle et al., 2000; Tiampo et al., 2002; Holliday et al., 2005] is a technique that uses phase dynamics in order to quantify temporal variations in seismicity patterns. This technique has shown interesting results for forecasting earthquakes with magnitude greater than or equal to 5 in southern California from 2000 to 2010 [Rundle et al., 2002]. In this work, a Bayesian approach is used to obtain a modified updated version of the PI called Bayesian pattern informatics (BPI). This alternative method uses the PI result as a prior probability and models such as ETAS [Ogata, 1988, 2004; Helmstetter and Sornette, 2002] or BASS [Turcotte et al., 2007] in order to obtain the likelihood. Its result is similar to the one obtained by the PI: the determination of regions, known as hotspots, that are most susceptible to the occurrence of events with M=5 and larger during the forecast period. As an initial test, retrospective forecasts for the southern California region from 1990 to 2000 were made with both the BPI and the PI techniques, and the results are discussed in this work.

  10. APPLICATION OF BAYESIAN MONTE CARLO ANALYSIS TO A LAGRANGIAN PHOTOCHEMICAL AIR QUALITY MODEL. (R824792)

    EPA Science Inventory

    Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...

  11. Joint Bayesian inversion of free oscillations and body waves for Earth's radial velocity and density structure

    NASA Astrophysics Data System (ADS)

    De Wit, R. W.; Valentine, A. P.; Trampert, J.

    2013-12-01

    Existing 1-D seismic models are the basis for 3-D seismic tomography and most earthquake location algorithms. It is therefore important to evaluate the quality of such 1-D reference models, yet quantifying uncertainties in seismological models is challenging. Ideally, quality assessment is an integral part of the inverse method. Our aim in this study is two-fold: (i) we show one approach to solving a Bayesian non-linear inverse problem and quantifying model uncertainties, and (ii) we analyse the constraint on radial P and S-wave velocity and density structure provided by normal mode and travel time data. We jointly invert body-wave travel times from the EHB bulletin (phases Pn, P, PP, PKP, Sn and S) and centre frequencies of recently measured splitting functions for the Earth's free oscillations up to 10 mHz. We adopt a machine learning approach to extract all information on the 1-D model that is available in the seismic data. We use artificial neural networks, which are very common in pattern recognition problems and can be used to approximate an arbitrary function. We obtain probability density functions (pdfs), which provide a quantitative description of our knowledge of the individual earth model parameters.

  12. Bayesian networks as a tool for epidemiological systems analysis

    NASA Astrophysics Data System (ADS)

    Lewis, F. I.

    2012-11-01

    Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter attempts not only to identify statistically associated variables, but to additionally, and empirically, separate these into those directly and indirectly dependent with one or more outcome variables. Such discrimination is vastly more ambitious but has the potential to reveal far more about key features of complex disease systems. Applying Bayesian network modeling to biological and medical data has considerable computational demands, combined with the need to ensure robust model selection given the vast model space of possible DAGs. These challenges require the use of approximation techniques, such as the Laplace approximation, Markov chain Monte Carlo simulation and parametric bootstrapping, along with computational parallelization. A case study in structure discovery - identification of an optimal DAG for given data - is presented which uses additive Bayesian networks to explore veterinary disease data of industrial and medical relevance.

  13. On Bayesian analysis of on-off measurements

    NASA Astrophysics Data System (ADS)

    Nosek, Dalibor; Nosková, Jana

    2016-06-01

    We propose an analytical solution to the on-off problem within the framework of Bayesian statistics. Both the statistical significance for the discovery of new phenomena and credible intervals on model parameters are presented in a consistent way. We use a large enough family of prior distributions of relevant parameters. The proposed analysis is designed to provide Bayesian solutions that can be used for any number of observed on-off events, including zero. The procedure is checked using Monte Carlo simulations. The usefulness of the method is demonstrated on examples from γ-ray astronomy.

  14. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    ERIC Educational Resources Information Center

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  15. Methods for the joint meta-analysis of multiple tests.

    PubMed

    Trikalinos, Thomas A; Hoaglin, David C; Small, Kevin M; Terrin, Norma; Schmid, Christopher H

    2014-12-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests' true-positive rates (TPRs) and between their false-positive rates (FPRs) (induced because tests are applied to the same participants), and allow for between-study correlations between TPRs and FPRs (such as those induced by threshold effects). We estimate models in the Bayesian setting. We demonstrate using a meta-analysis of screening for Down syndrome with two tests: shortened humerus (arm bone), and shortened femur (thigh bone). Separate and joint meta-analyses yielded similar TPR and FPR estimates. For example, the summary TPR for a shortened humerus was 35.3% (95% credible interval (CrI): 26.9, 41.8%) versus 37.9% (27.7, 50.3%) with joint versus separate meta-analysis. Joint meta-analysis is more efficient when calculating comparative accuracy: the difference in the summary TPRs was 0.0% (-8.9, 9.5%; TPR higher for shortened humerus) with joint versus 2.6% (-14.7, 19.8%) with separate meta-analyses. Simulation and empirical analyses are needed to refine the role of the proposed methodology.

  16. Bayesian analysis of the flutter margin method in aeroelasticity

    DOE PAGESBeta

    Khalil, Mohammad; Poirel, Dominique; Sarkar, Abhijit

    2016-08-27

    A Bayesian statistical framework is presented for Zimmerman and Weissenburger flutter margin method which considers the uncertainties in aeroelastic modal parameters. The proposed methodology overcomes the limitations of the previously developed least-square based estimation technique which relies on the Gaussian approximation of the flutter margin probability density function (pdf). Using the measured free-decay responses at subcritical (preflutter) airspeeds, the joint non-Gaussain posterior pdf of the modal parameters is sampled using the Metropolis–Hastings (MH) Markov chain Monte Carlo (MCMC) algorithm. The posterior MCMC samples of the modal parameters are then used to obtain the flutter margin pdfs and finally the fluttermore » speed pdf. The usefulness of the Bayesian flutter margin method is demonstrated using synthetic data generated from a two-degree-of-freedom pitch-plunge aeroelastic model. The robustness of the statistical framework is demonstrated using different sets of measurement data. In conclusion, it will be shown that the probabilistic (Bayesian) approach reduces the number of test points required in providing a flutter speed estimate for a given accuracy and precision.« less

  17. Bayesian Analysis of Nonlinear Structural Equation Models with Nonignorable Missing Data

    ERIC Educational Resources Information Center

    Lee, Sik-Yum

    2006-01-01

    A Bayesian approach is developed for analyzing nonlinear structural equation models with nonignorable missing data. The nonignorable missingness mechanism is specified by a logistic regression model. A hybrid algorithm that combines the Gibbs sampler and the Metropolis-Hastings algorithm is used to produce the joint Bayesian estimates of…

  18. Application of Bayesian graphs to SN Ia data analysis and compression

    NASA Astrophysics Data System (ADS)

    Ma, Cong; Corasaniti, Pier-Stefano; Bassett, Bruce A.

    2016-08-01

    Bayesian graphical models are an efficient tool for modelling complex data and derive self-consistent expressions of the posterior distribution of model parameters. We apply Bayesian graphs to perform statistical analyses of Type Ia supernova (SN Ia) luminosity distance measurements from the Joint Light-curve Analysis (JLA) dataset (Betoule et al. 2014). In contrast to the χ2 approach used in previous studies, the Bayesian inference allows us to fully account for the standard-candle parameter dependence of the data covariance matrix. Comparing with χ2 analysis results we find a systematic offset of the marginal model parameter bounds. We demonstrate that the bias is statistically significant in the case of the SN Ia standardization parameters with a maximal 6σ shift of the SN light-curve colour correction. In addition, we find that the evidence for a host galaxy correction is now only 2.4σ. Systematic offsets on the cosmological parameters remain small, but may increase by combining constraints from complementary cosmological probes. The bias of the χ2 analysis is due to neglecting the parameter-dependent log-determinant of the data covariance, which gives more statistical weight to larger values of the standardization parameters. We find a similar effect on compressed distance modulus data. To this end we implement a fully consistent compression method of the JLA dataset that uses a Gaussian approximation of the posterior distribution for fast generation of compressed data. Overall, the results of our analysis emphasize the need for a fully consistent Bayesian statistical approach in the analysis of future large SN Ia datasets.

  19. Comparing models for perfluorooctanoic acid pharmacokinetics using Bayesian analysis.

    PubMed

    Wambaugh, John F; Barton, Hugh A; Setzer, R Woodrow

    2008-12-01

    Selecting the appropriate pharmacokinetic (PK) model given the available data is investigated for perfluorooctanoic acid (PFOA), which has been widely analyzed with an empirical, one-compartment model. This research examined the results of experiments [Kemper R. A., DuPont Haskell Laboratories, USEPA Administrative Record AR-226.1499 (2003)] that administered single oral or iv doses of PFOA to adult male and female rats. PFOA concentration was observed over time; in plasma for some animals and in fecal and urinary excretion for others. There were four rats per dose group, for a total of 36 males and 36 females. Assuming that the PK parameters for each individual within a gender were drawn from the same, biologically varying population, plasma and excretion data were jointly analyzed using a hierarchical framework to separate uncertainty due to measurement error from actual biological variability. Bayesian analysis using Markov Chain Monte Carlo (MCMC) provides tools to perform such an analysis as well as quantitative diagnostics to evaluate and discriminate between models. Starting from a one-compartment PK model with separate clearances to urine and feces, the model was incrementally expanded using Bayesian measures to assess if the expansion was supported by the data. PFOA excretion is sexually dimorphic in rats; male rats have bi-phasic elimination that is roughly 40 times slower than that of the females, which appear to have a single elimination phase. The male and female data were analyzed separately, keeping only the parameters describing the measurement process in common. For male rats, including excretion data initially decreased certainty in the one-compartment parameter estimates compared to an analysis using plasma data only. Allowing a third, unspecified clearance improved agreement and increased certainty when all the data was used, however a significant amount of eliminated PFOA was estimated to be missing from the excretion data. Adding an additional

  20. An Overview of Bayesian Methods for Neural Spike Train Analysis

    PubMed Central

    2013-01-01

    Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed. PMID:24348527

  1. Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians

    SciTech Connect

    Dinklage, Andreas; Dreier, Heiko; Preuss, Roland; Fischer, Rainer; Gori, Silvio; Toussaint, Udo von

    2008-03-12

    Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.

  2. Bayesian tomography and integrated data analysis in fusion diagnostics

    NASA Astrophysics Data System (ADS)

    Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.

    2016-11-01

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.

  3. BAYESIAN ANALYSIS OF MULTIPLE HARMONIC OSCILLATIONS IN THE SOLAR CORONA

    SciTech Connect

    Arregui, I.; Asensio Ramos, A.; Diaz, A. J.

    2013-03-01

    The detection of multiple mode harmonic kink oscillations in coronal loops enables us to obtain information on coronal density stratification and magnetic field expansion using seismology inversion techniques. The inference is based on the measurement of the period ratio between the fundamental mode and the first overtone and theoretical results for the period ratio under the hypotheses of coronal density stratification and magnetic field expansion of the wave guide. We present a Bayesian analysis of multiple mode harmonic oscillations for the inversion of the density scale height and magnetic flux tube expansion under each of the hypotheses. The two models are then compared using a Bayesian model comparison scheme to assess how plausible each one is given our current state of knowledge.

  4. Bayesian statistical analysis of protein side-chain rotamer preferences.

    PubMed Central

    Dunbrack, R. L.; Cohen, F. E.

    1997-01-01

    We present a Bayesian statistical analysis of the conformations of side chains in proteins from the Protein Data Bank. This is an extension of the backbone-dependent rotamer library, and includes rotamer populations and average chi angles for a full range of phi, psi values. The Bayesian analysis used here provides a rigorous statistical method for taking account of varying amounts of data. Bayesian statistics requires the assumption of a prior distribution for parameters over their range of possible values. This prior distribution can be derived from previous data or from pooling some of the present data. The prior distribution is combined with the data to form the posterior distribution, which is a compromise between the prior distribution and the data. For the chi 2, chi 3, and chi 4 rotamer prior distributions, we assume that the probability of each rotamer type is dependent only on the previous chi rotamer in the chain. For the backbone-dependence of the chi 1 rotamers, we derive prior distributions from the product of the phi-dependent and psi-dependent probabilities. Molecular mechanics calculations with the CHARMM22 potential show a strong similarity with the experimental distributions, indicating that proteins attain their lowest energy rotamers with respect to local backbone-side-chain interactions. The new library is suitable for use in homology modeling, protein folding simulations, and the refinement of X-ray and NMR structures. PMID:9260279

  5. A Bayesian Nonparametric Meta-Analysis Model

    ERIC Educational Resources Information Center

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  6. Risk analysis using a hybrid Bayesian-approximate reasoning methodology.

    SciTech Connect

    Bott, T. F.; Eisenhawer, S. W.

    2001-01-01

    Analysts are sometimes asked to make frequency estimates for specific accidents in which the accident frequency is determined primarily by safety controls. Under these conditions, frequency estimates use considerable expert belief in determining how the controls affect the accident frequency. To evaluate and document beliefs about control effectiveness, we have modified a traditional Bayesian approach by using approximate reasoning (AR) to develop prior distributions. Our method produces accident frequency estimates that separately express the probabilistic results produced in Bayesian analysis and possibilistic results that reflect uncertainty about the prior estimates. Based on our experience using traditional methods, we feel that the AR approach better documents beliefs about the effectiveness of controls than if the beliefs are buried in Bayesian prior distributions. We have performed numerous expert elicitations in which probabilistic information was sought from subject matter experts not trained In probability. We find it rnuch easier to elicit the linguistic variables and fuzzy set membership values used in AR than to obtain the probability distributions used in prior distributions directly from these experts because it better captures their beliefs and better expresses their uncertainties.

  7. Bayesian analysis of botanical epidemics using stochastic compartmental models.

    PubMed

    Gibson, G J; Kleczkowski, A; Gilligan, C A

    2004-08-17

    A stochastic model for an epidemic, incorporating susceptible, latent, and infectious states, is developed. The model represents primary and secondary infection rates and a time-varying host susceptibility with applications to a wide range of epidemiological systems. A Markov chain Monte Carlo algorithm is presented that allows the model to be fitted to experimental observations within a Bayesian framework. The approach allows the uncertainty in unobserved aspects of the process to be represented in the parameter posterior densities. The methods are applied to experimental observations of damping-off of radish (Raphanus sativus) caused by the fungal pathogen Rhizoctonia solani, in the presence and absence of the antagonistic fungus Trichoderma viride, a biological control agent that has previously been shown to affect the rate of primary infection by using a maximum-likelihood estimate for a simpler model with no allowance for a latent period. Using the Bayesian analysis, we are able to estimate the latent period from population data, even when there is uncertainty in discriminating infectious from latently infected individuals in data collection. We also show that the inference that T. viride can control primary, but not secondary, infection is robust to inclusion of the latent period in the model, although the absolute values of the parameters change. Some refinements and potential difficulties with the Bayesian approach in this context, when prior information on parameters is lacking, are discussed along with broader applications of the methods to a wide range of epidemiological systems.

  8. Analysis of NSTX TF Joint Voltage Measurements

    SciTech Connect

    R, Woolley

    2005-10-07

    This report presents findings of analyses of recorded current and voltage data associated with 72 electrical joints operating at high current and high mechanical stress. The analysis goal was to characterize the mechanical behavior of each joint and thus evaluate its mechanical supports. The joints are part of the toroidal field (TF) magnet system of the National Spherical Torus Experiment (NSTX) pulsed plasma device operating at the Princeton Plasma Physics Laboratory (PPPL). Since there is not sufficient space near the joints for much traditional mechanical instrumentation, small voltage probes were installed on each joint and their voltage monitoring waveforms have been recorded on sampling digitizers during each NSTX ''shot''.

  9. Bayesian Dose-Finding in Two Treatment Cycles Based on the Joint Utility of Efficacy and Toxicity

    PubMed Central

    Lee, Juhee; Thall, Peter F.; Ji, Yuan; Müller, Peter

    2014-01-01

    A phase I/II clinical trial design is proposed for adaptively and dynamically optimizing each patient's dose in each of two cycles of therapy based on the joint binary efficacy and toxicity outcomes in each cycle. A dose-outcome model is assumed that includes a Bayesian hierarchical latent variable structure to induce association among the outcomes and also facilitate posterior computation. Doses are chosen in each cycle based on posteriors of a model-based objective function, similar to a reinforcement learning or Q-learning function, defined in terms of numerical utilities of the joint outcomes in each cycle. For each patient, the procedure outputs a sequence of two actions, one for each cycle, with each action being the decision to either treat the patient at a chosen dose or not to treat. The cycle 2 action depends on the individual patient's cycle 1 dose and outcomes. In addition, decisions are based on posterior inference using other patients’ data, and therefore the proposed method is adaptive both within and between patients. A simulation study of the method is presented, including comparison to two-cycle extensions of the conventional 3+3 algorithm, continual reassessment method, and a Bayesian model-based design, and evaluation of robustness. PMID:26366026

  10. Bayesian sensitivity analysis of bifurcating nonlinear models

    NASA Astrophysics Data System (ADS)

    Becker, W.; Worden, K.; Rowson, J.

    2013-01-01

    Sensitivity analysis allows one to investigate how changes in input parameters to a system affect the output. When computational expense is a concern, metamodels such as Gaussian processes can offer considerable computational savings over Monte Carlo methods, albeit at the expense of introducing a data modelling problem. In particular, Gaussian processes assume a smooth, non-bifurcating response surface. This work highlights a recent extension to Gaussian processes which uses a decision tree to partition the input space into homogeneous regions, and then fits separate Gaussian processes to each region. In this way, bifurcations can be modelled at region boundaries and different regions can have different covariance properties. To test this method, both the treed and standard methods were applied to the bifurcating response of a Duffing oscillator and a bifurcating FE model of a heart valve. It was found that the treed Gaussian process provides a practical way of performing uncertainty and sensitivity analysis on large, potentially-bifurcating models, which cannot be dealt with by using a single GP, although an open problem remains how to manage bifurcation boundaries that are not parallel to coordinate axes.

  11. Bayesian analysis for extreme climatic events: A review

    NASA Astrophysics Data System (ADS)

    Chu, Pao-Shin; Zhao, Xin

    2011-11-01

    This article reviews Bayesian analysis methods applied to extreme climatic data. We particularly focus on applications to three different problems related to extreme climatic events including detection of abrupt regime shifts, clustering tropical cyclone tracks, and statistical forecasting for seasonal tropical cyclone activity. For identifying potential change points in an extreme event count series, a hierarchical Bayesian framework involving three layers - data, parameter, and hypothesis - is formulated to demonstrate the posterior probability of the shifts throughout the time. For the data layer, a Poisson process with a gamma distributed rate is presumed. For the hypothesis layer, multiple candidate hypotheses with different change-points are considered. To calculate the posterior probability for each hypothesis and its associated parameters we developed an exact analytical formula, a Markov Chain Monte Carlo (MCMC) algorithm, and a more sophisticated reversible jump Markov Chain Monte Carlo (RJMCMC) algorithm. The algorithms are applied to several rare event series: the annual tropical cyclone or typhoon counts over the central, eastern, and western North Pacific; the annual extremely heavy rainfall event counts at Manoa, Hawaii; and the annual heat wave frequency in France. Using an Expectation-Maximization (EM) algorithm, a Bayesian clustering method built on a mixture Gaussian model is applied to objectively classify historical, spaghetti-like tropical cyclone tracks (1945-2007) over the western North Pacific and the South China Sea into eight distinct track types. A regression based approach to forecasting seasonal tropical cyclone frequency in a region is developed. Specifically, by adopting large-scale environmental conditions prior to the tropical cyclone season, a Poisson regression model is built for predicting seasonal tropical cyclone counts, and a probit regression model is alternatively developed toward a binary classification problem. With a non

  12. The Bayesian Analysis Software Developed At Washington University

    NASA Astrophysics Data System (ADS)

    Marutyan, Karen R.; Bretthorst, G. Larry

    2009-12-01

    Over the last few years there has been an ongoing effort at the Biomedical Magnetic Resonance Laboratory within Washington University to develop data analysis applications using Bayesian probability theory. A few of these applications are specific to Magnetic Resonance data, however, most are general and can analyze data from a wide variety of sources. These data analysis applications are server based and they have been written in such a way as to allow them to utilize as many processors as are available. The interface to these Bayesian applications is a client based Java interface. The client, usually a Windows PC, runs the interface, sets up an analysis, sends the analysis to the server, fetches the results and displays the appropriate plots on the users client machine. Together, the client and server software can be used to solve a host of interesting problems that occur regularly in the sciences. In this paper, we describe both the client and server software and briefly discuss how to acquire, install and maintain this software.

  13. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  14. Bayesian analysis to detect abrupt changes in extreme hydrological processes

    NASA Astrophysics Data System (ADS)

    Jo, Seongil; Kim, Gwangsu; Jeon, Jong-June

    2016-07-01

    In this study, we develop a new method for a Bayesian change point analysis. The proposed method is easy to implement and can be extended to a wide class of distributions. Using a generalized extreme-value distribution, we investigate the annual maximum of precipitations observed at stations in the South Korean Peninsula, and find significant changes in the considered sites. We evaluate the hydrological risk in predictions using the estimated return levels. In addition, we explain that the misspecification of the probability model can lead to a bias in the number of change points and using a simple example, show that this problem is difficult to avoid by technical data transformation.

  15. A Bayesian analysis of pentaquark signals from CLAS data

    SciTech Connect

    David Ireland; Bryan McKinnon; Dan Protopopescu; Pawel Ambrozewicz; Marco Anghinolfi; G. Asryan; Harutyun Avakian; H. Bagdasaryan; Nathan Baillie; Jacques Ball; Nathan Baltzell; V. Batourine; Marco Battaglieri; Ivan Bedlinski; Ivan Bedlinskiy; Matthew Bellis; Nawal Benmouna; Barry Berman; Angela Biselli; Lukasz Blaszczyk; Sylvain Bouchigny; Sergey Boyarinov; Robert Bradford; Derek Branford; William Briscoe; William Brooks; Volker Burkert; Cornel Butuceanu; John Calarco; Sharon Careccia; Daniel Carman; Liam Casey; Shifeng Chen; Lu Cheng; Philip Cole; Patrick Collins; Philip Coltharp; Donald Crabb; Volker Crede; Natalya Dashyan; Rita De Masi; Raffaella De Vita; Enzo De Sanctis; Pavel Degtiarenko; Alexandre Deur; Richard Dickson; Chaden Djalali; Gail Dodge; Joseph Donnelly; David Doughty; Michael Dugger; Oleksandr Dzyubak; Hovanes Egiyan; Kim Egiyan; Lamiaa Elfassi; Latifa Elouadrhiri; Paul Eugenio; Gleb Fedotov; Gerald Feldman; Ahmed Fradi; Herbert Funsten; Michel Garcon; Gagik Gavalian; Nerses Gevorgyan; Gerard Gilfoyle; Kevin Giovanetti; Francois-Xavier Girod; John Goetz; Wesley Gohn; Atilla Gonenc; Ralf Gothe; Keith Griffioen; Michel Guidal; Nevzat Guler; Lei Guo; Vardan Gyurjyan; Kawtar Hafidi; Hayk Hakobyan; Charles Hanretty; Neil Hassall; F. Hersman; Ishaq Hleiqawi; Maurik Holtrop; Charles Hyde; Yordanka Ilieva; Boris Ishkhanov; Eugeny Isupov; D. Jenkins; Hyon-Suk Jo; John Johnstone; Kyungseon Joo; Henry Juengst; Narbe Kalantarians; James Kellie; Mahbubul Khandaker; Wooyoung Kim; Andreas Klein; Franz Klein; Mikhail Kossov; Zebulun Krahn; Laird Kramer; Valery Kubarovsky; Joachim Kuhn; Sergey Kuleshov; Viacheslav Kuznetsov; Jeff Lachniet; Jean Laget; Jorn Langheinrich; D. Lawrence; Kenneth Livingston; Haiyun Lu; Marion MacCormick; Nikolai Markov; Paul Mattione; Bernhard Mecking; Mac Mestayer; Curtis Meyer; Tsutomu Mibe; Konstantin Mikhaylov; Marco Mirazita; Rory Miskimen; Viktor Mokeev; Brahim Moreno; Kei Moriya; Steven Morrow; Maryam Moteabbed; Edwin Munevar Espitia; Gordon Mutchler; Pawel Nadel-Turonski; Rakhsha Nasseripour; Silvia Niccolai; Gabriel Niculescu; Maria-Ioana Niculescu; Bogdan Niczyporuk; Megh Niroula; Rustam Niyazov; Mina Nozar; Mikhail Osipenko; Alexander Ostrovidov; Kijun Park; Evgueni Pasyuk; Craig Paterson; Sergio Pereira; Joshua Pierce; Nikolay Pivnyuk; Oleg Pogorelko; Sergey Pozdnyakov; John Price; Sebastien Procureur; Yelena Prok; Brian Raue; Giovanni Ricco; Marco Ripani; Barry Ritchie; Federico Ronchetti; Guenther Rosner; Patrizia Rossi; Franck Sabatie; Julian Salamanca; Carlos Salgado; Joseph Santoro; Vladimir Sapunenko; Reinhard Schumacher; Vladimir Serov; Youri Sharabian; Dmitri Sharov; Nikolay Shvedunov; Elton Smith; Lee Smith; Daniel Sober; Daria Sokhan; Aleksey Stavinskiy; Samuel Stepanyan; Stepan Stepanyan; Burnham Stokes; Paul Stoler; Steffen Strauch; Mauro Taiuti; David Tedeschi; Ulrike Thoma; Avtandil Tkabladze; Svyatoslav Tkachenko; Clarisse Tur; Maurizio Ungaro; Michael Vineyard; Alexander Vlassov; Daniel Watts; Lawrence Weinstein; Dennis Weygand; M. Williams; Elliott Wolin; M.H. Wood; Amrit Yegneswaran; Lorenzo Zana; Jixie Zhang; Bo Zhao; Zhiwen Zhao

    2008-02-01

    We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a $\\Theta^{+}$ pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a $\\Theta^{+}$. Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner.

  16. Bayesian analysis of inflationary features in Planck and SDSS data

    NASA Astrophysics Data System (ADS)

    Benetti, Micol; Alcaniz, Jailson S.

    2016-07-01

    We perform a Bayesian analysis to study possible features in the primordial inflationary power spectrum of scalar perturbations. In particular, we analyze the possibility of detecting the imprint of these primordial features in the anisotropy temperature power spectrum of the cosmic microwave background (CMB) and also in the matter power spectrum P (k ) . We use the most recent CMB data provided by the Planck Collaboration and P (k ) measurements from the 11th data release of the Sloan Digital Sky Survey. We focus our analysis on a class of potentials whose features are localized at different intervals of angular scales, corresponding to multipoles in the ranges 10 <ℓ<60 (Oscill-1) and 150 <ℓ<300 (Oscill-2). Our results show that one of the step potentials (Oscill-1) provides a better fit to the CMB data than does the featureless Λ CDM scenario, with moderate Bayesian evidence in favor of the former. Adding the P (k ) data to the analysis weakens the evidence of the Oscill-1 potential relative to the standard model and strengthens the evidence of this latter scenario with respect to the Oscill-2 model.

  17. Implementation of a Bayesian Engine for Uncertainty Analysis

    SciTech Connect

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  18. Joint modeling of survival and longitudinal non-survival data: current methods and issues. Report of the DIA Bayesian joint modeling working group

    PubMed Central

    Gould, A. Lawrence; Boye, Mark Ernest; Crowther, Michael J.; Ibrahim, Joseph G.; Quartey, George; Micallef, Sandrine; Bois, Frederic Y.

    2015-01-01

    Explicitly modeling underlying relationships between a survival endpoint and processes that generate longitudinal measured or reported outcomes potentially could improve the efficiency of clinical trials and provide greater insight into the various dimensions of the clinical effect of interventions included in the trials. Various strategies have been proposed for using longitudinal findings to elucidate intervention effects on clinical outcomes such as survival. The application of specifically Bayesian approaches for constructing models that address longitudinal and survival outcomes explicitly has been recently addressed in the literature. We review currently available methods for carrying out joint analyses, including issues of implementation and interpretation, identify software tools that can be used to carry out the necessary calculations, and review applications of the methodology. PMID:24634327

  19. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    SciTech Connect

    Keselman, Dmitry; Tompkins, George H; Leishman, Deborah A

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  20. Bayesian semiparametric meta-analysis for genetic association studies.

    PubMed

    De Iorio, Maria; Newcombe, Paul J; Tachmazidou, Ioanna; Verzilli, Claudio J; Whittaker, John C

    2011-07-01

    We present a Bayesian semiparametric model for the meta-analysis of candidate gene studies with a binary outcome. Such studies often report results from association tests for different, possibly study-specific and non-overlapping genetic markers in the same genetic region. Meta-analyses of the results at each marker in isolation are seldom appropriate as they ignore the correlation that may exist between markers due to linkage disequilibrium (LD) and cannot assess the relative importance of variants at each marker. Also such marker-wise meta-analyses are restricted to only those studies that have typed the marker in question, with a potential loss of power. A better strategy is one which incorporates information about the LD between markers so that any combined estimate of the effect of each variant is corrected for the effect of other variants, as in multiple regression. Here we develop a Bayesian semiparametric model which models the observed genotype group frequencies conditional to the case/control status and uses pairwise LD measurements between markers as prior information to make posterior inference on adjusted effects. The approach allows borrowing of strength across studies and across markers. The analysis is based on a mixture of Dirichlet processes model as the underlying semiparametric model. Full posterior inference is performed through Markov chain Monte Carlo algorithms. The approach is demonstrated on simulated and real data. PMID:21400586

  1. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    PubMed

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach.

  2. An Operant Analysis of Joint Attention Skills

    ERIC Educational Resources Information Center

    Holth, Per

    2005-01-01

    Joint attention, a synchronizing of the attention of two or more persons, has been an increasing focus of research in cognitive developmental psychology. Research in this area has progressed mainly outside of behavior analysis, and behavior-analytic research and theory has tended to ignore the work on joint attention. It is argued here, on the one…

  3. Bayesian regression analysis of data with random effects covariates from nonlinear longitudinal measurements

    PubMed Central

    De la Cruz, Rolando; Meza, Cristian; Arribas-Gil, Ana; Carroll, Raymond J.

    2016-01-01

    Joint models for a wide class of response variables and longitudinal measurements consist on a mixed-effects model to fit longitudinal trajectories whose random effects enter as covariates in a generalized linear model for the primary response. They provide a useful way to assess association between these two kinds of data, which in clinical studies are often collected jointly on a series of individuals and may help understanding, for instance, the mechanisms of recovery of a certain disease or the efficacy of a given therapy. When a nonlinear mixed-effects model is used to fit the longitudinal trajectories, the existing estimation strategies based on likelihood approximations have been shown to exhibit some computational efficiency problems (De la Cruz et al., 2011). In this article we consider a Bayesian estimation procedure for the joint model with a nonlinear mixed-effects model for the longitudinal data and a generalized linear model for the primary response. The proposed prior structure allows for the implementation of an MCMC sampler. Moreover, we consider that the errors in the longitudinal model may be correlated. We apply our method to the analysis of hormone levels measured at the early stages of pregnancy that can be used to predict normal versus abnormal pregnancy outcomes. We also conduct a simulation study to assess the importance of modelling correlated errors and quantify the consequences of model misspecification. PMID:27274601

  4. Bayesian robust analysis for genetic architecture of quantitative traits

    PubMed Central

    Yang, Runqing; Wang, Xin; Li, Jian; Deng, Hongwen

    2009-01-01

    Motivation: In most quantitative trait locus (QTL) mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may affect the accuracy of QTL detection and lead to detection of spurious QTLs. To improve the robustness of QTL mapping methods, we replaced the normal distribution for residuals in multiple interacting QTL models with the normal/independent distributions that are a class of symmetric and long-tailed distributions and are able to accommodate residual outliers. Subsequently, we developed a Bayesian robust analysis strategy for dissecting genetic architecture of quantitative traits and for mapping genome-wide interacting QTLs in line crosses. Results: Through computer simulations, we showed that our strategy had a similar power for QTL detection compared with traditional methods assuming normal-distributed traits, but had a substantially increased power for non-normal phenotypes. When this strategy was applied to a group of traits associated with physical/chemical characteristics and quality in rice, more main and epistatic QTLs were detected than traditional Bayesian model analyses under the normal assumption. Contact: runqingyang@sjtu.edu.cn; dengh@umkc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18974168

  5. Bayesian Models for fMRI Data Analysis

    PubMed Central

    Zhang, Linlin; Guindani, Michele; Vannucci, Marina

    2015-01-01

    Functional magnetic resonance imaging (fMRI), a noninvasive neuroimaging method that provides an indirect measure of neuronal activity by detecting blood flow changes, has experienced an explosive growth in the past years. Statistical methods play a crucial role in understanding and analyzing fMRI data. Bayesian approaches, in particular, have shown great promise in applications. A remarkable feature of fully Bayesian approaches is that they allow a flexible modeling of spatial and temporal correlations in the data. This paper provides a review of the most relevant models developed in recent years. We divide methods according to the objective of the analysis. We start from spatio-temporal models for fMRI data that detect task-related activation patterns. We then address the very important problem of estimating brain connectivity. We also touch upon methods that focus on making predictions of an individual's brain activity or a clinical or behavioral response. We conclude with a discussion of recent integrative models that aim at combining fMRI data with other imaging modalities, such as EEG/MEG and DTI data, measured on the same subjects. We also briefly discuss the emerging field of imaging genetics. PMID:25750690

  6. A Bayesian Framework for Reliability Analysis of Spacecraft Deployments

    NASA Technical Reports Server (NTRS)

    Evans, John W.; Gallo, Luis; Kaminsky, Mark

    2012-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.

  7. Structural analysis of Aircraft fuselage splice joint

    NASA Astrophysics Data System (ADS)

    Udaya Prakash, R.; Kumar, G. Raj; Vijayanandh, R.; Senthil Kumar, M.; Ramganesh, T.

    2016-09-01

    In Aviation sector, composite materials and its application to each component are one of the prime factors of consideration due to the high strength to weight ratio, design flexibility and non-corrosive so that the composite materials are widely used in the low weight constructions and also it can be treated as a suitable alternative to metals. The objective of this paper is to estimate and compare the suitability of a composite skin joint in an aircraft fuselage with different joints by simulating the displacement, normal stress, vonmises stress and shear stress with the help of numerical solution methods. The reference Z-stringer component of this paper is modeled by CATIA and numerical simulation is carried out by ANSYS has been used for splice joint presents in the aircraft fuselage with three combinations of joints such as riveted joint, bonded joint and hybrid joint. Nowadays the stringers are using to avoid buckling of fuselage skin, it has joined together by rivets and they are connected end to end by splice joint. Design and static analysis of three-dimensional models of joints such as bonded, riveted and hybrid are carried out and results are compared.

  8. A computational analysis of the neural bases of Bayesian inference.

    PubMed

    Kolossa, Antonio; Kopp, Bruno; Fingscheidt, Tim

    2015-02-01

    Empirical support for the Bayesian brain hypothesis, although of major theoretical importance for cognitive neuroscience, is surprisingly scarce. This hypothesis posits simply that neural activities code and compute Bayesian probabilities. Here, we introduce an urn-ball paradigm to relate event-related potentials (ERPs) such as the P300 wave to Bayesian inference. Bayesian model comparison is conducted to compare various models in terms of their ability to explain trial-by-trial variation in ERP responses at different points in time and over different regions of the scalp. Specifically, we are interested in dissociating specific ERP responses in terms of Bayesian updating and predictive surprise. Bayesian updating refers to changes in probability distributions given new observations, while predictive surprise equals the surprise about observations under current probability distributions. Components of the late positive complex (P3a, P3b, Slow Wave) provide dissociable measures of Bayesian updating and predictive surprise. Specifically, the updating of beliefs about hidden states yields the best fit for the anteriorly distributed P3a, whereas the updating of predictions of observations accounts best for the posteriorly distributed Slow Wave. In addition, parietally distributed P3b responses are best fit by predictive surprise. These results indicate that the three components of the late positive complex reflect distinct neural computations. As such they are consistent with the Bayesian brain hypothesis, but these neural computations seem to be subject to nonlinear probability weighting. We integrate these findings with the free-energy principle that instantiates the Bayesian brain hypothesis.

  9. BASE-9: Bayesian Analysis for Stellar Evolution with nine variables

    NASA Astrophysics Data System (ADS)

    Robinson, Elliot; von Hippel, Ted; Stein, Nathan; Stenning, David; Wagner-Kaiser, Rachel; Si, Shijing; van Dyk, David

    2016-08-01

    The BASE-9 (Bayesian Analysis for Stellar Evolution with nine variables) software suite recovers star cluster and stellar parameters from photometry and is useful for analyzing single-age, single-metallicity star clusters, binaries, or single stars, and for simulating such systems. BASE-9 uses a Markov chain Monte Carlo (MCMC) technique along with brute force numerical integration to estimate the posterior probability distribution for the age, metallicity, helium abundance, distance modulus, line-of-sight absorption, and parameters of the initial-final mass relation (IFMR) for a cluster, and for the primary mass, secondary mass (if a binary), and cluster probability for every potential cluster member. The MCMC technique is used for the cluster quantities (the first six items listed above) and numerical integration is used for the stellar quantities (the last three items in the above list).

  10. Testing Hardy-Weinberg equilibrium: an objective Bayesian analysis.

    PubMed

    Consonni, Guido; Moreno, Elías; Venturini, Sergio

    2011-01-15

    We analyze the general (multiallelic) Hardy-Weinberg equilibrium problem from an objective Bayesian testing standpoint. We argue that for small or moderate sample sizes the answer is rather sensitive to the prior chosen, and this suggests to carry out a sensitivity analysis with respect to the prior. This goal is achieved through the identification of a class of priors specifically designed for this testing problem. In this paper, we consider the class of intrinsic priors under the full model, indexed by a tuning quantity, the training sample size. These priors are objective, satisfy Savage's continuity condition and have proved to behave extremely well for many statistical testing problems. We compute the posterior probability of the Hardy-Weinberg equilibrium model for the class of intrinsic priors, assess robustness over the range of plausible answers, as well as stability of the decision in favor of either hypothesis.

  11. Bayesian Library for the Analysis of Neutron Diffraction Data

    NASA Astrophysics Data System (ADS)

    Ratcliff, William; Lesniewski, Joseph; Quintana, Dylan

    During this talk, I will introduce the Bayesian Library for the Analysis of Neutron Diffraction Data. In this library we use of the DREAM algorithm to effectively sample parameter space. This offers several advantages over traditional least squares fitting approaches. It gives us more robust estimates of the fitting parameters, their errors, and their correlations. It also is more stable than least squares methods and provides more confidence in finding a global minimum. I will discuss the algorithm and its application to several materials. I will show applications to both structural and magnetic diffraction patterns. I will present examples of fitting both powder and single crystal data. We would like to acknowledge support from the Department of Commerce and the NSF.

  12. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  13. An Exploratory Study Examining the Feasibility of Using Bayesian Networks to Predict Circuit Analysis Understanding

    ERIC Educational Resources Information Center

    Chung, Gregory K. W. K.; Dionne, Gary B.; Kaiser, William J.

    2006-01-01

    Our research question was whether we could develop a feasible technique, using Bayesian networks, to diagnose gaps in student knowledge. Thirty-four college-age participants completed tasks designed to measure conceptual knowledge, procedural knowledge, and problem-solving skills related to circuit analysis. A Bayesian network was used to model…

  14. Multiple SNP Set Analysis for Genome-Wide Association Studies Through Bayesian Latent Variable Selection.

    PubMed

    Lu, Zhao-Hua; Zhu, Hongtu; Knickmeyer, Rebecca C; Sullivan, Patrick F; Williams, Stephanie N; Zou, Fei

    2015-12-01

    The power of genome-wide association studies (GWAS) for mapping complex traits with single-SNP analysis (where SNP is single-nucleotide polymorphism) may be undermined by modest SNP effect sizes, unobserved causal SNPs, correlation among adjacent SNPs, and SNP-SNP interactions. Alternative approaches for testing the association between a single SNP set and individual phenotypes have been shown to be promising for improving the power of GWAS. We propose a Bayesian latent variable selection (BLVS) method to simultaneously model the joint association mapping between a large number of SNP sets and complex traits. Compared with single SNP set analysis, such joint association mapping not only accounts for the correlation among SNP sets but also is capable of detecting causal SNP sets that are marginally uncorrelated with traits. The spike-and-slab prior assigned to the effects of SNP sets can greatly reduce the dimension of effective SNP sets, while speeding up computation. An efficient Markov chain Monte Carlo algorithm is developed. Simulations demonstrate that BLVS outperforms several competing variable selection methods in some important scenarios. PMID:26515609

  15. Structural dynamic analysis of a ball joint

    NASA Astrophysics Data System (ADS)

    Hwang, Seok-Cheol; Lee, Kwon-Hee

    2012-11-01

    Ball joint is a rotating and swiveling element that is typically installed at the interface between two parts. In an automobile, the ball joint is the component that connects the control arms to the steering knuckle. The ball joint can also be installed in linkage systems for motion control applications. This paper describes the simulation strategy for a ball joint analysis, considering manufacturing process. Its manufacturing process can be divided into plugging and spinning. Then, the interested responses is selected as the stress distribution generated between its ball and bearing. In this paper, a commercial code of NX DAFUL using an implicit integration method is introduced to calculate the response. In addition, the gap analysis is performed to investigate the fitness, focusing on the response of the displacement of a ball stud. Also, the optimum design is suggested through case studies.

  16. Joint Bayesian inference of risk variants and tissue-specific epigenomic enrichments across multiple complex human diseases

    PubMed Central

    Li, Yue; Kellis, Manolis

    2016-01-01

    Genome wide association studies (GWAS) provide a powerful approach for uncovering disease-associated variants in human, but fine-mapping the causal variants remains a challenge. This is partly remedied by prioritization of disease-associated variants that overlap GWAS-enriched epigenomic annotations. Here, we introduce a new Bayesian model RiVIERA (Risk Variant Inference using Epigenomic Reference Annotations) for inference of driver variants from summary statistics across multiple traits using hundreds of epigenomic annotations. In simulation, RiVIERA promising power in detecting causal variants and causal annotations, the multi-trait joint inference further improved the detection power. We applied RiVIERA to model the existing GWAS summary statistics of 9 autoimmune diseases and Schizophrenia by jointly harnessing the potential causal enrichments among 848 tissue-specific epigenomics annotations from ENCODE/Roadmap consortium covering 127 cell/tissue types and 8 major epigenomic marks. RiVIERA identified meaningful tissue-specific enrichments for enhancer regions defined by H3K4me1 and H3K27ac for Blood T-Cell specifically in the nine autoimmune diseases and Brain-specific enhancer activities exclusively in Schizophrenia. Moreover, the variants from the 95% credible sets exhibited high conservation and enrichments for GTEx whole-blood eQTLs located within transcription-factor-binding-sites and DNA-hypersensitive-sites. Furthermore, joint modeling the nine immune traits by simultaneously inferring and exploiting the underlying epigenomic correlation between traits further improved the functional enrichments compared to single-trait models. PMID:27407109

  17. A Bayesian Analysis of Finite Mixtures in the LISREL Model.

    ERIC Educational Resources Information Center

    Zhu, Hong-Tu; Lee, Sik-Yum

    2001-01-01

    Proposes a Bayesian framework for estimating finite mixtures of the LISREL model. The model augments the observed data of the manifest variables with the latent variables and allocation variables and uses the Gibbs sampler to obtain the Bayesian solution. Discusses other associated statistical inferences. (SLD)

  18. UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE

    SciTech Connect

    Sanders, N. E.; Soderberg, A. M.; Betancourt, M.

    2015-02-10

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.

  19. RECONSTRUCTING EXPOSURE SCENARIOS USING DOSE BIOMARKERS - AN APPLICATION OF BAYESIAN UNCERTAINTY ANALYSIS

    EPA Science Inventory

    We use Bayesian uncertainty analysis to explore how to estimate pollutant exposures from biomarker concentrations. The growing number of national databases with exposure data makes such an analysis possible. They contain datasets of pharmacokinetic biomarkers for many polluta...

  20. BEAST 2: a software platform for Bayesian evolutionary analysis.

    PubMed

    Bouckaert, Remco; Heled, Joseph; Kühnert, Denise; Vaughan, Tim; Wu, Chieh-Hsi; Xie, Dong; Suchard, Marc A; Rambaut, Andrew; Drummond, Alexei J

    2014-04-01

    We present a new open source, extensible and flexible software platform for Bayesian evolutionary analysis called BEAST 2. This software platform is a re-design of the popular BEAST 1 platform to correct structural deficiencies that became evident as the BEAST 1 software evolved. Key among those deficiencies was the lack of post-deployment extensibility. BEAST 2 now has a fully developed package management system that allows third party developers to write additional functionality that can be directly installed to the BEAST 2 analysis platform via a package manager without requiring a new software release of the platform. This package architecture is showcased with a number of recently published new models encompassing birth-death-sampling tree priors, phylodynamics and model averaging for substitution models and site partitioning. A second major improvement is the ability to read/write the entire state of the MCMC chain to/from disk allowing it to be easily shared between multiple instances of the BEAST software. This facilitates checkpointing and better support for multi-processor and high-end computing extensions. Finally, the functionality in new packages can be easily added to the user interface (BEAUti 2) by a simple XML template-based mechanism because BEAST 2 has been re-designed to provide greater integration between the analysis engine and the user interface so that, for example BEAST and BEAUti use exactly the same XML file format. PMID:24722319

  1. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  2. Bayesian oligogenic analysis of quantitative and qualitative traits in general pedigrees.

    PubMed

    Uimari, P; Sillanpää, M J

    2001-11-01

    A Bayesian method for multipoint oligogenic analysis of quantitative and qualitative traits is presented. This method can be applied to general pedigrees, which do not necessarily have to be "peelable" and can have large numbers of markers. The number of quantitative/qualitative trait loci (QTL), their map positions in the genome, and phenotypic effects (mode of inheritances) are all estimated simultaneously within the same framework. The summaries of the estimated parameters are based on the marginal posterior distributions that are obtained through Markov chain Monte Carlo (MCMC) methods. The method uses founder alleles together with segregation indicators in order to determine the genotypes of the trait loci of all individuals in the pedigree. To improve mixing properties of the sampler, we propose (1) joint sampling of map position and segregation indicators, (2) omitting data augmentation for untyped or uninformative markers (homozygous parent), and (3) updating several markers jointly within a single block. The performance of the method was tested with two replicate GAW10 data sets (considering two levels of available marker information). The results were concordant and similar to those presented earlier with other methods. These analyses clearly illustrate the utility and wide applicability of the method. PMID:11668579

  3. Bayesian methods for design and analysis of safety trials.

    PubMed

    Price, Karen L; Xia, H Amy; Lakshminarayanan, Mani; Madigan, David; Manner, David; Scott, John; Stamey, James D; Thompson, Laura

    2014-01-01

    Safety assessment is essential throughout medical product development. There has been increased awareness of the importance of safety trials recently, in part due to recent US Food and Drug Administration guidance related to thorough assessment of cardiovascular risk in the treatment of type 2 diabetes. Bayesian methods provide great promise for improving the conduct of safety trials. In this paper, the safety subteam of the Drug Information Association Bayesian Scientific Working Group evaluates challenges associated with current methods for designing and analyzing safety trials and provides an overview of several suggested Bayesian opportunities that may increase efficiency of safety trials along with relevant case examples.

  4. Analysis of minor fractures associated with joints and faulted joints

    NASA Astrophysics Data System (ADS)

    Cruikshank, Kenneth M.; Zhao, Guozhu; Johnson, Arvid M.

    In this paper, we use fracture mechanics to interpret conditions responsible for secondary cracks that adorn joints and faulted joints in the Entrada Sandstone in Arches National Park, U.S.A. Because the joints in most places accommodated shearing offsets of a few mm to perhaps 1 dm, and thus became faulted joints, some of the minor cracks are due to faulting. However, in a few places where the shearing was zero, one can examine minor cracks due solely to interaction of joint segments at the time they formed. We recognize several types of minor cracks associated with subsequent faulting of the joints. One is the kink, a crack that occurs at the termination of a straight joint and whose trend is abruptly different from that of the joint. Kinks are common and should be studied because they contain a great deal of information about conditions during fracturing. The sense of kinking indicates the sense of shear during faulting: a kink that turns clockwise with respect to the direction of the main joint is a result of right-lateral shear, and a kink that turns counterclockwise is a result of left-lateral shear. Furthermore, the kink angle is related to the ratio of the shear stress responsible for the kinking to the normal stress responsible for the opening of the joint. The amount of opening of a joint at the time it faulted or even at the time the joint itself formed can be estimated by measuring the kink angle and the amount of strike-slip at some point along the faulted joint. Other fractures that form near terminations of pre-existing joints in response to shearing along the joint are horsetail fractures. Similar short fractures can occur anywhere along the length of the joints. The primary value in recognizing these fractures is that they indicate the sense of faulting accommodated by the host fracture and the direction of maximum tension. Even where there has been insignificant regional shearing in the Garden Area, the joints can have ornate terminations. Perhaps

  5. A procedure for seiche analysis with Bayesian information criterion

    NASA Astrophysics Data System (ADS)

    Aichi, Masaatsu

    2016-04-01

    Seiche is a standing wave in enclosed or semi-enclosed water body. Its amplitude irregularly changes in time due to weather condition etc. Then, extracting seiche signal is not easy by usual methods for time series analysis such as fast Fourier transform (FFT). In this study, a new method for time series analysis with Bayesian information criterion was developed to decompose seiche, tide, long-term trend and residual components from time series data of tide stations. The method was developed based on the maximum marginal likelihood estimation of tide amplitudes, seiche amplitude, and trend components. Seiche amplitude and trend components were assumed that they gradually changes as second derivative in time was close to zero. These assumptions were incorporated as prior distributions. The variances of prior distributions were estimated by minimizing Akaike-Bayes information criterion (ABIC). The frequency of seiche was determined by Newton method with initial guess by FFT. The accuracy of proposed method was checked by analyzing synthetic time series data composed of known components. The reproducibility of the original components was quite well. The proposed method was also applied to the actual time series data of sea level observed by tide station and the strain of coastal rock masses observed by fiber Bragg grating sensor in Aburatsubo Bay, Japan. The seiche in bay and its response of rock masses were successfully extracted.

  6. Analysis of human mini-exome sequencing data from Genetic Analysis Workshop 17 using a Bayesian hierarchical mixture model

    PubMed Central

    2011-01-01

    Next-generation sequencing technologies are rapidly changing the field of genetic epidemiology and enabling exploration of the full allele frequency spectrum underlying complex diseases. Although sequencing technologies have shifted our focus toward rare genetic variants, statistical methods traditionally used in genetic association studies are inadequate for estimating effects of low minor allele frequency variants. Four our study we use the Genetic Analysis Workshop 17 data from 697 unrelated individuals (genotypes for 24,487 autosomal variants from 3,205 genes). We apply a Bayesian hierarchical mixture model to identify genes associated with a simulated binary phenotype using a transformed genotype design matrix weighted by allele frequencies. A Metropolis Hasting algorithm is used to jointly sample each indicator variable and additive genetic effect pair from its conditional posterior distribution, and remaining parameters are sampled by Gibbs sampling. This method identified 58 genes with a posterior probability greater than 0.8 for being associated with the phenotype. One of these 58 genes, PIK3C2B was correctly identified as being associated with affected status based on the simulation process. This project demonstrates the utility of Bayesian hierarchical mixture models using a transformed genotype matrix to detect genes containing rare and common variants associated with a binary phenotype. PMID:22373180

  7. Bayesian semiparametric nonlinear mixed-effects joint models for data with skewness, missing responses, and measurement errors in covariates.

    PubMed

    Huang, Yangxin; Dagne, Getachew

    2012-09-01

    It is a common practice to analyze complex longitudinal data using semiparametric nonlinear mixed-effects (SNLME) models with a normal distribution. Normality assumption of model errors may unrealistically obscure important features of subject variations. To partially explain between- and within-subject variations, covariates are usually introduced in such models, but some covariates may often be measured with substantial errors. Moreover, the responses may be missing and the missingness may be nonignorable. Inferential procedures can be complicated dramatically when data with skewness, missing values, and measurement error are observed. In the literature, there has been considerable interest in accommodating either skewness, incompleteness or covariate measurement error in such models, but there has been relatively little study concerning all three features simultaneously. In this article, our objective is to address the simultaneous impact of skewness, missingness, and covariate measurement error by jointly modeling the response and covariate processes based on a flexible Bayesian SNLME model. The method is illustrated using a real AIDS data set to compare potential models with various scenarios and different distribution specifications.

  8. Using Bayesian analysis in repeated preclinical in vivo studies for a more effective use of animals.

    PubMed

    Walley, Rosalind; Sherington, John; Rastrick, Joe; Detrait, Eric; Hanon, Etienne; Watt, Gillian

    2016-05-01

    Whilst innovative Bayesian approaches are increasingly used in clinical studies, in the preclinical area Bayesian methods appear to be rarely used in the reporting of pharmacology data. This is particularly surprising in the context of regularly repeated in vivo studies where there is a considerable amount of data from historical control groups, which has potential value. This paper describes our experience with introducing Bayesian analysis for such studies using a Bayesian meta-analytic predictive approach. This leads naturally either to an informative prior for a control group as part of a full Bayesian analysis of the next study or using a predictive distribution to replace a control group entirely. We use quality control charts to illustrate study-to-study variation to the scientists and describe informative priors in terms of their approximate effective numbers of animals. We describe two case studies of animal models: the lipopolysaccharide-induced cytokine release model used in inflammation and the novel object recognition model used to screen cognitive enhancers, both of which show the advantage of a Bayesian approach over the standard frequentist analysis. We conclude that using Bayesian methods in stable repeated in vivo studies can result in a more effective use of animals, either by reducing the total number of animals used or by increasing the precision of key treatment differences. This will lead to clearer results and supports the "3Rs initiative" to Refine, Reduce and Replace animals in research. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Interacting Agricultural Pests and Their Effect on Crop Yield: Application of a Bayesian Decision Theory Approach to the Joint Management of Bromus tectorum and Cephus cinctus

    PubMed Central

    Keren, Ilai N.; Menalled, Fabian D.; Weaver, David K.; Robison-Cox, James F.

    2015-01-01

    Worldwide, the landscape homogeneity of extensive monocultures that characterizes conventional agriculture has resulted in the development of specialized and interacting multitrophic pest complexes. While integrated pest management emphasizes the need to consider the ecological context where multiple species coexist, management recommendations are often based on single-species tactics. This approach may not provide satisfactory solutions when confronted with the complex interactions occurring between organisms at the same or different trophic levels. Replacement of the single-species management model with more sophisticated, multi-species programs requires an understanding of the direct and indirect interactions occurring between the crop and all categories of pests. We evaluated a modeling framework to make multi-pest management decisions taking into account direct and indirect interactions among species belonging to different trophic levels. We adopted a Bayesian decision theory approach in combination with path analysis to evaluate interactions between Bromus tectorum (downy brome, cheatgrass) and Cephus cinctus (wheat stem sawfly) in wheat (Triticum aestivum) systems. We assessed their joint responses to weed management tactics, seeding rates, and cultivar tolerance to insect stem boring or competition. Our results indicated that C. cinctus oviposition behavior varied as a function of B. tectorum pressure. Crop responses were more readily explained by the joint effects of management tactics on both categories of pests and their interactions than just by the direct impact of any particular management scheme on yield. In accordance, a C. cinctus tolerant variety should be planted at a low seeding rate under high insect pressure. However as B. tectorum levels increase, the C. cinctus tolerant variety should be replaced by a competitive and drought tolerant cultivar at high seeding rates despite C. cinctus infestation. This study exemplifies the necessity of

  10. Bayesian analysis of multimodal data and brain imaging

    NASA Astrophysics Data System (ADS)

    Assadi, Amir H.; Eghbalnia, Hamid; Backonja, Miroslav; Wakai, Ronald T.; Rutecki, Paul; Haughton, Victor

    2000-06-01

    It is often the case that information about a process can be obtained using a variety of methods. Each method is employed because of specific advantages over the competing alternatives. An example in medical neuro-imaging is the choice between fMRI and MEG modes where fMRI can provide high spatial resolution in comparison to the superior temporal resolution of MEG. The combination of data from varying modes provides the opportunity to infer results that may not be possible by means of any one mode alone. We discuss a Bayesian and learning theoretic framework for enhanced feature extraction that is particularly suited to multi-modal investigations of massive data sets from multiple experiments. In the following Bayesian approach, acquired knowledge (information) regarding various aspects of the process are all directly incorporated into the formulation. This information can come from a variety of sources. In our case, it represents statistical information obtained from other modes of data collection. The information is used to train a learning machine to estimate a probability distribution, which is used in turn by a second machine as a prior, in order to produce a more refined estimation of the distribution of events. The computational demand of the algorithm is handled by proposing a distributed parallel implementation on a cluster of workstations that can be scaled to address real-time needs if required. We provide a simulation of these methods on a set of synthetically generated MEG and EEG data. We show how spatial and temporal resolutions improve by using prior distributions. The method on fMRI signals permits one to construct the probability distribution of the non-linear hemodynamics of the human brain (real data). These computational results are in agreement with biologically based measurements of other labs, as reported to us by researchers from UK. We also provide preliminary analysis involving multi-electrode cortical recording that accompanies

  11. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    SciTech Connect

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  12. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  13. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    PubMed

    Green, Adam W; Bailey, Larissa L

    2015-01-01

    Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA) using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica) metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools) using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA) and available information to inform a formal decision process to determine optimal and timely management policies.

  14. A Bayesian Model for the Analysis of Transgenerational Epigenetic Variation

    PubMed Central

    Varona, Luis; Munilla, Sebastián; Mouresan, Elena Flavia; González-Rodríguez, Aldemar; Moreno, Carlos; Altarriba, Juan

    2015-01-01

    Epigenetics has become one of the major areas of biological research. However, the degree of phenotypic variability that is explained by epigenetic processes still remains unclear. From a quantitative genetics perspective, the estimation of variance components is achieved by means of the information provided by the resemblance between relatives. In a previous study, this resemblance was described as a function of the epigenetic variance component and a reset coefficient that indicates the rate of dissipation of epigenetic marks across generations. Given these assumptions, we propose a Bayesian mixed model methodology that allows the estimation of epigenetic variance from a genealogical and phenotypic database. The methodology is based on the development of a T matrix of epigenetic relationships that depends on the reset coefficient. In addition, we present a simple procedure for the calculation of the inverse of this matrix (T−1) and a Gibbs sampler algorithm that obtains posterior estimates of all the unknowns in the model. The new procedure was used with two simulated data sets and with a beef cattle database. In the simulated populations, the results of the analysis provided marginal posterior distributions that included the population parameters in the regions of highest posterior density. In the case of the beef cattle dataset, the posterior estimate of transgenerational epigenetic variability was very low and a model comparison test indicated that a model that did not included it was the most plausible. PMID:25617408

  15. A Bayesian model for the analysis of transgenerational epigenetic variation.

    PubMed

    Varona, Luis; Munilla, Sebastián; Mouresan, Elena Flavia; González-Rodríguez, Aldemar; Moreno, Carlos; Altarriba, Juan

    2015-01-23

    Epigenetics has become one of the major areas of biological research. However, the degree of phenotypic variability that is explained by epigenetic processes still remains unclear. From a quantitative genetics perspective, the estimation of variance components is achieved by means of the information provided by the resemblance between relatives. In a previous study, this resemblance was described as a function of the epigenetic variance component and a reset coefficient that indicates the rate of dissipation of epigenetic marks across generations. Given these assumptions, we propose a Bayesian mixed model methodology that allows the estimation of epigenetic variance from a genealogical and phenotypic database. The methodology is based on the development of a T: matrix of epigenetic relationships that depends on the reset coefficient. In addition, we present a simple procedure for the calculation of the inverse of this matrix ( T-1: ) and a Gibbs sampler algorithm that obtains posterior estimates of all the unknowns in the model. The new procedure was used with two simulated data sets and with a beef cattle database. In the simulated populations, the results of the analysis provided marginal posterior distributions that included the population parameters in the regions of highest posterior density. In the case of the beef cattle dataset, the posterior estimate of transgenerational epigenetic variability was very low and a model comparison test indicated that a model that did not included it was the most plausible.

  16. Light curve demography via Bayesian functional data analysis

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas; Budavari, Tamas; Hendry, Martin A.; Kowal, Daniel; Ruppert, David

    2015-08-01

    Synoptic time-domain surveys provide astronomers, not simply more data, but a different kind of data: large ensembles of multivariate, irregularly and asynchronously sampled light curves. We describe a statistical framework for light curve demography—optimal accumulation and extraction of information, not only along individual light curves as conventional methods do, but also across large ensembles of related light curves. We build the framework using tools from functional data analysis (FDA), a rapidly growing area of statistics that addresses inference from datasets that sample ensembles of related functions. Our Bayesian FDA framework builds hierarchical models that describe light curve ensembles using multiple levels of randomness: upper levels describe the source population, and lower levels describe the observation process, including measurement errors and selection effects. Schematically, a particular object's light curve is modeled as the sum of a parameterized template component (modeling population-averaged behavior) and a peculiar component (modeling variability across the population), subsequently subjected to an observation model. A functional shrinkage adjustment to individual light curves emerges—an adaptive, functional generalization of the kind of adjustments made for Eddington or Malmquist bias in single-epoch photometric surveys. We are applying the framework to a variety of problems in synoptic time-domain survey astronomy, including optimal detection of weak sources in multi-epoch data, and improved estimation of Cepheid variable star luminosities from detailed demographic modeling of ensembles of Cepheid light curves.

  17. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives

    PubMed Central

    Green, Adam W.; Bailey, Larissa L.

    2015-01-01

    Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA) using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica) metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools) using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA) and available information to inform a formal decision process to determine optimal and timely management policies. PMID:26658734

  18. BAYESIAN ANGULAR POWER SPECTRUM ANALYSIS OF INTERFEROMETRIC DATA

    SciTech Connect

    Sutter, P. M.; Wandelt, Benjamin D.; Malu, Siddarth S.

    2012-09-15

    We present a Bayesian angular power spectrum and signal map inference engine which can be adapted to interferometric observations of anisotropies in the cosmic microwave background (CMB), 21 cm emission line mapping of galactic brightness fluctuations, or 21 cm absorption line mapping of neutral hydrogen in the dark ages. The method uses Gibbs sampling to generate a sampled representation of the angular power spectrum posterior and the posterior of signal maps given a set of measured visibilities in the uv-plane. We use a mock interferometric CMB observation to demonstrate the validity of this method in the flat-sky approximation when adapted to take into account arbitrary coverage of the uv-plane, mode-mode correlations due to observations on a finite patch, and heteroschedastic visibility errors. The computational requirements scale as O(n{sub p} log n{sub p}) where n{sub p} measures the ratio of the size of the detector array to the inter-detector spacing, meaning that Gibbs sampling is a promising technique for meeting the data analysis requirements of future cosmology missions.

  19. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    PubMed

    Green, Adam W; Bailey, Larissa L

    2015-01-01

    Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA) using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica) metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools) using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA) and available information to inform a formal decision process to determine optimal and timely management policies. PMID:26658734

  20. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    NASA Astrophysics Data System (ADS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  1. Bayesian analysis of inflation: Parameter estimation for single field models

    SciTech Connect

    Mortonson, Michael J.; Peiris, Hiranya V.; Easther, Richard

    2011-02-15

    Future astrophysical data sets promise to strengthen constraints on models of inflation, and extracting these constraints requires methods and tools commensurate with the quality of the data. In this paper we describe ModeCode, a new, publicly available code that computes the primordial scalar and tensor power spectra for single-field inflationary models. ModeCode solves the inflationary mode equations numerically, avoiding the slow roll approximation. It is interfaced with CAMB and CosmoMC to compute cosmic microwave background angular power spectra and perform likelihood analysis and parameter estimation. ModeCode is easily extendable to additional models of inflation, and future updates will include Bayesian model comparison. Errors from ModeCode contribute negligibly to the error budget for analyses of data from Planck or other next generation experiments. We constrain representative single-field models ({phi}{sup n} with n=2/3, 1, 2, and 4, natural inflation, and 'hilltop' inflation) using current data, and provide forecasts for Planck. From current data, we obtain weak but nontrivial limits on the post-inflationary physics, which is a significant source of uncertainty in the predictions of inflationary models, while we find that Planck will dramatically improve these constraints. In particular, Planck will link the inflationary dynamics with the post-inflationary growth of the horizon, and thus begin to probe the ''primordial dark ages'' between TeV and grand unified theory scale energies.

  2. Cepheid light curve demography via Bayesian functional data analysis

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas J.; Hendry, Martin; Kowal, Daniel; Ruppert, David

    2016-01-01

    Synoptic time-domain surveys provide astronomers, not simply more data, but a different kind of data: large ensembles of multivariate, irregularly and asynchronously sampled light curves. We describe a statistical framework for light curve demography—optimal accumulation and extraction of information, not only along individual light curves as conventional methods do, but also across large ensembles of related light curves. We build the framework using tools from functional data analysis (FDA), a rapidly growing area of statistics that addresses inference from datasets that sample ensembles of related functions. Our Bayesian FDA framework builds hierarchical models that describe light curve ensembles using multiple levels of randomness: upper levels describe the source population, and lower levels describe the observation process, including measurement errors and selection effects. Roughly speaking, a particular object's light curve is modeled as the sum of a parameterized template component (modeling population-averaged behavior) and a peculiar component (modeling variability across the population), subsequently subjected to an observation model. A functional shrinkage adjustment to individual light curves emerges—an adaptive, functional generalization of the kind of adjustments made for Eddington or Malmquist bias in single-epoch photometric surveys. We describe ongoing work applying the framework to improved estimation of Cepheid variable star luminosities via FDA-based refinement and generalization of the Cepheid period-luminosity relation.

  3. Dynamic sensor action selection with Bayesian decision analysis

    NASA Astrophysics Data System (ADS)

    Kristensen, Steen; Hansen, Volker; Kondak, Konstantin

    1998-10-01

    The aim of this work is to create a framework for the dynamic planning of sensor actions for an autonomous mobile robot. The framework uses Bayesian decision analysis, i.e., a decision-theoretic method, to evaluate possible sensor actions and selecting the most appropriate ones given the available sensors and what is currently known about the state of the world. Since sensing changes the knowledge of the system and since the current state of the robot (task, position, etc.) determines what knowledge is relevant, the evaluation and selection of sensing actions is an on-going process that effectively determines the behavior of the robot. The framework has been implemented on a real mobile robot and has been proven to be able to control in real-time the sensor actions of the system. In current work we are investigating methods to reduce or automatically generate the necessary model information needed by the decision- theoretic method to select the appropriate sensor actions.

  4. Nonparametric survival analysis using Bayesian Additive Regression Trees (BART).

    PubMed

    Sparapani, Rodney A; Logan, Brent R; McCulloch, Robert E; Laud, Purushottam W

    2016-07-20

    Bayesian additive regression trees (BART) provide a framework for flexible nonparametric modeling of relationships of covariates to outcomes. Recently, BART models have been shown to provide excellent predictive performance, for both continuous and binary outcomes, and exceeding that of its competitors. Software is also readily available for such outcomes. In this article, we introduce modeling that extends the usefulness of BART in medical applications by addressing needs arising in survival analysis. Simulation studies of one-sample and two-sample scenarios, in comparison with long-standing traditional methods, establish face validity of the new approach. We then demonstrate the model's ability to accommodate data from complex regression models with a simulation study of a nonproportional hazards scenario with crossing survival functions and survival function estimation in a scenario where hazards are multiplicatively modified by a highly nonlinear function of the covariates. Using data from a recently published study of patients undergoing hematopoietic stem cell transplantation, we illustrate the use and some advantages of the proposed method in medical investigations. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26854022

  5. Nuclear stockpile stewardship and Bayesian image analysis (DARHT and the BIE)

    SciTech Connect

    Carroll, James L

    2011-01-11

    Since the end of nuclear testing, the reliability of our nation's nuclear weapon stockpile has been performed using sub-critical hydrodynamic testing. These tests involve some pretty 'extreme' radiography. We will be discussing the challenges and solutions to these problems provided by DARHT (the world's premiere hydrodynamic testing facility) and the BIE or Bayesian Inference Engine (a powerful radiography analysis software tool). We will discuss the application of Bayesian image analysis techniques to this important and difficult problem.

  6. Toward a Behavioral Analysis of Joint Attention

    ERIC Educational Resources Information Center

    Dube, William V.; MacDonald, Rebecca P. F.; Mansfield, Renee C.; Holcomb, William L.; Ahearn, William H.

    2004-01-01

    Joint attention (JA) initiation is defined in cognitive-developmental psychology as a child's actions that verify or produce simultaneous attending by that child and an adult to some object or event in the environment so that both may experience the object or event together. This paper presents a contingency analysis of gaze shift in JA…

  7. Bayesian belief network analysis applied to determine the progression of temporomandibular disorders using MRI

    PubMed Central

    2015-01-01

    Objectives: This study investigated the applicability of a Bayesian belief network (BBN) to MR images to diagnose temporomandibular disorders (TMDs). Our aim was to determine the progression of TMDs, focusing on how each finding affects the other. Methods: We selected 1.5-T MRI findings (33 variables) and diagnoses (bone changes and disc displacement) of patients with TMD from 2007 to 2008. There were a total of 295 cases with 590 sides of temporomandibular joints (TMJs). The data were modified according to the research diagnostic criteria of TMD. We compared the accuracy of the BBN using 11 algorithms (necessary path condition, path condition, greedy search-and-score with Bayesian information criterion, Chow–Liu tree, Rebane–Pearl poly tree, tree augmented naïve Bayes model, maximum log likelihood, Akaike information criterion, minimum description length, K2 and C4.5), a multiple regression analysis and an artificial neural network using resubstitution validation and 10-fold cross-validation. Results: There were 191 TMJs (32.4%) with bone changes and 340 (57.6%) with articular disc displacement. The BBN path condition algorithm using resubstitution validation and 10-fold cross-validation was >99% accurate. However, the main advantage of a BBN is that it can represent the causal relationships between different findings and assign conditional probabilities, which can then be used to interpret the progression of TMD. Conclusions: Osteoarthritic bone changes progressed from condyle to articular fossa and finally to mandibular bone contours. Disc displacement was directly related to severe bone changes. Early bone changes were not directly related to disc displacement. TMJ functional factors (condylar translation, bony space and disc form) and age mediated between bone changes and disc displacement. PMID:25472616

  8. ChIP-BIT: Bayesian inference of target genes using a novel joint probabilistic model of ChIP-seq profiles.

    PubMed

    Chen, Xi; Jung, Jin-Gyoung; Shajahan-Haq, Ayesha N; Clarke, Robert; Shih, Ie-Ming; Wang, Yue; Magnani, Luca; Wang, Tian-Li; Xuan, Jianhua

    2016-04-20

    Chromatin immunoprecipitation with massively parallel DNA sequencing (ChIP-seq) has greatly improved the reliability with which transcription factor binding sites (TFBSs) can be identified from genome-wide profiling studies. Many computational tools are developed to detect binding events or peaks, however the robust detection of weak binding events remains a challenge for current peak calling tools. We have developed a novel Bayesian approach (ChIP-BIT) to reliably detect TFBSs and their target genes by jointly modeling binding signal intensities and binding locations of TFBSs. Specifically, a Gaussian mixture model is used to capture both binding and background signals in sample data. As a unique feature of ChIP-BIT, background signals are modeled by a local Gaussian distribution that is accurately estimated from the input data. Extensive simulation studies showed a significantly improved performance of ChIP-BIT in target gene prediction, particularly for detecting weak binding signals at gene promoter regions. We applied ChIP-BIT to find target genes from NOTCH3 and PBX1 ChIP-seq data acquired from MCF-7 breast cancer cells. TF knockdown experiments have initially validated about 30% of co-regulated target genes identified by ChIP-BIT as being differentially expressed in MCF-7 cells. Functional analysis on these genes further revealed the existence of crosstalk between Notch and Wnt signaling pathways.

  9. Bayesian methods for quantitative trait loci mapping based on model selection: approximate analysis using the Bayesian information criterion.

    PubMed Central

    Ball, R D

    2001-01-01

    We describe an approximate method for the analysis of quantitative trait loci (QTL) based on model selection from multiple regression models with trait values regressed on marker genotypes, using a modification of the easily calculated Bayesian information criterion to estimate the posterior probability of models with various subsets of markers as variables. The BIC-delta criterion, with the parameter delta increasing the penalty for additional variables in a model, is further modified to incorporate prior information, and missing values are handled by multiple imputation. Marginal probabilities for model sizes are calculated, and the posterior probability of nonzero model size is interpreted as the posterior probability of existence of a QTL linked to one or more markers. The method is demonstrated on analysis of associations between wood density and markers on two linkage groups in Pinus radiata. Selection bias, which is the bias that results from using the same data to both select the variables in a model and estimate the coefficients, is shown to be a problem for commonly used non-Bayesian methods for QTL mapping, which do not average over alternative possible models that are consistent with the data. PMID:11729175

  10. Bayesian Analysis of Multiple Populations in Galactic Globular Clusters

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, Rachel A.; Sarajedini, Ata; von Hippel, Ted; Stenning, David; Piotto, Giampaolo; Milone, Antonino; van Dyk, David A.; Robinson, Elliot; Stein, Nathan

    2016-01-01

    We use GO 13297 Cycle 21 Hubble Space Telescope (HST) observations and archival GO 10775 Cycle 14 HST ACS Treasury observations of Galactic Globular Clusters to find and characterize multiple stellar populations. Determining how globular clusters are able to create and retain enriched material to produce several generations of stars is key to understanding how these objects formed and how they have affected the structural, kinematic, and chemical evolution of the Milky Way. We employ a sophisticated Bayesian technique with an adaptive MCMC algorithm to simultaneously fit the age, distance, absorption, and metallicity for each cluster. At the same time, we also fit unique helium values to two distinct populations of the cluster and determine the relative proportions of those populations. Our unique numerical approach allows objective and precise analysis of these complicated clusters, providing posterior distribution functions for each parameter of interest. We use these results to gain a better understanding of multiple populations in these clusters and their role in the history of the Milky Way.Support for this work was provided by NASA through grant numbers HST-GO-10775 and HST-GO-13297 from the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS5-26555. This material is based upon work supported by the National Aeronautics and Space Administration under Grant NNX11AF34G issued through the Office of Space Science. This project was supported by the National Aeronautics & Space Administration through the University of Central Florida's NASA Florida Space Grant Consortium.

  11. Bayesian Statistical Analysis of Circadian Oscillations in Fibroblasts

    PubMed Central

    Cohen, Andrew L.; Leise, Tanya L.; Welsh, David K.

    2012-01-01

    Precise determination of a noisy biological oscillator’s period from limited experimental data can be challenging. The common practice is to calculate a single number (a point estimate) for the period of a particular time course. Uncertainty is inherent in any statistical estimator applied to noisy data, so our confidence in such point estimates depends on the quality and quantity of the data. Ideally, a period estimation method should both produce an accurate point estimate of the period and measure the uncertainty in that point estimate. A variety of period estimation methods are known, but few assess the uncertainty of the estimates, and a measure of uncertainty is rarely reported in the experimental literature. We compare the accuracy of point estimates using six common methods, only one of which can also produce uncertainty measures. We then illustrate the advantages of a new Bayesian method for estimating period, which outperforms the other six methods in accuracy of point estimates for simulated data and also provides a measure of uncertainty. We apply this method to analyze circadian oscillations of gene expression in individual mouse fibroblast cells and compute the number of cells and sampling duration required to reduce the uncertainty in period estimates to a desired level. This analysis indicates that, due to the stochastic variability of noisy intracellular oscillators, achieving a narrow margin of error can require an impractically large number of cells. In addition, we use a hierarchical model to determine the distribution of intrinsic cell periods, thereby separating the variability due to stochastic gene expression within each cell from the variability in period across the population of cells. PMID:22982138

  12. Bayesian Analysis of Order-Statistics Models for Ranking Data.

    ERIC Educational Resources Information Center

    Yu, Philip L. H.

    2000-01-01

    Studied the order-statistics models, extending the usual normal order-statistics model into one in which the underlying random variables followed a multivariate normal distribution. Used a Bayesian approach and the Gibbs sampling technique. Applied the proposed method to analyze presidential election data from the American Psychological…

  13. Semiparametric Thurstonian Models for Recurrent Choices: A Bayesian Analysis

    ERIC Educational Resources Information Center

    Ansari, Asim; Iyengar, Raghuram

    2006-01-01

    We develop semiparametric Bayesian Thurstonian models for analyzing repeated choice decisions involving multinomial, multivariate binary or multivariate ordinal data. Our modeling framework has multiple components that together yield considerable flexibility in modeling preference utilities, cross-sectional heterogeneity and parameter-driven…

  14. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services.

  15. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. PMID:26674686

  16. Bayesian Geostatistical Analysis and Prediction of Rhodesian Human African Trypanosomiasis

    PubMed Central

    Wardrop, Nicola A.; Atkinson, Peter M.; Gething, Peter W.; Fèvre, Eric M.; Picozzi, Kim; Kakembo, Abbas S. L.; Welburn, Susan C.

    2010-01-01

    Background The persistent spread of Rhodesian human African trypanosomiasis (HAT) in Uganda in recent years has increased concerns of a potential overlap with the Gambian form of the disease. Recent research has aimed to increase the evidence base for targeting control measures by focusing on the environmental and climatic factors that control the spatial distribution of the disease. Objectives One recent study used simple logistic regression methods to explore the relationship between prevalence of Rhodesian HAT and several social, environmental and climatic variables in two of the most recently affected districts of Uganda, and suggested the disease had spread into the study area due to the movement of infected, untreated livestock. Here we extend this study to account for spatial autocorrelation, incorporate uncertainty in input data and model parameters and undertake predictive mapping for risk of high HAT prevalence in future. Materials and Methods Using a spatial analysis in which a generalised linear geostatistical model is used in a Bayesian framework to account explicitly for spatial autocorrelation and incorporate uncertainty in input data and model parameters we are able to demonstrate a more rigorous analytical approach, potentially resulting in more accurate parameter and significance estimates and increased predictive accuracy, thereby allowing an assessment of the validity of the livestock movement hypothesis given more robust parameter estimation and appropriate assessment of covariate effects. Results Analysis strongly supports the theory that Rhodesian HAT was imported to the study area via the movement of untreated, infected livestock from endemic areas. The confounding effect of health care accessibility on the spatial distribution of Rhodesian HAT and the linkages between the disease's distribution and minimum land surface temperature have also been confirmed via the application of these methods. Conclusions Predictive mapping indicates an

  17. Bayesian network representing system dynamics in risk analysis of nuclear systems

    NASA Astrophysics Data System (ADS)

    Varuttamaseni, Athi

    2011-12-01

    A dynamic Bayesian network (DBN) model is used in conjunction with the alternating conditional expectation (ACE) regression method to analyze the risk associated with the loss of feedwater accident coupled with a subsequent initiation of the feed and bleed operation in the Zion-1 nuclear power plant. The use of the DBN allows the joint probability distribution to be factorized, enabling the analysis to be done on many simpler network structures rather than on one complicated structure. The construction of the DBN model assumes conditional independence relations among certain key reactor parameters. The choice of parameter to model is based on considerations of the macroscopic balance statements governing the behavior of the reactor under a quasi-static assumption. The DBN is used to relate the peak clad temperature to a set of independent variables that are known to be important in determining the success of the feed and bleed operation. A simple linear relationship is then used to relate the clad temperature to the core damage probability. To obtain a quantitative relationship among different nodes in the DBN, surrogates of the RELAP5 reactor transient analysis code are used. These surrogates are generated by applying the ACE algorithm to output data obtained from about 50 RELAP5 cases covering a wide range of the selected independent variables. These surrogates allow important safety parameters such as the fuel clad temperature to be expressed as a function of key reactor parameters such as the coolant temperature and pressure together with important independent variables such as the scram delay time. The time-dependent core damage probability is calculated by sampling the independent variables from their probability distributions and propagate the information up through the Bayesian network to give the clad temperature. With the knowledge of the clad temperature and the assumption that the core damage probability has a one-to-one relationship to it, we have

  18. Bayesian analysis of heavy-tailed and long-range dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, Timothy; Watkins, Nick; Gramacy, Robert; Franzke, Christian

    2014-05-01

    We have used MCMC algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modelling long range dependence (e.g. Beran et al, 2013). Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. We have extended the ARFIMA model by weakening the Gaussianity assumption, assuming an alpha-stable, heavy tailed, distribution for the innovations, and performing joint inference on d and alpha. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other popular measures of d.

  19. Major gene detection for fusiform rust resistance using Bayesian complex segregation analysis in loblolly pine.

    PubMed

    Li, Hua; Ghosh, Sujit; Amerson, Henry; Li, Bailian

    2006-09-01

    The presence of major genes affecting rust resistance of loblolly pine was investigated in a progeny population that was generated with a half-diallel mating of six parents. A Bayesian complex segregation analysis was used to make inference about a mixed inheritance model (MIM) that included polygenic effects and a single major gene effect. Marginalizations were achieved by using Gibbs sampler. A parent block sampling by which genotypes of a parent and its offspring were sampled jointly was implemented to improve mixing. The MIM was compared with a pure polygenic model (PM) using Bayes factor. Results showed that the MIM was a better model to explain the inheritance of rust resistance than the pure PM in the diallel population. A large major gene variance component estimate (> 50% of total variance), indicated the existence of major genes for rust resistance in the studied loblolly pine population. Based on estimations of parental genotypes, it appears that there may be two or more major genes affecting disease phenotypes in this diallel population.

  20. Bayesian Analysis of Two Stellar Populations in Galactic Globular Clusters III: Analysis of 30 Clusters

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, R.; Stenning, D. C.; Sarajedini, A.; von Hippel, T.; van Dyk, D. A.; Robinson, E.; Stein, N.; Jefferys, W. H.

    2016-09-01

    We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival ACS Treasury observations of 30 Galactic Globular Clusters to characterize two distinct stellar populations. A sophisticated Bayesian technique is employed to simultaneously sample the joint posterior distribution of age, distance, and extinction for each cluster, as well as unique helium values for two populations within each cluster and the relative proportion of those populations. We find the helium differences among the two populations in the clusters fall in the range of ˜0.04 to 0.11. Because adequate models varying in CNO are not presently available, we view these spreads as upper limits and present them with statistical rather than observational uncertainties. Evidence supports previous studies suggesting an increase in helium content concurrent with increasing mass of the cluster and also find that the proportion of the first population of stars increases with mass as well. Our results are examined in the context of proposed globular cluster formation scenarios. Additionally, we leverage our Bayesian technique to shed light on inconsistencies between the theoretical models and the observed data.

  1. Objective Bayesian fMRI analysis-a pilot study in different clinical environments.

    PubMed

    Magerkurth, Joerg; Mancini, Laura; Penny, William; Flandin, Guillaume; Ashburner, John; Micallef, Caroline; De Vita, Enrico; Daga, Pankaj; White, Mark J; Buckley, Craig; Yamamoto, Adam K; Ourselin, Sebastien; Yousry, Tarek; Thornton, John S; Weiskopf, Nikolaus

    2015-01-01

    Functional MRI (fMRI) used for neurosurgical planning delineates functionally eloquent brain areas by time-series analysis of task-induced BOLD signal changes. Commonly used frequentist statistics protect against false positive results based on a p-value threshold. In surgical planning, false negative results are equally if not more harmful, potentially masking true brain activity leading to erroneous resection of eloquent regions. Bayesian statistics provides an alternative framework, categorizing areas as activated, deactivated, non-activated or with low statistical confidence. This approach has not yet found wide clinical application partly due to the lack of a method to objectively define an effect size threshold. We implemented a Bayesian analysis framework for neurosurgical planning fMRI. It entails an automated effect-size threshold selection method for posterior probability maps accounting for inter-individual BOLD response differences, which was calibrated based on the frequentist results maps thresholded by two clinical experts. We compared Bayesian and frequentist analysis of passive-motor fMRI data from 10 healthy volunteers measured on a pre-operative 3T and an intra-operative 1.5T MRI scanner. As a clinical case study, we tested passive motor task activation in a brain tumor patient at 3T under clinical conditions. With our novel effect size threshold method, the Bayesian analysis revealed regions of all four categories in the 3T data. Activated region foci and extent were consistent with the frequentist analysis results. In the lower signal-to-noise ratio 1.5T intra-operative scanner data, Bayesian analysis provided improved brain-activation detection sensitivity compared with the frequentist analysis, albeit the spatial extents of the activations were smaller than at 3T. Bayesian analysis of fMRI data using operator-independent effect size threshold selection may improve the sensitivity and certainty of information available to guide neurosurgery.

  2. A Gibbs sampler for Bayesian analysis of site-occupancy data

    USGS Publications Warehouse

    Dorazio, Robert M.; Rodriguez, Daniel Taylor

    2012-01-01

    1. A Bayesian analysis of site-occupancy data containing covariates of species occurrence and species detection probabilities is usually completed using Markov chain Monte Carlo methods in conjunction with software programs that can implement those methods for any statistical model, not just site-occupancy models. Although these software programs are quite flexible, considerable experience is often required to specify a model and to initialize the Markov chain so that summaries of the posterior distribution can be estimated efficiently and accurately. 2. As an alternative to these programs, we develop a Gibbs sampler for Bayesian analysis of site-occupancy data that include covariates of species occurrence and species detection probabilities. This Gibbs sampler is based on a class of site-occupancy models in which probabilities of species occurrence and detection are specified as probit-regression functions of site- and survey-specific covariate measurements. 3. To illustrate the Gibbs sampler, we analyse site-occupancy data of the blue hawker, Aeshna cyanea (Odonata, Aeshnidae), a common dragonfly species in Switzerland. Our analysis includes a comparison of results based on Bayesian and classical (non-Bayesian) methods of inference. We also provide code (based on the R software program) for conducting Bayesian and classical analyses of site-occupancy data.

  3. Joint feature selection and classification using a Bayesian neural network with automatic relevance determination priors: potential use in CAD of medical imaging

    NASA Astrophysics Data System (ADS)

    Chen, Weijie; Zur, Richard M.; Giger, Maryellen L.

    2007-03-01

    Bayesian neural network (BNN) with automatic relevance determination (ARD) priors has the ability to assess the relevance of each input feature during network training. Our purpose is to investigate the potential use of BNN-with-ARD-priors for joint feature selection and classification in computer-aided diagnosis (CAD) of medical imaging. With ARD priors, each group of weights that connect an input feature to the hidden units is associated with a hyperparameter controlling the magnitudes of the weights. The hyperparameters and the weights are updated simultaneously during neural network training. A smaller hyperparameter will likely result in larger weight values and the corresponding feature will likely be more relevant to the output, and thus, to the classification task. For our study, a multivariate normal feature space is designed to include one feature with high classification performance in terms of both ideal observer and linear observer, two features with high ideal observer performance but low linear observer performance and 7 useless features. An exclusive-OR (XOR) feature space is designed to include 2 XOR features and 8 useless features. Our simulation results show that the ARD-BNN approach has the ability to select the optimal subset of features on the designed nonlinear feature spaces on which the linear approach fails. ARD-BNN has the ability to recognize features that have high ideal observer performance. Stepwise linear discriminant analysis (SWLDA) has the ability to select features that have high linear observer performance but fails to select features that have high ideal observer performance and low linear observer performance. The cross-validation results on clinical breast MRI data show that ARD-BNN yields statistically significant better performance than does the SWLDA-LDA approach. We believe that ARD-BNN is a promising method for pattern recognition in computer-aided diagnosis of medical imaging.

  4. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis: A Cross-National Investigation of Schwartz Values

    ERIC Educational Resources Information Center

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the model parameters and demonstrates the consequences…

  5. Pooled Bayesian meta-analysis of two Polish studies on radiation-induced cancers.

    PubMed

    Fornalski, Krzysztof W

    2015-11-01

    The robust Bayesian regression method was applied to perform meta-analysis of two independent studies on influence of low ionising radiation doses on the occurrence of fatal cancers. The re-analysed data come from occupational exposure analysis of nuclear workers in Świerk (Poland) and from ecological study of cancer risk from natural background radiation in Poland. Such two different types of data were analysed, and three popular models were tested: constant, linear and quadratic dose-response dependencies. The Bayesian model selection algorithm was used for all models. The Bayesian statistics clearly indicates that the popular linear no-threshold (LNT) assumption is not valid for presented cancer risks in the range of low doses of ionising radiation. The subject of LNT hypothesis use in radiation risk prediction and assessment is also discussed. PMID:25956788

  6. Cyclist activity and injury risk analysis at signalized intersections: a Bayesian modelling approach.

    PubMed

    Strauss, Jillian; Miranda-Moreno, Luis F; Morency, Patrick

    2013-10-01

    This study proposes a two-equation Bayesian modelling approach to simultaneously study cyclist injury occurrence and bicycle activity at signalized intersections as joint outcomes. This approach deals with the potential presence of endogeneity and unobserved heterogeneities and is used to identify factors associated with both cyclist injuries and volumes. Its application to identify high-risk corridors is also illustrated. Montreal, Quebec, Canada is the application environment, using an extensive inventory of a large sample of signalized intersections containing disaggregate motor-vehicle traffic volumes and bicycle flows, geometric design, traffic control and built environment characteristics in the vicinity of the intersections. Cyclist injury data for the period of 2003-2008 is used in this study. Also, manual bicycle counts were standardized using temporal and weather adjustment factors to obtain average annual daily volumes. Results confirm and quantify the effects of both bicycle and motor-vehicle flows on cyclist injury occurrence. Accordingly, more cyclists at an intersection translate into more cyclist injuries but lower injury rates due to the non-linear association between bicycle volume and injury occurrence. Furthermore, the results emphasize the importance of turning motor-vehicle movements. The presence of bus stops and total crosswalk length increase cyclist injury occurrence whereas the presence of a raised median has the opposite effect. Bicycle activity through intersections was found to increase as employment, number of metro stations, land use mix, area of commercial land use type, length of bicycle facilities and the presence of schools within 50-800 m of the intersection increase. Intersections with three approaches are expected to have fewer cyclists than those with four. Using Bayesian analysis, expected injury frequency and injury rates were estimated for each intersection and used to rank corridors. Corridors with high bicycle volumes

  7. A Bayesian Analysis of the Ages of Four Open Clusters

    NASA Astrophysics Data System (ADS)

    Jeffery, Elizabeth J.; von Hippel, Ted; van Dyk, David A.; Stenning, David C.; Robinson, Elliot; Stein, Nathan; Jefferys, William H.

    2016-09-01

    In this paper we apply a Bayesian technique to determine the best fit of stellar evolution models to find the main sequence turn-off age and other cluster parameters of four intermediate-age open clusters: NGC 2360, NGC 2477, NGC 2660, and NGC 3960. Our algorithm utilizes a Markov chain Monte Carlo technique to fit these various parameters, objectively finding the best-fit isochrone for each cluster. The result is a high-precision isochrone fit. We compare these results with the those of traditional “by-eye” isochrone fitting methods. By applying this Bayesian technique to NGC 2360, NGC 2477, NGC 2660, and NGC 3960, we determine the ages of these clusters to be 1.35 ± 0.05, 1.02 ± 0.02, 1.64 ± 0.04, and 0.860 ± 0.04 Gyr, respectively. The results of this paper continue our effort to determine cluster ages to a higher precision than that offered by these traditional methods of isochrone fitting.

  8. OBJECTIVE BAYESIAN ANALYSIS OF ''ON/OFF'' MEASUREMENTS

    SciTech Connect

    Casadei, Diego

    2015-01-01

    In high-energy astrophysics, it is common practice to account for the background overlaid with counts from the source of interest with the help of auxiliary measurements carried out by pointing off-source. In this ''on/off'' measurement, one knows the number of photons detected while pointing toward the source, the number of photons collected while pointing away from the source, and how to estimate the background counts in the source region from the flux observed in the auxiliary measurements. For very faint sources, the number of photons detected is so low that the approximations that hold asymptotically are not valid. On the other hand, an analytical solution exists for the Bayesian statistical inference, which is valid at low and high counts. Here we illustrate the objective Bayesian solution based on the reference posterior and compare the result with the approach very recently proposed by Knoetig, and discuss its most delicate points. In addition, we propose to compute the significance of the excess with respect to the background-only expectation with a method that is able to account for any uncertainty on the background and is valid for any photon count. This method is compared to the widely used significance formula by Li and Ma, which is based on asymptotic properties.

  9. S-PLUS Library For Nonlinear Bayesian Regression Analysis

    SciTech Connect

    Heasler, Patrick G. ); Anderson, Kevin K. ); Hylden, Jeff L. )

    2002-09-25

    This document describes a library of Splus functions used for nonlinear Bayesian regression in general and IR estimation in particular. This library has been developed to solve a general class of problems described by the nonlinear regression model: Y = F (beta,data)+ E where Y represents a vector of measurements, and F(beta,data) represents a Splus function that has been constructed to describe the measurements. The function F(beta,data) depends upon beta, a vector of parameters to be estimated, while data$ is an Splus object containing any other information needed by the model. The errors, E, are assumed to be independent, normal, unbiased and to have known standard deviations of stdev(E) = sd.E. The components in beta are split into two groups; estimation parameters and nuisance parameters. The Bayesian prior on the estimation parameters will generally be non-informative, while the prior on the nuisance parameters will be constructed to reflect the information we have about them. We hope an extended beta distribution is general enough to adequately represent the information we have on them. While we expect these functions to be improved and revised, this library is mature enough to be used without major modification.

  10. Wear analysis of revolute joints with clearance in multibody systems

    NASA Astrophysics Data System (ADS)

    Bai, ZhengFeng; Zhao, Yang; Wang, XingGui

    2013-08-01

    In this work, the prediction of wear for revolute joint with clearance in multibody systems is investigated using a computational methodology. The contact model in clearance joint is established using a new hybrid nonlinear contact force model and the friction effect is considered by using a modified Coulomb friction model. The dynamics model of multibody system with clearance is established using dynamic segmentation modeling method and the computational process for wear analysis of clearance joint in multibody systems is presented. The main computational process for wear analysis of clearance joint includes two steps, which are dynamics analysis and wear analysis. The dynamics simulation of multibody system with revolute clearance joint is carried out and the contact forces are drawn and used to calculate the wear amount of revolute clearance joint based on the Archard's wear model. Finally, a four-bar multibody mechanical system with revolute clearance joint is used as numerical example application to perform the simulation and show the dynamics responses and wear characteristics of multibody systems with revolute clearance joint. The main results of this work indicate that the contact between the joint elements is wider and more frequent in some specific regions and the wear phenomenon is not regular around the joint surface, which causes the clearance size increase non-regularly after clearance joint wear. This work presents an effective method to predict wear of revolute joint with clearance in multibody systems.

  11. Calibration of crash risk models on freeways with limited real-time traffic data using Bayesian meta-analysis and Bayesian inference approach.

    PubMed

    Xu, Chengcheng; Wang, Wei; Liu, Pan; Li, Zhibin

    2015-12-01

    This study aimed to develop a real-time crash risk model with limited data in China by using Bayesian meta-analysis and Bayesian inference approach. A systematic review was first conducted by using three different Bayesian meta-analyses, including the fixed effect meta-analysis, the random effect meta-analysis, and the meta-regression. The meta-analyses provided a numerical summary of the effects of traffic variables on crash risks by quantitatively synthesizing results from previous studies. The random effect meta-analysis and the meta-regression produced a more conservative estimate for the effects of traffic variables compared with the fixed effect meta-analysis. Then, the meta-analyses results were used as informative priors for developing crash risk models with limited data. Three different meta-analyses significantly affect model fit and prediction accuracy. The model based on meta-regression can increase the prediction accuracy by about 15% as compared to the model that was directly developed with limited data. Finally, the Bayesian predictive densities analysis was used to identify the outliers in the limited data. It can further improve the prediction accuracy by 5.0%.

  12. An Automated Bayesian Framework for Integrative Gene Expression Analysis and Predictive Medicine

    PubMed Central

    Parikh, Neena; Zollanvari, Amin; Alterovitz, Gil

    2012-01-01

    Motivation This work constructs a closed loop Bayesian Network framework for predictive medicine via integrative analysis of publicly available gene expression findings pertaining to various diseases. Results: An automated pipeline was successfully constructed. Integrative models were made based on gene expression data obtained from GEO experiments relating to four different diseases using Bayesian statistical methods. Many of these models demonstrated a high level of accuracy and predictive ability. The approach described in this paper can be applied to any complex disorder and can include any number and type of genome-scale studies. PMID:22779059

  13. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer

    2010-01-01

    This joint mobility KC lecture included information from two papers, "A Method for and Issues Associated with the Determination of Space Suit Joint Requirements" and "Results and Analysis from Space Suit Joint Torque Testing," as presented for the International Conference on Environmental Systems in 2009 and 2010, respectively. The first paper discusses historical joint torque testing methodologies and approaches that were tested in 2008 and 2009. The second paper discusses the testing that was completed in 2009 and 2010.

  14. Bayesian Analysis of Foraging by Pigeons (Columba livia)

    PubMed Central

    Killeen, Peter R.; Palombo, Gina-Marie; Gottlob, Lawrence R.; Beam, Jon

    2008-01-01

    In this article, the authors combine models of timing and Bayesian revision of information concerning patch quality to predict foraging behavior. Pigeons earned food by pecking on 2 keys (patches) in an experimental chamber. Food was primed for only 1 of the patches on each trial. There was a constant probability of finding food in a primed patch, but it accumulated only while the animals searched there. The optimal strategy was to choose the better patch first and remain for a fixed duration, thereafter alternating evenly between the patches. Pigeons were nonoptimal in 3 ways: (a) they departed too early, (b) their departure times were variable, and (c) they were biased in their choices after initial departure. The authors review various explanations of these data. PMID:8865614

  15. Variational Bayesian causal connectivity analysis for fMRI.

    PubMed

    Luessi, Martin; Babacan, S Derin; Molina, Rafael; Booth, James R; Katsaggelos, Aggelos K

    2014-01-01

    The ability to accurately estimate effective connectivity among brain regions from neuroimaging data could help answering many open questions in neuroscience. We propose a method which uses causality to obtain a measure of effective connectivity from fMRI data. The method uses a vector autoregressive model for the latent variables describing neuronal activity in combination with a linear observation model based on a convolution with a hemodynamic response function. Due to the employed modeling, it is possible to efficiently estimate all latent variables of the model using a variational Bayesian inference algorithm. The computational efficiency of the method enables us to apply it to large scale problems with high sampling rates and several hundred regions of interest. We use a comprehensive empirical evaluation with synthetic and real fMRI data to evaluate the performance of our method under various conditions.

  16. Variational Bayesian causal connectivity analysis for fMRI

    PubMed Central

    Luessi, Martin; Babacan, S. Derin; Molina, Rafael; Booth, James R.; Katsaggelos, Aggelos K.

    2014-01-01

    The ability to accurately estimate effective connectivity among brain regions from neuroimaging data could help answering many open questions in neuroscience. We propose a method which uses causality to obtain a measure of effective connectivity from fMRI data. The method uses a vector autoregressive model for the latent variables describing neuronal activity in combination with a linear observation model based on a convolution with a hemodynamic response function. Due to the employed modeling, it is possible to efficiently estimate all latent variables of the model using a variational Bayesian inference algorithm. The computational efficiency of the method enables us to apply it to large scale problems with high sampling rates and several hundred regions of interest. We use a comprehensive empirical evaluation with synthetic and real fMRI data to evaluate the performance of our method under various conditions. PMID:24847244

  17. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis

    NASA Technical Reports Server (NTRS)

    Herskovits, Edward H.; Gerring, Joan P.

    2003-01-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.

  18. Bayesian Factor Analysis as a Variable-Selection Problem: Alternative Priors and Consequences.

    PubMed

    Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric

    2016-01-01

    Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor-loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, Muthén & Asparouhov proposed a Bayesian structural equation modeling (BSEM) approach to explore the presence of cross loadings in CFA models. We show that the issue of determining factor-loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov's approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike-and-slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set is used to demonstrate our approach.

  19. Monte Carlo Algorithms for a Bayesian Analysis of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey B.; Eriksen, H. K.; ODwyer, I. J.; Wandelt, B. D.; Gorski, K.; Knox, L.; Chu, M.

    2006-01-01

    A viewgraph presentation on the review of Bayesian approach to Cosmic Microwave Background (CMB) analysis, numerical implementation with Gibbs sampling, a summary of application to WMAP I and work in progress with generalizations to polarization, foregrounds, asymmetric beams, and 1/f noise is given.

  20. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis.

    PubMed

    Herskovits, Edward H; Gerring, Joan P

    2003-08-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics. PMID:12948721

  1. Using Discrete Loss Functions and Weighted Kappa for Classification: An Illustration Based on Bayesian Network Analysis

    ERIC Educational Resources Information Center

    Zwick, Rebecca; Lenaburg, Lubella

    2009-01-01

    In certain data analyses (e.g., multiple discriminant analysis and multinomial log-linear modeling), classification decisions are made based on the estimated posterior probabilities that individuals belong to each of several distinct categories. In the Bayesian network literature, this type of classification is often accomplished by assigning…

  2. Bayesian Network Meta-Analysis for Unordered Categorical Outcomes with Incomplete Data

    ERIC Educational Resources Information Center

    Schmid, Christopher H.; Trikalinos, Thomas A.; Olkin, Ingram

    2014-01-01

    We develop a Bayesian multinomial network meta-analysis model for unordered (nominal) categorical outcomes that allows for partially observed data in which exact event counts may not be known for each category. This model properly accounts for correlations of counts in mutually exclusive categories and enables proper comparison and ranking of…

  3. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  4. Family Background Variables as Instruments for Education in Income Regressions: A Bayesian Analysis

    ERIC Educational Resources Information Center

    Hoogerheide, Lennart; Block, Joern H.; Thurik, Roy

    2012-01-01

    The validity of family background variables instrumenting education in income regressions has been much criticized. In this paper, we use data from the 2004 German Socio-Economic Panel and Bayesian analysis to analyze to what degree violations of the strict validity assumption affect the estimation results. We show that, in case of moderate direct…

  5. Bayesian Meta-Analysis of Cronbach's Coefficient Alpha to Evaluate Informative Hypotheses

    ERIC Educational Resources Information Center

    Okada, Kensuke

    2015-01-01

    This paper proposes a new method to evaluate informative hypotheses for meta-analysis of Cronbach's coefficient alpha using a Bayesian approach. The coefficient alpha is one of the most widely used reliability indices. In meta-analyses of reliability, researchers typically form specific informative hypotheses beforehand, such as "alpha of…

  6. Applying Bayesian Modeling and Receiver Operating Characteristic Methodologies for Test Utility Analysis

    ERIC Educational Resources Information Center

    Wang, Qiu; Diemer, Matthew A.; Maier, Kimberly S.

    2013-01-01

    This study integrated Bayesian hierarchical modeling and receiver operating characteristic analysis (BROCA) to evaluate how interest strength (IS) and interest differentiation (ID) predicted low–socioeconomic status (SES) youth's interest-major congruence (IMC). Using large-scale Kuder Career Search online-assessment data, this study fit three…

  7. Bayesian Factor Analysis as a Variable-Selection Problem: Alternative Priors and Consequences.

    PubMed

    Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric

    2016-01-01

    Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor-loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, Muthén & Asparouhov proposed a Bayesian structural equation modeling (BSEM) approach to explore the presence of cross loadings in CFA models. We show that the issue of determining factor-loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov's approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike-and-slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set is used to demonstrate our approach. PMID:27314566

  8. Symptoms of Depression and Challenging Behaviours in People with Intellectual Disability: A Bayesian Analysis. Brief Report

    ERIC Educational Resources Information Center

    Tsiouris, John; Mann, Rachel; Patti, Paul; Sturmey, Peter

    2004-01-01

    Clinicians need to know the likelihood of a condition given a positive or negative diagnostic test. In this study a Bayesian analysis of the Clinical Behavior Checklist for Persons with Intellectual Disabilities (CBCPID) to predict depression in people with intellectual disability was conducted. The CBCPID was administered to 92 adults with…

  9. Coronal joint spaces of the Temporomandibular joint: Systematic review and meta-analysis

    PubMed Central

    Silva, Joana-Cristina; Pires, Carlos A.; Ponces-Ramalhão, Maria-João-Feio; Lopes, Jorge-Dias

    2015-01-01

    Introduction The joint space measurements of the temporomandibular joint have been used to determine the condyle position variation. Therefore, the aim of this study is to perform a systematic review and meta-analysis on the coronal joint spaces measurements of the temporomandibular joint. Material and Methods An electronic database search was performed with the terms “condylar position”; “joint space”AND”TMJ”. Inclusionary criteria included: tomographic 3D imaging of the TMJ, presentation of at least two joint space measurements on the coronal plane. Exclusionary criteria were: mandibular fractures, animal studies, surgery, presence of genetic or chronic diseases, case reports, opinion or debate articles or unpublished material. The risk of bias of each study was judged as high, moderate or low according to the “Cochrane risk of bias tool”. The values used in the meta-analysis were the medial, superior and lateral joint space measurements and their differences between the right and left joint. Results From the initial search 2706 articles were retrieved. After excluding the duplicates and all the studies that did not match the eligibility criteria 4 articles classified for final review. All the retrieved articles were judged as low level of evidence. All of the reviewed studies were included in the meta-analysis concluding that the mean coronal joint space values were: medial joint space 2.94 mm, superior 2.55 mm and lateral 2.16 mm. Conclusions the analysis also showed high levels of heterogeneity. Right and left comparison did not show statistically significant differences. Key words:Temporomandibular joint, systematic review, meta-analysis. PMID:26330944

  10. Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals

    NASA Astrophysics Data System (ADS)

    De Francesco, A.; Guarini, E.; Bafile, U.; Formisano, F.; Scaccia, L.

    2016-08-01

    When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way.

  11. Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals.

    PubMed

    De Francesco, A; Guarini, E; Bafile, U; Formisano, F; Scaccia, L

    2016-08-01

    When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way.

  12. Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals.

    PubMed

    De Francesco, A; Guarini, E; Bafile, U; Formisano, F; Scaccia, L

    2016-08-01

    When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way. PMID:27627410

  13. Bayesian Analysis of Non-Gaussian Long-Range Dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, Timothy; Watkins, Nicholas; Franzke, Christian; Gramacy, Robert

    2013-04-01

    Recent studies [e.g. the Antarctic study of Franzke, J. Climate, 2010] have strongly suggested that surface temperatures exhibit long-range dependence (LRD). The presence of LRD would hamper the identification of deterministic trends and the quantification of their significance. It is well established that LRD processes exhibit stochastic trends over rather long periods of time. Thus, accurate methods for discriminating between physical processes that possess long memory and those that do not are an important adjunct to climate modeling. As we briefly review, the LRD idea originated at the same time as H-selfsimilarity, so it is often not realised that a model does not have to be H-self similar to show LRD [e.g. Watkins, GRL Frontiers, 2013]. We have used Markov Chain Monte Carlo algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modeling LRD. Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. Many physical processes, for example the Faraday Antarctic time series, are significantly non-Gaussian. We have therefore extended this work by weakening the Gaussianity assumption, assuming an alpha-stable distribution for the innovations, and performing joint inference on d and alpha. Such a modified FARIMA(p,d,q) process is a flexible, initial model for non-Gaussian processes with long memory. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other measures of d.

  14. Joint spatial analysis of gastrointestinal infectious diseases.

    PubMed

    Held, Leonhard; Graziano, Giusi; Frank, Christina; Rue, Håvard

    2006-10-01

    A major obstacle in the spatial analysis of infectious disease surveillance data is the problem of under-reporting. This article investigates the possibility of inferring reporting rates through joint statistical modelling of several infectious diseases with different aetiologies. Once variation in under-reporting can be estimated, geographic risk patterns for infections associated with specific food vehicles may be discerned. We adopt the shared component model, proposed by Knorr-Held and Best for two chronic diseases and further extended by (Held L, Natario I, Fenton S, Rue H, Becker N. Towards joint disease mapping. Statistical Methods in Medical Research 2005b; 14: 61-82) for more than two chronic diseases to the infectious disease setting. Our goal is to estimate a shared component, common to all diseases, which may be interpreted as representing the spatial variation in reporting rates. Additional components are introduced to describe the real spatial variation of the different diseases. Of course, this interpretation is only allowed under specific assumptions, in particular, the geographical variation in under-reporting should be similar for the diseases considered. In addition, it is vital that the data do not contain large local outbreaks, so adjustment based on a time series method recently proposed by (Held L, Höhle M, Hofmann M. A statistical framework for the analysis of multivariate infectious disease surveillance data. Statistical Modelling 2005a; 5: 187-99) is made at a preliminary stage. We will illustrate our approach through the analysis of gastrointestinal diseases notification data obtained from the German infectious disease surveillance system, administered by the Robert Koch Institute in Berlin.

  15. Candidate gene association study in pediatric acute lymphoblastic leukemia evaluated by Bayesian network based Bayesian multilevel analysis of relevance

    PubMed Central

    2012-01-01

    Background We carried out a candidate gene association study in pediatric acute lymphoblastic leukemia (ALL) to identify possible genetic risk factors in a Hungarian population. Methods The results were evaluated with traditional statistical methods and with our newly developed Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA) method. We collected genomic DNA and clinical data from 543 children, who underwent chemotherapy due to ALL, and 529 healthy controls. Altogether 66 single nucleotide polymorphisms (SNPs) in 19 candidate genes were genotyped. Results With logistic regression, we identified 6 SNPs in the ARID5B and IKZF1 genes associated with increased risk to B-cell ALL, and two SNPs in the STAT3 gene, which decreased the risk to hyperdiploid ALL. Because the associated SNPs were in linkage in each gene, these associations corresponded to one signal per gene. The odds ratio (OR) associated with the tag SNPs were: OR = 1.69, P = 2.22x10-7 for rs4132601 (IKZF1), OR = 1.53, P = 1.95x10-5 for rs10821936 (ARID5B) and OR = 0.64, P = 2.32x10-4 for rs12949918 (STAT3). With the BN-BMLA we confirmed the findings of the frequentist-based method and received additional information about the nature of the relations between the SNPs and the disease. E.g. the rs10821936 in ARID5B and rs17405722 in STAT3 showed a weak interaction, and in case of T-cell lineage sample group, the gender showed a weak interaction with three SNPs in three genes. In the hyperdiploid patient group the BN-BMLA detected a strong interaction among SNPs in the NOTCH1, STAT1, STAT3 and BCL2 genes. Evaluating the survival rate of the patients with ALL, the BN-BMLA showed that besides risk groups and subtypes, genetic variations in the BAX and CEBPA genes might also influence the probability of survival of the patients. Conclusions In the present study we confirmed the roles of genetic variations in ARID5B and IKZF1 in the susceptibility to B-cell ALL

  16. A fully Bayesian before-after analysis of permeable friction course (PFC) pavement wet weather safety.

    PubMed

    Buddhavarapu, Prasad; Smit, Andre F; Prozzi, Jorge A

    2015-07-01

    Permeable friction course (PFC), a porous hot-mix asphalt, is typically applied to improve wet weather safety on high-speed roadways in Texas. In order to warrant expensive PFC construction, a statistical evaluation of its safety benefits is essential. Generally, the literature on the effectiveness of porous mixes in reducing wet-weather crashes is limited and often inconclusive. In this study, the safety effectiveness of PFC was evaluated using a fully Bayesian before-after safety analysis. First, two groups of road segments overlaid with PFC and non-PFC material were identified across Texas; the non-PFC or reference road segments selected were similar to their PFC counterparts in terms of site specific features. Second, a negative binomial data generating process was assumed to model the underlying distribution of crash counts of PFC and reference road segments to perform Bayesian inference on the safety effectiveness. A data-augmentation based computationally efficient algorithm was employed for a fully Bayesian estimation. The statistical analysis shows that PFC is not effective in reducing wet weather crashes. It should be noted that the findings of this study are in agreement with the existing literature, although these studies were not based on a fully Bayesian statistical analysis. Our study suggests that the safety effectiveness of PFC road surfaces, or any other safety infrastructure, largely relies on its interrelationship with the road user. The results suggest that the safety infrastructure must be properly used to reap the benefits of the substantial investments. PMID:25897515

  17. A Pragmatic Bayesian Perspective on Correlation Analysis - The exoplanetary gravity - stellar activity case

    NASA Astrophysics Data System (ADS)

    Figueira, P.; Faria, J. P.; Adibekyan, V. Zh.; Oshagh, M.; Santos, N. C.

    2016-05-01

    We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, ρ, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short (˜ 130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log( R^' }_{{HK}}) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the reader interested in this kind of problem to apply our code to his/her own scientific problems. The full understanding of what the Bayesian framework is can only be gained through the insight that comes by handling priors, assessing the convergence of Monte Carlo runs, and a multitude of other practical problems. We hope to contribute so that Bayesian analysis becomes a tool in the toolkit of researchers, and they understand by experience its advantages and limitations.

  18. A fully Bayesian before-after analysis of permeable friction course (PFC) pavement wet weather safety.

    PubMed

    Buddhavarapu, Prasad; Smit, Andre F; Prozzi, Jorge A

    2015-07-01

    Permeable friction course (PFC), a porous hot-mix asphalt, is typically applied to improve wet weather safety on high-speed roadways in Texas. In order to warrant expensive PFC construction, a statistical evaluation of its safety benefits is essential. Generally, the literature on the effectiveness of porous mixes in reducing wet-weather crashes is limited and often inconclusive. In this study, the safety effectiveness of PFC was evaluated using a fully Bayesian before-after safety analysis. First, two groups of road segments overlaid with PFC and non-PFC material were identified across Texas; the non-PFC or reference road segments selected were similar to their PFC counterparts in terms of site specific features. Second, a negative binomial data generating process was assumed to model the underlying distribution of crash counts of PFC and reference road segments to perform Bayesian inference on the safety effectiveness. A data-augmentation based computationally efficient algorithm was employed for a fully Bayesian estimation. The statistical analysis shows that PFC is not effective in reducing wet weather crashes. It should be noted that the findings of this study are in agreement with the existing literature, although these studies were not based on a fully Bayesian statistical analysis. Our study suggests that the safety effectiveness of PFC road surfaces, or any other safety infrastructure, largely relies on its interrelationship with the road user. The results suggest that the safety infrastructure must be properly used to reap the benefits of the substantial investments.

  19. A Pragmatic Bayesian Perspective on Correlation Analysis. The exoplanetary gravity - stellar activity case

    NASA Astrophysics Data System (ADS)

    Figueira, P.; Faria, J. P.; Adibekyan, V. Zh.; Oshagh, M.; Santos, N. C.

    2016-11-01

    We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, ρ, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short (˜ 130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log(R^' }_{ {HK}}) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the reader interested in this kind of problem to apply our code to his/her own scientific problems. The full understanding of what the Bayesian framework is can only be gained through the insight that comes by handling priors, assessing the convergence of Monte Carlo runs, and a multitude of other practical problems. We hope to contribute so that Bayesian analysis becomes a tool in the toolkit of researchers, and they understand by experience its advantages and limitations.

  20. Bayesian dose-response analysis for epidemiological studies with complex uncertainty in dose estimation.

    PubMed

    Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L

    2016-02-10

    Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure.

  1. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  2. Toward an ecological analysis of Bayesian inferences: how task characteristics influence responses

    PubMed Central

    Hafenbrädl, Sebastian; Hoffrage, Ulrich

    2015-01-01

    In research on Bayesian inferences, the specific tasks, with their narratives and characteristics, are typically seen as exchangeable vehicles that merely transport the structure of the problem to research participants. In the present paper, we explore whether, and possibly how, task characteristics that are usually ignored influence participants’ responses in these tasks. We focus on both quantitative dimensions of the tasks, such as their base rates, hit rates, and false-alarm rates, as well as qualitative characteristics, such as whether the task involves a norm violation or not, whether the stakes are high or low, and whether the focus is on the individual case or on the numbers. Using a data set of 19 different tasks presented to 500 different participants who provided a total of 1,773 responses, we analyze these responses in two ways: first, on the level of the numerical estimates themselves, and second, on the level of various response strategies, Bayesian and non-Bayesian, that might have produced the estimates. We identified various contingencies, and most of the task characteristics had an influence on participants’ responses. Typically, this influence has been stronger when the numerical information in the tasks was presented in terms of probabilities or percentages, compared to natural frequencies – and this effect cannot be fully explained by a higher proportion of Bayesian responses when natural frequencies were used. One characteristic that did not seem to influence participants’ response strategy was the numerical value of the Bayesian solution itself. Our exploratory study is a first step toward an ecological analysis of Bayesian inferences, and highlights new avenues for future research. PMID:26300791

  3. Doubly Bayesian Analysis of Confidence in Perceptual Decision-Making

    PubMed Central

    Bahrami, Bahador; Latham, Peter E.

    2015-01-01

    Humans stand out from other animals in that they are able to explicitly report on the reliability of their internal operations. This ability, which is known as metacognition, is typically studied by asking people to report their confidence in the correctness of some decision. However, the computations underlying confidence reports remain unclear. In this paper, we present a fully Bayesian method for directly comparing models of confidence. Using a visual two-interval forced-choice task, we tested whether confidence reports reflect heuristic computations (e.g. the magnitude of sensory data) or Bayes optimal ones (i.e. how likely a decision is to be correct given the sensory data). In a standard design in which subjects were first asked to make a decision, and only then gave their confidence, subjects were mostly Bayes optimal. In contrast, in a less-commonly used design in which subjects indicated their confidence and decision simultaneously, they were roughly equally likely to use the Bayes optimal strategy or to use a heuristic but suboptimal strategy. Our results suggest that, while people’s confidence reports can reflect Bayes optimal computations, even a small unusual twist or additional element of complexity can prevent optimality. PMID:26517475

  4. Doubly Bayesian Analysis of Confidence in Perceptual Decision-Making.

    PubMed

    Aitchison, Laurence; Bang, Dan; Bahrami, Bahador; Latham, Peter E

    2015-10-01

    Humans stand out from other animals in that they are able to explicitly report on the reliability of their internal operations. This ability, which is known as metacognition, is typically studied by asking people to report their confidence in the correctness of some decision. However, the computations underlying confidence reports remain unclear. In this paper, we present a fully Bayesian method for directly comparing models of confidence. Using a visual two-interval forced-choice task, we tested whether confidence reports reflect heuristic computations (e.g. the magnitude of sensory data) or Bayes optimal ones (i.e. how likely a decision is to be correct given the sensory data). In a standard design in which subjects were first asked to make a decision, and only then gave their confidence, subjects were mostly Bayes optimal. In contrast, in a less-commonly used design in which subjects indicated their confidence and decision simultaneously, they were roughly equally likely to use the Bayes optimal strategy or to use a heuristic but suboptimal strategy. Our results suggest that, while people's confidence reports can reflect Bayes optimal computations, even a small unusual twist or additional element of complexity can prevent optimality. PMID:26517475

  5. Analysis of multiple-view Bayesian classification for SAR ATR

    NASA Astrophysics Data System (ADS)

    Brown, Myron Z.

    2003-09-01

    Classification of targets in high-resolution synthetic aperture radar imagery is a challenging problem in practice, due to extended operating conditions such as obscuration, articulation, varied configurations and a host of camouflage, concealment and deception tactics. Due to radar cross-section variability, the ability to discriminate between targets also varies greatly with target aspect. Potential space-borne and air-borne sensor systems may eventually be exploited to provide products to the warfighter at tactically relevant timelines. With such potential systems in place, multiple views of a given target area may be available to support targeting. In this paper, we examine the aspect dependence of SAR target classification and develop a Bayesian classification approach that exploits multiple incoherent views of a target. We further examine several practical issues in the design of such a classifier and consider sensitivities and their implications for sensor planning. Experimental results indicating the benefits of aspect diversity for improving performance under extended operating conditions are shown using publicly released 1-foot SAR data from DARPA's MSTAR program.

  6. Bayesian Analysis of Cosmic Ray Propagation: Evidence against Homogeneous Diffusion

    NASA Astrophysics Data System (ADS)

    Jóhannesson, G.; Ruiz de Austri, R.; Vincent, A. C.; Moskalenko, I. V.; Orlando, E.; Porter, T. A.; Strong, A. W.; Trotta, R.; Feroz, F.; Graff, P.; Hobson, M. P.

    2016-06-01

    We present the results of the most complete scan of the parameter space for cosmic ray (CR) injection and propagation. We perform a Bayesian search of the main GALPROP parameters, using the MultiNest nested sampling algorithm, augmented by the BAMBI neural network machine-learning package. This is the first study to separate out low-mass isotopes (p, \\bar{p}, and He) from the usual light elements (Be, B, C, N, and O). We find that the propagation parameters that best-fit p,\\bar{p}, and He data are significantly different from those that fit light elements, including the B/C and 10Be/9Be secondary-to-primary ratios normally used to calibrate propagation parameters. This suggests that each set of species is probing a very different interstellar medium, and that the standard approach of calibrating propagation parameters using B/C can lead to incorrect results. We present posterior distributions and best-fit parameters for propagation of both sets of nuclei, as well as for the injection abundances of elements from H to Si. The input GALDEF files with these new parameters will be included in an upcoming public GALPROP update.

  7. Bayesian analysis of Lidar signals with multiple returns.

    PubMed

    Hernández-Marín, Sergio; Wallace, Andrew M; Gibson, Gavin J

    2007-12-01

    Time-Correlated Single Photon Counting and Burst Illumination Laser data can be used for range profiling and target classification. In general, the problem is to analyse the response from a histogram of either photon counts or integrated intensities to assess the number, positions and amplitudes of the reflected returns from object surfaces. The goal of our work is a complete characterisation of the 3D surfaces viewed by the laser imaging system. The authors present a unified theory of pixel processing that is applicable to both approaches based on a Bayesian framework which allows for careful and thorough treatment of all types of uncertainties associated with the data. We use reversible jump Markov chain Monte Carlo (RJMCMC) techniques to evaluate the posterior distribution of the parameters and to explore spaces with different dimensionality. Further, we use a delayed rejection step to allow the generated Markov chain to mix better through the use of different proposal distributions. The approach is demonstrated on simulated and real data, showing that the return parameters can be estimated to a high degree of accuracy. We also show some practical examples from both near and far range depth imaging. PMID:17934226

  8. Individual organisms as units of analysis: Bayesian-clustering alternatives in population genetics.

    PubMed

    Mank, Judith E; Avise, John C

    2004-12-01

    Population genetic analyses traditionally focus on the frequencies of alleles or genotypes in 'populations' that are delimited a priori. However, there are potential drawbacks of amalgamating genetic data into such composite attributes of assemblages of specimens: genetic information on individual specimens is lost or submerged as an inherent part of the analysis. A potential also exists for circular reasoning when a population's initial identification and subsequent genetic characterization are coupled. In principle, these problems are circumvented by some newer methods of population identification and individual assignment based on statistical clustering of specimen genotypes. Here we evaluate a recent method in this genre--Bayesian clustering--using four genotypic data sets involving different types of molecular markers in non-model organisms from nature. As expected, measures of population genetic structure (F(ST) and phiST) tended to be significantly greater in Bayesian a posteriori data treatments than in analyses where populations were delimited a priori. In the four biological contexts examined, which involved both geographic population structures and hybrid zones, Bayesian clustering was able to recover differentiated populations, and Bayesian assignments were able to identify likely population sources of specific individuals.

  9. An intake prior for the Bayesian analysis of plutonium and uranium exposures in an epidemiology study.

    PubMed

    Puncher, M; Birchall, A; Bull, R K

    2014-12-01

    In Bayesian inference, the initial knowledge regarding the value of a parameter, before additional data are considered, is represented as a prior probability distribution. This paper describes the derivation of a prior distribution of intake that was used for the Bayesian analysis of plutonium and uranium worker doses in a recent epidemiology study. The chosen distribution is log-normal with a geometric standard deviation of 6 and a median value that is derived for each worker based on the duration of the work history and the number of reported acute intakes. The median value is a function of the work history and a constant related to activity in air concentration, M, which is derived separately for uranium and plutonium. The value of M is based primarily on measurements of plutonium and uranium in air derived from historical personal air sampler (PAS) data. However, there is significant uncertainty on the value of M that results from paucity of PAS data and from extrapolating these measurements to actual intakes. This paper compares posterior and prior distributions of intake and investigates the sensitivity of the Bayesian analyses to the assumed value of M. It is found that varying M by a factor of 10 results in a much smaller factor of 2 variation in mean intake and lung dose for both plutonium and uranium. It is concluded that if a log-normal distribution is considered to adequately represent worker intakes, then the Bayesian posterior distribution of dose is relatively insensitive to the value assumed of M.

  10. Progressive Damage Analysis of Bonded Composite Joints

    NASA Technical Reports Server (NTRS)

    Leone, Frank A., Jr.; Girolamo, Donato; Davila, Carlos G.

    2012-01-01

    The present work is related to the development and application of progressive damage modeling techniques to bonded joint technology. The joint designs studied in this work include a conventional composite splice joint and a NASA-patented durable redundant joint. Both designs involve honeycomb sandwich structures with carbon/epoxy facesheets joined using adhesively bonded doublers.Progressive damage modeling allows for the prediction of the initiation and evolution of damage within a structure. For structures that include multiple material systems, such as the joint designs under consideration, the number of potential failure mechanisms that must be accounted for drastically increases the complexity of the analyses. Potential failure mechanisms include fiber fracture, intraply matrix cracking, delamination, core crushing, adhesive failure, and their interactions. The bonded joints were modeled using highly parametric, explicitly solved finite element models, with damage modeling implemented via custom user-written subroutines. Each ply was discretely meshed using three-dimensional solid elements. Layers of cohesive elements were included between each ply to account for the possibility of delaminations and were used to model the adhesive layers forming the joint. Good correlation with experimental results was achieved both in terms of load-displacement history and the predicted failure mechanism(s).

  11. Joint custody: a critical analysis and appraisal.

    PubMed

    Derdeyn, A P; Scott, E

    1984-04-01

    Joint custody, now in the statutes of over half the states, is becoming the legally preferred custody arrangement. It eases the burden on the courts, and is thought to benefit all family members. Research has indicated that children generally gain from maintaining a relationship with each parent; however, the assumption that joint custody promotes cooperation between divorced parents is a research question that has yet to be answered. There is a disparity between the strength of the joint custody movement and the sufficiency of evidence that this is the optimum approach for all children.

  12. Enhanced characterization of solid solitary pulmonary nodules with Bayesian analysis-based computer-aided diagnosis

    PubMed Central

    Perandini, Simone; Soardi, Gian Alberto; Motton, Massimiliano; Augelli, Raffaele; Dallaserra, Chiara; Puntel, Gino; Rossi, Arianna; Sala, Giuseppe; Signorini, Manuel; Spezia, Laura; Zamboni, Federico; Montemezzi, Stefania

    2016-01-01

    The aim of this study was to prospectively assess the accuracy gain of Bayesian analysis-based computer-aided diagnosis (CAD) vs human judgment alone in characterizing solitary pulmonary nodules (SPNs) at computed tomography (CT). The study included 100 randomly selected SPNs with a definitive diagnosis. Nodule features at first and follow-up CT scans as well as clinical data were evaluated individually on a 1 to 5 points risk chart by 7 radiologists, firstly blinded then aware of Bayesian Inference Malignancy Calculator (BIMC) model predictions. Raters’ predictions were evaluated by means of receiver operating characteristic (ROC) curve analysis and decision analysis. Overall ROC area under the curve was 0.758 before and 0.803 after the disclosure of CAD predictions (P = 0.003). A net gain in diagnostic accuracy was found in 6 out of 7 readers. Mean risk class of benign nodules dropped from 2.48 to 2.29, while mean risk class of malignancies rose from 3.66 to 3.92. Awareness of CAD predictions also determined a significant drop on mean indeterminate SPNs (15 vs 23.86 SPNs) and raised the mean number of correct and confident diagnoses (mean 39.57 vs 25.71 SPNs). This study provides evidence supporting the integration of the Bayesian analysis-based BIMC model in SPN characterization.

  13. Bayesian Statistical Analysis Applied to NAA Data for Neutron Flux Spectrum Determination

    NASA Astrophysics Data System (ADS)

    Chiesa, D.; Previtali, E.; Sisti, M.

    2014-04-01

    In this paper, we present a statistical method, based on Bayesian statistics, to evaluate the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation analysis (NAA) experiment [A. Borio di Tigliole et al., Absolute flux measurement by NAA at the Pavia University TRIGA Mark II reactor facilities, ENC 2012 - Transactions Research Reactors, ISBN 978-92-95064-14-0, 22 (2012)] performed at the TRIGA Mark II reactor of Pavia University (Italy). In order to evaluate the neutron flux spectrum, subdivided in energy groups, we must solve a system of linear equations containing the grouped cross sections and the activation rate data. We solve this problem with Bayesian statistical analysis, including the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, is used to define the problem statistical model and solve it. The energy group fluxes and their uncertainties are then determined with great accuracy and the correlations between the groups are analyzed. Finally, the dependence of the results on the prior distribution choice and on the group cross section data is investigated to confirm the reliability of the analysis.

  14. Enhanced characterization of solid solitary pulmonary nodules with Bayesian analysis-based computer-aided diagnosis

    PubMed Central

    Perandini, Simone; Soardi, Gian Alberto; Motton, Massimiliano; Augelli, Raffaele; Dallaserra, Chiara; Puntel, Gino; Rossi, Arianna; Sala, Giuseppe; Signorini, Manuel; Spezia, Laura; Zamboni, Federico; Montemezzi, Stefania

    2016-01-01

    The aim of this study was to prospectively assess the accuracy gain of Bayesian analysis-based computer-aided diagnosis (CAD) vs human judgment alone in characterizing solitary pulmonary nodules (SPNs) at computed tomography (CT). The study included 100 randomly selected SPNs with a definitive diagnosis. Nodule features at first and follow-up CT scans as well as clinical data were evaluated individually on a 1 to 5 points risk chart by 7 radiologists, firstly blinded then aware of Bayesian Inference Malignancy Calculator (BIMC) model predictions. Raters’ predictions were evaluated by means of receiver operating characteristic (ROC) curve analysis and decision analysis. Overall ROC area under the curve was 0.758 before and 0.803 after the disclosure of CAD predictions (P = 0.003). A net gain in diagnostic accuracy was found in 6 out of 7 readers. Mean risk class of benign nodules dropped from 2.48 to 2.29, while mean risk class of malignancies rose from 3.66 to 3.92. Awareness of CAD predictions also determined a significant drop on mean indeterminate SPNs (15 vs 23.86 SPNs) and raised the mean number of correct and confident diagnoses (mean 39.57 vs 25.71 SPNs). This study provides evidence supporting the integration of the Bayesian analysis-based BIMC model in SPN characterization. PMID:27648166

  15. Enhanced characterization of solid solitary pulmonary nodules with Bayesian analysis-based computer-aided diagnosis.

    PubMed

    Perandini, Simone; Soardi, Gian Alberto; Motton, Massimiliano; Augelli, Raffaele; Dallaserra, Chiara; Puntel, Gino; Rossi, Arianna; Sala, Giuseppe; Signorini, Manuel; Spezia, Laura; Zamboni, Federico; Montemezzi, Stefania

    2016-08-28

    The aim of this study was to prospectively assess the accuracy gain of Bayesian analysis-based computer-aided diagnosis (CAD) vs human judgment alone in characterizing solitary pulmonary nodules (SPNs) at computed tomography (CT). The study included 100 randomly selected SPNs with a definitive diagnosis. Nodule features at first and follow-up CT scans as well as clinical data were evaluated individually on a 1 to 5 points risk chart by 7 radiologists, firstly blinded then aware of Bayesian Inference Malignancy Calculator (BIMC) model predictions. Raters' predictions were evaluated by means of receiver operating characteristic (ROC) curve analysis and decision analysis. Overall ROC area under the curve was 0.758 before and 0.803 after the disclosure of CAD predictions (P = 0.003). A net gain in diagnostic accuracy was found in 6 out of 7 readers. Mean risk class of benign nodules dropped from 2.48 to 2.29, while mean risk class of malignancies rose from 3.66 to 3.92. Awareness of CAD predictions also determined a significant drop on mean indeterminate SPNs (15 vs 23.86 SPNs) and raised the mean number of correct and confident diagnoses (mean 39.57 vs 25.71 SPNs). This study provides evidence supporting the integration of the Bayesian analysis-based BIMC model in SPN characterization. PMID:27648166

  16. Enhanced characterization of solid solitary pulmonary nodules with Bayesian analysis-based computer-aided diagnosis.

    PubMed

    Perandini, Simone; Soardi, Gian Alberto; Motton, Massimiliano; Augelli, Raffaele; Dallaserra, Chiara; Puntel, Gino; Rossi, Arianna; Sala, Giuseppe; Signorini, Manuel; Spezia, Laura; Zamboni, Federico; Montemezzi, Stefania

    2016-08-28

    The aim of this study was to prospectively assess the accuracy gain of Bayesian analysis-based computer-aided diagnosis (CAD) vs human judgment alone in characterizing solitary pulmonary nodules (SPNs) at computed tomography (CT). The study included 100 randomly selected SPNs with a definitive diagnosis. Nodule features at first and follow-up CT scans as well as clinical data were evaluated individually on a 1 to 5 points risk chart by 7 radiologists, firstly blinded then aware of Bayesian Inference Malignancy Calculator (BIMC) model predictions. Raters' predictions were evaluated by means of receiver operating characteristic (ROC) curve analysis and decision analysis. Overall ROC area under the curve was 0.758 before and 0.803 after the disclosure of CAD predictions (P = 0.003). A net gain in diagnostic accuracy was found in 6 out of 7 readers. Mean risk class of benign nodules dropped from 2.48 to 2.29, while mean risk class of malignancies rose from 3.66 to 3.92. Awareness of CAD predictions also determined a significant drop on mean indeterminate SPNs (15 vs 23.86 SPNs) and raised the mean number of correct and confident diagnoses (mean 39.57 vs 25.71 SPNs). This study provides evidence supporting the integration of the Bayesian analysis-based BIMC model in SPN characterization.

  17. A Bayesian Multinomial Probit MODEL FOR THE ANALYSIS OF PANEL CHOICE DATA.

    PubMed

    Fong, Duncan K H; Kim, Sunghoon; Chen, Zhe; DeSarbo, Wayne S

    2016-03-01

    A new Bayesian multinomial probit model is proposed for the analysis of panel choice data. Using a parameter expansion technique, we are able to devise a Markov Chain Monte Carlo algorithm to compute our Bayesian estimates efficiently. We also show that the proposed procedure enables the estimation of individual level coefficients for the single-period multinomial probit model even when the available prior information is vague. We apply our new procedure to consumer purchase data and reanalyze a well-known scanner panel dataset that reveals new substantive insights. In addition, we delineate a number of advantageous features of our proposed procedure over several benchmark models. Finally, through a simulation analysis employing a fractional factorial design, we demonstrate that the results from our proposed model are quite robust with respect to differing factors across various conditions.

  18. Development of a clinical pathways analysis system with adaptive Bayesian nets and data mining techniques.

    PubMed

    Kopec, D; Shagas, G; Reinharth, D; Tamang, S

    2004-01-01

    The use and development of software in the medical field offers tremendous opportunities for making health care delivery more efficient, more effective, and less error-prone. We discuss and explore the use of clinical pathways analysis with Adaptive Bayesian Networks and Data Mining Techniques to perform such analyses. The computation of "lift" (a measure of completed pathways improvement potential) leads us to optimism regarding the potential for this approach.

  19. Environmental Modeling and Bayesian Analysis for Assessing Human Health Impacts from Radioactive Waste Disposal

    NASA Astrophysics Data System (ADS)

    Stockton, T.; Black, P.; Tauxe, J.; Catlett, K.

    2004-12-01

    Bayesian decision analysis provides a unified framework for coherent decision-making. Two key components of Bayesian decision analysis are probability distributions and utility functions. Calculating posterior distributions and performing decision analysis can be computationally challenging, especially for complex environmental models. In addition, probability distributions and utility functions for environmental models must be specified through expert elicitation, stakeholder consensus, or data collection, all of which have their own set of technical and political challenges. Nevertheless, a grand appeal of the Bayesian approach for environmental decision- making is the explicit treatment of uncertainty, including expert judgment. The impact of expert judgment on the environmental decision process, though integral, goes largely unassessed. Regulations and orders of the Environmental Protection Agency, Department Of Energy, and Nuclear Regulatory Agency orders require assessing the impact on human health of radioactive waste contamination over periods of up to ten thousand years. Towards this end complex environmental simulation models are used to assess "risk" to human and ecological health from migration of radioactive waste. As the computational burden of environmental modeling is continually reduced probabilistic process modeling using Monte Carlo simulation is becoming routinely used to propagate uncertainty from model inputs through model predictions. The utility of a Bayesian approach to environmental decision-making is discussed within the context of a buried radioactive waste example. This example highlights the desirability and difficulties of merging the cost of monitoring, the cost of the decision analysis, the cost and viability of clean up, and the probability of human health impacts within a rigorous decision framework.

  20. Nonlinear transient analysis of joint dominated structures

    NASA Technical Reports Server (NTRS)

    Chapman, J. M.; Shaw, F. H.; Russell, W. C.

    1987-01-01

    A residual force technique is presented that can perform the transient analyses of large, flexible, and joint dominated structures. The technique permits substantial size reduction in the number of degrees of freedom describing the nonlinear structural models and can account for such nonlinear joint phenomena as free-play and hysteresis. In general, joints can have arbitrary force-state map representations but these are used in the form of residual force maps. One essential feature of the technique is to replace the arbitrary force-state maps describing the nonlinear joints with residual force maps describing the truss links. The main advantage of this replacement is that the incrementally small relative displacements and velocities across a joint are not monitored directly thereby avoiding numerical difficulties. Instead, very small and 'soft' residual forces are defined giving a numerically attractive form for the equations of motion and thereby permitting numerically stable integration algorithms. The technique was successfully applied to the transient analyses of a large 58 bay, 60 meter truss having nonlinear joints. A method to perform link testing is also presented.

  1. Ultraviolet light inactivation of protozoa in drinking water: a Bayesian meta-analysis.

    PubMed

    Qian, Song S; Donnelly, Maureen; Schmelling, Daniel C; Messner, Michael; Linden, Karl G; Cotton, Christine

    2004-01-01

    To assess the dose of UV light needed to achieve specified levels of Giardia spp. cysts and Cryptosporidium spp. oocysts inactivation in drinking water, a Bayesian meta-analysis is used to analyze experimental data from several studies. Of the 20 studies identified by an extensive data collection effort, 14 (five reported experiments on Giardia and nine on Cryptosporidium) were selected for analysis based on a set of criteria. A substantial amount of the log inactivation data are reported as greater than a given inactivation level (i.e., censored data). The Bayesian hierarchical modeling approach used in this study not only properly addresses the common concerns in a meta-analysis but also provides a robust method for incorporating censored data. Different statistical models will result in different estimates of the UV doses needed to achieve a specific inactivation level. The Bayesian approach allows us to present the uncertainty in terms of risk, which is better suited for supporting US EPA in developing regulations.

  2. 3D joint dynamics analysis of healthy children's gait.

    PubMed

    Samson, William; Desroches, Guillaume; Cheze, Laurence; Dumas, Raphaël

    2009-11-13

    The 3D joint moments and 2D joint powers have been largely explored in the literature of healthy children's gait, in particular to compare them with pathologic subjects' gait. However, no study reported on 3D joint power in children which could be due to the difficulties in interpreting the results. Recently, the analysis of the 3D angle between the joint moment and the joint angular velocity vectors has been proposed in order to help 3D joint power interpretation. Our hypothesis is that this 3D angle may help in characterizing the level of gait maturation. The present study explores 3D joint moments, 3D joint power and the proposed 3D angle for both children's and adults' gaits to highlight differences in the strategies used. The results seem to confirm that children have an alternative strategy of mainly ankle stabilization and hip propulsion compared to the adults' strategy of mainly ankle resistance and propulsion and hip stabilization. In the future, the same 3D angle analysis should be applied to different age groups for better describing the evolution of the 3D joint dynamic strategies during the growth.

  3. Bayesian Propensity Score Analysis: Simulation and Case Study

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Cassie J. S.

    2011-01-01

    Propensity score analysis (PSA) has been used in a variety of settings, such as education, epidemiology, and sociology. Most typically, propensity score analysis has been implemented within the conventional frequentist perspective of statistics. This perspective, as is well known, does not account for uncertainty in either the parameters of the…

  4. Bayesian estimation of dynamic matching function for U-V analysis in Japan

    NASA Astrophysics Data System (ADS)

    Kyo, Koki; Noda, Hideo; Kitagawa, Genshiro

    2012-05-01

    In this paper we propose a Bayesian method for analyzing unemployment dynamics. We derive a Beveridge curve for unemployment and vacancy (U-V) analysis from a Bayesian model based on a labor market matching function. In our framework, the efficiency of matching and the elasticities of new hiring with respect to unemployment and vacancy are regarded as time varying parameters. To construct a flexible model and obtain reasonable estimates in an underdetermined estimation problem, we treat the time varying parameters as random variables and introduce smoothness priors. The model is then described in a state space representation, enabling the parameter estimation to be carried out using Kalman filter and fixed interval smoothing. In such a representation, dynamic features of the cyclic unemployment rate and the structural-frictional unemployment rate can be accurately captured.

  5. MorePower 6.0 for ANOVA with relational confidence intervals and Bayesian analysis.

    PubMed

    Campbell, Jamie I D; Thompson, Valerie A

    2012-12-01

    MorePower 6.0 is a flexible freeware statistical calculator that computes sample size, effect size, and power statistics for factorial ANOVA designs. It also calculates relational confidence intervals for ANOVA effects based on formulas from Jarmasz and Hollands (Canadian Journal of Experimental Psychology 63:124-138, 2009), as well as Bayesian posterior probabilities for the null and alternative hypotheses based on formulas in Masson (Behavior Research Methods 43:679-690, 2011). The program is unique in affording direct comparison of these three approaches to the interpretation of ANOVA tests. Its high numerical precision and ability to work with complex ANOVA designs could facilitate researchers' attention to issues of statistical power, Bayesian analysis, and the use of confidence intervals for data interpretation. MorePower 6.0 is available at https://wiki.usask.ca/pages/viewpageattachments.action?pageId=420413544 .

  6. Inference of posterior inclusion probability of QTLs in Bayesian shrinkage analysis.

    PubMed

    Yang, Deguang; Han, Shanshan; Jiang, Dan; Yang, Runqing; Fang, Ming

    2015-01-01

    Bayesian shrinkage analysis estimates all QTLs effects simultaneously, which shrinks the effect of "insignificant" QTLs close to zero so that it does not need special model selection. Bayesian shrinkage estimation usually has an excellent performance on multiple QTLs mapping, but it could not give a probabilistic explanation of how often a QTLs is included in the model, also called posterior inclusion probability, which is important to assess the importance of a QTL. In this research, two methods, FitMix and SimMix, are proposed to approximate the posterior probabilities. Under the assumption of mixture distribution of the estimated QTL effect, FitMix and SimMix mathematically and intuitively fit mixture distribution, respectively. The simulation results showed that both methods gave very reasonable estimates for posterior probabilities. We also applied the two methods to map QTLs for the North American Barley Genome Mapping Project data. PMID:25857576

  7. Bayesian analysis of binary prediction tree models for retrospectively sampled outcomes.

    PubMed

    Pittman, Jennifer; Huang, Erich; Nevins, Joseph; Wang, Quanli; West, Mike

    2004-10-01

    Classification tree models are flexible analysis tools which have the ability to evaluate interactions among predictors as well as generate predictions for responses of interest. We describe Bayesian analysis of a specific class of tree models in which binary response data arise from a retrospective case-control design. We are also particularly interested in problems with potentially very many candidate predictors. This scenario is common in studies concerning gene expression data, which is a key motivating example context. Innovations here include the introduction of tree models that explicitly address and incorporate the retrospective design, and the use of nonparametric Bayesian models involving Dirichlet process priors on the distributions of predictor variables. The model specification influences the generation of trees through Bayes' factor based tests of association that determine significant binary partitions of nodes during a process of forward generation of trees. We describe this constructive process and discuss questions of generating and combining multiple trees via Bayesian model averaging for prediction. Additional discussion of parameter selection and sensitivity is given in the context of an example which concerns prediction of breast tumour status utilizing high-dimensional gene expression data; the example demonstrates the exploratory/explanatory uses of such models as well as their primary utility in prediction. Shortcomings of the approach and comparison with alternative tree modelling algorithms are also discussed, as are issues of modelling and computational extensions.

  8. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    NASA Astrophysics Data System (ADS)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  9. Bayesian methodology to estimate and update safety performance functions under limited data conditions: a sensitivity analysis.

    PubMed

    Heydari, Shahram; Miranda-Moreno, Luis F; Lord, Dominique; Fu, Liping

    2014-03-01

    In road safety studies, decision makers must often cope with limited data conditions. In such circumstances, the maximum likelihood estimation (MLE), which relies on asymptotic theory, is unreliable and prone to bias. Moreover, it has been reported in the literature that (a) Bayesian estimates might be significantly biased when using non-informative prior distributions under limited data conditions, and that (b) the calibration of limited data is plausible when existing evidence in the form of proper priors is introduced into analyses. Although the Highway Safety Manual (2010) (HSM) and other research studies provide calibration and updating procedures, the data requirements can be very taxing. This paper presents a practical and sound Bayesian method to estimate and/or update safety performance function (SPF) parameters combining the information available from limited data with the SPF parameters reported in the HSM. The proposed Bayesian updating approach has the advantage of requiring fewer observations to get reliable estimates. This paper documents this procedure. The adopted technique is validated by conducting a sensitivity analysis through an extensive simulation study with 15 different models, which include various prior combinations. This sensitivity analysis contributes to our understanding of the comparative aspects of a large number of prior distributions. Furthermore, the proposed method contributes to unification of the Bayesian updating process for SPFs. The results demonstrate the accuracy of the developed methodology. Therefore, the suggested approach offers considerable promise as a methodological tool to estimate and/or update baseline SPFs and to evaluate the efficacy of road safety countermeasures under limited data conditions.

  10. Bayesian sensitivity analysis of a nonlinear finite element model

    NASA Astrophysics Data System (ADS)

    Becker, W.; Oakley, J. E.; Surace, C.; Gili, P.; Rowson, J.; Worden, K.

    2012-10-01

    A major problem in uncertainty and sensitivity analysis is that the computational cost of propagating probabilistic uncertainty through large nonlinear models can be prohibitive when using conventional methods (such as Monte Carlo methods). A powerful solution to this problem is to use an emulator, which is a mathematical representation of the model built from a small set of model runs at specified points in input space. Such emulators are massively cheaper to run and can be used to mimic the "true" model, with the result that uncertainty analysis and sensitivity analysis can be performed for a greatly reduced computational cost. The work here investigates the use of an emulator known as a Gaussian process (GP), which is an advanced probabilistic form of regression. The GP is particularly suited to uncertainty analysis since it is able to emulate a wide class of models, and accounts for its own emulation uncertainty. Additionally, uncertainty and sensitivity measures can be estimated analytically, given certain assumptions. The GP approach is explained in detail here, and a case study of a finite element model of an airship is used to demonstrate the method. It is concluded that the GP is a very attractive way of performing uncertainty and sensitivity analysis on large models, provided that the dimensionality is not too high.

  11. Bayesian meta-analysis, with application to studies of ETS and lung cancer.

    PubMed

    Tweedie, R L; Scott, D J; Biggerstaff, B J; Mengersen, K L

    1996-03-01

    Meta-analysis enables researchers to combine the results of several studies to assess the information they provide as a whole. It has been used to give a systematic overview of many areas in which data on a possible association between an exposure and an outcome have been collected in a number of studies but where the overall picture remains obscure, both as to the existence or size of the effect. This paper outlines some innovations in meta-analysis, based on using Markov chain Monte Carlo (MCMC) techniques for implementing Bayesian hierarchical models, and compares these with a more well-known random effects (RE) model. The new techniques allow different aspects of variation to be incorporated into descriptions of the association, and in particular enable researchers to better quantify differences between studies. Both the classical and Bayesian methods are applied, in this paper, to the current collection of studies of the association between incidence of lung cancer in female never-smokers and exposure to environmental tobacco smoke (ETS), both in the home through spousal smoking and in the workplace. In this paper it is demonstrated that compared with the RE model, the Bayesian methods: (a) allow more detailed modeling of study heterogeneity to be incorporated; (b) are relatively robust against a wide choice of specifications of such information on heterogeneity; (c) allow for more detailed and satisfactory statements to be made, not only about the overall risk but about the individual studies, on the basis of the combined information. For the workplace exposure data set, the Bayesian methods give a somewhat lower overall estimate of relative risk of lung cancer associated with ETS, indicating the care that needs to be taken in using point estimates based on any one method of analysis. On the larger spousal data set the methods give similar answers. Some of the other concerns with meta-analysis are also considered. These include: consistency between different

  12. Highly efficient Bayesian joint inversion for receiver-based data and its application to lithospheric structure beneath the southern Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Dettmer, Jan; Rhie, Junkee; Tkalčić, Hrvoje

    2016-07-01

    With the deployment of extensive seismic arrays, systematic and efficient parameter and uncertainty estimation is of increasing importance and can provide reliable, regional models for crustal and upper-mantle structure. We present an efficient Bayesian method for the joint inversion of surface-wave dispersion and receiver-function data that combines trans-dimensional (trans-D) model selection in an optimization phase with subsequent rigorous parameter uncertainty estimation. Parameter and uncertainty estimation depend strongly on the chosen parametrization such that meaningful regional comparison requires quantitative model selection that can be carried out efficiently at several sites. While significant progress has been made for model selection (e.g. trans-D inference) at individual sites, the lack of efficiency can prohibit application to large data volumes or cause questionable results due to lack of convergence. Studies that address large numbers of data sets have mostly ignored model selection in favour of more efficient/simple estimation techniques (i.e. focusing on uncertainty estimation but employing ad-hoc model choices). Our approach consists of a two-phase inversion that combines trans-D optimization to select the most probable parametrization with subsequent Bayesian sampling for uncertainty estimation given that parametrization. The trans-D optimization is implemented here by replacing the likelihood function with the Bayesian information criterion (BIC). The BIC provides constraints on model complexity that facilitate the search for an optimal parametrization. Parallel tempering (PT) is applied as an optimization algorithm. After optimization, the optimal model choice is identified by the minimum BIC value from all PT chains. Uncertainty estimation is then carried out in fixed dimension. Data errors are estimated as part of the inference problem by a combination of empirical and hierarchical estimation. Data covariance matrices are estimated from

  13. Bayesian analysis of fingerprint, face and signature evidences with automatic biometric systems.

    PubMed

    Gonzalez-Rodriguez, Joaquin; Fierrez-Aguilar, Julian; Ramos-Castro, Daniel; Ortega-Garcia, Javier

    2005-12-20

    The Bayesian approach provides a unified and logical framework for the analysis of evidence and to provide results in the form of likelihood ratios (LR) from the forensic laboratory to court. In this contribution we want to clarify how the biometric scientist or laboratory can adapt their conventional biometric systems or technologies to work according to this Bayesian approach. Forensic systems providing their results in the form of LR will be assessed through Tippett plots, which give a clear representation of the LR-based performance both for targets (the suspect is the author/source of the test pattern) and non-targets. However, the computation procedures of the LR values, especially with biometric evidences, are still an open issue. Reliable estimation techniques showing good generalization properties for the estimation of the between- and within-source variabilities of the test pattern are required, as variance restriction techniques in the within-source density estimation to stand for the variability of the source with the course of time. Fingerprint, face and on-line signature recognition systems will be adapted to work according to this Bayesian approach showing both the likelihood ratios range in each application and the adequacy of these biometric techniques to the daily forensic work.

  14. Bayesian analysis of longitudinal Johne's disease diagnostic data without a gold standard test.

    PubMed

    Wang, C; Turnbull, B W; Nielsen, S S; Gröhn, Y T

    2011-05-01

    A Bayesian methodology was developed based on a latent change-point model to evaluate the performance of milk ELISA and fecal culture tests for longitudinal Johne's disease diagnostic data. The situation of no perfect reference test was considered; that is, no "gold standard." A change-point process with a Weibull survival hazard function was used to model the progression of the hidden disease status. The model adjusted for the fixed effects of covariate variables and random effects of subject on the diagnostic testing procedure. Markov chain Monte Carlo methods were used to compute the posterior estimates of the model parameters that provide the basis for inference concerning the accuracy of the diagnostic procedure. Based on the Bayesian approach, the posterior probability distribution of the change-point onset time can be obtained and used as a criterion for infection diagnosis. An application is presented to an analysis of ELISA and fecal culture test outcomes in the diagnostic testing of paratuberculosis (Johne's disease) for a Danish longitudinal study from January 2000 to March 2003. The posterior probability criterion based on the Bayesian model with 4 repeated observations has an area under the receiver operating characteristic curve (AUC) of 0.984, and is superior to the raw ELISA (AUC=0.911) and fecal culture (sensitivity=0.358, specificity=0.980) tests for Johne's disease diagnosis. PMID:21524521

  15. Inference on the Univariate Frailty Model: A Bayesian Reference Analysis Approach

    NASA Astrophysics Data System (ADS)

    Tomazella, Vera Lucia D.; Martins, Camila Bertini; Bernardo, Jose Miguel

    2008-11-01

    In this work we present an approach involving objective Bayesian reference analysis to the Frailty model with univariate survival time and sources of heterogeneity that are not captured by covariates. The derivation unconditional hazard and survival leads to the Lomax distribution, also known as the Pareto distribution of the second kind. This distribution has an important position in life testing to adjust data from business failures. Reference analysis, introduced by Bernardo (1979) produce a new solution for this problem. The results are illustrated with survival data analyzed in the literature and simulated data.

  16. A Bayesian 3D data fusion and unsupervised joint segmentation approach for stochastic geological modelling using Hidden Markov random fields

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Wellmann, Florian

    2016-04-01

    It is generally accepted that 3D geological models inferred from observed data will contain a certain amount of uncertainties. The uncertainty quantification and stochastic sampling methods are essential for gaining the insight into the geological variability of subsurface structures. In the community of deterministic or traditional modelling techniques, classical geo-statistical methods using boreholes (hard data sets) are still most widely accepted although suffering certain drawbacks. Modern geophysical measurements provide us regional data sets in 2D or 3D spaces either directly from sensors or indirectly from inverse problem solving using observed signal (soft data sets). We propose a stochastic modelling framework to extract subsurface heterogeneity from multiple and complementary types of data. In the presented work, subsurface heterogeneity is considered as the "hidden link" among multiple spatial data sets as well as inversion results. Hidden Markov random field models are employed to perform 3D segmentation which is the representation of the "hidden link". Finite Gaussian mixture models are adopted to characterize the statistical parameters of the multiple data sets. The uncertainties are quantified via a Gibbs sampling process under the Bayesian inferential framework. The proposed modelling framework is validated using two numerical examples. The model behavior and convergence are also well examined. It is shown that the presented stochastic modelling framework is a promising tool for the 3D data fusion in the communities of geological modelling and geophysics.

  17. Transdimensional Bayesian approach to pulsar timing noise analysis

    NASA Astrophysics Data System (ADS)

    Ellis, J. A.; Cornish, N. J.

    2016-04-01

    The modeling of intrinsic noise in pulsar timing residual data is of crucial importance for gravitational wave detection and pulsar timing (astro)physics in general. The noise budget in pulsars is a collection of several well-studied effects including radiometer noise, pulse-phase jitter noise, dispersion measure variations, and low-frequency spin noise. However, as pulsar timing data continue to improve, nonstationary and non-power-law noise terms are beginning to manifest which are not well modeled by current noise analysis techniques. In this work, we use a transdimensional approach to model these nonstationary and non-power-law effects through the use of a wavelet basis and an interpolation-based adaptive spectral modeling. In both cases, the number of wavelets and the number of control points in the interpolated spectrum are free parameters that are constrained by the data and then marginalized over in the final inferences, thus fully incorporating our ignorance of the noise model. We show that these new methods outperform standard techniques when nonstationary and non-power-law noise is present. We also show that these methods return results consistent with the standard analyses when no such signals are present.

  18. A Bayesian Analysis of the Correlations Among Sunspot Cycles

    NASA Astrophysics Data System (ADS)

    Yu, Y.; van Dyk, D. A.; Kashyap, V. L.; Young, C. A.

    2012-12-01

    Sunspot numbers form a comprehensive, long-duration proxy of solar activity and have been used numerous times to empirically investigate the properties of the solar cycle. A number of correlations have been discovered over the 24 cycles for which observational records are available. Here we carry out a sophisticated statistical analysis of the sunspot record that reaffirms these correlations, and sets up an empirical predictive framework for future cycles. An advantage of our approach is that it allows for rigorous assessment of both the statistical significance of various cycle features and the uncertainty associated with predictions. We summarize the data into three sequential relations that estimate the amplitude, duration, and time of rise to maximum for any cycle, given the values from the previous cycle. We find that there is no indication of a persistence in predictive power beyond one cycle, and we conclude that the dynamo does not retain memory beyond one cycle. Based on sunspot records up to October 2011, we obtain, for Cycle 24, an estimated maximum smoothed monthly sunspot number of 97±15, to occur in January - February 2014 ± six months.

  19. Hierarchical models and Bayesian analysis of bird survey information

    USGS Publications Warehouse

    Sauer, J.R.; Link, W.A.; Royle, J. Andrew; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Summary of bird survey information is a critical component of conservation activities, but often our summaries rely on statistical methods that do not accommodate the limitations of the information. Prioritization of species requires ranking and analysis of species by magnitude of population trend, but often magnitude of trend is a misleading measure of actual decline when trend is poorly estimated. Aggregation of population information among regions is also complicated by varying quality of estimates among regions. Hierarchical models provide a reasonable means of accommodating concerns about aggregation and ranking of quantities of varying precision. In these models the need to consider multiple scales is accommodated by placing distributional assumptions on collections of parameters. For collections of species trends, this allows probability statements to be made about the collections of species-specific parameters, rather than about the estimates. We define and illustrate hierarchical models for two commonly encountered situations in bird conservation: (1) Estimating attributes of collections of species estimates, including ranking of trends, estimating number of species with increasing populations, and assessing population stability with regard to predefined trend magnitudes; and (2) estimation of regional population change, aggregating information from bird surveys over strata. User-friendly computer software makes hierarchical models readily accessible to scientists.

  20. Majorana Demonstrator Bolted Joint Mechanical and Thermal Analysis

    SciTech Connect

    Aguayo Navarrete, Estanislao; Reid, Douglas J.; Fast, James E.

    2012-06-01

    The MAJORANA DEMONSTRATOR is designed to probe for neutrinoless double-beta decay, an extremely rare process with a half-life in the order of 1026 years. The experiment uses an ultra-low background, high-purity germanium detector array. The germanium crystals are both the source and the detector in this experiment. Operating these crystals as ionizing radiation detectors requires having them under cryogenic conditions (below 90 K). A liquid nitrogen thermosyphon is used to extract the heat from the detectors. The detector channels are arranged in strings and thermally coupled to the thermosyphon through a cold plate. The cold plate is joined to the thermosyphon by a bolted joint. This circular plate is housed inside the cryostat can. This document provides a detailed study of the bolted joint that connects the cold plate and the thermosyphon. An analysis of the mechanical and thermal properties of this bolted joint is presented. The force applied to the joint is derived from the torque applied to each one of the six bolts that form the joint. The thermal conductivity of the joint is measured as a function of applied force. The required heat conductivity for a successful experiment is the combination of the thermal conductivity of the detector string and this joint. The thermal behavior of the joint is experimentally implemented and analyzed in this study.

  1. Bayesian population analysis of a harmonized physiologically based pharmacokinetic model of trichloroethylene and its metabolites.

    PubMed

    Hack, C Eric; Chiu, Weihsueh A; Jay Zhao, Q; Clewell, Harvey J

    2006-10-01

    Bayesian population analysis of a harmonized physiologically based pharmacokinetic (PBPK) model for trichloroethylene (TCE) and its metabolites was performed. In the Bayesian framework, prior information about the PBPK model parameters is updated using experimental kinetic data to obtain posterior parameter estimates. Experimental kinetic data measured in mice, rats, and humans were available for this analysis, and the resulting posterior model predictions were in better agreement with the kinetic data than prior model predictions. Uncertainty in the prediction of the kinetics of TCE, trichloroacetic acid (TCA), and trichloroethanol (TCOH) was reduced, while the kinetics of other key metabolites dichloroacetic acid (DCA), chloral hydrate (CHL), and dichlorovinyl mercaptan (DCVSH) remain relatively uncertain due to sparse kinetic data for use in this analysis. To help focus future research to further reduce uncertainty in model predictions, a sensitivity analysis was conducted to help identify the parameters that have the greatest impact on various internal dose metric predictions. For application to a risk assessment for TCE, the model provides accurate estimates of TCE, TCA, and TCOH kinetics. This analysis provides an important step toward estimating uncertainty of dose-response relationships in noncancer and cancer risk assessment, improving the extrapolation of toxic TCE doses from experimental animals to humans.

  2. Bayesian Finite Mixtures for Nonlinear Modeling of Educational Data.

    ERIC Educational Resources Information Center

    Tirri, Henry; And Others

    A Bayesian approach for finding latent classes in data is discussed. The approach uses finite mixture models to describe the underlying structure in the data and demonstrate that the possibility of using full joint probability models raises interesting new prospects for exploratory data analysis. The concepts and methods discussed are illustrated…

  3. Reusable Solid Rocket Motor Nozzle Joint-4 Thermal Analysis

    NASA Technical Reports Server (NTRS)

    Clayton, J. Louie

    2001-01-01

    This study provides for development and test verification of a thermal model used for prediction of joint heating environments, structural temperatures and seal erosions in the Space Shuttle Reusable Solid Rocket Motor (RSRM) Nozzle Joint-4. The heating environments are a result of rapid pressurization of the joint free volume assuming a leak path has occurred in the filler material used for assembly gap close out. Combustion gases flow along the leak path from nozzle environment to joint O-ring gland resulting in local heating to the metal housing and erosion of seal materials. Analysis of this condition was based on usage of the NASA Joint Pressurization Routine (JPR) for environment determination and the Systems Improved Numerical Differencing Analyzer (SINDA) for structural temperature prediction. Model generated temperatures, pressures and seal erosions are compared to hot fire test data for several different leak path situations. Investigated in the hot fire test program were nozzle joint-4 O-ring erosion sensitivities to leak path width in both open and confined joint geometries. Model predictions were in generally good agreement with the test data for the confined leak path cases. Worst case flight predictions are provided using the test-calibrated model. Analysis issues are discussed based on model calibration procedures.

  4. Bayesian Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David

    2014-02-01

    Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.

  5. Bayesian Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David

    2009-12-01

    Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.

  6. Crash risk analysis for Shanghai urban expressways: A Bayesian semi-parametric modeling approach.

    PubMed

    Yu, Rongjie; Wang, Xuesong; Yang, Kui; Abdel-Aty, Mohamed

    2016-10-01

    Urban expressway systems have been developed rapidly in recent years in China; it has become one key part of the city roadway networks as carrying large traffic volume and providing high traveling speed. Along with the increase of traffic volume, traffic safety has become a major issue for Chinese urban expressways due to the frequent crash occurrence and the non-recurrent congestions caused by them. For the purpose of unveiling crash occurrence mechanisms and further developing Active Traffic Management (ATM) control strategies to improve traffic safety, this study developed disaggregate crash risk analysis models with loop detector traffic data and historical crash data. Bayesian random effects logistic regression models were utilized as it can account for the unobserved heterogeneity among crashes. However, previous crash risk analysis studies formulated random effects distributions in a parametric approach, which assigned them to follow normal distributions. Due to the limited information known about random effects distributions, subjective parametric setting may be incorrect. In order to construct more flexible and robust random effects to capture the unobserved heterogeneity, Bayesian semi-parametric inference technique was introduced to crash risk analysis in this study. Models with both inference techniques were developed for total crashes; semi-parametric models were proved to provide substantial better model goodness-of-fit, while the two models shared consistent coefficient estimations. Later on, Bayesian semi-parametric random effects logistic regression models were developed for weekday peak hour crashes, weekday non-peak hour crashes, and weekend non-peak hour crashes to investigate different crash occurrence scenarios. Significant factors that affect crash risk have been revealed and crash mechanisms have been concluded.

  7. Crash risk analysis for Shanghai urban expressways: A Bayesian semi-parametric modeling approach.

    PubMed

    Yu, Rongjie; Wang, Xuesong; Yang, Kui; Abdel-Aty, Mohamed

    2016-10-01

    Urban expressway systems have been developed rapidly in recent years in China; it has become one key part of the city roadway networks as carrying large traffic volume and providing high traveling speed. Along with the increase of traffic volume, traffic safety has become a major issue for Chinese urban expressways due to the frequent crash occurrence and the non-recurrent congestions caused by them. For the purpose of unveiling crash occurrence mechanisms and further developing Active Traffic Management (ATM) control strategies to improve traffic safety, this study developed disaggregate crash risk analysis models with loop detector traffic data and historical crash data. Bayesian random effects logistic regression models were utilized as it can account for the unobserved heterogeneity among crashes. However, previous crash risk analysis studies formulated random effects distributions in a parametric approach, which assigned them to follow normal distributions. Due to the limited information known about random effects distributions, subjective parametric setting may be incorrect. In order to construct more flexible and robust random effects to capture the unobserved heterogeneity, Bayesian semi-parametric inference technique was introduced to crash risk analysis in this study. Models with both inference techniques were developed for total crashes; semi-parametric models were proved to provide substantial better model goodness-of-fit, while the two models shared consistent coefficient estimations. Later on, Bayesian semi-parametric random effects logistic regression models were developed for weekday peak hour crashes, weekday non-peak hour crashes, and weekend non-peak hour crashes to investigate different crash occurrence scenarios. Significant factors that affect crash risk have been revealed and crash mechanisms have been concluded. PMID:26847949

  8. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network

    PubMed Central

    2011-01-01

    Background Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. Results We herein introduce a framework for network modularization and Bayesian network analysis (FMB) to investigate organism’s metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. Conclusions After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis. PMID:22784571

  9. Bayesian design and analysis of computer experiments: Use of derivatives in surface prediction

    SciTech Connect

    Morris, M.D.; Mitchell, T.J. ); Ylvisaker, D. . Dept. of Mathematics)

    1991-06-01

    The work of Currin et al. and others in developing fast predictive approximations'' of computer models is extended for the case in which derivatives of the output variable of interest with respect to input variables are available. In addition to describing the calculations required for the Bayesian analysis, the issue of experimental design is also discussed, and an algorithm is described for constructing maximin distance'' designs. An example is given based on a demonstration model of eight inputs and one output, in which predictions based on a maximin design, a Latin hypercube design, and two compromise'' designs are evaluated and compared. 12 refs., 2 figs., 6 tabs.

  10. Bayesian analysis of heat pipe life test data for reliability demonstration testing

    SciTech Connect

    Bartholomew, R.J.; Martz, H.F.

    1985-01-01

    The demonstration testing duration requirements to establish a quantitative measure of assurance of expected lifetime for heat pipes was determined. The heat pipes are candidate devices for transporting heat generated in a nuclear reactor core to thermoelectric converters for use as a space-based electric power plant. A Bayesian analysis technique is employed, utilizing a limited Delphi survey, and a geometric mean accelerated test criterion involving heat pipe power (P) and temperature (T). Resulting calculations indicate considerable test savings can be achieved by employing the method, but development testing to determine heat pipe failure mechanisms should not be circumvented.

  11. Analysis of adhesively bonded composite lap joints

    SciTech Connect

    Tong, L.; Kuruppu, M.; Kelly, D.

    1994-12-31

    A new nonlinear formulation is developed for the governing equations for the shear and peel stresses in adhesively bonded composite double lap joints. The new formulation allows arbitrary nonlinear stress-strain characteristics in both shear and peel behavior. The equations are numerically integrated using a shooting technique and Newton-Raphson method behind a user friendly interface. The failure loads are predicted by utilizing the maximum stress criterion, interlaminar delamination and the energy density failure criteria. Numerical examples are presented to demonstrate the effect of the nonlinear adhesive behavior on the stress distribution and predict the failure load and the associated mode.

  12. Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification

    PubMed Central

    Yan, Fang; Xu, Kaili; Yao, Xiwen; Li, Yang

    2016-01-01

    Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA) is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie) analysis was proposed by mapping bow-tie analysis into Bayesian network (BN). Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures. PMID:27463975

  13. Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification.

    PubMed

    Yan, Fang; Xu, Kaili; Yao, Xiwen; Li, Yang

    2016-01-01

    Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA) is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie) analysis was proposed by mapping bow-tie analysis into Bayesian network (BN). Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures.

  14. Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification.

    PubMed

    Yan, Fang; Xu, Kaili; Yao, Xiwen; Li, Yang

    2016-01-01

    Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA) is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie) analysis was proposed by mapping bow-tie analysis into Bayesian network (BN). Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures. PMID:27463975

  15. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    PubMed

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  16. Bayesian robustness in meta-analysis for studies with zero responses.

    PubMed

    Vázquez, F J; Moreno, E; Negrín, M A; Martel, M

    2016-05-01

    Statistical meta-analysis is mostly carried out with the help of the random effect normal model, including the case of discrete random variables. We argue that the normal approximation is not always able to adequately capture the underlying uncertainty of the original discrete data. Furthermore, when we examine the influence of the prior distributions considered, in the presence of rare events, the results from this approximation can be very poor. In order to assess the robustness of the quantities of interest in meta-analysis with respect to the choice of priors, this paper proposes an alternative Bayesian model for binomial random variables with several zero responses. Particular attention is paid to the coherence between the prior distributions of the study model parameters and the meta-parameter. Thus, our method introduces a simple way to examine the sensitivity of these quantities to the structure dependence selected for study. For illustrative purposes, an example with real data is analysed, using the proposed Bayesian meta-analysis model for binomial sparse data. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26913715

  17. A Two-Step Bayesian Approach for Propensity Score Analysis: Simulations and Case Study

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Jianshen

    2012-01-01

    A two-step Bayesian propensity score approach is introduced that incorporates prior information in the propensity score equation and outcome equation without the problems associated with simultaneous Bayesian propensity score approaches. The corresponding variance estimators are also provided. The two-step Bayesian propensity score is provided for…

  18. A regressed phase analysis for coupled joint systems.

    PubMed

    Wininger, Michael

    2011-01-01

    This study aims to address shortcomings of the relative phase analysis, a widely used method for assessment of coupling among joints of the lower limb. Goniometric data from 15 individuals with spastic diplegic cerebral palsy were recorded from the hip and knee joints during ambulation on a flat surface, and from a single healthy individual with no known motor impairment, over at least 10 gait cycles. The minimum relative phase (MRP) revealed substantial disparity in the timing and severity of the instance of maximum coupling, depending on which reference frame was selected: MRP(knee-hip) differed from MRP(hip-knee) by 16.1±14% of gait cycle and 50.6±77% difference in scale. Additionally, several relative phase portraits contained discontinuities which may contribute to error in phase feature extraction. These vagaries can be attributed to the predication of relative phase analysis on a transformation into the velocity-position phase plane, and the extraction of phase angle by the discontinuous arc-tangent operator. Here, an alternative phase analysis is proposed, wherein kinematic data is transformed into a profile of joint coupling across the entire gait cycle. By comparing joint velocities directly via a standard linear regression in the velocity-velocity phase plane, this regressed phase analysis provides several key advantages over relative phase analysis including continuity, commutativity between reference frames, and generalizability to many-joint systems.

  19. Finite element analysis of human joints

    SciTech Connect

    Bossart, P.L.; Hollerbach, K.

    1996-09-01

    Our work focuses on the development of finite element models (FEMs) that describe the biomechanics of human joints. Finite element modeling is becoming a standard tool in industrial applications. In highly complex problems such as those found in biomechanics research, however, the full potential of FEMs is just beginning to be explored, due to the absence of precise, high resolution medical data and the difficulties encountered in converting these enormous datasets into a form that is usable in FEMs. With increasing computing speed and memory available, it is now feasible to address these challenges. We address the first by acquiring data with a high resolution C-ray CT scanner and the latter by developing semi-automated method for generating the volumetric meshes used in the FEM. Issues related to tomographic reconstruction, volume segmentation, the use of extracted surfaces to generate volumetric hexahedral meshes, and applications of the FEM are described.

  20. A Bayesian fingerprinting analysis for detection and attribution of changes in extreme flows

    NASA Astrophysics Data System (ADS)

    Hundecha, Yeshewatesfa; Merz, Bruno; Perdigão, Rui A. P.; Vorogushyn, Sergiy; Viglione, Alberto; Blöschl, Günter

    2014-05-01

    Fingerprinting analysis has widely been used in the detection and attribution problem within the climate community over the past several decades. In the approach, a field of certain observed climate indicator is represented as a linear model of a signal pattern (fingerprint) that is simulated by a climate model under external forcing plus a noise field, which represents a realisation of the internal climate variability. A scaling factor is introduced to adjust the amplitude of the signal pattern so that it matches the observations well. In the approach, the scaling factor is optimally estimated to maximise the signal-to-noise ratio, thereby increasing detectability of the signal due to a forced climate change. Many of the fingerprinting analyses that are reported in the literature are framed on the classical statistical theory. Such an approach can give reliable results under the condition that the natural variability of the system and the uncertainties in the predicted signals under a given forcing can be quantified. If these uncertainties cannot be objectively estimated, interpretation of the results will mainly be guided by a subjective judgement. Recent analyses have made a shift towards a Bayesian approach, which provides a quantitative framework for the integration of subjective prior information on the uncertainties into the statistical detection and attribution problem. Hasselmann (1998) reviews the fingerprinting approach that is based on the classical statistical framework and presents generalisation of the approach to a Bayesian framework. Berliner et al. (2000) also presents a formal Bayesian fingerprinting analytical framework for the detection and attribution problem. The potential applicability of the fingerprinting approach to the detection and attribution problem of extreme flows has been discussed in the opinion paper by Merz et al. (2012). Hundecha and Merz (2012) have also implemented an approach that is similar to the fingerprinting approach

  1. Bayesian Gibbs Markov chain: MRF-based Stochastic Joint Inversion of Hydrological and Geophysical Datasets for Improved Characterization of Aquifer Heterogeneities.

    NASA Astrophysics Data System (ADS)

    Oware, E. K.

    2015-12-01

    Modeling aquifer heterogeneities (AH) is a complex, multidimensional problem that mostly requires stochastic imaging strategies for tractability. While the traditional Bayesian Markov chain Monte Carlo (McMC) provides a powerful framework to model AH, the generic McMC is computationally prohibitive and, thus, unappealing for large-scale problems. An innovative variant of the McMC scheme that imposes priori spatial statistical constraints on model parameter updates, for improved characterization in a computationally efficient manner is proposed. The proposed algorithm (PA) is based on Markov random field (MRF) modeling, which is an image processing technique that infers the global behavior of a random field from its local properties, making the MRF approach well suited for imaging AH. MRF-based modeling leverages the equivalence of Gibbs (or Boltzmann) distribution (GD) and MRF to identify the local properties of an MRF in terms of the easily quantifiable Gibbs energy. The PA employs the two-step approach to model the lithological structure of the aquifer and the hydraulic properties within the identified lithologies simultaneously. It performs local Gibbs energy minimizations along a random path, which requires parameters of the GD (spatial statistics) to be specified. A PA that implicitly infers site-specific GD parameters within a Bayesian framework is also presented. The PA is illustrated with a synthetic binary facies aquifer with a lognormal heterogeneity simulated within each facies. GD parameters of 2.6, 1.2, -0.4, and -0.2 were estimated for the horizontal, vertical, NESW, and NWSE directions, respectively. Most of the high hydraulic conductivity zones (facies 2) were fairly resolved (see results below) with facies identification accuracy rate of 81%, 89%, and 90% for the inversions conditioned on concentration (R1), resistivity (R2), and joint (R3), respectively. The incorporation of the conditioning datasets improved on the root mean square error (RMSE

  2. Critical composite joint subcomponents: Analysis and test results

    NASA Technical Reports Server (NTRS)

    Bunin, B. L.

    1983-01-01

    This program has been conducted to develop the technology for critical structural joints of a composite wing structure meeting design requirements for a 1990 commercial transport aircraft. A prime objective of the program was to demonstrate the ability to reliably predict the strength of large bolted composite joints. Load sharing between bolts in multirow joints was computed by a nonlinear analysis program (A4FJ) which was used both to assess the efficiency of different joint design concepts and to predict the strengths of large test articles representing a section from a wing root chord-wise splice. In most cases, the predictions were accurate to within a few percent of the test results. A highlight of these tests was the consistent ability to achieve gross-section failure strains on the order of 0.005 which represents a considerable improvement over the state of the art. The improvement was attained largely as the result of the better understanding of the load sharing in multirow joints provided by the analysis. The typical load intensity on the structural joints was about 40 to 45 thousand pound per inch in laminates having interspersed 37 1/2-percent 0-degree plies, 50-percent + or - 45-degrees plies and 12 1/2-percent 90-degrees plies. The composite material was Toray 300 fiber and Ciba-Geigy 914 resin, in the form of 0.010-inch thick unidirectional tape.

  3. Methods for the Joint Meta-Analysis of Multiple Tests

    ERIC Educational Resources Information Center

    Trikalinos, Thomas A.; Hoaglin, David C.; Small, Kevin M.; Terrin, Norma; Schmid, Christopher H.

    2014-01-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests'…

  4. Evolutionary Analysis of Dengue Serotype 2 Viruses Using Phylogenetic and Bayesian Methods from New Delhi, India.

    PubMed

    Afreen, Nazia; Naqvi, Irshad H; Broor, Shobha; Ahmed, Anwar; Kazim, Syed Naqui; Dohare, Ravins; Kumar, Manoj; Parveen, Shama

    2016-03-01

    Dengue fever is the most important arboviral disease in the tropical and sub-tropical countries of the world. Delhi, the metropolitan capital state of India, has reported many dengue outbreaks, with the last outbreak occurring in 2013. We have recently reported predominance of dengue virus serotype 2 during 2011-2014 in Delhi. In the present study, we report molecular characterization and evolutionary analysis of dengue serotype 2 viruses which were detected in 2011-2014 in Delhi. Envelope genes of 42 DENV-2 strains were sequenced in the study. All DENV-2 strains grouped within the Cosmopolitan genotype and further clustered into three lineages; Lineage I, II and III. Lineage III replaced lineage I during dengue fever outbreak of 2013. Further, a novel mutation Thr404Ile was detected in the stem region of the envelope protein of a single DENV-2 strain in 2014. Nucleotide substitution rate and time to the most recent common ancestor were determined by molecular clock analysis using Bayesian methods. A change in effective population size of Indian DENV-2 viruses was investigated through Bayesian skyline plot. The study will be a vital road map for investigation of epidemiology and evolutionary pattern of dengue viruses in India.

  5. Evolutionary Analysis of Dengue Serotype 2 Viruses Using Phylogenetic and Bayesian Methods from New Delhi, India

    PubMed Central

    Afreen, Nazia; Naqvi, Irshad H.; Broor, Shobha; Ahmed, Anwar; Kazim, Syed Naqui; Dohare, Ravins; Kumar, Manoj; Parveen, Shama

    2016-01-01

    Dengue fever is the most important arboviral disease in the tropical and sub-tropical countries of the world. Delhi, the metropolitan capital state of India, has reported many dengue outbreaks, with the last outbreak occurring in 2013. We have recently reported predominance of dengue virus serotype 2 during 2011–2014 in Delhi. In the present study, we report molecular characterization and evolutionary analysis of dengue serotype 2 viruses which were detected in 2011–2014 in Delhi. Envelope genes of 42 DENV-2 strains were sequenced in the study. All DENV-2 strains grouped within the Cosmopolitan genotype and further clustered into three lineages; Lineage I, II and III. Lineage III replaced lineage I during dengue fever outbreak of 2013. Further, a novel mutation Thr404Ile was detected in the stem region of the envelope protein of a single DENV-2 strain in 2014. Nucleotide substitution rate and time to the most recent common ancestor were determined by molecular clock analysis using Bayesian methods. A change in effective population size of Indian DENV-2 viruses was investigated through Bayesian skyline plot. The study will be a vital road map for investigation of epidemiology and evolutionary pattern of dengue viruses in India. PMID:26977703

  6. Bayesian flux balance analysis applied to a skeletal muscle metabolic model.

    PubMed

    Heino, Jenni; Tunyan, Knarik; Calvetti, Daniela; Somersalo, Erkki

    2007-09-01

    In this article, the steady state condition for the multi-compartment models for cellular metabolism is considered. The problem is to estimate the reaction and transport fluxes, as well as the concentrations in venous blood when the stoichiometry and bound constraints for the fluxes and the concentrations are given. The problem has been addressed previously by a number of authors, and optimization-based approaches as well as extreme pathway analysis have been proposed. These approaches are briefly discussed here. The main emphasis of this work is a Bayesian statistical approach to the flux balance analysis (FBA). We show how the bound constraints and optimality conditions such as maximizing the oxidative phosphorylation flux can be incorporated into the model in the Bayesian framework by proper construction of the prior densities. We propose an effective Markov chain Monte Carlo (MCMC) scheme to explore the posterior densities, and compare the results with those obtained via the previously studied linear programming (LP) approach. The proposed methodology, which is applied here to a two-compartment model for skeletal muscle metabolism, can be extended to more complex models. PMID:17568615

  7. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents. PMID:26433361

  8. Bayesian data analysis of severe fatal accident risk in the oil chain.

    PubMed

    Eckle, Petrissa; Burgherr, Peter

    2013-01-01

    We analyze the risk of severe fatal accidents causing five or more fatalities and for nine different activities covering the entire oil chain. Included are exploration and extraction, transport by different modes, refining and final end use in power plants, heating or gas stations. The risks are quantified separately for OECD and non-OECD countries and trends are calculated. Risk is analyzed by employing a Bayesian hierarchical model yielding analytical functions for both frequency (Poisson) and severity distributions (Generalized Pareto) as well as frequency trends. This approach addresses a key problem in risk estimation-namely the scarcity of data resulting in high uncertainties in particular for the risk of extreme events, where the risk is extrapolated beyond the historically most severe accidents. Bayesian data analysis allows the pooling of information from different data sets covering, for example, the different stages of the energy chains or different modes of transportation. In addition, it also inherently delivers a measure of uncertainty. This approach provides a framework, which comprehensively covers risk throughout the oil chain, allowing the allocation of risk in sustainability assessments. It also permits the progressive addition of new data to refine the risk estimates. Frequency, severity, and trends show substantial differences between the activities, emphasizing the need for detailed risk analysis. PMID:22642363

  9. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents.

  10. Bayesian data analysis of severe fatal accident risk in the oil chain.

    PubMed

    Eckle, Petrissa; Burgherr, Peter

    2013-01-01

    We analyze the risk of severe fatal accidents causing five or more fatalities and for nine different activities covering the entire oil chain. Included are exploration and extraction, transport by different modes, refining and final end use in power plants, heating or gas stations. The risks are quantified separately for OECD and non-OECD countries and trends are calculated. Risk is analyzed by employing a Bayesian hierarchical model yielding analytical functions for both frequency (Poisson) and severity distributions (Generalized Pareto) as well as frequency trends. This approach addresses a key problem in risk estimation-namely the scarcity of data resulting in high uncertainties in particular for the risk of extreme events, where the risk is extrapolated beyond the historically most severe accidents. Bayesian data analysis allows the pooling of information from different data sets covering, for example, the different stages of the energy chains or different modes of transportation. In addition, it also inherently delivers a measure of uncertainty. This approach provides a framework, which comprehensively covers risk throughout the oil chain, allowing the allocation of risk in sustainability assessments. It also permits the progressive addition of new data to refine the risk estimates. Frequency, severity, and trends show substantial differences between the activities, emphasizing the need for detailed risk analysis.

  11. A Semi-parametric Bayesian Approach for Differential Expression Analysis of RNA-seq Data

    PubMed Central

    Liu, Fangfang; Wang, Chong

    2016-01-01

    RNA-sequencing (RNA-seq) technologies have revolutionized the way agricultural biologists study gene expression as well as generated a tremendous amount of data waiting for analysis. Detecting differentially expressed genes is one of the fundamental steps in RNA-seq data analysis. In this paper, we model the count data from RNA-seq experiments with a Poisson-Gamma hierarchical model, or equivalently, a negative binomial (NB) model. We derive a semi-parametric Bayesian approach with a Dirichlet process as the prior model for the distribution of fold changes between the two treatment means. An inference strategy using Gibbs algorithm is developed for differential expression analysis. The results of several simulation studies show that our proposed method outperforms other methods including the popularly applied edgeR and DESeq methods. We also discuss an application of our method to a dataset that compares gene expression between bundle sheath and mesophyll cells in maize leaves. PMID:27570441

  12. Experimental and failure analysis of the prosthetic finger joint implants

    NASA Astrophysics Data System (ADS)

    Naidu, Sanjiv H.

    Small joint replacement arthroplasty of the hand is a well accepted surgical procedure to restore function and cosmesis in an individual with a crippled hand. Silicone elastomers have been used as prosthetic material in various small hand joints for well over three decades. Although the clinical science aspects of silicone elastomer failure are well known, the physical science aspects of prosthetic failure are scant and vague. In the following thesis, using both an animal model, and actual retrieved specimens which have failed in human service, experimental and failure analysis of silicone finger joints are presented. Fractured surfaces of retrieved silicone trapezial implants, and silicone finger joint implants were studied with both FESEM and SEM; the mode of failure for silicone trapezium is by wear polishing, whereas the finger joint implants failed either by fatigue fracture or tearing of the elastomer, or a combination of both. Thermal analysis revealed that the retrieved elastomer implants maintained its viscoelastic properties throughout the service period. In order to provide for a more functional and physiologic arthroplasty a novel finger joint (Rolamite prosthesis) is proposed using more recently developed thermoplastic polymers. The following thesis also addresses the outcome of the experimental studies of the Rolamite prosthesis in a rabbit animal model, in addition to the failure analysis of the thermoplastic polymers while in service in an in vivo synovial environment. Results of retrieved Rolamite specimens suggest that the use for thermoplastic elastomers such as block copolymer based elastomers in a synovial environment such as a mammalian joint may very well be limited.

  13. Joint aspiration and injection and synovial fluid analysis.

    PubMed

    Courtney, Philip; Doherty, Michael

    2009-04-01

    Joint aspiration/injection and synovial fluid (SF) analysis are both invaluable procedures for the diagnosis and treatment of joint disease. This chapter addresses: (1) the indications, the technical principles and the expected benefits and risks of aspiration and injection of intra-articular corticosteroid; and (2) practical aspects relating to SF analysis, especially in relation to crystal identification. Intra-articular injection of long-acting insoluble corticosteroids is a well-established procedure that produces rapid pain relief and resolution of inflammation in most injected joints. The knee is the most common site to require aspiration, although any non-axial joint is accessible for obtaining SF. The technique requires a knowledge of basic anatomy and should not be unduly painful for the patient. Provided sterile equipment and a sensible, aseptic approach are used, it is very safe. Analysis of aspirated SF is helpful in the differential diagnosis of arthritis and is the definitive method for diagnosis of septic arthritis and crystal arthritis. The gross appearance of SF can provide useful diagnostic information in terms of the degree of joint inflammation and presence of haemarthrosis. Microbiological studies of SF are the key to the confirmation of infectious conditions. Increasing joint inflammation is associated with increased SF volume, reduced viscosity, increasing turbidity and cell count, and increasing ratio of polymorphonuclear: mononuclear cells, but such changes are non-specific and must be interpreted in the clinical setting. However, detection of SF monosodium urate and calcium pyrophosphate dihydrate crystals, even from un-inflamed joints during intercritical periods, allow a precise diagnosis of gout and of calcium pyrophosphate crystal-related arthritis. PMID:19393565

  14. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach

    PubMed Central

    Mohammadi, Tayeb; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables “number of blood donation” and “number of blood deferral”: as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models. PMID:27703493

  15. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer E.; Aitchison, Lindsay

    2009-01-01

    A space suit s mobility is critical to an astronaut s ability to perform work efficiently. As mobility increases, the astronaut can perform tasks for longer durations with less fatigue. The term mobility, with respect to space suits, is defined in terms of two key components: joint range of motion and joint torque. Individually these measures describe the path which in which a joint travels and the force required to move it through that path. Previous space suits mobility requirements were defined as the collective result of these two measures and verified by the completion of discrete functional tasks. While a valid way to impose mobility requirements, such a method does necessitate a solid understanding of the operational scenarios in which the final suit will be performing. Because the Constellation space suit system requirements are being finalized with a relatively immature concept of operations, the Space Suit Element team elected to define mobility in terms of its constituent parts to increase the likelihood that the future pressure garment will be mobile enough to enable a broad scope of undefined exploration activities. The range of motion requirements were defined by measuring the ranges of motion test subjects achieved while performing a series of joint maximizing tasks in a variety of flight and prototype space suits. The definition of joint torque requirements has proved more elusive. NASA evaluated several different approaches to the problem before deciding to generate requirements based on unmanned joint torque evaluations of six different space suit configurations being articulated through 16 separate joint movements. This paper discusses the experiment design, data analysis and results, and the process used to determine the final values for the Constellation pressure garment joint torque requirements.

  16. A Preliminary Bayesian Analysis of Incomplete Longitudinal Data from a Small Sample: Methodological Advances in an International Comparative Study of Educational Inequality

    ERIC Educational Resources Information Center

    Hsieh, Chueh-An; Maier, Kimberly S.

    2009-01-01

    The capacity of Bayesian methods in estimating complex statistical models is undeniable. Bayesian data analysis is seen as having a range of advantages, such as an intuitive probabilistic interpretation of the parameters of interest, the efficient incorporation of prior information to empirical data analysis, model averaging and model selection.…

  17. Bayesian analysis of response to selection: a case study using litter size in Danish Yorkshire pigs.

    PubMed Central

    Sorensen, D; Vernersen, A; Andersen, S

    2000-01-01

    Implementation of a Bayesian analysis of a selection experiment is illustrated using litter size [total number of piglets born (TNB)] in Danish Yorkshire pigs. Other traits studied include average litter weight at birth (WTAB) and proportion of piglets born dead (PRBD). Response to selection for TNB was analyzed with a number of models, which differed in their level of hierarchy, in their prior distributions, and in the parametric form of the likelihoods. A model assessment study favored a particular form of an additive genetic model. With this model, the Monte Carlo estimate of the 95% probability interval of response to selection was (0.23; 0.60), with a posterior mean of 0.43 piglets. WTAB showed a correlated response of -7.2 g, with a 95% probability interval equal to (-33.1; 18.9). The posterior mean of the genetic correlation between TNB and WTAB was -0.23 with a 95% probability interval equal to (-0.46; -0.01). PRBD was studied informally; it increases with larger litters, when litter size is >7 piglets born. A number of methodological issues related to the Bayesian model assessment study are discussed, as well as the genetic consequences of inferring response to selection using additive genetic models. PMID:10978292

  18. A Bayesian analysis of the 69 highest energy cosmic rays detected by the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Khanin, Alexander; Mortlock, Daniel J.

    2016-08-01

    The origins of ultrahigh energy cosmic rays (UHECRs) remain an open question. Several attempts have been made to cross-correlate the arrival directions of the UHECRs with catalogues of potential sources, but no definite conclusion has been reached. We report a Bayesian analysis of the 69 events, from the Pierre Auger Observatory (PAO), that aims to determine the fraction of the UHECRs that originate from known AGNs in the Veron-Cety & Verson (VCV) catalogue, as well as AGNs detected with the Swift Burst Alert Telescope (Swift-BAT), galaxies from the 2MASS Redshift Survey (2MRS), and an additional volume-limited sample of 17 nearby AGNs. The study makes use of a multilevel Bayesian model of UHECR injection, propagation and detection. We find that for reasonable ranges of prior parameters the Bayes factors disfavour a purely isotropic model. For fiducial values of the model parameters, we report 68 per cent credible intervals for the fraction of source originating UHECRs of 0.09^{+0.05}_{-0.04}, 0.25^{+0.09}_{-0.08}, 0.24^{+0.12}_{-0.10}, and 0.08^{+0.04}_{-0.03} for the VCV, Swift-BAT and 2MRS catalogues, and the sample of 17 AGNs, respectively.

  19. Variational Bayesian mixture of experts models and sensitivity analysis for nonlinear dynamical systems

    NASA Astrophysics Data System (ADS)

    Baldacchino, Tara; Cross, Elizabeth J.; Worden, Keith; Rowson, Jennifer

    2016-01-01

    Most physical systems in reality exhibit a nonlinear relationship between input and output variables. This nonlinearity can manifest itself in terms of piecewise continuous functions or bifurcations, between some or all of the variables. The aims of this paper are two-fold. Firstly, a mixture of experts (MoE) model was trained on different physical systems exhibiting these types of nonlinearities. MoE models separate the input space into homogeneous regions and a different expert is responsible for the different regions. In this paper, the experts were low order polynomial regression models, thus avoiding the need for high-order polynomials. The model was trained within a Bayesian framework using variational Bayes, whereby a novel approach within the MoE literature was used in order to determine the number of experts in the model. Secondly, Bayesian sensitivity analysis (SA) of the systems under investigation was performed using the identified probabilistic MoE model in order to assess how uncertainty in the output can be attributed to uncertainty in the different inputs. The proposed methodology was first tested on a bifurcating Duffing oscillator, and it was then applied to real data sets obtained from the Tamar and Z24 bridges. In all cases, the MoE model was successful in identifying bifurcations and different physical regimes in the data by accurately dividing the input space; including identifying boundaries that were not parallel to coordinate axes.

  20. Critically evaluating the theory and performance of Bayesian analysis of macroevolutionary mixtures

    PubMed Central

    Moore, Brian R.; Höhna, Sebastian; May, Michael R.; Rannala, Bruce; Huelsenbeck, John P.

    2016-01-01

    Bayesian analysis of macroevolutionary mixtures (BAMM) has recently taken the study of lineage diversification by storm. BAMM estimates the diversification-rate parameters (speciation and extinction) for every branch of a study phylogeny and infers the number and location of diversification-rate shifts across branches of a tree. Our evaluation of BAMM reveals two major theoretical errors: (i) the likelihood function (which estimates the model parameters from the data) is incorrect, and (ii) the compound Poisson process prior model (which describes the prior distribution of diversification-rate shifts across branches) is incoherent. Using simulation, we demonstrate that these theoretical issues cause statistical pathologies; posterior estimates of the number of diversification-rate shifts are strongly influenced by the assumed prior, and estimates of diversification-rate parameters are unreliable. Moreover, the inability to correctly compute the likelihood or to correctly specify the prior for rate-variable trees precludes the use of Bayesian approaches for testing hypotheses regarding the number and location of diversification-rate shifts using BAMM. PMID:27512038

  1. Assessing State Nuclear Weapons Proliferation: Using Bayesian Network Analysis of Social Factors

    SciTech Connect

    Coles, Garill A.; Brothers, Alan J.; Olson, Jarrod; Whitney, Paul D.

    2010-04-16

    A Bayesian network (BN) model of social factors can support proliferation assessments by estimating the likelihood that a state will pursue a nuclear weapon. Social factors including political, economic, nuclear capability, security, and national identity and psychology factors may play as important a role in whether a State pursues nuclear weapons as more physical factors. This paper will show how using Bayesian reasoning on a generic case of a would-be proliferator State can be used to combine evidence that supports proliferation assessment. Theories and analysis by political scientists can be leveraged in a quantitative and transparent way to indicate proliferation risk. BN models facilitate diagnosis and inference in a probabilistic environment by using a network of nodes and acyclic directed arcs between the nodes whose connections, or absence of, indicate probabilistic relevance, or independence. We propose a BN model that would use information from both traditional safeguards and the strengthened safeguards associated with the Additional Protocol to indicate countries with a high risk of proliferating nuclear weapons. This model could be used in a variety of applications such a prioritization tool and as a component of state safeguards evaluations. This paper will discuss the benefits of BN reasoning, the development of Pacific Northwest National Laboratory’s (PNNL) BN state proliferation model and how it could be employed as an analytical tool.

  2. Bayesian Neural Networks for Uncertainty Analysis of Hydrologic Modeling: A Comparison of Two Schemes

    SciTech Connect

    Zhang, Xuesong; Zhao, Kaiguang

    2012-06-01

    Bayesian Neural Networks (BNNs) have been shown as useful tools to analyze modeling uncertainty of Neural Networks (NNs). This research focuses on the comparison of two BNNs. The first BNNs (BNN-I) use statistical methods to describe the characteristics of different uncertainty sources (input, parameter, and model structure) and integrate these uncertainties into a Markov Chain Monte Carlo (MCMC) framework to estimate total uncertainty. The second BNNs (BNN-II) lump all uncertainties into a single error term (i.e. the residual between model prediction and measurement). In this study, we propose a simple BNN-II, which use Genetic Algorithms (GA) and Bayesian Model Averaging (BMA) to calibrate Neural Networks with different structures (number of hidden units) and combine the predictions from different NNs to derive predictions and uncertainty analysis. We tested these two BNNs in two watersheds for daily and monthly hydrologic simulation. The BMA based BNNs developed in this study outperforms BNN-I in the two watersheds in terms of both accurate prediction and uncertainty estimation. These results show that, given incomplete understanding of the characteristics associated with each uncertainty source, the simple lumped error approach may yield better prediction and uncertainty estimation.

  3. Spatial-Temporal Epidemiology of Tuberculosis in Mainland China: An Analysis Based on Bayesian Theory

    PubMed Central

    Cao, Kai; Yang, Kun; Wang, Chao; Guo, Jin; Tao, Lixin; Liu, Qingrong; Gehendra, Mahara; Zhang, Yingjie; Guo, Xiuhua

    2016-01-01

    Objective: To explore the spatial-temporal interaction effect within a Bayesian framework and to probe the ecological influential factors for tuberculosis. Methods: Six different statistical models containing parameters of time, space, spatial-temporal interaction and their combination were constructed based on a Bayesian framework. The optimum model was selected according to the deviance information criterion (DIC) value. Coefficients of climate variables were then estimated using the best fitting model. Results: The model containing spatial-temporal interaction parameter was the best fitting one, with the smallest DIC value (−4,508,660). Ecological analysis results showed the relative risks (RRs) of average temperature, rainfall, wind speed, humidity, and air pressure were 1.00324 (95% CI, 1.00150–1.00550), 1.01010 (95% CI, 1.01007–1.01013), 0.83518 (95% CI, 0.93732–0.96138), 0.97496 (95% CI, 0.97181–1.01386), and 1.01007 (95% CI, 1.01003–1.01011), respectively. Conclusions: The spatial-temporal interaction was statistically meaningful and the prevalence of tuberculosis was influenced by the time and space interaction effect. Average temperature, rainfall, wind speed, and air pressure influenced tuberculosis. Average humidity had no influence on tuberculosis. PMID:27164117

  4. Critically evaluating the theory and performance of Bayesian analysis of macroevolutionary mixtures.

    PubMed

    Moore, Brian R; Höhna, Sebastian; May, Michael R; Rannala, Bruce; Huelsenbeck, John P

    2016-08-23

    Bayesian analysis of macroevolutionary mixtures (BAMM) has recently taken the study of lineage diversification by storm. BAMM estimates the diversification-rate parameters (speciation and extinction) for every branch of a study phylogeny and infers the number and location of diversification-rate shifts across branches of a tree. Our evaluation of BAMM reveals two major theoretical errors: (i) the likelihood function (which estimates the model parameters from the data) is incorrect, and (ii) the compound Poisson process prior model (which describes the prior distribution of diversification-rate shifts across branches) is incoherent. Using simulation, we demonstrate that these theoretical issues cause statistical pathologies; posterior estimates of the number of diversification-rate shifts are strongly influenced by the assumed prior, and estimates of diversification-rate parameters are unreliable. Moreover, the inability to correctly compute the likelihood or to correctly specify the prior for rate-variable trees precludes the use of Bayesian approaches for testing hypotheses regarding the number and location of diversification-rate shifts using BAMM. PMID:27512038

  5. Bayesian soft x-ray tomography and MHD mode analysis on HL-2A

    NASA Astrophysics Data System (ADS)

    Li, Dong; Liu, Yi; Svensson, J.; Liu, Y. Q.; Song, X. M.; Yu, L. M.; Mao, Rui; Fu, B. Z.; Deng, Wei; Yuan, B. S.; Ji, X. Q.; Xu, Yuan; Chen, Wei; Zhou, Yan; Yang, Q. W.; Duan, X. R.; Liu, Yong; HL-2A Team

    2016-03-01

    A Bayesian based tomography method using so-called Gaussian processes (GPs) for the emission model has been applied to the soft x-ray (SXR) diagnostics on HL-2A tokamak. To improve the accuracy of reconstructions, the standard GP is extended to a non-stationary version so that different smoothness between the plasma center and the edge can be taken into account in the algorithm. The uncertainty in the reconstruction arising from measurement errors and incapability can be fully analyzed by the usage of Bayesian probability theory. In this work, the SXR reconstructions by this non-stationary Gaussian processes tomography (NSGPT) method have been compared with the equilibrium magnetic flux surfaces, generally achieving a satisfactory agreement in terms of both shape and position. In addition, singular-value-decomposition (SVD) and Fast Fourier Transform (FFT) techniques have been applied for the analysis of SXR and magnetic diagnostics, in order to explore the spatial and temporal features of the saturated long-lived magnetohydrodynamics (MHD) instability induced by energetic particles during neutral beam injection (NBI) on HL-2A. The result shows that this ideal internal kink instability has a dominant m/n  =  1/1 mode structure along with a harmonics m/n  =  2/2, which are coupled near the q  =  1 surface with a rotation frequency of 12 kHz.

  6. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature.

  7. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature. PMID:26722989

  8. Bayesian analysis of radiocarbon chronologies: examples from the European Late-glacial

    NASA Astrophysics Data System (ADS)

    Blockley, S. P. E.; Lowe, J. J.; Walker, M. J. C.; Asioli, A.; Trincardi, F.; Coope, G. R.; Donahue, R. E.

    2004-02-01

    Although there are many Late-glacial (ca. 15 000-11 000 cal. yr BP) proxy climate records from northwest Europe, some analysed at a very high temporal resolution (decadal to century scale), attempts to establish time-stratigraphical correlations between sequences are constrained by problems of radiocarbon dating. In an attempt to overcome some of these difficulties, we have used a Bayesian approach to the analysis of radiocarbon chronologies for two Late-glacial sites in the British Isles and one in the Adriatic Sea. The palaeoclimatic records from the three sites were then compared with that from the GRIP Greenland ice-core. Although there are some apparent differences in the timing of climatic events during the early part of the Late-glacial (pre-14 000 cal. yr BP), the results suggest that regional climatic changes appear to have been broadly comparable between Greenland, the British Isles and the Adriatic during the major part of the Late-glacial (i.e. between 14 000 and 11 000 cal. yr BP). The advantage of using the Bayesian approach is that it provides a means of testing the reliability of Late-glacial radiocarbon chronologies that is independent of regional chronostratigraphical (climatostratigraphical) frameworks. It also uses the full radiocarbon inventory available for each sequence and makes explicit any data selection applied. Potentially, therefore, it offers a more objective basis for comparing regional radiocarbon chronologies than the conventional approaches that have been used hitherto. Copyright

  9. An analysis of a joint shear model for jointed media with orthogonal joint sets; Yucca Mountain Site Characterization Project

    SciTech Connect

    Koteras, J.R.

    1991-10-01

    This report describes a joint shear model used in conjunction with a computational model for jointed media with orthogonal joint sets. The joint shear model allows nonlinear behavior for both joint sets. Because nonlinear behavior is allowed for both joint sets, a great many cases must be considered to fully describe the joint shear behavior of the jointed medium. An extensive set of equations is required to describe the joint shear stress and slip displacements that can occur for all the various cases. This report examines possible methods for simplifying this set of equations so that the model can be implemented efficiently form a computational standpoint. The shear model must be examined carefully to obtain a computationally efficient implementation that does not lead to numerical problems. The application to fractures in rock is discussed. 5 refs., 4 figs.

  10. Bayesian Analysis for Exponential Random Graph Models Using the Adaptive Exchange Sampler*

    PubMed Central

    Jin, Ick Hoon; Yuan, Ying; Liang, Faming

    2014-01-01

    Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the intractable normalizing constant and model degeneracy. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the intractable normalizing constant and model degeneracy issues encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency. PMID:24653788

  11. A Bayesian Approach for Instrumental Variable Analysis with Censored Time-to-Event Outcome

    PubMed Central

    Li, Gang; Lu, Xuyang

    2014-01-01

    Instrumental variable (IV) analysis has been widely used in economics, epidemiology, and other fields to estimate the causal effects of covariates on outcomes, in the presence of unobserved confounders and/or measurement errors in covariates. However, IV methods for time-to-event outcome with censored data remain underdeveloped. This paper proposes a Bayesian approach for IV analysis with censored time-to-event outcome by using a two-stage linear model. A Markov Chain Monte Carlo sampling method is developed for parameter estimation for both normal and non-normal linear models with elliptically contoured error distributions. Performance of our method is examined by simulation studies. Our method largely reduces bias and greatly improves coverage probability of the estimated causal effect, compared to the method that ignores the unobserved confounders and measurement errors. We illustrate our method on the Women's Health Initiative Observational Study and the Atherosclerosis Risk in Communities Study. PMID:25393617

  12. Bayesian analysis of complex interacting mutations in HIV drug resistance and cross-resistance.

    PubMed

    Kozyryev, Ivan; Zhang, Jing

    2015-01-01

    A successful treatment of AIDS world-wide is severely hindered by the HIV virus' drug resistance capability resulting from complicated mutation patterns of viral proteins. Such a system of mutations enables the virus to survive and reproduce despite the presence of various antiretroviral drugs by disrupting their binding capability. Although these interacting mutation patterns are extremely difficult to efficiently uncover and interpret, they contribute valuable information to personalized therapeutic regimen design. The use of Bayesian statistical modeling provides an unprecedented opportunity in the field of anti-HIV therapy to understand detailed interaction structures of drug resistant mutations. Multiple Bayesian models equipped with Markov Chain Monte Carlo (MCMC) methods have been recently proposed in this field (Zhang et al. in PNAS 107:1321, 2010 [1]; Zhang et al. in J Proteome Sci Comput Biol 1:2, 2012 [2]; Svicher et al. in Antiviral Res 93(1):86-93, 2012 [3]; Svicher et al. in Antiviral Therapy 16(7):1035-1045, 2011 [4]; Svicher et al. in Antiviral Ther 16(4):A14-A14, 2011 [5]; Svicher et al. in Antiviral Ther 16(4):A85-A85, 2011 [6]; Alteri et al. in Signature mutations in V3 and bridging sheet domain of HIV-1 gp120 HIV-1 are specifically associated with dual tropism and modulate the interaction with CCR5 N-Terminus, 2011 [7]). Probabilistically modeling mutations in the HIV-1 protease or reverse transcriptase (RT) isolated from drug-treated patients provides a powerful statistical procedure that first detects mutation combinations associated with single or multiple-drug resistance, and then infers detailed dependence structures among the interacting mutations in viral proteins (Zhang et al. in PNAS 107:1321, 2010 [1]; Zhang et al. in J Proteome Sci Comput Biol 1:2, 2012 [2]). Combined with molecular dynamics simulations and free energy calculations, Bayesian analysis predictions help to uncover genetic and structural mechanisms in the HIV treatment

  13. Benchmark Composite Wing Design Including Joint Analysis and Optimization

    NASA Astrophysics Data System (ADS)

    Albers, Robert G.

    A composite wing panel software package, named WING Joint OpTimization and Analysis (WINGJOTA) featuring bolted joint analysis, is created and presented in this research. Three areas of focus were the development of an analytic composite bolted joint analysis suitable for fast evaluation; a more realistic wing design than what has been considered in the open literature; and the application of two optimization algorithms for composite wing design. Optimization results from 14 wing load cases applied to a composite wing panel with joints are presented. The composite bolted joint analysis consists of an elasticity solution that provides the stress state at a characteristic distance away from the bolt holes. The stresses at the characteristic distance are compared to a failure criterion on a ply-by-ply basis that not only determines first ply failure but also the failure mode. The loads in the multi-fastener joints used in this study were determined by an iterative scheme that provides the bearing-bypass loads to the elasticity analysis. A preliminary design of a composite subsonic transport wing was developed, based around a mid-size, twin-aisle aircraft. The benchmark design includes the leading and trailing edge structures and the center box inside the fuselage. Wing masses were included as point loads, and fuel loads were incorporated as distributed loads. The side-of-body boundary condition was modeled using high stiffness springs, and the aerodynamic loads were applied using an approximate point load scheme. The entire wing structure was modeled using the finite element code ANSYS to provide the internal loads needed as boundary conditions for the wing panel analyzed by WINGJOTA. The software package WINGJOTA combines the composite bolted joint analysis, a composite plate finite element analysis, a wing aeroelastic cycle, and two optimization algorithms to form the basis of a computer code for analysis and optimization. Both the Improving Hit-and-Run (IHR) and

  14. System for three-dimensional biomechanical analysis of joints

    NASA Astrophysics Data System (ADS)

    Siebert, Markus; Englmeier, Karl-Hans; von Eisenhart-Rothe, Ruediger; Bringmann, Christoph; Eckstein, Felix; Bonel, H.; Reiser, Maximilian; Graichen, Heiko

    2002-04-01

    We developed 3D MR based image processing methods for biomechanical analysis of joints. These methods provide quantitative data on the morphological distribution of the joint cartilage as well as biomechanical analysis of relative translation and rotation of joints. After image data acquisition in an open MR system, the segmentation of the different joint structures was performed by a semi automatic technique based on a gray value oriented region growing algorithm. After segmentation 3D reconstructions of cartilage and bone surfaces were performed. Principal axis decomposition is used to calculate a reproducible tibia plateau based coordinate system that allows the determination of relative rotation and translation of the condyles and menisci in relation to the tibia plateau. The analysis of the femoral movement is based on a reproducible, semi automatic calculated epicondylar axis. The analysis showed a posterior translation of the meniscus and even more of the femur condyles in healthy knees and in knees with an insufficiency of the anterior cruciate ligament (ACL).

  15. Space Shuttle RTOS Bayesian Network

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  16. Adhesive Characterization and Progressive Damage Analysis of Bonded Composite Joints

    NASA Technical Reports Server (NTRS)

    Girolamo, Donato; Davila, Carlos G.; Leone, Frank A.; Lin, Shih-Yung

    2014-01-01

    The results of an experimental/numerical campaign aimed to develop progressive damage analysis (PDA) tools for predicting the strength of a composite bonded joint under tensile loads are presented. The PDA is based on continuum damage mechanics (CDM) to account for intralaminar damage, and cohesive laws to account for interlaminar and adhesive damage. The adhesive response is characterized using standard fracture specimens and digital image correlation (DIC). The displacement fields measured by DIC are used to calculate the J-integrals, from which the associated cohesive laws of the structural adhesive can be derived. A finite element model of a sandwich conventional splice joint (CSJ) under tensile loads was developed. The simulations indicate that the model is capable of predicting the interactions of damage modes that lead to the failure of the joint.

  17. Results and Analysis from Space Suit Joint Torque Testing

    NASA Technical Reports Server (NTRS)

    Matty, Jennifer

    2010-01-01

    A space suit's mobility is critical to an astronaut's ability to perform work efficiently. As mobility increases, the astronaut can perform tasks for longer durations with less fatigue. Mobility can be broken down into two parts: range of motion (ROM) and torque. These two measurements describe how the suit moves and how much force it takes to move. Two methods were chosen to define mobility requirements for the Constellation Space Suit Element (CSSE). One method focuses on range of motion and the second method centers on joint torque. A joint torque test was conducted to determine a baseline for current advanced space suit joint torques. This test utilized the following space suits: Extravehicular Mobility Unit (EMU), Advanced Crew Escape Suit (ACES), I-Suit, D-Suit, Enhanced Mobility (EM)- ACES, and Mark III (MK-III). Data was collected data from 16 different joint movements of each suit. The results were then reviewed and CSSE joint torque requirement values were selected. The focus of this paper is to discuss trends observed during data analysis.

  18. Intuitive logic revisited: new data and a Bayesian mixed model meta-analysis.

    PubMed

    Singmann, Henrik; Klauer, Karl Christoph; Kellen, David

    2014-01-01

    Recent research on syllogistic reasoning suggests that the logical status (valid vs. invalid) of even difficult syllogisms can be intuitively detected via differences in conceptual fluency between logically valid and invalid syllogisms when participants are asked to rate how much they like a conclusion following from a syllogism (Morsanyi & Handley, 2012). These claims of an intuitive logic are at odds with most theories on syllogistic reasoning which posit that detecting the logical status of difficult syllogisms requires effortful and deliberate cognitive processes. We present new data replicating the effects reported by Morsanyi and Handley, but show that this effect is eliminated when controlling for a possible confound in terms of conclusion content. Additionally, we reanalyze three studies (n = 287) without this confound with a Bayesian mixed model meta-analysis (i.e., controlling for participant and item effects) which provides evidence for the null-hypothesis and against Morsanyi and Handley's claim.

  19. Exposure models for the prior distribution in bayesian decision analysis for occupational hygiene decision making.

    PubMed

    Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E; Harper, Martin

    2013-01-01

    This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure. PMID:23252451

  20. Bayesian semiparametric power spectral density estimation with applications in gravitational wave data analysis

    NASA Astrophysics Data System (ADS)

    Edwards, Matthew C.; Meyer, Renate; Christensen, Nelson

    2015-09-01

    The standard noise model in gravitational wave (GW) data analysis assumes detector noise is stationary and Gaussian distributed, with a known power spectral density (PSD) that is usually estimated using clean off-source data. Real GW data often depart from these assumptions, and misspecified parametric models of the PSD could result in misleading inferences. We propose a Bayesian semiparametric approach to improve this. We use a nonparametric Bernstein polynomial prior on the PSD, with weights attained via a Dirichlet process distribution, and update this using the Whittle likelihood. Posterior samples are obtained using a blocked Metropolis-within-Gibbs sampler. We simultaneously estimate the reconstruction parameters of a rotating core collapse supernova GW burst that has been embedded in simulated Advanced LIGO noise. We also discuss an approach to deal with nonstationary data by breaking longer data streams into smaller and locally stationary components.

  1. A Bayesian approach to probabilistic sensitivity analysis in structured benefit-risk assessment.

    PubMed

    Waddingham, Ed; Mt-Isa, Shahrul; Nixon, Richard; Ashby, Deborah

    2016-01-01

    Quantitative decision models such as multiple criteria decision analysis (MCDA) can be used in benefit-risk assessment to formalize trade-offs between benefits and risks, providing transparency to the assessment process. There is however no well-established method for propagating uncertainty of treatment effects data through such models to provide a sense of the variability of the benefit-risk balance. Here, we present a Bayesian statistical method that directly models the outcomes observed in randomized placebo-controlled trials and uses this to infer indirect comparisons between competing active treatments. The resulting treatment effects estimates are suitable for use within the MCDA setting, and it is possible to derive the distribution of the overall benefit-risk balance through Markov Chain Monte Carlo simulation. The method is illustrated using a case study of natalizumab for relapsing-remitting multiple sclerosis.

  2. Bayesian network meta-analysis for unordered categorical outcomes with incomplete data.

    PubMed

    Schmid, Christopher H; Trikalinos, Thomas A; Olkin, Ingram

    2014-06-01

    We develop a Bayesian multinomial network meta-analysis model for unordered (nominal) categorical outcomes that allows for partially observed data in which exact event counts may not be known for each category. This model properly accounts for correlations of counts in mutually exclusive categories and enables proper comparison and ranking of treatment effects across multiple treatments and multiple outcome categories. We apply the model to analyze 17 trials, each of which compares two of three treatments (high and low dose statins and standard care/control) for three outcomes for which data are complete: cardiovascular death, non-cardiovascular death and no death. We also analyze the cardiovascular death category divided into the three subcategories (coronary heart disease, stroke and other cardiovascular diseases) that are not completely observed. The multinomial and network representations show that high dose statins are effective in reducing the risk of cardiovascular disease. PMID:26052655

  3. Intuitive Logic Revisited: New Data and a Bayesian Mixed Model Meta-Analysis

    PubMed Central

    Singmann, Henrik; Klauer, Karl Christoph; Kellen, David

    2014-01-01

    Recent research on syllogistic reasoning suggests that the logical status (valid vs. invalid) of even difficult syllogisms can be intuitively detected via differences in conceptual fluency between logically valid and invalid syllogisms when participants are asked to rate how much they like a conclusion following from a syllogism (Morsanyi & Handley, 2012). These claims of an intuitive logic are at odds with most theories on syllogistic reasoning which posit that detecting the logical status of difficult syllogisms requires effortful and deliberate cognitive processes. We present new data replicating the effects reported by Morsanyi and Handley, but show that this effect is eliminated when controlling for a possible confound in terms of conclusion content. Additionally, we reanalyze three studies () without this confound with a Bayesian mixed model meta-analysis (i.e., controlling for participant and item effects) which provides evidence for the null-hypothesis and against Morsanyi and Handley's claim. PMID:24755777

  4. Exposure models for the prior distribution in bayesian decision analysis for occupational hygiene decision making.

    PubMed

    Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E; Harper, Martin

    2013-01-01

    This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure.

  5. Joint Tensor Feature Analysis For Visual Object Recognition.

    PubMed

    Wong, Wai Keung; Lai, Zhihui; Xu, Yong; Wen, Jiajun; Ho, Chu Po

    2015-11-01

    Tensor-based object recognition has been widely studied in the past several years. This paper focuses on the issue of joint feature selection from the tensor data and proposes a novel method called joint tensor feature analysis (JTFA) for tensor feature extraction and recognition. In order to obtain a set of jointly sparse projections for tensor feature extraction, we define the modified within-class tensor scatter value and the modified between-class tensor scatter value for regression. The k-mode optimization technique and the L(2,1)-norm jointly sparse regression are combined together to compute the optimal solutions. The convergent analysis, computational complexity analysis and the essence of the proposed method/model are also presented. It is interesting to show that the proposed method is very similar to singular value decomposition on the scatter matrix but with sparsity constraint on the right singular value matrix or eigen-decomposition on the scatter matrix with sparse manner. Experimental results on some tensor datasets indicate that JTFA outperforms some well-known tensor feature extraction and selection algorithms. PMID:26470058

  6. Application of Bayesian Approach to Cost-Effectiveness Analysis of Antiviral Treatments in Chronic Hepatitis B

    PubMed Central

    Zhang, Hua; Huo, Mingdong; Chao, Jianqian; Liu, Pei

    2016-01-01

    Background Hepatitis B virus (HBV) infection is a major problem for public health; timely antiviral treatment can significantly prevent the progression of liver damage from HBV by slowing down or stopping the virus from reproducing. In the study we applied Bayesian approach to cost-effectiveness analysis, using Markov Chain Monte Carlo (MCMC) simulation methods for the relevant evidence input into the model to evaluate cost-effectiveness of entecavir (ETV) and lamivudine (LVD) therapy for chronic hepatitis B (CHB) in Jiangsu, China, thus providing information to the public health system in the CHB therapy. Methods Eight-stage Markov model was developed, a hypothetical cohort of 35-year-old HBeAg-positive patients with CHB was entered into the model. Treatment regimens were LVD100mg daily and ETV 0.5 mg daily. The transition parameters were derived either from systematic reviews of the literature or from previous economic studies. The outcome measures were life-years, quality-adjusted lifeyears (QALYs), and expected costs associated with the treatments and disease progression. For the Bayesian models all the analysis was implemented by using WinBUGS version 1.4. Results Expected cost, life expectancy, QALYs decreased with age. Cost-effectiveness increased with age. Expected cost of ETV was less than LVD, while life expectancy and QALYs were higher than that of LVD, ETV strategy was more cost-effective. Costs and benefits of the Monte Carlo simulation were very close to the results of exact form among the group, but standard deviation of each group indicated there was a big difference between individual patients. Conclusions Compared with lamivudine, entecavir is the more cost-effective option. CHB patients should accept antiviral treatment as soon as possible as the lower age the more cost-effective. Monte Carlo simulation obtained costs and effectiveness distribution, indicate our Markov model is of good robustness. PMID:27574976

  7. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    PubMed

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis.

  8. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    PubMed

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis. PMID:25592482

  9. Bayesian analysis of exoplanet and binary orbits. Demonstrated using astrometric and radial-velocity data of Mizar A

    NASA Astrophysics Data System (ADS)

    Schulze-Hartung, T.; Launhardt, R.; Henning, T.

    2012-09-01

    Aims: We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A. Methods: With the Bayesian approach to data analysis we can incorporate prior knowledge and draw extensive posterior inferences about model parameters and derived quantities. This was implemented in BASE by Markov chain Monte Carlo (MCMC) sampling, using a combination of the Metropolis-Hastings, hit-and-run, and parallel-tempering algorithms to explore the whole parameter space. Nonconvergence to the posterior was tested by means of the Gelman-Rubin statistic (potential scale reduction). The samples were used directly and transformed into marginal densities by means of kernel density estimation, a "smooth" alternative to histograms. We derived the relevant observable models from Newton's law of gravitation, showing that the motion of Earth and the target can be neglected. Results: With our methods we can provide more detailed information about the parameters than a frequentist analysis does. Still, a comparison with the Mizar A literature shows that both approaches are compatible within the uncertainties. Conclusions: We show that the Bayesian approach to inference has been implemented successfully in BASE, a flexible tool for analysing astrometric and radial-velocity data. BASE, the computer program introduced in this article, can be downloaded at http://www.mpia.de/homes/schulze/base.html.

  10. Joint modality fusion and temporal context exploitation for semantic video analysis

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Georgios Th; Mezaris, Vasileios; Kompatsiaris, Ioannis; Strintzis, Michael G.

    2011-12-01

    In this paper, a multi-modal context-aware approach to semantic video analysis is presented. Overall, the examined video sequence is initially segmented into shots and for every resulting shot appropriate color, motion and audio features are extracted. Then, Hidden Markov Models (HMMs) are employed for performing an initial association of each shot with the semantic classes that are of interest separately for each modality. Subsequently, a graphical modeling-based approach is proposed for jointly performing modality fusion and temporal context exploitation. Novelties of this work include the combined use of contextual information and multi-modal fusion, and the development of a new representation for providing motion distribution information to HMMs. Specifically, an integrated Bayesian Network is introduced for simultaneously performing information fusion of the individual modality analysis results and exploitation of temporal context, contrary to the usual practice of performing each task separately. Contextual information is in the form of temporal relations among the supported classes. Additionally, a new computationally efficient method for providing motion energy distribution-related information to HMMs, which supports the incorporation of motion characteristics from previous frames to the currently examined one, is presented. The final outcome of this overall video analysis framework is the association of a semantic class with every shot. Experimental results as well as comparative evaluation from the application of the proposed approach to four datasets belonging to the domains of tennis, news and volleyball broadcast video are presented.

  11. Bayesian Inference in Probabilistic Risk Assessment -- The Current State of the Art

    SciTech Connect

    Dana L. Kelly; Curtis L. Smith

    2009-02-01

    Markov chain Monte Carlo approaches to sampling directly from the joint posterior distribution of aleatory model parameters have led to tremendous advances in Bayesian inference capability in a wide variety of fields, including probabilistic risk analysis. The advent of freely available software coupled with inexpensive computing power has catalyzed this advance. This paper examines where the risk assessment community is with respect to implementing modern computational-based Bayesian approaches to inference. Through a series of examples in different topical areas, it introduces salient concepts and illustrates the practical application of Bayesian inference via Markov chain Monte Carlo sampling to a variety of important problems.

  12. Bayesian uncertainty analysis for advanced seismic imaging - Application to the Mentelle Basin, Australia

    NASA Astrophysics Data System (ADS)

    Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.

    2016-04-01

    multivariate posterior distribution. The novelty of our approach and the major difference compared to the traditional semblance spectrum velocity analysis procedure is the calculation of uncertainty of the output model. As the model is able to estimate the credibility intervals of the corresponding interval velocities, we can produce the most probable PSDM images in an iterative manner. The depths extracted using our statistical algorithm are in very good agreement with the key horizons retrieved from the drilled core DSDP-258, showing that the Bayesian model is able to control the depth migration of the seismic data and estimate the uncertainty to the drilling targets.

  13. Sea-level variability in tide-gauge and geological records: An empirical Bayesian analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Kopp, R. E.; Hay, C.; Morrow, E.; Mitrovica, J. X.; Horton, B.; Kemp, A.

    2013-12-01

    Sea level varies at a range of temporal and spatial scales, and understanding all its significant sources of variability is crucial to building sea-level rise projections relevant to local decision-making. In the twentieth-century record, sites along the U.S. east coast have exhibited typical year-to-year variability of several centimeters. A faster-than-global increase in sea-level rise in the northeastern United States since about 1990 has led some to hypothesize a 'sea-level rise hot spot' in this region, perhaps driven by a trend in the Atlantic Meridional Overturning Circulation related to anthropogenic climate change [1]. However, such hypotheses must be evaluated in the context of natural variability, as revealed by observational and paleo-records. Bayesian and empirical Bayesian statistical approaches are well suited for assimilating data from diverse sources, such as tide-gauges and peats, with differing data availability and uncertainties, and for identifying regionally covarying patterns within these data. We present empirical Bayesian analyses of twentieth-century tide gauge data [2]. We find that the mid-Atlantic region of the United States has experienced a clear acceleration of sea level relative to the global average since about 1990, but this acceleration does not appear to be unprecedented in the twentieth-century record. The rate and extent of this acceleration instead appears comparable to an acceleration observed in the 1930s and 1940s. Both during the earlier episode of acceleration and today, the effect appears to be significantly positively correlated with the Atlantic Multidecadal Oscillation and likely negatively correlated with the North Atlantic Oscillation [2]. The Holocene and Common Era database of geological sea-level rise proxies [3,4] may allow these relationships to be assessed beyond the span of the direct observational record. At a global scale, similar approaches can be employed to look for the spatial fingerprints of land ice

  14. Hip Fracture in the Elderly: A Re-Analysis of the EPIDOS Study with Causal Bayesian Networks

    PubMed Central

    Caillet, Pascal; Klemm, Sarah; Ducher, Michel; Aussem, Alexandre; Schott, Anne-Marie

    2015-01-01

    Objectives Hip fractures commonly result in permanent disability, institutionalization or death in elderly. Existing hip-fracture predicting tools are underused in clinical practice, partly due to their lack of intuitive interpretation. By use of a graphical layer, Bayesian network models could increase the attractiveness of fracture prediction tools. Our aim was to study the potential contribution of a causal Bayesian network in this clinical setting. A logistic regression was performed as a standard control approach to check the robustness of the causal Bayesian network approach. Setting EPIDOS is a multicenter study, conducted in an ambulatory care setting in five French cities between 1992 and 1996 and updated in 2010. The study included 7598 women aged 75 years or older, in which fractures were assessed quarterly during 4 years. A causal Bayesian network and a logistic regression were performed on EPIDOS data to describe major variables involved in hip fractures occurrences. Results Both models had similar association estimations and predictive performances. They detected gait speed and mineral bone density as variables the most involved in the fracture process. The causal Bayesian network showed that gait speed and bone mineral density were directly connected to fracture and seem to mediate the influence of all the other variables included in our model. The logistic regression approach detected multiple interactions involving psychotropic drug use, age and bone mineral density. Conclusion Both approaches retrieved similar variables as predictors of hip fractures. However, Bayesian network highlighted the whole web of relation between the variables involved in the analysis, suggesting a possible mechanism leading to hip fracture. According to the latter results, intervention focusing concomitantly on gait speed and bone mineral density may be necessary for an optimal prevention of hip fracture occurrence in elderly people. PMID:25822373

  15. Thermographic Analysis of Stress Distribution in Welded Joints

    NASA Astrophysics Data System (ADS)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  16. A Bayesian Approach to Person Fit Analysis in Item Response Theory Models. Research Report.

    ERIC Educational Resources Information Center

    Glas, Cees A. W.; Meijer, Rob R.

    A Bayesian approach to the evaluation of person fit in item response theory (IRT) models is presented. In a posterior predictive check, the observed value on a discrepancy variable is positioned in its posterior distribution. In a Bayesian framework, a Markov Chain Monte Carlo procedure can be used to generate samples of the posterior distribution…

  17. Integrated survival analysis using an event-time approach in a Bayesian framework

    USGS Publications Warehouse

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  18. Integrated survival analysis using an event-time approach in a Bayesian framework

    PubMed Central

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  19. Integrated survival analysis using an event-time approach in a Bayesian framework.

    PubMed

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-02-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  20. Bayesian Rainfall Variability Analysis in West Africa along Cross Sections in Space Time Grid Boxes.

    NASA Astrophysics Data System (ADS)

    Tapsoba, Dominique; Haché, Mario; Perreault, Luc; Bobée, Bernard

    2004-03-01

    This paper proposes an approach for analyzing rainfall variability over West Africa during the 1950 90 period. Three grid boxes, corresponding to three selected areas over West Africa, have been constructed. For each candidate area the set of annual grid maps are stored in 3D matrices, reflecting time and geographical position, called here space time grid boxes. Each space time grid box contains grid points corresponding to a given gauging year. The Bayesian procedure, based on a single-shifting model, is applied to grid points extracted from mean meridional and latitudinal cross sections of each space time grid box. Two different problems are considered: the first is the detection of a change, while the second is the estimation of the changepoint and its amplitude under the assumption that a change has occurred. The Bayesian single-shift model is applied on grid points extracted from each cross section. A latitude latitude and longitude longitude analysis of the rainfall climatology changes is, thus, carried out. It is pointed out that the most significant rainfall climatological changes in the Sahel most probably occurred between 1965 and 1970 with the decrease of the mean level of annual rainfall. This deficit is very high over the coastal region of Senegal (25%) and over the central region of the Sahel (15% 20%). Under approximately 9° 10°N, over the humid West Africa region, a zone without any significant change extending from 6° to 10°N was highlighted. A similar zone with nonsignificant rainfall change was identified along the cross section at 1.5°E, which follow the border of Togo and Benin. However, over the zones in edge of the coast of Ivory Coast, a deficit about 17% is observed.

  1. Sensitivity analysis of structural parameters to measurement noninvariance: A Bayesian approach

    NASA Astrophysics Data System (ADS)

    Kang, Yoon Jeong

    Most previous studies have argued that the validity of group comparisons of structural parameters is dependent on the extent to which measurement invariance is met. Although some researchers have supported the concept of partial invariance, there is still no clear-cut partial invariance level which is needed to make valid group comparisons. In addition, relatively little attention has been paid to the implications of failing measurement invariance (e.g., partial measurement invariance) on group comparison on the underlying latent constructs in the multiple-group confirmatory factor analysis (MGCFA) framework. Given this, the purpose of the current study was to examine the extent to which measurement noninvariance affects structural parameter comparisons across populations in the MGCFA framework. Particularly, this study takes a Bayesian approach to investigate the sensitivity of the posterior distribution of structural parameter difference to varying types and magnitudes of noninvariance across two populations. A Monte Carlo simulation was performed to empirically investigate the sensitivity of structural parameters to varying types and magnitudes of noninvariant measurement models across two populations from a Bayesian approach. In order to assess the sensitivity of noninvariance conditions, three outcome variables were evaluated: (1) accuracy of statistical conclusion on structural parameter difference, (2) precision of the estimated structural parameter difference, and (3) bias in the posterior mean of structural parameter difference. Inconsistent with findings of previous studies, the results of this study showed that the three outcome variables were not sensitive to varying types and magnitudes of noninvariance across all conditions. Instead, the three outcome variables were sensitive to sample size, factor loading size, and prior distribution. These results indicate that even under a large magnitude of measurement noninvariance, accurate conclusions and

  2. Motion analysis of knee joint using dynamic volume images

    NASA Astrophysics Data System (ADS)

    Haneishi, Hideaki; Kohno, Takahiro; Suzuki, Masahiko; Moriya, Hideshige; Mori, Sin-ichiro; Endo, Masahiro

    2006-03-01

    Acquisition and analysis of three-dimensional movement of knee joint is desired in orthopedic surgery. We have developed two methods to obtain dynamic volume images of knee joint. One is a 2D/3D registration method combining a bi-plane dynamic X-ray fluoroscopy and a static three-dimensional CT, the other is a method using so-called 4D-CT that uses a cone-beam and a wide 2D detector. In this paper, we present two analyses of knee joint movement obtained by these methods: (1) transition of the nearest points between femur and tibia (2) principal component analysis (PCA) of six parameters representing the three dimensional movement of knee. As a preprocessing for the analysis, at first the femur and tibia regions are extracted from volume data at each time frame and then the registration of the tibia between different frames by an affine transformation consisting of rotation and translation are performed. The same transformation is applied femur as well. Using those image data, the movement of femur relative to tibia can be analyzed. Six movement parameters of femur consisting of three translation parameters and three rotation parameters are obtained from those images. In the analysis (1), axis of each bone is first found and then the flexion angle of the knee joint is calculated. For each flexion angle, the minimum distance between femur and tibia and the location giving the minimum distance are found in both lateral condyle and medial condyle. As a result, it was observed that the movement of lateral condyle is larger than medial condyle. In the analysis (2), it was found that the movement of the knee can be represented by the first three principal components with precision of 99.58% and those three components seem to strongly relate to three major movements of femur in the knee bend known in orthopedic surgery.

  3. Joint analysis of the seismic data and velocity gravity model

    NASA Astrophysics Data System (ADS)

    Belyakov, A. S.; Lavrov, V. S.; Muchamedov, V. A.; Nikolaev, A. V.

    2016-03-01

    We performed joint analysis of the seismic noises recorded at the Japanese Ogasawara station located on Titijima Island in the Philippine Sea using the STS-2 seismograph at the OSW station in the winter period of January 1-15, 2015, over the background of a velocity gravity model. The graphs prove the existence of a cause-and-effect relation between the seismic noise and gravity and allow us to consider it as a desired signal.

  4. Bayesian meta-analysis of diagnostic tests allowing for imperfect reference standards.

    PubMed

    Menten, J; Boelaert, M; Lesaffre, E

    2013-12-30

    There is an increasing interest in meta-analyses of rapid diagnostic tests (RDTs) for infectious diseases. To avoid spectrum bias, these meta-analyses should focus on phase IV studies performed in the target population. For many infectious diseases, these target populations attend primary health care centers in resource-constrained settings where it is difficult to perform gold standard diagnostic tests. As a consequence, phase IV diagnostic studies often use imperfect reference standards, which may result in biased meta-analyses of the diagnostic accuracy of novel RDTs. We extend the standard bivariate model for the meta-analysis of diagnostic studies to correct for differing and imperfect reference standards in the primary studies and to accommodate data from studies that try to overcome the absence of a true gold standard through the use of latent class analysis. Using Bayesian methods, improved estimates of sensitivity and specificity are possible, especially when prior information is available on the diagnostic accuracy of the reference test. In this analysis, the deviance information criterion can be used to detect conflicts between the prior information and observed data. When applying the model to a dataset of the diagnostic accuracy of an RDT for visceral leishmaniasis, the standard meta-analytic methods appeared to underestimate the specificity of the RDT. PMID:24003003

  5. Bayesian meta-analysis of diagnostic tests allowing for imperfect reference standards.

    PubMed

    Menten, J; Boelaert, M; Lesaffre, E

    2013-12-30

    There is an increasing interest in meta-analyses of rapid diagnostic tests (RDTs) for infectious diseases. To avoid spectrum bias, these meta-analyses should focus on phase IV studies performed in the target population. For many infectious diseases, these target populations attend primary health care centers in resource-constrained settings where it is difficult to perform gold standard diagnostic tests. As a consequence, phase IV diagnostic studies often use imperfect reference standards, which may result in biased meta-analyses of the diagnostic accuracy of novel RDTs. We extend the standard bivariate model for the meta-analysis of diagnostic studies to correct for differing and imperfect reference standards in the primary studies and to accommodate data from studies that try to overcome the absence of a true gold standard through the use of latent class analysis. Using Bayesian methods, improved estimates of sensitivity and specificity are possible, especially when prior information is available on the diagnostic accuracy of the reference test. In this analysis, the deviance information criterion can be used to detect conflicts between the prior information and observed data. When applying the model to a dataset of the diagnostic accuracy of an RDT for visceral leishmaniasis, the standard meta-analytic methods appeared to underestimate the specificity of the RDT.

  6. Bayesian Inference for Neural Electromagnetic Source Localization: Analysis of MEG Visual Evoked Activity

    SciTech Connect

    George, J.S.; Schmidt, D.M.; Wood, C.C.

    1999-02-01

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented.

  7. Application of Bayesian and cost benefit risk analysis in water resources management

    NASA Astrophysics Data System (ADS)

    Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.

    2016-03-01

    Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.

  8. The Phylogeographic History of the New World Screwworm Fly, Inferred by Approximate Bayesian Computation Analysis

    PubMed Central

    Azeredo-Espin, Ana Maria L.

    2013-01-01

    Insect pest phylogeography might be shaped both by biogeographic events and by human influence. Here, we conducted an approximate Bayesian computation (ABC) analysis to investigate the phylogeography of the New World screwworm fly, Cochliomyia hominivorax, with the aim of understanding its population history and its order and time of divergence. Our ABC analysis supports that populations spread from North to South in the Americas, in at least two different moments. The first split occurred between the North/Central American and South American populations in the end of the Last Glacial Maximum (15,300-19,000 YBP). The second split occurred between the North and South Amazonian populations in the transition between the Pleistocene and the Holocene eras (9,100-11,000 YBP). The species also experienced population expansion. Phylogenetic analysis likewise suggests this north to south colonization and Maxent models suggest an increase in the number of suitable areas in South America from the past to present. We found that the phylogeographic patterns observed in C. hominivorax cannot be explained only by climatic oscillations and can be connected to host population histories. Interestingly we found these patterns are very coincident with general patterns of ancient human movements in the Americas, suggesting that humans might have played a crucial role in shaping the distribution and population structure of this insect pest. This work presents the first hypothesis test regarding the processes that shaped the current phylogeographic structure of C. hominivorax and represents an alternate perspective on investigating the problem of insect pests. PMID:24098436

  9. Estimation of protein folding free energy barriers from calorimetric data by multi-model Bayesian analysis.

    PubMed

    Naganathan, Athi N; Perez-Jimenez, Raul; Muñoz, Victor; Sanchez-Ruiz, Jose M

    2011-10-14

    The realization that folding free energy barriers can be small enough to result in significant population of the species at the barrier top has sprouted in several methods to estimate folding barriers from equilibrium experiments. Some of these approaches are based on fitting the experimental thermogram measured by differential scanning calorimetry (DSC) to a one-dimensional representation of the folding free-energy surface (FES). Different physical models have been used to represent the FES: (1) a Landau quartic polynomial as a function of the total enthalpy, which acts as an order parameter; (2) the projection onto a structural order parameter (i.e. number of native residues or native contacts) of the free energy of all the conformations generated by Ising-like statistical mechanical models; and (3) mean-field models that define conformational entropy and stabilization energy as functions of a continuous local order parameter. The fundamental question that emerges is how can we obtain robust, model-independent estimates of the thermodynamic folding barrier from the analysis of DSC experiments. Here we address this issue by comparing the performance of various FES models in interpreting the thermogram of a protein with a marginal folding barrier. We chose the small α-helical protein PDD, which folds-unfolds in microseconds crossing a free energy barrier previously estimated as ~1 RT. The fits of the PDD thermogram to the various models and assumptions produce FES with a consistently small free energy barrier separating the folded and unfolded ensembles. However, the fits vary in quality as well as in the estimated barrier. Applying Bayesian probabilistic analysis we rank the fit performance using a statistically rigorous criterion that leads to a global estimate of the folding barrier and its precision, which for PDD is 1.3 ± 0.4 kJ mol(-1). This result confirms that PDD folds over a minor barrier consistent with the downhill folding regime. We have further

  10. A BAYESIAN HIERARCHICAL SPATIAL POINT PROCESS MODEL FOR MULTI-TYPE NEUROIMAGING META-ANALYSIS

    PubMed Central

    Kang, Jian; Nichols, Thomas E.; Wager, Tor D.; Johnson, Timothy D.

    2014-01-01

    Neuroimaging meta-analysis is an important tool for finding consistent effects over studies that each usually have 20 or fewer subjects. Interest in meta-analysis in brain mapping is also driven by a recent focus on so-called “reverse inference”: where as traditional “forward inference” identifies the regions of the brain involved in a task, a reverse inference identifies the cognitive processes that a task engages. Such reverse inferences, however, requires a set of meta-analysis, one for each possible cognitive domain. However, existing methods for neuroimaging meta-analysis have significant limitations. Commonly used methods for neuroimaging meta-analysis are not model based, do not provide interpretable parameter estimates, and only produce null hypothesis inferences; further, they are generally designed for a single group of studies and cannot produce reverse inferences. In this work we address these limitations by adopting a non-parametric Bayesian approach for meta analysis data from multiple classes or types of studies. In particular, foci from each type of study are modeled as a cluster process driven by a random intensity function that is modeled as a kernel convolution of a gamma random field. The type-specific gamma random fields are linked and modeled as a realization of a common gamma random field, shared by all types, that induces correlation between study types and mimics the behavior of a univariate mixed effects model. We illustrate our model on simulation studies and a meta analysis of five emotions from 219 studies and check model fit by a posterior predictive assessment. In addition, we implement reverse inference by using the model to predict study type from a newly presented study. We evaluate this predictive performance via leave-one-out cross validation that is efficiently implemented using importance sampling techniques. PMID:25426185

  11. A BAYESIAN HIERARCHICAL SPATIAL POINT PROCESS MODEL FOR MULTI-TYPE NEUROIMAGING META-ANALYSIS.

    PubMed

    Kang, Jian; Nichols, Thomas E; Wager, Tor D; Johnson, Timothy D

    2014-09-01

    Neuroimaging meta-analysis is an important tool for finding consistent effects over studies that each usually have 20 or fewer subjects. Interest in meta-analysis in brain mapping is also driven by a recent focus on so-called "reverse inference": where as traditional "forward inference" identifies the regions of the brain involved in a task, a reverse inference identifies the cognitive processes that a task engages. Such reverse inferences, however, requires a set of meta-analysis, one for each possible cognitive domain. However, existing methods for neuroimaging meta-analysis have significant limitations. Commonly used methods for neuroimaging meta-analysis are not model based, do not provide interpretable parameter estimates, and only produce null hypothesis inferences; further, they are generally designed for a single group of studies and cannot produce reverse inferences. In this work we address these limitations by adopting a non-parametric Bayesian approach for meta analysis data from multiple classes or types of studies. In particular, foci from each type of study are modeled as a cluster process driven by a random intensity function that is modeled as a kernel convolution of a gamma random field. The type-specific gamma random fields are linked and modeled as a realization of a common gamma random field, shared by all types, that induces correlation between study types and mimics the behavior of a univariate mixed effects model. We illustrate our model on simulation studies and a meta analysis of five emotions from 219 studies and check model fit by a posterior predictive assessment. In addition, we implement reverse inference by using the model to predict study type from a newly presented study. We evaluate this predictive performance via leave-one-out cross validation that is efficiently implemented using importance sampling techniques.

  12. Bayesian Sequential Meta-analysis Design in Evaluating Cardiovascular Risk in a New Antidiabetic Drug Development Program

    PubMed Central

    Chen, Ming-Hui; Ibrahim, Joseph G.; Xia, H. Amy; Liu, Thomas; Hennessey, Violeta

    2014-01-01

    Recently, the Center for Drug Evaluation and Research at the Food and Drug Administration (FDA) released a guidance that makes recommendations about how to demonstrate that a new antidiabetic therapy to treat Type 2 diabetes is not associated with an unacceptable increase in cardiovascular risk. One of the recommendations from the guidance is that Phase II and III trials should be appropriately designed and conducted so that a meta-analysis can be performed. In addition, the guidance implies that a sequential meta-analysis strategy could be adopted. That is, the initial meta-analysis could aim at demonstrating the upper bound of a 95% confidence interval (CI) for the estimated hazard ratio to be < 1.8 for the purpose of enabling a new drug application (NDA) or a biologics license application (BLA). Subsequently after the marketing authorization, a final meta-analysis would need to show the upper bound to be < 1.3. In this context, we develop a new Bayesian sequential meta-analysis approach using survival regression models to assess whether the size of a clinical development program is adequate to evaluate a particular safety endpoint. We propose a Bayesian sample size determination methodology for sequential meta-analysis clinical trial design with a focus on controlling the familywise Type I error rate and power. The partial borrowing power prior is used to incorporate the historical survival meta-data into the Bayesian design. Various properties of the proposed methodology are examined and simulation-based computational algorithms are developed to generate predictive data at various interim analyses, sample from the posterior distributions, and compute various quantities such as the power and the Type I error in the Bayesian sequential meta-analysis trial design. The proposed methodology is applied to the design of a hypothetical antidiabetic drug development program for evaluating cardiovascular risk. PMID:24343859

  13. What’s in a Name: A Bayesian Hierarchical Analysis of the Name-Letter Effect

    PubMed Central

    Dyjas, Oliver; Grasman, Raoul P. P. P.; Wetzels, Ruud; van der Maas, Han L. J.; Wagenmakers, Eric-Jan

    2012-01-01

    People generally prefer their initials to the other letters of the alphabet, a phenomenon known as the name-letter effect. This effect, researchers have argued, makes people move to certain cities, buy particular brands of consumer products, and choose particular professions (e.g., Angela moves to Los Angeles, Phil buys a Philips TV, and Dennis becomes a dentist). In order to establish such associations between people’s initials and their behavior, researchers typically carry out statistical analyses of large databases. Current methods of analysis ignore the hierarchical structure of the data, do not naturally handle order-restrictions, and are fundamentally incapable of confirming the null hypothesis. Here we outline a Bayesian hierarchical analysis that avoids these limitations and allows coherent inference both on the level of the individual and on the level of the group. To illustrate our method, we re-analyze two data sets that address the question of whether people are disproportionately likely to live in cities that resemble their name. PMID:23055989

  14. Built environment and Property Crime in Seattle, 1998–2000: A Bayesian Analysis

    PubMed Central

    Matthews, Stephen A.; Yang, Tse-chuan; Hayslett-McCall, Karen L.; Ruback, R. Barry

    2014-01-01

    The past decade has seen a rapid growth in the use of a spatial perspective in studies of crime. In part this growth has been driven by the availability of georeferenced data, and the tools to analyze and visualize them: geographic information systems (GIS), spatial analysis, and spatial statistics. In this paper we use exploratory spatial data analysis (ESDA) tools and Bayesian models to help better understand the spatial patterning and predictors of property crime in Seattle, Washington for 1998–2000, including a focus on built environment variables. We present results for aggregate property crime data as well as models for specific property crime types: residential burglary, nonresidential burglary, theft, auto theft, and arson. ESDA confirms the presence of spatial clustering of property crime and we seek to explain these patterns using spatial Poisson models implemented in WinBUGS. Our results indicate that built environment variables were significant predictors of property crime, especially the presence of a highway on auto theft and burglary. PMID:24737924

  15. Computational model of an infant brain subjected to periodic motion simplified modelling and Bayesian sensitivity analysis.

    PubMed

    Batterbee, D C; Sims, N D; Becker, W; Worden, K; Rowson, J

    2011-11-01

    Non-accidental head injury in infants, or shaken baby syndrome, is a highly controversial and disputed topic. Biomechanical studies often suggest that shaking alone cannot cause the classical symptoms, yet many medical experts believe the contrary. Researchers have turned to finite element modelling for a more detailed understanding of the interactions between the brain, skull, cerebrospinal fluid (CSF), and surrounding tissues. However, the uncertainties in such models are significant; these can arise from theoretical approximations, lack of information, and inherent variability. Consequently, this study presents an uncertainty analysis of a finite element model of a human head subject to shaking. Although the model geometry was greatly simplified, fluid-structure-interaction techniques were used to model the brain, skull, and CSF using a Eulerian mesh formulation with penalty-based coupling. Uncertainty and sensitivity measurements were obtained using Bayesian sensitivity analysis, which is a technique that is relatively new to the engineering community. Uncertainty in nine different model parameters was investigated for two different shaking excitations: sinusoidal translation only, and sinusoidal translation plus rotation about the base of the head. The level and type of sensitivity in the results was found to be highly dependent on the excitation type.

  16. BayesPeak: Bayesian analysis of ChIP-seq data

    PubMed Central

    Spyrou, Christiana; Stark, Rory; Lynch, Andy G; Tavaré, Simon

    2009-01-01

    Background High-throughput sequencing technology has become popular and widely used to study protein and DNA interactions. Chromatin immunoprecipitation, followed by sequencing of the resulting samples, produces large amounts of data that can be used to map genomic features such as transcription factor binding sites and histone modifications. Methods Our proposed statistical algorithm, BayesPeak, uses a fully Bayesian hidden Markov model to detect enriched locations in the genome. The structure accommodates the natural features of the Solexa/Illumina sequencing data and allows for overdispersion in the abundance of reads in different regions. Moreover, a control sample can be incorporated in the analysis to account for experimental and sequence biases. Markov chain Monte Carlo algorithms are applied to estimate the posterior distributions of the model parameters, and posterior probabilities are used to detect the sites of interest. Conclusion We have presented a flexible approach for identifying peaks from ChIP-seq reads, suitable for use on both transcription factor binding and histone modification data. Our method estimates probabilities of enrichment that can be used in downstream analysis. The method is assessed using experimentally verified data and is shown to provide high-confidence calls with low false positive rates. PMID:19772557

  17. Bayesian analysis of uncertainty in predisposing and triggering factors for landslides hazard assessment

    NASA Astrophysics Data System (ADS)

    Sandric, I.; Petropoulos, Y.; Chitu, Z.; Mihai, B.

    2012-04-01

    The landslide hazard analysis models takes into consideration both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. The latter is expressed not as land use classes, as for example CORINE, but as leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index was derived from Landsat time series images, starting from 1984 and up to 2011. All the images available for the Panatau administrative unit in Buzau County, Romania, have been downloaded from http://earthexplorer.usgs.gov, including the images with cloud cover. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created

  18. The Analysis of Adhesively Bonded Advanced Composite Joints Using Joint Finite Elements

    NASA Technical Reports Server (NTRS)

    Stapleton, Scott E.; Waas, Anthony M.

    2012-01-01

    The design and sizing of adhesively bonded joints has always been a major bottleneck in the design of composite vehicles. Dense finite element (FE) meshes are required to capture the full behavior of a joint numerically, but these dense meshes are impractical in vehicle-scale models where a course mesh is more desirable to make quick assessments and comparisons of different joint geometries. Analytical models are often helpful in sizing, but difficulties arise in coupling these models with full-vehicle FE models. Therefore, a joint FE was created which can be used within structural FE models to make quick assessments of bonded composite joints. The shape functions of the joint FE were found by solving the governing equations for a structural model for a joint. By analytically determining the shape functions of the joint FE, the complex joint behavior can be captured with very few elements. This joint FE was modified and used to consider adhesives with functionally graded material properties to reduce the peel stress concentrations located near adherend discontinuities. Several practical concerns impede the actual use of such adhesives. These include increased manufacturing complications, alterations to the grading due to adhesive flow during manufacturing, and whether changing the loading conditions significantly impact the effectiveness of the grading. An analytical study is conducted to address these three concerns. Furthermore, proof-of-concept testing is conducted to show the potential advantages of functionally graded adhesives. In this study, grading is achieved by strategically placing glass beads within the adhesive layer at different densities along the joint. Furthermore, the capability to model non-linear adhesive constitutive behavior with large rotations was developed, and progressive failure of the adhesive was modeled by re-meshing the joint as the adhesive fails. Results predicted using the joint FE was compared with experimental results for various

  19. The analysis of adhesively bonded advanced composite joints using joint finite elements

    NASA Astrophysics Data System (ADS)

    Stapleton, Scott E.

    The design and sizing of adhesively bonded joints has always been a major bottleneck in the design of composite vehicles. Dense finite element (FE) meshes are required to capture the full behavior of a joint numerically, but these dense meshes are impractical in vehicle-scale models where a course mesh is more desirable to make quick assessments and comparisons of different joint geometries. Analytical models are often helpful in sizing, but difficulties arise in coupling these models with full-vehicle FE models. Therefore, a joint FE was created which can be used within structural FE models to make quick assessments of bonded composite joints. The shape functions of the joint FE were found by solving the governing equations for a structural model for a joint. By analytically determining the shape functions of the joint FE, the complex joint behavior can be captured with very few elements. This joint FE was modified and used to consider adhesives with functionally graded material properties to reduce the peel stress concentrations located near adherend discontinuities. Several practical concerns impede the actual use of such adhesives. These include increased manufacturing complications, alterations to the grading due to adhesive flow during manufacturing, and whether changing the loading conditions significantly impact the effectiveness of the grading. An analytical study is conducted to address these three concerns. Furthermore, proof-of-concept testing is conducted to show the potential advantages of functionally graded adhesives. In this study, grading is achieved by strategically placing glass beads within the adhesive layer at different densities along the joint. Furthermore, the capability to model non-linear adhesive constitutive behavior with large rotations was developed, and progressive failure of the adhesive was modeled by re-meshing the joint as the adhesive fails. Results predicted using the joint FE was compared with experimental results for various

  20. Experimental analysis of a joint free space cryptosystem

    NASA Astrophysics Data System (ADS)

    Ramírez, John Fredy Barrera; Osorio, Alexis Jaramillo; Zea, Alejandro Vélez; Torroba, Roberto

    2016-08-01

    In this paper, we analyze a joint free space cryptosystem scheme implemented in an actual laboratory environment. In this encrypting architecture, the object to be encoded and the security key are placed side by side in the input plane without optical elements between the input and the output planes. In order to get the encrypted information, the joint Fresnel power distribution JFPD coming from the input plane is registered in a CMOS camera. The information of the encrypting key is registered with an off axis Fresnel holographic setup. The data registered with the experimental setup is digitally filtered to obtain the encrypted object and the encryption key. In addition, we explore the performance of the experimental system as a function of the object-camera and key-camera distances, which are two new parameters of interest. These parameters become available as a result of developing this encrypting scheme. The theoretical and experimental analysis shows the validity and applicability of the cryptosystem.

  1. Joint optimization of algorithmic suites for EEG analysis.

    PubMed

    Santana, Eder; Brockmeier, Austin J; Principe, Jose C

    2014-01-01

    Electroencephalogram (EEG) data analysis algorithms consist of multiple processing steps each with a number of free parameters. A joint optimization methodology can be used as a wrapper to fine-tune these parameters for the patient or application. This approach is inspired by deep learning neural network models, but differs because the processing layers for EEG are heterogeneous with different approaches used for processing space and time. Nonetheless, we treat the processing stages as a neural network and apply backpropagation to jointly optimize the parameters. This approach outperforms previous results on the BCI Competition II - dataset IV; additionally, it outperforms the common spatial patterns (CSP) algorithm on the BCI Competition III dataset IV. In addition, the optimized parameters in the architecture are still interpretable. PMID:25570621

  2. BAYESIAN ANALYSIS TO IDENTIFY NEW STAR CANDIDATES IN NEARBY YOUNG STELLAR KINEMATIC GROUPS

    SciTech Connect

    Malo, Lison; Doyon, Rene; Lafreniere, David; Artigau, Etienne; Gagne, Jonathan; Baron, Frederique; Riedel, Adric E-mail: doyon@astro.umontreal.ca E-mail: artigau@astro.umontreal.ca E-mail: baron@astro.umontreal.ca

    2013-01-10

    We present a new method based on a Bayesian analysis to identify new members of nearby young kinematic groups. The analysis minimally takes into account the position, proper motion, magnitude, and color of a star, but other observables can be readily added (e.g., radial velocity, distance). We use this method to find new young low-mass stars in the {beta} Pictoris and AB Doradus moving groups and in the TW Hydrae, Tucana-Horologium, Columba, Carina, and Argus associations. Starting from a sample of 758 mid-K to mid-M (K5V-M5V) stars showing youth indicators such as H{alpha} and X-ray emission, our analysis yields 214 new highly probable low-mass members of the kinematic groups analyzed. One is in TW Hydrae, 37 in {beta} Pictoris, 17 in Tucana-Horologium, 20 in Columba, 6 in Carina, 50 in Argus, 32 in AB Doradus, and the remaining 51 candidates are likely young but have an ambiguous membership to more than one association. The false alarm rate for new candidates is estimated to be 5% for {beta} Pictoris and TW Hydrae, 10% for Tucana-Horologium, Columba, Carina, and Argus, and 14% for AB Doradus. Our analysis confirms the membership of 58 stars proposed in the literature. Firm membership confirmation of our new candidates will require measurement of their radial velocity (predicted by our analysis), parallax, and lithium 6708 A equivalent width. We have initiated these follow-up observations for a number of candidates, and we have identified two stars (2MASSJ01112542+1526214, 2MASSJ05241914-1601153) as very strong candidate members of the {beta} Pictoris moving group and one strong candidate member (2MASSJ05332558-5117131) of the Tucana-Horologium association; these three stars have radial velocity measurements confirming their membership and lithium detections consistent with young age.

  3. Bayesian value-of-information analysis. An application to a policy model of Alzheimer's disease.

    PubMed

    Claxton, K; Neumann, P J; Araki, S; Weinstein, M C

    2001-01-01

    A framework is presented that distinguishes the conceptually separate decisions of which treatment strategy is optimal from the question of whether more information is required to inform this choice in the future. The authors argue that the choice of treatment strategy should be based on expected utility, and the only valid reason to characterize the uncertainty surrounding outcomes of interest is to establish the value of acquiring additional information. A Bayesian decision theoretic approach is demonstrated through a probabilistic analysis of a published policy model of Alzheimer's disease. The expected value of perfect information is estimated for the decision to adopt a new pharmaceutical for the population of patients with Alzheimer's disease in the United States. This provides an upper bound on the value of additional research. The value of information is also estimated for each of the model inputs. This analysis can focus future research by identifying those parameters where more precise estimates would be most valuable and indicating whether an experimental design would be required. We also discuss how this type of analysis can also be used to design experimental research efficiently (identifying optimal sample size and optimal sample allocation) based on the marginal cost and marginal benefit of sample information. Value-of-information analysis can provide a measure of the expected payoff from proposed research, which can be used to set priorities in research and development. It can also inform an efficient regulatory framework for new healthcare technologies: an analysis of the value of information would define when a claim for a new technology should be deemed substantiated and when evidence should be considered competent and reliable when it is not cost-effective to gather any more information. PMID:11329844

  4. Joint linkage and association analysis with exome sequence data implicates SLC25A40 in hypertriglyceridemia.

    PubMed

    Rosenthal, Elisabeth A; Ranchalis, Jane; Crosslin, David R; Burt, Amber; Brunzell, John D; Motulsky, Arno G; Nickerson, Deborah A; Wijsman, Ellen M; Jarvik, Gail P

    2013-12-01

    Hypertriglyceridemia (HTG) is a heritable risk factor for cardiovascular disease. Investigating the genetics of HTG may identify new drug targets. There are ~35 known single-nucleotide variants (SNVs) that explain only ~10% of variation in triglyceride (TG) level. Because of the genetic heterogeneity of HTG, a family study design is optimal for identification of rare genetic variants with large effect size because the same mutation can be observed in many relatives and cosegregation with TG can be tested. We considered HTG in a five-generation family of European American descent (n = 121), ascertained for familial combined hyperlipidemia. By using Bayesian Markov chain Monte Carlo joint oligogenic linkage and association analysis, we detected linkage to chromosomes 7 and 17. Whole-exome sequence data revealed shared, highly conserved, private missense SNVs in both SLC25A40 on chr7 and PLD2 on chr17. Jointly, these SNVs explained 49% of the genetic variance in TG; however, only the SLC25A40 SNV was significantly associated with TG (p = 0.0001). This SNV, c.374A>G, causes a highly disruptive p.Tyr125Cys substitution just outside the second helical transmembrane region of the SLC25A40 inner mitochondrial membrane transport protein. Whole-gene testing in subjects from the Exome Sequencing Project confirmed the association between TG and SLC25A40 rare, highly conserved, coding variants (p = 0.03). These results suggest a previously undescribed pathway for HTG and illustrate the power of large pedigrees in the search for rare, causal variants.

  5. Joint Linkage and Association Analysis with Exome Sequence Data Implicates SLC25A40 in Hypertriglyceridemia

    PubMed Central

    Rosenthal, Elisabeth A.; Ranchalis, Jane; Crosslin, David R.; Burt, Amber; Brunzell, John D.; Motulsky, Arno G.; Nickerson, Deborah A.; Wijsman, Ellen M.; Jarvik, Gail P.

    2013-01-01

    Hypertriglyceridemia (HTG) is a heritable risk factor for cardiovascular disease. Investigating the genetics of HTG may identify new drug targets. There are ∼35 known single-nucleotide variants (SNVs) that explain only ∼10% of variation in triglyceride (TG) level. Because of the genetic heterogeneity of HTG, a family study design is optimal for identification of rare genetic variants with large effect size because the same mutation can be observed in many relatives and cosegregation with TG can be tested. We considered HTG in a five-generation family of European American descent (n = 121), ascertained for familial combined hyperlipidemia. By using Bayesian Markov chain Monte Carlo joint oligogenic linkage and association analysis, we detected linkage to chromosomes 7 and 17. Whole-exome sequence data revealed shared, highly conserved, private missense SNVs in both SLC25A40 on chr7 and PLD2 on chr17. Jointly, these SNVs explained 49% of the genetic variance in TG; however, only the SLC25A40 SNV was significantly associated with TG (p = 0.0001). This SNV, c.374A>G, causes a highly disruptive p.Tyr125Cys substitution just outside the second helical transmembrane region of the SLC25A40 inner mitochondrial membrane transport protein. Whole-gene testing in subjects from the Exome Sequencing Project confirmed the association between TG and SLC25A40 rare, highly conserved, coding variants (p = 0.03). These results suggest a previously undescribed pathway for HTG and illustrate the power of large pedigrees in the search for rare, causal variants. PMID:24268658

  6. Development of Hierarchical Bayesian Model Based on Regional Frequency Analysis and Its Application to Estimate Areal Rainfall in South Korea

    NASA Astrophysics Data System (ADS)

    Kim, J.; Kwon, H. H.

    2014-12-01

    The existing regional frequency analysis has disadvantages in that it is difficult to consider geographical characteristics in estimating areal rainfall. In this regard, This study aims to develop a hierarchical Bayesian model based regional frequency analysis in that spatial patterns of the design rainfall with geographical information are explicitly incorporated. This study assumes that the parameters of Gumbel distribution are a function of geographical characteristics (e.g. altitude, latitude and longitude) within a general linear regression framework. Posterior distributions of the regression parameters are estimated by Bayesian Markov Chain Monte Calro (MCMC) method, and the identified functional relationship is used to spatially interpolate the parameters of the Gumbel distribution by using digital elevation models (DEM) as inputs. The proposed model is applied to derive design rainfalls over the entire Han-river watershed. It was found that the proposed Bayesian regional frequency analysis model showed similar results compared to L-moment based regional frequency analysis. In addition, the model showed an advantage in terms of quantifying uncertainty of the design rainfall and estimating the area rainfall considering geographical information. Acknowledgement: This research was supported by a grant (14AWMP-B079364-01) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  7. Bayesian model accounting for within-class biological variability in Serial Analysis of Gene Expression (SAGE)

    PubMed Central

    Vêncio, Ricardo ZN; Brentani, Helena; Patrão, Diogo FC; Pereira, Carlos AB

    2004-01-01

    Background An important challenge for transcript counting methods such as Serial Analysis of Gene Expression (SAGE), "Digital Northern" or Massively Parallel Signature Sequencing (MPSS), is to carry out statistical analyses that account for the within-class variability, i.e., variability due to the intrinsic biological differences among sampled individuals of the same class, and not only variability due to technical sampling error. Results We introduce a Bayesian model that accounts for the within-class variability by means of mixture distribution. We show that the previously available approaches of aggregation in pools ("pseudo-libraries") and the Beta-Binomial model, are particular cases of the mixture model. We illustrate our method with a brain tumor vs. normal comparison using SAGE data from public databases. We show examples of tags regarded as differentially expressed with high significance if the within-class variability is ignored, but clearly not so significant if one accounts for it. Conclusion Using available information about biological replicates, one can transform a list of candidate transcripts showing differential expression to a more reliable one. Our method is freely available, under GPL/GNU copyleft, through a user friendly web-based on-line tool or as R language scripts at supplemental web-site. PMID:15339345

  8. Cross-validation analysis of bias models in Bayesian multi-model projections of climate

    NASA Astrophysics Data System (ADS)

    Huttunen, J. M. J.; Räisänen, J.; Nissinen, A.; Lipponen, A.; Kolehmainen, V.

    2016-05-01

    Climate change projections are commonly based on multi-model ensembles of climate simulations. In this paper we consider the choice of bias models in Bayesian multimodel predictions. Buser et al. (Clim Res 44(2-3):227-241, 2010a) introduced a hybrid bias model which combines commonly used constant bias and constant relation bias assumptions. The hybrid model includes a weighting parameter which balances these bias models. In this study, we use a cross-validation approach to study which bias model or bias parameter leads to, in a specific sense, optimal climate change projections. The analysis is carried out for summer and winter season means of 2 m-temperatures spatially averaged over the IPCC SREX regions, using 19 model runs from the CMIP5 data set. The cross-validation approach is applied to calculate optimal bias parameters (in the specific sense) for projecting the temperature change from the control period (1961-2005) to the scenario period (2046-2090). The results are compared to the results of the Buser et al. (Clim Res 44(2-3):227-241, 2010a) method which includes the bias parameter as one of the unknown parameters to be estimated from the data.

  9. iBAG: integrative Bayesian analysis of high-dimensional multiplatform genomics data

    PubMed Central

    Wang, Wenting; Baladandayuthapani, Veerabhadran; Morris, Jeffrey S.; Broom, Bradley M.; Manyam, Ganiraju; Do, Kim-Anh

    2013-01-01

    Motivation: Analyzing data from multi-platform genomics experiments combined with patients’ clinical outcomes helps us understand the complex biological processes that characterize a disease, as well as how these processes relate to the development of the disease. Current data integration approaches are limited in that they do not consider the fundamental biological relationships that exist among the data obtained from different platforms. Statistical Model: We propose an integrative Bayesian analysis of genomics data (iBAG) framework for identifying important genes/biomarkers that are associated with clinical outcome. This framework uses hierarchical modeling to combine the data obtained from multiple platforms into one model. Results: We assess the performance of our methods using several synthetic and real examples. Simulations show our integrative methods to have higher power to detect disease-related genes than non-integrative methods. Using the Cancer Genome Atlas glioblastoma dataset, we apply the iBAG model to integrate gene expression and methylation data to study their associations with patient survival. Our proposed method discovers multiple methylation-regulated genes that are related to patient survival, most of which have important biological functions in other diseases but have not been previously studied in glioblastoma. Availability: http://odin.mdacc.tmc.edu/∼vbaladan/. Contact: veera@mdanderson.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23142963

  10. Composite behavior analysis for video surveillance using hierarchical dynamic Bayesian networks

    NASA Astrophysics Data System (ADS)

    Cheng, Huanhuan; Shan, Yong; Wang, Runsheng

    2011-03-01

    Analyzing composite behaviors involving objects from multiple categories in surveillance videos is a challenging task due to the complicated relationships among human and objects. This paper presents a novel behavior analysis framework using a hierarchical dynamic Bayesian network (DBN) for video surveillance systems. The model is built for extracting objects' behaviors and their relationships by representing behaviors using spatial-temporal characteristics. The recognition of object behaviors is processed by the DBN at multiple levels: features of objects at low level, objects and their relationships at middle level, and event at high level, where event refers to behaviors of a single type object as well as behaviors consisting of several types of objects such as ``a person getting in a car.'' Furthermore, to reduce the complexity, a simple model selection criterion is addressed, by which the appropriated model is picked out from a pool of candidate models. Experiments are shown to demonstrate that the proposed framework could efficiently recognize and semantically describe composite object and human activities in surveillance videos.

  11. Data analysis using scale-space filtering and Bayesian probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Kutulakos, Kiriakos; Robinson, Peter

    1991-01-01

    This paper describes a program for analysis of output curves from Differential Thermal Analyzer (DTA). The program first extracts probabilistic qualitative features from a DTA curve of a soil sample, and then uses Bayesian probabilistic reasoning to infer the mineral in the soil. The qualifier module employs a simple and efficient extension of scale-space filtering suitable for handling DTA data. We have observed that points can vanish from contours in the scale-space image when filtering operations are not highly accurate. To handle the problem of vanishing points, perceptual organizations heuristics are used to group the points into lines. Next, these lines are grouped into contours by using additional heuristics. Probabilities are associated with these contours using domain-specific correlations. A Bayes tree classifier processes probabilistic features to infer the presence of different minerals in the soil. Experiments show that the algorithm that uses domain-specific correlation to infer qualitative features outperforms a domain-independent algorithm that does not.

  12. Bayesian probability analysis: a prospective demonstration of its clinical utility in diagnosing coronary disease

    SciTech Connect

    Detrano, R.; Yiannikas, J.; Salcedo, E.E.; Rincon, G.; Go, R.T.; Williams, G.; Leatherman, J.

    1984-03-01

    One hundred fifty-four patients referred for coronary arteriography were prospectively studied with stress electrocardiography, stress thallium scintigraphy, cine fluoroscopy (for coronary calcifications), and coronary angiography. Pretest probabilities of coronary disease were determined based on age, sex, and type of chest pain. These and pooled literature values for the conditional probabilities of test results based on disease state were used in Bayes theorem to calculate posttest probabilities of disease. The results of the three noninvasive tests were compared for statistical independence, a necessary condition for their simultaneous use in Bayes theorem. The test results were found to demonstrate pairwise independence in patients with and those without disease. Some dependencies that were observed between the test results and the clinical variables of age and sex were not sufficient to invalidate application of the theorem. Sixty-eight of the study patients had at least one major coronary artery obstruction of greater than 50%. When these patients were divided into low-, intermediate-, and high-probability subgroups according to their pretest probabilities, noninvasive test results analyzed by Bayesian probability analysis appropriately advanced 17 of them by at least one probability subgroup while only seven were moved backward. Of the 76 patients without disease, 34 were appropriately moved into a lower probability subgroup while 10 were incorrectly moved up. We conclude that posttest probabilities calculated from Bayes theorem more accurately classified patients with and without disease than did pretest probabilities, thus demonstrating the utility of the theorem in this application.

  13. Adverse and Advantageous Selection in the Medicare Supplemental Market: A Bayesian Analysis of Prescription drug Expenditure.

    PubMed

    Li, Qian; Trivedi, Pravin K

    2016-02-01

    This paper develops an extended specification of the two-part model, which controls for unobservable self-selection and heterogeneity of health insurance, and analyzes the impact of Medicare supplemental plans on the prescription drug expenditure of the elderly, using a linked data set based on the Medicare Current Beneficiary Survey data for 2003-2004. The econometric analysis is conducted using a Bayesian econometric framework. We estimate the treatment effects for different counterfactuals and find significant evidence of endogeneity in plan choice and the presence of both adverse and advantageous selections in the supplemental insurance market. The average incentive effect is estimated to be $757 (2004 value) or 41% increase per person per year for the elderly enrolled in supplemental plans with drug coverage against the Medicare fee-for-service counterfactual and is $350 or 21% against the supplemental plans without drug coverage counterfactual. The incentive effect varies by different sources of drug coverage: highest for employer-sponsored insurance plans, followed by Medigap and managed medicare plans. PMID:25504934

  14. Bayesian time series analysis of segments of the Rocky Mountain trumpeter swan population

    USGS Publications Warehouse

    Wright, Christopher K.; Sojda, Richard S.; Goodman, Daniel

    2002-01-01

    A Bayesian time series analysis technique, the dynamic linear model, was used to analyze counts of Trumpeter Swans (Cygnus buccinator) summering in Idaho, Montana, and Wyoming from 1931 to 2000. For the Yellowstone National Park segment of white birds (sub-adults and adults combined) the estimated probability of a positive growth rate is 0.01. The estimated probability of achieving the Subcommittee on Rocky Mountain Trumpeter Swans 2002 population goal of 40 white birds for the Yellowstone segment is less than 0.01. Outside of Yellowstone National Park, Wyoming white birds are estimated to have a 0.79 probability of a positive growth rate with a 0.05 probability of achieving the 2002 objective of 120 white birds. In the Centennial Valley in southwest Montana, results indicate a probability of 0.87 that the white bird population is growing at a positive rate with considerable uncertainty. The estimated probability of achieving the 2002 Centennial Valley objective of 160 white birds is 0.14 but under an alternative model falls to 0.04. The estimated probability that the Targhee National Forest segment of white birds has a positive growth rate is 0.03. In Idaho outside of the Targhee National Forest, white birds are estimated to have a 0.97 probability of a positive growth rate with a 0.18 probability of attaining the 2002 goal of 150 white birds.

  15. A Bayesian Approach to the Design and Analysis of Computer Experiments

    SciTech Connect

    Currin, C.

    1988-01-01

    We consider the problem of designing and analyzing experiments for prediction of the function y(f), t {element_of} T, where y is evaluated by means of a computer code (typically by solving complicated equations that model a physical system), and T represents the domain of inputs to the code. We use a Bayesian approach, in which uncertainty about y is represented by a spatial stochastic process (random function); here we restrict attention to stationary Gaussian processes. The posterior mean function can be used as an interpolating function, with uncertainties given by the posterior standard deviations. Instead of completely specifying the prior process, we consider several families of priors, and suggest some cross-validational methods for choosing one that performs relatively well on the function at hand. As a design criterion, we use the expected reduction in the entropy of the random vector y (T*), where T* {contained_in} T is a given finite set of ''sites'' (input configurations) at which predictions are to be made. We describe an exchange algorithm for constructing designs that are optimal with respect to this criterion. To demonstrate the use of these design and analysis methods, several examples are given, including one experiment on a computer model of a thermal energy storage device and another on an integrated circuit simulator.

  16. Bayesian analysis of experimental epidemics of foot-and-mouth disease.

    PubMed

    Streftaris, George; Gibson, Gavin J

    2004-06-01

    We investigate the transmission dynamics of a certain type of foot-and-mouth disease (FMD) virus under experimental conditions. Previous analyses of experimental data from FMD outbreaks in non-homogeneously mixing populations of sheep have suggested a decline in viraemic level through serial passage of the virus, but these do not take into account possible variation in the length of the chain of viral transmission for each animal, which is implicit in the non-observed transmission process. We consider a susceptible-exposed-infectious-removed non-Markovian compartmental model for partially observed epidemic processes, and we employ powerful methodology (Markov chain Monte Carlo) for statistical inference, to address epidemiological issues under a Bayesian framework that accounts for all available information and associated uncertainty in a coherent approach. The analysis allows us to investigate the posterior distribution of the hidden transmission history of the epidemic, and thus to determine the effect of the length of the infection chain on the recorded viraemic levels, based on the posterior distribution of a p-value. Parameter estimates of the epidemiological characteristics of the disease are also obtained. The results reveal a possible decline in viraemia in one of the two experimental outbreaks. Our model also suggests that individual infectivity is related to the level of viraemia.

  17. Assessment of occupational safety risks in Floridian solid waste systems using Bayesian analysis.

    PubMed

    Bastani, Mehrad; Celik, Nurcin

    2015-10-01

    Safety risks embedded within solid waste management systems continue to be a significant issue and are prevalent at every step in the solid waste management process. To recognise and address these occupational hazards, it is necessary to discover the potential safety concerns that cause them, as well as their direct and/or indirect impacts on the different types of solid waste workers. In this research, our goal is to statistically assess occupational safety risks to solid waste workers in the state of Florida. Here, we first review the related standard industrial codes to major solid waste management methods including recycling, incineration, landfilling, and composting. Then, a quantitative assessment of major risks is conducted based on the data collected using a Bayesian data analysis and predictive methods. The risks estimated in this study for the period of 2005-2012 are then compared with historical statistics (1993-1997) from previous assessment studies. The results have shown that the injury rates among refuse collectors in both musculoskeletal and dermal injuries have decreased from 88 and 15 to 16 and three injuries per 1000 workers, respectively. However, a contrasting trend is observed for the injury rates among recycling workers, for whom musculoskeletal and dermal injuries have increased from 13 and four injuries to 14 and six injuries per 1000 workers, respectively. Lastly, a linear regression model has been proposed to identify major elements of the high number of musculoskeletal and dermal injuries.

  18. Adverse and Advantageous Selection in the Medicare Supplemental Market: A Bayesian Analysis of Prescription drug Expenditure.

    PubMed

    Li, Qian; Trivedi, Pravin K

    2016-02-01

    This paper develops an extended specification of the two-part model, which controls for unobservable self-selection and heterogeneity of health insurance, and analyzes the impact of Medicare supplemental plans on the prescription drug expenditure of the elderly, using a linked data set based on the Medicare Current Beneficiary Survey data for 2003-2004. The econometric analysis is conducted using a Bayesian econometric framework. We estimate the treatment effects for different counterfactuals and find significant evidence of endogeneity in plan choice and the presence of both adverse and advantageous selections in the supplemental insurance market. The average incentive effect is estimated to be $757 (2004 value) or 41% increase per person per year for the elderly enrolled in supplemental plans with drug coverage against the Medicare fee-for-service counterfactual and is $350 or 21% against the supplemental plans without drug coverage counterfactual. The incentive effect varies by different sources of drug coverage: highest for employer-sponsored insurance plans, followed by Medigap and managed medicare plans.

  19. Integration of Bayesian analysis for eutrophication prediction and assessment in a landscape lake.

    PubMed

    Yang, Likun; Zhao, Xinhua; Peng, Sen; Zhou, Guangyu

    2015-01-01

    Eutrophication models have been widely used to assess water quality in landscape lakes. Because flow rate in landscape lakes is relatively low and similar to that of natural lakes, eutrophication is more dominant in landscape lakes. To assess the risk of eutrophication in landscape lakes, a set of dynamic equations was developed to simulate lake water quality for total nitrogen (TN), total phosphorous (TP), dissolve oxygen (DO) and chlorophyll a (Chl a). Firstly, the Bayesian calibration results were described. Moreover, the ability of the model to reproduce adequately the observed mean patterns and major cause-effect relationships for water quality conditions in landscape lakes were presented. Two loading scenarios were used. A Monte Carlo algorithm was applied to calculate the predicated water quality distributions, which were used in the established hierarchical assessment system for lake water quality risk. The important factors affecting the lake water quality risk were defined using linear regression analysis. The results indicated that the variations in the landscape lake receiving recharge water quality caused considerable landscape lake water quality risk in the surrounding area. Moreover, the Chl a concentration in lake water was significantly affected by TP and TN concentrations; the lake TP concentration was the limiting factor for growth of plankton in lake water. The lake water TN concentration provided the basic nutritional requirements. Lastly, lower TN and TP concentrations in the receiving recharge water caused increased lake water quality risk.

  20. Iteration of Partially-Specified Target Matrices: Applications in Exploratory and Bayesian Confirmatory Factor Analysis

    PubMed Central

    Moore, Tyler M.; Reise, Steven P.; Depaoli, Sarah; Haviland, Mark G.

    2015-01-01

    We describe and evaluate a factor rotation algorithm, iterated target rotation (ITR). Whereas target rotation (Browne, 2001) requires a user to specify a target matrix a priori based on theory or prior research, ITR begins with a standard analytic factor rotation (i.e., an empirically-informed target) followed by an iterative search procedure to update the target matrix. Monte Carlo simulations were conducted to evaluate the performance of ITR relative to analytic rotations from the Crawford-Ferguson family with population factor structures varying in complexity. Simulation results: (a) suggested that ITR analyses will be particularly useful when evaluating data with complex structures (i.e., multiple cross-loadings) and (b) showed that the rotation method used to define an initial target matrix did not materially affect the accuracy of the various ITRs. In Study 2, we: (a) demonstrated the application of ITR as a way to determine empirically-informed priors in a Bayesian confirmatory factor analysis (BCFA; Muthén & Asparouhov, 2012) of a rater-report alexithymia measure (Haviland, Warren, & Riggs, 2000) and (b) highlighted some of the challenges when specifying empirically-based priors and assessing item and overall model fit. PMID:26609875

  1. A Bayesian analysis of redshifted 21-cm H I signal and foregrounds: simulations for LOFAR

    NASA Astrophysics Data System (ADS)

    Ghosh, Abhik; Koopmans, Léon V. E.; Chapman, E.; Jelić, V.

    2015-09-01

    Observations of the epoch of reionization (EoR) using the 21-cm hyperfine emission of neutral hydrogen (H I) promise to open an entirely new window on the formation of the first stars, galaxies and accreting black holes. In order to characterize the weak 21-cm signal, we need to develop imaging techniques that can reconstruct the extended emission very precisely. Here, we present an inversion technique for LOw Frequency ARray (LOFAR) baselines at the North Celestial Pole (NCP), based on a Bayesian formalism with optimal spatial regularization, which is used to reconstruct the diffuse foreground map directly from the simulated visibility data. We notice that the spatial regularization de-noises the images to a large extent, allowing one to recover the 21-cm power spectrum over a considerable k⊥-k∥ space in the range 0.03 Mpc-1 < k⊥ < 0.19 Mpc-1 and 0.14 Mpc-1 < k∥ < 0.35 Mpc-1 without subtracting the noise power spectrum. We find that, in combination with using generalized morphological component analysis (GMCA), a non-parametric foreground removal technique, we can mostly recover the spherical average power spectrum within 2σ statistical fluctuations for an input Gaussian random root-mean-square noise level of 60 mK in the maps after 600 h of integration over a 10-MHz bandwidth.

  2. Guideline for bolted joint design and analysis : version 1.0.

    SciTech Connect

    Brown, Kevin H.; Morrow, Charles W.; Durbin, Samuel; Baca, Allen

    2008-01-01

    This document provides general guidance for the design and analysis of bolted joint connections. An overview of the current methods used to analyze bolted joint connections is given. Several methods for the design and analysis of bolted joint connections are presented. Guidance is provided for general bolted joint design, computation of preload uncertainty and preload loss, and the calculation of the bolted joint factor of safety. Axial loads, shear loads, thermal loads, and thread tear out are used in factor of safety calculations. Additionally, limited guidance is provided for fatigue considerations. An overview of an associated Mathcad{copyright} Worksheet containing all bolted joint design formulae presented is also provided.

  3. A unified method for inference of tokamak equilibria and validation of force-balance models based on Bayesian analysis

    NASA Astrophysics Data System (ADS)

    von Nessi, G. T.; Hole, M. J.; the MAST Team

    2013-05-01

    A new method, based on Bayesian analysis, is presented which unifies the inference of plasma equilibria parameters in a tokamak with the ability to quantify differences between inferred equilibria and Grad-Shafranov (GS) force-balance solutions. At the heart of this technique is the new concept of weak observation, which allows multiple forward models to be associated with a single diagnostic observation. This new idea subsequently provides a means by which the space of GS solutions can be efficiently characterized via a prior distribution. The posterior evidence (a normalization constant of the inferred posterior distribution) is also inferred in the analysis and is used as a proxy for determining how relatively close inferred equilibria are to force-balance for different discharges/times. These points have been implemented in a code called BEAST (Bayesian equilibrium analysis and simulation tool), which uses a special implementation of Skilling’s nested sampling algorithm (Skilling 2006 Bayesian Anal. 1 833-59) to perform sampling and evidence calculations on high-dimensional, non-Gaussian posteriors. Initial BEAST equilibrium inference results are presented for two high-performance MAST discharges.

  4. Analysis Method for Inelastic, Adhesively Bonded Joints with Anisotropic Adherends

    NASA Technical Reports Server (NTRS)

    Smeltzer, Stanley S., III; Klang, Eric C.

    2003-01-01

    A one-dimensional analysis method for evaluating adhesively bonded joints composed of anisotropic adherends and adhesives with nonlinear material behavior is presented in the proposed paper. The strain and resulting stress field in a general, bonded joint overlap are determined by using a variable-step, finite-difference solution algorithm to iteratively solve a system of first-order differential equations. Applied loading is given by a system of combined extensional, bending, and shear forces that are applied to the edge of the joint overlap. Adherends are assumed to behave as linear, cylindrically bent plates using classical laminated plate theory that includes the effects of first-order transverse shear deformation. Using the deformation theory of plasticity and a modified von-Mises yield criterion, inelastic material behavior is modeled in the adhesive layer. Results for the proposed method are verified against previous results from the literature and shown to be in excellent agreement. An additional case that highlights the effects of transverse shear deformation between similar adherends is also presented.

  5. Bayesian spatiotemporal analysis of zero-inflated biological population density data by a delta-normal spatiotemporal additive model.

    PubMed

    Arcuti, Simona; Pollice, Alessio; Ribecco, Nunziata; D'Onghia, Gianfranco

    2016-03-01

    We evaluate the spatiotemporal changes in the density of a particular species of crustacean known as deep-water rose shrimp, Parapenaeus longirostris, based on biological sample data collected during trawl surveys carried out from 1995 to 2006 as part of the international project MEDITS (MEDiterranean International Trawl Surveys). As is the case for many biological variables, density data are continuous and characterized by unusually large amounts of zeros, accompanied by a skewed distribution of the remaining values. Here we analyze the normalized density data by a Bayesian delta-normal semiparametric additive model including the effects of covariates, using penalized regression with low-rank thin-plate splines for nonlinear spatial and temporal effects. Modeling the zero and nonzero values by two joint processes, as we propose in this work, allows to obtain great flexibility and easily handling of complex likelihood functions, avoiding inaccurate statistical inferences due to misclassification of the high proportion of exact zeros in the model. Bayesian model estimation is obtained by Markov chain Monte Carlo simulations, suitably specifying the complex likelihood function of the zero-inflated density data. The study highlights relevant nonlinear spatial and temporal effects and the influence of the annual Mediterranean oscillations index and of the sea surface temperature on the distribution of the deep-water rose shrimp density. PMID:26418888

  6. Bayesian spatiotemporal analysis of zero-inflated biological population density data by a delta-normal spatiotemporal additive model.

    PubMed

    Arcuti, Simona; Pollice, Alessio; Ribecco, Nunziata; D'Onghia, Gianfranco

    2016-03-01

    We evaluate the spatiotemporal changes in the density of a particular species of crustacean known as deep-water rose shrimp, Parapenaeus longirostris, based on biological sample data collected during trawl surveys carried out from 1995 to 2006 as part of the international project MEDITS (MEDiterranean International Trawl Surveys). As is the case for many biological variables, density data are continuous and characterized by unusually large amounts of zeros, accompanied by a skewed distribution of the remaining values. Here we analyze the normalized density data by a Bayesian delta-normal semiparametric additive model including the effects of covariates, using penalized regression with low-rank thin-plate splines for nonlinear spatial and temporal effects. Modeling the zero and nonzero values by two joint processes, as we propose in this work, allows to obtain great flexibility and easily handling of complex likelihood functions, avoiding inaccurate statistical inferences due to misclassification of the high proportion of exact zeros in the model. Bayesian model estimation is obtained by Markov chain Monte Carlo simulations, suitably specifying the complex likelihood function of the zero-inflated density data. The study highlights relevant nonlinear spatial and temporal effects and the influence of the annual Mediterranean oscillations index and of the sea surface temperature on the distribution of the deep-water rose shrimp density.

  7. Personality and coping traits: A joint factor analysis.

    PubMed

    Ferguson, Eamonn

    2001-11-01

    OBJECTIVES: The main objective of this paper is to explore the structural similarities between Eysenck's model of personality and the dimensions of the dispositional COPE. Costa et al. {Costa P., Somerfield, M., & McCrae, R. (1996). Personality and coping: A reconceptualisation. In (pp. 44-61) Handbook of coping: Theory, research and applications. New York: Wiley} suggest that personality and coping behaviour are part of a continuum based on adaptation. If this is the case, there should be structural similarities between measures of personality and coping behaviour. This is tested using a joint factor analysis of personality and coping measures. DESIGN: Cross-sectional survey. METHODS: The EPQ-R and the dispositional COPE were administered to 154 participants, and the data were analysed using joint factor analysis and bivariate associations. RESULTS: The joint factor analysis indicated that these data were best explained by a four-factor model. One factor was primarily unrelated to personality. There was a COPE-neurotic-introvert factor (NI-COPE) containing coping behaviours such as denial, a COPE-extroversion (E-COPE) factor containing behaviours such as seeking social support and a COPE-psychoticism factor (P-COPE) containing behaviours such as alcohol use. This factor pattern, especially for NI- and E-COPE, was interpreted in terms of Gray's model of personality {Gray, J. A. (1987) The psychology of fear and stress. Cambridge: Cambridge University Press}. NI-, E-, and P-COPE were shown to be related, in a theoretically consistent manner, to perceived coping success and perceived coping functions. CONCLUSIONS: The results indicate that there are indeed conceptual links between models of personality and coping. It is argued that future research should focus on identifying coping 'trait complexes'. Implications for practice are discussed. PMID:12614507

  8. A Bayesian Approach for Evaluation of Determinants of Health System Efficiency Using Stochastic Frontier Analysis and Beta Regression

    PubMed Central

    2016-01-01

    In today's world, Public expenditures on health are one of the most important issues for governments. These increased expenditures are putting pressure on public budgets. Therefore, health policy makers have focused on the performance of their health systems and many countries have introduced reforms to improve the performance of their health systems. This study investigates the most important determinants of healthcare efficiency for OECD countries using second stage approach for Bayesian Stochastic Frontier Analysis (BSFA). There are two steps in this study. First we measure 29 OECD countries' healthcare efficiency by BSFA using the data from the OECD Health Database. At second stage, we expose the multiple relationships between the healthcare efficiency and characteristics of healthcare systems across OECD countries using Bayesian beta regression. PMID:27118987

  9. A Bayesian Meta-Analysis on Prevalence of Hepatitis B Virus Infection among Chinese Volunteer Blood Donors

    PubMed Central

    Liu, Guang-ying; Zheng, Yang; Deng, Yan; Gao, Yan-yan; Wang, Lie

    2013-01-01

    Background Although transfusion-transmitted infection of hepatitis B virus (HBV) threatens the blood safety of China, the nationwide circumstance of HBV infection among blood donors is still unclear. Objectives To comprehensively estimate the prevalence of HBsAg positive and HBV occult infection (OBI) among Chinese volunteer blood donors through bayesian meta-analysis. Methods We performed an electronic search in Pub-Med, Web of Knowledge, Medline, Wanfang Data and CNKI, complemented by a hand search of relevant reference lists. Two authors independently extracted data from the eligible studies. Then two bayesian random-effect meta-analyses were performed, followed by bayesian meta-regressions. Results 5957412 and 571227 donors were identified in HBsAg group and OBI group, respectively. The pooled prevalence of HBsAg group and OBI group among donors is 1.085% (95% credible interval [CI] 0.859%∼1.398%) and 0.094% (95% CI 0.0578%∼0.1655%). For HBsAg group, subgroup analysis shows the more developed area has a lower prevalence than the less developed area; meta-regression indicates there is a significant decreasing trend in HBsAg positive prevalence with sampling year (beta = −0.1202, 95% −0.2081∼−0.0312). Conclusion Blood safety against HBV infection in China is suffering serious threats and the government should take effective measures to improve this situation. PMID:24236110

  10. Bayesian Statistical Analysis of Historical and Late Holocene Rates of Sea-Level Change

    NASA Astrophysics Data System (ADS)

    Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin

    2014-05-01

    A fundamental concern associated with climate change is the rate at which sea levels are rising. Studies of past sea level (particularly beyond the instrumental data range) allow modern sea-level rise to be placed in a more complete context. Considering this, we perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level rise, to determine when modern rates of sea-level rise began and to observe how these rates have been changing over time. Many of the current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over confidence in the sea-level trends being characterized. The proposed Bayesian model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, this is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary, in this case, for the model to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method provides a flexible fit and it allows for the direct estimation of the rate process with full consideration of all sources of uncertainty. Analysis of tide-gauge datasets and proxy reconstructions in this way means that changing rates of sea level can be estimated more comprehensively and accurately than previously possible. The model captures the continuous and dynamic evolution of sea-level change and results show that not only are modern sea levels rising but that the rates

  11. Finite Element Analysis of the Maximum Stress at the Joints of the Transmission Tower

    NASA Astrophysics Data System (ADS)

    Itam, Zarina; Beddu, Salmia; Liyana Mohd Kamal, Nur; Bamashmos, Khaled H.

    2016-03-01

    Transmission towers are tall structures, usually a steel lattice tower, used to support an overhead power line. Usually, transmission towers are analyzed as frame-truss systems and the members are assumed to be pin-connected without explicitly considering the effects of joints on the tower behavior. In this research, an engineering example of joint will be analyzed with the consideration of the joint detailing to investigate how it will affect the tower analysis. A static analysis using STAAD Pro was conducted to indicate the joint with the maximum stress. This joint will then be explicitly analyzed in ANSYS using the Finite Element Method. Three approaches were used in the software which are the simple plate model, bonded contact with no bolts, and beam element bolts. Results from the joint analysis show that stress values increased with joint details consideration. This proves that joints and connections play an important role in the distribution of stress within the transmission tower.

  12. Health at the borders: Bayesian multilevel analysis of women's malnutrition determinants in Ethiopia

    PubMed Central

    Delbiso, Tefera Darge; Rodriguez-Llanes, Jose Manuel; Altare, Chiara; Masquelier, Bruno; Guha-Sapir, Debarati

    2016-01-01

    Background Women's malnutrition, particularly undernutrition, remains an important public health challenge in Ethiopia. Although various studies examined the levels and determinants of women's nutritional status, the influence of living close to an international border on women's nutrition has not been investigated. Yet, Ethiopian borders are regularly affected by conflict and refugee flows, which might ultimately impact health. Objective To investigate the impact of living close to borders in the nutritional status of women in Ethiopia, while considering other important covariates. Design Our analysis was based on the body mass index (BMI) of 6,334 adult women aged 20–49 years, obtained from the 2011 Ethiopian Demographic and Health Survey (EDHS). A Bayesian multilevel multinomial logistic regression analysis was used to capture the clustered structure of the data and the possible correlation that may exist within and between clusters. Results After controlling for potential confounders, women living close to borders (i.e. ≤100 km) in Ethiopia were 59% more likely to be underweight (posterior odds ratio [OR]=1.59; 95% credible interval [CrI]: 1.32–1.90) than their counterparts living far from the borders. This result was robust to different choices of border delineation (i.e. ≤50, ≤75, ≤125, and ≤150 km). Women from poor families, those who have no access to improved toilets, reside in lowland areas, and are Muslim, were independently associated with underweight. In contrast, more wealth, higher education, older age, access to improved toilets, being married, and living in urban or lowlands were independently associated with overweight. Conclusions The problem of undernutrition among women in Ethiopia is most worrisome in the border areas. Targeted interventions to improve nutritional status in these areas, such as improved access to sanitation, economic and livelihood support, are recommended. PMID:27388539

  13. Bayesian network modeling: A case study of an epidemiologic system analysis of cardiovascular risk.

    PubMed

    Fuster-Parra, P; Tauler, P; Bennasar-Veny, M; Ligęza, A; López-González, A A; Aguiló, A

    2016-04-01

    An extensive, in-depth study of cardiovascular risk factors (CVRF) seems to be of crucial importance in the research of cardiovascular disease (CVD) in order to prevent (or reduce) the chance of developing or dying from CVD. The main focus of data analysis is on the use of models able to discover and understand the relationships between different CVRF. In this paper a report on applying Bayesian network (BN) modeling to discover the relationships among thirteen relevant epidemiological features of heart age domain in order to analyze cardiovascular lost years (CVLY), cardiovascular risk score (CVRS), and metabolic syndrome (MetS) is presented. Furthermore, the induced BN was used to make inference taking into account three reasoning patterns: causal reasoning, evidential reasoning, and intercausal reasoning. Application of BN tools has led to discovery of several direct and indirect relationships between different CVRF. The BN analysis showed several interesting results, among them: CVLY was highly influenced by smoking being the group of men the one with highest risk in CVLY; MetS was highly influence by physical activity (PA) being again the group of men the one with highest risk in MetS, and smoking did not show any influence. BNs produce an intuitive, transparent, graphical representation of the relationships between different CVRF. The ability of BNs to predict new scenarios when hypothetical information is introduced makes BN modeling an Artificial Intelligence (AI) tool of special interest in epidemiological studies. As CVD is multifactorial the use of BNs seems to be an adequate modeling tool. PMID:26777431

  14. Reduction of a Whole-Body Physiologically Based Pharmacokinetic Model to Stabilise the Bayesian Analysis of Clinical Data.

    PubMed

    Wendling, Thierry; Tsamandouras, Nikolaos; Dumitras, Swati; Pigeolet, Etienne; Ogungbenro, Kayode; Aarons, Leon

    2016-01-01

    Whole-body physiologically based pharmacokinetic (PBPK) models are increasingly used in drug development for their ability to predict drug concentrations in clinically relevant tissues and to extrapolate across species, experimental conditions and sub-populations. A whole-body PBPK model can be fitted to clinical data using a Bayesian population approach. However, the analysis might be time consuming and numerically unstable if prior information on the model parameters is too vague given the complexity of the system. We suggest an approach where (i) a whole-body PBPK model is formally reduced using a Bayesian proper lumping method to retain the mechanistic interpretation of the system and account for parameter uncertainty, (ii) the simplified model is fitted to clinical data using Markov Chain Monte Carlo techniques and (iii) the optimised reduced PBPK model is used for extrapolation. A previously developed 16-compartment whole-body PBPK model for mavoglurant was reduced to 7 compartments while preserving plasma concentration-time profiles (median and variance) and giving emphasis to the brain (target site) and the liver (elimination site). The reduced model was numerically more stable than the whole-body model for the Bayesian analysis of mavoglurant pharmacokinetic data in healthy adult volunteers. Finally, the reduced yet mechanistic model could easily be scaled from adults to children and predict mavoglurant pharmacokinetics in children aged from 3 to 11 years with similar performance compared with the whole-body model. This study is a first example of the practicality of formal reduction of complex mechanistic models for Bayesian inference in drug development. PMID:26538125

  15. Reduction of a Whole-Body Physiologically Based Pharmacokinetic Model to Stabilise the Bayesian Analysis of Clinical Data.

    PubMed

    Wendling, Thierry; Tsamandouras, Nikolaos; Dumitras, Swati; Pigeolet, Etienne; Ogungbenro, Kayode; Aarons, Leon

    2016-01-01

    Whole-body physiologically based pharmacokinetic (PBPK) models are increasingly used in drug development for their ability to predict drug concentrations in clinically relevant tissues and to extrapolate across species, experimental conditions and sub-populations. A whole-body PBPK model can be fitted to clinical data using a Bayesian population approach. However, the analysis might be time consuming and numerically unstable if prior information on the model parameters is too vague given the complexity of the system. We suggest an approach where (i) a whole-body PBPK model is formally reduced using a Bayesian proper lumping method to retain the mechanistic interpretation of the system and account for parameter uncertainty, (ii) the simplified model is fitted to clinical data using Markov Chain Monte Carlo techniques and (iii) the optimised reduced PBPK model is used for extrapolation. A previously developed 16-compartment whole-body PBPK model for mavoglurant was reduced to 7 compartments while preserving plasma concentration-time profiles (median and variance) and giving emphasis to the brain (target site) and the liver (elimination site). The reduced model was numerically more stable than the whole-body model for the Bayesian analysis of mavoglurant pharmacokinetic data in healthy adult volunteers. Finally, the reduced yet mechanistic model could easily be scaled from adults to children and predict mavoglurant pharmacokinetics in children aged from 3 to 11 years with similar performance compared with the whole-body model. This study is a first example of the practicality of formal reduction of complex mechanistic models for Bayesian inference in drug development.

  16. Joint regression analysis and AMMI model applied to oat improvement

    NASA Astrophysics Data System (ADS)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  17. Modeling of joints for the dynamic analysis of truss structures

    NASA Technical Reports Server (NTRS)

    Belvin, W. Keith

    1987-01-01

    An experimentally-based method for determining the stiffness and damping of truss joints is described. The analytical models use springs and both viscous and friction dampers to simulate joint load-deflection behavior. A least-squares algorithm is developed to identify the stiffness and damping coefficients of the analytical joint models from test data. The effects of nonlinear joint stiffness such as joint dead band are also studied. Equations for predicting the sensitivity of beam deformations to changes in joint stiffness are derived and used to show the level of joint stiffness required for nearly rigid joint behavior. Finally, the global frequency sensitivity of a truss structure to random perturbations in joint stiffness is discussed.

  18. Use of Bayesian Inference in Crystallographic Structure Refinement via Full Diffraction Profile Analysis

    PubMed Central

    Fancher, Chris M.; Han, Zhen; Levin, Igor; Page, Katharine; Reich, Brian J.; Smith, Ralph C.; Wilson, Alyson G.; Jones, Jacob L.

    2016-01-01

    A Bayesian inference method for refining crystallographic structures is presented. The distribution of model parameters is stochastically sampled using Markov chain Monte Carlo. Posterior probability distributions are constructed for all model parameters to properly quantify uncertainty by appropriately modeling the heteroskedasticity and correlation of the error structure. The proposed method is demonstrated by analyzing a National Institute of Standards and Technology silicon standard reference material. The results obtained by Bayesian inference are compared with those determined by Rietveld refinement. Posterior probability distributions of model parameters provide both estimates and uncertainties. The new method better estimates the true uncertainties in the model as compared to the Rietveld method. PMID:27550221

  19. Use of Bayesian Inference in Crystallographic Structure Refinement via Full Diffraction Profile Analysis.

    PubMed

    Fancher, Chris M; Han, Zhen; Levin, Igor; Page, Katharine; Reich, Brian J; Smith, Ralph C; Wilson, Alyson G; Jones, Jacob L

    2016-01-01

    A Bayesian inference method for refining crystallographic structures is presented. The distribution of model parameters is stochastically sampled using Markov chain Monte Carlo. Posterior probability distributions are constructed for all model parameters to properly quantify uncertainty by appropriately modeling the heteroskedasticity and correlation of the error structure. The proposed method is demonstrated by analyzing a National Institute of Standards and Technology silicon standard reference material. The results obtained by Bayesian inference are compared with those determined by Rietveld refinement. Posterior probability distributions of model parameters provide both estimates and uncertainties. The new method better estimates the true uncertainties in the model as compared to the Rietveld method. PMID:27550221

  20. A hybrid Bayesian hierarchical model combining cohort and case-control studies for meta-analysis of diagnostic tests: Accounting for partial verification bias.

    PubMed

    Ma, Xiaoye; Chen, Yong; Cole, Stephen R; Chu, Haitao

    2014-05-26

    To account for between-study heterogeneity in meta-analysis of diagnostic accuracy studies, bivariate random effects models have been recommended to jointly model the sensitivities and specificities. As study design and population vary, the definition of disease status or severity could differ across studies. Consequently, sensitivity and specificity may be correlated with disease prevalence. To account for this dependence, a trivariate random effects model had been proposed. However, the proposed approach can only include cohort studies with information estimating study-specific disease prevalence. In addition, some diagnostic accuracy studies only select a subset of samples to be verified by the reference test. It is known that ignoring unverified subjects may lead to partial verification bias in the estimation of prevalence, sensitivities, and specificities in a single study. However, the impact of this bias on a meta-analysis has not been investigated. In this paper, we propose a novel hybrid Bayesian hierarchical model combining cohort and case-control studies and correcting partial verification bias at the same time. We investigate the performance of the proposed methods through a set of simulation studies. Two case studies on assessing the diagnostic accuracy of gadolinium-enhanced magnetic resonance imaging in detecting lymph node metastases and of adrenal fluorine-18 fluorodeoxyglucose positron emission tomography in characterizing adrenal masses are presented.

  1. Bayesian Analysis for Risk Assessment of Selected Medical Events in Support of the Integrated Medical Model Effort

    NASA Technical Reports Server (NTRS)

    Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.

    2012-01-01

    The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.

  2. New class of hybrid EoS and Bayesian M - R data analysis

    NASA Astrophysics Data System (ADS)

    Alvarez-Castillo, D.; Ayriyan, A.; Benic, S.; Blaschke, D.; Grigorian, H.; Typel, S.

    2016-03-01

    We explore systematically a new class of two-phase equations of state (EoS) for hybrid stars that is characterized by three main features: 1) stiffening of the nuclear EoS at supersaturation densities due to quark exchange effects (Pauli blocking) between hadrons, modelled by an excluded volume correction; 2) stiffening of the quark matter EoS at high densities due to multiquark interactions; and 3) possibility for a strong first-order phase transition with an early onset and large density jump. The third feature results from a Maxwell construction for the possible transition from the nuclear to a quark matter phase and its properties depend on the two parameters used for 1) and 2), respectively. Varying these two parameters, one obtains a class of hybrid EoS that yields solutions of the Tolman-Oppenheimer-Volkoff (TOV) equations for sequences of hadronic and hybrid stars in the mass-radius diagram which cover the full range of patterns according to the Alford-Han-Prakash classification following which a hybrid star branch can be either absent, connected or disconnected with the hadronic one. The latter case often includes a tiny connected branch. The disconnected hybrid star branch, also called "third family", corresponds to high-mass twin stars characterized by the same gravitational mass but different radii. We perform a Bayesian analysis and demonstrate that the observation of such a pair of high-mass twin stars would have a sufficient discriminating power to favor hybrid EoS with a strong first-order phase transition over alternative EoS.

  3. A method of spherical harmonic analysis in the geosciences via hierarchical Bayesian inference

    NASA Astrophysics Data System (ADS)

    Muir, J. B.; Tkalčić, H.

    2015-11-01

    The problem of decomposing irregular data on the sphere into a set of spherical harmonics is common in many fields of geosciences where it is necessary to build a quantitative understanding of a globally varying field. For example, in global seismology, a compressional or shear wave speed that emerges from tomographic images is used to interpret current state and composition of the mantle, and in geomagnetism, secular variation of magnetic field intensity measured at the surface is studied to better understand the changes in the Earth's core. Optimization methods are widely used for spherical harmonic analysis of irregular data, but they typically do not treat the dependence of the uncertainty estimates on the imposed regularization. This can cause significant difficulties in interpretation, especially when the best-fit model requires more variables as a result of underestimating data noise. Here, with the above limitations in mind, the problem of spherical harmonic expansion of irregular data is treated within the hierarchical Bayesian framework. The hierarchical approach significantly simplifies the problem by removing the need for regularization terms and user-supplied noise estimates. The use of the corrected Akaike Information Criterion for picking the optimal maximum degree of spherical harmonic expansion and the resulting spherical harmonic analyses are first illustrated on a noisy synthetic data set. Subsequently, the method is applied to two global data sets sensitive to the Earth's inner core and lowermost mantle, consisting of PKPab-df and PcP-P differential traveltime residuals relative to a spherically symmetric Earth model. The posterior probability distributions for each spherical harmonic coefficient are calculated via Markov Chain Monte Carlo sampling; the uncertainty obtained for the coefficients thus reflects the noise present in the real data and the imperfections in the spherical harmonic expansion.

  4. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  5. Mapping, Bayesian Geostatistical Analysis and Spatial Prediction of Lymphatic Filariasis Prevalence in Africa

    PubMed Central

    Slater, Hannah; Michael, Edwin

    2013-01-01

    There is increasing interest to control or eradicate the major neglected tropical diseases. Accurate modelling of the geographic distributions of parasitic infections will be crucial to this endeavour. We used 664 community level infection prevalence data collated from the published literature in conjunction with eight environmental variables, altitude and population density, and a multivariate Bayesian generalized linear spatial model that allows explicit accounting for spatial autocorrelation and incorporation of uncertainty in input data and model parameters, to construct the first spatially-explicit map describing LF prevalence distribution in Africa. We also ran the best-fit model against predictions made by the HADCM3 and CCCMA climate models for 2050 to predict the likely distributions of LF under future climate and population changes. We show that LF prevalence is strongly influenced by spatial autocorrelation between locations but is only weakly associated with environmental covariates. Infection prevalence, however, is found to be related to variations in population density. All associations with key environmental/demographic variables appear to be complex and non-linear. LF prevalence is predicted to be highly heterogenous across Africa, with high prevalences (>20%) estimated to occur primarily along coastal West and East Africa, and lowest prevalences predicted for the central part of the continent. Error maps, however, indicate a need for further surveys to overcome problems with data scarcity in the latter and other regions. Analysis of future changes in prevalence indicates that population growth rather than climate change per se will represent the dominant factor in the predicted increase/decrease and spread of LF on the continent. We indicate that these results could play an important role in aiding the development of strategies that are best able to achieve the goals of parasite elimination locally and globally in a manner that may also account

  6. Bayesian Analysis of Non-Gaussian Long-Range Dependent Processes

    NASA Astrophysics Data System (ADS)

    Graves, T.; Franzke, C.; Gramacy, R. B.; Watkins, N. W.

    2012-12-01

    Recent studies have strongly suggested that surface temperatures exhibit long-range dependence (LRD). The presence of LRD would hamper the identification of deterministic trends and the quantification of their significance. It is well established that LRD processes exhibit stochastic trends over rather long periods of time. Thus, accurate methods for discriminating between physical processes that possess long memory and those that do not are an important adjunct to climate modeling. We have used Markov Chain Monte Carlo algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average (ARFIMA) processes, which are capable of modeling LRD. Our principal aim is to obtain inference about the long memory parameter, d,with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series such as the Central England Temperature. Many physical processes, for example the Faraday time series from Antarctica, are highly non-Gaussian. We have therefore extended this work by weakening the Gaussianity assumption. Specifically, we assume a symmetric α -stable distribution for the innovations. Such processes provide good, flexible, initial models for non-Gaussian processes with long memory. We will present a study of the dependence of the posterior variance σ d of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other measures of d.

  7. Bayesian analysis of stage-fall-discharge rating curves and their uncertainties

    NASA Astrophysics Data System (ADS)

    Mansanarez, Valentin; Le Coz, Jérôme; Renard, Benjamin; Lang, Michel; Pierrefeu, Gilles; Le Boursicaud, Raphaël; Pobanz, Karine

    2016-04-01

    Stage-fall-discharge (SFD) rating curves are traditionally used to compute streamflow records at sites where the energy slope of the flow is variable due to variable backwater effects. Building on existing Bayesian approaches, we introduce an original hydraulics-based method for developing SFD rating curves used at twin gauge stations and estimating their uncertainties. Conventional power functions for channel and section controls are used, and transition to a backwater-affected channel control is computed based on a continuity condition, solved either analytically or numerically. The difference between the reference levels at the two stations is estimated as another uncertain parameter of the SFD model. The method proposed in this presentation incorporates information from both the hydraulic knowledge (equations of channel or section controls) and the information available in the stage-fall-discharge observations (gauging data). The obtained total uncertainty combines the parametric uncertainty and the remnant uncertainty related to the model of rating curve. This method provides a direct estimation of the physical inputs of the rating curve (roughness, width, slope bed, distance between twin gauges, etc.). The performance of the new method is tested using an application case affected by the variable backwater of a run-of-the-river dam: the Rhône river at Valence, France. In particular, a sensitivity analysis to the prior information and to the gauging dataset is performed. At that site, the stage-fall-discharge domain is well documented with gaugings conducted over a range of backwater affected and unaffected conditions. The performance of the new model was deemed to be satisfactory. Notably, transition to uniform flow when the overall range of the auxiliary stage is gauged is correctly simulated. The resulting curves are in good agreement with the observations (gaugings) and their uncertainty envelopes are acceptable for computing streamflow records. Similar

  8. Multi-component joint analysis of surface waves

    NASA Astrophysics Data System (ADS)

    Dal Moro, Giancarlo; Moura, Rui Miguel Marques; Moustafa, Sayed S. R.

    2015-08-01

    Propagation of surface waves can occur with complex energy distribution amongst the various modes. It is shown that even simple VS (shear-wave velocity) profiles can generate velocity spectra that, because of a complex mode excitation, can be quite difficult to interpret in terms of modal dispersion curves. In some cases, Rayleigh waves show relevant differences depending on the considered component (radial or vertical) and the kind of source (vertical impact or explosive). Contrary to several simplistic assumptions often proposed, it is shown, both via synthetic and field datasets, that the fundamental mode of Rayleigh waves can be almost completely absent. This sort of evidence demonstrates the importance of a multi-component analysis capable of providing the necessary elements to properly interpret the data and adequately constrain the subsurface model. It is purposely shown, also through the sole use of horizontal geophones, how it can be possible to efficiently and quickly acquire both Love and Rayleigh (radial-component) waves. The presented field dataset reports a case where Rayleigh waves (both their vertical and radial components) appear largely dominated by higher modes with little or no evidence of the fundamental mode. The joint inversion of the radial and vertical components of Rayleigh waves jointly with Love waves is performed by adopting a multi-objective inversion scheme based on the computation of synthetic seismograms for the three considered components and the minimization of the whole velocity spectra misfits (Full Velocity Spectra - FVS - inversion). Such a FVS multi-component joint inversion can better handle complex velocity spectra thus providing a more robust subsurface model not affected by erroneous velocity spectra interpretations and non-uniqueness of the solution.

  9. Comparing energy sources for surgical ablation of atrial fibrillation: a Bayesian network meta-analysis of randomized, controlled trials.

    PubMed

    Phan, Kevin; Xie, Ashleigh; Kumar, Narendra; Wong, Sophia; Medi, Caroline; La Meir, Mark; Yan, Tristan D

    2015-08-01

    Simplified maze procedures involving radiofrequency, cryoenergy and microwave energy sources have been increasingly utilized for surgical treatment of atrial fibrillation as an alternative to the traditional cut-and-sew approach. In the absence of direct comparisons, a Bayesian network meta-analysis is another alternative to assess the relative effect of different treatments, using indirect evidence. A Bayesian meta-analysis of indirect evidence was performed using 16 published randomized trials identified from 6 databases. Rank probability analysis was used to rank each intervention in terms of their probability of having the best outcome. Sinus rhythm prevalence beyond the 12-month follow-up was similar between the cut-and-sew, microwave and radiofrequency approaches, which were all ranked better than cryoablation (respectively, 39, 36, and 25 vs 1%). The cut-and-sew maze was ranked worst in terms of mortality outcomes compared with microwave, radiofrequency and cryoenergy (2 vs 19, 34, and 24%, respectively). The cut-and-sew maze procedure was associated with significantly lower stroke rates compared with microwave ablation [odds ratio <0.01; 95% confidence interval 0.00, 0.82], and ranked the best in terms of pacemaker requirements compared with microwave, radiofrequency and cryoenergy (81 vs 14, and 1, <0.01% respectively). Bayesian rank probability analysis shows that the cut-and-sew approach is associated with the best outcomes in terms of sinus rhythm prevalence and stroke outcomes, and remains the gold standard approach for AF treatment. Given the limitations of indirect comparison analysis, these results should be viewed with caution and not over-interpreted.

  10. Identification of QTLs of resistance to white mold in common bean from multiple markers by using Bayesian analysis.

    PubMed

    Lara, L A C; Santos, J B; Balestre, M; Lima, I A; Pamplona, A K A; Veloso, J S; Silva, P H

    2015-02-06

    In this study, we identified simple sequence repeat, ampli-fied fragment length polymorphism, and sequence-related amplified poly-morphism markers linked to quantitative trait loci (QTLs) for resistance to white mold disease in common bean progenies derived from a cross between lines CNFC 9506 and RP-2, evaluated using the oxalic acid test and using Bayesian analysis. DNA was extracted from 186 F₂ plants and their parental lines for molecular analysis. Fifteen experiments were car-ried out for phenotypic analysis, which included 186 F₂:₄ progenies, the F₁ generation, the F₂ generation, and the lines CNFC 9506, RP-2, and G122 as common treatments. A completely randomized experimental design with 3 replications was used in controlled environments. The adjusted means for the F₂:₄ generation were to identify QTLs by Bayesian shrink-age analysis. Significant differences were observed among the progenies for the reaction to white mold. The moving away method under the Bayes-ian approach was effective for identifying QTLs when it was not possible to obtain a genetic map because of low marker density. Using the Wald test, 25 markers identified QTLs for resistance to white mold, as well as 16 simple sequence repeats, 7 amplified fragment length polymorphisms, and 2 sequence-related amplified polymorphisms. The markers BM184, BM211, and PV-gaat001 showed low distances from QTLs related white mold resistance. In addition, these markers showed, signal effects with increasing resistance to white mold and high heritability in the analysis with oxalic acid, and thus, are promising for marker-assisted selection.

  11. Genome-wide association study of swine farrowing traits. Part II: Bayesian analysis of marker data.

    PubMed

    Schneider, J F; Rempel, L A; Snelling, W M; Wiedmann, R T; Nonneman, D J; Rohrer, G A

    2012-10-01

    Reproductive efficiency has a great impact on the economic success of pork (sus scrofa) production. Number born alive (NBA) and average piglet birth weight (ABW) contribute greatly to reproductive efficiency. To better understand the underlying genetics of birth traits, a genome-wide association study (GWAS) was undertaken. Samples of DNA were collected and tested using the Illumina PorcineSNP60 BeadChip from 1,152 first parity gilts. Traits included total number born (TNB), NBA, number born dead (NBD), number stillborn (NSB), number of mummies (MUM), total litter birth weight (LBW), and ABW. A total of 41,151 SNP were tested using a Bayesian approach. Beginning with the first 5 SNP on SSC1 and ending with the last 5 SNP on the SSCX, SNP were assigned to groups of 5 consecutive SNP by chromosome-position order and analyzed again using a Bayesian approach. From that analysis, 5-SNP groups were selected having no overlap with another 5-SNP groups and no overlap across chromosomes. These selected 5-SNP non-overlapping groups were defined as QTL. Of the available 8,814 QTL, 124 were found to be statistically significant (P < 0.01). Multiple testing was considered using the probability of false positives. Eleven QTL were found for TNB, 3 on SSC1, 3 on SSC4, 1 on SSC13, 1 on SSC14, 2 on SSC15, and 1 on SSC17. Statistical testing for NBA identified 14 QTL, 4 on SSC1, 1 on SSC4, 1 on SSC6, 1 on SSC10, 1on SSC13, 3 on SSC15, and 3 on SSC17. A single NBD QTL was found on SSC11. No QTL were identified for NSB or MUM. Thirty-three QTL were found for LBW, 3 on SSC1, 1 on SSC2, 1 on SSC3, 5 on SSC4, 2 on SSC5, 5 on SSC6, 3 on SSC7, 2 on SSC9, 1 on SSC10, 2 on SSC14, 6 on SSC15, and 2 on SSC17. A total of 65 QTL were found for ABW, 9 on SSC1, 3 on SSC2, 9 on SSC5, 5 on SSC6, 1 on SSC7, 2 on SSC8, 2 on SSC9, 3 on SSC10, 1 on SSC11, 3 on SSC12, 2 on SSC13, 8 on SSC14, 8 on SSC15, 1 on SSC17, and 8 on SSC18. Several candidate genes have been identified that overlap QTL locations

  12. Bayesian Analysis of Structural Equation Models with Nonlinear Covariates and Latent Variables

    ERIC Educational Resources Information Center

    Song, Xin-Yuan; Lee, Sik-Yum

    2006-01-01

    In this article, we formulate a nonlinear structural equation model (SEM) that can accommodate covariates in the measurement equation and nonlinear terms of covariates and exogenous latent variables in the structural equation. The covariates can come from continuous or discrete distributions. A Bayesian approach is developed to analyze the…

  13. Bayesian analysis of censored response data in family-based genetic association studies.

    PubMed

    Del Greco M, Fabiola; Pattaro, Cristian; Minelli, Cosetta; Thompson, John R

    2016-09-01

    Biomarkers are subject to censoring whenever some measurements are not quantifiable given a laboratory detection limit. Methods for handling censoring have received less attention in genetic epidemiology, and censored data are still often replaced with a fixed value. We compared different strategies for handling a left-censored continuous biomarker in a family-based study, where the biomarker is tested for association with a genetic variant, S, adjusting for a covariate, X. Allowing different correlations between X and S, we compared simple substitution of censored observations with the detection limit followed by a linear mixed effect model (LMM), Bayesian model with noninformative priors, Tobit model with robust standard errors, the multiple imputation (MI) with and without S in the imputation followed by a LMM. Our comparison was based on real and simulated data in which 20% and 40% censoring were artificially induced. The complete data were also analyzed with a LMM. In the MICROS study, the Bayesian model gave results closer to those obtained with the complete data. In the simulations, simple substitution was always the most biased method, the Tobit approach gave the least biased estimates at all censoring levels and correlation values, the Bayesian model and both MI approaches gave slightly biased estimates but smaller root mean square errors. On the basis of these results the Bayesian approach is highly recommended for candidate gene studies; however, the computationally simpler Tobit and the MI without S are both good options for genome-wide studies.

  14. Exemplars, Prototypes, Similarities, and Rules in Category Representation: An Example of Hierarchical Bayesian Analysis

    ERIC Educational Resources Information Center

    Lee, Michael D.; Vanpaemel, Wolf

    2008-01-01

    This article demonstrates the potential of using hierarchical Bayesian methods to relate models and data in the cognitive sciences. This is done using a worked example that considers an existing model of category representation, the Varying Abstraction Model (VAM), which attempts to infer the representations people use from their behavior in…

  15. Analysis of Bonded Joints Between the Facesheet and Flange of Corrugated Composite Panels

    NASA Technical Reports Server (NTRS)

    Yarrington, Phillip W.; Collier, Craig S.; Bednarcyk, Brett A.

    2008-01-01

    This paper outlines a method for the stress analysis of bonded composite corrugated panel facesheet to flange joints. The method relies on the existing HyperSizer Joints software, which analyzes the bonded joint, along with a beam analogy model that provides the necessary boundary loading conditions to the joint analysis. The method is capable of predicting the full multiaxial stress and strain fields within the flange to facesheet joint and thus can determine ply-level margins and evaluate delamination. Results comparing the method to NASTRAN finite element model stress fields are provided illustrating the accuracy of the method.

  16. Nonlinear Analysis of Bonded Composite Single-LAP Joints

    NASA Technical Reports Server (NTRS)

    Oterkus, E.; Barut, A.; Madenci, E.; Smeltzer, S. S.; Ambur, D. R.

    2004-01-01

    This study presents a semi-analytical solution method to analyze the geometrically nonlinear response of bonded composite single-lap joints with tapered adherend edges under uniaxial tension. The solution method provides the transverse shear and normal stresses in the adhesive and in-plane stress resultants and bending moments in the adherends. The method utilizes the principle of virtual work in conjunction with von Karman s nonlinear plate theory to model the adherends and the shear lag model to represent the kinematics of the thin adhesive layer between the adherends. Furthermore, the method accounts for the bilinear elastic material behavior of the adhesive while maintaining a linear stress-strain relationship in the adherends. In order to account for the stiffness changes due to thickness variation of the adherends along the tapered edges, their in-plane and bending stiffness matrices are varied as a function of thickness along the tapered region. The combination of these complexities results in a system of nonlinear governing equilibrium equations. This approach represents a computationally efficient alternative to finite element method. Comparisons are made with corresponding results obtained from finite-element analysis. The results confirm the validity of the solution method. The numerical results present the effects of taper angle, adherend overlap length, and the bilinear adhesive material on the stress fields in the adherends, as well as the adhesive, of a single-lap joint

  17. SRB Environment Evaluation and Analysis. Volume 2: RSRB Joint Filling Test/Analysis Improvements

    NASA Technical Reports Server (NTRS)

    Knox, E. C.; Woods, G. Hamilton

    1991-01-01

    Following the Challenger accident a very comprehensive solid rocket booster (SRB) redesign program was initiated. One objective of the program was to develop expertise at NASA/MSFC in the techniques for analyzing the flow of hot gases in the SRB joints. Several test programs were undertaken to provide a data base of joint performance with manufactured defects in the joints to allow hot gases to fill the joints. This data base was used also to develop the analytical techniques. Some of the test programs were Joint Environment Simulator (JES), Nozzle Joint Environment Simulator (NJES), Transient Pressure Test Article (TPTA), and Seventy-Pound Charge (SPC). In 1988 the TPTA test hardware was moved from the Utah site to MSFC and several RSRM tests were scheduled, to be followed by tests for the ASRM program. REMTECH Inc. supported these activities with pretest estimates of the flow conditions in the test joints, and post-test analysis and evaluation of the measurements. During this support REMTECH identified deficiencies in the gas-measurement instrumentation that existed in the TPTA hardware, made recommendations for its replacement, and identified improvements to the analytical tools used in the test support. Only one test was completed under the TPTA RSRM test program, and those scheduled for the ASRM were rescheduled to a time after the expiration of this contract. The attention of this effort was directed toward improvements in the analytical techniques in preparation for when the ASRM program begins.

  18. Reliability Analysis of a Glacier Lake Warning System Using a Bayesian Net

    NASA Astrophysics Data System (ADS)

    Sturny, Rouven A.; Bründl, Michael

    2013-04-01

    Beside structural mitigation measures like avalanche defense structures, dams and galleries, warning and alarm systems have become important measures for dealing with Alpine natural hazards. Integrating them into risk mitigation strategies and comparing their effectiveness with structural measures requires quantification of the reliability of these systems. However, little is known about how reliability of warning systems can be quantified and which methods are suitable for comparing their contribution to risk reduction with that of structural mitigation measures. We present a reliability analysis of a warning system located in Grindelwald, Switzerland. The warning system was built for warning and protecting residents and tourists from glacier outburst floods as consequence of a rapid drain of the glacier lake. We have set up a Bayesian Net (BN, BPN) that allowed for a qualitative and quantitative reliability analysis. The Conditional Probability Tables (CPT) of the BN were determined according to manufacturer's reliability data for each component of the system as well as by assigning weights for specific BN nodes accounting for information flows and decision-making processes of the local safety service. The presented results focus on the two alerting units 'visual acoustic signal' (VAS) and 'alerting of the intervention entities' (AIE). For the summer of 2009, the reliability was determined to be 94 % for the VAS and 83 % for the AEI. The probability of occurrence of a major event was calculated as 0.55 % per day resulting in an overall reliability of 99.967 % for the VAS and 99.906 % for the AEI. We concluded that a failure of the VAS alerting unit would be the consequence of a simultaneous failure of the four probes located in the lake and the gorge. Similarly, we deduced that the AEI would fail either if there were a simultaneous connectivity loss of the mobile and fixed network in Grindelwald, an Internet access loss or a failure of the regional operations

  19. Rating locomotive crew diesel emission exposure profiles using statistics and Bayesian Decision Analysis.

    PubMed

    Hewett, Paul; Bullock, William H

    2014-01-01

    For more than 20 years CSX Transportation (CSXT) has collected exposure measurements from locomotive engineers and conductors who are potentially exposed to diesel emissions. The database included measurements for elemental and total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, carbon monoxide, and nitrogen dioxide. This database was statistically analyzed and summarized, and the resulting statistics and exposure profiles were compared to relevant occupational exposure limits (OELs) using both parametric and non-parametric descriptive and compliance statistics. Exposure ratings, using the American Industrial Health Association (AIHA) exposure categorization scheme, were determined using both the compliance statistics and Bayesian Decision Analysis (BDA). The statistical analysis of the elemental carbon data (a marker for diesel particulate) strongly suggests that the majority of levels in the cabs of the lead locomotives (n = 156) were less than the California guideline of 0.020 mg/m(3). The sample 95th percentile was roughly half the guideline; resulting in an AIHA exposure rating of category 2/3 (determined using BDA). The elemental carbon (EC) levels in the trailing locomotives tended to be greater than those in the lead locomotive; however, locomotive crews rarely ride in the trailing locomotive. Lead locomotive EC levels were similar to those reported by other investigators studying locomotive crew exposures and to levels measured in urban areas. Lastly, both the EC sample mean and 95%UCL were less than the Environmental Protection Agency (EPA) reference concentration of 0.005 mg/m(3). With the exception of nitrogen dioxide, the overwhelming majority of the measurements for total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, and combustion gases in the cabs of CSXT locomotives were either non-detects or considerably less than the working OELs for the years represented in the database. When compared to the previous American

  20. Elastic-plastic analysis of crack in ductile adhesive joint

    SciTech Connect

    Ikeda, Toru; Miyazaki, Noriyuki; Yamashita, Akira; Munakata, Tsuyoshi

    1995-11-01

    The fracture of a crack in adhesive is important to the structural integrity of adhesive structures and composite materials. Though the fracture toughness of a material should be constant according to fracture mechanics, it is said that the fracture toughness of a crack in an adhesive joint depends on the bond thickness. In the present study, the elastic-plastic stress analyses of a crack in a thin adhesive layer are performed by the combination of the boundary element method and the finite element method. The effect of adhesive thickness on the J-integral, the Q`-factor which is a modified version of the Q-factor, and the crack tip opening displacement (CTOD) are investigated. It is found from the analyses that the CTOD begins to decrease at very thin bond thickness, the Q`-factor being almost constant. The decrease of the fracture toughness at very thin adhesive layer is expected by the present analysis.

  1. Magellan/Galileo solder joint failure analysis and recommendations

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    1989-01-01

    On or about November 10, 1988 an open circuit solder joint was discovered in the Magellan Radar digital unit (DFU) during integration testing at Kennedy Space Center (KSC). A detailed analysis of the cause of the failure was conducted at the Jet Propulsion Laboratory leading to the successful repair of many pieces of affected electronic hardware on both the Magellan and Galileo spacecraft. The problem was caused by the presence of high thermal coefficient of expansion heat sink and conformal coating materials located in the large (0.055 inch) gap between Dual Inline Packages (DIPS) and the printed wiring board. The details of the observed problems are described and recommendations are made for improved design and testing activities in the future.

  2. Limitations of cytochrome oxidase I for the barcoding of Neritidae (Mollusca: Gastropoda) as revealed by Bayesian analysis.

    PubMed

    Chee, S Y

    2015-05-25

    The mitochondrial DNA (mtDNA) cytochrome oxidase I (COI) gene has been universally and successfully utilized as a barcoding gene, mainly because it can be amplified easily, applied across a wide range of taxa, and results can be obtained cheaply and quickly. However, in rare cases, the gene can fail to distinguish between species, particularly when exposed to highly sensitive methods of data analysis, such as the Bayesian method, or when taxa have undergone introgressive hybridization, over-splitting, or incomplete lineage sorting. Such cases require the use of alternative markers, and nuclear DNA markers are commonly used. In this study, a dendrogram produced by Bayesian analysis of an mtDNA COI dataset was compared with that of a nuclear DNA ATPS-α dataset, in order to evaluate the efficiency of COI in barcoding Malaysian nerites (Neritidae). In the COI dendrogram, most of the species were in individual clusters, except for two species: Nerita chamaeleon and N. histrio. These two species were placed in the same subcluster, whereas in the ATPS-α dendrogram they were in their own subclusters. Analysis of the ATPS-α gene also placed the two genera of nerites (Nerita and Neritina) in separate clusters, whereas COI gene analysis placed both genera in the same cluster. Therefore, in the case of the Neritidae, the ATPS-α gene is a better barcoding gene than the COI gene.

  3. Failure Analysis in Space: International Space Station (ISS) Starboard Solar Alpha Rotary Joint (SARJ) Debris Analysis

    NASA Technical Reports Server (NTRS)

    Long, V. S.; Wright, M. C.; McDanels, S. J.; Lubas, D.; Tucker, B.; Marciniak, P. J.

    2010-01-01

    This slide presentation reviews the debris analysis of the Starboard Solar Alpha Rotary Joint (SARJ), a mechanism that is designed to keep the solar arrays facing the sun. The goal of this was to identify the failure mechanism based on surface morphology and to determine the source of debris through elemental and particle analysis.

  4. Medical diagnosis aboard submarines. Use of a computer-based Bayesian method of analysis in an abdominal pain diagnostic program.

    PubMed

    Osborne, S F

    1984-02-01

    The medical issues that arise in the isolated environment of a submarine can occasionally be grave. While crewmembers are carefully screened for health problems, they are still susceptible to serious acute illness. Currently, the submarine medical department representative, the hospital corpsman, utilizes a history and physical examination, clinical acumen, and limited laboratory testing in diagnosis. The application of a Bayesian method of analysis to an abdominal pain diagnostic system utilizing an onboard microcomputer is described herein. Early results from sea trials show an appropriate diagnosis in eight of 10 cases of abdominal pain, but the program should still be viewed as an extended "laboratory test" until proved effective at sea.

  5. Kinematic and dynamic analysis of an anatomically based knee joint.

    PubMed

    Lee, Kok-Meng; Guo, Jiajie

    2010-05-01

    This paper presents a knee-joint model to provide a better understanding on the interaction between natural joints and artificial mechanisms for design and control of rehabilitation exoskeletons. The anatomically based knee model relaxes several commonly made assumptions that approximate a human knee as engineering pin-joint in exoskeleton design. Based on published MRI data, we formulate the kinematics of a knee-joint and compare three mathematical approximations; one model bases on two sequential circles rolling a flat plane; and the other two are mathematically differentiable ellipses-based models with and without sliding at the contact. The ellipses-based model taking sliding contact into accounts shows that the rolling-sliding ratio of a knee-joint is not a constant but has an average value consistent with published measurements. This knee-joint kinematics leads to a physically more accurate contact-point trajectory than methods based on multiple circles or lines, and provides a basis to derive a knee-joint kinetic model upon which the effects of a planar exoskeleton mechanism on the internal joint forces and torque during flexion can be numerically investigated. Two different knee-joint kinetic models (pin-joint approximation and anatomically based model) are compared against a condition with no exoskeleton. The leg and exoskeleton form a closed kinematic chain that has a significant effect on the joint forces in the knee. Human knee is more tolerant than pin-joint in negotiating around a singularity but its internal forces increase with the exoskeleton mass-to-length ratio. An oversimplifying pin-joint approximation cannot capture the finite change in the knee forces due to the singularity effect.

  6. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini,Benjamin; Young, Daniel

    1990-01-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite bonded joints below the cryogenic temperature of 30K (-405 F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to M55J/954-6 and T300/954-6 hybrid composite tubes (75mm square). Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  7. Bayesian Analysis of Hmi Images and Comparison to Tsi Variations and MWO Image Observables

    NASA Astrophysics Data System (ADS)

    Parker, D. G.; Ulrich, R. K.; Beck, J.; Tran, T. V.

    2015-12-01

    We have previously applied the Bayesian automatic classification system AutoClass to solar magnetogram and intensity images from the 150 Foot Solar Tower at Mount Wilson to identify classes of solar surface features associated with variations in total solar irradiance (TSI) and, using those identifications, modeled TSI time series with improved accuracy (r > 0.96). (Ulrich, et al, 2010) AutoClass identifies classes by a two-step process in which it: (1) finds, without human supervision, a set of class definitions based on specified attributes of a sample of the image data pixels, such as magnetic field and intensity in the case of MWO images, and (2) applies the class definitions thus found to new data sets to identify automatically in them the classes found in the sample set. HMI high resolution images capture four observables-magnetic field, continuum intensity, line depth and line width-in contrast to MWO's two observables-magnetic field and intensity. In this study, we apply AutoClass to the HMI observables for images from June, 2010 to December, 2014 to identify solar surface feature classes. We use contemporaneous TSI measurements to determine whether and how variations in the HMI classes are related to TSI variations and compare the characteristic statistics of the HMI classes to those found from MWO images. We also attempt to derive scale factors between the HMI and MWO magnetic and intensity observables.The ability to categorize automatically surface features in the HMI images holds out the promise of consistent, relatively quick and manageable analysis of the large quantity of data available in these images. Given that the classes found in MWO images using AutoClass have been found to improve modeling of TSI, application of AutoClass to the more complex HMI images should enhance understanding of the physical processes at work in solar surface features and their implications for the solar-terrestrial environment.Ulrich, R.K., Parker, D, Bertello, L. and

  8. Bayesian Analysis Of HMI Solar Image Observables And Comparison To TSI Variations And MWO Image Observables

    NASA Astrophysics Data System (ADS)

    Parker, D. G.; Ulrich, R. K.; Beck, J.

    2014-12-01

    We have previously applied the Bayesian automatic classification system AutoClass to solar magnetogram and intensity images from the 150 Foot Solar Tower at Mount Wilson to identify classes of solar surface features associated with variations in total solar irradiance (TSI) and, using those identifications, modeled TSI time series with improved accuracy (r > 0.96). (Ulrich, et al, 2010) AutoClass identifies classes by a two-step process in which it: (1) finds, without human supervision, a set of class definitions based on specified attributes of a sample of the image data pixels, such as magnetic field and intensity in the case of MWO images, and (2) applies the class definitions thus found to new data sets to identify automatically in them the classes found in the sample set. HMI high resolution images capture four observables-magnetic field, continuum intensity, line depth and line width-in contrast to MWO's two observables-magnetic field and intensity. In this study, we apply AutoClass to the HMI observables for images from May, 2010 to June, 2014 to identify solar surface feature classes. We use contemporaneous TSI measurements to determine whether and how variations in the HMI classes are related to TSI variations and compare the characteristic statistics of the HMI classes to those found from MWO images. We also attempt to derive scale factors between the HMI and MWO magnetic and intensity observables. The ability to categorize automatically surface features in the HMI images holds out the promise of consistent, relatively quick and manageable analysis of the large quantity of data available in these images. Given that the classes found in MWO images using AutoClass have been found to improve modeling of TSI, application of AutoClass to the more complex HMI images should enhance understanding of the physical processes at work in solar surface features and their implications for the solar-terrestrial environment. Ulrich, R.K., Parker, D, Bertello, L. and

  9. A Bayesian ridge regression analysis of congestion's impact on urban expressway safety.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Lee, Jaeyoung

    2016-03-01

    With the rapid growth of traffic in urban areas, concerns about congestion and traffic safety have been heightened. This study leveraged both Automatic Vehicle Identification (AVI) system and Microwave Vehicle Detection System (MVDS) installed on an expressway in Central Florida to explore how congestion impacts the crash occurrence in urban areas. Multiple congestion measures from the two systems were developed. To ensure more precise estimates of the congestion's effects, the traffic data were aggregated into peak and non-peak hours. Multicollinearity among traffic parameters was examined. The results showed the presence of multicollinearity especially during peak hours. As a response, ridge regression was introduced to cope with this issue. Poisson models with uncorrelated random effects, correlated random effects, and both correlated random effects and random parameters were constructed within the Bayesian framework. It was proven that correlated random effects could significantly enhance model performance. The random parameters model has similar goodness-of-fit compared with the model with only correlated random effects. However, by accounting for the unobserved heterogeneity, more variables were found to be significantly related to crash frequency. The models indicated that congestion increased crash frequency during peak hours while during non-peak hours it was not a major crash contributing factor. Using the random parameter model, the three congestion measures were compared. It was found that all congestion indicators had similar effects while Congestion Index (CI) derived from MVDS data was a better congestion indicator for safety analysis. Also, analyses showed that the segments with higher congestion intensity could not only increase property damage only (PDO) crashes, but also more severe crashes. In addition, the issues regarding the necessity to incorporate specific congestion indicator for congestion's effects on safety and to take care of the

  10. A Bayesian ridge regression analysis of congestion's impact on urban expressway safety.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Lee, Jaeyoung

    2016-03-01

    With the rapid growth of traffic in urban areas, concerns about congestion and traffic safety have been heightened. This study leveraged both Automatic Vehicle Identification (AVI) system and Microwave Vehicle Detection System (MVDS) installed on an expressway in Central Florida to explore how congestion impacts the crash occurrence in urban areas. Multiple congestion measures from the two systems were developed. To ensure more precise estimates of the congestion's effects, the traffic data were aggregated into peak and non-peak hours. Multicollinearity among traffic parameters was examined. The results showed the presence of multicollinearity especially during peak hours. As a response, ridge regression was introduced to cope with this issue. Poisson models with uncorrelated random effects, correlated random effects, and both correlated random effects and random parameters were constructed within the Bayesian framework. It was proven that correlated random effects could significantly enhance model performance. The random parameters model has similar goodness-of-fit compared with the model with only correlated random effects. However, by accounting for the unobserved heterogeneity, more variables were found to be significantly related to crash frequency. The models indicated that congestion increased crash frequency during peak hours while during non-peak hours it was not a major crash contributing factor. Using the random parameter model, the three congestion measures were compared. It was found that all congestion indicators had similar effects while Congestion Index (CI) derived from MVDS data was a better congestion indicator for safety analysis. Also, analyses showed that the segments with higher congestion intensity could not only increase property damage only (PDO) crashes, but also more severe crashes. In addition, the issues regarding the necessity to incorporate specific congestion indicator for congestion's effects on safety and to take care of the

  11. Multi-site identification of a distributed hydrological nitrogen model using Bayesian uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Sanyuan; Jomaa, Seifeddine; Büttner, Olaf; Meon, Günter; Rode, Michael

    2015-10-01

    For capturing spatial variations of runoff and nutrient fluxes attributed to catchment heterogeneity, multi-site hydrological water quality monitoring strategies are increasingly put into practice. This study aimed to investigate the impacts of spatially distributed streamflow and streamwater Inorganic Nitrogen (IN) concentration observations on the identification of a continuous time, spatially semi-distributed and process-based hydrological water quality model HYPE (HYdrological Predictions for the Environment). A Bayesian inference based approach DREAM(ZS) (DiffeRential Evolution Adaptive Metrololis algorithm) was combined with HYPE to implement model optimisation and uncertainty analysis on streamflow and streamwater IN concentration simulations at a nested meso scale catchment in central Germany. To this end, a 10-year period (1994-1999 for calibration and 1999-2004 for validation) was utilised. We compared the parameters' posterior distributions, modelling performance using the best estimated parameter set and 95% prediction confidence intervals at catchment outlet for the calibration period that were derived from single-site calibration (SSC) and multi-site calibration (MSC) modes. For SSC, streamflow and streamwater IN concentration observations at only the catchment outlet were used. While, for MSC, streamflow and streamwater IN concentration observations from both catchment outlet and two internal sites were considered. Results showed that the uncertainty intervals of hydrological water quality parameters' posterior distributions estimated from MSC, were narrower than those obtained from SSC. In addition, it was found that the MSC outperformed SSC on streamwater IN concentration simulations at internal sites for both calibration and validation periods, while the influence on streamflow modelling performance was small. This can be explained by the "nested" nature of the catchment and high correlation between discharge observations from different sites

  12. Spatial Intensity Duration Frequency Relationships Using Hierarchical Bayesian Analysis for Urban Areas

    NASA Astrophysics Data System (ADS)

    Rupa, Chandra; Mujumdar, Pradeep

    2016-04-01

    In urban areas, quantification of extreme precipitation is important in the design of storm water drains and other infrastructure. Intensity Duration Frequency (IDF) relationships are generally used to obtain design return level for a given duration and return period. Due to lack of availability of extreme precipitation data for sufficiently large number of years, estimating the probability of extreme events is difficult. Typically, a single station data is used to obtain the design return levels for various durations and return periods, which are used in the design of urban infrastructure for the entire city. In an urban setting, the spatial variation of precipitation can be high; the precipitation amounts and patterns often vary within short distances of less than 5 km. Therefore it is crucial to study the uncertainties in the spatial variation of return levels for various durations. In this work, the extreme precipitation is modeled spatially using the Bayesian hierarchical analysis and the spatial variation of return levels is studied. The analysis is carried out with Block Maxima approach for defining the extreme precipitation, using Generalized Extreme Value (GEV) distribution for Bangalore city, Karnataka state, India. Daily data for nineteen stations in and around Bangalore city is considered in the study. The analysis is carried out for summer maxima (March - May), monsoon maxima (June - September) and the annual maxima rainfall. In the hierarchical analysis, the statistical model is specified in three layers. The data layer models the block maxima, pooling the extreme precipitation from all the stations. In the process layer, the latent spatial process characterized by geographical and climatological covariates (lat-lon, elevation, mean temperature etc.) which drives the extreme precipitation is modeled and in the prior level, the prior distributions that govern the latent process are modeled. Markov Chain Monte Carlo (MCMC) algorithm (Metropolis Hastings

  13. Bayesian Analysis of Step-Stress Accelerated Life Test with Exponential Distribution

    SciTech Connect

    Lee, J.; Pan, R.

    2012-04-01

    In this article, we propose a general Bayesian inference approach to the step-stress accelerated life test with type II censoring. We assume that the failure times at each stress level are exponentially distributed and the test units are tested in an increasing order of stress levels. We formulate the prior distribution of the parameters of life-stress function and integrate the engineering knowledge of product failure rate and acceleration factor into the prior. The posterior distribution and the point estimates for the parameters of interest are provided. Through the Markov chain Monte Carlo technique, we demonstrate a nonconjugate prior case using an industrial example. It is shown that with the Bayesian approach, the statistical precision of parameter estimation is improved and, consequently, the required number of failures could be reduced.

  14. Bayesian Analysis of the Power Spectrum of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey B.; Eriksen, H. K.; O'Dwyer, I. J.; Wandelt, B. D.

    2005-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background. The sky, when viewed in the microwave, is very uniform, with a nearly perfect blackbody spectrum at 2.7 degrees. Very small amplitude brightness fluctuations (to one part in a million!!) trace small density perturbations in the early universe (roughly 300,000 years after the Big Bang), which later grow through gravitational instability to the large-scale structure seen in redshift surveys... In this talk, I will discuss a Bayesian formulation of this problem; discuss a Gibbs sampling approach to numerically sampling from the Bayesian posterior, and the application of this approach to the first-year data from the Wilkinson Microwave Anisotropy Probe. I will also comment on recent algorithmic developments for this approach to be tractable for the even more massive data set to be returned from the Planck satellite.

  15. Bayesian Sparse Regression Analysis Documents the Diversity of Spinal Inhibitory Interneurons.

    PubMed

    Gabitto, Mariano I; Pakman, Ari; Bikoff, Jay B; Abbott, L F; Jessell, Thomas M; Paninski, Liam

    2016-03-24

    Documenting the extent of cellular diversity is a critical step in defining the functional organization of tissues and organs. To infer cell-type diversity from partial or incomplete transcription factor expression data, we devised a sparse Bayesian framework that is able to handle estimation uncertainty and can incorporate diverse cellular characteristics to optimize experimental design. Focusing on spinal V1 inhibitory interneurons, for which the spatial expression of 19 transcription factors has been mapped, we infer the existence of ~50 candidate V1 neuronal types, many of which localize in compact spatial domains in the ventral spinal cord. We have validated the existence of inferred cell types by direct experimental measurement, establishing this Bayesian framework as an effective platform for cell-type characterization in the nervous system and elsewhere. PMID:26949187

  16. Analysis of a Preloaded Bolted Joint in a Ceramic Composite Combustor

    NASA Technical Reports Server (NTRS)

    Hissam, D. Andy; Bower, Mark V.

    2003-01-01

    This paper presents the detailed analysis of a preloaded bolted joint incorporating ceramic materials. The objective of this analysis is to determine the suitability of a joint design for a ceramic combustor. The analysis addresses critical factors in bolted joint design including preload, preload uncertainty, and load factor. The relationship between key joint variables is also investigated. The analysis is based on four key design criteria, each addressing an anticipated failure mode. The criteria are defined in terms of margin of safety, which must be greater than zero for the design criteria to be satisfied. Since the proposed joint has positive margins of safety, the design criteria are satisfied. Therefore, the joint design is acceptable.

  17. Decision-theoretic analysis of forensic sampling criteria using bayesian decision networks.

    PubMed

    Biedermann, A; Bozza, S; Garbolino, P; Taroni, F

    2012-11-30

    Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker--typically a client of a forensic examination or a scientist acting on behalf of a client--ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked

  18. Joint Modeling Compliance and Outcome for Causal Analysis in Longitudinal Studies

    PubMed Central

    Gao, Xin; Brown, Gregory K.; Elliott, Michael R.

    2013-01-01

    This article discusses joint modeling of compliance and outcome for longitudinal studies when noncompliance is present. We focus on two-arm randomized longitudinal studies in which subjects are randomized at baseline, treatment is applied repeatedly over time, and compliance behaviors and clinical outcomes are measured and recorded repeatedly over time. In the proposed Markov compliance and outcome model, we use the potential outcome framework to define pre-randomization principal strata from the joint distribution of compliance under treatment and control arms, and estimate the effect of treatment within each principal strata. Besides the causal effect of the treatment, our proposed model can estimate the impact of the causal effect of the treatment at a given time on the future compliance. Bayesian methods are used to estimate the parameters. The results are illustrated using a study assessing the effect of cognitive behavior therapy on depression. A simulation study is used to assess the repeated sampling properties of the proposed model. PMID:23576159

  19. Bayesian analysis of the astrobiological implications of life's early emergence on Earth

    NASA Astrophysics Data System (ADS)

    Spiegel, David S.; Turner, Edwin L.

    2012-01-01

    Life arose on Earth sometime in the first few hundred million years after the young planet had cooled to the point that it could support water-based organisms on its surface. The early emergence of life on Earth has been taken as evidence that the probability of abiogenesis is high, if starting from young Earth-like conditions. We revisit this argument quantitatively in a Bayesian statistical framework. By constructing a simple model of the probability of abiogenesis, we calculate a Bayesian estimate of its posterior probability, given the data that life emerged fairly early in Earth’s history and that, billions of years later, curious creatures noted this fact and considered its implications. We find that, given only this very limited empirical information, the choice of Bayesian prior for the abiogenesis probability parameter has a dominant influence on the computed posterior probability. Although terrestrial life's early emergence provides evidence that life might be abundant in the universe if early-Earth-like conditions are common, the evidence is inconclusive and indeed is consistent with an arbitrarily low intrinsic probability of abiogenesis for plausible uninformative priors. Finding a single case of life arising independently of our lineage (on Earth, elsewhere in the solar system, or on an extrasolar planet) would provide much stronger evidence that abiogenesis is not extremely rare in the universe.

  20. Bayesian analysis of the astrobiological implications of life's early emergence on Earth.

    PubMed

    Spiegel, David S; Turner, Edwin L

    2012-01-10

    Life arose on Earth sometime in the first few hundred million years after the young planet had cooled to the point that it could support water-based organisms on its surface. The early emergence of life on Earth has been taken as evidence that the probability of abiogenesis is high, if starting from young Earth-like conditions. We revisit this argument quantitatively in a bayesian statistical framework. By constructing a simple model of the probability of abiogenesis, we calculate a bayesian estimate of its posterior probability, given the data that life emerged fairly early in Earth's history and that, billions of years later, curious creatures noted this fact and considered its implications. We find that, given only this very limited empirical information, the choice of bayesian prior for the abiogenesis probability parameter has a dominant influence on the computed posterior probability. Although terrestrial life's early emergence provides evidence that life might be abundant in the universe if early-Earth-like conditions are common, the evidence is inconclusive and indeed is consistent with an arbitrarily low intrinsic probability of abiogenesis for plausible uninformative priors. Finding a single case of life arising independently of our lineage (on Earth, elsewhere in the solar system, or on an extrasolar planet) would provide much stronger evidence that abiogenesis is not extremely rare in the universe.

  1. Bayesian analysis of the astrobiological implications of life's early emergence on Earth.

    PubMed

    Spiegel, David S; Turner, Edwin L

    2012-01-10

    Life arose on Earth sometime in the first few hundred million years after the young planet had cooled to the point that it could support water-based organisms on its surface. The early emergence of life on Earth has been taken as evidence that the probability of abiogenesis is high, if starting from young Earth-like conditions. We revisit this argument quantitatively in a bayesian statistical framework. By constructing a simple model of the probability of abiogenesis, we calculate a bayesian estimate of its posterior probability, given the data that life emerged fairly early in Earth's history and that, billions of years later, curious creatures noted this fact and considered its implications. We find that, given only this very limited empirical information, the choice of bayesian prior for the abiogenesis probability parameter has a dominant influence on the computed posterior probability. Although terrestrial life's early emergence provides evidence that life might be abundant in the universe if early-Earth-like conditions are common, the evidence is inconclusive and indeed is consistent with an arbitrarily low intrinsic probability of abiogenesis for plausible uninformative priors. Finding a single case of life arising independently of our lineage (on Earth, elsewhere in the solar system, or on an extrasolar planet) would provide much stronger evidence that abiogenesis is not extremely rare in the universe. PMID:22198766

  2. Bayesian analysis of the astrobiological implications of life’s early emergence on Earth

    PubMed Central

    Spiegel, David S.; Turner, Edwin L.

    2012-01-01

    Life arose on Earth sometime in the first few hundred million years after the young planet had cooled to the point that it could support water-based organisms on its surface. The early emergence of life on Earth has been taken as evidence that the probability of abiogenesis is high, if starting from young Earth-like conditions. We revisit this argument quantitatively in a Bayesian statistical framework. By constructing a simple model of the probability of abiogenesis, we calculate a Bayesian estimate of its posterior probability, given the data that life emerged fairly early in Earth’s history and that, billions of years later, curious creatures noted this fact and considered its implications. We find that, given only this very limited empirical information, the choice of Bayesian prior for the abiogenesis probability parameter has a dominant influence on the computed posterior probability. Although terrestrial life's early emergence provides evidence that life might be abundant in the universe if early-Earth-like conditions are common, the evidence is inconclusive and indeed is consistent with an arbitrarily low intrinsic probability of abiogenesis for plausible uninformative priors. Finding a single case of life arising independently of our lineage (on Earth, elsewhere in the solar system, or on an extrasolar planet) would provide much stronger evidence that abiogenesis is not extremely rare in the universe. PMID:22198766

  3. An inelastic analysis of a welded aluminum joint

    NASA Technical Reports Server (NTRS)

    Vaughan, R. E.

    1994-01-01

    Butt-weld joints are most commonly designed into pressure vessels which then become as reliable as the weakest increment in the weld chain. In practice, weld material properties are determined from tensile test specimen and provided to the stress analyst in the form of a stress versus strain diagram. Variations in properties through the thickness of the weld and along the width of the weld have been suspect but not explored because of inaccessibility and cost. The purpose of this study is to investigate analytical and computational methods used for analysis of welds. The weld specimens are analyzed using classical elastic and plastic theory to provide a basis for modeling the inelastic properties in a finite-element solution. The results of the analysis are compared to experimental data to determine the weld behavior and the accuracy of prediction methods. The weld considered in this study is a multiple-pass aluminum 2219-T87 butt weld with thickness of 1.40 in. The weld specimen is modeled using the finite-element code ABAQUS. The finite-element model is used to produce the stress-strain behavior in the elastic and plastic regimes and to determine Poisson's ratio in the plastic region. The value of Poisson's ratio in the plastic regime is then compared to experimental data. The results of the comparisons are used to explain multipass weld behavior and to make recommendations concerning the analysis and testing of welds.

  4. Inelastic Strain Analysis of Solder Joint in NASA Fatigue Specimen

    NASA Technical Reports Server (NTRS)

    Dasgupta, Abhijit; Oyan, Chen

    1991-01-01

    The solder fatigue specimen designed by NASA-GSFC/UNISYS is analyzed in order to obtain the inelastic strain history during two different representative temperature cycles specified by UNISYS. In previous reports (dated July 25, 1990, and November 15, 1990), results were presented of the elastic-plastic and creep analysis for delta T = 31 C cycle, respectively. Subsequent results obtained during the current phase, from viscoplastic finite element analysis of the solder fatigue specimen for delta T = 113 C cycle are summarized. Some common information is repeated for self-completeness. Large-deformation continuum formulations in conjunction with a standard linear solid model is utilized for modeling the solder constitutive creep-plasticity behavior. Relevant material properties are obtained from the literature. Strain amplitudes, mean strains, and residual strains (as well as stresses) accumulated due to a representative complete temperature cycle are obtained as a result of this analysis. The partitioning between elastic strains, time-independent inelastic (plastic) strains, and time-dependent inelastic (creep) strains is also explicitly obtained for two representative cycles. Detailed plots are presented for two representative temperature cycles. This information forms an important input for fatigue damage models, when predicting the fatigue life of solder joints under thermal cycling

  5. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  6. Bayesian Attractor Learning

    NASA Astrophysics Data System (ADS)

    Wiegerinck, Wim; Schoenaker, Christiaan; Duane, Gregory

    2016-04-01

    Recently, methods for model fusion by dynamically combining model components in an interactive ensemble have been proposed. In these proposals, fusion parameters have to be learned from data. One can view these systems as parametrized dynamical systems. We address the question of learnability of dynamical systems with respect to both short term (vector field) and long term (attractor) behavior. In particular we are interested in learning in the imperfect model class setting, in which the ground truth has a higher complexity than the models, e.g. due to unresolved scales. We take a Bayesian point of view and we define a joint log-likelihood that consists of two terms, one is the vector field error and the other is the attractor error, for which we take the L1 distance between the stationary distributions of the model and the assumed ground truth. In the context of linear models (like so-called weighted supermodels), and assuming a Gaussian error model in the vector fields, vector field learning leads to a tractable Gaussian solution. This solution can then be used as a prior for the next step, Bayesian attractor learning, in which the attractor error is used as a log-likelihood term. Bayesian attractor learning is implemented by elliptical slice sampling, a sampling method for systems with a Gaussian prior and a non Gaussian likelihood. Simulations with a partially observed driven Lorenz 63 system illustrate the approach.

  7. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    NASA Astrophysics Data System (ADS)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  8. Development of a fingerprint reduction approach for Bayesian similarity searching based on Kullback-Leibler divergence analysis.

    PubMed

    Nisius, Britta; Vogt, Martin; Bajorath, Jürgen

    2009-06-01

    The contribution of individual fingerprint bit positions to similarity search performance is systematically evaluated. A method is introduced to determine bit significance on the basis of Kullback-Leibler divergence analysis of bit distributions in active and database compounds. Bit divergence analysis and Bayesian compound screening share a common methodological foundation. Hence, given the significance ranking of all individual bit positions comprising a fingerprint, subsets of bits are evaluated in the context of Bayesian screening, and minimal fingerprint representations are determined that meet or exceed the search performance of unmodified fingerprints. For fingerprints of different design evaluated on many compound activity classes, we consistently find that subsets of fingerprint bit positions are responsible for search performance. In part, these subsets are very small and contain in some cases only a few fingerprint bit positions. Structural or pharmacophore patterns captured by preferred bit positions can often be directly associated with characteristic features of active compounds. In some cases, reduced fingerprint representations clearly exceed the search performance of the original fingerprints. Thus, fingerprint reduction likely represents a promising approach for practical applications.

  9. Order-Constrained Reference Priors with Implications for Bayesian Isotonic Regression, Analysis of Covariance and Spatial Models

    NASA Astrophysics Data System (ADS)

    Gong, Maozhen

    Selecting an appropriate prior distribution is a fundamental issue in Bayesian Statistics. In this dissertation, under the framework provided by Berger and Bernardo, I derive the reference priors for several models which include: Analysis of Variance (ANOVA)/Analysis of Covariance (ANCOVA) models with a categorical variable under common ordering constraints, the conditionally autoregressive (CAR) models and the simultaneous autoregressive (SAR) models with a spatial autoregression parameter rho considered. The performances of reference priors for ANOVA/ANCOVA models are evaluated by simulation studies with comparisons to Jeffreys' prior and Least Squares Estimation (LSE). The priors are then illustrated in a Bayesian model of the "Risk of Type 2 Diabetes in New Mexico" data, where the relationship between the type 2 diabetes risk (through Hemoglobin A1c) and different smoking levels is investigated. In both simulation studies and real data set modeling, the reference priors that incorporate internal order information show good performances and can be used as default priors. The reference priors for the CAR and SAR models are also illustrated in the "1999 SAT State Average Verbal Scores" data with a comparison to a Uniform prior distribution. Due to the complexity of the reference priors for both CAR and SAR models, only a portion (12 states in the Midwest) of the original data set is considered. The reference priors can give a different marginal posterior distribution compared to a Uniform prior, which provides an alternative for prior specifications for areal data in Spatial statistics.

  10. Japanese Encephalitis Risk and Contextual Risk Factors in Southwest China: A Bayesian Hierarchical Spatial and Spatiotemporal Analysis

    PubMed Central

    Zhao, Xing; Cao, Mingqin; Feng, Hai-Huan; Fan, Heng; Chen, Fei; Feng, Zijian; Li, Xiaosong; Zhou, Xiao-Hua

    2014-01-01

    It is valuable to study the spatiotemporal pattern of Japanese encephalitis (JE) and its association with the contextual risk factors in southwest China, which is the most endemic area in China. Using data from 2004 to 2009, we applied GISmapping and spatial autocorrelation analysis to analyze reported incidence data of JE in 438 counties in southwest China, finding that JE cases were not randomly distributed, and a Bayesian hierarchical spatiotemporal model identified the east part of southwest China as a high risk area. Meanwhile, the Bayesian hierarchical spatial model in 2006 demonstrated a statistically significant association between JE and the agricultural and climatic variables, including the proportion of rural population, the pig-to-human ratio, the monthly precipitation and the monthly mean minimum and maximum temperatures. Particular emphasis was placed on the time-lagged effect for climatic factors. The regression method and the Spearman correlation analysis both identified a two-month lag for the precipitation, while the regression method found a one-month lag for temperature. The results show that the high risk area in the east part of southwest China may be connected to the agricultural and climatic factors. The routine surveillance and the allocation of health resources should be given more attention in this area. Moreover, the meteorological variables might be considered as possible predictors of JE in southwest China. PMID:24739769

  11. Pathological Knee Joint Motion Analysis By High Speed Cinephotography

    NASA Astrophysics Data System (ADS)

    Baumann, Jurg U.

    1985-02-01

    The use of cinephotography for evaluation of disturbed knee joint function was compared in three groups of patients. While a sampling rate of 50 images per second was adequate for patients with neuromuscular disorders, a higher frequency of around 300 i.p.s. is necessary in osteoarthritis and ligamentous knee joint injuries, but the task of digitizing is prohibitive unless automated.

  12. Scarf Joints of Composite Materials: Testing and Analysis

    NASA Astrophysics Data System (ADS)

    Kwon, Y. W.; Marrón, A.

    2009-12-01

    The objective of this study is to develop a reliable computational model in order to investigate joint strengths of scarf joint configurations constructed from carbon-fiber and glass-fiber woven fabric laminates with different material combinations like glass/glass, glass/carbon, carbon/glass, and carbon/carbon under various loading conditions such as axial, bending moment and shear loading. Both experimental and computational studies are conducted. For the experimental study, specimens made of hybrid scarf joints using carbon-fiber and glass-fiber woven fabrics are tested under compressive loadings to determine their joint failure strengths. Computational models are then developed using the discrete resin layer model along with fracture mechanics and virtual crack closure techniques. The numerical models are validated against the experimental data. The validate models are used to predict the joint strengths under different loading conditions such as axial, shear, and bending moment loadings.

  13. Bayesian network learning for natural hazard assessments

    NASA Astrophysics Data System (ADS)

    Vogel, Kristin

    2016-04-01

    Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables

  14. A microstructural analysis of solder joints from the electronic assemblies of dismantled nuclear weapons

    SciTech Connect

    Vianco, P.T.; Rejent, J.A.

    1997-05-01

    MC1814 Interconnection Boxes from dismantled B57 bombs, and MC2839 firing Sets from retired W70-1 warheads were obtained from the Pantex facility. Printed circuit boards were selected from these components for microstructural analysis of their solder joints. The analysis included a qualitative examination of the solder joints and quantitative assessments of (1) the thickness of the intermetallic compound layer that formed between the solder and circuit board Cu features, and (2) the Pb-rich phase particle distribution within the solder joint microstructure. The MC2839 solder joints had very good workmanship qualities. The intermetallic compound layer stoichiometry was determined to be that of Cu6Sn5. The mean intermetallic compound layer thickness for all solder joints was 0.885 mm. The magnitude of these values did not indicate significant growth over the weapon lifetime. The size distribution of the Pb-rich phase particles for each of the joints were represented by the mean of 9.85 {times} 10{sup {minus}6} mm{sup 2}. Assuming a spherical geometry, the mean particle diameter would be 3.54 mm. The joint-to-joint difference of intermetallic compound layer thickness and Pb-rich particle size distribution was not caused by varying thermal environments, but rather, was a result of natural variations in the joint microstructure that probably existed at the time of manufacture. The microstructural evaluation of the through-hole solder joints form the MC2839 and MC1814 components indicated that the environmental conditions to which these electronic units were exposed in the stockpile, were benign regarding solder joint aging. There was an absence of thermal fatigue damage in MC2839 circuit board, through-hole solder joints. The damage to the eyelet solder joints of the MC1814 more likely represented infant mortality failures at or very near the time of manufacture, resulting from a marginal design status of this type of solder joint design.

  15. Bayesian learning

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    In 1983 and 1984, the Infrared Astronomical Satellite (IRAS) detected 5,425 stellar objects and measured their infrared spectra. In 1987 a program called AUTOCLASS used Bayesian inference methods to discover the classes present in these data and determine the most probable class of each object, revealing unknown phenomena in astronomy. AUTOCLASS has rekindled the old debate on the suitability of Bayesian methods, which are computationally intensive, interpret probabilities as plausibility measures rather than frequencies, and appear to depend on a subjective assessment of the probability of a hypothesis before the data were collected. Modern statistical methods have, however, recently been shown to also depend on subjective elements. These debates bring into question the whole tradition of scientific objectivity and offer scientists a new way to take responsibility for their findings and conclusions.

  16. Neural network modeling and an uncertainty analysis in Bayesian framework: A case study from the KTB borehole site

    NASA Astrophysics Data System (ADS)

    Maiti, Saumen; Tiwari, Ram Krishna

    2010-10-01

    A new probabilistic approach based on the concept of Bayesian neural network (BNN) learning theory is proposed for decoding litho-facies boundaries from well-log data. We show that how a multi-layer-perceptron neural network model can be employed in Bayesian framework to classify changes in litho-log successions. The method is then applied to the German Continental Deep Drilling Program (KTB) well-log data for classification and uncertainty estimation in the litho-facies boundaries. In this framework, a posteriori distribution of network parameter is estimated via the principle of Bayesian probabilistic theory, and an objective function is minimized following the scaled conjugate gradient optimization scheme. For the model development, we inflict a suitable criterion, which provides probabilistic information by emulating different combinations of synthetic data. Uncertainty in the relationship between the data and the model space is appropriately taken care by assuming a Gaussian a priori distribution of networks parameters (e.g., synaptic weights and biases). Prior to applying the new method to the real KTB data, we tested the proposed method on synthetic examples to examine the sensitivity of neural network hyperparameters in prediction. Within this framework, we examine stability and efficiency of this new probabilistic approach using different kinds of synthetic data assorted with different level of correlated noise. Our data analysis suggests that the designed network topology based on the Bayesian paradigm is steady up to nearly 40% correlated noise; however, adding more noise (˜50% or more) degrades the results. We perform uncertainty analyses on training, validation, and test data sets with and devoid of intrinsic noise by making the Gaussian approximation of the a posteriori distribution about the peak model. We present a standard deviation error-map at the network output corresponding to the three types of the litho-facies present over the entire litho

  17. Population-level differences in disease transmission: A Bayesian analysis of multiple smallpox epidemics

    PubMed Central

    Elderd, Bret D.; Dwyer, Greg; Dukic, Vanja

    2013-01-01

    Estimates of a disease’s basic reproductive rate R0 play a central role in understanding outbreaks and planning intervention strategies. In many calculations of R0, a simplifying assumption is that different host populations have effectively identical transmission rates. This assumption can lead to an underestimate of the overall uncertainty associated with R0, which, due to the non-linearity of epidemic processes, may result in a mis-estimate of epidemic intensity and miscalculated expenditures associated with public-health interventions. In this paper, we utilize a Bayesian method for quantifying the overall uncertainty arising from differences in population-specific basic reproductive rates. Using this method, we fit spatial and non-spatial susceptible-exposed-infected-recovered (SEIR) models to a series of 13 smallpox outbreaks. Five outbreaks occurred in populations that had been previously exposed to smallpox, while the remaining eight occurred in Native-American populations that were naïve to the disease at the time. The Native-American outbreaks were close in a spatial and temporal sense. Using Bayesian Information Criterion (BIC), we show that the best model includes population-specific R0 values. These differences in R0 values may, in part, be due to differences in genetic background, social structure, or food and water availability. As a result of these inter-population differences, the overall uncertainty associated with the “population average” value of smallpox R0 is larger, a finding that can have important consequences for controlling epidemics. In general, Bayesian hierarchical models are able to properly account for the uncertainty associated with multiple epidemics, provide a clearer understanding of variability in epidemic dynamics, and yield a better assessment of the range of potential risks and consequences that decision makers face. PMID:24021521

  18. A Bayesian approach to the analysis of quantal bioassay studies using nonparametric mixture models.

    PubMed

    Fronczyk, Kassandra; Kottas, Athanasios

    2014-03-01

    We develop a Bayesian nonparametric mixture modeling framework for quantal bioassay settings. The approach is built upon modeling dose-dependent response distributions. We adopt a structured nonparametric prior mixture model, which induces a monotonicity restriction for the dose-response curve. Particular emphasis is placed on the key risk assessment goal of calibration for the dose level that corresponds to a specified response. The proposed methodology yields flexible inference for the dose-response relationship as well as for other inferential objectives, as illustrated with two data sets from the literature. PMID:24354490

  19. Joint analysis of BICEP2/keck array and Planck Data.

    PubMed

    Ade, P A R; Aghanim, N; Ahmed, Z; Aikin, R W; Alexander, K D; Arnaud, M; Aumont, J; Baccigalupi, C; Banday, A J; Barkats, D; Barreiro, R B; Bartlett, J G; Bartolo, N; Battaner, E; Benabed, K; Benoît, A; Benoit-Lévy, A; Benton, S J; Bernard, J-P; Bersanelli, M; Bielewicz, P; Bischoff, C A; Bock, J J; Bonaldi, A; Bonavera, L; Bond, J R; Borrill, J; Bouchet, F R; Boulanger, F; Brevik, J A; Bucher, M; Buder, I; Bullock, E; Burigana, C; Butler, R C; Buza, V; Calabrese, E; Cardoso, J-F; Catalano, A; Challinor, A; Chary, R-R; Chiang, H C; Christensen, P R; Colombo, L P L; Combet, C; Connors, J; Couchot, F; Coulais, A; Crill, B P; Curto, A; Cuttaia, F; Danese, L; Davies, R D; Davis, R J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Delouis, J-M; Désert, F-X; Dickinson, C; Diego, J M; Dole, H; Donzelli, S; Doré, O; Douspis, M; Dowell, C D; Duband, L; Ducout, A; Dunkley, J; Dupac, X; Dvorkin, C; Efstathiou, G; Elsner, F; Enßlin, T A; Eriksen, H K; Falgarone, E; Filippini, J P; Finelli, F; Fliescher, S; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Frejsel, A; Galeotta, S; Galli, S; Ganga, K; Ghosh, T; Giard, M; Gjerløw, E; Golwala, S R; González-Nuevo, J; Górski, K M; Gratton, S; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Halpern, M; Hansen, F K; Hanson, D; Harrison, D L; Hasselfield, M; Helou, G; Henrot-Versillé, S; Herranz, D; Hildebrandt, S R; Hilton, G C; Hivon, E; Hobson, M; Holmes, W A; Hovest, W; Hristov, V V; Huffenberger, K M; Hui, H; Hurier, G; Irwin, K D; Jaffe, A H; Jaffe, T R; Jewell, J; Jones, W C; Juvela, M; Karakci, A; Karkare, K S; Kaufman, J P; Keating, B G; Kefeli, S; Keihänen, E; Kernasovskiy, S A; Keskitalo, R; Kisner, T S; Kneissl, R; Knoche, J; Knox, L; Kovac, J M; Krachmalnicoff, N; Kunz, M; Kuo, C L; Kurki-Suonio, H; Lagache, G; Lähteenmäki, A; Lamarre, J-M; Lasenby, A; Lattanzi, M; Lawrence, C R; Leitch, E M; Leonardi, R; Levrier, F; Lewis, A; Liguori, M; Lilje, P B; Linden-Vørnle, M; López-Caniego, M; Lubin, P M; Lueker, M; Macías-Pérez, J F; Maffei, B; Maino, D; Mandolesi, N; Mangilli, A; Maris, M; Martin, P G; Martínez-González, E; Masi, S; Mason, P; Matarrese, S; Megerian, K G; Meinhold, P R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Miville-Deschênes, M-A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Netterfield, C B; Nguyen, H T; Nørgaard-Nielsen, H U; Noviello, F; Novikov, D; Novikov, I; O'Brient, R; Ogburn, R W; Orlando, A; Pagano, L; Pajot, F; Paladini, R; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, T J; Perdereau, O; Perotto, L; Pettorino, V; Piacentini, F; Piat, M; Pietrobon, D; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Pratt, G W; Prunet, S; Pryke, C; Puget, J-L; Rachen, J P; Reach, W T; Rebolo, R; Reinecke, M; Remazeilles, M; Renault, C; Renzi, A; Richter, S; Ristorcelli, I; Rocha, G; Rossetti, M; Roudier, G; Rowan-Robinson, M; Rubiño-Martín, J A; Rusholme, B; Sandri, M; Santos, D; Savelainen, M; Savini, G; Schwarz, R; Scott, D; Seiffert, M D; Sheehy, C D; Spencer, L D; Staniszewski, Z K; Stolyarov, V; Sudiwala, R; Sunyaev, R; Sutton, D; Suur-Uski, A-S; Sygnet, J-F; Tauber, J A; Teply, G P; Terenzi, L; Thompson, K L; Toffolatti, L; Tolan, J E; Tomasi, M; Tristram, M; Tucci, M; Turner, A D; Valenziano, L; Valiviita, J; Van Tent, B; Vibert, L; Vielva, P; Vieregg, A G; Villa, F; Wade, L A; Wandelt, B D; Watson, R; Weber, A C; Wehus, I K; White, M; White, S D M; Willmert, J; Wong, C L; Yoon, K W; Yvon, D; Zacchei, A; Zonca, A

    2015-03-13

    We report the results of a joint analysis of data from BICEP2/Keck Array and Planck. BICEP2 and Keck Array have observed the same approximately 400  deg^{2} patch of sky centered on RA 0 h, Dec. -57.5°. The combined maps reach a depth of 57 nK deg in Stokes Q and U in a band centered at 150 GHz. Planck has observed the full sky in polarization at seven frequencies from 30 to 353 GHz, but much less deeply in any given region (1.2  μK deg in Q and U at 143 GHz). We detect 150×353 cross-correlation in B modes at high significance. We fit the single- and cross-frequency power spectra at frequencies ≥150  GHz to a lensed-ΛCDM model that includes dust and a possible contribution from inflationary gravitational waves (as parametrized by the tensor-to-scalar ratio r), using a prior on the frequency spectral behavior of polarized dust emission from previous Planck analysis of other regions of the sky. We find strong evidence for dust and no statistically significant evidence for tensor modes. We probe various model variations and extensions, including adding a synchrotron component in combination with lower frequency data, and find that these make little difference to the r constraint. Finally, we present an alternative analysis which is similar to a map-based cleaning of the dust contribution, and show that this gives similar constraints. The final result is expressed as a likelihood curve for r, and yields an upper limit r_{0.05}<0.12 at 95% confidence. Marginalizing over dust and r, lensing B modes are detected at 7.0σ significance. PMID:25815919

  20. Joint analysis of BICEP2/keck array and Planck Data.

    PubMed

    Ade, P A R; Aghanim, N; Ahmed, Z; Aikin, R W; Alexander, K D; Arnaud, M; Aumont, J; Baccigalupi, C; Banday, A J; Barkats, D; Barreiro, R B; Bartlett, J G; Bartolo, N; Battaner, E; Benabed, K; Benoît, A; Benoit-Lévy, A; Benton, S J; Bernard, J-P; Bersanelli, M; Bielewicz, P; Bischoff, C A; Bock, J J; Bonaldi, A; Bonavera, L; Bond, J R; Borrill, J; Bouchet, F R; Boulanger, F; Brevik, J A; Bucher, M; Buder, I; Bullock, E; Burigana, C; Butler, R C; Buza, V; Calabrese, E; Cardoso, J-F; Catalano, A; Challinor, A; Chary, R-R; Chiang, H C; Christensen, P R; Colombo, L P L; Combet, C; Connors, J; Couchot, F; Coulais, A; Crill, B P; Curto, A; Cuttaia, F; Danese, L; Davies, R D; Davis, R J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Delouis, J-M; Désert, F-X; Dickinson, C; Diego, J M; Dole, H; Donzelli, S; Doré, O; Douspis, M; Dowell, C D; Duband, L; Ducout, A; Dunkley, J; Dupac, X; Dvorkin, C; Efstathiou, G; Elsner, F; Enßlin, T A; Eriksen, H K; Falgarone, E; Filippini, J P; Finelli, F; Fliescher, S; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Frejsel, A; Galeotta, S; Galli, S; Ganga, K; Ghosh, T; Giard, M; Gjerløw, E; Golwala, S R; González-Nuevo, J; Górski, K M; Gratton, S; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Halpern, M; Hansen, F K; Hanson, D; Harrison, D L; Hasselfield, M; Helou, G; Henrot-Versillé, S; Herranz, D; Hildebrandt, S R; Hilton, G C; Hivon, E; Hobson, M; Holmes, W A; Hovest, W; Hristov, V V; Huffenberger, K M; Hui, H; Hurier, G; Irwin, K D; Jaffe, A H; Jaffe, T R; Jewell, J; Jones, W C; Juvela, M; Karakci, A; Karkare, K S; Kaufman, J P; Keating, B G; Kefeli, S; Keihänen, E; Kernasovskiy, S A; Keskitalo, R; Kisner, T S; Kneissl, R; Knoche, J; Knox, L; Kovac, J M; Krachmalnicoff, N; Kunz, M; Kuo, C L; Kurki-Suonio, H; Lagache, G; Lähteenmäki, A; Lamarre, J-M; Lasenby, A; Lattanzi, M; Lawrence, C R; Leitch, E M; Leonardi, R; Levrier, F; Lewis, A; Liguori, M; Lilje, P B; Linden-Vørnle, M; López-Caniego, M; Lubin, P M; Lueker, M; Macías-Pérez, J F; Maffei, B; Maino, D; Mandolesi, N; Mangilli, A; Maris, M; Martin, P G; Martínez-González, E; Masi, S; Mason, P; Matarrese, S; Megerian, K G; Meinhold, P R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Miville-Deschênes, M-A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Netterfield, C B; Nguyen, H T; Nørgaard-Nielsen, H U; Noviello, F; Novikov, D; Novikov, I; O'Brient, R; Ogburn, R W; Orlando, A; Pagano, L; Pajot, F; Paladini, R; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, T J; Perdereau, O; Perotto, L; Pettorino, V; Piacentini, F; Piat, M; Pietrobon, D; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Pratt, G W; Prunet, S; Pryke, C; Puget, J-L; Rachen, J P; Reach, W T; Rebolo, R; Reinecke, M; Remazeilles, M; Renault, C; Renzi, A; Richter, S; Ristorcelli, I; Rocha, G; Rossetti, M; Roudier, G; Rowan-Robinson, M; Rubiño-Martín, J A; Rusholme, B; Sandri, M; Santos, D; Savelainen, M; Savini, G; Schwarz, R; Scott, D; Seiffert, M D; Sheehy, C D; Spencer, L D; Staniszewski, Z K; Stolyarov, V; Sudiwala, R; Sunyaev, R; Sutton, D; Suur-Uski, A-S; Sygnet, J-F; Tauber, J A; Teply, G P; Terenzi, L; Thompson, K L; Toffolatti, L; Tolan, J E; Tomasi, M; Tristram, M; Tucci, M; Turner, A D; Valenziano, L; Valiviita, J; Van Tent, B; Vibert, L; Vielva, P; Vieregg, A G; Villa, F; Wade, L A; Wandelt, B D; Watson, R; Weber, A C; Wehus, I K; White, M; White, S D M; Willmert, J; Wong, C L; Yoon, K W; Yvon, D; Zacchei, A; Zonca, A

    2015-03-13

    We report the results of a joint analysis of data from BICEP2/Keck Array and Planck. BICEP2 and Keck Array have observed the same approximately 400  deg^{2} patch of sky centered on RA 0 h, Dec. -57.5°. The combined maps reach a depth of 57 nK deg in Stokes Q and U in a band centered at 150 GHz. Planck has observed the full sky in polarization at seven frequencies from 30 to 353 GHz, but much less deeply in any given region (1.2  μK deg in Q and U at 143 GHz). We detect 150×353 cross-correlation in B modes at high significance. We fit the single- and cross-frequency power spectra at frequencies ≥150  GHz to a lensed-ΛCDM model that includes dust and a possible contribution from inflationary gravitational waves (as parametrized by the tensor-to-scalar ratio r), using a prior on the frequency spectral behavior of polarized dust emission from previous Planck analysis of other regions of the sky. We find strong evidence for dust and no statistically significant evidence for tensor modes. We probe various model variations and extensions, including adding a synchrotron component in combination with lower frequency data, and find that these make little difference to the r constraint. Finally, we present an alternative analysis which is similar to a map-based cleaning of the dust contribution, and show that this gives similar constraints. The final result is expressed as a likelihood curve for r, and yields an upper limit r_{0.05}<0.12 at 95% confidence. Marginalizing over dust and r, lensing B modes are detected at 7.0σ significance.

  1. Statistical analysis using the Bayesian nonparametric method for irradiation embrittlement of reactor pressure vessels

    NASA Astrophysics Data System (ADS)

    Takamizawa, Hisashi; Itoh, Hiroto; Nishiyama, Yutaka

    2016-10-01

    In order to understand neutron irradiation embrittlement in high fluence regions, statistical analysis using the Bayesian nonparametric (BNP) method was performed for the Japanese surveillance and material test reactor irradiation database. The BNP method is essentially expressed as an infinite summation of normal distributions, with input data being subdivided into clusters with identical statistical parameters, such as mean and standard deviation, for each cluster to estimate shifts in ductile-to-brittle transition temperature (DBTT). The clusters typically depend on chemical compositions, irradiation conditions, and the irradiation embrittlement. Specific variables contributing to the irradiation embrittlement include the content of Cu, Ni, P, Si, and Mn in the pressure vessel steels, neutron flux, neutron fluence, and irradiation temperatures. It was found that the measured shifts of DBTT correlated well with the calculated ones. Data associated with the same materials were subdivided into the same clusters even if neutron fluences were increased. Comparing cluster IDs 2 and 6, embrittlement of high-Cu-bearing materials (<0.07 wt%) was larger than that of low-Cu-bearing (0.07 < wt.%) materials. This is attributed to irradiation-induced Cu-enriched clusters, as well as those that are irradiation-enhanced [4]. A similar feature is recognized for cluster IDs 5 and 8 in materials with a higher Ni content. A flux effect with a higher flux range was demonstrated for cluster ID 3 comprising MTR irradiation in a high flux region (≤1 × 1013 n/cm2/s) [44]. For cluster ID 10, classification is rendered based upon flux effect, where embrittlement is accelerated in high Cu-bearing materials irradiated at lower flux levels (less than 5 × 109 n/cm2·s). This is possibly due to increased thermal equilibrium vacancies [44,45]. Per all the above considerations, it was hence ascertained that data belonging to identical cluster ID

  2. Joint Analysis of Survival Time and Longitudinal Categorical Outcomes

    PubMed Central

    Choi, Jaeun; Cai, Jianwen; Zeng, Donglin; Olshan, Andrew F.

    2013-01-01

    In biomedical or public health research, it is common for both survival time and longitudinal categorical outcomes to be collected for a subject, along with the subject’s characteristics or risk factors. Investigators are often interested in finding important variables for predicting both survival time and longitudinal outcomes which could be correlated within the same subject. Existing approaches for such joint analyses deal with continuous longitudinal outcomes. New statistical methods need to be developed for categorical longitudinal outcomes. We propose to simultaneously model the survival time with a stratified Cox proportional hazards model and the longitudinal categorical outcomes with a generalized linear mixed model. Random effects are introduced to account for the dependence between survival time and longitudinal outcomes due to unobserved factors. The Expectation-Maximization (EM) algorithm is used to derive the point estimates for the model parameters, and the observed information matrix is adopted to estimate their asymptotic variances. Asymptotic properties for our proposed maximum likelihood estimators are established using the theory of empirical processes. The method is demonstrated to perform well in finite samples via simulation studies. We illustrate our approach with data from the Carolina Head and Neck Cancer Study (CHANCE) and compare the results based on our simultaneous analysis and the separately conducted analyses using the generalized linear mixed model and the Cox proportional hazards model. Our proposed method identifies more predictors than by separate analyses. PMID:26052353

  3. Joint decorrelation, a versatile tool for multichannel data analysis.

    PubMed

    de Cheveigné, Alain; Parra, Lucas C

    2014-09-01

    We review a simple yet versatile approach for the analysis of multichannel data, focusing in particular on brain signals measured with EEG, MEG, ECoG, LFP or optical imaging. Sensors are combined linearly with weights that are chosen to provide optimal signal-to-noise ratio. Signal and noise can be variably defined to match the specific need, e.g. reproducibility over trials, frequency content, or differences between stimulus conditions. We demonstrate how the method can be used to remove power line or cardiac interference, enhance stimulus-evoked or stimulus-induced activity, isolate narrow-band cortical activity, and so on. The approach involves decorrelating both the original and filtered data by joint diagonalization of their covariance matrices. We trace its origins; offer an easy-to-understand explanation; review a range of applications; and chart failure scenarios that might lead to misleading results, in particular due to overfitting. In addition to its flexibility and effectiveness, a major appeal of the method is that it is easy to understand. PMID:24990357

  4. [Kinetic radiography and functional analysis of the temporomandibular joint (TMJ)].

    PubMed

    Bandai, Natsuko; Sanada, Shigeru; Ueki, Koichiro; Funabasama, Shintaro; Matsui, Takeshi; Tsuduki, Shinji

    2003-03-01

    To develop a method of kinetic radiography and a computer-aided diagnosis (CAD) system for quantitative evaluation of the temporomandibular joint (TMJ), dynamic images of the TMJ from one healthy volunteer were obtained by fluoroscopy in the lateral view on the right and left sides. The accumulated image subtraction technique extracted the condyle in each image. A sequential similarity detection algorithm (SSDA) was employed to trace the movement path and the velocity of the condyle. The shape of the path of the right condyle was smoother than that of the left condyle. The size of the maximum vertical and horizontal movements of the condyle were 4.6+/-0.1 mm and 15.0+/-0.2 mm, respectively. The velocity of the movement of the condyle was higher in the area close to the articular eminence than in any other area during the opening and closing of the mouth. This CAD system will contribute to the kinetic analysis of the TMJ for screening, follow-up study, and informed consent, providing speed, quantitation, and cost-effectiveness.

  5. Joint Analysis of Multiple Traits Using "Optimal" Maximum Heritability Test.

    PubMed

    Wang, Zhenchuan; Sha, Qiuying; Zhang, Shuanglin

    2016-01-01

    The joint analysis of multiple traits has recently become popular since it can increase statistical power to detect genetic variants and there is increasing evidence showing that pleiotropy is a widespread phenomenon in complex diseases. Currently, most of existing methods use all of the traits for testing the association between multiple traits and a single variant. However, those methods for association studies may lose power in the presence of a large number of noise traits. In this paper, we propose an "optimal" maximum heritability test (MHT-O) to test the association between multiple traits and a single variant. MHT-O includes a procedure of deleting traits that have weak or no association with the variant. Using extensive simulation studies, we compare the performance of MHT-O with MHT, Trait-based Association Test uses Extended Simes procedure (TATES), SUM_SCORE and MANOVA. Our results show that, in all of the simulation scenarios, MHT-O is either the most powerful test or comparable to the most powerful test among the five tests we compared. PMID:26950849

  6. Bayesian Clinical Trials in Action

    PubMed Central

    Lee, J. Jack; Chu, Caleb T.

    2012-01-01

    Although the frequentist paradigm has been the predominant approach to clinical trial design since the 1940s, it has several notable limitations. The alternative Bayesian paradigm has been greatly enhanced by advancements in computational algorithms and computer hardware. Compared to its frequentist counterpart, the Bayesian framework has several unique advantages, and its incorporation into clinical trial design is occurring more frequently. Using an extensive literature review to assess how Bayesian methods are used in clinical trials, we find them most commonly used for dose finding, efficacy monitoring, toxicity monitoring, diagnosis/decision making, and for studying pharmacokinetics/pharmacodynamics. The additional infrastructure required for implementing Bayesian methods in clinical trials may include specialized software programs to run the study design, simulation, and analysis, and Web-based applications, which are particularly useful for timely data entry and analysis. Trial success requires not only the development of proper tools but also timely and accurate execution of data entry, quality control, adaptive randomization, and Bayesian computation. The relative merit of the Bayesian and frequentist approaches continues to be the subject of debate in statistics. However, more evidence can be found showing the convergence of the two camps, at least at the practical level. Ultimately, better clinical trial methods lead to more efficient designs, lower sample sizes, more accurate conclusions, and better outcomes for patients enrolled in the trials. Bayesian methods offer attractive alternatives for better trials. More such trials should be designed and conducted to refine the approach and demonstrate its real benefit in action. PMID:22711340

  7. Elasto-Plastic Analysis of Tee Joints Using HOT-SMAC

    NASA Technical Reports Server (NTRS)

    Arnold, Steve M. (Technical Monitor); Bednarcyk, Brett A.; Yarrington, Phillip W.

    2004-01-01

    The Higher Order Theory - Structural/Micro Analysis Code (HOT-SMAC) software package is applied to analyze the linearly elastic and elasto-plastic response of adhesively bonded tee joints. Joints of this type are finding an increasing number of applications with the increased use of composite materials within advanced aerospace vehicles, and improved tools for the design and analysis of these joints are needed. The linearly elastic results of the code are validated vs. finite element analysis results from the literature under different loading and boundary conditions, and new results are generated to investigate the inelastic behavior of the tee joint. The comparison with the finite element results indicates that HOT-SMAC is an efficient and accurate alternative to the finite element method and has a great deal of potential as an analysis tool for a wide range of bonded joints.

  8. Bayesian change point analysis of abundance trends for pelagic fishes in the upper San Francisco Estuary.

    PubMed

    Thomson, James R; Kimmerer, Wim J; Brown, Larry R; Newman, Ken B; Mac Nally, Ralph; Bennett, William A; Feyrer, Frederick; Fleishman, Erica

    2010-07-01

    We examined trends in abundance of four pelagic fish species (delta smelt, longfin smelt, striped bass, and threadfin shad) in the upper San Francisco Estuary, California, USA, over 40 years using Bayesian change point models. Change point models identify times of abrupt or unusual changes in absolute abundance (step changes) or in rates of change in abundance (trend changes). We coupled Bayesian model selection with linear regression splines to identify biotic or abiotic covariates with the strongest associations with abundances of each species. We then refitted change point models conditional on the selected covariates to explore whether those covariates could explain statistical trends or change points in species abundances. We also fitted a multispecies change point model that identified change points common to all species. All models included hierarchical structures to model data uncertainties, including observation errors and missing covariate values. There were step declines in abundances of all four species in the early 2000s, with a likely common decline in 2002. Abiotic variables, including water clarity, position of the 2 per thousand isohaline (X2), and the volume of freshwater exported from the estuary, explained some variation in species' abundances over the time series, but no selected covariates could explain statistically the post-2000 change points for any species. PMID:20666259

  9. A Bayesian approach to the classification and analysis of silicon clusters

    NASA Astrophysics Data System (ADS)

    Carter-Schwendler, Carl; Subbaswamy, K. R.

    1997-03-01

    A relatively large database of locally stable 15- and 16-atom silicon clusters is analyzed to reveal attributes characteristic of low-energy isomers. We generate the structures with the help of a real-space genetic algorithm similar to that of Deaven and Ho(D. M. Deaven and K. M. Ho, Molecular geometry optimization with a genetic algorithm. Phys. Rev. Lett. 75), 288 (1995)., using a transferable, tight-binding scheme of Menon and Subbaswamy.(Madhu Menon and K. R. Subbaswamy, Transferable nonorthogonal tight-binding scheme for silicon. Phys. Rev. B 50), 11 577 (1994). The characteristics are then processed with AutoClass,(Robin Hanson, John Stutz, and Peter Cheeseman, Bayesian classification theory. Technical Report FIA-90-12-7-01, NASA Ames Research Center, Artificial Intelligence Branch, (May 1991).) an unsupervised Bayesian classification scheme, to identify families of clusters. Implications for a systematic study with increasing cluster size and for identifying particularly stable (i.e., unreactive) clusters will be examined. For more information see ěrb+http://www.pa.uky.edu/ schwendler+.

  10. Bayesian change point analysis of abundance trends for pelagic fishes in the upper San Francisco Estuary

    USGS Publications Warehouse

    Thompson, James R.; Kimmerer, Wim J.; Brown, Larry R.; Newman, Ken B.; Mac Nally, Ralph; Bennett, William A.; Feyrer, Frederick; Fleishman, Erica

    2010-01-01

    We examined trends in abundance of four pelagic fish species (delta smelt, longfin smelt, striped bass, and threadfin shad) in the upper San Francisco Estuary, California, USA, over 40 years using Bayesian change point models. Change point models identify times of abrupt or unusual changes in absolute abundance (step changes) or in rates of change in abundance (trend changes). We coupled Bayesian model selection with linear regression splines to identify biotic or abiotic covariates with the strongest associations with abundances of each species. We then refitted change point models conditional on the selected covariates to explore whether those covariates could explain statistical trends or change points in species abundances. We also fitted a multispecies change point model that identified change points common to all species. All models included hierarchical structures to model data uncertainties, including observation errors and missing covariate values. There were step declines in abundances of all four species in the early 2000s, with a likely common decline in 2002. Abiotic variables, including water clarity, position of the 2‰ isohaline (X2), and the volume of freshwater exported from the estuary, explained some variation in species' abundances over the time series, but no selected covariates could explain statistically the post-2000 change points for any species.

  11. Bayesian analysis of a morphological supermatrix sheds light on controversial fossil hominin relationships

    PubMed Central

    Dembo, Mana; Matzke, Nicholas J.; Mooers, Arne Ø.; Collard, Mark

    2015-01-01

    The phylogenetic relationships of several hominin species remain controversial. Two methodological issues contribute to the uncertainty—use of partial, inconsistent datasets and reliance on phylogenetic methods that are ill-suited to testing competing hypotheses. Here, we report a study designed to overcome these issues. We first compiled a supermatrix of craniodental characters for all widely accepted hominin species. We then took advantage of recently developed Bayesian methods for building trees of serially sampled tips to test among hypotheses that have been put forward in three of the most important current debates in hominin phylogenetics—the relationship between Australopithecus sediba and Homo, the taxonomic status of the Dmanisi hominins, and the place of the so-called hobbit fossils from Flores, Indonesia, in the hominin tree. Based on our results, several published hypotheses can be statistically rejected. For example, the data do not support the claim that Dmanisi hominins and all other early Homo specimens represent a single species, nor that the hobbit fossils are the remains of small-bodied modern humans, one of whom had Down syndrome. More broadly, our study provides a new baseline dataset for future work on hominin phylogeny and illustrates the promise of Bayesian approaches for understanding hominin phylogenetic relationships. PMID:26202999

  12. Bayesian models for cost-effectiveness analysis in the presence of structural zero costs.

    PubMed

    Baio, Gianluca

    2014-05-20

    Bayesian modelling for cost-effectiveness data has received much attention in both the health economics and the statistical literature, in recent years. Cost-effectiveness data are characterised by a relatively complex structure of relationships linking a suitable measure of clinical benefit (e.g. quality-adjusted life years) and the associated costs. Simplifying assumptions, such as (bivariate) normality of the underlying distributions, are usually not granted, particularly for the cost variable, which is characterised by markedly skewed distributions. In addition, individual-level data sets are often characterised by the presence of structural zeros in the cost variable. Hurdle models can be used to account for the presence of excess zeros in a distribution and have been applied in the context of cost data. We extend their application to cost-effectiveness data, defining a full Bayesian specification, which consists of a model for the individual probability of null costs, a marginal model for the costs and a conditional model for the measure of effectiveness (given the observed costs). We presented the model using a working example to describe its main features.

  13. The GRB Golentskii Correlation as a Cosmological Probe via Bayesian Analysis

    NASA Astrophysics Data System (ADS)

    Burgess, Michael

    2016-07-01

    Gamma-ray bursts (GRBs) are characterized by a strong correlation between the instantaneous luminosity and the spectral peak energy within a burst. This correlation, which is known as the hardness-intensity correlation or the Golenetskii correlation, not only holds important clues to the physics of GRBs but is thought to have the potential to determine redshifts of bursts. In this paper, I use a hierarchical Bayesian model to study the universality of the rest-frame Golenetskii correlation and in particular I assess its use as a redshift estimator for GRBs. I find that, using a power-law prescription of the correlation, the power-law indices cluster near a common value, but have a broader variance than previously reported ( 1-2). Furthermore, I find evidence that there is spread in intrinsic rest-frame correlation normalizations for the GRBs in our sample ( 10 ^{51}-10 ^{53} erg/s). This points towards variable physical settings of the emission (magnetic field strength, number of emitting electrons, photospheric radius, viewing angle, etc.). Subsequently, these results eliminate the Golenetskii correlation as a useful tool for redshift determination and hence a cosmological probe. Nevertheless, the Bayesian method introduced in this paper allows for a better determination of the rest frame properties of the correlation, which in turn allows for more stringent limitations for physical models of the emission to be set.

  14. A Bayesian analysis of HAT-P-7b using the EXONEST algorithm

    SciTech Connect

    Placek, Ben; Knuth, Kevin H.

    2015-01-13

    The study of exoplanets (planets orbiting other stars) is revolutionizing the way we view our universe. High-precision photometric data provided by the Kepler Space Telescope (Kepler) enables not only the detection of such planets, but also their characterization. This presents a unique opportunity to apply Bayesian methods to better characterize the multitude of previously confirmed exoplanets. This paper focuses on applying the EXONEST algorithm to characterize the transiting short-period-hot-Jupiter, HAT-P-7b (also referred to as Kepler-2b). EXONEST evaluates a suite of exoplanet photometric models by applying Bayesian Model Selection, which is implemented with the MultiNest algorithm. These models take into account planetary effects, such as reflected light and thermal emissions, as well as the effect of the planetary motion on the host star, such as Doppler beaming, or boosting, of light from the reflex motion of the host star, and photometric variations due to the planet-induced ellipsoidal shape of the host star. By calculating model evidences, one can determine which model best describes the observed data, thus identifying which effects dominate the planetary system. Presented are parameter estimates and model evidences for HAT-P-7b.

  15. Bayesian models for cost-effectiveness analysis in the presence of structural zero costs

    PubMed Central

    Baio, Gianluca

    2014-01-01

    Bayesian modelling for cost-effectiveness data has received much attention in both the health economics and the statistical literature, in recent years. Cost-effectiveness data are characterised by a relatively complex structure of relationships linking a suitable measure of clinical benefit (e.g. quality-adjusted life years) and the associated costs. Simplifying assumptions, such as (bivariate) normality of the underlying distributions, are usually not granted, particularly for the cost variable, which is characterised by markedly skewed distributions. In addition, individual-level data sets are often characterised by the presence of structural zeros in the cost variable. Hurdle models can be used to account for the presence of excess zeros in a distribution and have been applied in the context of cost data. We extend their application to cost-effectiveness data, defining a full Bayesian specification, which consists of a model for the individual probability of null costs, a marginal model for the costs and a conditional model for the measure of effectiveness (given the observed costs). We presented the model using a working example to describe its main features. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd. PMID:24343868

  16. Genetic basis of climatic adaptation in scots pine by bayesian quantitative trait locus analysis.

    PubMed Central

    Hurme, P; Sillanpää, M J; Arjas, E; Repo, T; Savolainen, O

    2000-01-01

    We examined the genetic basis of large adaptive differences in timing of bud set and frost hardiness between natural populations of Scots pine. As a mapping population, we considered an "open-pollinated backcross" progeny by collecting seeds of a single F(1) tree (cross between trees from southern and northern Finland) growing in southern Finland. Due to the special features of the design (no marker information available on grandparents or the father), we applied a Bayesian quantitative trait locus (QTL) mapping method developed previously for outcrossed offspring. We found four potential QTL for timing of bud set and seven for frost hardiness. Bayesian analyses detected more QTL than ANOVA for frost hardiness, but the opposite was true for bud set. These QTL included alleles with rather large effects, and additionally smaller QTL were supported. The largest QTL for bud set date accounted for about a fourth of the mean difference between populations. Thus, natural selection during adaptation has resulted in selection of at least some alleles of rather large effect. PMID:11063704

  17. A Bayesian analysis of the 2009 decline in tuberculosis morbidity in the United States.

    PubMed

    Chen, Michael P; Shang, Nong; Winston, Carla A; Becerra, Jose E

    2012-11-30

    Although annual data are commonly used to model linear trends and changes in trends of disease incidence, monthly data could provide additional resolution for statistical inferences. Because monthly data may exhibit seasonal patterns, we need to consider seasonally adjusted models, which can be theoretically complex and computationally intensive. We propose a combination of methods to reduce the complexity of modeling seasonal data and to provide estimates for a change in trend when the timing and magnitude of the change are unknown. To assess potential changes in trend, we first used autoregressive integrated moving average (ARIMA) models to analyze the residuals and forecast errors, followed by multiple ARIMA intervention models to estimate the timing and magnitude of the change. Because the variable corresponding to time of change is not a statistical parameter, its confidence bounds cannot be estimated by intervention models. To model timing of change and its credible interval, we developed a Bayesian technique. We avoided the need for computationally intensive simulations by deriving a closed form for the posterior distribution of the time of change. Using a combination of ARIMA and Bayesian methods, we estimated the timing and magnitude of change in trend for tuberculosis cases in the United States. Published 2012. This article is a US Government work and is in the public domain in the USA. PMID:22415632

  18. Bayesian analysis of a morphological supermatrix sheds light on controversial fossil hominin relationships.

    PubMed

    Dembo, Mana; Matzke, Nicholas J; Mooers, Arne Ø; Collard, Mark

    2015-08-01

    The phylogenetic relationships of several hominin species remain controversial. Two methodological issues contribute to the uncertainty-use of partial, inconsistent datasets and reliance on phylogenetic methods that are ill-suited to testing competing hypotheses. Here, we report a study designed to overcome these issues. We first compiled a supermatrix of craniodental characters for all widely accepted hominin species. We then took advantage of recently developed Bayesian methods for building trees of serially sampled tips to test among hypotheses that have been put forward in three of the most important current debates in hominin phylogenetics--the relationship between Australopithecus sediba and Homo, the taxonomic status of the Dmanisi hominins, and the place of the so-called hobbit fossils from Flores, Indonesia, in the hominin tree. Based on our results, several published hypotheses can be statistically rejected. For example, the data do not support the claim that Dmanisi hominins and all other early Homo specimens represent a single species, nor that the hobbit fossils are the remains of small-bodied modern humans, one of whom had Down syndrome. More broadly, our study provides a new baseline dataset for future work on hominin phylogeny and illustrates the promise of Bayesian approaches for understanding hominin phylogenetic relationships.

  19. Bayesian Analysis of Two Stellar Populations in Galactic Globular Clusters. II. NGC 5024, NGC 5272, and NGC 6352

    NASA Astrophysics Data System (ADS)

    Wagner-Kaiser, R.; Stenning, D. C.; Robinson, E.; von Hippel, T.; Sarajedini, A.; van Dyk, D. A.; Stein, N.; Jefferys, W. H.

    2016-07-01

    We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival Advanced Camera for Surveys Treasury observations of Galactic Globular Clusters to find and characterize two stellar populations in NGC 5024 (M53), NGC 5272 (M3), and NGC 6352. For these three clusters, both single and double-population analyses are used to determine a best fit isochrone(s). We employ a sophisticated Bayesian analysis technique to simultaneously fit the cluster parameters (age, distance, absorption, and metallicity) that characterize each cluster. For the two-population analysis, unique population level helium values are also fit to each distinct population of the cluster and the relative proportions of the populations are determined. We find differences in helium ranging from ˜0.05 to 0.11 for these three clusters. Model grids with solar α-element abundances ([α/Fe] = 0.0) and enhanced α-elements ([α/Fe] = 0.4) are adopted.

  20. Reconstruction of a windborne insect invasion using a particle dispersal model, historical wind data, and Bayesian analysis of genetic data.

    PubMed

    Lander, Tonya A; Klein, Etienne K; Oddou-Muratorio, Sylvie; Candau, Jean-Noël; Gidoin, Cindy; Chalon, Alain; Roig, Anne; Fallour, Delphine; Auger-Rozenberg, Marie-Anne; Boivin, Thomas

    2014-12-01

    Understanding how invasive species establish and spread is vital for developing effective management strategies for invaded areas and identifying new areas where the risk of invasion is highest. We investigated the explanatory power of dispersal histories reconstructed based on local-scale wind data and a regional-scale wind-dispersed particle trajectory model for the invasive seed chalcid wasp Megastigmus schimitscheki (Hymenoptera: Torymidae) in France. The explanatory power was tested by: (1) survival analysis of empirical data on M. schimitscheki presence, absence and year of arrival at 52 stands of the wasp's obligate hosts, Cedrus (true cedar trees); and (2) Approximate Bayesian analysis of M. schimitscheki genetic data using a coalescence model. The Bayesian demographic modeling and traditional population genetic analysis suggested that initial invasion across the range was the result of long-distance dispersal from the longest established sites. The survival analyses of the windborne expansion patterns derived from a particle dispersal model indicated that there was an informative correlation between the M. schimitscheki presence/absence data from the annual surveys and the scenarios based on regional-scale wind data. These three very different analyses produced highly congruent results supporting our proposal that wind is the most probable vector for passive long-distance dispersal of this invasive seed wasp. This result confirms that long-distance dispersal from introduction areas is a likely driver of secondary expansion of alien invasive species. Based on our results, management programs for this and other windborne invasive species may consider (1) focusing effort at the longest established sites and (2) monitoring outlying populations remains critically important due to their influence on rates of spread. We also suggest that there is a distinct need for new analysis methods that have the capacity to combine empirical spatiotemporal field data

  1. A new African fossil caprin and a combined molecular and morphological Bayesian phylogenetic analysis of caprini (Mammalia: Bovidae).

    PubMed

    Bibi, F; Vrba, E; Fack, F

    2012-09-01

    Given that most species that have ever existed on Earth are extinct, no evolutionary history can ever be complete without the inclusion of fossil taxa. Bovids (antelopes and relatives) are one of the most diverse clades of large mammals alive today, with over a hundred living species and hundreds of documented fossil species. With the advent of molecular phylogenetics, major advances have been made in the phylogeny of this clade; however, there has been little attempt to integrate the fossil record into the developing phylogenetic picture. We here describe a new large fossil caprin species from ca. 1.9-Ma deposits from the Middle Awash, Ethiopia. To place the new species phylogenetically, we perform a Bayesian analysis of a combined molecular (cytochrome b) and morphological (osteological) character supermatrix. We include all living species of Caprini, the new fossil species, a fossil takin from the Pliocene of Ethiopia (Budorcas churcheri), and the insular subfossil Myotragus balearicus. The combined analysis demonstrates successful incorporation of both living and fossil species within a single phylogeny based on both molecular and morphological evidence. Analysis of the combined supermatrix produces superior resolution than with either the molecular or morphological data sets considered alone. Parsimony and Bayesian analyses of the data set are also compared and shown to produce similar results. The combined phylogenetic analysis indicates that the new fossil species is nested within Capra, making it one of the earliest representatives of this clade, with implications for molecular clock calibration. Geographical optimization indicates no less than four independent dispersals into Africa by caprins since the Pliocene.

  2. Towards a Fuzzy Bayesian Network Based Approach for Safety Risk Analysis of Tunnel-Induced Pipeline Damage.

    PubMed

    Zhang, Limao; Wu, Xianguo; Qin, Yawei; Skibniewski, Miroslaw J; Liu, Wenli

    2016-02-01

    Tunneling excavation is bound to produce significant disturbances to surrounding environments, and the tunnel-induced damage to adjacent underground buried pipelines is of considerable importance for geotechnical practice. A fuzzy Bayesian networks (FBNs) based approach for safety risk analysis is developed in this article with detailed step-by-step procedures, consisting of risk mechanism analysis, the FBN model establishment, fuzzification, FBN-based inference, defuzzification, and decision making. In accordance with the failure mechanism analysis, a tunnel-induced pipeline damage model is proposed to reveal the cause-effect relationships between the pipeline damage and its influential variables. In terms of the fuzzification process, an expert confidence indicator is proposed to reveal the reliability of the data when determining the fuzzy probability of occurrence of basic events, with both the judgment ability level and the subjectivity reliability level taken into account. By means of the fuzzy Bayesian inference, the approach proposed in this article is capable of calculating the probability distribution of potential safety risks and identifying the most likely potential causes of accidents under both prior knowledge and given evidence circumstances. A case concerning the safety analysis of underground buried pipelines adjacent to the construction of the Wuhan Yangtze River Tunnel is presented. The results demonstrate the feasibility of the proposed FBN approach and its application potential. The proposed approach can be used as a decision tool to provide support for safety assurance and management in tunnel construction, and thus increase the likelihood of a successful project in a complex project environment. PMID:26224125

  3. Reconstruction of a windborne insect invasion using a particle dispersal model, historical wind data, and Bayesian analysis of genetic data

    PubMed Central

    Lander, Tonya A; Klein, Etienne K; Oddou-Muratorio, Sylvie; Candau, Jean-Noël; Gidoin, Cindy; Chalon, Alain; Roig, Anne; Fallour, Delphine; Auger-Rozenberg, Marie-Anne; Boivin, Thomas

    2014-01-01

    Understanding how invasive species establish and spread is vital for developing effective management strategies for invaded areas and identifying new areas where the risk of invasion is highest. We investigated the explanatory power of dispersal histories reconstructed based on local-scale wind data and a regional-scale wind-dispersed particle trajectory model for the invasive seed chalcid wasp Megastigmus schimitscheki (Hymenoptera: Torymidae) in France. The explanatory power was tested by: (1) survival analysis of empirical data on M. schimitscheki presence, absence and year of arrival at 52 stands of the wasp's obligate hosts, Cedrus (true cedar trees); and (2) Approximate Bayesian analysis of M. schimitscheki genetic data using a coalescence model. The Bayesian demographic modeling and traditional population genetic analysis suggested that initial invasion across the range was the result of long-distance dispersal from the longest established sites. The survival analyses of the windborne expansion patterns derived from a particle dispersal model indicated that there was an informative correlation between the M. schimitscheki presence/absence data from the annual surveys and the scenarios based on regional-scale wind data. These three very different analyses produced highly congruent results supporting our proposal that wind is the most probable vector for passive long-distance dispersal of this invasive seed wasp. This result confirms that long-distance dispersal from introduction areas is a likely driver of secondary expansion of alien invasive species. Based on our results, management programs for this and other windborne invasive species may consider (1) focusing effort at the longest established sites and (2) monitoring outlying populations remains critically important due to their influence on rates of spread. We also suggest that there is a distinct need for new analysis methods that have the capacity to combine empirical spatiotemporal field data

  4. Preloaded joint analysis methodology for space flight systems

    NASA Technical Reports Server (NTRS)

    Chambers, Jeffrey A.

    1995-01-01

    This report contains a compilation of some of the most basic equations governing simple preloaded joint systems and discusses the more common modes of failure associated with such hardware. It is intended to provide the mechanical designer with the tools necessary for designing a basic bolted joint. Although the information presented is intended to aid in the engineering of space flight structures, the fundamentals are equally applicable to other forms of mechanical design.

  5. Joint venture versus outreach: a financial analysis of case studies.

    PubMed

    Forsman, R W

    2001-01-01

    Medical centers across the country are facing cost challenges, and national commercial laboratories are experiencing financial declines that necessitate their capturing market share in any way possible. Many laboratories are turning to joint ventures or partnerships for financial relief. However, it often is in the best interest of the patient and the medical center to integrate laboratory services across the continuum of care. This article analyzes two hypothetical joint ventures involving a laboratory management agreement and full laboratory outsourcing.

  6. Analysis of in-situ rock joint strength using digital borehole scanner images

    SciTech Connect

    Thapa, B.B.

    1994-09-01

    The availability of high resolution digital images of borehole walls using the Borehole Scanner System has made it possible to develop new methods of in-situ rock characterization. This thesis addresses particularly new approaches to the characterization of in-situ joint strength arising from surface roughness. An image processing technique is used to extract the roughness profile from joints in the unrolled image of the borehole wall. A method for estimating in-situ Rengers envelopes using this data is presented along with results from using the method on joints in a borehole in porphyritic granite. Next, an analysis of the joint dilation angle anisotropy is described and applied to the porphyritic granite joints. The results indicate that the dilation angle of the joints studied are anisotropic at small scales and tend to reflect joint waviness as scale increases. A procedure to unroll the opposing roughness profiles to obtain a two dimensional sample is presented. The measurement of apertures during this process is shown to produce an error which increases with the dip of the joint. The two dimensional sample of opposing profiles is used in a new kinematic analysis of the joint shear stress-shear deformation behavior. Examples of applying these methods on the porphyritic granite joints are presented. The unrolled opposing profiles were used in a numerical simulation of a direct shear test using Discontinuous Deformation Analysis. Results were compared to laboratory test results using core samples containing the same joints. The simulated dilatancy and shear stress-shear deformation curves were close to the laboratory curves in the case of a joint in porphyritic granite.

  7. Training image analysis for model error assessment and dimension reduction in Bayesian-MCMC solutions to inverse problems

    NASA Astrophysics Data System (ADS)

    Koepke, C.; Irving, J.

    2015-12-01

    Bayesian solutions to inverse problems in near-surface geophysics and hydrology have gained increasing popularity as a means of estimating not only subsurface model parameters, but also their corresponding uncertainties that can be used in probabilistic forecasting and risk analysis. In particular, Markov-chain-Monte-Carlo (MCMC) methods have attracted much recent attention as a means of statistically sampling from the Bayesian posterior distribution. In this regard, two approaches are commonly used to improve the computational tractability of the Bayesian-MCMC approach: (i) Forward models involving a simplification of the underlying physics are employed, which offer a significant reduction in the time required to calculate data, but generally at the expense of model accuracy, and (ii) the model parameter space is represented using a limited set of spatially correlated basis functions as opposed to a more intuitive high-dimensional pixel-based parameterization. It has become well understood that model inaccuracies resulting from (i) can lead to posterior parameter distributions that are highly biased and overly confident. Further, when performing model reduction as described in (ii), it is not clear how the prior distribution for the basis weights should be defined because simple (e.g., Gaussian or uniform) priors that may be suitable for a pixel-based parameterization may result in a strong prior bias when used for the weights. To address the issue of model error resulting from known forward model approximations, we generate a set of error training realizations and analyze them with principal component analysis (PCA) in order to generate a sparse basis. The latter is used in the MCMC inversion to remove the main model-error component from the residuals. To improve issues related to prior bias when performing model reduction, we also use a training realization approach, but this time models are simulated from the prior distribution and analyzed using independent

  8. Stresses In Vehicle Chassis Joints - A Canparison Of SPATE With Other Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Loader, A. J.; Turner, W. B.; Harwood, N.

    1987-04-01

    Joints in ladder frame chassis have been studied as part of an SERC Teaching Company Schene. The joints between the cross members and side members are complex structures involving bolts, welds and/or rivets, as the cross member section can be tubular, box or C-section. It is therefore difficult to apply simple analytical methods to such joints. This paper compares the stresses obtained by brittle lacquer, strain gauge and SPATE measurements with those found from a finite elenent analysis of the joints.

  9. Sparsity and the Bayesian perspective

    NASA Astrophysics Data System (ADS)

    Starck, J.-L.; Donoho, D. L.; Fadili, M. J.; Rassat, A.

    2013-04-01

    Sparsity has recently been introduced in cosmology for weak-lensing and cosmic microwave background (CMB) data analysis for different applications such as denoising, component separation, or inpainting (i.e., filling the missing data or the mask). Although it gives very nice numerical results, CMB sparse inpainting has been severely criticized by top researchers in cosmology using arguments derived from a Bayesian perspective. In an attempt to understand their point of view, we realize that interpreting a regularization penalty term as a prior in a Bayesian framework can lead to erroneous conclusions. This paper is by no means against the Bayesian approach, which has proven to be very useful for many applications, but warns against a Bayesian-only interpretation in data analysis, which can be misleading in some cases.

  10. Crack propagation analysis of welded thin-walled joints using boundary element method

    NASA Astrophysics Data System (ADS)

    Mashiri, F. R.; Zhao, Xiao-Ling; Grundy, P.

    Tube-to-plate nodal joints under cyclic bending are widely used in the road transport and agricultural industry. The square hollow sections (SHS) used in these constructions are thin-walled and cold formed, and they have thicknesses of less than 4mm. Some fatigue failures have been observed. The weld undercut may affect the fatigue life of welded tubular joints especially for thin-walled sections. The undercut dimensions were measured using the silicon imprint technique. Modelling of thin-walled cruciform joints, as a simplification of welded tubular joints, is described in this paper to determine the effect of weld undercut on fatigue propagation life. The Boundary Element Analysis System Software (BEASY) is used. The results of the effect of weld toe undercut from this analysis are compared with results from previous research to determine the comparative reduction in fatigue life between thin-walled joints (T=3mm) and those made of thicker sections (T=20mm). The loss in fatigue strength of the thin-walled joints is found to be relatively more than that for thicker walled joints. A 3D model of a tube to plate T-joint is also modelled using the boundary element software, BEASY. The nodal joint consists of a square hollow section, 50×50×3 SHS, fillet welded to a 10-mm thick plate, and subjected to cyclic bending stress. Fatigue analyses are carried out and the results are compared with the only available S-N design curve.

  11. Bayesian segmentation of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Mohammadpour, Adel; Féron, Olivier; Mohammad-Djafari, Ali

    2004-11-01

    In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.

  12. Team performance and collective efficacy in the dynamic psychology of competitive team: a Bayesian network analysis.

    PubMed

    Fuster-Parra, P; García-Mas, A; Ponseti, F J; Leo, F M

    2015-04-01

    The purpose of this paper was to discover the relationships among 22 relevant psychological features in semi-professional football players in order to study team's performance and collective efficacy via a Bayesian network (BN). The paper includes optimization of team's performance and collective efficacy using intercausal reasoning pattern which constitutes a very common pattern in human reasoning. The BN is used to make inferences regarding our problem, and therefore we obtain some conclusions; among them: maximizing the team's performance causes a decrease in collective efficacy and when team's performance achieves the minimum value it causes an increase in moderate/high values of collective efficacy. Similarly, we may reason optimizing team collective efficacy instead. It also allows us to determine the features that have the strongest influence on performance and which on collective efficacy. From the BN two different coaching styles were differentiated taking into account the local Markov property: training leadership and autocratic leadership.

  13. Bayesian analysis of size-dependent overwinter mortality from size-frequency distributions.

    PubMed

    Carlson, Stephanie M; Kottas, Athanasios; Mangel, Marc

    2010-04-01

    Understanding the relationship between body size and mortality is an important problem in ecology. We introduce a novel Bayesian method that can be used to quantify this relationship when the only data available are size-frequency distributions of unmarked individuals measured at two successive time periods. The inverse Gaussian distribution provides a parametric form for the statistical model development, and we use Markov chain Monte Carlo methods to evaluate posterior distributions. We illustrate the method using data on threespine stickleback (Gasterosteus aculeatus) collected before and after the winter season in an Alaskan lake. Our method allows us to compare the intensity of size-biased mortality in different years. We discuss generalizations that include more complicated relationships between size and survival as well as time-series modeling.

  14. Bayesian analysis of the structural equation models with application to a longitudinal myopia trial.

    PubMed

    Wang, Yi-Fu; Fan, Tsai-Hung

    2012-01-30

    Myopia is becoming a significant public health problem, affecting more and more people. Studies indicate that there are two main factors, hereditary and environmental, suspected to have strong impact on myopia. Motivated by the increase in the number of people affected by this problem, this paper focuses primarily on the utilization of mathematical methods to gain further insight into their relationship with myopia. Accordingly, utilizing multidimensional longitudinal myopia data with correlation between both eyes, we develop a Bayesian structural equation model including random effects. With the aid of the MCMC method, it is capable of expressing the correlation between repeated measurements as well as the two-eye correlation and can be used to explore the relational structure among the variables in the model. We consider four observed factors, including intraocular pressure, anterior chamber depth, lens thickness, and axial length. The results indicate that the genetic effect has much greater influence on myopia than the environmental effects.

  15. Nonparametric Bayesian Dictionary Learning for Analysis of Noisy and Incomplete Images

    PubMed Central

    Zhou, Mingyuan; Chen, Haojun; Paisley, John; Ren, Lu; Li, Lingbo; Xing, Zhengming; Dunson, David; Sapiro, Guillermo; Carin, Lawrence

    2013-01-01

    Nonparametric Bayesian methods are considered for recovery of imagery based upon compressive, incomplete, and/or noisy measurements. A truncated beta-Bernoulli process is employed to infer an appropriate dictionary for the data under test and also for image recovery. In the context of compressive sensing, significant improvements in image recovery are manifested using learned dictionaries, relative to using standard orthonormal image expansions. The compressive-measurement projections are also optimized for the learned dictionary. Additionally, we consider simpler (incomplete) measurements, defined by measuring a subset of image pixels, uniformly selected at random. Spatial interrelationships within imagery are exploited through use of the Dirichlet and probit stick-breaking processes. Several example results are presented, with comparisons to other methods in the literature. PMID:21693421

  16. Team performance and collective efficacy in the dynamic psychology of competitive team: a Bayesian network analysis.

    PubMed

    Fuster-Parra, P; García-Mas, A; Ponseti, F J; Leo, F M

    2015-04-01

    The purpose of this paper was to discover the relationships among 22 relevant psychological features in semi-professional football players in order to study team's performance and collective efficacy via a Bayesian network (BN). The paper includes optimization of team's performance and collective efficacy using intercausal reasoning pattern which constitutes a very common pattern in human reasoning. The BN is used to make inferences regarding our problem, and therefore we obtain some conclusions; among them: maximizing the team's performance causes a decrease in collective efficacy and when team's performance achieves the minimum value it causes an increase in moderate/high values of collective efficacy. Similarly, we may reason optimizing team collective efficacy instead. It also allows us to determine the features that have the strongest influence on performance and which on collective efficacy. From the BN two different coaching styles were differentiated taking into account the local Markov property: training leadership and autocratic leadership. PMID:25546263

  17. Analysis of housing price by means of STAR models with neighbourhood effects: a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Beamonte, Asuncion; Gargallo, Pilar; Salvador, Manuel

    2010-06-01

    In this paper, we extend the Bayesian methodology introduced by Beamonte et al. (Stat Modelling 8:285-311, 2008) for the estimation and comparison of spatio-temporal autoregressive models (STAR) with neighbourhood effects, providing a more general treatment that uses larger and denser nets for the number of spatial and temporal influential neighbours and continuous distributions for their smoothing weights. This new treatment also reduces the computational time and the RAM necessities of the estimation algorithm in Beamonte et al. (Stat Modelling 8:285-311, 2008). The procedure is illustrated by an application to the Zaragoza (Spain) real estate market, improving the goodness of fit and the outsampling behaviour of the model thanks to a more flexible estimation of the neighbourhood parameters.

  18. Sizing aerosolized fractal nanoparticle aggregates through Bayesian analysis of wide-angle light scattering (WALS) data

    NASA Astrophysics Data System (ADS)

    Huber, Franz J. T.; Will, Stefan; Daun, Kyle J.

    2016-11-01

    Inferring the size distribution of aerosolized fractal aggregates from the angular distribution of elastically scattered light is a mathematically ill-posed problem. This paper presents a procedure for analyzing Wide-Angle Light Scattering (WALS) data using Bayesian inference. The outcome is probability densities for the recovered size distribution and aggregate morphology parameters. This technique is applied to both synthetic data and experimental data collected on soot-laden aerosols, using a measurement equation derived from Rayleigh-Debye-Gans fractal aggregate (RDG-FA) theory. In the case of experimental data, the recovered aggregate size distribution parameters are generally consistent with TEM-derived values, but the accuracy is impaired by the well-known limited accuracy of RDG-FA theory. Finally, we show how this bias could potentially be avoided using the approximation error technique.

  19. A Bayesian Approach to a Multiple-Group Latent Class-Profile Analysis: The Timing of Drinking Onset and Subsequent Drinking Behaviors among U.S. Adolescents

    ERIC Educational Resources Information Center

    Chung, Hwan; Anthony, James C.

    2013-01-01

    This article presents a multiple-group latent class-profile analysis (LCPA) by taking a Bayesian approach in which a Markov chain Monte Carlo simulation is employed to achieve more robust estimates for latent growth patterns. This article describes and addresses a label-switching problem that involves the LCPA likelihood function, which has…

  20. What a Bayesian Analysis Can Do for SLA: New Tools for the Sociolinguistic Study of Subject Expression in L2 Spanish

    ERIC Educational Resources Information Center

    Gudmestad, Aarnes; House, Leanna; Geeslin, Kimberly L.

    2013-01-01

    This study constitutes the first statistical analysis to employ a Bayesian multinomial probit model in the investigation of subject expression in first and second language (L2) Spanish. The study analyzes the use of third-person subject-expression forms and demonstrates that the following variables are important for subject expression:…

  1. Bayesian Geostatistical Analysis and Ecoclimatic Determinants of Corynebacterium pseudotuberculosis Infection among Horses.

    PubMed

    Boysen, Courtney; Davis, Elizabeth G; Beard, Laurie A; Lubbers, Brian V; Raghavan, Ram K

    2015-01-01

    Kansas witnessed an unprecedented outbreak in Corynebacterium pseudotuberculosis infection among horses, a disease commonly referred to as pigeon fever during fall 2012. Bayesian geostatistical models were developed to identify key environmental and climatic risk factors associated with C. pseudotuberculosis infection in horses. Positive infection status among horses (cases) was determined by positive test results for characteristic abscess formation, positive bacterial culture on purulent material obtained from a lanced abscess (n = 82), or positive serologic evidence of exposure to organism (≥ 1:512)(n = 11). Horses negative for these tests (n = 172)(controls) were considered free of infection. Information pertaining to horse demographics and stabled location were obtained through review of medical records and/or contact with horse owners via telephone. Covariate information for environmental and climatic determinants were obtained from USDA (soil attributes), USGS (land use/land cover), and NASA MODIS and NASA Prediction of Worldwide Renewable Resources (climate). Candidate covariates were screened using univariate regression models followed by Bayesian geostatistical models with and without covariates. The best performing model indicated a protective effect for higher soil moisture content (OR = 0.53, 95% CrI = 0.25, 0.71), and detrimental effects for higher land surface temperature (≥ 35°C) (OR = 2.81, 95% CrI = 2.21, 3.85) and habitat fragmentation (OR = 1.31, 95% CrI = 1.27, 2.22) for C. pseudotuberculosis infection status in horses, while age, gender and breed had no effect. Preventative and ecoclimatic significance of these findings are discussed.

  2. Analysis of Energy Spectrum with Low Photon Counts via Bayesian Posterior Simulation

    NASA Astrophysics Data System (ADS)

    van Dyk, David A.; Protassov, Rostislav; Kashyap, Vinay L.; Siemiginowska, Aneta; Connors, Alanna

    1999-04-01

    Recently Bayesian methods have grown rapidly in popularity in many scientific disciplines as several computationally intensive statistical algorithms have become feasible with modern computer power. In this paper, we demonstrate how we have employed these state-of-the-art techniques (e.g., Gibbs sampler and Metropolis-Hastings) to fit today's high-quality, high resolution astrophysical spectral data. These algorithms are very flexible and can be used to fit models that account for the highly hierarchical structure in the collection of high-quality spectra and thus can keep pace with the accelerating progress of new telescope designs. We explicitly model photon arrivals as a Poisson process and, thus, have no difficulty with high resolution low count X-ray and gamma-ray data. These methods will be useful not only for the soon-to-be-launched Chandra X-ray observatory but also such new generation telescopes as XMM, Constellation X, and GLAST. We also explicitly incorporate the instrument response (e.g. via a response matrix and effective area vector), plus background contamination of the data. In particular, we appropriately model the background as the realization of a second Poisson process, thereby eliminating the need to directly subtract off the background counts and the rather embarrassing problem of negative photon counts. The source energy spectrum is modeled as a mixture of a Generalized Linear Model which accounts for the continuum plus absorption and several (Gaussian) line profiles. Generalized Linear Models are the standard method for incorporating covariate information (as in regression) into non-Gaussian models and are thus an obvious but innovative choice in this setting. Using several examples, we illustrate how Bayesian posterior sampling can be used to compute point (i.e., ``best'') estimates of the various model parameters as well as compute error bars on these estimates and construct statistical tests.

  3. Bayesian Geostatistical Analysis and Ecoclimatic Determinants of Corynebacterium pseudotuberculosis Infection among Horses

    PubMed Central

    Boysen, Courtney; Davis, Elizabeth G.; Beard, Laurie A.; Lubbers, Brian V.; Raghavan, Ram K.

    2015-01-01

    Kansas witnessed an unprecedented outbreak in Corynebacterium pseudotuberculosis infection among horses, a disease commonly referred to as pigeon fever during fall 2012. Bayesian geostatistical models were developed to identify key environmental and climatic risk factors associated with C. pseudotuberculosis infection in horses. Positive infection status among horses (cases) was determined by positive test results for characteristic abscess formation, positive bacterial culture on purulent material obtained from a lanced abscess (n = 82), or positive serologic evidence of exposure to organism (≥1:512)(n = 11). Horses negative for these tests (n = 172)(controls) were considered free of infection. Information pertaining to horse demographics and stabled location were obtained through review of medical records and/or contact with horse owners via telephone. Covariate information for environmental and climatic determinants were obtained from USDA (soil attributes), USGS (land use/land cover), and NASA MODIS and NASA Prediction of Worldwide Renewable Resources (climate). Candidate covariates were screened using univariate regression models followed by Bayesian geostatistical models with and without covariates. The best performing model indicated a protective effect for higher soil moisture content (OR = 0.53, 95% CrI = 0.25, 0.71), and detrimental effects for higher land surface temperature (≥35°C) (OR = 2.81, 95% CrI = 2.21, 3.85) and habitat fragmentation (OR = 1.31, 95% CrI = 1.27, 2.22) for C. pseudotuberculosis infection status in horses, while age, gender and breed had no effect. Preventative and ecoclimatic significance of these findings are discussed. PMID:26473728

  4. Rapid sphere sizing using a Bayesian analysis of reciprocal space imaging data.

    PubMed

    Ziovas, K; Sederman, A J; Gehin-Delval, C; Gunes, D Z; Hughes, E; Mantle, M D

    2016-01-15

    Dispersed systems are important in many applications in a wide range of industries such as the petroleum, pharmaceutical and food industries. Therefore the ability to control and non-invasively measure the physical properties of these systems, such as the dispersed phase size distribution, is of significant interest, in particular for concentrated systems, where microscopy or scattering techniques may not apply or with very limited output quality. In this paper we show how reciprocal space data acquired using both 1D magnetic resonance imaging (MRI) and 2D X-ray micro-tomographic (X-ray μCT) data can be analysed, using a Bayesian statistical model, to extract the sphere size distribution (SSD) from model sphere systems and dispersed food foam samples. Glass spheres-in-xanthan gels were used as model samples with sphere diameters (D) in the range of 45μm⩽D⩽850μm. The results show that the SSD was successfully estimated from both the NMR and X-ray μCT with a good degree of accuracy for the entire range of glass spheres in times as short as two seconds. After validating the technique using model samples, the Bayesian sphere sizing method was successfully applied to air/water foam samples generated using a microfluidics apparatus with 160μm⩽D⩽400μm. The effect of different experimental parameters such as the standard deviation of the bubble size distribution and the volume fraction of the dispersed phase is discussed. PMID:26439290

  5. Bayesian analysis of multi-state data with individual covariates for estimating genetic effects on demography

    USGS Publications Warehouse

    Converse, Sarah J.; Royle, J. Andrew; Urbanek, Richard P.

    2012-01-01

    Inbreeding depression is frequently a concern of managers interested in restoring endangered species. Decisions to reduce the potential for inbreeding depression by balancing genotypic contributions to reintroduced populations may exact a cost on long-term demographic performance of the population if those decisions result in reduced numbers of animals released and/or restriction of particularly successful genotypes (i.e., heritable traits of particular family lines). As part of an effort to restore a migratory flock of Whooping Cranes (Grus americana) to eastern North America using the offspring of captive breeders, we obtained a unique dataset which includes post-release mark-recapture data, as well as the pedigree of each released individual. We developed a Bayesian formulation of a multi-state model to analyze radio-telemetry, band-resight, and dead recovery data on reintroduced individuals, in order to track survival and breeding state transitions. We used studbook-based individual covariates to examine the comparative evidence for and degree of effects of inbreeding, genotype, and genotype quality on post-release survival of reintroduced individuals. We demonstrate implementation of the Bayesian multi-state model, which allows for the integration of imperfect detection, multiple data types, random effects, and individual- and time-dependent covariates. Our results provide only weak evidence for an effect of the quality of an individual's genotype in captivity on post-release survival as well as for an effect of inbreeding on post-release survival. We plan to integrate our results into a decision-analytic modeling framework that can explicitly examine tradeoffs between the effects of inbreeding and the effects of genotype and demographic stochasticity on population establishment.

  6. Bayesian Geostatistical Analysis and Ecoclimatic Determinants of Corynebacterium pseudotuberculosis Infection among Horses.

    PubMed

    Boysen, Courtney; Davis, Elizabeth G; Beard, Laurie A; Lubbers, Brian V; Raghavan, Ram K

    2015-01-01

    Kansas witnessed an unprecedented outbreak in Corynebacterium pseudotuberculosis infection among horses, a disease commonly referred to as pigeon fever during fall 2012. Bayesian geostatistical models were developed to identify key environmental and climatic risk factors associated with C. pseudotuberculosis infection in horses. Positive infection status among horses (cases) was determined by positive test results for characteristic abscess formation, positive bacterial culture on purulent material obtained from a lanced abscess (n = 82), or positive serologic evidence of exposure to organism (≥ 1:512)(n = 11). Horses negative for these tests (n = 172)(controls) were considered free of infection. Information pertaining to horse demographics and stabled location were obtained through review of medical records and/or contact with horse owners via telephone. Covariate information for environmental and climatic determinants were obtained from USDA (soil attributes), USGS (land use/land cover), and NASA MODIS and NASA Prediction of Worldwide Renewable Resources (climate). Candidate covariates were screened using univariate regression models followed by Bayesian geostatistical models with and without covariates. The best performing model indicated a protective effect for higher soil moisture content (OR = 0.53, 95% CrI = 0.25, 0.71), and detrimental effects for higher land surface temperature (≥ 35°C) (OR = 2.81, 95% CrI = 2.21, 3.85) and habitat fragmentation (OR = 1.31, 95% CrI = 1.27, 2.22) for C. pseudotuberculosis infection status in horses, while age, gender and breed had no effect. Preventative and ecoclimatic significance of these findings are discussed. PMID:26473728

  7. Analysis of neural interaction in motor cortex during reach-to-grasp task based on Dynamic Bayesian Networks.

    PubMed

    Sang, Dong; Lv, Bin; He, Huiguang; He, Jiping; Wang, Feiyue

    2010-01-01

    In this work, we took the analysis of neural interaction based on the data recorded from the motor cortex of a monkey, when it was trained to complete multi-targets reach-to-grasp tasks. As a recently proved effective tool, Dynamic Bayesian Network (DBN) was applied to model and infer interactions of dependence between neurons. In the results, the gained networks of neural interactions, which correspond to different tasks with different directions and orientations, indicated that the target information was not encoded in simple ways by neuronal networks. We also explored the difference of neural interactions between delayed period and peri-movement period during reach-to-grasp task. We found that the motor control process always led to relatively more complex neural interaction networks than the plan thinking process. PMID:21096882

  8. Advances in computer simulation of genome evolution: toward more realistic evolutionary genomics analysis by approximate bayesian computation.

    PubMed

    Arenas, Miguel

    2015-04-01

    NGS technologies present a fast and cheap generation of genomic data. Nevertheless, ancestral genome inference is not so straightforward due to complex evolutionary processes acting on this material such as inversions, translocations, and other genome rearrangements that, in addition to their implicit complexity, can co-occur and confound ancestral inferences. Recently, models of genome evolution that accommodate such complex genomic events are emerging. This letter explores these novel evolutionary models and proposes their incorporation into robust statistical approaches based on computer simulations, such as approximate Bayesian computation, that may produce a more realistic evolutionary analysis of genomic data. Advantages and pitfalls in using these analytical methods are discussed. Potential applications of these ancestral genomic inferences are also pointed out.

  9. Testing the Storm et al. (2010) meta-analysis using Bayesian and frequentist approaches: reply to Rouder et al. (2013).

    PubMed

    Storm, Lance; Tressoldi, Patrizio E; Utts, Jessica

    2013-01-01

    Rouder, Morey, and Province (2013) stated that (a) the evidence-based case for psi in Storm, Tressoldi, and Di Risio's (2010) meta-analysis is supported only by a number of studies that used manual randomization, and (b) when these studies are excluded so that only investigations using automatic randomization are evaluated (and some additional studies previously omitted by Storm et al., 2010, are included), the evidence for psi is "unpersuasive." Rouder et al. used a Bayesian approach, and we adopted the same methodology, finding that our case is upheld. Because of recent updates and corrections, we reassessed the free-response databases of Storm et al. using a frequentist approach. We discuss and critique the assumptions and findings of Rouder et al.

  10. Dynamic analysis for planar beam with clearance joint