Fundamentals and Recent Developments in Approximate Bayesian Computation
Lintusaari, Jarno; Gutmann, Michael U.; Dutta, Ritabrata; Kaski, Samuel; Corander, Jukka
2017-01-01
Abstract Bayesian inference plays an important role in phylogenetics, evolutionary biology, and in many other branches of science. It provides a principled framework for dealing with uncertainty and quantifying how it changes in the light of new evidence. For many complex models and inference problems, however, only approximate quantitative answers are obtainable. Approximate Bayesian computation (ABC) refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible. We explain here the fundamentals of ABC, review the classical algorithms, and highlight recent developments. [ABC; approximate Bayesian computation; Bayesian inference; likelihood-free inference; phylogenetics; simulator-based models; stochastic simulation models; tree-based models.] PMID:28175922
Mertens, Ulf Kai; Voss, Andreas; Radev, Stefan
2018-01-01
We give an overview of the basic principles of approximate Bayesian computation (ABC), a class of stochastic methods that enable flexible and likelihood-free model comparison and parameter estimation. Our new open-source software called ABrox is used to illustrate ABC for model comparison on two prominent statistical tests, the two-sample t-test and the Levene-Test. We further highlight the flexibility of ABC compared to classical Bayesian hypothesis testing by computing an approximate Bayes factor for two multinomial processing tree models. Last but not least, throughout the paper, we introduce ABrox using the accompanied graphical user interface.
al3c: high-performance software for parameter inference using Approximate Bayesian Computation.
Stram, Alexander H; Marjoram, Paul; Chen, Gary K
2015-11-01
The development of Approximate Bayesian Computation (ABC) algorithms for parameter inference which are both computationally efficient and scalable in parallel computing environments is an important area of research. Monte Carlo rejection sampling, a fundamental component of ABC algorithms, is trivial to distribute over multiple processors but is inherently inefficient. While development of algorithms such as ABC Sequential Monte Carlo (ABC-SMC) help address the inherent inefficiencies of rejection sampling, such approaches are not as easily scaled on multiple processors. As a result, current Bayesian inference software offerings that use ABC-SMC lack the ability to scale in parallel computing environments. We present al3c, a C++ framework for implementing ABC-SMC in parallel. By requiring only that users define essential functions such as the simulation model and prior distribution function, al3c abstracts the user from both the complexities of parallel programming and the details of the ABC-SMC algorithm. By using the al3c framework, the user is able to scale the ABC-SMC algorithm in parallel computing environments for his or her specific application, with minimal programming overhead. al3c is offered as a static binary for Linux and OS-X computing environments. The user completes an XML configuration file and C++ plug-in template for the specific application, which are used by al3c to obtain the desired results. Users can download the static binaries, source code, reference documentation and examples (including those in this article) by visiting https://github.com/ahstram/al3c. astram@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
cosmoabc: Likelihood-free inference for cosmology
NASA Astrophysics Data System (ADS)
Ishida, Emille E. O.; Vitenti, Sandro D. P.; Penna-Lima, Mariana; Trindade, Arlindo M.; Cisewski, Jessi; M.; de Souza, Rafael; Cameron, Ewan; Busti, Vinicius C.
2015-05-01
Approximate Bayesian Computation (ABC) enables parameter inference for complex physical systems in cases where the true likelihood function is unknown, unavailable, or computationally too expensive. It relies on the forward simulation of mock data and comparison between observed and synthetic catalogs. cosmoabc is a Python Approximate Bayesian Computation (ABC) sampler featuring a Population Monte Carlo variation of the original ABC algorithm, which uses an adaptive importance sampling scheme. The code can be coupled to an external simulator to allow incorporation of arbitrary distance and prior functions. When coupled with the numcosmo library, it has been used to estimate posterior probability distributions over cosmological parameters based on measurements of galaxy clusters number counts without computing the likelihood function.
Parameter Estimation for a Turbulent Buoyant Jet Using Approximate Bayesian Computation
NASA Astrophysics Data System (ADS)
Christopher, Jason D.; Wimer, Nicholas T.; Hayden, Torrey R. S.; Lapointe, Caelan; Grooms, Ian; Rieker, Gregory B.; Hamlington, Peter E.
2016-11-01
Approximate Bayesian Computation (ABC) is a powerful tool that allows sparse experimental or other "truth" data to be used for the prediction of unknown model parameters in numerical simulations of real-world engineering systems. In this presentation, we introduce the ABC approach and then use ABC to predict unknown inflow conditions in simulations of a two-dimensional (2D) turbulent, high-temperature buoyant jet. For this test case, truth data are obtained from a simulation with known boundary conditions and problem parameters. Using spatially-sparse temperature statistics from the 2D buoyant jet truth simulation, we show that the ABC method provides accurate predictions of the true jet inflow temperature. The success of the ABC approach in the present test suggests that ABC is a useful and versatile tool for engineering fluid dynamics research.
NASA Astrophysics Data System (ADS)
Ben Abdessalem, Anis; Dervilis, Nikolaos; Wagg, David; Worden, Keith
2018-01-01
This paper will introduce the use of the approximate Bayesian computation (ABC) algorithm for model selection and parameter estimation in structural dynamics. ABC is a likelihood-free method typically used when the likelihood function is either intractable or cannot be approached in a closed form. To circumvent the evaluation of the likelihood function, simulation from a forward model is at the core of the ABC algorithm. The algorithm offers the possibility to use different metrics and summary statistics representative of the data to carry out Bayesian inference. The efficacy of the algorithm in structural dynamics is demonstrated through three different illustrative examples of nonlinear system identification: cubic and cubic-quintic models, the Bouc-Wen model and the Duffing oscillator. The obtained results suggest that ABC is a promising alternative to deal with model selection and parameter estimation issues, specifically for systems with complex behaviours.
Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A; Robert, Christian P; Marin, Jean-Michel; Balding, David J; Guillemaud, Thomas; Estoup, Arnaud
2008-12-01
Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc.
Parameter Estimation for a Pulsating Turbulent Buoyant Jet Using Approximate Bayesian Computation
NASA Astrophysics Data System (ADS)
Christopher, Jason; Wimer, Nicholas; Lapointe, Caelan; Hayden, Torrey; Grooms, Ian; Rieker, Greg; Hamlington, Peter
2017-11-01
Approximate Bayesian Computation (ABC) is a powerful tool that allows sparse experimental or other ``truth'' data to be used for the prediction of unknown parameters, such as flow properties and boundary conditions, in numerical simulations of real-world engineering systems. Here we introduce the ABC approach and then use ABC to predict unknown inflow conditions in simulations of a two-dimensional (2D) turbulent, high-temperature buoyant jet. For this test case, truth data are obtained from a direct numerical simulation (DNS) with known boundary conditions and problem parameters, while the ABC procedure utilizes lower fidelity large eddy simulations. Using spatially-sparse statistics from the 2D buoyant jet DNS, we show that the ABC method provides accurate predictions of true jet inflow parameters. The success of the ABC approach in the present test suggests that ABC is a useful and versatile tool for predicting flow information, such as boundary conditions, that can be difficult to determine experimentally.
Approximate Bayesian computation in large-scale structure: constraining the galaxy-halo connection
NASA Astrophysics Data System (ADS)
Hahn, ChangHoon; Vakili, Mohammadjavad; Walsh, Kilian; Hearin, Andrew P.; Hogg, David W.; Campbell, Duncan
2017-08-01
Standard approaches to Bayesian parameter inference in large-scale structure assume a Gaussian functional form (chi-squared form) for the likelihood. This assumption, in detail, cannot be correct. Likelihood free inferences such as approximate Bayesian computation (ABC) relax these restrictions and make inference possible without making any assumptions on the likelihood. Instead ABC relies on a forward generative model of the data and a metric for measuring the distance between the model and data. In this work, we demonstrate that ABC is feasible for LSS parameter inference by using it to constrain parameters of the halo occupation distribution (HOD) model for populating dark matter haloes with galaxies. Using specific implementation of ABC supplemented with population Monte Carlo importance sampling, a generative forward model using HOD and a distance metric based on galaxy number density, two-point correlation function and galaxy group multiplicity function, we constrain the HOD parameters of mock observation generated from selected 'true' HOD parameters. The parameter constraints we obtain from ABC are consistent with the 'true' HOD parameters, demonstrating that ABC can be reliably used for parameter inference in LSS. Furthermore, we compare our ABC constraints to constraints we obtain using a pseudo-likelihood function of Gaussian form with MCMC and find consistent HOD parameter constraints. Ultimately, our results suggest that ABC can and should be applied in parameter inference for LSS analyses.
Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A.; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud
2008-01-01
Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. Availability: The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc. Contact: j.cornuet@imperial.ac.uk Supplementary information: Supplementary data are also available at http://www.montpellier.inra.fr/CBGP/diyabc PMID:18842597
Bayesian experimental design for models with intractable likelihoods.
Drovandi, Christopher C; Pettitt, Anthony N
2013-12-01
In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables. © 2013, The International Biometric Society.
Taking error into account when fitting models using Approximate Bayesian Computation.
van der Vaart, Elske; Prangle, Dennis; Sibly, Richard M
2018-03-01
Stochastic computer simulations are often the only practical way of answering questions relating to ecological management. However, due to their complexity, such models are difficult to calibrate and evaluate. Approximate Bayesian Computation (ABC) offers an increasingly popular approach to this problem, widely applied across a variety of fields. However, ensuring the accuracy of ABC's estimates has been difficult. Here, we obtain more accurate estimates by incorporating estimation of error into the ABC protocol. We show how this can be done where the data consist of repeated measures of the same quantity and errors may be assumed to be normally distributed and independent. We then derive the correct acceptance probabilities for a probabilistic ABC algorithm, and update the coverage test with which accuracy is assessed. We apply this method, which we call error-calibrated ABC, to a toy example and a realistic 14-parameter simulation model of earthworms that is used in environmental risk assessment. A comparison with exact methods and the diagnostic coverage test show that our approach improves estimation of parameter values and their credible intervals for both models. © 2017 by the Ecological Society of America.
Kwon, Deukwoo; Reis, Isildinha M
2015-08-12
When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.
Approximate Bayesian computation for spatial SEIR(S) epidemic models.
Brown, Grant D; Porter, Aaron T; Oleson, Jacob J; Hinman, Jessica A
2018-02-01
Approximate Bayesia n Computation (ABC) provides an attractive approach to estimation in complex Bayesian inferential problems for which evaluation of the kernel of the posterior distribution is impossible or computationally expensive. These highly parallelizable techniques have been successfully applied to many fields, particularly in cases where more traditional approaches such as Markov chain Monte Carlo (MCMC) are impractical. In this work, we demonstrate the application of approximate Bayesian inference to spatially heterogeneous Susceptible-Exposed-Infectious-Removed (SEIR) stochastic epidemic models. These models have a tractable posterior distribution, however MCMC techniques nevertheless become computationally infeasible for moderately sized problems. We discuss the practical implementation of these techniques via the open source ABSEIR package for R. The performance of ABC relative to traditional MCMC methods in a small problem is explored under simulation, as well as in the spatially heterogeneous context of the 2014 epidemic of Chikungunya in the Americas. Copyright © 2017 Elsevier Ltd. All rights reserved.
Approximate Bayesian computation for forward modeling in cosmology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akeret, Joël; Refregier, Alexandre; Amara, Adam
Bayesian inference is often used in cosmology and astrophysics to derive constraints on model parameters from observations. This approach relies on the ability to compute the likelihood of the data given a choice of model parameters. In many practical situations, the likelihood function may however be unavailable or intractable due to non-gaussian errors, non-linear measurements processes, or complex data formats such as catalogs and maps. In these cases, the simulation of mock data sets can often be made through forward modeling. We discuss how Approximate Bayesian Computation (ABC) can be used in these cases to derive an approximation to themore » posterior constraints using simulated data sets. This technique relies on the sampling of the parameter set, a distance metric to quantify the difference between the observation and the simulations and summary statistics to compress the information in the data. We first review the principles of ABC and discuss its implementation using a Population Monte-Carlo (PMC) algorithm and the Mahalanobis distance metric. We test the performance of the implementation using a Gaussian toy model. We then apply the ABC technique to the practical case of the calibration of image simulations for wide field cosmological surveys. We find that the ABC analysis is able to provide reliable parameter constraints for this problem and is therefore a promising technique for other applications in cosmology and astrophysics. Our implementation of the ABC PMC method is made available via a public code release.« less
COSMOABC: Likelihood-free inference via Population Monte Carlo Approximate Bayesian Computation
NASA Astrophysics Data System (ADS)
Ishida, E. E. O.; Vitenti, S. D. P.; Penna-Lima, M.; Cisewski, J.; de Souza, R. S.; Trindade, A. M. M.; Cameron, E.; Busti, V. C.; COIN Collaboration
2015-11-01
Approximate Bayesian Computation (ABC) enables parameter inference for complex physical systems in cases where the true likelihood function is unknown, unavailable, or computationally too expensive. It relies on the forward simulation of mock data and comparison between observed and synthetic catalogues. Here we present COSMOABC, a Python ABC sampler featuring a Population Monte Carlo variation of the original ABC algorithm, which uses an adaptive importance sampling scheme. The code is very flexible and can be easily coupled to an external simulator, while allowing to incorporate arbitrary distance and prior functions. As an example of practical application, we coupled COSMOABC with the NUMCOSMO library and demonstrate how it can be used to estimate posterior probability distributions over cosmological parameters based on measurements of galaxy clusters number counts without computing the likelihood function. COSMOABC is published under the GPLv3 license on PyPI and GitHub and documentation is available at http://goo.gl/SmB8EX.
NASA Astrophysics Data System (ADS)
Kopka, P.; Wawrzynczak, A.; Borysiewicz, M.
2015-09-01
In many areas of application, a central problem is a solution to the inverse problem, especially estimation of the unknown model parameters to model the underlying dynamics of a physical system precisely. In this situation, the Bayesian inference is a powerful tool to combine observed data with prior knowledge to gain the probability distribution of searched parameters. We have applied the modern methodology named Sequential Approximate Bayesian Computation (S-ABC) to the problem of tracing the atmospheric contaminant source. The ABC is technique commonly used in the Bayesian analysis of complex models and dynamic system. Sequential methods can significantly increase the efficiency of the ABC. In the presented algorithm, the input data are the on-line arriving concentrations of released substance registered by distributed sensor network from OVER-LAND ATMOSPHERIC DISPERSION (OLAD) experiment. The algorithm output are the probability distributions of a contamination source parameters i.e. its particular location, release rate, speed and direction of the movement, start time and duration. The stochastic approach presented in this paper is completely general and can be used in other fields where the parameters of the model bet fitted to the observable data should be found.
Approximate Bayesian Computation by Subset Simulation using hierarchical state-space models
NASA Astrophysics Data System (ADS)
Vakilzadeh, Majid K.; Huang, Yong; Beck, James L.; Abrahamsson, Thomas
2017-02-01
A new multi-level Markov Chain Monte Carlo algorithm for Approximate Bayesian Computation, ABC-SubSim, has recently appeared that exploits the Subset Simulation method for efficient rare-event simulation. ABC-SubSim adaptively creates a nested decreasing sequence of data-approximating regions in the output space that correspond to increasingly closer approximations of the observed output vector in this output space. At each level, multiple samples of the model parameter vector are generated by a component-wise Metropolis algorithm so that the predicted output corresponding to each parameter value falls in the current data-approximating region. Theoretically, if continued to the limit, the sequence of data-approximating regions would converge on to the observed output vector and the approximate posterior distributions, which are conditional on the data-approximation region, would become exact, but this is not practically feasible. In this paper we study the performance of the ABC-SubSim algorithm for Bayesian updating of the parameters of dynamical systems using a general hierarchical state-space model. We note that the ABC methodology gives an approximate posterior distribution that actually corresponds to an exact posterior where a uniformly distributed combined measurement and modeling error is added. We also note that ABC algorithms have a problem with learning the uncertain error variances in a stochastic state-space model and so we treat them as nuisance parameters and analytically integrate them out of the posterior distribution. In addition, the statistical efficiency of the original ABC-SubSim algorithm is improved by developing a novel strategy to regulate the proposal variance for the component-wise Metropolis algorithm at each level. We demonstrate that Self-regulated ABC-SubSim is well suited for Bayesian system identification by first applying it successfully to model updating of a two degree-of-freedom linear structure for three cases: globally, locally and un-identifiable model classes, and then to model updating of a two degree-of-freedom nonlinear structure with Duffing nonlinearities in its interstory force-deflection relationship.
Selecting Summary Statistics in Approximate Bayesian Computation for Calibrating Stochastic Models
Burr, Tom
2013-01-01
Approximate Bayesian computation (ABC) is an approach for using measurement data to calibrate stochastic computer models, which are common in biology applications. ABC is becoming the “go-to” option when the data and/or parameter dimension is large because it relies on user-chosen summary statistics rather than the full data and is therefore computationally feasible. One technical challenge with ABC is that the quality of the approximation to the posterior distribution of model parameters depends on the user-chosen summary statistics. In this paper, the user requirement to choose effective summary statistics in order to accurately estimate the posterior distribution of model parameters is investigated and illustrated by example, using a model and corresponding real data of mitochondrial DNA population dynamics. We show that for some choices of summary statistics, the posterior distribution of model parameters is closely approximated and for other choices of summary statistics, the posterior distribution is not closely approximated. A strategy to choose effective summary statistics is suggested in cases where the stochastic computer model can be run at many trial parameter settings, as in the example. PMID:24288668
Selecting summary statistics in approximate Bayesian computation for calibrating stochastic models.
Burr, Tom; Skurikhin, Alexei
2013-01-01
Approximate Bayesian computation (ABC) is an approach for using measurement data to calibrate stochastic computer models, which are common in biology applications. ABC is becoming the "go-to" option when the data and/or parameter dimension is large because it relies on user-chosen summary statistics rather than the full data and is therefore computationally feasible. One technical challenge with ABC is that the quality of the approximation to the posterior distribution of model parameters depends on the user-chosen summary statistics. In this paper, the user requirement to choose effective summary statistics in order to accurately estimate the posterior distribution of model parameters is investigated and illustrated by example, using a model and corresponding real data of mitochondrial DNA population dynamics. We show that for some choices of summary statistics, the posterior distribution of model parameters is closely approximated and for other choices of summary statistics, the posterior distribution is not closely approximated. A strategy to choose effective summary statistics is suggested in cases where the stochastic computer model can be run at many trial parameter settings, as in the example.
NASA Astrophysics Data System (ADS)
Kacprzak, T.; Herbel, J.; Amara, A.; Réfrégier, A.
2018-02-01
Approximate Bayesian Computation (ABC) is a method to obtain a posterior distribution without a likelihood function, using simulations and a set of distance metrics. For that reason, it has recently been gaining popularity as an analysis tool in cosmology and astrophysics. Its drawback, however, is a slow convergence rate. We propose a novel method, which we call qABC, to accelerate ABC with Quantile Regression. In this method, we create a model of quantiles of distance measure as a function of input parameters. This model is trained on a small number of simulations and estimates which regions of the prior space are likely to be accepted into the posterior. Other regions are then immediately rejected. This procedure is then repeated as more simulations are available. We apply it to the practical problem of estimation of redshift distribution of cosmological samples, using forward modelling developed in previous work. The qABC method converges to nearly same posterior as the basic ABC. It uses, however, only 20% of the number of simulations compared to basic ABC, achieving a fivefold gain in execution time for our problem. For other problems the acceleration rate may vary; it depends on how close the prior is to the final posterior. We discuss possible improvements and extensions to this method.
NASA Astrophysics Data System (ADS)
Krishnanathan, Kirubhakaran; Anderson, Sean R.; Billings, Stephen A.; Kadirkamanathan, Visakan
2016-11-01
In this paper, we derive a system identification framework for continuous-time nonlinear systems, for the first time using a simulation-focused computational Bayesian approach. Simulation approaches to nonlinear system identification have been shown to outperform regression methods under certain conditions, such as non-persistently exciting inputs and fast-sampling. We use the approximate Bayesian computation (ABC) algorithm to perform simulation-based inference of model parameters. The framework has the following main advantages: (1) parameter distributions are intrinsically generated, giving the user a clear description of uncertainty, (2) the simulation approach avoids the difficult problem of estimating signal derivatives as is common with other continuous-time methods, and (3) as noted above, the simulation approach improves identification under conditions of non-persistently exciting inputs and fast-sampling. Term selection is performed by judging parameter significance using parameter distributions that are intrinsically generated as part of the ABC procedure. The results from a numerical example demonstrate that the method performs well in noisy scenarios, especially in comparison to competing techniques that rely on signal derivative estimation.
Automating approximate Bayesian computation by local linear regression.
Thornton, Kevin R
2009-07-07
In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.
Approximate Bayesian Computation in the estimation of the parameters of the Forbush decrease model
NASA Astrophysics Data System (ADS)
Wawrzynczak, A.; Kopka, P.
2017-12-01
Realistic modeling of the complicated phenomena as Forbush decrease of the galactic cosmic ray intensity is a quite challenging task. One aspect is a numerical solution of the Fokker-Planck equation in five-dimensional space (three spatial variables, the time and particles energy). The second difficulty arises from a lack of detailed knowledge about the spatial and time profiles of the parameters responsible for the creation of the Forbush decrease. Among these parameters, the central role plays a diffusion coefficient. Assessment of the correctness of the proposed model can be done only by comparison of the model output with the experimental observations of the galactic cosmic ray intensity. We apply the Approximate Bayesian Computation (ABC) methodology to match the Forbush decrease model to experimental data. The ABC method is becoming increasing exploited for dynamic complex problems in which the likelihood function is costly to compute. The main idea of all ABC methods is to accept samples as an approximate posterior draw if its associated modeled data are close enough to the observed one. In this paper, we present application of the Sequential Monte Carlo Approximate Bayesian Computation algorithm scanning the space of the diffusion coefficient parameters. The proposed algorithm is adopted to create the model of the Forbush decrease observed by the neutron monitors at the Earth in March 2002. The model of the Forbush decrease is based on the stochastic approach to the solution of the Fokker-Planck equation.
Using Approximate Bayesian Computation to Probe Multiple Transiting Planet Systems
NASA Astrophysics Data System (ADS)
Morehead, Robert C.
2015-08-01
The large number of multiple transiting planet systems (MTPS) uncovered with Kepler suggest a population of well-aligned planetary systems. Previously, the distribution of transit duration ratios in MTPSs has been used to place constraints on the distributions of mutual orbital inclinations and orbital eccentricities in these systems. However, degeneracies with the underlying number of planets in these systems pose added challenges and make explicit likelihood functions intractable. Approximate Bayesian computation (ABC) offers an intriguing path forward. In its simplest form, ABC proposes from a prior on the population parameters to produce synthetic datasets via a physically-motivated model. Samples are accepted or rejected based on how close they come to reproducing the actual observed dataset to some tolerance. The accepted samples then form a robust and useful approximation of the true posterior distribution of the underlying population parameters. We will demonstrate the utility of ABC in exoplanet populations by presenting new constraints on the mutual inclination and eccentricity distributions in the Kepler MTPSs. We will also introduce Simple-ABC, a new open-source Python package designed for ease of use and rapid specification of general models, suitable for use in a wide variety of applications in both exoplanet science and astrophysics as a whole.
Autonomic Closure for Turbulent Flows Using Approximate Bayesian Computation
NASA Astrophysics Data System (ADS)
Doronina, Olga; Christopher, Jason; Hamlington, Peter; Dahm, Werner
2017-11-01
Autonomic closure is a new technique for achieving fully adaptive and physically accurate closure of coarse-grained turbulent flow governing equations, such as those solved in large eddy simulations (LES). Although autonomic closure has been shown in recent a priori tests to more accurately represent unclosed terms than do dynamic versions of traditional LES models, the computational cost of the approach makes it challenging to implement for simulations of practical turbulent flows at realistically high Reynolds numbers. The optimization step used in the approach introduces large matrices that must be inverted and is highly memory intensive. In order to reduce memory requirements, here we propose to use approximate Bayesian computation (ABC) in place of the optimization step, thereby yielding a computationally-efficient implementation of autonomic closure that trades memory-intensive for processor-intensive computations. The latter challenge can be overcome as co-processors such as general purpose graphical processing units become increasingly available on current generation petascale and exascale supercomputers. In this work, we outline the formulation of ABC-enabled autonomic closure and present initial results demonstrating the accuracy and computational cost of the approach.
Improving the Accuracy of Planet Occurrence Rates from Kepler Using Approximate Bayesian Computation
NASA Astrophysics Data System (ADS)
Hsu, Danley C.; Ford, Eric B.; Ragozzine, Darin; Morehead, Robert C.
2018-05-01
We present a new framework to characterize the occurrence rates of planet candidates identified by Kepler based on hierarchical Bayesian modeling, approximate Bayesian computing (ABC), and sequential importance sampling. For this study, we adopt a simple 2D grid in planet radius and orbital period as our model and apply our algorithm to estimate occurrence rates for Q1–Q16 planet candidates orbiting solar-type stars. We arrive at significantly increased planet occurrence rates for small planet candidates (R p < 1.25 R ⊕) at larger orbital periods (P > 80 day) compared to the rates estimated by the more common inverse detection efficiency method (IDEM). Our improved methodology estimates that the occurrence rate density of small planet candidates in the habitable zone of solar-type stars is {1.6}-0.5+1.2 per factor of 2 in planet radius and orbital period. Additionally, we observe a local minimum in the occurrence rate for strong planet candidates marginalized over orbital period between 1.5 and 2 R ⊕ that is consistent with previous studies. For future improvements, the forward modeling approach of ABC is ideally suited to incorporating multiple populations, such as planets, astrophysical false positives, and pipeline false alarms, to provide accurate planet occurrence rates and uncertainties. Furthermore, ABC provides a practical statistical framework for answering complex questions (e.g., frequency of different planetary architectures) and providing sound uncertainties, even in the face of complex selection effects, observational biases, and follow-up strategies. In summary, ABC offers a powerful tool for accurately characterizing a wide variety of astrophysical populations.
NASA Astrophysics Data System (ADS)
Kopka, Piotr; Wawrzynczak, Anna; Borysiewicz, Mieczyslaw
2016-11-01
In this paper the Bayesian methodology, known as Approximate Bayesian Computation (ABC), is applied to the problem of the atmospheric contamination source identification. The algorithm input data are on-line arriving concentrations of the released substance registered by the distributed sensors network. This paper presents the Sequential ABC algorithm in detail and tests its efficiency in estimation of probabilistic distributions of atmospheric release parameters of a mobile contamination source. The developed algorithms are tested using the data from Over-Land Atmospheric Diffusion (OLAD) field tracer experiment. The paper demonstrates estimation of seven parameters characterizing the contamination source, i.e.: contamination source starting position (x,y), the direction of the motion of the source (d), its velocity (v), release rate (q), start time of release (ts) and its duration (td). The online-arriving new concentrations dynamically update the probability distributions of search parameters. The atmospheric dispersion Second-order Closure Integrated PUFF (SCIPUFF) Model is used as the forward model to predict the concentrations at the sensors locations.
NASA Astrophysics Data System (ADS)
Jennings, E.; Madigan, M.
2017-04-01
Given the complexity of modern cosmological parameter inference where we are faced with non-Gaussian data and noise, correlated systematics and multi-probe correlated datasets,the Approximate Bayesian Computation (ABC) method is a promising alternative to traditional Markov Chain Monte Carlo approaches in the case where the Likelihood is intractable or unknown. The ABC method is called "Likelihood free" as it avoids explicit evaluation of the Likelihood by using a forward model simulation of the data which can include systematics. We introduce astroABC, an open source ABC Sequential Monte Carlo (SMC) sampler for parameter estimation. A key challenge in astrophysics is the efficient use of large multi-probe datasets to constrain high dimensional, possibly correlated parameter spaces. With this in mind astroABC allows for massive parallelization using MPI, a framework that handles spawning of processes across multiple nodes. A key new feature of astroABC is the ability to create MPI groups with different communicators, one for the sampler and several others for the forward model simulation, which speeds up sampling time considerably. For smaller jobs the Python multiprocessing option is also available. Other key features of this new sampler include: a Sequential Monte Carlo sampler; a method for iteratively adapting tolerance levels; local covariance estimate using scikit-learn's KDTree; modules for specifying optimal covariance matrix for a component-wise or multivariate normal perturbation kernel and a weighted covariance metric; restart files output frequently so an interrupted sampling run can be resumed at any iteration; output and restart files are backed up at every iteration; user defined distance metric and simulation methods; a module for specifying heterogeneous parameter priors including non-standard prior PDFs; a module for specifying a constant, linear, log or exponential tolerance level; well-documented examples and sample scripts. This code is hosted online at https://github.com/EliseJ/astroABC.
Sandoval-Castellanos, Edson; Palkopoulou, Eleftheria; Dalén, Love
2014-01-01
Inference of population demographic history has vastly improved in recent years due to a number of technological and theoretical advances including the use of ancient DNA. Approximate Bayesian computation (ABC) stands among the most promising methods due to its simple theoretical fundament and exceptional flexibility. However, limited availability of user-friendly programs that perform ABC analysis renders it difficult to implement, and hence programming skills are frequently required. In addition, there is limited availability of programs able to deal with heterochronous data. Here we present the software BaySICS: Bayesian Statistical Inference of Coalescent Simulations. BaySICS provides an integrated and user-friendly platform that performs ABC analyses by means of coalescent simulations from DNA sequence data. It estimates historical demographic population parameters and performs hypothesis testing by means of Bayes factors obtained from model comparisons. Although providing specific features that improve inference from datasets with heterochronous data, BaySICS also has several capabilities making it a suitable tool for analysing contemporary genetic datasets. Those capabilities include joint analysis of independent tables, a graphical interface and the implementation of Markov-chain Monte Carlo without likelihoods.
Architectures of Kepler Planet Systems with Approximate Bayesian Computation
NASA Astrophysics Data System (ADS)
Morehead, Robert C.; Ford, Eric B.
2015-12-01
The distribution of period normalized transit duration ratios among Kepler’s multiple transiting planet systems constrains the distributions of mutual orbital inclinations and orbital eccentricities. However, degeneracies in these parameters tied to the underlying number of planets in these systems complicate their interpretation. To untangle the true architecture of planet systems, the mutual inclination, eccentricity, and underlying planet number distributions must be considered simultaneously. The complexities of target selection, transit probability, detection biases, vetting, and follow-up observations make it impractical to write an explicit likelihood function. Approximate Bayesian computation (ABC) offers an intriguing path forward. In its simplest form, ABC generates a sample of trial population parameters from a prior distribution to produce synthetic datasets via a physically-motivated forward model. Samples are then accepted or rejected based on how close they come to reproducing the actual observed dataset to some tolerance. The accepted samples form a robust and useful approximation of the true posterior distribution of the underlying population parameters. We build on the considerable progress from the field of statistics to develop sequential algorithms for performing ABC in an efficient and flexible manner. We demonstrate the utility of ABC in exoplanet populations and present new constraints on the distributions of mutual orbital inclinations, eccentricities, and the relative number of short-period planets per star. We conclude with a discussion of the implications for other planet occurrence rate calculations, such as eta-Earth.
Lopes, J S; Arenas, M; Posada, D; Beaumont, M A
2014-03-01
The estimation of parameters in molecular evolution may be biased when some processes are not considered. For example, the estimation of selection at the molecular level using codon-substitution models can have an upward bias when recombination is ignored. Here we address the joint estimation of recombination, molecular adaptation and substitution rates from coding sequences using approximate Bayesian computation (ABC). We describe the implementation of a regression-based strategy for choosing subsets of summary statistics for coding data, and show that this approach can accurately infer recombination allowing for intracodon recombination breakpoints, molecular adaptation and codon substitution rates. We demonstrate that our ABC approach can outperform other analytical methods under a variety of evolutionary scenarios. We also show that although the choice of the codon-substitution model is important, our inferences are robust to a moderate degree of model misspecification. In addition, we demonstrate that our approach can accurately choose the evolutionary model that best fits the data, providing an alternative for when the use of full-likelihood methods is impracticable. Finally, we applied our ABC method to co-estimate recombination, substitution and molecular adaptation rates from 24 published human immunodeficiency virus 1 coding data sets.
Slater, Graham J; Harmon, Luke J; Wegmann, Daniel; Joyce, Paul; Revell, Liam J; Alfaro, Michael E
2012-03-01
In recent years, a suite of methods has been developed to fit multiple rate models to phylogenetic comparative data. However, most methods have limited utility at broad phylogenetic scales because they typically require complete sampling of both the tree and the associated phenotypic data. Here, we develop and implement a new, tree-based method called MECCA (Modeling Evolution of Continuous Characters using ABC) that uses a hybrid likelihood/approximate Bayesian computation (ABC)-Markov-Chain Monte Carlo approach to simultaneously infer rates of diversification and trait evolution from incompletely sampled phylogenies and trait data. We demonstrate via simulation that MECCA has considerable power to choose among single versus multiple evolutionary rate models, and thus can be used to test hypotheses about changes in the rate of trait evolution across an incomplete tree of life. We finally apply MECCA to an empirical example of body size evolution in carnivores, and show that there is no evidence for an elevated rate of body size evolution in the pinnipeds relative to terrestrial carnivores. ABC approaches can provide a useful alternative set of tools for future macroevolutionary studies where likelihood-dependent approaches are lacking. © 2011 The Author(s). Evolution© 2011 The Society for the Study of Evolution.
USDA-ARS?s Scientific Manuscript database
Technical Abstract. Molecular markers can provide clear insight into the introduction history of invasive species. However, inferences about recent introduction histories remain challenging, because of the stochastic demographic processes often involved. Approximate Bayesian computation (ABC) can he...
Deciphering the Routes of invasion of Drosophila suzukii by Means of ABC Random Forest.
Fraimout, Antoine; Debat, Vincent; Fellous, Simon; Hufbauer, Ruth A; Foucaud, Julien; Pudlo, Pierre; Marin, Jean-Michel; Price, Donald K; Cattel, Julien; Chen, Xiao; Deprá, Marindia; François Duyck, Pierre; Guedot, Christelle; Kenis, Marc; Kimura, Masahito T; Loeb, Gregory; Loiseau, Anne; Martinez-Sañudo, Isabel; Pascual, Marta; Polihronakis Richmond, Maxi; Shearer, Peter; Singh, Nadia; Tamura, Koichiro; Xuéreb, Anne; Zhang, Jinping; Estoup, Arnaud
2017-04-01
Deciphering invasion routes from molecular data is crucial to understanding biological invasions, including identifying bottlenecks in population size and admixture among distinct populations. Here, we unravel the invasion routes of the invasive pest Drosophila suzukii using a multi-locus microsatellite dataset (25 loci on 23 worldwide sampling locations). To do this, we use approximate Bayesian computation (ABC), which has improved the reconstruction of invasion routes, but can be computationally expensive. We use our study to illustrate the use of a new, more efficient, ABC method, ABC random forest (ABC-RF) and compare it to a standard ABC method (ABC-LDA). We find that Japan emerges as the most probable source of the earliest recorded invasion into Hawaii. Southeast China and Hawaii together are the most probable sources of populations in western North America, which then in turn served as sources for those in eastern North America. European populations are genetically more homogeneous than North American populations, and their most probable source is northeast China, with evidence of limited gene flow from the eastern US as well. All introduced populations passed through bottlenecks, and analyses reveal five distinct admixture events. These findings can inform hypotheses concerning how this species evolved between different and independent source and invasive populations. Methodological comparisons indicate that ABC-RF and ABC-LDA show concordant results if ABC-LDA is based on a large number of simulated datasets but that ABC-RF out-performs ABC-LDA when using a comparable and more manageable number of simulated datasets, especially when analyzing complex introduction scenarios. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Phylodynamic Inference with Kernel ABC and Its Application to HIV Epidemiology.
Poon, Art F Y
2015-09-01
The shapes of phylogenetic trees relating virus populations are determined by the adaptation of viruses within each host, and by the transmission of viruses among hosts. Phylodynamic inference attempts to reverse this flow of information, estimating parameters of these processes from the shape of a virus phylogeny reconstructed from a sample of genetic sequences from the epidemic. A key challenge to phylodynamic inference is quantifying the similarity between two trees in an efficient and comprehensive way. In this study, I demonstrate that a new distance measure, based on a subset tree kernel function from computational linguistics, confers a significant improvement over previous measures of tree shape for classifying trees generated under different epidemiological scenarios. Next, I incorporate this kernel-based distance measure into an approximate Bayesian computation (ABC) framework for phylodynamic inference. ABC bypasses the need for an analytical solution of model likelihood, as it only requires the ability to simulate data from the model. I validate this "kernel-ABC" method for phylodynamic inference by estimating parameters from data simulated under a simple epidemiological model. Results indicate that kernel-ABC attained greater accuracy for parameters associated with virus transmission than leading software on the same data sets. Finally, I apply the kernel-ABC framework to study a recent outbreak of a recombinant HIV subtype in China. Kernel-ABC provides a versatile framework for phylodynamic inference because it can fit a broader range of models than methods that rely on the computation of exact likelihoods. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Approximate Bayesian estimation of extinction rate in the Finnish Daphnia magna metapopulation.
Robinson, John D; Hall, David W; Wares, John P
2013-05-01
Approximate Bayesian computation (ABC) is useful for parameterizing complex models in population genetics. In this study, ABC was applied to simultaneously estimate parameter values for a model of metapopulation coalescence and test two alternatives to a strict metapopulation model in the well-studied network of Daphnia magna populations in Finland. The models shared four free parameters: the subpopulation genetic diversity (θS), the rate of gene flow among patches (4Nm), the founding population size (N0) and the metapopulation extinction rate (e) but differed in the distribution of extinction rates across habitat patches in the system. The three models had either a constant extinction rate in all populations (strict metapopulation), one population that was protected from local extinction (i.e. a persistent source), or habitat-specific extinction rates drawn from a distribution with specified mean and variance. Our model selection analysis favoured the model including a persistent source population over the two alternative models. Of the closest 750,000 data sets in Euclidean space, 78% were simulated under the persistent source model (estimated posterior probability = 0.769). This fraction increased to more than 85% when only the closest 150,000 data sets were considered (estimated posterior probability = 0.774). Approximate Bayesian computation was then used to estimate parameter values that might produce the observed set of summary statistics. Our analysis provided posterior distributions for e that included the point estimate obtained from previous data from the Finnish D. magna metapopulation. Our results support the use of ABC and population genetic data for testing the strict metapopulation model and parameterizing complex models of demography. © 2013 Blackwell Publishing Ltd.
Statistical Hypothesis Testing in Intraspecific Phylogeography: NCPA versus ABC
Templeton, Alan R.
2009-01-01
Nested clade phylogeographic analysis (NCPA) and approximate Bayesian computation (ABC) have been used to test phylogeographic hypotheses. Multilocus NCPA tests null hypotheses, whereas ABC discriminates among a finite set of alternatives. The interpretive criteria of NCPA are explicit and allow complex models to be built from simple components. The interpretive criteria of ABC are ad hoc and require the specification of a complete phylogeographic model. The conclusions from ABC are often influenced by implicit assumptions arising from the many parameters needed to specify a complex model. These complex models confound many assumptions so that biological interpretations are difficult. Sampling error is accounted for in NCPA, but ABC ignores important sources of sampling error that creates pseudo-statistical power. NCPA generates the full sampling distribution of its statistics, but ABC only yields local probabilities, which in turn make it impossible to distinguish between a good fitting model, a non-informative model, and an over-determined model. Both NCPA and ABC use approximations, but convergences of the approximations used in NCPA are well defined whereas those in ABC are not. NCPA can analyze a large number of locations, but ABC cannot. Finally, the dimensionality of tested hypothesis is known in NCPA, but not for ABC. As a consequence, the “probabilities” generated by ABC are not true probabilities and are statistically non-interpretable. Accordingly, ABC should not be used for hypothesis testing, but simulation approaches are valuable when used in conjunction with NCPA or other methods that do not rely on highly parameterized models. PMID:19192182
An ABC estimate of pedigree error rate: application in dog, sheep and cattle breeds.
Leroy, G; Danchin-Burge, C; Palhiere, I; Baumung, R; Fritz, S; Mériaux, J C; Gautier, M
2012-06-01
On the basis of correlations between pairwise individual genealogical kinship coefficients and allele sharing distances computed from genotyping data, we propose an approximate Bayesian computation (ABC) approach to assess pedigree file reliability through gene-dropping simulations. We explore the features of the method using simulated data sets and show precision increases with the number of markers. An application is further made with five dog breeds, four sheep breeds and one cattle breed raised in France and displaying various characteristics and population sizes, using microsatellite or SNP markers. Depending on the breeds, pedigree error estimations range between 1% and 9% in dog breeds, 1% and 10% in sheep breeds and 4% in cattle breeds. © 2011 The Authors, Animal Genetics © 2011 Stichting International Foundation for Animal Genetics.
A novel approach for choosing summary statistics in approximate Bayesian computation.
Aeschbacher, Simon; Beaumont, Mark A; Futschik, Andreas
2012-11-01
The choice of summary statistics is a crucial step in approximate Bayesian computation (ABC). Since statistics are often not sufficient, this choice involves a trade-off between loss of information and reduction of dimensionality. The latter may increase the efficiency of ABC. Here, we propose an approach for choosing summary statistics based on boosting, a technique from the machine-learning literature. We consider different types of boosting and compare them to partial least-squares regression as an alternative. To mitigate the lack of sufficiency, we also propose an approach for choosing summary statistics locally, in the putative neighborhood of the true parameter value. We study a demographic model motivated by the reintroduction of Alpine ibex (Capra ibex) into the Swiss Alps. The parameters of interest are the mean and standard deviation across microsatellites of the scaled ancestral mutation rate (θ(anc) = 4N(e)u) and the proportion of males obtaining access to matings per breeding season (ω). By simulation, we assess the properties of the posterior distribution obtained with the various methods. According to our criteria, ABC with summary statistics chosen locally via boosting with the L(2)-loss performs best. Applying that method to the ibex data, we estimate θ(anc)≈ 1.288 and find that most of the variation across loci of the ancestral mutation rate u is between 7.7 × 10(-4) and 3.5 × 10(-3) per locus per generation. The proportion of males with access to matings is estimated as ω≈ 0.21, which is in good agreement with recent independent estimates.
A Novel Approach for Choosing Summary Statistics in Approximate Bayesian Computation
Aeschbacher, Simon; Beaumont, Mark A.; Futschik, Andreas
2012-01-01
The choice of summary statistics is a crucial step in approximate Bayesian computation (ABC). Since statistics are often not sufficient, this choice involves a trade-off between loss of information and reduction of dimensionality. The latter may increase the efficiency of ABC. Here, we propose an approach for choosing summary statistics based on boosting, a technique from the machine-learning literature. We consider different types of boosting and compare them to partial least-squares regression as an alternative. To mitigate the lack of sufficiency, we also propose an approach for choosing summary statistics locally, in the putative neighborhood of the true parameter value. We study a demographic model motivated by the reintroduction of Alpine ibex (Capra ibex) into the Swiss Alps. The parameters of interest are the mean and standard deviation across microsatellites of the scaled ancestral mutation rate (θanc = 4Neu) and the proportion of males obtaining access to matings per breeding season (ω). By simulation, we assess the properties of the posterior distribution obtained with the various methods. According to our criteria, ABC with summary statistics chosen locally via boosting with the L2-loss performs best. Applying that method to the ibex data, we estimate θ^anc≈1.288 and find that most of the variation across loci of the ancestral mutation rate u is between 7.7 × 10−4 and 3.5 × 10−3 per locus per generation. The proportion of males with access to matings is estimated as ω^≈0.21, which is in good agreement with recent independent estimates. PMID:22960215
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jennings, Elise; Wolf, Rachel; Sako, Masao
2016-11-09
Cosmological parameter estimation techniques that robustly account for systematic measurement uncertainties will be crucial for the next generation of cosmological surveys. We present a new analysis method, superABC, for obtaining cosmological constraints from Type Ia supernova (SN Ia) light curves using Approximate Bayesian Computation (ABC) without any likelihood assumptions. The ABC method works by using a forward model simulation of the data where systematic uncertainties can be simulated and marginalized over. A key feature of the method presented here is the use of two distinct metrics, the `Tripp' and `Light Curve' metrics, which allow us to compare the simulated data to the observed data set. The Tripp metric takes as input the parameters of models fit to each light curve with the SALT-II method, whereas the Light Curve metric uses the measured fluxes directly without model fitting. We apply the superABC sampler to a simulated data set ofmore » $$\\sim$$1000 SNe corresponding to the first season of the Dark Energy Survey Supernova Program. Varying $$\\Omega_m, w_0, \\alpha$$ and $$\\beta$$ and a magnitude offset parameter, with no systematics we obtain $$\\Delta(w_0) = w_0^{\\rm true} - w_0^{\\rm best \\, fit} = -0.036\\pm0.109$$ (a $$\\sim11$$% 1$$\\sigma$$ uncertainty) using the Tripp metric and $$\\Delta(w_0) = -0.055\\pm0.068$$ (a $$\\sim7$$% 1$$\\sigma$$ uncertainty) using the Light Curve metric. Including 1% calibration uncertainties in four passbands, adding 4 more parameters, we obtain $$\\Delta(w_0) = -0.062\\pm0.132$$ (a $$\\sim14$$% 1$$\\sigma$$ uncertainty) using the Tripp metric. Overall we find a $17$% increase in the uncertainty on $$w_0$$ with systematics compared to without. We contrast this with a MCMC approach where systematic effects are approximately included. We find that the MCMC method slightly underestimates the impact of calibration uncertainties for this simulated data set.« less
Bayesian parameter estimation for the Wnt pathway: an infinite mixture models approach.
Koutroumpas, Konstantinos; Ballarini, Paolo; Votsi, Irene; Cournède, Paul-Henry
2016-09-01
Likelihood-free methods, like Approximate Bayesian Computation (ABC), have been extensively used in model-based statistical inference with intractable likelihood functions. When combined with Sequential Monte Carlo (SMC) algorithms they constitute a powerful approach for parameter estimation and model selection of mathematical models of complex biological systems. A crucial step in the ABC-SMC algorithms, significantly affecting their performance, is the propagation of a set of parameter vectors through a sequence of intermediate distributions using Markov kernels. In this article, we employ Dirichlet process mixtures (DPMs) to design optimal transition kernels and we present an ABC-SMC algorithm with DPM kernels. We illustrate the use of the proposed methodology using real data for the canonical Wnt signaling pathway. A multi-compartment model of the pathway is developed and it is compared to an existing model. The results indicate that DPMs are more efficient in the exploration of the parameter space and can significantly improve ABC-SMC performance. In comparison to alternative sampling schemes that are commonly used, the proposed approach can bring potential benefits in the estimation of complex multimodal distributions. The method is used to estimate the parameters and the initial state of two models of the Wnt pathway and it is shown that the multi-compartment model fits better the experimental data. Python scripts for the Dirichlet Process Gaussian Mixture model and the Gibbs sampler are available at https://sites.google.com/site/kkoutroumpas/software konstantinos.koutroumpas@ecp.fr. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Bray, T C; Hall, S J G; Bruford, M W
2014-02-01
Investigation of historic population processes using molecular data has been facilitated by the use of approximate Bayesian computation (ABC), which enables the consideration of multiple alternative demographic scenarios. The Lincoln Red cattle breed provides a relatively simple example of two well-documented admixture events. Using molecular data for this breed, we found that structure did not resolve very low (<5% levels) of introgression, possibly due to sampling limitations. We evaluated the performance of two ABC approaches (2BAD and DIYABC) against those of two earlier methodologies, ADMIX and LEADMIX, by comparing their interpretations with the conclusions drawn from herdbook analysis. The ABC methods gave credible values for the proportions of the Lincoln Red genotype that are attributable to Aberdeen Angus and Limousin, although estimates of effective population size and event timing were not realistic. We suggest ABC methods are a valuable supplement to pedigree-based studies but that the accuracy of admixture determination is likely to diminish with increasing complexity of the admixture scenario. © 2013 Blackwell Verlag GmbH.
Melanoma Cell Colony Expansion Parameters Revealed by Approximate Bayesian Computation
Vo, Brenda N.; Drovandi, Christopher C.; Pettitt, Anthony N.; Pettet, Graeme J.
2015-01-01
In vitro studies and mathematical models are now being widely used to study the underlying mechanisms driving the expansion of cell colonies. This can improve our understanding of cancer formation and progression. Although much progress has been made in terms of developing and analysing mathematical models, far less progress has been made in terms of understanding how to estimate model parameters using experimental in vitro image-based data. To address this issue, a new approximate Bayesian computation (ABC) algorithm is proposed to estimate key parameters governing the expansion of melanoma cell (MM127) colonies, including cell diffusivity, D, cell proliferation rate, λ, and cell-to-cell adhesion, q, in two experimental scenarios, namely with and without a chemical treatment to suppress cell proliferation. Even when little prior biological knowledge about the parameters is assumed, all parameters are precisely inferred with a small posterior coefficient of variation, approximately 2–12%. The ABC analyses reveal that the posterior distributions of D and q depend on the experimental elapsed time, whereas the posterior distribution of λ does not. The posterior mean values of D and q are in the ranges 226–268 µm2h−1, 311–351 µm2h−1 and 0.23–0.39, 0.32–0.61 for the experimental periods of 0–24 h and 24–48 h, respectively. Furthermore, we found that the posterior distribution of q also depends on the initial cell density, whereas the posterior distributions of D and λ do not. The ABC approach also enables information from the two experiments to be combined, resulting in greater precision for all estimates of D and λ. PMID:26642072
Inferring epidemiological parameters from phylogenies using regression-ABC: A comparative study
Gascuel, Olivier
2017-01-01
Inferring epidemiological parameters such as the R0 from time-scaled phylogenies is a timely challenge. Most current approaches rely on likelihood functions, which raise specific issues that range from computing these functions to finding their maxima numerically. Here, we present a new regression-based Approximate Bayesian Computation (ABC) approach, which we base on a large variety of summary statistics intended to capture the information contained in the phylogeny and its corresponding lineage-through-time plot. The regression step involves the Least Absolute Shrinkage and Selection Operator (LASSO) method, which is a robust machine learning technique. It allows us to readily deal with the large number of summary statistics, while avoiding resorting to Markov Chain Monte Carlo (MCMC) techniques. To compare our approach to existing ones, we simulated target trees under a variety of epidemiological models and settings, and inferred parameters of interest using the same priors. We found that, for large phylogenies, the accuracy of our regression-ABC is comparable to that of likelihood-based approaches involving birth-death processes implemented in BEAST2. Our approach even outperformed these when inferring the host population size with a Susceptible-Infected-Removed epidemiological model. It also clearly outperformed a recent kernel-ABC approach when assuming a Susceptible-Infected epidemiological model with two host types. Lastly, by re-analyzing data from the early stages of the recent Ebola epidemic in Sierra Leone, we showed that regression-ABC provides more realistic estimates for the duration parameters (latency and infectiousness) than the likelihood-based method. Overall, ABC based on a large variety of summary statistics and a regression method able to perform variable selection and avoid overfitting is a promising approach to analyze large phylogenies. PMID:28263987
Boitard, Simon; Rodríguez, Willy; Jay, Flora; Mona, Stefano; Austerlitz, Frédéric
2016-01-01
Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey), PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles. PMID:26943927
Zaikin, Alexey; Míguez, Joaquín
2017-01-01
We compare three state-of-the-art Bayesian inference methods for the estimation of the unknown parameters in a stochastic model of a genetic network. In particular, we introduce a stochastic version of the paradigmatic synthetic multicellular clock model proposed by Ullner et al., 2007. By introducing dynamical noise in the model and assuming that the partial observations of the system are contaminated by additive noise, we enable a principled mechanism to represent experimental uncertainties in the synthesis of the multicellular system and pave the way for the design of probabilistic methods for the estimation of any unknowns in the model. Within this setup, we tackle the Bayesian estimation of a subset of the model parameters. Specifically, we compare three Monte Carlo based numerical methods for the approximation of the posterior probability density function of the unknown parameters given a set of partial and noisy observations of the system. The schemes we assess are the particle Metropolis-Hastings (PMH) algorithm, the nonlinear population Monte Carlo (NPMC) method and the approximate Bayesian computation sequential Monte Carlo (ABC-SMC) scheme. We present an extensive numerical simulation study, which shows that while the three techniques can effectively solve the problem there are significant differences both in estimation accuracy and computational efficiency. PMID:28797087
Using Approximate Bayesian Computation to infer sex ratios from acoustic data.
Lehnen, Lisa; Schorcht, Wigbert; Karst, Inken; Biedermann, Martin; Kerth, Gerald; Puechmaille, Sebastien J
2018-01-01
Population sex ratios are of high ecological relevance, but are challenging to determine in species lacking conspicuous external cues indicating their sex. Acoustic sexing is an option if vocalizations differ between sexes, but is precluded by overlapping distributions of the values of male and female vocalizations in many species. A method allowing the inference of sex ratios despite such an overlap will therefore greatly increase the information extractable from acoustic data. To meet this demand, we developed a novel approach using Approximate Bayesian Computation (ABC) to infer the sex ratio of populations from acoustic data. Additionally, parameters characterizing the male and female distribution of acoustic values (mean and standard deviation) are inferred. This information is then used to probabilistically assign a sex to a single acoustic signal. We furthermore develop a simpler means of sex ratio estimation based on the exclusion of calls from the overlap zone. Applying our methods to simulated data demonstrates that sex ratio and acoustic parameter characteristics of males and females are reliably inferred by the ABC approach. Applying both the ABC and the exclusion method to empirical datasets (echolocation calls recorded in colonies of lesser horseshoe bats, Rhinolophus hipposideros) provides similar sex ratios as molecular sexing. Our methods aim to facilitate evidence-based conservation, and to benefit scientists investigating ecological or conservation questions related to sex- or group specific behaviour across a wide range of organisms emitting acoustic signals. The developed methodology is non-invasive, low-cost and time-efficient, thus allowing the study of many sites and individuals. We provide an R-script for the easy application of the method and discuss potential future extensions and fields of applications. The script can be easily adapted to account for numerous biological systems by adjusting the type and number of groups to be distinguished (e.g. age, social rank, cryptic species) and the acoustic parameters investigated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jennings, E.; Madigan, M.
Given the complexity of modern cosmological parameter inference where we arefaced with non-Gaussian data and noise, correlated systematics and multi-probecorrelated data sets, the Approximate Bayesian Computation (ABC) method is apromising alternative to traditional Markov Chain Monte Carlo approaches in thecase where the Likelihood is intractable or unknown. The ABC method is called"Likelihood free" as it avoids explicit evaluation of the Likelihood by using aforward model simulation of the data which can include systematics. Weintroduce astroABC, an open source ABC Sequential Monte Carlo (SMC) sampler forparameter estimation. A key challenge in astrophysics is the efficient use oflarge multi-probe datasets to constrainmore » high dimensional, possibly correlatedparameter spaces. With this in mind astroABC allows for massive parallelizationusing MPI, a framework that handles spawning of jobs across multiple nodes. Akey new feature of astroABC is the ability to create MPI groups with differentcommunicators, one for the sampler and several others for the forward modelsimulation, which speeds up sampling time considerably. For smaller jobs thePython multiprocessing option is also available. Other key features include: aSequential Monte Carlo sampler, a method for iteratively adapting tolerancelevels, local covariance estimate using scikit-learn's KDTree, modules forspecifying optimal covariance matrix for a component-wise or multivariatenormal perturbation kernel, output and restart files are backed up everyiteration, user defined metric and simulation methods, a module for specifyingheterogeneous parameter priors including non-standard prior PDFs, a module forspecifying a constant, linear, log or exponential tolerance level,well-documented examples and sample scripts. This code is hosted online athttps://github.com/EliseJ/astroABC« less
Technow, Frank; Messina, Carlos D; Totir, L Radu; Cooper, Mark
2015-01-01
Genomic selection, enabled by whole genome prediction (WGP) methods, is revolutionizing plant breeding. Existing WGP methods have been shown to deliver accurate predictions in the most common settings, such as prediction of across environment performance for traits with additive gene effects. However, prediction of traits with non-additive gene effects and prediction of genotype by environment interaction (G×E), continues to be challenging. Previous attempts to increase prediction accuracy for these particularly difficult tasks employed prediction methods that are purely statistical in nature. Augmenting the statistical methods with biological knowledge has been largely overlooked thus far. Crop growth models (CGMs) attempt to represent the impact of functional relationships between plant physiology and the environment in the formation of yield and similar output traits of interest. Thus, they can explain the impact of G×E and certain types of non-additive gene effects on the expressed phenotype. Approximate Bayesian computation (ABC), a novel and powerful computational procedure, allows the incorporation of CGMs directly into the estimation of whole genome marker effects in WGP. Here we provide a proof of concept study for this novel approach and demonstrate its use with synthetic data sets. We show that this novel approach can be considerably more accurate than the benchmark WGP method GBLUP in predicting performance in environments represented in the estimation set as well as in previously unobserved environments for traits determined by non-additive gene effects. We conclude that this proof of concept demonstrates that using ABC for incorporating biological knowledge in the form of CGMs into WGP is a very promising and novel approach to improving prediction accuracy for some of the most challenging scenarios in plant breeding and applied genetics.
Integrating Crop Growth Models with Whole Genome Prediction through Approximate Bayesian Computation
Technow, Frank; Messina, Carlos D.; Totir, L. Radu; Cooper, Mark
2015-01-01
Genomic selection, enabled by whole genome prediction (WGP) methods, is revolutionizing plant breeding. Existing WGP methods have been shown to deliver accurate predictions in the most common settings, such as prediction of across environment performance for traits with additive gene effects. However, prediction of traits with non-additive gene effects and prediction of genotype by environment interaction (G×E), continues to be challenging. Previous attempts to increase prediction accuracy for these particularly difficult tasks employed prediction methods that are purely statistical in nature. Augmenting the statistical methods with biological knowledge has been largely overlooked thus far. Crop growth models (CGMs) attempt to represent the impact of functional relationships between plant physiology and the environment in the formation of yield and similar output traits of interest. Thus, they can explain the impact of G×E and certain types of non-additive gene effects on the expressed phenotype. Approximate Bayesian computation (ABC), a novel and powerful computational procedure, allows the incorporation of CGMs directly into the estimation of whole genome marker effects in WGP. Here we provide a proof of concept study for this novel approach and demonstrate its use with synthetic data sets. We show that this novel approach can be considerably more accurate than the benchmark WGP method GBLUP in predicting performance in environments represented in the estimation set as well as in previously unobserved environments for traits determined by non-additive gene effects. We conclude that this proof of concept demonstrates that using ABC for incorporating biological knowledge in the form of CGMs into WGP is a very promising and novel approach to improving prediction accuracy for some of the most challenging scenarios in plant breeding and applied genetics. PMID:26121133
Digest: Demographic inferences accounting for selection at linked sites†.
Simon, Alexis; Duranton, Maud
2018-05-16
Complex demography and selection at linked sites can generate spurious signatures of divergent selection. Unfortunately, many attempts at demographic inference consider overly simple models and neglect the effect of selection at linked sites. In this issue, Rougemont and Bernatchez (2018) applied an approximate Bayesian computation (ABC) framework that accounts for indirect selection to reveal a complex history of secondary contacts in Atlantic salmon (Salmo salar) that might explain a high rate of latitudinal clines in this species. © 2018 The Author(s). Evolution © 2018 The Society for the Study of Evolution.
In defence of model-based inference in phylogeography
Beaumont, Mark A.; Nielsen, Rasmus; Robert, Christian; Hey, Jody; Gaggiotti, Oscar; Knowles, Lacey; Estoup, Arnaud; Panchal, Mahesh; Corander, Jukka; Hickerson, Mike; Sisson, Scott A.; Fagundes, Nelson; Chikhi, Lounès; Beerli, Peter; Vitalis, Renaud; Cornuet, Jean-Marie; Huelsenbeck, John; Foll, Matthieu; Yang, Ziheng; Rousset, Francois; Balding, David; Excoffier, Laurent
2017-01-01
Recent papers have promoted the view that model-based methods in general, and those based on Approximate Bayesian Computation (ABC) in particular, are flawed in a number of ways, and are therefore inappropriate for the analysis of phylogeographic data. These papers further argue that Nested Clade Phylogeographic Analysis (NCPA) offers the best approach in statistical phylogeography. In order to remove the confusion and misconceptions introduced by these papers, we justify and explain the reasoning behind model-based inference. We argue that ABC is a statistically valid approach, alongside other computational statistical techniques that have been successfully used to infer parameters and compare models in population genetics. We also examine the NCPA method and highlight numerous deficiencies, either when used with single or multiple loci. We further show that the ages of clades are carelessly used to infer ages of demographic events, that these ages are estimated under a simple model of panmixia and population stationarity but are then used under different and unspecified models to test hypotheses, a usage the invalidates these testing procedures. We conclude by encouraging researchers to study and use model-based inference in population genetics. PMID:29284924
The Scientific Method, Diagnostic Bayes, and How to Detect Epistemic Errors
NASA Astrophysics Data System (ADS)
Vrugt, J. A.
2015-12-01
In the past decades, Bayesian methods have found widespread application and use in environmental systems modeling. Bayes theorem states that the posterior probability, P(H|D) of a hypothesis, H is proportional to the product of the prior probability, P(H) of this hypothesis and the likelihood, L(H|hat{D}) of the same hypothesis given the new/incoming observations, \\hat {D}. In science and engineering, H often constitutes some numerical simulation model, D = F(x,.) which summarizes using algebraic, empirical, and differential equations, state variables and fluxes, all our theoretical and/or practical knowledge of the system of interest, and x are the d unknown parameters which are subject to inference using some data, \\hat {D} of the observed system response. The Bayesian approach is intimately related to the scientific method and uses an iterative cycle of hypothesis formulation (model), experimentation and data collection, and theory/hypothesis refinement to elucidate the rules that govern the natural world. Unfortunately, model refinement has proven to be very difficult in large part because of the poor diagnostic power of residual based likelihood functions tep{gupta2008}. This has inspired te{vrugt2013} to advocate the use of 'likelihood-free' inference using approximate Bayesian computation (ABC). This approach uses one or more summary statistics, S(\\hat {D}) of the original data, \\hat {D} designed ideally to be sensitive only to one particular process in the model. Any mismatch between the observed and simulated summary metrics is then easily linked to a specific model component. A recurrent issue with the application of ABC is self-sufficiency of the summary statistics. In theory, S(.) should contain as much information as the original data itself, yet complex systems rarely admit sufficient statistics. In this article, we propose to combine the ideas of ABC and regular Bayesian inference to guarantee that no information is lost in diagnostic model evaluation. This hybrid approach, coined diagnostic Bayes, uses the summary metrics as prior distribution and original data in the likelihood function, or P(x|\\hat {D}) ∝ P(x|S(\\hat {D})) L(x|\\hat {D}). A case study illustrates the ability of the proposed methodology to diagnose epistemic errors and provide guidance on model refinement.
Cornille, A; Salcedo, A; Kryvokhyzha, D; Glémin, S; Holm, K; Wright, S I; Lascoux, M
2016-01-01
Polyploidization is a dominant feature of flowering plant evolution. However, detailed genomic analyses of the interpopulation diversification of polyploids following genome duplication are still in their infancy, mainly because of methodological limits, both in terms of sequencing and computational analyses. The shepherd's purse (Capsella bursa-pastoris) is one of the most common weed species in the world. It is highly self-fertilizing, and recent genomic data indicate that it is an allopolyploid, resulting from hybridization between the ancestors of the diploid species Capsella grandiflora and Capsella orientalis. Here, we investigated the genomic diversity of C. bursa-pastoris, its population structure and demographic history, following allopolyploidization in Eurasia. To that end, we genotyped 261 C. bursa-pastoris accessions spread across Europe, the Middle East and Asia, using genotyping-by-sequencing, leading to a total of 4274 SNPs after quality control. Bayesian clustering analyses revealed three distinct genetic clusters in Eurasia: one cluster grouping samples from Western Europe and Southeastern Siberia, the second one centred on Eastern Asia and the third one in the Middle East. Approximate Bayesian computation (ABC) supported the hypothesis that C. bursa-pastoris underwent a typical colonization history involving low gene flow among colonizing populations, likely starting from the Middle East towards Europe and followed by successive human-mediated expansions into Eastern Asia. Altogether, these findings bring new insights into the recent multistage colonization history of the allotetraploid C. bursa-pastoris and highlight ABC and genotyping-by-sequencing data as promising but still challenging tools to infer demographic histories of selfing allopolyploids. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Sadegh, M.; Vrugt, J. A.
2013-12-01
The ever increasing pace of computational power, along with continued advances in measurement technologies and improvements in process understanding has stimulated the development of increasingly complex hydrologic models that simulate soil moisture flow, groundwater recharge, surface runoff, root water uptake, and river discharge at increasingly finer spatial and temporal scales. Reconciling these system models with field and remote sensing data is a difficult task, particularly because average measures of model/data similarity inherently lack the power to provide a meaningful comparative evaluation of the consistency in model form and function. The very construction of the likelihood function - as a summary variable of the (usually averaged) properties of the error residuals - dilutes and mixes the available information into an index having little remaining correspondence to specific behaviors of the system (Gupta et al., 2008). The quest for a more powerful method for model evaluation has inspired Vrugt and Sadegh [2013] to introduce "likelihood-free" inference as vehicle for diagnostic model evaluation. This class of methods is also referred to as Approximate Bayesian Computation (ABC) and relaxes the need for an explicit likelihood function in favor of one or multiple different summary statistics rooted in hydrologic theory that together have a much stronger and compelling diagnostic power than some aggregated measure of the size of the error residuals. Here, we will introduce an efficient ABC sampling method that is orders of magnitude faster in exploring the posterior parameter distribution than commonly used rejection and Population Monte Carlo (PMC) samplers. Our methodology uses Markov Chain Monte Carlo simulation with DREAM, and takes advantage of a simple computational trick to resolve discontinuity problems with the application of set-theoretic summary statistics. We will also demonstrate a set of summary statistics that are rather insensitive to errors in the forcing data. This enhances prospects of detecting model structural deficiencies.
Irvine, Michael A; Hollingsworth, T Déirdre
2018-05-26
Fitting complex models to epidemiological data is a challenging problem: methodologies can be inaccessible to all but specialists, there may be challenges in adequately describing uncertainty in model fitting, the complex models may take a long time to run, and it can be difficult to fully capture the heterogeneity in the data. We develop an adaptive approximate Bayesian computation scheme to fit a variety of epidemiologically relevant data with minimal hyper-parameter tuning by using an adaptive tolerance scheme. We implement a novel kernel density estimation scheme to capture both dispersed and multi-dimensional data, and directly compare this technique to standard Bayesian approaches. We then apply the procedure to a complex individual-based simulation of lymphatic filariasis, a human parasitic disease. The procedure and examples are released alongside this article as an open access library, with examples to aid researchers to rapidly fit models to data. This demonstrates that an adaptive ABC scheme with a general summary and distance metric is capable of performing model fitting for a variety of epidemiological data. It also does not require significant theoretical background to use and can be made accessible to the diverse epidemiological research community. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Inferring the mode of origin of polyploid species from next-generation sequence data.
Roux, Camille; Pannell, John R
2015-03-01
Many eukaryote organisms are polyploid. However, despite their importance, evolutionary inference of polyploid origins and modes of inheritance has been limited by a need for analyses of allele segregation at multiple loci using crosses. The increasing availability of sequence data for nonmodel species now allows the application of established approaches for the analysis of genomic data in polyploids. Here, we ask whether approximate Bayesian computation (ABC), applied to realistic traditional and next-generation sequence data, allows correct inference of the evolutionary and demographic history of polyploids. Using simulations, we evaluate the robustness of evolutionary inference by ABC for tetraploid species as a function of the number of individuals and loci sampled, and the presence or absence of an outgroup. We find that ABC adequately retrieves the recent evolutionary history of polyploid species on the basis of both old and new sequencing technologies. The application of ABC to sequence data from diploid and polyploid species of the plant genus Capsella confirms its utility. Our analysis strongly supports an allopolyploid origin of C. bursa-pastoris about 80 000 years ago. This conclusion runs contrary to previous findings based on the same data set but using an alternative approach and is in agreement with recent findings based on whole-genome sequencing. Our results indicate that ABC is a promising and powerful method for revealing the evolution of polyploid species, without the need to attribute alleles to a homeologous chromosome pair. The approach can readily be extended to more complex scenarios involving higher ploidy levels. © 2015 John Wiley & Sons Ltd.
Chan, Yvonne L; Schanzenbach, David; Hickerson, Michael J
2014-09-01
Methods that integrate population-level sampling from multiple taxa into a single community-level analysis are an essential addition to the comparative phylogeographic toolkit. Detecting how species within communities have demographically tracked each other in space and time is important for understanding the effects of future climate and landscape changes and the resulting acceleration of extinctions, biological invasions, and potential surges in adaptive evolution. Here, we present a statistical framework for such an analysis based on hierarchical approximate Bayesian computation (hABC) with the goal of detecting concerted demographic histories across an ecological assemblage. Our method combines population genetic data sets from multiple taxa into a single analysis to estimate: 1) the proportion of a community sample that demographically expanded in a temporally clustered pulse and 2) when the pulse occurred. To validate the accuracy and utility of this new approach, we use simulation cross-validation experiments and subsequently analyze an empirical data set of 32 avian populations from Australia that are hypothesized to have expanded from smaller refugia populations in the late Pleistocene. The method can accommodate data set heterogeneity such as variability in effective population size, mutation rates, and sample sizes across species and exploits the statistical strength from the simultaneous analysis of multiple species. This hABC framework used in a multitaxa demographic context can increase our understanding of the impact of historical climate change by determining what proportion of the community responded in concert or independently and can be used with a wide variety of comparative phylogeographic data sets as biota-wide DNA barcoding data sets accumulate. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
NASA Astrophysics Data System (ADS)
Rohmer, Jeremy; Rousseau, Marie; Lemoine, Anne; Pedreros, Rodrigo; Lambert, Jerome; benki, Aalae
2017-04-01
Recent tsunami events including the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami have caused many casualties and damages to structures. Advances in numerical simulation of tsunami-induced wave processes have tremendously improved forecast, hazard and risk assessment and design of early warning for tsunamis. Among the major challenges, several studies have underlined uncertainties in earthquake slip distributions and rupture processes as major contributor on tsunami wave height and inundation extent. Constraining these uncertainties can be performed by taking advantage of observations either on tsunami waves (using network of water level gauge) or on inundation characteristics (using field evidence and eyewitness accounts). Despite these successful applications, combining tsunami observations and simulations still faces several limitations when the problem is addressed for past tsunamis events like 1755 Lisbon. 1) While recent inversion studies can benefit from current modern networks (e.g., tide gauges, sea bottom pressure gauges, GPS-mounted buoys), the number of tide gauges can be very scarce and testimonies on tsunami observations can be limited, incomplete and imprecise for past tsunamis events. These observations often restrict to eyewitness accounts on wave heights (e.g., maximum reached wave height at the coast) instead of the full observed waveforms; 2) Tsunami phenomena involve a large span of spatial scales (from ocean basin scales to local coastal wave interactions), which can make the modelling very demanding: the computation time cost of tsunami simulation can be very prohibitive; often reaching several hours. This often limits the number of allowable long-running simulations for performing the inversion, especially when the problem is addressed from a Bayesian inference perspective. The objective of the present study is to overcome both afore-described difficulties in the view to combine historical observations on past tsunami-induced waves and numerical simulations. In order to learn the uncertainty information on source parameters, we treat the problem within the Bayesian setting, which enables to incorporate in a flexible manner the different uncertainty sources. We propose to rely on an emerging technique called Approximate Bayesian Computation ABC, which has been developed to estimate the posterior distribution in modelling scenarios where the likelihood function is either unknown or cannot be explicitly defined. To overcome the computational issue, we combine ABC with statistical emulators (aka meta-model). We apply the proposed approach on the case study of Ligurian (North West of Italy) tsunami (1887) and discuss the results with a special attention paid to the impact of the observational error.
Gehara, Marcelo; Garda, Adrian A; Werneck, Fernanda P; Oliveira, Eliana F; da Fonseca, Emanuel M; Camurugi, Felipe; Magalhães, Felipe de M; Lanna, Flávia M; Sites, Jack W; Marques, Ricardo; Silveira-Filho, Ricardo; São Pedro, Vinícius A; Colli, Guarino R; Costa, Gabriel C; Burbrink, Frank T
2017-09-01
Many studies propose that Quaternary climatic cycles contracted and/or expanded the ranges of species and biomes. Strong expansion-contraction dynamics of biomes presume concerted demographic changes of associated fauna. The analysis of temporal concordance of demographic changes can be used to test the influence of Quaternary climate on diversification processes. Hierarchical approximate Bayesian computation (hABC) is a powerful and flexible approach that models genetic data from multiple species, and can be used to estimate the temporal concordance of demographic processes. Using available single-locus data, we can now perform large-scale analyses, both in terms of number of species and geographic scope. Here, we first compared the power of four alternative hABC models for a collection of single-locus data. We found that the model incorporating an a priori hypothesis about the timing of simultaneous demographic change had the best performance. Second, we applied the hABC models to a data set of seven squamate and four amphibian species occurring in the Seasonally Dry Tropical Forests (Caatinga) in northeastern Brazil, which, according to paleoclimatic evidence, experienced an increase in aridity during the Pleistocene. If this increase was important for the diversification of associated xeric-adapted species, simultaneous population expansions should be evident at the community level. We found a strong signal of synchronous population expansion in the Late Pleistocene, supporting the increase of the Caatinga during this time. This expansion likely enhanced the formation of communities adapted to high aridity and seasonality and caused regional extirpation of taxa adapted to wet forest. © 2017 John Wiley & Sons Ltd.
The ABC of non-inferiority margin setting from indirect comparisons.
Julious, Steven A
2011-01-01
In a non-inferiority trial to assess a new investigative treatment, there may need to be consideration of an indirect comparison with placebo using the active control in the current trial. We can, therefore, use the fact that there is a common active control in the comparisons of the investigative treatment and placebo. In analysing a non-inferiority trial, the ABC of: Assay sensitivity, Bias minimisation and Constancy assumption needs to be considered. It is highlighted how the ABC assumptions can potentially fail when there is placebo creep or a patient population shift. In this situation, the belief about the placebo response expressed in terms of a prior probability in Bayesian formulation could be used with the observed treatment effects to set the non-inferiority limit. Copyright © 2011 John Wiley & Sons, Ltd.
Bayesian Models for Astrophysical Data Using R, JAGS, Python, and Stan
NASA Astrophysics Data System (ADS)
Hilbe, Joseph M.; de Souza, Rafael S.; Ishida, Emille E. O.
2017-05-01
This comprehensive guide to Bayesian methods in astronomy enables hands-on work by supplying complete R, JAGS, Python, and Stan code, to use directly or to adapt. It begins by examining the normal model from both frequentist and Bayesian perspectives and then progresses to a full range of Bayesian generalized linear and mixed or hierarchical models, as well as additional types of models such as ABC and INLA. The book provides code that is largely unavailable elsewhere and includes details on interpreting and evaluating Bayesian models. Initial discussions offer models in synthetic form so that readers can easily adapt them to their own data; later the models are applied to real astronomical data. The consistent focus is on hands-on modeling, analysis of data, and interpretations that address scientific questions. A must-have for astronomers, its concrete approach will also be attractive to researchers in the sciences more generally.
A Bayesian Approach to Genome/Linguistic Relationships in Native South Americans
Amorim, Carlos Eduardo Guerra; Bisso-Machado, Rafael; Ramallo, Virginia; Bortolini, Maria Cátira; Bonatto, Sandro Luis; Salzano, Francisco Mauro; Hünemeier, Tábita
2013-01-01
The relationship between the evolution of genes and languages has been studied for over three decades. These studies rely on the assumption that languages, as many other cultural traits, evolve in a gene-like manner, accumulating heritable diversity through time and being subjected to evolutionary mechanisms of change. In the present work we used genetic data to evaluate South American linguistic classifications. We compared discordant models of language classifications to the current Native American genome-wide variation using realistic demographic models analyzed under an Approximate Bayesian Computation (ABC) framework. Data on 381 STRs spread along the autosomes were gathered from the literature for populations representing the five main South Amerindian linguistic groups: Andean, Arawakan, Chibchan-Paezan, Macro-Jê, and Tupí. The results indicated a higher posterior probability for the classification proposed by J.H. Greenberg in 1987, although L. Campbell's 1997 classification cannot be ruled out. Based on Greenberg's classification, it was possible to date the time of Tupí-Arawakan divergence (2.8 kya), and the time of emergence of the structure between present day major language groups in South America (3.1 kya). PMID:23696865
A bayesian approach to genome/linguistic relationships in native South Americans.
Amorim, Carlos Eduardo Guerra; Bisso-Machado, Rafael; Ramallo, Virginia; Bortolini, Maria Cátira; Bonatto, Sandro Luis; Salzano, Francisco Mauro; Hünemeier, Tábita
2013-01-01
The relationship between the evolution of genes and languages has been studied for over three decades. These studies rely on the assumption that languages, as many other cultural traits, evolve in a gene-like manner, accumulating heritable diversity through time and being subjected to evolutionary mechanisms of change. In the present work we used genetic data to evaluate South American linguistic classifications. We compared discordant models of language classifications to the current Native American genome-wide variation using realistic demographic models analyzed under an Approximate Bayesian Computation (ABC) framework. Data on 381 STRs spread along the autosomes were gathered from the literature for populations representing the five main South Amerindian linguistic groups: Andean, Arawakan, Chibchan-Paezan, Macro-Jê, and Tupí. The results indicated a higher posterior probability for the classification proposed by J.H. Greenberg in 1987, although L. Campbell's 1997 classification cannot be ruled out. Based on Greenberg's classification, it was possible to date the time of Tupí-Arawakan divergence (2.8 kya), and the time of emergence of the structure between present day major language groups in South America (3.1 kya).
EggLib: processing, analysis and simulation tools for population genetics and genomics
2012-01-01
Background With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. Results In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. Conclusions EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded. PMID:22494792
EggLib: processing, analysis and simulation tools for population genetics and genomics.
De Mita, Stéphane; Siol, Mathieu
2012-04-11
With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded.
NASA Astrophysics Data System (ADS)
Svensson, Andreas; Schön, Thomas B.; Lindsten, Fredrik
2018-05-01
Probabilistic (or Bayesian) modeling and learning offers interesting possibilities for systematic representation of uncertainty using probability theory. However, probabilistic learning often leads to computationally challenging problems. Some problems of this type that were previously intractable can now be solved on standard personal computers thanks to recent advances in Monte Carlo methods. In particular, for learning of unknown parameters in nonlinear state-space models, methods based on the particle filter (a Monte Carlo method) have proven very useful. A notoriously challenging problem, however, still occurs when the observations in the state-space model are highly informative, i.e. when there is very little or no measurement noise present, relative to the amount of process noise. The particle filter will then struggle in estimating one of the basic components for probabilistic learning, namely the likelihood p (data | parameters). To this end we suggest an algorithm which initially assumes that there is substantial amount of artificial measurement noise present. The variance of this noise is sequentially decreased in an adaptive fashion such that we, in the end, recover the original problem or possibly a very close approximation of it. The main component in our algorithm is a sequential Monte Carlo (SMC) sampler, which gives our proposed method a clear resemblance to the SMC2 method. Another natural link is also made to the ideas underlying the approximate Bayesian computation (ABC). We illustrate it with numerical examples, and in particular show promising results for a challenging Wiener-Hammerstein benchmark problem.
Tseng, Shu-Ping; Li, Shou-Hsien; Hsieh, Chia-Hung; Wang, Hurng-Yi; Lin, Si-Min
2014-10-01
Dating the time of divergence and understanding speciation processes are central to the study of the evolutionary history of organisms but are notoriously difficult. The difficulty is largely rooted in variations in the ancestral population size or in the genealogy variation across loci. To depict the speciation processes and divergence histories of three monophyletic Takydromus species endemic to Taiwan, we sequenced 20 nuclear loci and combined with one mitochondrial locus published in GenBank. They were analysed by a multispecies coalescent approach within a Bayesian framework. Divergence dating based on the gene tree approach showed high variation among loci, and the divergence was estimated at an earlier date than when derived by the species-tree approach. To test whether variations in the ancestral population size accounted for the majority of this variation, we conducted computer inferences using isolation-with-migration (IM) and approximate Bayesian computation (ABC) frameworks. The results revealed that gene flow during the early stage of speciation was strongly favoured over the isolation model, and the initiation of the speciation process was far earlier than the dates estimated by gene- and species-based divergence dating. Due to their limited dispersal ability, it is suggested that geographical isolation may have played a major role in the divergence of these Takydromus species. Nevertheless, this study reveals a more complex situation and demonstrates that gene flow during the speciation process cannot be overlooked and may have a great impact on divergence dating. By using multilocus data and incorporating Bayesian coalescence approaches, we provide a more biologically realistic framework for delineating the divergence history of Takydromus. © 2014 John Wiley & Sons Ltd.
Evidence for cryptic northern refugia in the last glacial period in Cryptomeria japonica
Kimura, Megumi K.; Uchiyama, Kentaro; Nakao, Katsuhiro; Moriguchi, Yoshinari; San Jose-Maldia, Lerma; Tsumura, Yoshihiko
2014-01-01
Background and Aims Distribution shifts and natural selection during past climatic changes are important factors in determining the genetic structure of forest species. In particular, climatic fluctuations during the Quaternary appear to have caused changes in the distribution ranges of plants, and thus strongly affected their genetic structure. This study was undertaken to identify the responses of the conifer Cryptomeria japonica, endemic to the Japanese Archipelago, to past climatic changes using a combination of phylogeography and species distribution modelling (SDM) methods. Specifically, this study focused on the locations of refugia during the last glacial maximum (LGM). Methods Genetic diversity and structure were examined using 20 microsatellite markers in 37 populations of C. japonica. The locations of glacial refugia were assessed using STRUCTURE analysis, and potential habitats under current and past climate conditions were predicted using SDM. The process of genetic divergence was also examined using the approximate Bayesian computation procedure (ABC) in DIY ABC to test the divergence time between the gene pools detected by the STRUCTURE analysis. Key Results STRUCTURE analysis identified four gene pools: northern Tohoku district; from Chubu to Chugoku district; from Tohoku to Shikoku district on the Pacific Ocean side of the Archipelago; and Yakushima Island. DIY ABC analysis indicated that the four gene pools diverged at the same time before the LGM. SDM also indicated potential northern cryptic refugia. Conclusions The combined evidence from microsatellites and SDM clearly indicates that climatic changes have shaped the genetic structure of C. japonica. The gene pool detected in northern Tohoku district is likely to have been established by cryptic northern refugia on the coast of the Japan Sea to the west of the Archipelago. The gene pool in Yakushima Island can probably be explained simply by long-term isolation from the other gene pools since the LGM. These results are supported by those of SDM and the predicted divergence time determined using ABC analysis. PMID:25355521
Recent colonization of the Galápagos by the tree Geoffroea spinosa Jacq. (Leguminosae).
Caetano, S; Currat, M; Pennington, R T; Prado, D; Excoffier, L; Naciri, Y
2012-06-01
This study puts together genetic data and an approximate bayesian computation (ABC) approach to infer the time at which the tree Geoffroea spinosa colonized the Galápagos Islands. The genetic diversity and differentiation between Peru and Galápagos population samples, estimated using three chloroplast spacers and six microsatellite loci, reveal significant differences between two mainland regions separated by the Andes mountains (Inter Andean vs. Pacific Coast) as well as a significant genetic differentiation of island populations. Microsatellites identify two distinct geographical clusters, the Galápagos and the mainland, and chloroplast markers show a private haplotype in the Galápagos. The nuclear distinctiveness of the Inter Andean populations suggests current restricted pollen flow, but chloroplast points to cross-Andean dispersals via seeds, indicating that the Andes might not be an effective biogeographical barrier. The ABC analyses clearly point to the colonization of the Galápagos within the last 160,000 years and possibly as recently as 4750 years ago (475 generations). Founder events associated with colonization of the two islands where the species occurs are detected, with Española having been colonized after Floreana. We discuss two nonmutually exclusive possibilities for the colonization of the Galápagos, recent natural dispersal vs. human introduction. © 2012 Blackwell Publishing Ltd.
Athrey, Giridhar; Barr, Kelly R; Lance, Richard F; Leberg, Paul L
2012-01-01
Anthropogenic alterations in the natural environment can be a potent evolutionary force. For species that have specific habitat requirements, habitat loss can result in substantial genetic effects, potentially impeding future adaptability and evolution. The endangered black-capped vireo (Vireo atricapilla) suffered a substantial contraction of breeding habitat and population size during much of the 20th century. In a previous study, we reported significant differentiation between remnant populations, but failed to recover a strong genetic signal of bottlenecks. In this study, we used a combination of historical and contemporary sampling from Oklahoma and Texas to (i) determine whether population structure and genetic diversity have changed over time and (ii) evaluate alternate demographic hypotheses using approximate Bayesian computation (ABC). We found lower genetic diversity and increased differentiation in contemporary samples compared to historical samples, indicating nontrivial impacts of fragmentation. ABC analysis suggests a bottleneck having occurred in the early part of the 20th century, resulting in a magnitude decline in effective population size. Genetic monitoring with temporally spaced samples, such as used in this study, can be highly informative for assessing the genetic impacts of anthropogenic fragmentation on threatened or endangered species, as well as revealing the dynamics of small populations over time. PMID:23028396
Ursenbacher, Sylvain; Guillon, Michaël; Cubizolle, Hervé; Dupoué, Andréaz; Blouin-Demers, Gabriel; Lourdais, Olivier
2015-07-01
Understanding the impact of postglacial recolonization on genetic diversity is essential in explaining current patterns of genetic variation. The central-marginal hypothesis (CMH) predicts a reduction in genetic diversity from the core of the distribution to peripheral populations, as well as reduced connectivity between peripheral populations. While the CMH has received considerable empirical support, its broad applicability is still debated and alternative hypotheses predict different spatial patterns of genetic diversity. Using microsatellite markers, we analysed the genetic diversity of the adder (Vipera berus) in western Europe to reconstruct postglacial recolonization. Approximate Bayesian Computation (ABC) analyses suggested a postglacial recolonization from two routes: a western route from the Atlantic Coast up to Belgium and a central route from the Massif Central to the Alps. This cold-adapted species likely used two isolated glacial refugia in southern France, in permafrost-free areas during the last glacial maximum. Adder populations further from putative glacial refugia had lower genetic diversity and reduced connectivity; therefore, our results support the predictions of the CMH. Our study also illustrates the utility of highly variable nuclear markers, such as microsatellites, and ABC to test competing recolonization hypotheses. © 2015 John Wiley & Sons Ltd.
Modified artificial bee colony for the vehicle routing problems with time windows.
Alzaqebah, Malek; Abdullah, Salwani; Jawarneh, Sana
2016-01-01
The natural behaviour of the honeybee has attracted the attention of researchers in recent years and several algorithms have been developed that mimic swarm behaviour to solve optimisation problems. This paper introduces an artificial bee colony (ABC) algorithm for the vehicle routing problem with time windows (VRPTW). A Modified ABC algorithm is proposed to improve the solution quality of the original ABC. The high exploration ability of the ABC slows-down its convergence speed, which may due to the mechanism used by scout bees in replacing abandoned (unimproved) solutions with new ones. In the Modified ABC a list of abandoned solutions is used by the scout bees to memorise the abandoned solutions, then the scout bees select a solution from the list based on roulette wheel selection and replace by a new solution with random routs selected from the best solution. The performance of the Modified ABC is evaluated on Solomon benchmark datasets and compared with the original ABC. The computational results demonstrate that the Modified ABC outperforms the original ABC also produce good solutions when compared with the best-known results in the literature. Computational investigations show that the proposed algorithm is a good and promising approach for the VRPTW.
Artificial Boundary Conditions Based on the Difference Potentials Method
NASA Technical Reports Server (NTRS)
Tsynkov, Semyon V.
1996-01-01
While numerically solving a problem initially formulated on an unbounded domain, one typically truncates this domain, which necessitates setting the artificial boundary conditions (ABC's) at the newly formed external boundary. The issue of setting the ABC's appears to be most significant in many areas of scientific computing, for example, in problems originating from acoustics, electrodynamics, solid mechanics, and fluid dynamics. In particular, in computational fluid dynamics (where external problems present a wide class of practically important formulations) the proper treatment of external boundaries may have a profound impact on the overall quality and performance of numerical algorithms. Most of the currently used techniques for setting the ABC's can basically be classified into two groups. The methods from the first group (global ABC's) usually provide high accuracy and robustness of the numerical procedure but often appear to be fairly cumbersome and (computationally) expensive. The methods from the second group (local ABC's) are, as a rule, algorithmically simple, numerically cheap, and geometrically universal; however, they usually lack accuracy of computations. In this paper we first present a survey and provide a comparative assessment of different existing methods for constructing the ABC's. Then, we describe a relatively new ABC's technique of ours and review the corresponding results. This new technique, in our opinion, is currently one of the most promising in the field. It enables one to construct such ABC's that combine the advantages relevant to the two aforementioned classes of existing methods. Our approach is based on application of the difference potentials method attributable to V. S. Ryaben'kii. This approach allows us to obtain highly accurate ABC's in the form of certain (nonlocal) boundary operator equations. The operators involved are analogous to the pseudodifferential boundary projections first introduced by A. P. Calderon and then also studied by R. T. Seeley. The apparatus of the boundary pseudodifferential equations, which has formerly been used mostly in the qualitative theory of integral equations and PDE'S, is now effectively employed for developing numerical methods in the different fields of scientific computing.
Bennett, Kelly Louise; Shija, Fortunate; Linton, Yvonne-Marie; Misinzo, Gerald; Kaddumukasa, Martha; Djouaka, Rousseau; Anyaele, Okorie; Harris, Angela; Irish, Seth; Hlaing, Thaung; Prakash, Anil; Lutwama, Julius; Walton, Catherine
2016-09-01
Increasing globalization has promoted the spread of exotic species, including disease vectors. Understanding the evolutionary processes involved in such colonizations is both of intrinsic biological interest and important to predict and mitigate future disease risks. The Aedes aegypti mosquito is a major vector of dengue, chikungunya and Zika, the worldwide spread of which has been facilitated by Ae. aegypti's adaption to human-modified environments. Understanding the evolutionary processes involved in this invasion requires characterization of the genetic make-up of the source population(s). The application of approximate Bayesian computation (ABC) to sequence data from four nuclear and one mitochondrial marker revealed that African populations of Ae. aegypti best fit a demographic model of lineage diversification, historical admixture and recent population structuring. As ancestral Ae. aegypti were dependent on forests, this population history is consistent with the effects of forest fragmentation and expansion driven by Pleistocene climatic change. Alternatively, or additionally, historical human movement across the continent may have facilitated their recent spread and mixing. ABC analysis and haplotype networks support earlier inferences of a single out-of-Africa colonization event, while a cline of decreasing genetic diversity indicates that Ae. aegypti moved first from Africa to the Americas and then to Asia. ABC analysis was unable to verify this colonization route, possibly because the genetic signal of admixture obscures the true colonization pathway. By increasing genetic diversity and forming novel allelic combinations, divergence and historical admixture within Africa could have provided the adaptive potential needed for the successful worldwide spread of Ae. aegypti. © 2016 The Authors. Molecular Ecology Published by John Wiley & Sons Ltd.
A new model to predict weak-lensing peak counts. II. Parameter constraint strategies
NASA Astrophysics Data System (ADS)
Lin, Chieh-An; Kilbinger, Martin
2015-11-01
Context. Peak counts have been shown to be an excellent tool for extracting the non-Gaussian part of the weak lensing signal. Recently, we developed a fast stochastic forward model to predict weak-lensing peak counts. Our model is able to reconstruct the underlying distribution of observables for analysis. Aims: In this work, we explore and compare various strategies for constraining a parameter using our model, focusing on the matter density Ωm and the density fluctuation amplitude σ8. Methods: First, we examine the impact from the cosmological dependency of covariances (CDC). Second, we perform the analysis with the copula likelihood, a technique that makes a weaker assumption than does the Gaussian likelihood. Third, direct, non-analytic parameter estimations are applied using the full information of the distribution. Fourth, we obtain constraints with approximate Bayesian computation (ABC), an efficient, robust, and likelihood-free algorithm based on accept-reject sampling. Results: We find that neglecting the CDC effect enlarges parameter contours by 22% and that the covariance-varying copula likelihood is a very good approximation to the true likelihood. The direct techniques work well in spite of noisier contours. Concerning ABC, the iterative process converges quickly to a posterior distribution that is in excellent agreement with results from our other analyses. The time cost for ABC is reduced by two orders of magnitude. Conclusions: The stochastic nature of our weak-lensing peak count model allows us to use various techniques that approach the true underlying probability distribution of observables, without making simplifying assumptions. Our work can be generalized to other observables where forward simulations provide samples of the underlying distribution.
Inference of Transmission Network Structure from HIV Phylogenetic Trees
Giardina, Federica; Romero-Severson, Ethan Obie; Albert, Jan; ...
2017-01-13
Phylogenetic inference is an attractive means to reconstruct transmission histories and epidemics. However, there is not a perfect correspondence between transmission history and virus phylogeny. Both node height and topological differences may occur, depending on the interaction between within-host evolutionary dynamics and between-host transmission patterns. To investigate these interactions, we added a within-host evolutionary model in epidemiological simulations and examined if the resulting phylogeny could recover different types of contact networks. To further improve realism, we also introduced patient-specific differences in infectivity across disease stages, and on the epidemic level we considered incomplete sampling and the age of the epidemic.more » Second, we implemented an inference method based on approximate Bayesian computation (ABC) to discriminate among three well-studied network models and jointly estimate both network parameters and key epidemiological quantities such as the infection rate. Our ABC framework used both topological and distance-based tree statistics for comparison between simulated and observed trees. Overall, our simulations showed that a virus time-scaled phylogeny (genealogy) may be substantially different from the between-host transmission tree. This has important implications for the interpretation of what a phylogeny reveals about the underlying epidemic contact network. In particular, we found that while the within-host evolutionary process obscures the transmission tree, the diversification process and infectivity dynamics also add discriminatory power to differentiate between different types of contact networks. We also found that the possibility to differentiate contact networks depends on how far an epidemic has progressed, where distance-based tree statistics have more power early in an epidemic. Finally, we applied our ABC inference on two different outbreaks from the Swedish HIV-1 epidemic.« less
2011-01-01
Background Bacteria have evolved a rich set of mechanisms for sensing and adapting to adverse conditions in their environment. These are crucial for their survival, which requires them to react to extracellular stresses such as heat shock, ethanol treatment or phage infection. Here we focus on studying the phage shock protein (Psp) stress response in Escherichia coli induced by a phage infection or other damage to the bacterial membrane. This system has not yet been theoretically modelled or analysed in silico. Results We develop a model of the Psp response system, and illustrate how such models can be constructed and analyzed in light of available sparse and qualitative information in order to generate novel biological hypotheses about their dynamical behaviour. We analyze this model using tools from Petri-net theory and study its dynamical range that is consistent with currently available knowledge by conditioning model parameters on the available data in an approximate Bayesian computation (ABC) framework. Within this ABC approach we analyze stochastic and deterministic dynamics. This analysis allows us to identify different types of behaviour and these mechanistic insights can in turn be used to design new, more detailed and time-resolved experiments. Conclusions We have developed the first mechanistic model of the Psp response in E. coli. This model allows us to predict the possible qualitative stochastic and deterministic dynamic behaviours of key molecular players in the stress response. Our inferential approach can be applied to stress response and signalling systems more generally: in the ABC framework we can condition mathematical models on qualitative data in order to delimit e.g. parameter ranges or the qualitative system dynamics in light of available end-point or qualitative information. PMID:21569396
Inference of Transmission Network Structure from HIV Phylogenetic Trees
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giardina, Federica; Romero-Severson, Ethan Obie; Albert, Jan
Phylogenetic inference is an attractive means to reconstruct transmission histories and epidemics. However, there is not a perfect correspondence between transmission history and virus phylogeny. Both node height and topological differences may occur, depending on the interaction between within-host evolutionary dynamics and between-host transmission patterns. To investigate these interactions, we added a within-host evolutionary model in epidemiological simulations and examined if the resulting phylogeny could recover different types of contact networks. To further improve realism, we also introduced patient-specific differences in infectivity across disease stages, and on the epidemic level we considered incomplete sampling and the age of the epidemic.more » Second, we implemented an inference method based on approximate Bayesian computation (ABC) to discriminate among three well-studied network models and jointly estimate both network parameters and key epidemiological quantities such as the infection rate. Our ABC framework used both topological and distance-based tree statistics for comparison between simulated and observed trees. Overall, our simulations showed that a virus time-scaled phylogeny (genealogy) may be substantially different from the between-host transmission tree. This has important implications for the interpretation of what a phylogeny reveals about the underlying epidemic contact network. In particular, we found that while the within-host evolutionary process obscures the transmission tree, the diversification process and infectivity dynamics also add discriminatory power to differentiate between different types of contact networks. We also found that the possibility to differentiate contact networks depends on how far an epidemic has progressed, where distance-based tree statistics have more power early in an epidemic. Finally, we applied our ABC inference on two different outbreaks from the Swedish HIV-1 epidemic.« less
Ratmann, Oliver; Andrieu, Christophe; Wiuf, Carsten; Richardson, Sylvia
2009-06-30
Mathematical models are an important tool to explain and comprehend complex phenomena, and unparalleled computational advances enable us to easily explore them without any or little understanding of their global properties. In fact, the likelihood of the data under complex stochastic models is often analytically or numerically intractable in many areas of sciences. This makes it even more important to simultaneously investigate the adequacy of these models-in absolute terms, against the data, rather than relative to the performance of other models-but no such procedure has been formally discussed when the likelihood is intractable. We provide a statistical interpretation to current developments in likelihood-free Bayesian inference that explicitly accounts for discrepancies between the model and the data, termed Approximate Bayesian Computation under model uncertainty (ABCmicro). We augment the likelihood of the data with unknown error terms that correspond to freely chosen checking functions, and provide Monte Carlo strategies for sampling from the associated joint posterior distribution without the need of evaluating the likelihood. We discuss the benefit of incorporating model diagnostics within an ABC framework, and demonstrate how this method diagnoses model mismatch and guides model refinement by contrasting three qualitative models of protein network evolution to the protein interaction datasets of Helicobacter pylori and Treponema pallidum. Our results make a number of model deficiencies explicit, and suggest that the T. pallidum network topology is inconsistent with evolution dominated by link turnover or lateral gene transfer alone.
Cornejo-Romero, Amelia; Aguilar-Martínez, Gustavo F.; Medina-Sánchez, Javier; Rendón-Aguilar, Beatriz; Valverde, Pedro Luis; Zavala-Hurtado, Jose Alejandro; Serrato, Alejandra; Rivas-Arancibia, Sombra; Pérez-Hernández, Marco Aurelio; López-Ortega, Gerardo; Jiménez-Sierra, Cecilia
2017-01-01
Historic demography changes of plant species adapted to New World arid environments could be consistent with either the Glacial Refugium Hypothesis (GRH), which posits that populations contracted to refuges during the cold-dry glacial and expanded in warm-humid interglacial periods, or with the Interglacial Refugium Hypothesis (IRH), which suggests that populations contracted during interglacials and expanded in glacial times. These contrasting hypotheses are developed in the present study for the giant columnar cactus Cephalocereus columna-trajani in the intertropical Mexican drylands where the effects of Late Quaternary climatic changes on phylogeography of cacti remain largely unknown. In order to determine if the historic demography and phylogeographic structure of the species are consistent with either hypothesis, sequences of the chloroplast regions psbA-trnH and trnT-trnL from 110 individuals from 10 populations comprising the full distribution range of this species were analysed. Standard estimators of genetic diversity and structure were calculated. The historic demography was analysed using a Bayesian approach and the palaeodistribution was derived from ecological niche modelling to determine if, in the arid environments of south-central Mexico, glacial-interglacial cycles drove the genetic divergence and diversification of this species. Results reveal low but statistically significant population differentiation (FST = 0.124, P < 0.001), although very clear geographic clusters are not formed. Genetic diversity, haplotype network and Approximate Bayesian Computation (ABC) demographic analyses suggest a population expansion estimated to have taken place in the Last Interglacial (123.04 kya, 95% CI 115.3–130.03). The species palaeodistribution is consistent with the ABC analyses and indicates that the potential area of palaedistribution and climatic suitability were larger during the Last Interglacial and Holocene than in the Last Glacial Maximum. Overall, these results suggest that C. columna-trajani experienced an expansion following the warm conditions of interglacials, in accordance with the GRH. PMID:28426818
Cornejo-Romero, Amelia; Vargas-Mendoza, Carlos Fabián; Aguilar-Martínez, Gustavo F; Medina-Sánchez, Javier; Rendón-Aguilar, Beatriz; Valverde, Pedro Luis; Zavala-Hurtado, Jose Alejandro; Serrato, Alejandra; Rivas-Arancibia, Sombra; Pérez-Hernández, Marco Aurelio; López-Ortega, Gerardo; Jiménez-Sierra, Cecilia
2017-01-01
Historic demography changes of plant species adapted to New World arid environments could be consistent with either the Glacial Refugium Hypothesis (GRH), which posits that populations contracted to refuges during the cold-dry glacial and expanded in warm-humid interglacial periods, or with the Interglacial Refugium Hypothesis (IRH), which suggests that populations contracted during interglacials and expanded in glacial times. These contrasting hypotheses are developed in the present study for the giant columnar cactus Cephalocereus columna-trajani in the intertropical Mexican drylands where the effects of Late Quaternary climatic changes on phylogeography of cacti remain largely unknown. In order to determine if the historic demography and phylogeographic structure of the species are consistent with either hypothesis, sequences of the chloroplast regions psbA-trnH and trnT-trnL from 110 individuals from 10 populations comprising the full distribution range of this species were analysed. Standard estimators of genetic diversity and structure were calculated. The historic demography was analysed using a Bayesian approach and the palaeodistribution was derived from ecological niche modelling to determine if, in the arid environments of south-central Mexico, glacial-interglacial cycles drove the genetic divergence and diversification of this species. Results reveal low but statistically significant population differentiation (FST = 0.124, P < 0.001), although very clear geographic clusters are not formed. Genetic diversity, haplotype network and Approximate Bayesian Computation (ABC) demographic analyses suggest a population expansion estimated to have taken place in the Last Interglacial (123.04 kya, 95% CI 115.3-130.03). The species palaeodistribution is consistent with the ABC analyses and indicates that the potential area of palaedistribution and climatic suitability were larger during the Last Interglacial and Holocene than in the Last Glacial Maximum. Overall, these results suggest that C. columna-trajani experienced an expansion following the warm conditions of interglacials, in accordance with the GRH.
Hadjithomas, Michalis; Chen, I-Min Amy; Chu, Ken; Ratner, Anna; Palaniappan, Krishna; Szeto, Ernest; Huang, Jinghua; Reddy, T B K; Cimermančič, Peter; Fischbach, Michael A; Ivanova, Natalia N; Markowitz, Victor M; Kyrpides, Nikos C; Pati, Amrita
2015-07-14
In the discovery of secondary metabolites, analysis of sequence data is a promising exploration path that remains largely underutilized due to the lack of computational platforms that enable such a systematic approach on a large scale. In this work, we present IMG-ABC (https://img.jgi.doe.gov/abc), an atlas of biosynthetic gene clusters within the Integrated Microbial Genomes (IMG) system, which is aimed at harnessing the power of "big" genomic data for discovering small molecules. IMG-ABC relies on IMG's comprehensive integrated structural and functional genomic data for the analysis of biosynthetic gene clusters (BCs) and associated secondary metabolites (SMs). SMs and BCs serve as the two main classes of objects in IMG-ABC, each with a rich collection of attributes. A unique feature of IMG-ABC is the incorporation of both experimentally validated and computationally predicted BCs in genomes as well as metagenomes, thus identifying BCs in uncultured populations and rare taxa. We demonstrate the strength of IMG-ABC's focused integrated analysis tools in enabling the exploration of microbial secondary metabolism on a global scale, through the discovery of phenazine-producing clusters for the first time in Alphaproteobacteria. IMG-ABC strives to fill the long-existent void of resources for computational exploration of the secondary metabolism universe; its underlying scalable framework enables traversal of uncovered phylogenetic and chemical structure space, serving as a doorway to a new era in the discovery of novel molecules. IMG-ABC is the largest publicly available database of predicted and experimental biosynthetic gene clusters and the secondary metabolites they produce. The system also includes powerful search and analysis tools that are integrated with IMG's extensive genomic/metagenomic data and analysis tool kits. As new research on biosynthetic gene clusters and secondary metabolites is published and more genomes are sequenced, IMG-ABC will continue to expand, with the goal of becoming an essential component of any bioinformatic exploration of the secondary metabolism world. Copyright © 2015 Hadjithomas et al.
Technical Note: Approximate Bayesian parameterization of a process-based tropical forest model
NASA Astrophysics Data System (ADS)
Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.
2014-02-01
Inverse parameter estimation of process-based models is a long-standing problem in many scientific disciplines. A key question for inverse parameter estimation is how to define the metric that quantifies how well model predictions fit to the data. This metric can be expressed by general cost or objective functions, but statistical inversion methods require a particular metric, the probability of observing the data given the model parameters, known as the likelihood. For technical and computational reasons, likelihoods for process-based stochastic models are usually based on general assumptions about variability in the observed data, and not on the stochasticity generated by the model. Only in recent years have new methods become available that allow the generation of likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional Markov chain Monte Carlo (MCMC) sampler, performs well in retrieving known parameter values from virtual inventory data generated by the forest model. We analyze the results of the parameter estimation, examine its sensitivity to the choice and aggregation of model outputs and observed data (summary statistics), and demonstrate the application of this method by fitting the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss how this approach differs from approximate Bayesian computation (ABC), another method commonly used to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can be successfully applied to process-based models of high complexity. The methodology is particularly suitable for heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models.
Romero-Severson, Ethan O.; Bulla, Ingo; Hengartner, Nick; Bártolo, Inês; Abecasis, Ana; Azevedo-Pereira, José M.; Taveira, Nuno; Leitner, Thomas
2017-01-01
Diversity of the founding population of Human Immunodeficiency Virus Type 1 (HIV-1) transmissions raises many important biological, clinical, and epidemiological issues. In up to 40% of sexual infections, there is clear evidence for multiple founding variants, which can influence the efficacy of putative prevention methods, and the reconstruction of epidemiologic histories. To infer who-infected-whom, and to compute the probability of alternative transmission scenarios while explicitly taking phylogenetic uncertainty into account, we created an approximate Bayesian computation (ABC) method based on a set of statistics measuring phylogenetic topology, branch lengths, and genetic diversity. We applied our method to a suspected heterosexual transmission case involving three individuals, showing a complex monophyletic-paraphyletic-polyphyletic phylogenetic topology. We detected that seven phylogenetic lineages had been transmitted between two of the individuals based on the available samples, implying that many more unsampled lineages had also been transmitted. Testing whether the lineages had been transmitted at one time or over some length of time suggested that an ongoing superinfection process over several years was most likely. While one individual was found unlinked to the other two, surprisingly, when evaluating two competing epidemiological priors, the donor of the two that did infect each other was not identified by the host root-label, and was also not the primary suspect in that transmission. This highlights that it is important to take epidemiological information into account when analyzing support for one transmission hypothesis over another, as results may be nonintuitive and sensitive to details about sampling dates relative to possible infection dates. Our study provides a formal inference framework to include information on infection and sampling times, and to investigate ancestral node-label states, transmission direction, transmitted genetic diversity, and frequency of transmission. PMID:28912340
Bayesian analysis of factors associated with fibromyalgia syndrome subjects
NASA Astrophysics Data System (ADS)
Jayawardana, Veroni; Mondal, Sumona; Russek, Leslie
2015-01-01
Factors contributing to movement-related fear were assessed by Russek, et al. 2014 for subjects with Fibromyalgia (FM) based on the collected data by a national internet survey of community-based individuals. The study focused on the variables, Activities-Specific Balance Confidence scale (ABC), Primary Care Post-Traumatic Stress Disorder screen (PC-PTSD), Tampa Scale of Kinesiophobia (TSK), a Joint Hypermobility Syndrome screen (JHS), Vertigo Symptom Scale (VSS-SF), Obsessive-Compulsive Personality Disorder (OCPD), Pain, work status and physical activity dependent from the "Revised Fibromyalgia Impact Questionnaire" (FIQR). The study presented in this paper revisits same data with a Bayesian analysis where appropriate priors were introduced for variables selected in the Russek's paper.
Artificial Boundary Conditions for Computation of Oscillating External Flows
NASA Technical Reports Server (NTRS)
Tsynkov, S. V.
1996-01-01
In this paper, we propose a new technique for the numerical treatment of external flow problems with oscillatory behavior of the solution in time. Specifically, we consider the case of unbounded compressible viscous plane flow past a finite body (airfoil). Oscillations of the flow in time may be caused by the time-periodic injection of fluid into the boundary layer, which in accordance with experimental data, may essentially increase the performance of the airfoil. To conduct the actual computations, we have to somehow restrict the original unbounded domain, that is, to introduce an artificial (external) boundary and to further consider only a finite computational domain. Consequently, we will need to formulate some artificial boundary conditions (ABC's) at the introduced external boundary. The ABC's we are aiming to obtain must meet a fundamental requirement. One should be able to uniquely complement the solution calculated inside the finite computational domain to its infinite exterior so that the original problem is solved within the desired accuracy. Our construction of such ABC's for oscillating flows is based on an essential assumption: the Navier-Stokes equations can be linearized in the far field against the free-stream back- ground. To actually compute the ABC's, we represent the far-field solution as a Fourier series in time and then apply the Difference Potentials Method (DPM) of V. S. Ryaben'kii. This paper contains a general theoretical description of the algorithm for setting the DPM-based ABC's for time-periodic external flows. Based on our experience in implementing analogous ABC's for steady-state problems (a simpler case), we expect that these boundary conditions will become an effective tool for constructing robust numerical methods to calculate oscillatory flows.
Zhang, Yunyun; Yan, Jing; Fu, Yi; Chen, Shengdi
2013-01-01
Objective To compare the accuracy of formula 1/2ABC with 2/3SH on volume estimation for hypertensive infratentorial hematoma. Methods One hundred and forty-seven CT scans diagnosed as hypertensive infratentorial hemorrhage were reviewed. Based on the shape, hematomas were categorized as regular or irregular. Multilobular was defined as a special shape of irregular. Hematoma volume was calculated employing computer-assisted volumetric analysis (CAVA), 1/2ABC and 2/3SH, respectively. Results The correlation coefficients between 1/2ABC (or 2/3SH) and CAVA were greater than 0.900 in all subgroups. There were neither significant differences in absolute values of volume deviation nor percentage deviation between 1/2ABC and 2/3SH for regular hemorrhage (P>0.05). While for cerebellar, brainstem and irregular hemorrhages, the absolute values of volume deviation and percentage deviation by formula 1/2ABC were greater than 2/3SH (P<0.05). 1/2ABC and 2/3SH underestimated hematoma volume each by 10% and 5% for cerebellar hemorrhage, 14% and 9% for brainstem hemorrhage, 19% and 16% for regular hemorrhage, 9% and 3% for irregular hemorrhage, respectively. In addition, for the multilobular hemorrhage, 1/2ABC underestimated the volume by 9% while 2/3SH overestimated it by 2%. Conclusions For regular hemorrhage volume calculation, the accuracy of 2/3SH is similar to 1/2ABC. While for cerebellar, brainstem or irregular hemorrhages (including multilobular), 2/3SH is more accurate than 1/2ABC. PMID:23638025
Creating an iPhone Application for Collecting Continuous ABC Data
ERIC Educational Resources Information Center
Whiting, Seth W.; Dixon, Mark R.
2012-01-01
This paper provides an overview and task analysis for creating a continuous ABC data- collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to…
Palero, Ferran; Lopes, Joao; Abelló, Pere; Macpherson, Enrique; Pascual, Marta; Beaumont, Mark A
2009-11-09
Molecular tools may help to uncover closely related and still diverging species from a wide variety of taxa and provide insight into the mechanisms, pace and geography of marine speciation. There is a certain controversy on the phylogeography and speciation modes of species-groups with an Eastern Atlantic-Western Indian Ocean distribution, with previous studies suggesting that older events (Miocene) and/or more recent (Pleistocene) oceanographic processes could have influenced the phylogeny of marine taxa. The spiny lobster genus Palinurus allows for testing among speciation hypotheses, since it has a particular distribution with two groups of three species each in the Northeastern Atlantic (P. elephas, P. mauritanicus and P. charlestoni) and Southeastern Atlantic and Southwestern Indian Oceans (P. gilchristi, P. delagoae and P. barbarae). In the present study, we obtain a more complete understanding of the phylogenetic relationships among these species through a combined dataset with both nuclear and mitochondrial markers, by testing alternative hypotheses on both the mutation rate and tree topology under the recently developed approximate Bayesian computation (ABC) methods. Our analyses support a North-to-South speciation pattern in Palinurus with all the South-African species forming a monophyletic clade nested within the Northern Hemisphere species. Coalescent-based ABC methods allowed us to reject the previously proposed hypothesis of a Middle Miocene speciation event related with the closure of the Tethyan Seaway. Instead, divergence times obtained for Palinurus species using the combined mtDNA-microsatellite dataset and standard mutation rates for mtDNA agree with known glaciation-related processes occurring during the last 2 my. The Palinurus speciation pattern is a typical example of a series of rapid speciation events occurring within a group, with very short branches separating different species. Our results support the hypothesis that recent climate change-related oceanographic processes have influenced the phylogeny of marine taxa, with most Palinurus species originating during the last two million years. The present study highlights the value of new coalescent-based statistical methods such as ABC for testing different speciation hypotheses using molecular data.
Okumura, Yuki; Kobayashi, Ryohei; Onishi, Takako; Shoyama, Yoshinari; Barret, Olivier; Alagille, David; Jennings, Danna; Marek, Kenneth; Seibyl, John; Tamagnan, Gilles; Tanaka, Akihiro; Shirakami, Yoshifumi
2016-01-01
Abstract Non-invasive imaging of amyloid-β in the brain, a hallmark of Alzheimer’s disease, may support earlier and more accurate diagnosis of the disease. In this study, we assessed the novel single photon emission computed tomography tracer 123 I-ABC577 as a potential imaging biomarker for amyloid-β in the brain. The radio-iodinated imidazopyridine derivative 123 I-ABC577 was designed as a candidate for a novel amyloid-β imaging agent. The binding affinity of 123 I-ABC577 for amyloid-β was evaluated by saturation binding assay and in vitro autoradiography using post-mortem Alzheimer’s disease brain tissue. Biodistribution experiments using normal rats were performed to evaluate the biokinetics of 123 I-ABC577. Furthermore, to validate 123 I-ABC577 as a biomarker for Alzheimer’s disease, we performed a clinical study to compare the brain uptake of 123 I-ABC577 in three patients with Alzheimer’s disease and three healthy control subjects. 123 I-ABC577 binding was quantified by use of the standardized uptake value ratio, which was calculated for the cortex using the cerebellum as a reference region. Standardized uptake value ratio images were visually scored as positive or negative. As a result, 123 I-ABC577 showed high binding affinity for amyloid-β and desirable pharmacokinetics in the preclinical studies. In the clinical study, 123 I-ABC577 was an effective marker for discriminating patients with Alzheimer’s disease from healthy control subjects based on visual images or the ratio of cortical-to-cerebellar binding. In patients with Alzheimer’s disease, 123 I-ABC577 demonstrated clear retention in cortical regions known to accumulate amyloid, such as the frontal cortex, temporal cortex, and posterior cingulate. In contrast, less, more diffuse, and non-specific uptake without localization to these key regions was observed in healthy controls. At 150 min after injection, the cortical standardized uptake value ratio increased by ∼60% in patients with Alzheimer’s disease relative to healthy control subjects. Both healthy control subjects and patients with Alzheimer’s disease showed minimal 123 I-ABC577 retention in the white matter. These observations indicate that 123 I-ABC577 may be a useful single photon emission computed tomography imaging maker to identify amyloid-β in the human brain. The availability of an amyloid-β tracer for single photon emission computed tomography might increase the accessibility of diagnostic imaging for Alzheimer’s disease. PMID:26490333
CREATING AN IPHONE APPLICATION FOR COLLECTING CONTINUOUS ABC DATA
Whiting, Seth W; Dixon, Mark R
2012-01-01
This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs. PMID:23060682
Creating an iPhone application for collecting continuous ABC data.
Whiting, Seth W; Dixon, Mark R
2012-01-01
This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs.
Row, Jeffery R.; Oyler-McCance, Sara J.; Fedy, Brad C.
2016-01-01
The distribution of spatial genetic variation across a region can shape evolutionary dynamics and impact population persistence. Local population dynamics and among-population dispersal rates are strong drivers of this spatial genetic variation, yet for many species we lack a clear understanding of how these population processes interact in space to shape within-species genetic variation. Here, we used extensive genetic and demographic data from 10 subpopulations of greater sage-grouse to parameterize a simulated approximate Bayesian computation (ABC) model and (i) test for regional differences in population density and dispersal rates for greater sage-grouse subpopulations in Wyoming, and (ii) quantify how these differences impact subpopulation regional influence on genetic variation. We found a close match between observed and simulated data under our parameterized model and strong variation in density and dispersal rates across Wyoming. Sensitivity analyses suggested that changes in dispersal (via landscape resistance) had a greater influence on regional differentiation, whereas changes in density had a greater influence on mean diversity across all subpopulations. Local subpopulations, however, varied in their regional influence on genetic variation. Decreases in the size and dispersal rates of central populations with low overall and net immigration (i.e. population sources) had the greatest negative impact on genetic variation. Overall, our results provide insight into the interactions among demography, dispersal and genetic variation and highlight the potential of ABC to disentangle the complexity of regional population dynamics and project the genetic impact of changing conditions.
JPRS Report, West Europe, Reference Aid, Glossary of Acronyms and Abbreviations of Norway
1989-04-04
nuclear-biological- chemical [use NBC] ABC/S Forsvarets ABC-sekretariat Defense NBC Secretariat adj . adjutant adm administrasj on adm...institusj onens regneanlegg kjem kjemi(sk) kj. st. kj^restilling Kjeller Institute Computing Facility chemical driving position kjt kj...National Federation of the Chemical Mineral Industry Air Commander, North Norway LKR lett kontrollrom LKS Luftkommandor S0r-Norge LK-sjef
Hadjithomas, Michalis; Chen, I-Min Amy; Chu, Ken; ...
2015-07-14
In the discovery of secondary metabolites, analysis of sequence data is a promising exploration path that remains largely underutilized due to the lack of computational platforms that enable such a systematic approach on a large scale. In this work, we present IMG-ABC (https://img.jgi.doe.gov/abc), an atlas of biosynthetic gene clusters within the Integrated Microbial Genomes (IMG) system, which is aimed at harnessing the power of “big” genomic data for discovering small molecules. IMG-ABC relies on IMG’s comprehensive integrated structural and functional genomic data for the analysis of biosynthetic gene clusters (BCs) and associated secondary metabolites (SMs). SMs and BCs serve asmore » the two main classes of objects in IMG-ABC, each with a rich collection of attributes. A unique feature of IMG-ABC is the incorporation of both experimentally validated and computationally predicted BCs in genomes as well as metagenomes, thus identifying BCs in uncultured populations and rare taxa. We demonstrate the strength of IMG-ABC’s focused integrated analysis tools in enabling the exploration of microbial secondary metabolism on a global scale, through the discovery of phenazine-producing clusters for the first time in lphaproteobacteria. IMG-ABC strives to fill the long-existent void of resources for computational exploration of the secondary metabolism universe; its underlying scalable framework enables traversal of uncovered phylogenetic and chemical structure space, serving as a doorway to a new era in the discovery of novel molecules. IMG-ABC is the largest publicly available database of predicted and experimental biosynthetic gene clusters and the secondary metabolites they produce. The system also includes powerful search and analysis tools that are integrated with IMG’s extensive genomic/metagenomic data and analysis tool kits. As new research on biosynthetic gene clusters and secondary metabolites is published and more genomes are sequenced, IMG-ABC will continue to expand, with the goal of becoming an essential component of any bioinformatic exploration of the secondary metabolism world.« less
Ambrose, Luke; Cooper, Robert D.; Chow, Weng K.; Davis, Joseph B.; Muzari, Mutizwa O.; van den Hurk, Andrew F.; Hall-Mendelin, Sonja; Hasty, Jeomhee M.; Burkot, Thomas R.; Bangs, Michael J.; Reimer, Lisa J.; Butafa, Charles; Lobo, Neil F.; Syafruddin, Din; Maung Maung, Yan Naung; Ahmad, Rohani; Beebe, Nigel W.
2017-01-01
Background Within the last century, increases in human movement and globalization of trade have facilitated the establishment of several highly invasive mosquito species in new geographic locations with concurrent major environmental, economic and health consequences. The Asian tiger mosquito, Aedes albopictus, is an extremely invasive and aggressive daytime-biting mosquito that is a major public health threat throughout its expanding range. Methodology/Principal findings We used 13 nuclear microsatellite loci (on 911 individuals) and mitochondrial COI sequences to gain a better understanding of the historical and contemporary movements of Ae. albopictus in the Indo-Pacific region and to characterize its population structure. Approximate Bayesian computation (ABC) was employed to test competing historical routes of invasion of Ae. albopictus within the Southeast (SE) Asian/Australasian region. Our ABC results show that Ae. albopictus was most likely introduced to New Guinea via mainland Southeast Asia, before colonizing the Solomon Islands via either Papua New Guinea or SE Asia. The analysis also supported that the recent incursion into northern Australia’s Torres Strait Islands was seeded chiefly from Indonesia. For the first time documented in this invasive species, we provide evidence of a recently colonized population (the Torres Strait Islands) that has undergone rapid temporal changes in its genetic makeup, which could be the result of genetic drift or represent a secondary invasion from an unknown source. Conclusions/Significance There appears to be high spatial genetic structure and high gene flow between some geographically distant populations. The species' genetic structure in the region tends to favour a dispersal pattern driven mostly by human movements. Importantly, this study provides a more widespread sampling distribution of the species’ native range, revealing more spatial population structure than previously shown. Additionally, we present the most probable invasion history of this species in the Australasian region using ABC analysis. PMID:28410388
Maynard, Andrew J; Ambrose, Luke; Cooper, Robert D; Chow, Weng K; Davis, Joseph B; Muzari, Mutizwa O; van den Hurk, Andrew F; Hall-Mendelin, Sonja; Hasty, Jeomhee M; Burkot, Thomas R; Bangs, Michael J; Reimer, Lisa J; Butafa, Charles; Lobo, Neil F; Syafruddin, Din; Maung Maung, Yan Naung; Ahmad, Rohani; Beebe, Nigel W
2017-04-01
Within the last century, increases in human movement and globalization of trade have facilitated the establishment of several highly invasive mosquito species in new geographic locations with concurrent major environmental, economic and health consequences. The Asian tiger mosquito, Aedes albopictus, is an extremely invasive and aggressive daytime-biting mosquito that is a major public health threat throughout its expanding range. We used 13 nuclear microsatellite loci (on 911 individuals) and mitochondrial COI sequences to gain a better understanding of the historical and contemporary movements of Ae. albopictus in the Indo-Pacific region and to characterize its population structure. Approximate Bayesian computation (ABC) was employed to test competing historical routes of invasion of Ae. albopictus within the Southeast (SE) Asian/Australasian region. Our ABC results show that Ae. albopictus was most likely introduced to New Guinea via mainland Southeast Asia, before colonizing the Solomon Islands via either Papua New Guinea or SE Asia. The analysis also supported that the recent incursion into northern Australia's Torres Strait Islands was seeded chiefly from Indonesia. For the first time documented in this invasive species, we provide evidence of a recently colonized population (the Torres Strait Islands) that has undergone rapid temporal changes in its genetic makeup, which could be the result of genetic drift or represent a secondary invasion from an unknown source. There appears to be high spatial genetic structure and high gene flow between some geographically distant populations. The species' genetic structure in the region tends to favour a dispersal pattern driven mostly by human movements. Importantly, this study provides a more widespread sampling distribution of the species' native range, revealing more spatial population structure than previously shown. Additionally, we present the most probable invasion history of this species in the Australasian region using ABC analysis.
A Novel Artificial Bee Colony Approach of Live Virtual Machine Migration Policy Using Bayes Theorem
Xu, Gaochao; Hu, Liang; Fu, Xiaodong
2013-01-01
Green cloud data center has become a research hotspot of virtualized cloud computing architecture. Since live virtual machine (VM) migration technology is widely used and studied in cloud computing, we have focused on the VM placement selection of live migration for power saving. We present a novel heuristic approach which is called PS-ABC. Its algorithm includes two parts. One is that it combines the artificial bee colony (ABC) idea with the uniform random initialization idea, the binary search idea, and Boltzmann selection policy to achieve an improved ABC-based approach with better global exploration's ability and local exploitation's ability. The other one is that it uses the Bayes theorem to further optimize the improved ABC-based process to faster get the final optimal solution. As a result, the whole approach achieves a longer-term efficient optimization for power saving. The experimental results demonstrate that PS-ABC evidently reduces the total incremental power consumption and better protects the performance of VM running and migrating compared with the existing research. It makes the result of live VM migration more high-effective and meaningful. PMID:24385877
A novel artificial bee colony approach of live virtual machine migration policy using Bayes theorem.
Xu, Gaochao; Ding, Yan; Zhao, Jia; Hu, Liang; Fu, Xiaodong
2013-01-01
Green cloud data center has become a research hotspot of virtualized cloud computing architecture. Since live virtual machine (VM) migration technology is widely used and studied in cloud computing, we have focused on the VM placement selection of live migration for power saving. We present a novel heuristic approach which is called PS-ABC. Its algorithm includes two parts. One is that it combines the artificial bee colony (ABC) idea with the uniform random initialization idea, the binary search idea, and Boltzmann selection policy to achieve an improved ABC-based approach with better global exploration's ability and local exploitation's ability. The other one is that it uses the Bayes theorem to further optimize the improved ABC-based process to faster get the final optimal solution. As a result, the whole approach achieves a longer-term efficient optimization for power saving. The experimental results demonstrate that PS-ABC evidently reduces the total incremental power consumption and better protects the performance of VM running and migrating compared with the existing research. It makes the result of live VM migration more high-effective and meaningful.
Eldon, Bjarki; Birkner, Matthias; Blath, Jochen; Freund, Fabian
2015-01-01
The ability of the site-frequency spectrum (SFS) to reflect the particularities of gene genealogies exhibiting multiple mergers of ancestral lines as opposed to those obtained in the presence of population growth is our focus. An excess of singletons is a well-known characteristic of both population growth and multiple mergers. Other aspects of the SFS, in particular, the weight of the right tail, are, however, affected in specific ways by the two model classes. Using an approximate likelihood method and minimum-distance statistics, our estimates of statistical power indicate that exponential and algebraic growth can indeed be distinguished from multiple-merger coalescents, even for moderate sample sizes, if the number of segregating sites is high enough. A normalized version of the SFS (nSFS) is also used as a summary statistic in an approximate Bayesian computation (ABC) approach. The results give further positive evidence as to the general eligibility of the SFS to distinguish between the different histories. PMID:25575536
Barrès, B; Carlier, J; Seguin, M; Fenouillet, C; Cilas, C; Ravigné, V
2012-01-01
Understanding the processes by which new diseases are introduced in previously healthy areas is of major interest in elaborating prevention and management policies, as well as in understanding the dynamics of pathogen diversity at large spatial scale. In this study, we aimed to decipher the dispersal processes that have led to the emergence of the plant pathogenic fungus Microcyclus ulei, which is responsible for the South American Leaf Blight (SALB). This fungus has devastated rubber tree plantations across Latin America since the beginning of the twentieth century. As only imprecise historical information is available, the study of population evolutionary history based on population genetics appeared most appropriate. The distribution of genetic diversity in a continental sampling of four countries (Brazil, Ecuador, Guatemala and French Guiana) was studied using a set of 16 microsatellite markers developed specifically for this purpose. A very strong genetic structure was found (Fst=0.70), demonstrating that there has been no regular gene flow between Latin American M. ulei populations. Strong bottlenecks probably occurred at the foundation of each population. The most likely scenario of colonization identified by the Approximate Bayesian Computation (ABC) method implemented in 𝒟ℐ𝒴𝒜ℬ𝒞 suggested two independent sources from the Amazonian endemic area. The Brazilian, Ecuadorian and Guatemalan populations might stem from serial introductions through human-mediated movement of infected plant material from an unsampled source population, whereas the French Guiana population seems to have arisen from an independent colonization event through spore dispersal. PMID:22828899
First MRI application of an active breathing coordinator
NASA Astrophysics Data System (ADS)
Kaza, E.; Symonds-Tayler, R.; Collins, D. J.; McDonald, F.; McNair, H. A.; Scurr, E.; Koh, D.-M.; Leach, M. O.
2015-02-01
A commercial active breathing coordinator (ABC) device, employed to hold respiration at a specific level for a predefined duration, was successfully adapted for magnetic resonance imaging (MRI) use for the first time. Potential effects of the necessary modifications were assessed and taken into account. Automatic MR acquisition during ABC breath holding was achieved. The feasibility of MR-ABC thoracic and abdominal examinations together with the advantages of imaging in repeated ABC-controlled breath holds were demonstrated on healthy volunteers. Five lung cancer patients were imaged under MR-ABC, visually confirming the very good intra-session reproducibility of organ position in images acquired with the same patient positioning as used for computed tomography (CT). Using identical ABC settings, good MR-CT inter-modality registration was achieved. This demonstrates the value of ABC, since application of T1, T2 and diffusion weighted MR sequences provides a wider range of contrast mechanisms and additional diagnostic information compared to CT, thus improving radiotherapy treatment planning and assessment.
First MRI application of an active breathing coordinator.
Kaza, E; Symonds-Tayler, R; Collins, D J; McDonald, F; McNair, H A; Scurr, E; Koh, D-M; Leach, M O
2015-02-21
A commercial active breathing coordinator (ABC) device, employed to hold respiration at a specific level for a predefined duration, was successfully adapted for magnetic resonance imaging (MRI) use for the first time. Potential effects of the necessary modifications were assessed and taken into account. Automatic MR acquisition during ABC breath holding was achieved. The feasibility of MR-ABC thoracic and abdominal examinations together with the advantages of imaging in repeated ABC-controlled breath holds were demonstrated on healthy volunteers. Five lung cancer patients were imaged under MR-ABC, visually confirming the very good intra-session reproducibility of organ position in images acquired with the same patient positioning as used for computed tomography (CT). Using identical ABC settings, good MR-CT inter-modality registration was achieved. This demonstrates the value of ABC, since application of T1, T2 and diffusion weighted MR sequences provides a wider range of contrast mechanisms and additional diagnostic information compared to CT, thus improving radiotherapy treatment planning and assessment.
First MRI application of an active breathing coordinator
Kaza, E; Symonds-Tayler, R; Collins, D J; McDonald, F; McNair, H A; Scurr, E; Koh, D-M; Leach, M O
2015-01-01
Abstract A commercial active breathing coordinator (ABC) device, employed to hold respiration at a specific level for a predefined duration, was successfully adapted for magnetic resonance imaging (MRI) use for the first time. Potential effects of the necessary modifications were assessed and taken into account. Automatic MR acquisition during ABC breath holding was achieved. The feasibility of MR-ABC thoracic and abdominal examinations together with the advantages of imaging in repeated ABC-controlled breath holds were demonstrated on healthy volunteers. Five lung cancer patients were imaged under MR-ABC, visually confirming the very good intra-session reproducibility of organ position in images acquired with the same patient positioning as used for computed tomography (CT). Using identical ABC settings, good MR-CT inter-modality registration was achieved. This demonstrates the value of ABC, since application of T1, T2 and diffusion weighted MR sequences provides a wider range of contrast mechanisms and additional diagnostic information compared to CT, thus improving radiotherapy treatment planning and assessment. PMID:25633183
Global Discrete Artificial Boundary Conditions for Time-Dependent Wave Propagation
NASA Technical Reports Server (NTRS)
Ryabenkii, V. S.; Tsynkov, S. V.; Turchaninov, V. I.; Bushnell, Dennis M. (Technical Monitor)
2001-01-01
We construct global artificial boundary conditions (ABCs) for the numerical simulation of wave processes on unbounded domains using a special non-deteriorating algorithm that has been developed previously for the long-term computation of wave-radiation solutions. The ABCs are obtained directly for the discrete formulation of the problem; in so doing, neither a rational approximation of 'non-reflecting kernels,' nor discretization of the continuous boundary conditions is required. The extent of temporal nonlocality of the new ABCs appears fixed and limited; in addition, the ABCs can handle artificial boundaries of irregular shape on regular grids with no fitting/adaptation needed and no accuracy loss induced. The non-deteriorating algorithm, which is the core of the new ABCs is inherently three-dimensional, it guarantees temporally uniform grid convergence of the solution driven by a continuously operating source on arbitrarily long time intervals, and provides unimprovable linear computational complexity with respect to the grid dimension. The algorithm is based on the presence of lacunae, i.e., aft fronts of the waves, in wave-type solutions in odd-dimension spaces, It can, in fact, be built as a modification on top of any consistent and stable finite-difference scheme, making its grid convergence uniform in time and at the same time keeping the rate of convergence the same as that of the non-modified scheme. In the paper, we delineate the construction of the global lacunae-based ABCs in the framework of a discretized wave equation. The ABCs are obtained for the most general formulation of the problem that involves radiation of waves by moving sources (e.g., radiation of acoustic waves by a maneuvering aircraft). We also present systematic numerical results that corroborate the theoretical design properties of the ABCs' algorithm.
Global Discrete Artificial Boundary Conditions for Time-Dependent Wave Propagation
NASA Astrophysics Data System (ADS)
Ryaben'kii, V. S.; Tsynkov, S. V.; Turchaninov, V. I.
2001-12-01
We construct global artificial boundary conditions (ABCs) for the numerical simulation of wave processes on unbounded domains using a special nondeteriorating algorithm that has been developed previously for the long-term computation of wave-radiation solutions. The ABCs are obtained directly for the discrete formulation of the problem; in so doing, neither a rational approximation of “nonreflecting kernels” nor discretization of the continuous boundary conditions is required. The extent of temporal nonlocality of the new ABCs appears fixed and limited; in addition, the ABCs can handle artificial boundaries of irregular shape on regular grids with no fitting/adaptation needed and no accuracy loss induced. The nondeteriorating algorithm, which is the core of the new ABCs, is inherently three-dimensional, it guarantees temporally uniform grid convergence of the solution driven by a continuously operating source on arbitrarily long time intervals and provides unimprovable linear computational complexity with respect to the grid dimension. The algorithm is based on the presence of lacunae, i.e., aft fronts of the waves, in wave-type solutions in odd-dimensional spaces. It can, in fact, be built as a modification on top of any consistent and stable finite-difference scheme, making its grid convergence uniform in time and at the same time keeping the rate of convergence the same as that of the unmodified scheme. In this paper, we delineate the construction of the global lacunae-based ABCs in the framework of a discretized wave equation. The ABCs are obtained for the most general formulation of the problem that involves radiation of waves by moving sources (e.g., radiation of acoustic waves by a maneuvering aircraft). We also present systematic numerical results that corroborate the theoretical design properties of the ABC algorithm.
Human-facilitated metapopulation dynamics in an emerging pest species, Cimex lectularius
FOUNTAIN, TOBY; DUVAUX, LUDOVIC; HORSBURGH, GAVIN; REINHARDT, KLAUS; BUTLIN, ROGER K
2014-01-01
The number and demographic history of colonists can have dramatic consequences for the way in which genetic diversity is distributed and maintained in a metapopulation. The bed bug (Cimex lectularius) is a re-emerging pest species whose close association with humans has led to frequent local extinction and colonization, that is, to metapopulation dynamics. Pest control limits the lifespan of subpopulations, causing frequent local extinctions, and human-facilitated dispersal allows the colonization of empty patches. Founder events often result in drastic reductions in diversity and an increased influence of genetic drift. Coupled with restricted migration, this can lead to rapid population differentiation. We therefore predicted strong population structuring. Here, using 21 newly characterized microsatellite markers and approximate Bayesian computation (ABC), we investigate simplified versions of two classical models of metapopulation dynamics, in a coalescent framework, to estimate the number and genetic composition of founders in the common bed bug. We found very limited diversity within infestations but high degrees of structuring across the city of London, with extreme levels of genetic differentiation between infestations (FST = 0.59). ABC results suggest a common origin of all founders of a given subpopulation and that the numbers of colonists were low, implying that even a single mated female is enough to found a new infestation successfully. These patterns of colonization are close to the predictions of the propagule pool model, where all founders originate from the same parental infestation. These results show that aspects of metapopulation dynamics can be captured in simple models and provide insights that are valuable for the future targeted control of bed bug infestations. PMID:24446663
Macher, Jan-Niklas; Rozenberg, Andrey; Pauls, Steffen U; Tollrian, Ralph; Wagner, Rüdiger; Leese, Florian
2015-01-01
Repeated Quaternary glaciations have significantly shaped the present distribution and diversity of several European species in aquatic and terrestrial habitats. To study the phylogeography of freshwater invertebrates, patterns of intraspecific variation have been examined primarily using mitochondrial DNA markers that may yield results unrepresentative of the true species history. Here, population genetic parameters were inferred for a montane aquatic caddisfly, Thremma gallicum, by sequencing a 658-bp fragment of the mitochondrial CO1 gene, and 12,514 nuclear RAD loci. T. gallicum has a highly disjunct distribution in southern and central Europe, with known populations in the Cantabrian Mountains, Pyrenees, Massif Central, and Black Forest. Both datasets represented rangewide sampling of T. gallicum. For the CO1 dataset, this included 352 specimens from 26 populations, and for the RAD dataset, 17 specimens from eight populations. We tested 20 competing phylogeographic scenarios using approximate Bayesian computation (ABC) and estimated genetic diversity patterns. Support for phylogeographic scenarios and diversity estimates differed between datasets with the RAD data favouring a southern origin of extant populations and indicating the Cantabrian Mountains and Massif Central populations to represent highly diverse populations as compared with the Pyrenees and Black Forest populations. The CO1 data supported a vicariance scenario (north–south) and yielded inconsistent diversity estimates. Permutation tests suggest that a few hundred polymorphic RAD SNPs are necessary for reliable parameter estimates. Our results highlight the potential of RAD and ABC-based hypothesis testing to complement phylogeographic studies on non-model species. PMID:25691988
Carroll, E L; Alderman, R; Bannister, J L; Bérubé, M; Best, P B; Boren, L; Baker, C S; Constantine, R; Findlay, K; Harcourt, R; Lemaire, L; Palsbøll, P J; Patenaude, N J; Rowntree, V J; Seger, J; Steel, D; Valenzuela, L O; Watson, M; Gaggiotti, O E
2018-05-03
Understanding how dispersal and gene flow link geographically separated the populations over evolutionary history is challenging, particularly in migratory marine species. In southern right whales (SRWs, Eubalaena australis), patterns of genetic diversity are likely influenced by the glacial climate cycle and recent history of whaling. Here we use a dataset of mitochondrial DNA (mtDNA) sequences (n = 1327) and nuclear markers (17 microsatellite loci, n = 222) from major wintering grounds to investigate circumpolar population structure, historical demography and effective population size. Analyses of nuclear genetic variation identify two population clusters that correspond to the South Atlantic and Indo-Pacific ocean basins that have similar effective breeder estimates. In contrast, all wintering grounds show significant differentiation for mtDNA, but no sex-biased dispersal was detected using the microsatellite genotypes. An approximate Bayesian computation (ABC) approach with microsatellite markers compared the scenarios with gene flow through time, or isolation and secondary contact between ocean basins, while modelling declines in abundance linked to whaling. Secondary-contact scenarios yield the highest posterior probabilities, implying that populations in different ocean basins were largely isolated and came into secondary contact within the last 25,000 years, but the role of whaling in changes in genetic diversity and gene flow over recent generations could not be resolved. We hypothesise that these findings are driven by factors that promote isolation, such as female philopatry, and factors that could promote dispersal, such as oceanographic changes. These findings highlight the application of ABC approaches to infer the connectivity in mobile species with complex population histories and, currently, low levels of differentiation.
Past climate change drives current genetic structure of an endangered freshwater mussel species.
Inoue, Kentaro; Lang, Brian K; Berg, David J
2015-04-01
Historical-to-recent climate change and anthropogenic disturbance affect species distributions and genetic structure. The Rio Grande watershed of the United States and Mexico encompasses ecosystems that are intensively exploited, resulting in substantial degradation of aquatic habitats. While significant anthropogenic disturbances in the Rio Grande are recent, inhospitable conditions for freshwater organisms likely existed prior to such disturbances. A combination of anthropogenic and past climate factors may contribute to current distributions of aquatic fauna in the Rio Grande basin. We used mitochondrial DNA and 18 microsatellite loci to infer evolutionary history and genetic structure of an endangered freshwater mussel, Popenaias popeii, throughout the Rio Grande drainage. We estimated spatial connectivity and gene flow across extant populations of P. popeii and used ecological niche models (ENMs) and approximate Bayesian computation (ABC) to infer its evolutionary history during the Pleistocene. structure results recovered regional and local population clusters in the Rio Grande. ENMs predicted drastic reductions in suitable habitat during the last glacial maximum. ABC analyses suggested that regional population structure likely arose in this species during the mid-to-late Pleistocene and was followed by a late Pleistocene population bottleneck in New Mexico populations. The local population structure arose relatively recently, perhaps due to anthropogenic factors. Popenaias popeii, one of the few freshwater mussel species native to the Rio Grande basin, is a case study for understanding how both geological and anthropogenic factors shape current population genetic structure. Conservation strategies for this species should account for the fragmented nature of contemporary populations. © 2015 John Wiley & Sons Ltd.
External Boundary Conditions for Three-Dimensional Problems of Computational Aerodynamics
NASA Technical Reports Server (NTRS)
Tsynkov, Semyon V.
1997-01-01
We consider an unbounded steady-state flow of viscous fluid over a three-dimensional finite body or configuration of bodies. For the purpose of solving this flow problem numerically, we discretize the governing equations (Navier-Stokes) on a finite-difference grid. The grid obviously cannot stretch from the body up to infinity, because the number of the discrete variables in that case would not be finite. Therefore, prior to the discretization we truncate the original unbounded flow domain by introducing some artificial computational boundary at a finite distance of the body. Typically, the artificial boundary is introduced in a natural way as the external boundary of the domain covered by the grid. The flow problem formulated only on the finite computational domain rather than on the original infinite domain is clearly subdefinite unless some artificial boundary conditions (ABC's) are specified at the external computational boundary. Similarly, the discretized flow problem is subdefinite (i.e., lacks equations with respect to unknowns) unless a special closing procedure is implemented at this artificial boundary. The closing procedure in the discrete case is called the ABC's as well. In this paper, we present an innovative approach to constructing highly accurate ABC's for three-dimensional flow computations. The approach extends our previous technique developed for the two-dimensional case; it employs the finite-difference counterparts to Calderon's pseudodifferential boundary projections calculated in the framework of the difference potentials method (DPM) by Ryaben'kii. The resulting ABC's appear spatially nonlocal but particularly easy to implement along with the existing solvers. The new boundary conditions have been successfully combined with the NASA-developed production code TLNS3D and used for the analysis of wing-shaped configurations in subsonic (including incompressible limit) and transonic flow regimes. As demonstrated by the computational experiments and comparisons with the standard (local) methods, the DPM-based ABC's allow one to greatly reduce the size of the computational domain while still maintaining high accuracy of the numerical solution. Moreover, they may provide for a noticeable increase of the convergence rate of multigrid iterations.
Yurtkuran, Alkın; Emel, Erdal
2016-01-01
The artificial bee colony (ABC) algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA) to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.
ABC estimation of unit costs for emergency department services.
Holmes, R L; Schroeder, R E
1996-04-01
Rapid evolution of the health care industry forces managers to make cost-effective decisions. Typical hospital cost accounting systems do not provide emergency department managers with the information needed, but emergency department settings are so complex and dynamic as to make the more accurate activity-based costing (ABC) system prohibitively expensive. Through judicious use of the available traditional cost accounting information and simple computer spreadsheets. managers may approximate the decision-guiding information that would result from the much more costly and time-consuming implementation of ABC.
Bayesian Latent Class Analysis Tutorial.
Li, Yuelin; Lord-Bessen, Jennifer; Shiyko, Mariya; Loeb, Rebecca
2018-01-01
This article is a how-to guide on Bayesian computation using Gibbs sampling, demonstrated in the context of Latent Class Analysis (LCA). It is written for students in quantitative psychology or related fields who have a working knowledge of Bayes Theorem and conditional probability and have experience in writing computer programs in the statistical language R . The overall goals are to provide an accessible and self-contained tutorial, along with a practical computation tool. We begin with how Bayesian computation is typically described in academic articles. Technical difficulties are addressed by a hypothetical, worked-out example. We show how Bayesian computation can be broken down into a series of simpler calculations, which can then be assembled together to complete a computationally more complex model. The details are described much more explicitly than what is typically available in elementary introductions to Bayesian modeling so that readers are not overwhelmed by the mathematics. Moreover, the provided computer program shows how Bayesian LCA can be implemented with relative ease. The computer program is then applied in a large, real-world data set and explained line-by-line. We outline the general steps in how to extend these considerations to other methodological applications. We conclude with suggestions for further readings.
Zhou, Y.; Ojeda-May, P.; Nagaraju, M.; Pu, J.
2016-01-01
Adenosine triphosphate (ATP)-binding cassette (ABC) transporters are ubiquitous ATP-dependent membrane proteins involved in translocations of a wide variety of substrates across cellular membranes. To understand the chemomechanical coupling mechanism as well as functional asymmetry in these systems, a quantitative description of how ABC transporters hydrolyze ATP is needed. Complementary to experimental approaches, computer simulations based on combined quantum mechanical and molecular mechanical (QM/MM) potentials have provided new insights into the catalytic mechanism in ABC transporters. Quantitatively reliable determination of the free energy requirement for enzymatic ATP hydrolysis, however, requires substantial statistical sampling on QM/MM potential. A case study shows that brute force sampling of ab initio QM/MM (AI/MM) potential energy surfaces is computationally impractical for enzyme simulations of ABC transporters. On the other hand, existing semiempirical QM/MM (SE/MM) methods, although affordable for free energy sampling, are unreliable for studying ATP hydrolysis. To close this gap, a multiscale QM/MM approach named reaction path–force matching (RP–FM) has been developed. In RP–FM, specific reaction parameters for a selected SE method are optimized against AI reference data along reaction paths by employing the force matching technique. The feasibility of the method is demonstrated for a proton transfer reaction in the gas phase and in solution. The RP–FM method may offer a general tool for simulating complex enzyme systems such as ABC transporters. PMID:27498639
Technical Note: Approximate Bayesian parameterization of a complex tropical forest model
NASA Astrophysics Data System (ADS)
Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.
2013-08-01
Inverse parameter estimation of process-based models is a long-standing problem in ecology and evolution. A key problem of inverse parameter estimation is to define a metric that quantifies how well model predictions fit to the data. Such a metric can be expressed by general cost or objective functions, but statistical inversion approaches are based on a particular metric, the probability of observing the data given the model, known as the likelihood. Deriving likelihoods for dynamic models requires making assumptions about the probability for observations to deviate from mean model predictions. For technical reasons, these assumptions are usually derived without explicit consideration of the processes in the simulation. Only in recent years have new methods become available that allow generating likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional MCMC, performs well in retrieving known parameter values from virtual field data generated by the forest model. We analyze the results of the parameter estimation, examine the sensitivity towards the choice and aggregation of model outputs and observed data (summary statistics), and show results from using this method to fit the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss differences of this approach to Approximate Bayesian Computing (ABC), another commonly used method to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can successfully be applied to process-based models of high complexity. The methodology is particularly suited to heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models in ecology and evolution.
Bayesian Parameter Inference and Model Selection by Population Annealing in Systems Biology
Murakami, Yohei
2014-01-01
Parameter inference and model selection are very important for mathematical modeling in systems biology. Bayesian statistics can be used to conduct both parameter inference and model selection. Especially, the framework named approximate Bayesian computation is often used for parameter inference and model selection in systems biology. However, Monte Carlo methods needs to be used to compute Bayesian posterior distributions. In addition, the posterior distributions of parameters are sometimes almost uniform or very similar to their prior distributions. In such cases, it is difficult to choose one specific value of parameter with high credibility as the representative value of the distribution. To overcome the problems, we introduced one of the population Monte Carlo algorithms, population annealing. Although population annealing is usually used in statistical mechanics, we showed that population annealing can be used to compute Bayesian posterior distributions in the approximate Bayesian computation framework. To deal with un-identifiability of the representative values of parameters, we proposed to run the simulations with the parameter ensemble sampled from the posterior distribution, named “posterior parameter ensemble”. We showed that population annealing is an efficient and convenient algorithm to generate posterior parameter ensemble. We also showed that the simulations with the posterior parameter ensemble can, not only reproduce the data used for parameter inference, but also capture and predict the data which was not used for parameter inference. Lastly, we introduced the marginal likelihood in the approximate Bayesian computation framework for Bayesian model selection. We showed that population annealing enables us to compute the marginal likelihood in the approximate Bayesian computation framework and conduct model selection depending on the Bayes factor. PMID:25089832
Bobo-Pinilla, Javier; Barrios de León, Sara B; Seguí Colomar, Jaume; Fenu, Giuseppe; Bacchetta, Gianluigi; Peñas de Giles, Julio; Martínez-Ortega, María Montserrat
2016-01-01
Although it has been traditionally accepted that Arenaria balearica (Caryophyllaceae) could be a relict Tertiary plant species, this has never been experimentally tested. Nor have the palaeohistorical reasons underlying the highly fragmented distribution of the species in the Western Mediterranean region been investigated. We have analysed AFLP data (213) and plastid DNA sequences (226) from a total of 250 plants from 29 populations sampled throughout the entire distribution range of the species in Majorca, Corsica, Sardinia, and the Tuscan Archipelago. The AFLP data analyses indicate very low geographic structure and population differentiation. Based on plastid DNA data, six alternative phylogeographic hypotheses were tested using Approximate Bayesian Computation (ABC). These analyses revealed ancient area fragmentation as the most probable scenario, which is in accordance with the star-like topology of the parsimony network that suggests a pattern of long term survival and subsequent in situ differentiation. Overall low levels of genetic diversity and plastid DNA variation were found, reflecting evolutionary stasis of a species preserved in locally long-term stable habitats.
Distinguishing between Selective Sweeps from Standing Variation and from a De Novo Mutation
Peter, Benjamin M.; Huerta-Sanchez, Emilia; Nielsen, Rasmus
2012-01-01
An outstanding question in human genetics has been the degree to which adaptation occurs from standing genetic variation or from de novo mutations. Here, we combine several common statistics used to detect selection in an Approximate Bayesian Computation (ABC) framework, with the goal of discriminating between models of selection and providing estimates of the age of selected alleles and the selection coefficients acting on them. We use simulations to assess the power and accuracy of our method and apply it to seven of the strongest sweeps currently known in humans. We identify two genes, ASPM and PSCA, that are most likely affected by selection on standing variation; and we find three genes, ADH1B, LCT, and EDAR, in which the adaptive alleles seem to have swept from a new mutation. We also confirm evidence of selection for one further gene, TRPV6. In one gene, G6PD, neither neutral models nor models of selective sweeps fit the data, presumably because this locus has been subject to balancing selection. PMID:23071458
On Certain Topological Indices of Boron Triangular Nanotubes
NASA Astrophysics Data System (ADS)
Aslam, Adnan; Ahmad, Safyan; Gao, Wei
2017-08-01
The topological index gives information about the whole structure of a chemical graph, especially degree-based topological indices that are very useful. Boron triangular nanotubes are now replacing usual carbon nanotubes due to their excellent properties. We have computed general Randić (Rα), first Zagreb (M1) and second Zagreb (M2), atom-bond connectivity (ABC), and geometric-arithmetic (GA) indices of boron triangular nanotubes. Also, we have computed the fourth version of atom-bond connectivity (ABC4) and the fifth version of geometric-arithmetic (GA5) indices of boron triangular nanotubes.
Zhu, Liping; Aono, Masashi; Kim, Song-Ju; Hara, Masahiko
2013-04-01
A single-celled, multi-nucleated amoeboid organism, a plasmodium of the true slime mold Physarum polycephalum, can perform sophisticated computing by exhibiting complex spatiotemporal oscillatory dynamics while deforming its amorphous body. We previously devised an "amoeba-based computer (ABC)" to quantitatively evaluate the optimization capability of the amoeboid organism in searching for a solution to the traveling salesman problem (TSP) under optical feedback control. In ABC, the organism changes its shape to find a high quality solution (a relatively shorter TSP route) by alternately expanding and contracting its pseudopod-like branches that exhibit local photoavoidance behavior. The quality of the solution serves as a measure of the optimality of which the organism maximizes its global body area (nutrient absorption) while minimizing the risk of being illuminated (exposure to aversive stimuli). ABC found a high quality solution for the 8-city TSP with a high probability. However, it remains unclear whether intracellular communication among the branches of the organism is essential for computing. In this study, we conducted a series of control experiments using two individual cells (two single-celled organisms) to perform parallel searches in the absence of intercellular communication. We found that ABC drastically lost its ability to find a solution when it used two independent individuals. However, interestingly, when two individuals were prepared by dividing one individual, they found a solution for a few tens of minutes. That is, the two divided individuals remained correlated even though they were spatially separated. These results suggest the presence of a long-term memory in the intrinsic dynamics of this organism and its significance in performing sophisticated computing. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
BCM: toolkit for Bayesian analysis of Computational Models using samplers.
Thijssen, Bram; Dijkstra, Tjeerd M H; Heskes, Tom; Wessels, Lodewyk F A
2016-10-21
Computational models in biology are characterized by a large degree of uncertainty. This uncertainty can be analyzed with Bayesian statistics, however, the sampling algorithms that are frequently used for calculating Bayesian statistical estimates are computationally demanding, and each algorithm has unique advantages and disadvantages. It is typically unclear, before starting an analysis, which algorithm will perform well on a given computational model. We present BCM, a toolkit for the Bayesian analysis of Computational Models using samplers. It provides efficient, multithreaded implementations of eleven algorithms for sampling from posterior probability distributions and for calculating marginal likelihoods. BCM includes tools to simplify the process of model specification and scripts for visualizing the results. The flexible architecture allows it to be used on diverse types of biological computational models. In an example inference task using a model of the cell cycle based on ordinary differential equations, BCM is significantly more efficient than existing software packages, allowing more challenging inference problems to be solved. BCM represents an efficient one-stop-shop for computational modelers wishing to use sampler-based Bayesian statistics.
The aggregate site frequency spectrum for comparative population genomic inference.
Xue, Alexander T; Hickerson, Michael J
2015-12-01
Understanding how assemblages of species responded to past climate change is a central goal of comparative phylogeography and comparative population genomics, an endeavour that has increasing potential to integrate with community ecology. New sequencing technology now provides the potential to perform complex demographic inference at unprecedented resolution across assemblages of nonmodel species. To this end, we introduce the aggregate site frequency spectrum (aSFS), an expansion of the site frequency spectrum to use single nucleotide polymorphism (SNP) data sets collected from multiple, co-distributed species for assemblage-level demographic inference. We describe how the aSFS is constructed over an arbitrary number of independent population samples and then demonstrate how the aSFS can differentiate various multispecies demographic histories under a wide range of sampling configurations while allowing effective population sizes and expansion magnitudes to vary independently. We subsequently couple the aSFS with a hierarchical approximate Bayesian computation (hABC) framework to estimate degree of temporal synchronicity in expansion times across taxa, including an empirical demonstration with a data set consisting of five populations of the threespine stickleback (Gasterosteus aculeatus). Corroborating what is generally understood about the recent postglacial origins of these populations, the joint aSFS/hABC analysis strongly suggests that the stickleback data are most consistent with synchronous expansion after the Last Glacial Maximum (posterior probability = 0.99). The aSFS will have general application for multilevel statistical frameworks to test models involving assemblages and/or communities, and as large-scale SNP data from nonmodel species become routine, the aSFS expands the potential for powerful next-generation comparative population genomic inference. © 2015 The Authors. Molecular Ecology Published by John Wiley & Sons Ltd.
Prates, Ivan; Rivera, Danielle; Rodrigues, Miguel T; Carnaval, Ana C
2016-10-01
Shifts in the geographic distribution of habitats over time can promote dispersal and vicariance, thereby influencing large-scale biogeographic patterns and ecological processes. An example is that of transient corridors of suitable habitat across disjunct but ecologically similar regions, which have been associated with climate change over time. Such connections likely played a role in the assembly of tropical communities, especially within the highly diverse Amazonian and Atlantic rainforests of South America. Although these forests are presently separated by open and dry ecosystems, paleoclimatic and phylogenetic evidence suggest that they have been transiently connected in the past. However, little is known about the timing, magnitude and the distribution of former forest connections. We employ sequence data at multiple loci from three codistributed arboreal lizards (Anolis punctatus, Anolis ortonii and Polychrus marmoratus) to infer the phylogenetic relationships among Amazonian and Atlantic Forest populations and to test alternative historical demographic scenarios of colonization and vicariance using coalescent simulations and approximate Bayesian computation (ABC). Data from the better-sampled Anolis species support colonization of the Atlantic Forest from eastern Amazonia. Hierarchical ABC indicates that the three species colonized the Atlantic Forest synchronously during the mid-Pleistocene. We find support of population bottlenecks associated with founder events in the two Anolis, but not in P. marmoratus, consistently with their distinct ecological tolerances. Our findings support that climatic fluctuations provided key opportunities for dispersal and forest colonization in eastern South America through the cessation of environmental barriers. Evidence of species-specific histories strengthens assertions that biological attributes play a role in responses to shared environmental change. © 2016 John Wiley & Sons Ltd.
Trujillo-Arias, Natalia; Dantas, Gisele P M; Arbeláez-Cortés, Enrique; Naoki, Kazuya; Gómez, Maria I; Santos, Fabricio R; Miyaki, Cristina Y; Aleixo, Alexandre; Tubaro, Pablo L; Cabanne, Gustavo S
2017-07-01
The Atlantic Forest is separated from the Andean tropical forest by dry and open vegetation biomes (Chaco and Cerrado). Despite this isolation, both rainforests share closely related lineages, which suggest a past connection. This connection could have been important for forest taxa evolution. In this study, we used the Saffron-billed Sparrow (Arremon flavirostris) as a model to evaluate whether the Andean and the Atlantic forests act as a refugia system, as well as to test for a history of biogeographic connection between them. In addition, we evaluated the molecular systematic of intraspecific lineages of the studied species. We modeled the current and past distribution of A. flavirostris, performed phylogeographic analyses based on mitochondrial and nuclear genes, and used Approximate Bayesian Computation (ABC) analyses to test for biogeographic scenarios. The major phylogeographic disjunction within A. flavirostris was found between the Andean and the Atlantic forests, with a divergence that occurred during the Mid-Pleistocene. Our paleodistribution models indicated a connection between these forest domains in different periods and through both the Chaco and Cerrado. Additionally, the phylogeographic and ABC analyses supported that the Cerrado was the main route of connection between these rainforests, but without giving decisive evidence against a Chaco connection. Our study with A. flavirostris suggest that the biodiversity of the Andean and of the Atlantic forests could have been impacted (and perhaps enriched?) by cycles of connections through the Cerrado and Chaco. This recurrent cycle of connection between the Andean and the Atlantic Forest could have been important for the evolution of Neotropical forest taxa. In addition, we discussed taxonomic implications of the results and proposed to split the studied taxon into two full species. Copyright © 2017 Elsevier Inc. All rights reserved.
Bemmels, Jordan B; Title, Pascal O; Ortego, Joaquín; Knowles, L Lacey
2016-10-01
Past climate change has caused shifts in species distributions and undoubtedly impacted patterns of genetic variation, but the biological processes mediating responses to climate change, and their genetic signatures, are often poorly understood. We test six species-specific biologically informed hypotheses about such processes in canyon live oak (Quercus chrysolepis) from the California Floristic Province. These hypotheses encompass the potential roles of climatic niche, niche multidimensionality, physiological trade-offs in functional traits, and local-scale factors (microsites and local adaptation within ecoregions) in structuring genetic variation. Specifically, we use ecological niche models (ENMs) to construct temporally dynamic landscapes where the processes invoked by each hypothesis are reflected by differences in local habitat suitabilities. These landscapes are used to simulate expected patterns of genetic variation under each model and evaluate the fit of empirical data from 13 microsatellite loci genotyped in 226 individuals from across the species range. Using approximate Bayesian computation (ABC), we obtain very strong support for two statistically indistinguishable models: a trade-off model in which growth rate and drought tolerance drive habitat suitability and genetic structure, and a model based on the climatic niche estimated from a generic ENM, in which the variables found to make the most important contribution to the ENM have strong conceptual links to drought stress. The two most probable models for explaining the patterns of genetic variation thus share a common component, highlighting the potential importance of seasonal drought in driving historical range shifts in a temperate tree from a Mediterranean climate where summer drought is common. © 2016 John Wiley & Sons Ltd.
Fountain, Emily D; Kang, Jung Koo; Tempel, Douglas J; Palsbøll, Per J; Pauli, Jonathan N; Zachariah Peery, M
2018-01-01
Understanding how habitat quality in heterogeneous landscapes governs the distribution and fitness of individuals is a fundamental aspect of ecology. While mean individual fitness is generally considered a key to assessing habitat quality, a comprehensive understanding of habitat quality in heterogeneous landscapes requires estimates of dispersal rates among habitat types. The increasing accessibility of genomic approaches, combined with field-based demographic methods, provides novel opportunities for incorporating dispersal estimation into assessments of habitat quality. In this study, we integrated genomic kinship approaches with field-based estimates of fitness components and approximate Bayesian computation (ABC) procedures to estimate habitat-specific dispersal rates and characterize habitat quality in two-toed sloths (Choloepus hoffmanni) occurring in a Costa Rican agricultural ecosystem. Field-based observations indicated that birth and survival rates were similar in a sparsely shaded cacao farm and adjacent cattle pasture-forest mosaic. Sloth density was threefold higher in pasture compared with cacao, whereas home range size and overlap were greater in cacao compared with pasture. Dispersal rates were similar between the two habitats, as estimated using ABC procedures applied to the spatial distribution of pairs of related individuals identified using 3,431 single nucleotide polymorphism and 11 microsatellite locus genotypes. Our results indicate that crops produced under a sparse overstorey can, in some cases, constitute lower-quality habitat than pasture-forest mosaics for sloths, perhaps because of differences in food resources or predator communities. Finally, our study demonstrates that integrating field-based demographic approaches with genomic methods can provide a powerful means for characterizing habitat quality for animal populations occurring in heterogeneous landscapes. © 2017 John Wiley & Sons Ltd.
IMG-ABC: An Atlas of Biosynthetic Gene Clusters to Fuel the Discovery of Novel Secondary Metabolites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, I-Min; Chu, Ken; Ratner, Anna
2014-10-28
In the discovery of secondary metabolites (SMs), large-scale analysis of sequence data is a promising exploration path that remains largely underutilized due to the lack of relevant computational resources. We present IMG-ABC (https://img.jgi.doe.gov/abc/) -- An Atlas of Biosynthetic gene Clusters within the Integrated Microbial Genomes (IMG) system1. IMG-ABC is a rich repository of both validated and predicted biosynthetic clusters (BCs) in cultured isolates, single-cells and metagenomes linked with the SM chemicals they produce and enhanced with focused analysis tools within IMG. The underlying scalable framework enables traversal of phylogenetic dark matter and chemical structure space -- serving as a doorwaymore » to a new era in the discovery of novel molecules.« less
Bayesian models: A statistical primer for ecologists
Hobbs, N. Thompson; Hooten, Mevin B.
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models
ERIC Educational Resources Information Center
Tan, Andrea; Ferreira, Aldónio
2012-01-01
This study investigates the influence of the use of accounting software in teaching activity-based costing (ABC) on the learning process. It draws upon the Theory of Planned Behaviour and uses the end-user computer satisfaction (EUCS) framework to examine students' satisfaction with the ABC software. The study examines students' satisfaction with…
Bayesian inference based on dual generalized order statistics from the exponentiated Weibull model
NASA Astrophysics Data System (ADS)
Al Sobhi, Mashail M.
2015-02-01
Bayesian estimation for the two parameters and the reliability function of the exponentiated Weibull model are obtained based on dual generalized order statistics (DGOS). Also, Bayesian prediction bounds for future DGOS from exponentiated Weibull model are obtained. The symmetric and asymmetric loss functions are considered for Bayesian computations. The Markov chain Monte Carlo (MCMC) methods are used for computing the Bayes estimates and prediction bounds. The results have been specialized to the lower record values. Comparisons are made between Bayesian and maximum likelihood estimators via Monte Carlo simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Claude, Line; Malet, Claude Phys.; Pommier, Pascal
2007-04-01
Purpose: The challenge in early Hodgkin's disease (HD) in children is to maintain good survival rates while sparing organs at risk. This study assesses the feasibility of active breathing control (ABC) in children, and compares normal tissue irradiation with and without ABC. Methods and Materials: Between May 2003 and June 2004, seven children with HD with mediastinal involvement, median age 15, were treated by chemotherapy and involved-field radiation therapy. A free-breathing computed tomography simulation scan and one additional scan during deep inspiration using ABC were performed. A comparison between planning treatment with clinical target volume including supraclavicular regions, mediastinum, andmore » hila was performed, both in free breathing and using ABC. Results: For a prescription of 36 Gy, pulmonary dose-volume histograms revealed a mean reduction in lung volume irradiated at more than 20 Gy (V20) and 30 Gy (V30) of 25% and 26%, respectively, using ABC (p = 0.016). The mean volume of heart irradiated at 30 Gy or more decreased from 15% to 12% (nonsignificant). The mean dose delivered to breasts in girls was small in both situations (less than 2 Gy) and stable with or without ABC. Considering axillary irradiation, the mean dose delivered to breasts remained low (<9 Gy), without significant difference using ABC or not. The mean radiation dose delivered to thyroid was stable using ABC or not. Conclusions: Using ABC is feasible in childhood. The use of ABC decreases normal lung tissue irradiation. Concerning heart irradiation, a minimal gain is also shown. No significant change has been demonstrated concerning breast and thyroid irradiation.« less
NASA Astrophysics Data System (ADS)
Vrugt, Jasper A.; Beven, Keith J.
2018-04-01
This essay illustrates some recent developments to the DiffeRential Evolution Adaptive Metropolis (DREAM) MATLAB toolbox of Vrugt (2016) to delineate and sample the behavioural solution space of set-theoretic likelihood functions used within the GLUE (Limits of Acceptability) framework (Beven and Binley, 1992, 2014; Beven and Freer, 2001; Beven, 2006). This work builds on the DREAM(ABC) algorithm of Sadegh and Vrugt (2014) and enhances significantly the accuracy and CPU-efficiency of Bayesian inference with GLUE. In particular it is shown how lack of adequate sampling in the model space might lead to unjustified model rejection.
Comparison of mechanistic transport cycle models of ABC exporters.
Szöllősi, Dániel; Rose-Sperling, Dania; Hellmich, Ute A; Stockner, Thomas
2018-04-01
ABC (ATP binding cassette) transporters, ubiquitous in all kingdoms of life, carry out essential substrate transport reactions across cell membranes. Their transmembrane domains bind and translocate substrates and are connected to a pair of nucleotide binding domains, which bind and hydrolyze ATP to energize import or export of substrates. Over four decades of investigations into ABC transporters have revealed numerous details from atomic-level structural insights to their functional and physiological roles. Despite all these advances, a comprehensive understanding of the mechanistic principles of ABC transporter function remains elusive. The human multidrug resistance transporter ABCB1, also referred to as P-glycoprotein (P-gp), is one of the most intensively studied ABC exporters. Using ABCB1 as the reference point, we aim to compare the dominating mechanistic models of substrate transport and ATP hydrolysis for ABC exporters and to highlight the experimental and computational evidence in their support. In particular, we point out in silico studies that enhance and complement available biochemical data. "This article is part of a Special Issue entitled: Beyond the Structure-Function Horizon of Membrane Proteins edited by Ute Hellmich, Rupak Doshi and Benjamin McIlwain." Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Wu, Xiao-Lin; Sun, Chuanyu; Beissinger, Timothy M; Rosa, Guilherme Jm; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel
2012-09-25
Most Bayesian models for the analysis of complex traits are not analytically tractable and inferences are based on computationally intensive techniques. This is true of Bayesian models for genome-enabled selection, which uses whole-genome molecular data to predict the genetic merit of candidate animals for breeding purposes. In this regard, parallel computing can overcome the bottlenecks that can arise from series computing. Hence, a major goal of the present study is to bridge the gap to high-performance Bayesian computation in the context of animal breeding and genetics. Parallel Monte Carlo Markov chain algorithms and strategies are described in the context of animal breeding and genetics. Parallel Monte Carlo algorithms are introduced as a starting point including their applications to computing single-parameter and certain multiple-parameter models. Then, two basic approaches for parallel Markov chain Monte Carlo are described: one aims at parallelization within a single chain; the other is based on running multiple chains, yet some variants are discussed as well. Features and strategies of the parallel Markov chain Monte Carlo are illustrated using real data, including a large beef cattle dataset with 50K SNP genotypes. Parallel Markov chain Monte Carlo algorithms are useful for computing complex Bayesian models, which does not only lead to a dramatic speedup in computing but can also be used to optimize model parameters in complex Bayesian models. Hence, we anticipate that use of parallel Markov chain Monte Carlo will have a profound impact on revolutionizing the computational tools for genomic selection programs.
2012-01-01
Background Most Bayesian models for the analysis of complex traits are not analytically tractable and inferences are based on computationally intensive techniques. This is true of Bayesian models for genome-enabled selection, which uses whole-genome molecular data to predict the genetic merit of candidate animals for breeding purposes. In this regard, parallel computing can overcome the bottlenecks that can arise from series computing. Hence, a major goal of the present study is to bridge the gap to high-performance Bayesian computation in the context of animal breeding and genetics. Results Parallel Monte Carlo Markov chain algorithms and strategies are described in the context of animal breeding and genetics. Parallel Monte Carlo algorithms are introduced as a starting point including their applications to computing single-parameter and certain multiple-parameter models. Then, two basic approaches for parallel Markov chain Monte Carlo are described: one aims at parallelization within a single chain; the other is based on running multiple chains, yet some variants are discussed as well. Features and strategies of the parallel Markov chain Monte Carlo are illustrated using real data, including a large beef cattle dataset with 50K SNP genotypes. Conclusions Parallel Markov chain Monte Carlo algorithms are useful for computing complex Bayesian models, which does not only lead to a dramatic speedup in computing but can also be used to optimize model parameters in complex Bayesian models. Hence, we anticipate that use of parallel Markov chain Monte Carlo will have a profound impact on revolutionizing the computational tools for genomic selection programs. PMID:23009363
Evolutionary Trajectories of Entomopathogenic Fungi ABC Transporters.
Baral, Bikash
2017-01-01
The ABC protein superfamily-also called traffic ATPases-are energy-dependent ubiquitous proteins, representing one of the crucial and the largest family in the fungal genomes. The ATP-binding cassette endows a characteristic 200-250 amino acids and is omnipresent in all organisms ranging from prokaryotes to eukaryotes. Unlike in bacteria with nutrient import functions, ABC transporters in fungal entomopathogens serve as effective efflux pumps that are largely involved in the shuttle of metabolites across the biological membranes. Thus, the search for ABC proteins may prove of immense importance in elucidating the functional and molecular mechanism at the host-pathogen (insect-fungus) interface. Their sequence homology, domain topology, and functional traits led to the actual identification of nine different families in fungal entomopathogens. Evolutionary relationships within the ABC superfamily are discussed, concentrating on computational approaches for comparative identification of ABC transporters in insect-pathogenic fungi (entomopathogens) with those of animals, plants, and their bacterial orthologs. Ancestors of some fungal candidates have duplicated extensively in some phyla, while others were lost in one lineage or the other, and predictions for the cause of their duplications and/or loss in some phyla are made. ABC transporters of fungal insect-pathogens serve both defensive and offensive functions effective against land-dwelling and ground foraging voracious insects. This study may help to unravel the molecular cascades of ABC proteins to illuminate the means through which insects cope with fungal infection and fungal-related diseases. Copyright © 2017 Elsevier Inc. All rights reserved.
Applying activity-based costing to the nuclear medicine unit.
Suthummanon, Sakesun; Omachonu, Vincent K; Akcin, Mehmet
2005-08-01
Previous studies have shown the feasibility of using activity-based costing (ABC) in hospital environments. However, many of these studies discuss the general applications of ABC in health-care organizations. This research explores the potential application of ABC to the nuclear medicine unit (NMU) at a teaching hospital. The finding indicates that the current cost averages 236.11 US dollars for all procedures, which is quite different from the costs computed by using ABC. The difference is most significant with positron emission tomography scan, 463 US dollars (an increase of 96%), as well as bone scan and thyroid scan, 114 US dollars (a decrease of 52%). The result of ABC analysis demonstrates that the operational time (machine time and direct labour time) and the cost of drugs have the most influence on cost per procedure. Clearly, to reduce the cost per procedure for the NMU, the reduction in operational time and cost of drugs should be analysed. The result also indicates that ABC can be used to improve resource allocation and management. It can be an important aid in making management decisions, particularly for improving pricing practices by making costing more accurate. It also facilitates the identification of underutilized resources and related costs, leading to cost reduction. The ABC system will also help hospitals control costs, improve the quality and efficiency of the care they provide, and manage their resources better.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zellars, Richard, E-mail: zellari@jhmi.edu; Bravo, Paco E.; Tryggestad, Erik
2014-03-15
Purpose: Cardiac muscle perfusion, as determined by single-photon emission computed tomography (SPECT), decreases after breast and/or chest wall (BCW) irradiation. The active breathing coordinator (ABC) enables radiation delivery when the BCW is farther from the heart, thereby decreasing cardiac exposure. We hypothesized that ABC would prevent radiation-induced cardiac toxicity and conducted a randomized controlled trial evaluating myocardial perfusion changes after radiation for left-sided breast cancer with or without ABC. Methods and Materials: Stages I to III left breast cancer patients requiring adjuvant radiation therapy (XRT) were randomized to ABC or No-ABC. Myocardial perfusion was evaluated by SPECT scans (before andmore » 6 months after BCW radiation) using 2 methods: (1) fully automated quantitative polar mapping; and (2) semiquantitative visual assessment. The left ventricle was divided into 20 segments for the polar map and 17 segments for the visual method. Segments were grouped by anatomical rings (apical, mid, basal) or by coronary artery distribution. For the visual method, 2 nuclear medicine physicians, blinded to treatment groups, scored each segment's perfusion. Scores were analyzed with nonparametric tests and linear regression. Results: Between 2006 and 2010, 57 patients were enrolled and 43 were available for analysis. The cohorts were well matched. The apical and left anterior descending coronary artery segments had significant decreases in perfusion on SPECT scans in both ABC and No-ABC cohorts. In unadjusted and adjusted analyses, controlling for pretreatment perfusion score, age, and chemotherapy, ABC was not significantly associated with prevention of perfusion deficits. Conclusions: In this randomized controlled trial, ABC does not appear to prevent radiation-induced cardiac perfusion deficits.« less
NASA Astrophysics Data System (ADS)
Davies, Frederick B.; Hennawi, Joseph F.; Eilers, Anna-Christina; Lukić, Zarija
2018-03-01
The amplitude of the ionizing background that pervades the intergalactic medium (IGM) at the end of the epoch of reionization provides a valuable constraint on the emissivity of the sources that reionized the universe. While measurements of the ionizing background at lower redshifts rely on a simulation-calibrated mapping between the photoionization rate and the mean transmission of the Lyα forest, at z ≳ 6 the IGM becomes increasingly opaque and transmission arises solely in narrow spikes separated by saturated Gunn–Peterson troughs. In this regime, the traditional approach of measuring the average transmission over large ∼50 Mpc/h regions is less sensitive and suboptimal. In addition, the five times smaller oscillator strength of the Lyβ transition implies that the Lyβ forest is considerably more transparent at z ≳ 6, even in the presence of contamination by foreground z ∼ 5 Lyα forest absorption. In this work we present a novel statistical approach to analyze the joint distribution of transmission spikes in the cospatial z ∼ 6 Lyα and Lyβ forests. Our method relies on approximate Bayesian computation (ABC), which circumvents the necessity of computing the intractable likelihood function describing the highly correlated Lyα and Lyβ transmission. We apply ABC to mock data generated from a large-volume hydrodynamical simulation combined with a state-of-the-art model of ionizing background fluctuations in the post-reionization IGM and show that it is sensitive to higher IGM neutral hydrogen fractions than previous techniques. As a proof of concept, we apply this methodology to a real spectrum of a z = 6.54 quasar and measure the ionizing background from 5.4 ≤ z ≤ 6.4 along this sightline with ∼0.2 dex statistical uncertainties. Some of the data presented herein were obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California, and the National Aeronautics and Space Administration. The Observatory was made possible by the generous financial support of the W. M. Keck Foundation.
2005-12-09
decision making logic that respond to the environment (concentration of operands - the state vector), and bias or "mood" as established by its history of...mentioned in the chart, there is no need for file management in a ABC Machine. Information is distributed, no history is maintained. The instruction set... Postgresql ) for collection of cluster samples/snapshots over intervals of time. An prototypical example of an XML file to configure and launch the ABC
Zhou, Chao; Yin, Kunlong; Cao, Ying; Ahmed, Bayes; Fu, Xiaolin
2018-05-08
Landslide displacement prediction is considered as an essential component for developing early warning systems. The modelling of conventional forecast methods requires enormous monitoring data that limit its application. To conduct accurate displacement prediction with limited data, a novel method is proposed and applied by integrating three computational intelligence algorithms namely: the wavelet transform (WT), the artificial bees colony (ABC), and the kernel-based extreme learning machine (KELM). At first, the total displacement was decomposed into several sub-sequences with different frequencies using the WT. Next each sub-sequence was predicted separately by the KELM whose parameters were optimized by the ABC. Finally the predicted total displacement was obtained by adding all the predicted sub-sequences. The Shuping landslide in the Three Gorges Reservoir area in China was taken as a case study. The performance of the new method was compared with the WT-ELM, ABC-KELM, ELM, and the support vector machine (SVM) methods. Results show that the prediction accuracy can be improved by decomposing the total displacement into sub-sequences with various frequencies and by predicting them separately. The ABC-KELM algorithm shows the highest prediction capacity followed by the ELM and SVM. Overall, the proposed method achieved excellent performance both in terms of accuracy and stability.
An Application of the Difference Potentials Method to Solving External Problems in CFD
NASA Technical Reports Server (NTRS)
Ryaben 'Kii, Victor S.; Tsynkov, Semyon V.
1997-01-01
Numerical solution of infinite-domain boundary-value problems requires some special techniques that would make the problem available for treatment on the computer. Indeed, the problem must be discretized in a way that the computer operates with only finite amount of information. Therefore, the original infinite-domain formulation must be altered and/or augmented so that on one hand the solution is not changed (or changed slightly) and on the other hand the finite discrete formulation becomes available. One widely used approach to constructing such discretizations consists of truncating the unbounded original domain and then setting the artificial boundary conditions (ABC's) at the newly formed external boundary. The role of the ABC's is to close the truncated problem and at the same time to ensure that the solution found inside the finite computational domain would be maximally close to (in the ideal case, exactly the same as) the corresponding fragment of the original infinite-domain solution. Let us emphasize that the proper treatment of artificial boundaries may have a profound impact on the overall quality and performance of numerical algorithms. The latter statement is corroborated by the numerous computational experiments and especially concerns the area of CFD, in which external problems present a wide class of practically important formulations. In this paper, we review some work that has been done over the recent years on constructing highly accurate nonlocal ABC's for calculation of compressible external flows. The approach is based on implementation of the generalized potentials and pseudodifferential boundary projection operators analogous to those proposed first by Calderon. The difference potentials method (DPM) by Ryaben'kii is used for the effective computation of the generalized potentials and projections. The resulting ABC's clearly outperform the existing methods from the standpoints of accuracy and robustness, in many cases noticeably speed up the multigrid convergence, and at the same time are quite comparable to other methods from the standpoints of geometric universality and simplicity of implementation.
Children Can Solve Bayesian Problems: The Role of Representation in Mental Computation
ERIC Educational Resources Information Center
Zhu, Liqi; Gigerenzer, Gerd
2006-01-01
Can children reason the Bayesian way? We argue that the answer to this question depends on how numbers are represented, because a representation can do part of the computation. We test, for the first time, whether Bayesian reasoning can be elicited in children by means of natural frequencies. We show that when information was presented to fourth,…
A PhoPQ-Regulated ABC Transporter System Exports Tetracycline in Pseudomonas aeruginosa.
Chen, Lin; Duan, Kangmin
2016-05-01
Pseudomonas aeruginosa is an important human pathogen whose infections are difficult to treat due to its high intrinsic resistance to many antibiotics. Here, we show that the disruption of PA4456, encoding the ATP binding component of a putative ATP-binding cassette (ABC) transporter, increased the bacterium's susceptible to tetracycline and other antibiotics or toxic chemicals. Fluorescence spectroscopy and antibiotic accumulation tests showed that the interruption of the ABC transporter caused increased intracellular accumulation of tetracycline, demonstrating a role of the ABC transporter in tetracycline expulsion. Site-directed mutagenesis proved that the conserved residues of E170 in the Walker B motif and H203 in the H-loop, which are important for ATP hydrolysis, were essential for the function of PA4456. Through a genome-wide search, the PhoPQ two-component system was identified as a regulator of the computationally predicted PA4456-4452 operon that encodes the ABC transporter system. A >5-fold increase of the expression of this operon was observed in the phoQ mutant. The results obtained also show that the expression of the phzA1B1C1D1E1 operon and the production of pyocyanin were significantly higher in the ABC transporter mutant, signifying a connection between the ABC transporter and pyocyanin production. These results indicated that the PhoPQ-regulated ABC transporter is associated with intrinsic resistance to antibiotics and other adverse compounds in P. aeruginosa, probably by extruding them out of the cell. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
With or without you: predictive coding and Bayesian inference in the brain
Aitchison, Laurence; Lengyel, Máté
2018-01-01
Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084
ERIC Educational Resources Information Center
ALTMANN, BERTHOLD; BROWN, WILLIAM G.
THE FIRST-GENERATION APPROACH BY CONCEPT (ABC) STORAGE AND RETRIEVAL METHOD, A METHOD WHICH UTILIZES AS A SUBJECT APPROACH APPROPRIATE STANDARDIZED ENGLISH-LANGUAGE STATEMENTS PROCESSED AND PRINTED IN A PERMUTED INDEX FORMAT, UNDERWENT A PERFORMANCE TEST, THE PRIMARY OBJECTIVE OF WHICH WAS TO SPOT DEFICIENCIES AND TO DEVELOP A SECOND-GENERATION…
Multidrug resistance in parasites: ABC transporters, P-glycoproteins and molecular modelling.
Jones, P M; George, A M
2005-04-30
Parasitic diseases, caused by protozoa, helminths and arthropods, rank among the most important problems in human and veterinary medicine, and in agriculture, leading to debilitating sicknesses and loss of life. In the absence of vaccines and with the general failure of vector eradication programs, drugs are the main line of defence, but the newest drugs are being tracked by the emergence of resistance in parasites, sharing ominous parallels with multidrug resistance in bacterial pathogens. Any of a number of mechanisms will elicit a drug resistance phenotype in parasites, including: active efflux, reduced uptake, target modification, drug modification, drug sequestration, by-pass shunting, or substrate competition. The role of ABC transporters in parasitic multidrug resistance mechanisms is being subjected to more scrutiny, due in part to the established roles of certain ABC transporters in human diseases, and also to an increasing portfolio of ABC transporters from parasite genome sequencing projects. For example, over 100 ABC transporters have been identified in the Escherichia coli genome, but to date only about 65 in all parasitic genomes. Long established laboratory investigations are now being assisted by molecular biology, bioinformatics, and computational modelling, and it is in these areas that the role of ABC transporters in parasitic multidrug resistance mechanisms may be defined and put in perspective with that of other proteins. We discuss ABC transporters in parasites, and conclude with an example of molecular modelling that identifies a new interaction between the structural domains of a parasite P-glycoprotein.
Artificial bee colony in neuro - Symbolic integration
NASA Astrophysics Data System (ADS)
Kasihmuddin, Mohd Shareduwan Mohd; Sathasivam, Saratha; Mansor, Mohd. Asyraf
2017-08-01
Swarm intelligence is a research area that models the population of the swarm based on natural computation. Artificial bee colony (ABC) algorithm is a swarm based metaheuristic algorithm introduced by Karaboga to optimize numerical problem. Pattern-SAT is a pattern reconstruction paradigm that utilized 2SAT logical rule in representing the behavior of the desired pattern. The information of the desired pattern in terms of 2SAT logic is embedded to Hopfield neural network (HNN-P2SAT) and the desired pattern is reconstructed during the retrieval phase. Since the performance of HNN-P2SAT in Pattern-SAT deteriorates when the number of 2SAT clause increased, newly improved ABC is used to reduce the computation burden during the learning phase of HNN-P2SAT (HNN-P2SATABC). The aim of this study is to investigate the performance of Pattern-SAT produced by ABC incorporated with HNN-P2SAT and compare it with conventional standalone HNN. The comparison is examined by using Microsoft Visual Basic C++ 2013 software. The detailed comparison in doing Pattern-SAT is discussed based on global Pattern-SAT, ratio of activated clauses and computation time. The result obtained from computer simulation indicates the beneficial features of HNN-P2SATABC in doing Pattern-SAT. This finding is expected to result in a significant implication on the choice of searching method used to do Pattern-SAT.
An Asymmetric Birdcage Coil for Small-animal MR Imaging at 7T
Kim, Kyoung-Nam; Han, Sang-Doc; Seo, Jeung-Hoon; Heo, Phil; Yoo, Dongkyeom; Im, Geun Ho; Lee, Jung Hee
2017-01-01
The birdcage (BC) coil is currently being utilized for uniform radiofrequency (RF) transmit/receive (Tx/Rx) or Tx-only configuration in many magnetic resonance (MR) imaging applications, but insufficient magnetic flux (|B1|) density and their non-uniform distribution still exists in high-field (HF) environments. We demonstrate that the asymmetric birdcage (ABC) transmit/receive (Tx/Rx) volume coil, which is a modified standard birdcage (SBC) coil with the end ring split into two halves, is suitable for improving the |B1| sensitivity in 7T small-animal MR imaging. Cylindrical SBC and ABC coils with 35 mm diameter were constructed and bench tested for mouse body MR imaging at 300 MHz using a 7T scanner. To assess the ABC coil performance, computational electromagnetic (EM) simulation and 7T MR experiment were performed by using a cylindrical phantom and in vivo mouse body and quantitatively compared with the SBC coil in terms of |B1| distribution, RF transmit (|B1+|) field, and signal-to-noise ratio (SNR). The bench measurements of the two BC coils are similar, yielding a quality value (Q-value) of 74.42 for the SBC coil and 77.06 for the ABC coil. The computational calculation results clearly show that the proposed ABC coil offers superior |B1| field and |B1+| field sensitivity in the central axial slice compared with the SBC coil. There was also high SNR and uniformly distributed flip angle (FA) under the loaded condition of mouse body in the 7T experiment. Although ABC geometry allows a further increase in the |B1| field and |B1+| field sensitivity in only the central axial slice, the geometrical modification of the SBC coil can make a high performance RF coil feasible in the central axial slice and also make target imaging possible in the diagonal direction. PMID:27725573
Ornelas, Juan Francisco; Gándara, Etelvina; Vásquez-Aguilar, Antonio Acini; Ramírez-Barahona, Santiago; Ortiz-Rodriguez, Andrés Ernesto; González, Clementina; Mejía Saules, María Teresa; Ruiz-Sanchez, Eduardo
2016-04-12
Ecological adaptation to host taxa is thought to result in mistletoe speciation via race formation. However, historical and ecological factors could also contribute to explain genetic structuring particularly when mistletoe host races are distributed allopatrically. Using sequence data from nuclear (ITS) and chloroplast (trnL-F) DNA, we investigate the genetic differentiation of 31 Psittacanthus schiedeanus (Loranthaceae) populations across the Mesoamerican species range. We conducted phylogenetic, population and spatial genetic analyses on 274 individuals of P. schiedeanus to gain insight of the evolutionary history of these populations. Species distribution modeling, isolation with migration and Bayesian inference methods were used to infer the evolutionary transition of mistletoe invasion, in which evolutionary scenarios were compared through posterior probabilities. Our analyses revealed shallow levels of population structure with three genetic groups present across the sample area. Nine haplotypes were identified after sequencing the trnL-F intergenic spacer. These haplotypes showed phylogeographic structure, with three groups with restricted gene flow corresponding to the distribution of individuals/populations separated by habitat (cloud forest localities from San Luis Potosí to northwestern Oaxaca and Chiapas, localities with xeric vegetation in central Oaxaca, and localities with tropical deciduous forests in Chiapas), with post-glacial population expansions and potentially corresponding to post-glacial invasion types. Similarly, 44 ITS ribotypes suggest phylogeographic structure, despite the fact that most frequent ribotypes are widespread indicating effective nuclear gene flow via pollen. Gene flow estimates, a significant genetic signal of demographic expansion, and range shifts under past climatic conditions predicted by species distribution modeling suggest post-glacial invasion of P. schiedeanus mistletoes to cloud forests. However, Approximate Bayesian Computation (ABC) analyses strongly supported a scenario of simultaneous divergence among the three groups isolated recently. Our results provide support for the predominant role of isolation and environmental factors in driving genetic differentiation of Mesoamerican parrot-flower mistletoes. The ABC results are consistent with a scenario of post-glacial mistletoe invasion, independent of host identity, and that habitat types recently isolated P. schiedeanus populations, accumulating slight phenotypic differences among genetic groups due to recent migration across habitats. Under this scenario, climatic fluctuations throughout the Pleistocene would have altered the distribution of suitable habitat for mistletoes throughout Mesoamerica leading to variation in population continuity and isolation. Our findings add to an understanding of the role of recent isolation and colonization in shaping cloud forest communities in the region.
Computational Neuropsychology and Bayesian Inference.
Parr, Thomas; Rees, Geraint; Friston, Karl J
2018-01-01
Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine 'prior' beliefs with a generative (predictive) model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world). This draws upon the notion of a Bayes optimal pathology - optimal inference with suboptimal priors - and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient's behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology.
Computational Neuropsychology and Bayesian Inference
Parr, Thomas; Rees, Geraint; Friston, Karl J.
2018-01-01
Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine ‘prior’ beliefs with a generative (predictive) model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world). This draws upon the notion of a Bayes optimal pathology – optimal inference with suboptimal priors – and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient’s behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology. PMID:29527157
A Large Animal Model that Recapitulates the Spectrum of Human Intervertebral Disc Degeneration
Gullbrand, Sarah E.; Malhotra, Neil R.; Schaer, Thomas P.; Zawacki, Zosia; Martin, John T.; Bendigo, Justin R.; Milby, Andrew H.; Dodge, George R.; Vresilovic, Edward J.; Elliott, Dawn M.; Mauck, Robert L.; Smith, Lachlan J.
2016-01-01
Objective The objective of this study was to establish a large animal model that recapitulates the spectrum of intervertebral disc degeneration that occurs in humans and which is suitable for pre-clinical evaluation of a wide range of experimental therapeutics. Design Degeneration was induced in the lumbar intervertebral discs of large frame goats by either intradiscal injection of chondroitinase ABC (ChABC) over a range of dosages (0.1U, 1U or 5U) or subtotal nucleotomy. Radiographs were used to assess disc height changes over 12 weeks. Degenerative changes to the discs and endplates were assessed via magnetic resonance imaging (MRI), semi-quantitative histological grading, micro-computed tomography (µCT), and measurement of disc biomechanical properties. Results Degenerative changes were observed for all interventions that ranged from mild (0.1U ChABC) to moderate (1U ChABC and nucleotomy) to severe (5U ChABC). All groups showed progressive reductions in disc height over 12 weeks. Histological scores were significantly increased in the 1U and 5U ChABC groups. Reductions in T2 and T1ρ, and increased Pfirrmann grade were observed on MRI. Resorption and remodeling of the cortical boney endplate adjacent to ChABC injected discs also occurred. Spine segment range of motion was greater and compressive modulus was lower in 1U ChABC and nucleotomy discs compared to intact. Conclusions A large animal model of disc degeneration was established that recapitulates the spectrum of structural, compositional and biomechanical features of human disc degeneration. This model may serve as a robust platform for evaluating the efficacy of therapeutics targeted towards varying degrees of disc degeneration. PMID:27568573
High order local absorbing boundary conditions for acoustic waves in terms of farfield expansions
NASA Astrophysics Data System (ADS)
Villamizar, Vianey; Acosta, Sebastian; Dastrup, Blake
2017-03-01
We devise a new high order local absorbing boundary condition (ABC) for radiating problems and scattering of time-harmonic acoustic waves from obstacles of arbitrary shape. By introducing an artificial boundary S enclosing the scatterer, the original unbounded domain Ω is decomposed into a bounded computational domain Ω- and an exterior unbounded domain Ω+. Then, we define interface conditions at the artificial boundary S, from truncated versions of the well-known Wilcox and Karp farfield expansion representations of the exact solution in the exterior region Ω+. As a result, we obtain a new local absorbing boundary condition (ABC) for a bounded problem on Ω-, which effectively accounts for the outgoing behavior of the scattered field. Contrary to the low order absorbing conditions previously defined, the error at the artificial boundary induced by this novel ABC can be easily reduced to reach any accuracy within the limits of the computational resources. We accomplish this by simply adding as many terms as needed to the truncated farfield expansions of Wilcox or Karp. The convergence of these expansions guarantees that the order of approximation of the new ABC can be increased arbitrarily without having to enlarge the radius of the artificial boundary. We include numerical results in two and three dimensions which demonstrate the improved accuracy and simplicity of this new formulation when compared to other absorbing boundary conditions.
Nessler, Bernhard; Pfeiffer, Michael; Buesing, Lars; Maass, Wolfgang
2013-01-01
The principles by which networks of neurons compute, and how spike-timing dependent plasticity (STDP) of synaptic weights generates and maintains their computational function, are unknown. Preceding work has shown that soft winner-take-all (WTA) circuits, where pyramidal neurons inhibit each other via interneurons, are a common motif of cortical microcircuits. We show through theoretical analysis and computer simulations that Bayesian computation is induced in these network motifs through STDP in combination with activity-dependent changes in the excitability of neurons. The fundamental components of this emergent Bayesian computation are priors that result from adaptation of neuronal excitability and implicit generative models for hidden causes that are created in the synaptic weights through STDP. In fact, a surprising result is that STDP is able to approximate a powerful principle for fitting such implicit generative models to high-dimensional spike inputs: Expectation Maximization. Our results suggest that the experimentally observed spontaneous activity and trial-to-trial variability of cortical neurons are essential features of their information processing capability, since their functional role is to represent probability distributions rather than static neural codes. Furthermore it suggests networks of Bayesian computation modules as a new model for distributed information processing in the cortex. PMID:23633941
NASA Astrophysics Data System (ADS)
Rahimi Dalkhani, Amin; Javaherian, Abdolrahim; Mahdavi Basir, Hadi
2018-04-01
Wave propagation modeling as a vital tool in seismology can be done via several different numerical methods among them are finite-difference, finite-element, and spectral-element methods (FDM, FEM and SEM). Some advanced applications in seismic exploration benefit the frequency domain modeling. Regarding flexibility in complex geological models and dealing with the free surface boundary condition, we studied the frequency domain acoustic wave equation using FEM and SEM. The results demonstrated that the frequency domain FEM and SEM have a good accuracy and numerical efficiency with the second order interpolation polynomials. Furthermore, we developed the second order Clayton and Engquist absorbing boundary condition (CE-ABC2) and compared it with the perfectly matched layer (PML) for the frequency domain FEM and SEM. In spite of PML method, CE-ABC2 does not add any additional computational cost to the modeling except assembling boundary matrices. As a result, considering CE-ABC2 is more efficient than PML for the frequency domain acoustic wave propagation modeling especially when computational cost is high and high-level absorbing performance is unnecessary.
Wang, Jiali; Zhang, Qingnian; Ji, Wenfeng
2014-01-01
A large number of data is needed by the computation of the objective Bayesian network, but the data is hard to get in actual computation. The calculation method of Bayesian network was improved in this paper, and the fuzzy-precise Bayesian network was obtained. Then, the fuzzy-precise Bayesian network was used to reason Bayesian network model when the data is limited. The security of passengers during shipping is affected by various factors, and it is hard to predict and control. The index system that has the impact on the passenger safety during shipping was established on basis of the multifield coupling theory in this paper. Meanwhile, the fuzzy-precise Bayesian network was applied to monitor the security of passengers in the shipping process. The model was applied to monitor the passenger safety during shipping of a shipping company in Hainan, and the effectiveness of this model was examined. This research work provides guidance for guaranteeing security of passengers during shipping.
Wang, Jiali; Zhang, Qingnian; Ji, Wenfeng
2014-01-01
A large number of data is needed by the computation of the objective Bayesian network, but the data is hard to get in actual computation. The calculation method of Bayesian network was improved in this paper, and the fuzzy-precise Bayesian network was obtained. Then, the fuzzy-precise Bayesian network was used to reason Bayesian network model when the data is limited. The security of passengers during shipping is affected by various factors, and it is hard to predict and control. The index system that has the impact on the passenger safety during shipping was established on basis of the multifield coupling theory in this paper. Meanwhile, the fuzzy-precise Bayesian network was applied to monitor the security of passengers in the shipping process. The model was applied to monitor the passenger safety during shipping of a shipping company in Hainan, and the effectiveness of this model was examined. This research work provides guidance for guaranteeing security of passengers during shipping. PMID:25254227
Quantifying the effect of experimental design choices for in vitro scratch assays.
Johnston, Stuart T; Ross, Joshua V; Binder, Benjamin J; Sean McElwain, D L; Haridas, Parvathi; Simpson, Matthew J
2016-07-07
Scratch assays are often used to investigate potential drug treatments for chronic wounds and cancer. Interpreting these experiments with a mathematical model allows us to estimate the cell diffusivity, D, and the cell proliferation rate, λ. However, the influence of the experimental design on the estimates of D and λ is unclear. Here we apply an approximate Bayesian computation (ABC) parameter inference method, which produces a posterior distribution of D and λ, to new sets of synthetic data, generated from an idealised mathematical model, and experimental data for a non-adhesive mesenchymal population of fibroblast cells. The posterior distribution allows us to quantify the amount of information obtained about D and λ. We investigate two types of scratch assay, as well as varying the number and timing of the experimental observations captured. Our results show that a scrape assay, involving one cell front, provides more precise estimates of D and λ, and is more computationally efficient to interpret than a wound assay, with two opposingly directed cell fronts. We find that recording two observations, after making the initial observation, is sufficient to estimate D and λ, and that the final observation time should correspond to the time taken for the cell front to move across the field of view. These results provide guidance for estimating D and λ, while simultaneously minimising the time and cost associated with performing and interpreting the experiment. Copyright © 2016 Elsevier Ltd. All rights reserved.
MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control
NASA Astrophysics Data System (ADS)
Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming
2017-09-01
Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.
Nadachowska-Brzyska, Krystyna; Burri, Reto; Olason, Pall I.; Kawakami, Takeshi; Smeds, Linnéa; Ellegren, Hans
2013-01-01
Profound knowledge of demographic history is a prerequisite for the understanding and inference of processes involved in the evolution of population differentiation and speciation. Together with new coalescent-based methods, the recent availability of genome-wide data enables investigation of differentiation and divergence processes at unprecedented depth. We combined two powerful approaches, full Approximate Bayesian Computation analysis (ABC) and pairwise sequentially Markovian coalescent modeling (PSMC), to reconstruct the demographic history of the split between two avian speciation model species, the pied flycatcher and collared flycatcher. Using whole-genome re-sequencing data from 20 individuals, we investigated 15 demographic models including different levels and patterns of gene flow, and changes in effective population size over time. ABC provided high support for recent (mode 0.3 my, range <0.7 my) species divergence, declines in effective population size of both species since their initial divergence, and unidirectional recent gene flow from pied flycatcher into collared flycatcher. The estimated divergence time and population size changes, supported by PSMC results, suggest that the ancestral species persisted through one of the glacial periods of middle Pleistocene and then split into two large populations that first increased in size before going through severe bottlenecks and expanding into their current ranges. Secondary contact appears to have been established after the last glacial maximum. The severity of the bottlenecks at the last glacial maximum is indicated by the discrepancy between current effective population sizes (20,000–80,000) and census sizes (5–50 million birds) of the two species. The recent divergence time challenges the supposition that avian speciation is a relatively slow process with extended times for intrinsic postzygotic reproductive barriers to evolve. Our study emphasizes the importance of using genome-wide data to unravel tangled demographic histories. Moreover, it constitutes one of the first examples of the inference of divergence history from genome-wide data in non-model species. PMID:24244198
Nadachowska-Brzyska, Krystyna; Burri, Reto; Olason, Pall I; Kawakami, Takeshi; Smeds, Linnéa; Ellegren, Hans
2013-11-01
Profound knowledge of demographic history is a prerequisite for the understanding and inference of processes involved in the evolution of population differentiation and speciation. Together with new coalescent-based methods, the recent availability of genome-wide data enables investigation of differentiation and divergence processes at unprecedented depth. We combined two powerful approaches, full Approximate Bayesian Computation analysis (ABC) and pairwise sequentially Markovian coalescent modeling (PSMC), to reconstruct the demographic history of the split between two avian speciation model species, the pied flycatcher and collared flycatcher. Using whole-genome re-sequencing data from 20 individuals, we investigated 15 demographic models including different levels and patterns of gene flow, and changes in effective population size over time. ABC provided high support for recent (mode 0.3 my, range <0.7 my) species divergence, declines in effective population size of both species since their initial divergence, and unidirectional recent gene flow from pied flycatcher into collared flycatcher. The estimated divergence time and population size changes, supported by PSMC results, suggest that the ancestral species persisted through one of the glacial periods of middle Pleistocene and then split into two large populations that first increased in size before going through severe bottlenecks and expanding into their current ranges. Secondary contact appears to have been established after the last glacial maximum. The severity of the bottlenecks at the last glacial maximum is indicated by the discrepancy between current effective population sizes (20,000-80,000) and census sizes (5-50 million birds) of the two species. The recent divergence time challenges the supposition that avian speciation is a relatively slow process with extended times for intrinsic postzygotic reproductive barriers to evolve. Our study emphasizes the importance of using genome-wide data to unravel tangled demographic histories. Moreover, it constitutes one of the first examples of the inference of divergence history from genome-wide data in non-model species.
Laboratory Colonisation and Genetic Bottlenecks in the Tsetse Fly Glossina pallidipes
Ciosi, Marc
2014-01-01
Background The IAEA colony is the only one available for mass rearing of Glossina pallidipes, a vector of human and animal African trypanosomiasis in eastern Africa. This colony is the source for Sterile Insect Technique (SIT) programs in East Africa. The source population of this colony is unclear and its genetic diversity has not previously been evaluated and compared to field populations. Methodology/Principal Findings We examined the genetic variation within and between the IAEA colony and its potential source populations in north Zimbabwe and the Kenya/Uganda border at 9 microsatellites loci to retrace the demographic history of the IAEA colony. We performed classical population genetics analyses and also combined historical and genetic data in a quantitative analysis using Approximate Bayesian Computation (ABC). There is no evidence of introgression from the north Zimbabwean population into the IAEA colony. Moreover, the ABC analyses revealed that the foundation and establishment of the colony was associated with a genetic bottleneck that has resulted in a loss of 35.7% of alleles and 54% of expected heterozygosity compared to its source population. Also, we show that tsetse control carried out in the 1990's is likely reduced the effective population size of the Kenya/Uganda border population. Conclusions/Significance All the analyses indicate that the area of origin of the IAEA colony is the Kenya/Uganda border and that a genetic bottleneck was associated with the foundation and establishment of the colony. Genetic diversity associated with traits that are important for SIT may potentially have been lost during this genetic bottleneck which could lead to a suboptimal competitiveness of the colony males in the field. The genetic diversity of the colony is lower than that of field populations and so, studies using colony flies should be interpreted with caution when drawing general conclusions about G. pallidipes biology. PMID:24551260
Global Artificial Boundary Conditions for Computation of External Flow Problems with Propulsive Jets
NASA Technical Reports Server (NTRS)
Tsynkov, Semyon; Abarbanel, Saul; Nordstrom, Jan; Ryabenkii, Viktor; Vatsa, Veer
1998-01-01
We propose new global artificial boundary conditions (ABC's) for computation of flows with propulsive jets. The algorithm is based on application of the difference potentials method (DPM). Previously, similar boundary conditions have been implemented for calculation of external compressible viscous flows around finite bodies. The proposed modification substantially extends the applicability range of the DPM-based algorithm. In the paper, we present the general formulation of the problem, describe our numerical methodology, and discuss the corresponding computational results. The particular configuration that we analyze is a slender three-dimensional body with boat-tail geometry and supersonic jet exhaust in a subsonic external flow under zero angle of attack. Similarly to the results obtained earlier for the flows around airfoils and wings, current results for the jet flow case corroborate the superiority of the DPM-based ABC's over standard local methodologies from the standpoints of accuracy, overall numerical performance, and robustness.
Strategies for improving approximate Bayesian computation tests for synchronous diversification.
Overcast, Isaac; Bagley, Justin C; Hickerson, Michael J
2017-08-24
Estimating the variability in isolation times across co-distributed taxon pairs that may have experienced the same allopatric isolating mechanism is a core goal of comparative phylogeography. The use of hierarchical Approximate Bayesian Computation (ABC) and coalescent models to infer temporal dynamics of lineage co-diversification has been a contentious topic in recent years. Key issues that remain unresolved include the choice of an appropriate prior on the number of co-divergence events (Ψ), as well as the optimal strategies for data summarization. Through simulation-based cross validation we explore the impact of the strategy for sorting summary statistics and the choice of prior on Ψ on the estimation of co-divergence variability. We also introduce a new setting (β) that can potentially improve estimation of Ψ by enforcing a minimal temporal difference between pulses of co-divergence. We apply this new method to three empirical datasets: one dataset each of co-distributed taxon pairs of Panamanian frogs and freshwater fishes, and a large set of Neotropical butterfly sister-taxon pairs. We demonstrate that the choice of prior on Ψ has little impact on inference, but that sorting summary statistics yields substantially more reliable estimates of co-divergence variability despite violations of assumptions about exchangeability. We find the implementation of β improves estimation of Ψ, with improvement being most dramatic given larger numbers of taxon pairs. We find equivocal support for synchronous co-divergence for both of the Panamanian groups, but we find considerable support for asynchronous divergence among the Neotropical butterflies. Our simulation experiments demonstrate that using sorted summary statistics results in improved estimates of the variability in divergence times, whereas the choice of hyperprior on Ψ has negligible effect. Additionally, we demonstrate that estimating the number of pulses of co-divergence across co-distributed taxon-pairs is improved by applying a flexible buffering regime over divergence times. This improves the correlation between Ψ and the true variability in isolation times and allows for more meaningful interpretation of this hyperparameter. This will allow for more accurate identification of the number of temporally distinct pulses of co-divergence that generated the diversification pattern of a given regional assemblage of sister-taxon-pairs.
Understanding exoplanet populations with simulation-based methods
NASA Astrophysics Data System (ADS)
Morehead, Robert Charles
The Kepler candidate catalog represents an unprecedented sample of exoplanet host stars. This dataset is ideal for probing the populations of exoplanet systems and exploring their architectures. Confirming transiting exoplanets candidates through traditional follow-up methods is challenging, especially for faint host stars. Most of Kepler's validated planets relied on statistical methods to separate true planets from false-positives. Multiple transiting planet systems (MTPS) have been previously shown to have low false-positive rates and over 850 planets in MTPSs have been statistically validated so far. We show that the period-normalized transit duration ratio (xi) offers additional information that can be used to establish the planetary nature of these systems. We briefly discuss the observed distribution of xi for the Q1-Q17 Kepler Candidate Search. We also use xi to develop a Bayesian statistical framework combined with Monte Carlo methods to determine which pairs of planet candidates in an MTPS are consistent with the planet hypothesis for a sample of 862 MTPSs that include candidate planets, confirmed planets, and known false-positives. This analysis proves to be efficient and advantageous in that it only requires catalog-level bulk candidate properties and galactic population modeling to compute the probabilities of a myriad of feasible scenarios composed of background and companion stellar blends in the photometric aperture, without needing additional observational follow-up. Our results agree with the previous results of a low false-positive rate in the Kepler MTPSs. This implies, independently of any other estimates, that most of the MTPSs detected by Kepler are planetary in nature, but that a substantial fraction could be orbiting stars other than then the putative target star, and therefore may be subject to significant error in the inferred planet parameters resulting from unknown or mismeasured stellar host attributes. We also apply approximate Bayesian computation (ABC) using forward simulations of the Kepler planet catalog to simultaneously constrain the distributions of mutual inclination between the planets, orbital eccentricity, the underlying number of planets per planetary system, and the fraction of stars that host planet systems in a subsample of Kepler candidate planets using SimpleABC, a Python package we developed that is a general-purpose framework for ABC analysis. For our investigation into planet architectures, we limit our investigation to candidates in orbits from 10 to 320 days, where the false-positive contamination rate is expected to be low. We test two models, the first is an independent eccentricity ( e) model where mutual inclination and e are drawn from Rayleigh distributions with dispersions sigmaim and sigmae, planets per planetary system is drawn from a Poisson distribution with mean lambda, and the fraction of stars with planetary systems is drawn from two-state categorical distribution parameterized by etap. We also test an Equipartition Model identical to the Independent e Model, except that sigmae is linked to sigmaim by a scaling factor gammae. For the Independent e Model, we find sigmaim = 5.51° +8.00-3.35, sigmae = 0.03+0.05-0.01, lambda = 6.62+7.74 -3.36, and etap = 0.20 +0.18-0.11. For the Equipartition Model, we find sigmaim = 1.15°+0.56-0.33 , gammae = 1.38+1.89 -0.93, lambda = 2.25+0.56-0.29, and etap = 0.56+0.08-0.11 . These results, especially the Equipartition Model, are in good agreement with previous studies. However, deficiencies in our single population models suggest that at least one additional subpopulation of planet systems is needed to explain the Kepler sample, providing more confirmation of the so-called "Kepler Dichotomy".
NASA Astrophysics Data System (ADS)
Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad
2016-05-01
Bayesian inference has traditionally been conceived as the proper framework for the formal incorporation of expert knowledge in parameter estimation of groundwater models. However, conventional Bayesian inference is incapable of taking into account the imprecision essentially embedded in expert provided information. In order to solve this problem, a number of extensions to conventional Bayesian inference have been introduced in recent years. One of these extensions is 'fuzzy Bayesian inference' which is the result of integrating fuzzy techniques into Bayesian statistics. Fuzzy Bayesian inference has a number of desirable features which makes it an attractive approach for incorporating expert knowledge in the parameter estimation process of groundwater models: (1) it is well adapted to the nature of expert provided information, (2) it allows to distinguishably model both uncertainty and imprecision, and (3) it presents a framework for fusing expert provided information regarding the various inputs of the Bayesian inference algorithm. However an important obstacle in employing fuzzy Bayesian inference in groundwater numerical modeling applications is the computational burden, as the required number of numerical model simulations often becomes extremely exhaustive and often computationally infeasible. In this paper, a novel approach of accelerating the fuzzy Bayesian inference algorithm is proposed which is based on using approximate posterior distributions derived from surrogate modeling, as a screening tool in the computations. The proposed approach is first applied to a synthetic test case of seawater intrusion (SWI) in a coastal aquifer. It is shown that for this synthetic test case, the proposed approach decreases the number of required numerical simulations by an order of magnitude. Then the proposed approach is applied to a real-world test case involving three-dimensional numerical modeling of SWI in Kish Island, located in the Persian Gulf. An expert elicitation methodology is developed and applied to the real-world test case in order to provide a road map for the use of fuzzy Bayesian inference in groundwater modeling applications.
Slok, Annerika H M; in 't Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, P N Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P
2014-07-10
In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice.
Slok, Annerika H M; in ’t Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, PN Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P
2014-01-01
In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice. PMID:25010353
NASA Astrophysics Data System (ADS)
Wang, Li; Li, Feng; Xing, Jian
2017-10-01
In this paper, a hybrid artificial bee colony (ABC) algorithm and pattern search (PS) method is proposed and applied for recovery of particle size distribution (PSD) from spectral extinction data. To be more useful and practical, size distribution function is modelled as the general Johnson's ? function that can overcome the difficulty of not knowing the exact type beforehand encountered in many real circumstances. The proposed hybrid algorithm is evaluated through simulated examples involving unimodal, bimodal and trimodal PSDs with different widths and mean particle diameters. For comparison, all examples are additionally validated by the single ABC algorithm. In addition, the performance of the proposed algorithm is further tested by actual extinction measurements with real standard polystyrene samples immersed in water. Simulation and experimental results illustrate that the hybrid algorithm can be used as an effective technique to retrieve the PSDs with high reliability and accuracy. Compared with the single ABC algorithm, our proposed algorithm can produce more accurate and robust inversion results while taking almost comparative CPU time over ABC algorithm alone. The superiority of ABC and PS hybridization strategy in terms of reaching a better balance of estimation accuracy and computation effort increases its potentials as an excellent inversion technique for reliable and efficient actual measurement of PSD.
A Multiuser Detector Based on Artificial Bee Colony Algorithm for DS-UWB Systems
Liu, Xiaohui
2013-01-01
Artificial Bee Colony (ABC) algorithm is an optimization algorithm based on the intelligent behavior of honey bee swarm. The ABC algorithm was developed to solve optimizing numerical problems and revealed premising results in processing time and solution quality. In ABC, a colony of artificial bees search for rich artificial food sources; the optimizing numerical problems are converted to the problem of finding the best parameter which minimizes an objective function. Then, the artificial bees randomly discover a population of initial solutions and then iteratively improve them by employing the behavior: moving towards better solutions by means of a neighbor search mechanism while abandoning poor solutions. In this paper, an efficient multiuser detector based on a suboptimal code mapping multiuser detector and artificial bee colony algorithm (SCM-ABC-MUD) is proposed and implemented in direct-sequence ultra-wideband (DS-UWB) systems under the additive white Gaussian noise (AWGN) channel. The simulation results demonstrate that the BER and the near-far effect resistance performances of this proposed algorithm are quite close to those of the optimum multiuser detector (OMD) while its computational complexity is much lower than that of OMD. Furthermore, the BER performance of SCM-ABC-MUD is not sensitive to the number of active users and can obtain a large system capacity. PMID:23983638
Kawahara, Kunimitsu; Kawasumi, Hiromi; Nagano, Teruaki; Sasada, Shinji; Okamoto, Norio
2008-04-01
More than 1 asbestos body (AB) per ml of bronchoalveolar lavage fluid (BALF) under light microscopy was defined as AB positive (ABP) and suggests an occupational asbestos exposure. We microscopically evaluated the AB number per one ml of BALF, which we defined as the AB concentration (ABC), using bronchoalveolar lavage (BAL) cytocentrifuge slides obtained from 35 patients having pulmonary nodular lesions (20 carcinoma and 15 nonneoplastic disease) and examined the correlation between ABC and clinicopathological data including findings on Helical computed tomography scan (HCTS) and occupational history of asbestos exposure (OHAE). BAL was performed by the standard technique without removing mucous with a gauze filter. AB was microscopically defined as a structure consisting of a core of transparent asbestos surrounded by an iron-protein coat. Twenty of 35 patients were ABP (ABP rate; 57%) and ABC ranged from 0 to 207.98/ml (mean ABC; 11.33/ml). Mean ABC was significantly higher in patients with OHAE (15.04/ml) compared to that in patients without OHAE (3.23/ml). Twenty-two of 35 patients (63%) lacked abnormality on HCTS and among these, 12 patients (55%) were ABP. In 20 pulmonary carcinoma patients, the ABP rate was 85% and ABC ranged from 0 to 31.1/ml (Mean ABC; 2.99/ml). The ABP rate of pulmonary carcinoma patients was 40% (8 patients) and among these, 5 patients (63%) did not show any abnormality on HCTS. In conclusion, our method was simple and useful and should be applied to patients with pulmonary nodular lesions and OHAE, even if there are no abnormalities on HCTS.
Bayesian modeling of flexible cognitive control
Jiang, Jiefeng; Heller, Katherine; Egner, Tobias
2014-01-01
“Cognitive control” describes endogenous guidance of behavior in situations where routine stimulus-response associations are suboptimal for achieving a desired goal. The computational and neural mechanisms underlying this capacity remain poorly understood. We examine recent advances stemming from the application of a Bayesian learner perspective that provides optimal prediction for control processes. In reviewing the application of Bayesian models to cognitive control, we note that an important limitation in current models is a lack of a plausible mechanism for the flexible adjustment of control over conflict levels changing at varying temporal scales. We then show that flexible cognitive control can be achieved by a Bayesian model with a volatility-driven learning mechanism that modulates dynamically the relative dependence on recent and remote experiences in its prediction of future control demand. We conclude that the emergent Bayesian perspective on computational mechanisms of cognitive control holds considerable promise, especially if future studies can identify neural substrates of the variables encoded by these models, and determine the nature (Bayesian or otherwise) of their neural implementation. PMID:24929218
Paris, Josephine R; King, R Andrew; Stevens, Jamie R
2015-01-01
Humans have exploited the earth's metal resources for thousands of years leaving behind a legacy of toxic metal contamination and poor water quality. The southwest of England provides a well-defined example, with a rich history of metal mining dating to the Bronze Age. Mine water washout continues to negatively impact water quality across the region where brown trout (Salmo trutta L.) populations exist in both metal-impacted and relatively clean rivers. We used microsatellites to assess the genetic impact of mining practices on trout populations in this region. Our analyses demonstrated that metal-impacted trout populations have low genetic diversity and have experienced severe population declines. Metal-river trout populations are genetically distinct from clean-river populations, and also from one another, despite being geographically proximate. Using approximate Bayesian computation (ABC), we dated the origins of these genetic patterns to periods of intensive mining activity. The historical split of contemporary metal-impacted populations from clean-river fish dated to the Medieval period. Moreover, we observed two distinct genetic populations of trout within a single catchment and dated their divergence to the Industrial Revolution. Our investigation thus provides an evaluation of contemporary population genetics in showing how human-altered landscapes can change the genetic makeup of a species. PMID:26136823
Huang, Lei; Liao, Li; Wu, Cathy H.
2016-01-01
Revealing the underlying evolutionary mechanism plays an important role in understanding protein interaction networks in the cell. While many evolutionary models have been proposed, the problem about applying these models to real network data, especially for differentiating which model can better describe evolutionary process for the observed network urgently remains as a challenge. The traditional way is to use a model with presumed parameters to generate a network, and then evaluate the fitness by summary statistics, which however cannot capture the complete network structures information and estimate parameter distribution. In this work we developed a novel method based on Approximate Bayesian Computation and modified Differential Evolution (ABC-DEP) that is capable of conducting model selection and parameter estimation simultaneously and detecting the underlying evolutionary mechanisms more accurately. We tested our method for its power in differentiating models and estimating parameters on the simulated data and found significant improvement in performance benchmark, as compared with a previous method. We further applied our method to real data of protein interaction networks in human and yeast. Our results show Duplication Attachment model as the predominant evolutionary mechanism for human PPI networks and Scale-Free model as the predominant mechanism for yeast PPI networks. PMID:26357273
Tanaka, Ray; Hayashi, Takafumi; Ike, Makiko; Noto, Yoshiyuki; Goto, Tazuko K
2013-06-01
The aim of this study was to evaluate the usefulness of hypothetical monoenergetic images after dual-energy computed tomography (DECT) for assessment of the bone encircling dental implant bodies. Seventy-two axial images of implantation sites clipped out from image data scanned using DECT in dual-energy mode were used. Subjective assessment on reduction of dark-band-like artifacts (R-DBAs) and diagnosability of adjacent bone condition (D-ABC) in 3 sets of DECT images-a fused image set (DE120) and 2 sets of hypothetical monoenergetic images (ME100, ME190)-was performed and the results were statistically analyzed. With regards to R-DBAs and D-ABC, significant differences among DE120, ME100, and ME190 were observed. The ME100 and ME190 images revealed more artifact reduction and diagnosability than those of DE120. DECT imaging followed by hypothetical monoenergetic image construction can cause R-DBAs and increase D-ABC and may be potentially used for the evaluation of postoperative changes in the bone encircling implant bodies. Copyright © 2013 Elsevier Inc. All rights reserved.
The Bayesian Revolution Approaches Psychological Development
ERIC Educational Resources Information Center
Shultz, Thomas R.
2007-01-01
This commentary reviews five articles that apply Bayesian ideas to psychological development, some with psychology experiments, some with computational modeling, and some with both experiments and modeling. The reviewed work extends the current Bayesian revolution into tasks often studied in children, such as causal learning and word learning, and…
A Tutorial in Bayesian Potential Outcomes Mediation Analysis.
Miočević, Milica; Gonzalez, Oscar; Valente, Matthew J; MacKinnon, David P
2018-01-01
Statistical mediation analysis is used to investigate intermediate variables in the relation between independent and dependent variables. Causal interpretation of mediation analyses is challenging because randomization of subjects to levels of the independent variable does not rule out the possibility of unmeasured confounders of the mediator to outcome relation. Furthermore, commonly used frequentist methods for mediation analysis compute the probability of the data given the null hypothesis, which is not the probability of a hypothesis given the data as in Bayesian analysis. Under certain assumptions, applying the potential outcomes framework to mediation analysis allows for the computation of causal effects, and statistical mediation in the Bayesian framework gives indirect effects probabilistic interpretations. This tutorial combines causal inference and Bayesian methods for mediation analysis so the indirect and direct effects have both causal and probabilistic interpretations. Steps in Bayesian causal mediation analysis are shown in the application to an empirical example.
Robson, Barry
2003-01-01
New scientific problems, arising from the human genome project, are challenging the classical means of using statistics. Yet quantified knowledge in the form of rules and rule strengths based on real relationships in data, as opposed to expert opinion, is urgently required for researcher and physician decision support. The problem is that with many parameters, the space to be analyzed is highly dimensional. That is, the combinations of data to examine are subject to a combinatorial explosion as the number of possible events (entries, items, sub-records) (a),(b),(c),... per record (a,b,c,..) increases, and hence much of the space is sparsely populated. These combinatorial considerations are particularly problematic for identifying those associations called "Unicorn Events" which occur significantly less than expected to the extent that they are never seen to be counted. To cope with the combinatorial explosion, a novel numerical "book keeping" approach is taken to generate information terms relating to the combinatorial subsets of events (a,b,c,..), and, most importantly, the zeta (Zeta) function is employed. The incomplete Zeta function zeta(s,n) with s = 1, in which frequencies of occurrence such as n = n(a,b,c,...) determine the range of summation n, is argued to be the natural choice of information function. It emerges from Bayesian integration, taken over the distribution of possible values of information measures for sparse and ample data alike. Expected mutual information l(a;b;c) in nats (i.e., natural units analogous to bits but based on the natural logarithm), such as is available to the observer, is measured as e.g., the difference zeta(s,o(a,b,c..)) - zeta(s,e(a,b,c..)) where o(a,b,c,..) and e(a,b,c,..) are, or relate to, the observed and expected frequencies of occurrence, respectively. For real values of s > 1 the qualitative impact of strongly (positively or negatively) ranked data is preserved despite several numerical approximations. As real s increases, and the output of the information functions converge into three values +1, 0, and -1 nats representing a trinary logic system. For quantitative data, a useful ad hoc method, to report sigma-normalized covariations in an analogous manner to mutual information for significance comparison purposes, is demonstrated. Finally, the potential ability to make use of mutual information in a complex biomedical study, and to include Bayesian prior information derived from statistical, tabular, anecdotal, and expert opinion is briefly illustrated.
Multiple sequence alignment using multi-objective based bacterial foraging optimization algorithm.
Rani, R Ranjani; Ramyachitra, D
2016-12-01
Multiple sequence alignment (MSA) is a widespread approach in computational biology and bioinformatics. MSA deals with how the sequences of nucleotides and amino acids are sequenced with possible alignment and minimum number of gaps between them, which directs to the functional, evolutionary and structural relationships among the sequences. Still the computation of MSA is a challenging task to provide an efficient accuracy and statistically significant results of alignments. In this work, the Bacterial Foraging Optimization Algorithm was employed to align the biological sequences which resulted in a non-dominated optimal solution. It employs Multi-objective, such as: Maximization of Similarity, Non-gap percentage, Conserved blocks and Minimization of gap penalty. BAliBASE 3.0 benchmark database was utilized to examine the proposed algorithm against other methods In this paper, two algorithms have been proposed: Hybrid Genetic Algorithm with Artificial Bee Colony (GA-ABC) and Bacterial Foraging Optimization Algorithm. It was found that Hybrid Genetic Algorithm with Artificial Bee Colony performed better than the existing optimization algorithms. But still the conserved blocks were not obtained using GA-ABC. Then BFO was used for the alignment and the conserved blocks were obtained. The proposed Multi-Objective Bacterial Foraging Optimization Algorithm (MO-BFO) was compared with widely used MSA methods Clustal Omega, Kalign, MUSCLE, MAFFT, Genetic Algorithm (GA), Ant Colony Optimization (ACO), Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Hybrid Genetic Algorithm with Artificial Bee Colony (GA-ABC). The final results show that the proposed MO-BFO algorithm yields better alignment than most widely used methods. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Bayes' theorem application in the measure information diagnostic value assessment
NASA Astrophysics Data System (ADS)
Orzechowski, Piotr D.; Makal, Jaroslaw; Nazarkiewicz, Andrzej
2006-03-01
The paper presents Bayesian method application in the measure information diagnostic value assessment that is used in the computer-aided diagnosis system. The computer system described here has been created basing on the Bayesian Network and is used in Benign Prostatic Hyperplasia (BPH) diagnosis. The graphic diagnostic model enables to juxtapose experts' knowledge with data.
Arnold Diffusion of Charged Particles in ABC Magnetic Fields
NASA Astrophysics Data System (ADS)
Luque, Alejandro; Peralta-Salas, Daniel
2017-06-01
We prove the existence of diffusing solutions in the motion of a charged particle in the presence of ABC magnetic fields. The equations of motion are modeled by a 3DOF Hamiltonian system depending on two parameters. For small values of these parameters, we obtain a normally hyperbolic invariant manifold and we apply the so-called geometric methods for a priori unstable systems developed by A. Delshams, R. de la Llave and T.M. Seara. We characterize explicitly sufficient conditions for the existence of a transition chain of invariant tori having heteroclinic connections, thus obtaining global instability (Arnold diffusion). We also check the obtained conditions in a computer-assisted proof. ABC magnetic fields are the simplest force-free-type solutions of the magnetohydrodynamics equations with periodic boundary conditions, and can be considered as an elementary model for the motion of plasma-charged particles in a tokamak.
Bayesian Asymmetric Regression as a Means to Estimate and Evaluate Oral Reading Fluency Slopes
ERIC Educational Resources Information Center
Solomon, Benjamin G.; Forsberg, Ole J.
2017-01-01
Bayesian techniques have become increasingly present in the social sciences, fueled by advances in computer speed and the development of user-friendly software. In this paper, we forward the use of Bayesian Asymmetric Regression (BAR) to monitor intervention responsiveness when using Curriculum-Based Measurement (CBM) to assess oral reading…
Performance advantages of CPML over UPML absorbing boundary conditions in FDTD algorithm
NASA Astrophysics Data System (ADS)
Gvozdic, Branko D.; Djurdjevic, Dusan Z.
2017-01-01
Implementation of absorbing boundary condition (ABC) has a very important role in simulation performance and accuracy in finite difference time domain (FDTD) method. The perfectly matched layer (PML) is the most efficient type of ABC. The aim of this paper is to give detailed insight in and discussion of boundary conditions and hence to simplify the choice of PML used for termination of computational domain in FDTD method. In particular, we demonstrate that using the convolutional PML (CPML) has significant advantages in terms of implementation in FDTD method and reducing computer resources than using uniaxial PML (UPML). An extensive number of numerical experiments has been performed and results have shown that CPML is more efficient in electromagnetic waves absorption. Numerical code is prepared, several problems are analyzed and relative error is calculated and presented.
Drift of Phase Fluctuations in the ABC Model
NASA Astrophysics Data System (ADS)
Bertini, Lorenzo; Buttà, Paolo
2013-07-01
In a recent work, Bodineau and Derrida analyzed the phase fluctuations in the ABC model. In particular, they computed the asymptotic variance and, on the basis of numerical simulations, they conjectured the presence of a drift, which they guessed to be an antisymmetric function of the three densities. By assuming the validity of the fluctuating hydrodynamic approximation, we prove the presence of such a drift, providing an analytical expression for it. This expression is then shown to be an antisymmetric function of the three densities. The antisymmetry of the drift can also be inferred from a symmetry property of the underlying microscopic dynamics.
Ali, Sajid; Gladieux, Pierre; Leconte, Marc; Gautier, Angélique; Justesen, Annemarie F.; Hovmøller, Mogens S.; Enjalbert, Jérôme; de Vallavieille-Pope, Claude
2014-01-01
Analyses of large-scale population structure of pathogens enable the identification of migration patterns, diversity reservoirs or longevity of populations, the understanding of current evolutionary trajectories and the anticipation of future ones. This is particularly important for long-distance migrating fungal pathogens such as Puccinia striiformis f.sp. tritici (PST), capable of rapid spread to new regions and crop varieties. Although a range of recent PST invasions at continental scales are well documented, the worldwide population structure and the center of origin of the pathogen were still unknown. In this study, we used multilocus microsatellite genotyping to infer worldwide population structure of PST and the origin of new invasions based on 409 isolates representative of distribution of the fungus on six continents. Bayesian and multivariate clustering methods partitioned the set of multilocus genotypes into six distinct genetic groups associated with their geographical origin. Analyses of linkage disequilibrium and genotypic diversity indicated a strong regional heterogeneity in levels of recombination, with clear signatures of recombination in the Himalayan (Nepal and Pakistan) and near-Himalayan regions (China) and a predominant clonal population structure in other regions. The higher genotypic diversity, recombinant population structure and high sexual reproduction ability in the Himalayan and neighboring regions suggests this area as the putative center of origin of PST. We used clustering methods and approximate Bayesian computation (ABC) to compare different competing scenarios describing ancestral relationship among ancestral populations and more recently founded populations. Our analyses confirmed the Middle East-East Africa as the most likely source of newly spreading, high-temperature-adapted strains; Europe as the source of South American, North American and Australian populations; and Mediterranean-Central Asian populations as the origin of South African populations. Although most geographic populations are not markedly affected by recent dispersal events, this study emphasizes the influence of human activities on recent long-distance spread of the pathogen. PMID:24465211
Zigouris, Joanna; Schaefer, James A.; Fortin, Clément; Kyle, Christopher J.
2013-01-01
Interglacial-glacial cycles of the Quaternary are widely recognized in shaping phylogeographic structure. Patterns from cold adapted species can be especially informative - in particular, uncovering additional glacial refugia, identifying likely recolonization patterns, and increasing our understanding of species’ responses to climate change. We investigated phylogenetic structure of the wolverine, a wide-ranging cold adapted carnivore, using a 318 bp of the mitochondrial DNA control region for 983 wolverines (n = 209 this study, n = 774 from GenBank) from across their full Holarctic distribution. Bayesian phylogenetic tree reconstruction and the distribution of observed pairwise haplotype differences (mismatch distribution) provided evidence of a single rapid population expansion across the wolverine’s Holarctic range. Even though molecular evidence corroborated a single refugium, significant subdivisions of population genetic structure (0.01< ΦST <0.99, P<0.05) were detected. Pairwise ΦST estimates separated Scandinavia from Russia and Mongolia, and identified five main divisions within North America - the Central Arctic, a western region, an eastern region consisting of Ontario and Quebec/Labrador, Manitoba, and California. These data are in contrast to the nearly panmictic structure observed in northwestern North America using nuclear microsatellites, but largely support the nuclear DNA separation of contemporary Manitoba and Ontario wolverines from northern populations. Historic samples (c. 1900) from the functionally extirpated eastern population of Quebec/Labrador displayed genetic similarities to contemporary Ontario wolverines. To understand these divergence patterns, four hypotheses were tested using Approximate Bayesian Computation (ABC). The most supported hypothesis was a single Beringia incursion during the last glacial maximum that established the northwestern population, followed by a west-to-east colonization during the Holocene. This pattern is suggestive of colonization occurring in accordance with glacial retreat, and supports expansion from a single refugium. These data are significant relative to current discussions on the conservation status of this species across its range. PMID:24386287
Zigouris, Joanna; Schaefer, James A; Fortin, Clément; Kyle, Christopher J
2013-01-01
Interglacial-glacial cycles of the Quaternary are widely recognized in shaping phylogeographic structure. Patterns from cold adapted species can be especially informative - in particular, uncovering additional glacial refugia, identifying likely recolonization patterns, and increasing our understanding of species' responses to climate change. We investigated phylogenetic structure of the wolverine, a wide-ranging cold adapted carnivore, using a 318 bp of the mitochondrial DNA control region for 983 wolverines (n=209 this study, n=774 from GenBank) from across their full Holarctic distribution. Bayesian phylogenetic tree reconstruction and the distribution of observed pairwise haplotype differences (mismatch distribution) provided evidence of a single rapid population expansion across the wolverine's Holarctic range. Even though molecular evidence corroborated a single refugium, significant subdivisions of population genetic structure (0.01< ΦST <0.99, P<0.05) were detected. Pairwise ΦST estimates separated Scandinavia from Russia and Mongolia, and identified five main divisions within North America - the Central Arctic, a western region, an eastern region consisting of Ontario and Quebec/Labrador, Manitoba, and California. These data are in contrast to the nearly panmictic structure observed in northwestern North America using nuclear microsatellites, but largely support the nuclear DNA separation of contemporary Manitoba and Ontario wolverines from northern populations. Historic samples (c. 1900) from the functionally extirpated eastern population of Quebec/Labrador displayed genetic similarities to contemporary Ontario wolverines. To understand these divergence patterns, four hypotheses were tested using Approximate Bayesian Computation (ABC). The most supported hypothesis was a single Beringia incursion during the last glacial maximum that established the northwestern population, followed by a west-to-east colonization during the Holocene. This pattern is suggestive of colonization occurring in accordance with glacial retreat, and supports expansion from a single refugium. These data are significant relative to current discussions on the conservation status of this species across its range.
NASA Astrophysics Data System (ADS)
Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.
2015-03-01
We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.
ERIC Educational Resources Information Center
West, Patti; Rutstein, Daisy Wise; Mislevy, Robert J.; Liu, Junhui; Choi, Younyoung; Levy, Roy; Crawford, Aaron; DiCerbo, Kristen E.; Chappel, Kristina; Behrens, John T.
2010-01-01
A major issue in the study of learning progressions (LPs) is linking student performance on assessment tasks to the progressions. This report describes the challenges faced in making this linkage using Bayesian networks to model LPs in the field of computer networking. The ideas are illustrated with exemplar Bayesian networks built on Cisco…
Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen
2013-10-01
Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer's disease classification task. As an additional benefit, the technique also allows one to compute informative "error bars" on the volume estimates of individual structures. Copyright © 2013 Elsevier B.V. All rights reserved.
Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Leemput, Koen Van
2013-01-01
Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer’s disease classification task. As an additional benefit, the technique also allows one to compute informative “error bars” on the volume estimates of individual structures. PMID:23773521
Bayesian evidence computation for model selection in non-linear geoacoustic inference problems.
Dettmer, Jan; Dosso, Stan E; Osler, John C
2010-12-01
This paper applies a general Bayesian inference approach, based on Bayesian evidence computation, to geoacoustic inversion of interface-wave dispersion data. Quantitative model selection is carried out by computing the evidence (normalizing constants) for several model parameterizations using annealed importance sampling. The resulting posterior probability density estimate is compared to estimates obtained from Metropolis-Hastings sampling to ensure consistent results. The approach is applied to invert interface-wave dispersion data collected on the Scotian Shelf, off the east coast of Canada for the sediment shear-wave velocity profile. Results are consistent with previous work on these data but extend the analysis to a rigorous approach including model selection and uncertainty analysis. The results are also consistent with core samples and seismic reflection measurements carried out in the area.
High-order Two-way Artificial Boundary Conditions for Nonlinear Wave Propagation with Backscattering
NASA Technical Reports Server (NTRS)
Fibich, Gadi; Tsynkov, Semyon
2000-01-01
When solving linear scattering problems, one typically first solves for the impinging wave in the absence of obstacles. Then, by linear superposition, the original problem is reduced to one that involves only the scattered waves driven by the values of the impinging field at the surface of the obstacles. In addition, when the original domain is unbounded, special artificial boundary conditions (ABCs) that would guarantee the reflectionless propagation of waves have to be set at the outer boundary of the finite computational domain. The situation becomes conceptually different when the propagation equation is nonlinear. In this case the impinging and scattered waves can no longer be separated, and the problem has to be solved in its entirety. In particular, the boundary on which the incoming field values are prescribed, should transmit the given incoming waves in one direction and simultaneously be transparent to all the outgoing waves that travel in the opposite direction. We call this type of boundary conditions two-way ABCs. In the paper, we construct the two-way ABCs for the nonlinear Helmholtz equation that models the laser beam propagation in a medium with nonlinear index of refraction. In this case, the forward propagation is accompanied by backscattering, i.e., generation of waves in the direction opposite to that of the incoming signal. Our two-way ABCs generate no reflection of the backscattered waves and at the same time impose the correct values of the incoming wave. The ABCs are obtained for a fourth-order accurate discretization to the Helmholtz operator; the fourth-order grid convergence is corroborated experimentally by solving linear model problems. We also present solutions in the nonlinear case using the two-way ABC which, unlike the traditional Dirichlet boundary condition, allows for direct calculation of the magnitude of backscattering.
Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach
NASA Technical Reports Server (NTRS)
Warner, James E.; Hochhalter, Jacob D.
2016-01-01
This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.
Nonlinear and non-Gaussian Bayesian based handwriting beautification
NASA Astrophysics Data System (ADS)
Shi, Cao; Xiao, Jianguo; Xu, Canhui; Jia, Wenhua
2013-03-01
A framework is proposed in this paper to effectively and efficiently beautify handwriting by means of a novel nonlinear and non-Gaussian Bayesian algorithm. In the proposed framework, format and size of handwriting image are firstly normalized, and then typeface in computer system is applied to optimize vision effect of handwriting. The Bayesian statistics is exploited to characterize the handwriting beautification process as a Bayesian dynamic model. The model parameters to translate, rotate and scale typeface in computer system are controlled by state equation, and the matching optimization between handwriting and transformed typeface is employed by measurement equation. Finally, the new typeface, which is transformed from the original one and gains the best nonlinear and non-Gaussian optimization, is the beautification result of handwriting. Experimental results demonstrate the proposed framework provides a creative handwriting beautification methodology to improve visual acceptance.
Computational statistics using the Bayesian Inference Engine
NASA Astrophysics Data System (ADS)
Weinberg, Martin D.
2013-09-01
This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef
Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less
Computer Simulation of the Virulome of Bacillus anthracis Using Proteomics
2006-07-31
hypothetical protein gi|47526566 spermidine /putrescine ABC transporter, spermidine /putrescine-binding protein gi|47526625 oligoendopeptidase F, putative gi...glutamyl-trna(gln) amidotransferase, a subunit x gi|50196927 aspartate aminotransferase x gi|50196970 spermidine synthase x
Identification of transmissivity fields using a Bayesian strategy and perturbative approach
NASA Astrophysics Data System (ADS)
Zanini, Andrea; Tanda, Maria Giovanna; Woodbury, Allan D.
2017-10-01
The paper deals with the crucial problem of the groundwater parameter estimation that is the basis for efficient modeling and reclamation activities. A hierarchical Bayesian approach is developed: it uses the Akaike's Bayesian Information Criteria in order to estimate the hyperparameters (related to the covariance model chosen) and to quantify the unknown noise variance. The transmissivity identification proceeds in two steps: the first, called empirical Bayesian interpolation, uses Y* (Y = lnT) observations to interpolate Y values on a specified grid; the second, called empirical Bayesian update, improve the previous Y estimate through the addition of hydraulic head observations. The relationship between the head and the lnT has been linearized through a perturbative solution of the flow equation. In order to test the proposed approach, synthetic aquifers from literature have been considered. The aquifers in question contain a variety of boundary conditions (both Dirichelet and Neuman type) and scales of heterogeneities (σY2 = 1.0 and σY2 = 5.3). The estimated transmissivity fields were compared to the true one. The joint use of Y* and head measurements improves the estimation of Y considering both degrees of heterogeneity. Even if the variance of the strong transmissivity field can be considered high for the application of the perturbative approach, the results show the same order of approximation of the non-linear methods proposed in literature. The procedure allows to compute the posterior probability distribution of the target quantities and to quantify the uncertainty in the model prediction. Bayesian updating has advantages related both to the Monte-Carlo (MC) and non-MC approaches. In fact, as the MC methods, Bayesian updating allows computing the direct posterior probability distribution of the target quantities and as non-MC methods it has computational times in the order of seconds.
Bayesian Inference: with ecological applications
Link, William A.; Barker, Richard J.
2010-01-01
This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.
Incorporating approximation error in surrogate based Bayesian inversion
NASA Astrophysics Data System (ADS)
Zhang, J.; Zeng, L.; Li, W.; Wu, L.
2015-12-01
There are increasing interests in applying surrogates for inverse Bayesian modeling to reduce repetitive evaluations of original model. In this way, the computational cost is expected to be saved. However, the approximation error of surrogate model is usually overlooked. This is partly because that it is difficult to evaluate the approximation error for many surrogates. Previous studies have shown that, the direct combination of surrogates and Bayesian methods (e.g., Markov Chain Monte Carlo, MCMC) may lead to biased estimations when the surrogate cannot emulate the highly nonlinear original system. This problem can be alleviated by implementing MCMC in a two-stage manner. However, the computational cost is still high since a relatively large number of original model simulations are required. In this study, we illustrate the importance of incorporating approximation error in inverse Bayesian modeling. Gaussian process (GP) is chosen to construct the surrogate for its convenience in approximation error evaluation. Numerical cases of Bayesian experimental design and parameter estimation for contaminant source identification are used to illustrate this idea. It is shown that, once the surrogate approximation error is well incorporated into Bayesian framework, promising results can be obtained even when the surrogate is directly used, and no further original model simulations are required.
Sparse-grid, reduced-basis Bayesian inversion: Nonaffine-parametric nonlinear equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Peng, E-mail: peng@ices.utexas.edu; Schwab, Christoph, E-mail: christoph.schwab@sam.math.ethz.ch
2016-07-01
We extend the reduced basis (RB) accelerated Bayesian inversion methods for affine-parametric, linear operator equations which are considered in [16,17] to non-affine, nonlinear parametric operator equations. We generalize the analysis of sparsity of parametric forward solution maps in [20] and of Bayesian inversion in [48,49] to the fully discrete setting, including Petrov–Galerkin high-fidelity (“HiFi”) discretization of the forward maps. We develop adaptive, stochastic collocation based reduction methods for the efficient computation of reduced bases on the parametric solution manifold. The nonaffinity and nonlinearity with respect to (w.r.t.) the distributed, uncertain parameters and the unknown solution is collocated; specifically, by themore » so-called Empirical Interpolation Method (EIM). For the corresponding Bayesian inversion problems, computational efficiency is enhanced in two ways: first, expectations w.r.t. the posterior are computed by adaptive quadratures with dimension-independent convergence rates proposed in [49]; the present work generalizes [49] to account for the impact of the PG discretization in the forward maps on the convergence rates of the Quantities of Interest (QoI for short). Second, we propose to perform the Bayesian estimation only w.r.t. a parsimonious, RB approximation of the posterior density. Based on the approximation results in [49], the infinite-dimensional parametric, deterministic forward map and operator admit N-term RB and EIM approximations which converge at rates which depend only on the sparsity of the parametric forward map. In several numerical experiments, the proposed algorithms exhibit dimension-independent convergence rates which equal, at least, the currently known rate estimates for N-term approximation. We propose to accelerate Bayesian estimation by first offline construction of reduced basis surrogates of the Bayesian posterior density. The parsimonious surrogates can then be employed for online data assimilation and for Bayesian estimation. They also open a perspective for optimal experimental design.« less
Li, Yanhe; Guo, Xianwu; Chen, Liping; Bai, Xiaohui; Wei, Xinlan; Zhou, Xiaoyun; Huang, Songqian; Wang, Weimin
2015-01-01
Identifying the dispersal pathways of an invasive species is useful for adopting the appropriate strategies to prevent and control its spread. However, these processes are exceedingly complex. So, it is necessary to apply new technology and collect representative samples for analysis. This study used Approximate Bayesian Computation (ABC) in combination with traditional genetic tools to examine extensive sample data and historical records to infer the invasion history of the red swamp crayfish, Procambarus clarkii, in China. The sequences of the mitochondrial control region and the proPOx intron in the nuclear genome of samples from 37 sites (35 in China and one each in Japan and the USA) were analyzed. The results of combined scenarios testing and historical records revealed a much more complex invasion history in China than previously believed. P. clarkii was most likely originally introduced into China from Japan from an unsampled source, and the species then expanded its range primarily into the middle and lower reaches and, to a lesser extent, into the upper reaches of the Changjiang River in China. No transfer was observed from the upper reaches to the middle and lower reaches of the Changjiang River. Human-mediated jump dispersal was an important dispersal pathway for P. clarkii. The results provide a better understanding of the evolutionary scenarios involved in the rapid invasion of P. clarkii in China. PMID:26132567
Wang, Baosheng; Khalili Mahani, Marjan; Ng, Wei Lun; Kusumi, Junko; Phi, Hai Hong; Inomata, Nobuyuki; Wang, Xiao-Ru; Szmidt, Alfred E
2014-01-01
Pinus krempfii Lecomte is a morphologically and ecologically unique pine, endemic to Vietnam. It is regarded as vulnerable species with distribution limited to just two provinces: Khanh Hoa and Lam Dong. Although a few phylogenetic studies have included this species, almost nothing is known about its genetic features. In particular, there are no studies addressing the levels and patterns of genetic variation in natural populations of P. krempfii. In this study, we sampled 57 individuals from six natural populations of P. krempfii and analyzed their sequence variation in ten nuclear gene regions (approximately 9 kb) and 14 mitochondrial (mt) DNA regions (approximately 10 kb). We also analyzed variation at seven chloroplast (cp) microsatellite (SSR) loci. We found very low haplotype and nucleotide diversity at nuclear loci compared with other pine species. Furthermore, all investigated populations were monomorphic across all mitochondrial DNA (mtDNA) regions included in our study, which are polymorphic in other pine species. Population differentiation at nuclear loci was low (5.2%) but significant. However, structure analysis of nuclear loci did not detect genetically differentiated groups of populations. Approximate Bayesian computation (ABC) using nuclear sequence data and mismatch distribution analysis for cpSSR loci suggested recent expansion of the species. The implications of these findings for the management and conservation of P. krempfii genetic resources were discussed. PMID:25360263
Application of Bayesian Approach in Cancer Clinical Trial
Bhattacharjee, Atanu
2014-01-01
The application of Bayesian approach in clinical trials becomes more useful over classical method. It is beneficial from design to analysis phase. The straight forward statement is possible to obtain through Bayesian about the drug treatment effect. Complex computational problems are simple to handle with Bayesian techniques. The technique is only feasible to performing presence of prior information of the data. The inference is possible to establish through posterior estimates. However, some limitations are present in this method. The objective of this work was to explore the several merits and demerits of Bayesian approach in cancer research. The review of the technique will be helpful for the clinical researcher involved in the oncology to explore the limitation and power of Bayesian techniques. PMID:29147387
Bayesian Nonlinear Assimilation of Eulerian and Lagrangian Coastal Flow Data
2015-09-30
Lagrangian Coastal Flow Data Dr. Pierre F.J. Lermusiaux Department of Mechanical Engineering Center for Ocean Science and Engineering Massachusetts...Develop and apply theory, schemes and computational systems for rigorous Bayesian nonlinear assimilation of Eulerian and Lagrangian coastal flow data...coastal ocean fields, both in Eulerian and Lagrangian forms. - Further develop and implement our GMM-DO schemes for robust Bayesian nonlinear estimation
NASA Astrophysics Data System (ADS)
Farrell, Kathryn; Oden, J. Tinsley; Faghihi, Danial
2015-08-01
A general adaptive modeling algorithm for selection and validation of coarse-grained models of atomistic systems is presented. A Bayesian framework is developed to address uncertainties in parameters, data, and model selection. Algorithms for computing output sensitivities to parameter variances, model evidence and posterior model plausibilities for given data, and for computing what are referred to as Occam Categories in reference to a rough measure of model simplicity, make up components of the overall approach. Computational results are provided for representative applications.
Tieleman, D Peter
2006-10-01
A key function of biological membranes is to provide mechanisms for the controlled transport of ions, nutrients, metabolites, peptides and proteins between a cell and its environment. We are using computer simulations to study several processes involved in transport. In model membranes, the distribution of small molecules can be accurately calculated; we are making progress towards understanding the factors that determine the partitioning behaviour in the inhomogeneous lipid environment, with implications for drug distribution, membrane protein folding and the energetics of voltage gating. Lipid bilayers can be simulated at a scale that is sufficiently large to study significant defects, such as those caused by electroporation. Computer simulations of complex membrane proteins, such as potassium channels and ATP-binding cassette (ABC) transporters, can give detailed information about the atomistic dynamics that form the basis of ion transport, selectivity, conformational change and the molecular mechanism of ATP-driven transport. This is illustrated in the present review with recent simulation studies of the voltage-gated potassium channel KvAP and the ABC transporter BtuCD.
Skull removal in MR images using a modified artificial bee colony optimization algorithm.
Taherdangkoo, Mohammad
2014-01-01
Removal of the skull from brain Magnetic Resonance (MR) images is an important preprocessing step required for other image analysis techniques such as brain tissue segmentation. In this paper, we propose a new algorithm based on the Artificial Bee Colony (ABC) optimization algorithm to remove the skull region from brain MR images. We modify the ABC algorithm using a different strategy for initializing the coordinates of scout bees and their direction of search. Moreover, we impose an additional constraint to the ABC algorithm to avoid the creation of discontinuous regions. We found that our algorithm successfully removed all bony skull from a sample of de-identified MR brain images acquired from different model scanners. The obtained results of the proposed algorithm compared with those of previously introduced well known optimization algorithms such as Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO) demonstrate the superior results and computational performance of our algorithm, suggesting its potential for clinical applications.
NASA Technical Reports Server (NTRS)
Volakis, J. L.; Kempel, L. C.; Sliva, R.; Wang, H. T. G.; Woo, A. G.
1994-01-01
The goal of this project was to develop analysis codes for computing the scattering and radiation of antennas on cylindrically and doubly conformal platforms. The finite element-boundary integral (FE-BI) method has been shown to accurately model the scattering and radiation of cavity-backed patch antennas. Unfortunately extension of this rigorous technique to coated or doubly curved platforms is cumbersome and inefficient. An alternative approximate approach is to employ an absorbing boundary condition (ABC) for terminating the finite element mesh thus avoiding use of a Green's function. A FE-ABC method is used to calculate the radar cross section (RCS) and radiation pattern of a cavity-backed patch antenna which is recessed within a metallic surface. It is shown that this approach is accurate for RCS and antenna pattern calculations with an ABC surface displaced as little as 0.3 lambda from the cavity aperture. These patch antennas may have a dielectric overlay which may also be modeled with this technique.
Narimani, Zahra; Beigy, Hamid; Ahmad, Ashar; Masoudi-Nejad, Ali; Fröhlich, Holger
2017-01-01
Inferring the structure of molecular networks from time series protein or gene expression data provides valuable information about the complex biological processes of the cell. Causal network structure inference has been approached using different methods in the past. Most causal network inference techniques, such as Dynamic Bayesian Networks and ordinary differential equations, are limited by their computational complexity and thus make large scale inference infeasible. This is specifically true if a Bayesian framework is applied in order to deal with the unavoidable uncertainty about the correct model. We devise a novel Bayesian network reverse engineering approach using ordinary differential equations with the ability to include non-linearity. Besides modeling arbitrary, possibly combinatorial and time dependent perturbations with unknown targets, one of our main contributions is the use of Expectation Propagation, an algorithm for approximate Bayesian inference over large scale network structures in short computation time. We further explore the possibility of integrating prior knowledge into network inference. We evaluate the proposed model on DREAM4 and DREAM8 data and find it competitive against several state-of-the-art existing network inference methods.
Calculation of Crystallographic Texture of BCC Steels During Cold Rolling
NASA Astrophysics Data System (ADS)
Das, Arpan
2017-05-01
BCC alloys commonly tend to develop strong fibre textures and often represent as isointensity diagrams in φ 1 sections or by fibre diagrams. Alpha fibre in bcc steels is generally characterised by <110> crystallographic axis parallel to the rolling direction. The objective of present research is to correlate carbon content, carbide dispersion, rolling reduction, Euler angles (ϕ) (when φ 1 = 0° and φ 2 = 45° along alpha fibre) and the resulting alpha fibre texture orientation intensity. In the present research, Bayesian neural computation has been employed to correlate these and compare with the existing feed-forward neural network model comprehensively. Excellent match to the measured texture data within the bounding box of texture training data set has been already predicted through the feed-forward neural network model by other researchers. Feed-forward neural network prediction outside the bounds of training texture data showed deviations from the expected values. Currently, Bayesian computation has been similarly applied to confirm that the predictions are reasonable in the context of basic metallurgical principles, and matched better outside the bounds of training texture data set than the reported feed-forward neural network. Bayesian computation puts error bars on predicted values and allows significance of each individual parameters to be estimated. Additionally, it is also possible by Bayesian computation to estimate the isolated influence of particular variable such as carbon concentration, which exactly cannot in practice be varied independently. This shows the ability of the Bayesian neural network to examine the new phenomenon in situations where the data cannot be accessed through experiments.
Application of bayesian networks to real-time flood risk estimation
NASA Astrophysics Data System (ADS)
Garrote, L.; Molina, M.; Blasco, G.
2003-04-01
This paper presents the application of a computational paradigm taken from the field of artificial intelligence - the bayesian network - to model the behaviour of hydrologic basins during floods. The final goal of this research is to develop representation techniques for hydrologic simulation models in order to define, develop and validate a mechanism, supported by a software environment, oriented to build decision models for the prediction and management of river floods in real time. The emphasis is placed on providing decision makers with tools to incorporate their knowledge of basin behaviour, usually formulated in terms of rainfall-runoff models, in the process of real-time decision making during floods. A rainfall-runoff model is only a step in the process of decision making. If a reliable rainfall forecast is available and the rainfall-runoff model is well calibrated, decisions can be based mainly on model results. However, in most practical situations, uncertainties in rainfall forecasts or model performance have to be incorporated in the decision process. The computation paradigm adopted for the simulation of hydrologic processes is the bayesian network. A bayesian network is a directed acyclic graph that represents causal influences between linked variables. Under this representation, uncertain qualitative variables are related through causal relations quantified with conditional probabilities. The solution algorithm allows the computation of the expected probability distribution of unknown variables conditioned to the observations. An approach to represent hydrologic processes by bayesian networks with temporal and spatial extensions is presented in this paper, together with a methodology for the development of bayesian models using results produced by deterministic hydrologic simulation models
NASA Astrophysics Data System (ADS)
Ketabchi, Hamed; Ataie-Ashtiani, Behzad
2015-01-01
This paper surveys the literature associated with the application of evolutionary algorithms (EAs) in coastal groundwater management problems (CGMPs). This review demonstrates that previous studies were mostly relied on the application of limited and particular EAs, mainly genetic algorithm (GA) and its variants, to a number of specific problems. The exclusive investigation of these problems is often not the representation of the variety of feasible processes may be occurred in coastal aquifers. In this study, eight EAs are evaluated for CGMPs. The considered EAs are: GA, continuous ant colony optimization (CACO), particle swarm optimization (PSO), differential evolution (DE), artificial bee colony optimization (ABC), harmony search (HS), shuffled complex evolution (SCE), and simplex simulated annealing (SIMPSA). The first application of PSO, ABC, HS, and SCE in CGMPs is reported here. Moreover, the four benchmark problems with different degree of difficulty and variety are considered to address the important issues of groundwater resources in coastal regions. Hence, the wide ranges of popular objective functions and constraints with the number of decision variables ranging from 4 to 15 are included. These benchmark problems are applied in the combined simulation-optimization model to examine the optimization scenarios. Some preliminary experiments are performed to select the most efficient parameters values for EAs to set a fair comparison. The specific capabilities of each EA toward CGMPs in terms of results quality and required computational time are compared. The evaluation of the results highlights EA's applicability in CGMPs, besides the remarkable strengths and weaknesses of them. The comparisons show that SCE, CACO, and PSO yield superior solutions among the EAs according to the quality of solutions whereas ABC presents the poor performance. CACO provides the better solutions (up to 17%) than the worst EA (ABC) for the problem with the highest decision variables and more complexity. In terms of computational time, PSO and SIMPSA are the fastest. SCE needs the highest computational time, even up to four times in comparison to the fastest EAs. CACO and PSO can be recommended for application in CGMPs, in terms of both abovementioned criteria.
Dorazio, R.M.; Johnson, F.A.
2003-01-01
Bayesian inference and decision theory may be used in the solution of relatively complex problems of natural resource management, owing to recent advances in statistical theory and computing. In particular, Markov chain Monte Carlo algorithms provide a computational framework for fitting models of adequate complexity and for evaluating the expected consequences of alternative management actions. We illustrate these features using an example based on management of waterfowl habitat.
St. Onge, K. R.; Palmé, A. E.; Wright, S. I.; Lascoux, M.
2012-01-01
Most species have at least some level of genetic structure. Recent simulation studies have shown that it is important to consider population structure when sampling individuals to infer past population history. The relevance of the results of these computer simulations for empirical studies, however, remains unclear. In the present study, we use DNA sequence datasets collected from two closely related species with very different histories, the selfing species Capsella rubella and its outcrossing relative C. grandiflora, to assess the impact of different sampling strategies on summary statistics and the inference of historical demography. Sampling strategy did not strongly influence the mean values of Tajima’s D in either species, but it had some impact on the variance. The general conclusions about demographic history were comparable across sampling schemes even when resampled data were analyzed with approximate Bayesian computation (ABC). We used simulations to explore the effects of sampling scheme under different demographic models. We conclude that when sequences from modest numbers of loci (<60) are analyzed, the sampling strategy is generally of limited importance. The same is true under intermediate or high levels of gene flow (4Nm > 2–10) in models in which global expansion is combined with either local expansion or hierarchical population structure. Although we observe a less severe effect of sampling than predicted under some earlier simulation models, our results should not be seen as an encouragement to neglect this issue. In general, a good coverage of the natural range, both within and between populations, will be needed to obtain a reliable reconstruction of a species’s demographic history, and in fact, the effect of sampling scheme on polymorphism patterns may itself provide important information about demographic history. PMID:22870403
Hayashi, Tomohiko; Chiba, Shuntaro; Kaneta, Yusuke; Furuta, Tadaomi; Sakurai, Minoru
2014-11-06
ATP binding cassette (ABC) proteins belong to a superfamily of active transporters. Recent experimental and computational studies have shown that binding of ATP to the nucleotide binding domains (NBDs) of ABC proteins drives the dimerization of NBDs, which, in turn, causes large conformational changes within the transmembrane domains (TMDs). To elucidate the active substrate transport mechanism of ABC proteins, it is first necessary to understand how the NBD dimerization is driven by ATP binding. In this study, we selected MalKs (NBDs of a maltose transporter) as a representative NBD and calculated the free-energy change upon dimerization using molecular mechanics calculations combined with a statistical thermodynamic theory of liquids, as well as a method to calculate the translational, rotational, and vibrational entropy change. This combined method is applied to a large number of snapshot structures obtained from molecular dynamics simulations containing explicit water molecules. The results suggest that the NBD dimerization proceeds with a large gain of water entropy when ATP molecules bind to the NBDs. The energetic gain arising from direct NBD-NBD interactions is canceled by the dehydration penalty and the configurational-entropy loss. ATP hydrolysis induces a loss of the shape complementarity between the NBDs, which leads to the dissociation of the dimer, due to a decrease in the water-entropy gain and an increase in the configurational-entropy loss. This interpretation of the NBD dimerization mechanism in concert with ATP, especially focused on the water-mediated entropy force, is potentially applicable to a wide variety of the ABC transporters.
Bayesian statistics in medicine: a 25 year review.
Ashby, Deborah
2006-11-15
This review examines the state of Bayesian thinking as Statistics in Medicine was launched in 1982, reflecting particularly on its applicability and uses in medical research. It then looks at each subsequent five-year epoch, with a focus on papers appearing in Statistics in Medicine, putting these in the context of major developments in Bayesian thinking and computation with reference to important books, landmark meetings and seminal papers. It charts the growth of Bayesian statistics as it is applied to medicine and makes predictions for the future. From sparse beginnings, where Bayesian statistics was barely mentioned, Bayesian statistics has now permeated all the major areas of medical statistics, including clinical trials, epidemiology, meta-analyses and evidence synthesis, spatial modelling, longitudinal modelling, survival modelling, molecular genetics and decision-making in respect of new technologies.
A Bayesian approach for parameter estimation and prediction using a computationally intensive model
Higdon, Dave; McDonnell, Jordan D.; Schunck, Nicolas; ...
2015-02-05
Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based modelmore » $$\\eta (\\theta )$$, where θ denotes the uncertain, best input setting. Hence the statistical model is of the form $$y=\\eta (\\theta )+\\epsilon ,$$ where $$\\epsilon $$ accounts for measurement, and possibly other, error sources. When nonlinearity is present in $$\\eta (\\cdot )$$, the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model $$\\eta (\\cdot )$$. This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. Lastly, we also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory.« less
A Variational Bayes Genomic-Enabled Prediction Model with Genotype × Environment Interaction
Montesinos-López, Osval A.; Montesinos-López, Abelardo; Crossa, José; Montesinos-López, José Cricelio; Luna-Vázquez, Francisco Javier; Salinas-Ruiz, Josafhat; Herrera-Morales, José R.; Buenrostro-Mariscal, Raymundo
2017-01-01
There are Bayesian and non-Bayesian genomic models that take into account G×E interactions. However, the computational cost of implementing Bayesian models is high, and becomes almost impossible when the number of genotypes, environments, and traits is very large, while, in non-Bayesian models, there are often important and unsolved convergence problems. The variational Bayes method is popular in machine learning, and, by approximating the probability distributions through optimization, it tends to be faster than Markov Chain Monte Carlo methods. For this reason, in this paper, we propose a new genomic variational Bayes version of the Bayesian genomic model with G×E using half-t priors on each standard deviation (SD) term to guarantee highly noninformative and posterior inferences that are not sensitive to the choice of hyper-parameters. We show the complete theoretical derivation of the full conditional and the variational posterior distributions, and their implementations. We used eight experimental genomic maize and wheat data sets to illustrate the new proposed variational Bayes approximation, and compared its predictions and implementation time with a standard Bayesian genomic model with G×E. Results indicated that prediction accuracies are slightly higher in the standard Bayesian model with G×E than in its variational counterpart, but, in terms of computation time, the variational Bayes genomic model with G×E is, in general, 10 times faster than the conventional Bayesian genomic model with G×E. For this reason, the proposed model may be a useful tool for researchers who need to predict and select genotypes in several environments. PMID:28391241
A Variational Bayes Genomic-Enabled Prediction Model with Genotype × Environment Interaction.
Montesinos-López, Osval A; Montesinos-López, Abelardo; Crossa, José; Montesinos-López, José Cricelio; Luna-Vázquez, Francisco Javier; Salinas-Ruiz, Josafhat; Herrera-Morales, José R; Buenrostro-Mariscal, Raymundo
2017-06-07
There are Bayesian and non-Bayesian genomic models that take into account G×E interactions. However, the computational cost of implementing Bayesian models is high, and becomes almost impossible when the number of genotypes, environments, and traits is very large, while, in non-Bayesian models, there are often important and unsolved convergence problems. The variational Bayes method is popular in machine learning, and, by approximating the probability distributions through optimization, it tends to be faster than Markov Chain Monte Carlo methods. For this reason, in this paper, we propose a new genomic variational Bayes version of the Bayesian genomic model with G×E using half-t priors on each standard deviation (SD) term to guarantee highly noninformative and posterior inferences that are not sensitive to the choice of hyper-parameters. We show the complete theoretical derivation of the full conditional and the variational posterior distributions, and their implementations. We used eight experimental genomic maize and wheat data sets to illustrate the new proposed variational Bayes approximation, and compared its predictions and implementation time with a standard Bayesian genomic model with G×E. Results indicated that prediction accuracies are slightly higher in the standard Bayesian model with G×E than in its variational counterpart, but, in terms of computation time, the variational Bayes genomic model with G×E is, in general, 10 times faster than the conventional Bayesian genomic model with G×E. For this reason, the proposed model may be a useful tool for researchers who need to predict and select genotypes in several environments. Copyright © 2017 Montesinos-López et al.
Bayesian reconstruction and use of anatomical a priori information for emission tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowsher, J.E.; Johnson, V.E.; Turkington, T.G.
1996-10-01
A Bayesian method is presented for simultaneously segmenting and reconstructing emission computed tomography (ECT) images and for incorporating high-resolution, anatomical information into those reconstructions. The anatomical information is often available from other imaging modalities such as computed tomography (CT) or magnetic resonance imaging (MRI). The Bayesian procedure models the ECT radiopharmaceutical distribution as consisting of regions, such that radiopharmaceutical activity is similar throughout each region. It estimates the number of regions, the mean activity of each region, and the region classification and mean activity of each voxel. Anatomical information is incorporated by assigning higher prior probabilities to ECT segmentations inmore » which each ECT region stays within a single anatomical region. This approach is effective because anatomical tissue type often strongly influences radiopharmaceutical uptake. The Bayesian procedure is evaluated using physically acquired single-photon emission computed tomography (SPECT) projection data and MRI for the three-dimensional (3-D) Hoffman brain phantom. A clinically realistic count level is used. A cold lesion within the brain phantom is created during the SPECT scan but not during the MRI to demonstrate that the estimation procedure can detect ECT structure that is not present anatomically.« less
Bayesian data analysis in population ecology: motivations, methods, and benefits
Dorazio, Robert
2016-01-01
During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.
Results of cement augmentation and curettage in aneurysmal bone cyst of spine
Basu, Saumyajit; Patel, Dharmesh R; Dhakal, Gaurav; Sarangi, T
2016-01-01
Aneurysmal bone cyst (ABC) is a vascular tumor of the spine. Management of spinal ABC still remains controversial because of its location, vascular nature and incidence of recurrence. In this manuscript, we hereby describe two cases of ABC spine treated by curettage, vertebral cement augmentation for control of bleeding and internal stabilization with two years followup. To the best of our knowledge, this is the first case report in the literature describing the role of cement augmentation in spinal ABC in controlling vascular bleeding in curettage of ABC of spine. Case 1: A 22 year old male patient presented with chronic back pain. On radiological investigation, there were multiple, osteolytic septite lesions at L3 vertebral body without neural compression or instability. Percutaneous transpedicular biopsy of L3 from involved pedicle was done. This was followed by cement augmentation through the uninvolved pedicle. Next, transpedicular complete curettage was done through involved pedicle. Case 2: A 15-year-old female presented with nonradiating back pain and progressive myelopathy. On radiological investigation, there was an osteolytic lesion at D9. At surgery, decompression, pedicle screw-rod fixation and posterolateral fusion from D7 to D11 was done. At D9 level, through normal pedicle cement augmentation was added to provide anterior column support and to control the expected bleeding following curettage. Transpedicular complete curettage was done through the involved pedicle with controlled bleeding at the surgical field. Cement augmentation was providing controlled bleeding at surgical field during curettage, internal stabilization and control of pain. On 2 years followup, pain was relieved and there was a stable spinal segment with well filled cement without any sign of recurrence in computed tomography scan. In selected cases of spinal ABC with single vertebral, single pedicle involvement; cement augmentation of vertebra through normal pedicle has an important role in surgery aimed for curettage of vertebra. PMID:26955184
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheung, Y; Rahimi, A; Sawant, A
Purpose: Active breathing control (ABC) has been used to reduce treatment margin due to respiratory organ motion by enforcing temporary breath-holds. However, in practice, even if the ABC device indicates constant lung volume during breath-hold, the patient may still exhibit minor chest motion. Consequently, therapists are given a false sense of security that the patient is immobilized. This study aims at quantifying such motion during ABC breath-holds by monitoring the patient chest motion using a surface photogrammetry system, VisionRT. Methods: A female patient with breast cancer was selected to evaluate chest motion during ABC breath-holds. During the entire course ofmore » treatment, the patient’s chest surface was monitored by a surface photogrammetry system, VisionRT. Specifically, a user-defined region-of-interest (ROI) on the chest surface was selected for the system to track at a rate of ∼3Hz. The surface motion was estimated by rigid image registration between the current ROI image captured and a reference image. The translational and rotational displacements computed were saved in a log file. Results: A total of 20 fractions of radiation treatment were monitored by VisionRT. After removing noisy data, we obtained chest motion of 79 breath-hold sessions. Mean chest motion in AP direction during breath-holds is 1.31mm with 0.62mm standard deviation. Of the 79 sessions, the patient exhibited motion ranging from 0–1 mm (30 sessions), 1–2 mm (37 sessions), 2–3 mm (11 sessions) and >3 mm (1 session). Conclusion: Contrary to popular assumptions, the patient is not completely still during ABC breath-hold sessions. In this particular case studied, the patient exhibited chest motion over 2mm in 14 out of 79 breath-holds. Underestimating treatment margin for radiation therapy with ABC could reduce treatment effectiveness due to geometric miss or overdose of critical organs. The senior author receives research funding from NIH, VisionRT, Varian Medical Systems and Elekta.« less
Artificial blood circulation: stabilization, physiological control, and optimization.
Lerner, A Y
1990-04-01
The requirements for creating an efficient Artificial Blood Circulation System (ABCS) have been determined. A hierarchical three-level adaptive control system is suggested for ABCS to solve the following problems: stabilization of the circulation conditions, left and right pump coordination, physiological control for maintaining a proper relation between the cardiac output and the level of gas exchange required for metabolism, and optimization of the system behavior. The adaptations to varying load and body parameters will be accomplished using the signals which characterize the real-time computer-processed values of correlations between the changes in hydraulic resistance of blood vessels, or the changes in aortic pressure, and the oxygen (or carbon dioxide) concentration.
On Topological Indices of Certain Dendrimer Structures
NASA Astrophysics Data System (ADS)
Aslam, Adnan; Bashir, Yasir; Ahmad, Safyan; Gao, Wei
2017-05-01
A topological index can be considered as transformation of chemical structure in to real number. In QSAR/QSPR study, physicochemical properties and topological indices such as Randić, Zagreb, atom-bond connectivity ABC, and geometric-arithmetic GA index are used to predict the bioactivity of chemical compounds. Dendrimers are highly branched, star-shaped macromolecules with nanometer-scale dimensions. Dendrimers are defined by three components: a central core, an interior dendritic structure (the branches), and an exterior surface with functional surface groups. In this paper we determine generalised Randić, general Zagreb, general sum-connectivity indices of poly(propyl) ether imine, porphyrin, and zinc-Porphyrin dendrimers. We also compute ABC and GA indices of these families of dendrimers.
Bayes in biological anthropology.
Konigsberg, Lyle W; Frankenberg, Susan R
2013-12-01
In this article, we both contend and illustrate that biological anthropologists, particularly in the Americas, often think like Bayesians but act like frequentists when it comes to analyzing a wide variety of data. In other words, while our research goals and perspectives are rooted in probabilistic thinking and rest on prior knowledge, we often proceed to use statistical hypothesis tests and confidence interval methods unrelated (or tenuously related) to the research questions of interest. We advocate for applying Bayesian analyses to a number of different bioanthropological questions, especially since many of the programming and computational challenges to doing so have been overcome in the past two decades. To facilitate such applications, this article explains Bayesian principles and concepts, and provides concrete examples of Bayesian computer simulations and statistics that address questions relevant to biological anthropology, focusing particularly on bioarchaeology and forensic anthropology. It also simultaneously reviews the use of Bayesian methods and inference within the discipline to date. This article is intended to act as primer to Bayesian methods and inference in biological anthropology, explaining the relationships of various methods to likelihoods or probabilities and to classical statistical models. Our contention is not that traditional frequentist statistics should be rejected outright, but that there are many situations where biological anthropology is better served by taking a Bayesian approach. To this end it is hoped that the examples provided in this article will assist researchers in choosing from among the broad array of statistical methods currently available. Copyright © 2013 Wiley Periodicals, Inc.
Yang, Jingjing; Cox, Dennis D; Lee, Jong Soo; Ren, Peng; Choi, Taeryon
2017-12-01
Functional data are defined as realizations of random functions (mostly smooth functions) varying over a continuum, which are usually collected on discretized grids with measurement errors. In order to accurately smooth noisy functional observations and deal with the issue of high-dimensional observation grids, we propose a novel Bayesian method based on the Bayesian hierarchical model with a Gaussian-Wishart process prior and basis function representations. We first derive an induced model for the basis-function coefficients of the functional data, and then use this model to conduct posterior inference through Markov chain Monte Carlo methods. Compared to the standard Bayesian inference that suffers serious computational burden and instability in analyzing high-dimensional functional data, our method greatly improves the computational scalability and stability, while inheriting the advantage of simultaneously smoothing raw observations and estimating the mean-covariance functions in a nonparametric way. In addition, our method can naturally handle functional data observed on random or uncommon grids. Simulation and real studies demonstrate that our method produces similar results to those obtainable by the standard Bayesian inference with low-dimensional common grids, while efficiently smoothing and estimating functional data with random and high-dimensional observation grids when the standard Bayesian inference fails. In conclusion, our method can efficiently smooth and estimate high-dimensional functional data, providing one way to resolve the curse of dimensionality for Bayesian functional data analysis with Gaussian-Wishart processes. © 2017, The International Biometric Society.
Two Approaches to Calibration in Metrology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campanelli, Mark
2014-04-01
Inferring mathematical relationships with quantified uncertainty from measurement data is common to computational science and metrology. Sufficient knowledge of measurement process noise enables Bayesian inference. Otherwise, an alternative approach is required, here termed compartmentalized inference, because collection of uncertain data and model inference occur independently. Bayesian parameterized model inference is compared to a Bayesian-compatible compartmentalized approach for ISO-GUM compliant calibration problems in renewable energy metrology. In either approach, model evidence can help reduce model discrepancy.
Bayesian Inference for Functional Dynamics Exploring in fMRI Data.
Guo, Xuan; Liu, Bing; Chen, Le; Chen, Guantao; Pan, Yi; Zhang, Jing
2016-01-01
This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI) data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM), Bayesian Connectivity Change Point Model (BCCPM), and Dynamic Bayesian Variable Partition Model (DBVPM), and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come.
Schepens, Stacey; Goldberg, Allon; Wallace, Melissa
2010-01-01
A shortened version of the ABC 16-item scale (ABC-16), the ABC-6, has been proposed as an alternative balance confidence measure. We investigated whether the ABC-6 is a valid and reliable measure of balance confidence and examined its relationship to balance impairment and falls in older adults. Thirty-five community-dwelling older adults completed the ABC-16, including the 6 questions of the ABC-6. They also completed the following clinical balance tests: unipedal stance time (UST), functional reach (FR), Timed Up and Go (TUG), and maximum step length (MSL). Participants reported 12-month falls history. Balance confidence on the ABC-6 was significantly lower than on the ABC-16, however scores were highly correlated. Fallers reported lower balance confidence than non-fallers as measured by the ABC-6 scale, but confidence did not differ between the groups with the ABC-16. The ABC-6 significantly correlated with all balance tests assessed and number of falls. The ABC-16 significantly correlated with all balance tests assessed, but not with number of falls. Test-retest reliability for the ABC-16 and ABC-6 was good to excellent. The ABC-6 is a valid and reliable measure of balance confidence in community-dwelling older adults, and shows stronger relationships to falls than does the ABC-16. The ABC-6 may be a more useful balance confidence assessment tool than the ABC-16. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.
On the Bayesian Treed Multivariate Gaussian Process with Linear Model of Coregionalization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Konomi, Bledar A.; Karagiannis, Georgios; Lin, Guang
2015-02-01
The Bayesian treed Gaussian process (BTGP) has gained popularity in recent years because it provides a straightforward mechanism for modeling non-stationary data and can alleviate computational demands by fitting models to less data. The extension of BTGP to the multivariate setting requires us to model the cross-covariance and to propose efficient algorithms that can deal with trans-dimensional MCMC moves. In this paper we extend the cross-covariance of the Bayesian treed multivariate Gaussian process (BTMGP) to that of linear model of Coregionalization (LMC) cross-covariances. Different strategies have been developed to improve the MCMC mixing and invert smaller matrices in the Bayesianmore » inference. Moreover, we compare the proposed BTMGP with existing multiple BTGP and BTMGP in test cases and multiphase flow computer experiment in a full scale regenerator of a carbon capture unit. The use of the BTMGP with LMC cross-covariance helped to predict the computer experiments relatively better than existing competitors. The proposed model has a wide variety of applications, such as computer experiments and environmental data. In the case of computer experiments we also develop an adaptive sampling strategy for the BTMGP with LMC cross-covariance function.« less
Moving beyond qualitative evaluations of Bayesian models of cognition.
Hemmer, Pernille; Tauber, Sean; Steyvers, Mark
2015-06-01
Bayesian models of cognition provide a powerful way to understand the behavior and goals of individuals from a computational point of view. Much of the focus in the Bayesian cognitive modeling approach has been on qualitative model evaluations, where predictions from the models are compared to data that is often averaged over individuals. In many cognitive tasks, however, there are pervasive individual differences. We introduce an approach to directly infer individual differences related to subjective mental representations within the framework of Bayesian models of cognition. In this approach, Bayesian data analysis methods are used to estimate cognitive parameters and motivate the inference process within a Bayesian cognitive model. We illustrate this integrative Bayesian approach on a model of memory. We apply the model to behavioral data from a memory experiment involving the recall of heights of people. A cross-validation analysis shows that the Bayesian memory model with inferred subjective priors predicts withheld data better than a Bayesian model where the priors are based on environmental statistics. In addition, the model with inferred priors at the individual subject level led to the best overall generalization performance, suggesting that individual differences are important to consider in Bayesian models of cognition.
Image steganalysis using Artificial Bee Colony algorithm
NASA Astrophysics Data System (ADS)
Sajedi, Hedieh
2017-09-01
Steganography is the science of secure communication where the presence of the communication cannot be detected while steganalysis is the art of discovering the existence of the secret communication. Processing a huge amount of information takes extensive execution time and computational sources most of the time. As a result, it is needed to employ a phase of preprocessing, which can moderate the execution time and computational sources. In this paper, we propose a new feature-based blind steganalysis method for detecting stego images from the cover (clean) images with JPEG format. In this regard, we present a feature selection technique based on an improved Artificial Bee Colony (ABC). ABC algorithm is inspired by honeybees' social behaviour in their search for perfect food sources. In the proposed method, classifier performance and the dimension of the selected feature vector depend on using wrapper-based methods. The experiments are performed using two large data-sets of JPEG images. Experimental results demonstrate the effectiveness of the proposed steganalysis technique compared to the other existing techniques.
Bayesian analysis of caustic-crossing microlensing events
NASA Astrophysics Data System (ADS)
Cassan, A.; Horne, K.; Kains, N.; Tsapras, Y.; Browne, P.
2010-06-01
Aims: Caustic-crossing binary-lens microlensing events are important anomalous events because they are capable of detecting an extrasolar planet companion orbiting the lens star. Fast and robust modelling methods are thus of prime interest in helping to decide whether a planet is detected by an event. Cassan introduced a new set of parameters to model binary-lens events, which are closely related to properties of the light curve. In this work, we explain how Bayesian priors can be added to this framework, and investigate on interesting options. Methods: We develop a mathematical formulation that allows us to compute analytically the priors on the new parameters, given some previous knowledge about other physical quantities. We explicitly compute the priors for a number of interesting cases, and show how this can be implemented in a fully Bayesian, Markov chain Monte Carlo algorithm. Results: Using Bayesian priors can accelerate microlens fitting codes by reducing the time spent considering physically implausible models, and helps us to discriminate between alternative models based on the physical plausibility of their parameters.
Gilet, Estelle; Diard, Julien; Bessière, Pierre
2011-01-01
In this paper, we study the collaboration of perception and action representations involved in cursive letter recognition and production. We propose a mathematical formulation for the whole perception–action loop, based on probabilistic modeling and Bayesian inference, which we call the Bayesian Action–Perception (BAP) model. Being a model of both perception and action processes, the purpose of this model is to study the interaction of these processes. More precisely, the model includes a feedback loop from motor production, which implements an internal simulation of movement. Motor knowledge can therefore be involved during perception tasks. In this paper, we formally define the BAP model and show how it solves the following six varied cognitive tasks using Bayesian inference: i) letter recognition (purely sensory), ii) writer recognition, iii) letter production (with different effectors), iv) copying of trajectories, v) copying of letters, and vi) letter recognition (with internal simulation of movements). We present computer simulations of each of these cognitive tasks, and discuss experimental predictions and theoretical developments. PMID:21674043
NASA Astrophysics Data System (ADS)
Reis, D. S.; Stedinger, J. R.; Martins, E. S.
2005-10-01
This paper develops a Bayesian approach to analysis of a generalized least squares (GLS) regression model for regional analyses of hydrologic data. The new approach allows computation of the posterior distributions of the parameters and the model error variance using a quasi-analytic approach. Two regional skew estimation studies illustrate the value of the Bayesian GLS approach for regional statistical analysis of a shape parameter and demonstrate that regional skew models can be relatively precise with effective record lengths in excess of 60 years. With Bayesian GLS the marginal posterior distribution of the model error variance and the corresponding mean and variance of the parameters can be computed directly, thereby providing a simple but important extension of the regional GLS regression procedures popularized by Tasker and Stedinger (1989), which is sensitive to the likely values of the model error variance when it is small relative to the sampling error in the at-site estimator.
NASA Astrophysics Data System (ADS)
Xu, T.; Valocchi, A. J.; Ye, M.; Liang, F.
2016-12-01
Due to simplification and/or misrepresentation of the real aquifer system, numerical groundwater flow and solute transport models are usually subject to model structural error. During model calibration, the hydrogeological parameters may be overly adjusted to compensate for unknown structural error. This may result in biased predictions when models are used to forecast aquifer response to new forcing. In this study, we extend a fully Bayesian method [Xu and Valocchi, 2015] to calibrate a real-world, regional groundwater flow model. The method uses a data-driven error model to describe model structural error and jointly infers model parameters and structural error. In this study, Bayesian inference is facilitated using high performance computing and fast surrogate models. The surrogate models are constructed using machine learning techniques to emulate the response simulated by the computationally expensive groundwater model. We demonstrate in the real-world case study that explicitly accounting for model structural error yields parameter posterior distributions that are substantially different from those derived by the classical Bayesian calibration that does not account for model structural error. In addition, the Bayesian with error model method gives significantly more accurate prediction along with reasonable credible intervals.
BUMPER: the Bayesian User-friendly Model for Palaeo-Environmental Reconstruction
NASA Astrophysics Data System (ADS)
Holden, Phil; Birks, John; Brooks, Steve; Bush, Mark; Hwang, Grace; Matthews-Bird, Frazer; Valencia, Bryan; van Woesik, Robert
2017-04-01
We describe the Bayesian User-friendly Model for Palaeo-Environmental Reconstruction (BUMPER), a Bayesian transfer function for inferring past climate and other environmental variables from microfossil assemblages. The principal motivation for a Bayesian approach is that the palaeoenvironment is treated probabilistically, and can be updated as additional data become available. Bayesian approaches therefore provide a reconstruction-specific quantification of the uncertainty in the data and in the model parameters. BUMPER is fully self-calibrating, straightforward to apply, and computationally fast, requiring 2 seconds to build a 100-taxon model from a 100-site training-set on a standard personal computer. We apply the model's probabilistic framework to generate thousands of artificial training-sets under ideal assumptions. We then use these to demonstrate both the general applicability of the model and the sensitivity of reconstructions to the characteristics of the training-set, considering assemblage richness, taxon tolerances, and the number of training sites. We demonstrate general applicability to real data, considering three different organism types (chironomids, diatoms, pollen) and different reconstructed variables. In all of these applications an identically configured model is used, the only change being the input files that provide the training-set environment and taxon-count data.
Bayesian networks and statistical analysis application to analyze the diagnostic test accuracy
NASA Astrophysics Data System (ADS)
Orzechowski, P.; Makal, Jaroslaw; Onisko, A.
2005-02-01
The computer aided BPH diagnosis system based on Bayesian network is described in the paper. First result are compared to a given statistical method. Different statistical methods are used successfully in medicine for years. However, the undoubted advantages of probabilistic methods make them useful in application in newly created systems which are frequent in medicine, but do not have full and competent knowledge. The article presents advantages of the computer aided BPH diagnosis system in clinical practice for urologists.
Wang, Chih-Wei; Liu, Yi-Jui; Lee, Yi-Hsiung; Hueng, Dueng-Yuan; Fan, Hueng-Chuen; Yang, Fu-Chi; Hsueh, Chun-Jen; Kao, Hung-Wen; Juan, Chun-Jung; Hsu, Hsian-He
2014-01-01
Purpose To investigate the performance of hematoma shape, hematoma size, Glasgow coma scale (GCS) score, and intracerebral hematoma (ICH) score in predicting the 30-day mortality for ICH patients. To examine the influence of the estimation error of hematoma size on the prediction of 30-day mortality. Materials and Methods This retrospective study, approved by a local institutional review board with written informed consent waived, recruited 106 patients diagnosed as ICH by non-enhanced computed tomography study. The hemorrhagic shape, hematoma size measured by computer-assisted volumetric analysis (CAVA) and estimated by ABC/2 formula, ICH score and GCS score was examined. The predicting performance of 30-day mortality of the aforementioned variables was evaluated. Statistical analysis was performed using Kolmogorov-Smirnov tests, paired t test, nonparametric test, linear regression analysis, and binary logistic regression. The receiver operating characteristics curves were plotted and areas under curve (AUC) were calculated for 30-day mortality. A P value less than 0.05 was considered as statistically significant. Results The overall 30-day mortality rate was 15.1% of ICH patients. The hematoma shape, hematoma size, ICH score, and GCS score all significantly predict the 30-day mortality for ICH patients, with an AUC of 0.692 (P = 0.0018), 0.715 (P = 0.0008) (by ABC/2) to 0.738 (P = 0.0002) (by CAVA), 0.877 (P<0.0001) (by ABC/2) to 0.882 (P<0.0001) (by CAVA), and 0.912 (P<0.0001), respectively. Conclusion Our study shows that hematoma shape, hematoma size, ICH scores and GCS score all significantly predict the 30-day mortality in an increasing order of AUC. The effect of overestimation of hematoma size by ABC/2 formula in predicting the 30-day mortality could be remedied by using ICH score. PMID:25029592
Cortical Hierarchies Perform Bayesian Causal Inference in Multisensory Perception
Rohe, Tim; Noppeney, Uta
2015-01-01
To form a veridical percept of the environment, the brain needs to integrate sensory signals from a common source but segregate those from independent sources. Thus, perception inherently relies on solving the “causal inference problem.” Behaviorally, humans solve this problem optimally as predicted by Bayesian Causal Inference; yet, the underlying neural mechanisms are unexplored. Combining psychophysics, Bayesian modeling, functional magnetic resonance imaging (fMRI), and multivariate decoding in an audiovisual spatial localization task, we demonstrate that Bayesian Causal Inference is performed by a hierarchy of multisensory processes in the human brain. At the bottom of the hierarchy, in auditory and visual areas, location is represented on the basis that the two signals are generated by independent sources (= segregation). At the next stage, in posterior intraparietal sulcus, location is estimated under the assumption that the two signals are from a common source (= forced fusion). Only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the causal structure of the world is taken into account and sensory signals are combined as predicted by Bayesian Causal Inference. Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex. It unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition. Our results demonstrate how the brain combines information in the face of uncertainty about the underlying causal structure of the world. PMID:25710328
USDA-ARS?s Scientific Manuscript database
Plastoglobules (PGs) are plastid lipid-protein particles. This study examines the function of PG-localized kinases ABC1K1 and ABC1K3 in Arabidopsis thaliana. Several lines of evidence suggested that ABC1K1 and ABC1K3 form a protein complex. Null mutants for both genes (abc1k1 and abc1k3) and the dou...
A FAST BAYESIAN METHOD FOR UPDATING AND FORECASTING HOURLY OZONE LEVELS
A Bayesian hierarchical space-time model is proposed by combining information from real-time ambient AIRNow air monitoring data, and output from a computer simulation model known as the Community Multi-scale Air Quality (Eta-CMAQ) forecast model. A model validation analysis shows...
2014-10-02
intervals (Neil, Tailor, Marquez, Fenton , & Hear, 2007). This is cumbersome, error prone and usually inaccurate. Even though a universal framework...Science. Neil, M., Tailor, M., Marquez, D., Fenton , N., & Hear. (2007). Inference in Bayesian networks using dynamic discretisation. Statistics
Teaching with Technology: Literature and Software.
ERIC Educational Resources Information Center
Allen, Denise
1994-01-01
Reviews five computer programs and compact disc-read only memory (CD-ROM) products designed to improve students' reading and problem-solving skills: (1) "Reading Realities" (Teacher Support Software); (2) "Kid Rhymes" (Creative Pursuits); (3) "First-Start Biographies" (Troll Associates); (4) "My Silly CD of ABCs" (Discis Classroom Editions); and…
The ABCs of Writing a Technical Glossary.
ERIC Educational Resources Information Center
Gray, Evie; Ingram, William; Bodson, Dennis
1998-01-01
Explains format, style rules, and lexicographic conventions that improve clarity and precision in a technical glossary. Discusses general rules, rules of style, rules of grammar and syntax, and rules for figures. Describes the computer display techniques and file management system used to develop such a glossary. (SR)
A sub-space greedy search method for efficient Bayesian Network inference.
Zhang, Qing; Cao, Yong; Li, Yong; Zhu, Yanming; Sun, Samuel S M; Guo, Dianjing
2011-09-01
Bayesian network (BN) has been successfully used to infer the regulatory relationships of genes from microarray dataset. However, one major limitation of BN approach is the computational cost because the calculation time grows more than exponentially with the dimension of the dataset. In this paper, we propose a sub-space greedy search method for efficient Bayesian Network inference. Particularly, this method limits the greedy search space by only selecting gene pairs with higher partial correlation coefficients. Using both synthetic and real data, we demonstrate that the proposed method achieved comparable results with standard greedy search method yet saved ∼50% of the computational time. We believe that sub-space search method can be widely used for efficient BN inference in systems biology. Copyright © 2011 Elsevier Ltd. All rights reserved.
Validation of the thermal challenge problem using Bayesian Belief Networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McFarland, John; Swiler, Laura Painton
The thermal challenge problem has been developed at Sandia National Laboratories as a testbed for demonstrating various types of validation approaches and prediction methods. This report discusses one particular methodology to assess the validity of a computational model given experimental data. This methodology is based on Bayesian Belief Networks (BBNs) and can incorporate uncertainty in experimental measurements, in physical quantities, and model uncertainties. The approach uses the prior and posterior distributions of model output to compute a validation metric based on Bayesian hypothesis testing (a Bayes' factor). This report discusses various aspects of the BBN, specifically in the context ofmore » the thermal challenge problem. A BBN is developed for a given set of experimental data in a particular experimental configuration. The development of the BBN and the method for ''solving'' the BBN to develop the posterior distribution of model output through Monte Carlo Markov Chain sampling is discussed in detail. The use of the BBN to compute a Bayes' factor is demonstrated.« less
The Bayesian reader: explaining word recognition as an optimal Bayesian decision process.
Norris, Dennis
2006-04-01
This article presents a theory of visual word recognition that assumes that, in the tasks of word identification, lexical decision, and semantic categorization, human readers behave as optimal Bayesian decision makers. This leads to the development of a computational model of word recognition, the Bayesian reader. The Bayesian reader successfully simulates some of the most significant data on human reading. The model accounts for the nature of the function relating word frequency to reaction time and identification threshold, the effects of neighborhood density and its interaction with frequency, and the variation in the pattern of neighborhood density effects seen in different experimental tasks. Both the general behavior of the model and the way the model predicts different patterns of results in different tasks follow entirely from the assumption that human readers approximate optimal Bayesian decision makers. ((c) 2006 APA, all rights reserved).
MUMPS Based Integration of Disparate Computer-Assisted Medical Diagnosis Modules
1989-12-12
modules use a Bayesian approach, while the Opthalmology module uses a Rule Based approach. In the current effort, MUMPS is used to develop an...Abdominal and Chest Pain modules use a Bayesian approach, while the Opthalmology module uses a Rule Based approach. In the current effort, MUMPS is used
Hierarchical Bayesian Models of Subtask Learning
ERIC Educational Resources Information Center
Anglim, Jeromy; Wynton, Sarah K. A.
2015-01-01
The current study used Bayesian hierarchical methods to challenge and extend previous work on subtask learning consistency. A general model of individual-level subtask learning was proposed focusing on power and exponential functions with constraints to test for inconsistency. To study subtask learning, we developed a novel computer-based booking…
USDA-ARS?s Scientific Manuscript database
Data assimilation and regression are two commonly used methods for predicting agricultural yield from remote sensing observations. Data assimilation is a generative approach because it requires explicit approximations of the Bayesian prior and likelihood to compute the probability density function...
Nonparametric Bayesian Modeling for Automated Database Schema Matching
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferragut, Erik M; Laska, Jason A
2015-01-01
The problem of merging databases arises in many government and commercial applications. Schema matching, a common first step, identifies equivalent fields between databases. We introduce a schema matching framework that builds nonparametric Bayesian models for each field and compares them by computing the probability that a single model could have generated both fields. Our experiments show that our method is more accurate and faster than the existing instance-based matching algorithms in part because of the use of nonparametric Bayesian models.
Shedding Light on the Grey Zone of Speciation along a Continuum of Genomic Divergence.
Roux, Camille; Fraïsse, Christelle; Romiguier, Jonathan; Anciaux, Yoann; Galtier, Nicolas; Bierne, Nicolas
2016-12-01
Speciation results from the progressive accumulation of mutations that decrease the probability of mating between parental populations or reduce the fitness of hybrids-the so-called species barriers. The speciation genomic literature, however, is mainly a collection of case studies, each with its own approach and specificities, such that a global view of the gradual process of evolution from one to two species is currently lacking. Of primary importance is the prevalence of gene flow between diverging entities, which is central in most species concepts and has been widely discussed in recent years. Here, we explore the continuum of speciation thanks to a comparative analysis of genomic data from 61 pairs of populations/species of animals with variable levels of divergence. Gene flow between diverging gene pools is assessed under an approximate Bayesian computation (ABC) framework. We show that the intermediate "grey zone" of speciation, in which taxonomy is often controversial, spans from 0.5% to 2% of net synonymous divergence, irrespective of species life history traits or ecology. Thanks to appropriate modeling of among-locus variation in genetic drift and introgression rate, we clarify the status of the majority of ambiguous cases and uncover a number of cryptic species. Our analysis also reveals the high incidence in animals of semi-isolated species (when some but not all loci are affected by barriers to gene flow) and highlights the intrinsic difficulty, both statistical and conceptual, of delineating species in the grey zone of speciation.
Ravinet, Mark; Harrod, Chris; Eizaguirre, Christophe; Prodöhl, Paulo A
2014-06-01
Repeated recolonization of freshwater environments following Pleistocene glaciations has played a major role in the evolution and adaptation of anadromous taxa. Located at the western fringe of Europe, Ireland and Britain were likely recolonized rapidly by anadromous fishes from the North Atlantic following the last glacial maximum (LGM). While the presence of unique mitochondrial haplotypes in Ireland suggests that a cryptic northern refugium may have played a role in recolonization, no explicit test of this hypothesis has been conducted. The three-spined stickleback is native and ubiquitous to aquatic ecosystems throughout Ireland, making it an excellent model species with which to examine the biogeographical history of anadromous fishes in the region. We used mitochondrial and microsatellite markers to examine the presence of divergent evolutionary lineages and to assess broad-scale patterns of geographical clustering among postglacially isolated populations. Our results confirm that Ireland is a region of secondary contact for divergent mitochondrial lineages and that endemic haplotypes occur in populations in Central and Southern Ireland. To test whether a putative Irish lineage arose from a cryptic Irish refugium, we used approximate Bayesian computation (ABC). However, we found no support for this hypothesis. Instead, the Irish lineage likely diverged from the European lineage as a result of postglacial isolation of freshwater populations by rising sea levels. These findings emphasize the need to rigorously test biogeographical hypothesis and contribute further evidence that postglacial processes may have shaped genetic diversity in temperate fauna.
Demographic History of a Recent Invasion of House Mice on the Isolated Island of Gough
Gray, Melissa M.; Wegmann, Daniel; Haasl, Ryan J.; White, Michael A.; Gabriel, Sofia I.; Searle, Jeremy B.; Cuthbert, Richard J.; Ryan, Peter G.; Payseur, Bret A.
2014-01-01
Island populations provide natural laboratories for studying key contributors to evolutionary change, including natural selection, population size, and the colonization of new environments. The demographic histories of island populations can be reconstructed from patterns of genetic diversity. House mice (Mus musculus) inhabit islands throughout the globe, making them an attractive system for studying island colonization from a genetic perspective. Gough Island, in the central South Atlantic Ocean, is one of the remotest islands in the world. House mice were introduced to Gough Island by sealers during the 19th century, and display unusual phenotypes, including exceptionally large body size and carnivorous feeding behavior. We describe genetic variation in Gough Island mice using mitochondrial sequences, nuclear sequences, and microsatellites. Phylogenetic analysis of mitochondrial sequences suggested that Gough Island mice belong to Mus musculus domesticus, with the maternal lineage possibly originating in England or France. Cluster analyses of microsatellites revealed genetic membership for Gough Island mice in multiple coastal populations in Western Europe, suggesting admixed ancestry. Gough Island mice showed substantial reductions in mitochondrial and nuclear sequence variation and weak reductions in microsatellite diversity compared with Western European populations, consistent with a population bottleneck. Approximate Bayesian Computation (ABC) estimated that mice recently colonized Gough Island (~100 years ago) and experienced a 98% reduction in population size followed by a rapid expansion. Our results indicate that the unusual phenotypes of Gough Island mice evolved rapidly, positioning these mice as useful models for understanding rapid phenotypic evolution. PMID:24617968
Generative inference for cultural evolution.
Kandler, Anne; Powell, Adam
2018-04-05
One of the major challenges in cultural evolution is to understand why and how various forms of social learning are used in human populations, both now and in the past. To date, much of the theoretical work on social learning has been done in isolation of data, and consequently many insights focus on revealing the learning processes or the distributions of cultural variants that are expected to have evolved in human populations. In population genetics, recent methodological advances have allowed a greater understanding of the explicit demographic and/or selection mechanisms that underlie observed allele frequency distributions across the globe, and their change through time. In particular, generative frameworks-often using coalescent-based simulation coupled with approximate Bayesian computation (ABC)-have provided robust inferences on the human past, with no reliance on a priori assumptions of equilibrium. Here, we demonstrate the applicability and utility of generative inference approaches to the field of cultural evolution. The framework advocated here uses observed population-level frequency data directly to establish the likely presence or absence of particular hypothesized learning strategies. In this context, we discuss the problem of equifinality and argue that, in the light of sparse cultural data and the multiplicity of possible social learning processes, the exclusion of those processes inconsistent with the observed data might be the most instructive outcome. Finally, we summarize the findings of generative inference approaches applied to a number of case studies.This article is part of the theme issue 'Bridging cultural gaps: interdisciplinary studies in human cultural evolution'. © 2018 The Author(s).
Ravinet, Mark; Harrod, Chris; Eizaguirre, Christophe; Prodöhl, Paulo A
2014-01-01
Repeated recolonization of freshwater environments following Pleistocene glaciations has played a major role in the evolution and adaptation of anadromous taxa. Located at the western fringe of Europe, Ireland and Britain were likely recolonized rapidly by anadromous fishes from the North Atlantic following the last glacial maximum (LGM). While the presence of unique mitochondrial haplotypes in Ireland suggests that a cryptic northern refugium may have played a role in recolonization, no explicit test of this hypothesis has been conducted. The three-spined stickleback is native and ubiquitous to aquatic ecosystems throughout Ireland, making it an excellent model species with which to examine the biogeographical history of anadromous fishes in the region. We used mitochondrial and microsatellite markers to examine the presence of divergent evolutionary lineages and to assess broad-scale patterns of geographical clustering among postglacially isolated populations. Our results confirm that Ireland is a region of secondary contact for divergent mitochondrial lineages and that endemic haplotypes occur in populations in Central and Southern Ireland. To test whether a putative Irish lineage arose from a cryptic Irish refugium, we used approximate Bayesian computation (ABC). However, we found no support for this hypothesis. Instead, the Irish lineage likely diverged from the European lineage as a result of postglacial isolation of freshwater populations by rising sea levels. These findings emphasize the need to rigorously test biogeographical hypothesis and contribute further evidence that postglacial processes may have shaped genetic diversity in temperate fauna. PMID:25360281
Scribner, Kim T.; Soiseth, Chad; McGuire, Jeffrey J.; Sage, Kevin; Thorsteinson, Lyman K.; Nielsen, J. L.; Knudsen, E.
2017-01-01
Measures of genetic diversity within and among populations and historical geomorphological data on stream landscapes were used in model simulations based on approximate Bayesian computation (ABC) to examine hypotheses of the relative importance of stream features (geomorphology and age) associated with colonization events and gene flow for coho salmon Oncorhynchus kisutch breeding in recently deglaciated streams (50–240 years b.p.) in Glacier Bay National Park (GBNP), Alaska. Population estimates of genetic diversity including heterozygosity and allelic richness declined significantly and monotonically from the oldest and largest to youngest and smallest GBNP streams. Interpopulation variance in allele frequency increased with increasing distance between streams (r = 0·435, P < 0·01) and was inversely related to stream age (r = –0·281, P < 0·01). The most supported model of colonization involved ongoing or recent (<10 generations before sampling) colonization originating from large populations outside Glacier Bay proper into all other GBNP streams sampled. Results here show that sustained gene flow from large source populations is important to recently established O. kisutch metapopulations. Studies that document how genetic and demographic characteristics of newly founded populations vary associated with successional changes in stream habitat are of particular importance to and have significant implications for, restoration of declining or repatriation of extirpated populations in other regions of the species' native range.
[The ABC transporters of Saccharomyces cerevisiae].
Wawrzycka, Donata
2011-01-01
The ABC transporters (ATP Binding Cassette) compose one of the bigest protein family with the great medical, industrial and economical impact. They are found in all organism from bacteria to man. ABC proteins are responsible for resistance of microorganism to antibiotics and fungicides and multidrug resistance of cancer cells. Mutations in ABC transporters genes cause seriuos deseases like cystic fibrosis, adrenoleucodystrophy or ataxia. Transport catalized by ABC proteins is charged with energy from the ATP hydrolysis. The ABC superfamily contains transporters, canals, receptors. Analysis of the Saccharomyces cerevisiae genome allowed to distinguish 30 potential ABC proteins which are classified into 6 subfamilies. The structural and functional similarity of the yeast and human ABC proteins allowes to use the S. cerevisiae as a model organism for ABC transporters characterisation. In this work the present state of knowleadge on yeast S. cerevisiae ABC proteins was summarised.
Döll, Katharina; Karlovsky, Petr; Deising, Holger B.; Wirsel, Stefan G. R.
2013-01-01
Fusarium graminearum is a plant pathogen infecting several important cereals, resulting in substantial yield losses and mycotoxin contamination of the grain. Triazole fungicides are used to control diseases caused by this fungus on a worldwide scale. Our previous microarray study indicated that 15 ABC transporter genes were transcriptionally upregulated in response to tebuconazole treatment. Here, we deleted four ABC transporter genes in two genetic backgrounds of F. graminearum representing the DON (deoxynivalenol) and the NIV (nivalenol) trichothecene chemotypes. Deletion of FgABC3 and FgABC4 belonging to group I of ABC-G and to group V of ABC-C subfamilies of ABC transporters, respectively, considerably increased the sensitivity to the class I sterol biosynthesis inhibitors triazoles and fenarimol. Such effects were specific since they did not occur with any other fungicide class tested. Assessing the contribution of the four ABC transporters to virulence of F. graminearum revealed that, irrespective of their chemotypes, deletion mutants of FgABC1 (ABC-C subfamily group V) and FgABC3 were impeded in virulence on wheat, barley and maize. Phylogenetic context and analyses of mycotoxin production suggests that FgABC3 may encode a transporter protecting the fungus from host-derived antifungal molecules. In contrast, FgABC1 may encode a transporter responsible for the secretion of fungal secondary metabolites alleviating defence of the host. Our results show that ABC transporters play important and diverse roles in both fungicide resistance and pathogenesis of F. graminearum. PMID:24244413
26 CFR 1.6655-6 - Methods of accounting.
Code of Federal Regulations, 2010 CFR
2010-04-01
... of accounting method. Corporation ABC, a calendar year taxpayer, uses an accrual method of accounting... 26 Internal Revenue 13 2010-04-01 2010-04-01 false Methods of accounting. 1.6655-6 Section 1.6655... Methods of accounting. (a) In general. In computing any required installment, a corporation must use the...
NASA Technical Reports Server (NTRS)
Warner, James E.; Zubair, Mohammad; Ranjan, Desh
2017-01-01
This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.
Regulation of Expression of abcA and Its Response to Environmental Conditions
Villet, Regis A.; Truong-Bolduc, Que Chi; Wang, Yin; Estabrooks, Zoe; Medeiros, Heidi
2014-01-01
The ATP-dependent transporter gene abcA in Staphylococcus aureus confers resistance to hydrophobic β-lactams. In strain ISP794, abcA is regulated by the transcriptional regulators MgrA and NorG and shares a 420-nucleotide intercistronic region with the divergently transcribed pbp4 gene, which encodes the transpeptidase Pbp4. Exposure of exponentially growing cells to iron-limited media, oxidative stress, and acidic pH (5.5) for 0.5 to 2 h had no effect on abcA expression. In contrast, nutrient limitation produced a significant increase in abcA transcripts. We identified three additional regulators (SarA, SarZ, and Rot) that bind to the overlapping promoter region of abcA and pbp4 in strain MW2 and investigated their role in the regulation of abcA expression. Expression of abcA is decreased by 10.0-fold in vivo in a subcutaneous abscess model. In vitro, abcA expression depends on rot and sarZ regulators. Moenomycin A exposure of strain MW2 produced an increase in abcA transcripts. Relative to MW2, the MIC of moenomycin was decreased 8-fold for MW2ΔabcA and increased 10-fold for the MW2 abcA overexpresser, suggesting that moenomycin is a substrate of AbcA. PMID:24509312
Radiation dose reduction in computed tomography perfusion using spatial-temporal Bayesian methods
NASA Astrophysics Data System (ADS)
Fang, Ruogu; Raj, Ashish; Chen, Tsuhan; Sanelli, Pina C.
2012-03-01
In current computed tomography (CT) examinations, the associated X-ray radiation dose is of significant concern to patients and operators, especially CT perfusion (CTP) imaging that has higher radiation dose due to its cine scanning technique. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) parameter as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and degrade CT perfusion maps greatly if no adequate noise control is applied during image reconstruction. To capture the essential dynamics of CT perfusion, a simple spatial-temporal Bayesian method that uses a piecewise parametric model of the residual function is used, and then the model parameters are estimated from a Bayesian formulation of prior smoothness constraints on perfusion parameters. From the fitted residual function, reliable CTP parameter maps are obtained from low dose CT data. The merit of this scheme exists in the combination of analytical piecewise residual function with Bayesian framework using a simpler prior spatial constrain for CT perfusion application. On a dataset of 22 patients, this dynamic spatial-temporal Bayesian model yielded an increase in signal-tonoise-ratio (SNR) of 78% and a decrease in mean-square-error (MSE) of 40% at low dose radiation of 43mA.
McClelland, James L.
2013-01-01
This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered. PMID:23970868
McClelland, James L
2013-01-01
This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered.
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
NASA Astrophysics Data System (ADS)
Beck, Joakim; Dia, Ben Mansour; Espath, Luis F. R.; Long, Quan; Tempone, Raúl
2018-06-01
In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.
Lévy flight artificial bee colony algorithm
NASA Astrophysics Data System (ADS)
Sharma, Harish; Bansal, Jagdish Chand; Arya, K. V.; Yang, Xin-She
2016-08-01
Artificial bee colony (ABC) optimisation algorithm is a relatively simple and recent population-based probabilistic approach for global optimisation. The solution search equation of ABC is significantly influenced by a random quantity which helps in exploration at the cost of exploitation of the search space. In the ABC, there is a high chance to skip the true solution due to its large step sizes. In order to balance between diversity and convergence in the ABC, a Lévy flight inspired search strategy is proposed and integrated with ABC. The proposed strategy is named as Lévy Flight ABC (LFABC) has both the local and global search capability simultaneously and can be achieved by tuning the Lévy flight parameters and thus automatically tuning the step sizes. In the LFABC, new solutions are generated around the best solution and it helps to enhance the exploitation capability of ABC. Furthermore, to improve the exploration capability, the numbers of scout bees are increased. The experiments on 20 test problems of different complexities and five real-world engineering optimisation problems show that the proposed strategy outperforms the basic ABC and recent variants of ABC, namely, Gbest-guided ABC, best-so-far ABC and modified ABC in most of the experiments.
Natanegara, Fanni; Neuenschwander, Beat; Seaman, John W; Kinnersley, Nelson; Heilmann, Cory R; Ohlssen, David; Rochester, George
2014-01-01
Bayesian applications in medical product development have recently gained popularity. Despite many advances in Bayesian methodology and computations, increase in application across the various areas of medical product development has been modest. The DIA Bayesian Scientific Working Group (BSWG), which includes representatives from industry, regulatory agencies, and academia, has adopted the vision to ensure Bayesian methods are well understood, accepted more broadly, and appropriately utilized to improve decision making and enhance patient outcomes. As Bayesian applications in medical product development are wide ranging, several sub-teams were formed to focus on various topics such as patient safety, non-inferiority, prior specification, comparative effectiveness, joint modeling, program-wide decision making, analytical tools, and education. The focus of this paper is on the recent effort of the BSWG Education sub-team to administer a Bayesian survey to statisticians across 17 organizations involved in medical product development. We summarize results of this survey, from which we provide recommendations on how to accelerate progress in Bayesian applications throughout medical product development. The survey results support findings from the literature and provide additional insight on regulatory acceptance of Bayesian methods and information on the need for a Bayesian infrastructure within an organization. The survey findings support the claim that only modest progress in areas of education and implementation has been made recently, despite substantial progress in Bayesian statistical research and software availability. Copyright © 2013 John Wiley & Sons, Ltd.
Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory
ERIC Educational Resources Information Center
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…
A baker's dozen of new particle flows for nonlinear filters, Bayesian decisions and transport
NASA Astrophysics Data System (ADS)
Daum, Fred; Huang, Jim
2015-05-01
We describe a baker's dozen of new particle flows to compute Bayes' rule for nonlinear filters, Bayesian decisions and learning as well as transport. Several of these new flows were inspired by transport theory, but others were inspired by physics or statistics or Markov chain Monte Carlo methods.
Merlé, Y; Mentré, F
1995-02-01
In this paper 3 criteria to design experiments for Bayesian estimation of the parameters of nonlinear models with respect to their parameters, when a prior distribution is available, are presented: the determinant of the Bayesian information matrix, the determinant of the pre-posterior covariance matrix, and the expected information provided by an experiment. A procedure to simplify the computation of these criteria is proposed in the case of continuous prior distributions and is compared with the criterion obtained from a linearization of the model about the mean of the prior distribution for the parameters. This procedure is applied to two models commonly encountered in the area of pharmacokinetics and pharmacodynamics: the one-compartment open model with bolus intravenous single-dose injection and the Emax model. They both involve two parameters. Additive as well as multiplicative gaussian measurement errors are considered with normal prior distributions. Various combinations of the variances of the prior distribution and of the measurement error are studied. Our attention is restricted to designs with limited numbers of measurements (1 or 2 measurements). This situation often occurs in practice when Bayesian estimation is performed. The optimal Bayesian designs that result vary with the variances of the parameter distribution and with the measurement error. The two-point optimal designs sometimes differ from the D-optimal designs for the mean of the prior distribution and may consist of replicating measurements. For the studied cases, the determinant of the Bayesian information matrix and its linearized form lead to the same optimal designs. In some cases, the pre-posterior covariance matrix can be far from its lower bound, namely, the inverse of the Bayesian information matrix, especially for the Emax model and a multiplicative measurement error. The expected information provided by the experiment and the determinant of the pre-posterior covariance matrix generally lead to the same designs except for the Emax model and the multiplicative measurement error. Results show that these criteria can be easily computed and that they could be incorporated in modules for designing experiments.
Calibrating Bayesian Network Representations of Social-Behavioral Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitney, Paul D.; Walsh, Stephen J.
2010-04-08
While human behavior has long been studied, recent and ongoing advances in computational modeling present opportunities for recasting research outcomes in human behavior. In this paper we describe how Bayesian networks can represent outcomes of human behavior research. We demonstrate a Bayesian network that represents political radicalization research – and show a corresponding visual representation of aspects of this research outcome. Since Bayesian networks can be quantitatively compared with external observations, the representation can also be used for empirical assessments of the research which the network summarizes. For a political radicalization model based on published research, we show this empiricalmore » comparison with data taken from the Minorities at Risk Organizational Behaviors database.« less
New insights into faster computation of uncertainties
NASA Astrophysics Data System (ADS)
Bhattacharya, Atreyee
2012-11-01
Heavy computation power, lengthy simulations, and an exhaustive number of model runs—often these seem like the only statistical tools that scientists have at their disposal when computing uncertainties associated with predictions, particularly in cases of environmental processes such as groundwater movement. However, calculation of uncertainties need not be as lengthy, a new study shows. Comparing two approaches—the classical Bayesian “credible interval” and a less commonly used regression-based “confidence interval” method—Lu et al. show that for many practical purposes both methods provide similar estimates of uncertainties. The advantage of the regression method is that it demands 10-1000 model runs, whereas the classical Bayesian approach requires 10,000 to millions of model runs.
Arenas, Miguel
2015-04-01
NGS technologies present a fast and cheap generation of genomic data. Nevertheless, ancestral genome inference is not so straightforward due to complex evolutionary processes acting on this material such as inversions, translocations, and other genome rearrangements that, in addition to their implicit complexity, can co-occur and confound ancestral inferences. Recently, models of genome evolution that accommodate such complex genomic events are emerging. This letter explores these novel evolutionary models and proposes their incorporation into robust statistical approaches based on computer simulations, such as approximate Bayesian computation, that may produce a more realistic evolutionary analysis of genomic data. Advantages and pitfalls in using these analytical methods are discussed. Potential applications of these ancestral genomic inferences are also pointed out.
Bayesian analyses of time-interval data for environmental radiation monitoring.
Luo, Peng; Sharp, Julia L; DeVol, Timothy A
2013-01-01
Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.
Liu, Xiang; Li, Shangqi; Peng, Wenzhu; Feng, Shuaisheng; Feng, Jianxin; Mahboob, Shahid; Al-Ghanim, Khalid A; Xu, Peng
2016-01-01
The ATP-binding cassette (ABC) gene family is considered to be one of the largest gene families in all forms of prokaryotic and eukaryotic life. Although the ABC transporter genes have been annotated in some species, detailed information about the ABC superfamily and the evolutionary characterization of ABC genes in common carp (Cyprinus carpio) are still unclear. In this research, we identified 61 ABC transporter genes in the common carp genome. Phylogenetic analysis revealed that they could be classified into seven subfamilies, namely 11 ABCAs, six ABCBs, 19 ABCCs, eight ABCDs, two ABCEs, four ABCFs, and 11 ABCGs. Comparative analysis of the ABC genes in seven vertebrate species including common carp, showed that at least 10 common carp genes were retained from the third round of whole genome duplication, while 12 duplicated ABC genes may have come from the fourth round of whole genome duplication. Gene losses were also observed for 14 ABC genes. Expression profiles of the 61 ABC genes in six common carp tissues (brain, heart, spleen, kidney, intestine, and gill) revealed extensive functional divergence among the ABC genes. Different copies of some genes had tissue-specific expression patterns, which may indicate some gene function specialization. This study provides essential genomic resources for future studies in common carp.
Peng, Wenzhu; Feng, Shuaisheng; Feng, Jianxin; Mahboob, Shahid; Al-Ghanim, Khalid A.
2016-01-01
The ATP-binding cassette (ABC) gene family is considered to be one of the largest gene families in all forms of prokaryotic and eukaryotic life. Although the ABC transporter genes have been annotated in some species, detailed information about the ABC superfamily and the evolutionary characterization of ABC genes in common carp (Cyprinus carpio) are still unclear. In this research, we identified 61 ABC transporter genes in the common carp genome. Phylogenetic analysis revealed that they could be classified into seven subfamilies, namely 11 ABCAs, six ABCBs, 19 ABCCs, eight ABCDs, two ABCEs, four ABCFs, and 11 ABCGs. Comparative analysis of the ABC genes in seven vertebrate species including common carp, showed that at least 10 common carp genes were retained from the third round of whole genome duplication, while 12 duplicated ABC genes may have come from the fourth round of whole genome duplication. Gene losses were also observed for 14 ABC genes. Expression profiles of the 61 ABC genes in six common carp tissues (brain, heart, spleen, kidney, intestine, and gill) revealed extensive functional divergence among the ABC genes. Different copies of some genes had tissue-specific expression patterns, which may indicate some gene function specialization. This study provides essential genomic resources for future studies in common carp. PMID:27058731
Bayesian Inference in the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2008-01-01
This paper provides an elementary tutorial overview of Bayesian inference and its potential for application in aerospace experimentation in general and wind tunnel testing in particular. Bayes Theorem is reviewed and examples are provided to illustrate how it can be applied to objectively revise prior knowledge by incorporating insights subsequently obtained from additional observations, resulting in new (posterior) knowledge that combines information from both sources. A logical merger of Bayesian methods and certain aspects of Response Surface Modeling is explored. Specific applications to wind tunnel testing, computational code validation, and instrumentation calibration are discussed.
Bayesian linkage and segregation analysis: factoring the problem.
Matthysse, S
2000-01-01
Complex segregation analysis and linkage methods are mathematical techniques for the genetic dissection of complex diseases. They are used to delineate complex modes of familial transmission and to localize putative disease susceptibility loci to specific chromosomal locations. The computational problem of Bayesian linkage and segregation analysis is one of integration in high-dimensional spaces. In this paper, three available techniques for Bayesian linkage and segregation analysis are discussed: Markov Chain Monte Carlo (MCMC), importance sampling, and exact calculation. The contribution of each to the overall integration will be explicitly discussed.
NASA Technical Reports Server (NTRS)
Buntine, Wray
1991-01-01
Algorithms for learning classification trees have had successes in artificial intelligence and statistics over many years. How a tree learning algorithm can be derived from Bayesian decision theory is outlined. This introduces Bayesian techniques for splitting, smoothing, and tree averaging. The splitting rule turns out to be similar to Quinlan's information gain splitting rule, while smoothing and averaging replace pruning. Comparative experiments with reimplementations of a minimum encoding approach, Quinlan's C4 and Breiman et al. Cart show the full Bayesian algorithm is consistently as good, or more accurate than these other approaches though at a computational price.
A local approach for focussed Bayesian fusion
NASA Astrophysics Data System (ADS)
Sander, Jennifer; Heizmann, Michael; Goussev, Igor; Beyerer, Jürgen
2009-04-01
Local Bayesian fusion approaches aim to reduce high storage and computational costs of Bayesian fusion which is separated from fixed modeling assumptions. Using the small world formalism, we argue why this proceeding is conform with Bayesian theory. Then, we concentrate on the realization of local Bayesian fusion by focussing the fusion process solely on local regions that are task relevant with a high probability. The resulting local models correspond then to restricted versions of the original one. In a previous publication, we used bounds for the probability of misleading evidence to show the validity of the pre-evaluation of task specific knowledge and prior information which we perform to build local models. In this paper, we prove the validity of this proceeding using information theoretic arguments. For additional efficiency, local Bayesian fusion can be realized in a distributed manner. Here, several local Bayesian fusion tasks are evaluated and unified after the actual fusion process. For the practical realization of distributed local Bayesian fusion, software agents are predestinated. There is a natural analogy between the resulting agent based architecture and criminal investigations in real life. We show how this analogy can be used to improve the efficiency of distributed local Bayesian fusion additionally. Using a landscape model, we present an experimental study of distributed local Bayesian fusion in the field of reconnaissance, which highlights its high potential.
ERIC Educational Resources Information Center
Bruno, Sam J., Ed.; Pettit, John D., Jr., Ed.
These conference proceedings contain the following 23 presentations: "Development of a Communication Skill Model Using Interpretive Structural Modeling" (Karen S. Nantz and Linda Gammill); "The Coincidence of Needs: An Inventional Model for Audience Analysis" (Gina Burchard); "A Computer Algorithm for Measuring Readability" (Terry D. Lundgren);…
A Bayesian test for Hardy–Weinberg equilibrium of biallelic X-chromosomal markers
Puig, X; Ginebra, J; Graffelman, J
2017-01-01
The X chromosome is a relatively large chromosome, harboring a lot of genetic information. Much of the statistical analysis of X-chromosomal information is complicated by the fact that males only have one copy. Recently, frequentist statistical tests for Hardy–Weinberg equilibrium have been proposed specifically for dealing with markers on the X chromosome. Bayesian test procedures for Hardy–Weinberg equilibrium for the autosomes have been described, but Bayesian work on the X chromosome in this context is lacking. This paper gives the first Bayesian approach for testing Hardy–Weinberg equilibrium with biallelic markers at the X chromosome. Marginal and joint posterior distributions for the inbreeding coefficient in females and the male to female allele frequency ratio are computed, and used for statistical inference. The paper gives a detailed account of the proposed Bayesian test, and illustrates it with data from the 1000 Genomes project. In that implementation, a novel approach to tackle multiple testing from a Bayesian perspective through posterior predictive checks is used. PMID:28900292
Bayesian estimation inherent in a Mexican-hat-type neural network
NASA Astrophysics Data System (ADS)
Takiyama, Ken
2016-05-01
Brain functions, such as perception, motor control and learning, and decision making, have been explained based on a Bayesian framework, i.e., to decrease the effects of noise inherent in the human nervous system or external environment, our brain integrates sensory and a priori information in a Bayesian optimal manner. However, it remains unclear how Bayesian computations are implemented in the brain. Herein, I address this issue by analyzing a Mexican-hat-type neural network, which was used as a model of the visual cortex, motor cortex, and prefrontal cortex. I analytically demonstrate that the dynamics of an order parameter in the model corresponds exactly to a variational inference of a linear Gaussian state-space model, a Bayesian estimation, when the strength of recurrent synaptic connectivity is appropriately stronger than that of an external stimulus, a plausible condition in the brain. This exact correspondence can reveal the relationship between the parameters in the Bayesian estimation and those in the neural network, providing insight for understanding brain functions.
Uses and misuses of Bayes' rule and Bayesian classifiers in cybersecurity
NASA Astrophysics Data System (ADS)
Bard, Gregory V.
2017-12-01
This paper will discuss the applications of Bayes' Rule and Bayesian Classifiers in Cybersecurity. While the most elementary form of Bayes' rule occurs in undergraduate coursework, there are more complicated forms as well. As an extended example, Bayesian spam filtering is explored, and is in many ways the most triumphant accomplishment of Bayesian reasoning in computer science, as nearly everyone with an email address has a spam folder. Bayesian Classifiers have also been responsible significant cybersecurity research results; yet, because they are not part of the standard curriculum, few in the mathematics or information-technology communities have seen the exact definitions, requirements, and proofs that comprise the subject. Moreover, numerous errors have been made by researchers (described in this paper), due to some mathematical misunderstandings dealing with conditional independence, or other badly chosen assumptions. Finally, to provide instructors and researchers with real-world examples, 25 published cybersecurity papers that use Bayesian reasoning are given, with 2-4 sentence summaries of the focus and contributions of each paper.
Win-Stay, Lose-Sample: a simple sequential algorithm for approximating Bayesian inference.
Bonawitz, Elizabeth; Denison, Stephanie; Gopnik, Alison; Griffiths, Thomas L
2014-11-01
People can behave in a way that is consistent with Bayesian models of cognition, despite the fact that performing exact Bayesian inference is computationally challenging. What algorithms could people be using to make this possible? We show that a simple sequential algorithm "Win-Stay, Lose-Sample", inspired by the Win-Stay, Lose-Shift (WSLS) principle, can be used to approximate Bayesian inference. We investigate the behavior of adults and preschoolers on two causal learning tasks to test whether people might use a similar algorithm. These studies use a "mini-microgenetic method", investigating how people sequentially update their beliefs as they encounter new evidence. Experiment 1 investigates a deterministic causal learning scenario and Experiments 2 and 3 examine how people make inferences in a stochastic scenario. The behavior of adults and preschoolers in these experiments is consistent with our Bayesian version of the WSLS principle. This algorithm provides both a practical method for performing Bayesian inference and a new way to understand people's judgments. Copyright © 2014 Elsevier Inc. All rights reserved.
Schiffmann, Christoph; Sebastiani, Daniel
2011-05-10
We present an algorithmic extension of a numerical optimization scheme for analytic capping potentials for use in mixed quantum-classical (quantum mechanical/molecular mechanical, QM/MM) ab initio calculations. Our goal is to minimize bond-cleavage-induced perturbations in the electronic structure, measured by means of a suitable penalty functional. The optimization algorithm-a variant of the artificial bee colony (ABC) algorithm, which relies on swarm intelligence-couples deterministic (downhill gradient) and stochastic elements to avoid local minimum trapping. The ABC algorithm outperforms the conventional downhill gradient approach, if the penalty hypersurface exhibits wiggles that prevent a straight minimization pathway. We characterize the optimized capping potentials by computing NMR chemical shifts. This approach will increase the accuracy of QM/MM calculations of complex biomolecules.
Laminar fMRI and computational theories of brain function.
Stephan, K E; Petzschner, F H; Kasper, L; Bayer, J; Wellstein, K V; Stefanics, G; Pruessmann, K P; Heinzle, J
2017-11-02
Recently developed methods for functional MRI at the resolution of cortical layers (laminar fMRI) offer a novel window into neurophysiological mechanisms of cortical activity. Beyond physiology, laminar fMRI also offers an unprecedented opportunity to test influential theories of brain function. Specifically, hierarchical Bayesian theories of brain function, such as predictive coding, assign specific computational roles to different cortical layers. Combined with computational models, laminar fMRI offers a unique opportunity to test these proposals noninvasively in humans. This review provides a brief overview of predictive coding and related hierarchical Bayesian theories, summarises their predictions with regard to layered cortical computations, examines how these predictions could be tested by laminar fMRI, and considers methodological challenges. We conclude by discussing the potential of laminar fMRI for clinically useful computational assays of layer-specific information processing. Copyright © 2017 Elsevier Inc. All rights reserved.
Computational modelling of cellular level metabolism
NASA Astrophysics Data System (ADS)
Calvetti, D.; Heino, J.; Somersalo, E.
2008-07-01
The steady and stationary state inverse problems consist of estimating the reaction and transport fluxes, blood concentrations and possibly the rates of change of some of the concentrations based on data which are often scarce noisy and sampled over a population. The Bayesian framework provides a natural setting for the solution of this inverse problem, because a priori knowledge about the system itself and the unknown reaction fluxes and transport rates can compensate for the insufficiency of measured data, provided that the computational costs do not become prohibitive. This article identifies the computational challenges which have to be met when analyzing the steady and stationary states of multicompartment model for cellular metabolism and suggest stable and efficient ways to handle the computations. The outline of a computational tool based on the Bayesian paradigm for the simulation and analysis of complex cellular metabolic systems is also presented.
Understanding the Scalability of Bayesian Network Inference using Clique Tree Growth Curves
NASA Technical Reports Server (NTRS)
Mengshoel, Ole Jakob
2009-01-01
Bayesian networks (BNs) are used to represent and efficiently compute with multi-variate probability distributions in a wide range of disciplines. One of the main approaches to perform computation in BNs is clique tree clustering and propagation. In this approach, BN computation consists of propagation in a clique tree compiled from a Bayesian network. There is a lack of understanding of how clique tree computation time, and BN computation time in more general, depends on variations in BN size and structure. On the one hand, complexity results tell us that many interesting BN queries are NP-hard or worse to answer, and it is not hard to find application BNs where the clique tree approach in practice cannot be used. On the other hand, it is well-known that tree-structured BNs can be used to answer probabilistic queries in polynomial time. In this article, we develop an approach to characterizing clique tree growth as a function of parameters that can be computed in polynomial time from BNs, specifically: (i) the ratio of the number of a BN's non-root nodes to the number of root nodes, or (ii) the expected number of moral edges in their moral graphs. Our approach is based on combining analytical and experimental results. Analytically, we partition the set of cliques in a clique tree into different sets, and introduce a growth curve for each set. For the special case of bipartite BNs, we consequently have two growth curves, a mixed clique growth curve and a root clique growth curve. In experiments, we systematically increase the degree of the root nodes in bipartite Bayesian networks, and find that root clique growth is well-approximated by Gompertz growth curves. It is believed that this research improves the understanding of the scaling behavior of clique tree clustering, provides a foundation for benchmarking and developing improved BN inference and machine learning algorithms, and presents an aid for analytical trade-off studies of clique tree clustering using growth curves.
NASA Astrophysics Data System (ADS)
Li, L.; Xu, C.-Y.; Engeland, K.
2012-04-01
With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD
Aneurysmal Bone Cyst: An Analysis of 38 Cases and Report of Four Unusual Surface Ones
Shooshtarizadeh, Tina; Movahedinia, Sajjadeh; Mostafavi, Hassan; Jamshidi, Khodamorad; Sami, Sam Hajialiloo
2016-01-01
Aneurysmal bone cyst (ABC) is a benign expansile bone tumor, most commonly involving the medulla of long bones. ABC rarely arises within the cortex or in the subperiosteal region, radiographically mimicking other conditions, in particular surface osteosarcomathat is low-grade in nature and may go secondary ABC changes, and telangiectatic osteosarcoma. Both of these are sometimes mistaken microscopically for primary ABC. We review the characteristics of ABC cases in our center and report four unusualsurface ABCs arising in the subperiosteal or cortical region of long bones, identified among 38 histologically proven ABCs during a four-year period in our center. The surface ABCs occurred at an older agewith a predilection for diaphysis of femur, tibia, and humerus. PMID:27200397
Torres-Quesada, Omar; Millán, Vicenta; Nisa-Martínez, Rafael; Bardou, Florian; Crespi, Martín; Toro, Nicolás; Jiménez-Zurdo, José I
2013-01-01
The legume symbiont Sinorhizobium meliloti expresses a plethora of small noncoding RNAs (sRNAs) whose function is mostly unknown. Here, we have functionally characterized two tandemly encoded S. meliloti Rm1021 sRNAs that are similar in sequence and structure. Homologous sRNAs (designated AbcR1 and AbcR2) have been shown to regulate several ABC transporters in the related α-proteobacteria Agrobacterium tumefaciens and Brucella abortus. In Rm1021, AbcR1 and AbcR2 exhibit divergent unlinked regulation and are stabilized by the RNA chaperone Hfq. AbcR1 is transcribed in actively dividing bacteria, either in culture, rhizosphere or within the invasion zone of mature alfalfa nodules. Conversely, AbcR2 expression is induced upon entry into stationary phase and under abiotic stress. Only deletion of AbcR1 resulted into a discrete growth delay in rich medium, but both are dispensable for symbiosis. Periplasmic proteome profiling revealed down-regulation of the branched-chain amino acid binding protein LivK by AbcR1, but not by AbcR2. A double-plasmid reporter assay confirmed the predicted specific targeting of the 5'-untranslated region of the livK mRNA by AbcR1 in vivo. Our findings provide evidences of independent regulatory functions of these sRNAs, probably to fine-tune nutrient uptake in free-living and undifferentiated symbiotic rhizobia.
Torres-Quesada, Omar; Millán, Vicenta; Nisa-Martínez, Rafael; Bardou, Florian; Crespi, Martín; Toro, Nicolás; Jiménez-Zurdo, José I.
2013-01-01
The legume symbiont Sinorhizobium meliloti expresses a plethora of small noncoding RNAs (sRNAs) whose function is mostly unknown. Here, we have functionally characterized two tandemly encoded S. meliloti Rm1021 sRNAs that are similar in sequence and structure. Homologous sRNAs (designated AbcR1 and AbcR2) have been shown to regulate several ABC transporters in the related α-proteobacteria Agrobacterium tumefaciens and Brucella abortus. In Rm1021, AbcR1 and AbcR2 exhibit divergent unlinked regulation and are stabilized by the RNA chaperone Hfq. AbcR1 is transcribed in actively dividing bacteria, either in culture, rhizosphere or within the invasion zone of mature alfalfa nodules. Conversely, AbcR2 expression is induced upon entry into stationary phase and under abiotic stress. Only deletion of AbcR1 resulted into a discrete growth delay in rich medium, but both are dispensable for symbiosis. Periplasmic proteome profiling revealed down-regulation of the branched-chain amino acid binding protein LivK by AbcR1, but not by AbcR2. A double-plasmid reporter assay confirmed the predicted specific targeting of the 5′-untranslated region of the livK mRNA by AbcR1 in vivo. Our findings provide evidences of independent regulatory functions of these sRNAs, probably to fine-tune nutrient uptake in free-living and undifferentiated symbiotic rhizobia. PMID:23869210
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, J; Hill, G; Spiegel, J
Purpose: To investigate the clinical and dosimetric benefits of automatic gating of left breast mixed with breath-hold technique. Methods: Two Active Breathing Control systems, ABC2.0 and ABC3.0, were used during simulation and treatment delivery. The two systems are different such that ABC2.0 is a breath-hold system without beam control capability, while ABC3.0 has capability in both breath-hold and beam gating. At simulation, each patient was scanned twice: one with free breathing (FB) and one with breath hold through ABC. Treatment plan was generated on the CT with ABC. The same plan was also recalculated on the CT with FB. Thesemore » two plans were compared to assess plan quality. For treatments with ABC2.0, beams with MU > 55 were manually split into multiple subfields. All subfields were identical and shared the total MU. For treatment with ABC3.0, beam splitting was unnecessary. Instead, treatment was delivered in gating mode mixed with breath-hold technique. Treatment delivery efficiency using the two systems was compared. Results: The prescribed dose was 50.4Gy at 1.8Gy/fraction. The maximum heart dose averaged over 10 patients was 46.0±2.5Gy and 24.5±12.2Gy for treatments with FB and with ABC respectively. The corresponding heart V10 was 13.2±3.6% and 1.0±1.6% respectively. The averaged MUs were 99.8±7.5 for LMT, 99.2±9.4 for LLT. For treatment with ABC2.0, normally the original beam was split into 2 subfields. The averaged total time to delivery all beams was 4.3±0.4min for treatments with ABC2.0 and 3.3±0.6min for treatments with ABC3.0 in gating mode. Conclusion: Treatment with ABC tremendously reduced heart dose. Compared to treatments with ABC2.0, gating with ABC3.0 reduced the total treatment time by 23%. Use of ABC3.0 improved the delivery efficiency, and eliminated the possibility of mistreatments. The latter may happen with ABC2.0 where beam is not terminated when breath signal falls outside of the treatment window.« less
Bayesian Factor Analysis When Only a Sample Covariance Matrix Is Available
ERIC Educational Resources Information Center
Hayashi, Kentaro; Arav, Marina
2006-01-01
In traditional factor analysis, the variance-covariance matrix or the correlation matrix has often been a form of inputting data. In contrast, in Bayesian factor analysis, the entire data set is typically required to compute the posterior estimates, such as Bayes factor loadings and Bayes unique variances. We propose a simple method for computing…
QUEST - A Bayesian adaptive psychometric method
NASA Technical Reports Server (NTRS)
Watson, A. B.; Pelli, D. G.
1983-01-01
An adaptive psychometric procedure that places each trial at the current most probable Bayesian estimate of threshold is described. The procedure takes advantage of the common finding that the human psychometric function is invariant in form when expressed as a function of log intensity. The procedure is simple, fast, and efficient, and may be easily implemented on any computer.
Designing a Mobile Training System in Rural Areas with Bayesian Factor Models
ERIC Educational Resources Information Center
Omidi Najafabadi, Maryam; Mirdamadi, Seyed Mehdi; Payandeh Najafabadi, Amir Teimour
2014-01-01
The facts that the wireless technologies (1) are more convenient; and (2) need less skill than desktop computers, play a crucial role to decrease digital gap in rural areas. This study employed the Bayesian Confirmatory Factor Analysis (CFA) to design a mobile training system in rural areas of Iran. It categorized challenges, potential, and…
Understanding the Scalability of Bayesian Network Inference Using Clique Tree Growth Curves
NASA Technical Reports Server (NTRS)
Mengshoel, Ole J.
2010-01-01
One of the main approaches to performing computation in Bayesian networks (BNs) is clique tree clustering and propagation. The clique tree approach consists of propagation in a clique tree compiled from a Bayesian network, and while it was introduced in the 1980s, there is still a lack of understanding of how clique tree computation time depends on variations in BN size and structure. In this article, we improve this understanding by developing an approach to characterizing clique tree growth as a function of parameters that can be computed in polynomial time from BNs, specifically: (i) the ratio of the number of a BN s non-root nodes to the number of root nodes, and (ii) the expected number of moral edges in their moral graphs. Analytically, we partition the set of cliques in a clique tree into different sets, and introduce a growth curve for the total size of each set. For the special case of bipartite BNs, there are two sets and two growth curves, a mixed clique growth curve and a root clique growth curve. In experiments, where random bipartite BNs generated using the BPART algorithm are studied, we systematically increase the out-degree of the root nodes in bipartite Bayesian networks, by increasing the number of leaf nodes. Surprisingly, root clique growth is well-approximated by Gompertz growth curves, an S-shaped family of curves that has previously been used to describe growth processes in biology, medicine, and neuroscience. We believe that this research improves the understanding of the scaling behavior of clique tree clustering for a certain class of Bayesian networks; presents an aid for trade-off studies of clique tree clustering using growth curves; and ultimately provides a foundation for benchmarking and developing improved BN inference and machine learning algorithms.
A combined Fuzzy and Naive Bayesian strategy can be used to assign event codes to injury narratives.
Marucci-Wellman, H; Lehto, M; Corns, H
2011-12-01
Bayesian methods show promise for classifying injury narratives from large administrative datasets into cause groups. This study examined a combined approach where two Bayesian models (Fuzzy and Naïve) were used to either classify a narrative or select it for manual review. Injury narratives were extracted from claims filed with a worker's compensation insurance provider between January 2002 and December 2004. Narratives were separated into a training set (n=11,000) and prediction set (n=3,000). Expert coders assigned two-digit Bureau of Labor Statistics Occupational Injury and Illness Classification event codes to each narrative. Fuzzy and Naïve Bayesian models were developed using manually classified cases in the training set. Two semi-automatic machine coding strategies were evaluated. The first strategy assigned cases for manual review if the Fuzzy and Naïve models disagreed on the classification. The second strategy selected additional cases for manual review from the Agree dataset using prediction strength to reach a level of 50% computer coding and 50% manual coding. When agreement alone was used as the filtering strategy, the majority were coded by the computer (n=1,928, 64%) leaving 36% for manual review. The overall combined (human plus computer) sensitivity was 0.90 and positive predictive value (PPV) was >0.90 for 11 of 18 2-digit event categories. Implementing the 2nd strategy improved results with an overall sensitivity of 0.95 and PPV >0.90 for 17 of 18 categories. A combined Naïve-Fuzzy Bayesian approach can classify some narratives with high accuracy and identify others most beneficial for manual review, reducing the burden on human coders.
Bayesian Computation for Log-Gaussian Cox Processes: A Comparative Analysis of Methods
Teng, Ming; Nathoo, Farouk S.; Johnson, Timothy D.
2017-01-01
The Log-Gaussian Cox Process is a commonly used model for the analysis of spatial point pattern data. Fitting this model is difficult because of its doubly-stochastic property, i.e., it is an hierarchical combination of a Poisson process at the first level and a Gaussian Process at the second level. Various methods have been proposed to estimate such a process, including traditional likelihood-based approaches as well as Bayesian methods. We focus here on Bayesian methods and several approaches that have been considered for model fitting within this framework, including Hamiltonian Monte Carlo, the Integrated nested Laplace approximation, and Variational Bayes. We consider these approaches and make comparisons with respect to statistical and computational efficiency. These comparisons are made through several simulation studies as well as through two applications, the first examining ecological data and the second involving neuroimaging data. PMID:29200537
Catalytic and transport cycles of ABC exporters.
Al-Shawi, Marwan K
2011-09-07
ABC (ATP-binding cassette) transporters are arguably the most important family of ATP-driven transporters in biology. Despite considerable effort and advances in determining the structures and physiology of these transporters, their fundamental molecular mechanisms remain elusive and highly controversial. How does ATP hydrolysis by ABC transporters drive their transport function? Part of the problem in answering this question appears to be a perceived need to formulate a universal mechanism. Although it has been generally hoped and assumed that the whole superfamily of ABC transporters would exhibit similar conserved mechanisms, this is proving not to be the case. Structural considerations alone suggest that there are three overall types of coupling mechanisms related to ABC exporters, small ABC importers and large ABC importers. Biochemical and biophysical characterization leads us to the conclusion that, even within these three classes, the catalytic and transport mechanisms are not fully conserved, but continue to evolve. ABC transporters also exhibit unusual characteristics not observed in other primary transporters, such as uncoupled basal ATPase activity, that severely complicate mechanistic studies by established methods. In this chapter, I review these issues as related to ABC exporters in particular. A consensus view has emerged that ABC exporters follow alternating-access switch transport mechanisms. However, some biochemical data suggest that alternating catalytic site transport mechanisms are more appropriate for fully symmetrical ABC exporters. Heterodimeric and asymmetrical ABC exporters appear to conform to simple alternating-access-type mechanisms.
An ATP-driven efflux pump is a novel pathogenicity factor in rice blast disease.
Urban, M; Bhargava, T; Hamer, J E
1999-01-01
Cells tolerate exposure to cytotoxic compounds through the action of ATP-driven efflux pumps belonging to the ATP-binding cassette (ABC) superfamily of membrane transporters. Phytopathogenic fungi encounter toxic environments during plant invasion as a result of the plant defense response. Here we demonstrate the requirement for an ABC transporter during host infection by the fungal plant pathogen Magnaporthe grisea. The ABC1 gene was identified in an insertional mutagenesis screen for pathogenicity mutants. The ABC1 insertional mutant and a gene-replacement mutant arrest growth and die shortly after penetrating either rice or barley epidermal cells. The ABC1-encoded protein is similar to yeast ABC transporters implicated in multidrug resistance, and ABC1 gene transcripts are inducible by toxic drugs and a rice phytoalexin. However, abc1 mutants are not hypersensitive to antifungal compounds. The non-pathogenic, insertional mutation in ABC1 occurs in the promoter region and dramatically reduces transcript induction by metabolic poisons. These data strongly suggest that M.grisea requires the up-regulation of specific ABC transporters for pathogenesis; most likely to protect itself against plant defense mechanisms. PMID:9927411
A Comparison of the β-Substitution Method and a Bayesian Method for Analyzing Left-Censored Data
Huynh, Tran; Quick, Harrison; Ramachandran, Gurumurthy; Banerjee, Sudipto; Stenzel, Mark; Sandler, Dale P.; Engel, Lawrence S.; Kwok, Richard K.; Blair, Aaron; Stewart, Patricia A.
2016-01-01
Classical statistical methods for analyzing exposure data with values below the detection limits are well described in the occupational hygiene literature, but an evaluation of a Bayesian approach for handling such data is currently lacking. Here, we first describe a Bayesian framework for analyzing censored data. We then present the results of a simulation study conducted to compare the β-substitution method with a Bayesian method for exposure datasets drawn from lognormal distributions and mixed lognormal distributions with varying sample sizes, geometric standard deviations (GSDs), and censoring for single and multiple limits of detection. For each set of factors, estimates for the arithmetic mean (AM), geometric mean, GSD, and the 95th percentile (X0.95) of the exposure distribution were obtained. We evaluated the performance of each method using relative bias, the root mean squared error (rMSE), and coverage (the proportion of the computed 95% uncertainty intervals containing the true value). The Bayesian method using non-informative priors and the β-substitution method were generally comparable in bias and rMSE when estimating the AM and GM. For the GSD and the 95th percentile, the Bayesian method with non-informative priors was more biased and had a higher rMSE than the β-substitution method, but use of more informative priors generally improved the Bayesian method’s performance, making both the bias and the rMSE more comparable to the β-substitution method. An advantage of the Bayesian method is that it provided estimates of uncertainty for these parameters of interest and good coverage, whereas the β-substitution method only provided estimates of uncertainty for the AM, and coverage was not as consistent. Selection of one or the other method depends on the needs of the practitioner, the availability of prior information, and the distribution characteristics of the measurement data. We suggest the use of Bayesian methods if the practitioner has the computational resources and prior information, as the method would generally provide accurate estimates and also provides the distributions of all of the parameters, which could be useful for making decisions in some applications. PMID:26209598
NASA Astrophysics Data System (ADS)
Granade, Christopher; Combes, Joshua; Cory, D. G.
2016-03-01
In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of-the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we address all three problems. First, we use modern statistical methods, as pioneered by Huszár and Houlsby (2012 Phys. Rev. A 85 052120) and by Ferrie (2014 New J. Phys. 16 093035), to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first priors on quantum states and channels that allow for including useful experimental insight. Finally, we develop a method that allows tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.
Dynamic Bayesian network modeling for longitudinal brain morphometry
Chen, Rong; Resnick, Susan M; Davatzikos, Christos; Herskovits, Edward H
2011-01-01
Identifying interactions among brain regions from structural magnetic-resonance images presents one of the major challenges in computational neuroanatomy. We propose a Bayesian data-mining approach to the detection of longitudinal morphological changes in the human brain. Our method uses a dynamic Bayesian network to represent evolving inter-regional dependencies. The major advantage of dynamic Bayesian network modeling is that it can represent complicated interactions among temporal processes. We validated our approach by analyzing a simulated atrophy study, and found that this approach requires only a small number of samples to detect the ground-truth temporal model. We further applied dynamic Bayesian network modeling to a longitudinal study of normal aging and mild cognitive impairment — the Baltimore Longitudinal Study of Aging. We found that interactions among regional volume-change rates for the mild cognitive impairment group are different from those for the normal-aging group. PMID:21963916
Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy
NASA Astrophysics Data System (ADS)
Sharma, Sanjib
2017-08-01
Markov Chain Monte Carlo based Bayesian data analysis has now become the method of choice for analyzing and interpreting data in almost all disciplines of science. In astronomy, over the last decade, we have also seen a steady increase in the number of papers that employ Monte Carlo based Bayesian analysis. New, efficient Monte Carlo based methods are continuously being developed and explored. In this review, we first explain the basics of Bayesian theory and discuss how to set up data analysis problems within this framework. Next, we provide an overview of various Monte Carlo based methods for performing Bayesian data analysis. Finally, we discuss advanced ideas that enable us to tackle complex problems and thus hold great promise for the future. We also distribute downloadable computer software (available at https://github.com/sanjibs/bmcmc/ ) that implements some of the algorithms and examples discussed here.
Bayesian methods in reliability
NASA Astrophysics Data System (ADS)
Sander, P.; Badoux, R.
1991-11-01
The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.
Free will in Bayesian and inverse Bayesian inference-driven endo-consciousness.
Gunji, Yukio-Pegio; Minoura, Mai; Kojima, Kei; Horry, Yoichi
2017-12-01
How can we link challenging issues related to consciousness and/or qualia with natural science? The introduction of endo-perspective, instead of exo-perspective, as proposed by Matsuno, Rössler, and Gunji, is considered one of the most promising candidate approaches. Here, we distinguish the endo-from the exo-perspective in terms of whether the external is or is not directly operated. In the endo-perspective, the external can be neither perceived nor recognized directly; rather, one can only indirectly summon something outside of the perspective, which can be illustrated by a causation-reversal pair. On one hand, causation logically proceeds from the cause to the effect. On the other hand, a reversal from the effect to the cause is non-logical and is equipped with a metaphorical structure. We argue that the differences in exo- and endo-perspectives result not from the difference between Western and Eastern cultures, but from differences between modernism and animism. Here, a causation-reversal pair described using a pair of upward (from premise to consequence) and downward (from consequence to premise) causation and a pair of Bayesian and inverse Bayesian inference (BIB inference). Accordingly, the notion of endo-consciousness is proposed as an agent equipped with BIB inference. We also argue that BIB inference can yield both highly efficient computations through Bayesian interference and robust computations through inverse Bayesian inference. By adapting a logical model of the free will theorem to the BIB inference, we show that endo-consciousness can explain free will as a regression of the controllability of voluntary action. Copyright © 2017. Published by Elsevier Ltd.
Bayesian analysis of rare events
NASA Astrophysics Data System (ADS)
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Untch, Michael; Würstlein, Rachel; Marschner, Norbert; Lüftner, Diana; Augustin, Doris; Briest, Susanne; Ettl, Johannes; Haidinger, Renate; Müller, Lothar; Müller, Volkmar; Ruckhäberle, Eugen; Harbeck, Nadia; Thomssen, Christoph
2018-05-01
The fourth international advanced breast cancer consensus conference (ABC4) on the diagnosis and treatment of advanced breast cancer (ABC) headed by Professor Fatima Cardoso was once again held in Lisbon on November 2 - 4, 2017. To simplify matters, the abbreviation ABC will be used hereinafter in the text. In clinical practice, the abbreviation corresponds to metastatic breast cancer or locally far-advanced disease. This year the focus was on new developments in the treatment of ABC. Topics discussed included the importance of CDK4/6 inhibition in hormone receptor (HR)-positive ABC, the use of dual antibody blockade to treat HER2-positive ABC, PARP inhibition in triple-negative ABC and the potential therapeutic outcomes. Another major area discussed at the conference was BRCA-associated breast cancer, the treatment of cerebral metastasis, and individualized treatment decisions based on molecular testing (so-called precision medicine). As in previous years, close cooperation with representatives from patient organizations from around the world is an important aspect of the ABC conference. This cooperation was reinforced and expanded at the ABC4 conference. A global alliance was founded at the conclusion of the consensus conference, which aims to promote and coordinate the measures considered necessary by patient advocates worldwide. Because the panel of experts was composed of specialists from all over the world, it was inevitable that the ABC consensus also reflected country-specific features. As in previous years, a team of German breast cancer specialists who closely followed the consensus voting of the ABC panelists in Lisbon and intensively discussed the votes has therefore commented on the consensus in the context of the current German guidelines on the diagnosis and treatment of breast cancer 1 , 2 used in clinical practice in Germany. The ABC consensus is based on the votes of the ABC panelists in Lisbon.
Liu, Fang; Eugenio, Evercita C
2018-04-01
Beta regression is an increasingly popular statistical technique in medical research for modeling of outcomes that assume values in (0, 1), such as proportions and patient reported outcomes. When outcomes take values in the intervals [0,1), (0,1], or [0,1], zero-or-one-inflated beta (zoib) regression can be used. We provide a thorough review on beta regression and zoib regression in the modeling, inferential, and computational aspects via the likelihood-based and Bayesian approaches. We demonstrate the statistical and practical importance of correctly modeling the inflation at zero/one rather than ad hoc replacing them with values close to zero/one via simulation studies; the latter approach can lead to biased estimates and invalid inferences. We show via simulation studies that the likelihood-based approach is computationally faster in general than MCMC algorithms used in the Bayesian inferences, but runs the risk of non-convergence, large biases, and sensitivity to starting values in the optimization algorithm especially with clustered/correlated data, data with sparse inflation at zero and one, and data that warrant regularization of the likelihood. The disadvantages of the regular likelihood-based approach make the Bayesian approach an attractive alternative in these cases. Software packages and tools for fitting beta and zoib regressions in both the likelihood-based and Bayesian frameworks are also reviewed.
Bayesian statistics in radionuclide metrology: measurement of a decaying source
NASA Astrophysics Data System (ADS)
Bochud, François O.; Bailat, Claude J.; Laedermann, Jean-Pascal
2007-08-01
The most intuitive way of defining a probability is perhaps through the frequency at which it appears when a large number of trials are realized in identical conditions. The probability derived from the obtained histogram characterizes the so-called frequentist or conventional statistical approach. In this sense, probability is defined as a physical property of the observed system. By contrast, in Bayesian statistics, a probability is not a physical property or a directly observable quantity, but a degree of belief or an element of inference. The goal of this paper is to show how Bayesian statistics can be used in radionuclide metrology and what its advantages and disadvantages are compared with conventional statistics. This is performed through the example of an yttrium-90 source typically encountered in environmental surveillance measurement. Because of the very low activity of this kind of source and the small half-life of the radionuclide, this measurement takes several days, during which the source decays significantly. Several methods are proposed to compute simultaneously the number of unstable nuclei at a given reference time, the decay constant and the background. Asymptotically, all approaches give the same result. However, Bayesian statistics produces coherent estimates and confidence intervals in a much smaller number of measurements. Apart from the conceptual understanding of statistics, the main difficulty that could deter radionuclide metrologists from using Bayesian statistics is the complexity of the computation.
Can Bayesian Theories of Autism Spectrum Disorder Help Improve Clinical Practice?
Haker, Helene; Schneebeli, Maya; Stephan, Klaas Enno
2016-01-01
Diagnosis and individualized treatment of autism spectrum disorder (ASD) represent major problems for contemporary psychiatry. Tackling these problems requires guidance by a pathophysiological theory. In this paper, we consider recent theories that re-conceptualize ASD from a "Bayesian brain" perspective, which posit that the core abnormality of ASD resides in perceptual aberrations due to a disbalance in the precision of prediction errors (sensory noise) relative to the precision of predictions (prior beliefs). This results in percepts that are dominated by sensory inputs and less guided by top-down regularization and shifts the perceptual focus to detailed aspects of the environment with difficulties in extracting meaning. While these Bayesian theories have inspired ongoing empirical studies, their clinical implications have not yet been carved out. Here, we consider how this Bayesian perspective on disease mechanisms in ASD might contribute to improving clinical care for affected individuals. Specifically, we describe a computational strategy, based on generative (e.g., hierarchical Bayesian) models of behavioral and functional neuroimaging data, for establishing diagnostic tests. These tests could provide estimates of specific cognitive processes underlying ASD and delineate pathophysiological mechanisms with concrete treatment targets. Written with a clinical audience in mind, this article outlines how the development of computational diagnostics applicable to behavioral and functional neuroimaging data in routine clinical practice could not only fundamentally alter our concept of ASD but eventually also transform the clinical management of this disorder.
Can Bayesian Theories of Autism Spectrum Disorder Help Improve Clinical Practice?
Haker, Helene; Schneebeli, Maya; Stephan, Klaas Enno
2016-01-01
Diagnosis and individualized treatment of autism spectrum disorder (ASD) represent major problems for contemporary psychiatry. Tackling these problems requires guidance by a pathophysiological theory. In this paper, we consider recent theories that re-conceptualize ASD from a “Bayesian brain” perspective, which posit that the core abnormality of ASD resides in perceptual aberrations due to a disbalance in the precision of prediction errors (sensory noise) relative to the precision of predictions (prior beliefs). This results in percepts that are dominated by sensory inputs and less guided by top-down regularization and shifts the perceptual focus to detailed aspects of the environment with difficulties in extracting meaning. While these Bayesian theories have inspired ongoing empirical studies, their clinical implications have not yet been carved out. Here, we consider how this Bayesian perspective on disease mechanisms in ASD might contribute to improving clinical care for affected individuals. Specifically, we describe a computational strategy, based on generative (e.g., hierarchical Bayesian) models of behavioral and functional neuroimaging data, for establishing diagnostic tests. These tests could provide estimates of specific cognitive processes underlying ASD and delineate pathophysiological mechanisms with concrete treatment targets. Written with a clinical audience in mind, this article outlines how the development of computational diagnostics applicable to behavioral and functional neuroimaging data in routine clinical practice could not only fundamentally alter our concept of ASD but eventually also transform the clinical management of this disorder. PMID:27378955
NASA Astrophysics Data System (ADS)
Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.; Amerjeed, Mansoor
2018-02-01
Bayesian inference using Markov Chain Monte Carlo (MCMC) provides an explicit framework for stochastic calibration of hydrogeologic models accounting for uncertainties; however, the MCMC sampling entails a large number of model calls, and could easily become computationally unwieldy if the high-fidelity hydrogeologic model simulation is time consuming. This study proposes a surrogate-based Bayesian framework to address this notorious issue, and illustrates the methodology by inverse modeling a regional MODFLOW model. The high-fidelity groundwater model is approximated by a fast statistical model using Bagging Multivariate Adaptive Regression Spline (BMARS) algorithm, and hence the MCMC sampling can be efficiently performed. In this study, the MODFLOW model is developed to simulate the groundwater flow in an arid region of Oman consisting of mountain-coast aquifers, and used to run representative simulations to generate training dataset for BMARS model construction. A BMARS-based Sobol' method is also employed to efficiently calculate input parameter sensitivities, which are used to evaluate and rank their importance for the groundwater flow model system. According to sensitivity analysis, insensitive parameters are screened out of Bayesian inversion of the MODFLOW model, further saving computing efforts. The posterior probability distribution of input parameters is efficiently inferred from the prescribed prior distribution using observed head data, demonstrating that the presented BMARS-based Bayesian framework is an efficient tool to reduce parameter uncertainties of a groundwater system.
Lee, Hyunjung; McKeon, Robert J; Bellamkonda, Ravi V
2010-02-23
Chondroitin sulfate proteoglycans (CSPGs) are a major class of axon growth inhibitors that are up-regulated after spinal cord injury (SCI) and contribute to regenerative failure. Chondroitinase ABC (chABC) digests glycosaminoglycan chains on CSPGs and can thereby overcome CSPG-mediated inhibition. But chABC loses its enzymatic activity rapidly at 37 degrees C, necessitating the use of repeated injections or local infusions for a period of days to weeks. These infusion systems are invasive, infection-prone, and clinically problematic. To overcome this limitation, we have thermostabilized chABC and developed a system for its sustained local delivery in vivo, obviating the need for chronically implanted catheters and pumps. Thermostabilized chABC remained active at 37 degrees C in vitro for up to 4 weeks. CSPG levels remained low in vivo up to 6 weeks post-SCI when thermostabilized chABC was delivered by a hydrogel-microtube scaffold system. Axonal growth and functional recovery following the sustained local release of thermostabilized chABC versus a single treatment of unstabilized chABC demonstrated significant differences in CSPG digestion. Animals treated with thermostabilized chABC in combination with sustained neurotrophin-3 delivery showed significant improvement in locomotor function and enhanced growth of cholera toxin B subunit-positive sensory axons and sprouting of serotonergic fibers. Therefore, improving chABC thermostability facilitates minimally invasive, sustained, local delivery of chABC that is potentially effective in overcoming CSPG-mediated regenerative failure. Combination therapy with thermostabilized chABC with neurotrophic factors enhances axonal regrowth, sprouting, and functional recovery after SCI.
Hyde, B B; Liesa, M; Elorza, A A; Qiu, W; Haigh, S E; Richey, L; Mikkola, H K; Schlaeger, T M; Shirihai, O S
2012-07-01
The mitochondrial transporter ATP binding cassette mitochondrial erythroid (ABC-me/ABCB10) is highly induced during erythroid differentiation by GATA-1 and its overexpression increases hemoglobin production rates in vitro. However, the role of ABC-me in erythropoiesis in vivo is unknown. Here we report for the first time that erythrocyte development in mice requires ABC-me. ABC-me-/- mice die at day 12.5 of gestation, showing nearly complete eradication of primitive erythropoiesis and lack of hemoglobinized cells at day 10.5. ABC-me-/- erythroid cells fail to differentiate because they exhibit a marked increase in apoptosis, both in vivo and ex vivo. Erythroid precursors are particularly sensitive to oxidative stress and ABC-me in the heart and its yeast ortholog multidrug resistance-like 1 have been shown to protect against oxidative stress. Thus, we hypothesized that increased apoptosis in ABC-me-/- erythroid precursors was caused by oxidative stress. Within this context, ABC-me deletion causes an increase in mitochondrial superoxide production and protein carbonylation in erythroid precursors. Furthermore, treatment of ABC-me-/- erythroid progenitors with the mitochondrial antioxidant MnTBAP (superoxide dismutase 2 mimetic) supports survival, ex vivo differentiation and increased hemoglobin production. Altogether, our findings demonstrate that ABC-me is essential for erythropoiesis in vivo.
75 FR 49549 - ABC & D Recycling, Inc.-Lease and Operation Exemption-a Line of Railroad in Ware, MA
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-13
... DEPARTMENT OF TRANSPORTATION Surface Transportation Board [Docket No. FD 35397] ABC & D Recycling, Inc.--Lease and Operation Exemption--a Line of Railroad in Ware, MA ABC & D Recycling, Inc. (ABC & D..., ABC & D Recycling, Inc.--Lease and Operation Exemption--a Line of Railroad in Ware, Massachusetts (STB...
a Novel Discrete Optimal Transport Method for Bayesian Inverse Problems
NASA Astrophysics Data System (ADS)
Bui-Thanh, T.; Myers, A.; Wang, K.; Thiery, A.
2017-12-01
We present the Augmented Ensemble Transform (AET) method for generating approximate samples from a high-dimensional posterior distribution as a solution to Bayesian inverse problems. Solving large-scale inverse problems is critical for some of the most relevant and impactful scientific endeavors of our time. Therefore, constructing novel methods for solving the Bayesian inverse problem in more computationally efficient ways can have a profound impact on the science community. This research derives the novel AET method for exploring a posterior by solving a sequence of linear programming problems, resulting in a series of transport maps which map prior samples to posterior samples, allowing for the computation of moments of the posterior. We show both theoretical and numerical results, indicating this method can offer superior computational efficiency when compared to other SMC methods. Most of this efficiency is derived from matrix scaling methods to solve the linear programming problem and derivative-free optimization for particle movement. We use this method to determine inter-well connectivity in a reservoir and the associated uncertainty related to certain parameters. The attached file shows the difference between the true parameter and the AET parameter in an example 3D reservoir problem. The error is within the Morozov discrepancy allowance with lower computational cost than other particle methods.
NASA Astrophysics Data System (ADS)
Volkov, D.
2017-12-01
We introduce an algorithm for the simultaneous reconstruction of faults and slip fields on those faults. We define a regularized functional to be minimized for the reconstruction. We prove that the minimum of that functional converges to the unique solution of the related fault inverse problem. Due to inherent uncertainties in measurements, rather than seeking a deterministic solution to the fault inverse problem, we consider a Bayesian approach. The advantage of such an approach is that we obtain a way of quantifying uncertainties as part of our final answer. On the downside, this Bayesian approach leads to a very large computation. To contend with the size of this computation we developed an algorithm for the numerical solution to the stochastic minimization problem which can be easily implemented on a parallel multi-core platform and we discuss techniques to save on computational time. After showing how this algorithm performs on simulated data and assessing the effect of noise, we apply it to measured data. The data was recorded during a slow slip event in Guerrero, Mexico.
Cole, Michael H; Rippey, Jodi; Naughton, Geraldine A; Silburn, Peter A
2016-01-01
To assess whether the 16-item Activities-specific Balance Confidence scale (ABC-16) and short-form 6-item Activities-specific Balance Confidence scale (ABC-6) could predict future recurrent falls in people with Parkinson disease (PD) and to validate the robustness of their predictive capacities. Twelve-month prospective cohort study. General community. People with idiopathic PD (N=79). Clinical tests were conducted to assess symptom severity, balance confidence, and medical history. Over the subsequent 12 months, participants recorded any falls on daily fall calendars, which they returned monthly by reply paid post. Logistic regression and receiver operating characteristic analyses estimated the sensitivities and specificities of the ABC-16 and ABC-6 for predicting future recurrent falls in this cohort, and "leave-one-out" validation was used to assess their robustness. Of the 79 patients who completed follow-up, 28 (35.4%) fell more than once during the 12-month period. Both the ABC-16 and ABC-6 were significant predictors of future recurrent falls, and moderate sensitivities (ABC-16: 75.0%; ABC-6: 71.4%) and specificities (ABC-16: 76.5%; ABC-6: 74.5%) were reported for each tool for a cutoff score of 77.5 and 65.8, respectively. The results have significant implications and demonstrate that the ABC-16 and ABC-6 independently identify patients with PD at risk of future recurrent falls. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
A new prior for bayesian anomaly detection: application to biosurveillance.
Shen, Y; Cooper, G F
2010-01-01
Bayesian anomaly detection computes posterior probabilities of anomalous events by combining prior beliefs and evidence from data. However, the specification of prior probabilities can be challenging. This paper describes a Bayesian prior in the context of disease outbreak detection. The goal is to provide a meaningful, easy-to-use prior that yields a posterior probability of an outbreak that performs at least as well as a standard frequentist approach. If this goal is achieved, the resulting posterior could be usefully incorporated into a decision analysis about how to act in light of a possible disease outbreak. This paper describes a Bayesian method for anomaly detection that combines learning from data with a semi-informative prior probability over patterns of anomalous events. A univariate version of the algorithm is presented here for ease of illustration of the essential ideas. The paper describes the algorithm in the context of disease-outbreak detection, but it is general and can be used in other anomaly detection applications. For this application, the semi-informative prior specifies that an increased count over baseline is expected for the variable being monitored, such as the number of respiratory chief complaints per day at a given emergency department. The semi-informative prior is derived based on the baseline prior, which is estimated from using historical data. The evaluation reported here used semi-synthetic data to evaluate the detection performance of the proposed Bayesian method and a control chart method, which is a standard frequentist algorithm that is closest to the Bayesian method in terms of the type of data it uses. The disease-outbreak detection performance of the Bayesian method was statistically significantly better than that of the control chart method when proper baseline periods were used to estimate the baseline behavior to avoid seasonal effects. When using longer baseline periods, the Bayesian method performed as well as the control chart method. The time complexity of the Bayesian algorithm is linear in the number of the observed events being monitored, due to a novel, closed-form derivation that is introduced in the paper. This paper introduces a novel prior probability for Bayesian outbreak detection that is expressive, easy-to-apply, computationally efficient, and performs as well or better than a standard frequentist method.
ERIC Educational Resources Information Center
Tsutakawa, Robert K.; Lin, Hsin Ying
Item response curves for a set of binary responses are studied from a Bayesian viewpoint of estimating the item parameters. For the two-parameter logistic model with normally distributed ability, restricted bivariate beta priors are used to illustrate the computation of the posterior mode via the EM algorithm. The procedure is illustrated by data…
Virtual Representation of IID Observations in Bayesian Belief Networks
1994-04-01
programs for structuring and using Bayesian inference include ERGO ( Noetic Systems, Inc., 1991) and HUGIN (Andersen, Jensen, Olesen, & Jensen, 1989...Nichols, S.. Chipman, & R. Brennan (Eds.), Cognitively diagnostic assessment. Hillsdale, NJ: Erlbaum. Noetic Systems, Inc. (1991). ERGO [computer...Dr Geore Eageiard Jr Chicago IL 60612 US Naval Academy Division of Educational Studies Annapolis MD 21402-5002 Emory University Dr Janice Gifford 210
Bayesian methods for outliers detection in GNSS time series
NASA Astrophysics Data System (ADS)
Qianqian, Zhang; Qingming, Gui
2013-07-01
This article is concerned with the problem of detecting outliers in GNSS time series based on Bayesian statistical theory. Firstly, a new model is proposed to simultaneously detect different types of outliers based on the conception of introducing different types of classification variables corresponding to the different types of outliers; the problem of outlier detection is converted into the computation of the corresponding posterior probabilities, and the algorithm for computing the posterior probabilities based on standard Gibbs sampler is designed. Secondly, we analyze the reasons of masking and swamping about detecting patches of additive outliers intensively; an unmasking Bayesian method for detecting additive outlier patches is proposed based on an adaptive Gibbs sampler. Thirdly, the correctness of the theories and methods proposed above is illustrated by simulated data and then by analyzing real GNSS observations, such as cycle slips detection in carrier phase data. Examples illustrate that the Bayesian methods for outliers detection in GNSS time series proposed by this paper are not only capable of detecting isolated outliers but also capable of detecting additive outlier patches. Furthermore, it can be successfully used to process cycle slips in phase data, which solves the problem of small cycle slips.
A simulation study on Bayesian Ridge regression models for several collinearity levels
NASA Astrophysics Data System (ADS)
Efendi, Achmad; Effrihan
2017-12-01
When analyzing data with multiple regression model if there are collinearities, then one or several predictor variables are usually omitted from the model. However, there sometimes some reasons, for instance medical or economic reasons, the predictors are all important and should be included in the model. Ridge regression model is not uncommon in some researches to use to cope with collinearity. Through this modeling, weights for predictor variables are used for estimating parameters. The next estimation process could follow the concept of likelihood. Furthermore, for the estimation nowadays the Bayesian version could be an alternative. This estimation method does not match likelihood one in terms of popularity due to some difficulties; computation and so forth. Nevertheless, with the growing improvement of computational methodology recently, this caveat should not at the moment become a problem. This paper discusses about simulation process for evaluating the characteristic of Bayesian Ridge regression parameter estimates. There are several simulation settings based on variety of collinearity levels and sample sizes. The results show that Bayesian method gives better performance for relatively small sample sizes, and for other settings the method does perform relatively similar to the likelihood method.
Embedding the results of focussed Bayesian fusion into a global context
NASA Astrophysics Data System (ADS)
Sander, Jennifer; Heizmann, Michael
2014-05-01
Bayesian statistics offers a well-founded and powerful fusion methodology also for the fusion of heterogeneous information sources. However, except in special cases, the needed posterior distribution is not analytically derivable. As consequence, Bayesian fusion may cause unacceptably high computational and storage costs in practice. Local Bayesian fusion approaches aim at reducing the complexity of the Bayesian fusion methodology significantly. This is done by concentrating the actual Bayesian fusion on the potentially most task relevant parts of the domain of the Properties of Interest. Our research on these approaches is motivated by an analogy to criminal investigations where criminalists pursue clues also only locally. This publication follows previous publications on a special local Bayesian fusion technique called focussed Bayesian fusion. Here, the actual calculation of the posterior distribution gets completely restricted to a suitably chosen local context. By this, the global posterior distribution is not completely determined. Strategies for using the results of a focussed Bayesian analysis appropriately are needed. In this publication, we primarily contrast different ways of embedding the results of focussed Bayesian fusion explicitly into a global context. To obtain a unique global posterior distribution, we analyze the application of the Maximum Entropy Principle that has been shown to be successfully applicable in metrology and in different other areas. To address the special need for making further decisions subsequently to the actual fusion task, we further analyze criteria for decision making under partial information.
Whole-Genome Survey of the Putative ATP-Binding Cassette Transporter Family Genes in Vitis vinifera
Çakır, Birsen; Kılıçkaya, Ozan
2013-01-01
The ATP-binding cassette (ABC) protein superfamily constitutes one of the largest protein families known in plants. In this report, we performed a complete inventory of ABC protein genes in Vitis vinifera, the whole genome of which has been sequenced. By comparison with ABC protein members of Arabidopsis thaliana, we identified 135 putative ABC proteins with 1 or 2 NBDs in V. vinifera. Of these, 120 encode intrinsic membrane proteins, and 15 encode proteins missing TMDs. V. vinifera ABC proteins can be divided into 13 subfamilies with 79 “full-size,” 41 “half-size,” and 15 “soluble” putative ABC proteins. The main feature of the Vitis ABC superfamily is the presence of 2 large subfamilies, ABCG (pleiotropic drug resistance and white-brown complex homolog) and ABCC (multidrug resistance-associated protein). We identified orthologs of V. vinifera putative ABC transporters in different species. This work represents the first complete inventory of ABC transporters in V. vinifera. The identification of Vitis ABC transporters and their comparative analysis with the Arabidopsis counterparts revealed a strong conservation between the 2 species. This inventory could help elucidate the biological and physiological functions of these transporters in V. vinifera. PMID:24244377
Young, David W
2015-11-01
Historically, hospital departments have computed the costs of individual tests or procedures using the ratio of cost to charges (RCC) method, which can produce inaccurate results. To determine a more accurate cost of a test or procedure, the activity-based costing (ABC) method must be used. Accurate cost calculations will ensure reliable information about the profitability of a hospital's DRGs.
A New Computational Framework for Atmospheric and Surface Remote Sensing
NASA Technical Reports Server (NTRS)
Timucin, Dogan A.
2004-01-01
A Bayesian data-analysis framework is described for atmospheric and surface retrievals from remotely-sensed hyper-spectral data. Some computational techniques are high- lighted for improved accuracy in the forward physics model.
Untch, Michael; Würstlein, Rachel; Marschner, Norbert; Lüftner, Diana; Augustin, Doris; Briest, Susanne; Ettl, Johannes; Haidinger, Renate; Müller, Lothar; Müller, Volkmar; Ruckhäberle, Eugen; Harbeck, Nadia; Thomssen, Christoph
2018-01-01
The fourth international advanced breast cancer consensus conference (ABC4) on the diagnosis and treatment of advanced breast cancer (ABC) headed by Professor Fatima Cardoso was once again held in Lisbon on November 2 – 4, 2017. To simplify matters, the abbreviation ABC will be used hereinafter in the text. In clinical practice, the abbreviation corresponds to metastatic breast cancer or locally far-advanced disease. This year the focus was on new developments in the treatment of ABC. Topics discussed included the importance of CDK4/6 inhibition in hormone receptor (HR)-positive ABC, the use of dual antibody blockade to treat HER2-positive ABC, PARP inhibition in triple-negative ABC and the potential therapeutic outcomes. Another major area discussed at the conference was BRCA-associated breast cancer, the treatment of cerebral metastasis, and individualized treatment decisions based on molecular testing (so-called precision medicine). As in previous years, close cooperation with representatives from patient organizations from around the world is an important aspect of the ABC conference. This cooperation was reinforced and expanded at the ABC4 conference. A global alliance was founded at the conclusion of the consensus conference, which aims to promote and coordinate the measures considered necessary by patient advocates worldwide. Because the panel of experts was composed of specialists from all over the world, it was inevitable that the ABC consensus also reflected country-specific features. As in previous years, a team of German breast cancer specialists who closely followed the consensus voting of the ABC panelists in Lisbon and intensively discussed the votes has therefore commented on the consensus in the context of the current German guidelines on the diagnosis and treatment of breast cancer 1 , 2 used in clinical practice in Germany. The ABC consensus is based on the votes of the ABC panelists in Lisbon. PMID:29880982
Xiao, Lin-Fan; Zhang, Wei; Jing, Tian-Xing; Zhang, Meng-Yi; Miao, Ze-Qing; Wei, Dan-Dan; Yuan, Guo-Rui; Wang, Jin-Jun
2018-03-01
The ATP-binding cassette (ABC) is the largest transporter gene family and the genes play key roles in xenobiotic resistance, metabolism, and development of all phyla. However, the specific functions of ABC gene families in insects is unclear. We report a genome-wide identification, phylogenetic, and transcriptional analysis of the ABC genes in the oriental fruit fly, Bactrocera dorsalis (Hendel). We identified a total of 47 ABC genes (BdABCs) from the transcriptomic and genomic databases of B. dorsalis and classified these genes into eight subfamilies (A-H), including 7 ABCAs, 7 ABCBs, 9 ABCCs, 2 ABCDs, 1 ABCE, 3 ABCFs, 15 ABCGs, and 3 ABCHs. Comparative phylogenetic analysis of the ABCs suggests an orthologous relationship between B. dorsalis and other insect species in which these genes have been related to pesticide resistance and essential biological processes. Comparison of transcriptome and relative expression patterns of BdABCs indicated diverse multifunctions within different B. dorsalis tissues. The expression of 4, 10, and 14 BdABCs from 18 BdABCs was significantly upregulated after exposure to LD 50 s of malathion, avermectin, and beta-cypermethrin, respectively. The maximum expression level of most BdABCs (including BdABCFs, BdABCGs, and BdABCHs) occurred at 48h post exposures, whereas BdABCEs peaked at 24h after treatment. Furthermore, RNA interference-mediated suppression of BdABCB7 resulted in increased toxicity of malathion against B. dorsalis. These data suggest that ABC transporter genes might play key roles in xenobiotic metabolism and biosynthesis in B. dorsalis. Copyright © 2017 Elsevier Inc. All rights reserved.
Sims, Lynn M; Igarashi, Robert Y
2012-08-15
Ribosomal function is dependent on multiple proteins. The ABCE1 ATPase, a unique ABC superfamily member that bears two Fe₄S₄ clusters, is crucial for ribosomal biogenesis and recycling. Here, the ATPase activity of the Pyrococcus abyssi ABCE1 (PabABCE1) was studied using both apo- (without reconstituted Fe-S clusters) and holo- (with full complement of Fe-S clusters reconstituted post-purification) forms, and is shown to be jointly regulated by the status of Fe-S clusters and Mg²⁺. Typically ATPases require Mg²⁺, as is true for PabABCE1, but Mg²⁺ also acts as a negative allosteric effector that modulates ATP affinity of PabABCE1. Physiological [Mg²⁺] inhibits the PabABCE1 ATPase (K(i) of ∼1 μM) for both apo- and holo-PabABCE1. Comparative kinetic analysis of Mg²⁺ inhibition shows differences in degree of allosteric regulation between the apo- and holo-PabABCE1 where the apparent ATP K(m) of apo-PabABCE1 increases >30-fold from ∼30 μM to over 1 mM with M²⁺. This effect would significantly convert the ATPase activity of PabABCE1 from being independent of cellular energy charge (φ) to being dependent on φ with cellular [Mg²⁺]. These findings uncover intricate overlapping effects by both [Mg²⁺] and the status of Fe-S clusters that regulate ABCE1's ATPase activity with implications to ribosomal function. Copyright © 2012 Elsevier Inc. All rights reserved.
Quantum state estimation when qubits are lost: a no-data-left-behind approach
Williams, Brian P.; Lougovski, Pavel
2017-04-06
We present an approach to Bayesian mean estimation of quantum states using hyperspherical parametrization and an experiment-specific likelihood which allows utilization of all available data, even when qubits are lost. With this method, we report the first closed-form Bayesian mean and maximum likelihood estimates for the ideal single qubit. Due to computational constraints, we utilize numerical sampling to determine the Bayesian mean estimate for a photonic two-qubit experiment in which our novel analysis reduces burdens associated with experimental asymmetries and inefficiencies. This method can be applied to quantum states of any dimension and experimental complexity.
Uncertainty aggregation and reduction in structure-material performance prediction
NASA Astrophysics Data System (ADS)
Hu, Zhen; Mahadevan, Sankaran; Ao, Dan
2018-02-01
An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.
The image recognition based on neural network and Bayesian decision
NASA Astrophysics Data System (ADS)
Wang, Chugege
2018-04-01
The artificial neural network began in 1940, which is an important part of artificial intelligence. At present, it has become a hot topic in the fields of neuroscience, computer science, brain science, mathematics, and psychology. Thomas Bayes firstly reported the Bayesian theory in 1763. After the development in the twentieth century, it has been widespread in all areas of statistics. In recent years, due to the solution of the problem of high-dimensional integral calculation, Bayesian Statistics has been improved theoretically, which solved many problems that cannot be solved by classical statistics and is also applied to the interdisciplinary fields. In this paper, the related concepts and principles of the artificial neural network are introduced. It also summarizes the basic content and principle of Bayesian Statistics, and combines the artificial neural network technology and Bayesian decision theory and implement them in all aspects of image recognition, such as enhanced face detection method based on neural network and Bayesian decision, as well as the image classification based on the Bayesian decision. It can be seen that the combination of artificial intelligence and statistical algorithms has always been the hot research topic.
A Silent ABC Transporter Isolated from Streptomyces rochei F20 Induces Multidrug Resistance
Fernández-Moreno, Miguel A.; Carbó, Lázaro; Cuesta, Trinidad; Vallín, Carlos; Malpartida, Francisco
1998-01-01
In the search for heterologous activators for actinorhodin production in Streptomyces lividans, 3.4 kb of DNA from Streptomyces rochei F20 (a streptothricin producer) were characterized. Subcloning experiments showed that the minimal DNA fragment required for activation was 0.4 kb in size. The activation is mediated by increasing the levels of transcription of the actII-ORF4 gene. Sequencing of the minimal activating fragment did not reveal any clues about its mechanism; nevertheless, it was shown to overlap the 3′ end of two convergent genes, one of whose translated products (ORF2) strongly resembles that of other genes belonging to the ABC transporter superfamily. Computer-assisted analysis of the 3.4-kb DNA sequence showed the 3′ terminus of an open reading frame (ORF), i.e., ORFA, and three complete ORFs (ORF1, ORF2, and ORFB). Searches in the databases with their respective gene products revealed similarities for ORF1 and ORF2 with ATP-binding proteins and transmembrane proteins, respectively, which are found in members of the ABC transporter superfamily. No similarities for ORFA and ORFB were found in the databases. Insertional inactivation of ORF1 and ORF2, their transcription analysis, and their cloning in heterologous hosts suggested that these genes were not expressed under our experimental conditions; however, cloning of ORF1 and ORF2 together (but not separately) under the control of an expressing promoter induced resistance to several chemically different drugs: oleandomycin, erythromycin, spiramycin, doxorubicin, and tetracycline. Thus, this genetic system, named msr, is a new bacterial multidrug ABC transporter. PMID:9696745
Walters, Kevin
2012-08-07
In this paper we use approximate Bayesian computation to estimate the parameters in an immortal model of colonic stem cell division. We base the inferences on the observed DNA methylation patterns of cells sampled from the human colon. Utilising DNA methylation patterns as a form of molecular clock is an emerging area of research and has been used in several studies investigating colonic stem cell turnover. There is much debate concerning the two competing models of stem cell turnover: the symmetric (immortal) and asymmetric models. Early simulation studies concluded that the observed methylation data were not consistent with the immortal model. A later modified version of the immortal model that included preferential strand segregation was subsequently shown to be consistent with the same methylation data. Most of this earlier work assumes site independent methylation models that do not take account of the known processivity of methyltransferases whilst other work does not take into account the methylation errors that occur in differentiated cells. This paper addresses both of these issues for the immortal model and demonstrates that approximate Bayesian computation provides accurate estimates of the parameters in this neighbour-dependent model of methylation error rates. The results indicate that if colonic stem cells divide asymmetrically then colon stem cell niches are maintained by more than 8 stem cells. Results also indicate the possibility of preferential strand segregation and provide clear evidence against a site-independent model for methylation errors. In addition, algebraic expressions for some of the summary statistics used in the approximate Bayesian computation (that allow for the additional variation arising from cell division in differentiated cells) are derived and their utility discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Hyde, B B; Liesa, M; Elorza, A A; Qiu, W; Haigh, S E; Richey, L; Mikkola, H K; Schlaeger, T M; Shirihai, O S
2012-01-01
The mitochondrial transporter ATP binding cassette mitochondrial erythroid (ABC-me/ABCB10) is highly induced during erythroid differentiation by GATA-1 and its overexpression increases hemoglobin production rates in vitro. However, the role of ABC-me in erythropoiesis in vivo is unknown. Here we report for the first time that erythrocyte development in mice requires ABC-me. ABC-me−/− mice die at day 12.5 of gestation, showing nearly complete eradication of primitive erythropoiesis and lack of hemoglobinized cells at day 10.5. ABC-me−/− erythroid cells fail to differentiate because they exhibit a marked increase in apoptosis, both in vivo and ex vivo. Erythroid precursors are particularly sensitive to oxidative stress and ABC-me in the heart and its yeast ortholog multidrug resistance-like 1 have been shown to protect against oxidative stress. Thus, we hypothesized that increased apoptosis in ABC-me−/− erythroid precursors was caused by oxidative stress. Within this context, ABC-me deletion causes an increase in mitochondrial superoxide production and protein carbonylation in erythroid precursors. Furthermore, treatment of ABC-me−/− erythroid progenitors with the mitochondrial antioxidant MnTBAP (superoxide dismutase 2 mimetic) supports survival, ex vivo differentiation and increased hemoglobin production. Altogether, our findings demonstrate that ABC-me is essential for erythropoiesis in vivo. PMID:22240895
On the Huygens absorbing boundary conditions for electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berenger, Jean-Pierre
A new absorbing boundary condition (ABC) is presented for the solution of Maxwell equations in unbounded spaces. Called the Huygens ABC, this condition is a generalization of two previously published ABCs, namely the multiple absorbing surfaces (MAS) and the re-radiating boundary condition (rRBC). The properties of the Huygens ABC are derived theoretically in continuous spaces and in the finite-difference (FDTD) discretized space. A solution is proposed to render the Huygens ABC effective for the absorption of evanescent waves. Numerical experiments with the FDTD method show that the effectiveness of the Huygens ABC is close to that of the PML ABCmore » in some realistic problems of numerical electromagnetics. It is also shown in the paper that a combination of the Huygens ABC with the PML ABC is very well suited to the solution of some particular problems.« less
Fenton, Norman; Neil, Martin; Berger, Daniel
2016-01-01
Although the last forty years has seen considerable growth in the use of statistics in legal proceedings, it is primarily classical statistical methods rather than Bayesian methods that have been used. Yet the Bayesian approach avoids many of the problems of classical statistics and is also well suited to a broader range of problems. This paper reviews the potential and actual use of Bayes in the law and explains the main reasons for its lack of impact on legal practice. These include misconceptions by the legal community about Bayes’ theorem, over-reliance on the use of the likelihood ratio and the lack of adoption of modern computational methods. We argue that Bayesian Networks (BNs), which automatically produce the necessary Bayesian calculations, provide an opportunity to address most concerns about using Bayes in the law. PMID:27398389
Fenton, Norman; Neil, Martin; Berger, Daniel
2016-06-01
Although the last forty years has seen considerable growth in the use of statistics in legal proceedings, it is primarily classical statistical methods rather than Bayesian methods that have been used. Yet the Bayesian approach avoids many of the problems of classical statistics and is also well suited to a broader range of problems. This paper reviews the potential and actual use of Bayes in the law and explains the main reasons for its lack of impact on legal practice. These include misconceptions by the legal community about Bayes' theorem, over-reliance on the use of the likelihood ratio and the lack of adoption of modern computational methods. We argue that Bayesian Networks (BNs), which automatically produce the necessary Bayesian calculations, provide an opportunity to address most concerns about using Bayes in the law.
Kamou, Nathalie N; Dubey, Mukesh; Tzelepis, Georgios; Menexes, Georgios; Papadakis, Emmanouil N; Karlsson, Magnus; Lagopodi, Anastasia L; Jensen, Dan Funck
2016-05-01
This study was carried out to assess the compatibility of the biocontrol fungus Clonostachys rosea IK726 with the phenazine-producing Pseudomonas chlororaphis ToZa7 or with the prodigiosin-producing Serratia rubidaea S55 against Fusarium oxysporum f. sp. radicis-lycopersici. The pathogen was inhibited by both strains in vitro, whereas C. rosea displayed high tolerance to S. rubidaea but not to P. chlororaphis. We hypothesized that this could be attributed to the ATP-binding cassette (ABC) proteins. The results of the reverse transcription quantitative PCR showed an induction of seven genes (abcB1, abcB20, abcB26, abcC12, abcC12, abcG8 and abcG25) from subfamilies B, C and G. In planta experiments showed a significant reduction in foot and root rot on tomato plants inoculated with C. rosea and P. chlororaphis. This study demonstrates the potential for combining different biocontrol agents and suggests an involvement of ABC transporters in secondary metabolite tolerance in C. rosea.
Massatti, Rob; Knowles, L Lacey
2016-08-01
Deterministic processes may uniquely affect codistributed species' phylogeographic patterns such that discordant genetic variation among taxa is predicted. Yet, explicitly testing expectations of genomic discordance in a statistical framework remains challenging. Here, we construct spatially and temporally dynamic models to investigate the hypothesized effect of microhabitat preferences on the permeability of glaciated regions to gene flow in two closely related montane species. Utilizing environmental niche models from the Last Glacial Maximum and the present to inform demographic models of changes in habitat suitability over time, we evaluate the relative probabilities of two alternative models using approximate Bayesian computation (ABC) in which glaciated regions are either (i) permeable or (ii) a barrier to gene flow. Results based on the fit of the empirical data to data sets simulated using a spatially explicit coalescent under alternative models indicate that genomic data are consistent with predictions about the hypothesized role of microhabitat in generating discordant patterns of genetic variation among the taxa. Specifically, a model in which glaciated areas acted as a barrier was much more probable based on patterns of genomic variation in Carex nova, a wet-adapted species. However, in the dry-adapted Carex chalciolepis, the permeable model was more probable, although the difference in the support of the models was small. This work highlights how statistical inferences can be used to distinguish deterministic processes that are expected to result in discordant genomic patterns among species, including species-specific responses to climate change. © 2016 John Wiley & Sons Ltd.
Hernández, Damir; Casane, Didier; Chevalier-Monteagudo, Pedro; Bernatchez, Louis; García-Machado, Erik
2016-01-01
Consistent with the limited dispersal capacity of most troglobitic animals, almost all Lucifuga cavefish species have very narrow geographic distribution in Cuba. However, one species, L. dentata, has a wide but disjointed distribution over 300 km in the west of the island. In order to estimate the relative role of vicariance and dispersal in the unexpected L. dentata distribution, we obtained partial sequences of the mitochondrial DNA (mtDNA) cytochrome b (cytb) gene and control region (CR), and then applied Approximate Bayesian Computation (ABC), based on the identification of five genetic and geographic congruent groups of populations. The process that best explains the distribution of genetic diversity in this species is sequential range expansion from east Matanzas to the western Pinar del Río provinces, followed by isolation of groups of populations. We found relative high haplotype diversity and low nucleotide diversity in all but the Havana group, which has high values for both diversity parameters, suggesting that this group has been demographically stable over time. For two groups of populations (Cayuco and Bolondrón), the mismatch distribution analyses suggests past demographic expansion. In the case of the Cayuco region, the star like relationships of haplotypes in the network suggests a recent founding event, congruent with other evidence indicating that this is the most recently colonized region. Over all, the results suggest that a combination of habitat availability, temporal interconnections, and possibly the biological properties of this species, may have enabled its dispersal and range expansion compared to other species of the genus, which are more geographically restricted.
Zhao, Yujuan; Yin, Genshen; Pan, Yuezhi; Gong, Xun
2018-01-01
Understanding of the processes of divergence and speciation is a major task for biodiversity researches and may offer clearer insight into mechanisms generating biological diversity. Here, we employ an integrative approach to explore genetic and ecological differentiation of Leucomeris decora and Nouelia insignis distributed allopatrically along the two sides of the biogeographic boundary 'Tanaka Line' in Southwest China. We addressed these questions using ten low-copy nuclear genes and nine plastid DNA regions sequenced among individuals sampled from 28 populations across their geographic ranges in China. Phylogenetic, coalescent-based population genetic analyses, approximate Bayesian computation (ABC) framework and ecological niche models (ENMs) were conducted. We identified a closer phylogenetic relationship in maternal lineage of L. decora with N. insignis than that between L . decora and congeneric Leucomeris spectabilis . A deep divergence between the two species was observed and occurred at the boundary between later Pliocene and early Pleistocene. However, the evidence of significant chloroplast DNA gene flow was also detected between the marginal populations of L. decora and N. insignis . Niche models and statistical analyses showed significant ecological differentiation, and two nuclear loci among the ten nuclear genes may be under divergent selection. These integrative results imply that the role of climatic shift from Pliocene to Pleistocene may be the prominent factor for the divergence of L . decora and N . insignis , and population expansion after divergence may have given rise to chloroplast DNA introgression. The divergence was maintained by differential selection despite in the face of gene flow.
Liu, Bingbing; Abbott, Richard J; Lu, Zhiqiang; Tian, Bin; Liu, Jianquan
2014-06-01
Despite the well-known effects that Quaternary climate oscillations had on shaping intraspecific diversity, their role in driving homoploid hybrid speciation is less clear. Here, we examine their importance in the putative homoploid hybrid origin and evolution of Ostryopsis intermedia, a diploid species occurring in the Qinghai-Tibet Plateau (QTP), a biodiversity hotspot. We investigated interspecific relationships between this species and its only other congeners, O. davidiana and O. nobilis, based on four sets of nuclear and chloroplast population genetic data and tested alternative speciation hypotheses. All nuclear data distinguished the three species clearly and supported a close relationship between O. intermedia and the disjunctly distributed O. davidiana. Chloroplast DNA sequence variation identified two tentative lineages, which distinguished O. intermedia from O. davidiana; however, both were present in O. nobilis. Admixture analyses of genetic polymorphisms at 20 SSR loci and sequence variation at 11 nuclear loci and approximate Bayesian computation (ABC) tests supported the hypothesis that O. intermedia originated by homoploid hybrid speciation from O. davidiana and O. nobilis. We further estimated that O. davidiana and O. nobilis diverged 6-11 Ma, while O. intermedia originated 0.5-1.2 Ma when O. davidiana is believed to have migrated southward, contacted and hybridized with O. nobilis possibly during the largest Quaternary glaciation that occurred in this region. Our findings highlight the importance of Quaternary climate change in the QTP in causing hybrid speciation in this important biodiversity hotspot. © 2014 John Wiley & Sons Ltd.
Mercière, Maxime; Boulord, Romain; Carasco-Lacombe, Catherine; Klopp, Christophe; Lee, Yang-Ping; Tan, Joon-Sheong; Syed Alwee, Sharifah S R; Zaremski, Alba; De Franqueville, Hubert; Breton, Frédéric; Camus-Kulandaivelu, Létizia
Wood rot fungi form one of the main classes of phytopathogenic fungus. The group includes many species, but has remained poorly studied. Many species belonging to the Ganoderma genus are well known for causing decay in a wide range of tree species around the world. Ganoderma boninense, causal agent of oil palm basal stem rot, is responsible for considerable yield losses in Southeast Asian oil palm plantations. In a large-scale sampling operation, 357 sporophores were collected from oil palm plantations spread over peninsular Malaysia and Sumatra and genotyped using 11 SSR markers. The genotyping of these samples made it possible to investigate the population structure and demographic history of G. boninense across the oldest known area of interaction between oil palm and G. boninense. Results show that G. boninense possesses a high degree of genetic diversity and no detectable genetic structure at the scale of Sumatra and peninsular Malaysia. The fact that few duplicate genotypes were found in several studies including this one supports the hypothesis of spore dispersal in the spread of G. boninense. Meanwhile, spatial autocorrelation analysis shows that G. boninense is able to disperse across both short and long distances. These results bring new insight into mechanisms by which G. boninense spreads in oil palm plantations. Finally, the use of approximate Bayesian computation (ABC) modelling indicates that G. boninense has undergone a demographic expansion in the past, probably before the oil palm was introduced into Southeast Asia. Copyright © 2017 British Mycological Society. Published by Elsevier Ltd. All rights reserved.
Demographic history of a recent invasion of house mice on the isolated Island of Gough.
Gray, Melissa M; Wegmann, Daniel; Haasl, Ryan J; White, Michael A; Gabriel, Sofia I; Searle, Jeremy B; Cuthbert, Richard J; Ryan, Peter G; Payseur, Bret A
2014-04-01
Island populations provide natural laboratories for studying key contributors to evolutionary change, including natural selection, population size and the colonization of new environments. The demographic histories of island populations can be reconstructed from patterns of genetic diversity. House mice (Mus musculus) inhabit islands throughout the globe, making them an attractive system for studying island colonization from a genetic perspective. Gough Island, in the central South Atlantic Ocean, is one of the remotest islands in the world. House mice were introduced to Gough Island by sealers during the 19th century and display unusual phenotypes, including exceptionally large body size and carnivorous feeding behaviour. We describe genetic variation in Gough Island mice using mitochondrial sequences, nuclear sequences and microsatellites. Phylogenetic analysis of mitochondrial sequences suggested that Gough Island mice belong to Mus musculus domesticus, with the maternal lineage possibly originating in England or France. Cluster analyses of microsatellites revealed genetic membership for Gough Island mice in multiple coastal populations in Western Europe, suggesting admixed ancestry. Gough Island mice showed substantial reductions in mitochondrial and nuclear sequence variation and weak reductions in microsatellite diversity compared with Western European populations, consistent with a population bottleneck. Approximate Bayesian computation (ABC) estimated that mice recently colonized Gough Island (~100 years ago) and experienced a 98% reduction in population size followed by a rapid expansion. Our results indicate that the unusual phenotypes of Gough Island mice evolved rapidly, positioning these mice as useful models for understanding rapid phenotypic evolution. © 2014 John Wiley & Sons Ltd.
Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods
NASA Astrophysics Data System (ADS)
Davis, A. D.
2015-12-01
The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity analysis to help answer this question, and make the computation of sensitivity indices computationally tractable using a combination of polynomial chaos and Monte Carlo techniques.
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2017-12-01
Curation of software promotes discoverability and accessibility and works hand in hand with scholarly citation to ascribe value to, and provide recognition for software development. To meet this challenge, the Computational Infrastructure for Geodynamics (CIG) maintains a community repository built on custom and open tools to promote discovery, access, identification, credit, and provenance of research software for the geodynamics community. CIG (geodynamics.org) originated from recognition of the tremendous effort required to develop sound software and the need to reduce duplication of effort and to sustain community codes. CIG curates software across 6 domains and has developed and follows software best practices that include establishing test cases, documentation, and a citable publication for each software package. CIG software landing web pages provide access to current and past releases; many are also accessible through the CIG community repository on github. CIG has now developed abc - attribution builder for citation to enable software users to give credit to software developers. abc uses zenodo as an archive and as the mechanism to obtain a unique identifier (DOI) for scientific software. To assemble the metadata, we searched the software's documentation and research publications and then requested the primary developers to verify. In this process, we have learned that each development community approaches software attribution differently. The metadata gathered is based on guidelines established by groups such as FORCE11 and OntoSoft. The rollout of abc is gradual as developers are forward-looking, rarely willing to go back and archive prior releases in zenodo. Going forward all actively developed packages will utilize the zenodo and github integration to automate the archival process when a new release is issued. How to handle legacy software, multi-authored libraries, and assigning roles to software remain open issues.
Bayesian analysis of rare events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less
The visual system’s internal model of the world
Lee, Tai Sing
2015-01-01
The Bayesian paradigm has provided a useful conceptual theory for understanding perceptual computation in the brain. While the detailed neural mechanisms of Bayesian inference are not fully understood, recent computational and neurophysiological works have illuminated the underlying computational principles and representational architecture. The fundamental insights are that the visual system is organized as a modular hierarchy to encode an internal model of the world, and that perception is realized by statistical inference based on such internal model. In this paper, I will discuss and analyze the varieties of representational schemes of these internal models and how they might be used to perform learning and inference. I will argue for a unified theoretical framework for relating the internal models to the observed neural phenomena and mechanisms in the visual cortex. PMID:26566294
Bayesian Research at the NASA Ames Research Center,Computational Sciences Division
NASA Technical Reports Server (NTRS)
Morris, Robin D.
2003-01-01
NASA Ames Research Center is one of NASA s oldest centers, having started out as part of the National Advisory Committee on Aeronautics, (NACA). The site, about 40 miles south of San Francisco, still houses many wind tunnels and other aviation related departments. In recent years, with the growing realization that space exploration is heavily dependent on computing and data analysis, its focus has turned more towards Information Technology. The Computational Sciences Division has expanded rapidly as a result. In this article, I will give a brief overview of some of the past and present projects with a Bayesian content. Much more than is described here goes on with the Division. The web pages at http://ic.arc. nasa.gov give more information on these, and the other Division projects.
NASA Astrophysics Data System (ADS)
Elshall, A. S.; Ye, M.; Niu, G. Y.; Barron-Gafford, G.
2016-12-01
Bayesian multimodel inference is increasingly being used in hydrology. Estimating Bayesian model evidence (BME) is of central importance in many Bayesian multimodel analysis such as Bayesian model averaging and model selection. BME is the overall probability of the model in reproducing the data, accounting for the trade-off between the goodness-of-fit and the model complexity. Yet estimating BME is challenging, especially for high dimensional problems with complex sampling space. Estimating BME using the Monte Carlo numerical methods is preferred, as the methods yield higher accuracy than semi-analytical solutions (e.g. Laplace approximations, BIC, KIC, etc.). However, numerical methods are prone the numerical demons arising from underflow of round off errors. Although few studies alluded to this issue, to our knowledge this is the first study that illustrates these numerical demons. We show that the precision arithmetic can become a threshold on likelihood values and Metropolis acceptance ratio, which results in trimming parameter regions (when likelihood function is less than the smallest floating point number that a computer can represent) and corrupting of the empirical measures of the random states of the MCMC sampler (when using log-likelihood function). We consider two of the most powerful numerical estimators of BME that are the path sampling method of thermodynamic integration (TI) and the importance sampling method of steppingstone sampling (SS). We also consider the two most widely used numerical estimators, which are the prior sampling arithmetic mean (AS) and posterior sampling harmonic mean (HM). We investigate the vulnerability of these four estimators to the numerical demons. Interesting, the most biased estimator, namely the HM, turned out to be the least vulnerable. While it is generally assumed that AM is a bias-free estimator that will always approximate the true BME by investing in computational effort, we show that arithmetic underflow can hamper AM resulting in severe underestimation of BME. TI turned out to be the most vulnerable, resulting in BME overestimation. Finally, we show how SS can be largely invariant to rounding errors, yielding the most accurate and computational efficient results. These research results are useful for MC simulations to estimate Bayesian model evidence.
As-built design specification for proportion estimate software subsystem
NASA Technical Reports Server (NTRS)
Obrien, S. (Principal Investigator)
1980-01-01
The Proportion Estimate Processor evaluates four estimation techniques in order to get an improved estimate of the proportion of a scene that is planted in a selected crop. The four techniques to be evaluated were provided by the techniques development section and are: (1) random sampling; (2) proportional allocation, relative count estimate; (3) proportional allocation, Bayesian estimate; and (4) sequential Bayesian allocation. The user is given two options for computation of the estimated mean square error. These are referred to as the cluster calculation option and the segment calculation option. The software for the Proportion Estimate Processor is operational on the IBM 3031 computer.
NASA Astrophysics Data System (ADS)
Walker, David M.; Allingham, David; Lee, Heung Wing Joseph; Small, Michael
2010-02-01
Small world network models have been effective in capturing the variable behaviour of reported case data of the SARS coronavirus outbreak in Hong Kong during 2003. Simulations of these models have previously been realized using informed “guesses” of the proposed model parameters and tested for consistency with the reported data by surrogate analysis. In this paper we attempt to provide statistically rigorous parameter distributions using Approximate Bayesian Computation sampling methods. We find that such sampling schemes are a useful framework for fitting parameters of stochastic small world network models where simulation of the system is straightforward but expressing a likelihood is cumbersome.
A Massively Parallel Bayesian Approach to Planetary Protection Trajectory Analysis and Design
NASA Technical Reports Server (NTRS)
Wallace, Mark S.
2015-01-01
The NASA Planetary Protection Office has levied a requirement that the upper stage of future planetary launches have a less than 10(exp -4) chance of impacting Mars within 50 years after launch. A brute-force approach requires a decade of computer time to demonstrate compliance. By using a Bayesian approach and taking advantage of the demonstrated reliability of the upper stage, the required number of fifty-year propagations can be massively reduced. By spreading the remaining embarrassingly parallel Monte Carlo simulations across multiple computers, compliance can be demonstrated in a reasonable time frame. The method used is described here.
Valence-Dependent Belief Updating: Computational Validation
Kuzmanovic, Bojana; Rigoux, Lionel
2017-01-01
People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments. PMID:28706499
Valence-Dependent Belief Updating: Computational Validation.
Kuzmanovic, Bojana; Rigoux, Lionel
2017-01-01
People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments.
50 CFR 622.60 - Adjustment of management measures.
Code of Federal Regulations, 2013 CFR
2013-10-01
... markings and identification, allowable biological catch (ABC) and ABC control rules, rebuilding plans, sale... harvested shrimp (maintaining shrimp in whole condition, use as bait), target effort and fishing mortality... identification, vessel markings and identification, ABC and ABC control rules, rebuilding plans, sale and...
50 CFR 622.60 - Adjustment of management measures.
Code of Federal Regulations, 2014 CFR
2014-10-01
... markings and identification, allowable biological catch (ABC) and ABC control rules, rebuilding plans, sale... harvested shrimp (maintaining shrimp in whole condition, use as bait), target effort and fishing mortality... identification, vessel markings and identification, ABC and ABC control rules, rebuilding plans, sale and...
Heuristics as Bayesian inference under extreme priors.
Parpart, Paula; Jones, Matt; Love, Bradley C
2018-05-01
Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Statistical Surrogate Modeling of Atmospheric Dispersion Events Using Bayesian Adaptive Splines
NASA Astrophysics Data System (ADS)
Francom, D.; Sansó, B.; Bulaevskaya, V.; Lucas, D. D.
2016-12-01
Uncertainty in the inputs of complex computer models, including atmospheric dispersion and transport codes, is often assessed via statistical surrogate models. Surrogate models are computationally efficient statistical approximations of expensive computer models that enable uncertainty analysis. We introduce Bayesian adaptive spline methods for producing surrogate models that capture the major spatiotemporal patterns of the parent model, while satisfying all the necessities of flexibility, accuracy and computational feasibility. We present novel methodological and computational approaches motivated by a controlled atmospheric tracer release experiment conducted at the Diablo Canyon nuclear power plant in California. Traditional methods for building statistical surrogate models often do not scale well to experiments with large amounts of data. Our approach is well suited to experiments involving large numbers of model inputs, large numbers of simulations, and functional output for each simulation. Our approach allows us to perform global sensitivity analysis with ease. We also present an approach to calibration of simulators using field data.
Bayesian just-so stories in psychology and neuroscience.
Bowers, Jeffrey S; Davis, Colin J
2012-05-01
According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak. This weakness relates to the many arbitrary ways that priors, likelihoods, and utility functions can be altered in order to account for the data that are obtained, making the models unfalsifiable. It further relates to the fact that Bayesian theories are rarely better at predicting data compared with alternative (and simpler) non-Bayesian theories. Second, we show that the empirical evidence for Bayesian theories in neuroscience is weaker still. There are impressive mathematical analyses showing how populations of neurons could compute in a Bayesian manner but little or no evidence that they do. Third, we challenge the general scientific approach that characterizes Bayesian theorizing in cognitive science. A common premise is that theories in psychology should largely be constrained by a rational analysis of what the mind ought to do. We question this claim and argue that many of the important constraints come from biological, evolutionary, and processing (algorithmic) considerations that have no adaptive relevance to the problem per se. In our view, these factors have contributed to the development of many Bayesian "just so" stories in psychology and neuroscience; that is, mathematical analyses of cognition that can be used to explain almost any behavior as optimal. 2012 APA, all rights reserved.
Development of a practical costing method for hospitals.
Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei
2006-03-01
To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing.
Thermo-responsive magnetic liposomes for hyperthermia-triggered local drug delivery.
Dai, Min; Wu, Cong; Fang, Hong-Ming; Li, Li; Yan, Jia-Bao; Zeng, Dan-Lin; Zou, Tao
2017-06-01
We prepared and characterised thermo-responsive magnetic liposomes, which were designed to combine features of magnetic targeting and thermo-responsive control release for hyperthermia-triggered local drug delivery. The particle size and zeta-potential of the thermo-responsive magnetic ammonium bicarbonate (MagABC) liposomes were about 210 nm and -14 mV, respectively. The MagABC liposomes showed encapsulation efficiencies of about 15% and 82% for magnetic nanoparticles (mean crystallite size 12 nm) and doxorubicin (DOX), respectively. The morphology of the MagABC liposomes was visualised using transmission electron microscope (TEM). The MagABC liposomes showed desired thermo-responsive release. The MagABC liposomes, when physically targeted to tumour cells in culture by a permanent magnetic field yielded a substantial increase in intracellular accumulation of DOX as compared to non-magnetic ammonium bicarbonate (ABC) liposomes. This resulted in a parallel increase in cytotoxicity for DOX loaded MagABC liposomes over DOX loaded ABC liposomes in tumour cells.
The Centre of High-Performance Scientific Computing, Geoverbund, ABC/J - Geosciences enabled by HPSC
NASA Astrophysics Data System (ADS)
Kollet, Stefan; Görgen, Klaus; Vereecken, Harry; Gasper, Fabian; Hendricks-Franssen, Harrie-Jan; Keune, Jessica; Kulkarni, Ketan; Kurtz, Wolfgang; Sharples, Wendy; Shrestha, Prabhakar; Simmer, Clemens; Sulis, Mauro; Vanderborght, Jan
2016-04-01
The Centre of High-Performance Scientific Computing (HPSC TerrSys) was founded 2011 to establish a centre of competence in high-performance scientific computing in terrestrial systems and the geosciences enabling fundamental and applied geoscientific research in the Geoverbund ABC/J (geoscientfic research alliance of the Universities of Aachen, Cologne, Bonn and the Research Centre Jülich, Germany). The specific goals of HPSC TerrSys are to achieve relevance at the national and international level in (i) the development and application of HPSC technologies in the geoscientific community; (ii) student education; (iii) HPSC services and support also to the wider geoscientific community; and in (iv) the industry and public sectors via e.g., useful applications and data products. A key feature of HPSC TerrSys is the Simulation Laboratory Terrestrial Systems, which is located at the Jülich Supercomputing Centre (JSC) and provides extensive capabilities with respect to porting, profiling, tuning and performance monitoring of geoscientific software in JSC's supercomputing environment. We will present a summary of success stories of HPSC applications including integrated terrestrial model development, parallel profiling and its application from watersheds to the continent; massively parallel data assimilation using physics-based models and ensemble methods; quasi-operational terrestrial water and energy monitoring; and convection permitting climate simulations over Europe. The success stories stress the need for a formalized education of students in the application of HPSC technologies in future.
Methods for modeling cytoskeletal and DNA filaments
NASA Astrophysics Data System (ADS)
Andrews, Steven S.
2014-02-01
This review summarizes the models that researchers use to represent the conformations and dynamics of cytoskeletal and DNA filaments. It focuses on models that address individual filaments in continuous space. Conformation models include the freely jointed, Gaussian, angle-biased chain (ABC), and wormlike chain (WLC) models, of which the first three bend at discrete joints and the last bends continuously. Predictions from the WLC model generally agree well with experiment. Dynamics models include the Rouse, Zimm, stiff rod, dynamic WLC, and reptation models, of which the first four apply to isolated filaments and the last to entangled filaments. Experiments show that the dynamic WLC and reptation models are most accurate. They also show that biological filaments typically experience strong hydrodynamic coupling and/or constrained motion. Computer simulation methods that address filament dynamics typically compute filament segment velocities from local forces using the Langevin equation and then integrate these velocities with explicit or implicit methods; the former are more versatile and the latter are more efficient. Much remains to be discovered in biological filament modeling. In particular, filament dynamics in living cells are not well understood, and current computational methods are too slow and not sufficiently versatile. Although primarily a review, this paper also presents new statistical calculations for the ABC and WLC models. Additionally, it corrects several discrepancies in the literature about bending and torsional persistence length definitions, and their relations to flexural and torsional rigidities.
The A [plus] B [double arrow] C of Chemical Thermodynamics.
ERIC Educational Resources Information Center
Gerhartl, F. J.
1994-01-01
Basic chemical thermodynamics usually treats non-p,T reactions in a stepmotherly fashion. This paper covers the main aspects of the theoretical principles of reactions (p,T; V,T; p,H; and V,U) and offers results from the ABC computer program, which was designed to show the validity of the equilibrium theory to all types of reaction modes. (PVD)
ERIC Educational Resources Information Center
Perruchet, Pierre; Poulin-Charronnat, Benedicte
2012-01-01
Endress and Mehler (2009) reported that when adult subjects are exposed to an unsegmented artificial language composed from trisyllabic words such as ABX, YBC, and AZC, they are unable to distinguish between these words and what they coined as the "phantom-word" ABC in a subsequent test. This suggests that statistical learning generates knowledge…
The utility of Bayesian predictive probabilities for interim monitoring of clinical trials
Connor, Jason T.; Ayers, Gregory D; Alvarez, JoAnn
2014-01-01
Background Bayesian predictive probabilities can be used for interim monitoring of clinical trials to estimate the probability of observing a statistically significant treatment effect if the trial were to continue to its predefined maximum sample size. Purpose We explore settings in which Bayesian predictive probabilities are advantageous for interim monitoring compared to Bayesian posterior probabilities, p-values, conditional power, or group sequential methods. Results For interim analyses that address prediction hypotheses, such as futility monitoring and efficacy monitoring with lagged outcomes, only predictive probabilities properly account for the amount of data remaining to be observed in a clinical trial and have the flexibility to incorporate additional information via auxiliary variables. Limitations Computational burdens limit the feasibility of predictive probabilities in many clinical trial settings. The specification of prior distributions brings additional challenges for regulatory approval. Conclusions The use of Bayesian predictive probabilities enables the choice of logical interim stopping rules that closely align with the clinical decision making process. PMID:24872363
Sironi, Emanuele; Taroni, Franco; Baldinotti, Claudio; Nardi, Cosimo; Norelli, Gian-Aristide; Gallidabino, Matteo; Pinchi, Vilma
2017-11-14
The present study aimed to investigate the performance of a Bayesian method in the evaluation of dental age-related evidence collected by means of a geometrical approximation procedure of the pulp chamber volume. Measurement of this volume was based on three-dimensional cone beam computed tomography images. The Bayesian method was applied by means of a probabilistic graphical model, namely a Bayesian network. Performance of that method was investigated in terms of accuracy and bias of the decisional outcomes. Influence of an informed elicitation of the prior belief of chronological age was also studied by means of a sensitivity analysis. Outcomes in terms of accuracy were adequate with standard requirements for forensic adult age estimation. Findings also indicated that the Bayesian method does not show a particular tendency towards under- or overestimation of the age variable. Outcomes of the sensitivity analysis showed that results on estimation are improved with a ration elicitation of the prior probabilities of age.
Bayesian inference of a historical bottleneck in a heavily exploited marine mammal.
Hoffman, J I; Grant, S M; Forcada, J; Phillips, C D
2011-10-01
Emerging Bayesian analytical approaches offer increasingly sophisticated means of reconstructing historical population dynamics from genetic data, but have been little applied to scenarios involving demographic bottlenecks. Consequently, we analysed a large mitochondrial and microsatellite dataset from the Antarctic fur seal Arctocephalus gazella, a species subjected to one of the most extreme examples of uncontrolled exploitation in history when it was reduced to the brink of extinction by the sealing industry during the late eighteenth and nineteenth centuries. Classical bottleneck tests, which exploit the fact that rare alleles are rapidly lost during demographic reduction, yielded ambiguous results. In contrast, a strong signal of recent demographic decline was detected using both Bayesian skyline plots and Approximate Bayesian Computation, the latter also allowing derivation of posterior parameter estimates that were remarkably consistent with historical observations. This was achieved using only contemporary samples, further emphasizing the potential of Bayesian approaches to address important problems in conservation and evolutionary biology. © 2011 Blackwell Publishing Ltd.
A Neisseria meningitidis fbpABC mutant is incapable of using nonheme iron for growth.
Khun, H H; Kirby, S D; Lee, B C
1998-05-01
The neisserial fbpABC locus has been proposed to act as an iron-specific ABC transporter system. To confirm this assigned function, we constructed an fbpABC mutant in Neisseria meningitidis by insertional inactivation of fbpABC with a selectable antibiotic marker. The mutant was unable to use iron supplied from human transferrin, human lactoferrin, or iron chelates. However, the use of iron from heme and human hemoglobin was unimpaired. These results support the obligatory participation of fbpABC in neisserial periplasmic iron transport and do not indicate a role for this genetic locus in the heme iron pathway.
A Neisseria meningitidis fbpABC Mutant Is Incapable of Using Nonheme Iron for Growth
Khun, Heng H.; Kirby, Shane D.; Lee, B. Craig
1998-01-01
The neisserial fbpABC locus has been proposed to act as an iron-specific ABC transporter system. To confirm this assigned function, we constructed an fbpABC mutant in Neisseria meningitidis by insertional inactivation of fbpABC with a selectable antibiotic marker. The mutant was unable to use iron supplied from human transferrin, human lactoferrin, or iron chelates. However, the use of iron from heme and human hemoglobin was unimpaired. These results support the obligatory participation of fbpABC in neisserial periplasmic iron transport and do not indicate a role for this genetic locus in the heme iron pathway. PMID:9573125
Inference of epidemiological parameters from household stratified data
Walker, James N.; Ross, Joshua V.
2017-01-01
We consider a continuous-time Markov chain model of SIR disease dynamics with two levels of mixing. For this so-called stochastic households model, we provide two methods for inferring the model parameters—governing within-household transmission, recovery, and between-household transmission—from data of the day upon which each individual became infectious and the household in which each infection occurred, as might be available from First Few Hundred studies. Each method is a form of Bayesian Markov Chain Monte Carlo that allows us to calculate a joint posterior distribution for all parameters and hence the household reproduction number and the early growth rate of the epidemic. The first method performs exact Bayesian inference using a standard data-augmentation approach; the second performs approximate Bayesian inference based on a likelihood approximation derived from branching processes. These methods are compared for computational efficiency and posteriors from each are compared. The branching process is shown to be a good approximation and remains computationally efficient as the amount of data is increased. PMID:29045456
BELM: Bayesian extreme learning machine.
Soria-Olivas, Emilio; Gómez-Sanchis, Juan; Martín, José D; Vila-Francés, Joan; Martínez, Marcelino; Magdalena, José R; Serrano, Antonio J
2011-03-01
The theory of extreme learning machine (ELM) has become very popular on the last few years. ELM is a new approach for learning the parameters of the hidden layers of a multilayer neural network (as the multilayer perceptron or the radial basis function neural network). Its main advantage is the lower computational cost, which is especially relevant when dealing with many patterns defined in a high-dimensional space. This brief proposes a bayesian approach to ELM, which presents some advantages over other approaches: it allows the introduction of a priori knowledge; obtains the confidence intervals (CIs) without the need of applying methods that are computationally intensive, e.g., bootstrap; and presents high generalization capabilities. Bayesian ELM is benchmarked against classical ELM in several artificial and real datasets that are widely used for the evaluation of machine learning algorithms. Achieved results show that the proposed approach produces a competitive accuracy with some additional advantages, namely, automatic production of CIs, reduction of probability of model overfitting, and use of a priori knowledge.
A Comparison of the β-Substitution Method and a Bayesian Method for Analyzing Left-Censored Data.
Huynh, Tran; Quick, Harrison; Ramachandran, Gurumurthy; Banerjee, Sudipto; Stenzel, Mark; Sandler, Dale P; Engel, Lawrence S; Kwok, Richard K; Blair, Aaron; Stewart, Patricia A
2016-01-01
Classical statistical methods for analyzing exposure data with values below the detection limits are well described in the occupational hygiene literature, but an evaluation of a Bayesian approach for handling such data is currently lacking. Here, we first describe a Bayesian framework for analyzing censored data. We then present the results of a simulation study conducted to compare the β-substitution method with a Bayesian method for exposure datasets drawn from lognormal distributions and mixed lognormal distributions with varying sample sizes, geometric standard deviations (GSDs), and censoring for single and multiple limits of detection. For each set of factors, estimates for the arithmetic mean (AM), geometric mean, GSD, and the 95th percentile (X0.95) of the exposure distribution were obtained. We evaluated the performance of each method using relative bias, the root mean squared error (rMSE), and coverage (the proportion of the computed 95% uncertainty intervals containing the true value). The Bayesian method using non-informative priors and the β-substitution method were generally comparable in bias and rMSE when estimating the AM and GM. For the GSD and the 95th percentile, the Bayesian method with non-informative priors was more biased and had a higher rMSE than the β-substitution method, but use of more informative priors generally improved the Bayesian method's performance, making both the bias and the rMSE more comparable to the β-substitution method. An advantage of the Bayesian method is that it provided estimates of uncertainty for these parameters of interest and good coverage, whereas the β-substitution method only provided estimates of uncertainty for the AM, and coverage was not as consistent. Selection of one or the other method depends on the needs of the practitioner, the availability of prior information, and the distribution characteristics of the measurement data. We suggest the use of Bayesian methods if the practitioner has the computational resources and prior information, as the method would generally provide accurate estimates and also provides the distributions of all of the parameters, which could be useful for making decisions in some applications. © The Author 2015. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Object Detection Based on Template Matching through Use of Best-So-Far ABC
2014-01-01
Best-so-far ABC is a modified version of the artificial bee colony (ABC) algorithm used for optimization tasks. This algorithm is one of the swarm intelligence (SI) algorithms proposed in recent literature, in which the results demonstrated that the best-so-far ABC can produce higher quality solutions with faster convergence than either the ordinary ABC or the current state-of-the-art ABC-based algorithm. In this work, we aim to apply the best-so-far ABC-based approach for object detection based on template matching by using the difference between the RGB level histograms corresponding to the target object and the template object as the objective function. Results confirm that the proposed method was successful in both detecting objects and optimizing the time used to reach the solution. PMID:24812556
Evolution of the cerebellum as a neuronal machine for Bayesian state estimation
NASA Astrophysics Data System (ADS)
Paulin, M. G.
2005-09-01
The cerebellum evolved in association with the electric sense and vestibular sense of the earliest vertebrates. Accurate information provided by these sensory systems would have been essential for precise control of orienting behavior in predation. A simple model shows that individual spikes in electrosensory primary afferent neurons can be interpreted as measurements of prey location. Using this result, I construct a computational neural model in which the spatial distribution of spikes in a secondary electrosensory map forms a Monte Carlo approximation to the Bayesian posterior distribution of prey locations given the sense data. The neural circuit that emerges naturally to perform this task resembles the cerebellar-like hindbrain electrosensory filtering circuitry of sharks and other electrosensory vertebrates. The optimal filtering mechanism can be extended to handle dynamical targets observed from a dynamical platform; that is, to construct an optimal dynamical state estimator using spiking neurons. This may provide a generic model of cerebellar computation. Vertebrate motion-sensing neurons have specific fractional-order dynamical characteristics that allow Bayesian state estimators to be implemented elegantly and efficiently, using simple operations with asynchronous pulses, i.e. spikes. The computational neural models described in this paper represent a novel kind of particle filter, using spikes as particles. The models are specific and make testable predictions about computational mechanisms in cerebellar circuitry, while providing a plausible explanation of cerebellar contributions to aspects of motor control, perception and cognition.
50 CFR 648.55 - Framework adjustments to management measures.
Code of Federal Regulations, 2014 CFR
2014-10-01
... establish OFL, ABC, ACL, ACT, DAS allocations, rotational area management programs, percentage allocations... measures will be adjusted. (c) OFL, ABC, ACL, ACT, and AMs. The Council shall specify OFL, ABC, ACL, ACT... derive specifications for ABC, ACL, and ACT, as specified in paragraphs (c)(2) through (c)(5) of this...
50 CFR 648.55 - Framework adjustments to management measures.
Code of Federal Regulations, 2013 CFR
2013-10-01
... establish OFL, ABC, ACL, ACT, DAS allocations, rotational area management programs, percentage allocations... measures will be adjusted. (c) OFL, ABC, ACL, ACT, and AMs. The Council shall specify OFL, ABC, ACL, ACT... derive specifications for ABC, ACL, and ACT, as specified in paragraphs (c)(2) through (c)(5) of this...
50 CFR 648.55 - Framework adjustments to management measures.
Code of Federal Regulations, 2012 CFR
2012-10-01
... establish OFL, ABC, ACL, ACT, DAS allocations, rotational area management programs, percentage allocations... measures will be adjusted. (c) OFL, ABC, ACL, ACT, and AMs. The Council shall specify OFL, ABC, ACL, ACT... derive specifications for ABC, ACL, and ACT, as specified in paragraphs (c)(2) through (c)(5) of this...
50 CFR 648.55 - Framework adjustments to management measures.
Code of Federal Regulations, 2011 CFR
2011-10-01
... establish OFL, ABC, ACL, ACT, DAS allocations, rotational area management programs, percentage allocations... measures will be adjusted. (c) OFL, ABC, ACL, ACT, and AMs. The Council shall specify OFL, ABC, ACL, ACT... derive specifications for ABC, ACL, and ACT, as specified in paragraphs (a)(2) through (5) of this...
Molitor, John
2012-03-01
Bayesian methods have seen an increase in popularity in a wide variety of scientific fields, including epidemiology. One of the main reasons for their widespread application is the power of the Markov chain Monte Carlo (MCMC) techniques generally used to fit these models. As a result, researchers often implicitly associate Bayesian models with MCMC estimation procedures. However, Bayesian models do not always require Markov-chain-based methods for parameter estimation. This is important, as MCMC estimation methods, while generally quite powerful, are complex and computationally expensive and suffer from convergence problems related to the manner in which they generate correlated samples used to estimate probability distributions for parameters of interest. In this issue of the Journal, Cole et al. (Am J Epidemiol. 2012;175(5):368-375) present an interesting paper that discusses non-Markov-chain-based approaches to fitting Bayesian models. These methods, though limited, can overcome some of the problems associated with MCMC techniques and promise to provide simpler approaches to fitting Bayesian models. Applied researchers will find these estimation approaches intuitively appealing and will gain a deeper understanding of Bayesian models through their use. However, readers should be aware that other non-Markov-chain-based methods are currently in active development and have been widely published in other fields.
A fast combination method in DSmT and its application to recommender system
Liu, Yihai
2018-01-01
In many applications involving epistemic uncertainties usually modeled by belief functions, it is often necessary to approximate general (non-Bayesian) basic belief assignments (BBAs) to subjective probabilities (called Bayesian BBAs). This necessity occurs if one needs to embed the fusion result in a system based on the probabilistic framework and Bayesian inference (e.g. tracking systems), or if one needs to make a decision in the decision making problems. In this paper, we present a new fast combination method, called modified rigid coarsening (MRC), to obtain the final Bayesian BBAs based on hierarchical decomposition (coarsening) of the frame of discernment. Regarding this method, focal elements with probabilities are coarsened efficiently to reduce computational complexity in the process of combination by using disagreement vector and a simple dichotomous approach. In order to prove the practicality of our approach, this new approach is applied to combine users’ soft preferences in recommender systems (RSs). Additionally, in order to make a comprehensive performance comparison, the proportional conflict redistribution rule #6 (PCR6) is regarded as a baseline in a range of experiments. According to the results of experiments, MRC is more effective in accuracy of recommendations compared to original Rigid Coarsening (RC) method and comparable in computational time. PMID:29351297
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jiangjiang; Li, Weixuan; Zeng, Lingzao
Surrogate models are commonly used in Bayesian approaches such as Markov Chain Monte Carlo (MCMC) to avoid repetitive CPU-demanding model evaluations. However, the approximation error of a surrogate may lead to biased estimations of the posterior distribution. This bias can be corrected by constructing a very accurate surrogate or implementing MCMC in a two-stage manner. Since the two-stage MCMC requires extra original model evaluations, the computational cost is still high. If the information of measurement is incorporated, a locally accurate approximation of the original model can be adaptively constructed with low computational cost. Based on this idea, we propose amore » Gaussian process (GP) surrogate-based Bayesian experimental design and parameter estimation approach for groundwater contaminant source identification problems. A major advantage of the GP surrogate is that it provides a convenient estimation of the approximation error, which can be incorporated in the Bayesian formula to avoid over-confident estimation of the posterior distribution. The proposed approach is tested with a numerical case study. Without sacrificing the estimation accuracy, the new approach achieves about 200 times of speed-up compared to our previous work using two-stage MCMC.« less
Progress in computational toxicology.
Ekins, Sean
2014-01-01
Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.
Requirements for Information/Education Programs on Hypothermia.
1979-10-23
NY (NBC) WEZF - Burlington, VT (ABC) Bangor, ME: 112,800 WLBZ - Bangor, ME (NBC) WABI - Bangor, ME (CBS) WVII - Bangor, ME (ABC) Presque Isle , ME...28,700 WAGM - Presque Isle , ME CBS(ABC-NBC) A-2 MID-ATLANTIC: 18,885,600 New York, NY: 6,375,500 Raleigh-Durham, NC: 451,800 WCBS - New York, NY (CBS...NY (ABC) WICZ - Binghamton, NY (NBC) WilingonNC: 135,300 WWAY -IWilmington, NC (ABC) WECT - Wilmington, NC NBC (CBS) Erie , PA: 132,600 WICU - Erie , PA
Attention in a Bayesian Framework
Whiteley, Louise; Sahani, Maneesh
2012-01-01
The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models of perception, and use this observation to frame a new computational account of the need for, and action of, attention – unifying diverse attentional phenomena in a way that goes beyond previous inferential, probabilistic and Bayesian models. Attentional effects are most evident in cluttered environments, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental settings, where cues shape expectations about a small number of upcoming stimuli and thus convey “prior” information about clearly defined objects. While operationally consistent with the experiments it seeks to describe, this view of attention as prior seems to miss many essential elements of both its selective and integrative roles, and thus cannot be easily extended to complex environments. We suggest that the resource bottleneck stems from the computational intractability of exact perceptual inference in complex settings, and that attention reflects an evolved mechanism for approximate inference which can be shaped to refine the local accuracy of perception. We show that this approach extends the simple picture of attention as prior, so as to provide a unified and computationally driven account of both selective and integrative attentional phenomena. PMID:22712010
Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang
2014-01-01
Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible. PMID:25745272
NASA Astrophysics Data System (ADS)
Ledall, Jérémy; Fruchon, Séverine; Garzoni, Matteo; Pavan, Giovanni M.; Caminade, Anne-Marie; Turrin, Cédric-Olivier; Blanzat, Muriel; Poupot, Rémy
2015-10-01
Dendrimers are nano-materials with perfectly defined structure and size, and multivalency properties that confer substantial advantages for biomedical applications. Previous work has shown that phosphorus-based polyphosphorhydrazone (PPH) dendrimers capped with azabisphosphonate (ABP) end groups have immuno-modulatory and anti-inflammatory properties leading to efficient therapeutic control of inflammatory diseases in animal models. These properties are mainly prompted through activation of monocytes. Here, we disclose new insights into the molecular mechanisms underlying the anti-inflammatory activation of human monocytes by ABP-capped PPH dendrimers. Following an interdisciplinary approach, we have characterized the physicochemical and biological behavior of the lead ABP dendrimer with model and cell membranes, and compared this experimental set of data to predictive computational modelling studies. The behavior of the ABP dendrimer was compared to the one of an isosteric analog dendrimer capped with twelve azabiscarboxylate (ABC) end groups instead of twelve ABP end groups. The ABC dendrimer displayed no biological activity on human monocytes, therefore it was considered as a negative control. In detail, we show that the ABP dendrimer can bind both non-specifically and specifically to the membrane of human monocytes. The specific binding leads to the internalization of the ABP dendrimer by human monocytes. On the contrary, the ABC dendrimer only interacts non-specifically with human monocytes and is not internalized. These data indicate that the bioactive ABP dendrimer is recognized by specific receptor(s) at the surface of human monocytes.Dendrimers are nano-materials with perfectly defined structure and size, and multivalency properties that confer substantial advantages for biomedical applications. Previous work has shown that phosphorus-based polyphosphorhydrazone (PPH) dendrimers capped with azabisphosphonate (ABP) end groups have immuno-modulatory and anti-inflammatory properties leading to efficient therapeutic control of inflammatory diseases in animal models. These properties are mainly prompted through activation of monocytes. Here, we disclose new insights into the molecular mechanisms underlying the anti-inflammatory activation of human monocytes by ABP-capped PPH dendrimers. Following an interdisciplinary approach, we have characterized the physicochemical and biological behavior of the lead ABP dendrimer with model and cell membranes, and compared this experimental set of data to predictive computational modelling studies. The behavior of the ABP dendrimer was compared to the one of an isosteric analog dendrimer capped with twelve azabiscarboxylate (ABC) end groups instead of twelve ABP end groups. The ABC dendrimer displayed no biological activity on human monocytes, therefore it was considered as a negative control. In detail, we show that the ABP dendrimer can bind both non-specifically and specifically to the membrane of human monocytes. The specific binding leads to the internalization of the ABP dendrimer by human monocytes. On the contrary, the ABC dendrimer only interacts non-specifically with human monocytes and is not internalized. These data indicate that the bioactive ABP dendrimer is recognized by specific receptor(s) at the surface of human monocytes. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr03884g
Building Self-Esteem of Children and Adolescents through Adventure-Based Counseling.
ERIC Educational Resources Information Center
Nassar-McMillan, Sylvia C.; Cashwell, Craig S.
1997-01-01
Explores ways in which communities and school counselors can foster self-esteem in children and adolescents through adventure-based counseling (ABC). Discusses the importance of self-esteem, the philosophy and tenets of ABC, the effectiveness of ABC, and ways to integrate ABC concepts into groups. Focuses on prevention and intervention. (RJM)
Finite-difference time-domain simulation of GPR data
NASA Astrophysics Data System (ADS)
Chen, How-Wei; Huang, Tai-Min
1998-10-01
Simulation of digital ground penetrating radar (GPR) wave propagation in two-dimensional (2-D) media is developed, tested, implemented, and applied using a time-domain staggered-grid finite-difference (FD) numerical method. Three types of numerical algorithms for constructing synthetic common-shot, constant-offset radar profiles based on an actual transmitter-to-receiver configuration and based on the exploding reflector concept are demonstrated to mimic different types of radar survey geometries. Frequency-dependent attenuation is also incorporated to account for amplitude decay and time shift in the recorded responses. The algorithms are based on an explicit FD solution to Maxwell's curl equations. In addition, the first-order TE mode responses of wave propagation phenomena are considered due to the operating frequency of current GPR instruments. The staggered-grid technique is used to sample the fields and approximate the spatial derivatives with fourth-order FDs. The temporal derivatives are approximated by an explicit second-order difference time-marching scheme. By combining paraxial approximation of the one-way wave equation ( A2) and the damping mechanisms (sponge filter), we propose a new composite absorbing boundary conditions (ABC) algorithm that effectively absorb both incoming and outgoing waves. To overcome the angle- and frequency-dependent characteristic of the absorbing behaviors, each ABC has two types of absorption mechanism. The first ABC uses a modified Clayton and Enquist's A2 condition. Moreover, a fixed and a floating A2 ABC that operates at one grid point is proposed. The second ABC uses a damping mechanism. By superimposing artificial damping and by alternating the physical attenuation properties and impedance contrast of the media within the absorbing region, those waves impinging on the boundary can be effectively attenuated and can prevent waves from reflecting back into the grid. The frequency-dependent characteristic of the damping mechanism can be used to adjust the width of the absorbing zone around the computational domain. By applying any combination of absorbing mechanism, non-physical reflections from the computation domain boundary can be effectively minimized. The algorithm enables us to use very thin absorbing boundaries. The model can be parameterized through velocity, relative electrical permittivity (dielectric constants), electrical conductivity, magnetic permeability, loss tangent, Q values, and attenuation. According to this scheme, widely varying electrical properties of near-surface earth materials can be modeled. The capability of simulating common-source, constant-offset and zero-offset gathers is also demonstrated through various synthetic examples. The synthetic cases for typical GPR applications include buried objects such as pipes of different materials, AVO analysis for ground water exploration, archaeological site investigation, and stratigraphy studies. The algorithms are also applied to iterative modeling of GPR data acquired over a gymnasium construction site on the NCCU campus.
ABCE1 is essential for S phase progression in human cells
Toompuu, Marina; Kärblane, Kairi; Pata, Pille; Truve, Erkki; Sarmiento, Cecilia
2016-01-01
ABSTRACT ABCE1 is a highly conserved protein universally present in eukaryotes and archaea, which is crucial for the viability of different organisms. First identified as RNase L inhibitor, ABCE1 is currently recognized as an essential translation factor involved in several stages of eukaryotic translation and ribosome biogenesis. The nature of vital functions of ABCE1, however, remains unexplained. Here, we study the role of ABCE1 in human cell proliferation and its possible connection to translation. We show that ABCE1 depletion by siRNA results in a decreased rate of cell growth due to accumulation of cells in S phase, which is accompanied by inefficient DNA synthesis and reduced histone mRNA and protein levels. We infer that in addition to the role in general translation, ABCE1 is involved in histone biosynthesis and DNA replication and therefore is essential for normal S phase progression. In addition, we analyze whether ABCE1 is implicated in transcript-specific translation via its association with the eIF3 complex subunits known to control the synthesis of cell proliferation-related proteins. The expression levels of a few such targets regulated by eIF3A, however, were not consistently affected by ABCE1 depletion. PMID:26985706
A novel approach for dimension reduction of microarray.
Aziz, Rabia; Verma, C K; Srivastava, Namita
2017-12-01
This paper proposes a new hybrid search technique for feature (gene) selection (FS) using Independent component analysis (ICA) and Artificial Bee Colony (ABC) called ICA+ABC, to select informative genes based on a Naïve Bayes (NB) algorithm. An important trait of this technique is the optimization of ICA feature vector using ABC. ICA+ABC is a hybrid search algorithm that combines the benefits of extraction approach, to reduce the size of data and wrapper approach, to optimize the reduced feature vectors. This hybrid search technique is facilitated by evaluating the performance of ICA+ABC on six standard gene expression datasets of classification. Extensive experiments were conducted to compare the performance of ICA+ABC with the results obtained from recently published Minimum Redundancy Maximum Relevance (mRMR) +ABC algorithm for NB classifier. Also to check the performance that how ICA+ABC works as feature selection with NB classifier, compared the combination of ICA with popular filter techniques and with other similar bio inspired algorithm such as Genetic Algorithm (GA) and Particle Swarm Optimization (PSO). The result shows that ICA+ABC has a significant ability to generate small subsets of genes from the ICA feature vector, that significantly improve the classification accuracy of NB classifier compared to other previously suggested methods. Copyright © 2017 Elsevier Ltd. All rights reserved.
Caenorhabditis elegans ABCRNAi transporters interact genetically with rde-2 and mut-7.
Sundaram, Prema; Han, Wang; Cohen, Nancy; Echalier, Benjamin; Albin, John; Timmons, Lisa
2008-02-01
RNA interference (RNAi) mechanisms are conserved and consist of an interrelated network of activities that not only respond to exogenous dsRNA, but also perform endogenous functions required in the fine tuning of gene expression and in maintaining genome integrity. Not surprisingly, RNAi functions have widespread influences on cellular function and organismal development. Previously, we observed a reduced capacity to mount an RNAi response in nine Caenorhabditis elegans mutants that are defective in ABC transporter genes (ABC(RNAi) mutants). Here, we report an exhaustive study of mutants, collectively defective in 49 different ABC transporter genes, that allowed for the categorization of one additional transporter into the ABC(RNAi) gene class. Genetic complementation tests reveal functions for ABC(RNAi) transporters in the mut-7/rde-2 branch of the RNAi pathway. These second-site noncomplementation interactions suggest that ABC(RNAi) proteins and MUT-7/RDE-2 function together in parallel pathways and/or as multiprotein complexes. Like mut-7 and rde-2, some ABC(RNAi) mutants display transposon silencing defects. Finally, our analyses reveal a genetic interaction network of ABC(RNAi) gene function with respect to this part of the RNAi pathway. From our results, we speculate that the coordinated activities of ABC(RNAi) transporters, through their effects on endogenous RNAi-related mechanisms, ultimately affect chromosome function and integrity.
Ding, Xiwei; Chaiteerakij, Roongruedee; Moser, Catherine D; Shaleh, Hassan; Boakye, Jeffrey; Chen, Gang; Ndzengue, Albert; Li, Ying; Zhou, Yanling; Huang, Shengbing; Sinicrope, Frank A; Zou, Xiaoping; Thomas, Melanie B; Smith, Charles D; Roberts, Lewis R
2016-04-12
Sphingosine kinase 2 (Sphk2) has an oncogenic role in cancer. A recently developed first-in-class Sphk2 specific inhibitor ABC294640 displays antitumor activity in many cancer models. However, the role of Sphk2 and the antitumor activity of its inhibitor ABC294640 are not known in cholangiocarcinoma. We investigated the potential of targeting Sphk2 for the treatment of cholangiocarcinoma. We found that Sphk2 is overexpressed in five established human cholangiocarcinoma cell lines (WITT, HuCCT1, EGI-1, OZ and HuH28) and a new patient-derived cholangiocarcinoma cell line (LIV27) compared to H69 normal cholangiocytes. Inhibition of Sphk2 by ABC294640 inhibited proliferation and induced caspase-dependent apoptosis. Furthermore, we found that ABC294640 inhibited STAT3 phosphorylation, one of the key signaling pathways regulating cholangiocarcinoma cell proliferation and survival. ABC294640 also induced autophagy. Inhibition of autophagy by bafilomycin A1 or chloroquine potentiated ABC294640-induced cytotoxicity and apoptosis. In addition, ABC294640 in combination with sorafenib synergistically inhibited cell proliferation of cholangiocarcinoma cells. Strong decreases in STAT3 phosphorylation were observed in WITT and HuCCT1 cells exposed to the ABC294640 and sorafenib combination. These findings provide novel evidence that Sphk2 may be a rational therapeutic target in cholangiocarcinoma. Combinations of ABC294640 with sorafenib and/or autophagy inhibitors may provide novel strategies for the treatment of cholangiocarcinoma.
A novel artificial bee colony algorithm based on modified search equation and orthogonal learning.
Gao, Wei-feng; Liu, San-yang; Huang, Ling-ling
2013-06-01
The artificial bee colony (ABC) algorithm is a relatively new optimization technique which has been shown to be competitive to other population-based algorithms. However, ABC has an insufficiency regarding its solution search equation, which is good at exploration but poor at exploitation. To address this concerning issue, we first propose an improved ABC method called as CABC where a modified search equation is applied to generate a candidate solution to improve the search ability of ABC. Furthermore, we use the orthogonal experimental design (OED) to form an orthogonal learning (OL) strategy for variant ABCs to discover more useful information from the search experiences. Owing to OED's good character of sampling a small number of well representative combinations for testing, the OL strategy can construct a more promising and efficient candidate solution. In this paper, the OL strategy is applied to three versions of ABC, i.e., the standard ABC, global-best-guided ABC (GABC), and CABC, which yields OABC, OGABC, and OCABC, respectively. The experimental results on a set of 22 benchmark functions demonstrate the effectiveness and efficiency of the modified search equation and the OL strategy. The comparisons with some other ABCs and several state-of-the-art algorithms show that the proposed algorithms significantly improve the performance of ABC. Moreover, OCABC offers the highest solution quality, fastest global convergence, and strongest robustness among all the contenders on almost all the test functions.
Li, Ye; Chen, Zhenya; Zhou, Zhao; Yuan, Qipeng
2016-12-01
Chondroitinases (ChSases) are a family of polysaccharide lyases that can depolymerize high molecular weight chondroitin sulfate (CS) and dermatan sulfate (DS). In this study, glyceraldehyde-3-phosphate dehydrogenase (GAPDH), which is stably expressed in different cells like normal cells and cancer cells and the expression is relatively insensitive to experimental conditions, was expressed as a fusion protein with ChSase ABC I. Results showed that the expression level and enzyme activity of GAPDH-ChSase ABC I were about 2.2 and 3.0 times higher than those of ChSase ABC I. By optimization of fermentation conditions, higher productivity of ChSase ABC I was achieved as 880 ± 61 IU/g wet cell weight compared with the reported ones. The optimal temperature and pH of GAPDH-ChSase ABC I were 40 °C and 7.5, respectively. GAPDH-ChSase ABC I had a kcat/Km of 131 ± 4.1 L/μmol s and the catalytic efficiency was decreased as compared to ChSase ABC I. The relative activity of GAPDH-ChSase ABC I remained 89% after being incubated at 30 °C for 180 min and the thermostability of ChSase ABC I was enhanced by GAPDH when it was incubated at 30, 35, 40 and 45 °C. Copyright © 2016 Elsevier Inc. All rights reserved.
Application of activity-based costing (ABC) for a Peruvian NGO healthcare provider.
Waters, H; Abdallah, H; Santillán, D
2001-01-01
This article describes the application of activity-based costing (ABC) to calculate the unit costs of the services for a health care provider in Peru. While traditional costing allocates overhead and indirect costs in proportion to production volume or to direct costs, ABC assigns costs through activities within an organization. ABC uses personnel interviews to determine principal activities and the distribution of individual's time among these activities. Indirect costs are linked to services through time allocation and other tracing methods, and the result is a more accurate estimate of unit costs. The study concludes that applying ABC in a developing country setting is feasible, yielding results that are directly applicable to pricing and management. ABC determines costs for individual clinics, departments and services according to the activities that originate these costs, showing where an organization spends its money. With this information, it is possible to identify services that are generating extra revenue and those operating at a loss, and to calculate cross subsidies across services. ABC also highlights areas in the health care process where efficiency improvements are possible. Conclusions about the ultimate impact of the methodology are not drawn here, since the study was not repeated and changes in utilization patterns and the addition of new clinics affected applicability of the results. A potential constraint to implementing ABC is the availability and organization of cost information. Applying ABC efficiently requires information to be readily available, by cost category and department, since the greatest benefits of ABC come from frequent, systematic application of the methodology in order to monitor efficiency and provide feedback for management. The article concludes with a discussion of the potential applications of ABC in the health sector in developing countries.
Alvarez, Angeles; Rios-Navarro, Cesar; Blanch-Ruiz, Maria Amparo; Collado-Diaz, Victor; Andujar, Isabel; Martinez-Cuesta, Maria Angeles; Orden, Samuel; Esplugues, Juan V
2017-05-01
The controversy connecting Abacavir (ABC) with cardiovascular disease has been fuelled by the lack of a credible mechanism of action. ABC shares structural similarities with endogenous purines, signalling molecules capable of triggering prothrombotic/proinflammatory programmes. Platelets are leading actors in the process of thrombosis. Our study addresses the effects of ABC on interactions between platelets and other vascular cells, while exploring the adhesion molecules implicated and the potential interference with the purinergic signalling pathway. The effects of ABC on platelet aggregation and platelet-endothelium interactions were evaluated, respectively, with an aggregometer and a flow chamber system that reproduced conditions in vivo. The role of adhesion molecules and purinergic receptors in endothelial and platelet populations was assessed by selective pre-incubation with specific antagonists and antibodies. ABC and carbovir triphosphate (CBT) levels were evaluated by HPLC. The results showed that ABC promoted the adherence of platelets to endothelial cells, a crucial step for the formation of thrombi. This was not a consequence of a direct effect of ABC on platelets, but resulted from activation of the endothelium via purinergic ATP-P2X 7 receptors, which subsequently triggered an interplay between P-selectin and ICAM-1 on endothelial cells with constitutively expressed GPIIb/IIIa and GPIbα on platelets. ABC did not induce platelet activation (P-selectin expression or Ca 2+ mobilization) or aggregation, even at high concentrations. CBT levels in endothelial cells were lower than those required to induce platelet-endothelium interactions. Thus, ABC interference with endothelial purinergic signalling leads to platelet recruitment. This highlights the endothelium as the main cell target of ABC in this interaction, which is in line with previous experimental evidence that ABC induces manifestations of vascular inflammation. Copyright © 2017 Elsevier B.V. All rights reserved.
Link, William; Sauer, John R.
2016-01-01
The analysis of ecological data has changed in two important ways over the last 15 years. The development and easy availability of Bayesian computational methods has allowed and encouraged the fitting of complex hierarchical models. At the same time, there has been increasing emphasis on acknowledging and accounting for model uncertainty. Unfortunately, the ability to fit complex models has outstripped the development of tools for model selection and model evaluation: familiar model selection tools such as Akaike's information criterion and the deviance information criterion are widely known to be inadequate for hierarchical models. In addition, little attention has been paid to the evaluation of model adequacy in context of hierarchical modeling, i.e., to the evaluation of fit for a single model. In this paper, we describe Bayesian cross-validation, which provides tools for model selection and evaluation. We describe the Bayesian predictive information criterion and a Bayesian approximation to the BPIC known as the Watanabe-Akaike information criterion. We illustrate the use of these tools for model selection, and the use of Bayesian cross-validation as a tool for model evaluation, using three large data sets from the North American Breeding Bird Survey.
Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno
2016-01-01
Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision. PMID:27303323
Johnson, Eric D; Tubau, Elisabet
2017-06-01
Presenting natural frequencies facilitates Bayesian inferences relative to using percentages. Nevertheless, many people, including highly educated and skilled reasoners, still fail to provide Bayesian responses to these computationally simple problems. We show that the complexity of relational reasoning (e.g., the structural mapping between the presented and requested relations) can help explain the remaining difficulties. With a non-Bayesian inference that required identical arithmetic but afforded a more direct structural mapping, performance was universally high. Furthermore, reducing the relational demands of the task through questions that directed reasoners to use the presented statistics, as compared with questions that prompted the representation of a second, similar sample, also significantly improved reasoning. Distinct error patterns were also observed between these presented- and similar-sample scenarios, which suggested differences in relational-reasoning strategies. On the other hand, while higher numeracy was associated with better Bayesian reasoning, higher-numerate reasoners were not immune to the relational complexity of the task. Together, these findings validate the relational-reasoning view of Bayesian problem solving and highlight the importance of considering not only the presented task structure, but also the complexity of the structural alignment between the presented and requested relations.
Classifying emotion in Twitter using Bayesian network
NASA Astrophysics Data System (ADS)
Surya Asriadie, Muhammad; Syahrul Mubarok, Mohamad; Adiwijaya
2018-03-01
Language is used to express not only facts, but also emotions. Emotions are noticeable from behavior up to the social media statuses written by a person. Analysis of emotions in a text is done in a variety of media such as Twitter. This paper studies classification of emotions on twitter using Bayesian network because of its ability to model uncertainty and relationships between features. The result is two models based on Bayesian network which are Full Bayesian Network (FBN) and Bayesian Network with Mood Indicator (BNM). FBN is a massive Bayesian network where each word is treated as a node. The study shows the method used to train FBN is not very effective to create the best model and performs worse compared to Naive Bayes. F1-score for FBN is 53.71%, while for Naive Bayes is 54.07%. BNM is proposed as an alternative method which is based on the improvement of Multinomial Naive Bayes and has much lower computational complexity compared to FBN. Even though it’s not better compared to FBN, the resulting model successfully improves the performance of Multinomial Naive Bayes. F1-Score for Multinomial Naive Bayes model is 51.49%, while for BNM is 52.14%.
A bayesian approach to classification criteria for spectacled eiders
Taylor, B.L.; Wade, P.R.; Stehn, R.A.; Cochrane, J.F.
1996-01-01
To facilitate decisions to classify species according to risk of extinction, we used Bayesian methods to analyze trend data for the Spectacled Eider, an arctic sea duck. Trend data from three independent surveys of the Yukon-Kuskokwim Delta were analyzed individually and in combination to yield posterior distributions for population growth rates. We used classification criteria developed by the recovery team for Spectacled Eiders that seek to equalize errors of under- or overprotecting the species. We conducted both a Bayesian decision analysis and a frequentist (classical statistical inference) decision analysis. Bayesian decision analyses are computationally easier, yield basically the same results, and yield results that are easier to explain to nonscientists. With the exception of the aerial survey analysis of the 10 most recent years, both Bayesian and frequentist methods indicated that an endangered classification is warranted. The discrepancy between surveys warrants further research. Although the trend data are abundance indices, we used a preliminary estimate of absolute abundance to demonstrate how to calculate extinction distributions using the joint probability distributions for population growth rate and variance in growth rate generated by the Bayesian analysis. Recent apparent increases in abundance highlight the need for models that apply to declining and then recovering species.
Zonta, Zivko J; Flotats, Xavier; Magrí, Albert
2014-08-01
The procedure commonly used for the assessment of the parameters included in activated sludge models (ASMs) relies on the estimation of their optimal value within a confidence region (i.e. frequentist inference). Once optimal values are estimated, parameter uncertainty is computed through the covariance matrix. However, alternative approaches based on the consideration of the model parameters as probability distributions (i.e. Bayesian inference), may be of interest. The aim of this work is to apply (and compare) both Bayesian and frequentist inference methods when assessing uncertainty for an ASM-type model, which considers intracellular storage and biomass growth, simultaneously. Practical identifiability was addressed exclusively considering respirometric profiles based on the oxygen uptake rate and with the aid of probabilistic global sensitivity analysis. Parameter uncertainty was thus estimated according to both the Bayesian and frequentist inferential procedures. Results were compared in order to evidence the strengths and weaknesses of both approaches. Since it was demonstrated that Bayesian inference could be reduced to a frequentist approach under particular hypotheses, the former can be considered as a more generalist methodology. Hence, the use of Bayesian inference is encouraged for tackling inferential issues in ASM environments.
SU-E-T-401: Feasibility Study of Using ABC to Gate Lung SBRT Treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, D; Xie, X; Shepard, D
2014-06-01
Purpose: The current SBRT treatment techniques include free breathing (FB) SBRT and gated FB SBRT. Gated FB SBRT has smaller target and less lung toxicity with longer treatment time. The recent development of direct connectivity between the ABC and linac allowing for automated beam gating. In this study, we have examined the feasibility of using ABC system to gate the lung SBRT treatment. Methods: A CIRS lung phantom with a 3cm sphere-insert and a moving chest plate was used in this study. Sinusoidal motion was used for the FB pattern. An ABC signal was imported to simulate breath holds. 4D-CTmore » was taken in FB mode and average-intensity-projection (AIP) was used to create FB and 50% gated FB SBRT planning CT. A manually gated 3D CT scan was acquired for ABC gated SBRT planning.An SBRT plan was created for each treatment option. A surface-mapping system was used for 50% gating and ABC system was used for ABC gating. A manually gated CBCT scan was also performed to verify setup. Results: Among three options, the ABC gated plan has the smallest PTV of 35.94cc, which is 35% smaller comparing to that of the FB plan. Consequently, the V20 of the left lung reduced by 15% and 23% comparing to the 50% gated FB and FB plans, respectively. The FB plan took 4.7 minutes to deliver, while the 50% gated FB plan took 18.5 minutes. The ABC gated plan delivery took only 10.6 minutes. A stationary target with 3cm diameter was also obtained from the manually gated CBCT scan. Conclusion: A strategy for ABC gated lung SBRT was developed. ABC gating can significantly reduce the lung toxicity while maintaining the target coverage. Comparing to the 50% gated FB SBRT, ABC gated treatment can also provide less lung toxicity as well as improved delivery efficiency. This research is funded by Elekta.« less
Epis, Sara; Porretta, Daniele; Mastrantonio, Valentina; Comandatore, Francesco; Sassera, Davide; Rossi, Paolo; Cafarchia, Claudia; Otranto, Domenico; Favia, Guido; Genchi, Claudio; Bandi, Claudio; Urbanelli, Sandra
2014-07-29
Proteins from the ABC family (ATP-binding cassette) represent the largest known group of efflux pumps, responsible for transporting specific molecules across lipid membranes in both prokaryotic and eukaryotic organisms. In arthropods they have been shown to play a role in insecticide defense/resistance. The presence of ABC transporters and their possible association with insecticide transport have not yet been investigated in the mosquito Anopheles stephensi, the major vector of human malaria in the Middle East and South Asian regions. Here we investigated the presence and role of ABCs in transport of permethrin insecticide in a susceptible strain of this mosquito species. To identify ABC transporter genes we obtained a transcriptome from untreated larvae of An. stephensi and then compared it with the annotated transcriptome of Anopheles gambiae. To analyse the association between ABC transporters and permethrin we conducted bioassays with permethrin alone and in combination with an ABC inhibitor, and then we investigated expression profiles of the identified genes in larvae exposed to permethrin. Bioassays showed an increased mortality of mosquitoes when permethrin was used in combination with the ABC-transporter inhibitor. Genes for ABC transporters were detected in the transcriptome, and five were selected (AnstABCB2, AnstABCB3, AnstABCB4, AnstABCmember6 and AnstABCG4). An increased expression in one of them (AnstABCG4) was observed in larvae exposed to the LD50 dose of permethrin. Contrary to what was found in other insect species, no up-regulation was observed in the AnstABCB genes. Our results show for the first time the involvement of ABC transporters in larval defense against permethrin in An. stephensi and, more in general, confirm the role of ABC transporters in insecticide defense. The differences observed with previous studies highlight the need of further research as, despite the growing number of studies on ABC transporters in insects, the heterogeneity of the results available at present does not allow us to infer general trends in ABC transporter-insecticide interactions.
Bayesian flood forecasting methods: A review
NASA Astrophysics Data System (ADS)
Han, Shasha; Coulibaly, Paulin
2017-08-01
Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.
MDTS: automatic complex materials design using Monte Carlo tree search.
M Dieb, Thaer; Ju, Shenghong; Yoshizoe, Kazuki; Hou, Zhufeng; Shiomi, Junichiro; Tsuda, Koji
2017-01-01
Complex materials design is often represented as a black-box combinatorial optimization problem. In this paper, we present a novel python library called MDTS (Materials Design using Tree Search). Our algorithm employs a Monte Carlo tree search approach, which has shown exceptional performance in computer Go game. Unlike evolutionary algorithms that require user intervention to set parameters appropriately, MDTS has no tuning parameters and works autonomously in various problems. In comparison to a Bayesian optimization package, our algorithm showed competitive search efficiency and superior scalability. We succeeded in designing large Silicon-Germanium (Si-Ge) alloy structures that Bayesian optimization could not deal with due to excessive computational cost. MDTS is available at https://github.com/tsudalab/MDTS.
MDTS: automatic complex materials design using Monte Carlo tree search
NASA Astrophysics Data System (ADS)
Dieb, Thaer M.; Ju, Shenghong; Yoshizoe, Kazuki; Hou, Zhufeng; Shiomi, Junichiro; Tsuda, Koji
2017-12-01
Complex materials design is often represented as a black-box combinatorial optimization problem. In this paper, we present a novel python library called MDTS (Materials Design using Tree Search). Our algorithm employs a Monte Carlo tree search approach, which has shown exceptional performance in computer Go game. Unlike evolutionary algorithms that require user intervention to set parameters appropriately, MDTS has no tuning parameters and works autonomously in various problems. In comparison to a Bayesian optimization package, our algorithm showed competitive search efficiency and superior scalability. We succeeded in designing large Silicon-Germanium (Si-Ge) alloy structures that Bayesian optimization could not deal with due to excessive computational cost. MDTS is available at https://github.com/tsudalab/MDTS.
Bayesian Modeling for Identification and Estimation of the Learning Effects of Pointing Tasks
NASA Astrophysics Data System (ADS)
Kyo, Koki
Recently, in the field of human-computer interaction, a model containing the systematic factor and human factor has been proposed to evaluate the performance of the input devices of a computer. This is called the SH-model. In this paper, in order to extend the range of application of the SH-model, we propose some new models based on the Box-Cox transformation and apply a Bayesian modeling method for identification and estimation of the learning effects of pointing tasks. We consider the parameters describing the learning effect as random variables and introduce smoothness priors for them. Illustrative results show that the newly-proposed models work well.
Bayesian X-ray computed tomography using a three-level hierarchical prior model
NASA Astrophysics Data System (ADS)
Wang, Li; Mohammad-Djafari, Ali; Gac, Nicolas
2017-06-01
In recent decades X-ray Computed Tomography (CT) image reconstruction has been largely developed in both medical and industrial domain. In this paper, we propose using the Bayesian inference approach with a new hierarchical prior model. In the proposed model, a generalised Student-t distribution is used to enforce the Haar transformation of images to be sparse. Comparisons with some state of the art methods are presented. It is shown that by using the proposed model, the sparsity of sparse representation of images is enforced, so that edges of images are preserved. Simulation results are also provided to demonstrate the effectiveness of the new hierarchical model for reconstruction with fewer projections.
NASA Astrophysics Data System (ADS)
Fenicia, Fabrizio; Reichert, Peter; Kavetski, Dmitri; Albert, Calro
2016-04-01
The calibration of hydrological models based on signatures (e.g. Flow Duration Curves - FDCs) is often advocated as an alternative to model calibration based on the full time series of system responses (e.g. hydrographs). Signature based calibration is motivated by various arguments. From a conceptual perspective, calibration on signatures is a way to filter out errors that are difficult to represent when calibrating on the full time series. Such errors may for example occur when observed and simulated hydrographs are shifted, either on the "time" axis (i.e. left or right), or on the "streamflow" axis (i.e. above or below). These shifts may be due to errors in the precipitation input (time or amount), and if not properly accounted in the likelihood function, may cause biased parameter estimates (e.g. estimated model parameters that do not reproduce the recession characteristics of a hydrograph). From a practical perspective, signature based calibration is seen as a possible solution for making predictions in ungauged basins. Where streamflow data are not available, it may in fact be possible to reliably estimate streamflow signatures. Previous research has for example shown how FDCs can be reliably estimated at ungauged locations based on climatic and physiographic influence factors. Typically, the goal of signature based calibration is not the prediction of the signatures themselves, but the prediction of the system responses. Ideally, the prediction of system responses should be accompanied by a reliable quantification of the associated uncertainties. Previous approaches for signature based calibration, however, do not allow reliable estimates of streamflow predictive distributions. Here, we illustrate how the Bayesian approach can be employed to obtain reliable streamflow predictive distributions based on signatures. A case study is presented, where a hydrological model is calibrated on FDCs and additional signatures. We propose an approach where the likelihood function for the signatures is derived from the likelihood for streamflow (rather than using an "ad-hoc" likelihood for the signatures as done in previous approaches). This likelihood is not easily tractable analytically and we therefore cannot apply "simple" MCMC methods. This numerical problem is solved using Approximate Bayesian Computation (ABC). Our result indicate that the proposed approach is suitable for producing reliable streamflow predictive distributions based on calibration to signature data. Moreover, our results provide indications on which signatures are more appropriate to represent the information content of the hydrograph.
Critical Problems in Very Large Scale Computer Systems
1989-09-30
N Massachusetts Institute of Technology Cambridge, Massachusetts 02139 Anant Agarwal (617) 253-1448 William J. Dally (617) 253-6043 Srinivas Devadas ...rapidly switched between the ports. Labelling the terminal voltages ab.c. d. this attempts to enforce a constraint a - b = c - d. This is a reciprocal...Srinivas Devadas and his students have been focusing on the optimization ofcomibinational and sequen- tial circuits specified at the register
Transforming System Engineering through Model-Centric Engineering
2015-11-18
best practices and provide computational technologies for real-time training within digital engineering environments Multidisciplinary System...MBSE well due to continued training and practicing . While MBSE is a part of the MCE it does not encompass the full idea and enabling technologies of... practices against other Industry contractors and it was believed that ABC was trailing the others in the use of MDAO capabilities. They decided that
Nonparametric Bayesian models through probit stick-breaking processes
Rodríguez, Abel; Dunson, David B.
2013-01-01
We describe a novel class of Bayesian nonparametric priors based on stick-breaking constructions where the weights of the process are constructed as probit transformations of normal random variables. We show that these priors are extremely flexible, allowing us to generate a great variety of models while preserving computational simplicity. Particular emphasis is placed on the construction of rich temporal and spatial processes, which are applied to two problems in finance and ecology. PMID:24358072
Nonparametric Bayesian models through probit stick-breaking processes.
Rodríguez, Abel; Dunson, David B
2011-03-01
We describe a novel class of Bayesian nonparametric priors based on stick-breaking constructions where the weights of the process are constructed as probit transformations of normal random variables. We show that these priors are extremely flexible, allowing us to generate a great variety of models while preserving computational simplicity. Particular emphasis is placed on the construction of rich temporal and spatial processes, which are applied to two problems in finance and ecology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Stephen A.; Sigeti, David E.
These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ 2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H 0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Wei, E-mail: wlu@umm.edu; Neuner, Geoffrey A.; George, Rohini
2014-01-01
Purpose: To investigate whether coaching patients' breathing would improve the match between ITV{sub MIP} (internal target volume generated by contouring in the maximum intensity projection scan) and ITV{sub 10} (generated by combining the gross tumor volumes contoured in 10 phases of a 4-dimensional CT [4DCT] scan). Methods and Materials: Eight patients with a thoracic tumor and 5 patients with an abdominal tumor were included in an institutional review board-approved prospective study. Patients underwent 3 4DCT scans with: (1) free breathing (FB); (2) coaching using audio-visual (AV) biofeedback via the Real-Time Position Management system; and (3) coaching via a spirometer systemmore » (Active Breathing Coordinator or ABC). One physician contoured all scans to generate the ITV{sub 10} and ITV{sub MIP}. The match between ITV{sub MIP} and ITV{sub 10} was quantitatively assessed with volume ratio, centroid distance, root mean squared distance, and overlap/Dice coefficient. We investigated whether coaching (AV or ABC) or uniform expansions (1, 2, 3, or 5 mm) of ITV{sub MIP} improved the match. Results: Although both AV and ABC coaching techniques improved frequency reproducibility and ABC improved displacement regularity, neither improved the match between ITV{sub MIP} and ITV{sub 10} over FB. On average, ITV{sub MIP} underestimated ITV{sub 10} by 19%, 19%, and 21%, with centroid distance of 1.9, 2.3, and 1.7 mm and Dice coefficient of 0.87, 0.86, and 0.88 for FB, AV, and ABC, respectively. Separate analyses indicated a better match for lung cancers or tumors not adjacent to high-intensity tissues. Uniform expansions of ITV{sub MIP} did not correct for the mismatch between ITV{sub MIP} and ITV{sub 10}. Conclusions: In this pilot study, audio-visual biofeedback did not improve the match between ITV{sub MIP} and ITV{sub 10}. In general, ITV{sub MIP} should be limited to lung cancers, and modification of ITV{sub MIP} in each phase of the 4DCT data set is recommended.« less
Young, J; Mucsi, I; Rollet-Kurhajec, K C; Klein, M B
2016-05-01
Fibroblast growth factor 23 (FGF23) has been associated with cardiovascular mortality. We estimate associations between the level of plasma FGF23 and exposure to abacavir (ABC) and to other components of antiretroviral therapy in patients co-infected with HIV and hepatitis C. Both intact and c-terminal FGF23 were measured in plasma using commercial assays for a sub-cohort of 295 patients selected at random from the 1150 patients enrolled in the Canadian Co-infection Cohort. The multiplicative effects of antiretroviral drug exposures and covariates on median FGF23 were then estimated using a hierarchical Bayesian model. The median level of intact FGF23 was independent of either past or recent exposure to abacavir, with multiplicative ratios of 1.00 and 1.07, 95% credible intervals 0.90-1.12 and 0.94-1.23, respectively. Median intact FGF23 tended to increase with past use of both nonnucleoside reverse-transcriptase inhibitors and protease inhibitors, but tended to decrease with recent use of either tenofovir, efavirenz or lopinavir. There were no obvious associations between the median level of c-terminal FGF23 and individual drugs or drug classes. Age, female gender, smoking and the aspartate aminotransferase to platelet ratio index were all associated with a higher median c-terminal FGF23 but not with a higher median intact FGF23. The level of FGF23 in plasma was independent of exposure to ABC. Lower levels of intact FGF23 with recent use of tenofovir, efavirenz or lopinavir may reflect their adverse effects on bone and vitamin D metabolism relative to other drugs in their respective drug classes. © 2015 British HIV Association.
Bayesian randomized clinical trials: From fixed to adaptive design.
Yin, Guosheng; Lam, Chi Kin; Shi, Haolun
2017-08-01
Randomized controlled studies are the gold standard for phase III clinical trials. Using α-spending functions to control the overall type I error rate, group sequential methods are well established and have been dominating phase III studies. Bayesian randomized design, on the other hand, can be viewed as a complement instead of competitive approach to the frequentist methods. For the fixed Bayesian design, the hypothesis testing can be cast in the posterior probability or Bayes factor framework, which has a direct link to the frequentist type I error rate. Bayesian group sequential design relies upon Bayesian decision-theoretic approaches based on backward induction, which is often computationally intensive. Compared with the frequentist approaches, Bayesian methods have several advantages. The posterior predictive probability serves as a useful and convenient tool for trial monitoring, and can be updated at any time as the data accrue during the trial. The Bayesian decision-theoretic framework possesses a direct link to the decision making in the practical setting, and can be modeled more realistically to reflect the actual cost-benefit analysis during the drug development process. Other merits include the possibility of hierarchical modeling and the use of informative priors, which would lead to a more comprehensive utilization of information from both historical and longitudinal data. From fixed to adaptive design, we focus on Bayesian randomized controlled clinical trials and make extensive comparisons with frequentist counterparts through numerical studies. Copyright © 2017 Elsevier Inc. All rights reserved.
Using Bayesian Networks for Candidate Generation in Consistency-based Diagnosis
NASA Technical Reports Server (NTRS)
Narasimhan, Sriram; Mengshoel, Ole
2008-01-01
Consistency-based diagnosis relies heavily on the assumption that discrepancies between model predictions and sensor observations can be detected accurately. When sources of uncertainty like sensor noise and model abstraction exist robust schemes have to be designed to make a binary decision on whether predictions are consistent with observations. This risks the occurrence of false alarms and missed alarms when an erroneous decision is made. Moreover when multiple sensors (with differing sensing properties) are available the degree of match between predictions and observations can be used to guide the search for fault candidates. In this paper we propose a novel approach to handle this problem using Bayesian networks. In the consistency- based diagnosis formulation, automatically generated Bayesian networks are used to encode a probabilistic measure of fit between predictions and observations. A Bayesian network inference algorithm is used to compute most probable fault candidates.
Bayesian least squares deconvolution
NASA Astrophysics Data System (ADS)
Asensio Ramos, A.; Petit, P.
2015-11-01
Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Robust Learning of High-dimensional Biological Networks with Bayesian Networks
NASA Astrophysics Data System (ADS)
Nägele, Andreas; Dejori, Mathäus; Stetter, Martin
Structure learning of Bayesian networks applied to gene expression data has become a potentially useful method to estimate interactions between genes. However, the NP-hardness of Bayesian network structure learning renders the reconstruction of the full genetic network with thousands of genes unfeasible. Consequently, the maximal network size is usually restricted dramatically to a small set of genes (corresponding with variables in the Bayesian network). Although this feature reduction step makes structure learning computationally tractable, on the downside, the learned structure might be adversely affected due to the introduction of missing genes. Additionally, gene expression data are usually very sparse with respect to the number of samples, i.e., the number of genes is much greater than the number of different observations. Given these problems, learning robust network features from microarray data is a challenging task. This chapter presents several approaches tackling the robustness issue in order to obtain a more reliable estimation of learned network features.
Bayesian Group Bridge for Bi-level Variable Selection.
Mallick, Himel; Yi, Nengjun
2017-06-01
A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.
Comparing hierarchical models via the marginalized deviance information criterion.
Quintero, Adrian; Lesaffre, Emmanuel
2018-07-20
Hierarchical models are extensively used in pharmacokinetics and longitudinal studies. When the estimation is performed from a Bayesian approach, model comparison is often based on the deviance information criterion (DIC). In hierarchical models with latent variables, there are several versions of this statistic: the conditional DIC (cDIC) that incorporates the latent variables in the focus of the analysis and the marginalized DIC (mDIC) that integrates them out. Regardless of the asymptotic and coherency difficulties of cDIC, this alternative is usually used in Markov chain Monte Carlo (MCMC) methods for hierarchical models because of practical convenience. The mDIC criterion is more appropriate in most cases but requires integration of the likelihood, which is computationally demanding and not implemented in Bayesian software. Therefore, we consider a method to compute mDIC by generating replicate samples of the latent variables that need to be integrated out. This alternative can be easily conducted from the MCMC output of Bayesian packages and is widely applicable to hierarchical models in general. Additionally, we propose some approximations in order to reduce the computational complexity for large-sample situations. The method is illustrated with simulated data sets and 2 medical studies, evidencing that cDIC may be misleading whilst mDIC appears pertinent. Copyright © 2018 John Wiley & Sons, Ltd.
Hidden Markov induced Dynamic Bayesian Network for recovering time evolving gene regulatory networks
NASA Astrophysics Data System (ADS)
Zhu, Shijia; Wang, Yadong
2015-12-01
Dynamic Bayesian Networks (DBN) have been widely used to recover gene regulatory relationships from time-series data in computational systems biology. Its standard assumption is ‘stationarity’, and therefore, several research efforts have been recently proposed to relax this restriction. However, those methods suffer from three challenges: long running time, low accuracy and reliance on parameter settings. To address these problems, we propose a novel non-stationary DBN model by extending each hidden node of Hidden Markov Model into a DBN (called HMDBN), which properly handles the underlying time-evolving networks. Correspondingly, an improved structural EM algorithm is proposed to learn the HMDBN. It dramatically reduces searching space, thereby substantially improving computational efficiency. Additionally, we derived a novel generalized Bayesian Information Criterion under the non-stationary assumption (called BWBIC), which can help significantly improve the reconstruction accuracy and largely reduce over-fitting. Moreover, the re-estimation formulas for all parameters of our model are derived, enabling us to avoid reliance on parameter settings. Compared to the state-of-the-art methods, the experimental evaluation of our proposed method on both synthetic and real biological data demonstrates more stably high prediction accuracy and significantly improved computation efficiency, even with no prior knowledge and parameter settings.
Optimal Sequential Rules for Computer-Based Instruction.
ERIC Educational Resources Information Center
Vos, Hans J.
1998-01-01
Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…
ABC versus CAB for cardiopulmonary resuscitation: a prospective, randomized simulator-based trial.
Marsch, Stephan; Tschan, Franziska; Semmer, Norbert K; Zobrist, Roger; Hunziker, Patrick R; Hunziker, Sabina
2013-09-06
After years of advocating ABC (Airway-Breathing-Circulation), current guidelines of cardiopulmonary resuscitation (CPR) recommend CAB (Circulation-Airway-Breathing). This trial compared ABC with CAB as initial approach to CPR from the arrival of rescuers until the completion of the first resuscitation cycle. 108 teams, consisting of two physicians each, were randomized to receive a graphical display of either the ABC algorithm or the CAB algorithm. Subsequently teams had to treat a simulated cardiac arrest. Data analysis was performed using video recordings obtained during simulations. The primary endpoint was the time to completion of the first resuscitation cycle of 30 compressions and two ventilations. The time to execution of the first resuscitation measure was 32 ± 12 seconds in ABC teams and 25 ± 10 seconds in CAB teams (P = 0.002). 18/53 ABC teams (34%) and none of the 55 CAB teams (P = 0.006) applied more than the recommended two initial rescue breaths which caused a longer duration of the first cycle of 30 compressions and two ventilations in ABC teams (31 ± 13 vs.23 ± 6 sec; P = 0.001). Overall, the time to completion of the first resuscitation cycle was longer in ABC teams (63 ± 17 vs. 48 ± 10 sec; P <0.0001). This randomized controlled trial found CAB superior to ABC with an earlier start of CPR and a shorter time to completion of the first 30:2 resuscitation cycle. These findings endorse the change from ABC to CAB in international resuscitation guidelines.
Diagnosing and discriminating between primary and secondary aneurysmal bone cysts
Sasaki, Hiromi; Nagano, Satoshi; Shimada, Hirofumi; Yokouchi, Masahiro; Setoguchi, Takao; Ishidou, Yasuhiro; Kunigou, Osamu; Maehara, Kosuke; Komiya, Setsuro
2017-01-01
Aneurysmal bone cysts (ABCs) are benign bony lesions frequently accompanied by multiple cystic lesions and aggressive bone destruction. They are relatively rare lesions, representing only 1% of bone tumors. The pathogenesis of ABCs has yet to be elucidated. In the present study, a series of 22 cases of primary and secondary ABC from patients treated in Department of Orthopedic Surgery, Kagoshima University Hospital (Kagoshima, Japan) from 2001–2015 were retrospectively analyzed. The average age at the time of diagnosis of primary ABC was 17.9 years. Intralesional curettage and artificial bone grafting were performed in the majority of the patients with primary ABC. The local recurrence rate following curettage for primary ABC was 18%, and the cause of local recurrence was considered to be insufficient curettage. Although no adjuvant therapy was administered during the surgeries, it may assist the prevention of local recurrence in certain cases. The cases of secondary ABC were preceded by benign bone tumors, including fibrous dysplasia, giant cell tumors, chondroblastoma and non-ossifying fibroma. The features of the secondary ABC typically reflected those of the preceding bone tumor. In the majority of cases, distinguishing the primary ABC from the secondary ABC was possible based on characteristic features, including age of the patient at diagnosis and the tumor location. In cases that exhibit ambiguous features, including a soft tissue mass or a thick septal enhancement on the preoperative magnetic resonance images, a biopsy must be obtained in order to exclude other types of aggressive bone tumors, including giant cell tumor, osteosarcoma and telangiectatic osteosarcoma. PMID:28454393
Anticipated Benefits of Care (ABC): psychometrics and predictive value in psychiatric disorders.
Warden, D; Trivedi, M H; Carmody, T J; Gollan, J K; Kashner, T M; Lind, L; Crismon, M L; Rush, A J
2010-06-01
Attitudes and expectations about treatment have been associated with symptomatic outcomes, adherence and utilization in patients with psychiatric disorders. No measure of patients' anticipated benefits of treatment on domains of everyday functioning has previously been available. The Anticipated Benefits of Care (ABC) is a new, 10-item questionnaire used to measure patient expectations about the impact of treatment on domains of everyday functioning. The ABC was collected at baseline in adult out-patients with major depressive disorder (MDD) (n=528), bipolar disorder (n=395) and schizophrenia (n=447) in the Texas Medication Algorithm Project (TMAP). Psychometric properties of the ABC were assessed, and the association of ABC scores with treatment response at 3 months was evaluated. Evaluation of the ABC's internal consistency yielded Cronbach's alpha of 0.90-0.92 for patients across disorders. Factor analysis showed that the ABC was unidimensional for all patients and for patients with each disorder. For patients with MDD, lower anticipated benefits of treatment was associated with less symptom improvement and lower odds of treatment response [odds ratio (OR) 0.72, 95% confidence interval (CI) 0.57-0.87, p=0.0011]. There was no association between ABC and symptom improvement or treatment response for patients with bipolar disorder or schizophrenia, possibly because these patients had modest benefits with treatment. The ABC is the first self-report that measures patient expectations about the benefits of treatment on everyday functioning, filling an important gap in available assessments of attitudes and expectations about treatment. The ABC is simple, easy to use, and has acceptable psychometric properties for use in research or clinical settings.
Alshamlan, Hala; Badr, Ghada; Alohali, Yousef
2015-01-01
An artificial bee colony (ABC) is a relatively recent swarm intelligence optimization approach. In this paper, we propose the first attempt at applying ABC algorithm in analyzing a microarray gene expression profile. In addition, we propose an innovative feature selection algorithm, minimum redundancy maximum relevance (mRMR), and combine it with an ABC algorithm, mRMR-ABC, to select informative genes from microarray profile. The new approach is based on a support vector machine (SVM) algorithm to measure the classification accuracy for selected genes. We evaluate the performance of the proposed mRMR-ABC algorithm by conducting extensive experiments on six binary and multiclass gene expression microarray datasets. Furthermore, we compare our proposed mRMR-ABC algorithm with previously known techniques. We reimplemented two of these techniques for the sake of a fair comparison using the same parameters. These two techniques are mRMR when combined with a genetic algorithm (mRMR-GA) and mRMR when combined with a particle swarm optimization algorithm (mRMR-PSO). The experimental results prove that the proposed mRMR-ABC algorithm achieves accurate classification performance using small number of predictive genes when tested using both datasets and compared to previously suggested methods. This shows that mRMR-ABC is a promising approach for solving gene selection and cancer classification problems. PMID:25961028
Alshamlan, Hala; Badr, Ghada; Alohali, Yousef
2015-01-01
An artificial bee colony (ABC) is a relatively recent swarm intelligence optimization approach. In this paper, we propose the first attempt at applying ABC algorithm in analyzing a microarray gene expression profile. In addition, we propose an innovative feature selection algorithm, minimum redundancy maximum relevance (mRMR), and combine it with an ABC algorithm, mRMR-ABC, to select informative genes from microarray profile. The new approach is based on a support vector machine (SVM) algorithm to measure the classification accuracy for selected genes. We evaluate the performance of the proposed mRMR-ABC algorithm by conducting extensive experiments on six binary and multiclass gene expression microarray datasets. Furthermore, we compare our proposed mRMR-ABC algorithm with previously known techniques. We reimplemented two of these techniques for the sake of a fair comparison using the same parameters. These two techniques are mRMR when combined with a genetic algorithm (mRMR-GA) and mRMR when combined with a particle swarm optimization algorithm (mRMR-PSO). The experimental results prove that the proposed mRMR-ABC algorithm achieves accurate classification performance using small number of predictive genes when tested using both datasets and compared to previously suggested methods. This shows that mRMR-ABC is a promising approach for solving gene selection and cancer classification problems.
75 FR 11991 - ABC & D Recycling, Inc.-Lease and Operation Exemption-a Line of Railroad in Ware, MA
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
... DEPARTMENT OF TRANSPORTATION Surface Transportation Board [STB Finance Docket No. 35356] ABC & D Recycling, Inc.--Lease and Operation Exemption--a Line of Railroad in Ware, MA ABC & D Recycling, Inc. (ABC & D), a noncarrier, has filed a verified notice of exemption under 49 CFR 1150.31 to lease from O...
NASA Astrophysics Data System (ADS)
Fer, I.; Kelly, R.; Andrews, T.; Dietze, M.; Richardson, A. D.
2016-12-01
Our ability to forecast ecosystems is limited by how well we parameterize ecosystem models. Direct measurements for all model parameters are not always possible and inverse estimation of these parameters through Bayesian methods is computationally costly. A solution to computational challenges of Bayesian calibration is to approximate the posterior probability surface using a Gaussian Process that emulates the complex process-based model. Here we report the integration of this method within an ecoinformatics toolbox, Predictive Ecosystem Analyzer (PEcAn), and its application with two ecosystem models: SIPNET and ED2.1. SIPNET is a simple model, allowing application of MCMC methods both to the model itself and to its emulator. We used both approaches to assimilate flux (CO2 and latent heat), soil respiration, and soil carbon data from Bartlett Experimental Forest. This comparison showed that emulator is reliable in terms of convergence to the posterior distribution. A 10000-iteration MCMC analysis with SIPNET itself required more than two orders of magnitude greater computation time than an MCMC run of same length with its emulator. This difference would be greater for a more computationally demanding model. Validation of the emulator-calibrated SIPNET against both the assimilated data and out-of-sample data showed improved fit and reduced uncertainty around model predictions. We next applied the validated emulator method to the ED2, whose complexity precludes standard Bayesian data assimilation. We used the ED2 emulator to assimilate demographic data from a network of inventory plots. For validation of the calibrated ED2, we compared the model to results from Empirical Succession Mapping (ESM), a novel synthesis of successional patterns in Forest Inventory and Analysis data. Our results revealed that while the pre-assimilation ED2 formulation cannot capture the emergent demographic patterns from ESM analysis, constrained model parameters controlling demographic processes increased their agreement considerably.
A Bayesian Method for Evaluating and Discovering Disease Loci Associations
Jiang, Xia; Barmada, M. Michael; Cooper, Gregory F.; Becich, Michael J.
2011-01-01
Background A genome-wide association study (GWAS) typically involves examining representative SNPs in individuals from some population. A GWAS data set can concern a million SNPs and may soon concern billions. Researchers investigate the association of each SNP individually with a disease, and it is becoming increasingly commonplace to also analyze multi-SNP associations. Techniques for handling so many hypotheses include the Bonferroni correction and recently developed Bayesian methods. These methods can encounter problems. Most importantly, they are not applicable to a complex multi-locus hypothesis which has several competing hypotheses rather than only a null hypothesis. A method that computes the posterior probability of complex hypotheses is a pressing need. Methodology/Findings We introduce the Bayesian network posterior probability (BNPP) method which addresses the difficulties. The method represents the relationship between a disease and SNPs using a directed acyclic graph (DAG) model, and computes the likelihood of such models using a Bayesian network scoring criterion. The posterior probability of a hypothesis is computed based on the likelihoods of all competing hypotheses. The BNPP can not only be used to evaluate a hypothesis that has previously been discovered or suspected, but also to discover new disease loci associations. The results of experiments using simulated and real data sets are presented. Our results concerning simulated data sets indicate that the BNPP exhibits both better evaluation and discovery performance than does a p-value based method. For the real data sets, previous findings in the literature are confirmed and additional findings are found. Conclusions/Significance We conclude that the BNPP resolves a pressing problem by providing a way to compute the posterior probability of complex multi-locus hypotheses. A researcher can use the BNPP to determine the expected utility of investigating a hypothesis further. Furthermore, we conclude that the BNPP is a promising method for discovering disease loci associations. PMID:21853025
ABCE1 Is a Highly Conserved RNA Silencing Suppressor
Kärblane, Kairi; Gerassimenko, Jelena; Nigul, Lenne; Piirsoo, Alla; Smialowska, Agata; Vinkel, Kadri; Kylsten, Per; Ekwall, Karl; Swoboda, Peter; Truve, Erkki; Sarmiento, Cecilia
2015-01-01
ATP-binding cassette sub-family E member 1 (ABCE1) is a highly conserved protein among eukaryotes and archaea. Recent studies have identified ABCE1 as a ribosome-recycling factor important for translation termination in mammalian cells, yeast and also archaea. Here we report another conserved function of ABCE1. We have previously described AtRLI2, the homolog of ABCE1 in the plant Arabidopsis thaliana, as an endogenous suppressor of RNA silencing. In this study we show that this function is conserved: human ABCE1 is able to suppress RNA silencing in Nicotiana benthamiana plants, in mammalian HEK293 cells and in the worm Caenorhabditis elegans. Using co-immunoprecipitation and mass spectrometry, we found a number of potential ABCE1-interacting proteins that might support its function as an endogenous suppressor of RNA interference. The interactor candidates are associated with epigenetic regulation, transcription, RNA processing and mRNA surveillance. In addition, one of the identified proteins is translin, which together with its binding partner TRAX supports RNA interference. PMID:25659154
Ghosh, Sujit K
2010-01-01
Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the so-called posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.
Development of dynamic Bayesian models for web application test management
NASA Astrophysics Data System (ADS)
Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.
2018-03-01
The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.
Case studies in Bayesian microbial risk assessments.
Kennedy, Marc C; Clough, Helen E; Turner, Joanne
2009-12-21
The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs). We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5). The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11). In the second case study the effective number of inputs was reduced from 30 to 7 in the screening stage, and just 2 inputs were found to explain 82.8% of the output variance. A combined total of 500 runs of the computer code were used. These case studies illustrate the use of Bayesian statistics to perform detailed uncertainty and sensitivity analyses, integrating multiple information sources in a way that is both rigorous and efficient.
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary
2015-06-01
PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.
Bayesian depth estimation from monocular natural images.
Su, Che-Chun; Cormack, Lawrence K; Bovik, Alan C
2017-05-01
Estimating an accurate and naturalistic dense depth map from a single monocular photographic image is a difficult problem. Nevertheless, human observers have little difficulty understanding the depth structure implied by photographs. Two-dimensional (2D) images of the real-world environment contain significant statistical information regarding the three-dimensional (3D) structure of the world that the vision system likely exploits to compute perceived depth, monocularly as well as binocularly. Toward understanding how this might be accomplished, we propose a Bayesian model of monocular depth computation that recovers detailed 3D scene structures by extracting reliable, robust, depth-sensitive statistical features from single natural images. These features are derived using well-accepted univariate natural scene statistics (NSS) models and recent bivariate/correlation NSS models that describe the relationships between 2D photographic images and their associated depth maps. This is accomplished by building a dictionary of canonical local depth patterns from which NSS features are extracted as prior information. The dictionary is used to create a multivariate Gaussian mixture (MGM) likelihood model that associates local image features with depth patterns. A simple Bayesian predictor is then used to form spatial depth estimates. The depth results produced by the model, despite its simplicity, correlate well with ground-truth depths measured by a current-generation terrestrial light detection and ranging (LIDAR) scanner. Such a strong form of statistical depth information could be used by the visual system when creating overall estimated depth maps incorporating stereopsis, accommodation, and other conditions. Indeed, even in isolation, the Bayesian predictor delivers depth estimates that are competitive with state-of-the-art "computer vision" methods that utilize highly engineered image features and sophisticated machine learning algorithms.
NASA Astrophysics Data System (ADS)
Perkins, S. J.; Marais, P. C.; Zwart, J. T. L.; Natarajan, I.; Tasse, C.; Smirnov, O.
2015-09-01
We present Montblanc, a GPU implementation of the Radio interferometer measurement equation (RIME) in support of the Bayesian inference for radio observations (BIRO) technique. BIRO uses Bayesian inference to select sky models that best match the visibilities observed by a radio interferometer. To accomplish this, BIRO evaluates the RIME multiple times, varying sky model parameters to produce multiple model visibilities. χ2 values computed from the model and observed visibilities are used as likelihood values to drive the Bayesian sampling process and select the best sky model. As most of the elements of the RIME and χ2 calculation are independent of one another, they are highly amenable to parallel computation. Additionally, Montblanc caters for iterative RIME evaluation to produce multiple χ2 values. Modified model parameters are transferred to the GPU between each iteration. We implemented Montblanc as a Python package based upon NVIDIA's CUDA architecture. As such, it is easy to extend and implement different pipelines. At present, Montblanc supports point and Gaussian morphologies, but is designed for easy addition of new source profiles. Montblanc's RIME implementation is performant: On an NVIDIA K40, it is approximately 250 times faster than MEQTREES on a dual hexacore Intel E5-2620v2 CPU. Compared to the OSKAR simulator's GPU-implemented RIME components it is 7.7 and 12 times faster on the same K40 for single and double-precision floating point respectively. However, OSKAR's RIME implementation is more general than Montblanc's BIRO-tailored RIME. Theoretical analysis of Montblanc's dominant CUDA kernel suggests that it is memory bound. In practice, profiling shows that is balanced between compute and memory, as much of the data required by the problem is retained in L1 and L2 caches.
Lane, Thomas S; Rempe, Caroline S; Davitt, Jack; Staton, Margaret E; Peng, Yanhui; Soltis, Douglas Edward; Melkonian, Michael; Deyholos, Michael; Leebens-Mack, James H; Chase, Mark; Rothfels, Carl J; Stevenson, Dennis; Graham, Sean W; Yu, Jun; Liu, Tao; Pires, J Chris; Edger, Patrick P; Zhang, Yong; Xie, Yinlong; Zhu, Ying; Carpenter, Eric; Wong, Gane Ka-Shu; Stewart, C Neal
2016-05-31
The ATP-binding cassette (ABC) transporter gene superfamily is ubiquitous among extant organisms and prominently represented in plants. ABC transporters act to transport compounds across cellular membranes and are involved in a diverse range of biological processes. Thus, the applicability to biotechnology is vast, including cancer resistance in humans, drug resistance among vertebrates, and herbicide and other xenobiotic resistance in plants. In addition, plants appear to harbor the highest diversity of ABC transporter genes compared with any other group of organisms. This study applied transcriptome analysis to survey the kingdom-wide ABC transporter diversity in plants and suggest biotechnology applications of this diversity. We utilized sequence similarity-based informatics techniques to infer the identity of ABC transporter gene candidates from 1295 phylogenetically-diverse plant transcriptomes. A total of 97,149 putative (approximately 25 % were full-length) ABC transporter gene members were identified; each RNA-Seq library (plant sample) had 88 ± 30 gene members. As expected, simpler organisms, such as algae, had fewer unique members than vascular land plants. Differences were also noted in the richness of certain ABC transporter subfamilies. Land plants had more unique ABCB, ABCC, and ABCG transporter gene members on average (p < 0.005), and green algae, red algae, and bryophytes had significantly more ABCF transporter gene members (p < 0.005). Ferns had significantly fewer ABCA transporter gene members than all other plant groups (p < 0.005). We present a transcriptomic overview of ABC transporter gene members across all major plant groups. An increase in the number of gene family members present in the ABCB, ABCC, and ABCD transporter subfamilies may indicate an expansion of the ABC transporter superfamily among green land plants, which include all crop species. The striking difference between the number of ABCA subfamily transporter gene members between ferns and other plant taxa is surprising and merits further investigation. Discussed is the potential exploitation of ABC transporters in plant biotechnology, with an emphasis on crops.
ABC transporters affect the elimination and toxicity of CdTe quantum dots in liver and kidney cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Mingli; Yin, Huancai; Bai, Pengli
This paper aimed to investigate the role of adenosine triphosphate-binding cassette (ABC) transporters on the efflux and the toxicity of nanoparticles in liver and kidney cells. In this study, we synthesized CdTe quantum dots (QDs) that were monodispersed and emitted green fluorescence (maximum peak at 530 nm). Such QDs tended to accumulate in human hepatocellular carcinoma cells (HepG2), human kidney cells 2 (HK-2), and Madin-Darby canine kidney (MDCK) cells, and cause significant toxicity in all the three cell lines. Using specific inhibitors and inducers of P-glycoprotein (Pgp) and multidrug resistance associated proteins (Mrps), the cellular accumulation and subsequent toxicity ofmore » QDs in HepG2 and HK-2 cells were significantly affected, while only slight changes appeared in MDCK cells, corresponding well with the functional expressions of ABC transporters in cells. Moreover, treatment of QDs caused concentration- and time- dependent induction of ABC transporters in HepG2 and HK-2 cells, but such phenomenon was barely found in MDCK cells. Furthermore, the effects of CdTe QDs on ABC transporters were found to be greater than those of CdCl{sub 2} at equivalent concentrations of cadmium, indicating that the effects of QDs should be a combination of free Cd{sup 2+} and specific properties of QDs. Overall, these results indicated a strong dependence between the functional expressions of ABC transporters and the efflux of QDs, which could be an important reason for the modulation of QDs toxicity by ABC transporters. - Highlights: • ABC transporters contributed actively to the cellular efflux of CdTe quantum dots. • ABC transporters affected the cellular toxicity of CdTe quantum dots. • Treatment of CdTe quantum dots induced the gene expression of ABC transporters. • Free Cd{sup 2+} should be partially involved in the effects of QDs on ABC transporters. • Cellular efflux of quantum dots could be an important modulator for its toxicity.« less
Falls and confidence related quality of life outcome measures in an older British cohort
Parry, S; Steen, N; Galloway, S; Kenny, R; Bond, J
2001-01-01
Falls are common in older subjects and result in loss of confidence and independence. The Falls Efficacy Scale (FES) and the Activities-specific Balance Confidence scale (ABC) were developed in North America to quantify these entities, but contain idiom unfamiliar to an older British population. Neither has been validated in the UK. The FES and the ABC were modified for use within British culture and the internal consistency and test-retest reliability of the modified scales (FES-UK and ABC-UK) assessed. A total of 193 consecutive, ambulant, new, and return patients (n=119; 62%) and their friends and relatives ("visitors", n=74; 38%) were tested on both scales, while the last 60 subjects were retested within one week. Internal reliability was excellent for both scales (Cronbach's alpha 0.97 (FES-UK), and 0.98 (ABC-UK)). Test-retest reliability was good for both scales, though superior for the ABC-UK (intraclass correlation coefficient 0.58 (FES-UK), 0.89 (ABC-UK)). There was evidence to suggest that the ABC-UK was better than the FES-UK at distinguishing between older patients and younger patients (|tABC| = 4.4; |tFES| = 2.3); and between fallers and non-fallers (|tABC| = 8.7; |tFES| = 5.0) where the t statistics are based on the comparison of two independent samples. The ABC-UK and FES-UK are both reliable and valid measures for the assessment of falls and balance related confidence in older adults. However, better test-retest reliability and more robust differentiation of subgroups in whom falls related quality of life would be expected to be different make the ABC-UK the current instrument of choice in assessing this entity in older British subjects. Keywords: quality of life; falls; elderly; health status measurement PMID:11161077
Update on Bayesian Blocks: Segmented Models for Sequential Data
NASA Technical Reports Server (NTRS)
Scargle, Jeff
2017-01-01
The Bayesian Block algorithm, in wide use in astronomy and other areas, has been improved in several ways. The model for block shape has been generalized to include other than constant signal rate - e.g., linear, exponential, or other parametric models. In addition the computational efficiency has been improved, so that instead of O(N**2) the basic algorithm is O(N) in most cases. Other improvements in the theory and application of segmented representations will be described.
Artificial Intelligence (AI) Center of Excellence at the University of Pennsylvania
1995-07-01
that controls impact forces. Robust Location Estimation for MLR and Non-MLR Distributions (Dissertation Proposal) Gerda L. Kamberova MS-CIS-92-28...Bayesian Approach To Computer Vision Problems Gerda L. Kamberova MS-CIS-92-29 GRASP LAB 310 The object of our study is the Bayesian approach in...Estimation for MLR and Non-MLR Distributions (Dissertation) Gerda L. Kamberova MS-CIS-92-93 GRASP LAB 340 We study the problem of estimating an unknown
Krypotos, Angelos-Miltiadis; Klugkist, Irene; Engelhard, Iris M.
2017-01-01
ABSTRACT Threat conditioning procedures have allowed the experimental investigation of the pathogenesis of Post-Traumatic Stress Disorder. The findings of these procedures have also provided stable foundations for the development of relevant intervention programs (e.g. exposure therapy). Statistical inference of threat conditioning procedures is commonly based on p-values and Null Hypothesis Significance Testing (NHST). Nowadays, however, there is a growing concern about this statistical approach, as many scientists point to the various limitations of p-values and NHST. As an alternative, the use of Bayes factors and Bayesian hypothesis testing has been suggested. In this article, we apply this statistical approach to threat conditioning data. In order to enable the easy computation of Bayes factors for threat conditioning data we present a new R package named condir, which can be used either via the R console or via a Shiny application. This article provides both a non-technical introduction to Bayesian analysis for researchers using the threat conditioning paradigm, and the necessary tools for computing Bayes factors easily. PMID:29038683
High-throughput Bayesian Network Learning using Heterogeneous Multicore Computers
Linderman, Michael D.; Athalye, Vivek; Meng, Teresa H.; Asadi, Narges Bani; Bruggner, Robert; Nolan, Garry P.
2017-01-01
Aberrant intracellular signaling plays an important role in many diseases. The causal structure of signal transduction networks can be modeled as Bayesian Networks (BNs), and computationally learned from experimental data. However, learning the structure of Bayesian Networks (BNs) is an NP-hard problem that, even with fast heuristics, is too time consuming for large, clinically important networks (20–50 nodes). In this paper, we present a novel graphics processing unit (GPU)-accelerated implementation of a Monte Carlo Markov Chain-based algorithm for learning BNs that is up to 7.5-fold faster than current general-purpose processor (GPP)-based implementations. The GPU-based implementation is just one of several implementations within the larger application, each optimized for a different input or machine configuration. We describe the methodology we use to build an extensible application, assembled from these variants, that can target a broad range of heterogeneous systems, e.g., GPUs, multicore GPPs. Specifically we show how we use the Merge programming model to efficiently integrate, test and intelligently select among the different potential implementations. PMID:28819655
NASA Astrophysics Data System (ADS)
Sahai, Swupnil
This thesis includes three parts. The overarching theme is how to analyze structured hierarchical data, with applications to astronomy and sociology. The first part discusses how expectation propagation can be used to parallelize the computation when fitting big hierarchical bayesian models. This methodology is then used to fit a novel, nonlinear mixture model to ultraviolet radiation from various regions of the observable universe. The second part discusses how the Stan probabilistic programming language can be used to numerically integrate terms in a hierarchical bayesian model. This technique is demonstrated on supernovae data to significantly speed up convergence to the posterior distribution compared to a previous study that used a Gibbs-type sampler. The third part builds a formal latent kernel representation for aggregate relational data as a way to more robustly estimate the mixing characteristics of agents in a network. In particular, the framework is applied to sociology surveys to estimate, as a function of ego age, the age and sex composition of the personal networks of individuals in the United States.
Perdikaris, Paris; Karniadakis, George Em
2016-05-01
We present a computational framework for model inversion based on multi-fidelity information fusion and Bayesian optimization. The proposed methodology targets the accurate construction of response surfaces in parameter space, and the efficient pursuit to identify global optima while keeping the number of expensive function evaluations at a minimum. We train families of correlated surrogates on available data using Gaussian processes and auto-regressive stochastic schemes, and exploit the resulting predictive posterior distributions within a Bayesian optimization setting. This enables a smart adaptive sampling procedure that uses the predictive posterior variance to balance the exploration versus exploitation trade-off, and is a key enabler for practical computations under limited budgets. The effectiveness of the proposed framework is tested on three parameter estimation problems. The first two involve the calibration of outflow boundary conditions of blood flow simulations in arterial bifurcations using multi-fidelity realizations of one- and three-dimensional models, whereas the last one aims to identify the forcing term that generated a particular solution to an elliptic partial differential equation. © 2016 The Author(s).
Perdikaris, Paris; Karniadakis, George Em
2016-01-01
We present a computational framework for model inversion based on multi-fidelity information fusion and Bayesian optimization. The proposed methodology targets the accurate construction of response surfaces in parameter space, and the efficient pursuit to identify global optima while keeping the number of expensive function evaluations at a minimum. We train families of correlated surrogates on available data using Gaussian processes and auto-regressive stochastic schemes, and exploit the resulting predictive posterior distributions within a Bayesian optimization setting. This enables a smart adaptive sampling procedure that uses the predictive posterior variance to balance the exploration versus exploitation trade-off, and is a key enabler for practical computations under limited budgets. The effectiveness of the proposed framework is tested on three parameter estimation problems. The first two involve the calibration of outflow boundary conditions of blood flow simulations in arterial bifurcations using multi-fidelity realizations of one- and three-dimensional models, whereas the last one aims to identify the forcing term that generated a particular solution to an elliptic partial differential equation. PMID:27194481
Lebedeva, Irina V.; Pande, Praveen; Patton, Wayne F.
2011-01-01
An underlying mechanism for multi drug resistance (MDR) is up-regulation of the transmembrane ATP-binding cassette (ABC) transporter proteins. ABC transporters also determine the general fate and effect of pharmaceutical agents in the body. The three major types of ABC transporters are MDR1 (P-gp, P-glycoprotein, ABCB1), MRP1/2 (ABCC1/2) and BCRP/MXR (ABCG2) proteins. Flow cytometry (FCM) allows determination of the functional expression levels of ABC transporters in live cells, but most dyes used as indicators (rhodamine 123, DiOC2(3), calcein-AM) have limited applicability as they do not detect all three major types of ABC transporters. Dyes with broad coverage (such as doxorubicin, daunorubicin and mitoxantrone) lack sensitivity due to overall dimness and thus may yield a significant percentage of false negative results. We describe two novel fluorescent probes that are substrates for all three common types of ABC transporters and can serve as indicators of MDR in flow cytometry assays using live cells. The probes exhibit fast internalization, favorable uptake/efflux kinetics and high sensitivity of MDR detection, as established by multidrug resistance activity factor (MAF) values and Kolmogorov-Smirnov statistical analysis. Used in combination with general or specific inhibitors of ABC transporters, both dyes readily identify functional efflux and are capable of detecting small levels of efflux as well as defining the type of multidrug resistance. The assay can be applied to the screening of putative modulators of ABC transporters, facilitating rapid, reproducible, specific and relatively simple functional detection of ABC transporter activity, and ready implementation on widely available instruments. PMID:21799851
Li, Bai; Lin, Mu; Liu, Qiao; Li, Ya; Zhou, Changjun
2015-10-01
Protein folding is a fundamental topic in molecular biology. Conventional experimental techniques for protein structure identification or protein folding recognition require strict laboratory requirements and heavy operating burdens, which have largely limited their applications. Alternatively, computer-aided techniques have been developed to optimize protein structures or to predict the protein folding process. In this paper, we utilize a 3D off-lattice model to describe the original protein folding scheme as a simplified energy-optimal numerical problem, where all types of amino acid residues are binarized into hydrophobic and hydrophilic ones. We apply a balance-evolution artificial bee colony (BE-ABC) algorithm as the minimization solver, which is featured by the adaptive adjustment of search intensity to cater for the varying needs during the entire optimization process. In this work, we establish a benchmark case set with 13 real protein sequences from the Protein Data Bank database and evaluate the convergence performance of BE-ABC algorithm through strict comparisons with several state-of-the-art ABC variants in short-term numerical experiments. Besides that, our obtained best-so-far protein structures are compared to the ones in comprehensive previous literature. This study also provides preliminary insights into how artificial intelligence techniques can be applied to reveal the dynamics of protein folding. Graphical Abstract Protein folding optimization using 3D off-lattice model and advanced optimization techniques.
Iwayama, Koji; Zhu, Liping; Hirata, Yoshito; Aono, Masashi; Hara, Masahiko; Aihara, Kazuyuki
2016-04-12
An amoeboid unicellular organism, a plasmodium of the true slime mold Physarum polycephalum, exhibits complex spatiotemporal oscillatory dynamics and sophisticated information processing capabilities while deforming its amorphous body. We previously devised an 'amoeba-based computer (ABC),' that implemented optical feedback control to lead this amoeboid organism to search for a solution to the traveling salesman problem (TSP). In the ABC, the shortest TSP route (the optimal solution) is represented by the shape of the organism in which the body area (nutrient absorption) is maximized while the risk of being exposed to aversive light stimuli is minimized. The shortness of the TSP route found by ABC, therefore, serves as a quantitative measure of the optimality of the decision made by the organism. However, it remains unclear how the decision-making ability of the organism originates from the oscillatory dynamics of the organism. We investigated the number of coexisting traveling waves in the spatiotemporal patterns of the oscillatory dynamics of the organism. We show that a shorter TSP route can be found when the organism exhibits a lower number of traveling waves. The results imply that the oscillatory dynamics are highly coordinated throughout the global body. Based on the results, we discuss the fact that the decision-making ability of the organism can be enhanced not by uncorrelated random fluctuations, but by its highly coordinated oscillatory dynamics.
NASA Astrophysics Data System (ADS)
Epis, Sara; Porretta, Daniele; Mastrantonio, Valentina; Urbanelli, Sandra; Sassera, Davide; De Marco, Leone; Mereghetti, Valeria; Montagna, Matteo; Ricci, Irene; Favia, Guido; Bandi, Claudio
2014-12-01
In insects, ABC transporters have been shown to contribute to defence/resistance to insecticides by reducing toxic concentrations in cells/tissues. Despite the extensive studies about this detoxifying mechanism, the temporal patterns of ABC transporter activation have been poorly investigated. Using the malaria vector Anopheles stephensi as a study system, we investigated the expression profile of ABC genes belonging to different subfamilies in permethrin-treated larvae at different time points (30 min to 48 h). Our results showed that the expression of ABCB and ABCG subfamily genes was upregulated at 1 h after treatment, with the highest expression observed at 6 h. Therefore, future investigations on the temporal dynamics of ABC gene expression will allow a better implementation of insecticide treatment regimens, including the use of specific inhibitors of ABC efflux pumps.
Zheng, Desen; Hao, Guixia; Cursino, Luciana; Zhang, Hongsheng; Burr, Thomas J
2012-09-01
The characterization of Tn5 transposon insertional mutants of Agrobacterium vitis strain F2/5 revealed a gene encoding a predicted LysR-type transcriptional regulator, lhnR (for 'LysR-type regulator associated with HR and necrosis'), and an immediate upstream operon consisting of three open reading frames (lhnABC) required for swarming motility, surfactant production and the induction of a hypersensitive response (HR) on tobacco and necrosis on grape. The operon lhnABC is unique to A. vitis among the sequenced members in Rhizobiaceae. Mutagenesis of lhnR and lhnABC by gene disruption and complementation of ΔlhnR and ΔlhnABC confirmed their roles in the expression of these phenotypes. Mutation of lhnR resulted in complete loss of HR, swarming motility, surfactant production and reduced necrosis, whereas mutation of lhnABC resulted in loss of swarming motility, delayed and reduced HR development and reduced surfactant production and necrosis. The data from promoter-green fluorescent protein (gfp) fusions showed that lhnR suppresses the expression of lhnABC and negatively autoregulates its own expression. It was also shown that lhnABC negatively affects its own expression and positively affects the transcription of lhnR. lhnR and lhnABC constitute a regulatory circuit that coordinates the transcription level of lhnR, resulting in the expression of swarming, surfactant, HR and necrosis phenotypes. © 2012 THE AUTHORS. MOLECULAR PLANT PATHOLOGY © 2012 BSPP AND BLACKWELL PUBLISHING LTD.
A Comparison of FPGA and GPGPU Designs for Bayesian Occupancy Filters.
Medina, Luis; Diez-Ochoa, Miguel; Correal, Raul; Cuenca-Asensi, Sergio; Serrano, Alejandro; Godoy, Jorge; Martínez-Álvarez, Antonio; Villagra, Jorge
2017-11-11
Grid-based perception techniques in the automotive sector based on fusing information from different sensors and their robust perceptions of the environment are proliferating in the industry. However, one of the main drawbacks of these techniques is the traditionally prohibitive, high computing performance that is required for embedded automotive systems. In this work, the capabilities of new computing architectures that embed these algorithms are assessed in a real car. The paper compares two ad hoc optimized designs of the Bayesian Occupancy Filter; one for General Purpose Graphics Processing Unit (GPGPU) and the other for Field-Programmable Gate Array (FPGA). The resulting implementations are compared in terms of development effort, accuracy and performance, using datasets from a realistic simulator and from a real automated vehicle.
GPU Computing in Bayesian Inference of Realized Stochastic Volatility Model
NASA Astrophysics Data System (ADS)
Takaishi, Tetsuya
2015-01-01
The realized stochastic volatility (RSV) model that utilizes the realized volatility as additional information has been proposed to infer volatility of financial time series. We consider the Bayesian inference of the RSV model by the Hybrid Monte Carlo (HMC) algorithm. The HMC algorithm can be parallelized and thus performed on the GPU for speedup. The GPU code is developed with CUDA Fortran. We compare the computational time in performing the HMC algorithm on GPU (GTX 760) and CPU (Intel i7-4770 3.4GHz) and find that the GPU can be up to 17 times faster than the CPU. We also code the program with OpenACC and find that appropriate coding can achieve the similar speedup with CUDA Fortran.
Pettengill, James B; Moeller, David A
2012-09-01
The origins of hybrid zones between parapatric taxa have been of particular interest for understanding the evolution of reproductive isolation and the geographic context of species divergence. One challenge has been to distinguish between allopatric divergence (followed by secondary contact) versus primary intergradation (parapatric speciation) as alternative divergence histories. Here, we use complementary phylogeographic and population genetic analyses to investigate the recent divergence of two subspecies of Clarkia xantiana and the formation of a hybrid zone within the narrow region of sympatry. We tested alternative phylogeographic models of divergence using approximate Bayesian computation (ABC) and found strong support for a secondary contact model and little support for a model allowing for gene flow throughout the divergence process (i.e. primary intergradation). Two independent methods for inferring the ancestral geography of each subspecies, one based on probabilistic character state reconstructions and the other on palaeo-distribution modelling, also support a model of divergence in allopatry and range expansion leading to secondary contact. The membership of individuals to genetic clusters suggests geographic substructure within each taxon where allopatric and sympatric samples are primarily found in separate clusters. We also observed coincidence and concordance of genetic clines across three types of molecular markers, which suggests that there is a strong barrier to gene flow. Taken together, our results provide evidence for allopatric divergence followed by range expansion leading to secondary contact. The location of refugial populations and the directionality of range expansion are consistent with expectations based on climate change since the last glacial maximum. Our approach also illustrates the utility of combining phylogeographic hypothesis testing with species distribution modelling and fine-scale population genetic analyses for inferring the geography of the divergence process. © 2012 Blackwell Publishing Ltd.
The redshift distribution of cosmological samples: a forward modeling approach
NASA Astrophysics Data System (ADS)
Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam; Refregier, Alexandre; Bruderer, Claudio; Nicola, Andrina
2017-08-01
Determining the redshift distribution n(z) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n(z) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc{UFig} (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizes and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n(z) distributions for the acceptable models. We demonstrate the method by determining n(z) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n(z) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.
Wang, Xiu-Ling; Sun, Jian-Yun; Xue, Yan; Zhang, Peng; Zhou, Hui; Qu, Liang-Hu
2012-01-01
Background The Siberian salamander (Ranodon sibiricus), distributed in geographically isolated areas of Central Asia, is an ideal alpine species for studies of conservation and phylogeography. However, there are few data regarding the genetic diversity in R. sibiricus populations. Methodology/Principal Findings We used two genetic markers (mtDNA and microsatellites) to survey all six populations of R. sibiricus in China. Both of the markers revealed extreme genetic uniformity among these populations. There were only three haplotypes in the mtDNA, and the overall nucleotide diversity in the mtDNA was 0.00064, ranging from 0.00000 to 0.00091 for the six populations. Although we recovered 70 sequences containing microsatellite repeats, there were only two loci that displayed polymorphism. We used the approximate Bayesian computation (ABC) method to study the demographic history of the populations. This analysis suggested that the extant populations diverged from the ancestral population approximately 120 years ago and that the historical population size was much larger than the present population size; i.e., R. sibiricus has experienced dramatic population declines. Conclusion/Significance Our findings suggest that the genetic diversity in the R. sibiricus populations is the lowest among all investigated amphibians. We conclude that the isolation of R. sibiricus populations occurred recently and was a result of recent human activity and/or climatic changes. The Pleistocene glaciation oscillations may have facilitated intraspecies genetic homogeneity rather than enhanced divergence. A low genomic evolutionary rate and elevated inbreeding frequency may have also contributed to the low genetic variation observed in this species. Our findings indicate the urgency of implementing a protection plan for this endangered species. PMID:22428037
Manni, Mosè; Guglielmino, Carmela R.; Scolari, Francesca; Vega-Rúa, Anubis; Failloux, Anna-Bella; Somboon, Pradya; Lisa, Antonella; Savini, Grazia; Bonizzoni, Mariangela; Gomulski, Ludvik M.; Malacrida, Anna R.
2017-01-01
Background Invasive species represent a global concern for their rapid spread and the possibility of infectious disease transmission. This is the case of the global invader Aedes albopictus, the Asian tiger mosquito. This species is a vector of medically important arboviruses, notably chikungunya (CHIKV), dengue (DENV) and Zika (ZIKV). The reconstruction of the complex colonization pattern of this mosquito has great potential for mitigating its spread and, consequently, disease risks. Methodology/Principal findings Classical population genetics analyses and Approximate Bayesian Computation (ABC) approaches were combined to disentangle the demographic history of Aedes albopictus populations from representative countries in the Southeast Asian native range and in the recent and more recently colonized areas. In Southeast Asia, the low differentiation and the high co-ancestry values identified among China, Thailand and Japan indicate that, in the native range, these populations maintain high genetic connectivity, revealing their ancestral common origin. China appears to be the oldest population. Outside Southeast Asia, the invasion process in La Réunion, America and the Mediterranean Basin is primarily supported by a chaotic propagule distribution, which cooperates in maintaining a relatively high genetic diversity within the adventive populations. Conclusions/Significance From our data, it appears that independent and also trans-continental introductions of Ae. albopictus may have facilitated the rapid establishment of adventive populations through admixture of unrelated genomes. As a consequence, a great amount of intra-population variability has been detected, and it is likely that this variability may extend to the genetic mechanisms controlling vector competence. Thus, in the context of the invasion process of this mosquito, it is possible that both population ancestry and admixture contribute to create the conditions for the efficient transmission of arboviruses and for outbreak establishment. PMID:28135274
A Generalized QMRA Beta-Poisson Dose-Response Model.
Xie, Gang; Roiko, Anne; Stratton, Helen; Lemckert, Charles; Dunn, Peter K; Mengersen, Kerrie
2016-10-01
Quantitative microbial risk assessment (QMRA) is widely accepted for characterizing the microbial risks associated with food, water, and wastewater. Single-hit dose-response models are the most commonly used dose-response models in QMRA. Denoting PI(d) as the probability of infection at a given mean dose d, a three-parameter generalized QMRA beta-Poisson dose-response model, PI(d|α,β,r*), is proposed in which the minimum number of organisms required for causing infection, K min , is not fixed, but a random variable following a geometric distribution with parameter 0
Semerikov, Vladimir L; Semerikova, Svetlana A; Polezhaeva, Maria A; Kosintsev, Pavel A; Lascoux, Martin
2013-10-01
While many species were confined to southern latitudes during the last glaciations, there has lately been mounting evidence that some of the most cold-tolerant species were actually able to survive close to the ice sheets. The contribution of these higher latitude outposts to the main recolonization thrust remains, however, untested. In the present study, we use the first range-wide survey of genetic diversity at cytoplasmic markers in Siberian larch (Larix sibirica; four mitochondrial (mt) DNA loci and five chloroplast (cp) DNA SSR loci) to (i) assess the relative contributions of southern and central areas to the current L. sibirica distribution range; and (ii) date the last major population expansion in both L. sibirica and adjacent Larix species. The geographic distribution of cpDNA variation was uninformative, but that of mitotypes clearly indicates that the southernmost populations, located in Mongolia and the Tien-Shan and Sayan Mountain ranges, had a very limited contribution to the current populations of the central and northern parts of the range. It also suggests that the contribution of the high latitude cryptic refugia was geographically limited and that most of the current West Siberian Plain larch populations likely originated in the foothills of the Sayan Mountains. Interestingly, the main population expansion detected through Approximate Bayesian Computation (ABC) in all four larch species investigated here pre-dates the LGM, with a mode in a range of 220,000-1,340,000 years BP. Hence, L. sibirica, like other major conifer species of the boreal forest, was strongly affected by climatic events pre-dating the Last Glacial Maximum. © 2013 John Wiley & Sons Ltd.
The redshift distribution of cosmological samples: a forward modeling approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam
Determining the redshift distribution n ( z ) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n ( z ) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc(UFig) (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizesmore » and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n ( z ) distributions for the acceptable models. We demonstrate the method by determining n ( z ) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n ( z ) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.« less
Manni, Mosè; Guglielmino, Carmela R; Scolari, Francesca; Vega-Rúa, Anubis; Failloux, Anna-Bella; Somboon, Pradya; Lisa, Antonella; Savini, Grazia; Bonizzoni, Mariangela; Gomulski, Ludvik M; Malacrida, Anna R; Gasperi, Giuliano
2017-01-01
Invasive species represent a global concern for their rapid spread and the possibility of infectious disease transmission. This is the case of the global invader Aedes albopictus, the Asian tiger mosquito. This species is a vector of medically important arboviruses, notably chikungunya (CHIKV), dengue (DENV) and Zika (ZIKV). The reconstruction of the complex colonization pattern of this mosquito has great potential for mitigating its spread and, consequently, disease risks. Classical population genetics analyses and Approximate Bayesian Computation (ABC) approaches were combined to disentangle the demographic history of Aedes albopictus populations from representative countries in the Southeast Asian native range and in the recent and more recently colonized areas. In Southeast Asia, the low differentiation and the high co-ancestry values identified among China, Thailand and Japan indicate that, in the native range, these populations maintain high genetic connectivity, revealing their ancestral common origin. China appears to be the oldest population. Outside Southeast Asia, the invasion process in La Réunion, America and the Mediterranean Basin is primarily supported by a chaotic propagule distribution, which cooperates in maintaining a relatively high genetic diversity within the adventive populations. From our data, it appears that independent and also trans-continental introductions of Ae. albopictus may have facilitated the rapid establishment of adventive populations through admixture of unrelated genomes. As a consequence, a great amount of intra-population variability has been detected, and it is likely that this variability may extend to the genetic mechanisms controlling vector competence. Thus, in the context of the invasion process of this mosquito, it is possible that both population ancestry and admixture contribute to create the conditions for the efficient transmission of arboviruses and for outbreak establishment.
The Military Applications of Cloud Computing Technologies
2013-05-23
tactical networks will potentially cause some unique issues when implementing the JIE. Tactical networks are temporary in nature , and are utilized...connected ABCS clients will receive software updates and security patches as they are published over the network , rather than catching up after an extended...approach from the previous JNN network model, in that it introduces a limited, wireless capability to a unit’s LAN that will enable limited, on-the
Reid-Hresko, John
2014-01-01
ABC-based HIV-prevention programmes have been widely employed in northern Tanzanian wildlife conservation settings in an attempt to (re)shape the sexual behaviours of conservation actors. Utilising findings from 66 semi-structured interviews conducted in 2009-2010, this paper examines ABC prevention as a form of Foucauldian governmentality--circulating technologies of power that mobilise disciplinary technologies and attempt to transform such efforts into technologies of the self--and explores how individuals understand and respond to attempts to govern their behaviour. ABC regimes attempt to rework subjectivity, positioning HIV-related behaviours within a risk-based neoliberal rationality. However, efforts to use ABC as a technology to govern populations and individual bodies are largely incommensurate with existing Tanzanian sociocultural formations, including economic and gendered inequalities, and local understandings of sexuality. The language research participants used to talk about ABC and the justifications they offered for non-compliance illuminate this discrepancy. Data reveal that the recipients of ABC campaigns are active producers of understandings that work for them in their lives, but may not produce the behavioural shifts envisioned by programme goals. These findings corroborate previous research, which questions the continued plausibility of ABC as a stand-alone HIV- prevention framework.
Applying Activity Based Costing (ABC) Method to Calculate Cost Price in Hospital and Remedy Services
Rajabi, A; Dabiri, A
2012-01-01
Background Activity Based Costing (ABC) is one of the new methods began appearing as a costing methodology in the 1990’s. It calculates cost price by determining the usage of resources. In this study, ABC method was used for calculating cost price of remedial services in hospitals. Methods: To apply ABC method, Shahid Faghihi Hospital was selected. First, hospital units were divided into three main departments: administrative, diagnostic, and hospitalized. Second, activity centers were defined by the activity analysis method. Third, costs of administrative activity centers were allocated into diagnostic and operational departments based on the cost driver. Finally, with regard to the usage of cost objectives from services of activity centers, the cost price of medical services was calculated. Results: The cost price from ABC method significantly differs from tariff method. In addition, high amount of indirect costs in the hospital indicates that capacities of resources are not used properly. Conclusion: Cost price of remedial services with tariff method is not properly calculated when compared with ABC method. ABC calculates cost price by applying suitable mechanisms but tariff method is based on the fixed price. In addition, ABC represents useful information about the amount and combination of cost price services. PMID:23113171
Diagnosis and Reconfiguration using Bayesian Networks: An Electrical Power System Case Study
NASA Technical Reports Server (NTRS)
Knox, W. Bradley; Mengshoel, Ole
2009-01-01
Automated diagnosis and reconfiguration are important computational techniques that aim to minimize human intervention in autonomous systems. In this paper, we develop novel techniques and models in the context of diagnosis and reconfiguration reasoning using causal Bayesian networks (BNs). We take as starting point a successful diagnostic approach, using a static BN developed for a real-world electrical power system. We discuss in this paper the extension of this diagnostic approach along two dimensions, namely: (i) from a static BN to a dynamic BN; and (ii) from a diagnostic task to a reconfiguration task. More specifically, we discuss the auto-generation of a dynamic Bayesian network from a static Bayesian network. In addition, we discuss subtle, but important, differences between Bayesian networks when used for diagnosis versus reconfiguration. We discuss a novel reconfiguration agent, which models a system causally, including effects of actions through time, using a dynamic Bayesian network. Though the techniques we discuss are general, we demonstrate them in the context of electrical power systems (EPSs) for aircraft and spacecraft. EPSs are vital subsystems on-board aircraft and spacecraft, and many incidents and accidents of these vehicles have been attributed to EPS failures. We discuss a case study that provides initial but promising results for our approach in the setting of electrical power systems.
NASA Astrophysics Data System (ADS)
Zaveri, Mazad Shaheriar
The semiconductor/computer industry has been following Moore's law for several decades and has reaped the benefits in speed and density of the resultant scaling. Transistor density has reached almost one billion per chip, and transistor delays are in picoseconds. However, scaling has slowed down, and the semiconductor industry is now facing several challenges. Hybrid CMOS/nano technologies, such as CMOL, are considered as an interim solution to some of the challenges. Another potential architectural solution includes specialized architectures for applications/models in the intelligent computing domain, one aspect of which includes abstract computational models inspired from the neuro/cognitive sciences. Consequently in this dissertation, we focus on the hardware implementations of Bayesian Memory (BM), which is a (Bayesian) Biologically Inspired Computational Model (BICM). This model is a simplified version of George and Hawkins' model of the visual cortex, which includes an inference framework based on Judea Pearl's belief propagation. We then present a "hardware design space exploration" methodology for implementing and analyzing the (digital and mixed-signal) hardware for the BM. This particular methodology involves: analyzing the computational/operational cost and the related micro-architecture, exploring candidate hardware components, proposing various custom hardware architectures using both traditional CMOS and hybrid nanotechnology - CMOL, and investigating the baseline performance/price of these architectures. The results suggest that CMOL is a promising candidate for implementing a BM. Such implementations can utilize the very high density storage/computation benefits of these new nano-scale technologies much more efficiently; for example, the throughput per 858 mm2 (TPM) obtained for CMOL based architectures is 32 to 40 times better than the TPM for a CMOS based multiprocessor/multi-FPGA system, and almost 2000 times better than the TPM for a PC implementation. We later use this methodology to investigate the hardware implementations of cortex-scale spiking neural system, which is an approximate neural equivalent of BICM based cortex-scale system. The results of this investigation also suggest that CMOL is a promising candidate to implement such large-scale neuromorphic systems. In general, the assessment of such hypothetical baseline hardware architectures provides the prospects for building large-scale (mammalian cortex-scale) implementations of neuromorphic/Bayesian/intelligent systems using state-of-the-art and beyond state-of-the-art silicon structures.
MetaABC--an integrated metagenomics platform for data adjustment, binning and clustering.
Su, Chien-Hao; Hsu, Ming-Tsung; Wang, Tse-Yi; Chiang, Sufeng; Cheng, Jen-Hao; Weng, Francis C; Kao, Cheng-Yan; Wang, Daryi; Tsai, Huai-Kuang
2011-08-15
MetaABC is a metagenomic platform that integrates several binning tools coupled with methods for removing artifacts, analyzing unassigned reads and controlling sampling biases. It allows users to arrive at a better interpretation via series of distinct combinations of analysis tools. After execution, MetaABC provides outputs in various visual formats such as tables, pie and bar charts as well as clustering result diagrams. MetaABC source code and documentation are available at http://bits2.iis.sinica.edu.tw/MetaABC/ CONTACT: dywang@gate.sinica.edu.tw; hktsai@iis.sinica.edu.tw Supplementary data are available at Bioinformatics online.
ERIC Educational Resources Information Center
Landsbergen, Jan, Ed.; Odijk, Jan, Ed.; van Deemter, Kees, Ed.; van Zanten, Gert Veldhuijzen, Ed.
Papers from the meeting on computational linguistics include: "Conversational Games, Belief Revision and Bayesian Networks" (Stephen G. Pulman); "Valence Alternation without Lexical Rules" (Gosse Bouma); "Filtering Left Dislocation Chains in Parsing Categorical Grammar" (Crit Cremers, Maarten Hijzelendoorn);…
Specifying and Refining a Measurement Model for a Computer-Based Interactive Assessment
ERIC Educational Resources Information Center
Levy, Roy; Mislevy, Robert J.
2004-01-01
The challenges of modeling students' performance in computer-based interactive assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance. This article describes a Bayesian approach to modeling and estimating cognitive models…
Bayesian Methods and Confidence Intervals for Automatic Target Recognition of SAR Canonical Shapes
2014-03-27
and DirectX [22]. The CUDA platform was developed by the NVIDIA Corporation to allow programmers access to the computational capabilities of the...were used for the intense repetitive computations. Developing CUDA software requires writing code for specialized compilers provided by NVIDIA and
Computer Aided Evaluation of Higher Education Tutors' Performance
ERIC Educational Resources Information Center
Xenos, Michalis; Papadopoulos, Thanos
2007-01-01
This article presents a method for computer-aided tutor evaluation: Bayesian Networks are used for organizing the collected data about tutors and for enabling accurate estimations and predictions about future tutor behavior. The model provides indications about each tutor's strengths and weaknesses, which enables the evaluator to exploit strengths…
Characterisation of single domain ATP-binding cassette protien homologues of Theileria parva.
Kibe, M K; Macklin, M; Gobright, E; Bishop, R; Urakawa, T; ole-MoiYoi, O K
2001-09-01
Two distinct genes encoding single domain, ATP-binding cassette transport protein homologues of Theileria parva were cloned and sequenced. Neither of the genes is tandemly duplicated. One gene, TpABC1, encodes a predicted protein of 593 amino acids with an N-terminal hydrophobic domain containing six potential membrane-spanning segments. A single discontinuous ATP-binding element was located in the C-terminal region of TpABC1. The second gene, TpABC2, also contains a single C-terminal ATP-binding motif. Copies of TpABC2 were present at four loci in the T. parva genome on three different chromosomes. TpABC1 exhibited allelic polymorphism between stocks of the parasite. Comparison of cDNA and genomic sequences revealed that TpABC1 contained seven short introns, between 29 and 84 bp in length. The full-length TpABC1 protein was expressed in insect cells using the baculovirus system. Application of antibodies raised against the recombinant antigen to western blots of T. parva piroplasm lysates detected an 85 kDa protein in this life-cycle stage.
Doliwa, Christelle; Escotte-Binet, Sandie; Aubert, Dominique; Sauvage, Virginie; Velard, Frédéric; Schmid, Aline; Villena, Isabelle
2013-01-01
Several treatment failures have been reported for the treatment of toxoplasmic encephalitis, chorioretinitis, and congenital toxoplasmosis. Recently we found three Toxoplasma gondii strains naturally resistant to sulfadiazine and we developed in vitro two sulfadiazine resistant strains, RH-RSDZ and ME-49-RSDZ, by gradual pressure. In Plasmodium, common mechanisms of drug resistance involve, among others, mutations and/or amplification within genes encoding the therapeutic targets dhps and dhfr and/or the ABC transporter genes family. To identify genotypic and/or phenotypic markers of resistance in T. gondii, we sequenced and analyzed the expression levels of therapeutic targets dhps and dhfr, three ABC genes, two Pgp, TgABC.B1 and TgABC.B2, and one MRP, TgABC.C1, on sensitive strains compared to sulfadiazine resistant strains. Neither polymorphism nor overexpression was identified. Contrary to Plasmodium, in which mutations and/or overexpression within gene targets and ABC transporters are involved in antimalarial resistance, T. gondii sulfadiazine resistance is not related to these toxoplasmic genes studied. PMID:23707894
Optimization of Straight Cylindrical Turning Using Artificial Bee Colony (ABC) Algorithm
NASA Astrophysics Data System (ADS)
Prasanth, Rajanampalli Seshasai Srinivasa; Hans Raj, Kandikonda
2017-04-01
Artificial bee colony (ABC) algorithm, that mimics the intelligent foraging behavior of honey bees, is increasingly gaining acceptance in the field of process optimization, as it is capable of handling nonlinearity, complexity and uncertainty. Straight cylindrical turning is a complex and nonlinear machining process which involves the selection of appropriate cutting parameters that affect the quality of the workpiece. This paper presents the estimation of optimal cutting parameters of the straight cylindrical turning process using the ABC algorithm. The ABC algorithm is first tested on four benchmark problems of numerical optimization and its performance is compared with genetic algorithm (GA) and ant colony optimization (ACO) algorithm. Results indicate that, the rate of convergence of ABC algorithm is better than GA and ACO. Then, the ABC algorithm is used to predict optimal cutting parameters such as cutting speed, feed rate, depth of cut and tool nose radius to achieve good surface finish. Results indicate that, the ABC algorithm estimated a comparable surface finish when compared with real coded genetic algorithm and differential evolution algorithm.
Thomssen, Christoph; Marschner, Norbert; Untch, Michael; Decker, Thomas; Hegewisch-Becker, Susanna; Jackisch, Christian; Janni, Wolfgang; Hans-Joachim, Lück; von Minckwitz, Gunter; Scharl, Anton; Schneeweiss, Andreas; Tesch, Hans; Welt, Anja; Harbeck, Nadia
2012-02-01
A group of German breast cancer experts (medical oncologists and gynaecologists) reviewed and commented on the results of the first international 'Advanced Breast Cancer First Consensus Conference' (ABC1) for the diagnosis and treatment of advanced breast cancer. The ABC1 Conference is an initiative of the European School of Oncology (ESO) Metastatic Breast Cancer Task Force in cooperation with the EBCC (European Breast Cancer Conference), ESMO (European Society of Medical Oncology) and the American JNCI (Journal of the National Cancer Institute). The main focus of the ABC1 Conference was metastatic breast cancer (stage IV). The ABC1 consensus is based on the vote of 33 breast cancer experts from different countries and has been specified as a guideline for therapeutic practice by the German expert group. It is the objective of the ABC1 consensus as well as of the German comments to provide an internationally standardized and evidence-based foundation for qualified decision-making in the treatment of metastatic breast cancer.
NASA Astrophysics Data System (ADS)
Hadia, Sarman K.; Thakker, R. A.; Bhatt, Kirit R.
2016-05-01
The study proposes an application of evolutionary algorithms, specifically an artificial bee colony (ABC), variant ABC and particle swarm optimisation (PSO), to extract the parameters of metal oxide semiconductor field effect transistor (MOSFET) model. These algorithms are applied for the MOSFET parameter extraction problem using a Pennsylvania surface potential model. MOSFET parameter extraction procedures involve reducing the error between measured and modelled data. This study shows that ABC algorithm optimises the parameter values based on intelligent activities of honey bee swarms. Some modifications have also been applied to the basic ABC algorithm. Particle swarm optimisation is a population-based stochastic optimisation method that is based on bird flocking activities. The performances of these algorithms are compared with respect to the quality of the solutions. The simulation results of this study show that the PSO algorithm performs better than the variant ABC and basic ABC algorithm for the parameter extraction of the MOSFET model; also the implementation of the ABC algorithm is shown to be simpler than that of the PSO algorithm.
Ahn, Woo-Young; Haines, Nathaniel; Zhang, Lei
2017-01-01
Reinforcement learning and decision-making (RLDM) provide a quantitative framework and computational theories with which we can disentangle psychiatric conditions into the basic dimensions of neurocognitive functioning. RLDM offer a novel approach to assessing and potentially diagnosing psychiatric patients, and there is growing enthusiasm for both RLDM and computational psychiatry among clinical researchers. Such a framework can also provide insights into the brain substrates of particular RLDM processes, as exemplified by model-based analysis of data from functional magnetic resonance imaging (fMRI) or electroencephalography (EEG). However, researchers often find the approach too technical and have difficulty adopting it for their research. Thus, a critical need remains to develop a user-friendly tool for the wide dissemination of computational psychiatric methods. We introduce an R package called hBayesDM (hierarchical Bayesian modeling of Decision-Making tasks), which offers computational modeling of an array of RLDM tasks and social exchange games. The hBayesDM package offers state-of-the-art hierarchical Bayesian modeling, in which both individual and group parameters (i.e., posterior distributions) are estimated simultaneously in a mutually constraining fashion. At the same time, the package is extremely user-friendly: users can perform computational modeling, output visualization, and Bayesian model comparisons, each with a single line of coding. Users can also extract the trial-by-trial latent variables (e.g., prediction errors) required for model-based fMRI/EEG. With the hBayesDM package, we anticipate that anyone with minimal knowledge of programming can take advantage of cutting-edge computational-modeling approaches to investigate the underlying processes of and interactions between multiple decision-making (e.g., goal-directed, habitual, and Pavlovian) systems. In this way, we expect that the hBayesDM package will contribute to the dissemination of advanced modeling approaches and enable a wide range of researchers to easily perform computational psychiatric research within different populations. PMID:29601060