Estimating a Noncompensatory IRT Model Using Metropolis within Gibbs Sampling
ERIC Educational Resources Information Center
Babcock, Ben
2011-01-01
Relatively little research has been conducted with the noncompensatory class of multidimensional item response theory (MIRT) models. A Monte Carlo simulation study was conducted exploring the estimation of a two-parameter noncompensatory item response theory (IRT) model. The estimation method used was a Metropolis-Hastings within Gibbs algorithm…
Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo
NASA Astrophysics Data System (ADS)
Schön, Thomas B.; Svensson, Andreas; Murray, Lawrence; Lindsten, Fredrik
2018-05-01
Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods-the particle Metropolis-Hastings algorithm-which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods-including particle Metropolis-Hastings-to a large group of users without requiring them to know all the underlying mathematical details.
A quantum–quantum Metropolis algorithm
Yung, Man-Hong; Aspuru-Guzik, Alán
2012-01-01
The classical Metropolis sampling method is a cornerstone of many statistical modeling applications that range from physics, chemistry, and biology to economics. This method is particularly suitable for sampling the thermal distributions of classical systems. The challenge of extending this method to the simulation of arbitrary quantum systems is that, in general, eigenstates of quantum Hamiltonians cannot be obtained efficiently with a classical computer. However, this challenge can be overcome by quantum computers. Here, we present a quantum algorithm which fully generalizes the classical Metropolis algorithm to the quantum domain. The meaning of quantum generalization is twofold: The proposed algorithm is not only applicable to both classical and quantum systems, but also offers a quantum speedup relative to the classical counterpart. Furthermore, unlike the classical method of quantum Monte Carlo, this quantum algorithm does not suffer from the negative-sign problem associated with fermionic systems. Applications of this algorithm include the study of low-temperature properties of quantum systems, such as the Hubbard model, and preparing the thermal states of sizable molecules to simulate, for example, chemical reactions at an arbitrary temperature. PMID:22215584
NASA Astrophysics Data System (ADS)
Eric, L.; Vrugt, J. A.
2010-12-01
Spatially distributed hydrologic models potentially contain hundreds of parameters that need to be derived by calibration against a historical record of input-output data. The quality of this calibration strongly determines the predictive capability of the model and thus its usefulness for science-based decision making and forecasting. Unfortunately, high-dimensional optimization problems are typically difficult to solve. Here we present our recent developments to the Differential Evolution Adaptive Metropolis (DREAM) algorithm (Vrugt et al., 2009) to warrant efficient solution of high-dimensional parameter estimation problems. The algorithm samples from an archive of past states (Ter Braak and Vrugt, 2008), and uses multiple-try Metropolis sampling (Liu et al., 2000) to decrease the required burn-in time for each individual chain and increase efficiency of posterior sampling. This approach is hereafter referred to as MT-DREAM. We present results for 2 synthetic mathematical case studies, and 2 real-world examples involving from 10 to 240 parameters. Results for those cases show that our multiple-try sampler, MT-DREAM, can consistently find better solutions than other Bayesian MCMC methods. Moreover, MT-DREAM is admirably suited to be implemented and ran on a parallel machine and is therefore a powerful method for posterior inference.
Application of Biased Metropolis Algorithms: From protons to proteins
Bazavov, Alexei; Berg, Bernd A.; Zhou, Huan-Xiang
2015-01-01
We show that sampling with a biased Metropolis scheme is essentially equivalent to using the heatbath algorithm. However, the biased Metropolis method can also be applied when an efficient heatbath algorithm does not exist. This is first illustrated with an example from high energy physics (lattice gauge theory simulations). We then illustrate the Rugged Metropolis method, which is based on a similar biased updating scheme, but aims at very different applications. The goal of such applications is to locate the most likely configurations in a rugged free energy landscape, which is most relevant for simulations of biomolecules. PMID:26612967
2017-09-01
efficacy of statistical post-processing methods downstream of these dynamical model components with a hierarchical multivariate Bayesian approach to...Bayesian hierarchical modeling, Markov chain Monte Carlo methods , Metropolis algorithm, machine learning, atmospheric prediction 15. NUMBER OF PAGES...scale processes. However, this dissertation explores the efficacy of statistical post-processing methods downstream of these dynamical model components
Biased Metropolis Sampling for Rugged Free Energy Landscapes
NASA Astrophysics Data System (ADS)
Berg, Bernd A.
2003-11-01
Metropolis simulations of all-atom models of peptides (i.e. small proteins) are considered. Inspired by the funnel picture of Bryngelson and Wolyness, a transformation of the updating probabilities of the dihedral angles is defined, which uses probability densities from a higher temperature to improve the algorithmic performance at a lower temperature. The method is suitable for canonical as well as for generalized ensemble simulations. A simple approximation to the full transformation is tested at room temperature for Met-Enkephalin in vacuum. Integrated autocorrelation times are found to be reduced by factors close to two and a similar improvement due to generalized ensemble methods enters multiplicatively.
Mori, Yoshiharu; Okumura, Hisashi
2015-12-05
Simulated tempering (ST) is a useful method to enhance sampling of molecular simulations. When ST is used, the Metropolis algorithm, which satisfies the detailed balance condition, is usually applied to calculate the transition probability. Recently, an alternative method that satisfies the global balance condition instead of the detailed balance condition has been proposed by Suwa and Todo. In this study, ST method with the Suwa-Todo algorithm is proposed. Molecular dynamics simulations with ST are performed with three algorithms (the Metropolis, heat bath, and Suwa-Todo algorithms) to calculate the transition probability. Among the three algorithms, the Suwa-Todo algorithm yields the highest acceptance ratio and the shortest autocorrelation time. These suggest that sampling by a ST simulation with the Suwa-Todo algorithm is most efficient. In addition, because the acceptance ratio of the Suwa-Todo algorithm is higher than that of the Metropolis algorithm, the number of temperature states can be reduced by 25% for the Suwa-Todo algorithm when compared with the Metropolis algorithm. © 2015 Wiley Periodicals, Inc.
Stochastic Approximation Methods for Latent Regression Item Response Models
ERIC Educational Resources Information Center
von Davier, Matthias; Sinharay, Sandip
2010-01-01
This article presents an application of a stochastic approximation expectation maximization (EM) algorithm using a Metropolis-Hastings (MH) sampler to estimate the parameters of an item response latent regression model. Latent regression item response models are extensions of item response theory (IRT) to a latent variable model with covariates…
ERIC Educational Resources Information Center
Shamsuddeen, Abdulrahman; Amina, Hassan
2016-01-01
This study investigated the Correlation between instructional methods and students end of term achievement in Biology in selected secondary schools in Sokoto Metropolis, Sokoto State Nigeria. The study addressed three Specific objectives. To examine the relationship between; Cooperative learning methods, guided discovery, Simulation Method and…
Gradient-free MCMC methods for dynamic causal modelling
Sengupta, Biswa; Friston, Karl J.; Penny, Will D.
2015-03-14
Here, we compare the performance of four gradient-free MCMC samplers (random walk Metropolis sampling, slice-sampling, adaptive MCMC sampling and population-based MCMC sampling with tempering) in terms of the number of independent samples they can produce per unit computational time. For the Bayesian inversion of a single-node neural mass model, both adaptive and population-based samplers are more efficient compared with random walk Metropolis sampler or slice-sampling; yet adaptive MCMC sampling is more promising in terms of compute time. Slice-sampling yields the highest number of independent samples from the target density -- albeit at almost 1000% increase in computational time, in comparisonmore » to the most efficient algorithm (i.e., the adaptive MCMC sampler).« less
Comparing Three Estimation Methods for the Three-Parameter Logistic IRT Model
ERIC Educational Resources Information Center
Lamsal, Sunil
2015-01-01
Different estimation procedures have been developed for the unidimensional three-parameter item response theory (IRT) model. These techniques include the marginal maximum likelihood estimation, the fully Bayesian estimation using Markov chain Monte Carlo simulation techniques, and the Metropolis-Hastings Robbin-Monro estimation. With each…
Putting the Learning in Service Learning: From Soup Kitchen Models to the Black Metropolis Model
ERIC Educational Resources Information Center
Manley, Theodoric, Jr.; Buffa, Avery S.; Dube, Caleb; Reed, Lauren
2006-01-01
Results of the Black Metropolis Model (BMM) of service learning are analyzed and illustrated in this article to explain how to "put the learning in service learning." There are many soup kitchens or nontransforming models of service learning where students are asked to serve needy populations but internalize and learn little about the…
ERIC Educational Resources Information Center
von Davier, Matthias; Sinharay, Sandip
2009-01-01
This paper presents an application of a stochastic approximation EM-algorithm using a Metropolis-Hastings sampler to estimate the parameters of an item response latent regression model. Latent regression models are extensions of item response theory (IRT) to a 2-level latent variable model in which covariates serve as predictors of the…
Quantifying parameter uncertainty in stochastic models using the Box Cox transformation
NASA Astrophysics Data System (ADS)
Thyer, Mark; Kuczera, George; Wang, Q. J.
2002-08-01
The Box-Cox transformation is widely used to transform hydrological data to make it approximately Gaussian. Bayesian evaluation of parameter uncertainty in stochastic models using the Box-Cox transformation is hindered by the fact that there is no analytical solution for the posterior distribution. However, the Markov chain Monte Carlo method known as the Metropolis algorithm can be used to simulate the posterior distribution. This method properly accounts for the nonnegativity constraint implicit in the Box-Cox transformation. Nonetheless, a case study using the AR(1) model uncovered a practical problem with the implementation of the Metropolis algorithm. The use of a multivariate Gaussian jump distribution resulted in unacceptable convergence behaviour. This was rectified by developing suitable parameter transformations for the mean and variance of the AR(1) process to remove the strong nonlinear dependencies with the Box-Cox transformation parameter. Applying this methodology to the Sydney annual rainfall data and the Burdekin River annual runoff data illustrates the efficacy of these parameter transformations and demonstrate the value of quantifying parameter uncertainty.
Norris, Peter M; da Silva, Arlindo M
2016-07-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.
NASA Technical Reports Server (NTRS)
Norris, Peter M.; Da Silva, Arlindo M.
2016-01-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.
Norris, Peter M.; da Silva, Arlindo M.
2018-01-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC. PMID:29618847
Gradient-free MCMC methods for dynamic causal modelling.
Sengupta, Biswa; Friston, Karl J; Penny, Will D
2015-05-15
In this technical note we compare the performance of four gradient-free MCMC samplers (random walk Metropolis sampling, slice-sampling, adaptive MCMC sampling and population-based MCMC sampling with tempering) in terms of the number of independent samples they can produce per unit computational time. For the Bayesian inversion of a single-node neural mass model, both adaptive and population-based samplers are more efficient compared with random walk Metropolis sampler or slice-sampling; yet adaptive MCMC sampling is more promising in terms of compute time. Slice-sampling yields the highest number of independent samples from the target density - albeit at almost 1000% increase in computational time, in comparison to the most efficient algorithm (i.e., the adaptive MCMC sampler). Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Beddard, Godfrey S.
2011-01-01
Thermodynamic quantities such as the average energy, heat capacity, and entropy are calculated using a Monte Carlo method based on the Metropolis algorithm. This method is illustrated with reference to the harmonic oscillator but is particularly useful when the partition function cannot be evaluated; an example using a one-dimensional spin system…
Modelling maximum river flow by using Bayesian Markov Chain Monte Carlo
NASA Astrophysics Data System (ADS)
Cheong, R. Y.; Gabda, D.
2017-09-01
Analysis of flood trends is vital since flooding threatens human living in terms of financial, environment and security. The data of annual maximum river flows in Sabah were fitted into generalized extreme value (GEV) distribution. Maximum likelihood estimator (MLE) raised naturally when working with GEV distribution. However, previous researches showed that MLE provide unstable results especially in small sample size. In this study, we used different Bayesian Markov Chain Monte Carlo (MCMC) based on Metropolis-Hastings algorithm to estimate GEV parameters. Bayesian MCMC method is a statistical inference which studies the parameter estimation by using posterior distribution based on Bayes’ theorem. Metropolis-Hastings algorithm is used to overcome the high dimensional state space faced in Monte Carlo method. This approach also considers more uncertainty in parameter estimation which then presents a better prediction on maximum river flow in Sabah.
Link, W.A.; Barker, R.J.
2008-01-01
Judicious choice of candidate generating distributions improves efficiency of the Metropolis-Hastings algorithm. In Bayesian applications, it is sometimes possible to identify an approximation to the target posterior distribution; this approximate posterior distribution is a good choice for candidate generation. These observations are applied to analysis of the Cormack?Jolly?Seber model and its extensions.
ERIC Educational Resources Information Center
Yang, Ji Seung; Cai, Li
2014-01-01
The main purpose of this study is to improve estimation efficiency in obtaining maximum marginal likelihood estimates of contextual effects in the framework of nonlinear multilevel latent variable model by adopting the Metropolis-Hastings Robbins-Monro algorithm (MH-RM). Results indicate that the MH-RM algorithm can produce estimates and standard…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Dan; Ricciuto, Daniel; Walker, Anthony
Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this study, a Differential Evolution Adaptive Metropolis (DREAM) algorithm was used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The DREAM is a multi-chainmore » method and uses differential evolution technique for chain movement, allowing it to be efficiently applied to high-dimensional problems, and can reliably estimate heavy-tailed and multimodal distributions that are difficult for single-chain schemes using a Gaussian proposal distribution. The results were evaluated against the popular Adaptive Metropolis (AM) scheme. DREAM indicated that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identified one mode. The calibration of DREAM resulted in a better model fit and predictive performance compared to the AM. DREAM provides means for a good exploration of the posterior distributions of model parameters. Lastly, it reduces the risk of false convergence to a local optimum and potentially improves the predictive performance of the calibrated model.« less
Lu, Dan; Ricciuto, Daniel; Walker, Anthony; ...
2017-02-22
Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this study, a Differential Evolution Adaptive Metropolis (DREAM) algorithm was used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The DREAM is a multi-chainmore » method and uses differential evolution technique for chain movement, allowing it to be efficiently applied to high-dimensional problems, and can reliably estimate heavy-tailed and multimodal distributions that are difficult for single-chain schemes using a Gaussian proposal distribution. The results were evaluated against the popular Adaptive Metropolis (AM) scheme. DREAM indicated that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identified one mode. The calibration of DREAM resulted in a better model fit and predictive performance compared to the AM. DREAM provides means for a good exploration of the posterior distributions of model parameters. Lastly, it reduces the risk of false convergence to a local optimum and potentially improves the predictive performance of the calibrated model.« less
NASA Astrophysics Data System (ADS)
Feldt, Jonas; Miranda, Sebastião; Pratas, Frederico; Roma, Nuno; Tomás, Pedro; Mata, Ricardo A.
2017-12-01
In this work, we present an optimized perturbative quantum mechanics/molecular mechanics (QM/MM) method for use in Metropolis Monte Carlo simulations. The model adopted is particularly tailored for the simulation of molecular systems in solution but can be readily extended to other applications, such as catalysis in enzymatic environments. The electrostatic coupling between the QM and MM systems is simplified by applying perturbation theory to estimate the energy changes caused by a movement in the MM system. This approximation, together with the effective use of GPU acceleration, leads to a negligible added computational cost for the sampling of the environment. Benchmark calculations are carried out to evaluate the impact of the approximations applied and the overall computational performance.
Feldt, Jonas; Miranda, Sebastião; Pratas, Frederico; Roma, Nuno; Tomás, Pedro; Mata, Ricardo A
2017-12-28
In this work, we present an optimized perturbative quantum mechanics/molecular mechanics (QM/MM) method for use in Metropolis Monte Carlo simulations. The model adopted is particularly tailored for the simulation of molecular systems in solution but can be readily extended to other applications, such as catalysis in enzymatic environments. The electrostatic coupling between the QM and MM systems is simplified by applying perturbation theory to estimate the energy changes caused by a movement in the MM system. This approximation, together with the effective use of GPU acceleration, leads to a negligible added computational cost for the sampling of the environment. Benchmark calculations are carried out to evaluate the impact of the approximations applied and the overall computational performance.
ERIC Educational Resources Information Center
Monroe, Scott; Cai, Li
2013-01-01
In Ramsay curve item response theory (RC-IRT, Woods & Thissen, 2006) modeling, the shape of the latent trait distribution is estimated simultaneously with the item parameters. In its original implementation, RC-IRT is estimated via Bock and Aitkin's (1981) EM algorithm, which yields maximum marginal likelihood estimates. This method, however,…
ERIC Educational Resources Information Center
Monroe, Scott; Cai, Li
2014-01-01
In Ramsay curve item response theory (RC-IRT) modeling, the shape of the latent trait distribution is estimated simultaneously with the item parameters. In its original implementation, RC-IRT is estimated via Bock and Aitkin's EM algorithm, which yields maximum marginal likelihood estimates. This method, however, does not produce the…
Examining Work and Family Conflict among Female Bankers in Accra Metropolis, Ghana
ERIC Educational Resources Information Center
Kissi-Abrokwah, Bernard; Andoh-Robertson, Theophilus; Tutu-Danquah, Cecilia; Agbesi, Catherine Selorm
2015-01-01
This study investigated the effects and solutions of work and family conflict among female bankers in Accra Metropolis. Using triangulatory mixed method design, a structured questionnaire was randomly administered to 300 female bankers and 15 female Bankers who were interviewed were also sampled by using convenient sampling technique. The…
Topics in Bayesian Hierarchical Modeling and its Monte Carlo Computations
NASA Astrophysics Data System (ADS)
Tak, Hyung Suk
The first chapter addresses a Beta-Binomial-Logit model that is a Beta-Binomial conjugate hierarchical model with covariate information incorporated via a logistic regression. Various researchers in the literature have unknowingly used improper posterior distributions or have given incorrect statements about posterior propriety because checking posterior propriety can be challenging due to the complicated functional form of a Beta-Binomial-Logit model. We derive data-dependent necessary and sufficient conditions for posterior propriety within a class of hyper-prior distributions that encompass those used in previous studies. Frequency coverage properties of several hyper-prior distributions are also investigated to see when and whether Bayesian interval estimates of random effects meet their nominal confidence levels. The second chapter deals with a time delay estimation problem in astrophysics. When the gravitational field of an intervening galaxy between a quasar and the Earth is strong enough to split light into two or more images, the time delay is defined as the difference between their travel times. The time delay can be used to constrain cosmological parameters and can be inferred from the time series of brightness data of each image. To estimate the time delay, we construct a Gaussian hierarchical model based on a state-space representation for irregularly observed time series generated by a latent continuous-time Ornstein-Uhlenbeck process. Our Bayesian approach jointly infers model parameters via a Gibbs sampler. We also introduce a profile likelihood of the time delay as an approximation of its marginal posterior distribution. The last chapter specifies a repelling-attracting Metropolis algorithm, a new Markov chain Monte Carlo method to explore multi-modal distributions in a simple and fast manner. This algorithm is essentially a Metropolis-Hastings algorithm with a proposal that consists of a downhill move in density that aims to make local modes repelling, followed by an uphill move in density that aims to make local modes attracting. The downhill move is achieved via a reciprocal Metropolis ratio so that the algorithm prefers downward movement. The uphill move does the opposite using the standard Metropolis ratio which prefers upward movement. This down-up movement in density increases the probability of a proposed move to a different mode.
Monte Carlo sampling in diffusive dynamical systems
NASA Astrophysics Data System (ADS)
Tapias, Diego; Sanders, David P.; Altmann, Eduardo G.
2018-05-01
We introduce a Monte Carlo algorithm to efficiently compute transport properties of chaotic dynamical systems. Our method exploits the importance sampling technique that favors trajectories in the tail of the distribution of displacements, where deviations from a diffusive process are most prominent. We search for initial conditions using a proposal that correlates states in the Markov chain constructed via a Metropolis-Hastings algorithm. We show that our method outperforms the direct sampling method and also Metropolis-Hastings methods with alternative proposals. We test our general method through numerical simulations in 1D (box-map) and 2D (Lorentz gas) systems.
A Bootstrap Metropolis-Hastings Algorithm for Bayesian Analysis of Big Data.
Liang, Faming; Kim, Jinsu; Song, Qifan
2016-01-01
Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively.
NASA Astrophysics Data System (ADS)
Lu, Dan; Ricciuto, Daniel; Walker, Anthony; Safta, Cosmin; Munger, William
2017-09-01
Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM) algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results in a better model fit and predictive performance compared to the popular adaptive Metropolis (AM) scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. The result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.
ERIC Educational Resources Information Center
Ntumi, Simon
2016-01-01
The study examined the challenges that pre-school teachers encounter in the implementation of the early childhood curriculum; exploring teaching methods employed by pre-schools teachers in the Cape Coast Metropolis. The study employed descriptive survey as the research design. A convenient sample of 62 pre-school teachers were selected from a…
ERIC Educational Resources Information Center
Dauda, Bala; Jambo, Hyelni Emmanuel; Umar, Muhammad Amin
2016-01-01
This study examined students' perception of factors influencing teaching and learning of mathematics in senior secondary schools in Maiduguri Metropolis of Borno State, Nigeria. The objectives of the study were to determine the extent to which students perceived: qualification, method of teaching, instructional materials and attitude of both…
NASA Astrophysics Data System (ADS)
Bérubé, Charles L.; Chouteau, Michel; Shamsipour, Pejman; Enkin, Randolph J.; Olivo, Gema R.
2017-08-01
Spectral induced polarization (SIP) measurements are now widely used to infer mineralogical or hydrogeological properties from the low-frequency electrical properties of the subsurface in both mineral exploration and environmental sciences. We present an open-source program that performs fast multi-model inversion of laboratory complex resistivity measurements using Markov-chain Monte Carlo simulation. Using this stochastic method, SIP parameters and their uncertainties may be obtained from the Cole-Cole and Dias models, or from the Debye and Warburg decomposition approaches. The program is tested on synthetic and laboratory data to show that the posterior distribution of a multiple Cole-Cole model is multimodal in particular cases. The Warburg and Debye decomposition approaches yield unique solutions in all cases. It is shown that an adaptive Metropolis algorithm performs faster and is less dependent on the initial parameter values than the Metropolis-Hastings step method when inverting SIP data through the decomposition schemes. There are no advantages in using an adaptive step method for well-defined Cole-Cole inversion. Finally, the influence of measurement noise on the recovered relaxation time distribution is explored. We provide the geophysics community with a open-source platform that can serve as a base for further developments in stochastic SIP data inversion and that may be used to perform parameter analysis with various SIP models.
77 FR 29697 - Honeywell Metropolis Works; Grant of Exemption for Honeywell Metropolis Works License
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-18
... NUCLEAR REGULATORY COMMISSION [Docket No. 40-3392; NRC-2012-0111] Honeywell Metropolis Works; Grant of Exemption for Honeywell Metropolis Works License AGENCY: Nuclear Regulatory Commission. ACTION...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Dan; Ricciuto, Daniel M.; Walker, Anthony P.
Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM) algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results inmore » a better model fit and predictive performance compared to the popular adaptive Metropolis (AM) scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. Here, the result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.« less
Lu, Dan; Ricciuto, Daniel M.; Walker, Anthony P.; ...
2017-09-27
Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM) algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results inmore » a better model fit and predictive performance compared to the popular adaptive Metropolis (AM) scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. Here, the result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.« less
Random Numbers and Monte Carlo Methods
NASA Astrophysics Data System (ADS)
Scherer, Philipp O. J.
Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.
Volatility modeling for IDR exchange rate through APARCH model with student-t distribution
NASA Astrophysics Data System (ADS)
Nugroho, Didit Budi; Susanto, Bambang
2017-08-01
The aim of this study is to empirically investigate the performance of APARCH(1,1) volatility model with the Student-t error distribution on five foreign currency selling rates to Indonesian rupiah (IDR), including the Swiss franc (CHF), the Euro (EUR), the British pound (GBP), Japanese yen (JPY), and the US dollar (USD). Six years daily closing rates over the period of January 2010 to December 2016 for a total number of 1722 observations have analysed. The Bayesian inference using the efficient independence chain Metropolis-Hastings and adaptive random walk Metropolis methods in the Markov chain Monte Carlo (MCMC) scheme has been applied to estimate the parameters of model. According to the DIC criterion, this study has found that the APARCH(1,1) model under Student-t distribution is a better fit than the model under normal distribution for any observed rate return series. The 95% highest posterior density interval suggested the APARCH models to model the IDR/JPY and IDR/USD volatilities. In particular, the IDR/JPY and IDR/USD data, respectively, have significant negative and positive leverage effect in the rate returns. Meanwhile, the optimal power coefficient of volatility has been found to be statistically different from 2 in adopting all rate return series, save the IDR/EUR rate return series.
NASA Astrophysics Data System (ADS)
Wentworth, Mami Tonoe
Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.
Optimized nested Markov chain Monte Carlo sampling: theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coe, Joshua D; Shaw, M Sam; Sewell, Thomas D
2009-01-01
Metropolis Monte Carlo sampling of a reference potential is used to build a Markov chain in the isothermal-isobaric ensemble. At the endpoints of the chain, the energy is reevaluated at a different level of approximation (the 'full' energy) and a composite move encompassing all of the intervening steps is accepted on the basis of a modified Metropolis criterion. By manipulating the thermodynamic variables characterizing the reference system we maximize the average acceptance probability of composite moves, lengthening significantly the random walk made between consecutive evaluations of the full energy at a fixed acceptance probability. This provides maximally decorrelated samples ofmore » the full potential, thereby lowering the total number required to build ensemble averages of a given variance. The efficiency of the method is illustrated using model potentials appropriate to molecular fluids at high pressure. Implications for ab initio or density functional theory (DFT) treatment are discussed.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-19
... NUCLEAR REGULATORY COMMISSION [Docket No. 40-3392-MLA; ASLBP No. 11-910-01-MLA-BD01] Atomic Safety and Licensing Board; Honeywell International, Inc.; Metropolis Works Uranium Conversion Facility... assurance for its Metropolis Works uranium conversion facility in Metropolis, Illinois. \\1\\ LBP-11-19, 74...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-07
...., Metropolis Works; License Amendment Request and Request for a Hearing AGENCY: Nuclear Regulatory Commission... surface impoundment decommissioning plan at its Metropolis Works Facility site located in Metropolis... information. With respect to copyrighted works, except for limited excerpts that serve the purpose of the...
Modeling and Bayesian parameter estimation for shape memory alloy bending actuators
NASA Astrophysics Data System (ADS)
Crews, John H.; Smith, Ralph C.
2012-04-01
In this paper, we employ a homogenized energy model (HEM) for shape memory alloy (SMA) bending actuators. Additionally, we utilize a Bayesian method for quantifying parameter uncertainty. The system consists of a SMA wire attached to a flexible beam. As the actuator is heated, the beam bends, providing endoscopic motion. The model parameters are fit to experimental data using an ordinary least-squares approach. The uncertainty in the fit model parameters is then quantified using Markov Chain Monte Carlo (MCMC) methods. The MCMC algorithm provides bounds on the parameters, which will ultimately be used in robust control algorithms. One purpose of the paper is to test the feasibility of the Random Walk Metropolis algorithm, the MCMC method used here.
General Metropolis-Hastings jump diffusions for automatic target recognition in infrared scenes
NASA Astrophysics Data System (ADS)
Lanterman, Aaron D.; Miller, Michael I.; Snyder, Donald L.
1997-04-01
To locate and recognize ground-based targets in forward- looking IR (FLIR) images, 3D faceted models with associated pose parameters are formulated to accommodate the variability found in FLIR imagery. Taking a Bayesian approach, scenes are simulated from the emissive characteristics of the CAD models and compared with the collected data by a likelihood function based on sensor statistics. This likelihood is combined with a prior distribution defined over the set of possible scenes to form a posterior distribution. To accommodate scenes with variable numbers of targets, the posterior distribution is defined over parameter vectors of varying dimension. An inference algorithm based on Metropolis-Hastings jump- diffusion processes empirically samples from the posterior distribution, generating configurations of templates and transformations that match the collected sensor data with high probability. The jumps accommodate the addition and deletion of targets and the estimation of target identities; diffusions refine the hypotheses by drifting along the gradient of the posterior distribution with respect to the orientation and position parameters. Previous results on jumps strategies analogous to the Metropolis acceptance/rejection algorithm, with proposals drawn from the prior and accepted based on the likelihood, are extended to encompass general Metropolis-Hastings proposal densities. In particular, the algorithm proposes moves by drawing from the posterior distribution over computationally tractible subsets of the parameter space. The algorithm is illustrated by an implementation on a Silicon Graphics Onyx/Reality Engine.
Markov Chain Monte Carlo in the Analysis of Single-Molecule Experimental Data
NASA Astrophysics Data System (ADS)
Kou, S. C.; Xie, X. Sunney; Liu, Jun S.
2003-11-01
This article provides a Bayesian analysis of the single-molecule fluorescence lifetime experiment designed to probe the conformational dynamics of a single DNA hairpin molecule. The DNA hairpin's conformational change is initially modeled as a two-state Markov chain, which is not observable and has to be indirectly inferred. The Brownian diffusion of the single molecule, in addition to the hidden Markov structure, further complicates the matter. We show that the analytical form of the likelihood function can be obtained in the simplest case and a Metropolis-Hastings algorithm can be designed to sample from the posterior distribution of the parameters of interest and to compute desired estiamtes. To cope with the molecular diffusion process and the potentially oscillating energy barrier between the two states of the DNA hairpin, we introduce a data augmentation technique to handle both the Brownian diffusion and the hidden Ornstein-Uhlenbeck process associated with the fluctuating energy barrier, and design a more sophisticated Metropolis-type algorithm. Our method not only increases the estimating resolution by several folds but also proves to be successful for model discrimination.
MontePython 3: Parameter inference code for cosmology
NASA Astrophysics Data System (ADS)
Brinckmann, Thejs; Lesgourgues, Julien; Audren, Benjamin; Benabed, Karim; Prunet, Simon
2018-05-01
MontePython 3 provides numerous ways to explore parameter space using Monte Carlo Markov Chain (MCMC) sampling, including Metropolis-Hastings, Nested Sampling, Cosmo Hammer, and a Fisher sampling method. This improved version of the Monte Python (ascl:1307.002) parameter inference code for cosmology offers new ingredients that improve the performance of Metropolis-Hastings sampling, speeding up convergence and offering significant time improvement in difficult runs. Additional likelihoods and plotting options are available, as are post-processing algorithms such as Importance Sampling and Adding Derived Parameter.
SAChES: Scalable Adaptive Chain-Ensemble Sampling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swiler, Laura Painton; Ray, Jaideep; Ebeida, Mohamed Salah
We present the development of a parallel Markov Chain Monte Carlo (MCMC) method called SAChES, Scalable Adaptive Chain-Ensemble Sampling. This capability is targed to Bayesian calibration of com- putationally expensive simulation models. SAChES involves a hybrid of two methods: Differential Evo- lution Monte Carlo followed by Adaptive Metropolis. Both methods involve parallel chains. Differential evolution allows one to explore high-dimensional parameter spaces using loosely coupled (i.e., largely asynchronous) chains. Loose coupling allows the use of large chain ensembles, with far more chains than the number of parameters to explore. This reduces per-chain sampling burden, enables high-dimensional inversions and the usemore » of computationally expensive forward models. The large number of chains can also ameliorate the impact of silent-errors, which may affect only a few chains. The chain ensemble can also be sampled to provide an initial condition when an aberrant chain is re-spawned. Adaptive Metropolis takes the best points from the differential evolution and efficiently hones in on the poste- rior density. The multitude of chains in SAChES is leveraged to (1) enable efficient exploration of the parameter space; and (2) ensure robustness to silent errors which may be unavoidable in extreme-scale computational platforms of the future. This report outlines SAChES, describes four papers that are the result of the project, and discusses some additional results.« less
Hu, Y C; Chen, J; Li, M; Wang, R; Li, W D; Yang, Y H; Yang, C; Yun, C F; Yang, L C; Yang, X G
2017-02-06
Objective: To evaluate the prevalence of anemia and the nutritional status of vitamins A and D by analyzing hemoglobin, serum retinol, and serum 25-hydroxyvitamin D levels in Chinese urban pregnant women during 2010-2012. Methods: Data were obtained from the China Nutrition and Health Survey in 2010-2012. Using multi-stage stratified sampling and population proportional stratified random sampling, 2 250 pregnant women from 34 metropolis and 41 middle-sized and small cities were included in this study. Information was collected using a questionnaire survey. The blood hemoglobin concentration was determined using the cyanmethemoglobin method, and anemia was determined using the World Health Organization guidelines combined with the elevation correction standard. The serum retinol level was determined using high-performance liquid chromatography, and vitamin A deficiency (VAD) was judged by the related standard recommended by the World Health Organization. The vitamin D level was determined using enzyme-linked immunosorbent assay and vitamin D deficiency was judged by the recommendation standards from the Institute of Medicine of The National Academies. The hemoglobin, serum retinol, and serum 25-hydroxyvitamin D levels were compared, along with differences in the prevalence of anemia, VAD, and the vitamin D deficiency rate (including deficiency and serious deficiency). Results: A total of 1 738 cases of hemoglobin level, 594 cases of serum retinol level, and 1 027 cases of serum 25-hydroxyvitamin D were available for analysis in this study. The overall blood hemoglobin level ( P (50) ( P (25)- P (75))) was 122.70 (114.00-131.10) g/L; 123.70 (115.21-132.00) g/L for metropolis and 122.01 (113.30-130.40) g/L for middle-sized and small cities. The blood hemoglobin level of metropolis residents was significantly higher than that of middle-sized and small city residents ( P= 0.027). The overall prevalence of anemia was 17.0% (295/1 738). The overall serum retinol level ( P (50) ( P (25)- P (75))) was 1.61 (1.20-2.06) μmol/L; 1.50 (1.04-2.06) μmol/L for metropolis and 1.63 (1.31-2.05) μmol/L for middle-sized and small cities. The serum retinol level of metropolis residents was significantly higher than that of middle-sized and small city residents ( P= 0.033). The overall prevalence of VAD was 7.4% (47/639); 11.5% (33/286) for metropolis and 4.0% (14/353) for middle-sized and small cities. A significant difference was observed in the prevalence of VAD between metropolis and middle-sized and small city residents ( P< 0.001). The overall serum 25-hydroxyvitamin D level ( P (50) ( P (25)- P (75))) was 15.41 (11.79-20.23) ng/ml; 14.71 (11.15-19.07) ng/ml for metropolis and 16.02 (12.65-21.36) ng/ml for middle-sized and small cities. A significant difference was observed in the vitamin D level between metropolis and middle-sized and small city residents ( P< 0.001). The overall prevalence of vitamin D deficiency was 74.3% (763/1 027); A significant difference was observed in the prevalence of serious vitamin D deficiency between metropolis (30.64%(144/470)) and middle-sized and small city residents (26%(267/1 027))( P= 0.002). There were no significant differences between blood hemoglobin level and the prevalence of anemia, VAD, and vitamin D deficiency. Conclusion: The prevalence of anemia in Chinese urban pregnant women improved from 2002 to 2012. The prevalence of vitamin D deficiency in pregnant women was generally more serious, while a certain percentage of women had VAD. The prevalence of VAD and serious vitamin D deficiency among pregnant women from metropolis was significantly higher than that of pregnant women from medium and small-sized cities.
Water Oxidation Catalysis for NiOOH by a Metropolis Monte Carlo Algorithm.
Hareli, Chen; Caspary Toroker, Maytal
2018-05-08
Understanding catalytic mechanisms is important for discovering better catalysts, particularly for water splitting reactions that are of great interest to the renewable energy field. One of the best performing catalysts for water oxidation is nickel oxyhydroxide (NiOOH). However, only one mechanism has been adopted so far for modeling catalysis of the active plane: β-NiOOH(01̅5). In order to understand how a second reaction mechanism affects catalysis, we perform Density Functional Theory + U (DFT+U) calculations of a second mechanism for water oxidation reaction of NiOOH. Then, we use a Metropolis Monte Carlo algorithm to calculate how many catalytic cycles are completed when two reaction mechanisms are competing. We find that within the Metropolis algorithm, the second mechanism has a higher overpotential and is therefore not active even for large applied biases.
NASA Astrophysics Data System (ADS)
Wang, Hongrui; Wang, Cheng; Wang, Ying; Gao, Xiong; Yu, Chen
2017-06-01
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLE confidence interval and thus more precise estimation by using the related information from regional gage stations. The Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.
Rapid recipe formulation for plasma etching of new materials
NASA Astrophysics Data System (ADS)
Chopra, Meghali; Zhang, Zizhuo; Ekerdt, John; Bonnecaze, Roger T.
2016-03-01
A fast and inexpensive scheme for etch rate prediction using flexible continuum models and Bayesian statistics is demonstrated. Bulk etch rates of MgO are predicted using a steady-state model with volume-averaged plasma parameters and classical Langmuir surface kinetics. Plasma particle and surface kinetics are modeled within a global plasma framework using single component Metropolis Hastings methods and limited data. The accuracy of these predictions is evaluated with synthetic and experimental etch rate data for magnesium oxide in an ICP-RIE system. This approach is compared and superior to factorial models generated from JMP, a software package frequently employed for recipe creation and optimization.
Non-proportional odds multivariate logistic regression of ordinal family data.
Zaloumis, Sophie G; Scurrah, Katrina J; Harrap, Stephen B; Ellis, Justine A; Gurrin, Lyle C
2015-03-01
Methods to examine whether genetic and/or environmental sources can account for the residual variation in ordinal family data usually assume proportional odds. However, standard software to fit the non-proportional odds model to ordinal family data is limited because the correlation structure of family data is more complex than for other types of clustered data. To perform these analyses we propose the non-proportional odds multivariate logistic regression model and take a simulation-based approach to model fitting using Markov chain Monte Carlo methods, such as partially collapsed Gibbs sampling and the Metropolis algorithm. We applied the proposed methodology to male pattern baldness data from the Victorian Family Heart Study. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Jeong, Jeho; Chen, Qing; Febo, Robert; Yang, Jie; Pham, Hai; Xiong, Jian-Ping; Zanzonico, Pat B.; Deasy, Joseph O.; Humm, John L.; Mageras, Gig S.
2016-01-01
Although spatially precise systems are now available for small-animal irradiations, there are currently limited software tools available for treatment planning for such irradiations. We report on the adaptation, commissioning, and evaluation of a 3-dimensional treatment planning system for use with a small-animal irradiation system. The 225-kV X-ray beam of the X-RAD 225Cx microirradiator (Precision X-Ray) was commissioned using both ion-chamber and radiochromic film for 10 different collimators ranging in field size from 1 mm in diameter to 40 × 40 mm2. A clinical 3-dimensional treatment planning system (Metropolis) developed at our institution was adapted to small-animal irradiation by making it compatible with the dimensions of mice and rats, modeling the microirradiator beam orientations and collimators, and incorporating the measured beam data for dose calculation. Dose calculations in Metropolis were verified by comparison with measurements in phantoms. Treatment plans for irradiation of a tumor-bearing mouse were generated with both the Metropolis and the vendor-supplied software. The calculated beam-on times and the plan evaluation tools were compared. The dose rate at the central axis ranges from 74 to 365 cGy/min depending on the collimator size. Doses calculated with Metropolis agreed with phantom measurements within 3% for all collimators. The beam-on times calculated by Metropolis and the vendor-supplied software agreed within 1% at the isocenter. The modified 3-dimensional treatment planning system provides better visualization of the relationship between the X-ray beams and the small-animal anatomy as well as more complete dosimetric information on target tissues and organs at risk. It thereby enhances the potential of image-guided microirradiator systems for evaluation of dose–response relationships and for preclinical experimentation generally. PMID:25948321
ERIC Educational Resources Information Center
Abayomi, B. O.; Oyeniyi, Pat Ola; Ainazx, O. O.
2017-01-01
The paper appraised the organization and administration of intramural sports programmes in secondary schools in Ibadan metropolis. The descriptive research design of survey type was employed for the study. The population was all secondary school students and teachers in Ibadan Metropolis. The sample consisted of 500 respondents, 40 public…
Lung Cancer Pathological Image Analysis Using a Hidden Potts Model
Li, Qianyun; Yi, Faliu; Wang, Tao; Xiao, Guanghua; Liang, Faming
2017-01-01
Nowadays, many biological data are acquired via images. In this article, we study the pathological images scanned from 205 patients with lung cancer with the goal to find out the relationship between the survival time and the spatial distribution of different types of cells, including lymphocyte, stroma, and tumor cells. Toward this goal, we model the spatial distribution of different types of cells using a modified Potts model for which the parameters represent interactions between different types of cells and estimate the parameters of the Potts model using the double Metropolis-Hastings algorithm. The double Metropolis-Hastings algorithm allows us to simulate samples approximately from a distribution with an intractable normalizing constant. Our numerical results indicate that the spatial interaction between the lymphocyte and tumor cells is significantly associated with the patient’s survival time, and it can be used together with the cell count information to predict the survival of the patients. PMID:28615918
ERIC Educational Resources Information Center
Bua, Felix Terhile
2013-01-01
The study investigated the influence of school environment on the management of secondary school education in Makurdi Metropolis of Benue State. Two research questions and two hypotheses guided the study. The survey design was adopted for the study. Four hundred (400) teachers from 20 grant aided secondary schools in Markurdi Metropolis of Benue…
Rugged Metropolis sampling with simultaneous updating of two dynamical variables
NASA Astrophysics Data System (ADS)
Berg, Bernd A.; Zhou, Huan-Xiang
2005-07-01
The rugged Metropolis (RM) algorithm is a biased updating scheme which aims at directly hitting the most likely configurations in a rugged free-energy landscape. Details of the one-variable (RM1) implementation of this algorithm are presented. This is followed by an extension to simultaneous updating of two dynamical variables (RM2) . In a test with the brain peptide Met-Enkephalin in vacuum RM2 improves conventional Metropolis simulations by a factor of about 4. Correlations between three or more dihedral angles appear to prevent larger improvements at low temperatures. We also investigate a multihit Metropolis scheme, which spends more CPU time on variables with large autocorrelation times.
Extended Mixed-Efects Item Response Models with the MH-RM Algorithm
ERIC Educational Resources Information Center
Chalmers, R. Philip
2015-01-01
A mixed-effects item response theory (IRT) model is presented as a logical extension of the generalized linear mixed-effects modeling approach to formulating explanatory IRT models. Fixed and random coefficients in the extended model are estimated using a Metropolis-Hastings Robbins-Monro (MH-RM) stochastic imputation algorithm to accommodate for…
Bayesian Analysis of Nonlinear Structural Equation Models with Nonignorable Missing Data
ERIC Educational Resources Information Center
Lee, Sik-Yum
2006-01-01
A Bayesian approach is developed for analyzing nonlinear structural equation models with nonignorable missing data. The nonignorable missingness mechanism is specified by a logistic regression model. A hybrid algorithm that combines the Gibbs sampler and the Metropolis-Hastings algorithm is used to produce the joint Bayesian estimates of…
Ozoh, Obianuju B.; Okubadejo, Njideka U.; Akanbi, Maxwell O.; Dania, Michelle G.
2013-01-01
Background: The burden of obstructive sleep apnea among commercial drivers in Nigeria is not known. Aim: To assess the prevalence of high risk of obstructive sleep apnea (OSA) and excessive daytime sleepiness (EDS) among intra-city commercial drivers. Setting and Design: A descriptive cross-sectional study in three major motor parks in Lagos metropolis. Materials and Methods: Demographic, anthropometric and historical data was obtained. The risk of OSA and EDS was assessed using the STOP BANG questionnaire and the Epworth Sleepiness Scale, respectively. Statistical Analysis: The relationship between the OSA risk, EDS risk and past road traffic accident (RTA) was explored using the Pearson's chi square. Independent determinants of OSA risk, EDS risk and past RTA, respectively, were assessed by multiple logistic regression models. Result: Five hundred male commercial drivers (mean age (years) ±SD = 42.36 ± 11.17 and mean BMI (kg/m2) ±SD = 25.68 ± 3.79) were recruited. OSA risk was high in 244 (48.8%) drivers and 72 (14.4%) had EDS. There was a positive relationship between OSA risk and the risk of EDS (Pearson's X2 = 28.2, P < 0.001). Sixty-one (12.2%) drivers had a past history of RTA but there was no significant relationship between a past RTA and either OSA risk (X2 = 2.05, P = 0.15) or EDS risk (X2 = 2.7, P = 0.1), respectively. Abdominal adiposity, regular alcohol use and EDS were independent determinants of OSA risk while the use of cannabis and OSA risk were independent determinants of EDS. No independent risk factor for past RTA was identified. Conclusion: A significant proportion of commercial drivers in Lagos metropolis are at high risk of OSA and EDS. PMID:24249946
Meng, Xia; Fu, Qingyan; Ma, Zongwei; Chen, Li; Zou, Bin; Zhang, Yan; Xue, Wenbo; Wang, Jinnan; Wang, Dongfang; Kan, Haidong; Liu, Yang
2016-01-01
Development of exposure assessment model is the key component for epidemiological studies concerning air pollution, but the evidence from China is limited. Therefore, a linear mixed effects (LME) model was established in this study in a Chinese metropolis by incorporating aerosol optical depth (AOD), meteorological information and the land use regression (LUR) model to predict ground PM10 levels on high spatiotemporal resolution. The cross validation (CV) R(2) and the RMSE of the LME model were 0.87 and 19.2 μg/m(3), respectively. The relative prediction error (RPE) of daily and annual mean predicted PM10 concentrations were 19.1% and 7.5%, respectively. This study was the first attempt in China to estimate both short-term and long-term variation of PM10 levels with high spatial resolution in a Chinese metropolis with the LME model. The results suggested that the LME model could provide exposure assessment for short-term and long-term epidemiological studies in China. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Bootstrap Metropolis–Hastings Algorithm for Bayesian Analysis of Big Data
Kim, Jinsu; Song, Qifan
2016-01-01
Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively. PMID:29033469
Maximum Likelihood Estimation of Nonlinear Structural Equation Models.
ERIC Educational Resources Information Center
Lee, Sik-Yum; Zhu, Hong-Tu
2002-01-01
Developed an EM type algorithm for maximum likelihood estimation of a general nonlinear structural equation model in which the E-step is completed by a Metropolis-Hastings algorithm. Illustrated the methodology with results from a simulation study and two real examples using data from previous studies. (SLD)
DOT National Transportation Integrated Search
2016-10-01
This report summarizes the presentations, key themes, and recommendations identified at a Regional Models of Cooperation peer exchange on October 24, 2016 in Salt Lake City, Utah. The Utah Transit Authority hosted peers from the Los Angeles Metropoli...
Mattfeldt, Torsten
2011-04-01
Computer-intensive methods may be defined as data analytical procedures involving a huge number of highly repetitive computations. We mention resampling methods with replacement (bootstrap methods), resampling methods without replacement (randomization tests) and simulation methods. The resampling methods are based on simple and robust principles and are largely free from distributional assumptions. Bootstrap methods may be used to compute confidence intervals for a scalar model parameter and for summary statistics from replicated planar point patterns, and for significance tests. For some simple models of planar point processes, point patterns can be simulated by elementary Monte Carlo methods. The simulation of models with more complex interaction properties usually requires more advanced computing methods. In this context, we mention simulation of Gibbs processes with Markov chain Monte Carlo methods using the Metropolis-Hastings algorithm. An alternative to simulations on the basis of a parametric model consists of stochastic reconstruction methods. The basic ideas behind the methods are briefly reviewed and illustrated by simple worked examples in order to encourage novices in the field to use computer-intensive methods. © 2010 The Authors Journal of Microscopy © 2010 Royal Microscopical Society.
NASA Astrophysics Data System (ADS)
Bonnema, Matthew G.; Sikder, Safat; Hossain, Faisal; Durand, Michael; Gleason, Colin J.; Bjerklie, David M.
2016-04-01
The objective of this study is to compare the effectiveness of three algorithms that estimate discharge from remotely sensed observables (river width, water surface height, and water surface slope) in anticipation of the forthcoming NASA/CNES Surface Water and Ocean Topography (SWOT) mission. SWOT promises to provide these measurements simultaneously, and the river discharge algorithms included here are designed to work with these data. Two algorithms were built around Manning's equation, the Metropolis Manning (MetroMan) method, and the Mean Flow and Geomorphology (MFG) method, and one approach uses hydraulic geometry to estimate discharge, the at-many-stations hydraulic geometry (AMHG) method. A well-calibrated and ground-truthed hydrodynamic model of the Ganges river system (HEC-RAS) was used as reference for three rivers from the Ganges River Delta: the main stem of Ganges, the Arial-Khan, and the Mohananda Rivers. The high seasonal variability of these rivers due to the Monsoon presented a unique opportunity to thoroughly assess the discharge algorithms in light of typical monsoon regime rivers. It was found that the MFG method provides the most accurate discharge estimations in most cases, with an average relative root-mean-squared error (RRMSE) across all three reaches of 35.5%. It is followed closely by the Metropolis Manning algorithm, with an average RRMSE of 51.5%. However, the MFG method's reliance on knowledge of prior river discharge limits its application on ungauged rivers. In terms of input data requirement at ungauged regions with no prior records, the Metropolis Manning algorithm provides a more practical alternative over a region that is lacking in historical observations as the algorithm requires less ancillary data. The AMHG algorithm, while requiring the least prior river data, provided the least accurate discharge measurements with an average wet and dry season RRMSE of 79.8% and 119.1%, respectively, across all rivers studied. This poor performance is directly traced to poor estimation of AMHG via a remotely sensed proxy, and results improve commensurate with MFG and MetroMan when prior AMHG information is given to the method. Therefore, we cannot recommend use of AMHG without inclusion of this prior information, at least for the studied rivers. The dry season discharge (within-bank flow) was captured well by all methods, while the wet season (floodplain flow) appeared more challenging. The picture that emerges from this study is that a multialgorithm approach may be appropriate during flood inundation periods in Ganges Delta.
NASA Astrophysics Data System (ADS)
Ustinov, E. A.
2017-01-01
The paper aims at a comparison of techniques based on the kinetic Monte Carlo (kMC) and the conventional Metropolis Monte Carlo (MC) methods as applied to the hard-sphere (HS) fluid and solid. In the case of the kMC, an alternative representation of the chemical potential is explored [E. A. Ustinov and D. D. Do, J. Colloid Interface Sci. 366, 216 (2012)], which does not require any external procedure like the Widom test particle insertion method. A direct evaluation of the chemical potential of the fluid and solid without thermodynamic integration is achieved by molecular simulation in an elongated box with an external potential imposed on the system in order to reduce the particle density in the vicinity of the box ends. The existence of rarefied zones allows one to determine the chemical potential of the crystalline phase and substantially increases its accuracy for the disordered dense phase in the central zone of the simulation box. This method is applicable to both the Metropolis MC and the kMC, but in the latter case, the chemical potential is determined with higher accuracy at the same conditions and the number of MC steps. Thermodynamic functions of the disordered fluid and crystalline face-centered cubic (FCC) phase for the hard-sphere system have been evaluated with the kinetic MC and the standard MC coupled with the Widom procedure over a wide range of density. The melting transition parameters have been determined by the point of intersection of the pressure-chemical potential curves for the disordered HS fluid and FCC crystal using the Gibbs-Duhem equation as a constraint. A detailed thermodynamic analysis of the hard-sphere fluid has provided a rigorous verification of the approach, which can be extended to more complex systems.
NASA Astrophysics Data System (ADS)
De Lannoy, G. J.; Reichle, R. H.; Vrugt, J. A.
2012-12-01
Simulated L-band (1.4 GHz) brightness temperatures are very sensitive to the values of the parameters in the radiative transfer model (RTM). We assess the optimum RTM parameter values and their (posterior) uncertainty in the Goddard Earth Observing System (GEOS-5) land surface model using observations of multi-angular brightness temperature over North America from the Soil Moisture Ocean Salinity (SMOS) mission. Two different parameter estimation methods are being compared: (i) a particle swarm optimization (PSO) approach, and (ii) an MCMC simulation procedure using the differential evolution adaptive Metropolis (DREAM) algorithm. Our results demonstrate that both methods provide similar "optimal" parameter values. Yet, DREAM exhibits better convergence properties, resulting in a reduced spread of the posterior ensemble. The posterior parameter distributions derived with both methods are used for predictive uncertainty estimation of brightness temperature. This presentation will highlight our model-data synthesis framework and summarize our initial findings.
A GPU-based large-scale Monte Carlo simulation method for systems with long-range interactions
NASA Astrophysics Data System (ADS)
Liang, Yihao; Xing, Xiangjun; Li, Yaohang
2017-06-01
In this work we present an efficient implementation of Canonical Monte Carlo simulation for Coulomb many body systems on graphics processing units (GPU). Our method takes advantage of the GPU Single Instruction, Multiple Data (SIMD) architectures, and adopts the sequential updating scheme of Metropolis algorithm. It makes no approximation in the computation of energy, and reaches a remarkable 440-fold speedup, compared with the serial implementation on CPU. We further use this method to simulate primitive model electrolytes, and measure very precisely all ion-ion pair correlation functions at high concentrations. From these data, we extract the renormalized Debye length, renormalized valences of constituent ions, and renormalized dielectric constants. These results demonstrate unequivocally physics beyond the classical Poisson-Boltzmann theory.
Temme, K; Osborne, T J; Vollbrecht, K G; Poulin, D; Verstraete, F
2011-03-03
The original motivation to build a quantum computer came from Feynman, who imagined a machine capable of simulating generic quantum mechanical systems--a task that is believed to be intractable for classical computers. Such a machine could have far-reaching applications in the simulation of many-body quantum physics in condensed-matter, chemical and high-energy systems. Part of Feynman's challenge was met by Lloyd, who showed how to approximately decompose the time evolution operator of interacting quantum particles into a short sequence of elementary gates, suitable for operation on a quantum computer. However, this left open the problem of how to simulate the equilibrium and static properties of quantum systems. This requires the preparation of ground and Gibbs states on a quantum computer. For classical systems, this problem is solved by the ubiquitous Metropolis algorithm, a method that has basically acquired a monopoly on the simulation of interacting particles. Here we demonstrate how to implement a quantum version of the Metropolis algorithm. This algorithm permits sampling directly from the eigenstates of the Hamiltonian, and thus evades the sign problem present in classical simulations. A small-scale implementation of this algorithm should be achievable with today's technology.
Exploring first-order phase transitions with population annealing
NASA Astrophysics Data System (ADS)
Barash, Lev Yu.; Weigel, Martin; Shchur, Lev N.; Janke, Wolfhard
2017-03-01
Population annealing is a hybrid of sequential and Markov chain Monte Carlo methods geared towards the efficient parallel simulation of systems with complex free-energy landscapes. Systems with first-order phase transitions are among the problems in computational physics that are difficult to tackle with standard methods such as local-update simulations in the canonical ensemble, for example with the Metropolis algorithm. It is hence interesting to see whether such transitions can be more easily studied using population annealing. We report here our preliminary observations from population annealing runs for the two-dimensional Potts model with q > 4, where it undergoes a first-order transition.
Community of Inquiry Method and Language Skills Acquisition: Empirical Evidence
ERIC Educational Resources Information Center
Preece, Abdul Shakhour Duncan
2015-01-01
The study investigates the effectiveness of community of inquiry method in preparing students to develop listening and speaking skills in a sample of junior secondary school students in Borno state, Nigeria. A sample of 100 students in standard classes was drawn in one secondary school in Maiduguri metropolis through stratified random sampling…
Exact posterior computation in non-conjugate Gaussian location-scale parameters models
NASA Astrophysics Data System (ADS)
Andrade, J. A. A.; Rathie, P. N.
2017-12-01
In Bayesian analysis the class of conjugate models allows to obtain exact posterior distributions, however this class quite restrictive in the sense that it involves only a few distributions. In fact, most of the practical applications involves non-conjugate models, thus approximate methods, such as the MCMC algorithms, are required. Although these methods can deal with quite complex structures, some practical problems can make their applications quite time demanding, for example, when we use heavy-tailed distributions, convergence may be difficult, also the Metropolis-Hastings algorithm can become very slow, in addition to the extra work inevitably required on choosing efficient candidate generator distributions. In this work, we draw attention to the special functions as a tools for Bayesian computation, we propose an alternative method for obtaining the posterior distribution in Gaussian non-conjugate models in an exact form. We use complex integration methods based on the H-function in order to obtain the posterior distribution and some of its posterior quantities in an explicit computable form. Two examples are provided in order to illustrate the theory.
Hierarchical Bayesian modeling of ionospheric TEC disturbances as non-stationary processes
NASA Astrophysics Data System (ADS)
Seid, Abdu Mohammed; Berhane, Tesfahun; Roininen, Lassi; Nigussie, Melessew
2018-03-01
We model regular and irregular variation of ionospheric total electron content as stationary and non-stationary processes, respectively. We apply the method developed to SCINDA GPS data set observed at Bahir Dar, Ethiopia (11.6 °N, 37.4 °E) . We use hierarchical Bayesian inversion with Gaussian Markov random process priors, and we model the prior parameters in the hyperprior. We use Matérn priors via stochastic partial differential equations, and use scaled Inv -χ2 hyperpriors for the hyperparameters. For drawing posterior estimates, we use Markov Chain Monte Carlo methods: Gibbs sampling and Metropolis-within-Gibbs for parameter and hyperparameter estimations, respectively. This allows us to quantify model parameter estimation uncertainties as well. We demonstrate the applicability of the method proposed using a synthetic test case. Finally, we apply the method to real GPS data set, which we decompose to regular and irregular variation components. The result shows that the approach can be used as an accurate ionospheric disturbance characterization technique that quantifies the total electron content variability with corresponding error uncertainties.
PyDREAM: high-dimensional parameter inference for biological models in python.
Shockley, Erin M; Vrugt, Jasper A; Lopez, Carlos F; Valencia, Alfonso
2018-02-15
Biological models contain many parameters whose values are difficult to measure directly via experimentation and therefore require calibration against experimental data. Markov chain Monte Carlo (MCMC) methods are suitable to estimate multivariate posterior model parameter distributions, but these methods may exhibit slow or premature convergence in high-dimensional search spaces. Here, we present PyDREAM, a Python implementation of the (Multiple-Try) Differential Evolution Adaptive Metropolis [DREAM(ZS)] algorithm developed by Vrugt and ter Braak (2008) and Laloy and Vrugt (2012). PyDREAM achieves excellent performance for complex, parameter-rich models and takes full advantage of distributed computing resources, facilitating parameter inference and uncertainty estimation of CPU-intensive biological models. PyDREAM is freely available under the GNU GPLv3 license from the Lopez lab GitHub repository at http://github.com/LoLab-VU/PyDREAM. c.lopez@vanderbilt.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Brian Phillip
The purpose of this document is to describe the statistical modeling effort for gas concentrations in WIPP storage containers. The concentration (in ppm) of CO 2 in the headspace volume of standard waste box (SWB) 68685 is shown. A Bayesian approach and an adaptive Metropolis-Hastings algorithm were used.
Hybrid optimization and Bayesian inference techniques for a non-smooth radiation detection problem
Stefanescu, Razvan; Schmidt, Kathleen; Hite, Jason; ...
2016-12-12
In this paper, we propose several algorithms to recover the location and intensity of a radiation source located in a simulated 250 × 180 m block of an urban center based on synthetic measurements. Radioactive decay and detection are Poisson random processes, so we employ likelihood functions based on this distribution. Owing to the domain geometry and the proposed response model, the negative logarithm of the likelihood is only piecewise continuous differentiable, and it has multiple local minima. To address these difficulties, we investigate three hybrid algorithms composed of mixed optimization techniques. For global optimization, we consider simulated annealing, particlemore » swarm, and genetic algorithm, which rely solely on objective function evaluations; that is, they do not evaluate the gradient in the objective function. By employing early stopping criteria for the global optimization methods, a pseudo-optimum point is obtained. This is subsequently utilized as the initial value by the deterministic implicit filtering method, which is able to find local extrema in non-smooth functions, to finish the search in a narrow domain. These new hybrid techniques, combining global optimization and implicit filtering address, difficulties associated with the non-smooth response, and their performances, are shown to significantly decrease the computational time over the global optimization methods. To quantify uncertainties associated with the source location and intensity, we employ the delayed rejection adaptive Metropolis and DiffeRential Evolution Adaptive Metropolis algorithms. Finally, marginal densities of the source properties are obtained, and the means of the chains compare accurately with the estimates produced by the hybrid algorithms.« less
Cure fraction model with random effects for regional variation in cancer survival.
Seppä, Karri; Hakulinen, Timo; Kim, Hyon-Jung; Läärä, Esa
2010-11-30
Assessing regional differences in the survival of cancer patients is important but difficult when separate regions are small or sparsely populated. In this paper, we apply a mixture cure fraction model with random effects to cause-specific survival data of female breast cancer patients collected by the population-based Finnish Cancer Registry. Two sets of random effects were used to capture the regional variation in the cure fraction and in the survival of the non-cured patients, respectively. This hierarchical model was implemented in a Bayesian framework using a Metropolis-within-Gibbs algorithm. To avoid poor mixing of the Markov chain, when the variance of either set of random effects was close to zero, posterior simulations were based on a parameter-expanded model with tailor-made proposal distributions in Metropolis steps. The random effects allowed the fitting of the cure fraction model to the sparse regional data and the estimation of the regional variation in 10-year cause-specific breast cancer survival with a parsimonious number of parameters. Before 1986, the capital of Finland clearly stood out from the rest, but since then all the 21 hospital districts have achieved approximately the same level of survival. Copyright © 2010 John Wiley & Sons, Ltd.
Metropolis-Hastings Robbins-Monro Algorithm for Confirmatory Item Factor Analysis
ERIC Educational Resources Information Center
Cai, Li
2010-01-01
Item factor analysis (IFA), already well established in educational measurement, is increasingly applied to psychological measurement in research settings. However, high-dimensional confirmatory IFA remains a numerical challenge. The current research extends the Metropolis-Hastings Robbins-Monro (MH-RM) algorithm, initially proposed for…
Applicability of land use models for the Houston area test site
NASA Technical Reports Server (NTRS)
Petersburg, R. K.; Bradford, L. H.
1973-01-01
Descriptions of land use models are presented which were considered for their applicability to the Houston Area Test Site. These models are representative both of the prevailing theories of land use dynamics and of basic approaches to simulation. The models considered are: a model of metropolis, land use simulation model, emperic land use forecasting model, a probabilistic model for residential growth, and the regional environmental management allocation process. Sources of environmental/resource information are listed.
Comparison of sampling techniques for Bayesian parameter estimation
NASA Astrophysics Data System (ADS)
Allison, Rupert; Dunkley, Joanna
2014-02-01
The posterior probability distribution for a set of model parameters encodes all that the data have to tell us in the context of a given model; it is the fundamental quantity for Bayesian parameter estimation. In order to infer the posterior probability distribution we have to decide how to explore parameter space. Here we compare three prescriptions for how parameter space is navigated, discussing their relative merits. We consider Metropolis-Hasting sampling, nested sampling and affine-invariant ensemble Markov chain Monte Carlo (MCMC) sampling. We focus on their performance on toy-model Gaussian likelihoods and on a real-world cosmological data set. We outline the sampling algorithms themselves and elaborate on performance diagnostics such as convergence time, scope for parallelization, dimensional scaling, requisite tunings and suitability for non-Gaussian distributions. We find that nested sampling delivers high-fidelity estimates for posterior statistics at low computational cost, and should be adopted in favour of Metropolis-Hastings in many cases. Affine-invariant MCMC is competitive when computing clusters can be utilized for massive parallelization. Affine-invariant MCMC and existing extensions to nested sampling naturally probe multimodal and curving distributions.
Irreversible Local Markov Chains with Rapid Convergence towards Equilibrium.
Kapfer, Sebastian C; Krauth, Werner
2017-12-15
We study the continuous one-dimensional hard-sphere model and present irreversible local Markov chains that mix on faster time scales than the reversible heat bath or Metropolis algorithms. The mixing time scales appear to fall into two distinct universality classes, both faster than for reversible local Markov chains. The event-chain algorithm, the infinitesimal limit of one of these Markov chains, belongs to the class presenting the fastest decay. For the lattice-gas limit of the hard-sphere model, reversible local Markov chains correspond to the symmetric simple exclusion process (SEP) with periodic boundary conditions. The two universality classes for irreversible Markov chains are realized by the totally asymmetric SEP (TASEP), and by a faster variant (lifted TASEP) that we propose here. We discuss how our irreversible hard-sphere Markov chains generalize to arbitrary repulsive pair interactions and carry over to higher dimensions through the concept of lifted Markov chains and the recently introduced factorized Metropolis acceptance rule.
Irreversible Local Markov Chains with Rapid Convergence towards Equilibrium
NASA Astrophysics Data System (ADS)
Kapfer, Sebastian C.; Krauth, Werner
2017-12-01
We study the continuous one-dimensional hard-sphere model and present irreversible local Markov chains that mix on faster time scales than the reversible heat bath or Metropolis algorithms. The mixing time scales appear to fall into two distinct universality classes, both faster than for reversible local Markov chains. The event-chain algorithm, the infinitesimal limit of one of these Markov chains, belongs to the class presenting the fastest decay. For the lattice-gas limit of the hard-sphere model, reversible local Markov chains correspond to the symmetric simple exclusion process (SEP) with periodic boundary conditions. The two universality classes for irreversible Markov chains are realized by the totally asymmetric SEP (TASEP), and by a faster variant (lifted TASEP) that we propose here. We discuss how our irreversible hard-sphere Markov chains generalize to arbitrary repulsive pair interactions and carry over to higher dimensions through the concept of lifted Markov chains and the recently introduced factorized Metropolis acceptance rule.
ERIC Educational Resources Information Center
Akpoghol, T. V.; Ezeudu, F. O.; Adzape, J. N.; Otor, E. E.
2016-01-01
The study investigated the effects of Lecture Method Supplemented with Music (LMM) and Computer Animation (LMC) on senior secondary school students' academic achievement in electrochemistry in Makurdi metropolis. Six research questions and six hypotheses guided the study. The design of the study was quasi experimental, specifically the pre-test,…
ERIC Educational Resources Information Center
Akpoghol, T. V.; Ezeudu, F. O.; Adzape, J. N.; Otor, E. E.
2016-01-01
The study investigated the effects of Lecture Method Supplemented with Music (LMM) and Computer Animation (LMC) on senior secondary school students' retention in electrochemistry in Makurdi metropolis. Three research questions and three hypotheses guided the study. The design of the study was quasi experimental, specifically the pre-test,…
Direct simulation Monte Carlo modeling of relaxation processes in polyatomic gases
NASA Astrophysics Data System (ADS)
Pfeiffer, M.; Nizenkov, P.; Mirza, A.; Fasoulas, S.
2016-02-01
Relaxation processes of polyatomic molecules are modeled and implemented in an in-house Direct Simulation Monte Carlo code in order to enable the simulation of atmospheric entry maneuvers at Mars and Saturn's Titan. The description of rotational and vibrational relaxation processes is derived from basic quantum-mechanics using a rigid rotator and a simple harmonic oscillator, respectively. Strategies regarding the vibrational relaxation process are investigated, where good agreement for the relaxation time according to the Landau-Teller expression is found for both methods, the established prohibiting double relaxation method and the new proposed multi-mode relaxation. Differences and applications areas of these two methods are discussed. Consequently, two numerical methods used for sampling of energy values from multi-dimensional distribution functions are compared. The proposed random-walk Metropolis algorithm enables the efficient treatment of multiple vibrational modes within a time step with reasonable computational effort. The implemented model is verified and validated by means of simple reservoir simulations and the comparison to experimental measurements of a hypersonic, carbon-dioxide flow around a flat-faced cylinder.
Approximate Bayesian Computation by Subset Simulation using hierarchical state-space models
NASA Astrophysics Data System (ADS)
Vakilzadeh, Majid K.; Huang, Yong; Beck, James L.; Abrahamsson, Thomas
2017-02-01
A new multi-level Markov Chain Monte Carlo algorithm for Approximate Bayesian Computation, ABC-SubSim, has recently appeared that exploits the Subset Simulation method for efficient rare-event simulation. ABC-SubSim adaptively creates a nested decreasing sequence of data-approximating regions in the output space that correspond to increasingly closer approximations of the observed output vector in this output space. At each level, multiple samples of the model parameter vector are generated by a component-wise Metropolis algorithm so that the predicted output corresponding to each parameter value falls in the current data-approximating region. Theoretically, if continued to the limit, the sequence of data-approximating regions would converge on to the observed output vector and the approximate posterior distributions, which are conditional on the data-approximation region, would become exact, but this is not practically feasible. In this paper we study the performance of the ABC-SubSim algorithm for Bayesian updating of the parameters of dynamical systems using a general hierarchical state-space model. We note that the ABC methodology gives an approximate posterior distribution that actually corresponds to an exact posterior where a uniformly distributed combined measurement and modeling error is added. We also note that ABC algorithms have a problem with learning the uncertain error variances in a stochastic state-space model and so we treat them as nuisance parameters and analytically integrate them out of the posterior distribution. In addition, the statistical efficiency of the original ABC-SubSim algorithm is improved by developing a novel strategy to regulate the proposal variance for the component-wise Metropolis algorithm at each level. We demonstrate that Self-regulated ABC-SubSim is well suited for Bayesian system identification by first applying it successfully to model updating of a two degree-of-freedom linear structure for three cases: globally, locally and un-identifiable model classes, and then to model updating of a two degree-of-freedom nonlinear structure with Duffing nonlinearities in its interstory force-deflection relationship.
Statistical hadronization and microcanonical ensemble
Becattini, F.; Ferroni, L.
2004-01-01
We present a Monte Carlo calculation of the microcanonical ensemble of the of the ideal hadron-resonance gas including all known states up to a mass of 1. 8 GeV, taking into account quantum statistics. The computing method is a development of a previous one based on a Metropolis Monte Carlo algorithm, with a the grand-canonical limit of the multi-species multiplicity distribution as proposal matrix. The microcanonical average multiplicities of the various hadron species are found to converge to the canonical ones for moderately low values of the total energy. This algorithm opens the way for event generators based for themore » statistical hadronization model.« less
Chen, Yunjie; Roux, Benoît
2015-08-11
Molecular dynamics (MD) trajectories based on a classical equation of motion provide a straightforward, albeit somewhat inefficient approach, to explore and sample the configurational space of a complex molecular system. While a broad range of techniques can be used to accelerate and enhance the sampling efficiency of classical simulations, only algorithms that are consistent with the Boltzmann equilibrium distribution yield a proper statistical mechanical computational framework. Here, a multiscale hybrid algorithm relying simultaneously on all-atom fine-grained (FG) and coarse-grained (CG) representations of a system is designed to improve sampling efficiency by combining the strength of nonequilibrium molecular dynamics (neMD) and Metropolis Monte Carlo (MC). This CG-guided hybrid neMD-MC algorithm comprises six steps: (1) a FG configuration of an atomic system is dynamically propagated for some period of time using equilibrium MD; (2) the resulting FG configuration is mapped onto a simplified CG model; (3) the CG model is propagated for a brief time interval to yield a new CG configuration; (4) the resulting CG configuration is used as a target to guide the evolution of the FG system; (5) the FG configuration (from step 1) is driven via a nonequilibrium MD (neMD) simulation toward the CG target; (6) the resulting FG configuration at the end of the neMD trajectory is then accepted or rejected according to a Metropolis criterion before returning to step 1. A symmetric two-ends momentum reversal prescription is used for the neMD trajectories of the FG system to guarantee that the CG-guided hybrid neMD-MC algorithm obeys microscopic detailed balance and rigorously yields the equilibrium Boltzmann distribution. The enhanced sampling achieved with the method is illustrated with a model system with hindered diffusion and explicit-solvent peptide simulations. Illustrative tests indicate that the method can yield a speedup of about 80 times for the model system and up to 21 times for polyalanine and (AAQAA)3 in water.
2015-01-01
Molecular dynamics (MD) trajectories based on a classical equation of motion provide a straightforward, albeit somewhat inefficient approach, to explore and sample the configurational space of a complex molecular system. While a broad range of techniques can be used to accelerate and enhance the sampling efficiency of classical simulations, only algorithms that are consistent with the Boltzmann equilibrium distribution yield a proper statistical mechanical computational framework. Here, a multiscale hybrid algorithm relying simultaneously on all-atom fine-grained (FG) and coarse-grained (CG) representations of a system is designed to improve sampling efficiency by combining the strength of nonequilibrium molecular dynamics (neMD) and Metropolis Monte Carlo (MC). This CG-guided hybrid neMD-MC algorithm comprises six steps: (1) a FG configuration of an atomic system is dynamically propagated for some period of time using equilibrium MD; (2) the resulting FG configuration is mapped onto a simplified CG model; (3) the CG model is propagated for a brief time interval to yield a new CG configuration; (4) the resulting CG configuration is used as a target to guide the evolution of the FG system; (5) the FG configuration (from step 1) is driven via a nonequilibrium MD (neMD) simulation toward the CG target; (6) the resulting FG configuration at the end of the neMD trajectory is then accepted or rejected according to a Metropolis criterion before returning to step 1. A symmetric two-ends momentum reversal prescription is used for the neMD trajectories of the FG system to guarantee that the CG-guided hybrid neMD-MC algorithm obeys microscopic detailed balance and rigorously yields the equilibrium Boltzmann distribution. The enhanced sampling achieved with the method is illustrated with a model system with hindered diffusion and explicit-solvent peptide simulations. Illustrative tests indicate that the method can yield a speedup of about 80 times for the model system and up to 21 times for polyalanine and (AAQAA)3 in water. PMID:26574442
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Shuai; Xiong, Lihua; Li, Hong-Yi
2015-05-26
Hydrological simulations to delineate the impacts of climate variability and human activities are subjected to uncertainties related to both parameter and structure of the hydrological models. To analyze the impact of these uncertainties on the model performance and to yield more reliable simulation results, a global calibration and multimodel combination method that integrates the Shuffled Complex Evolution Metropolis (SCEM) and Bayesian Model Averaging (BMA) of four monthly water balance models was proposed. The method was applied to the Weihe River Basin (WRB), the largest tributary of the Yellow River, to determine the contribution of climate variability and human activities tomore » runoff changes. The change point, which was used to determine the baseline period (1956-1990) and human-impacted period (1991-2009), was derived using both cumulative curve and Pettitt’s test. Results show that the combination method from SCEM provides more skillful deterministic predictions than the best calibrated individual model, resulting in the smallest uncertainty interval of runoff changes attributed to climate variability and human activities. This combination methodology provides a practical and flexible tool for attribution of runoff changes to climate variability and human activities by hydrological models.« less
NASA Astrophysics Data System (ADS)
Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn
2013-04-01
SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.
NASA Astrophysics Data System (ADS)
Li, L.; Xu, C.-Y.; Engeland, K.
2012-04-01
With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-27
... NUCLEAR REGULATORY COMMISSION [Docket No. 40-3392; NRC-2011-0143] License Amendment Request for Closure of Calcium Fluoride Ponds at Honeywell Metropolis Works, Honeywell International, Inc. AGENCY... Federal Regulations (10 CFR) to approve the closure of the calcium fluoride ponds in-place, by...
BOOK REVIEW: HANDBOOK OF URBAN HEALTH: POPULATIONS, METHODS AND PRACTICE
In Clifford D. Simak's 1952 science fiction classic, City, the metropolis is dead by the end of the 20th century. Cheap atomic power and ubiquitous private helicopters have made concentrated human existence a quaint memory. Simak could not have been more wrong, of course. About h...
Fission Dynamics with Microscopic Level Densities
Ward, D.; Carlsson, B. G.; Dossing, Th.; ...
2017-01-01
We present a consistent framework for treating the energy and angularmomentum dependence of the shape evolution in the nuclear fission. It combines microscopically calculated level densities with the Metropolis-walk method, has no new parameters, and can elucidate the energy-dependent influence of pairing and shell effects on the dynamics of warm nuclei.
Zaikin, Alexey; Míguez, Joaquín
2017-01-01
We compare three state-of-the-art Bayesian inference methods for the estimation of the unknown parameters in a stochastic model of a genetic network. In particular, we introduce a stochastic version of the paradigmatic synthetic multicellular clock model proposed by Ullner et al., 2007. By introducing dynamical noise in the model and assuming that the partial observations of the system are contaminated by additive noise, we enable a principled mechanism to represent experimental uncertainties in the synthesis of the multicellular system and pave the way for the design of probabilistic methods for the estimation of any unknowns in the model. Within this setup, we tackle the Bayesian estimation of a subset of the model parameters. Specifically, we compare three Monte Carlo based numerical methods for the approximation of the posterior probability density function of the unknown parameters given a set of partial and noisy observations of the system. The schemes we assess are the particle Metropolis-Hastings (PMH) algorithm, the nonlinear population Monte Carlo (NPMC) method and the approximate Bayesian computation sequential Monte Carlo (ABC-SMC) scheme. We present an extensive numerical simulation study, which shows that while the three techniques can effectively solve the problem there are significant differences both in estimation accuracy and computational efficiency. PMID:28797087
NASA Astrophysics Data System (ADS)
Zhu, Gaofeng; Li, Xin; Ma, Jinzhu; Wang, Yunquan; Liu, Shaomin; Huang, Chunlin; Zhang, Kun; Hu, Xiaoli
2018-04-01
Sequential Monte Carlo (SMC) samplers have become increasing popular for estimating the posterior parameter distribution with the non-linear dependency structures and multiple modes often present in hydrological models. However, the explorative capabilities and efficiency of the sampler depends strongly on the efficiency in the move step of SMC sampler. In this paper we presented a new SMC sampler entitled the Particle Evolution Metropolis Sequential Monte Carlo (PEM-SMC) algorithm, which is well suited to handle unknown static parameters of hydrologic model. The PEM-SMC sampler is inspired by the works of Liang and Wong (2001) and operates by incorporating the strengths of the genetic algorithm, differential evolution algorithm and Metropolis-Hasting algorithm into the framework of SMC. We also prove that the sampler admits the target distribution to be a stationary distribution. Two case studies including a multi-dimensional bimodal normal distribution and a conceptual rainfall-runoff hydrologic model by only considering parameter uncertainty and simultaneously considering parameter and input uncertainty show that PEM-SMC sampler is generally superior to other popular SMC algorithms in handling the high dimensional problems. The study also indicated that it may be important to account for model structural uncertainty by using multiplier different hydrological models in the SMC framework in future study.
Fast Proton Titration Scheme for Multiscale Modeling of Protein Solutions.
Teixeira, Andre Azevedo Reis; Lund, Mikael; da Silva, Fernando Luís Barroso
2010-10-12
Proton exchange between titratable amino acid residues and the surrounding solution gives rise to exciting electric processes in proteins. We present a proton titration scheme for studying acid-base equilibria in Metropolis Monte Carlo simulations where salt is treated at the Debye-Hückel level. The method, rooted in the Kirkwood model of impenetrable spheres, is applied on the three milk proteins α-lactalbumin, β-lactoglobulin, and lactoferrin, for which we investigate the net-charge, molecular dipole moment, and charge capacitance. Over a wide range of pH and salt conditions, excellent agreement is found with more elaborate simulations where salt is explicitly included. The implicit salt scheme is orders of magnitude faster than the explicit analog and allows for transparent interpretation of physical mechanisms. It is shown how the method can be expanded to multiscale modeling of aqueous salt solutions of many biomolecules with nonstatic charge distributions. Important examples are protein-protein aggregation, protein-polyelectrolyte complexation, and protein-membrane association.
Salehi Shahrabi, Narges; Pourezzat, Aliasghar; Mobaraki, Hossein; Mafimoradi, Shiva
2013-01-01
Abstract Background The centralization of human activities is associated with different pollutants which enter into environment easily and cause the urban environment more vulnerable. Regarding the importance of air pollution issue for Tehran metropolis, many plans and regulations have been developed. However, most of them failed to decline the pollution. The purpose of this study was to pathologically analyze air-pollution control plans to offer effective solutions for Tehran metropolis. Methods A Qualitative content analysis in addition to a semi-structured interview with 14 practicing professional were used to identify 1) key sources of Tehran’s air pollution, 2) recognize challenges towards effective performance of pertinent plans and 3), offer effective solutions. Results Related challenges to air-pollution control plans can be divided into two major categories including lack of integrated and organized stewardship and PEST challenges. Conclusion For controlling the air pollution of Tehran effectively, various controlling alternatives were identified as systematization of plan preparation process, standardization and utilization of new technologies & experts, infrastructural development, realization of social justice, developing coordination mechanisms, improving citizens’ participatory capacity and focusing on effective management of fuel and energy. Controlling air pollution in Tehran needs a serious attention of policymakers to make enforcements through applying a systemic cycle of preparation comprehensive plans. Further, implement the enforcements and evaluate the environmental impact of the plans through involving all stakeholders. PMID:26171340
Behavioural Problems of Juvenile Street Hawkers in Uyo Metropolis, Nigeria
ERIC Educational Resources Information Center
Udoh, Nsisong A.; Joseph, Eme U.
2012-01-01
The study sought the opinions of Faculty of Education Students of University of Uyo on the behavioural problems of juvenile street hawkers in Uyo metropolis. Five research hypotheses were formulated to guide the study. This cross-sectional survey employed multi-stage random sampling technique in selecting 200 regular undergraduate students in the…
Women in Educational Leadership within the Tamale Metropolis
ERIC Educational Resources Information Center
Segkulu, L.; Gyimah, K.
2016-01-01
Within the Tamale Metropolis, it is observed that only a few women occupy top level management positions within the Ghana Education Service (GES). A descriptive survey was therefore conducted in 2013/2014 academic year to assess the factors affecting the gender disparity in educational leadership within the Service. Specifically, the study sought…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-23
... NUCLEAR REGULATORY COMMISSION [Docket No. 40-3392; License No. SUB-526; EA-12-157; NRC-2012-0244] Confirmatory Order; In the Matter of Honeywell International Inc.; Metropolis, Illinois I. Honeywell International Inc. (Honeywell or Licensee) is the holder of Materials License No. SUB-526, issued by the U.S...
High-Dimensional Exploratory Item Factor Analysis by a Metropolis-Hastings Robbins-Monro Algorithm
ERIC Educational Resources Information Center
Cai, Li
2010-01-01
A Metropolis-Hastings Robbins-Monro (MH-RM) algorithm for high-dimensional maximum marginal likelihood exploratory item factor analysis is proposed. The sequence of estimates from the MH-RM algorithm converges with probability one to the maximum likelihood solution. Details on the computer implementation of this algorithm are provided. The…
Environmental Awareness and School Sanitation in Calabar Metropolis of Cross Rivers State, Nigeria
ERIC Educational Resources Information Center
Anijaobi-Idem, F. N.; Ukata, B. N.; Bisong, N. N
2015-01-01
This descriptive survey designed study explored the influence of environmental awareness on secondary school sanitation in Calabar Metropolis. 1 hypothesis was formulated to direct the investigation. 300 subjects made up of 30 principals and 270 teachers constituted the sample drawn from the population of principals and teachers in secondary…
ERIC Educational Resources Information Center
Musa, Alice K. J.; Nwachukwu, Kelechukwu I.; Ali, Domiya Geoffrey
2016-01-01
The study determined Relationship between Students' Expectancy Beliefs and English Language Performance of Students in Maiduguri Metropolis, Borno State, Nigeria. Correlation design was adopted for the study. Four hypotheses which determined the relationships between the components of expectancy beliefs: ability, tasks difficulty, and past…
Accelerated Dimension-Independent Adaptive Metropolis
Chen, Yuxin; Keyes, David E.; Law, Kody J.; ...
2016-10-27
This work describes improvements from algorithmic and architectural means to black-box Bayesian inference over high-dimensional parameter spaces. The well-known adaptive Metropolis (AM) algorithm [33] is extended herein to scale asymptotically uniformly with respect to the underlying parameter dimension for Gaussian targets, by respecting the variance of the target. The resulting algorithm, referred to as the dimension-independent adaptive Metropolis (DIAM) algorithm, also shows improved performance with respect to adaptive Metropolis on non-Gaussian targets. This algorithm is further improved, and the possibility of probing high-dimensional (with dimension d 1000) targets is enabled, via GPU-accelerated numerical libraries and periodically synchronized concurrent chains (justimore » ed a posteriori). Asymptotically in dimension, this GPU implementation exhibits a factor of four improvement versus a competitive CPU-based Intel MKL parallel version alone. Strong scaling to concurrent chains is exhibited, through a combination of longer time per sample batch (weak scaling) and yet fewer necessary samples to convergence. The algorithm performance is illustrated on several Gaussian and non-Gaussian target examples, in which the dimension may be in excess of one thousand.« less
Accelerated Dimension-Independent Adaptive Metropolis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yuxin; Keyes, David E.; Law, Kody J.
This work describes improvements from algorithmic and architectural means to black-box Bayesian inference over high-dimensional parameter spaces. The well-known adaptive Metropolis (AM) algorithm [33] is extended herein to scale asymptotically uniformly with respect to the underlying parameter dimension for Gaussian targets, by respecting the variance of the target. The resulting algorithm, referred to as the dimension-independent adaptive Metropolis (DIAM) algorithm, also shows improved performance with respect to adaptive Metropolis on non-Gaussian targets. This algorithm is further improved, and the possibility of probing high-dimensional (with dimension d 1000) targets is enabled, via GPU-accelerated numerical libraries and periodically synchronized concurrent chains (justimore » ed a posteriori). Asymptotically in dimension, this GPU implementation exhibits a factor of four improvement versus a competitive CPU-based Intel MKL parallel version alone. Strong scaling to concurrent chains is exhibited, through a combination of longer time per sample batch (weak scaling) and yet fewer necessary samples to convergence. The algorithm performance is illustrated on several Gaussian and non-Gaussian target examples, in which the dimension may be in excess of one thousand.« less
Direct simulation Monte Carlo modeling of relaxation processes in polyatomic gases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfeiffer, M., E-mail: mpfeiffer@irs.uni-stuttgart.de; Nizenkov, P., E-mail: nizenkov@irs.uni-stuttgart.de; Mirza, A., E-mail: mirza@irs.uni-stuttgart.de
2016-02-15
Relaxation processes of polyatomic molecules are modeled and implemented in an in-house Direct Simulation Monte Carlo code in order to enable the simulation of atmospheric entry maneuvers at Mars and Saturn’s Titan. The description of rotational and vibrational relaxation processes is derived from basic quantum-mechanics using a rigid rotator and a simple harmonic oscillator, respectively. Strategies regarding the vibrational relaxation process are investigated, where good agreement for the relaxation time according to the Landau-Teller expression is found for both methods, the established prohibiting double relaxation method and the new proposed multi-mode relaxation. Differences and applications areas of these two methodsmore » are discussed. Consequently, two numerical methods used for sampling of energy values from multi-dimensional distribution functions are compared. The proposed random-walk Metropolis algorithm enables the efficient treatment of multiple vibrational modes within a time step with reasonable computational effort. The implemented model is verified and validated by means of simple reservoir simulations and the comparison to experimental measurements of a hypersonic, carbon-dioxide flow around a flat-faced cylinder.« less
Rapid Non-Gaussian Uncertainty Quantification of Seismic Velocity Models and Images
NASA Astrophysics Data System (ADS)
Ely, G.; Malcolm, A. E.; Poliannikov, O. V.
2017-12-01
Conventional seismic imaging typically provides a single estimate of the subsurface without any error bounds. Noise in the observed raw traces as well as the uncertainty of the velocity model directly impact the uncertainty of the final seismic image and its resulting interpretation. We present a Bayesian inference framework to quantify uncertainty in both the velocity model and seismic images, given noise statistics of the observed data.To estimate velocity model uncertainty, we combine the field expansion method, a fast frequency domain wave equation solver, with the adaptive Metropolis-Hastings algorithm. The speed of the field expansion method and its reduced parameterization allows us to perform the tens or hundreds of thousands of forward solves needed for non-parametric posterior estimations. We then migrate the observed data with the distribution of velocity models to generate uncertainty estimates of the resulting subsurface image. This procedure allows us to create both qualitative descriptions of seismic image uncertainty and put error bounds on quantities of interest such as the dip angle of a subduction slab or thickness of a stratigraphic layer.
Geometrically Constructed Markov Chain Monte Carlo Study of Quantum Spin-phonon Complex Systems
NASA Astrophysics Data System (ADS)
Suwa, Hidemaro
2013-03-01
We have developed novel Monte Carlo methods for precisely calculating quantum spin-boson models and investigated the critical phenomena of the spin-Peierls systems. Three significant methods are presented. The first is a new optimization algorithm of the Markov chain transition kernel based on the geometric weight allocation. This algorithm, for the first time, satisfies the total balance generally without imposing the detailed balance and always minimizes the average rejection rate, being better than the Metropolis algorithm. The second is the extension of the worm (directed-loop) algorithm to non-conserved particles, which cannot be treated efficiently by the conventional methods. The third is the combination with the level spectroscopy. Proposing a new gap estimator, we are successful in eliminating the systematic error of the conventional moment method. Then we have elucidated the phase diagram and the universality class of the one-dimensional XXZ spin-Peierls system. The criticality is totally consistent with the J1 -J2 model, an effective model in the antiadiabatic limit. Through this research, we have succeeded in investigating the critical phenomena of the effectively frustrated quantum spin system by the quantum Monte Carlo method without the negative sign. JSPS Postdoctoral Fellow for Research Abroad
ERIC Educational Resources Information Center
Adam, Abdul-Kahar
2015-01-01
This project is carried out by employing an empirical method through questionnaire design and administration and tapped the perceptions and knowledge of the target elements of this study. The research frame was about Ghana Education Service office workers within the Accra Metropolis including higher education institutions. A qualitative data…
Link, William A; Barker, Richard J
2005-03-01
We present a hierarchical extension of the Cormack-Jolly-Seber (CJS) model for open population capture-recapture data. In addition to recaptures of marked animals, we model first captures of animals and losses on capture. The parameter set includes capture probabilities, survival rates, and birth rates. The survival rates and birth rates are treated as a random sample from a bivariate distribution, thus the model explicitly incorporates correlation in these demographic rates. A key feature of the model is that the likelihood function, which includes a CJS model factor, is expressed entirely in terms of identifiable parameters; losses on capture can be factored out of the model. Since the computational complexity of classical likelihood methods is prohibitive, we use Markov chain Monte Carlo in a Bayesian analysis. We describe an efficient candidate-generation scheme for Metropolis-Hastings sampling of CJS models and extensions. The procedure is illustrated using mark-recapture data for the moth Gonodontis bidentata.
Nanothermodynamics Applied to Thermal Processes in Heterogeneous Materials
2012-08-03
models agree favorably with a wide range of measurements of local thermal and dynamic properties. Progress in understanding basic thermodynamic...Monte- Carlo (MC) simulations of the Ising model .7 The solid black lines in Fig. 4 show results using the uncorrected (Metropolis) algorithm on the...parameter g=0.5 (green, dash-dot), g=1 (black, solid ), and g=2 (blue, dash-dot-dot). Note the failure of the standard Ising model (g=0) to match
Race, Schools and Opportunity Hoarding: Evidence from a Post-War American Metropolis
ERIC Educational Resources Information Center
Rury, John L.; Rife, Aaron Tyler
2018-01-01
Opportunity hoarding is a sociological concept first introduced by Charles Tilly. This article explores its utility for historians by examining efforts to exclude different groups of people in a major American metropolis during the 1960s and seventies. This was a period of significant social change, as the racial composition of big city schools…
Employee Motivation on the Organisational Growth of Printing Industry in the Kumasi Metropolis
ERIC Educational Resources Information Center
Enninful, Ebenezer Kofi; Boakye-Amponsah, Abraham; Osei-Poku, Patrick
2015-01-01
The printing industry is supposed to be a major contributor to Ghana's development through employment creation and the enhancement of information to the general public. The main purpose of the study was to assess employee motivation on the printing industry within Kumasi Metropolis. The study employed both the quantitative and qualitative surveys…
ERIC Educational Resources Information Center
Roth, Lane
Fritz Lang's "Metropolis" (1927) is a seminal film because of its concern, now generic, with the profound impact technological progress has on mankind's social and spiritual progress. As in many later science fiction films, the ascendancy of artifact over nature is depicted not as liberating human beings, but as subjecting and corrupting…
Anisotropic dielectric properties of two-dimensional matrix in pseudo-spin ferroelectric system
NASA Astrophysics Data System (ADS)
Kim, Se-Hun
2016-10-01
The anisotropic dielectric properties of a two-dimensional (2D) ferroelectric system were studied using the statistical calculation of the pseudo-spin Ising Hamiltonian model. It is necessary to delay the time for measurements of the observable and the independence of the new spin configuration under Monte Carlo sampling, in which the thermal equilibrium state depends on the temperature and size of the system. The autocorrelation time constants of the normalized relaxation function were determined by taking temperature and 2D lattice size into account. We discuss the dielectric constants of a two-dimensional ferroelectric system by using the Metropolis method in view of the Slater-Takagi defect energies.
NASA Astrophysics Data System (ADS)
Smith, T.; Marshall, L.
2007-12-01
In many mountainous regions, the single most important parameter in forecasting the controls on regional water resources is snowpack (Williams et al., 1999). In an effort to bridge the gap between theoretical understanding and functional modeling of snow-driven watersheds, a flexible hydrologic modeling framework is being developed. The aim is to create a suite of models that move from parsimonious structures, concentrated on aggregated watershed response, to those focused on representing finer scale processes and distributed response. This framework will operate as a tool to investigate the link between hydrologic model predictive performance, uncertainty, model complexity, and observable hydrologic processes. Bayesian methods, and particularly Markov chain Monte Carlo (MCMC) techniques, are extremely useful in uncertainty assessment and parameter estimation of hydrologic models. However, these methods have some difficulties in implementation. In a traditional Bayesian setting, it can be difficult to reconcile multiple data types, particularly those offering different spatial and temporal coverage, depending on the model type. These difficulties are also exacerbated by sensitivity of MCMC algorithms to model initialization and complex parameter interdependencies. As a way of circumnavigating some of the computational complications, adaptive MCMC algorithms have been developed to take advantage of the information gained from each successive iteration. Two adaptive algorithms are compared is this study, the Adaptive Metropolis (AM) algorithm, developed by Haario et al (2001), and the Delayed Rejection Adaptive Metropolis (DRAM) algorithm, developed by Haario et al (2006). While neither algorithm is truly Markovian, it has been proven that each satisfies the desired ergodicity and stationarity properties of Markov chains. Both algorithms were implemented as the uncertainty and parameter estimation framework for a conceptual rainfall-runoff model based on the Probability Distributed Model (PDM), developed by Moore (1985). We implement the modeling framework in Stringer Creek watershed in the Tenderfoot Creek Experimental Forest (TCEF), Montana. The snowmelt-driven watershed offers that additional challenge of modeling snow accumulation and melt and current efforts are aimed at developing a temperature- and radiation-index snowmelt model. Auxiliary data available from within TCEF's watersheds are used to support in the understanding of information value as it relates to predictive performance. Because the model is based on lumped parameters, auxiliary data are hard to incorporate directly. However, these additional data offer benefits through the ability to inform prior distributions of the lumped, model parameters. By incorporating data offering different information into the uncertainty assessment process, a cross-validation technique is engaged to better ensure that modeled results reflect real process complexity.
ERIC Educational Resources Information Center
Oyefara, John Lekan
2005-01-01
This article examines the sexual behaviour and the HIV/AIDS knowledge and vulnerability of female street hawkers in Lagos metropolis, Nigeria. A total of 126 female street hawkers under 18 were sampled in a cross-sectional survey and six Focus Group Discussions (FGDs) were conducted to generate data from respondents. Data on sexual behaviour…
Analysis of Errors Committed by Physics Students in Secondary Schools in Ilorin Metropolis, Nigeria
ERIC Educational Resources Information Center
Omosewo, Esther Ore; Akanbi, Abdulrasaq Oladimeji
2013-01-01
The study attempt to find out the types of error committed and influence of gender on the type of error committed by senior secondary school physics students in metropolis. Six (6) schools were purposively chosen for the study. One hundred and fifty five students' scripts were randomly sampled for the study. Joint Mock physics essay questions…
ERIC Educational Resources Information Center
Ogidi, Reuben C.; Udechukwu, Jonathan O.
2017-01-01
The study sought to investigate the perception of stakeholders on teachers' assessment effectiveness in secondary schools in Port Harcourt Metropolis in Rivers State. Three research questions and one hypothesis were formulated to guide the study. The study adopted survey research design. The sample of the study consisted of 20 principles, 30 vice…
ERIC Educational Resources Information Center
Musa, Alice K. J.; Meshak, Bibi; Sagir, Jummai Ibrahim
2016-01-01
The purpose of the study was to determine adolescents' perceptions of the psychological security of their schools environments and their relationship with their emotional development and academic performance in secondary schools in Gombe Metropolis. A sample of 239 (107 males and 133 females) secondary school students selected via stratified…
A global food demand model for the assessment of complex human-earth systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
EDMONDS, JAMES A.; LINK, ROBERT; WALDHOFF, STEPHANIE T.
Demand for agricultural products is an important problem in climate change economics. Food consumption will shape and shaped by climate change and emissions mitigation policies through interactions with bioenergy and afforestation, two critical issues in meeting international climate goals such as two-degrees. We develop a model of food demand for staple and nonstaple commodities that evolves with changing incomes and prices. The model addresses a long-standing issue in estimating food demands, the evolution of demand relationships across large changes in income and prices. We discuss the model, some of its properties and limitations. We estimate parameter values using pooled cross-sectional-time-seriesmore » observations and the Metropolis Monte Carlo method and cross-validate the model by estimating parameters using a subset of the observations and test its ability to project into the unused observations. Finally, we apply bias correction techniques borrowed from the climate-modeling community and report results.« less
[Study on vitamin A nutritional status of Chinese urban elderly residents in 2010-2012].
Chen, J; Hu, Y C; Yang, C; Yun, C F; Wang, R; Mao, D Q; Li, W D; Yang, Y H; Yang, X G; Yang, L C
2017-02-06
Objective: To assess the vitamin A nutritional status of the Chinese urban elderly population by analyzing serum retinol level in 2010-2012. Methods: Data were collected from the Chinese National Nutrition and Health Survey in 2010-2012. Using the multi-stage stratified cluster sampling method, serum samples from elderly residents aged ≥60 years old were obtained from 34 metropolis and 41 middle-sized and small cities. Demographic data were collected using a questionnaire survey. The serum retinol concentration was determined by high-performance liquid chromatography. Vitamin A deficiency (VAD) was determined using the World Health Organization guidelines. A total of 3 200 elderly residents were included in the study. The serum retinol levels and prevalence of VAD and marginal VAD were also compared. Results: The serum retinol concentration ( P (50)( P (25)- P (75))) of Chinese urban elderly residents was 1.83 (1.37-2.39) μmoL/L. Compared with middle-sized and small cities (1.91 (1.47-2.48) μmol/L), the retinol level of senior citizens in metropolis (1.70 (1.25-2.25) μmol/L) was significantly lower ( P< 0.001). The serum retinol levels of elderly male (1.89 (1.37-2.47) μmoL/L) was significantly higher than that of female (1.80 (1.36-2.28) μmoL/L) ( P= 0.001). The serum retinol concentration was 1.87 (1.42-2.43), 1.78 (1.32-2.33), and 1.71 (1.24-2.24) μmol/L for 60-69, 70-79, and ≥80 years olds, respectively. The retinol level in elderly people ≥70 years olds was significantly lower than that of 60-69 years olds ( P< 0.001). The overall prevalence of VAD among Chinese urban elderly residents was 4.22% (135/3 200); 6.00% (81/1 350) for metropolis residents and 2.92% (54/1 850) for middle-sized and small city residents. The overall marginal VAD rate of Chinese urban elderly residents was 8.19% (262/3 200); 10.51% (142/1 350) for metropolis residents and 6.49% (120/1 850) for medium-sized and small city residents. The prevalence of VAD and marginal VAD for males was 3.87% (61/1 577) and 8.24% (130/1 577), respectively ( P< 0.05). The prevalence of VAD according to age group was 3.65% (72/1 975), 4.96% (50/1 008), and 5.99% (13/217), respectively( P =0.097). The prevalence of marginal VAD according to age group was 6.99% (138/1 975), 9.82% (99/1 008), and 11.52% (25/217), respectively( P =0.05). Conclusion: Chinese urban elderly residents showed various levels of VAD, although marginal VAD was quite common. As VAD was more common in metropolis residents and older residents, specific strategies should target these populations.
Lin, Cheng-Horng
2016-12-23
There are more than 7 million people living near the Tatun volcano group in northern Taiwan. For the safety of the Taipei metropolis, in particular, it has been debated for decades whether or not these volcanoes are active. Here I show evidence of a deep magma reservoir beneath the Taipei metropolis from both S-wave shadows and P-wave delays. The reservoir is probably composed of either a thin magma layer overlay or many molten sills within thick partially molten rocks. Assuming that 40% of the reservoir is partially molten, its total volume could be approximately 350 km 3 . The exact location and geometry of the magma reservoir will be obtained after dense seismic arrays are deployed in 2017-2020.
Wheelchair accessibility to public buildings in the Kumasi metropolis, Ghana
Ashigbi, Evans Y.K.
2017-01-01
Background Accessibility implies making public places accessible to every individual, irrespective of his or her disability or special need, ensuring the integration of the wheelchair user into the society and thereby granting them the capability of participating in activities of daily living and ensuring equality in daily life. Objective This study was carried out to assess the accessibility of the physical infrastructures (public buildings) in the Kumasi metropolis to wheelchairs after the passage of the Ghanaian Disability Law (Act 716, 2006). Methods Eighty-four public buildings housing education facilities, health facilities, ministries, departments and agencies, sports and recreation, religious groups and banks were assessed. The routes, entrances, height of steps, grade of ramps, sinks, entrance to washrooms, toilets, urinals, automated teller machines and tellers’ counters were measured and computed. Results Out of a total of 84 buildings assessed, only 34 (40.5%) of the buildings, 52.3% of the entrances and 87.4% of the routes of the buildings were accessible to wheelchair users. A total of 25% (13 out of 52) of the public buildings with more than one floor were fitted with elevators to connect the different levels of floors. Conclusion The results of this study show that public buildings in the Kumasi metropolis are not wheelchair accessible. An important observation made during this study was that there is an intention to improve accessibility when buildings are being constructed or renovated, but there are no laid down guidelines as how to make the buildings accessible for wheelchair users. PMID:29062761
Obembe, Taiwo A.; Osungbade, Kayode O.; Ademokun, Oluwakemi M.
2016-01-01
Background: The awareness, knowledge, and involvement of teachers in the implementation of School Health Programme (SHP) in secondary schools are essential in ensuring the effectiveness and overall success of the School Health Policy. This study assessed the awareness and knowledge of teachers on SHP in Ibadan metropolis. Methods: A descriptive cross-sectional study was carried out using a two-stage sampling technique to select 426 secondary school teachers across all the five Urban Local Government Areas (LGAs) in Ibadan metropolis by balloting. Pretested semi-structured questionnaires were used to collect data from 426 teachers. Quantitative data were analyzed using descriptive statistics, Chi-square, and logistics regression tests at 5% level of significance. Results: About one-third of the respondents had heard of National School Health Policy (NSHP); however, few had seen the document. About half of the respondents were aware of the SHP in their schools. Many of the respondents had a good knowledge of SHP. Age and level of education of participants significantly influenced the knowledge of SHP. Above 50 years of age and postgraduate qualification were the significant predictors for the good knowledge of SHP. Conclusions: Awareness of the NSHP was low despite the good knowledge of SHP. This could be due to the tertiary education that most of the respondents had. Concerted efforts of stakeholders are required to intensify the health education awareness campaign to improve teachers’ knowledge based on NSHP. PMID:27630385
A sampling algorithm for segregation analysis
Tier, Bruce; Henshall, John
2001-01-01
Methods for detecting Quantitative Trait Loci (QTL) without markers have generally used iterative peeling algorithms for determining genotype probabilities. These algorithms have considerable shortcomings in complex pedigrees. A Monte Carlo Markov chain (MCMC) method which samples the pedigree of the whole population jointly is described. Simultaneous sampling of the pedigree was achieved by sampling descent graphs using the Metropolis-Hastings algorithm. A descent graph describes the inheritance state of each allele and provides pedigrees guaranteed to be consistent with Mendelian sampling. Sampling descent graphs overcomes most, if not all, of the limitations incurred by iterative peeling algorithms. The algorithm was able to find the QTL in most of the simulated populations. However, when the QTL was not modeled or found then its effect was ascribed to the polygenic component. No QTL were detected when they were not simulated. PMID:11742631
ERIC Educational Resources Information Center
Shelina, S. L.; Mitina, O. V.
2015-01-01
The article presents the results of an analysis of the moral value judgments of adults (parents, teachers, educators) that directly concern the socialization process of the young generation in the modern metropolis. This paper follows the model study by Jean Piaget that investigated the moral value judgments of children. A comparative analysis of…
ERIC Educational Resources Information Center
Martin-Fernandez, Manuel; Revuelta, Javier
2017-01-01
This study compares the performance of two estimation algorithms of new usage, the Metropolis-Hastings Robins-Monro (MHRM) and the Hamiltonian MCMC (HMC), with two consolidated algorithms in the psychometric literature, the marginal likelihood via EM algorithm (MML-EM) and the Markov chain Monte Carlo (MCMC), in the estimation of multidimensional…
Hybrid lattice gas simulations of flow through porous media
NASA Astrophysics Data System (ADS)
Becklehimer, Jeffrey Lynn
1997-10-01
This study introduces a suite of models designed to investigate transport phenomena in simulated porous media such as rigid or quenched sediment and clay-like deformable environments. This is achieved by using a variety of techniques that are borrowed from the field of statistical physics. These techniques include percolation, lattice gas, and cellular automata. A percolation-based model is used to study a porous medium by using rods and chains of various shapes and sizes to model the porous media formed by sediments. This is further extended to model clay-like deformable media by interacting heavy sediment particles. An interacting lattice gas computer simulation model based on the Metropolis algorithm is used to study the transport properties of fluid particles and permeability of a porous sediment. Finally, a hybrid lattice gas model is introduced by combining the Metropolis Monte Carlo method with a direct simulation which involves the collision rules as in cellular automata. This model is then used to study shock propagation in a fluid filled porous medium. This study is then extended to study shock propagation through in a fluid filled elastic porous medium. Several interesting and new results were obtained. These results show that for rigid chain percolation the percolation threshold shows a dependence on the chain length of pc~ Lc-1/2 and the jamming coverage decreases with the chain length as Lc- 1/3. For the random SAW-like chains the percolation threshold decays with the chain length as Lc- 0.01 and the jamming coverage as Lc-1/3. The fluid flow model shows that permeability depends nonmonotonically on the concentration of the fluid. For some fluids at a fixed porosity, the permeability increases on increasing the bias until a certain value Bc above which it decreases. Also, it was found that a shock propagates in a drift-like fashion when in a rigid porous medium when the porosity is high; low porosity damps out the shock front very quickly. For a shock propagating in a clay-like porous medium an unusually super-fast power-law behavior is observed for the RMS displacements of the fluid and clay particles.
NASA Astrophysics Data System (ADS)
García, M. F.; Restrepo-Parra, E.; Riaño-Rojas, J. C.
2015-05-01
This work develops a model that mimics the growth of diatomic, polycrystalline thin films by artificially splitting the growth into deposition and relaxation processes including two stages: (1) a grain-based stochastic method (grains orientation randomly chosen) is considered and by means of the Kinetic Monte Carlo method employing a non-standard version, known as Constant Time Stepping, the deposition is simulated. The adsorption of adatoms is accepted or rejected depending on the neighborhood conditions; furthermore, the desorption process is not included in the simulation and (2) the Monte Carlo method combined with the metropolis algorithm is used to simulate the diffusion. The model was developed by accounting for parameters that determine the morphology of the film, such as the growth temperature, the interacting atomic species, the binding energy and the material crystal structure. The modeled samples exhibited an FCC structure with grain formation with orientations in the family planes of < 111 >, < 200 > and < 220 >. The grain size and film roughness were analyzed. By construction, the grain size decreased, and the roughness increased, as the growth temperature increased. Although, during the growth process of real materials, the deposition and relaxation occurs simultaneously, this method may perhaps be valid to build realistic polycrystalline samples.
A comparative study of noise pollution levels in some selected areas in Ilorin Metropolis, Nigeria.
Oyedepo, Olayinka S; Saadu, Abdullahi A
2009-11-01
The noise pollution is a major problem for the quality of life in urban areas. This study was conducted to compare the noise pollution levels at busy roads/road junctions, passengers loading parks, commercial, industrial and residential areas in Ilorin metropolis. A total number of 47-locations were selected within the metropolis. Statistical analysis shows significant difference (P < 0.05) in noise pollution levels between industrial areas and low density residential areas, industrial areas and high density areas, industrial areas and passengers loading parks, industrial areas and commercial areas, busy roads/road junctions and low density areas, passengers loading parks and commercial areas and commercial areas and low density areas. There is no significant difference (P > 0.05) in noise pollution levels between industrial areas and busy roads/road junctions, busy roads/road junctions and high density areas, busy roads/road junctions and passengers loading parks, busy roads/road junctions and commercial areas, passengers loading parks and high density areas, passengers loading parks and commercial areas and commercial areas and high density areas. The results show that Industrial areas have the highest noise pollution levels (110.2 dB(A)) followed by busy roads/Road junctions (91.5 dB(A)), Passengers loading parks (87.8 dB(A)) and Commercial areas (84.4 dB(A)). The noise pollution levels in Ilorin metropolis exceeded the recommended level by WHO at 34 of 47 measuring points. It can be concluded that the city is environmentally noise polluted and road traffic and industrial machineries are the major sources of it. Noting the noise emission standards, technical control measures, planning and promoting the citizens awareness about the high noise risk may help to relieve the noise problem in the metropolis.
Lin, Cheng-Horng
2016-01-01
There are more than 7 million people living near the Tatun volcano group in northern Taiwan. For the safety of the Taipei metropolis, in particular, it has been debated for decades whether or not these volcanoes are active. Here I show evidence of a deep magma reservoir beneath the Taipei metropolis from both S-wave shadows and P-wave delays. The reservoir is probably composed of either a thin magma layer overlay or many molten sills within thick partially molten rocks. Assuming that 40% of the reservoir is partially molten, its total volume could be approximately 350 km3. The exact location and geometry of the magma reservoir will be obtained after dense seismic arrays are deployed in 2017–2020. PMID:28008931
ERIC Educational Resources Information Center
Iji, C. O.; Ogbole, P. O.; Uka, N. K.
2014-01-01
Among all approaches aimed at reducing poor mathematics achievement among the students, adoption of appropriate methods of teaching appears to be more rewarding. In this study, improvised instructional materials were used to ascertain students' geometry achievement at the upper basic education one. Two research questions were asked with associated…
Strategic guidelines of a megalopolis’s development: new industrialization and ecological tension
NASA Astrophysics Data System (ADS)
Lavrikova, Yulia; Akberdina, Victoria; Mezentseva, Elena
2017-06-01
The article is devoted to the integration of environmental concerns in the development strategy of the metropolis. The authors substantiate the relationship of the new industrialization and reduce the burden on the environment. For example, a large city in Russia - Ekaterinburg - projections and strategic directions of ecological development of the city for the period up to 2035. The basis for the forecast were the methods of aggregation of economic sectors, functional relationship and forecasting algorithms. The article describes three scenarios for the development of Ekaterinburg, the results of calculations by the author’s method. The authors have shown the relationship of industrial development of the metropolis and the anthropogenic load, which is assessed using such indicators as emissions of harmful substances into the atmosphere, discharges of sewage, green spaces per inhabitant. The authors note that the ecological security of the residents is directly related to the improvement of controllability of the municipal economy, the increased control in the field of the environment, reduction of environmental burden on humans and the environment, zoning of the city with a goal differential application of indicators of the quality of the environment.
Weare, Jonathan; Dinner, Aaron R.; Roux, Benoît
2016-01-01
A multiple time-step integrator based on a dual Hamiltonian and a hybrid method combining molecular dynamics (MD) and Monte Carlo (MC) is proposed to sample systems in the canonical ensemble. The Dual Hamiltonian Multiple Time-Step (DHMTS) algorithm is based on two similar Hamiltonians: a computationally expensive one that serves as a reference and a computationally inexpensive one to which the workload is shifted. The central assumption is that the difference between the two Hamiltonians is slowly varying. Earlier work has shown that such dual Hamiltonian multiple time-step schemes effectively precondition nonlinear differential equations for dynamics by reformulating them into a recursive root finding problem that can be solved by propagating a correction term through an internal loop, analogous to RESPA. Of special interest in the present context, a hybrid MD-MC version of the DHMTS algorithm is introduced to enforce detailed balance via a Metropolis acceptance criterion and ensure consistency with the Boltzmann distribution. The Metropolis criterion suppresses the discretization errors normally associated with the propagation according to the computationally inexpensive Hamiltonian, treating the discretization error as an external work. Illustrative tests are carried out to demonstrate the effectiveness of the method. PMID:26918826
Assessment of parametric uncertainty for groundwater reactive transport modeling,
Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun
2014-01-01
The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.
Who Cares? Pre and Post Abortion Experiences among Young Females in Cape Coast Metropolis, Ghana.
Esia-Donkoh, Kobina; Darteh, Eugene K M; Blemano, Harriet; Asare, Hagar
2015-06-01
Issues of abortion are critical in Ghana largely due to its consequences on sexual and reproductive health. The negative perception society attaches to it makes it difficult for young females to access services and share their experiences. This paper examines the pre and post abortion experiences of young females; a subject scarcely researched in the country. Twenty-one clients of Planned Parenthood Association of Ghana (PPAG) clinic at Cape Coast were interviewed. Guided by the biopsychosocial model, the study revealed that fear of societal stigma, shame, and rejection by partners, as well as self-imposed stigma constituted some of the pre and post abortion experiences the respondents. Other experiences reported were bleeding, severe abdominal pain and psychological pain. The Ghana Health Services (GHS) and other service providers should partner the PPAG clinic to integrate psychosocial treatment in its abortion services while intensifying behaviour change communication and community-based stigma-reduction education in the Metropolis.
ERIC Educational Resources Information Center
Mccrea, Rod; Stimson, Robert; Western, John
2005-01-01
Using survey data collected from households living in the Brisbane-South East Queensland region, a rapidly growing metropolis in Australia, path analysis is used to test links between urban residents' assessment of various urban attributes and their level of satisfaction in three urban domains--housing, neighbourhood or local area, and the wider…
[Investigation on emission properties of biogenic VOCs of landscape plants in Shenzhen].
Huang, Ai-Kui; Li, Nan; Guenther, Alex; Greenberg, Jim; Baker, Brad; Graessli, Michael; Bai, Jian-Hui
2011-12-01
Isoprene and monoterpene emissions were characterized using flow and enclosure sampling method and GC-MS in USA for 158 species of plants growing in Shenzhen, China. This survey was designed to include all of the dominant plants within the Shenzhen region as well as unique plants such as Cycads. These are the first measurements in a subtropical Asian metropolis. Substantial isoprene emissions were observed from thirty-one species, including Caryota mitis, Adenanthera pavonina var. microsperma, Mangifera indica and Excoecoria agalloch. Monoterpene emissions were observed from fifty-two species, including Passiflora edulis, Bambusa glaucescens cv. silverstripe as well as some primitive and rare Cycadaceae and Cyatheaceae plants. For the first time some of red plants have been measured, most of them have the ability of releasing terpene. These results will be used to develop biogenic emission model estimates for Shenzhen and the surrounding region that can be used as inputs for regional air quality models.
Dynamic Conformations of Nucleosome Arrays in Solution from Small-Angle X-ray Scattering
NASA Astrophysics Data System (ADS)
Howell, Steven C.
Chromatin conformation and dynamics remains unsolved despite the critical role of the chromatin in fundamental genetic functions such as transcription, replication, and repair. At the molecular level, chromatin can be viewed as a linear array of nucleosomes, each consisting of 147 base pairs (bp) of double-stranded DNA (dsDNA) wrapped around a protein core and connected by 10 to 90 bp of linker dsDNA. Using small-angle X-ray scattering (SAXS), we investigated how the conformations of model nucleosome arrays in solution are modulated by ionic condition as well as the effect of linker histone proteins. To facilitate ensemble modeling of these SAXS measurements, we developed a simulation method that treats coarse-grained DNA as a Markov chain, then explores possible DNA conformations using Metropolis Monte Carlo (MC) sampling. This algorithm extends the functionality of SASSIE, a program used to model intrinsically disordered biological molecules, adding to the previous methods for simulating protein, carbohydrates, and single-stranded DNA. Our SAXS measurements of various nucleosome arrays together with the MC generated models provide valuable solution structure information identifying specific differences from the structure of crystallized arrays.
Quality assessment of MEG-to-MRI coregistrations
NASA Astrophysics Data System (ADS)
Sonntag, Hermann; Haueisen, Jens; Maess, Burkhard
2018-04-01
For high precision in source reconstruction of magnetoencephalography (MEG) or electroencephalography data, high accuracy of the coregistration of sources and sensors is mandatory. Usually, the source space is derived from magnetic resonance imaging (MRI). In most cases, however, no quality assessment is reported for sensor-to-MRI coregistrations. If any, typically root mean squares (RMS) of point residuals are provided. It has been shown, however, that RMS of residuals do not correlate with coregistration errors. We suggest using target registration error (TRE) as criterion for the quality of sensor-to-MRI coregistrations. TRE measures the effect of uncertainty in coregistrations at all points of interest. In total, 5544 data sets with sensor-to-head and 128 head-to-MRI coregistrations, from a single MEG laboratory, were analyzed. An adaptive Metropolis algorithm was used to estimate the optimal coregistration and to sample the coregistration parameters (rotation and translation). We found an average TRE between 1.3 and 2.3 mm at the head surface. Further, we observed a mean absolute difference in coregistration parameters between the Metropolis and iterative closest point algorithm of (1.9 +/- 15){\\hspace{0pt}}\\circ and (1.1 +/- 9) m. A paired sample t-test indicated a significant improvement in goal function minimization by using the Metropolis algorithm. The sampled parameters allowed computation of TRE on the entire grid of the MRI volume. Hence, we recommend the Metropolis algorithm for head-to-MRI coregistrations.
Simulating the Rayleigh-Taylor instability with the Ising model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ball, Justin R.; Elliott, James B.
2011-08-26
The Ising model, implemented with the Metropolis algorithm and Kawasaki dynamics, makes a system with its own physics, distinct from the real world. These physics are sophisticated enough to model behavior similar to the Rayleigh-Taylor instability and by better understanding these physics, we can learn how to modify the system to better re ect reality. For example, we could add a v x and a v y to each spin and modify the exchange rules to incorporate them, possibly using two body scattering laws to construct a more realistic system.
Ising model simulation in directed lattices and networks
NASA Astrophysics Data System (ADS)
Lima, F. W. S.; Stauffer, D.
2006-01-01
On directed lattices, with half as many neighbours as in the usual undirected lattices, the Ising model does not seem to show a spontaneous magnetisation, at least for lower dimensions. Instead, the decay time for flipping of the magnetisation follows an Arrhenius law on the square and simple cubic lattice. On directed Barabási-Albert networks with two and seven neighbours selected by each added site, Metropolis and Glauber algorithms give similar results, while for Wolff cluster flipping the magnetisation decays exponentially with time.
Computer simulation of the mechanical properties of metamaterials
NASA Astrophysics Data System (ADS)
Gerasimov, R. A.; Eremeyev, V. A.; Petrova, T. O.; Egorov, V. I.; Maksimova, O. G.; Maksimov, A. V.
2016-08-01
For a hybrid discrete-continual model describing a system which consists of a substrate and polymer coating, we provide computer simulation of its mechanical properties for various levels of deformations. For the substrate, we apply the elastic model with the Hooke law while for the polymeric coating, we use a discrete model. Here we use the Stockmayer potential which is a Lennard-Jones potential with additional term which describes the dipole interactions between neighbour segments of polymer chains, that is Keesom energy. Using Monte-Carlo method with Metropolis algorithm for a given temperature the equilibrium state is determined. We obtain dependencies of the energy, force, bending moment and Young's modulus for various levels of deformations and for different values of temperature. We show that for the increase of the deformations level the influence of surface coating on the considered material parameters is less pronounced. We provide comparison of obtained results with experimental data on deformations of crystalline polymers (gutta-percha, etc.)
Appplication of statistical mechanical methods to the modeling of social networks
NASA Astrophysics Data System (ADS)
Strathman, Anthony Robert
With the recent availability of large-scale social data sets, social networks have become open to quantitative analysis via the methods of statistical physics. We examine the statistical properties of a real large-scale social network, generated from cellular phone call-trace logs. We find this network, like many other social networks to be assortative (r = 0.31) and clustered (i.e., strongly transitive, C = 0.21). We measure fluctuation scaling to identify the presence of internal structure in the network and find that structural inhomogeneity effectively disappears at the scale of a few hundred nodes, though there is no sharp cutoff. We introduce an agent-based model of social behavior, designed to model the formation and dissolution of social ties. The model is a modified Metropolis algorithm containing agents operating under the basic sociological constraints of reciprocity, communication need and transitivity. The model introduces the concept of a social temperature. We go on to show that this simple model reproduces the global statistical network features (incl. assortativity, connected fraction, mean degree, clustering, and mean shortest path length) of the real network data and undergoes two phase transitions, one being from a "gas" to a "liquid" state and the second from a liquid to a glassy state as function of this social temperature.
Discrete Spin Vector Approach for Monte Carlo-based Magnetic Nanoparticle Simulations
NASA Astrophysics Data System (ADS)
Senkov, Alexander; Peralta, Juan; Sahay, Rahul
The study of magnetic nanoparticles has gained significant popularity due to the potential uses in many fields such as modern medicine, electronics, and engineering. To study the magnetic behavior of these particles in depth, it is important to be able to model and simulate their magnetic properties efficiently. Here we utilize the Metropolis-Hastings algorithm with a discrete spin vector model (in contrast to the standard continuous model) to model the magnetic hysteresis of a set of protected pure iron nanoparticles. We compare our simulations with the experimental hysteresis curves and discuss the efficiency of our algorithm.
Surface Segregation in Ternary Alloys
NASA Technical Reports Server (NTRS)
Good, Brian; Bozzolo, Guillermo H.; Abel, Phillip B.
2000-01-01
Surface segregation profiles of binary (Cu-Ni, Au-Ni, Cu-Au) and ternary (Cu-Au-Ni) alloys are determined via Monte Carlo-Metropolis computer simulations using the BFS method for alloys for the calculation of the energetics. The behavior of Cu or Au in Ni is contrasted with their behavior when both are present. The interaction between Cu and Au and its effect on the segregation profiles for Cu-Au-Ni alloys is discussed.
NASA Astrophysics Data System (ADS)
Keating, Elizabeth H.; Doherty, John; Vrugt, Jasper A.; Kang, Qinjun
2010-10-01
Highly parameterized and CPU-intensive groundwater models are increasingly being used to understand and predict flow and transport through aquifers. Despite their frequent use, these models pose significant challenges for parameter estimation and predictive uncertainty analysis algorithms, particularly global methods which usually require very large numbers of forward runs. Here we present a general methodology for parameter estimation and uncertainty analysis that can be utilized in these situations. Our proposed method includes extraction of a surrogate model that mimics key characteristics of a full process model, followed by testing and implementation of a pragmatic uncertainty analysis technique, called null-space Monte Carlo (NSMC), that merges the strengths of gradient-based search and parameter dimensionality reduction. As part of the surrogate model analysis, the results of NSMC are compared with a formal Bayesian approach using the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. Such a comparison has never been accomplished before, especially in the context of high parameter dimensionality. Despite the highly nonlinear nature of the inverse problem, the existence of multiple local minima, and the relatively large parameter dimensionality, both methods performed well and results compare favorably with each other. Experiences gained from the surrogate model analysis are then transferred to calibrate the full highly parameterized and CPU intensive groundwater model and to explore predictive uncertainty of predictions made by that model. The methodology presented here is generally applicable to any highly parameterized and CPU-intensive environmental model, where efficient methods such as NSMC provide the only practical means for conducting predictive uncertainty analysis.
Orish, Verner N; Onyeabor, Onyekachi S; Boampong, Johnson N; Aforakwah, Richmond; Nwaefuna, Ekene; Iriemenam, Nnaemeka C
2012-09-01
The problem of malaria in adolescence has been surpassed by the immense burden of malaria in children, most especially less than 5. A substantial amount of work done on malaria in pregnancy in endemic regions has not properly considered the adolescence. The present study therefore aimed at evaluating the prevalence of Plasmodium falciparum and anaemia infection in adolescent pregnant girls in the Sekondi-Takoradi metropolis, Ghana. The study was carried out at four hospitals in the Sekondi-Takoradi metropolis of the western region of Ghana from January 2010 to October 2010. Structured questionnaires were administered to the consenting pregnant women during their antenatal care visits. Information on education, age, gravidae, occupation and socio-demographic characteristics were recorded. Venous bloods were screened for malaria using RAPID response antibody kit and Geimsa staining while haemoglobin estimations were done by cyanmethemoglobin method. The results revealed that adolescent pregnant girls were more likely to have malaria infection than the adult pregnant women (34.6% verses 21.3%, adjusted OR 1.65, 95% CI, 1.03-2.65, P=0.039). In addition, adolescent pregnant girls had higher odds of anaemia than their adult pregnant women equivalent (43.9% versus 33.2%; adjusted OR 1.63, 95% CI, 1.01-2.62, P=0.046). Taken together, these data suggest that adolescent pregnant girls were more likely to have malaria and anaemia compared to their adult pregnant counterpart. Results from this study shows that proactive adolescent friendly policies and control programmes for malaria and anaemia are needed in this region in order to protect this vulnerable group of pregnant women. Copyright © 2012 Elsevier B.V. All rights reserved.
The Metropolis Monte Carlo method with CUDA enabled Graphic Processing Units
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Clifford; School of Physics, Astronomy, and Computational Sciences, George Mason University, 4400 University Dr., Fairfax, VA 22030; Ji, Weixiao
2014-02-01
We present a CPU–GPU system for runtime acceleration of large molecular simulations using GPU computation and memory swaps. The memory architecture of the GPU can be used both as container for simulation data stored on the graphics card and as floating-point code target, providing an effective means for the manipulation of atomistic or molecular data on the GPU. To fully take advantage of this mechanism, efficient GPU realizations of algorithms used to perform atomistic and molecular simulations are essential. Our system implements a versatile molecular engine, including inter-molecule interactions and orientational variables for performing the Metropolis Monte Carlo (MMC) algorithm,more » which is one type of Markov chain Monte Carlo. By combining memory objects with floating-point code fragments we have implemented an MMC parallel engine that entirely avoids the communication time of molecular data at runtime. Our runtime acceleration system is a forerunner of a new class of CPU–GPU algorithms exploiting memory concepts combined with threading for avoiding bus bandwidth and communication. The testbed molecular system used here is a condensed phase system of oligopyrrole chains. A benchmark shows a size scaling speedup of 60 for systems with 210,000 pyrrole monomers. Our implementation can easily be combined with MPI to connect in parallel several CPU–GPU duets. -- Highlights: •We parallelize the Metropolis Monte Carlo (MMC) algorithm on one CPU—GPU duet. •The Adaptive Tempering Monte Carlo employs MMC and profits from this CPU—GPU implementation. •Our benchmark shows a size scaling-up speedup of 62 for systems with 225,000 particles. •The testbed involves a polymeric system of oligopyrroles in the condensed phase. •The CPU—GPU parallelization includes dipole—dipole and Mie—Jones classic potentials.« less
Hazardous waste management and weight-based indicators--the case of Haifa Metropolis.
Elimelech, E; Ayalon, O; Flicstein, B
2011-01-30
The quantity control of hazardous waste in Israel relies primarily on the Environmental Services Company (ESC) reports. With limited management tools, the Ministry of Environmental Protection (MoEP) has no applicable methodology to confirm or monitor the actual amounts of hazardous waste produced by various industrial sectors. The main goal of this research was to develop a method for estimating the amounts of hazardous waste produced by various sectors. In order to achieve this goal, sector-specific indicators were tested on three hazardous waste producing sectors in the Haifa Metropolis: petroleum refineries, dry cleaners, and public hospitals. The findings reveal poor practice of hazardous waste management in the dry cleaning sector and in the public hospitals sector. Large discrepancies were found in the dry cleaning sector, between the quantities of hazardous waste reported and the corresponding indicator estimates. Furthermore, a lack of documentation on hospitals' pharmaceutical and chemical waste production volume was observed. Only in the case of petroleum refineries, the reported amount was consistent with the estimate. Copyright © 2010 Elsevier B.V. All rights reserved.
Constant-pH Hybrid Nonequilibrium Molecular Dynamics–Monte Carlo Simulation Method
2016-01-01
A computational method is developed to carry out explicit solvent simulations of complex molecular systems under conditions of constant pH. In constant-pH simulations, preidentified ionizable sites are allowed to spontaneously protonate and deprotonate as a function of time in response to the environment and the imposed pH. The method, based on a hybrid scheme originally proposed by H. A. Stern (J. Chem. Phys.2007, 126, 164112), consists of carrying out short nonequilibrium molecular dynamics (neMD) switching trajectories to generate physically plausible configurations with changed protonation states that are subsequently accepted or rejected according to a Metropolis Monte Carlo (MC) criterion. To ensure microscopic detailed balance arising from such nonequilibrium switches, the atomic momenta are altered according to the symmetric two-ends momentum reversal prescription. To achieve higher efficiency, the original neMD–MC scheme is separated into two steps, reducing the need for generating a large number of unproductive and costly nonequilibrium trajectories. In the first step, the protonation state of a site is randomly attributed via a Metropolis MC process on the basis of an intrinsic pKa; an attempted nonequilibrium switch is generated only if this change in protonation state is accepted. This hybrid two-step inherent pKa neMD–MC simulation method is tested with single amino acids in solution (Asp, Glu, and His) and then applied to turkey ovomucoid third domain and hen egg-white lysozyme. Because of the simple linear increase in the computational cost relative to the number of titratable sites, the present method is naturally able to treat extremely large systems. PMID:26300709
Profile-Based LC-MS Data Alignment—A Bayesian Approach
Tsai, Tsung-Heng; Tadesse, Mahlet G.; Wang, Yue; Ressom, Habtom W.
2014-01-01
A Bayesian alignment model (BAM) is proposed for alignment of liquid chromatography-mass spectrometry (LC-MS) data. BAM belongs to the category of profile-based approaches, which are composed of two major components: a prototype function and a set of mapping functions. Appropriate estimation of these functions is crucial for good alignment results. BAM uses Markov chain Monte Carlo (MCMC) methods to draw inference on the model parameters and improves on existing MCMC-based alignment methods through 1) the implementation of an efficient MCMC sampler and 2) an adaptive selection of knots. A block Metropolis-Hastings algorithm that mitigates the problem of the MCMC sampler getting stuck at local modes of the posterior distribution is used for the update of the mapping function coefficients. In addition, a stochastic search variable selection (SSVS) methodology is used to determine the number and positions of knots. We applied BAM to a simulated data set, an LC-MS proteomic data set, and two LC-MS metabolomic data sets, and compared its performance with the Bayesian hierarchical curve registration (BHCR) model, the dynamic time-warping (DTW) model, and the continuous profile model (CPM). The advantage of applying appropriate profile-based retention time correction prior to performing a feature-based approach is also demonstrated through the metabolomic data sets. PMID:23929872
LMC: Logarithmantic Monte Carlo
NASA Astrophysics Data System (ADS)
Mantz, Adam B.
2017-06-01
LMC is a Markov Chain Monte Carlo engine in Python that implements adaptive Metropolis-Hastings and slice sampling, as well as the affine-invariant method of Goodman & Weare, in a flexible framework. It can be used for simple problems, but the main use case is problems where expensive likelihood evaluations are provided by less flexible third-party software, which benefit from parallelization across many nodes at the sampling level. The parallel/adaptive methods use communication through MPI, or alternatively by writing/reading files, and mostly follow the approaches pioneered by CosmoMC (ascl:1106.025).
Metis: A Pure Metropolis Markov Chain Monte Carlo Bayesian Inference Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bates, Cameron Russell; Mckigney, Edward Allen
The use of Bayesian inference in data analysis has become the standard for large scienti c experiments [1, 2]. The Monte Carlo Codes Group(XCP-3) at Los Alamos has developed a simple set of algorithms currently implemented in C++ and Python to easily perform at-prior Markov Chain Monte Carlo Bayesian inference with pure Metropolis sampling. These implementations are designed to be user friendly and extensible for customization based on speci c application requirements. This document describes the algorithmic choices made and presents two use cases.
Bayesian Model Selection in Geophysics: The evidence
NASA Astrophysics Data System (ADS)
Vrugt, J. A.
2016-12-01
Bayesian inference has found widespread application and use in science and engineering to reconcile Earth system models with data, including prediction in space (interpolation), prediction in time (forecasting), assimilation of observations and deterministic/stochastic model output, and inference of the model parameters. Per Bayes theorem, the posterior probability, , P(H|D), of a hypothesis, H, given the data D, is equivalent to the product of its prior probability, P(H), and likelihood, L(H|D), divided by a normalization constant, P(D). In geophysics, the hypothesis, H, often constitutes a description (parameterization) of the subsurface for some entity of interest (e.g. porosity, moisture content). The normalization constant, P(D), is not required for inference of the subsurface structure, yet of great value for model selection. Unfortunately, it is not particularly easy to estimate P(D) in practice. Here, I will introduce the various building blocks of a general purpose method which provides robust and unbiased estimates of the evidence, P(D). This method uses multi-dimensional numerical integration of the posterior (parameter) distribution. I will then illustrate this new estimator by application to three competing subsurface models (hypothesis) using GPR travel time data from the South Oyster Bacterial Transport Site, in Virginia, USA. The three subsurface models differ in their treatment of the porosity distribution and use (a) horizontal layering with fixed layer thicknesses, (b) vertical layering with fixed layer thicknesses and (c) a multi-Gaussian field. The results of the new estimator are compared against the brute force Monte Carlo method, and the Laplace-Metropolis method.
Isaiah, Ibeh Nnana; Nche, Bikwe Thomas; Nwagu, Ibeh Georgina; Nnanna, Ibeh Isaiah
2011-01-01
Background: The current rise of male infertility associated with bacterospermia and urogenital infection has been on the increase amongst adult married males in Benin metropolis and a major cause of concern to male fertility and reproduction in Nigeria. Aim: To microbiologically isolate and study the infectious agent that has led to male infertility and also to study the percentage occurrence of bacteropsermia and urogenital caused infertility in adult married males in Benin metropolis Material and Method: using standard microbiological methods of isolating and identifying the organism, specimen was collected and processed which includes the susceptibility profile of isolates and sperm quality. In this study a total of 140 sperm samples was collected from patient who were referred from the consultant outpatient department of the University of Benin Teaching Hospital and then evaluated bacteriologically using standard bacterial cultural methods Results: Among the total cases, 92 (65.7%) showed at least one pathogen. Staphylococcus aureus (28.3%), Staphylococcus Saprophyticus (13.0%), Pseudomonas aerouginosa (6.5%), Escherichia Coli (19.6%) Proteus mirabilis (10.8%) Klebsiella spp (10.8%) and Proteus vulgaris (10.8%). Conclusion: There was an outstanding significant relationship between bacteriospermia and the rate of total motility and morphologically abnormal sperms, The percentage of morphologically normal sperm was lower in this study. Staphylococcus aureus Staphylococcus saprohyticus and Escherichia coli were the most common pathogen having negative effects on sperm motility and morphology in this study. PMID:22363079
Lu, Fred Sun; Hou, Suqin; Baltrusaitis, Kristin; Shah, Manan; Leskovec, Jure; Sosic, Rok; Hawkins, Jared; Brownstein, John; Conidi, Giuseppe; Gunn, Julia; Gray, Josh; Zink, Anna
2018-01-01
Background Influenza outbreaks pose major challenges to public health around the world, leading to thousands of deaths a year in the United States alone. Accurate systems that track influenza activity at the city level are necessary to provide actionable information that can be used for clinical, hospital, and community outbreak preparation. Objective Although Internet-based real-time data sources such as Google searches and tweets have been successfully used to produce influenza activity estimates ahead of traditional health care–based systems at national and state levels, influenza tracking and forecasting at finer spatial resolutions, such as the city level, remain an open question. Our study aimed to present a precise, near real-time methodology capable of producing influenza estimates ahead of those collected and published by the Boston Public Health Commission (BPHC) for the Boston metropolitan area. This approach has great potential to be extended to other cities with access to similar data sources. Methods We first tested the ability of Google searches, Twitter posts, electronic health records, and a crowd-sourced influenza reporting system to detect influenza activity in the Boston metropolis separately. We then adapted a multivariate dynamic regression method named ARGO (autoregression with general online information), designed for tracking influenza at the national level, and showed that it effectively uses the above data sources to monitor and forecast influenza at the city level 1 week ahead of the current date. Finally, we presented an ensemble-based approach capable of combining information from models based on multiple data sources to more robustly nowcast as well as forecast influenza activity in the Boston metropolitan area. The performances of our models were evaluated in an out-of-sample fashion over 4 influenza seasons within 2012-2016, as well as a holdout validation period from 2016 to 2017. Results Our ensemble-based methods incorporating information from diverse models based on multiple data sources, including ARGO, produced the most robust and accurate results. The observed Pearson correlations between our out-of-sample flu activity estimates and those historically reported by the BPHC were 0.98 in nowcasting influenza and 0.94 in forecasting influenza 1 week ahead of the current date. Conclusions We show that information from Internet-based data sources, when combined using an informed, robust methodology, can be effectively used as early indicators of influenza activity at fine geographic resolutions. PMID:29317382
NASA Astrophysics Data System (ADS)
Riza, Yose; Cheris, Rika; Repi
2017-12-01
The development of Pekanbaru City is very rapid, consequently is constantly experiencing changes in buildings, areas or cultural objects that need to be preserved to be disrupted, replaced by economic-oriented development - commercial. The contradiction between the construction of the metropolis will be the beginning of the problem for urban areas. Kampong Bandar Senapelan is an early town of Pekanbaru town located on the banks of the Siak River. The settlement has a typology of Malay and vernacular Malay architecture. The existence of these villages experienced concern as a contradiction of the city's development toward the metropolis which resulted in degradation of the historical value of urban development in this region. This study was conducted to make an important assessment of preserving Kampung Bandar Senapelan as the oldest area and its great influence on the development of metropolis. Preservation of historical and cultural heritage with conservation and preservation measures is one of the urban design elements to be considered for all city stakeholders to safeguard the civilization of a generation. Considerations that will become a benchmark is the history, conservation and urban development towards the metropolis. The importance of awareness of the conservation of the city through conservation and preservation in this area can lead to new characters and values to the building and its environment and will create an atmosphere different from the rapid development (modern style). In addition, this preservation will be evident in a harmonious life with a high tolerance between multi-ethnicity that co-existed in the past.
Simulated Annealing in the Variable Landscape
NASA Astrophysics Data System (ADS)
Hasegawa, Manabu; Kim, Chang Ju
An experimental analysis is conducted to test whether the appropriate introduction of the smoothness-temperature schedule enhances the optimizing ability of the MASSS method, the combination of the Metropolis algorithm (MA) and the search-space smoothing (SSS) method. The test is performed on two types of random traveling salesman problems. The results show that the optimization performance of the MA is substantially improved by a single smoothing alone and slightly more by a single smoothing with cooling and by a de-smoothing process with heating. The performance is compared to that of the parallel tempering method and a clear advantage of the idea of smoothing is observed depending on the problem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berco, Dan, E-mail: danny.barkan@gmail.com; Tseng, Tseung-Yuen, E-mail: tseng@cc.nctu.edu.tw
This study presents an evaluation method for resistive random access memory retention reliability based on the Metropolis Monte Carlo algorithm and Gibbs free energy. The method, which does not rely on a time evolution, provides an extremely efficient way to compare the relative retention properties of metal-insulator-metal structures. It requires a small number of iterations and may be used for statistical analysis. The presented approach is used to compare the relative robustness of a single layer ZrO{sub 2} device with a double layer ZnO/ZrO{sub 2} one, and obtain results which are in good agreement with experimental data.
Isaiah, Ibeh Nnana; Nche, Bikwe Thomas; Nwagu, Ibeh Georgina; Nnanna, Ibeh Isaiah
2011-12-01
The current rise of male infertility associated with bacterospermia and urogenital infection has been on the increase amongst adult married males in Benin metropolis and a major cause of concern to male fertility and reproduction in Nigeria. To microbiologically isolate and study the infectious agent that has led to male infertility and also to study the percentage occurrence of bacteropsermia and urogenital caused infertility in adult married males in Benin metropolis using standard microbiological methods of isolating and identifying the organism, specimen was collected and processed which includes the susceptibility profile of isolates and sperm quality. In this study a total of 140 sperm samples was collected from patient who were referred from the consultant outpatient department of the University of Benin Teaching Hospital and then evaluated bacteriologically using standard bacterial cultural methods Among the total cases, 92 (65.7%) showed at least one pathogen. Staphylococcus aureus (28.3%), Staphylococcus Saprophyticus (13.0%), Pseudomonas aerouginosa (6.5%), Escherichia Coli (19.6%) Proteus mirabilis (10.8%) Klebsiella spp (10.8%) and Proteus vulgaris (10.8%). There was an outstanding significant relationship between bacteriospermia and the rate of total motility and morphologically abnormal sperms, The percentage of morphologically normal sperm was lower in this study. Staphylococcus aureus Staphylococcus saprohyticus and Escherichia coli were the most common pathogen having negative effects on sperm motility and morphology in this study.
Sampling algorithms for validation of supervised learning models for Ising-like systems
NASA Astrophysics Data System (ADS)
Portman, Nataliya; Tamblyn, Isaac
2017-12-01
In this paper, we build and explore supervised learning models of ferromagnetic system behavior, using Monte-Carlo sampling of the spin configuration space generated by the 2D Ising model. Given the enormous size of the space of all possible Ising model realizations, the question arises as to how to choose a reasonable number of samples that will form physically meaningful and non-intersecting training and testing datasets. Here, we propose a sampling technique called ;ID-MH; that uses the Metropolis-Hastings algorithm creating Markov process across energy levels within the predefined configuration subspace. We show that application of this method retains phase transitions in both training and testing datasets and serves the purpose of validation of a machine learning algorithm. For larger lattice dimensions, ID-MH is not feasible as it requires knowledge of the complete configuration space. As such, we develop a new ;block-ID; sampling strategy: it decomposes the given structure into square blocks with lattice dimension N ≤ 5 and uses ID-MH sampling of candidate blocks. Further comparison of the performance of commonly used machine learning methods such as random forests, decision trees, k nearest neighbors and artificial neural networks shows that the PCA-based Decision Tree regressor is the most accurate predictor of magnetizations of the Ising model. For energies, however, the accuracy of prediction is not satisfactory, highlighting the need to consider more algorithmically complex methods (e.g., deep learning).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bledsoe, Keith C.
2015-04-01
The DiffeRential Evolution Adaptive Metropolis (DREAM) method is a powerful optimization/uncertainty quantification tool used to solve inverse transport problems in Los Alamos National Laboratory’s INVERSE code system. The DREAM method has been shown to be adept at accurate uncertainty quantification, but it can be very computationally demanding. Previously, the DREAM method in INVERSE performed a user-defined number of particle transport calculations. This placed a burden on the user to guess the number of calculations that would be required to accurately solve any given problem. This report discusses a new approach that has been implemented into INVERSE, the Gelman-Rubin convergence metric.more » This metric automatically detects when an appropriate number of transport calculations have been completed and the uncertainty in the inverse problem has been accurately calculated. In a test problem with a spherical geometry, this method was found to decrease the number of transport calculations (and thus time required) to solve a problem by an average of over 90%. In a cylindrical test geometry, a 75% decrease was obtained.« less
A computer program for uncertainty analysis integrating regression and Bayesian methods
Lu, Dan; Ye, Ming; Hill, Mary C.; Poeter, Eileen P.; Curtis, Gary
2014-01-01
This work develops a new functionality in UCODE_2014 to evaluate Bayesian credible intervals using the Markov Chain Monte Carlo (MCMC) method. The MCMC capability in UCODE_2014 is based on the FORTRAN version of the differential evolution adaptive Metropolis (DREAM) algorithm of Vrugt et al. (2009), which estimates the posterior probability density function of model parameters in high-dimensional and multimodal sampling problems. The UCODE MCMC capability provides eleven prior probability distributions and three ways to initialize the sampling process. It evaluates parametric and predictive uncertainties and it has parallel computing capability based on multiple chains to accelerate the sampling process. This paper tests and demonstrates the MCMC capability using a 10-dimensional multimodal mathematical function, a 100-dimensional Gaussian function, and a groundwater reactive transport model. The use of the MCMC capability is made straightforward and flexible by adopting the JUPITER API protocol. With the new MCMC capability, UCODE_2014 can be used to calculate three types of uncertainty intervals, which all can account for prior information: (1) linear confidence intervals which require linearity and Gaussian error assumptions and typically 10s–100s of highly parallelizable model runs after optimization, (2) nonlinear confidence intervals which require a smooth objective function surface and Gaussian observation error assumptions and typically 100s–1,000s of partially parallelizable model runs after optimization, and (3) MCMC Bayesian credible intervals which require few assumptions and commonly 10,000s–100,000s or more partially parallelizable model runs. Ready access allows users to select methods best suited to their work, and to compare methods in many circumstances.
Mazumdar, Subhendu; Ghose, Dipankar; Saha, Goutam Kumar
2017-12-14
Although Black Kites (Milvus migrans govinda) serve as major scavenging raptor in most of the urban areas, scientific studies on this important ecosystem service provider are almost non-existent in Indian context. The present study was carried out in a metropolis in eastern India to find out the factors influencing relative abundance and roosting site selection of Black Kites. Separate generalized linear models (GLMs) were performed considering encounter rate and roosting Black Kite abundance as response variables. The study conclusively indicated that encounter rates of Black Kites were significantly influenced by the presence of garbage dumps in its vicinity. Numbers of Black Kites were also higher in the roosting sites situated closer to garbage dumps and open spaces. In addition, expected counts of Black Kites significantly increased in roosting sites situated away from buildings and water bodies. However, built-up area and tree cover around the roosting sites had no influence on the abundance of Black Kites therein. With rapid urbanization and changing offal disposal patterns, our findings would be useful to ensure continued availability of food and roosting sites of Black Kites in urban areas.
NASA Astrophysics Data System (ADS)
Vrugt, Jasper A.; Beven, Keith J.
2018-04-01
This essay illustrates some recent developments to the DiffeRential Evolution Adaptive Metropolis (DREAM) MATLAB toolbox of Vrugt (2016) to delineate and sample the behavioural solution space of set-theoretic likelihood functions used within the GLUE (Limits of Acceptability) framework (Beven and Binley, 1992, 2014; Beven and Freer, 2001; Beven, 2006). This work builds on the DREAM(ABC) algorithm of Sadegh and Vrugt (2014) and enhances significantly the accuracy and CPU-efficiency of Bayesian inference with GLUE. In particular it is shown how lack of adequate sampling in the model space might lead to unjustified model rejection.
NASA Astrophysics Data System (ADS)
Liu, Boda; Liang, Yan
2017-04-01
Markov chain Monte Carlo (MCMC) simulation is a powerful statistical method in solving inverse problems that arise from a wide range of applications. In Earth sciences applications of MCMC simulations are primarily in the field of geophysics. The purpose of this study is to introduce MCMC methods to geochemical inverse problems related to trace element fractionation during mantle melting. MCMC methods have several advantages over least squares methods in deciphering melting processes from trace element abundances in basalts and mantle rocks. Here we use an MCMC method to invert for extent of melting, fraction of melt present during melting, and extent of chemical disequilibrium between the melt and residual solid from REE abundances in clinopyroxene in abyssal peridotites from Mid-Atlantic Ridge, Central Indian Ridge, Southwest Indian Ridge, Lena Trough, and American-Antarctic Ridge. We consider two melting models: one with exact analytical solution and the other without. We solve the latter numerically in a chain of melting models according to the Metropolis-Hastings algorithm. The probability distribution of inverted melting parameters depends on assumptions of the physical model, knowledge of mantle source composition, and constraints from the REE data. Results from MCMC inversion are consistent with and provide more reliable uncertainty estimates than results based on nonlinear least squares inversion. We show that chemical disequilibrium is likely to play an important role in fractionating LREE in residual peridotites during partial melting beneath mid-ocean ridge spreading centers. MCMC simulation is well suited for more complicated but physically more realistic melting problems that do not have analytical solutions.
Adaptive Metropolis Sampling with Product Distributions
NASA Technical Reports Server (NTRS)
Wolpert, David H.; Lee, Chiu Fan
2005-01-01
The Metropolis-Hastings (MH) algorithm is a way to sample a provided target distribution pi(z). It works by repeatedly sampling a separate proposal distribution T(x,x') to generate a random walk {x(t)}. We consider a modification of the MH algorithm in which T is dynamically updated during the walk. The update at time t uses the {x(t' less than t)} to estimate the product distribution that has the least Kullback-Leibler distance to pi. That estimate is the information-theoretically optimal mean-field approximation to pi. We demonstrate through computer experiments that our algorithm produces samples that are superior to those of the conventional MH algorithm.
Wusu, Onipede
2013-06-01
The influence of adolescents' exposure to sexual health content of mass media in their sexual health behaviour in Nigeria is still not clear. Data were gathered through a survey conducted among adolescents aged 12-19 years in Lagos metropolis between November 2009 and February 2010. A multistage sampling strategy was adopted in selecting respondents. Logistic regression technique was utilised in the analysis. The results indicate that the respondents were most frequently exposed to TV (male = 92.2; female = 94.9) and radio (male = 88.2; female = 91.7) media. The odds ratios indicate that sexual health content of mass media significantly predicted condom use, multiple sexual relationship, sexual intercourse and self reported occurrence of abortion in the study sample. The findings imply that positive media sexual health content is likely to promote sexual health among adolescents but negative contents can put adolescents' sexual health in danger. In addition, safe sex can be advanced among adolescents if the media provide accurate information on sexuality, emphasising the dangers of risky sexual practices. Finally, this study posits that accurate portrayal of sexuality in the media would contribute immensely to improving public health in the metropolis.
Gumbo, B
2000-01-01
The Harare metropolis in Zimbabwe, extending upstream from Manyame Dam in the Upper Manyame River Basin, consists of the City of Harare and its satellite towns: Chitungwiza, Norton, Epworth and Ruwa. The existing urban drainage system is typically a single-use-mixing system: water is used and discharged to "waste", excreta are flushed to sewers and eventually, after "treatment", the effluent is discharged to a drinking water supply source. Polluted urban storm water is evacuated as fast as possible. This system not only ignores the substantial value in "waste" materials, but it also exports problems to downstream communities and to vulnerable fresh-water sources. The question is how can the harare metropolis urban drainage system, which is complex and has evolved over time, be rearranged to achieve sustainability (i.e. water conservation, pollution prevention at source, protection of the vulnerable drinking water sources and recovery of valuable materials)? This paper reviews current concepts regarding the future development of the urban drainage system in line with the new vision of "Sustainable Cities of the Future". The Harare Metropolis in Zimbabwe is taken as a case, and philosophical options for re-engineering the drainage system are discussed.
Assessment of radiation protection practices among radiographers in Lagos, Nigeria.
Eze, Cletus Uche; Abonyi, Livinus Chibuzo; Njoku, Jerome; Irurhe, Nicholas Kayode; Olowu, Oluwabola
2013-11-01
Use of ionising radiation in diagnostic radiography could lead to hazards such as somatic and genetic damages. Compliance to safe work and radiation protection practices could mitigate such risks. The aim of the study was to assess the knowledge and radiation protection practices among radiographers in Lagos, Nigeria. The study was a prospective cross sectional survey. Convenience sampling technique was used to select four x-ray diagnostic centres in four tertiary hospitals in Lagos metropolis. Data were analysed with Epi- info software, version 3.5.1. Average score on assessment of knowledge was 73%. Most modern radiation protection instruments were lacking in all the centres studied. Application of shielding devices such as gonad shield for protection was neglected mostly in government hospitals. Most x-ray machines were quite old and evidence of quality assurance tests performed on such machines were lacking. Radiographers within Lagos metropolis showed an excellent knowledge of radiation protection within the study period. Adherence to radiation protection practices among radiographers in Lagos metropolis during the period studied was, however, poor. Radiographers in Lagos, Nigeria should embrace current trends in radiation protection and make more concerted efforts to apply their knowledge in protecting themselves and patients from harmful effects of ionising radiation.
Transition records of stationary Markov chains.
Naudts, Jan; Van der Straeten, Erik
2006-10-01
In any Markov chain with finite state space the distribution of transition records always belongs to the exponential family. This observation is used to prove a fluctuation theorem, and to show that the dynamical entropy of a stationary Markov chain is linear in the number of steps. Three applications are discussed. A known result about entropy production is reproduced. A thermodynamic relation is derived for equilibrium systems with Metropolis dynamics. Finally, a link is made with recent results concerning a one-dimensional polymer model.
From Internet of Things to Smart Data for Smart Urban Monitoring
NASA Astrophysics Data System (ADS)
Gastaud, E.
2017-09-01
Cities are facing some of the major challenges of our time: global warming, pollution, waste management, energy efficiency. The territory of the Metropolis of Lyon, France, which brings together 59 municipalities, for a total of 1.3 million inhabitants, has launched a smart city policy aimed, among other things, at finding solutions for these issues. The data platform set up in 2013 is one of the cornerstones of this policy. In this context, the Metropolis of Lyon is deploying solutions that will enable, through the collection of new data, to implement monitoring and action tools in several fields. As part of a European innovation project called "bIoTope", focused on the development of new services based on the Internet of Things, a multidisciplinary team is implementing a system to mitigate the effects of global warming in the city. Thanks to various connected objects allowing a true monitoring of the trees, and by using different data sources, an automatic and intelligent irrigation system is developed. In the field of waste management, several hundred containers in which the inhabitants throw away their used glass for recycling will soon be equipped with fill rate sensors. The main objective is to have this network of sensors interact easily with the container collection trucks. Expected results are an optimization of the collection, thus less fuel consumed, less noise, less traffic jam. The Metropolis of Lyon also participates in the "Smarter Together" project, focused on the development of intelligent duplicable solutions for cities, in the field of energy. A digital tool for analysing consumption and energy production at the level of a neighbourhood is currently being developed. This requires both interfaces with multiple partners, the development of a data model reflecting the reality of the terrain, from the sensors to the buildings, and the implementation of a visualization tool.
Bayesian model selection validates a biokinetic model for zirconium processing in humans
2012-01-01
Background In radiation protection, biokinetic models for zirconium processing are of crucial importance in dose estimation and further risk analysis for humans exposed to this radioactive substance. They provide limiting values of detrimental effects and build the basis for applications in internal dosimetry, the prediction for radioactive zirconium retention in various organs as well as retrospective dosimetry. Multi-compartmental models are the tool of choice for simulating the processing of zirconium. Although easily interpretable, determining the exact compartment structure and interaction mechanisms is generally daunting. In the context of observing the dynamics of multiple compartments, Bayesian methods provide efficient tools for model inference and selection. Results We are the first to apply a Markov chain Monte Carlo approach to compute Bayes factors for the evaluation of two competing models for zirconium processing in the human body after ingestion. Based on in vivo measurements of human plasma and urine levels we were able to show that a recently published model is superior to the standard model of the International Commission on Radiological Protection. The Bayes factors were estimated by means of the numerically stable thermodynamic integration in combination with a recently developed copula-based Metropolis-Hastings sampler. Conclusions In contrast to the standard model the novel model predicts lower accretion of zirconium in bones. This results in lower levels of noxious doses for exposed individuals. Moreover, the Bayesian approach allows for retrospective dose assessment, including credible intervals for the initially ingested zirconium, in a significantly more reliable fashion than previously possible. All methods presented here are readily applicable to many modeling tasks in systems biology. PMID:22863152
Rényi information flow in the Ising model with single-spin dynamics.
Deng, Zehui; Wu, Jinshan; Guo, Wenan
2014-12-01
The n-index Rényi mutual information and transfer entropies for the two-dimensional kinetic Ising model with arbitrary single-spin dynamics in the thermodynamic limit are derived as functions of ensemble averages of observables and spin-flip probabilities. Cluster Monte Carlo algorithms with different dynamics from the single-spin dynamics are thus applicable to estimate the transfer entropies. By means of Monte Carlo simulations with the Wolff algorithm, we calculate the information flows in the Ising model with the Metropolis dynamics and the Glauber dynamics, respectively. We find that not only the global Rényi transfer entropy, but also the pairwise Rényi transfer entropy, peaks in the disorder phase.
Note: A pure-sampling quantum Monte Carlo algorithm with independent Metropolis.
Vrbik, Jan; Ospadov, Egor; Rothstein, Stuart M
2016-07-14
Recently, Ospadov and Rothstein published a pure-sampling quantum Monte Carlo algorithm (PSQMC) that features an auxiliary Path Z that connects the midpoints of the current and proposed Paths X and Y, respectively. When sufficiently long, Path Z provides statistical independence of Paths X and Y. Under those conditions, the Metropolis decision used in PSQMC is done without any approximation, i.e., not requiring microscopic reversibility and without having to introduce any G(x → x'; τ) factors into its decision function. This is a unique feature that contrasts with all competing reptation algorithms in the literature. An example illustrates that dependence of Paths X and Y has adverse consequences for pure sampling.
Note: A pure-sampling quantum Monte Carlo algorithm with independent Metropolis
NASA Astrophysics Data System (ADS)
Vrbik, Jan; Ospadov, Egor; Rothstein, Stuart M.
2016-07-01
Recently, Ospadov and Rothstein published a pure-sampling quantum Monte Carlo algorithm (PSQMC) that features an auxiliary Path Z that connects the midpoints of the current and proposed Paths X and Y, respectively. When sufficiently long, Path Z provides statistical independence of Paths X and Y. Under those conditions, the Metropolis decision used in PSQMC is done without any approximation, i.e., not requiring microscopic reversibility and without having to introduce any G(x → x'; τ) factors into its decision function. This is a unique feature that contrasts with all competing reptation algorithms in the literature. An example illustrates that dependence of Paths X and Y has adverse consequences for pure sampling.
An Improved Nested Sampling Algorithm for Model Selection and Assessment
NASA Astrophysics Data System (ADS)
Zeng, X.; Ye, M.; Wu, J.; WANG, D.
2017-12-01
Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.
Pesewu, George A; Bentum, Daniel; Olu-Taiwo, Michael A; Glover, Kathreen K; Yirenya-Tawiah, Dzidzo R
2017-01-01
Many developing countries, including Ghana, are water stressed. As such, farmers, particularly those in urban areas, have adopted the use of wastewater for irrigation. This study evaluated the bacteriological water quality of the wastewater used for irrigation in the vegetable farms at Korle-Bu Teaching Hospital (KBTH), Accra Metropolis, Ghana. In all, 40 wastewater samples were collected and analysed bacteriologically using the total aerobic plate count method. The isolated bacteria were identified biochemically using Bergey's manual for determinative bacteriology. Mean total bacterial colony count values in the range of 2.75-4.44 × 10 5 CFU/100 mL were isolated which far exceeds values of 1 × 10 3 /100 mL recommended by the World Health Organization (WHO) for unrestricted irrigation of crops likely to be eaten raw. Enterobacter cloacae (51.4%), Klebsiella sp. (24.1%), Pseudomonas aeruginosa (11.3%), Salmonella typhi (10.6%), Escherichia coli (2.2%) and Proteus sp. (0.4%) were the predominant bacteria isolated. Growers should use treated wastewater for farming while processors and consumers should minimize contamination risks of produce from the vegetable farms/garden to the plate. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Arnst, M.; Abello Álvarez, B.; Ponthot, J.-P.; Boman, R.
2017-11-01
This paper is concerned with the characterization and the propagation of errors associated with data limitations in polynomial-chaos-based stochastic methods for uncertainty quantification. Such an issue can arise in uncertainty quantification when only a limited amount of data is available. When the available information does not suffice to accurately determine the probability distributions that must be assigned to the uncertain variables, the Bayesian method for assigning these probability distributions becomes attractive because it allows the stochastic model to account explicitly for insufficiency of the available information. In previous work, such applications of the Bayesian method had already been implemented by using the Metropolis-Hastings and Gibbs Markov Chain Monte Carlo (MCMC) methods. In this paper, we present an alternative implementation, which uses an alternative MCMC method built around an Itô stochastic differential equation (SDE) that is ergodic for the Bayesian posterior. We draw together from the mathematics literature a number of formal properties of this Itô SDE that lend support to its use in the implementation of the Bayesian method, and we describe its discretization, including the choice of the free parameters, by using the implicit Euler method. We demonstrate the proposed methodology on a problem of uncertainty quantification in a complex nonlinear engineering application relevant to metal forming.
Building test data from real outbreaks for evaluating detection algorithms.
Texier, Gaetan; Jackson, Michael L; Siwe, Leonel; Meynard, Jean-Baptiste; Deparis, Xavier; Chaudet, Herve
2017-01-01
Benchmarking surveillance systems requires realistic simulations of disease outbreaks. However, obtaining these data in sufficient quantity, with a realistic shape and covering a sufficient range of agents, size and duration, is known to be very difficult. The dataset of outbreak signals generated should reflect the likely distribution of authentic situations faced by the surveillance system, including very unlikely outbreak signals. We propose and evaluate a new approach based on the use of historical outbreak data to simulate tailored outbreak signals. The method relies on a homothetic transformation of the historical distribution followed by resampling processes (Binomial, Inverse Transform Sampling Method-ITSM, Metropolis-Hasting Random Walk, Metropolis-Hasting Independent, Gibbs Sampler, Hybrid Gibbs Sampler). We carried out an analysis to identify the most important input parameters for simulation quality and to evaluate performance for each of the resampling algorithms. Our analysis confirms the influence of the type of algorithm used and simulation parameters (i.e. days, number of cases, outbreak shape, overall scale factor) on the results. We show that, regardless of the outbreaks, algorithms and metrics chosen for the evaluation, simulation quality decreased with the increase in the number of days simulated and increased with the number of cases simulated. Simulating outbreaks with fewer cases than days of duration (i.e. overall scale factor less than 1) resulted in an important loss of information during the simulation. We found that Gibbs sampling with a shrinkage procedure provides a good balance between accuracy and data dependency. If dependency is of little importance, binomial and ITSM methods are accurate. Given the constraint of keeping the simulation within a range of plausible epidemiological curves faced by the surveillance system, our study confirms that our approach can be used to generate a large spectrum of outbreak signals.
NASA Astrophysics Data System (ADS)
Wu, Yi-Hua; Chan, Chang-Chuan; Rao, Carol Y.; Lee, Chung-Te; Hsu, Hsiao-Hsien; Chiu, Yueh-Hsiu; Chao, H. Jasmine
This study was conducted to investigate the temporal and spatial distributions, compositions, and determinants of ambient aeroallergens in Taipei, Taiwan, a subtropical metropolis. We monitored ambient culturable fungi in Shin-Jhuang City, an urban area, and Shi-Men Township, a rural area, in Taipei metropolis from 2003 to 2004. We collected ambient fungi in the last week of every month during the study period, using duplicate Burkard portable samplers and Malt Extract Agar. The median concentration of total fungi was 1339 colony-forming units m -3 of air over the study period. The most prevalent fungi were non-sporulating fungi, Cladosporium, Penicillium, Curvularia and Aspergillus at both sites. Airborne fungal concentrations and diversity of fungal species were generally higher in urban than in rural areas. Most fungal taxa had significant seasonal variations, with higher levels in summer. Multivariate analyses showed that the levels of ambient fungi were associated positively with temperature, but negatively with ozone and several other air pollutants. Relative humidity also had a significant non-linear relationship with ambient fungal levels. We concluded that the concentrations and the compositions of ambient fungi are diverse in urban and rural areas in the subtropical region. High ambient fungal levels were related to an urban environment and environmental conditions of high temperature and low ozone levels.
Gato, Worlanyo E; Acquah, Samuel; Apenteng, Bettye A; Opoku, Samuel T; Boakye, Blessed K
2017-09-01
Despite the significant increase in the incidence of diabetes in Ghana, research in this area has been lagging. The purpose of the study was to assess the risk factors associated with diabetes in the Cape Coast metropolis of Ghana, and to describe nutritional practices and efforts toward lifestyle change. A convenient sample of 482 adults from the Cape Coast metropolis was surveyed using a self-reported questionnaire. The survey collected information on the demographic, socioeconomic characteristics, health status and routine nutritional practices of respondents. The aims of the study were addressed using multivariable regression analyses. A total of 8% of respondents reported that they had been diagnosed with diabetes. Older age and body weight were found to be independently associated with diabetes. Individuals living with diabetes were no more likely than those without diabetes to have taken active steps at reducing their weight. The percentage of self-reported diabetes in this population was consistent with what has been reported in previous studies in Ghana. The findings from this study highlight the need for more patient education on physical activity and weight management. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalkwarf, D.R.
1980-05-01
Airborne uranium products were collected at the perimeter of the uranium-conversion plant operated by the Allied Chemical Corporation at Metropolis, Illinois, and the dissolution rates of these products were classified in terms of the ICRP Task Group Lung Model. Assignments were based on measurements of the dissolution half-times exhibited by uranium components of the dust samples as they dissolved in simulated lung fluid at 37/sup 0/C. Based on three trials, the dissolution behavior of dust with aerodynamic equivalent diameter (AED) less than 5.5 ..mu..m and collected nearest the closest residence to the plant was classified 0.40 D, 0.60 Y. Basedmore » on two trials, the dissolution behavior of dust with AED greater than 5.5 ..mu..m and collected at this location was classified 0.37 D, 0.63 Y. Based on one trial, the dissolution behavior of dust with AED less than 5.5 ..mu..m and collected at a location on the opposite side of the plant was classified 0.68 D, 0.32 Y. There was some evidence for adsorption of dissolved uranium onto other dust components during dissolution, and preliminary dissolution trials are recommended for future samples in order to optimize the fluid replacement schedule.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan
In this study we developed an efficient Bayesian inversion framework for interpreting marine seismic amplitude versus angle (AVA) and controlled source electromagnetic (CSEM) data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo (MCMC) sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis (DREAM) and Adaptive Metropolis (AM) samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and CSEM data. The multi-chain MCMC is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration,more » the approach is used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic AVA and CSEM joint inversion provides better estimation of reservoir saturations than the seismic AVA-only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated – reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.« less
NASA Astrophysics Data System (ADS)
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan; Huang, Maoyi; Bao, Jie; Swiler, Laura
2017-12-01
In this study we developed an efficient Bayesian inversion framework for interpreting marine seismic Amplitude Versus Angle and Controlled-Source Electromagnetic data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis and Adaptive Metropolis samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and Controlled-Source Electromagnetic data. The multi-chain Markov-chain Monte Carlo is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration, the approach is used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic Amplitude Versus Angle and Controlled-Source Electromagnetic joint inversion provides better estimation of reservoir saturations than the seismic Amplitude Versus Angle only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated - reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.
Water budget analysis and management for Bangkok Metropolis, Thailand.
Singkran, Nuanchan
2017-09-01
The water budget of the Bangkok Metropolis system was analyzed using a material flow analysis model. Total imported flows into the system were 80,080 million m 3 per year (Mm 3 y -1 ) including inflows from the Chao Phraya and Mae Klong rivers and rainwater. Total exported flows out of the system were 78,528 Mm 3 y -1 including outflow into the lower Chao Phraya River and tap water (TW) distributed to suburbs. Total rates of stock exchange (1,552 Mm 3 y -1 ) were found in the processes of water recycling, TW distribution, domestic use, swine farming, aquaculture, and paddy fields. Only 21% of the total amount of wastewater (1,255 Mm 3 y -1 ) was collected, with insufficient treatment capacity of about 415 Mm 3 y -1 . Domestic and business (industrial and commercial sectors) areas were major point sources, whereas paddy fields were a major non-point source of wastewater. To manage Bangkok's water budget, critical measures have to be considered. Wastewater treatment capacity and efficiency of wastewater collection should be improved. On-site wastewater treatment plants for residential areas should be installed. Urban planning and land use zoning are suggested to control land use activities. Green technology should be supported to reduce wastewater from farming.
Geo-products of urban areas: Silesian Metropolis, Southern Poland
NASA Astrophysics Data System (ADS)
Chybiorz, Ryszard; Abramowicz, Anna
2017-04-01
Silesian Metropolis is located in the Silesian Voivodeship, in the most important industrial region in Poland. It consist of 14 cities with powiat rights, which create the largest urban center in Poland and one of the largest in Central and Eastern Europe. Almost 2 million people live in its territory. A large concentration of the population is associated with industrialization and especially with the development of the mining industry (Upper Silesian Coal Basin) and the processing industry (steelworks, textile industry) at the end of 19th century. One hundred years later, during the creation of the modern sectors of the economy, processes of metallurgy and mining restructuring have been started. Created mechanisms and conditions for development of post-industrial areas were consistent with the principles of sustainable development and had many new features, including cultural and touristic features. The Industrial Monuments Route was opened for the inhabitants and visitors in October 2006. The route joined the European Route of Industrial Heritage (ERIH) in 2010. Its most interesting mining attractions are located in Silesian Metropolis, and the most frequently visited object on the route is the Guido Historical Coal Mine in Zabrze and the Historical Silver Mine in Tarnowskie Góry. The project, which is realized in Zabrze, will provide for tourists a system of underground corridors, which were used for coal transportation in the 19th century. Visitors will be able to actively explore the work of miners, moving by underground boats, railway and suspension railway. Surface mines are also available for geotourists. The Ecological and Geological Education Center GEOsfera was created in a former Triassic quarry in Jaworzno. Although the area of the Silesian Metropolis is characterized by a very large devastation of the environment, the following objects were created (and are still created) on the basis of inanimate nature and they have a touristic value for the region and the country. Some of them already provide for the implementation of geotouristic purposes.
Finding a Hadamard matrix by simulated annealing of spin vectors
NASA Astrophysics Data System (ADS)
Bayu Suksmono, Andriyan
2017-05-01
Reformulation of a combinatorial problem into optimization of a statistical-mechanics system enables finding a better solution using heuristics derived from a physical process, such as by the simulated annealing (SA). In this paper, we present a Hadamard matrix (H-matrix) searching method based on the SA on an Ising model. By equivalence, an H-matrix can be converted into a seminormalized Hadamard (SH) matrix, whose first column is unit vector and the rest ones are vectors with equal number of -1 and +1 called SH-vectors. We define SH spin vectors as representation of the SH vectors, which play a similar role as the spins on Ising model. The topology of the lattice is generalized into a graph, whose edges represent orthogonality relationship among the SH spin vectors. Starting from a randomly generated quasi H-matrix Q, which is a matrix similar to the SH-matrix without imposing orthogonality, we perform the SA. The transitions of Q are conducted by random exchange of {+, -} spin-pair within the SH-spin vectors that follow the Metropolis update rule. Upon transition toward zeroth energy, the Q-matrix is evolved following a Markov chain toward an orthogonal matrix, at which the H-matrix is said to be found. We demonstrate the capability of the proposed method to find some low-order H-matrices, including the ones that cannot trivially be constructed by the Sylvester method.
Monte Carlo Analysis of Reservoir Models Using Seismic Data and Geostatistical Models
NASA Astrophysics Data System (ADS)
Zunino, A.; Mosegaard, K.; Lange, K.; Melnikova, Y.; Hansen, T. M.
2013-12-01
We present a study on the analysis of petroleum reservoir models consistent with seismic data and geostatistical constraints performed on a synthetic reservoir model. Our aim is to invert directly for structure and rock bulk properties of the target reservoir zone. To infer the rock facies, porosity and oil saturation seismology alone is not sufficient but a rock physics model must be taken into account, which links the unknown properties to the elastic parameters. We then combine a rock physics model with a simple convolutional approach for seismic waves to invert the "measured" seismograms. To solve this inverse problem, we employ a Markov chain Monte Carlo (MCMC) method, because it offers the possibility to handle non-linearity, complex and multi-step forward models and provides realistic estimates of uncertainties. However, for large data sets the MCMC method may be impractical because of a very high computational demand. To face this challenge one strategy is to feed the algorithm with realistic models, hence relying on proper prior information. To address this problem, we utilize an algorithm drawn from geostatistics to generate geologically plausible models which represent samples of the prior distribution. The geostatistical algorithm learns the multiple-point statistics from prototype models (in the form of training images), then generates thousands of different models which are accepted or rejected by a Metropolis sampler. To further reduce the computation time we parallelize the software and run it on multi-core machines. The solution of the inverse problem is then represented by a collection of reservoir models in terms of facies, porosity and oil saturation, which constitute samples of the posterior distribution. We are finally able to produce probability maps of the properties we are interested in by performing statistical analysis on the collection of solutions.
Kyiv Small Rivers in Metropolis Water Objects System
NASA Astrophysics Data System (ADS)
Krelshteyn, P.; Dubnytska, M.
2017-12-01
The article answers the question, what really are the small underground rivers with artificial watercourses: water bodies or city engineering infrastructure objects? The place of such rivers in metropolis water objects system is identified. The ecological state and the degree of urbanization of small rivers, as well as the dynamics of change in these indicators are analysed on the Kiev city example with the help of water objects cadastre. It was found that the registration of small rivers in Kyiv city is not conducted, and the summary information on such water objects is absent and is not taken into account when making managerial decisions at the urban level. To solve this problem, we propose to create some water bodies accounting system (water cadastre).
A hierarchical Bayesian method for vibration-based time domain force reconstruction problems
NASA Astrophysics Data System (ADS)
Li, Qiaofeng; Lu, Qiuhai
2018-05-01
Traditional force reconstruction techniques require prior knowledge on the force nature to determine the regularization term. When such information is unavailable, the inappropriate term is easily chosen and the reconstruction result becomes unsatisfactory. In this paper, we propose a novel method to automatically determine the appropriate q as in ℓq regularization and reconstruct the force history. The method incorporates all to-be-determined variables such as the force history, precision parameters and q into a hierarchical Bayesian formulation. The posterior distributions of variables are evaluated by a Metropolis-within-Gibbs sampler. The point estimates of variables and their uncertainties are given. Simulations of a cantilever beam and a space truss under various loading conditions validate the proposed method in providing adaptive determination of q and better reconstruction performance than existing Bayesian methods.
Bayesian evidence computation for model selection in non-linear geoacoustic inference problems.
Dettmer, Jan; Dosso, Stan E; Osler, John C
2010-12-01
This paper applies a general Bayesian inference approach, based on Bayesian evidence computation, to geoacoustic inversion of interface-wave dispersion data. Quantitative model selection is carried out by computing the evidence (normalizing constants) for several model parameterizations using annealed importance sampling. The resulting posterior probability density estimate is compared to estimates obtained from Metropolis-Hastings sampling to ensure consistent results. The approach is applied to invert interface-wave dispersion data collected on the Scotian Shelf, off the east coast of Canada for the sediment shear-wave velocity profile. Results are consistent with previous work on these data but extend the analysis to a rigorous approach including model selection and uncertainty analysis. The results are also consistent with core samples and seismic reflection measurements carried out in the area.
Minimal model for the secondary structures and conformational conversions in proteins
NASA Astrophysics Data System (ADS)
Imamura, Hideo
Better understanding of protein folding process can provide physical insights on the function of proteins and makes it possible to benefit from genetic information accumulated so far. Protein folding process normally takes place in less than seconds but even seconds are beyond reach of current computational power for simulations on a system of all-atom detail. Hence, to model and explore protein folding process it is crucial to construct a proper model that can adequately describe the physical process and mechanism for the relevant time scale. We discuss the reduced off-lattice model that can express _-helix and ?-hairpin conformations defined solely by a given sequence in order to investigate a protein folding mechanism of conformations such as a ?-hairpin and also to investigate conformational conversions in proteins. The first two chapters introduce and review essential concepts in protein folding modelling physical interaction in proteins, various simple models, and also review computational methods, in particular, the Metropolis Monte Carlo method, its dynamic interpretation and thermodynamic Monte Carlo algorithms. Chapter 3 describes the minimalist model that represents both _-helix and ?-sheet conformations using simple potentials. The native conformation can be specified by the sequence without particular conformational biases to a reference state. In Chapter 4, the model is used to investigate the folding mechanism of ?-hairpins exhaustively using the dynamic Monte Carlo and a thermodynamic Monte Carlo method an effcient combination of the multicanonical Monte Carlo and the weighted histogram analysis method. We show that the major folding pathways and folding rate depend on the location of a hydrophobic. The conformational conversions between _-helix and ?-sheet conformations are examined in Chapter 5 and 6. First, the conformational conversion due to mutation in a non-hydrophobic system and then the conformational conversion due to mutation with a hydrophobic pair at a different position at various temperatures are examined.
Caranica, C; Al-Omari, A; Deng, Z; Griffith, J; Nilsen, R; Mao, L; Arnold, J; Schüttler, H-B
2018-01-01
A major challenge in systems biology is to infer the parameters of regulatory networks that operate in a noisy environment, such as in a single cell. In a stochastic regime it is hard to distinguish noise from the real signal and to infer the noise contribution to the dynamical behavior. When the genetic network displays oscillatory dynamics, it is even harder to infer the parameters that produce the oscillations. To address this issue we introduce a new estimation method built on a combination of stochastic simulations, mass action kinetics and ensemble network simulations in which we match the average periodogram and phase of the model to that of the data. The method is relatively fast (compared to Metropolis-Hastings Monte Carlo Methods), easy to parallelize, applicable to large oscillatory networks and large (~2000 cells) single cell expression data sets, and it quantifies the noise impact on the observed dynamics. Standard errors of estimated rate coefficients are typically two orders of magnitude smaller than the mean from single cell experiments with on the order of ~1000 cells. We also provide a method to assess the goodness of fit of the stochastic network using the Hilbert phase of single cells. An analysis of phase departures from the null model with no communication between cells is consistent with a hypothesis of Stochastic Resonance describing single cell oscillators. Stochastic Resonance provides a physical mechanism whereby intracellular noise plays a positive role in establishing oscillatory behavior, but may require model parameters, such as rate coefficients, that differ substantially from those extracted at the macroscopic level from measurements on populations of millions of communicating, synchronized cells.
Semiclassical propagation of Wigner functions.
Dittrich, T; Gómez, E A; Pachón, L A
2010-06-07
We present a comprehensive study of semiclassical phase-space propagation in the Wigner representation, emphasizing numerical applications, in particular as an initial-value representation. Two semiclassical approximation schemes are discussed. The propagator of the Wigner function based on van Vleck's approximation replaces the Liouville propagator by a quantum spot with an oscillatory pattern reflecting the interference between pairs of classical trajectories. Employing phase-space path integration instead, caustics in the quantum spot are resolved in terms of Airy functions. We apply both to two benchmark models of nonlinear molecular potentials, the Morse oscillator and the quartic double well, to test them in standard tasks such as computing autocorrelation functions and propagating coherent states. The performance of semiclassical Wigner propagation is very good even in the presence of marked quantum effects, e.g., in coherent tunneling and in propagating Schrodinger cat states, and of classical chaos in four-dimensional phase space. We suggest options for an effective numerical implementation of our method and for integrating it in Monte-Carlo-Metropolis algorithms suitable for high-dimensional systems.
Activated aging dynamics and effective trap model description in the random energy model
NASA Astrophysics Data System (ADS)
Baity-Jesi, M.; Biroli, G.; Cammarota, C.
2018-01-01
We study the out-of-equilibrium aging dynamics of the random energy model (REM) ruled by a single spin-flip Metropolis dynamics. We focus on the dynamical evolution taking place on time-scales diverging with the system size. Our aim is to show to what extent the activated dynamics displayed by the REM can be described in terms of an effective trap model. We identify two time regimes: the first one corresponds to the process of escaping from a basin in the energy landscape and to the subsequent exploration of high energy configurations, whereas the second one corresponds to the evolution from a deep basin to the other. By combining numerical simulations with analytical arguments we show why the trap model description does not hold in the former but becomes exact in the second.
Nuclear shape evolution based on microscopic level densities
Ward, D. E.; Carlsson, B. G.; Døssing, T.; ...
2017-02-27
Here, by combining microscopically calculated level densities with the Metropolis walk method, we develop a consistent framework for treating the energy and angular-momentum dependence of the nuclear shape evolution in the fission process. For each nucleus under consideration, the level density is calculated microscopically for each of more than five million shapes with a recently developed combinatorial method. The method employs the same single-particle levels as those used for the extraction of the pairing and shell contributions to the macroscopic-microscopic deformation-energy surface. Containing no new parameters, the treatment is suitable for elucidating the energy dependence of the dynamics of warmmore » nuclei on pairing and shell effects. It is illustrated for the fission fragment mass distribution for several uranium and plutonium isotopes of particular interest.« less
Markov Chain Monte Carlo Inference of Parametric Dictionaries for Sparse Bayesian Approximations
Chaspari, Theodora; Tsiartas, Andreas; Tsilifis, Panagiotis; Narayanan, Shrikanth
2016-01-01
Parametric dictionaries can increase the ability of sparse representations to meaningfully capture and interpret the underlying signal information, such as encountered in biomedical problems. Given a mapping function from the atom parameter space to the actual atoms, we propose a sparse Bayesian framework for learning the atom parameters, because of its ability to provide full posterior estimates, take uncertainty into account and generalize on unseen data. Inference is performed with Markov Chain Monte Carlo, that uses block sampling to generate the variables of the Bayesian problem. Since the parameterization of dictionary atoms results in posteriors that cannot be analytically computed, we use a Metropolis-Hastings-within-Gibbs framework, according to which variables with closed-form posteriors are generated with the Gibbs sampler, while the remaining ones with the Metropolis Hastings from appropriate candidate-generating densities. We further show that the corresponding Markov Chain is uniformly ergodic ensuring its convergence to a stationary distribution independently of the initial state. Results on synthetic data and real biomedical signals indicate that our approach offers advantages in terms of signal reconstruction compared to previously proposed Steepest Descent and Equiangular Tight Frame methods. This paper demonstrates the ability of Bayesian learning to generate parametric dictionaries that can reliably represent the exemplar data and provides the foundation towards inferring the entire variable set of the sparse approximation problem for signal denoising, adaptation and other applications. PMID:28649173
King, Samuel B.; Lapidus, Mariana
2015-01-01
Objective: The authors' goal was to assess changes in the role of librarians in informatics education from 2004 to 2013. This is a follow-up to “Metropolis Redux: The Unique Importance of Library Skills in Informatics,” a 2004 survey of informatics programs. Methods: An electronic survey was conducted in January 2013 and sent to librarians via the MEDLIB-L email discussion list, the library section of the American Association of Colleges of Pharmacy, the Medical Informatics Section of the Medical Library Association, the Information Technology Interest Group of the Association of College and Research Libraries/New England Region, and various library directors across the country. Results: Librarians from fifty-five institutions responded to the survey. Of these respondents, thirty-four included librarians in nonlibrary aspects of informatics training. Fifteen institutions have librarians participating in leadership positions in their informatics programs. Compared to the earlier survey, the role of librarians has evolved. Conclusions: Librarians possess skills that enable them to participate in informatics programs beyond a narrow library focus. Librarians currently perform significant leadership roles in informatics education. There are opportunities for librarian interdisciplinary collaboration in informatics programs. Implications: Informatics is much more than the study of technology. The information skills that librarians bring to the table enrich and broaden the study of informatics in addition to adding value to the library profession itself. PMID:25552939
Li, Xianfeng; Murthy, Sanjeeva; Latour, Robert A.
2011-01-01
A new empirical sampling method termed “temperature intervals with global exchange of replicas and reduced radii” (TIGER3) is presented and demonstrated to efficiently equilibrate entangled long-chain molecular systems such as amorphous polymers. The TIGER3 algorithm is a replica exchange method in which simulations are run in parallel over a range of temperature levels at and above a designated baseline temperature. The replicas sampled at temperature levels above the baseline are run through a series of cycles with each cycle containing four stages – heating, sampling, quenching, and temperature level reassignment. The method allows chain segments to pass through one another at elevated temperature levels during the sampling stage by reducing the van der Waals radii of the atoms, thus eliminating chain entanglement problems. Atomic radii are then returned to their regular values and re-equilibrated at elevated temperature prior to quenching to the baseline temperature. Following quenching, replicas are compared using a Metropolis Monte Carlo exchange process for the construction of an approximate Boltzmann-weighted ensemble of states and then reassigned to the elevated temperature levels for additional sampling. Further system equilibration is performed by periodic implementation of the previously developed TIGER2 algorithm between cycles of TIGER3, which applies thermal cycling without radii reduction. When coupled with a coarse-grained modeling approach, the combined TIGER2/TIGER3 algorithm yields fast equilibration of bulk-phase models of amorphous polymer, even for polymers with complex, highly branched structures. The developed method was tested by modeling the polyethylene melt. The calculated properties of chain conformation and chain segment packing agreed well with published data. The method was also applied to generate equilibrated structural models of three increasingly complex amorphous polymer systems: poly(methyl methacrylate), poly(butyl methacrylate), and DTB-succinate copolymer. Calculated glass transition temperature (Tg) and structural parameter profile (S(q)) for each resulting polymer model were found to be in close agreement with experimental Tg values and structural measurements obtained by x-ray diffraction, thus validating that the developed methods provide realistic models of amorphous polymer structure. PMID:21769156
NASA Astrophysics Data System (ADS)
Touhidul Mustafa, Syed Md.; Nossent, Jiri; Ghysels, Gert; Huysmans, Marijke
2017-04-01
Transient numerical groundwater flow models have been used to understand and forecast groundwater flow systems under anthropogenic and climatic effects, but the reliability of the predictions is strongly influenced by different sources of uncertainty. Hence, researchers in hydrological sciences are developing and applying methods for uncertainty quantification. Nevertheless, spatially distributed flow models pose significant challenges for parameter and spatially distributed input estimation and uncertainty quantification. In this study, we present a general and flexible approach for input and parameter estimation and uncertainty analysis of groundwater models. The proposed approach combines a fully distributed groundwater flow model (MODFLOW) with the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. To avoid over-parameterization, the uncertainty of the spatially distributed model input has been represented by multipliers. The posterior distributions of these multipliers and the regular model parameters were estimated using DREAM. The proposed methodology has been applied in an overexploited aquifer in Bangladesh where groundwater pumping and recharge data are highly uncertain. The results confirm that input uncertainty does have a considerable effect on the model predictions and parameter distributions. Additionally, our approach also provides a new way to optimize the spatially distributed recharge and pumping data along with the parameter values under uncertain input conditions. It can be concluded from our approach that considering model input uncertainty along with parameter uncertainty is important for obtaining realistic model predictions and a correct estimation of the uncertainty bounds.
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan; ...
2017-10-17
In this paper we developed an efficient Bayesian inversion framework for interpreting marine seismic Amplitude Versus Angle and Controlled-Source Electromagnetic data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis and Adaptive Metropolis samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and Controlled-Source Electromagnetic data. The multi-chain Markov-chain Monte Carlo is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration, the approach ismore » used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic Amplitude Versus Angle and Controlled-Source Electromagnetic joint inversion provides better estimation of reservoir saturations than the seismic Amplitude Versus Angle only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated — reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan
In this paper we developed an efficient Bayesian inversion framework for interpreting marine seismic Amplitude Versus Angle and Controlled-Source Electromagnetic data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis and Adaptive Metropolis samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and Controlled-Source Electromagnetic data. The multi-chain Markov-chain Monte Carlo is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration, the approach ismore » used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic Amplitude Versus Angle and Controlled-Source Electromagnetic joint inversion provides better estimation of reservoir saturations than the seismic Amplitude Versus Angle only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated — reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.« less
Peltola, Tomi; Marttinen, Pekka; Vehtari, Aki
2012-01-01
High-dimensional datasets with large amounts of redundant information are nowadays available for hypothesis-free exploration of scientific questions. A particular case is genome-wide association analysis, where variations in the genome are searched for effects on disease or other traits. Bayesian variable selection has been demonstrated as a possible analysis approach, which can account for the multifactorial nature of the genetic effects in a linear regression model. Yet, the computation presents a challenge and application to large-scale data is not routine. Here, we study aspects of the computation using the Metropolis-Hastings algorithm for the variable selection: finite adaptation of the proposal distributions, multistep moves for changing the inclusion state of multiple variables in a single proposal and multistep move size adaptation. We also experiment with a delayed rejection step for the multistep moves. Results on simulated and real data show increase in the sampling efficiency. We also demonstrate that with application specific proposals, the approach can overcome a specific mixing problem in real data with 3822 individuals and 1,051,811 single nucleotide polymorphisms and uncover a variant pair with synergistic effect on the studied trait. Moreover, we illustrate multimodality in the real dataset related to a restrictive prior distribution on the genetic effect sizes and advocate a more flexible alternative. PMID:23166669
Wang, Hongrui; Wang, Cheng; Wang, Ying; ...
2017-04-05
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLEmore » confidence interval and thus more precise estimation by using the related information from regional gage stations. As a result, the Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.« less
The New Sectionalism: I. Metropolis Without Growth
ERIC Educational Resources Information Center
Alonso, William
1978-01-01
This article suggests that there are three principal sources of metropolitan population decline: the declining birth rate, the reversal of rural-to-urban migration, and inter-metropolitan migration. (Author/AM)
Improving and Evaluating Nested Sampling Algorithm for Marginal Likelihood Estimation
NASA Astrophysics Data System (ADS)
Ye, M.; Zeng, X.; Wu, J.; Wang, D.; Liu, J.
2016-12-01
With the growing impacts of climate change and human activities on the cycle of water resources, an increasing number of researches focus on the quantification of modeling uncertainty. Bayesian model averaging (BMA) provides a popular framework for quantifying conceptual model and parameter uncertainty. The ensemble prediction is generated by combining each plausible model's prediction, and each model is attached with a model weight which is determined by model's prior weight and marginal likelihood. Thus, the estimation of model's marginal likelihood is crucial for reliable and accurate BMA prediction. Nested sampling estimator (NSE) is a new proposed method for marginal likelihood estimation. The process of NSE is accomplished by searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm is often used for local sampling. However, M-H is not an efficient sampling algorithm for high-dimensional or complicated parameter space. For improving the efficiency of NSE, it could be ideal to incorporate the robust and efficient sampling algorithm - DREAMzs into the local sampling of NSE. The comparison results demonstrated that the improved NSE could improve the efficiency of marginal likelihood estimation significantly. However, both improved and original NSEs suffer from heavy instability. In addition, the heavy computation cost of huge number of model executions is overcome by using an adaptive sparse grid surrogates.
Modelling malaria incidence by an autoregressive distributed lag model with spatial component.
Laguna, Francisco; Grillet, María Eugenia; León, José R; Ludeña, Carenne
2017-08-01
The influence of climatic variables on the dynamics of human malaria has been widely highlighted. Also, it is known that this mosquito-borne infection varies in space and time. However, when the data is spatially incomplete most popular spatio-temporal methods of analysis cannot be applied directly. In this paper, we develop a two step methodology to model the spatio-temporal dependence of malaria incidence on local rainfall, temperature, and humidity as well as the regional sea surface temperatures (SST) in the northern coast of Venezuela. First, we fit an autoregressive distributed lag model (ARDL) to the weekly data, and then, we adjust a linear separable spacial vectorial autoregressive model (VAR) to the residuals of the ARDL. Finally, the model parameters are tuned using a Markov Chain Monte Carlo (MCMC) procedure derived from the Metropolis-Hastings algorithm. Our results show that the best model to account for the variations of malaria incidence from 2001 to 2008 in 10 endemic Municipalities in North-Eastern Venezuela is a logit model that included the accumulated local precipitation in combination with the local maximum temperature of the preceding month as positive regressors. Additionally, we show that although malaria dynamics is highly heterogeneous in space, a detailed analysis of the estimated spatial parameters in our model yield important insights regarding the joint behavior of the disease incidence across the different counties in our study. Copyright © 2017 Elsevier Ltd. All rights reserved.
Computational Studies of Strongly Correlated Quantum Matter
NASA Astrophysics Data System (ADS)
Shi, Hao
The study of strongly correlated quantum many-body systems is an outstanding challenge. Highly accurate results are needed for the understanding of practical and fundamental problems in condensed-matter physics, high energy physics, material science, quantum chemistry and so on. Our familiar mean-field or perturbative methods tend to be ineffective. Numerical simulations provide a promising approach for studying such systems. The fundamental difficulty of numerical simulation is that the dimension of the Hilbert space needed to describe interacting systems increases exponentially with the system size. Quantum Monte Carlo (QMC) methods are one of the best approaches to tackle the problem of enormous Hilbert space. They have been highly successful for boson systems and unfrustrated spin models. For systems with fermions, the exchange symmetry in general causes the infamous sign problem, making the statistical noise in the computed results grow exponentially with the system size. This hinders our understanding of interesting physics such as high-temperature superconductivity, metal-insulator phase transition. In this thesis, we present a variety of new developments in the auxiliary-field quantum Monte Carlo (AFQMC) methods, including the incorporation of symmetry in both the trial wave function and the projector, developing the constraint release method, using the force-bias to drastically improve the efficiency in Metropolis framework, identifying and solving the infinite variance problem, and sampling Hartree-Fock-Bogoliubov wave function. With these developments, some of the most challenging many-electron problems are now under control. We obtain an exact numerical solution of two-dimensional strongly interacting Fermi atomic gas, determine the ground state properties of the 2D Fermi gas with Rashba spin-orbit coupling, provide benchmark results for the ground state of the two-dimensional Hubbard model, and establish that the Hubbard model has a stripe order in the underdoped region.
Mckee, D L
1985-01-01
The tendency toward hypertrophy of large metropolitan areas in the Third World has been a subject of concern to economists and other social scientists for some time. Inability to absorb vast waves of migrants into the organized labor force or to provide adequate infrastructure and services are serious problems in many growing cities of Asia, Africa, and Latin America. A different phenomenon created by perpetual urban expansion has been relatively neglected: the problems caused when preexisting urban areas are absorbed into the metropolis. The tendency of squatter settlements to constrict normal urban growth and expansion and to impede rational provision of services has been recognized, but the absorption of small cities does not necessarily produce identical problems. Small cities absorbed into a metropolis lose their identity in the successive waves of suburban proliferation. Los Angeles in the US may be considered the prototype of the phenomenon in which multiple preexisting urban zones are absorbed into the same metropolis without formation of any visible center of gravity. In some cases, small cities may be completely engulfed by the encroaching metropolis, if transit routes or availability of land makes them interesting to developers. The livelihood of residents may be threatened if they are no longer able to cultivate gardens or raise small animals. Local services may deteriorate. The youngest and most able residents are likely to abandon such places for the greater opportunities of the city, leaving the aged and less qualified to fend for themselves. Jobs may disappear and traditional commercial relations may be destroyed without being replaced. The future wellbeing of residents depends on their ability to maneuver in the new metropolitan environment, but many will be unable to adjust for lack of training, the weight of immovable property, or diverse personal considerations. Planning could help to reduce the problems that occasional survival of some small entities may pose for rational expansion of transportation and services at the metropolitan level, but many Third World cities lack such planning capacity altogether.
Obiri-Yeboah, Dorcas; Asante Awuku, Yaw; Adu, Joseph; Pappoe, Faustina; Obboh, Evans; Nsiah, Paul; Amoako-Sakyi, Daniel; Simpore, Jacques
2018-01-01
Hepatitis E virus is an emerging infection in Africa with poor maternal and foetal outcomes. There is scanty data on the sero-prevalence of HEV infection among pregnant women in Ghana. This study highlighted the prevalence and risk factors associated with HEV infection among pregnant women in Cape Coast Metropolis, Central Region of Ghana. A multicenter (3 selected sites) analytical cross sectional study involving 398 pregnant women in the Cape Coast metropolis was conducted. HEV (Anti-HEV IgG and Anti-HEV IgM) ELISA was performed. Sero-positive women had liver chemistries done and data collected on maternal and neonatal outcomes. Data analyses were performed using Stata version 13 software (STATA Corp, Texas USA). Mean age was 28.01 (± 5.93) years. HEV sero-prevalence was 12.2% (n = 48) for IgG and 0.2% (n = 1) for IgM with overall of 12.3%. The odds of being HEV sero-positive for women aged 26-35 years was 3.1 (95% CI: 1.1-8.1), p = 0.02 and ≥36 years it was 10.7 (95% CI; 3.4-33.5), p = 0.0001. Living in urban settlement was associated with lowest odds of HEV infection {OR 0.4 (95% CI; 0.2-0.8), p = 0.01}. Factors with no statistical evidence of association include main source of drinking water and history of blood transfusion. The sero-prevalence of HEV IgG increased progressively across trimesters with the highest among women in their third trimester (55.3%). None of the 49 HEV sero-positive women had elevated ALT level. Ten (N = 41) of the neonates born to sero-positive women developed jaundice in the neonatal period. The mean birth weight was 3.1kg (SD 0.4). HEV sero-prevalence among pregnant women in the Cape Coast Metropolis is high enough to deserve more attention than it has received so far. It is therefore important to conduct further research on the potential impact on maternal and neonatal mortality and morbidity in Ghana.
BCL::MP-Fold: membrane protein structure prediction guided by EPR restraints
Fischer, Axel W.; Alexander, Nathan S.; Woetzel, Nils; Karakaş, Mert; Weiner, Brian E.; Meiler, Jens
2016-01-01
For many membrane proteins, the determination of their topology remains a challenge for methods like X-ray crystallography and nuclear magnetic resonance (NMR) spectroscopy. Electron paramagnetic resonance (EPR) spectroscopy has evolved as an alternative technique to study structure and dynamics of membrane proteins. The present study demonstrates the feasibility of membrane protein topology determination using limited EPR distance and accessibility measurements. The BCL::MP-Fold algorithm assembles secondary structure elements (SSEs) in the membrane using a Monte Carlo Metropolis (MCM) approach. Sampled models are evaluated using knowledge-based potential functions and agreement with the EPR data and a knowledge-based energy function. Twenty-nine membrane proteins of up to 696 residues are used to test the algorithm. The protein-size-normalized root-mean-square-deviation (RMSD100) value of the most accurate model is better than 8 Å for twenty-seven, better than 6 Å for twenty-two, and better than 4 Å for fifteen out of twenty-nine proteins, demonstrating the algorithm’s ability to sample the native topology. The average enrichment could be improved from 1.3 to 2.5, showing the improved discrimination power by using EPR data. PMID:25820805
Guo, Qingbin; Kang, Ji; Wu, Yan; Cui, Steve W; Hu, Xinzhong; Yada, Rickey Y
2015-12-10
The structure and conformation relationships of a heteropolysaccharide (GlcpA)Xylan in terms of various molecular weights, Xylp/GlcpA ratio and the distribution of GlcpA along xylan chain were investigated using computer modeling. The adiabatic contour maps of xylobiose, XylpXylp(GlcpA) and (GlcpA)XylpXylp(GlcpA) indicated that the insertion of the side group (GlcpA) influenced the accessible conformational space of xylobiose molecule. RIS-Metropolis Monte Carlo method indicated that insertion of GlcpA side chain induced a lowering effect of the calculated chain extension at low GlcpA:Xylp ratio (GlcpA:Xylp = 1:3). The chain, however, became extended when the ratio of GlcpA:Xylp above 2/3. It was also shown that the spatial extension of the polymer chains was dependent on the distribution of side chain: the random distribution demonstrated the most flexible structure compared to block and alternative distribution. The present studies provide a unique insight into the dependence of both side chain ratio and distribution on the stiffness and flexibility of various (GlcpA)Xylan molecules. Copyright © 2015. Published by Elsevier Ltd.
Building test data from real outbreaks for evaluating detection algorithms
Texier, Gaetan; Jackson, Michael L.; Siwe, Leonel; Meynard, Jean-Baptiste; Deparis, Xavier; Chaudet, Herve
2017-01-01
Benchmarking surveillance systems requires realistic simulations of disease outbreaks. However, obtaining these data in sufficient quantity, with a realistic shape and covering a sufficient range of agents, size and duration, is known to be very difficult. The dataset of outbreak signals generated should reflect the likely distribution of authentic situations faced by the surveillance system, including very unlikely outbreak signals. We propose and evaluate a new approach based on the use of historical outbreak data to simulate tailored outbreak signals. The method relies on a homothetic transformation of the historical distribution followed by resampling processes (Binomial, Inverse Transform Sampling Method—ITSM, Metropolis-Hasting Random Walk, Metropolis-Hasting Independent, Gibbs Sampler, Hybrid Gibbs Sampler). We carried out an analysis to identify the most important input parameters for simulation quality and to evaluate performance for each of the resampling algorithms. Our analysis confirms the influence of the type of algorithm used and simulation parameters (i.e. days, number of cases, outbreak shape, overall scale factor) on the results. We show that, regardless of the outbreaks, algorithms and metrics chosen for the evaluation, simulation quality decreased with the increase in the number of days simulated and increased with the number of cases simulated. Simulating outbreaks with fewer cases than days of duration (i.e. overall scale factor less than 1) resulted in an important loss of information during the simulation. We found that Gibbs sampling with a shrinkage procedure provides a good balance between accuracy and data dependency. If dependency is of little importance, binomial and ITSM methods are accurate. Given the constraint of keeping the simulation within a range of plausible epidemiological curves faced by the surveillance system, our study confirms that our approach can be used to generate a large spectrum of outbreak signals. PMID:28863159
2013-12-18
The collection of red dots seen here show one of several very distant galaxy clusters discovered by combining ground-based optical data from the NOAO Kitt Peak National Observatory with infrared data from NASA Spitzer Space Telescope.
Bayesian structured additive regression modeling of epidemic data: application to cholera
2012-01-01
Background A significant interest in spatial epidemiology lies in identifying associated risk factors which enhances the risk of infection. Most studies, however, make no, or limited use of the spatial structure of the data, as well as possible nonlinear effects of the risk factors. Methods We develop a Bayesian Structured Additive Regression model for cholera epidemic data. Model estimation and inference is based on fully Bayesian approach via Markov Chain Monte Carlo (MCMC) simulations. The model is applied to cholera epidemic data in the Kumasi Metropolis, Ghana. Proximity to refuse dumps, density of refuse dumps, and proximity to potential cholera reservoirs were modeled as continuous functions; presence of slum settlers and population density were modeled as fixed effects, whereas spatial references to the communities were modeled as structured and unstructured spatial effects. Results We observe that the risk of cholera is associated with slum settlements and high population density. The risk of cholera is equal and lower for communities with fewer refuse dumps, but variable and higher for communities with more refuse dumps. The risk is also lower for communities distant from refuse dumps and potential cholera reservoirs. The results also indicate distinct spatial variation in the risk of cholera infection. Conclusion The study highlights the usefulness of Bayesian semi-parametric regression model analyzing public health data. These findings could serve as novel information to help health planners and policy makers in making effective decisions to control or prevent cholera epidemics. PMID:22866662
NASA Astrophysics Data System (ADS)
Banerjee, D.; Jiang, F.-J.; Olesen, T. Z.; Orland, P.; Wiese, U.-J.
2018-05-01
We consider the (2 +1 ) -dimensional S U (2 ) quantum link model on the honeycomb lattice and show that it is equivalent to a quantum dimer model on the kagome lattice. The model has crystalline confined phases with spontaneously broken translation invariance associated with pinwheel order, which is investigated with either a Metropolis or an efficient cluster algorithm. External half-integer non-Abelian charges [which transform nontrivially under the Z (2 ) center of the S U (2 ) gauge group] are confined to each other by fractionalized strings with a delocalized Z (2 ) flux. The strands of the fractionalized flux strings are domain walls that separate distinct pinwheel phases. A second-order phase transition in the three-dimensional Ising universality class separates two confining phases: one with correlated pinwheel orientations, and the other with uncorrelated pinwheel orientations.
NASA Astrophysics Data System (ADS)
Elshall, A. S.; Ye, M.; Niu, G. Y.; Barron-Gafford, G.
2016-12-01
Bayesian multimodel inference is increasingly being used in hydrology. Estimating Bayesian model evidence (BME) is of central importance in many Bayesian multimodel analysis such as Bayesian model averaging and model selection. BME is the overall probability of the model in reproducing the data, accounting for the trade-off between the goodness-of-fit and the model complexity. Yet estimating BME is challenging, especially for high dimensional problems with complex sampling space. Estimating BME using the Monte Carlo numerical methods is preferred, as the methods yield higher accuracy than semi-analytical solutions (e.g. Laplace approximations, BIC, KIC, etc.). However, numerical methods are prone the numerical demons arising from underflow of round off errors. Although few studies alluded to this issue, to our knowledge this is the first study that illustrates these numerical demons. We show that the precision arithmetic can become a threshold on likelihood values and Metropolis acceptance ratio, which results in trimming parameter regions (when likelihood function is less than the smallest floating point number that a computer can represent) and corrupting of the empirical measures of the random states of the MCMC sampler (when using log-likelihood function). We consider two of the most powerful numerical estimators of BME that are the path sampling method of thermodynamic integration (TI) and the importance sampling method of steppingstone sampling (SS). We also consider the two most widely used numerical estimators, which are the prior sampling arithmetic mean (AS) and posterior sampling harmonic mean (HM). We investigate the vulnerability of these four estimators to the numerical demons. Interesting, the most biased estimator, namely the HM, turned out to be the least vulnerable. While it is generally assumed that AM is a bias-free estimator that will always approximate the true BME by investing in computational effort, we show that arithmetic underflow can hamper AM resulting in severe underestimation of BME. TI turned out to be the most vulnerable, resulting in BME overestimation. Finally, we show how SS can be largely invariant to rounding errors, yielding the most accurate and computational efficient results. These research results are useful for MC simulations to estimate Bayesian model evidence.
78 FR 962 - Sunshine Act Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-07
.... (Metropolis Works Uranium Conversion Facility), Docket No. 40-3392, Petition for Review of LBP-12-6 (Mar. 22...--Tentative Thursday, February 7, 2013 1:00 p.m. Briefing on Steam Generator Tube Degradation (Public Meeting...
2008-01-17
Delhi is the second largest metropolis in India, with a population of 16 million and is located in northern India along the banks of the Yamuna River. This image was acquired by NASA Terra satellite on September 22, 2003.
NASA Astrophysics Data System (ADS)
Chen, C.; Lee, J.; Chan, Y.; Lu, C.
2010-12-01
The Taipei Metropolis, home to around 10 million people, is subject to seismic hazard originated from not only distant faults or sources scattered throughout the Taiwan region, but also active fault lain directly underneath. Northern Taiwan including the Taipei region is currently affected by post-orogenic (Penglai arc-continent collision) processes related to backarc extension of the Ryukyu subduction system. The Shanchiao Fault, an active normal fault outcropping along the western boundary of the Taipei Basin and dipping to the east, is investigated here for its subsurface structure and activities. Boreholes records in the central portion of the fault were analyzed to document the stacking of post- Last Glacial Maximum growth sediments, and a tulip flower structure is illuminated with averaged vertical slip rate of about 3 mm/yr. Similar fault zone architecture and post-LGM tectonic subsidence rate is also found in the northern portion of the fault. A correlation between geomorphology and structural geology in the Shanchiao Fault zone demonstrates an array of subtle geomorphic scarps corresponds to the branch fault while the surface trace of the main fault seems to be completely erased by erosion and sedimentation. Such constraints and knowledge are crucial in earthquake hazard evaluation and mitigation in the Taipei Metropolis, and in understanding the kinematics of transtensional tectonics in northern Taiwan. Schematic 3D diagram of the fault zone in the central portion of the Shanchiao Fault, displaying regional subsurface geology and its relation to topographic features.
Wheelchair accessibility to public buildings in the Kumasi metropolis, Ghana.
Yarfi, Cosmos; Ashigbi, Evans Y K; Nakua, Emmanuel K
2017-01-01
Accessibility implies making public places accessible to every individual, irrespective of his or her disability or special need, ensuring the integration of the wheelchair user into the society and thereby granting them the capability of participating in activities of daily living and ensuring equality in daily life. This study was carried out to assess the accessibility of the physical infrastructures (public buildings) in the Kumasi metropolis to wheelchairs after the passage of the Ghanaian Disability Law (Act 716, 2006). Eighty-four public buildings housing education facilities, health facilities, ministries, departments and agencies, sports and recreation, religious groups and banks were assessed. The routes, entrances, height of steps, grade of ramps, sinks, entrance to washrooms, toilets, urinals, automated teller machines and tellers' counters were measured and computed. Out of a total of 84 buildings assessed, only 34 (40.5%) of the buildings, 52.3% of the entrances and 87.4% of the routes of the buildings were accessible to wheelchair users. A total of 25% (13 out of 52) of the public buildings with more than one floor were fitted with elevators to connect the different levels of floors. The results of this study show that public buildings in the Kumasi metropolis are not wheelchair accessible. An important observation made during this study was that there is an intention to improve accessibility when buildings are being constructed or renovated, but there are no laid down guidelines as how to make the buildings accessible for wheelchair users.
Ocular Health and Safety Assessment among Mechanics of the Cape Coast Metropolis, Ghana.
Abu, Emmanuel Kwasi; Boadi-Kusi, Samuel Bert; Opuni, Prince Quarcoo; Kyei, Samuel; Owusu-Ansah, Andrew; Darko-Takyi, Charles
2016-01-01
To conduct an ocular health and safety assessment among mechanics in the Cape Coast Metropolis, Ghana. This descriptive cross sectional study included 500 mechanics using multistage sampling. All participants filled a structured questionnaire on demographic data, occupational history and ocular health history. Study participants underwent determination of visual acuity (VA) using LogMAR chart, external eye examination with a handheld slit lamp biomicroscope, dilated fundus examination, applanation tonometry and refraction. Out of 500 mechanics, 433 were examined (response rate, 87%) comprised of 408 (94.2%) male and 25 (5.8%) female subjects. The prevalence of visual impairment (i.e. presenting VA < 6/18) among the respondents was 2.1%. Eye injuries were reported in 171 (39.5%) mechanics probably due to the large number of workers, 314 (72.5%), who did not use eye protective devices. Mechanics in the auto welding category were at the highest risk of sustaining an eye injury (odds ratio [OR], 13.4; P < 0.001). Anterior segment ocular disorders were mostly pterygia while posterior segment eye disorders included glaucoma suspects and retinochoroidal lesions. The development of pterygia was associated with the number of years a mechanic stayed on the job. Eye care seeking behavior among the participants was poor. Eye injuries were prevalent among the mechanics as the use of eye protection was low. Eye safety should be made an integral part of the public health agenda in the Cape Coast Metropolis.
NASA Astrophysics Data System (ADS)
Linde, N.; Vrugt, J. A.
2009-04-01
Geophysical models are increasingly used in hydrological simulations and inversions, where they are typically treated as an artificial data source with known uncorrelated "data errors". The model appraisal problem in classical deterministic linear and non-linear inversion approaches based on linearization is often addressed by calculating model resolution and model covariance matrices. These measures offer only a limited potential to assign a more appropriate "data covariance matrix" for future hydrological applications, simply because the regularization operators used to construct a stable inverse solution bear a strong imprint on such estimates and because the non-linearity of the geophysical inverse problem is not explored. We present a parallelized Markov Chain Monte Carlo (MCMC) scheme to efficiently derive the posterior spatially distributed radar slowness and water content between boreholes given first-arrival traveltimes. This method is called DiffeRential Evolution Adaptive Metropolis (DREAM_ZS) with snooker updater and sampling from past states. Our inverse scheme does not impose any smoothness on the final solution, and uses uniform prior ranges of the parameters. The posterior distribution of radar slowness is converted into spatially distributed soil moisture values using a petrophysical relationship. To benchmark the performance of DREAM_ZS, we first apply our inverse method to a synthetic two-dimensional infiltration experiment using 9421 traveltimes contaminated with Gaussian errors and 80 different model parameters, corresponding to a model discretization of 0.3 m × 0.3 m. After this, the method is applied to field data acquired in the vadose zone during snowmelt. This work demonstrates that fully non-linear stochastic inversion can be applied with few limiting assumptions to a range of common two-dimensional tomographic geophysical problems. The main advantage of DREAM_ZS is that it provides a full view of the posterior distribution of spatially distributed soil moisture, which is key to appropriately treat geophysical parameter uncertainty and infer hydrologic models.
1995 Bicycle and Pedestrian Safety Report
DOT National Transportation Integrated Search
1995-03-01
This report provides a review of the current data on bicycle and pedestrian : safety across the United States, finding that safety and education : programs could significantly improve bicycle and pedestrian safety in the : Dallas-Fort Worth Metropoli...
Lee, Jui-Huna; Wu, Chang-Fu; Hoek, Gerard; de Hoogh, Kees; Beelen, Rob; Brunekreef, Bert; Chan, Chang-Chuan
2015-05-01
Traffic intensity, length of road, and proximity to roads are the most common traffic indicators in the land use regression (LUR) models for particulate matter in ESCAPE study areas in Europe. This study explored what local variables can improve the performance of LUR models in an Asian metropolis with high densities of roads and strong activities of industry, commerce and construction. By following the ESCAPE procedure, we derived LUR models of PM₂.₅, PM₂.₅ absorbance, PM₁₀, and PMcoarse (PM₂.₅-₁₀) in Taipei. The overall annual average concentrations of PM₂.₅, PM₁₀, and PMcoarse were 26.0 ± 5.6, 48.6 ± 5.9, and 23.3 ± 3.1 μg/m(3), respectively, and the absorption coefficient of PM₂.₅ was 2.0 ± 0.4 × 10(-5)m(-1). Our LUR models yielded R(2) values of 95%, 96%, 87%, and 65% for PM₂.₅, PM₂.₅ absorbance, PM₁₀, and PMcoarse, respectively. PM₂.₅ levels were increased by local traffic variables, industrial, construction, and residential land-use variables and decreased by rivers; while PM₂.₅ absorbance levels were increased by local traffic variables, industrial, and commercial land-use variables in the models. PMcoarse levels were increased by elevated highways. Road area explained more variance than road length by increasing the incremental value of 27% and 6% adjusted R(2) for PM₂.₅ and PM₁₀ models, respectively. In the PM₂.₅ absorbance model, road area and transportation facility explain 29% more variance than road length. In the PMcoarse model, industrial and new local variables instead of road length improved the incremental value of adjusted R(2) from 39% to 60%. We concluded that road area can better explain the spatial distribution of PM₂.₅ and PM₂.₅ absorbance concentrations than road length. By incorporating road area and other new local variables, the performance of each PM LUR model was improved. The results suggest that road area is a better indicator of traffic intensity rather than road length in a city with high density of road network and traffic. Copyright © 2015 Elsevier B.V. All rights reserved.
Efficient Parameter Searches for Colloidal Materials Design with Digital Alchemy
NASA Astrophysics Data System (ADS)
Dodd, Paul, M.; Geng, Yina; van Anders, Greg; Glotzer, Sharon C.
Optimal colloidal materials design is challenging, even for high-throughput or genomic approaches, because the design space provided by modern colloid synthesis techniques can easily have dozens of dimensions. In this talk we present the methodology of an inverse approach we term ''digital alchemy'' to perform rapid searches of design-paramenter spaces with up to 188 dimensions that yield thermodynamically optimal colloid parameters for target crystal structures with up to 20 particles in a unit cell. The method relies only on fundamental principles of statistical mechanics and Metropolis Monte Carlo techniques, and yields particle attribute tolerances via analogues of familiar stress-strain relationships.
Ising antiferromagnet on the Archimedean lattices.
Yu, Unjong
2015-06-01
Geometric frustration effects were studied systematically with the Ising antiferromagnet on the 11 Archimedean lattices using the Monte Carlo methods. The Wang-Landau algorithm for static properties (specific heat and residual entropy) and the Metropolis algorithm for a freezing order parameter were adopted. The exact residual entropy was also found. Based on the degree of frustration and dynamic properties, ground states of them were determined. The Shastry-Sutherland lattice and the trellis lattice are weakly frustrated and have two- and one-dimensional long-range-ordered ground states, respectively. The bounce, maple-leaf, and star lattices have the spin ice phase. The spin liquid phase appears in the triangular and kagome lattices.
Ising antiferromagnet on the Archimedean lattices
NASA Astrophysics Data System (ADS)
Yu, Unjong
2015-06-01
Geometric frustration effects were studied systematically with the Ising antiferromagnet on the 11 Archimedean lattices using the Monte Carlo methods. The Wang-Landau algorithm for static properties (specific heat and residual entropy) and the Metropolis algorithm for a freezing order parameter were adopted. The exact residual entropy was also found. Based on the degree of frustration and dynamic properties, ground states of them were determined. The Shastry-Sutherland lattice and the trellis lattice are weakly frustrated and have two- and one-dimensional long-range-ordered ground states, respectively. The bounce, maple-leaf, and star lattices have the spin ice phase. The spin liquid phase appears in the triangular and kagome lattices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schröder, Markus, E-mail: Markus.Schroeder@pci.uni-heidelberg.de; Meyer, Hans-Dieter, E-mail: Hans-Dieter.Meyer@pci.uni-heidelberg.de
2014-07-21
We report energies and tunneling splittings of vibrational excited states of malonaldehyde which have been obtained using full dimensional quantum mechanical calculations. To this end we employed the multi configuration time-dependent Hartree method. The results have been obtained using a recently published potential energy surface [Y. Wang, B. J. Braams, J. M. Bowman, S. Carter, and D. P. Tew, J. Chem. Phys. 128, 224314 (2008)] which has been brought into a suitable form by a modified version of the n-mode representation which was used with two different arrangements of coordinates. The relevant terms of the expansion have been identified withmore » a Metropolis algorithm and a diffusion Monte-Carlo technique, respectively.« less
NASA Astrophysics Data System (ADS)
Liu, Y. R.; Li, Y. P.; Huang, G. H.; Zhang, J. L.; Fan, Y. R.
2017-10-01
In this study, a Bayesian-based multilevel factorial analysis (BMFA) method is developed to assess parameter uncertainties and their effects on hydrological model responses. In BMFA, Differential Evolution Adaptive Metropolis (DREAM) algorithm is employed to approximate the posterior distributions of model parameters with Bayesian inference; factorial analysis (FA) technique is used for measuring the specific variations of hydrological responses in terms of posterior distributions to investigate the individual and interactive effects of parameters on model outputs. BMFA is then applied to a case study of the Jinghe River watershed in the Loess Plateau of China to display its validity and applicability. The uncertainties of four sensitive parameters, including soil conservation service runoff curve number to moisture condition II (CN2), soil hydraulic conductivity (SOL_K), plant available water capacity (SOL_AWC), and soil depth (SOL_Z), are investigated. Results reveal that (i) CN2 has positive effect on peak flow, implying that the concentrated rainfall during rainy season can cause infiltration-excess surface flow, which is an considerable contributor to peak flow in this watershed; (ii) SOL_K has positive effect on average flow, implying that the widely distributed cambisols can lead to medium percolation capacity; (iii) the interaction between SOL_AWC and SOL_Z has noticeable effect on the peak flow and their effects are dependent upon each other, which discloses that soil depth can significant influence the processes of plant uptake of soil water in this watershed. Based on the above findings, the significant parameters and the relationship among uncertain parameters can be specified, such that hydrological model's capability for simulating/predicting water resources of the Jinghe River watershed can be improved.
Galactic City at the Edge of the Universe
2011-01-12
Astronomers have discovered a massive cluster of young galaxies forming in the distant universe. The growing galactic metropolis is known as COSMOS-AzTEC3. This image was taken Japan Subaru telescope atop Mauna Kea in Hawaii.
Rainforest metropolis casts 1,000-km defaunation shadow.
Tregidgo, Daniel J; Barlow, Jos; Pompeu, Paulo S; de Almeida Rocha, Mayana; Parry, Luke
2017-08-08
Tropical rainforest regions are urbanizing rapidly, yet the role of emerging metropolises in driving wildlife overharvesting in forests and inland waters is unknown. We present evidence of a large defaunation shadow around a rainforest metropolis. Using interviews with 392 rural fishers, we show that fishing has severely depleted a large-bodied keystone fish species, tambaqui ( Colossoma macropomum ), with an impact extending over 1,000 km from the rainforest city of Manaus (population 2.1 million). There was strong evidence of defaunation within this area, including a 50% reduction in body size and catch rate (catch per unit effort). Our findings link these declines to city-based boats that provide rural fishers with reliable access to fish buyers and ice and likely impact rural fisher livelihoods and flooded forest biodiversity. This empirical evidence that urban markets can defaunate deep into rainforest wilderness has implications for other urbanizing socioecological systems.
NASA Astrophysics Data System (ADS)
Hogue, T. S.; He, M.; Franz, K. J.; Margulis, S. A.; Vrugt, J. A.
2010-12-01
The current study presents an integrated uncertainty analysis and data assimilation approach to improve streamflow predictions while simultaneously providing meaningful estimates of the associated uncertainty. Study models include the National Weather Service (NWS) operational snow model (SNOW17) and rainfall-runoff model (SAC-SMA). The proposed approach uses the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) to simultaneously estimate uncertainties in model parameters, forcing, and observations. An ensemble Kalman filter (EnKF) is configured with the DREAM-identified uncertainty structure and applied to assimilating snow water equivalent data into the SNOW17 model for improved snowmelt simulations. Snowmelt estimates then serves as an input to the SAC-SMA model to provide streamflow predictions at the basin outlet. The robustness and usefulness of the approach is evaluated for a snow-dominated watershed in the northern Sierra Mountains. This presentation describes the implementation of DREAM and EnKF into the coupled SNOW17 and SAC-SMA models and summarizes study results and findings.
NASA Astrophysics Data System (ADS)
Spezi, Emiliano
2010-08-01
Sixty years after the paper 'The Monte Carlo method' by N Metropolis and S Ulam in The Journal of the American Statistical Association (Metropolis and Ulam 1949), use of the most accurate algorithm for computer modelling of radiotherapy linear accelerators, radiation detectors and three dimensional patient dose was discussed in Wales (UK). The Second European Workshop on Monte Carlo Treatment Planning (MCTP2009) was held at the National Museum of Wales in Cardiff. The event, organized by Velindre NHS Trust, Cardiff University and Cancer Research Wales, lasted two and a half days, during which leading experts and contributing authors presented and discussed the latest advances in the field of Monte Carlo treatment planning (MCTP). MCTP2009 was highly successful, judging from the number of participants which was in excess of 140. Of the attendees, 24% came from the UK, 46% from the rest of Europe, 12% from North America and 18% from the rest of the World. Fifty-three oral presentations and 24 posters were delivered in a total of 12 scientific sessions. MCTP2009 follows the success of previous similar initiatives (Verhaegen and Seuntjens 2005, Reynaert 2007, Verhaegen and Seuntjens 2008), and confirms the high level of interest in Monte Carlo technology for radiotherapy treatment planning. The 13 articles selected for this special section (following Physics in Medicine and Biology's usual rigorous peer-review procedure) give a good picture of the high quality of the work presented at MCTP2009. The book of abstracts can be downloaded from http://www.mctp2009.org. I wish to thank the IOP Medical Physics and Computational Physics Groups for their financial support, Elekta Ltd and Dosisoft for sponsoring MCTP2009, and leading manufacturers such as BrainLab, Nucletron and Varian for showcasing their latest MC-based radiotherapy solutions during a dedicated technical session. I am also very grateful to the eight invited speakers who kindly accepted to give keynote presentations which contributed significantly to raising the quality of the event and capturing the interest of the medical physics community. I also wish to thank all those who contributed to the success of MCTP2009: the members of the local Organizing Committee and the Workshop Management Team who managed the event very efficiently, the members of the European Working Group in Monte Carlo Treatment Planning (EWG-MCTP) who acted as Guest Associate Editors for the MCTP2009 abstracts reviewing process, and all the authors who generated new, high quality work. Finally, I hope that you find the contents of this special section enjoyable and informative. Emiliano Spezi Chairman of MCTP2009 Organizing Committee and Guest Editor References Metropolis N and Ulam S 1949 The Monte Carlo method J. Amer. Stat. Assoc. 44 335-41 Reynaert N 2007 First European Workshop on Monte Carlo Treatment Planning J. Phys.: Conf. Ser. 74 011001 Verhaegen F and Seuntjens J 2005 International Workshop on Current Topics in Monte Carlo Treatment Planning Phys. Med. Biol. 50 Verhaegen F and Seuntjens J 2008 International Workshop on Monte Carlo Techniques in Radiotherapy Delivery and Verification J. Phys.: Conf. Ser. 102 011001
Quality of life of deaf and hard of hearing students in Ibadan metropolis, Nigeria
2018-01-01
Quality of Life encompasses an individual’s well-being and health, social participation and satisfaction with functional daily living. Disabilities such as deafness can impact on the quality of life with spatial variance to the environment. Deafness causes communicative problems with significant consequences in cognitive, social, and emotional well-being of affected individuals. However, information relating to the quality of life of deaf and hard of hearing individuals, especially students in developing countries like Nigeria, which could be used to design special health-related interventions is sparse. This study examined the quality of life of deaf and hard of hearing students in Ibadan metropolis, Nigeria. One hundred and ten deaf and hard of hearing students participated in this cross-sectional study. Participants were drawn from all four secondary schools for the Deaf in Ibadan metropolis. The 26 item Brief version of the WHO Quality of Life questionnaire was used for data collection. The data was analyzed using descriptive and inferential statistics at statistical significance of p<0.05. Majority (57.8%) of the deaf and hard of hearing students had poor quality of life. Attending the special school for the Deaf, upper socio-economic status and age (≥17years) are significantly associated with better quality of life. However, gender and age at onset of hearing loss had no significant influence on the quality of life. The Deaf community available in the special school appeared to protect against stigma and discrimination, while also promoting social interactions between deaf and hard of hearing individuals. PMID:29293560
Quality of life of deaf and hard of hearing students in Ibadan metropolis, Nigeria.
Jaiyeola, Mofadeke T; Adeyemo, Adebolajo A
2018-01-01
Quality of Life encompasses an individual's well-being and health, social participation and satisfaction with functional daily living. Disabilities such as deafness can impact on the quality of life with spatial variance to the environment. Deafness causes communicative problems with significant consequences in cognitive, social, and emotional well-being of affected individuals. However, information relating to the quality of life of deaf and hard of hearing individuals, especially students in developing countries like Nigeria, which could be used to design special health-related interventions is sparse. This study examined the quality of life of deaf and hard of hearing students in Ibadan metropolis, Nigeria. One hundred and ten deaf and hard of hearing students participated in this cross-sectional study. Participants were drawn from all four secondary schools for the Deaf in Ibadan metropolis. The 26 item Brief version of the WHO Quality of Life questionnaire was used for data collection. The data was analyzed using descriptive and inferential statistics at statistical significance of p<0.05. Majority (57.8%) of the deaf and hard of hearing students had poor quality of life. Attending the special school for the Deaf, upper socio-economic status and age (≥17years) are significantly associated with better quality of life. However, gender and age at onset of hearing loss had no significant influence on the quality of life. The Deaf community available in the special school appeared to protect against stigma and discrimination, while also promoting social interactions between deaf and hard of hearing individuals.
Comparison of different uncertainty techniques in urban stormwater quantity and quality modelling.
Dotto, Cintia B S; Mannina, Giorgio; Kleidorfer, Manfred; Vezzaro, Luca; Henrichs, Malte; McCarthy, David T; Freni, Gabriele; Rauch, Wolfgang; Deletic, Ana
2012-05-15
Urban drainage models are important tools used by both practitioners and scientists in the field of stormwater management. These models are often conceptual and usually require calibration using local datasets. The quantification of the uncertainty associated with the models is a must, although it is rarely practiced. The International Working Group on Data and Models, which works under the IWA/IAHR Joint Committee on Urban Drainage, has been working on the development of a framework for defining and assessing uncertainties in the field of urban drainage modelling. A part of that work is the assessment and comparison of different techniques generally used in the uncertainty assessment of the parameters of water models. This paper compares a number of these techniques: the Generalized Likelihood Uncertainty Estimation (GLUE), the Shuffled Complex Evolution Metropolis algorithm (SCEM-UA), an approach based on a multi-objective auto-calibration (a multialgorithm, genetically adaptive multi-objective method, AMALGAM) and a Bayesian approach based on a simplified Markov Chain Monte Carlo method (implemented in the software MICA). To allow a meaningful comparison among the different uncertainty techniques, common criteria have been set for the likelihood formulation, defining the number of simulations, and the measure of uncertainty bounds. Moreover, all the uncertainty techniques were implemented for the same case study, in which the same stormwater quantity and quality model was used alongside the same dataset. The comparison results for a well-posed rainfall/runoff model showed that the four methods provide similar probability distributions of model parameters, and model prediction intervals. For ill-posed water quality model the differences between the results were much wider; and the paper provides the specific advantages and disadvantages of each method. In relation to computational efficiency (i.e. number of iterations required to generate the probability distribution of parameters), it was found that SCEM-UA and AMALGAM produce results quicker than GLUE in terms of required number of simulations. However, GLUE requires the lowest modelling skills and is easy to implement. All non-Bayesian methods have problems with the way they accept behavioural parameter sets, e.g. GLUE, SCEM-UA and AMALGAM have subjective acceptance thresholds, while MICA has usually problem with its hypothesis on normality of residuals. It is concluded that modellers should select the method which is most suitable for the system they are modelling (e.g. complexity of the model's structure including the number of parameters), their skill/knowledge level, the available information, and the purpose of their study. Copyright © 2012 Elsevier Ltd. All rights reserved.
Exchange bias for core/shell magnetic nanoparticles
NASA Astrophysics Data System (ADS)
Lemos, C. G. O.; Figueiredo, W.; Santos, M.
2015-09-01
We study the properties of a finite magnetic system to model a magnetic nanoparticle, which is formed by a reduced number of magnetic dipole moments due to the spin of the atoms. The nanoparticle is of the type core/shell where the shell is formed by spins interacting through an antiferromagnetic exchange coupling while for the spins belonging to the core the coupling is ferromagnetic. The interaction between the spins at the interface core/shell can be either ferro or antiferromagnetic. To describe the states of the spins we used the XY model in which the spins are considered as continuous variables, free to point in any direction of the xy plane. We also consider a magnetocrystalline anisotropy, exchange anisotropy and the Zeeman effect. Our model is studied in a lattice with square symmetry, using the Monte Carlo method along with the Metropolis prescription. The results show that in the absence of an external magnetic field and exchange anisotropy, the system continuously goes to a disordered state from an ordered state at a well defined temperature. In the presence of external magnetic fields the system displays the exchange bias phenomenon, that is, the displacement of the hysteresis loops, due to the introduction of the exchange anisotropy term. However, this displacement depends on the core and shell sizes, as well as on the magnitude of the coupling between the shell and the core moments.
Modelling Evolutionary Algorithms with Stochastic Differential Equations.
Heredia, Jorge Pérez
2017-11-20
There has been renewed interest in modelling the behaviour of evolutionary algorithms (EAs) by more traditional mathematical objects, such as ordinary differential equations or Markov chains. The advantage is that the analysis becomes greatly facilitated due to the existence of well established methods. However, this typically comes at the cost of disregarding information about the process. Here, we introduce the use of stochastic differential equations (SDEs) for the study of EAs. SDEs can produce simple analytical results for the dynamics of stochastic processes, unlike Markov chains which can produce rigorous but unwieldy expressions about the dynamics. On the other hand, unlike ordinary differential equations (ODEs), they do not discard information about the stochasticity of the process. We show that these are especially suitable for the analysis of fixed budget scenarios and present analogues of the additive and multiplicative drift theorems from runtime analysis. In addition, we derive a new more general multiplicative drift theorem that also covers non-elitist EAs. This theorem simultaneously allows for positive and negative results, providing information on the algorithm's progress even when the problem cannot be optimised efficiently. Finally, we provide results for some well-known heuristics namely Random Walk (RW), Random Local Search (RLS), the (1+1) EA, the Metropolis Algorithm (MA), and the Strong Selection Weak Mutation (SSWM) algorithm.
Markov Chain Monte Carlo Bayesian Learning for Neural Networks
NASA Technical Reports Server (NTRS)
Goodrich, Michael S.
2011-01-01
Conventional training methods for neural networks involve starting al a random location in the solution space of the network weights, navigating an error hyper surface to reach a minimum, and sometime stochastic based techniques (e.g., genetic algorithms) to avoid entrapment in a local minimum. It is further typically necessary to preprocess the data (e.g., normalization) to keep the training algorithm on course. Conversely, Bayesian based learning is an epistemological approach concerned with formally updating the plausibility of competing candidate hypotheses thereby obtaining a posterior distribution for the network weights conditioned on the available data and a prior distribution. In this paper, we developed a powerful methodology for estimating the full residual uncertainty in network weights and therefore network predictions by using a modified Jeffery's prior combined with a Metropolis Markov Chain Monte Carlo method.
NASA Astrophysics Data System (ADS)
Graham, Eleanor; Cuore Collaboration
2017-09-01
The CUORE experiment is a large-scale bolometric detector seeking to observe the never-before-seen process of neutrinoless double beta decay. Predictions for CUORE's sensitivity to neutrinoless double beta decay allow for an understanding of the half-life ranges that the detector can probe, and also to evaluate the relative importance of different detector parameters. Currently, CUORE uses a Bayesian analysis based in BAT, which uses Metropolis-Hastings Markov Chain Monte Carlo, for its sensitivity studies. My work evaluates the viability and potential improvements of switching the Bayesian analysis to Hamiltonian Monte Carlo, realized through the program Stan and its Morpho interface. I demonstrate that the BAT study can be successfully recreated in Stan, and perform a detailed comparison between the results and computation times of the two methods.
Sterilisation: characteristics of vasectomy acceptors in Delhi.
Sarkar, N N
1993-01-01
The place of vasectomy within the sterilisation programme in Delhi over the period 1983-88 is reviewed and data on vasectomy acceptance and characteristics of acceptors are analysed. Findings suggest a need to improve the strategy for the promotion of vasectomy within the metropolis.
NASA Technical Reports Server (NTRS)
Sequera, Pedro; McDonald, Kyle C.; Gonzalez, Jorge; Arend, Mark; Krakauer, Nir; Bornstein, Robert; Luvll, Jeffrey
2012-01-01
The need for comprehensive studies of the relationships between past and projected changes of regional climate and human activity in comple x urban environments has been well established. The HyspIRI preparato ry airborne activities in California, associated science and applicat ions research, and eventually HyspIRI itself provide an unprecedented opportunity for development and implementation of an integrated data and modeling analysis system focused on coastal urban environments. We will utilize HyspIRI preparatory data collections in developing ne w remote sensing-based tools for investigating the integrated urban e nvironment, emphasizing weather, climate, and energy demands in compl ex coastal cities.
A brief history of the introduction of generalized ensembles to Markov chain Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Berg, Bernd A.
2017-03-01
The most efficient weights for Markov chain Monte Carlo calculations of physical observables are not necessarily those of the canonical ensemble. Generalized ensembles, which do not exist in nature but can be simulated on computers, lead often to a much faster convergence. In particular, they have been used for simulations of first order phase transitions and for simulations of complex systems in which conflicting constraints lead to a rugged free energy landscape. Starting off with the Metropolis algorithm and Hastings' extension, I present a minireview which focuses on the explosive use of generalized ensembles in the early 1990s. Illustrations are given, which range from spin models to peptides.
Drivers and Pattern of Social Vulnerability to Flood in Metropolitan Lagos, Nigeria
NASA Astrophysics Data System (ADS)
Fasona, M.
2016-12-01
Lagos is Africa's second largest city and a city-state in southwest Nigeria. Population and economic activities in the city are concentrated in the greater Lagos metropolitan area - a group of barrier islands less than a thousand square kilometer. Several physical factors and critical human-environmental conditions contribute to high flood vulnerability across the city. Flood impact is highly denominated and the poor tend to suffer more due to higher risk of exposure and poor adaptive capacity. In this study we present the pattern of social vulnerability to flooding across the Lagos metropolis and argued that the pattern substantially reflects the pattern and severity of flooding impact on people across the metropolis. Twenty nine social indicators and experiences including poverty profile, housing conditions, education, population and demography, social network, and communication, among others, were considered. The data were collated through field survey and subjected to principal component analysis. The results were processed into raster surfaces using GIS for social vulnerability characterization at neighborhood levels. The results suggest the social status indicators, neighborhood standing and social networks indictors, the indicators of emergency responses and security, and the neighborhood conditions, in that order, are the most important determinants of social vulnerability. Six of the 16 LGAs in metropolitan Lagos have high social vulnerability. Neighborhoods that combine poor social status indicators and poor neighborhood standing and social networks are found to have high social vulnerability whereas other poor neighborhoods with strong social networks performed better. We conclude that improved human living condition and social network and communication in poor urban neighborhoods are important to reducing social vulnerability to flooding in the metropolis.
Ochoa, Silvia; Yoo, Ahrim; Repke, Jens-Uwe; Wozny, Günter; Yang, Dae Ryook
2007-01-01
Despite many environmental advantages of using alcohol as a fuel, there are still serious questions about its economical feasibility when compared with oil-based fuels. The bioethanol industry needs to be more competitive, and therefore, all stages of its production process must be simple, inexpensive, efficient, and "easy" to control. In recent years, there have been significant improvements in process design, such as in the purification technologies for ethanol dehydration (molecular sieves, pressure swing adsorption, pervaporation, etc.) and in genetic modifications of microbial strains. However, a lot of research effort is still required in optimization and control, where the first step is the development of suitable models of the process, which can be used as a simulated plant, as a soft sensor or as part of the control algorithm. Thus, toward developing good, reliable, and simple but highly predictive models that can be used in the future for optimization and process control applications, in this paper an unstructured and a cybernetic model are proposed and compared for the simultaneous saccharification-fermentation process (SSF) for the production of ethanol from starch by a recombinant Saccharomyces cerevisiae strain. The cybernetic model proposed is a new one that considers the degradation of starch not only into glucose but also into dextrins (reducing sugars) and takes into account the intracellular reactions occurring inside the cells, giving a more detailed description of the process. Furthermore, an identification procedure based on the Metropolis Monte Carlo optimization method coupled with a sensitivity analysis is proposed for the identification of the model's parameters, employing experimental data reported in the literature.
Does standard Monte Carlo give justice to instantons?
NASA Astrophysics Data System (ADS)
Fucito, F.; Solomon, S.
1984-01-01
The results of the standard local Monte Carlo are changed by offering instantons as candidates in the Metropolis procedure. We also define an O(3) topological charge with no contribution from planar dislocations. The RG behavior is still not recovered. Bantrell Fellow in Theoretical Physics.
NASA Astrophysics Data System (ADS)
Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara
2015-09-01
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.
Macrocell path loss prediction using artificial intelligence techniques
NASA Astrophysics Data System (ADS)
Usman, Abraham U.; Okereke, Okpo U.; Omizegba, Elijah E.
2014-04-01
The prediction of propagation loss is a practical non-linear function approximation problem which linear regression or auto-regression models are limited in their ability to handle. However, some computational Intelligence techniques such as artificial neural networks (ANNs) and adaptive neuro-fuzzy inference systems (ANFISs) have been shown to have great ability to handle non-linear function approximation and prediction problems. In this study, the multiple layer perceptron neural network (MLP-NN), radial basis function neural network (RBF-NN) and an ANFIS network were trained using actual signal strength measurement taken at certain suburban areas of Bauchi metropolis, Nigeria. The trained networks were then used to predict propagation losses at the stated areas under differing conditions. The predictions were compared with the prediction accuracy of the popular Hata model. It was observed that ANFIS model gave a better fit in all cases having higher R2 values in each case and on average is more robust than MLP and RBF models as it generalises better to a different data.
Model selection and Bayesian inference for high-resolution seabed reflection inversion.
Dettmer, Jan; Dosso, Stan E; Holland, Charles W
2009-02-01
This paper applies Bayesian inference, including model selection and posterior parameter inference, to inversion of seabed reflection data to resolve sediment structure at a spatial scale below the pulse length of the acoustic source. A practical approach to model selection is used, employing the Bayesian information criterion to decide on the number of sediment layers needed to sufficiently fit the data while satisfying parsimony to avoid overparametrization. Posterior parameter inference is carried out using an efficient Metropolis-Hastings algorithm for high-dimensional models, and results are presented as marginal-probability depth distributions for sound velocity, density, and attenuation. The approach is applied to plane-wave reflection-coefficient inversion of single-bounce data collected on the Malta Plateau, Mediterranean Sea, which indicate complex fine structure close to the water-sediment interface. This fine structure is resolved in the geoacoustic inversion results in terms of four layers within the upper meter of sediments. The inversion results are in good agreement with parameter estimates from a gravity core taken at the experiment site.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedmann, S J
Carbon capture and sequestration (CCS) has emerged as a key technology for dramatic short-term reduction in greenhouse gas emissions in particular from large stationary. A key challenge in this arena is the monitoring and verification (M&V) of CO2 plumes in the deep subsurface. Towards that end, we have developed a tool that can simultaneously invert multiple sub-surface data sets to constrain the location, geometry, and saturation of subsurface CO2 plumes. We have focused on a suite of unconventional geophysical approaches that measure changes in electrical properties (electrical resistance tomography, electromagnetic induction tomography) and bulk crustal deformation (til-meters). We had alsomore » used constraints of the geology as rendered in a shared earth model (ShEM) and of the injection (e.g., total injected CO{sub 2}). We describe a stochastic inversion method for mapping subsurface regions where CO{sub 2} saturation is changing. The technique combines prior information with measurements of injected CO{sub 2} volume, reservoir deformation and electrical resistivity. Bayesian inference and a Metropolis simulation algorithm form the basis for this approach. The method can (a) jointly reconstruct disparate data types such as surface or subsurface tilt, electrical resistivity, and injected CO{sub 2} volume measurements, (b) provide quantitative measures of the result uncertainty, (c) identify competing models when the available data are insufficient to definitively identify a single optimal model and (d) rank the alternative models based on how well they fit available data. We present results from general simulations of a hypothetical case derived from a real site. We also apply the technique to a field in Wyoming, where measurements collected during CO{sub 2} injection for enhanced oil recovery serve to illustrate the method's performance. The stochastic inversions provide estimates of the most probable location, shape, volume of the plume and most likely CO{sub 2} saturation. The results suggest that the method can reconstruct data with poor signal to noise ratio and use hard constraints available from many sites and applications. External interest in the approach and method is high, and already commercial and DOE entities have requested technical work using the newly developed methodology for CO{sub 2} monitoring.« less
NASA Astrophysics Data System (ADS)
Scharnagl, B.; Vrugt, J. A.; Vereecken, H.; Herbst, M.
2010-02-01
A major drawback of current soil organic carbon (SOC) models is that their conceptually defined pools do not necessarily correspond to measurable SOC fractions in real practice. This not only impairs our ability to rigorously evaluate SOC models but also makes it difficult to derive accurate initial states of the individual carbon pools. In this study, we tested the feasibility of inverse modelling for estimating pools in the Rothamsted carbon model (ROTHC) using mineralization rates observed during incubation experiments. This inverse approach may provide an alternative to existing SOC fractionation methods. To illustrate our approach, we used a time series of synthetically generated mineralization rates using the ROTHC model. We adopted a Bayesian approach using the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm to infer probability density functions of the various carbon pools at the start of incubation. The Kullback-Leibler divergence was used to quantify the information content of the mineralization rate data. Our results indicate that measured mineralization rates generally provided sufficient information to reliably estimate all carbon pools in the ROTHC model. The incubation time necessary to appropriately constrain all pools was about 900 days. The use of prior information on microbial biomass carbon significantly reduced the uncertainty of the initial carbon pools, decreasing the required incubation time to about 600 days. Simultaneous estimation of initial carbon pools and decomposition rate constants significantly increased the uncertainty of the carbon pools. This effect was most pronounced for the intermediate and slow pools. Altogether, our results demonstrate that it is particularly difficult to derive reasonable estimates of the humified organic matter pool and the inert organic matter pool from inverse modelling of mineralization rates observed during incubation experiments.
NASA Astrophysics Data System (ADS)
Mäkelä, Jarmo; Susiluoto, Jouni; Markkanen, Tiina; Aurela, Mika; Järvinen, Heikki; Mammarella, Ivan; Hagemann, Stefan; Aalto, Tuula
2016-12-01
We examined parameter optimisation in the JSBACH (Kaminski et al., 2013; Knorr and Kattge, 2005; Reick et al., 2013) ecosystem model, applied to two boreal forest sites (Hyytiälä and Sodankylä) in Finland. We identified and tested key parameters in soil hydrology and forest water and carbon-exchange-related formulations, and optimised them using the adaptive Metropolis (AM) algorithm for Hyytiälä with a 5-year calibration period (2000-2004) followed by a 4-year validation period (2005-2008). Sodankylä acted as an independent validation site, where optimisations were not made. The tuning provided estimates for full distribution of possible parameters, along with information about correlation, sensitivity and identifiability. Some parameters were correlated with each other due to a phenomenological connection between carbon uptake and water stress or other connections due to the set-up of the model formulations. The latter holds especially for vegetation phenology parameters. The least identifiable parameters include phenology parameters, parameters connecting relative humidity and soil dryness, and the field capacity of the skin reservoir. These soil parameters were masked by the large contribution from vegetation transpiration. In addition to leaf area index and the maximum carboxylation rate, the most effective parameters adjusting the gross primary production (GPP) and evapotranspiration (ET) fluxes in seasonal tuning were related to soil wilting point, drainage and moisture stress imposed on vegetation. For daily and half-hourly tunings the most important parameters were the ratio of leaf internal CO2 concentration to external CO2 and the parameter connecting relative humidity and soil dryness. Effectively the seasonal tuning transferred water from soil moisture into ET, and daily and half-hourly tunings reversed this process. The seasonal tuning improved the month-to-month development of GPP and ET, and produced the most stable estimates of water use efficiency. When compared to the seasonal tuning, the daily tuning is worse on the seasonal scale. However, daily parametrisation reproduced the observations for average diurnal cycle best, except for the GPP for Sodankylä validation period, where half-hourly tuned parameters were better. In general, the daily tuning provided the largest reduction in model-data mismatch. The models response to drought was unaffected by our parametrisations and further studies are needed into enhancing the dry response in JSBACH.
Hadron spectrum of quenched QCD on a 32{sup 3} {times} 64 lattice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Seyong; Sinclair, D.K.
1992-10-01
Preliminary results from a hadron spectrum calculation of quenched Quantumchromodynamics on a 32{sup 3} {times} 64 lattice at {beta} = 6.5 are reported. The hadron spectrum calculation is done with staggered quarks of masses, m{sub q}a = 0.001, 0.005 and 0.0025. We use two different sources in order to be able to extract the {Delta} mass in addition to the usual local light hadron masses. The numerical simulation is executed on the Intel Touchstone Delta computer. The peak speed of the Delta for a 16 {times} 32 mesh configuration is 41 Gflops for 32 bit precision. The sustained speed formore » our updating code is 9.5 Gflops. A multihit metropolis algorithm combined with an over-relaxation method is used in the updating and the conjugate gradient method is employed for Dirac matrix inversion. Configurations are stored every 1000 sweeps.« less
Hadron spectrum of quenched QCD on a 32[sup 3] [times] 64 lattice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Seyong; Sinclair, D.K.
1992-10-01
Preliminary results from a hadron spectrum calculation of quenched Quantumchromodynamics on a 32[sup 3] [times] 64 lattice at [beta] = 6.5 are reported. The hadron spectrum calculation is done with staggered quarks of masses, m[sub q]a = 0.001, 0.005 and 0.0025. We use two different sources in order to be able to extract the [Delta] mass in addition to the usual local light hadron masses. The numerical simulation is executed on the Intel Touchstone Delta computer. The peak speed of the Delta for a 16 [times] 32 mesh configuration is 41 Gflops for 32 bit precision. The sustained speed formore » our updating code is 9.5 Gflops. A multihit metropolis algorithm combined with an over-relaxation method is used in the updating and the conjugate gradient method is employed for Dirac matrix inversion. Configurations are stored every 1000 sweeps.« less
Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin
2016-01-01
Background: In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. Methods: As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6–12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. Results: The prevalence of anemia was 12.60% with a range of 3.47%–40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. Conclusions: The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities. PMID:27174328
Kulis, Stephen; Hodge, David R.; Ayers, Stephanie L.; Brown, Eddie F.; Marsiglia, Flavio F.
2012-01-01
Background and objective This article explores the aspects of spirituality and religious involvement that may be the protective factors against substance use among urban American Indian (AI) youth. Methods Data come from AI youth (N = 123) in five urban middle schools in a southwestern metropolis. Results Ordinary least squares regression analyses indicated that following Christian beliefs and belonging to the Native American Church were associated with lower levels of substance use. Conclusions and Scientific Significance Following AI traditional spiritual beliefs was associated with antidrug attitudes, norms, and expectancies. Having a sense of belonging to traditions from both AI cultures and Christianity may foster integration of the two worlds in which urban AI youth live. PMID:22554065
Ibe, K M; Nwankwor, G I; Onyekuru, S O
2001-03-01
Pollution vulnerability of the Owerri regional water supply aquifer was evaluated as a basis for developing appropriate protection strategy for the groundwater resource. The assessment was accomplished using Legrand, GOD, Siga and DRASTIC models. Techniques of the models generally involved parameters rating and point count systems, which are based on the evaluation of various parameter in relation to their capacity for enhancing or attenuating contaminants in the groundwater system. Field and laboratory evaluations of the parameters indicate that the Owerri area generally occupies a nearly, flat topography with a relatively high groundwater recharge. The area is underlain by predominantly sandy facies in the Northern area which grades into gravelly sequences towards the southwest. The Southeastern area is distinguished by thick clayey facies that thin westwards towards the Owerri metropolis. Effective hydraulic conductivity (Kz) in the downward direction ranges from 1.44 x 10(-3) to 5.6 x 10(-9) m s(-1); with the upper limits reflecting coarse sands and gravelly units. The amount of clay and clay-size particles in the sandy and gravelly units is negligible, suggesting that the sorptive capacity of the units is low. Depth to water table decreases southwards while hydraulic head gradients vary between 0.09 and 0.22. Groundwater occurs in unconfined conditions in most places except in the southeastern zone where it is semi-confined due to the presence of a clayey unit. The groundwater vulnerability map developed on the basis of the models and several other thematic maps shows that the Owerri metropolis and the southwest area of Owerri have high vulnerability, indicating groundwater pollution. The existing waste disposal sites in these sub-areas should be abandoned and rehabilitated to forstall further pollution of the groundwater system. Areas to the North and Southeast of Owerri have moderate and low vulnerabilities, respectively, indicating the relatively lower sensitivity of the groundwater system in the sub-areas to contamination. The lower sensitivity could further be matched with properly engineered sanitary landfills in the event of choice of sites, as an additional protective strategy for the groundwater system.
NASA Technical Reports Server (NTRS)
Shepherd, J. Marshall; OCStarr, David (Technical Monitor)
2002-01-01
A recent paper by Shepherd and Pierce (in press at Journal of Applied Meteorology) used rainfall data from the Precipitation Radar on NASA's Tropical Rainfall Measuring Mission's (TRMM) satellite to identify warm season rainfall anomalies downwind of major urban areas. Data (PR) were employed to identify warm season rainfall (1998-2000) patterns around Atlanta, Montgomery, Nashville, San Antonio, Waco, and Dallas. Results reveal an average increase of approx. 28% in monthly rainfall rates within 30-60 kilometers downwind of the metropolis with a modest increase of 5.6% over the metropolis. Portions of the downwind area exhibit increases as high as 51%. The percentage changes are relative to an upwind control area. It was also found that maximum rainfall rates in the downwind impact area exceeded the mean value in the upwind control area by 48%-116%. The maximum value was generally found at an average distance of 39 km from the edge of the urban center or 64 km from the center of the city. Results are consistent with METROMEX studies of St. Louis almost two decades ago and with more recent studies near Atlanta. A convective-mesoscale model with extensive land-surface processes is currently being employed to (a) determine if an urban heat island (UHI) thermal perturbation can induce a dynamic response to affect rainfall processes and (b) quantify the impact of the following three factors on the evolution of rainfall: (1) urban surface roughness, (2) magnitude of the UHI temperature anomaly, and (3) physical size of the UHI temperature anomaly. The sensitivity experiments are achieved by inserting a slab of land with urban properties (e.g. roughness length, albedo, thermal character) within a rural surface environment and varying the appropriate lower boundary condition parameters. The study will discuss the feasibility of utilizing satellite-based rainfall estimates for examining rainfall modification by urban areas on global scales and over longer time periods. The talk also introduces very preliminary results from the modeling component of the study. Such research has implications for weather forecasting, urban planning, water resource management, and understanding human impact on the environment and climate.
ERIC Educational Resources Information Center
Fogg, Piper
2007-01-01
When the nearest metropolis is hundreds of miles away, cultural enrichment is not always easy to come by. Arts programs have evolved to reflect the needs of such regions, providing a rich diet for culture-starved residents. Some colleges have created choirs or theater groups that welcome local participation, while others have developed elaborate…
ERIC Educational Resources Information Center
Pearman, Francis A., III; Swain, Walker A.
2017-01-01
Racial and socioeconomic stratification have long governed patterns of residential sorting in the American metropolis. However, recent expansions of school choice policies that allow parents to select schools outside their neighborhood raise questions as to whether this weakening of the neighborhood-school connection might influence the…
NASA Astrophysics Data System (ADS)
Hasegawa, Manabu; Hiramatsu, Kotaro
2013-10-01
The effectiveness of the Metropolis algorithm (MA) (constant-temperature simulated annealing) in optimization by the method of search-space smoothing (SSS) (potential smoothing) is studied on two types of random traveling salesman problems. The optimization mechanism of this hybrid approach (MASSS) is investigated by analyzing the exploration dynamics observed in the rugged landscape of the cost function (energy surface). The results show that the MA can be successfully utilized as a local search algorithm in the SSS approach. It is also clarified that the optimization characteristics of these two constituent methods are improved in a mutually beneficial manner in the MASSS run. Specifically, the relaxation dynamics generated by employing the MA work effectively even in a smoothed landscape and more advantage is taken of the guiding function proposed in the idea of SSS; this mechanism operates in an adaptive manner in the de-smoothing process and therefore the MASSS method maintains its optimization function over a wider temperature range than the MA.
Molecular dynamics simulations of field emission from a planar nanodiode
NASA Astrophysics Data System (ADS)
Torfason, Kristinn; Valfells, Agust; Manolescu, Andrei
2015-03-01
High resolution molecular dynamics simulations with full Coulomb interactions of electrons are used to investigate field emission in planar nanodiodes. The effects of space-charge and emitter radius are examined and compared to previous results concerning transition from Fowler-Nordheim to Child-Langmuir current [Y. Y. Lau, Y. Liu, and R. K. Parker, Phys. Plasmas 1, 2082 (1994) and Y. Feng and J. P. Verboncoeur, Phys. Plasmas 13, 073105 (2006)]. The Fowler-Nordheim law is used to determine the current density injected into the system and the Metropolis-Hastings algorithm to find a favourable point of emission on the emitter surface. A simple fluid like model is also developed and its results are in qualitative agreement with the simulations.
Design of the smart scenic spot service platform
NASA Astrophysics Data System (ADS)
Yin, Min; Wang, Shi-tai
2015-12-01
With the deepening of the smart city construction, the model "smart+" is rapidly developing. Guilin, the international tourism metropolis fast constructing need smart tourism technology support. This paper studied the smart scenic spot service object and its requirements. And then constructed the smart service platform of the scenic spot application of 3S technology (Geographic Information System (GIS), Remote Sensing (RS) and Global Navigation Satellite System (GNSS)) and the Internet of things, cloud computing. Based on Guilin Seven-star Park scenic area as an object, this paper designed the Seven-star smart scenic spot service platform framework. The application of this platform will improve the tourists' visiting experience, make the tourism management more scientifically and standardly, increase tourism enterprises operating earnings.
Examining the emerging entrepreneurial mindset in adolescence: A study in Nigeria.
Salami, Samuel O
2017-05-10
This study investigated the relationship of family environment, network, parental socio-economic status, self-efficacy and proactive personality on entrepreneurial intention of secondary school adolescents and the mediating role of self-efficacy. The participants were 250 secondary school SS2 adolescents randomly selected from six secondary schools in Ibadan Metropolis, Ibadan, Oyo State, Nigeria. Structural Equation Modelling was used to analyse the data obtained from the participants. The results showed that all the contextual and individual factors had significant relationship with entrepreneurial intention and self-efficacy partially mediated the relationship. It was suggested that counselling psychologists should consider the contextual and individual variables while assisting students in building their entrepreneurial intention. © 2017 International Union of Psychological Science.
Molecular dynamics simulations of field emission from a planar nanodiode
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torfason, Kristinn; Valfells, Agust; Manolescu, Andrei
High resolution molecular dynamics simulations with full Coulomb interactions of electrons are used to investigate field emission in planar nanodiodes. The effects of space-charge and emitter radius are examined and compared to previous results concerning transition from Fowler-Nordheim to Child-Langmuir current [Y. Y. Lau, Y. Liu, and R. K. Parker, Phys. Plasmas 1, 2082 (1994) and Y. Feng and J. P. Verboncoeur, Phys. Plasmas 13, 073105 (2006)]. The Fowler-Nordheim law is used to determine the current density injected into the system and the Metropolis-Hastings algorithm to find a favourable point of emission on the emitter surface. A simple fluid likemore » model is also developed and its results are in qualitative agreement with the simulations.« less
Spatial Heterogeneity in the Effects of Immigration and Diversity on Neighborhood Homicide Rates
Graif, Corina; Sampson, Robert J.
2010-01-01
This paper examines the connection of immigration and diversity to homicide by advancing a recently developed approach to modeling spatial dynamics—geographically weighted regression. In contrast to traditional global averaging, we argue on substantive grounds that neighborhood characteristics vary in their effects across neighborhood space, a process of “spatial heterogeneity.” Much like treatment-effect heterogeneity and distinct from spatial spillover, our analysis finds considerable evidence that neighborhood characteristics in Chicago vary significantly in predicting homicide, in some cases showing countervailing effects depending on spatial location. In general, however, immigrant concentration is either unrelated or inversely related to homicide, whereas language diversity is consistently linked to lower homicide. The results shed new light on the immigration-homicide nexus and suggest the pitfalls of global averaging models that hide the reality of a highly diversified and spatially stratified metropolis. PMID:20671811
Modeling of urban growth using cellular automata (CA) optimized by Particle Swarm Optimization (PSO)
NASA Astrophysics Data System (ADS)
Khalilnia, M. H.; Ghaemirad, T.; Abbaspour, R. A.
2013-09-01
In this paper, two satellite images of Tehran, the capital city of Iran, which were taken by TM and ETM+ for years 1988 and 2010 are used as the base information layers to study the changes in urban patterns of this metropolis. The patterns of urban growth for the city of Tehran are extracted in a period of twelve years using cellular automata setting the logistic regression functions as transition functions. Furthermore, the weighting coefficients of parameters affecting the urban growth, i.e. distance from urban centers, distance from rural centers, distance from agricultural centers, and neighborhood effects were selected using PSO. In order to evaluate the results of the prediction, the percent correct match index is calculated. According to the results, by combining optimization techniques with cellular automata model, the urban growth patterns can be predicted with accuracy up to 75 %.
Searching for Genotype-Phenotype Structure: Using Hierarchical Log-Linear Models in Crohn Disease
Chapman, Juliet M.; Onnie, Clive M.; Prescott, Natalie J.; Fisher, Sheila A.; Mansfield, John C.; Mathew, Christopher G.; Lewis, Cathryn M.; Verzilli, Claudio J.; Whittaker, John C.
2009-01-01
There has been considerable recent success in the detection of gene-disease associations. We consider here the development of tools that facilitate the more detailed characterization of the effect of a genetic variant on disease. We replace the simplistic classification of individuals according to a single binary disease indicator with classification according to a number of subphenotypes. This more accurately reflects the underlying biological complexity of the disease process, but it poses additional analytical difficulties. Notably, the subphenotypes that make up a particular disease are typically highly associated, and it becomes difficult to distinguish which genes might be causing which subphenotypes. Such problems arise in many complex diseases. Here, we concentrate on an application to Crohn disease (CD). We consider this problem as one of model selection based upon log-linear models, fitted in a Bayesian framework via reversible-jump Metropolis-Hastings approach. We evaluate the performance of our suggested approach with a simple simulation study and then apply the method to a real data example in CD, revealing a sparse disease structure. Most notably, the associated NOD2.908G→R mutation appears to be directly related to more severe disease behaviors, whereas the other two associated NOD2 variants, 1007L→FS and 702R→W, are more generally related to disease in the small bowel (ileum and jejenum). The ATG16L1.300T→A variant appears to be directly associated with only disease of the small bowel. PMID:19185283
Groundwater fluoride and dental fluorosis in southwestern Nigeria.
Gbadebo, A M
2012-10-01
This study was carried out to assess the fluoride levels of groundwater from open wells, consumed by the residents of three communities located in two distinct geological terrains of southwestern Nigeria. Fluoride concentration was determined using spectrophotometric technique, while analysis of other parameters like temperature, pH and total dissolve solids followed standard methods. Results of the analysis indicated that groundwater samples from Abeokuta Metropolis (i.e., basement complex terrain) had fluoride content in the range of 0.65 ± 0.21 and 1.20 ± 0.14. These values were found to be lower than the fluoride contents in the groundwater samples from Ewekoro peri-urban and Lagos metropolis where the values ranged between 1.10 ± 0.14-1.45 ± 0.07 and 0.15 ± 0.07-2.20 ± 1.41 mg/l, respectively. The fluoride contents in almost all locations were generally higher than the WHO recommended 0.6 mg/l. Analysis of Duncan multiple range test indicated that there is similarity in the level of significance of fluoride contents between different locations of same geological terrain at p ≤ 0.05. It was also observed that fluoride distribution of groundwater samples from the different geological terrain was more dependent on factors like pH and TDS than on temperature. The result of the analyzed social demographic characteristics of the residents indicated that the adults (between the age of 20 and >40 years) showed dental decay than the adolescent (<20 years). This signifies incidence of dental fluorosis by the high fluoride content in the drinking water of the populace. Further investigation on all sources of drinking water and other causes of tooth decay in the area is suggested.
Adegboyega, O; Abioye, K
2017-08-01
The payment for health-care services is a major problem for many poor patients in developing nations. The aim of the study was to examine the cost of services and commodities and how these affect the patients who utilizes the primary health-care centers in Zaria, North western Nigeria. A descriptive cross-sectional survey of six primary health-care facilities in Zaria metropolis, namely Baban dodo, Tudun Wada, Magajiya PHCs from Zaria local government areas (LGA) and Samaru, Kwata, and Dogarawa PHCs from Sabon Gari LGA, was carried out. The mean age of the respondents was 28.87± 8.63 years, most of them were married (53.3%), Hausa (63.3%), and Muslims (85.7%); also, they were unemployed housewives with daily stipends from their husbands less than 1 dollar/day. The major method for payment for health-care services was out of pocket (98.3%). More than one-third of the clients were not aware of the National Health Insurance Scheme (NHIS) (39%). There was a significant inverse relationship between the monthly income of the clients and the experience of financial stress and a positive association between patients' monthly income and awareness of the NHIS (P < 0.05). The respondents were paying user fees for essential health-care services at the primary health-care centers and this was not convenient for them. There is a need for the LGA health department to intensify the supervision of the activities at the PHCs. Standardization of prices of services and commodities and the implementation of the National Health Act may alleviate the burdens of the poor community members who access PHCs in Nigeria.
Hussain, Muhammad Hammad; Saqib, Muhammad; Raza, Fahad; Muhammad, Ghulam; Asi, Muhammad Nadeem; Mansoor, Muhammad Khalid; Saleem, Muhammad; Jabbar, Abdul
2014-05-28
Equine piroplasmosis (EP) caused by intraerythrocytic parasites (Theileria equi and Babesia caballi) is an emerging equine disease of world-wide distribution. In Pakistan, the prevalence and incidence of EP are unknown. In order to obtain the first insights into the prevalence of the disease, a total of 430 equids, including 33 mules, 65 horses and 332 donkeys, aging from ≤ 5 to ≥ 10 years of either sex, from five metropolises of Punjab, Pakistan, were serologically tested for the presence of antibodies directed against B. caballi and T. equi, using a competitive enzyme-linked immunosorbent assay (cELISA). Out of 430 equid serum samples tested, 226 (52.6%, 95% CI 47.7-57.4) were found cELISA positive for EP (T. equi and/or B. caballi infections). The overall seroprevalence of EP was 41.2% (95% CI 36.5-46.0) for T. equi and 21.6% (95% CI 17.8-25.8) for B. caballi. A small proportion of equids (10.2%, 95% CI 7.5-13.5) was seropositive for both T. equi and B. caballi. Seroprevalence of T. equi was significantly higher (P<0.01) in equines from the metropolis of Lahore (66.7%, 95% CI 54.3-77.6) and in horses (56.9%, 95% CI 44.0-69.2). Multivariable logistic regression model analysis indicated that factors associated with prevalence of EP were being an equine species kept in metropolis Lahore (OR=4.24, 95% CI 2.28-7.90), horse (OR=2.82, 95% CI 1.53-5.20) and male equids (OR=1.81, 95% CI 1.15-2.86). Copyright © 2014 Elsevier B.V. All rights reserved.
Magnetic monopole dynamics in spin ice.
Jaubert, L D C; Holdsworth, P C W
2011-04-27
One of the most remarkable examples of emergent quasi-particles is that of the 'fractionalization' of magnetic dipoles in the low energy configurations of materials known as 'spin ice' into free and unconfined magnetic monopoles interacting via Coulomb's 1/r law (Castelnovo et al 2008 Nature 451 42-5). Recent experiments have shown that a Coulomb gas of magnetic charges really does exist at low temperature in these materials and this discovery provides a new perspective on otherwise largely inaccessible phenomenology. In this paper, after a review of the different spin ice models, we present detailed results describing the diffusive dynamics of monopole particles starting both from the dipolar spin ice model and directly from a Coulomb gas within the grand canonical ensemble. The diffusive quasi-particle dynamics of real spin ice materials within the 'quantum tunnelling' regime is modelled with Metropolis dynamics, with the particles constrained to move along an underlying network of oriented paths, which are classical analogues of the Dirac strings connecting pairs of Dirac monopoles.
NASA Astrophysics Data System (ADS)
Jokar Arsanjani, Jamal; Helbich, Marco; Kainz, Wolfgang; Darvishi Boloorani, Ali
2013-04-01
This research analyses the suburban expansion in the metropolitan area of Tehran, Iran. A hybrid model consisting of logistic regression model, Markov chain (MC), and cellular automata (CA) was designed to improve the performance of the standard logistic regression model. Environmental and socio-economic variables dealing with urban sprawl were operationalised to create a probability surface of spatiotemporal states of built-up land use for the years 2006, 2016, and 2026. For validation, the model was evaluated by means of relative operating characteristic values for different sets of variables. The approach was calibrated for 2006 by cross comparing of actual and simulated land use maps. The achieved outcomes represent a match of 89% between simulated and actual maps of 2006, which was satisfactory to approve the calibration process. Thereafter, the calibrated hybrid approach was implemented for forthcoming years. Finally, future land use maps for 2016 and 2026 were predicted by means of this hybrid approach. The simulated maps illustrate a new wave of suburban development in the vicinity of Tehran at the western border of the metropolis during the next decades.
A time series model: First-order integer-valued autoregressive (INAR(1))
NASA Astrophysics Data System (ADS)
Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.
2017-07-01
Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.
NASA Astrophysics Data System (ADS)
Deng, Hua; Dutta, Prashanta; Liu, Jin
2016-11-01
Clathrin-mediated endocytosis (CME) is one of the most important endocytic pathways for the internalization of bioparticles at lipid membrane of cells, which plays crucial roles in fundamental understanding of viral infections and interacellular/transcelluar targeted drug delivery. During CME, highly dynamic clathrin-coated pit (CCP), formed by the growth of ordered clathrin lattices, is the key scaffolding component that drives the deformation of plasma membrane. Experimental studies have shown that CCP alone can provide sufficient membrane curvature for facilitating membrane invagination. However, currently there is no computational model that could couple cargo receptor binding with membrane invagination process, nor simulations of the dynamic growing process of CCP. We develop a stochastic computational model for the clathrin-mediated endocytosis based on Metropolis Monte Carlo simulations. In our model, the energetic costs of bending membrane and CCP are linked with antigen-antibody interactions. The assembly of clathrin lattices is a dynamic process that correlates with antigen-antibody bond formation. This model helps study the membrane deformation and the effects of CCP during functionalized bioparticles internalization through CME. This work is supported by NSF Grants: CBET-1250107 and CBET-1604211.
NASA Astrophysics Data System (ADS)
Bauer, Thilo; Jäger, Christof M.; Jordan, Meredith J. T.; Clark, Timothy
2015-07-01
We have developed a multi-agent quantum Monte Carlo model to describe the spatial dynamics of multiple majority charge carriers during conduction of electric current in the channel of organic field-effect transistors. The charge carriers are treated by a neglect of diatomic differential overlap Hamiltonian using a lattice of hydrogen-like basis functions. The local ionization energy and local electron affinity defined previously map the bulk structure of the transistor channel to external potentials for the simulations of electron- and hole-conduction, respectively. The model is designed without a specific charge-transport mechanism like hopping- or band-transport in mind and does not arbitrarily localize charge. An electrode model allows dynamic injection and depletion of charge carriers according to source-drain voltage. The field-effect is modeled by using the source-gate voltage in a Metropolis-like acceptance criterion. Although the current cannot be calculated because the simulations have no time axis, using the number of Monte Carlo moves as pseudo-time gives results that resemble experimental I/V curves.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauer, Thilo; Jäger, Christof M.; Jordan, Meredith J. T.
2015-07-28
We have developed a multi-agent quantum Monte Carlo model to describe the spatial dynamics of multiple majority charge carriers during conduction of electric current in the channel of organic field-effect transistors. The charge carriers are treated by a neglect of diatomic differential overlap Hamiltonian using a lattice of hydrogen-like basis functions. The local ionization energy and local electron affinity defined previously map the bulk structure of the transistor channel to external potentials for the simulations of electron- and hole-conduction, respectively. The model is designed without a specific charge-transport mechanism like hopping- or band-transport in mind and does not arbitrarily localizemore » charge. An electrode model allows dynamic injection and depletion of charge carriers according to source-drain voltage. The field-effect is modeled by using the source-gate voltage in a Metropolis-like acceptance criterion. Although the current cannot be calculated because the simulations have no time axis, using the number of Monte Carlo moves as pseudo-time gives results that resemble experimental I/V curves.« less
Andasari, Vivi; Roper, Ryan T.; Swat, Maciej H.; Chaplain, Mark A. J.
2012-01-01
In this paper we present a multiscale, individual-based simulation environment that integrates CompuCell3D for lattice-based modelling on the cellular level and Bionetsolver for intracellular modelling. CompuCell3D or CC3D provides an implementation of the lattice-based Cellular Potts Model or CPM (also known as the Glazier-Graner-Hogeweg or GGH model) and a Monte Carlo method based on the metropolis algorithm for system evolution. The integration of CC3D for cellular systems with Bionetsolver for subcellular systems enables us to develop a multiscale mathematical model and to study the evolution of cell behaviour due to the dynamics inside of the cells, capturing aspects of cell behaviour and interaction that is not possible using continuum approaches. We then apply this multiscale modelling technique to a model of cancer growth and invasion, based on a previously published model of Ramis-Conde et al. (2008) where individual cell behaviour is driven by a molecular network describing the dynamics of E-cadherin and -catenin. In this model, which we refer to as the centre-based model, an alternative individual-based modelling technique was used, namely, a lattice-free approach. In many respects, the GGH or CPM methodology and the approach of the centre-based model have the same overall goal, that is to mimic behaviours and interactions of biological cells. Although the mathematical foundations and computational implementations of the two approaches are very different, the results of the presented simulations are compatible with each other, suggesting that by using individual-based approaches we can formulate a natural way of describing complex multi-cell, multiscale models. The ability to easily reproduce results of one modelling approach using an alternative approach is also essential from a model cross-validation standpoint and also helps to identify any modelling artefacts specific to a given computational approach. PMID:22461894
Air pollution and hospitalizations in the largest Brazilian metropolis
Gouveia, Nelson; Corrallo, Flavia Prado; de Leon, Antônio Carlos Ponce; Junger, Washington; de Freitas, Clarice Umbelino
2017-01-01
ABSTRACT OBJECTIVE To evaluate the impact of air pollution on hospitalizations for respiratory and cardiovascular diseases in the largest Brazilian metropolis. METHODS This study was carried out at the Metropolitan Region of São Paulo, Brazil. Environmental data were obtained from the network of monitoring stations of nine municipalities. Air pollution exposure was measured by daily means of PM10 (particles with a nominal mean aerodynamic diameter ≤ 10 μm) per municipality, while daily counts of hospitalizations for respiratory and cardiovascular diseases within the Brazilian Unified Health System were the outcome. For each municipality a time series analysis was carried out in which a semiparametric Poisson regression model was the framework to explain the daily fluctuations on counts of hospitalizations over time. The results were combined in a meta-analysis to estimate the overall risk of PM10 in hospitalizations for respiratory and cardiovascular diseases at the Metropolitan Region of São Paulo. RESULTS Regarding hospitalizations for respiratory diseases, the effect estimates were statistically significant (p < 0.05) for all municipalities, except Santo André and Taboão da Serra. The RR (Relative Risk) of this outcome for an increase of 10 µg/m3 in the levels of PM10 ranged from 1.011 (95%CI 1.009–1.013) for São Paulo to 1.032 (95%CI 1.024–1.040) in São Bernardo do Campo. The RR of hospitalization for respiratory diseases in children for an increase of 10 µg/m3 of PM10 ranged from 1.009 (95%CI 1.001–1.017) in Santo André to 1.077 (95%CI 1.056–1.098) in Mauá. Only São Paulo and São Bernardo do Campo presented positive and statistically significant results for hospitalizations for cardiovascular diseases. CONCLUSIONS This is the first study to estimate the risk of illness from air pollution in the set of municipalities of the Metropolitan Region of São Paulo, Brazil. Global estimates of the effect of exposure to pollution in the region indicated associations only with respiratory diseases. Only São Paulo and São Bernardo do Campo showed an association between the levels of PM10 and hospitalizations for cardiovascular diseases. PMID:29211200
Manifestations of Dyslexia and Dyscalculia
ERIC Educational Resources Information Center
Osisanya, Ayo; Lazarus, Kelechi; Adewunmi, Abiodun
2013-01-01
This study examined the prevalence of dyslexia and dyscalculia among persons with academic deficits in English Language and Mathematics in public primary schools in Ibadan metropolis. A correlational survey study, sampling 477 pupils who were between the ages of eight and 12 years, and in 4th and 5th grades with the use of four research…
Engaging Suburban Students in Dialogues on Diversity in a Segregated Metropolitan Area
ERIC Educational Resources Information Center
Checkoway, Barry; Lipa, Todd; Vivyan, Erika; Zurvalec, Sue
2017-01-01
What are some strategies for engaging suburban students in dialogues on diversity in new American metropolis? This question is important, especially at a time when some suburbs are changing from "segregated" to "segregated and diverse," and scholarship is needed to guide their discussion. This article analyzes efforts by a…
Urbanization and Spatial Organization: Hospital and Orphanage Location in Chicago, 1848-1916
ERIC Educational Resources Information Center
Britton, Marcus; Ocasio, William
2007-01-01
What factors affect where organizations locate facilities in local communities? This paper examines how urban development influenced the neighborhood location of two very different types of facilities, general hospitals and orphanages, over the 70-year period during which Chicago emerged as an urban metropolis. Our results suggest that the human…
Mapping Language Ideologies in Multi-Ethnic Urban Europe: The Case of Parisian French
ERIC Educational Resources Information Center
Stewart, Christopher Michael
2012-01-01
Although the modern multicultural European metropolis has brought previously disparate groups into close contact, little research has focused on the effect of these shifting demographic patterns on language attitudes and ideologies. This is probably due to the sensitive nature of issues relating to immigration which may evoke contexts of…
ERIC Educational Resources Information Center
Fisher-Maltese, Carley; Fisher, Dana R.; Ray, Rashawn
2018-01-01
This article explores how school gardens provide learning opportunities for school-aged children while concurrently helping cities achieve sustainability. The authors analyse this process in Washington, DC, a particularly innovative metropolis in the United States. This national capital city boasts two of the most progressive examples of…
Battle in Los Angeles: Conflict Escalates as Charter Schools Thrive
ERIC Educational Resources Information Center
Whitmire, Richard
2016-01-01
Throughout the 1990s and well into the new millennium, the massive Los Angeles Unified School District barely noticed the many charter schools that were springing up around the metropolis. But Los Angeles parents certainly took notice, and started enrolling their children. In 2008, five charter-management organizations announced plans to…
Challenges of Attending E-Learning Studies in Nigeria
ERIC Educational Resources Information Center
Bugi, Stephan Z.
2012-01-01
This study set out to find out what challenges the E-leaner faces in the Nigerian environment. Survey research design was used to obtain the opinion of 200 randomly selected E-learners in Kaduna metropolis. Their responses revealed that the most prominent challenges they face are, Inadequate Power supply, Internet connectivity problems, Efficacy…
From Mountain to Metropolis: Appalachian Migrants in American Cities.
ERIC Educational Resources Information Center
Borman, Kathryn M., Ed.; Obermiller, Phillip J., Ed.
This book consists of 14 essays that focus on the condition of urban Appalachians (former migrants to cities from Appalachia and their descendants). Chapters address issues of health, environment, education, and cultural identity in an urban Appalachian context, and are meant to be a resource for educators and health and human service…
No Safe Place: Environmental Hazards & Injustice along Mexico's Northern Border
ERIC Educational Resources Information Center
Grineski, Sara E.; Collins, Timothy W.; Aguilar, Maria de Lourdes Romo; Aldouri, Raed
2010-01-01
This article examines spatial relationships between environmental hazards (i.e., pork feed lots, brick kilns, final assembly plants and a rail line) and markers of social marginality in Ciudad Juarez, Mexico. Juarez represents an opportunity for researchers to test for patterns of injustice in a recently urbanizing metropolis of the Global South.…
Audit of sharp weapon deaths in metropolis of Karachi--an autopsy based study.
Mirza, Farhat Hussain; Hasan, Qudsia; Memon, Akhtar Amin; Adil, Syeda Ezz-e-Rukhshan
2010-01-01
Sharp weapons are one of the most violent and abhorrent means of deaths. This study assesses the frequency of sharp weapon deaths in Karachi. This was a cross sectional study, and involves the deaths by sharp weapons autopsied in Karachi during Mar 2008-Feb 2009. This study reports that the frequency of sharp weapon deaths in Karachi is similar to some other studies conducted in different regions of Pakistan, yet it is very high as the population of Karachi is way more than any other metropolis of Pakistan. Our study reported that out of 2090 medico-legal deaths in Karachi during the study period, 91 deaths were due to sharp weapons, including 73 (80.2%) males and 18 (19.8%) females. 100% of the deaths were homicides, so none were suicides. Deaths were more frequent in age group ranging from 20-39 years (59.3%). Sharp weapon deaths continue to be a means of quite a number of deaths in Karachi. Such violence depicts intolerant and frustrated nature of the citizens.
Russoff, D
1986-04-01
The Tokyo metropolis houses 11,892,016 people, 1/10 of the Japanese population. In recent years, Tokyo's population growth has slowed as the birthrate has fallen from a 1947 postwar high of 31.5/1000 to 11.4/1000 in 1983. 5.9 million males and 5.8 million females, composing 4 1/2 million households, live in Tokyo's 2160 square kilometers. Within the metropolis' 23 wards, density per square kilometer was 14,023 persons in 1983, with Toshima ward containing 21,844 people per square kilometer. Wards around the city's center held 71% of the population in 1983, but had only 27% of its land mass; outlying cities, towns, and villages held 28% of the population on 54% of the land. 50% of Tokyo's population is aged 25-59; those 65 or over will rise from 1980's 9% to 15.6% of the population in 2000. In 40 years, Japan will have more elderly people than any other advanced country. In 1983, Tokyo had over 150,000 housing starts, high by Japanese and international standards. Nearly 1/4 of Tokyo households each contain a married couple with 2 children, but single person households predominate, reflecting Tokyo's student, working bachelor, and elderly populations. Young, single Japanese workers spend 1/3 of their income on leisure, entertainment, cultural activities and education; couples marry late and, with 2 incomes, can purchase many nonessentials. Nearly 3.25 million students attend Tokyo's fiercely competitive schools and colleges; Japan is almost 100% literate. Of the 6 million people working in Tokyo, half work in the service and retail sector, 25% in manufacturing, 12% in transportation and communication, 9% in finance and insurance, and 8% work in construction. Tokyo workers earn nearly 20% more than the average Japanese worker. Japan now faces job shortages and will see many unemployment problems by 1990. To help absorb new workers, government planners recommend increasing vacation time, training workers as specialists rather than generalists, and encouraging job sharing and part-time work.
Childhood epilepsy: knowledge and attitude of primary school teachers in Port Harcourt, Nigeria.
Alikor, E A D; Essien, A A
2005-01-01
This study was conducted to determine the knowledge of primary school teachers in Port Harcourt metropolis of epilepsy, their knowledge of the management of an attack of epilepsy and the attitude of these teachers towards epilepsy in children. This is a questionnaire-based, cross-sectional study of 118 school teachers from five randomly selected primary schools in Port Harcourt metropolis, Nigeria. Ten percent (12) of the 118 teachers were graded "Good", 45% (54) "Fair" and 43% (52) "Poor" in overall knowledge score. Sixty six teachers (56%) accept applying crude oil on the body as useful in stopping epileptic attacks in children. There was no significant association between overall knowledge score and sex, year of experience as a teacher and experience with a child with epilepsy. Only 10% of the teachers studied were classified as having overall good knowledge of epilepsy. Sixty nine teachers (58.5%) were graded as having good knowledge of cause of epilepsy. Only 38 (32%) disagree that the saliva drooled during an epileptic attack is contagious; one hundred (84.8%) and 65 (55.1%) agree that some childhood illnesses can cause epilepsy and that it runs in families respectively. Overall, 54 teachers (45.8%) had a cumulative score of negative attitude towards epilepsy. Eighty three teachers (73.3%) would want all children with epilepsy put in a special school whilst 57 (48%) agree that children with epilepsy should be withdrawn from schools. The longer the teacher's professional experience, the more the likelihood of positive attitude towards epilepsy but the association did not reach statistically significant level (p = 0.076). Attitude was not statistically associated with sex and educational qualification. The overall knowledge of primary school teachers in Port Harcourt metropolis of epilepsy and the first-aid management of an epileptic attack is poor. The attitude of these teachers towards epilepsy is negative. Education of the primary school teacher and general public on epilepsy is recommended.
Energy model for rumor propagation on social networks
NASA Astrophysics Data System (ADS)
Han, Shuo; Zhuang, Fuzhen; He, Qing; Shi, Zhongzhi; Ao, Xiang
2014-01-01
With the development of social networks, the impact of rumor propagation on human lives is more and more significant. Due to the change of propagation mode, traditional rumor propagation models designed for word-of-mouth process may not be suitable for describing the rumor spreading on social networks. To overcome this shortcoming, we carefully analyze the mechanisms of rumor propagation and the topological properties of large-scale social networks, then propose a novel model based on the physical theory. In this model, heat energy calculation formula and Metropolis rule are introduced to formalize this problem and the amount of heat energy is used to measure a rumor’s impact on a network. Finally, we conduct track experiments to show the evolution of rumor propagation, make comparison experiments to contrast the proposed model with the traditional models, and perform simulation experiments to study the dynamics of rumor spreading. The experiments show that (1) the rumor propagation simulated by our model goes through three stages: rapid growth, fluctuant persistence and slow decline; (2) individuals could spread a rumor repeatedly, which leads to the rumor’s resurgence; (3) rumor propagation is greatly influenced by a rumor’s attraction, the initial rumormonger and the sending probability.
Bayesian Inference for Source Reconstruction: A Real-World Application
Yee, Eugene; Hoffman, Ian; Ungar, Kurt
2014-01-01
This paper applies a Bayesian probabilistic inferential methodology for the reconstruction of the location and emission rate from an actual contaminant source (emission from the Chalk River Laboratories medical isotope production facility) using a small number of activity concentration measurements of a noble gas (Xenon-133) obtained from three stations that form part of the International Monitoring System radionuclide network. The sampling of the resulting posterior distribution of the source parameters is undertaken using a very efficient Markov chain Monte Carlo technique that utilizes a multiple-try differential evolution adaptive Metropolis algorithm with an archive of past states. It is shown that the principal difficulty in the reconstruction lay in the correct specification of the model errors (both scale and structure) for use in the Bayesian inferential methodology. In this context, two different measurement models for incorporation of the model error of the predicted concentrations are considered. The performance of both of these measurement models with respect to their accuracy and precision in the recovery of the source parameters is compared and contrasted. PMID:27379292
A bayesian hierarchical model for classification with selection of functional predictors.
Zhu, Hongxiao; Vannucci, Marina; Cox, Dennis D
2010-06-01
In functional data classification, functional observations are often contaminated by various systematic effects, such as random batch effects caused by device artifacts, or fixed effects caused by sample-related factors. These effects may lead to classification bias and thus should not be neglected. Another issue of concern is the selection of functions when predictors consist of multiple functions, some of which may be redundant. The above issues arise in a real data application where we use fluorescence spectroscopy to detect cervical precancer. In this article, we propose a Bayesian hierarchical model that takes into account random batch effects and selects effective functions among multiple functional predictors. Fixed effects or predictors in nonfunctional form are also included in the model. The dimension of the functional data is reduced through orthonormal basis expansion or functional principal components. For posterior sampling, we use a hybrid Metropolis-Hastings/Gibbs sampler, which suffers slow mixing. An evolutionary Monte Carlo algorithm is applied to improve the mixing. Simulation and real data application show that the proposed model provides accurate selection of functional predictors as well as good classification.
Shape matters: The case for Ellipsoids and Ellipsoidal Water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tillack, Andreas F.; Robinson, Bruce H.
We describe the shape potentials used for the van der Waals interactions between soft-ellipsoids used to coarse-grain molecular moieties in our Metropolis Monte-Carlo simulation software. The morphologies resulting from different expressions for these van der Waals interaction potentials are discussed for the case of a prolate spheroid system with a strong dipole at the ellipsoid center. We also show that the calculation of ellipsoids is, at worst, only about fivefold more expensive computationally when compared to a simple Lennard- Jones sphere. Finally, as an application of the ellipsoidal shape we parametrize water from the original SPC water model and observemore » – just through the difference in shape alone – a significant improvement of the O-O radial distribution function when compared to experimental data.« less
Semi-blind sparse image reconstruction with application to MRFM.
Park, Se Un; Dobigeon, Nicolas; Hero, Alfred O
2012-09-01
We propose a solution to the image deconvolution problem where the convolution kernel or point spread function (PSF) is assumed to be only partially known. Small perturbations generated from the model are exploited to produce a few principal components explaining the PSF uncertainty in a high-dimensional space. Unlike recent developments on blind deconvolution of natural images, we assume the image is sparse in the pixel basis, a natural sparsity arising in magnetic resonance force microscopy (MRFM). Our approach adopts a Bayesian Metropolis-within-Gibbs sampling framework. The performance of our Bayesian semi-blind algorithm for sparse images is superior to previously proposed semi-blind algorithms such as the alternating minimization algorithm and blind algorithms developed for natural images. We illustrate our myopic algorithm on real MRFM tobacco virus data.
Mobilizing Practice: Engaging Space, Technology and Design from a Thai Metropolis
ERIC Educational Resources Information Center
Williams, Amanda Marisa
2009-01-01
The project of ubiquitous computing aims to embed computation into everyday spaces. As a practice that is heavily concerned with space and place, its stance towards mobility is sometimes conflicted--treating mobility by turns as a disruption or as an opportunity--and almost always conceiving of it as free and empowered. Conducted in industrial and…
ERIC Educational Resources Information Center
Gesinde, Abiodun Matthew; Sanu, Oluwafunto Jolade
2015-01-01
This study sought to examine the impact which age, gender and psychological adjustment have on behaviour towards seeking professional counselling intervention. Multistage sampling technique was employed to select a total of three hundred workers across Lagos metropolis. The ex post facto research design was adopted for the study. Inventory of…
Factors That Inform Students' Choice of Study and Career
ERIC Educational Resources Information Center
Theresa, Lawer Dede
2015-01-01
The research was conducted to find out factors that informed second cycle students' choices of programmes of study and career in the Kumasi Metropolis of Ghana. The descriptive survey was used for the study, and both questionnaire and interview guide were used in gathering the data. The questionnaire was administered on the students while the…
Urban Inequality: Evidence from Four Cities. A Volume in the Multi-City Study of Urban Inequality.
ERIC Educational Resources Information Center
O'Connor, Alice, Ed.; Tilly, Chris, Ed.; Bobo, Lawrence D., Ed.
This collection of papers focuses on urban inequalities in Atlanta, Boston, Detroit, and Los Angeles. There are 11 chapters in 3 parts. The book begins with an introduction, "Understanding Inequality in the Late Twentieth-Century Metropolis: New Perspectives on the Enduring Racial Divide" (Alice O'Connor) and chapter 1,…
Exploring In-Service Teachers' Self-Efficacy in the Kindergarten Classrooms in Ghana
ERIC Educational Resources Information Center
Boateng, Philip; Sekyere, Frank Owusu
2018-01-01
The study explored in-service teachers' efficacy beliefs in pupil engagement. The sample size was 299 kindergarten teachers selected from both public and private kindergarten schools in the Kumasi metropolis of Ghana. The study adopted and used pupil engagement subscale of the Ohio State Teacher Efficacy Scale (OSTES) developed by Tschannen-Moran…
ERIC Educational Resources Information Center
Dodoo, Joana Eva; Kuupole, Domwini Dabire
2017-01-01
The majority of studies and reports on university education in Africa have focused mainly on issues related to access, quality, teaching and learning environment, and so on. Although these issues are undoubtedly critical, even more germane to the discourse is the desired utility of university education to society. The authors present the…
Perceiving the Metropolis: Seeing the City through a Prism of Race
ERIC Educational Resources Information Center
Krysan, Maria; Bader, Michael
2007-01-01
Investigating the role of preferences in causing persistent patterns of racial residential segregation in the United States has a long history. In this paper, we bring a new perspective--and new data from the 2004 Detroit Area Study--to the question of how best to characterize black and white preferences toward living in neighborhoods with people…
Effects of Goal-Setting Skills on Students'academic Performance in English Language in Enugu Nigeria
ERIC Educational Resources Information Center
Abe, Iyabo Idowu; Ilogu, Guy Chibuzoh; Madueke, Ify Louisa
2014-01-01
The study investigated the effectiveness of goal-setting skills among Senior Secondary II students' academic performance in English language in Enugu Metropolis, Enugu state, Nigeria. Quasi-experimental pre-test, post-test control group design was adopted for the study. The initial sample was 147 participants (male and female) Senior Secondary…
ERIC Educational Resources Information Center
Yaki, Akawo Angwal; Babagana, Mohammed
2016-01-01
The paper examined the effects of a Technological Instructional Package (TIP) on secondary school students' performance in biology. The study adopted a pre-test, post-test experimental control group design. The sample size of the study was 80 students from Minna metropolis, Niger state, Nigeria; the samples were randomly assigned into treatment…
Beyond the Metropolis: The Forgotten History of Small-Town Teachers' Unions
ERIC Educational Resources Information Center
Scribner, Campbell F.
2015-01-01
This article examines the legal and political significance of teacher unionization in rural and suburban school districts between 1960 and 1975. While most historians focus on the growth of unions in urban areas, strikes in outlying districts played a determinative role in the development of public sector labor law, particularly in the arbitration…
ERIC Educational Resources Information Center
Oluwatomiwo, Oladunmoye Enoch
2015-01-01
This study examined the development and validation of socio provision scale on first year undergraduates adjustment among institution in Ibadan metropolis. The study adopted a descriptive survey design. A sample of 300 participants was randomly selected across institutions in Ibadan. Data were collected using socio provision scale (a =0.76),…
Ecology, Literature and Environmental Education
ERIC Educational Resources Information Center
Tsekos, Christos A.; Tsekos, Evangelos A.; Christoforidou, Elena I.
2012-01-01
The first part of this article refers to the initial attempt to relate Nature to Literature since the age of Hellenistic Alexandria in Egypt. Alexandria was a metropolis of its time with a quite lively character of urban life. Influenced by that character Theocritus was the first to lay the foundations of what is defined as pastoral poetry. In the…
ERIC Educational Resources Information Center
Kuo, Fan-Sheng; Perng, Yeng-Horng
2016-01-01
Creating an attractive cityscape has become one of the most promising actions to improve urban functionality and increase urban competitiveness. However, the resistances from the local inhabitants are always against the urban development. Taipei City, a metropolis in Taiwan, is now composed of complex urban systems chaotically enclosed by existing…
Firoz Uncle: A "Reluctant" Educationist in a Mumbai Ghetto
ERIC Educational Resources Information Center
Murali, Sreejith
2017-01-01
This article focuses on the educational efforts of Syed Firoz Ashraf in the East Jogeshwari area of Mumbai and places his work in the context of the increasing communalisation of social life and education in a poor working class suburb in Mumbai city. Muslim community has been ghettoised in the metropolis to specific areas especially since the…
Determinants of Differing Teacher Attitudes towards Inclusive Education Practice
ERIC Educational Resources Information Center
Gyimah, Emmanuel K.; Ackah, Francis R., Jr.; Yarquah, John A.
2010-01-01
An examination of literature reveals that teacher attitude is fundamental to the practice of inclusive education. In order to verify the extent to which the assertion is applicable in Ghana, 132 teachers were selected from 16 regular schools in the Cape Coast Metropolis using purposive and simple random sampling techniques to respond to a four…
Hao, Xiaohu; Zhang, Guijun; Zhou, Xiaogen
2018-04-01
Computing conformations which are essential to associate structural and functional information with gene sequences, is challenging due to the high dimensionality and rugged energy surface of the protein conformational space. Consequently, the dimension of the protein conformational space should be reduced to a proper level, and an effective exploring algorithm should be proposed. In this paper, a plug-in method for guiding exploration in conformational feature space with Lipschitz underestimation (LUE) for ab-initio protein structure prediction is proposed. The conformational space is converted into ultrafast shape recognition (USR) feature space firstly. Based on the USR feature space, the conformational space can be further converted into Underestimation space according to Lipschitz estimation theory for guiding exploration. As a consequence of the use of underestimation model, the tight lower bound estimate information can be used for exploration guidance, the invalid sampling areas can be eliminated in advance, and the number of energy function evaluations can be reduced. The proposed method provides a novel technique to solve the exploring problem of protein conformational space. LUE is applied to differential evolution (DE) algorithm, and metropolis Monte Carlo(MMC) algorithm which is available in the Rosetta; When LUE is applied to DE and MMC, it will be screened by the underestimation method prior to energy calculation and selection. Further, LUE is compared with DE and MMC by testing on 15 small-to-medium structurally diverse proteins. Test results show that near-native protein structures with higher accuracy can be obtained more rapidly and efficiently with the use of LUE. Copyright © 2018 Elsevier Ltd. All rights reserved.
Simulation of phase equilibria
NASA Astrophysics Data System (ADS)
Martin, Marcus Gary
The focus of this thesis is on the use of configurational bias Monte Carlo in the Gibbs ensemble. Unlike Metropolis Monte Carlo, which is reviewed in chapter I, configurational bias Monte Carlo uses an underlying Markov chain transition matrix which is asymmetric in such a way that it is more likely to attempt to move to a molecular conformation which has a lower energy than to one with a higher energy. Chapter II explains how this enables efficient simulation of molecules with complex architectures (long chains and branched molecules) for coexisting fluid phases (liquid, vapor, or supercritical), and also presents several of our recent extensions to this method. In chapter III we discuss the development of the Transferable Potentials for Phase Equilibria United Atom (TraPPE-UA) force field which accurately describes the fluid phase coexistence for linear and branched alkanes. Finally, in the fourth chapter the methods and the force field are applied to systems ranging from supercritical extraction to gas chromatography to illustrate the power and versatility of our approach.
Stochastic, real-space, imaginary-time evaluation of third-order Feynman-Goldstone diagrams
NASA Astrophysics Data System (ADS)
Willow, Soohaeng Yoo; Hirata, So
2014-01-01
A new, alternative set of interpretation rules of Feynman-Goldstone diagrams for many-body perturbation theory is proposed, which translates diagrams into algebraic expressions suitable for direct Monte Carlo integrations. A vertex of a diagram is associated with a Coulomb interaction (rather than a two-electron integral) and an edge with the trace of a Green's function in real space and imaginary time. With these, 12 diagrams of third-order many-body perturbation (MP3) theory are converted into 20-dimensional integrals, which are then evaluated by a Monte Carlo method. It uses redundant walkers for convergence acceleration and a weight function for importance sampling in conjunction with the Metropolis algorithm. The resulting Monte Carlo MP3 method has low-rank polynomial size dependence of the operation cost, a negligible memory cost, and a naturally parallel computational kernel, while reproducing the correct correlation energies of small molecules within a few mEh after 106 Monte Carlo steps.
NASA Astrophysics Data System (ADS)
Alzate-Cardona, J. D.; Barco-Rios, H.; Restrepo-Parra, E.
2018-02-01
The magnetocaloric behavior of La{2/{3}} Ca{1/{3}} Mn1-x Fe x O3 for x = 0.00, 0.02, 0.03, 0.05, 0.07, 0.08 and 0.10 under the influence of an external magnetic field was simulated and analyzed. Simulations were carried out using the Monte Carlo method and the classical Heisenberg model under the Metropolis algorithm. These mixed valence manganites are characterized by having three types of magnetic ions corresponding to Mn4+≤ft(S=\\frac{3}{2}\\right) , which are bonded with Ca2+ , and Mneg3+ and Mneg\\prime3+ (S=2) , related to La3+ . The Fe ions were randomly included, replacing Mn ions. With this model, the magnetic entropy change, Δ S , in an isothermal process was determined. -Δ Sm showed maximum peaks around the paramagnetic-ferromagnetic transition temperature, which depends on Fe doping. Relative cooling power was computed for different Fe concentrations varying the magnetic applied field. Our model and results show that the Fe doping decreases the magnetocaloric effect in the La{2/{3}} Ca{1/{3}} Mn1-x Fe x O3, making this a bad candidate for magnetic refrigeration. The strong dependence of the magnetocaloric behavior on Fe doping and the external magnetic field in La{2/{3}} Ca{1/{3}} Mn1-x Fe x O3 can boost these materials for the future technological applications.
NASA Astrophysics Data System (ADS)
Cao, Chao
2009-03-01
Nano-scale physical phenomena and processes, especially those in electronics, have drawn great attention in the past decade. Experiments have shown that electronic and transport properties of functionalized carbon nanotubes are sensitive to adsorption of gas molecules such as H2, NO2, and NH3. Similar measurements have also been performed to study adsorption of proteins on other semiconductor nano-wires. These experiments suggest that nano-scale systems can be useful for making future chemical and biological sensors. Aiming to understand the physical mechanisms underlying and governing property changes at nano-scale, we start off by investigating, via first-principles method, the electronic structure of Pd-CNT before and after hydrogen adsorption, and continue with coherent electronic transport using non-equilibrium Green’s function techniques combined with density functional theory. Once our results are fully analyzed they can be used to interpret and understand experimental data, with a few difficult issues to be addressed. Finally, we discuss a newly developed multi-scale computing architecture, OPAL, that coordinates simultaneous execution of multiple codes. Inspired by the capabilities of this computing framework, we present a scenario of future modeling and simulation of multi-scale, multi-physical processes.
ERIC Educational Resources Information Center
Obomanu, B. J.; Adaramola, M. O.
2011-01-01
We report a research into factors related to underachievement in science, technology and mathematics (STM) education in schools in Rivers State, Nigeria. The study investigated 240 Nigerian secondary school students, 100 parents, 140 STM teachers and 20 government officials from Port Harcourt Metropolis. Five (5) research questions and one…
The Vitality of a City: Challenge to Higher Education; Challenge to Education: A New Approach.
ERIC Educational Resources Information Center
Johnson, Byron
The US higher education system adopted the European pattern of separating the university from the city. This pattern has changed somewhat in the last few decades, when new universities or branches of older ones have appeared in the metropolis. But frequently these institutions are unconcerned with finding ways to contribute toward improving urban…
Teachers' Level of Awareness of 21st Century Occupational Roles in Rivers State Secondary Schools
ERIC Educational Resources Information Center
Uche, Chineze M.; Kaegon, Leesi E. S. P.; Okata, Fanny Chiemezie
2016-01-01
This study investigated the teachers' level of awareness of 21st century occupational roles in Rivers state secondary schools. Three research questions and three hypotheses guided the study. The population of study comprised of 247 public secondary schools and 57 private secondary schools in Port Harcourt metropolis of Rivers state which gave a…
ERIC Educational Resources Information Center
Yusuf, Hanna Onyi
2014-01-01
This study assessed the implementation of the reading component of the Junior Secondary School English Language Curriculum for Basic Education in Nigeria. Ten (10) randomly selected public and private secondary schools from Kaduna metropolis in Kaduna State of Nigeria were used for the study. Among the factors assessed in relation to the…
Code of Federal Regulations, 2014 CFR
2014-07-01
... representatives include commissioned, warrant, and petty officers of the U.S. Coast Guard. (d) Informational...-90.5 (West Virginia). 9. 1 day—Third or fourth of July Harrah's Casino/Metropolis Fireworks... Mississippi River mile marker 518.0 to 519.0 (Iowa). 27. 1 day—4th of July weekend Harrah's Casino and Hotel...
ERIC Educational Resources Information Center
Awere, E.; Edu-Buandoh, K. B. M.; Dadzie, D. K.; Aboagye, J. A.
2016-01-01
Building Technology graduates from Ghanaian Polytechnics seek employment in the construction industry, yet little information is known as to whether their tertiary education is really related to and meeting the actual needs of their prospective employers in the construction industry. The tracer study was conducted to ascertain the performance of…
1985-10-30
61A-31-005 (30 Oct 1985) --- This almost vertical view, photographed from Earth-orbit by an STS-61A crew member, centers on the metropolis of Milwaukee, Wisconsin, and some of the adjacent Lake Michigan shoreline, southward toward the Illinois border. The 70mm frame was photographed on the first day of the Spacelab D-1 mission with a handheld Hasselblad camera.
ERIC Educational Resources Information Center
Boakye-Amponsah, Abraham; Enninful, Ebenezer Kofi; Anin, Emmanuel Kwabena; Vanderpuye, Patience
2015-01-01
Background: Ghana being a member of the United Nations, committed to the Universal Primary Education initiative in 2000 and has since implemented series of educational reforms to meet the target for the Millennium Development Goal (MDG) 2. Despite the numerous government interventions to achieve the MDG 2, many children in Ghana have been denied…
ERIC Educational Resources Information Center
Tunckan, Ergun
2007-01-01
The Open Education Faculty Students Centers have been offering many services to students in Turkey since 1982. Building up bridges between students and faculties, student centers have had technological improvements since 1998 and thereafter quality of services have been increased and services given to students at the student center have been…
Visual defects and commercial motorcycle accidents in south eastern Nigeria.
Achigbu, E O; Fiebai, B
2013-01-01
Commercial motorcyclists are a regular part of our highways, especially with the decrease in the number and quality of good roads. This study is aimed at determining the role of vision if any in the increasing number of road traffic accidents (RTA's) among commercial motorcyclists in Enugu metropolis, Nigeria. A cross sectional survey with a multi stage random sampling design was used to select the 615 commercial motorcyclists in Enugu metropolis enrolled in the study. Out of the 615 motorcyclists, seven (1.14% +/- 0.70%) motorcyclists had visual impairment (< 6/18-3/60). Visual field defect was noted in 2.3% +/- 0.98% while 2.6% +/- 0.98% had colour vision defect. The prevalence of road traffic accident (RTA) was 57.7%. Visual impairment was not significantly associated with RTA (P = 0.333) while visual field defect (P = 0.000), and colour vision defect (P = 0.003) were positively associated with RTA. Inexperienced riders had significantly more RTAs than their counterparts (P = 0.000). CONCLUSION; Visual field defect, and colour vision defect were significantly associated with RTA but this finding is in the backdrop of poor training, and inexperience which also significantly affected RTA among the predominantly young riders involved in RTA.
Spatial patterns monitoring of road traffic injuries in Karachi metropolis.
Lateef, Muhammad U
2011-06-01
This article aims to assess the pattern of road traffic injuries (RTIs) and fatalities in Karachi metropolis. Assessing the pattern of RTIs in Karachi at this juncture is important for many reasons. The rapid motorisation in the recent years due to the availability of credit has significantly increased the traffic volume of the city. Since then, the roads of Karachi have continuously developed at a rapid pace. This development has come with a high human loss, because the construction of multilevel flyovers, signal-free corridors and the resulting high-speed traffic ultimately increase the severity of injuries. The reasons for this high proportion are inadequate infrastructure, poor enforcement of safety regulations, high crash severity index and greater population of vulnerable road user groups (riders and pedestrians). This research is the first of its kind in the country to have a geocoded database of fatalities and injuries in a geographical information system for the entire city of Karachi. In fact, road crashes are both predictable and preventable. Developing countries should learn from the experience of highly motorised nations to avoid the high burden of RTIs by adopting road safety and prevention measures.
Zhang, Wenquan; Logan, John R.
2018-01-01
The rapid growth of Asian and Hispanic populations in urban areas is superceding traditional classifications of neighborhoods (for example as white, transitional, or minority). The “global neighborhood” that includes all groups (white, black, Hispanic and Asian) is one important new category. We examine the emerging spatial pattern of racial/ethnic composition in the Chicago metropolis, documenting an expansion of all-minority neighborhoods in the city and just beyond its borders, a shrinking set of all-white neighborhoods in the outer suburbs, and more diverse neighborhoods including whites mainly in between. The most novel element of this pattern is how large the zone of diversity has become and how far it extends into suburbia, upending the old dichotomy of “chocolate city” and “vanilla suburbs.” In addition to comparing the distance of different kinds of neighborhoods from the urban core, we also analyze their adjacency to neighborhoods of the same type or other types. There is a strong tendency toward spatial clustering of each neighborhood type and also for transitions on the boundaries of clusters either to expand or to contract their territory. PMID:29430517
NASA Astrophysics Data System (ADS)
Wall, Michael
2014-03-01
Experimental progress in generating and manipulating synthetic quantum systems, such as ultracold atoms and molecules in optical lattices, has revolutionized our understanding of quantum many-body phenomena and posed new challenges for modern numerical techniques. Ultracold molecules, in particular, feature long-range dipole-dipole interactions and a complex and selectively accessible internal structure of rotational and hyperfine states, leading to many-body models with long range interactions and many internal degrees of freedom. Additionally, the many-body physics of ultracold molecules is often probed far from equilibrium, and so algorithms which simulate quantum many-body dynamics are essential. Numerical methods which are to have significant impact in the design and understanding of such synthetic quantum materials must be able to adapt to a variety of different interactions, physical degrees of freedom, and out-of-equilibrium dynamical protocols. Matrix product state (MPS)-based methods, such as the density-matrix renormalization group (DMRG), have become the de facto standard for strongly interacting low-dimensional systems. Moreover, the flexibility of MPS-based methods makes them ideally suited both to generic, open source implementation as well as to studies of the quantum many-body dynamics of ultracold molecules. After introducing MPSs and variational algorithms using MPSs generally, I will discuss my own research using MPSs for many-body dynamics of long-range interacting systems. In addition, I will describe two open source implementations of MPS-based algorithms in which I was involved, as well as educational materials designed to help undergraduates and graduates perform research in computational quantum many-body physics using a variety of numerical methods including exact diagonalization and static and dynamic variational MPS methods. Finally, I will mention present research on ultracold molecules in optical lattices, such as the exploration of many-body physics with polyatomic molecules, and the next generation of open source matrix product state codes. This work was performed in the research group of Prof. Lincoln D. Carr.
NASA Astrophysics Data System (ADS)
Tuve, T.; Mostaccio, A.; Langer, H. K.; di Grazia, G.
2005-12-01
A recent research project carried out together with the Italian Civil Protection concerns the study of amplitude decay laws in various areas on the Italian territory, including Mt Etna. A particular feature of seismic activity is the presence of moderate magnitude earthquakes causing frequently considerable damage in the epicentre areas. These earthquakes are supposed to occur at rather shallow depth, no more than 5 km. Given the geological context, however, these shallow earthquakes would origin in rather weak sedimentary material. In this study we check the reliability of standard earthquake location, in particular with respect to the calculated focal depth, using standard location methods as well as more advanced approaches such as the NONLINLOC software proposed by Lomax et al. (2000) using it with its various options (i.e., Grid Search, Metropolis-Gibbs and Oct-Tree) and 3D velocity model (Cocina et al., 2005). All three options of NONLINLOC gave comparable results with respect to hypocenter locations and quality. Compared to standard locations we note a significant improve of location quality and, in particular a considerable difference of focal depths (in the order of 1.5 - 2 km). However, we cannot find a clear bias towards greater or lower depth. Further analyses concern the assessment of the stability of locations. For this purpose we carry out various Monte Carlo experiments perturbing travel time reading randomly. Further investigations are devoted to possible biases which may arise from the use of an unsuitable velocity model.
Carlacci, Louis; Millard, Charles B; Olson, Mark A
2004-10-01
The X-ray crystal structure of the reaction product of acetylcholinesterase (AChE) with the inhibitor diisopropylphosphorofluoridate (DFP) showed significant structural displacement in a loop segment of residues 287-290. To understand this conformational selection, a Monte Carlo (MC) simulation study was performed of the energy landscape for the loop segment. A computational strategy was applied by using a combined simulated annealing and room temperature Metropolis sampling approach with solvent polarization modeled by a generalized Born (GB) approximation. Results from thermal annealing reveal a landscape topology of broader basin opening and greater distribution of energies for the displaced loop conformation, while the ensemble average of conformations at 298 K favored a shift in populations toward the native by a free-energy difference in good agreement with the estimated experimental value. Residue motions along a reaction profile of loop conformational reorganization are proposed where Arg-289 is critical in determining electrostatic effects of solvent interaction versus Coulombic charging.
Androids: application of EAP as artificial muscles to entertainment industry
NASA Technical Reports Server (NTRS)
Hanson, D.; Pioggia, G.; Bar-Cohen, Yoseph; de Rossi, D.
2001-01-01
The classic movie Metropolis (1926), which is nowadays considered a cinema milestone, has shown the possibility to build robots called androids that are science and fiction run together to realize a dream: the human-like robot. In that movie, Dr. Rotwang transforms a simple and cold calculating robot into the body of a beautiful woman. Robots have often been depicted as metal creatures with cold steel bodies, but there is no reason why metals should be the only kind of material for construction of robots. The authors examined the issues related to applying electroactive polymers materials (EAP) to the entertainment industry. EAP are offering attractive characteristics with the potential to produce more realistic models of living creatures at significantly lower cost. This paper seeks to elucidate how EAP might infiltrate and ultimately revolutionize entertainment, showing some applicative examples.
ERIC Educational Resources Information Center
Akom, A. A.
2008-01-01
In this article, I reflect on Signithia Fordham and John Ogbu's classic research on the "burden of "acting White"" to develop a long overdue dialogue between Africana studies and critical white studies. It highlights the dialectical nature of Fordham and Ogbu's philosophy of race and critical race theory by locating the origins of the "burden of…
ERIC Educational Resources Information Center
Amosa, Abdul Ganiyu Alasela; Ogunlade, Oyeronke Olufunmilola; Atobatele, Adunni Suliat
2015-01-01
The use of field trip in teaching and learning helps to bring about effective and efficient learning in Basic Technology. Field trip is a group excursion away from the normal education environment for firsthand experience of an historic site or place of special interest. This study therefore was geared towards finding out the effect of field trip…
ERIC Educational Resources Information Center
Bonney, Ebenezer Appah; Amoah, Daniel F.; Micah, Sophia A.; Ahiamenyo, Comfort; Lemaire, Margaret B.
2015-01-01
The study investigated into the relationship between the quality of teachers and students' academic performance in Sekondi Takoradi Metropolitan Assembly (STMA) Junior High Schools. Descriptive survey design was used and the target population was Junior High School teachers and pupils in the metropolis. Five educational circuits in the metropolis…
ERIC Educational Resources Information Center
LAMANNA, RICHARD A.; SAMORA, JULIAN
MEXICAN AMERICANS WHO HAVE MIGRATED TO THE INDUSTRIAL COMPLEX OF EAST CHICAGO ARE ANALYZED TO DETERMINE THE VALIDITY OF A HYPOTHESIS THAT THIS GROUP WAS PROVIDED OPPORTUNITIES NOT AVAILABLE TO THEIR COUNTERPARTS IN THE SOUTHWEST FOR ASSIMILATION INTO THE COMMUNITY. A CONCISE REPORT ON THE HISTORY OF THE MEXICAN-AMERICAN COLONY IN EAST CHICAGO, ITS…
ERIC Educational Resources Information Center
Torto, Gertrude Afiba
2017-01-01
The English language curriculum for primary schools in Ghana spells out the various aspects, topics and sub topics that teachers must teach the child within a specified time. The syllabus again specifies the various topics and sub topics that should be taught in an integrated manner so as to enhance meaningful learning. For this meaningful…
Area Handbook Series: Paraguay: A Country Study
1988-12-01
fattening period for steers. Artificial insemination was increasingly common. To a certain extent, cattle raising reflected the dispari- ties in agriculture...metropolis and satellite . Joseph had no constituency in Spanish America. Without a king, the entire colonial system lost its legitimacy, and the colonists...generally more dependable than local service because it used a microwave and satellite transmission system. Telex ser- vices also were available through
ERIC Educational Resources Information Center
Cobbold, Cosmas; Boateng, Philip
2016-01-01
The objective of the study was to investigate kindergarten teachers' efficacy beliefs in classroom management. The sample size was 299 teachers drawn from both public and private kindergarten schools in the Kumasi Metropolis of Ghana. The efficacy beliefs of the teachers with respect to their classroom management practices were measured on a…
ERIC Educational Resources Information Center
Achor, Emmanuel E.; Amadu, Samuel O.
2015-01-01
This study examined the extent to which school outdoor activities could enhance senior secondary (SS) two students' achievement in ecology. Non randomized pre test post test control group Quasi-experimental design was adopted. A sample of 160 SS II students from 4 co-educational schools in Jalingo metropolis, Taraba State Nigeria was used. A 40…
ERIC Educational Resources Information Center
Durowoju, Esther O.; Onuka, Adams O. U.
2015-01-01
The paper investigated the effect of teacher self-efficacy enhancement and school location on students' achievement in Economics in Senior Secondary School in Ibadan Metropolis of Oyo State, Nigeria. Three hypotheses were tested at 0.05 level of significance. Multi-stage sampling technique was adopted in the study. Four Local Government Areas (two…
Kaur, Harparkash; Allan, Elizabeth Louise; Mamadu, Ibrahim; Hall, Zoe; Ibe, Ogochukwu; El Sherbiny, Mohamed; van Wyk, Albert; Yeung, Shunmay; Swamidoss, Isabel; Green, Michael D.; Dwivedi, Prabha; Culzoni, Maria Julia; Clarke, Siân; Schellenberg, David; Fernández, Facundo M.; Onwujekwe, Obinna
2015-01-01
Background Artemisinin-based combination therapies are recommended by the World Health Organisation (WHO) as first-line treatment for Plasmodium falciparum malaria, yet medication must be of good quality for efficacious treatment. A recent meta-analysis reported 35% (796/2,296) of antimalarial drug samples from 21 Sub-Saharan African countries, purchased from outlets predominantly using convenience sampling, failed chemical content analysis. We used three sampling strategies to purchase artemisinin-containing antimalarials (ACAs) in Enugu metropolis, Nigeria, and compared the resulting quality estimates. Methods ACAs were purchased using three sampling approaches - convenience, mystery clients and overt, within a defined area and sampling frame in Enugu metropolis. The active pharmaceutical ingredients were assessed using high-performance liquid chromatography and confirmed by mass spectrometry at three independent laboratories. Results were expressed as percentage of APIs stated on the packaging and used to categorise each sample as acceptable quality, substandard, degraded, or falsified. Results Content analysis of 3024 samples purchased from 421 outlets using convenience (n=200), mystery (n=1,919) and overt (n=905) approaches, showed overall 90.8% ACAs to be of acceptable quality, 6.8% substandard, 1.3% degraded and 1.2% falsified. Convenience sampling yielded a significantly higher prevalence of poor quality ACAs, but was not evident by the mystery and overt sampling strategies both of which yielded results that were comparable between each other. Artesunate (n=135; 4 falsified) and dihydroartemisinin (n=14) monotherapy tablets, not recommended by WHO, were also identified. Conclusion Randomised sampling identified fewer falsified ACAs than previously reported by convenience approaches. Our findings emphasise the need for specific consideration to be given to sampling frame and sampling approach if representative information on drug quality is to be obtained. PMID:26018221
NASA Astrophysics Data System (ADS)
Suh, Donghyuk; Radak, Brian K.; Chipot, Christophe; Roux, Benoît
2018-01-01
Molecular dynamics (MD) trajectories based on classical equations of motion can be used to sample the configurational space of complex molecular systems. However, brute-force MD often converges slowly due to the ruggedness of the underlying potential energy surface. Several schemes have been proposed to address this problem by effectively smoothing the potential energy surface. However, in order to recover the proper Boltzmann equilibrium probability distribution, these approaches must then rely on statistical reweighting techniques or generate the simulations within a Hamiltonian tempering replica-exchange scheme. The present work puts forth a novel hybrid sampling propagator combining Metropolis-Hastings Monte Carlo (MC) with proposed moves generated by non-equilibrium MD (neMD). This hybrid neMD-MC propagator comprises three elementary elements: (i) an atomic system is dynamically propagated for some period of time using standard equilibrium MD on the correct potential energy surface; (ii) the system is then propagated for a brief period of time during what is referred to as a "boosting phase," via a time-dependent Hamiltonian that is evolved toward the perturbed potential energy surface and then back to the correct potential energy surface; (iii) the resulting configuration at the end of the neMD trajectory is then accepted or rejected according to a Metropolis criterion before returning to step 1. A symmetric two-end momentum reversal prescription is used at the end of the neMD trajectories to guarantee that the hybrid neMD-MC sampling propagator obeys microscopic detailed balance and rigorously yields the equilibrium Boltzmann distribution. The hybrid neMD-MC sampling propagator is designed and implemented to enhance the sampling by relying on the accelerated MD and solute tempering schemes. It is also combined with the adaptive biased force sampling algorithm to examine. Illustrative tests with specific biomolecular systems indicate that the method can yield a significant speedup.
Suh, Donghyuk; Radak, Brian K; Chipot, Christophe; Roux, Benoît
2018-01-07
Molecular dynamics (MD) trajectories based on classical equations of motion can be used to sample the configurational space of complex molecular systems. However, brute-force MD often converges slowly due to the ruggedness of the underlying potential energy surface. Several schemes have been proposed to address this problem by effectively smoothing the potential energy surface. However, in order to recover the proper Boltzmann equilibrium probability distribution, these approaches must then rely on statistical reweighting techniques or generate the simulations within a Hamiltonian tempering replica-exchange scheme. The present work puts forth a novel hybrid sampling propagator combining Metropolis-Hastings Monte Carlo (MC) with proposed moves generated by non-equilibrium MD (neMD). This hybrid neMD-MC propagator comprises three elementary elements: (i) an atomic system is dynamically propagated for some period of time using standard equilibrium MD on the correct potential energy surface; (ii) the system is then propagated for a brief period of time during what is referred to as a "boosting phase," via a time-dependent Hamiltonian that is evolved toward the perturbed potential energy surface and then back to the correct potential energy surface; (iii) the resulting configuration at the end of the neMD trajectory is then accepted or rejected according to a Metropolis criterion before returning to step 1. A symmetric two-end momentum reversal prescription is used at the end of the neMD trajectories to guarantee that the hybrid neMD-MC sampling propagator obeys microscopic detailed balance and rigorously yields the equilibrium Boltzmann distribution. The hybrid neMD-MC sampling propagator is designed and implemented to enhance the sampling by relying on the accelerated MD and solute tempering schemes. It is also combined with the adaptive biased force sampling algorithm to examine. Illustrative tests with specific biomolecular systems indicate that the method can yield a significant speedup.
Chen, Yunjie; Roux, Benoît
2014-09-21
Hybrid schemes combining the strength of molecular dynamics (MD) and Metropolis Monte Carlo (MC) offer a promising avenue to improve the sampling efficiency of computer simulations of complex systems. A number of recently proposed hybrid methods consider new configurations generated by driving the system via a non-equilibrium MD (neMD) trajectory, which are subsequently treated as putative candidates for Metropolis MC acceptance or rejection. To obey microscopic detailed balance, it is necessary to alter the momentum of the system at the beginning and/or the end of the neMD trajectory. This strict rule then guarantees that the random walk in configurational space generated by such hybrid neMD-MC algorithm will yield the proper equilibrium Boltzmann distribution. While a number of different constructs are possible, the most commonly used prescription has been to simply reverse the momenta of all the particles at the end of the neMD trajectory ("one-end momentum reversal"). Surprisingly, it is shown here that the choice of momentum reversal prescription can have a considerable effect on the rate of convergence of the hybrid neMD-MC algorithm, with the simple one-end momentum reversal encountering particularly acute problems. In these neMD-MC simulations, different regions of configurational space end up being essentially isolated from one another due to a very small transition rate between regions. In the worst-case scenario, it is almost as if the configurational space does not constitute a single communicating class that can be sampled efficiently by the algorithm, and extremely long neMD-MC simulations are needed to obtain proper equilibrium probability distributions. To address this issue, a novel momentum reversal prescription, symmetrized with respect to both the beginning and the end of the neMD trajectory ("symmetric two-ends momentum reversal"), is introduced. Illustrative simulations demonstrate that the hybrid neMD-MC algorithm robustly yields a correct equilibrium probability distribution with this prescription.
NASA Astrophysics Data System (ADS)
Chen, Yunjie; Roux, Benoît
2014-09-01
Hybrid schemes combining the strength of molecular dynamics (MD) and Metropolis Monte Carlo (MC) offer a promising avenue to improve the sampling efficiency of computer simulations of complex systems. A number of recently proposed hybrid methods consider new configurations generated by driving the system via a non-equilibrium MD (neMD) trajectory, which are subsequently treated as putative candidates for Metropolis MC acceptance or rejection. To obey microscopic detailed balance, it is necessary to alter the momentum of the system at the beginning and/or the end of the neMD trajectory. This strict rule then guarantees that the random walk in configurational space generated by such hybrid neMD-MC algorithm will yield the proper equilibrium Boltzmann distribution. While a number of different constructs are possible, the most commonly used prescription has been to simply reverse the momenta of all the particles at the end of the neMD trajectory ("one-end momentum reversal"). Surprisingly, it is shown here that the choice of momentum reversal prescription can have a considerable effect on the rate of convergence of the hybrid neMD-MC algorithm, with the simple one-end momentum reversal encountering particularly acute problems. In these neMD-MC simulations, different regions of configurational space end up being essentially isolated from one another due to a very small transition rate between regions. In the worst-case scenario, it is almost as if the configurational space does not constitute a single communicating class that can be sampled efficiently by the algorithm, and extremely long neMD-MC simulations are needed to obtain proper equilibrium probability distributions. To address this issue, a novel momentum reversal prescription, symmetrized with respect to both the beginning and the end of the neMD trajectory ("symmetric two-ends momentum reversal"), is introduced. Illustrative simulations demonstrate that the hybrid neMD-MC algorithm robustly yields a correct equilibrium probability distribution with this prescription.
Achievements and future path of Tehran municipality in urban health domain: An Iranian experience
Damari, Behzad; Riazi-Isfahani, Sahand
2016-01-01
Background: According to national laws and world experiences; provision, maintenance, and improving citizens’ health are considered to be the essential functions of municipalities as a "social institute". In order to equitably promote health conditions at urban level, particularly in marginal areas, since 2004 targeted efforts have been implemented in the municipality of Tehran metropolis. This study was intended to identify and analyze these targeted measures and tries to analyze health interventions in a conceptual framework and propose a future path. Methods: This is a qualitative study with content analysis approach. Reviewing documents and structured interviews with national health policy making and planning experts and executive managers of 22-region municipalities of Tehran metropolis were used to collect data. The data were analyzed on the basis of conceptual framework prepared for urban health in 4 domains including municipal interventions, goal achievements, drivers and obstacles of success, and the way forward. Results: From the viewpoint of interviewees, these new health actions of Tehran municipality are more based on public participation and the municipality was able to prioritize health issue in the programs and policies of Tehran city council. Tehran municipality has accomplished three types of interventions to improve health, which in orders of magnitude are: facilitative, promotional, and mandatory interventions. Development and institutionalization of public participation is the greatest achievement in health-oriented actions; and expansion of environmental and physical health-oriented facilities and promoting a healthy lifestyle are next in ranks. Conclusion: Since management alterations seriously challenges institutionalization of actions and innovations especially in the developing countries, it is suggested that mayors of metropolitan cities like Tehran document and review municipal health measures as soon as possible and while eliminating overlapping of interventions with other sectors, design and approve the charter of "health promoting municipality". The most important role of municipalities in this charter would be coordinating health improvement of citizens. This charter, when approved as a national policy could be used for other cities too. PMID:27390693
NASA Astrophysics Data System (ADS)
Fedosov, Dmitry
2011-03-01
Computational biophysics is a large and rapidly growing area of computational physics. In this talk, we will focus on a number of biophysical problems related to blood cells and blood flow in health and disease. Blood flow plays a fundamental role in a wide range of physiological processes and pathologies in the organism. To understand and, if necessary, manipulate the course of these processes it is essential to investigate blood flow under realistic conditions including deformability of blood cells, their interactions, and behavior in the complex microvascular network. Using a multiscale cell model we are able to accurately capture red blood cell mechanics, rheology, and dynamics in agreement with a number of single cell experiments. Further, this validated model yields accurate predictions of the blood rheological properties, cell migration, cell-free layer, and hemodynamic resistance in microvessels. In addition, we investigate blood related changes in malaria, which include a considerable stiffening of red blood cells and their cytoadherence to endothelium. For these biophysical problems computational modeling is able to provide new physical insights and capabilities for quantitative predictions of blood flow in health and disease.
Dynamic population flow based risk analysis of infectious disease propagation in a metropolis.
Zhang, Nan; Huang, Hong; Duarte, Marlyn; Zhang, Junfeng Jim
2016-09-01
Knowledge on the characteristics of infectious disease propagation in metropolises plays a critical role in guiding public health intervention strategies to reduce death tolls, disease incidence, and possible economic losses. Based on the SIR model, we established a comprehensive spatiotemporal risk assessment model to compute infectious disease propagation within an urban setting using Beijing, China as a case study. The model was developed for a dynamic population distribution using actual data on location, density of residences and offices, and means of public transportation (e.g., subways, buses and taxis). We evaluated four influencing factors including biological, behavioral, environmental parameters and infectious sources. The model output resulted in a set of maps showing how the four influencing factors affected the trend and characteristics of airborne infectious disease propagation in Beijing. We compared the scenarios for the long-term dynamic propagation of infectious disease without governmental interventions versus scenarios with government intervention and hospital coordinated emergency responses. Lastly, the sensitivity of the average number of people at different location in spreading infections is analyzed. Based on our results, we provide valuable recommendations to governmental agencies and the public in order to minimize the disease propagation. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pankratov, Oleg; Kuvshinov, Alexey
2016-01-01
Despite impressive progress in the development and application of electromagnetic (EM) deterministic inverse schemes to map the 3-D distribution of electrical conductivity within the Earth, there is one question which remains poorly addressed—uncertainty quantification of the recovered conductivity models. Apparently, only an inversion based on a statistical approach provides a systematic framework to quantify such uncertainties. The Metropolis-Hastings (M-H) algorithm is the most popular technique for sampling the posterior probability distribution that describes the solution of the statistical inverse problem. However, all statistical inverse schemes require an enormous amount of forward simulations and thus appear to be extremely demanding computationally, if not prohibitive, if a 3-D set up is invoked. This urges development of fast and scalable 3-D modelling codes which can run large-scale 3-D models of practical interest for fractions of a second on high-performance multi-core platforms. But, even with these codes, the challenge for M-H methods is to construct proposal functions that simultaneously provide a good approximation of the target density function while being inexpensive to be sampled. In this paper we address both of these issues. First we introduce a variant of the M-H method which uses information about the local gradient and Hessian of the penalty function. This, in particular, allows us to exploit adjoint-based machinery that has been instrumental for the fast solution of deterministic inverse problems. We explain why this modification of M-H significantly accelerates sampling of the posterior probability distribution. In addition we show how Hessian handling (inverse, square root) can be made practicable by a low-rank approximation using the Lanczos algorithm. Ultimately we discuss uncertainty analysis based on stochastic inversion results. In addition, we demonstrate how this analysis can be performed within a deterministic approach. In the second part, we summarize modern trends in the development of efficient 3-D EM forward modelling schemes with special emphasis on recent advances in the integral equation approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rotondo, M.; Rueda, Jorge A.; Xue, S.-S.
The Feynman-Metropolis-Teller treatment of compressed atoms is extended to the relativistic regimes. Each atomic configuration is confined by a Wigner-Seitz cell and is characterized by a positive electron Fermi energy. The nonrelativistic treatment assumes a pointlike nucleus and infinite values of the electron Fermi energy can be attained. In the relativistic treatment there exists a limiting configuration, reached when the Wigner-Seitz cell radius equals the radius of the nucleus, with a maximum value of the electron Fermi energy (E{sub e}{sup F}){sub max}, here expressed analytically in the ultrarelativistic approximation. The corrections given by the relativistic Thomas-Fermi-Dirac exchange term are alsomore » evaluated and shown to be generally small and negligible in the relativistic high-density regime. The dependence of the relativistic electron Fermi energies by compression for selected nuclei are compared and contrasted to the nonrelativistic ones and to the ones obtained in the uniform approximation. The relativistic Feynman-Metropolis-Teller approach here presented overcomes some difficulties in the Salpeter approximation generally adopted for compressed matter in physics and astrophysics. The treatment is then extrapolated to compressed nuclear matter cores of stellar dimensions with A{approx_equal}(m{sub Planck}/m{sub n}){sup 3}{approx}10{sup 57} or M{sub core}{approx}M{sub {circle_dot}}. A new family of equilibrium configurations exists for selected values of the electron Fermi energy varying in the range 0
Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics
NASA Technical Reports Server (NTRS)
Pohorille, Andrew
2006-01-01
The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described by rate constants. These problems are isomorphic with chemical kinetics problems. Recently, several efficient techniques for this purpose have been developed based on the approach originally proposed by Gillespie. Although the utility of the techniques mentioned above for Bayesian problems has not been determined, further research along these lines is warranted
Alderman, Phillip D.; Stanfill, Bryan
2016-10-06
Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less
Stochastic evaluation of second-order many-body perturbation energies.
Willow, Soohaeng Yoo; Kim, Kwang S; Hirata, So
2012-11-28
With the aid of the Laplace transform, the canonical expression of the second-order many-body perturbation correction to an electronic energy is converted into the sum of two 13-dimensional integrals, the 12-dimensional parts of which are evaluated by Monte Carlo integration. Weight functions are identified that are analytically normalizable, are finite and non-negative everywhere, and share the same singularities as the integrands. They thus generate appropriate distributions of four-electron walkers via the Metropolis algorithm, yielding correlation energies of small molecules within a few mE(h) of the correct values after 10(8) Monte Carlo steps. This algorithm does away with the integral transformation as the hotspot of the usual algorithms, has a far superior size dependence of cost, does not suffer from the sign problem of some quantum Monte Carlo methods, and potentially easily parallelizable and extensible to other more complex electron-correlation theories.
Markov Chain Monte Carlo from Lagrangian Dynamics.
Lan, Shiwei; Stathopoulos, Vasileios; Shahbaba, Babak; Girolami, Mark
2015-04-01
Hamiltonian Monte Carlo (HMC) improves the computational e ciency of the Metropolis-Hastings algorithm by reducing its random walk behavior. Riemannian HMC (RHMC) further improves the performance of HMC by exploiting the geometric properties of the parameter space. However, the geometric integrator used for RHMC involves implicit equations that require fixed-point iterations. In some cases, the computational overhead for solving implicit equations undermines RHMC's benefits. In an attempt to circumvent this problem, we propose an explicit integrator that replaces the momentum variable in RHMC by velocity. We show that the resulting transformation is equivalent to transforming Riemannian Hamiltonian dynamics to Lagrangian dynamics. Experimental results suggests that our method improves RHMC's overall computational e ciency in the cases considered. All computer programs and data sets are available online (http://www.ics.uci.edu/~babaks/Site/Codes.html) in order to allow replication of the results reported in this paper.
NASA Astrophysics Data System (ADS)
Kim, Seung Joong
The protein folding problem has been one of the most challenging subjects in biological physics due to its complexity. Energy landscape theory based on statistical mechanics provides a thermodynamic interpretation of the protein folding process. We have been working to answer fundamental questions about protein-protein and protein-water interactions, which are very important for describing the energy landscape surface of proteins correctly. At first, we present a new method for computing protein-protein interaction potentials of solvated proteins directly from SAXS data. An ensemble of proteins was modeled by Metropolis Monte Carlo and Molecular Dynamics simulations, and the global X-ray scattering of the whole model ensemble was computed at each snapshot of the simulation. The interaction potential model was optimized and iterated by a Levenberg-Marquardt algorithm. Secondly, we report that terahertz spectroscopy directly probes hydration dynamics around proteins and determines the size of the dynamical hydration shell. We also present the sequence and pH-dependence of the hydration shell and the effect of the hydrophobicity. On the other hand, kinetic terahertz absorption (KITA) spectroscopy is introduced to study the refolding kinetics of ubiquitin and its mutants. KITA results are compared to small angle X-ray scattering, tryptophan fluorescence, and circular dichroism results. We propose that KITA monitors the rearrangement of hydrogen bonding during secondary structure formation. Finally, we present development of the automated single molecule operating system (ASMOS) for a high throughput single molecule detector, which levitates a single protein molecule in a 10 microm diameter droplet by the laser guidance. I also have performed supporting calculations and simulations with my own program codes.
Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin
2016-05-20
In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6-12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. The prevalence of anemia was 12.60% with a range of 3.47%-40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities.
NASA Astrophysics Data System (ADS)
Zielke, O.; McDougall, D.; Mai, P. M.; Babuska, I.
2014-12-01
One fundamental aspect of seismic hazard mitigation is gaining a better understanding of the rupture process. Because direct observation of the relevant parameters and properties is not possible, other means such as kinematic source inversions are used instead. By constraining the spatial and temporal evolution of fault slip during an earthquake, those inversion approaches may enable valuable insights in the physics of the rupture process. However, due to the underdetermined nature of this inversion problem (i.e., inverting a kinematic source model for an extended fault based on seismic data), the provided solutions are generally non-unique. Here we present a statistical (Bayesian) inversion approach based on an open-source library for uncertainty quantification (UQ) called QUESO that was developed at ICES (UT Austin). The approach has advantages with respect to deterministic inversion approaches as it provides not only a single (non-unique) solution but also provides uncertainty bounds with it. Those uncertainty bounds help to qualitatively and quantitatively judge how well constrained an inversion solution is and how much rupture complexity the data reliably resolve. The presented inversion scheme uses only tele-seismically recorded body waves but future developments may lead us towards joint inversion schemes. After giving an insight in the inversion scheme ifself (based on delayed rejection adaptive metropolis, DRAM) we explore the method's resolution potential. For that, we synthetically generate tele-seismic data, add for example different levels of noise and/or change fault plane parameterization and then apply our inversion scheme in the attempt to extract the (known) kinematic rupture model. We conclude with exemplary inverting real tele-seismic data of a recent large earthquake and compare those results with deterministically derived kinematic source models provided by other research groups.
Assessment of SWE data assimilation for ensemble streamflow predictions
NASA Astrophysics Data System (ADS)
Franz, Kristie J.; Hogue, Terri S.; Barik, Muhammad; He, Minxue
2014-11-01
An assessment of data assimilation (DA) for Ensemble Streamflow Prediction (ESP) using seasonal water supply hindcasting in the North Fork of the American River Basin (NFARB) and the National Weather Service (NWS) hydrologic forecast models is undertaken. Two parameter sets, one from the California Nevada River Forecast Center (RFC) and one from the Differential Evolution Adaptive Metropolis (DREAM) algorithm, are tested. For each parameter set, hindcasts are generated using initial conditions derived with and without the inclusion of a DA scheme that integrates snow water equivalent (SWE) observations. The DREAM-DA scenario uses an Integrated Uncertainty and Ensemble-based data Assimilation (ICEA) framework that also considers model and parameter uncertainty. Hindcasts are evaluated using deterministic and probabilistic forecast verification metrics. In general, the impact of DA on the skill of the seasonal water supply predictions is mixed. For deterministic (ensemble mean) predictions, the Percent Bias (PBias) is improved with integration of the DA. DREAM-DA and the RFC-DA have the lowest biases and the RFC-DA has the lowest Root Mean Squared Error (RMSE). However, the RFC and DREAM-DA have similar RMSE scores. For the probabilistic predictions, the RFC and DREAM have the highest Continuous Ranked Probability Skill Scores (CRPSS) and the RFC has the best discrimination for low flows. Reliability results are similar between the non-DA and DA tests and the DREAM and DREAM-DA have better reliability than the RFC and RFC-DA for forecast dates February 1 and later. Despite producing improved streamflow simulations in previous studies, the hindcast analysis suggests that the DA method tested may not result in obvious improvements in streamflow forecasts. We advocate that integration of hindcasting and probabilistic metrics provides more rigorous insight on model performance for forecasting applications, such as in this study.
Multi-component fluid flow through porous media by interacting lattice gas computer simulation
NASA Astrophysics Data System (ADS)
Cueva-Parra, Luis Alberto
In this work we study structural and transport properties such as power-law behavior of trajectory of each constituent and their center of mass, density profile, mass flux, permeability, velocity profile, phase separation, segregation, and mixing of miscible and immiscible multicomponent fluid flow through rigid and non-consolidated porous media. The considered parameters are the mass ratio of the components, temperature, external pressure, and porosity. Due to its solid theoretical foundation and computational simplicity, the selected approaches are the Interacting Lattice Gas with Monte Carlo Method (Metropolis Algorithm) and direct sampling, combined with particular collision rules. The percolation mechanism is used for modeling initial random porous media. The introduced collision rules allow to model non-consolidated porous media, because part of the kinetic energy of the fluid particles is transfered to barrier particles, which are the components of the porous medium. Having gained kinetic energy, the barrier particles can move. A number of interesting results are observed. Some findings include, (i) phase separation in immiscible fluid flow through a medium with no barrier particles (porosity p P = 1). (ii) For the flow of miscible fluids through rigid porous medium with porosity close to percolation threshold (p C), the flux density (measure of permeability) shows a power law increase ∝ (pC - p) mu with mu = 2.0, and the density profile is found to decay with height ∝ exp(-mA/Bh), consistent with the barometric height law. (iii) Sedimentation and driving of barrier particles in fluid flow through non-consolidated porous medium. This study involves developing computer simulation models with efficient serial and parallel codes, extensive data analysis via graphical utilities, and computer visualization techniques.
ERIC Educational Resources Information Center
African-American Inst., New York, NY. School Services Div.
Four modules dealing with African culture are combined in this document. The first module discusses various life-styles of African women, including warrior, queen, ruler, and matriarch. A lesson plan uses a question-and-answer format to encourage discussion of the effects of tradition, society, and nation upon African women. Questions asked…
ERIC Educational Resources Information Center
Wahab, E. O.; Ajiboye, O. E.; Atere, A. A.
2011-01-01
This research is motivated as a result of rapid changes in the lifestyle of our youths and the increasing deterioration of their welfare in terms of the increase in the number of out of school youth in the country, high incidence of child participation in economic activities and incidence of street children in Nigeria. Although, many researches…
ERIC Educational Resources Information Center
Arhin, Ato Kwamina
2015-01-01
The study was a quasi-experimental research project conducted to investigate the effect of performance assessment-driven instructions on the attitude and achievement in mathematics of senior high school students in Ghana at Ghana National College in Cape Coast. Two Form 1 science classes were used for the study and were assigned as experimental…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hou Fengji; Hogg, David W.; Goodman, Jonathan
Markov chain Monte Carlo (MCMC) proves to be powerful for Bayesian inference and in particular for exoplanet radial velocity fitting because MCMC provides more statistical information and makes better use of data than common approaches like chi-square fitting. However, the nonlinear density functions encountered in these problems can make MCMC time-consuming. In this paper, we apply an ensemble sampler respecting affine invariance to orbital parameter extraction from radial velocity data. This new sampler has only one free parameter, and does not require much tuning for good performance, which is important for automatization. The autocorrelation time of this sampler is approximatelymore » the same for all parameters and far smaller than Metropolis-Hastings, which means it requires many fewer function calls to produce the same number of independent samples. The affine-invariant sampler speeds up MCMC by hundreds of times compared with Metropolis-Hastings in the same computing situation. This novel sampler would be ideal for projects involving large data sets such as statistical investigations of planet distribution. The biggest obstacle to ensemble samplers is the existence of multiple local optima; we present a clustering technique to deal with local optima by clustering based on the likelihood of the walkers in the ensemble. We demonstrate the effectiveness of the sampler on real radial velocity data.« less
NASA Astrophysics Data System (ADS)
Shepherd, J. Marshall; Pierce, Harold; Negri, Andrew J.
2002-07-01
Data from the Tropical Rainfall Measuring Mission (TRMM) satellite's precipitation radar (PR) were employed to identify warm-season rainfall (1998-2000) patterns around Atlanta, Georgia; Montgomery, Alabama; Nashville, Tennessee; and San Antonio, Waco, and Dallas, Texas. Results reveal an average increase of about 28% in monthly rainfall rates within 30-60 km downwind of the metropolis, with a modest increase of 5.6% over the metropolis. Portions of the downwind area exhibit increases as high as 51%. The percentage changes are relative to an upwind control area. It was also found that maximum rainfall rates in the downwind impact area exceeded the mean value in the upwind control area by 48%-116%. The maximum value was generally found at an average distance of 39 km from the edge of the urban center or 64 km from the center of the city. Results are consistent with the Metropolitan Meteorological Experiment (METROMEX) studies of St. Louis, Missouri, almost two decades ago and with more recent studies near Atlanta. The study establishes the possibility of utilizing satellite-based rainfall estimates for examining rainfall modification by urban areas on global scales and over longer time periods. Such research has implications for weather forecasting, urban planning, water resource management, and understanding human impact on the environment and climate.
Determinants of parents' decisions on childhood immunisations at Kumasi Metropolis in Ghana.
Hagan, Doris; Phethlu, Deliwe R
2016-07-29
To describe factors that influence parents' decisions on childhood immunisations at Kumasi Metropolis in Ghana. Quantitative cross-sectional survey. A sample of 303 parents was obtained from a monthly accessible population of 1420 individuals from the five district hospitals through convenience sampling of respondents at immunisation sessions in Kumasi. Data obtained from the survey were analysed with SPSS version 21 software. Most parents were aware of child immunisations, but they had limited knowledge on vaccines and immunisation schedules. Antenatal nurses constituted the most accessible source of vaccine information. The study established a high percentage of complete immunisation, influenced by parents' fear of their children contracting vaccine-preventable diseases. Remarkably, some parents indicated that they immunised their children because they wanted to know the weight of their children. Forgetfulness and lack of personnel or vaccine at the centres were the reasons given by the few parents who could not complete immunisation schedules for their children, whereas the socio-demographic variables considered did not influence parents' decision on immunisation. Knowledge on immunisation could not influence immunisation decisions but parents' fear of vaccine-preventable diseases, awareness on the benefits of immunisations and sources of vaccine information were the main factors that influenced immunisation decision at Kumasi in Ghana.
Wemakor, Anthony; Iddrisu, Habib
2018-06-25
Maternal depression may affect child feeding practice which is an important determinant of child nutritional status. The objective of this study was to explore the association between maternal depression and WHO complementary feeding indicators [minimum dietary diversity (MDD), minimum meal frequency (MMF) and minimum acceptable diet (MAD)] or stunting status of children (6-23 months) in Tamale Metropolis, Ghana. A community-based cross-sectional study was carried out involving 200 mother-child pairs randomly sampled from three communities in Tamale Metropolis, Ghana. The prevalence of MDD, MMF, and MAD were 56.5, 65.0, and 44.0% respectively and 41.0% of the children sampled were stunted. A third of the mothers (33.5%) screened positive for depression. Maternal depression did not influence significantly MDD (p = 0.245), MMF (p = 0.442), and MAD (p = 0.885) or children's risk of stunting (p = 0.872). In conclusion maternal depression and child stunting are prevalent in Northern Ghana but there is a lack of evidence of an association between maternal depression and child feeding practices or nutritional status in this study population. Further research is needed to assess the effect of maternal depression on feeding practices and growth of young children.
NASA Astrophysics Data System (ADS)
Adeniran, J. A.; Yusuf, R. O.; Olajire, A. A.
2017-10-01
This study aims to determine the seasonal variations and composition of suspended particulate matter in different sizes PM1.0, PM2.5, PM10 and the total suspended particles (TSP) emitted at major intra-urban traffic intersections (TIs) of Ilorin metropolis. The concentration levels of PM (PM1.0, PM2.5, PM10) obtained at the TIs during the rush hours (45.1, 77.9, and 513 μg/m3) are higher than the levels obtained for the non-rush hour periods (42.3, 62.7, and 390 μg/m3). The average on-road respiratory deposition dose (RDD) rates of PM1.0, PM2.5 and PM10 during the dry period at TIs types was found to be about 24%, 9% and 25% higher than those obtained during the wet period. Based on EF values calculated, Pb and Zn were anthropogenically-derived while Fe, Mn, Cr, Cu and Mg were of crustal source. Principal component analysis (PCA) has been applied to a set of PM data in order to determine the contribution of different sources. It was found that the main principal factors extracted from particulate emission data were related to exhaust and non-exhaust emissions such as tyre wears, oil and fuel combustion sources.
Exchange bias training relaxation in spin glass/ferromagnet bilayers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chi, Xiaodan; Du, An; Rui, Wenbin
2016-04-25
A canonical spin glass (SG) FeAu layer is fabricated to couple to a soft ferromagnet (FM) FeNi layer. Below the SG freezing temperature, exchange bias (EB) and training are observed. Training in SG/FM bilayers is insensitive to cooling field and may suppress the EB or change the sign of the EB field from negative to positive at specific temperatures, violating from the simple power-law or the single exponential function derived from the antiferromagnet based systems. In view of the SG nature, we employ a double decay model to distinguish the contributions from the SG bulk and the SG/FM interface tomore » training. Dynamical properties during training under different cooling fields and at different temperatures are discussed, and the nonzero shifting coefficient in the time index as a signature of slowing-down decay for SG based systems is interpreted by means of a modified Monte Carlo Metropolis algorithm.« less
Ma, Jianzhong; Amos, Christopher I; Warwick Daw, E
2007-09-01
Although extended pedigrees are often sampled through probands with extreme levels of a quantitative trait, Markov chain Monte Carlo (MCMC) methods for segregation and linkage analysis have not been able to perform ascertainment corrections. Further, the extent to which ascertainment of pedigrees leads to biases in the estimation of segregation and linkage parameters has not been previously studied for MCMC procedures. In this paper, we studied these issues with a Bayesian MCMC approach for joint segregation and linkage analysis, as implemented in the package Loki. We first simulated pedigrees ascertained through individuals with extreme values of a quantitative trait in spirit of the sequential sampling theory of Cannings and Thompson [Cannings and Thompson [1977] Clin. Genet. 12:208-212]. Using our simulated data, we detected no bias in estimates of the trait locus location. However, in addition to allele frequencies, when the ascertainment threshold was higher than or close to the true value of the highest genotypic mean, bias was also found in the estimation of this parameter. When there were multiple trait loci, this bias destroyed the additivity of the effects of the trait loci, and caused biases in the estimation all genotypic means when a purely additive model was used for analyzing the data. To account for pedigree ascertainment with sequential sampling, we developed a Bayesian ascertainment approach and implemented Metropolis-Hastings updates in the MCMC samplers used in Loki. Ascertainment correction greatly reduced biases in parameter estimates. Our method is designed for multiple, but a fixed number of trait loci. Copyright (c) 2007 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Wang, Kejing; Zhang, Yuan; An, Youzhi; Jing, Zhuoxin; Wang, Chao
2013-09-01
With the fast urbanization process, how does the vegetation environment change in one of the most economically developed metropolis, Shanghai in East China? To answer this question, there is a pressing demand to explore the non-stationary relationship between socio-economic conditions and vegetation across Shanghai. In this study, environmental data on vegetation cover, the Normalized Difference Vegetation Index (NDVI) derived from MODIS imagery in 2003 were integrated with socio-economic data to reflect the city's vegetative conditions at the census block group level. To explore regional variations in the relationship of vegetation and socio-economic conditions, Ordinary Least Squares (OLS) and Geographically Weighted Regression (GWR) models were applied to characterize mean NDVI against three independent socio-economic variables, an urban land use ratio, Gross Domestic Product (GDP) and population density. The study results show that a considerable distinctive spatial variation exists in the relationship for each model. The GWR model has superior effects and higher precision than the OLS model at the census block group scale. So, it is more suitable to account for local effects and geographical variations. This study also indicates that unreasonable excessive urbanization, together with non-sustainable economic development, has a negative influence of vegetation vigor for some neighborhoods in Shanghai.
The Manhattan Frame Model-Manhattan World Inference in the Space of Surface Normals.
Straub, Julian; Freifeld, Oren; Rosman, Guy; Leonard, John J; Fisher, John W
2018-01-01
Objects and structures within man-made environments typically exhibit a high degree of organization in the form of orthogonal and parallel planes. Traditional approaches utilize these regularities via the restrictive, and rather local, Manhattan World (MW) assumption which posits that every plane is perpendicular to one of the axes of a single coordinate system. The aforementioned regularities are especially evident in the surface normal distribution of a scene where they manifest as orthogonally-coupled clusters. This motivates the introduction of the Manhattan-Frame (MF) model which captures the notion of an MW in the surface normals space, the unit sphere, and two probabilistic MF models over this space. First, for a single MF we propose novel real-time MAP inference algorithms, evaluate their performance and their use in drift-free rotation estimation. Second, to capture the complexity of real-world scenes at a global scale, we extend the MF model to a probabilistic mixture of Manhattan Frames (MMF). For MMF inference we propose a simple MAP inference algorithm and an adaptive Markov-Chain Monte-Carlo sampling algorithm with Metropolis-Hastings split/merge moves that let us infer the unknown number of mixture components. We demonstrate the versatility of the MMF model and inference algorithm across several scales of man-made environments.
Wen, Jiayi; Zhou, Shenggao; Xu, Zhenli; Li, Bo
2013-01-01
Competitive adsorption of counterions of multiple species to charged surfaces is studied by a size-effect included mean-field theory and Monte Carlo (MC) simulations. The mean-field electrostatic free-energy functional of ionic concentrations, constrained by Poisson’s equation, is numerically minimized by an augmented Lagrangian multiplier method. Unrestricted primitive models and canonical ensemble MC simulations with the Metropolis criterion are used to predict the ionic distributions around a charged surface. It is found that, for a low surface charge density, the adsorption of ions with a higher valence is preferable, agreeing with existing studies. For a highly charged surface, both of the mean-field theory and MC simulations demonstrate that the counterions bind tightly around the charged surface, resulting in a stratification of counterions of different species. The competition between mixed entropy and electrostatic energetics leads to a compromise that the ionic species with a higher valence-to-volume ratio has a larger probability to form the first layer of stratification. In particular, the MC simulations confirm the crucial role of ionic valence-to-volume ratios in the competitive adsorption to charged surfaces that had been previously predicted by the mean-field theory. The charge inversion for ionic systems with salt is predicted by the MC simulations but not by the mean-field theory. This work provides a better understanding of competitive adsorption of counterions to charged surfaces and calls for further studies on the ionic size effect with application to large-scale biomolecular modeling. PMID:22680474
Wen, Jiayi; Zhou, Shenggao; Xu, Zhenli; Li, Bo
2012-04-01
Competitive adsorption of counterions of multiple species to charged surfaces is studied by a size-effect-included mean-field theory and Monte Carlo (MC) simulations. The mean-field electrostatic free-energy functional of ionic concentrations, constrained by Poisson's equation, is numerically minimized by an augmented Lagrangian multiplier method. Unrestricted primitive models and canonical ensemble MC simulations with the Metropolis criterion are used to predict the ionic distributions around a charged surface. It is found that, for a low surface charge density, the adsorption of ions with a higher valence is preferable, agreeing with existing studies. For a highly charged surface, both the mean-field theory and the MC simulations demonstrate that the counterions bind tightly around the charged surface, resulting in a stratification of counterions of different species. The competition between mixed entropy and electrostatic energetics leads to a compromise that the ionic species with a higher valence-to-volume ratio has a larger probability to form the first layer of stratification. In particular, the MC simulations confirm the crucial role of ionic valence-to-volume ratios in the competitive adsorption to charged surfaces that had been previously predicted by the mean-field theory. The charge inversion for ionic systems with salt is predicted by the MC simulations but not by the mean-field theory. This work provides a better understanding of competitive adsorption of counterions to charged surfaces and calls for further studies on the ionic size effect with application to large-scale biomolecular modeling.
ECAIM : Air Quality Studies and its Impact in Central Mexico.
NASA Astrophysics Data System (ADS)
Ruiz-Suárez, L. G.; Torres, R.; Garcia-Reynoso, J. A.; Zavala-Hidalgo, J.; Grutter, M.; Delgado-Campos, J.; Molina, L. T.
2014-12-01
Mexico City Metropolitan Area has been the object of several well know intensive campaigns. Since MARI (1991) , IMADA (1997), MCMA 2003 and MILAGRO (2006). The spatial scope of these studies have gone from urban to regional to continental, with the focus on MCMA as an emissions source. During MILAGRO, the influence on MCMA of wildfires and agricultural biomass burning around the megacity was considered. However, around Mexico City a crown of metropolis and middle size cities make a region known as the Central Mexico Regional Crow (CRCM for its acronym in Spanish language) or Central Mexico City Belt. It contains 32 million inhabitants and produces 40% of national gross product. The region undergoes an uncontrolled urban sprawl. Evidence is building-up on complex air pollution transport processes between the air basins within CRCM. However, only MCMA counts with reliable long-term records of criteria pollutants monitoring. Only few intensive campaigns have been done in the air basins surrounding MCMA. ECAIM project has several goals: a) To use ground and satellite observations to assess emissions inventories; b) To use ground and satellite observations to assess the performance of air quality models for the whole region; c) to produce critical levels exceedence maps; d) To produce a preliminary diagnostic of air quality for the CRCM; e) to produce a preliminary estimate of the cost of air pollution within the CRCM. In this work we show the method approach to use the best available information from local AQM networks, field campaigns, satellite observations and modeling to achieve those goals. We show some preliminary results.
Comparison of the landslide susceptibility models in Taipei Water Source Domain, Taiwan
NASA Astrophysics Data System (ADS)
WU, C. Y.; Yeh, Y. C.; Chou, T. H.
2017-12-01
Taipei Water Source Domain, locating at the southeast of Taipei Metropolis, is the main source of water resource in this region. Recently, the downstream turbidity often soared significantly during the typhoon period because of the upstream landslides. The landslide susceptibilities should be analysed to assess the influence zones caused by different rainfall events, and to ensure the abilities of this domain to serve enough and quality water resource. Generally, the landslide susceptibility models can be established based on either a long-term landslide inventory or a specified landslide event. Sometimes, there is no long-term landslide inventory in some areas. Thus, the event-based landslide susceptibility models are established widely. However, the inventory-based and event-based landslide susceptibility models may result in dissimilar susceptibility maps in the same area. So the purposes of this study were to compare the landslide susceptibility maps derived from the inventory-based and event-based models, and to interpret how to select a representative event to be included in the susceptibility model. The landslide inventory from Typhoon Tim in July, 1994 and Typhoon Soudelor in August, 2015 was collected, and used to establish the inventory-based landslide susceptibility model. The landslides caused by Typhoon Nari and rainfall data were used to establish the event-based model. The results indicated the high susceptibility slope-units were located at middle upstream Nan-Shih Stream basin.
NASA Astrophysics Data System (ADS)
Wang, H.; Guan, H.; Deng, R.; Simmons, C. T.
2013-12-01
Canopy conductance response to environmental conditions is a critical component in land surface hydrological modeling. This response is often formulated as a combination of response functions of each influencing factor (solar radiation, air temperature, vapor pressure deficit, and soil water availability). These functions are climate and vegetation specific. Thus, it is important to determine the most appropriate combination of response functions and their parameter values for a specific environment. We will present a method for this purpose based on field measurements and an optimization scheme. The study was performed on Drooping Sheoak (Allocasuarina verticillata) in Adelaide South Australia. Sap flow and stem water potential were measured in a year together with microclimate variables. Canopy conductance was calculated from the inversed Penman-Monteith (PM) equation, which was then used to examine the performance of 36 combinations of various response functions. Parameters in the models were optimized using a DiffeRential Evolution Adaptive Metropolis (DREAM) model based on a training dataset. The testing results show that the best combination gave a correlation coefficient of 0.97, and root mean square error of 0.0006 m/s in comparison to the PM-calculated values. The maximum stomatal conductance given by this combination is 0.0075 m/s, equivalent to a minimum stomatal resistance of 133 s/m. This is close to the number (150 s/m) used in Noah land surface model for evergreen needle-leaf trees. It is surprising that for all combinations, the optimized parameter of the temperature response function is against its physical meaning. This is likely related to the inter-dependence between air temperature and vapor pressure deficit. Supported by the results, we suggest that the effects of vapor pressure deficit and air temperature should be represented together, so as to be consistent with the physics.
Lee, Michael S; Olson, Mark A
2011-06-28
Temperature-based replica exchange (T-ReX) enhances sampling of molecular dynamics simulations by autonomously heating and cooling simulation clients via a Metropolis exchange criterion. A pathological case for T-ReX can occur when a change in state (e.g., folding to unfolding of a protein) has a large energetic difference over a short temperature interval leading to insufficient exchanges amongst replica clients near the transition temperature. One solution is to allow the temperature set to dynamically adapt in the temperature space, thereby enriching the population of clients near the transition temperature. In this work, we evaluated two approaches for adapting the temperature set: a method that equalizes exchange rates over all neighbor temperature pairs and a method that attempts to induce clients to visit all temperatures (dubbed "current maximization") by positioning many clients at or near the transition temperature. As a test case, we simulated the 57-residue SH3 domain of alpha-spectrin. Exchange rate equalization yielded the same unfolding-folding transition temperature as fixed-temperature ReX with much smoother convergence of this value. Surprisingly, the current maximization method yielded a significantly lower transition temperature, in close agreement with experimental observation, likely due to more extensive sampling of the transition state.
Learn-as-you-go acceleration of cosmological parameter estimates
NASA Astrophysics Data System (ADS)
Aslanyan, Grigor; Easther, Richard; Price, Layne C.
2015-09-01
Cosmological analyses can be accelerated by approximating slow calculations using a training set, which is either precomputed or generated dynamically. However, this approach is only safe if the approximations are well understood and controlled. This paper surveys issues associated with the use of machine-learning based emulation strategies for accelerating cosmological parameter estimation. We describe a learn-as-you-go algorithm that is implemented in the Cosmo++ code and (1) trains the emulator while simultaneously estimating posterior probabilities; (2) identifies unreliable estimates, computing the exact numerical likelihoods if necessary; and (3) progressively learns and updates the error model as the calculation progresses. We explicitly describe and model the emulation error and show how this can be propagated into the posterior probabilities. We apply these techniques to the Planck likelihood and the calculation of ΛCDM posterior probabilities. The computation is significantly accelerated without a pre-defined training set and uncertainties in the posterior probabilities are subdominant to statistical fluctuations. We have obtained a speedup factor of 6.5 for Metropolis-Hastings and 3.5 for nested sampling. Finally, we discuss the general requirements for a credible error model and show how to update them on-the-fly.
Effective gravitational coupling in modified teleparallel theories
NASA Astrophysics Data System (ADS)
Abedi, Habib; Capozziello, Salvatore; D'Agostino, Rocco; Luongo, Orlando
2018-04-01
In the present study, we consider an extended form of teleparallel Lagrangian f (T ,ϕ ,X ) , as function of a scalar field ϕ , its kinetic term X and the torsion scalar T . We use linear perturbations to obtain the equation of matter density perturbations on sub-Hubble scales. The gravitational coupling is modified in scalar modes with respect to the one of general relativity, albeit vector modes decay and do not show any significant effects. We thus extend these results by involving multiple scalar field models. Further, we study conformal transformations in teleparallel gravity and we obtain the coupling as the scalar field is nonminimally coupled to both torsion and boundary terms. Finally, we propose the specific model f (T ,ϕ ,X )=T +∂μϕ ∂μϕ +ξ T ϕ2 . To check its goodness, we employ the observational Hubble data, constraining the coupling constant, ξ , through a Monte Carlo technique based on the Metropolis-Hastings algorithm. Hence, fixing ξ to its best-fit value got from our numerical analysis, we calculate the growth rate of matter perturbations and we compare our outcomes with the latest measurements and the predictions of the Λ CDM model.
NASA Astrophysics Data System (ADS)
Grayver, Alexander V.; Kuvshinov, Alexey V.
2016-05-01
This paper presents a methodology to sample equivalence domain (ED) in nonlinear partial differential equation (PDE)-constrained inverse problems. For this purpose, we first applied state-of-the-art stochastic optimization algorithm called Covariance Matrix Adaptation Evolution Strategy (CMAES) to identify low-misfit regions of the model space. These regions were then randomly sampled to create an ensemble of equivalent models and quantify uncertainty. CMAES is aimed at exploring model space globally and is robust on very ill-conditioned problems. We show that the number of iterations required to converge grows at a moderate rate with respect to number of unknowns and the algorithm is embarrassingly parallel. We formulated the problem by using the generalized Gaussian distribution. This enabled us to seamlessly use arbitrary norms for residual and regularization terms. We show that various regularization norms facilitate studying different classes of equivalent solutions. We further show how performance of the standard Metropolis-Hastings Markov chain Monte Carlo algorithm can be substantially improved by using information CMAES provides. This methodology was tested by using individual and joint inversions of magneotelluric, controlled-source electromagnetic (EM) and global EM induction data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F
In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less
Learn-as-you-go acceleration of cosmological parameter estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aslanyan, Grigor; Easther, Richard; Price, Layne C., E-mail: g.aslanyan@auckland.ac.nz, E-mail: r.easther@auckland.ac.nz, E-mail: lpri691@aucklanduni.ac.nz
2015-09-01
Cosmological analyses can be accelerated by approximating slow calculations using a training set, which is either precomputed or generated dynamically. However, this approach is only safe if the approximations are well understood and controlled. This paper surveys issues associated with the use of machine-learning based emulation strategies for accelerating cosmological parameter estimation. We describe a learn-as-you-go algorithm that is implemented in the Cosmo++ code and (1) trains the emulator while simultaneously estimating posterior probabilities; (2) identifies unreliable estimates, computing the exact numerical likelihoods if necessary; and (3) progressively learns and updates the error model as the calculation progresses. We explicitlymore » describe and model the emulation error and show how this can be propagated into the posterior probabilities. We apply these techniques to the Planck likelihood and the calculation of ΛCDM posterior probabilities. The computation is significantly accelerated without a pre-defined training set and uncertainties in the posterior probabilities are subdominant to statistical fluctuations. We have obtained a speedup factor of 6.5 for Metropolis-Hastings and 3.5 for nested sampling. Finally, we discuss the general requirements for a credible error model and show how to update them on-the-fly.« less
NASA Astrophysics Data System (ADS)
Dai, Z.; Wolfsberg, A. V.; Zhu, L.; Reimus, P. W.
2017-12-01
Colloids have the potential to enhance mobility of strongly sorbing radionuclide contaminants in fractured rocks at underground nuclear test sites. This study presents an experimental and numerical investigation of colloid-facilitated plutonium reactive transport in fractured porous media for identifying plutonium sorption/filtration processes. The transport parameters for dispersion, diffusion, sorption, and filtration are estimated with inverse modeling for minimizing the least squares objective function of multicomponent concentration data from multiple transport experiments with the Shuffled Complex Evolution Metropolis (SCEM). Capitalizing on an unplanned experimental artifact that led to colloid formation and migration, we adopt a stepwise strategy to first interpret the data from each experiment separately and then to incorporate multiple experiments simultaneously to identify a suite of plutonium-colloid transport processes. Nonequilibrium or kinetic attachment and detachment of plutonium-colloid in fractures was clearly demonstrated and captured in the inverted modeling parameters along with estimates of the source plutonium fraction that formed plutonium-colloids. The results from this study provide valuable insights for understanding the transport mechanisms and environmental impacts of plutonium in fractured formations and groundwater aquifers.
NASA Astrophysics Data System (ADS)
Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi
2015-01-01
We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.
A Bayesian approach to earthquake source studies
NASA Astrophysics Data System (ADS)
Minson, Sarah
Bayesian sampling has several advantages over conventional optimization approaches to solving inverse problems. It produces the distribution of all possible models sampled proportionally to how much each model is consistent with the data and the specified prior information, and thus images the entire solution space, revealing the uncertainties and trade-offs in the model. Bayesian sampling is applicable to both linear and non-linear modeling, and the values of the model parameters being sampled can be constrained based on the physics of the process being studied and do not have to be regularized. However, these methods are computationally challenging for high-dimensional problems. Until now the computational expense of Bayesian sampling has been too great for it to be practicable for most geophysical problems. I present a new parallel sampling algorithm called CATMIP for Cascading Adaptive Tempered Metropolis In Parallel. This technique, based on Transitional Markov chain Monte Carlo, makes it possible to sample distributions in many hundreds of dimensions, if the forward model is fast, or to sample computationally expensive forward models in smaller numbers of dimensions. The design of the algorithm is independent of the model being sampled, so CATMIP can be applied to many areas of research. I use CATMIP to produce a finite fault source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. Surface displacements from the earthquake were recorded by six interferograms and twelve local high-rate GPS stations. Because of the wealth of near-fault data, the source process is well-constrained. I find that the near-field high-rate GPS data have significant resolving power above and beyond the slip distribution determined from static displacements. The location and magnitude of the maximum displacement are resolved. The rupture almost certainly propagated at sub-shear velocities. The full posterior distribution can be used not only to calculate source parameters but also to determine their uncertainties. So while kinematic source modeling and the estimation of source parameters is not new, with CATMIP I am able to use Bayesian sampling to determine which parts of the source process are well-constrained and which are not.
NASA Astrophysics Data System (ADS)
Santabarbara, Ignacio; Haas, Edwin; Kraus, David; Herrera, Saul; Klatt, Steffen; Kiese, Ralf
2014-05-01
When using biogeochemical models to estimate greenhouse gas emissions at site to regional/national levels, the assessment and quantification of the uncertainties of simulation results are of significant importance. The uncertainties in simulation results of process-based ecosystem models may result from uncertainties of the process parameters that describe the processes of the model, model structure inadequacy as well as uncertainties in the observations. Data for development and testing of uncertainty analisys were corp yield observations, measurements of soil fluxes of nitrous oxide (N2O) and carbon dioxide (CO2) from 8 arable sites across Europe. Using the process-based biogeochemical model LandscapeDNDC for simulating crop yields, N2O and CO2 emissions, our aim is to assess the simulation uncertainty by setting up a Bayesian framework based on Metropolis-Hastings algorithm. Using Gelman statistics convergence criteria and parallel computing techniques, enable multi Markov Chains to run independently in parallel and create a random walk to estimate the joint model parameter distribution. Through means distribution we limit the parameter space, get probabilities of parameter values and find the complex dependencies among them. With this parameter distribution that determines soil-atmosphere C and N exchange, we are able to obtain the parameter-induced uncertainty of simulation results and compare them with the measurements data.
NASA Astrophysics Data System (ADS)
Yamauchi, Masataka; Okumura, Hisashi
2017-11-01
We developed a two-dimensional replica-permutation molecular dynamics method in the isothermal-isobaric ensemble. The replica-permutation method is a better alternative to the replica-exchange method. It was originally developed in the canonical ensemble. This method employs the Suwa-Todo algorithm, instead of the Metropolis algorithm, to perform permutations of temperatures and pressures among more than two replicas so that the rejection ratio can be minimized. We showed that the isothermal-isobaric replica-permutation method performs better sampling efficiency than the isothermal-isobaric replica-exchange method and infinite swapping method. We applied this method to a β-hairpin mini protein, chignolin. In this simulation, we observed not only the folded state but also the misfolded state. We calculated the temperature and pressure dependence of the fractions on the folded, misfolded, and unfolded states. Differences in partial molar enthalpy, internal energy, entropy, partial molar volume, and heat capacity were also determined and agreed well with experimental data. We observed a new phenomenon that misfolded chignolin becomes more stable under high-pressure conditions. We also revealed this mechanism of the stability as follows: TYR2 and TRP9 side chains cover the hydrogen bonds that form a β-hairpin structure. The hydrogen bonds are protected from the water molecules that approach the protein as the pressure increases.
NASA Astrophysics Data System (ADS)
Chen, Bingzhang; Smith, Sherwood Lan
2018-02-01
Diversity plays critical roles in ecosystem functioning, but it remains challenging to model phytoplankton diversity in order to better understand those roles and reproduce consistently observed diversity patterns in the ocean. In contrast to the typical approach of resolving distinct species or functional groups, we present a ContInuous TRAiT-basEd phytoplankton model (CITRATE) that focuses on macroscopic system properties such as total biomass, mean trait values, and trait variance. This phytoplankton component is embedded within a nitrogen-phytoplankton-zooplankton-detritus-iron model that itself is coupled with a simplified one-dimensional ocean model. Size is used as the master trait for phytoplankton. CITRATE also incorporates trait diffusion
for sustaining diversity and simple representations of physiological acclimation, i.e., flexible chlorophyll-to-carbon and nitrogen-to-carbon ratios. We have implemented CITRATE at two contrasting stations in the North Pacific where several years of observational data are available. The model is driven by physical forcing including vertical eddy diffusivity imported from three-dimensional general ocean circulation models (GCMs). One common set of model parameters for the two stations is optimized using the Delayed-Rejection Adaptive Metropolis-Hasting Monte Carlo (DRAM) algorithm. The model faithfully reproduces most of the observed patterns and gives robust predictions on phytoplankton mean size and size diversity. CITRATE is suitable for applications in GCMs and constitutes a prototype upon which more sophisticated continuous trait-based models can be developed.
Artificial neural network model for ozone concentration estimation and Monte Carlo analysis
NASA Astrophysics Data System (ADS)
Gao, Meng; Yin, Liting; Ning, Jicai
2018-07-01
Air pollution in urban atmosphere directly affects public-health; therefore, it is very essential to predict air pollutant concentrations. Air quality is a complex function of emissions, meteorology and topography, and artificial neural networks (ANNs) provide a sound framework for relating these variables. In this study, we investigated the feasibility of using ANN model with meteorological parameters as input variables to predict ozone concentration in the urban area of Jinan, a metropolis in Northern China. We firstly found that the architecture of network of neurons had little effect on the predicting capability of ANN model. A parsimonious ANN model with 6 routinely monitored meteorological parameters and one temporal covariate (the category of day, i.e. working day, legal holiday and regular weekend) as input variables was identified, where the 7 input variables were selected following the forward selection procedure. Compared with the benchmarking ANN model with 9 meteorological and photochemical parameters as input variables, the predicting capability of the parsimonious ANN model was acceptable. Its predicting capability was also verified in term of warming success ratio during the pollution episodes. Finally, uncertainty and sensitivity analysis were also performed based on Monte Carlo simulations (MCS). It was concluded that the ANN could properly predict the ambient ozone level. Maximum temperature, atmospheric pressure, sunshine duration and maximum wind speed were identified as the predominate input variables significantly influencing the prediction of ambient ozone concentrations.
Rainfall Modification by Urban Areas: New Perspectives from TRMM
NASA Technical Reports Server (NTRS)
Shepherd, J. Marshall; Pierce, Harold F.; Negri, Andrew
2002-01-01
Data from the Tropical Rainfall Measuring Mission's (TRMM) Precipitation Radar (PR) were employed to identify warm season rainfall (1998-2000) patterns around Atlanta, Montgomery, Nashville, San Antonio, Waco, and Dallas. Results reveal an average increase of -28% in monthly rainfall rates within 30-60 kilometers downwind of the metropolis with a modest increase of 5.6% over the metropolis. Portions of the downwind area exhibit increases as high as 51%. The percentage changes are relative to an upwind control area. It was also found that maximum rainfall rates in the downwind impact area exceeded the mean value in the upwind control area by 48% - 116%. The maximum value was generally found at an average distance of 39 km from the edge of the urban center or 64 km from the center of the city. Results are consistent with METROMEX studies of St. Louis almost two decades ago and with more recent studies near Atlanta. Future work is extending the investigation to Phoenix, Arizona, an arid U.S. city, and several international cities like Mexico City, Johannesburg, and Brasilia. The study establishes the possibility of utilizing satellite-based rainfall estimates for examining rainfall modification by urban areas on global scales and over longer time periods. Such research has implications for weather forecasting, urban planning, water resource management, and understanding human impact on the environment and climate.
The prevalence and incidence of COPD among urban older persons of Bangkok Metropolis.
Maranetra, Khun Nanta; Chuaychoo, Benjamas; Dejsomritrutai, Wanchai; Chierakul, Nitipatana; Nana, Arth; Lertakyamanee, Jariya; Naruman, Chana; Suthamsmai, Tasneeya; Sangkaew, Sutee; Sreelum, Wichean; Aksornin, Montchai; Dechapol, Jaroon; Sathet, Wichean
2002-11-01
COPD substantially affects the national healthcare resource and healthcare cost especially among the older persons. Identifying the accurate prevalence and incidence reflects the scale of problem posed by COPD. This epidemiological study using the criteria for diagnosing COPD based on ratio of FEV1.0/FVC less than 70 per cent and the reversibility of less than 15 per cent increase of post bronchodilator FEV1.0 in the absence of parenchymal lesions and cardiomegaly in CXR (PA and lateral view) revealed the prevalence (1998) of COPD among the 3094 older persons aged 60 years and over in the communities of Bangkok Metropolis 10 km around Siriraj Hospital was 7.11 per cent (95% CI: 6.21-8.01), whereas the incidence (1999) of COPD was 3.63 per cent (95% CI: 2.83-4.43). Both the prevalence and the incidence were increased with increasing age. The disease occurred predominantly among male smokers. The distribution of mild : moderate : severe COPD in the prevalence study was 5.6:2.2:1. The current findings also suggest that tobacco smoking is the prime important cause of COPD and the indoor pollution especially cooking smoke is not significant. In particular, the unexpectedly high incidence compared with prevalence in this population probably represents the warning message to the national policy maker for prompt and effective health promotion and disease prevention to prevent further social and economic loss.
Orish, Verner N; Onyeabor, Onyekachi S; Boampong, Johnson N; Afoakwah, Richmond; Nwaefuna, Ekene; Acquah, Samuel; Orish, Esther O; Sanyaolu, Adekunle O; Iriemenam, Nnaemeka C
2014-08-01
This study investigated the influence of the level of education on HIV infection among pregnant women attending antenatal care in Sekondi-Takoradi, Ghana. A cross-sectional study was conducted at four hospitals in the Sekondi-Takoradi metropolis. The study group comprised 885 consenting pregnant women attending antenatal care clinics. Questionnaires were administered and venous blood samples were screened for HIV and other parameters. Multivariable logistic regression analyses were performed to determine the association between the level of education attained by the pregnant women and their HIV statuses. The data showed that 9.83% (87/885) of the pregnant women were HIV seropositive while 90.17% (798/885) were HIV seronegative. There were significant differences in mean age (years) between the HIV seropositive women (27.45 ± 5.5) and their HIV seronegative (26.02 ± 5.6) counterparts (p = .026) but the inference disappeared after adjustment (p = .22). Multivariable logistic regression analysis revealed that pregnant women with secondary/tertiary education were less likely to have HIV infection compared with those with none/primary education (adjusted OR, 0.53; 95% CI, 0.30-0.91; p = .022). Our data showed an association with higher level of education and HIV statuses of the pregnant women. It is imperative to encourage formal education among pregnant women in this region.
Urban transformation of a metropolis and its environmental impacts: a case study in Shanghai.
Tian, Zhan; Cao, Guiying; Shi, Jun; McCallum, Ian; Cui, Linli; Fan, Dongli; Li, Xinhu
2012-06-01
The aim of this paper is to understand the sustainability of urban spatial transformation in the process of rapid urbanization, and calls for future research on the demographic and economic dimensions of climate change. Shanghai towards its transformation to a metropolis has experienced vast socioeconomic and ecological change and calls for future research on the impacts of demographic and economic dimensions on climate change. We look at the major questions (1) to explore economic and demographic growth, land use and land-cover changes in the context of rapid economic and city growth, and (2) to analyze how the demography and economic growth have been associated with the local air temperature and vegetation. We examine urban growth, land use and land-cover changes in the context of rapid economic development and urbanization. We assess the impact of urban expansion on local air temperature and vegetation. The analysis is based on time series data of land use, normalized difference vegetation index (NDVI), and meteorological, demographic and economic data. The results indicate that urban growth has been driven by mass immigration; as a consequence of economic growth and urban expansion, a large amount of farmland has been converted to paved road and residential buildings. Furthermore, the difference between air temperature in urban and exurban areas has increased rapidly. The decrease of high mean annual NDVI has mainly occurred around the dense urban areas.
King, Samuel B; Lapidus, Mariana
2015-01-01
The authors' goal was to assess changes in the role of librarians in informatics education from 2004 to 2013. This is a follow-up to "Metropolis Redux: The Unique Importance of Library Skills in Informatics," a 2004 survey of informatics programs. An electronic survey was conducted in January 2013 and sent to librarians via the MEDLIB-L email discussion list, the library section of the American Association of Colleges of Pharmacy, the Medical Informatics Section of the Medical Library Association, the Information Technology Interest Group of the Association of College and Research Libraries/New England Region, and various library directors across the country. Librarians from fifty-five institutions responded to the survey. Of these respondents, thirty-four included librarians in nonlibrary aspects of informatics training. Fifteen institutions have librarians participating in leadership positions in their informatics programs. Compared to the earlier survey, the role of librarians has evolved. Librarians possess skills that enable them to participate in informatics programs beyond a narrow library focus. Librarians currently perform significant leadership roles in informatics education. There are opportunities for librarian interdisciplinary collaboration in informatics programs. Informatics is much more than the study of technology. The information skills that librarians bring to the table enrich and broaden the study of informatics in addition to adding value to the library profession itself.
Bayesian inference in an item response theory model with a generalized student t link function
NASA Astrophysics Data System (ADS)
Azevedo, Caio L. N.; Migon, Helio S.
2012-10-01
In this paper we introduce a new item response theory (IRT) model with a generalized Student t-link function with unknown degrees of freedom (df), named generalized t-link (GtL) IRT model. In this model we consider only the difficulty parameter in the item response function. GtL is an alternative to the two parameter logit and probit models, since the degrees of freedom (df) play a similar role to the discrimination parameter. However, the behavior of the curves of the GtL is different from those of the two parameter models and the usual Student t link, since in GtL the curve obtained from different df's can cross the probit curves in more than one latent trait level. The GtL model has similar proprieties to the generalized linear mixed models, such as the existence of sufficient statistics and easy parameter interpretation. Also, many techniques of parameter estimation, model fit assessment and residual analysis developed for that models can be used for the GtL model. We develop fully Bayesian estimation and model fit assessment tools through a Metropolis-Hastings step within Gibbs sampling algorithm. We consider a prior sensitivity choice concerning the degrees of freedom. The simulation study indicates that the algorithm recovers all parameters properly. In addition, some Bayesian model fit assessment tools are considered. Finally, a real data set is analyzed using our approach and other usual models. The results indicate that our model fits the data better than the two parameter models.
Seroprevalence of hepatitis B virus serological markers among pregnant Nigerian women
Aba, Henrietta Oneh; Aminu, Maryam
2016-01-01
Background: Chronic hepatitis B infection is a global problem; however, Asia and sub-Saharan Africa are most affected by it. Hepatitis B status of pregnant women is essential for the effective management of the disease and prevention of mother to child transmission. Materials and Methods: The study was conducted at the antenatal care unit of four hospitals within Kaduna Metropolis, Nigeria, between August and December 2011. After obtaining ethical clearance, blood samples were collected from 800 consenting pregnant women, the plasma were screened for hepatitis B surface antigen (HBsAg) using first response HBsAg card and the reactive sera were confirmed with enzyme-linked immunosorbent assay. Other serological markers of hepatitis B virus (HBV) were detected using the one-step HBV multi-5 test kit. Results: Of the 800 pregnant women screened, 31 (3.9%) tested positive for HBsAg. Only one of the 31 HBsAg positive women had developed the hepatitis B surface antibody, 16 (51.6%) had the envelop antibody, 18 (58.1%) had the hepatitis B core antibody (anti-HBc), and two (6.5%) had hepatitis B envelop antigen (HBeAg). The highest prevalence of HBsAg was recorded among women in age group 21–25 years old (P = 0.968). Similarly, married women (P = 0.772), women in their second trimester of pregnancy (P = 0.938), women with tertiary education (P = 0.972), women from the South-East geopolitical zone (P = 0.250) and those whose husbands were in polygamous relationships (P = 0.944) had the highest seroprevalence of HBsAg. Conclusion: HBV was detected with a prevalence of 3.9% among pregnant women in Kaduna Metropolis, Nigeria. About 96.8% (29) of the reactive women had HBeAg negative chronic hepatitis while 6.5% (2) had HBeAg positive chronic hepatitis B infection. About 58.1% of the women had anti-HBc, hence, did not have immunity and probably had chronic infection with reduced risk of vertical transmission. Pregnant women should be screened for HBsAg at the first antenatal clinic visit for appropriate clinical management and effective prevention of vertical transmission. PMID:26857933
NASA Astrophysics Data System (ADS)
Wang, Hailong; Guan, Huade; Deng, Zijuan; Simmons, Craig T.
2014-07-01
Canopy conductance (gc) is a critical component in hydrological modeling for transpiration estimate. It is often formulated as functions of environmental variables. These functions are climate and vegetation specific. Thus, it is important to determine the appropriate functions in gc models and corresponding parameter values for a specific environment. In this study, sap flow, stem water potential, and microclimatic variables were measured for three Drooping Sheoak (Allocasuarina verticillata) trees in year 2011, 2012, and 2014. Canopy conductance was calculated from the inversed Penman-Monteith (PM) equation, which was then used to examine 36 gc models that comprise different response functions. Parameters were optimized using the DiffeRential Evolution Adaptive Metropolis (DREAM) model based on a training data set in 2012. Use of proper predawn stem water potential function, vapor pressure deficit function, and temperature function improves model performance significantly, while no pronounced difference is observed between models that differ in solar radiation functions. The best model gives a correlation coefficient of 0.97, and root-mean-square error of 0.0006 m/s in comparison to the PM-calculated gc. The optimized temperature function shows different characteristics from its counterparts in other similar studies. This is likely due to strong interdependence between air temperature and vapor pressure deficit in the study area or Sheoak tree physiology. Supported by the measurements and optimization results, we suggest that the effects of air temperature and vapor pressure deficit on canopy conductance should be represented together.
Chen, Bei-Bei; Gong, Hui-Li; Li, Xiao-Juan; Lei, Kun-Chao; Lin, Zhu; Wang, Yan-Bing
2013-08-01
The excessive mining for underground water is the main reason inducing the land subsidence in Beijing, while, increasing of load brought by the urban construction aggravate the local land subsidence in a certain degree. As an international metropolis, the problems of land subsidence that caused by urban construction are becoming increasingly highlights, so revealing the relationship between regional load increase and the response of land subsidence also becomes one of the key problems in the land subsidence research field. In order to analyze the relationship between the load changes in construction and the land subsidence quantitatively, the present study selected the TM remote sensing image covering Beijing plain and used Erdas Modeler tool to invert the index based on building site (IBI), acquired the spatial and temporal change information in research area further; Based on results monitored by PS-InSAR (permanent scatterer interferometry) and IBI index method, and combined with the GIS spatial analysis method in the view of pixels in different scales, this paper analyzes the correlation between typical area load change and land subsidence, The conclusions show that there is a positive correlation between the density of load and the homogeneity of subsidence, especially in area which has a high sedimentation rate. Owing to such characteristics as the complexity and hysteretic nature of soil and geological structure, it is not obvious that the land subsidence caused by the increase of load in a short period. But with the increasing of local land load made by high density buildings and additional settlement of each monomer building superposed with each other, regional land subsidence is still a question that cannot be ignored and needs long-term systematic research and discussion.
A Bayesian hierarchical model with novel prior specifications for estimating HIV testing rates
An, Qian; Kang, Jian; Song, Ruiguang; Hall, H. Irene
2016-01-01
Human immunodeficiency virus (HIV) infection is a severe infectious disease actively spreading globally, and acquired immunodeficiency syndrome (AIDS) is an advanced stage of HIV infection. The HIV testing rate, that is, the probability that an AIDS-free HIV infected person seeks a test for HIV during a particular time interval, given no previous positive test has been obtained prior to the start of the time, is an important parameter for public health. In this paper, we propose a Bayesian hierarchical model with two levels of hierarchy to estimate the HIV testing rate using annual AIDS and AIDS-free HIV diagnoses data. At level one, we model the latent number of HIV infections for each year using a Poisson distribution with the intensity parameter representing the HIV incidence rate. At level two, the annual numbers of AIDS and AIDS-free HIV diagnosed cases and all undiagnosed cases stratified by the HIV infections at different years are modeled using a multinomial distribution with parameters including the HIV testing rate. We propose a new class of priors for the HIV incidence rate and HIV testing rate taking into account the temporal dependence of these parameters to improve the estimation accuracy. We develop an efficient posterior computation algorithm based on the adaptive rejection metropolis sampling technique. We demonstrate our model using simulation studies and the analysis of the national HIV surveillance data in the USA. PMID:26567891
Chrzastowski, Michael J.
1983-01-01
Lake Washington, in the midst of the greater Seattle metropolitan area of the Puget Sound region (fig. 1), is an exceptional commercial, recreational, and esthetic resource for the region . In the past 130 years, Lake Washington has been changed from a " wild " lake in a wilderness setting to a regulated lake surrounded by a growing metropolis--a transformation that provides an unusual opportunity to study changes to a lake's shoreline and hydrologic characteristics -resulting from urbanization.
[Marketing as a tool in the medical institution management].
Petrova, N G; Balokhina, S A
2009-01-01
The contemporary social economic conditions dictate the necessity to change tactics and strategy of functioning of medical institutions of different property forms. Marketing, alongside with management is to become a leading concept of administration of medical institutions. It should be a framework for systematic collection, registration and analysis of data relevant to the medical services market. The issues of the implementation of marketing concept in the practical everyday activities of commercial medical organization providing cosmetology services to population of metropolis.
Monte Carlo sampling of Wigner functions and surface hopping quantum dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kube, Susanna; Lasser, Caroline; Weber, Marcus
2009-04-01
The article addresses the achievable accuracy for a Monte Carlo sampling of Wigner functions in combination with a surface hopping algorithm for non-adiabatic quantum dynamics. The approximation of Wigner functions is realized by an adaption of the Metropolis algorithm for real-valued functions with disconnected support. The integration, which is necessary for computing values of the Wigner function, uses importance sampling with a Gaussian weight function. The numerical experiments agree with theoretical considerations and show an error of 2-3%.
Vahmani, P.; Sun, F.; Hall, A.; ...
2016-12-15
The climate warming effects of accelerated urbanization along with projected global climate change raise an urgent need for sustainable mitigation and adaptation strategies to cool urban climates. Our modeling results show that historical urbanization in the Los Angeles and San Diego metropolitan areas has increased daytime urban air temperature by 1.3 °C, in part due to a weakening of the onshore sea breeze circulation. We find that metropolis-wide adoption of cool roofs can meaningfully offset this daytime warming, reducing temperatures by 0.9 °C relative to a case without cool roofs. Residential cool roofs were responsible for 67% of the cooling.more » Nocturnal temperature increases of 3.1 °C from urbanization were larger than daytime warming, while nocturnal temperature reductions from cool roofs of 0.5 °C were weaker than corresponding daytime reductions. We further show that cool roof deployment could partially counter the local impacts of global climate change in the Los Angeles metropolitan area. Assuming a scenario in which there are dramatic decreases in greenhouse gas emissions in the 21st century (RCP2.6), mid- and end-of-century temperature increases from global change relative to current climate are similarly reduced by cool roofs from 1.4 °C to 0.6 °C. Assuming a scenario with continued emissions increases throughout the century (RCP8.5), mid-century warming is significantly reduced by cool roofs from 2.0 °C to 1.0 °C. The end-century warming, however, is significantly offset only in small localized areas containing mostly industrial/commercial buildings where cool roofs with the highest albedo are adopted. We conclude that metropolis-wide adoption of cool roofs can play an important role in mitigating the urban heat island effect, and offsetting near-term local warming from global climate change. Global-scale reductions in greenhouse gas emissions are the only way of avoiding long-term warming, however. We further suggest that both climate mitigation and adaptation can be pursued simultaneously using 'cool photovoltaics'.« less
NASA Astrophysics Data System (ADS)
Vahmani, P.; Sun, F.; Hall, A.; Ban-Weiss, G.
2016-12-01
The climate warming effects of accelerated urbanization along with projected global climate change raise an urgent need for sustainable mitigation and adaptation strategies to cool urban climates. Our modeling results show that historical urbanization in the Los Angeles and San Diego metropolitan areas has increased daytime urban air temperature by 1.3 °C, in part due to a weakening of the onshore sea breeze circulation. We find that metropolis-wide adoption of cool roofs can meaningfully offset this daytime warming, reducing temperatures by 0.9 °C relative to a case without cool roofs. Residential cool roofs were responsible for 67% of the cooling. Nocturnal temperature increases of 3.1 °C from urbanization were larger than daytime warming, while nocturnal temperature reductions from cool roofs of 0.5 °C were weaker than corresponding daytime reductions. We further show that cool roof deployment could partially counter the local impacts of global climate change in the Los Angeles metropolitan area. Assuming a scenario in which there are dramatic decreases in greenhouse gas emissions in the 21st century (RCP2.6), mid- and end-of-century temperature increases from global change relative to current climate are similarly reduced by cool roofs from 1.4 °C to 0.6 °C. Assuming a scenario with continued emissions increases throughout the century (RCP8.5), mid-century warming is significantly reduced by cool roofs from 2.0 °C to 1.0 °C. The end-century warming, however, is significantly offset only in small localized areas containing mostly industrial/commercial buildings where cool roofs with the highest albedo are adopted. We conclude that metropolis-wide adoption of cool roofs can play an important role in mitigating the urban heat island effect, and offsetting near-term local warming from global climate change. Global-scale reductions in greenhouse gas emissions are the only way of avoiding long-term warming, however. We further suggest that both climate mitigation and adaptation can be pursued simultaneously using ‘cool photovoltaics’.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vahmani, P.; Sun, F.; Hall, A.
The climate warming effects of accelerated urbanization along with projected global climate change raise an urgent need for sustainable mitigation and adaptation strategies to cool urban climates. Our modeling results show that historical urbanization in the Los Angeles and San Diego metropolitan areas has increased daytime urban air temperature by 1.3 °C, in part due to a weakening of the onshore sea breeze circulation. We find that metropolis-wide adoption of cool roofs can meaningfully offset this daytime warming, reducing temperatures by 0.9 °C relative to a case without cool roofs. Residential cool roofs were responsible for 67% of the cooling.more » Nocturnal temperature increases of 3.1 °C from urbanization were larger than daytime warming, while nocturnal temperature reductions from cool roofs of 0.5 °C were weaker than corresponding daytime reductions. We further show that cool roof deployment could partially counter the local impacts of global climate change in the Los Angeles metropolitan area. Assuming a scenario in which there are dramatic decreases in greenhouse gas emissions in the 21st century (RCP2.6), mid- and end-of-century temperature increases from global change relative to current climate are similarly reduced by cool roofs from 1.4 °C to 0.6 °C. Assuming a scenario with continued emissions increases throughout the century (RCP8.5), mid-century warming is significantly reduced by cool roofs from 2.0 °C to 1.0 °C. The end-century warming, however, is significantly offset only in small localized areas containing mostly industrial/commercial buildings where cool roofs with the highest albedo are adopted. We conclude that metropolis-wide adoption of cool roofs can play an important role in mitigating the urban heat island effect, and offsetting near-term local warming from global climate change. Global-scale reductions in greenhouse gas emissions are the only way of avoiding long-term warming, however. We further suggest that both climate mitigation and adaptation can be pursued simultaneously using 'cool photovoltaics'.« less
NASA Astrophysics Data System (ADS)
Yang, Chunxia; Tang, Minxuan; Cao, Yongjian; Chen, Yanhua; Deng, Qiangqiang
2015-10-01
Based on the annual GDP (Gross Domestic Product) in 27 Chinese provinces and autonomous regions, the asymmetric economic information flows between different regions are calculated by the symbolic transfer entropy method and corresponding economic information flow networks are built over two periods, one is before the reform and opening up policy, the other is after that. By analyzing such networks, the obtained results are as follows. First, before the policy, balanced development strategy weakens or cuts off the ties between adjacent areas, resulting in a slow regional economic development, does not conform to the law of scientific development. Second, with introducing market mechanisms and promoting the reform and opening up policy, increasing economic activities have gradually shifted from coast to inland of China over Period II. Last but not least, there has a dramatic alternation of the influential centers that Jilin, Beijing and Jiangsu become new influential centers. Especially, at Beijing-Tianjin-Hebei metropolis circle Beijing becomes an influential center after the policy.
Wang, Hailong; Sun, Yuqiu; Su, Qinghua; Xia, Xuewen
2018-01-01
The backtracking search optimization algorithm (BSA) is a population-based evolutionary algorithm for numerical optimization problems. BSA has a powerful global exploration capacity while its local exploitation capability is relatively poor. This affects the convergence speed of the algorithm. In this paper, we propose a modified BSA inspired by simulated annealing (BSAISA) to overcome the deficiency of BSA. In the BSAISA, the amplitude control factor (F) is modified based on the Metropolis criterion in simulated annealing. The redesigned F could be adaptively decreased as the number of iterations increases and it does not introduce extra parameters. A self-adaptive ε-constrained method is used to handle the strict constraints. We compared the performance of the proposed BSAISA with BSA and other well-known algorithms when solving thirteen constrained benchmarks and five engineering design problems. The simulation results demonstrated that BSAISA is more effective than BSA and more competitive with other well-known algorithms in terms of convergence speed. PMID:29666635
Decision-Making for Induced Abortion in the Accra Metropolis, Ghana.
Gbagbo, Fred Yao; Amo-Adjei, Joshua; Laar, Amos
2015-06-01
Decision-making for induced abortion can be influenced by various circumstances including those surrounding onset of a pregnancy. There are various dimensions to induced abortion decision-making among women who had an elective induced abortion in a cosmopolitan urban setting in Ghana, which this paper examined. A cross-sectional mixed method study was conducted between January and December 2011 with 401 women who had undergone an abortion procedure in the preceding 12 months. Whereas the quantitative data were analysed with descriptive statistics, thematic analysis was applied to the qualitative data. The study found that women of various profiles have different reasons for undergoing abortion. Women considered the circumstances surrounding onset of pregnancy, person responsible for the pregnancy, gestational age at decision to terminate, and social, economic and medical considerations. Pressures from partners, career progression and reproductive intentions of women reinforced these reasons. First time pregnancies were mostly aborted regardless of gestational ages and partners' consent. Policies and programmes targeted at safe abortion care are needed to guide informed decisions on induced abortions.
Forouhar, Amir; Hasankhani, Mahnoosh
2018-04-01
Urban decay is the process by which a historical city center, or an old part of a city, falls into decrepitude and faces serious problems. Urban management, therefore, implements renewal mega projects with the goal of physical and functional revitalization, retrieval of socioeconomic capacities, and improving of quality of life of residents. Ignoring the complexities of these large-scale interventions in the old and historical urban fabrics may lead to undesirable consequences, including an additional decline of quality of life. Thus, the present paper aims to assess the impact of renewal mega projects on residents' subjective quality of life, in the historical religious district of the holy city of Mashhad (Samen District). A combination of quantitative and qualitative methods of impact assessment, including questionnaires, semi-structured personal interviews, and direct observation, is used in this paper. The results yield that the Samen Renewal Project has significantly reduced the resident's subjective quality of life, due to its undesirable impacts on physical, socio-cultural, and economic environments.
BFS Simulation and Experimental Analysis of the Effect of Ti Additions on the Structure of NiAl
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Ferrante,John; Garg, Anita; Honecy, Frank S.; Amador, Carlos
1999-01-01
The Bozzolo-Ferrante-Smith (BFS) method for alloy energetics is applied to the study of ternary additions to NiAl. A description of the method and its application to alloy design is given. Two different approaches are used in the analysis of the effect of Ti additions to NiAl. First, a thorough analytical study is performed, where the energy of formation, lattice parameter and bulk modulus are calculated for a large number of possible atomic distributions of Ni, Al and Ti. Substitutional site preference schemes and formation of precipitates are thus predicted and analyzed. The second approach used consists of the determination of temperature effects on the final results, as obtained by performing a number of large scale numerical simulations using the Monte Carlo-Metropolis procedure and BFS for the calculation of the energy at every step in the simulation. The results indicate a sharp preference of Ti for Al sites in Ni-rich NiAl alloys and the formation of ternary Heusler precipitates beyond the predicted solubility limit of 5 at. % Ti. Experimental analysis of three Ni-Al-Ti alloys confirms the theoretical predictions.
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Ferrante, John; Garg, Anita; Amador, Carlos
1997-01-01
The Bozzolo-Ferrante-Smith (BFS) semiempirical method for alloy energetics is applied to the study of ternary additions to NiAl alloys. A detailed description of the method and its application to alloy design is given. Two different approaches are used in the analysis of the effect of Ti additions to NiAl. First, a thorough analytical study is performed, where the energy of formation, lattice parameter and bulk modulus are calculated for hundreds of possible atomic distributions of Ni, Al and Ti. Substitutional site preference schemes and formation of precipitates are thus predicted and analyzed. The second approach used consists of the determination of temperature effects on the final results, as obtained by performing a number of large scale numerical simulations using the Monte Carlo - Metropolis procedure and BFS for the calculation of the energy at every step in the simulation. The results indicate a sharp preference of Ti for Al sites in Ni-rich NiAl alloys and the formation of ternary Heusler precipitates beyond the predicted solubility limit of 5 at. % Ti. Experimental analysis of three NiAl+Ti alloys confirms the theoretical predictions.
NASA Astrophysics Data System (ADS)
Koskela, J. J.; Croke, B. W. F.; Koivusalo, H.; Jakeman, A. J.; Kokkonen, T.
2012-11-01
Bayesian inference is used to study the effect of precipitation and model structural uncertainty on estimates of model parameters and confidence limits of predictive variables in a conceptual rainfall-runoff model in the snow-fed Rudbäck catchment (142 ha) in southern Finland. The IHACRES model is coupled with a simple degree day model to account for snow accumulation and melt. The posterior probability distribution of the model parameters is sampled by using the Differential Evolution Adaptive Metropolis (DREAM(ZS)) algorithm and the generalized likelihood function. Precipitation uncertainty is taken into account by introducing additional latent variables that were used as multipliers for individual storm events. Results suggest that occasional snow water equivalent (SWE) observations together with daily streamflow observations do not contain enough information to simultaneously identify model parameters, precipitation uncertainty and model structural uncertainty in the Rudbäck catchment. The addition of an autoregressive component to account for model structure error and latent variables having uniform priors to account for input uncertainty lead to dubious posterior distributions of model parameters. Thus our hypothesis that informative priors for latent variables could be replaced by additional SWE data could not be confirmed. The model was found to work adequately in 1-day-ahead simulation mode, but the results were poor in the simulation batch mode. This was caused by the interaction of parameters that were used to describe different sources of uncertainty. The findings may have lessons for other cases where parameterizations are similarly high in relation to available prior information.
Slepoy, A; Peters, M D; Thompson, A P
2007-11-30
Molecular dynamics and other molecular simulation methods rely on a potential energy function, based only on the relative coordinates of the atomic nuclei. Such a function, called a force field, approximately represents the electronic structure interactions of a condensed matter system. Developing such approximate functions and fitting their parameters remains an arduous, time-consuming process, relying on expert physical intuition. To address this problem, a functional programming methodology was developed that may enable automated discovery of entirely new force-field functional forms, while simultaneously fitting parameter values. The method uses a combination of genetic programming, Metropolis Monte Carlo importance sampling and parallel tempering, to efficiently search a large space of candidate functional forms and parameters. The methodology was tested using a nontrivial problem with a well-defined globally optimal solution: a small set of atomic configurations was generated and the energy of each configuration was calculated using the Lennard-Jones pair potential. Starting with a population of random functions, our fully automated, massively parallel implementation of the method reproducibly discovered the original Lennard-Jones pair potential by searching for several hours on 100 processors, sampling only a minuscule portion of the total search space. This result indicates that, with further improvement, the method may be suitable for unsupervised development of more accurate force fields with completely new functional forms. Copyright (c) 2007 Wiley Periodicals, Inc.
Multi-scale dynamics and relaxation of a tethered membrane in a solvent by Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Pandey, Ras; Anderson, Kelly; Farmer, Barry
2006-03-01
A tethered membrane modeled by a flexible sheet dissipates entropy as it wrinkles and crumples. Nodes of a coarse grained membrane are connected via multiple pathways for dynamical modes to propagate. We consider a sheet with nodes connected by fluctuating bonds on a cubic lattice. The empty lattice sites constitute an effective solvent medium via node-solvent interaction. Each node execute its stochastic motion with the Metropolis algorithm subject to bond fluctuations, excluded volume constraints, and interaction energy. Dynamics and conformation of the sheet are examined at a low and a high temperature with attractive and repulsive node-node interactions for the contrast in an attractive solvent medium. Variations of the mean square displacement of the center node of the sheet and that of its center of mass with the time steps are examined in detail which show different power-law motion from short to long time regimes. Relaxation of the gyration radius and scaling of its asymptotic value with the molecular weight are examined.
Commodore-Mensah, Yvonne; Sampah, Maame; Berko, Charles; Cudjoe, Joycelyn; Abu-Bonsrah, Nancy; Obisesan, Olawunmi; Agyemang, Charles; Adeyemo, Adebowale; Himmelfarb, Cheryl Dennison
2016-12-01
Cardiovascular disease (CVD) remains the leading cause of death in the United States (US). African-descent populations bear a disproportionate burden of CVD risk factors. With the increase in the number of West African immigrants (WAIs) to the US over the past decades, it is imperative to specifically study this new and substantial subset of the African-descent population and how acculturation impacts their CVD risk. The Afro-Cardiac study, a community-based cross-sectional study of adult WAIs in the Baltimore-Washington metropolis. Guided by the PRECEDE-PROCEED model, we used a modification of the World Health Organization Steps survey to collect data on demographics, socioeconomic status, migration-related factors and behaviors. We obtained physical, biochemical, acculturation measurements as well as a socio-demographic and health history. Our study provides critical data on the CVD risk of WAIs. The framework used is valuable for future epidemiological studies addressing CVD risk and acculturation among immigrants.
Radhakrishnan, Aditya; Vitalis, Andreas; Mao, Albert H.; Steffen, Adam T.; Pappu, Rohit V.
2012-01-01
Poly-L-proline (PLP) polymers are useful mimics of biologically relevant proline-rich sequences. Biophysical and computational studies of PLP polymers in aqueous solutions are challenging because of the diversity of length scales and the slow time scales for conformational conversions. We describe an atomistic simulation approach that combines an improved ABSINTH implicit solvation model, with conformational sampling based on standard and novel Metropolis Monte Carlo moves. Refinements to forcefield parameters were guided by published experimental data for proline-rich systems. We assessed the validity of our simulation results through quantitative comparisons to experimental data that were not used in refining the forcefield parameters. Our analysis shows that PLP polymers form heterogeneous ensembles of conformations characterized by semi-rigid, rod-like segments interrupted by kinks, which result from a combination of internal cis peptide bonds, flexible backbone ψ-angles, and the coupling between ring puckering and backbone degrees of freedom. PMID:22329658
Lee, Janie M.; McMahon, Pamela M.; Lowry, Kathryn P.; Omer, Zehra B.; Eisenberg, Jonathan D.; Pandharipande, Pari V.; Gazelle, G. Scott
2012-01-01
Purpose: To evaluate the effect of incorporating radiation risk into microsimulation (first-order Monte Carlo) models for breast and lung cancer screening to illustrate effects of including radiation risk on patient outcome projections. Materials and Methods: All data used in this study were derived from publicly available or deidentified human subject data. Institutional review board approval was not required. The challenges of incorporating radiation risk into simulation models are illustrated with two cancer screening models (Breast Cancer Model and Lung Cancer Policy Model) adapted to include radiation exposure effects from mammography and chest computed tomography (CT), respectively. The primary outcome projected by the breast model was life expectancy (LE) for BRCA1 mutation carriers. Digital mammographic screening beginning at ages 25, 30, 35, and 40 years was evaluated in the context of screenings with false-positive results and radiation exposure effects. The primary outcome of the lung model was lung cancer–specific mortality reduction due to annual screening, comparing two diagnostic CT protocols for lung nodule evaluation. The Metropolis-Hastings algorithm was used to estimate the mean values of the results with 95% uncertainty intervals (UIs). Results: Without radiation exposure effects, the breast model indicated that annual digital mammography starting at age 25 years maximized LE (72.03 years; 95% UI: 72.01 years, 72.05 years) and had the highest number of screenings with false-positive results (2.0 per woman). When radiation effects were included, annual digital mammography beginning at age 30 years maximized LE (71.90 years; 95% UI: 71.87 years, 71.94 years) with a lower number of screenings with false-positive results (1.4 per woman). For annual chest CT screening of 50-year-old females with no follow-up for nodules smaller than 4 mm in diameter, the lung model predicted lung cancer–specific mortality reduction of 21.50% (95% UI: 20.90%, 22.10%) without radiation risk and 17.75% (95% UI: 16.97%, 18.41%) with radiation risk. Conclusion: Because including radiation exposure risk can influence long-term projections from simulation models, it is important to include these risks when conducting modeling-based assessments of diagnostic imaging. © RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.11110352/-/DC1 PMID:22357897
The Full Monte Carlo: A Live Performance with Stars
NASA Astrophysics Data System (ADS)
Meng, Xiao-Li
2014-06-01
Markov chain Monte Carlo (MCMC) is being applied increasingly often in modern Astrostatistics. It is indeed incredibly powerful, but also very dangerous. It is popular because of its apparent generality (from simple to highly complex problems) and simplicity (the availability of out-of-the-box recipes). It is dangerous because it always produces something but there is no surefire way to verify or even diagnosis that the “something” is remotely close to what the MCMC theory predicts or one hopes. Using very simple models (e.g., conditionally Gaussian), this talk starts with a tutorial of the two most popular MCMC algorithms, namely, the Gibbs Sampler and the Metropolis-Hasting Algorithm, and illustratestheir good, bad, and ugly implementations via live demonstration. The talk ends with a story of how a recent advance, the Ancillary-Sufficient Interweaving Strategy (ASIS) (Yu and Meng, 2011, http://www.stat.harvard.edu/Faculty_Content/meng/jcgs.2011-article.pdf)reduces the danger. It was discovered almost by accident during a Ph.D. student’s (Yaming Yu) struggle with fitting a Cox process model for detecting changes in source intensity of photon counts observed by the Chandra X-ray telescope from a (candidate) neutron/quark star.
Garcia, Jonathan; Muñoz-Laboy, Miguel; Parker, Richard; Wilson, Patrick A
2014-04-01
Sex markets (the spatially and culturally bounded arenas) that shape bisexual behavior among Latino men have been utilized as a deterministic concept without a sufficient focus on the ability of individuals to make autonomous decisions within such arenas. We nuance the theory of sex markets using the concept of sexual opportunity structures to investigate the ways in which behaviorally bisexual Latino men in the urban metropolis of New York City navigate sexual geographies, cultural meaning systems, sexual scripts, and social institutions to configure their bisexual behaviors. Drawing on 60 in-depth interviews with bisexual Latino men in New York City, we first describe and analyze venues that constitute sexual geographies that facilitate and impede sexual interaction. These also allow for a degree of autonomy in decision-making, as men travel throughout the urban sexual landscape and sometimes even manage to reject norms, such as those imposed by Christian religion. We explore some of the cultural meaning systems and social institutions that regulate sex markets and influence individual decision-making. Secrecy and discretion-regulated by the family, masculinity, migration, and religion-only partially shaped sexual behavior and relationships. These factors create a flux in "equilibrium" in bisexual sex markets in which sociocultural-economic structures constantly interplay with human agency. This article contributes to the literature in identifying dynamic spaces for sexual health interventions that draw on individual agency and empowerment.
NASA Technical Reports Server (NTRS)
Shepherd, J. Marshall; Pierce, Harold; Starr, David OC. (Technical Monitor)
2001-01-01
This study represents one of the first published attempts to identify rainfall modification by urban areas using satellite-based rainfall measurements. Data from the first space-based rain-radar, the Tropical Rainfall Measuring Mission's (TRMM) Precipitation Radar, are employed. Analysis of the data enables identification of rainfall patterns around Atlanta, Montgomery, Nashville, San Antonio, Waco, and Dallas during the warm season. Results reveal an average increase of -28% in monthly rainfall rates within 30-60 kilometers downwind of the metropolis with a modest increase of 5.6% over the metropolis. Portions of the downwind area exhibit increases as high as 51%. The percentage chances are relative to an upwind CONTROL area. It was also found that maximum rainfall rates in the downwind impact area can exceed the mean value in the upwind CONTROL area by 48%-116%. The maximum value was generally found at an average distance of 39 km from the edge of the urban center or 64 km from the center of the city. These results are consistent with METROMEX studies of St. Louis almost two decades ago and more recent studies near Atlanta. Future work will investi(yate hypothesized factors causing rainfall modification by urban areas. Additional work is also needed to provide more robust validation of space-based rain estimates near major urban areas. Such research has implications for urban planning, water resource management, and understanding human impact on the environment.
Garcia, Jonathan; Muñoz-Laboy, Miguel; Parker, Richard; Wilson, Patrick A.
2013-01-01
Sex markets (the spatially and culturally bounded arenas) that shape bisexual behavior among Latino men have been utilized as a deterministic concept without a sufficient focus on the ability of individuals to make autonomous decisions within such arenas. We nuance the theory of sex markets using the concept of sexual opportunity structures to investigate the ways in which behaviorally bisexual Latino men in the urban metropolis of New York City navigate sexual geographies, cultural meaning systems, sexual scripts, and social institutions to configure their bisexual behaviors. Drawing on 60 in-depth interviews with bisexual Latino men in New York city, we first describe and analyze venues that constitute sexual geographies that facilitate and impede sexual interaction. These also allow for a degree of autonomy in decision-making, as men travel throughout the urban sexual landscape and sometimes even manage to reject norms, such as those imposed by Christian religion. We explore some of the cultural meaning systems and social institutions that regulate sex markets and influence individual decision-making. Secrecy and discretion—regulated by the family, masculinity, migration, and religion—only partially shaped sexual behavior and relationships. These factors create a flux in “equilibrium” in bisexual sex markets in which sociocultural-economic structures constantly interplay with human agency. This article contributes to the literature in identifying dynamic spaces for sexual health interventions that draw on individual agency and empowerment. PMID:23479357
NASA Astrophysics Data System (ADS)
Karikari, A. Y.; Ampofo, J. A.
2013-06-01
Drinking water quality from two major treatment plants in Ghana; Kpong and Weija Plants, and distribution networks in the Accra-Tema Metropolis were monitored monthly for a year at fifteen different locations. The study determined the relationship between chlorine residual, other physico-chemical qualities of the treated water, and, bacteria regrowth. Results indicated that the treated water at the Kpong and Weija Treatment Plants conformed to WHO guidelines for potable water. However, the water quality deteriorated bacteriologically, from the plants to the delivery points with high numbers of indicator and opportunistic pathogens. This could be due to inadequate disinfection residual, biofilms or accidental point source contamination by broken pipes, installation and repair works. The mean turbidity ranged from 1.6 to 2.4 NTU; pH varied from 6.8 to 7.4; conductivity fluctuated from 71.1 to 293 μS/cm. Chlorine residual ranged from 0.13 to 1.35 mg/l. High residual chlorine was observed at the treatment plants, which decreased further from the plants. Results showed that additional chlorination does not take place at the booster stations. Chlorine showed inverse relationship with microbial counts. Total coliform bacteria ranged from 0 to 248 cfu/100 ml, and faecal coliform values varied from 0 to 128 cfu/100 ml. Other microorganisms observed in the treated water included Aeromonas spp., Clostridium spp. and Pseudomonas spp. Boiling water in the household before consumption will reduce water-related health risks.
Effects of snowmelt on watershed transit time distributions
NASA Astrophysics Data System (ADS)
Fang, Z.; Carroll, R. W. H.; Harman, C. J.; Wilusz, D. C.; Schumer, R.
2017-12-01
Snowmelt is the principal control of the timing and magnitude of water flow through alpine watersheds, but the streamflow generated may be displaced groundwater. To quantify this effect, we use a rank StorAge Selection (rSAS) model to estimate time-dependent travel time distributions (TTDs) for the East River Catchment (ERC, 84 km2) - a headwater basin of the Colorado River, and newly designated as the Lawrence Berkeley National Laboratory's Watershed Function Science Focus Area (SFA). Through the SFA, observational networks related to precipitation and stream fluxes have been established with a focus on environmental tracers and stable isotopes. The United Stated Geological Survey Precipitation Runoff Modeling System (PRMS) was used to estimate spatially- and temporally-variable boundary fluxes of effective precipitation (snowmelt & rain), evapotranspiration, and subsurface storage. The DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm was used to calibrate the rSAS model to observed stream isotopic concentration data and quantify uncertainty. The sensitivity of the simulated TTDs to systematic changes in the boundary fluxes was explored. Different PRMS and rSAS model parameters setup were tested to explore how they affect the relationship between input precipitation, especially snowmelt, and the estimated TTDs. Wavelet Coherence Analysis (WCA) was applied to investigate the seasonality of TTD simulations. Our ultimate goal is insight into how the Colorado River headwater catchments store and route water, and how sensitive flow paths and transit times are to climatic changes.
NASA Astrophysics Data System (ADS)
Jones, Alan G.; Afonso, Juan Carlos; Fullea, Javier
2015-04-01
The deep mantle African Superswell is thought to cause up to 500 m of the uplift of the Southern African Plateau. We investigate this phenomenon through stochastic thermo-chemical inversion modelling of the geoid, surface heat flow, Rayleigh and Love dispersion curves and MT data, in a manner that is fully petrologically-consistent. We invert for a three layer crustal velocity, density and thermal structure, but assume the resistivity layering (based on prior inversion of the MT data alone). Inversions are performed using an improved Delayed Rejection and Adaptive Metropolis (DRAM) type Markov chain Monte Carlo (MCMC) algorithm. We demonstrate that a single layer lithosphere can fit most of the data, but not the MT responses. We further demonstrate that modelling the seismic data alone, without the constraint of requiring reasonable oxide chemistry or of fitting the geoid, permits wildly acceptable elevations and with very poorly defined lithosphere-asthenosphere boundary (LAB). We parameterise the lithosphere into three layers, and bound the permitted oxide chemistry of each layer consistent with known chemical layering. We find acceptable models, from 5 million tested in each case, that fit all responses and yield a posteriori elevation distributions centred on 900-950 m, suggesting dynamic support from the lower mantle of some 400 m.
Meirovitch, Hagai
2010-01-01
The commonly used simulation techniques, Metropolis Monte Carlo (MC) and molecular dynamics (MD) are of a dynamical type which enables one to sample system configurations i correctly with the Boltzmann probability, P(i)(B), while the value of P(i)(B) is not provided directly; therefore, it is difficult to obtain the absolute entropy, S approximately -ln P(i)(B), and the Helmholtz free energy, F. With a different simulation approach developed in polymer physics, a chain is grown step-by-step with transition probabilities (TPs), and thus their product is the value of the construction probability; therefore, the entropy is known. Because all exact simulation methods are equivalent, i.e. they lead to the same averages and fluctuations of physical properties, one can treat an MC or MD sample as if its members have rather been generated step-by-step. Thus, each configuration i of the sample can be reconstructed (from nothing) by calculating the TPs with which it could have been constructed. This idea applies also to bulk systems such as fluids or magnets. This approach has led earlier to the "local states" (LS) and the "hypothetical scanning" (HS) methods, which are approximate in nature. A recent development is the hypothetical scanning Monte Carlo (HSMC) (or molecular dynamics, HSMD) method which is based on stochastic TPs where all interactions are taken into account. In this respect, HSMC(D) can be viewed as exact and the only approximation involved is due to insufficient MC(MD) sampling for calculating the TPs. The validity of HSMC has been established by applying it first to liquid argon, TIP3P water, self-avoiding walks (SAW), and polyglycine models, where the results for F were found to agree with those obtained by other methods. Subsequently, HSMD was applied to mobile loops of the enzymes porcine pancreatic alpha-amylase and acetylcholinesterase in explicit water, where the difference in F between the bound and free states of the loop was calculated. Currently, HSMD is being extended for calculating the absolute and relative free energies of ligand-enzyme binding. We describe the whole approach and discuss future directions. 2009 John Wiley & Sons, Ltd.
Lattice QCD calculation using VPP500
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Seyong; Ohta, Shigemi
1995-02-01
A new vector parallel supercomputer, Fujitsu VPP500, was installed at RIKEN earlier this year. It consists of 30 vector computers, each with 1.6 GFLOPS peak speed and 256 MB memory, connected by a crossbar switch with 400 MB/s peak data transfer rate each way between any pair of nodes. The authors developed a Fortran lattice QCD simulation code for it. It runs at about 1.1 GFLOPS sustained per node for Metropolis pure-gauge update, and about 0.8 GFLOPS sustained per node for conjugate gradient inversion of staggered fermion matrix.
Out to eat: the emergence and evolution of the restaurant in nineteenth-century New York City.
Lobel, Cindy R
2010-01-01
Unheard of in the eighteenth century, restaurants became an integral part of New York City's public culture in the antebellum period. This article examines the emergence and development of New York's restaurant sector in the nineteenth century, focusing on three aspects in particular: the close ties between urbanization and the rise of New York's restaurants, the role restaurants played in enforcing the city's class structure and gender mores, and the role of restaurants in shaping the public culture of the growing metropolis.
A Bayesian hierarchical model with novel prior specifications for estimating HIV testing rates.
An, Qian; Kang, Jian; Song, Ruiguang; Hall, H Irene
2016-04-30
Human immunodeficiency virus (HIV) infection is a severe infectious disease actively spreading globally, and acquired immunodeficiency syndrome (AIDS) is an advanced stage of HIV infection. The HIV testing rate, that is, the probability that an AIDS-free HIV infected person seeks a test for HIV during a particular time interval, given no previous positive test has been obtained prior to the start of the time, is an important parameter for public health. In this paper, we propose a Bayesian hierarchical model with two levels of hierarchy to estimate the HIV testing rate using annual AIDS and AIDS-free HIV diagnoses data. At level one, we model the latent number of HIV infections for each year using a Poisson distribution with the intensity parameter representing the HIV incidence rate. At level two, the annual numbers of AIDS and AIDS-free HIV diagnosed cases and all undiagnosed cases stratified by the HIV infections at different years are modeled using a multinomial distribution with parameters including the HIV testing rate. We propose a new class of priors for the HIV incidence rate and HIV testing rate taking into account the temporal dependence of these parameters to improve the estimation accuracy. We develop an efficient posterior computation algorithm based on the adaptive rejection metropolis sampling technique. We demonstrate our model using simulation studies and the analysis of the national HIV surveillance data in the USA. Copyright © 2015 John Wiley & Sons, Ltd.
Multiscale implementation of infinite-swap replica exchange molecular dynamics.
Yu, Tang-Qing; Lu, Jianfeng; Abrams, Cameron F; Vanden-Eijnden, Eric
2016-10-18
Replica exchange molecular dynamics (REMD) is a popular method to accelerate conformational sampling of complex molecular systems. The idea is to run several replicas of the system in parallel at different temperatures that are swapped periodically. These swaps are typically attempted every few MD steps and accepted or rejected according to a Metropolis-Hastings criterion. This guarantees that the joint distribution of the composite system of replicas is the normalized sum of the symmetrized product of the canonical distributions of these replicas at the different temperatures. Here we propose a different implementation of REMD in which (i) the swaps obey a continuous-time Markov jump process implemented via Gillespie's stochastic simulation algorithm (SSA), which also samples exactly the aforementioned joint distribution and has the advantage of being rejection free, and (ii) this REMD-SSA is combined with the heterogeneous multiscale method to accelerate the rate of the swaps and reach the so-called infinite-swap limit that is known to optimize sampling efficiency. The method is easy to implement and can be trivially parallelized. Here we illustrate its accuracy and efficiency on the examples of alanine dipeptide in vacuum and C-terminal β-hairpin of protein G in explicit solvent. In this latter example, our results indicate that the landscape of the protein is a triple funnel with two folded structures and one misfolded structure that are stabilized by H-bonds.
Efficient Implementation of MrBayes on Multi-GPU
Zhou, Jianfu; Liu, Xiaoguang; Wang, Gang
2013-01-01
MrBayes, using Metropolis-coupled Markov chain Monte Carlo (MCMCMC or (MC)3), is a popular program for Bayesian inference. As a leading method of using DNA data to infer phylogeny, the (MC)3 Bayesian algorithm and its improved and parallel versions are now not fast enough for biologists to analyze massive real-world DNA data. Recently, graphics processor unit (GPU) has shown its power as a coprocessor (or rather, an accelerator) in many fields. This article describes an efficient implementation a(MC)3 (aMCMCMC) for MrBayes (MC)3 on compute unified device architecture. By dynamically adjusting the task granularity to adapt to input data size and hardware configuration, it makes full use of GPU cores with different data sets. An adaptive method is also developed to split and combine DNA sequences to make full use of a large number of GPU cards. Furthermore, a new “node-by-node” task scheduling strategy is developed to improve concurrency, and several optimizing methods are used to reduce extra overhead. Experimental results show that a(MC)3 achieves up to 63× speedup over serial MrBayes on a single machine with one GPU card, and up to 170× speedup with four GPU cards, and up to 478× speedup with a 32-node GPU cluster. a(MC)3 is dramatically faster than all the previous (MC)3 algorithms and scales well to large GPU clusters. PMID:23493260
Efficient implementation of MrBayes on multi-GPU.
Bao, Jie; Xia, Hongju; Zhou, Jianfu; Liu, Xiaoguang; Wang, Gang
2013-06-01
MrBayes, using Metropolis-coupled Markov chain Monte Carlo (MCMCMC or (MC)(3)), is a popular program for Bayesian inference. As a leading method of using DNA data to infer phylogeny, the (MC)(3) Bayesian algorithm and its improved and parallel versions are now not fast enough for biologists to analyze massive real-world DNA data. Recently, graphics processor unit (GPU) has shown its power as a coprocessor (or rather, an accelerator) in many fields. This article describes an efficient implementation a(MC)(3) (aMCMCMC) for MrBayes (MC)(3) on compute unified device architecture. By dynamically adjusting the task granularity to adapt to input data size and hardware configuration, it makes full use of GPU cores with different data sets. An adaptive method is also developed to split and combine DNA sequences to make full use of a large number of GPU cards. Furthermore, a new "node-by-node" task scheduling strategy is developed to improve concurrency, and several optimizing methods are used to reduce extra overhead. Experimental results show that a(MC)(3) achieves up to 63× speedup over serial MrBayes on a single machine with one GPU card, and up to 170× speedup with four GPU cards, and up to 478× speedup with a 32-node GPU cluster. a(MC)(3) is dramatically faster than all the previous (MC)(3) algorithms and scales well to large GPU clusters.
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
NASA Astrophysics Data System (ADS)
Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.
2011-12-01
The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.
Estimating rare events in biochemical systems using conditional sampling.
Sundar, V S
2017-01-28
The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.
Matsumoto, Yoko; Nakai, Akihito; Nishijima, Yasuhiro; Kishita, Eisaku; Hakuno, Haruhiko; Sakoi, Masami; Kusuda, Satoshi; Unno, Nobuya; Tamura, Masanori; Fujii, Tomoyuki
2016-10-01
National medical projects are carried out according to medical care plans directed by the Medical Care Act of Japan. In order to improve Japanese perinatal medical care, it is necessary to determine the factors that might influence perinatal outcome. Statistical data of births and perinatal deaths were obtained for all municipalities in Japan from 2008 to 2012 from the Portal Site of Official Statistics of Japan (e-Stat). The perinatal mortality of all 349 Japanese secondary medical care zones was calculated. The number of neonatal intensive care units (NICUs), maternal-fetal intensive care units (MFICUs), pediatricians and obstetricians in 2011 were also obtained from e-Stat. Nine secondary medical care zones in two prefectures, Fukushima (7) and Miyagi (2) were excluded to eliminate the influence of the 2011 Great East Japan Earthquake. The 340 secondary medical care zones were divided into three groups according to population size and density: metropolis, provincial city, and depopulation. The number of secondary medical care zones in each group were 52, 168, and 120, respectively. The secondary medical care zones in the depopulation group had fewer pediatricians and significantly fewer NICUs and MFICUs than the metropolis group, but there was no significant difference in perinatal mortality. The only independent risk factor for high perinatal mortality, determined by multivariable analysis, was the absence of an NICU (P = 0.011). To consider directions in perinatal medical care, planned arrangement and appropriate access to NICUs is indispensable. © 2016 Japan Society of Obstetrics and Gynecology.
NASA Astrophysics Data System (ADS)
Alaigba, D. B.; Soumah, M.; Banjo, M. O.
2017-05-01
The problem of urban mobility is complicated by traffic delay, resulting from poor planning, high population density and poor condition of roads within urban spaces. This study assessed traffic congestion resulting from differential contribution made by various land-uses along Apapa-Oworoshoki expressway in Lagos metropolis. The data for this study was from both primary and secondary sources; GPS point data was collected at selected points for traffic volume count; observation of the nature of vehicular traffic congestion, and land use types along the corridor. Existing data on traffic count along the corridor, connectivity map and land use map sourced from relevant authorities were acquired. Traffic congestion within the area was estimated using volume capacity ratio (V/C). Heterogeneity Index was developed and used to quantify the percentage contribution to traffic volume from various land-use categories. Analytical Hierarchical Processing (AHP) and knowledge-based weighting were used to rank the importance of different heterogeneity indices. Results showed significant relationship between the degree of heterogeneity of the land use pattern and road traffic congestion. Volume Capacity Ratio computed revealed that the route corridor exceeds its designed capacity in the southward direction between the hours of 8am and 12pm on working days. Five major nodes were analyzed along the corridor, and were all above the expected Passenger Car Unit (PCU), these are "Oshodi" 15 %, "Airport junction" 10 %, "Cele bus stop" 21 %, "Mile 2" 14 %, "Berger" 15 % and "Tincan bus stop" 33 % indicating heavy traffic congestion.
Assessment of Heavy Metal Pollution in Topsoil around Beijing Metropolis
Sun, Ranhao; Chen, Liding
2016-01-01
The topsoil around Beijing metropolis, China, is experiencing impacts of rapid urbanization, intensive farming, and extensive industrial emissions. We analyzed the concentrations of Cu, Ni, Pb, Zn, Cd, and Cr from 87 topsoil samples in the pre-rainy season and 115 samples in the post-rainy season. These samples were attributed to nine land use types: forest, grass, shrub, orchard, wheat, cotton, spring maize, summer maize, and mixed farmland. The pollution index (PI) of heavy metals was calculated from the measured and background concentrations. The ecological risk index (RI) was assessed based on the PI values and toxic-response parameters. The results showed that the mean PI values of Pb, Cr, and Cd were > 1 while those of Cu, Ni, and Zn were < 1. All the samples had low ecological risk for Cu, Ni, Pb, Zn, and Cr while only 15.35% of samples had low ecological risk for Cd. Atmospheric transport rather than land use factors best explained the seasonal variations in heavy metal concentrations and the impact of atmospheric transport on heavy metal concentrations varied according to the heavy metal types. The concentrations of Cu, Cd, and Cr decreased from the pre- to post-rainy season, while those of Ni, Pb, and Zn increased during this period. Future research should be focused on the underlying atmospheric processes that lead to these spatial and seasonal variations in heavy metals. The policymaking on environmental management should pay close attention to potential ecological risks of Cd as well as identifying the transport pathways of different heavy metals. PMID:27159454
Amjadian, Keyvan; Sacchi, Elisa; Rastegari Mehr, Meisam
2016-11-01
Urban soil contamination is a growing concern for the potential health impact on the increasing number of people living in these areas. In this study, the concentration, the distribution, the contamination levels, and the role of land use were investigated in Erbil metropolis, the capital of Iraqi Kurdistan. A total of 74 soil samples were collected, treated, and analyzed for their physicochemical properties, and for 7 heavy metals (As, Cd, Cr, Cu, Fe, Pb, and Zn) and 16 PAH contents. High concentrations, especially of Cd, Cu Pb, and Zn, were found. The Geoaccumulation index (I geo ), along with correlation coefficients and principal component analysis (PCA) showed that Cd, Cu, Pb, and Zn have similar behaviors and spatial distribution patterns. Heavy traffic density mainly contributed to the high concentrations of these metals. The total concentration of ∑PAHs ranged from 24.26 to 6129.14 ng/g with a mean of 2296.1 ng/g. The PAH pattern was dominated by 4- and 5-ring PAHs, while diagnostic ratios and PCA indicated that the main sources of PAHs were pyrogenic. The toxic equivalent (TEQ) values ranged from 3.26 to 362.84 ng/g, with higher values in central parts of the city. A statistically significant difference in As, Cd, Cu, Pb, Zn, and ∑PAH concentrations between different land uses was observed. The highest As concentrations were found in agricultural areas while roadside, commercial, and industrial areas had the highest Cd, Cu, Pb, Zn, and ∑PAH contents.
NASA Technical Reports Server (NTRS)
Shepherd, J. Marshell; Starr, David OC. (Technical Monitor)
2001-01-01
A novel approach is introduced to correlating urbanization and rainfall modification. This study represents one of the first published attempts (possibly the first) to identify and quantify rainfall modification by urban areas using satellite-based rainfall measurements. Previous investigations successfully used rain gauge networks and around-based radar to investigate this phenomenon but still encountered difficulties due to limited, specialized measurements and separation of topographic and other influences. Three years of mean monthly rainfall rates derived from the first space-based rainfall radar, Tropical Rainfall Measuring Mission's (TRMM) Precipitation Radar, are employed. Analysis of data at half-degree latitude resolution enables identification of rainfall patterns around major metropolitan areas of Atlanta, Montgomery, Nashville, San Antonio, Waco, and Dallas during the warm season. Preliminary results reveal an average increase of 5.6% in monthly rainfall rates (relative to a mean upwind CONTROL area) over the metropolis but an average increase of approx. 28%, in monthly rainfall rates within 30-60 kilometers downwind of the metropolis. Some portions of the downwind area exhibit increases as high as 51%. It was also found that maximum rainfall rates found in the downwind impact area exceeded the mean value in the upwind CONTROL area by 48%-116% and were generally found at an average distance of 39 km from the edge of the urban center or 64 km from the center of the city. These results are quite consistent studies of St. Louis (e.g' METROMEX) and Chicago almost two decades ago and more recent studies in the Atlanta and Mexico City areas.
Pt-Zn Clusters on Stoichiometric MgO(100) and TiO2(110): Dramatically Different Sintering Behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dadras, Mostafa J.; Shen, Lu; Alexandrova, Anastassia N.
2015-03-02
Zn was suggested to be a promising additive to Pt in the catalysis of dehydrogenation reactions. In this work, mixed Pt-Zn clusters deposited on two simple oxides, MgO(100) and TiO2(110), were investigated. The stability of these systems against cluster sintering, one of the major mechanisms of catalyst deactivation, is simulated using a Metropolis Monte Carlo scheme under the assumption of the Ostwald ripening mechanism. Particle migration, association to and dissociation from clusters, and evaporation and redeposition of monomers were all included in the simulations. Simulations are done at several high temperatures relevant to reactions of catalytic dehydrogenation. The effect ofmore » temperature is included via both the Metropolis algorithm and the Boltzmann-weighted populations of the global and thermally accessible local minima on the density functional theory potential energy surfaces of clusters of all sizes and compositions up to tetramers. On both surfaces, clusters are shown to sinter quite rapidly. However, the resultant compositions of the clusters most resistant to sintering are quite different on the two supports. On TiO2(110), Pt and Zn appear to phase separate, preferentially forming clusters rich in just one or the other metal. On MgO(100), Pt and Zn remain well-mixed and form a range of bimetallic clusters of various compositions that appear relatively stable. However, Zn is more easily lost from MgO through evaporation. These phenomena were rationalized by several means of chemical bonding analysis.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ojedokun, Oluyinka, E-mail: yinkaoje2004@yahoo.com
Highlights: > Independently, altruism and locus of control contributed significantly toward attitude towards littering. > Altruism and locus of control jointly contributed significantly to attitude towards littering. > The results further show a significant joint influence of altruism and locus of control on REB. > The independent contributions reveal that altruism and locus of control contribute significantly to REB. > Attitude towards littering mediates the relationship between locus of control and REB. - Abstract: The study tested whether attitude towards littering mediates the relationship between personality attributes (altruism and locus of control) and responsible environmental behavior (REB) among some residentsmore » of Ibadan metropolis, Nigeria. Using multistage sampling technique, measures of each construct were administered to 1360 participants. Results reveal significant independent and joint influence of personality attributes on attitude towards littering and responsible environmental behavior, respectively. Attitude towards littering also mediates the relationship between personality characteristics and REB. These findings imply that individuals who possess certain desirable personality characteristics and who have unfavorable attitude towards littering have more tendencies to engage in pro-environmental behavior. Therefore, stakeholders who have waste management as their priority should incorporate this information when guidelines for public education and litter prevention programs are being developed. It is suggested that psychologists should be involved in designing of litter prevention strategies. This will ensure the inclusion of behavioral issues in such strategies. An integrated approach to litter prevention that combines empowerment, cognitive, social, and technical solutions is recommended as the most effective tool of tackling the litter problem among residents of Ibadan metropolis.« less
PREFACE: Symmetry and Structural Properties of Condensed Matter
NASA Astrophysics Data System (ADS)
Lulek, Tadeusz; Wal, Andrzej; Lulek, Barbara
2008-03-01
This volume comprises the proceedings of the Ninth Summer School on Theoretical Physics under the leading title `Symmetry and Structural Properties of Condensed Matter' (SSPCM 2007). The school, organised by Rzeszów University of Technology, Poland, together with AGH University of Science and Technology, Cracow, Poland, in 5-12 September 2007 in Myczkowce. The meeting aimed to continue the series of biannual SSPCM schools (since 1990), and focused on the promotion of some advanced mathematical methods within the physics of condensed matter, with an emphasis on quantum information aspects. The main topics of the SSPCM07 school were the following: Quantum information and computing Finite dimensional Hilbert spaces Generating functions and exactly soluble models The Proceedings are divided into three parts accordingly. These topics can be seen as a natural continuation of the previous SSPCM05 school, aimed at studying interrelations between solid state physics and quantum informatics, as well as an extension of earlier SSPCM meetings, devoted to mathematical tools of condensed matter theory. The school gathered together more than 60 participants from 11 countries and 7 scientific centres in Poland. Some of them were there for the first time, and some had attended nearly all previous meetings. We had advanced researchers as well as their young collaborators and students. Acknowledgements The Organizing Committee wishes to express our gratitude to all participants for several their activities at the school and for creating so friendly and inspiring an atmosphere that one can talk about the term: `SSPCM society'. Special thanks are due to all lecturers, for preparing and presenting their talks, and for several valuable discussions. We also give thanks to all those who prepared manuscripts, giving us thus an opportunity to share their ideas, to all referees who improved significantly the quality of this volume, to all members of our International Advisory Committee, and to chairmen for their polite but efficient leading of sessions. It is our sad duty to inform the whole SSPCM society that one of us, Professor Jan Mozrzymas from The Institute of Theoretical Physics, Wrocław University, died on 8 January 2006. He was a lecturer and a very active participant of our first four SSPCM meetings in Zajaczkowo near Poznań. Some of us remember his lectures on crystallography, solitons and motions of electric charges, nicely intertwined with differential geometry, theta functions, knots or Rieman surfaces. We shall keep his attitude towards physics and mathematics in our minds. As the Organizing Committee, we would like to express our special gratitude to The Nicholas C Metropolis Mathematics Foundation (USA) for substantial financial support of our two last SSPCM schools. We direct this gratitude to Professor James D Louck, The President of this Foundation, and also one of the most regular lecturers of our series of SSPCM schools. He was so kind to introduce us, in his impressive address at the ceremonial dinner, into the history and aims of the Foundation which we like to sketch here briefly. The Founder, Professor Nicholas C Metropolis, was a famous researcher who started his work in Chicago in 1942 within the Fermi group that was building the first nuclear reactor, in 1943 he joined the group of Bethe in Los Alamos, taking part in the construction of the atomic bomb. He was in scientific contact with many well known mathematicians and physicists of those times, Feynman, von Neumann, Ulam, Szilard, Teller, and Wigner. Before he died in 1999, he created The Mathematics Foundation with the purpose of supporting science, with a special emphasis on mathematical orientation. We wish to thank the Nicholas C Metropolis Foundation, not only as benefactors (in fact, this support has allowed us to initiate the last two SSPCM schools), but also for giving us the honour of being seen as proper members in the perspective of such an important tradition. We would also like to acknowledge the support of European Physical Society. On behalf of the organizers of SSPCM 2007 Tadeusz Lulek, Andrzej Wal and Barbara Lulek Editors
Chou, Sheng-Kai; Jiau, Ming-Kai; Huang, Shih-Chia
2016-08-01
The growing ubiquity of vehicles has led to increased concerns about environmental issues. These concerns can be mitigated by implementing an effective carpool service. In an intelligent carpool system, an automated service process assists carpool participants in determining routes and matches. It is a discrete optimization problem that involves a system-wide condition as well as participants' expectations. In this paper, we solve the carpool service problem (CSP) to provide satisfactory ride matches. To this end, we developed a particle swarm carpool algorithm based on stochastic set-based particle swarm optimization (PSO). Our method introduces stochastic coding to augment traditional particles, and uses three terminologies to represent a particle: 1) particle position; 2) particle view; and 3) particle velocity. In this way, the set-based PSO (S-PSO) can be realized by local exploration. In the simulation and experiments, two kind of discrete PSOs-S-PSO and binary PSO (BPSO)-and a genetic algorithm (GA) are compared and examined using tested benchmarks that simulate a real-world metropolis. We observed that the S-PSO outperformed the BPSO and the GA thoroughly. Moreover, our method yielded the best result in a statistical test and successfully obtained numerical results for meeting the optimization objectives of the CSP.
Active heat pulse sensing of 3-D-flow fields in streambeds
NASA Astrophysics Data System (ADS)
Banks, Eddie W.; Shanafield, Margaret A.; Noorduijn, Saskia; McCallum, James; Lewandowski, Jörg; Batelaan, Okke
2018-03-01
Profiles of temperature time series are commonly used to determine hyporheic flow patterns and hydraulic dynamics in the streambed sediments. Although hyporheic flows are 3-D, past research has focused on determining the magnitude of the vertical flow component and how this varies spatially. This study used a portable 56-sensor, 3-D temperature array with three heat pulse sources to measure the flow direction and magnitude up to 200 mm below the water-sediment interface. Short, 1 min heat pulses were injected at one of the three heat sources and the temperature response was monitored over a period of 30 min. Breakthrough curves from each of the sensors were analysed using a heat transport equation. Parameter estimation and uncertainty analysis was undertaken using the differential evolution adaptive metropolis (DREAM) algorithm, an adaption of the Markov chain Monte Carlo method, to estimate the flux and its orientation. Measurements were conducted in the field and in a sand tank under an extensive range of controlled hydraulic conditions to validate the method. The use of short-duration heat pulses provided a rapid, accurate assessment technique for determining dynamic and multi-directional flow patterns in the hyporheic zone and is a basis for improved understanding of biogeochemical processes at the water-streambed interface.
Amankwaa, Isaac; Agyemang-Dankwah, Anabella; Boateng, Daniel
2015-01-01
Introduction. Success in the licensure examination is the only legal prerequisite to practice as a nurse in Ghana. However, a large percentage of nursing students who sit fail this examination for the first time. This study sought to unravel whether prior education, sociodemographic characteristics, and nursing Cumulative Grade Point Average (CGPA) could predict performance in the licensure examinations. Methods. The study was a descriptive cross-sectional survey conducted from November 2014 to April 2015 in the Kumasi metropolis, Ghana on 176 past nursing students. Data was collected using questionnaires and analyzed using SPSS version 22. A logistic regression model was fitted to look at the influence of the explanatory variables on the odds of passing the licensure examinations. All statistical significances were tested at p value of <0.05. Results. Majority, 56.3%, were females and 86.4% were between the ages of 25 and 31 years. Most of the students (88.6%) entered the nursing training colleges with a WASSCE qualification and 38% read general science. 73.9% passed the licensure examinations and the mean CGPA of the students was 2.89 (SD = 0.37). Sociodemographic characteristics and previous education had no influence on performance in the licensure examinations. CGPA had strong positive relationship with performance in licensure examinations (AOR = 15.27; 95% CI = 6.28, 27.11). Conclusion. Students CGPA could be a good predictor of their performance in the licensure examinations. On the other hand, students' sociodemographic and previous educational characteristics might not be important factors to consider in admitting students into the nursing training programme. PMID:26635975
NASA Astrophysics Data System (ADS)
Jakkareddy, Pradeep S.; Balaji, C.
2016-09-01
This paper employs the Bayesian based Metropolis Hasting - Markov Chain Monte Carlo algorithm to solve inverse heat transfer problem of determining the spatially varying heat transfer coefficient from a flat plate with flush mounted discrete heat sources with measured temperatures at the bottom of the plate. The Nusselt number is assumed to be of the form Nu = aReb(x/l)c . To input reasonable values of ’a’ and ‘b’ into the inverse problem, first limited two dimensional conjugate convection simulations were done with Comsol. Based on the guidance from this different values of ‘a’ and ‘b’ are input to a computationally less complex problem of conjugate conduction in the flat plate (15mm thickness) and temperature distributions at the bottom of the plate which is a more convenient location for measuring the temperatures without disturbing the flow were obtained. Since the goal of this work is to demonstrate the eficiacy of the Bayesian approach to accurately retrieve ‘a’ and ‘b’, numerically generated temperatures with known values of ‘a’ and ‘b’ are treated as ‘surrogate’ experimental data. The inverse problem is then solved by repeatedly using the forward solutions together with the MH-MCMC aprroach. To speed up the estimation, the forward model is replaced by an artificial neural network. The mean, maximum-a-posteriori and standard deviation of the estimated parameters ‘a’ and ‘b’ are reported. The robustness of the proposed method is examined, by synthetically adding noise to the temperatures.
Developing a cosmic ray muon sampling capability for muon tomography and monitoring applications
NASA Astrophysics Data System (ADS)
Chatzidakis, S.; Chrysikopoulou, S.; Tsoukalas, L. H.
2015-12-01
In this study, a cosmic ray muon sampling capability using a phenomenological model that captures the main characteristics of the experimentally measured spectrum coupled with a set of statistical algorithms is developed. The "muon generator" produces muons with zenith angles in the range 0-90° and energies in the range 1-100 GeV and is suitable for Monte Carlo simulations with emphasis on muon tomographic and monitoring applications. The muon energy distribution is described by the Smith and Duller (1959) [35] phenomenological model. Statistical algorithms are then employed for generating random samples. The inverse transform provides a means to generate samples from the muon angular distribution, whereas the Acceptance-Rejection and Metropolis-Hastings algorithms are employed to provide the energy component. The predictions for muon energies 1-60 GeV and zenith angles 0-90° are validated with a series of actual spectrum measurements and with estimates from the software library CRY. The results confirm the validity of the phenomenological model and the applicability of the statistical algorithms to generate polyenergetic-polydirectional muons. The response of the algorithms and the impact of critical parameters on computation time and computed results were investigated. Final output from the proposed "muon generator" is a look-up table that contains the sampled muon angles and energies and can be easily integrated into Monte Carlo particle simulation codes such as Geant4 and MCNP.
NASA Astrophysics Data System (ADS)
Santos-Filho, J. B.; Plascak, J. A.
2017-09-01
The X Y vectorial generalization of the Blume-Emery-Griffiths (X Y -VBEG) model, which is suitable to be applied to the study of 3He-4He mixtures, is treated on thin films structure and its thermodynamical properties are analyzed as a function of the film thickness. We employ extensive and up-to-date Monte Carlo simulations consisting of hybrid algorithms combining lattice-gas moves, Metropolis, Wolff, and super-relaxation procedures to overcome the critical slowing down and correlations among different spin configurations of the system. We also make use of single histogram techniques to get the behavior of the thermodynamical quantities close to the corresponding transition temperatures. Thin films of the X Y -VBEG model present a quite rich phase diagram with Berezinskii-Kosterlitz-Thouless (BKT) transitions, BKT endpoints, and isolated critical points. As one varies the impurity concentrations along the layers, and in the limit of infinite film thickness, there is a coalescence of the BKT transition endpoint and the isolated critical point into a single, unique tricritical point. In addition, when mimicking the behavior of thin films of 3He-4He mixtures, one obtains that the concentration of 3He atoms decreases from the outer layers to the inner layers of the film, meaning that the superfluid particles tend to locate in the bulk of the system.
Small Au clusters on a defective MgO(1 0 0) surface
NASA Astrophysics Data System (ADS)
Barcaro, Giovanni; Fortunelli, Alessandro
2008-05-01
The lowest energy structures of small T]>rndm where rndm is a random number (Metropolis criterion), the new configuration is accepted, otherwise the old configuration is kept, and the process is iterated. For each size we performed 3-5 BH runs, each one composed of 20-25 Monte Carlo steps, using a value of 0.5 eV as kT in the Metropolis criterion. Previous experience [13-15] shows that this is sufficient to single out the global minimum for adsorbed clusters of this size, and that the BH approach is more efficient as a global optimization algorithm than other techniques such as simulated annealing [18]. The MgO support was described via an (Mg 12O 12) cluster embedded in an array of ±2.0 a.u. point charges and repulsive pseudopotentials on the positive charges in direct contact with the cluster (see Ref. [15] for more details on the method). The atoms of the oxide cluster and the point charges were located at the lattice positions of the MgO rock-salt bulk structure using the experimental lattice constant of 4.208 Å. At variance with the ), evaluated by subtracting the energy of the oxide surface and of the metal cluster, both frozen in their interacting configuration, from the value of the total energy of the system, and by taking the absolute value; (ii) the binding energy of the metal cluster (E), evaluated by subtracting the energy of the isolated metal atoms from the total energy of the metal cluster in its interacting configuration, and by taking the absolute value; (iii) the metal cluster distortion energy (E), which corresponds to the difference between the energy of the metal cluster in the configuration interacting with the surface minus the energy of the cluster in its lowest-energy gas-phase configuration (a positive quantity); (iv) the oxide distortion energy (ΔE), evaluated subtracting the energy of the relaxed isolated defected oxide from the energy of the isolated defected oxide in the interacting configuration; and (v) the total binding energy (E), which is the sum of the binding energy of the metal cluster, the adhesion energy and the oxide distortion energy (E=E+E-ΔE). Note that the total binding energy of gas-phase clusters in their global minima can be obtained by summing E+E.
NASA Astrophysics Data System (ADS)
Vatansever, Erol
2017-05-01
By means of Monte Carlo simulation method with Metropolis algorithm, we elucidate the thermal and magnetic phase transition behaviors of a ferrimagnetic core/shell nanocubic system driven by a time dependent magnetic field. The particle core is composed of ferromagnetic spins, and it is surrounded by an antiferromagnetic shell. At the interface of the core/shell particle, we use antiferromagnetic spin-spin coupling. We simulate the nanoparticle using classical Heisenberg spins. After a detailed analysis, our Monte Carlo simulation results suggest that present system exhibits unusual and interesting magnetic behaviors. For example, at the relatively lower temperature regions, an increment in the amplitude of the external field destroys the antiferromagnetism in the shell part of the nanoparticle, leading to a ground state with ferromagnetic character. Moreover, particular attention has been dedicated to the hysteresis behaviors of the system. For the first time, we show that frequency dispersions can be categorized into three groups for a fixed temperature for finite core/shell systems, as in the case of the conventional bulk systems under the influence of an oscillating magnetic field.
High-resolution mapping of anthropogenic heat in China from 1992 to 2010.
Yang, Wangming; Chen, Bing; Cui, Xuefeng
2014-04-14
Anthropogenic heat generated by human activity contributes to urban and regional climate warming. Due to the resolution and accuracy of existing anthropogenic heat data, it is difficult to analyze and simulate the corresponding effects. This study exploited a new method to estimate high spatial and temporal resolutions of anthropogenic heat based on long-term data of energy consumption and the US Air Force Defense Meteorological Satellite Program-Operational Linescan System (DMSP-OLS) data from 1992 to 2010 across China. Our results showed that, throughout the entire study period, there are apparent increasing trends in anthropogenic heat in three major metropoli, i.e., the Beijing-Tianjin region, the Yangzi River delta and the Pearl River delta. The annual mean anthropogenic heat fluxes for Beijing, Shanghai and Guangzhou in 2010 were 17 Wm⁻², 19 and 7.8 Wm⁻², respectively. Comparisons with previous studies indicate that DMSP-OLS data could provide a better spatial proxy for estimating anthropogenic heat than population density and our analysis shows better performance at large scales for estimation of anthropogenic heat.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crowder, Jeff; Jet Propulsion Laboratory, California Institute of Technology, Pasadena, California 91109; Cornish, Neil J.
Low frequency gravitational wave detectors, such as the Laser Interferometer Space Antenna (LISA), will have to contend with large foregrounds produced by millions of compact galactic binaries in our galaxy. While these galactic signals are interesting in their own right, the unresolved component can obscure other sources. The science yield for the LISA mission can be improved if the brighter and more isolated foreground sources can be identified and regressed from the data. Since the signals overlap with one another, we are faced with a 'cocktail party' problem of picking out individual conversations in a crowded room. Here we presentmore » and implement an end-to-end solution to the galactic foreground problem that is able to resolve tens of thousands of sources from across the LISA band. Our algorithm employs a variant of the Markov chain Monte Carlo (MCMC) method, which we call the blocked annealed Metropolis-Hastings (BAM) algorithm. Following a description of the algorithm and its implementation, we give several examples ranging from searches for a single source to searches for hundreds of overlapping sources. Our examples include data sets from the first round of mock LISA data challenges.« less
Tricriticality in the q-neighbor Ising model on a partially duplex clique.
Chmiel, Anna; Sienkiewicz, Julian; Sznajd-Weron, Katarzyna
2017-12-01
We analyze a modified kinetic Ising model, a so-called q-neighbor Ising model, with Metropolis dynamics [Phys. Rev. E 92, 052105 (2015)PLEEE81539-375510.1103/PhysRevE.92.052105] on a duplex clique and a partially duplex clique. In the q-neighbor Ising model each spin interacts only with q spins randomly chosen from its whole neighborhood. In the case of a duplex clique the change of a spin is allowed only if both levels simultaneously induce this change. Due to the mean-field-like nature of the model we are able to derive the analytic form of transition probabilities and solve the corresponding master equation. The existence of the second level changes dramatically the character of the phase transition. In the case of the monoplex clique, the q-neighbor Ising model exhibits a continuous phase transition for q=3, discontinuous phase transition for q≥4, and for q=1 and q=2 the phase transition is not observed. On the other hand, in the case of the duplex clique continuous phase transitions are observed for all values of q, even for q=1 and q=2. Subsequently we introduce a partially duplex clique, parametrized by r∈[0,1], which allows us to tune the network from monoplex (r=0) to duplex (r=1). Such a generalized topology, in which a fraction r of all nodes appear on both levels, allows us to obtain the critical value of r=r^{*}(q) at which a tricriticality (switch from continuous to discontinuous phase transition) appears.
Physical–chemical determinants of coil conformations in globular proteins
Perskie, Lauren L; Rose, George D
2010-01-01
We present a method with the potential to generate a library of coil segments from first principles. Proteins are built from α-helices and/or β-strands interconnected by these coil segments. Here, we investigate the conformational determinants of short coil segments, with particular emphasis on chain turns. Toward this goal, we extracted a comprehensive set of two-, three-, and four-residue turns from X-ray–elucidated proteins and classified them by conformation. A remarkably small number of unique conformers account for most of this experimentally determined set, whereas remaining members span a large number of rare conformers, many occurring only once in the entire protein database. Factors determining conformation were identified via Metropolis Monte Carlo simulations devised to test the effectiveness of various energy terms. Simulated structures were validated by comparison to experimental counterparts. After filtering rare conformers, we found that 98% of the remaining experimentally determined turn population could be reproduced by applying a hydrogen bond energy term to an exhaustively generated ensemble of clash-free conformers in which no backbone polar group lacks a hydrogen-bond partner. Further, at least 90% of longer coil segments, ranging from 5- to 20 residues, were found to be structural composites of these shorter primitives. These results are pertinent to protein structure prediction, where approaches can be divided into either empirical or ab initio methods. Empirical methods use database-derived information; ab initio methods rely on physical–chemical principles exclusively. Replacing the database-derived coil library with one generated from first principles would transform any empirically based method into its corresponding ab initio homologue. PMID:20512968
Critical success factors for physical activity promotion through community partnerships.
Lucidarme, Steffie; Marlier, Mathieu; Cardon, Greet; De Bourdeaudhuij, Ilse; Willem, Annick
2014-02-01
To define key factors of effective evidence-based policy implementation for physical activity promotion by use of a partnership approach. Using Parent and Harvey's model for sport and physical activity community-based partnerships, we defined determinants of implementation based on 13 face-to-face interviews with network organisations and 39 telephone interviews with partner organisations. Furthermore, two quantitative data-sets (n = 991 and n = 965) were used to measure implementation. In total, nine variables were found to influence implementation. Personal contact was the most powerful variable since its presence contributed to success while its absence led to a negative outcome. Four contributed directly to success: political motive, absence of a metropolis, high commitment and more qualified staff. Four others resulted in a less successful implementation: absence of positive merger effects, exposure motive and governance, and dispersed leadership. Community networks are a promising instrument for the implementation of evidence-based policies. However, determinants of both formation and management of partnerships influence the implementation success. During partnership formation, special attention should be given to partnership motives while social skills are of utmost importance for the management.
Three-dimensional Probabilistic Earthquake Location Applied to 2002-2003 Mt. Etna Eruption
NASA Astrophysics Data System (ADS)
Mostaccio, A.; Tuve', T.; Zuccarello, L.; Patane', D.; Saccorotti, G.; D'Agostino, M.
2005-12-01
Recorded seismicity for the Mt. Etna volcano, occurred during the 2002-2003 eruption, has been relocated using a probabilistic, non-linear, earthquake location approach. We used the software package NonLinLoc (Lomax et al., 2000) adopting the 3D velocity model obtained by Cocina et al., 2005. We applied our data through different algorithms: (1) via a grid-search; (2) via a Metropolis-Gibbs; and (3) via an Oct-tree. The Oct-Tree algorithm gives efficient, faster and accurate mapping of the PDF (Probability Density Function) of the earthquake location problem. More than 300 seismic events were analyzed in order to compare non-linear location results with the ones obtained by using traditional, linearized earthquake location algorithm such as Hypoellipse, and a 3D linearized inversion (Thurber, 1983). Moreover, we compare 38 focal mechanisms, chosen following stricta criteria selection, with the ones obtained by the 3D and 1D results. Although the presented approach is more of a traditional relocation application, probabilistic earthquake location could be used in routinely survey.
Population aging and its impacts: strategies of the health-care system in Taipei.
Lin, Ming-Hsien; Chou, Ming-Yueh; Liang, Chih-Kuang; Peng, Li-Ning; Chen, Liang-Kung
2010-11-01
Taiwan is one of the fastest aging countries in the world. As such, the government has developed various strategies to promote an age-friendly health-care system. Health services are supported by National Health Insurance (NHI), which insures over 97% of citizens and over 99% of health-care institutes. The current health-care system has difficulties in caring for older patients with multiple comorbidities, complex care needs, functional impairments, and post-acute care needs. Taipei, an international metropolis with a well-preserved tradition of filial piety in Chinese societies, has developed various strategies to overcome the aforementioned barriers to an age-friendly health-care system. These include an emphasis on general medical care and a holistic approach in all specialties, development of a geriatrics specialty training program, development of post-acute services, and strengthening of linkages between health and social care services. Despite achievements thus far, challenges still include creating a more extensive integration between medical specialties, promotion of an interdisciplinary care model across specialties and health-care settings, and integration of health and social care services. The experiences of Taipei in developing an age-friendly health-care service system may be a culturally appropriate model for other Chinese and Asian communities. Copyright © 2010 Elsevier B.V. All rights reserved.
Spectral analysis of finite-time correlation matrices near equilibrium phase transitions
NASA Astrophysics Data System (ADS)
Vinayak; Prosen, T.; Buča, B.; Seligman, T. H.
2014-10-01
We study spectral densities for systems on lattices, which, at a phase transition display, power-law spatial correlations. Constructing the spatial correlation matrix we prove that its eigenvalue density shows a power law that can be derived from the spatial correlations. In practice time series are short in the sense that they are either not stationary over long time intervals or not available over long time intervals. Also we usually do not have time series for all variables available. We shall make numerical simulations on a two-dimensional Ising model with the usual Metropolis algorithm as time evolution. Using all spins on a grid with periodic boundary conditions we find a power law, that is, for large grids, compatible with the analytic result. We still find a power law even if we choose a fairly small subset of grid points at random. The exponents of the power laws will be smaller under such circumstances. For very short time series leading to singular correlation matrices we use a recently developed technique to lift the degeneracy at zero in the spectrum and find a significant signature of critical behavior even in this case as compared to high temperature results which tend to those of random matrix models.
Conformation and Dynamics of a Flexible Sheet in Solvent Media by Monte Carlo Simulations
NASA Astrophysics Data System (ADS)
Pandey, Ras; Anderson, Kelly; Heinz, Hendrik; Farmer, Barry
2005-03-01
Flexibility of the clay sheet is limited even in the ex-foliated state in some solvent media. A coarse grained model is used to investigate dynamics and conformation of a flexible sheet to model such a clay platelet in an effective solvent medium on a cubic lattice of size L^3 with lattice constant a. The undeformed sheet is described by a square lattice of size Ls^2, where, each node of the sheet is represented by the unit cube of the cubic lattice and 2a is the minimum distance between the nearest neighbor nodes to incorporate the excluded volume constraints. Additionally, each node interacts with neighboring nodes and solvent (empty) sites within a range ri. Each node execute their stochastic motion with the Metropolis algorithm subject to bond length fluctuation and excluded volume constraints. Mean square displacements of the center node and that of its center of mass are investigated as a function of time step for a set of these parameters. The radius of gyration (Rg) is also examined concurrently to understand its relaxation. Multi-scale segmental dynamics of the sheet is studied by identifying the power-law dependence in various time regimes. Relaxation of Rg and its dependence of temperature are planned to be discussed.
Modeling of protein-anion exchange resin interaction for the human growth hormone charge variants.
Lapelosa, Mauro; Patapoff, Thomas W; Zarraga, Isidro E
2015-12-01
Modeling ion exchange chromatography (IEC) behavior has generated significant interest because of the wide use of IEC as an analytical technique as well as a preparative protein purification process; indeed there is a need for better understanding of what drives the unique behavior of protein charge variants. We hypothesize that a complex protein molecule, which contains both hydrophobic and charged moieties, would interact strongly with an in silico designed resin through charged electrostatic patches on the surface of the protein. In the present work, variants of recombinant human growth hormone that mimic naturally-occurring deamidation products were produced and characterized in silico. The study included these four variants: rhGH, N149D, N152D, and N149D/N152D. Poisson-Boltzmann calculations were used to determine surface electrostatic potential. Metropolis Monte Carlo simulations were carried out with the resulting variants to simulate IEC systems, examining the free energy of the interaction of the protein with an in silico anion exchange column represented by polylysine polypeptide. The results show that the charge variants have different average binding energies and the free energy of interaction can be used to predict the retention time for the different variants. Copyright © 2015 Elsevier B.V. All rights reserved.
EXOFIT: orbital parameters of extrasolar planets from radial velocities
NASA Astrophysics Data System (ADS)
Balan, Sreekumar T.; Lahav, Ofer
2009-04-01
Retrieval of orbital parameters of extrasolar planets poses considerable statistical challenges. Due to sparse sampling, measurement errors, parameters degeneracy and modelling limitations, there are no unique values of basic parameters, such as period and eccentricity. Here, we estimate the orbital parameters from radial velocity data in a Bayesian framework by utilizing Markov Chain Monte Carlo (MCMC) simulations with the Metropolis-Hastings algorithm. We follow a methodology recently proposed by Gregory and Ford. Our implementation of MCMC is based on the object-oriented approach outlined by Graves. We make our resulting code, EXOFIT, publicly available with this paper. It can search for either one or two planets as illustrated on mock data. As an example we re-analysed the orbital solution of companions to HD 187085 and HD 159868 from the published radial velocity data. We confirm the degeneracy reported for orbital parameters of the companion to HD 187085, and show that a low-eccentricity orbit is more probable for this planet. For HD 159868, we obtained slightly different orbital solution and a relatively high `noise' factor indicating the presence of an unaccounted signal in the radial velocity data. EXOFIT is designed in such a way that it can be extended for a variety of probability models, including different Bayesian priors.
NASA Astrophysics Data System (ADS)
Karam, H. A.; Pereira Filho, A. J.
This work proposes a numerical representation of the urban surface for tropical and subtropical cities in numerical models of atmosphere. A typical tropical metropolis is São Paulo City, SP, Brazil, that presents a neighborhood area characterized by an uncompleted urbanization and where the public services are limited in attend the needs of the population. The suburban area of São Paulo city presents an occupation that is distinguished of the typical occupation of the European cities because: (1) it occurs in risk areas, i.e., over inclined terrain or potentially flooded areas on the borders of rivers; (2) the buildings are made with some cheap row material mixed with traditional materials; (3) the distribution of short and long wave radiation is conditioned by the inclination of the terrain, geometry of the buildings, materials and population density; (4) the exclusion of many common living areas; (5) intense or free thermal convection is found over the urban surface on the diurnal time with impact in the Atmospheric Boundary Layer dynamics; (6) high levels of airborne pollutants are found; etc. The proposed numerical scheme is designed to contribute with the current tools used to forecast the impact of convective precipitations in the risk areas of São Paulo City.
Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach
NASA Technical Reports Server (NTRS)
Warner, James E.; Hochhalter, Jacob D.
2016-01-01
This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.
NASA Astrophysics Data System (ADS)
Rosenberg, D. E.; Alafifi, A.
2016-12-01
Water resources systems analysis often focuses on finding optimal solutions. Yet an optimal solution is optimal only for the modelled issues and managers often seek near-optimal alternatives that address un-modelled objectives, preferences, limits, uncertainties, and other issues. Early on, Modelling to Generate Alternatives (MGA) formalized near-optimal as the region comprising the original problem constraints plus a new constraint that allowed performance within a specified tolerance of the optimal objective function value. MGA identified a few maximally-different alternatives from the near-optimal region. Subsequent work applied Markov Chain Monte Carlo (MCMC) sampling to generate a larger number of alternatives that span the near-optimal region of linear problems or select portions for non-linear problems. We extend the MCMC Hit-And-Run method to generate alternatives that span the full extent of the near-optimal region for non-linear, non-convex problems. First, start at a feasible hit point within the near-optimal region, then run a random distance in a random direction to a new hit point. Next, repeat until generating the desired number of alternatives. The key step at each iterate is to run a random distance along the line in the specified direction to a new hit point. If linear equity constraints exist, we construct an orthogonal basis and use a null space transformation to confine hits and runs to a lower-dimensional space. Linear inequity constraints define the convex bounds on the line that runs through the current hit point in the specified direction. We then use slice sampling to identify a new hit point along the line within bounds defined by the non-linear inequity constraints. This technique is computationally efficient compared to prior near-optimal alternative generation techniques such MGA, MCMC Metropolis-Hastings, evolutionary, or firefly algorithms because search at each iteration is confined to the hit line, the algorithm can move in one step to any point in the near-optimal region, and each iterate generates a new, feasible alternative. We use the method to generate alternatives that span the near-optimal regions of simple and more complicated water management problems and may be preferred to optimal solutions. We also discuss extensions to handle non-linear equity constraints.
Markov Chain Monte Carlo Used in Parameter Inference of Magnetic Resonance Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hock, Kiel; Earle, Keith
2016-02-06
In this paper, we use Boltzmann statistics and the maximum likelihood distribution derived from Bayes’ Theorem to infer parameter values for a Pake Doublet Spectrum, a lineshape of historical significance and contemporary relevance for determining distances between interacting magnetic dipoles. A Metropolis Hastings Markov Chain Monte Carlo algorithm is implemented and designed to find the optimum parameter set and to estimate parameter uncertainties. In conclusion, the posterior distribution allows us to define a metric on parameter space that induces a geometry with negative curvature that affects the parameter uncertainty estimates, particularly for spectra with low signal to noise.
Cell-veto Monte Carlo algorithm for long-range systems.
Kapfer, Sebastian C; Krauth, Werner
2016-09-01
We present a rigorous efficient event-chain Monte Carlo algorithm for long-range interacting particle systems. Using a cell-veto scheme within the factorized Metropolis algorithm, we compute each single-particle move with a fixed number of operations. For slowly decaying potentials such as Coulomb interactions, screening line charges allow us to take into account periodic boundary conditions. We discuss the performance of the cell-veto Monte Carlo algorithm for general inverse-power-law potentials, and illustrate how it provides a new outlook on one of the prominent bottlenecks in large-scale atomistic Monte Carlo simulations.
NASA Astrophysics Data System (ADS)
Peyronel, Fernanda; Quinn, Bonnie; Marangoni, Alejandro G.; Pink, David A.
2015-01-01
We have characterized the surfaces of grain boundaries in edible oils with high solid fat content by combining ultra-small angle x-ray scattering (USAXS) with theoretical modelling and computer simulation. Our results will lead to understand the solid structures formed at the time of manufacturing fats like confectionery fats as well as pave the way for the engineering of innovative fat products. Edible fats are complex semi-solid materials where a solid structure entraps liquid oil. It was not until USAXS combined with modelling was used that the nano- to meso-structures for systems with less than 20% solids were understood. The interpretation of those results utilized models of crystalline nanoplatelets represented by rigid close-packed flat aggregates made of spheres and was allowed to aggregate using the Metropolis Monte Carlo technique. Here, we report on systems containing between 50% and 90% solids. We modelled the solid phase as being formed from seeds onto which solids condensed thereby giving rise to oil-filled nanospaces. The models predicted that the system (a) exhibits structures with fractal dimensions approximately 2, (b) a broad peak somewhat masking that slope, and (c) for smaller values of q, indications that the structures with fractal dimension approximately 2 are uniformly distributed in space. The interpretation of the experimental data was completely driven by these results. The computer simulation predictions were used in conjunction with the USAXS observations to conclude that the systems studied scattered from oil-cavities with sizes between ˜800 and ˜16 000 Å and possessed rough 2-dimensional walls.
NASA Astrophysics Data System (ADS)
Gosselin, Jeremy M.; Dosso, Stan E.; Cassidy, John F.; Quijano, Jorge E.; Molnar, Sheri; Dettmer, Jan
2017-10-01
This paper develops and applies a Bernstein-polynomial parametrization to efficiently represent general, gradient-based profiles in nonlinear geophysical inversion, with application to ambient-noise Rayleigh-wave dispersion data. Bernstein polynomials provide a stable parametrization in that small perturbations to the model parameters (basis-function coefficients) result in only small perturbations to the geophysical parameter profile. A fully nonlinear Bayesian inversion methodology is applied to estimate shear wave velocity (VS) profiles and uncertainties from surface wave dispersion data extracted from ambient seismic noise. The Bayesian information criterion is used to determine the appropriate polynomial order consistent with the resolving power of the data. Data error correlations are accounted for in the inversion using a parametric autoregressive model. The inversion solution is defined in terms of marginal posterior probability profiles for VS as a function of depth, estimated using Metropolis-Hastings sampling with parallel tempering. This methodology is applied to synthetic dispersion data as well as data processed from passive array recordings collected on the Fraser River Delta in British Columbia, Canada. Results from this work are in good agreement with previous studies, as well as with co-located invasive measurements. The approach considered here is better suited than `layered' modelling approaches in applications where smooth gradients in geophysical parameters are expected, such as soil/sediment profiles. Further, the Bernstein polynomial representation is more general than smooth models based on a fixed choice of gradient type (e.g. power-law gradient) because the form of the gradient is determined objectively by the data, rather than by a subjective parametrization choice.
The impact of climate change on ozone-related mortality in Sydney.
Physick, William; Cope, Martin; Lee, Sunhee
2014-01-13
Coupled global, regional and chemical transport models are now being used with relative-risk functions to determine the impact of climate change on human health. Studies have been carried out for global and regional scales, and in our paper we examine the impact of climate change on ozone-related mortality at the local scale across an urban metropolis (Sydney, Australia). Using three coupled models, with a grid spacing of 3 km for the chemical transport model (CTM), and a mortality relative risk function of 1.0006 per 1 ppb increase in daily maximum 1-hour ozone concentration, we evaluated the change in ozone concentrations and mortality between decades 1996-2005 and 2051-2060. The global model was run with the A2 emissions scenario. As there is currently uncertainty regarding a threshold concentration below which ozone does not impact on mortality, we calculated mortality estimates for the three daily maximum 1-hr ozone concentration thresholds of 0, 25 and 40 ppb. The mortality increase for 2051-2060 ranges from 2.3% for a 0 ppb threshold to 27.3% for a 40 ppb threshold, although the numerical increases differ little. Our modeling approach is able to identify the variation in ozone-related mortality changes at a suburban scale, estimating that climate change could lead to an additional 55 to 65 deaths across Sydney in the decade 2051-2060. Interestingly, the largest increases do not correspond spatially to the largest ozone increases or the densest population centres. The distribution pattern of changes does not seem to vary with threshold value, while the magnitude only varies slightly.
Kaur, Harparkash; Allan, Elizabeth Louise; Mamadu, Ibrahim; Hall, Zoe; Ibe, Ogochukwu; El Sherbiny, Mohamed; van Wyk, Albert; Yeung, Shunmay; Swamidoss, Isabel; Green, Michael D; Dwivedi, Prabha; Culzoni, Maria Julia; Clarke, Siân; Schellenberg, David; Fernández, Facundo M; Onwujekwe, Obinna
2015-01-01
Artemisinin-based combination therapies are recommended by the World Health Organisation (WHO) as first-line treatment for Plasmodium falciparum malaria, yet medication must be of good quality for efficacious treatment. A recent meta-analysis reported 35% (796/2,296) of antimalarial drug samples from 21 Sub-Saharan African countries, purchased from outlets predominantly using convenience sampling, failed chemical content analysis. We used three sampling strategies to purchase artemisinin-containing antimalarials (ACAs) in Enugu metropolis, Nigeria, and compared the resulting quality estimates. ACAs were purchased using three sampling approaches--convenience, mystery clients and overt, within a defined area and sampling frame in Enugu metropolis. The active pharmaceutical ingredients were assessed using high-performance liquid chromatography and confirmed by mass spectrometry at three independent laboratories. Results were expressed as percentage of APIs stated on the packaging and used to categorise each sample as acceptable quality, substandard, degraded, or falsified. Content analysis of 3024 samples purchased from 421 outlets using convenience (n=200), mystery (n=1,919) and overt (n=905) approaches, showed overall 90.8% ACAs to be of acceptable quality, 6.8% substandard, 1.3% degraded and 1.2% falsified. Convenience sampling yielded a significantly higher prevalence of poor quality ACAs, but was not evident by the mystery and overt sampling strategies both of which yielded results that were comparable between each other. Artesunate (n=135; 4 falsified) and dihydroartemisinin (n=14) monotherapy tablets, not recommended by WHO, were also identified. Randomised sampling identified fewer falsified ACAs than previously reported by convenience approaches. Our findings emphasise the need for specific consideration to be given to sampling frame and sampling approach if representative information on drug quality is to be obtained.
Saba, Courage Kosi Setsoafia; Atayure, Seidu Isaac; Adzitey, Frederick
2015-03-01
Fish is an important source of protein all over the world, including in Ghana. The fishery sector plays a major role in meeting the domestic need of animal protein and also contributes greatly in foreign exchange earnings. The domestic supply of fish does not meet the demand, so Ghana imports fish and fish products from other countries. Media reports in Ghana have alleged the use of formaldehyde to preserve fish for increased shelf life and to maintain freshness. This research, therefore, sought to establish the levels of formaldehyde in imported and local fresh fish in the Tamale Metropolis by using a ChemSee formaldehyde and formalin detection test kit. Positive and negative controls were performed by using various concentrations of formalin (1, 10, 30, 50, 100, and 300 ppm) and sterile distilled water, respectively. Three times over a 6-month period, different fish species were obtained from five wholesale cold stores (where fish are sold in cartons) and some local sales points (where locally caught fish are sold). A total of 32 samples were taken during three different sampling sessions: 23 imported fish (mackerel, herring, horse mackerel, salmon, and redfish) and 9 local tilapia. The fish were cut, and 50 g was weighed and blended with an equal volume (50 ml) of sterile distilled water. Samples were transferred to test tubes and centrifuged. A test strip was dipped into the supernatant and observed for a color change. A change in color from white to pink or purple indicated the presence of formaldehyde in fish. The study showed that no formaldehyde was present in the imported and local fish obtained. The appropriate regulatory agencies should carry out this study regularly to ensure that fish consumed in Ghana is safe for consumption.
NASA Astrophysics Data System (ADS)
Ray, Raghab; Jana, Tapan Kumar
2017-12-01
Mangroves are known as natural carbon sinks, taking CO2 out of the atmosphere and store it in their biomass for many years. This study aimed to investigate the capacity of world's largest mangrove, the Sundarbans (Indian part) to sequester anthropogenic CO2 emitted from the proximate coal-based thermal power plant in Kolaghat (∼100 km away from mangrove site). Study also includes Kolkata, one of the largest metropolises of India (∼150 km away from mangrove site) for comparing micrometeorological parameters, biosphere-atmosphere CO2 exchange fluxes and atmospheric pollutants between three distinct environments: mangrove-power plant-metropolis. Hourly sampling of atmospheric CO2 in all three sites (late December 2011 and early January 2012) revealed that CO2 concentrations and emission fluxes were maximum around the power plant (360-621 ppmv, 5.6-56.7 mg m-2s-1 respectively) followed by the metropolis (383-459 ppmv, 3.8-20.4 mg m-2s-1 respectively) and mangroves (277-408 ppmv, -8.9-11.4 mg m-2s-1, respectively). Monthly coal consumption rates (41-57, in 104 ton month-1) were converted to CO2 suggesting that 2.83 Tg C was added to the atmosphere in 2011 for the generation of 7469732 MW energy from the power plant. Indian Sundarbans (4264 km2) sequestered total of 2.79 Tg C which was 0.64% of the annual fossil fuel emission from India in the same time period. Based on these data from 2010 to 2011, it is calculated that about 4328 km2 mangrove forest coverage is needed to sequester all CO2 emitted from the Kolaghat power plant.
Arlinghaus, Robert; Mehner, Thomas
2004-03-01
Increased efforts to analyze the human dimensions of anglers are necessary to improve freshwater fisheries management. This paper is a comparative analysis of urban and rural anglers living in a metropolis, based on n = 1061 anglers responding to a mail survey in the German capital of Berlin. More than two thirds of the anglers (71%) had spent most (>50%) of their effort outside the city borders of Berlin and thus were categorized as rural anglers. Compared to the rural anglers, urban anglers (>/=50% of total effort spent inside the city) were younger and less educated. Urban anglers were more avid and committed, less mobile, and more frequently fished from boats and during weekdays. Rural anglers were more experienced, fished for longer times per trip, fished more often at weekends and on holidays, were more often members of angling clubs, and more frequently caught higher valued fish species. The achievement and fish quantity aspects of the angling experience were more important for urban than for rural anglers. Concerning management options, urban anglers more frequently suggested constraining other stakeholders and reducing regulations, whereas rural anglers more often proposed improving physical access to angling sites. Future urban fishing programs should offer ease of access, connection to public transportation, moderate prices, and diverse piscivorous fish stocks. In contrast to rural fisheries, the provision of high ecological and aesthetical quality of the angling waters can be regarded as of minor importance in urban fisheries. Rural fisheries managers need to consider the needs of stakeholders living in Berlin to minimize impacts on the less degraded rural water bodies and potential user conflicts with resident anglers. Ecosystem-based management approaches should guide rural fisheries policy.
NASA Astrophysics Data System (ADS)
Adak, Anandamay; Chatterjee, Abhijit; Ghosh, Sanjay; Raha, Sibaji; Roy, Arindam
2016-07-01
A study was conducted on the chemical characterization of fine mode aerosol or PM2.5 over a rural atmosphere near the coast of Bay of Bengal in eastern India. Samples were collected and analyzed during March 2013 - February 2014. The concentration of PM2.5 was found span over a wide range from as low as 3 µg m-3 to as high as 180 µg m-3. The average concentration of PM2.5 was 62 µg m-3. Maximum accumulation of fine mode aerosol was observed during winter whereas minimum was observed during monsoon. Water soluble ionic species of fine mode aerosol were characterized over this rural atmosphere. In spite of being situated near the coast of Bay of Bengal, we observed significantly higher concentrations for anthropogenic species like ammonium and sulphate. The concentrations of these two species were much higher than the sea-salt aerosols. Ammonium and sulphate contributed around 30 % to the total fine mode aerosols. Even dust aerosol species like calcium also showed higher concentrations. Chloride to sodium ratio was found to be much less than that in standard sea-water indicating strong interaction between sea-salt and anthropogenic aerosols. Use of fertilizers in various crop fields and human and animal wastes significantly increased ammonium in fine mode aerosols. Dust aerosol species were accumulated in the atmosphere which could be due to transport of finer dust species from nearby metropolis or locally generated. Non-sea-sulphate and nitrate showed significant contributions in fine mode aerosols having both local and transported sources. Source apportionment shows prominent emission sources of anthropogenic aerosols from local anthropogenic activities and transported from nearby Kolkata metropolis as well.
Ntodie, Michael; Abu, Sampson L; Kyei, Samuel; Abokyi, Samuel; Abu, Emmanuel K
2017-06-01
To determine the near vision spectacle coverage and barriers to obtaining near vision correction among adults aged 35 years and older in the Cape Coast Metropolis of Ghana. A population-based cross-sectional study design was adopted and 500 out of 576 participants aged 35 years and older were examined from 12 randomly selected clusters in Cape Coast, Ghana. All participants underwent a comprehensive eye examination which included: distance and near visual acuities measurements and external and internal ocular health assessments. Distance and near refractions were performed using subjective refraction technique. Information on participants' demographics, near vision correction status, near visual needs and barriers to acquiring near vision correction were obtained through a questionnaire administered as part of the study. The mean age of participants was 52.3±10.3 years of whom 280 (56%) were females and 220 (44%) were males. The near vision spectacle coverage was 25%, 33% "met need" for near vision correction in the presbyopic population, and 64% unmet need in the entire study population. After controlling for other variables, age (5 th and 6 th decades) and educational level were associated with "met need" for near vision correction (OR=2.7 (1.55-4.68), p =0.00, and OR=2.36 (1.18-4.72), p=0.02 respectively). Among those who needed but did not have near vision correction, 64 (26%) did not feel the need for correction, 55 (22%) stated that they were unaware of available interventions, and 53 (21%) found the cost of near vision correction prohibitive. There was a low near vision spectacle coverage in this population which suggests the need for strategies on health education and promotion to address the lack of awareness of spectacle need and cost of services.
Chua, Huey Eng; Bhowmick, Sourav S; Tucker-Kellogg, Lisa
2017-10-01
Given a signaling network, the target combination prediction problem aims to predict efficacious and safe target combinations for combination therapy. State-of-the-art in silico methods use Monte Carlo simulated annealing (mcsa) to modify a candidate solution stochastically, and use the Metropolis criterion to accept or reject the proposed modifications. However, such stochastic modifications ignore the impact of the choice of targets and their activities on the combination's therapeutic effect and off-target effects, which directly affect the solution quality. In this paper, we present mascot, a method that addresses this limitation by leveraging two additional heuristic criteria to minimize off-target effects and achieve synergy for candidate modification. Specifically, off-target effects measure the unintended response of a signaling network to the target combination and is often associated with toxicity. Synergy occurs when a pair of targets exerts effects that are greater than the sum of their individual effects, and is generally a beneficial strategy for maximizing effect while minimizing toxicity. mascot leverages on a machine learning-based target prioritization method which prioritizes potential targets in a given disease-associated network to select more effective targets (better therapeutic effect and/or lower off-target effects); and on Loewe additivity theory from pharmacology which assesses the non-additive effects in a combination drug treatment to select synergistic target activities. Our experimental study on two disease-related signaling networks demonstrates the superiority of mascot in comparison to existing approaches. Copyright © 2017 Elsevier Inc. All rights reserved.
GRID-BASED EXPLORATION OF COSMOLOGICAL PARAMETER SPACE WITH SNAKE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikkelsen, K.; Næss, S. K.; Eriksen, H. K., E-mail: kristin.mikkelsen@astro.uio.no
2013-11-10
We present a fully parallelized grid-based parameter estimation algorithm for investigating multidimensional likelihoods called Snake, and apply it to cosmological parameter estimation. The basic idea is to map out the likelihood grid-cell by grid-cell according to decreasing likelihood, and stop when a certain threshold has been reached. This approach improves vastly on the 'curse of dimensionality' problem plaguing standard grid-based parameter estimation simply by disregarding grid cells with negligible likelihood. The main advantages of this method compared to standard Metropolis-Hastings Markov Chain Monte Carlo methods include (1) trivial extraction of arbitrary conditional distributions; (2) direct access to Bayesian evidences; (3)more » better sampling of the tails of the distribution; and (4) nearly perfect parallelization scaling. The main disadvantage is, as in the case of brute-force grid-based evaluation, a dependency on the number of parameters, N{sub par}. One of the main goals of the present paper is to determine how large N{sub par} can be, while still maintaining reasonable computational efficiency; we find that N{sub par} = 12 is well within the capabilities of the method. The performance of the code is tested by comparing cosmological parameters estimated using Snake and the WMAP-7 data with those obtained using CosmoMC, the current standard code in the field. We find fully consistent results, with similar computational expenses, but shorter wall time due to the perfect parallelization scheme.« less
NASA Astrophysics Data System (ADS)
Shen, W.; Schulte-Pelkum, V.; Ritzwoller, M. H.
2011-12-01
The joint inversion of surface wave dispersion and receiver functions was proven feasible on a station by station basis more than a decade ago. Joint application to a large number of stations across a broad region such as western US is more challenging, however, because of the different resolutions of the two methods. Improvements in resolution in surface wave studies derived from ambient noise and array-based methods applied to earthquake data now allow surface wave dispersion and receiver functions to be inverted simultaneously across much of the Earthscope/USArray Transportable Array (TA), and we have developed a Monte-Carlo procedure for this purpose. As a proof of concept we applied this procedure to a region containing 186 TA stations in the intermountain west, including a variety of tectonic settings such as the Colorado Plateau, the Basin and Range, the Rocky Mountains, and the Great Plains. This work has now been expanded to encompass all TA stations in the western US. Our approach includes three main components. (1) We enlarge the Earthscope Automated Receiver Survey (EARS) receiver function database by adding more events within a quality control procedure. A back-azimuth-independent receiver function and its associated uncertainties are constructed using a harmonic stripping algorithm. (2) Rayleigh wave dispersion curves are generated from the eikonal tomography applied to ambient noise cross-correlation data and Helmoholtz tomography applied to teleseismic surface wave data to yield dispersion maps from 8 sec to 80 sec period. (3) We apply a Metropolis Monte Carlo algorithm to invert for the average velocity structure beneath each station. Simple kriging is applied to interpolate to the discrete results into a continuous 3-D model. This method has now been applied to over 1,000 TA stations in the western US. We show that the receiver functions and surface wave dispersion data can be reconciled beneath more than 80% of the stations using a smooth parameterization of both crustal and uppermost mantle structure. After the inversion, a 3-D model for the crust and uppermost mantle to a depth of 150 km is constructed for this region. Compared with using surface wave data alone, uncertainty in crustal thickness is much lower and as a result, the lower crustal velocity is better constrained given a smaller depth-velocity trade-off. The new 3-D model including Moho depth with attendant uncertainties provides the basis for further analysis on radial anisotropy and geodynamics in the western US, and also forms a starting point for other seismological studies such as body wave tomography and receiver function CCP analysis.
Bayesian inversions of a dynamic vegetation model at four European grassland sites
NASA Astrophysics Data System (ADS)
Minet, J.; Laloy, E.; Tychon, B.; Francois, L.
2015-05-01
Eddy covariance data from four European grassland sites are used to probabilistically invert the CARAIB (CARbon Assimilation In the Biosphere) dynamic vegetation model (DVM) with 10 unknown parameters, using the DREAM(ZS) (DiffeRential Evolution Adaptive Metropolis) Markov chain Monte Carlo (MCMC) sampler. We focus on comparing model inversions, considering both homoscedastic and heteroscedastic eddy covariance residual errors, with variances either fixed a priori or jointly inferred together with the model parameters. Agreements between measured and simulated data during calibration are comparable with previous studies, with root mean square errors (RMSEs) of simulated daily gross primary productivity (GPP), ecosystem respiration (RECO) and evapotranspiration (ET) ranging from 1.73 to 2.19, 1.04 to 1.56 g C m-2 day-1 and 0.50 to 1.28 mm day-1, respectively. For the calibration period, using a homoscedastic eddy covariance residual error model resulted in a better agreement between measured and modelled data than using a heteroscedastic residual error model. However, a model validation experiment showed that CARAIB models calibrated considering heteroscedastic residual errors perform better. Posterior parameter distributions derived from using a heteroscedastic model of the residuals thus appear to be more robust. This is the case even though the classical linear heteroscedastic error model assumed herein did not fully remove heteroscedasticity of the GPP residuals. Despite the fact that the calibrated model is generally capable of fitting the data within measurement errors, systematic bias in the model simulations are observed. These are likely due to model inadequacies such as shortcomings in the photosynthesis modelling. Besides the residual error treatment, differences between model parameter posterior distributions among the four grassland sites are also investigated. It is shown that the marginal distributions of the specific leaf area and characteristic mortality time parameters can be explained by site-specific ecophysiological characteristics.
NASA Astrophysics Data System (ADS)
Oware, E. K.
2017-12-01
Geophysical quantification of hydrogeological parameters typically involve limited noisy measurements coupled with inadequate understanding of the target phenomenon. Hence, a deterministic solution is unrealistic in light of the largely uncertain inputs. Stochastic imaging (SI), in contrast, provides multiple equiprobable realizations that enable probabilistic assessment of aquifer properties in a realistic manner. Generation of geologically realistic prior models is central to SI frameworks. Higher-order statistics for representing prior geological features in SI are, however, usually borrowed from training images (TIs), which may produce undesirable outcomes if the TIs are unpresentatitve of the target structures. The Markov random field (MRF)-based SI strategy provides a data-driven alternative to TI-based SI algorithms. In the MRF-based method, the simulation of spatial features is guided by Gibbs energy (GE) minimization. Local configurations with smaller GEs have higher likelihood of occurrence and vice versa. The parameters of the Gibbs distribution for computing the GE are estimated from the hydrogeophysical data, thereby enabling the generation of site-specific structures in the absence of reliable TIs. In Metropolis-like SI methods, the variance of the transition probability controls the jump-size. The procedure is a standard Markov chain Monte Carlo (McMC) method when a constant variance is assumed, and becomes simulated annealing (SA) when the variance (cooling temperature) is allowed to decrease gradually with time. We observe that in certain problems, the large variance typically employed at the beginning to hasten burn-in may be unideal for sampling at the equilibrium state. The powerfulness of SA stems from its flexibility to adaptively scale the variance at different stages of the sampling. Degeneration of results were reported in a previous implementation of the MRF-based SI strategy based on a constant variance. Here, we present an updated version of the algorithm based on SA that appears to resolve the degeneration problem with seemingly improved results. We illustrate the performance of the SA version with a joint inversion of time-lapse concentration and electrical resistivity measurements in a hypothetical trinary hydrofacies aquifer characterization problem.
NASA Astrophysics Data System (ADS)
Dunkley, J.; Spergel, D. N.; Komatsu, E.; Hinshaw, G.; Larson, D.; Nolta, M. R.; Odegard, N.; Page, L.; Bennett, C. L.; Gold, B.; Hill, R. S.; Jarosik, N.; Weiland, J. L.; Halpern, M.; Kogut, A.; Limon, M.; Meyer, S. S.; Tucker, G. S.; Wollack, E.; Wright, E. L.
2009-08-01
We describe a sampling method to estimate the polarized cosmic microwave background (CMB) signal from observed maps of the sky. We use a Metropolis-within-Gibbs algorithm to estimate the polarized CMB map, containing Q and U Stokes parameters at each pixel, and its covariance matrix. These can be used as inputs for cosmological analyses. The polarized sky signal is parameterized as the sum of three components: CMB, synchrotron emission, and thermal dust emission. The polarized Galactic components are modeled with spatially varying power-law spectral indices for the synchrotron, and a fixed power law for the dust, and their component maps are estimated as by-products. We apply the method to simulated low-resolution maps with pixels of side 7.2 deg, using diagonal and full noise realizations drawn from the WMAP noise matrices. The CMB maps are recovered with goodness of fit consistent with errors. Computing the likelihood of the E-mode power in the maps as a function of optical depth to reionization, τ, for fixed temperature anisotropy power, we recover τ = 0.091 ± 0.019 for a simulation with input τ = 0.1, and mean τ = 0.098 averaged over 10 simulations. A "null" simulation with no polarized CMB signal has maximum likelihood consistent with τ = 0. The method is applied to the five-year WMAP data, using the K, Ka, Q, and V channels. We find τ = 0.090 ± 0.019, compared to τ = 0.086 ± 0.016 from the template-cleaned maps used in the primary WMAP analysis. The synchrotron spectral index, β, averaged over high signal-to-noise pixels with standard deviation σ(β) < 0.25, but excluding ~6% of the sky masked in the Galactic plane, is -3.03 ± 0.04. This estimate does not vary significantly with Galactic latitude, although includes an informative prior. WMAP is the result of a partnership between Princeton University and NASA's Goddard Space Flight Center. Scientific guidance is provided by the WMAP Science Team.
Mathieu, Jordane A; Hatté, Christine; Balesdent, Jérôme; Parent, Éric
2015-11-01
The response of soil carbon dynamics to climate and land-use change will affect both the future climate and the quality of ecosystems. Deep soil carbon (>20 cm) is the primary component of the soil carbon pool, but the dynamics of deep soil carbon remain poorly understood. Therefore, radiocarbon activity (Δ14C), which is a function of the age of carbon, may help to understand the rates of soil carbon biodegradation and stabilization. We analyzed the published 14C contents in 122 profiles of mineral soil that were well distributed in most of the large world biomes, except for the boreal zone. With a multivariate extension of a linear mixed-effects model whose inference was based on the parallel combination of two algorithms, the expectation-maximization (EM) and the Metropolis-Hasting algorithms, we expressed soil Δ14C profiles as a four-parameter function of depth. The four-parameter model produced insightful predictions of soil Δ14C as dependent on depth, soil type, climate, vegetation, land-use and date of sampling (R2=0.68). Further analysis with the model showed that the age of topsoil carbon was primarily affected by climate and cultivation. By contrast, the age of deep soil carbon was affected more by soil taxa than by climate and thus illustrated the strong dependence of soil carbon dynamics on other pedologic traits such as clay content and mineralogy. © 2015 John Wiley & Sons Ltd.
Li, Chongwei; Zhang, Yajuan; Kharel, Gehendra; Zou, Chris B
2018-06-01
Nutrient discharge into peri-urban streams and reservoirs constitutes a significant pressure on environmental management, but quantitative assessment of non-point source pollution under climate variability in fast changing peri-urban watersheds is challenging. Soil and Water Assessment Tool (SWAT) was used to simulate water budget and nutrient loads for landscape patterns representing a 30-year progression of urbanization in a peri-urban watershed near Tianjin metropolis, China. A suite of landscape pattern indices was related to nitrogen (N) and phosphorous (P) loads under dry and wet climate using CANOCO redundancy analysis. The calibrated SWAT model was adequate to simulate runoff and nutrient loads for this peri-urban watershed, with Nash-Sutcliffe coefficient (NSE) and coefficient of determination (R 2 ) > 0.70 and percentage bias (PBIAS) between -7 and +18 for calibration and validation periods. With the progression of urbanization, forest remained the main "sink" landscape while cultivated and urban lands remained the main "source" landscapes with the role of orchard and grassland being uncertain and changing with time. Compared to 1984, the landscape use pattern in 2013 increased nutrient discharge by 10%. Nutrient loads modelled under wet climate were 3-4 times higher than that under dry climate for the same landscape pattern. Results indicate that climate change could impose a far greater impact on runoff and nutrient discharge in a peri-urban watershed than landscape pattern change.
NASA Astrophysics Data System (ADS)
Li, Chongwei; Zhang, Yajuan; Kharel, Gehendra; Zou, Chris B.
2018-06-01
Nutrient discharge into peri-urban streams and reservoirs constitutes a significant pressure on environmental management, but quantitative assessment of non-point source pollution under climate variability in fast changing peri-urban watersheds is challenging. Soil and Water Assessment Tool (SWAT) was used to simulate water budget and nutrient loads for landscape patterns representing a 30-year progression of urbanization in a peri-urban watershed near Tianjin metropolis, China. A suite of landscape pattern indices was related to nitrogen (N) and phosphorous (P) loads under dry and wet climate using CANOCO redundancy analysis. The calibrated SWAT model was adequate to simulate runoff and nutrient loads for this peri-urban watershed, with Nash-Sutcliffe coefficient (NSE) and coefficient of determination ( R 2) > 0.70 and percentage bias (PBIAS) between -7 and +18 for calibration and validation periods. With the progression of urbanization, forest remained the main "sink" landscape while cultivated and urban lands remained the main "source" landscapes with the role of orchard and grassland being uncertain and changing with time. Compared to 1984, the landscape use pattern in 2013 increased nutrient discharge by 10%. Nutrient loads modelled under wet climate were 3-4 times higher than that under dry climate for the same landscape pattern. Results indicate that climate change could impose a far greater impact on runoff and nutrient discharge in a peri-urban watershed than landscape pattern change.
[Urbanization in tropical countries].
Amat-Roze, J M
1983-01-01
Rural populations are still the most numerous in tropical countries. But we can witness an unprecedent process of urbanization. However the dynamics of the phenomenon differs greatly in different countries and in different towns. As a matter of fact, the greatest overcrowded metropolis attract the greatest part of the migrants from rural areas; the attractive factors are multifarious and universal. This not easily controllable trend seems to be irreversible. The migrant farmers will generally find a job within the informed economic sector. The zones of spontaneous and precarious settlement are often their first environmental living conditions. Some of these unhealthy dwelling areas are subject to development plans; some of them being extremely well designed.
Monte Carlo simulation of a noisy quantum channel with memory.
Akhalwaya, Ismail; Moodley, Mervlyn; Petruccione, Francesco
2015-10-01
The classical capacity of quantum channels is well understood for channels with uncorrelated noise. For the case of correlated noise, however, there are still open questions. We calculate the classical capacity of a forgetful channel constructed by Markov switching between two depolarizing channels. Techniques have previously been applied to approximate the output entropy of this channel and thus its capacity. In this paper, we use a Metropolis-Hastings Monte Carlo approach to numerically calculate the entropy. The algorithm is implemented in parallel and its performance is studied and optimized. The effects of memory on the capacity are explored and previous results are confirmed to higher precision.
Kulis, Stephen; Hodge, David R; Ayers, Stephanie L; Brown, Eddie F; Marsiglia, Flavio F
2012-09-01
This article explores the aspects of spirituality and religious involvement that may be the protective factors against substance use among urban American Indian (AI) youth. Data come from AI youth (N = 123) in five urban middle schools in a southwestern metropolis. Ordinary least squares regression analyses indicated that following Christian beliefs and belonging to the Native American Church were associated with lower levels of substance use. Following AI traditional spiritual beliefs was associated with antidrug attitudes, norms, and expectancies. Having a sense of belonging to traditions from both AI cultures and Christianity may foster integration of the two worlds in which urban AI youth live.
The experience of pregnancy and childbirth for unmarried mothers in London, 1760-1866.
Williams, Samantha
2011-01-01
This article explores the experience of pregnancy and childbirth for unmarried mothers in the metropolis in the eighteenth and nineteenth centuries. It draws upon, in particular, the infanticide cases heard at the Old Bailey between 1760 and 1866. Many of the women in these records found themselves alone and afraid as they coped with the pregnancy and birth of their first child. A great deal is revealed about the birthing body: the ambiguity surrounding the identification of and signs of pregnancy, labour and delivery, the place of birth and the degree of privacy, and the nature of, and dangers associated with, solitary childbirth.