Rugged Metropolis sampling with simultaneous updating of two dynamical variables
NASA Astrophysics Data System (ADS)
Berg, Bernd A.; Zhou, Huan-Xiang
2005-07-01
The rugged Metropolis (RM) algorithm is a biased updating scheme which aims at directly hitting the most likely configurations in a rugged free-energy landscape. Details of the one-variable (RM1) implementation of this algorithm are presented. This is followed by an extension to simultaneous updating of two dynamical variables (RM2) . In a test with the brain peptide Met-Enkephalin in vacuum RM2 improves conventional Metropolis simulations by a factor of about 4. Correlations between three or more dihedral angles appear to prevent larger improvements at low temperatures. We also investigate a multihit Metropolis scheme, which spends more CPU time on variables with large autocorrelation times.
Hadron spectrum of quenched QCD on a 32{sup 3} {times} 64 lattice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Seyong; Sinclair, D.K.
1992-10-01
Preliminary results from a hadron spectrum calculation of quenched Quantumchromodynamics on a 32{sup 3} {times} 64 lattice at {beta} = 6.5 are reported. The hadron spectrum calculation is done with staggered quarks of masses, m{sub q}a = 0.001, 0.005 and 0.0025. We use two different sources in order to be able to extract the {Delta} mass in addition to the usual local light hadron masses. The numerical simulation is executed on the Intel Touchstone Delta computer. The peak speed of the Delta for a 16 {times} 32 mesh configuration is 41 Gflops for 32 bit precision. The sustained speed formore » our updating code is 9.5 Gflops. A multihit metropolis algorithm combined with an over-relaxation method is used in the updating and the conjugate gradient method is employed for Dirac matrix inversion. Configurations are stored every 1000 sweeps.« less
Hadron spectrum of quenched QCD on a 32[sup 3] [times] 64 lattice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Seyong; Sinclair, D.K.
1992-10-01
Preliminary results from a hadron spectrum calculation of quenched Quantumchromodynamics on a 32[sup 3] [times] 64 lattice at [beta] = 6.5 are reported. The hadron spectrum calculation is done with staggered quarks of masses, m[sub q]a = 0.001, 0.005 and 0.0025. We use two different sources in order to be able to extract the [Delta] mass in addition to the usual local light hadron masses. The numerical simulation is executed on the Intel Touchstone Delta computer. The peak speed of the Delta for a 16 [times] 32 mesh configuration is 41 Gflops for 32 bit precision. The sustained speed formore » our updating code is 9.5 Gflops. A multihit metropolis algorithm combined with an over-relaxation method is used in the updating and the conjugate gradient method is employed for Dirac matrix inversion. Configurations are stored every 1000 sweeps.« less
Mori, Yoshiharu; Okumura, Hisashi
2015-12-05
Simulated tempering (ST) is a useful method to enhance sampling of molecular simulations. When ST is used, the Metropolis algorithm, which satisfies the detailed balance condition, is usually applied to calculate the transition probability. Recently, an alternative method that satisfies the global balance condition instead of the detailed balance condition has been proposed by Suwa and Todo. In this study, ST method with the Suwa-Todo algorithm is proposed. Molecular dynamics simulations with ST are performed with three algorithms (the Metropolis, heat bath, and Suwa-Todo algorithms) to calculate the transition probability. Among the three algorithms, the Suwa-Todo algorithm yields the highest acceptance ratio and the shortest autocorrelation time. These suggest that sampling by a ST simulation with the Suwa-Todo algorithm is most efficient. In addition, because the acceptance ratio of the Suwa-Todo algorithm is higher than that of the Metropolis algorithm, the number of temperature states can be reduced by 25% for the Suwa-Todo algorithm when compared with the Metropolis algorithm. © 2015 Wiley Periodicals, Inc.
Application of Biased Metropolis Algorithms: From protons to proteins
Bazavov, Alexei; Berg, Bernd A.; Zhou, Huan-Xiang
2015-01-01
We show that sampling with a biased Metropolis scheme is essentially equivalent to using the heatbath algorithm. However, the biased Metropolis method can also be applied when an efficient heatbath algorithm does not exist. This is first illustrated with an example from high energy physics (lattice gauge theory simulations). We then illustrate the Rugged Metropolis method, which is based on a similar biased updating scheme, but aims at very different applications. The goal of such applications is to locate the most likely configurations in a rugged free energy landscape, which is most relevant for simulations of biomolecules. PMID:26612967
High-Dimensional Exploratory Item Factor Analysis by a Metropolis-Hastings Robbins-Monro Algorithm
ERIC Educational Resources Information Center
Cai, Li
2010-01-01
A Metropolis-Hastings Robbins-Monro (MH-RM) algorithm for high-dimensional maximum marginal likelihood exploratory item factor analysis is proposed. The sequence of estimates from the MH-RM algorithm converges with probability one to the maximum likelihood solution. Details on the computer implementation of this algorithm are provided. The…
Accelerated Dimension-Independent Adaptive Metropolis
Chen, Yuxin; Keyes, David E.; Law, Kody J.; ...
2016-10-27
This work describes improvements from algorithmic and architectural means to black-box Bayesian inference over high-dimensional parameter spaces. The well-known adaptive Metropolis (AM) algorithm [33] is extended herein to scale asymptotically uniformly with respect to the underlying parameter dimension for Gaussian targets, by respecting the variance of the target. The resulting algorithm, referred to as the dimension-independent adaptive Metropolis (DIAM) algorithm, also shows improved performance with respect to adaptive Metropolis on non-Gaussian targets. This algorithm is further improved, and the possibility of probing high-dimensional (with dimension d 1000) targets is enabled, via GPU-accelerated numerical libraries and periodically synchronized concurrent chains (justimore » ed a posteriori). Asymptotically in dimension, this GPU implementation exhibits a factor of four improvement versus a competitive CPU-based Intel MKL parallel version alone. Strong scaling to concurrent chains is exhibited, through a combination of longer time per sample batch (weak scaling) and yet fewer necessary samples to convergence. The algorithm performance is illustrated on several Gaussian and non-Gaussian target examples, in which the dimension may be in excess of one thousand.« less
Accelerated Dimension-Independent Adaptive Metropolis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yuxin; Keyes, David E.; Law, Kody J.
This work describes improvements from algorithmic and architectural means to black-box Bayesian inference over high-dimensional parameter spaces. The well-known adaptive Metropolis (AM) algorithm [33] is extended herein to scale asymptotically uniformly with respect to the underlying parameter dimension for Gaussian targets, by respecting the variance of the target. The resulting algorithm, referred to as the dimension-independent adaptive Metropolis (DIAM) algorithm, also shows improved performance with respect to adaptive Metropolis on non-Gaussian targets. This algorithm is further improved, and the possibility of probing high-dimensional (with dimension d 1000) targets is enabled, via GPU-accelerated numerical libraries and periodically synchronized concurrent chains (justimore » ed a posteriori). Asymptotically in dimension, this GPU implementation exhibits a factor of four improvement versus a competitive CPU-based Intel MKL parallel version alone. Strong scaling to concurrent chains is exhibited, through a combination of longer time per sample batch (weak scaling) and yet fewer necessary samples to convergence. The algorithm performance is illustrated on several Gaussian and non-Gaussian target examples, in which the dimension may be in excess of one thousand.« less
Metropolis-Hastings Robbins-Monro Algorithm for Confirmatory Item Factor Analysis
ERIC Educational Resources Information Center
Cai, Li
2010-01-01
Item factor analysis (IFA), already well established in educational measurement, is increasingly applied to psychological measurement in research settings. However, high-dimensional confirmatory IFA remains a numerical challenge. The current research extends the Metropolis-Hastings Robbins-Monro (MH-RM) algorithm, initially proposed for…
ERIC Educational Resources Information Center
Yang, Ji Seung; Cai, Li
2014-01-01
The main purpose of this study is to improve estimation efficiency in obtaining maximum marginal likelihood estimates of contextual effects in the framework of nonlinear multilevel latent variable model by adopting the Metropolis-Hastings Robbins-Monro algorithm (MH-RM). Results indicate that the MH-RM algorithm can produce estimates and standard…
ERIC Educational Resources Information Center
Beddard, Godfrey S.
2011-01-01
Thermodynamic quantities such as the average energy, heat capacity, and entropy are calculated using a Monte Carlo method based on the Metropolis algorithm. This method is illustrated with reference to the harmonic oscillator but is particularly useful when the partition function cannot be evaluated; an example using a one-dimensional spin system…
Link, W.A.; Barker, R.J.
2008-01-01
Judicious choice of candidate generating distributions improves efficiency of the Metropolis-Hastings algorithm. In Bayesian applications, it is sometimes possible to identify an approximation to the target posterior distribution; this approximate posterior distribution is a good choice for candidate generation. These observations are applied to analysis of the Cormack?Jolly?Seber model and its extensions.
A quantum–quantum Metropolis algorithm
Yung, Man-Hong; Aspuru-Guzik, Alán
2012-01-01
The classical Metropolis sampling method is a cornerstone of many statistical modeling applications that range from physics, chemistry, and biology to economics. This method is particularly suitable for sampling the thermal distributions of classical systems. The challenge of extending this method to the simulation of arbitrary quantum systems is that, in general, eigenstates of quantum Hamiltonians cannot be obtained efficiently with a classical computer. However, this challenge can be overcome by quantum computers. Here, we present a quantum algorithm which fully generalizes the classical Metropolis algorithm to the quantum domain. The meaning of quantum generalization is twofold: The proposed algorithm is not only applicable to both classical and quantum systems, but also offers a quantum speedup relative to the classical counterpart. Furthermore, unlike the classical method of quantum Monte Carlo, this quantum algorithm does not suffer from the negative-sign problem associated with fermionic systems. Applications of this algorithm include the study of low-temperature properties of quantum systems, such as the Hubbard model, and preparing the thermal states of sizable molecules to simulate, for example, chemical reactions at an arbitrary temperature. PMID:22215584
Water Oxidation Catalysis for NiOOH by a Metropolis Monte Carlo Algorithm.
Hareli, Chen; Caspary Toroker, Maytal
2018-05-08
Understanding catalytic mechanisms is important for discovering better catalysts, particularly for water splitting reactions that are of great interest to the renewable energy field. One of the best performing catalysts for water oxidation is nickel oxyhydroxide (NiOOH). However, only one mechanism has been adopted so far for modeling catalysis of the active plane: β-NiOOH(01̅5). In order to understand how a second reaction mechanism affects catalysis, we perform Density Functional Theory + U (DFT+U) calculations of a second mechanism for water oxidation reaction of NiOOH. Then, we use a Metropolis Monte Carlo algorithm to calculate how many catalytic cycles are completed when two reaction mechanisms are competing. We find that within the Metropolis algorithm, the second mechanism has a higher overpotential and is therefore not active even for large applied biases.
Adaptive Metropolis Sampling with Product Distributions
NASA Technical Reports Server (NTRS)
Wolpert, David H.; Lee, Chiu Fan
2005-01-01
The Metropolis-Hastings (MH) algorithm is a way to sample a provided target distribution pi(z). It works by repeatedly sampling a separate proposal distribution T(x,x') to generate a random walk {x(t)}. We consider a modification of the MH algorithm in which T is dynamically updated during the walk. The update at time t uses the {x(t' less than t)} to estimate the product distribution that has the least Kullback-Leibler distance to pi. That estimate is the information-theoretically optimal mean-field approximation to pi. We demonstrate through computer experiments that our algorithm produces samples that are superior to those of the conventional MH algorithm.
General Metropolis-Hastings jump diffusions for automatic target recognition in infrared scenes
NASA Astrophysics Data System (ADS)
Lanterman, Aaron D.; Miller, Michael I.; Snyder, Donald L.
1997-04-01
To locate and recognize ground-based targets in forward- looking IR (FLIR) images, 3D faceted models with associated pose parameters are formulated to accommodate the variability found in FLIR imagery. Taking a Bayesian approach, scenes are simulated from the emissive characteristics of the CAD models and compared with the collected data by a likelihood function based on sensor statistics. This likelihood is combined with a prior distribution defined over the set of possible scenes to form a posterior distribution. To accommodate scenes with variable numbers of targets, the posterior distribution is defined over parameter vectors of varying dimension. An inference algorithm based on Metropolis-Hastings jump- diffusion processes empirically samples from the posterior distribution, generating configurations of templates and transformations that match the collected sensor data with high probability. The jumps accommodate the addition and deletion of targets and the estimation of target identities; diffusions refine the hypotheses by drifting along the gradient of the posterior distribution with respect to the orientation and position parameters. Previous results on jumps strategies analogous to the Metropolis acceptance/rejection algorithm, with proposals drawn from the prior and accepted based on the likelihood, are extended to encompass general Metropolis-Hastings proposal densities. In particular, the algorithm proposes moves by drawing from the posterior distribution over computationally tractible subsets of the parameter space. The algorithm is illustrated by an implementation on a Silicon Graphics Onyx/Reality Engine.
Quality assessment of MEG-to-MRI coregistrations
NASA Astrophysics Data System (ADS)
Sonntag, Hermann; Haueisen, Jens; Maess, Burkhard
2018-04-01
For high precision in source reconstruction of magnetoencephalography (MEG) or electroencephalography data, high accuracy of the coregistration of sources and sensors is mandatory. Usually, the source space is derived from magnetic resonance imaging (MRI). In most cases, however, no quality assessment is reported for sensor-to-MRI coregistrations. If any, typically root mean squares (RMS) of point residuals are provided. It has been shown, however, that RMS of residuals do not correlate with coregistration errors. We suggest using target registration error (TRE) as criterion for the quality of sensor-to-MRI coregistrations. TRE measures the effect of uncertainty in coregistrations at all points of interest. In total, 5544 data sets with sensor-to-head and 128 head-to-MRI coregistrations, from a single MEG laboratory, were analyzed. An adaptive Metropolis algorithm was used to estimate the optimal coregistration and to sample the coregistration parameters (rotation and translation). We found an average TRE between 1.3 and 2.3 mm at the head surface. Further, we observed a mean absolute difference in coregistration parameters between the Metropolis and iterative closest point algorithm of (1.9 +/- 15){\\hspace{0pt}}\\circ and (1.1 +/- 9) m. A paired sample t-test indicated a significant improvement in goal function minimization by using the Metropolis algorithm. The sampled parameters allowed computation of TRE on the entire grid of the MRI volume. Hence, we recommend the Metropolis algorithm for head-to-MRI coregistrations.
Metis: A Pure Metropolis Markov Chain Monte Carlo Bayesian Inference Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bates, Cameron Russell; Mckigney, Edward Allen
The use of Bayesian inference in data analysis has become the standard for large scienti c experiments [1, 2]. The Monte Carlo Codes Group(XCP-3) at Los Alamos has developed a simple set of algorithms currently implemented in C++ and Python to easily perform at-prior Markov Chain Monte Carlo Bayesian inference with pure Metropolis sampling. These implementations are designed to be user friendly and extensible for customization based on speci c application requirements. This document describes the algorithmic choices made and presents two use cases.
A Bootstrap Metropolis-Hastings Algorithm for Bayesian Analysis of Big Data.
Liang, Faming; Kim, Jinsu; Song, Qifan
2016-01-01
Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively.
Temme, K; Osborne, T J; Vollbrecht, K G; Poulin, D; Verstraete, F
2011-03-03
The original motivation to build a quantum computer came from Feynman, who imagined a machine capable of simulating generic quantum mechanical systems--a task that is believed to be intractable for classical computers. Such a machine could have far-reaching applications in the simulation of many-body quantum physics in condensed-matter, chemical and high-energy systems. Part of Feynman's challenge was met by Lloyd, who showed how to approximately decompose the time evolution operator of interacting quantum particles into a short sequence of elementary gates, suitable for operation on a quantum computer. However, this left open the problem of how to simulate the equilibrium and static properties of quantum systems. This requires the preparation of ground and Gibbs states on a quantum computer. For classical systems, this problem is solved by the ubiquitous Metropolis algorithm, a method that has basically acquired a monopoly on the simulation of interacting particles. Here we demonstrate how to implement a quantum version of the Metropolis algorithm. This algorithm permits sampling directly from the eigenstates of the Hamiltonian, and thus evades the sign problem present in classical simulations. A small-scale implementation of this algorithm should be achievable with today's technology.
Note: A pure-sampling quantum Monte Carlo algorithm with independent Metropolis.
Vrbik, Jan; Ospadov, Egor; Rothstein, Stuart M
2016-07-14
Recently, Ospadov and Rothstein published a pure-sampling quantum Monte Carlo algorithm (PSQMC) that features an auxiliary Path Z that connects the midpoints of the current and proposed Paths X and Y, respectively. When sufficiently long, Path Z provides statistical independence of Paths X and Y. Under those conditions, the Metropolis decision used in PSQMC is done without any approximation, i.e., not requiring microscopic reversibility and without having to introduce any G(x → x'; τ) factors into its decision function. This is a unique feature that contrasts with all competing reptation algorithms in the literature. An example illustrates that dependence of Paths X and Y has adverse consequences for pure sampling.
Note: A pure-sampling quantum Monte Carlo algorithm with independent Metropolis
NASA Astrophysics Data System (ADS)
Vrbik, Jan; Ospadov, Egor; Rothstein, Stuart M.
2016-07-01
Recently, Ospadov and Rothstein published a pure-sampling quantum Monte Carlo algorithm (PSQMC) that features an auxiliary Path Z that connects the midpoints of the current and proposed Paths X and Y, respectively. When sufficiently long, Path Z provides statistical independence of Paths X and Y. Under those conditions, the Metropolis decision used in PSQMC is done without any approximation, i.e., not requiring microscopic reversibility and without having to introduce any G(x → x'; τ) factors into its decision function. This is a unique feature that contrasts with all competing reptation algorithms in the literature. An example illustrates that dependence of Paths X and Y has adverse consequences for pure sampling.
Monte Carlo sampling in diffusive dynamical systems
NASA Astrophysics Data System (ADS)
Tapias, Diego; Sanders, David P.; Altmann, Eduardo G.
2018-05-01
We introduce a Monte Carlo algorithm to efficiently compute transport properties of chaotic dynamical systems. Our method exploits the importance sampling technique that favors trajectories in the tail of the distribution of displacements, where deviations from a diffusive process are most prominent. We search for initial conditions using a proposal that correlates states in the Markov chain constructed via a Metropolis-Hastings algorithm. We show that our method outperforms the direct sampling method and also Metropolis-Hastings methods with alternative proposals. We test our general method through numerical simulations in 1D (box-map) and 2D (Lorentz gas) systems.
Meisenkothen, Frederick; Steel, Eric B; Prosa, Ty J; Henry, Karen T; Prakash Kolli, R
2015-12-01
In atom probe tomography (APT), some elements tend to field evaporate preferentially in multi-hit detection events. Boron (B) is one such element. It is thought that a large fraction of the B signal may be lost during data acquisition and is not reported in the mass spectrum or in the 3-D APT reconstruction. Understanding the relationship between the field evaporation behavior of B and the limitations for detecting multi-hit events can provide insight into the signal loss mechanism for B and may suggest ways to improve B detection accuracy. The present work reports data for nominally pure B and for B-implanted silicon (Si) (NIST-SRM2137) at dose levels two-orders of magnitude lower than previously studied by Da Costa, et al. in 2012. Boron concentration profiles collected from SRM2137 specimens qualitatively confirmed a signal loss mechanism is at work in laser pulsed atom probe measurements of B in Si. Ion correlation analysis was used to graphically demonstrate that the detector dead-time results in few same isotope, same charge-state (SISCS) ion pairs being properly recorded in the multi-hit data, explaining why B is consistently under-represented in quantitative analyses. Given the important role of detector dead-time as a signal loss mechanism, the results from three different methods of estimating the detector dead-time are presented. The findings of this study apply to all quantitative analyses that involve multi-hit data, but the dead-time will have the greatest effect on the elements that have a significant quantity of ions detected in multi-hit events. Published by Elsevier B.V.
ERIC Educational Resources Information Center
Martin-Fernandez, Manuel; Revuelta, Javier
2017-01-01
This study compares the performance of two estimation algorithms of new usage, the Metropolis-Hastings Robins-Monro (MHRM) and the Hamiltonian MCMC (HMC), with two consolidated algorithms in the psychometric literature, the marginal likelihood via EM algorithm (MML-EM) and the Markov chain Monte Carlo (MCMC), in the estimation of multidimensional…
Random Numbers and Monte Carlo Methods
NASA Astrophysics Data System (ADS)
Scherer, Philipp O. J.
Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.
MontePython 3: Parameter inference code for cosmology
NASA Astrophysics Data System (ADS)
Brinckmann, Thejs; Lesgourgues, Julien; Audren, Benjamin; Benabed, Karim; Prunet, Simon
2018-05-01
MontePython 3 provides numerous ways to explore parameter space using Monte Carlo Markov Chain (MCMC) sampling, including Metropolis-Hastings, Nested Sampling, Cosmo Hammer, and a Fisher sampling method. This improved version of the Monte Python (ascl:1307.002) parameter inference code for cosmology offers new ingredients that improve the performance of Metropolis-Hastings sampling, speeding up convergence and offering significant time improvement in difficult runs. Additional likelihoods and plotting options are available, as are post-processing algorithms such as Importance Sampling and Adding Derived Parameter.
Maximum Likelihood Estimation of Nonlinear Structural Equation Models.
ERIC Educational Resources Information Center
Lee, Sik-Yum; Zhu, Hong-Tu
2002-01-01
Developed an EM type algorithm for maximum likelihood estimation of a general nonlinear structural equation model in which the E-step is completed by a Metropolis-Hastings algorithm. Illustrated the methodology with results from a simulation study and two real examples using data from previous studies. (SLD)
Bayesian Analysis of Nonlinear Structural Equation Models with Nonignorable Missing Data
ERIC Educational Resources Information Center
Lee, Sik-Yum
2006-01-01
A Bayesian approach is developed for analyzing nonlinear structural equation models with nonignorable missing data. The nonignorable missingness mechanism is specified by a logistic regression model. A hybrid algorithm that combines the Gibbs sampler and the Metropolis-Hastings algorithm is used to produce the joint Bayesian estimates of…
NASA Astrophysics Data System (ADS)
Eric, L.; Vrugt, J. A.
2010-12-01
Spatially distributed hydrologic models potentially contain hundreds of parameters that need to be derived by calibration against a historical record of input-output data. The quality of this calibration strongly determines the predictive capability of the model and thus its usefulness for science-based decision making and forecasting. Unfortunately, high-dimensional optimization problems are typically difficult to solve. Here we present our recent developments to the Differential Evolution Adaptive Metropolis (DREAM) algorithm (Vrugt et al., 2009) to warrant efficient solution of high-dimensional parameter estimation problems. The algorithm samples from an archive of past states (Ter Braak and Vrugt, 2008), and uses multiple-try Metropolis sampling (Liu et al., 2000) to decrease the required burn-in time for each individual chain and increase efficiency of posterior sampling. This approach is hereafter referred to as MT-DREAM. We present results for 2 synthetic mathematical case studies, and 2 real-world examples involving from 10 to 240 parameters. Results for those cases show that our multiple-try sampler, MT-DREAM, can consistently find better solutions than other Bayesian MCMC methods. Moreover, MT-DREAM is admirably suited to be implemented and ran on a parallel machine and is therefore a powerful method for posterior inference.
Extended Mixed-Efects Item Response Models with the MH-RM Algorithm
ERIC Educational Resources Information Center
Chalmers, R. Philip
2015-01-01
A mixed-effects item response theory (IRT) model is presented as a logical extension of the generalized linear mixed-effects modeling approach to formulating explanatory IRT models. Fixed and random coefficients in the extended model are estimated using a Metropolis-Hastings Robbins-Monro (MH-RM) stochastic imputation algorithm to accommodate for…
Topics in Bayesian Hierarchical Modeling and its Monte Carlo Computations
NASA Astrophysics Data System (ADS)
Tak, Hyung Suk
The first chapter addresses a Beta-Binomial-Logit model that is a Beta-Binomial conjugate hierarchical model with covariate information incorporated via a logistic regression. Various researchers in the literature have unknowingly used improper posterior distributions or have given incorrect statements about posterior propriety because checking posterior propriety can be challenging due to the complicated functional form of a Beta-Binomial-Logit model. We derive data-dependent necessary and sufficient conditions for posterior propriety within a class of hyper-prior distributions that encompass those used in previous studies. Frequency coverage properties of several hyper-prior distributions are also investigated to see when and whether Bayesian interval estimates of random effects meet their nominal confidence levels. The second chapter deals with a time delay estimation problem in astrophysics. When the gravitational field of an intervening galaxy between a quasar and the Earth is strong enough to split light into two or more images, the time delay is defined as the difference between their travel times. The time delay can be used to constrain cosmological parameters and can be inferred from the time series of brightness data of each image. To estimate the time delay, we construct a Gaussian hierarchical model based on a state-space representation for irregularly observed time series generated by a latent continuous-time Ornstein-Uhlenbeck process. Our Bayesian approach jointly infers model parameters via a Gibbs sampler. We also introduce a profile likelihood of the time delay as an approximation of its marginal posterior distribution. The last chapter specifies a repelling-attracting Metropolis algorithm, a new Markov chain Monte Carlo method to explore multi-modal distributions in a simple and fast manner. This algorithm is essentially a Metropolis-Hastings algorithm with a proposal that consists of a downhill move in density that aims to make local modes repelling, followed by an uphill move in density that aims to make local modes attracting. The downhill move is achieved via a reciprocal Metropolis ratio so that the algorithm prefers downward movement. The uphill move does the opposite using the standard Metropolis ratio which prefers upward movement. This down-up movement in density increases the probability of a proposed move to a different mode.
NASA Astrophysics Data System (ADS)
Wang, Hongrui; Wang, Cheng; Wang, Ying; Gao, Xiong; Yu, Chen
2017-06-01
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLE confidence interval and thus more precise estimation by using the related information from regional gage stations. The Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.
Modelling maximum river flow by using Bayesian Markov Chain Monte Carlo
NASA Astrophysics Data System (ADS)
Cheong, R. Y.; Gabda, D.
2017-09-01
Analysis of flood trends is vital since flooding threatens human living in terms of financial, environment and security. The data of annual maximum river flows in Sabah were fitted into generalized extreme value (GEV) distribution. Maximum likelihood estimator (MLE) raised naturally when working with GEV distribution. However, previous researches showed that MLE provide unstable results especially in small sample size. In this study, we used different Bayesian Markov Chain Monte Carlo (MCMC) based on Metropolis-Hastings algorithm to estimate GEV parameters. Bayesian MCMC method is a statistical inference which studies the parameter estimation by using posterior distribution based on Bayes’ theorem. Metropolis-Hastings algorithm is used to overcome the high dimensional state space faced in Monte Carlo method. This approach also considers more uncertainty in parameter estimation which then presents a better prediction on maximum river flow in Sabah.
The Metropolis Monte Carlo method with CUDA enabled Graphic Processing Units
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Clifford; School of Physics, Astronomy, and Computational Sciences, George Mason University, 4400 University Dr., Fairfax, VA 22030; Ji, Weixiao
2014-02-01
We present a CPU–GPU system for runtime acceleration of large molecular simulations using GPU computation and memory swaps. The memory architecture of the GPU can be used both as container for simulation data stored on the graphics card and as floating-point code target, providing an effective means for the manipulation of atomistic or molecular data on the GPU. To fully take advantage of this mechanism, efficient GPU realizations of algorithms used to perform atomistic and molecular simulations are essential. Our system implements a versatile molecular engine, including inter-molecule interactions and orientational variables for performing the Metropolis Monte Carlo (MMC) algorithm,more » which is one type of Markov chain Monte Carlo. By combining memory objects with floating-point code fragments we have implemented an MMC parallel engine that entirely avoids the communication time of molecular data at runtime. Our runtime acceleration system is a forerunner of a new class of CPU–GPU algorithms exploiting memory concepts combined with threading for avoiding bus bandwidth and communication. The testbed molecular system used here is a condensed phase system of oligopyrrole chains. A benchmark shows a size scaling speedup of 60 for systems with 210,000 pyrrole monomers. Our implementation can easily be combined with MPI to connect in parallel several CPU–GPU duets. -- Highlights: •We parallelize the Metropolis Monte Carlo (MMC) algorithm on one CPU—GPU duet. •The Adaptive Tempering Monte Carlo employs MMC and profits from this CPU—GPU implementation. •Our benchmark shows a size scaling-up speedup of 62 for systems with 225,000 particles. •The testbed involves a polymeric system of oligopyrroles in the condensed phase. •The CPU—GPU parallelization includes dipole—dipole and Mie—Jones classic potentials.« less
Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo
NASA Astrophysics Data System (ADS)
Schön, Thomas B.; Svensson, Andreas; Murray, Lawrence; Lindsten, Fredrik
2018-05-01
Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods-the particle Metropolis-Hastings algorithm-which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods-including particle Metropolis-Hastings-to a large group of users without requiring them to know all the underlying mathematical details.
Lung Cancer Pathological Image Analysis Using a Hidden Potts Model
Li, Qianyun; Yi, Faliu; Wang, Tao; Xiao, Guanghua; Liang, Faming
2017-01-01
Nowadays, many biological data are acquired via images. In this article, we study the pathological images scanned from 205 patients with lung cancer with the goal to find out the relationship between the survival time and the spatial distribution of different types of cells, including lymphocyte, stroma, and tumor cells. Toward this goal, we model the spatial distribution of different types of cells using a modified Potts model for which the parameters represent interactions between different types of cells and estimate the parameters of the Potts model using the double Metropolis-Hastings algorithm. The double Metropolis-Hastings algorithm allows us to simulate samples approximately from a distribution with an intractable normalizing constant. Our numerical results indicate that the spatial interaction between the lymphocyte and tumor cells is significantly associated with the patient’s survival time, and it can be used together with the cell count information to predict the survival of the patients. PMID:28615918
ERIC Educational Resources Information Center
Monroe, Scott; Cai, Li
2013-01-01
In Ramsay curve item response theory (RC-IRT, Woods & Thissen, 2006) modeling, the shape of the latent trait distribution is estimated simultaneously with the item parameters. In its original implementation, RC-IRT is estimated via Bock and Aitkin's (1981) EM algorithm, which yields maximum marginal likelihood estimates. This method, however,…
ERIC Educational Resources Information Center
Monroe, Scott; Cai, Li
2014-01-01
In Ramsay curve item response theory (RC-IRT) modeling, the shape of the latent trait distribution is estimated simultaneously with the item parameters. In its original implementation, RC-IRT is estimated via Bock and Aitkin's EM algorithm, which yields maximum marginal likelihood estimates. This method, however, does not produce the…
NASA Astrophysics Data System (ADS)
Zhu, Gaofeng; Li, Xin; Ma, Jinzhu; Wang, Yunquan; Liu, Shaomin; Huang, Chunlin; Zhang, Kun; Hu, Xiaoli
2018-04-01
Sequential Monte Carlo (SMC) samplers have become increasing popular for estimating the posterior parameter distribution with the non-linear dependency structures and multiple modes often present in hydrological models. However, the explorative capabilities and efficiency of the sampler depends strongly on the efficiency in the move step of SMC sampler. In this paper we presented a new SMC sampler entitled the Particle Evolution Metropolis Sequential Monte Carlo (PEM-SMC) algorithm, which is well suited to handle unknown static parameters of hydrologic model. The PEM-SMC sampler is inspired by the works of Liang and Wong (2001) and operates by incorporating the strengths of the genetic algorithm, differential evolution algorithm and Metropolis-Hasting algorithm into the framework of SMC. We also prove that the sampler admits the target distribution to be a stationary distribution. Two case studies including a multi-dimensional bimodal normal distribution and a conceptual rainfall-runoff hydrologic model by only considering parameter uncertainty and simultaneously considering parameter and input uncertainty show that PEM-SMC sampler is generally superior to other popular SMC algorithms in handling the high dimensional problems. The study also indicated that it may be important to account for model structural uncertainty by using multiplier different hydrological models in the SMC framework in future study.
A Bootstrap Metropolis–Hastings Algorithm for Bayesian Analysis of Big Data
Kim, Jinsu; Song, Qifan
2016-01-01
Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively. PMID:29033469
Biased Metropolis Sampling for Rugged Free Energy Landscapes
NASA Astrophysics Data System (ADS)
Berg, Bernd A.
2003-11-01
Metropolis simulations of all-atom models of peptides (i.e. small proteins) are considered. Inspired by the funnel picture of Bryngelson and Wolyness, a transformation of the updating probabilities of the dihedral angles is defined, which uses probability densities from a higher temperature to improve the algorithmic performance at a lower temperature. The method is suitable for canonical as well as for generalized ensemble simulations. A simple approximation to the full transformation is tested at room temperature for Met-Enkephalin in vacuum. Integrated autocorrelation times are found to be reduced by factors close to two and a similar improvement due to generalized ensemble methods enters multiplicatively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Brian Phillip
The purpose of this document is to describe the statistical modeling effort for gas concentrations in WIPP storage containers. The concentration (in ppm) of CO 2 in the headspace volume of standard waste box (SWB) 68685 is shown. A Bayesian approach and an adaptive Metropolis-Hastings algorithm were used.
NASA Astrophysics Data System (ADS)
Preston, M. F.; Myers, L. S.; Annand, J. R. M.; Fissum, K. G.; Hansen, K.; Isaksson, L.; Jebali, R.; Lundin, M.
2014-04-01
Rate-dependent effects in the electronics used to instrument the tagger focal plane at the MAX IV Laboratory were recently investigated using the novel approach of Monte Carlo simulation to allow for normalization of high-rate experimental data acquired with single-hit time-to-digital converters (TDCs). The instrumentation of the tagger focal plane has now been expanded to include multi-hit TDCs. The agreement between results obtained from data taken using single-hit and multi-hit TDCs demonstrate a thorough understanding of the behavior of the detector system.
Quantifying parameter uncertainty in stochastic models using the Box Cox transformation
NASA Astrophysics Data System (ADS)
Thyer, Mark; Kuczera, George; Wang, Q. J.
2002-08-01
The Box-Cox transformation is widely used to transform hydrological data to make it approximately Gaussian. Bayesian evaluation of parameter uncertainty in stochastic models using the Box-Cox transformation is hindered by the fact that there is no analytical solution for the posterior distribution. However, the Markov chain Monte Carlo method known as the Metropolis algorithm can be used to simulate the posterior distribution. This method properly accounts for the nonnegativity constraint implicit in the Box-Cox transformation. Nonetheless, a case study using the AR(1) model uncovered a practical problem with the implementation of the Metropolis algorithm. The use of a multivariate Gaussian jump distribution resulted in unacceptable convergence behaviour. This was rectified by developing suitable parameter transformations for the mean and variance of the AR(1) process to remove the strong nonlinear dependencies with the Box-Cox transformation parameter. Applying this methodology to the Sydney annual rainfall data and the Burdekin River annual runoff data illustrates the efficacy of these parameter transformations and demonstrate the value of quantifying parameter uncertainty.
Irreversible Local Markov Chains with Rapid Convergence towards Equilibrium.
Kapfer, Sebastian C; Krauth, Werner
2017-12-15
We study the continuous one-dimensional hard-sphere model and present irreversible local Markov chains that mix on faster time scales than the reversible heat bath or Metropolis algorithms. The mixing time scales appear to fall into two distinct universality classes, both faster than for reversible local Markov chains. The event-chain algorithm, the infinitesimal limit of one of these Markov chains, belongs to the class presenting the fastest decay. For the lattice-gas limit of the hard-sphere model, reversible local Markov chains correspond to the symmetric simple exclusion process (SEP) with periodic boundary conditions. The two universality classes for irreversible Markov chains are realized by the totally asymmetric SEP (TASEP), and by a faster variant (lifted TASEP) that we propose here. We discuss how our irreversible hard-sphere Markov chains generalize to arbitrary repulsive pair interactions and carry over to higher dimensions through the concept of lifted Markov chains and the recently introduced factorized Metropolis acceptance rule.
Irreversible Local Markov Chains with Rapid Convergence towards Equilibrium
NASA Astrophysics Data System (ADS)
Kapfer, Sebastian C.; Krauth, Werner
2017-12-01
We study the continuous one-dimensional hard-sphere model and present irreversible local Markov chains that mix on faster time scales than the reversible heat bath or Metropolis algorithms. The mixing time scales appear to fall into two distinct universality classes, both faster than for reversible local Markov chains. The event-chain algorithm, the infinitesimal limit of one of these Markov chains, belongs to the class presenting the fastest decay. For the lattice-gas limit of the hard-sphere model, reversible local Markov chains correspond to the symmetric simple exclusion process (SEP) with periodic boundary conditions. The two universality classes for irreversible Markov chains are realized by the totally asymmetric SEP (TASEP), and by a faster variant (lifted TASEP) that we propose here. We discuss how our irreversible hard-sphere Markov chains generalize to arbitrary repulsive pair interactions and carry over to higher dimensions through the concept of lifted Markov chains and the recently introduced factorized Metropolis acceptance rule.
NASA Astrophysics Data System (ADS)
Bonnema, Matthew G.; Sikder, Safat; Hossain, Faisal; Durand, Michael; Gleason, Colin J.; Bjerklie, David M.
2016-04-01
The objective of this study is to compare the effectiveness of three algorithms that estimate discharge from remotely sensed observables (river width, water surface height, and water surface slope) in anticipation of the forthcoming NASA/CNES Surface Water and Ocean Topography (SWOT) mission. SWOT promises to provide these measurements simultaneously, and the river discharge algorithms included here are designed to work with these data. Two algorithms were built around Manning's equation, the Metropolis Manning (MetroMan) method, and the Mean Flow and Geomorphology (MFG) method, and one approach uses hydraulic geometry to estimate discharge, the at-many-stations hydraulic geometry (AMHG) method. A well-calibrated and ground-truthed hydrodynamic model of the Ganges river system (HEC-RAS) was used as reference for three rivers from the Ganges River Delta: the main stem of Ganges, the Arial-Khan, and the Mohananda Rivers. The high seasonal variability of these rivers due to the Monsoon presented a unique opportunity to thoroughly assess the discharge algorithms in light of typical monsoon regime rivers. It was found that the MFG method provides the most accurate discharge estimations in most cases, with an average relative root-mean-squared error (RRMSE) across all three reaches of 35.5%. It is followed closely by the Metropolis Manning algorithm, with an average RRMSE of 51.5%. However, the MFG method's reliance on knowledge of prior river discharge limits its application on ungauged rivers. In terms of input data requirement at ungauged regions with no prior records, the Metropolis Manning algorithm provides a more practical alternative over a region that is lacking in historical observations as the algorithm requires less ancillary data. The AMHG algorithm, while requiring the least prior river data, provided the least accurate discharge measurements with an average wet and dry season RRMSE of 79.8% and 119.1%, respectively, across all rivers studied. This poor performance is directly traced to poor estimation of AMHG via a remotely sensed proxy, and results improve commensurate with MFG and MetroMan when prior AMHG information is given to the method. Therefore, we cannot recommend use of AMHG without inclusion of this prior information, at least for the studied rivers. The dry season discharge (within-bank flow) was captured well by all methods, while the wet season (floodplain flow) appeared more challenging. The picture that emerges from this study is that a multialgorithm approach may be appropriate during flood inundation periods in Ganges Delta.
Cell-veto Monte Carlo algorithm for long-range systems.
Kapfer, Sebastian C; Krauth, Werner
2016-09-01
We present a rigorous efficient event-chain Monte Carlo algorithm for long-range interacting particle systems. Using a cell-veto scheme within the factorized Metropolis algorithm, we compute each single-particle move with a fixed number of operations. For slowly decaying potentials such as Coulomb interactions, screening line charges allow us to take into account periodic boundary conditions. We discuss the performance of the cell-veto Monte Carlo algorithm for general inverse-power-law potentials, and illustrate how it provides a new outlook on one of the prominent bottlenecks in large-scale atomistic Monte Carlo simulations.
Weare, Jonathan; Dinner, Aaron R.; Roux, Benoît
2016-01-01
A multiple time-step integrator based on a dual Hamiltonian and a hybrid method combining molecular dynamics (MD) and Monte Carlo (MC) is proposed to sample systems in the canonical ensemble. The Dual Hamiltonian Multiple Time-Step (DHMTS) algorithm is based on two similar Hamiltonians: a computationally expensive one that serves as a reference and a computationally inexpensive one to which the workload is shifted. The central assumption is that the difference between the two Hamiltonians is slowly varying. Earlier work has shown that such dual Hamiltonian multiple time-step schemes effectively precondition nonlinear differential equations for dynamics by reformulating them into a recursive root finding problem that can be solved by propagating a correction term through an internal loop, analogous to RESPA. Of special interest in the present context, a hybrid MD-MC version of the DHMTS algorithm is introduced to enforce detailed balance via a Metropolis acceptance criterion and ensure consistency with the Boltzmann distribution. The Metropolis criterion suppresses the discretization errors normally associated with the propagation according to the computationally inexpensive Hamiltonian, treating the discretization error as an external work. Illustrative tests are carried out to demonstrate the effectiveness of the method. PMID:26918826
Stochastic Approximation Methods for Latent Regression Item Response Models
ERIC Educational Resources Information Center
von Davier, Matthias; Sinharay, Sandip
2010-01-01
This article presents an application of a stochastic approximation expectation maximization (EM) algorithm using a Metropolis-Hastings (MH) sampler to estimate the parameters of an item response latent regression model. Latent regression item response models are extensions of item response theory (IRT) to a latent variable model with covariates…
Gradient-free MCMC methods for dynamic causal modelling
Sengupta, Biswa; Friston, Karl J.; Penny, Will D.
2015-03-14
Here, we compare the performance of four gradient-free MCMC samplers (random walk Metropolis sampling, slice-sampling, adaptive MCMC sampling and population-based MCMC sampling with tempering) in terms of the number of independent samples they can produce per unit computational time. For the Bayesian inversion of a single-node neural mass model, both adaptive and population-based samplers are more efficient compared with random walk Metropolis sampler or slice-sampling; yet adaptive MCMC sampling is more promising in terms of compute time. Slice-sampling yields the highest number of independent samples from the target density -- albeit at almost 1000% increase in computational time, in comparisonmore » to the most efficient algorithm (i.e., the adaptive MCMC sampler).« less
Markov Chain Monte Carlo in the Analysis of Single-Molecule Experimental Data
NASA Astrophysics Data System (ADS)
Kou, S. C.; Xie, X. Sunney; Liu, Jun S.
2003-11-01
This article provides a Bayesian analysis of the single-molecule fluorescence lifetime experiment designed to probe the conformational dynamics of a single DNA hairpin molecule. The DNA hairpin's conformational change is initially modeled as a two-state Markov chain, which is not observable and has to be indirectly inferred. The Brownian diffusion of the single molecule, in addition to the hidden Markov structure, further complicates the matter. We show that the analytical form of the likelihood function can be obtained in the simplest case and a Metropolis-Hastings algorithm can be designed to sample from the posterior distribution of the parameters of interest and to compute desired estiamtes. To cope with the molecular diffusion process and the potentially oscillating energy barrier between the two states of the DNA hairpin, we introduce a data augmentation technique to handle both the Brownian diffusion and the hidden Ornstein-Uhlenbeck process associated with the fluctuating energy barrier, and design a more sophisticated Metropolis-type algorithm. Our method not only increases the estimating resolution by several folds but also proves to be successful for model discrimination.
Monte Carlo sampling of Wigner functions and surface hopping quantum dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kube, Susanna; Lasser, Caroline; Weber, Marcus
2009-04-01
The article addresses the achievable accuracy for a Monte Carlo sampling of Wigner functions in combination with a surface hopping algorithm for non-adiabatic quantum dynamics. The approximation of Wigner functions is realized by an adaption of the Metropolis algorithm for real-valued functions with disconnected support. The integration, which is necessary for computing values of the Wigner function, uses importance sampling with a Gaussian weight function. The numerical experiments agree with theoretical considerations and show an error of 2-3%.
Gradient-free MCMC methods for dynamic causal modelling.
Sengupta, Biswa; Friston, Karl J; Penny, Will D
2015-05-15
In this technical note we compare the performance of four gradient-free MCMC samplers (random walk Metropolis sampling, slice-sampling, adaptive MCMC sampling and population-based MCMC sampling with tempering) in terms of the number of independent samples they can produce per unit computational time. For the Bayesian inversion of a single-node neural mass model, both adaptive and population-based samplers are more efficient compared with random walk Metropolis sampler or slice-sampling; yet adaptive MCMC sampling is more promising in terms of compute time. Slice-sampling yields the highest number of independent samples from the target density - albeit at almost 1000% increase in computational time, in comparison to the most efficient algorithm (i.e., the adaptive MCMC sampler). Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Hybrid optimization and Bayesian inference techniques for a non-smooth radiation detection problem
Stefanescu, Razvan; Schmidt, Kathleen; Hite, Jason; ...
2016-12-12
In this paper, we propose several algorithms to recover the location and intensity of a radiation source located in a simulated 250 × 180 m block of an urban center based on synthetic measurements. Radioactive decay and detection are Poisson random processes, so we employ likelihood functions based on this distribution. Owing to the domain geometry and the proposed response model, the negative logarithm of the likelihood is only piecewise continuous differentiable, and it has multiple local minima. To address these difficulties, we investigate three hybrid algorithms composed of mixed optimization techniques. For global optimization, we consider simulated annealing, particlemore » swarm, and genetic algorithm, which rely solely on objective function evaluations; that is, they do not evaluate the gradient in the objective function. By employing early stopping criteria for the global optimization methods, a pseudo-optimum point is obtained. This is subsequently utilized as the initial value by the deterministic implicit filtering method, which is able to find local extrema in non-smooth functions, to finish the search in a narrow domain. These new hybrid techniques, combining global optimization and implicit filtering address, difficulties associated with the non-smooth response, and their performances, are shown to significantly decrease the computational time over the global optimization methods. To quantify uncertainties associated with the source location and intensity, we employ the delayed rejection adaptive Metropolis and DiffeRential Evolution Adaptive Metropolis algorithms. Finally, marginal densities of the source properties are obtained, and the means of the chains compare accurately with the estimates produced by the hybrid algorithms.« less
ERIC Educational Resources Information Center
von Davier, Matthias; Sinharay, Sandip
2009-01-01
This paper presents an application of a stochastic approximation EM-algorithm using a Metropolis-Hastings sampler to estimate the parameters of an item response latent regression model. Latent regression models are extensions of item response theory (IRT) to a 2-level latent variable model in which covariates serve as predictors of the…
Semi-blind sparse image reconstruction with application to MRFM.
Park, Se Un; Dobigeon, Nicolas; Hero, Alfred O
2012-09-01
We propose a solution to the image deconvolution problem where the convolution kernel or point spread function (PSF) is assumed to be only partially known. Small perturbations generated from the model are exploited to produce a few principal components explaining the PSF uncertainty in a high-dimensional space. Unlike recent developments on blind deconvolution of natural images, we assume the image is sparse in the pixel basis, a natural sparsity arising in magnetic resonance force microscopy (MRFM). Our approach adopts a Bayesian Metropolis-within-Gibbs sampling framework. The performance of our Bayesian semi-blind algorithm for sparse images is superior to previously proposed semi-blind algorithms such as the alternating minimization algorithm and blind algorithms developed for natural images. We illustrate our myopic algorithm on real MRFM tobacco virus data.
2017-09-01
efficacy of statistical post-processing methods downstream of these dynamical model components with a hierarchical multivariate Bayesian approach to...Bayesian hierarchical modeling, Markov chain Monte Carlo methods , Metropolis algorithm, machine learning, atmospheric prediction 15. NUMBER OF PAGES...scale processes. However, this dissertation explores the efficacy of statistical post-processing methods downstream of these dynamical model components
A sampling algorithm for segregation analysis
Tier, Bruce; Henshall, John
2001-01-01
Methods for detecting Quantitative Trait Loci (QTL) without markers have generally used iterative peeling algorithms for determining genotype probabilities. These algorithms have considerable shortcomings in complex pedigrees. A Monte Carlo Markov chain (MCMC) method which samples the pedigree of the whole population jointly is described. Simultaneous sampling of the pedigree was achieved by sampling descent graphs using the Metropolis-Hastings algorithm. A descent graph describes the inheritance state of each allele and provides pedigrees guaranteed to be consistent with Mendelian sampling. Sampling descent graphs overcomes most, if not all, of the limitations incurred by iterative peeling algorithms. The algorithm was able to find the QTL in most of the simulated populations. However, when the QTL was not modeled or found then its effect was ascribed to the polygenic component. No QTL were detected when they were not simulated. PMID:11742631
List-Based Simulated Annealing Algorithm for Traveling Salesman Problem.
Zhan, Shi-hua; Lin, Juan; Zhang, Ze-jun; Zhong, Yi-wen
2016-01-01
Simulated annealing (SA) algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters' setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA) algorithm to solve traveling salesman problem (TSP). LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms.
77 FR 29697 - Honeywell Metropolis Works; Grant of Exemption for Honeywell Metropolis Works License
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-18
... NUCLEAR REGULATORY COMMISSION [Docket No. 40-3392; NRC-2012-0111] Honeywell Metropolis Works; Grant of Exemption for Honeywell Metropolis Works License AGENCY: Nuclear Regulatory Commission. ACTION...
NASA Astrophysics Data System (ADS)
Vrugt, Jasper A.; Beven, Keith J.
2018-04-01
This essay illustrates some recent developments to the DiffeRential Evolution Adaptive Metropolis (DREAM) MATLAB toolbox of Vrugt (2016) to delineate and sample the behavioural solution space of set-theoretic likelihood functions used within the GLUE (Limits of Acceptability) framework (Beven and Binley, 1992, 2014; Beven and Freer, 2001; Beven, 2006). This work builds on the DREAM(ABC) algorithm of Sadegh and Vrugt (2014) and enhances significantly the accuracy and CPU-efficiency of Bayesian inference with GLUE. In particular it is shown how lack of adequate sampling in the model space might lead to unjustified model rejection.
Discrete Spin Vector Approach for Monte Carlo-based Magnetic Nanoparticle Simulations
NASA Astrophysics Data System (ADS)
Senkov, Alexander; Peralta, Juan; Sahay, Rahul
The study of magnetic nanoparticles has gained significant popularity due to the potential uses in many fields such as modern medicine, electronics, and engineering. To study the magnetic behavior of these particles in depth, it is important to be able to model and simulate their magnetic properties efficiently. Here we utilize the Metropolis-Hastings algorithm with a discrete spin vector model (in contrast to the standard continuous model) to model the magnetic hysteresis of a set of protected pure iron nanoparticles. We compare our simulations with the experimental hysteresis curves and discuss the efficiency of our algorithm.
Approximate Bayesian Computation by Subset Simulation using hierarchical state-space models
NASA Astrophysics Data System (ADS)
Vakilzadeh, Majid K.; Huang, Yong; Beck, James L.; Abrahamsson, Thomas
2017-02-01
A new multi-level Markov Chain Monte Carlo algorithm for Approximate Bayesian Computation, ABC-SubSim, has recently appeared that exploits the Subset Simulation method for efficient rare-event simulation. ABC-SubSim adaptively creates a nested decreasing sequence of data-approximating regions in the output space that correspond to increasingly closer approximations of the observed output vector in this output space. At each level, multiple samples of the model parameter vector are generated by a component-wise Metropolis algorithm so that the predicted output corresponding to each parameter value falls in the current data-approximating region. Theoretically, if continued to the limit, the sequence of data-approximating regions would converge on to the observed output vector and the approximate posterior distributions, which are conditional on the data-approximation region, would become exact, but this is not practically feasible. In this paper we study the performance of the ABC-SubSim algorithm for Bayesian updating of the parameters of dynamical systems using a general hierarchical state-space model. We note that the ABC methodology gives an approximate posterior distribution that actually corresponds to an exact posterior where a uniformly distributed combined measurement and modeling error is added. We also note that ABC algorithms have a problem with learning the uncertain error variances in a stochastic state-space model and so we treat them as nuisance parameters and analytically integrate them out of the posterior distribution. In addition, the statistical efficiency of the original ABC-SubSim algorithm is improved by developing a novel strategy to regulate the proposal variance for the component-wise Metropolis algorithm at each level. We demonstrate that Self-regulated ABC-SubSim is well suited for Bayesian system identification by first applying it successfully to model updating of a two degree-of-freedom linear structure for three cases: globally, locally and un-identifiable model classes, and then to model updating of a two degree-of-freedom nonlinear structure with Duffing nonlinearities in its interstory force-deflection relationship.
Building test data from real outbreaks for evaluating detection algorithms.
Texier, Gaetan; Jackson, Michael L; Siwe, Leonel; Meynard, Jean-Baptiste; Deparis, Xavier; Chaudet, Herve
2017-01-01
Benchmarking surveillance systems requires realistic simulations of disease outbreaks. However, obtaining these data in sufficient quantity, with a realistic shape and covering a sufficient range of agents, size and duration, is known to be very difficult. The dataset of outbreak signals generated should reflect the likely distribution of authentic situations faced by the surveillance system, including very unlikely outbreak signals. We propose and evaluate a new approach based on the use of historical outbreak data to simulate tailored outbreak signals. The method relies on a homothetic transformation of the historical distribution followed by resampling processes (Binomial, Inverse Transform Sampling Method-ITSM, Metropolis-Hasting Random Walk, Metropolis-Hasting Independent, Gibbs Sampler, Hybrid Gibbs Sampler). We carried out an analysis to identify the most important input parameters for simulation quality and to evaluate performance for each of the resampling algorithms. Our analysis confirms the influence of the type of algorithm used and simulation parameters (i.e. days, number of cases, outbreak shape, overall scale factor) on the results. We show that, regardless of the outbreaks, algorithms and metrics chosen for the evaluation, simulation quality decreased with the increase in the number of days simulated and increased with the number of cases simulated. Simulating outbreaks with fewer cases than days of duration (i.e. overall scale factor less than 1) resulted in an important loss of information during the simulation. We found that Gibbs sampling with a shrinkage procedure provides a good balance between accuracy and data dependency. If dependency is of little importance, binomial and ITSM methods are accurate. Given the constraint of keeping the simulation within a range of plausible epidemiological curves faced by the surveillance system, our study confirms that our approach can be used to generate a large spectrum of outbreak signals.
Building test data from real outbreaks for evaluating detection algorithms
Texier, Gaetan; Jackson, Michael L.; Siwe, Leonel; Meynard, Jean-Baptiste; Deparis, Xavier; Chaudet, Herve
2017-01-01
Benchmarking surveillance systems requires realistic simulations of disease outbreaks. However, obtaining these data in sufficient quantity, with a realistic shape and covering a sufficient range of agents, size and duration, is known to be very difficult. The dataset of outbreak signals generated should reflect the likely distribution of authentic situations faced by the surveillance system, including very unlikely outbreak signals. We propose and evaluate a new approach based on the use of historical outbreak data to simulate tailored outbreak signals. The method relies on a homothetic transformation of the historical distribution followed by resampling processes (Binomial, Inverse Transform Sampling Method—ITSM, Metropolis-Hasting Random Walk, Metropolis-Hasting Independent, Gibbs Sampler, Hybrid Gibbs Sampler). We carried out an analysis to identify the most important input parameters for simulation quality and to evaluate performance for each of the resampling algorithms. Our analysis confirms the influence of the type of algorithm used and simulation parameters (i.e. days, number of cases, outbreak shape, overall scale factor) on the results. We show that, regardless of the outbreaks, algorithms and metrics chosen for the evaluation, simulation quality decreased with the increase in the number of days simulated and increased with the number of cases simulated. Simulating outbreaks with fewer cases than days of duration (i.e. overall scale factor less than 1) resulted in an important loss of information during the simulation. We found that Gibbs sampling with a shrinkage procedure provides a good balance between accuracy and data dependency. If dependency is of little importance, binomial and ITSM methods are accurate. Given the constraint of keeping the simulation within a range of plausible epidemiological curves faced by the surveillance system, our study confirms that our approach can be used to generate a large spectrum of outbreak signals. PMID:28863159
Modeling and Bayesian parameter estimation for shape memory alloy bending actuators
NASA Astrophysics Data System (ADS)
Crews, John H.; Smith, Ralph C.
2012-04-01
In this paper, we employ a homogenized energy model (HEM) for shape memory alloy (SMA) bending actuators. Additionally, we utilize a Bayesian method for quantifying parameter uncertainty. The system consists of a SMA wire attached to a flexible beam. As the actuator is heated, the beam bends, providing endoscopic motion. The model parameters are fit to experimental data using an ordinary least-squares approach. The uncertainty in the fit model parameters is then quantified using Markov Chain Monte Carlo (MCMC) methods. The MCMC algorithm provides bounds on the parameters, which will ultimately be used in robust control algorithms. One purpose of the paper is to test the feasibility of the Random Walk Metropolis algorithm, the MCMC method used here.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-19
... NUCLEAR REGULATORY COMMISSION [Docket No. 40-3392-MLA; ASLBP No. 11-910-01-MLA-BD01] Atomic Safety and Licensing Board; Honeywell International, Inc.; Metropolis Works Uranium Conversion Facility... assurance for its Metropolis Works uranium conversion facility in Metropolis, Illinois. \\1\\ LBP-11-19, 74...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-07
...., Metropolis Works; License Amendment Request and Request for a Hearing AGENCY: Nuclear Regulatory Commission... surface impoundment decommissioning plan at its Metropolis Works Facility site located in Metropolis... information. With respect to copyrighted works, except for limited excerpts that serve the purpose of the...
List-Based Simulated Annealing Algorithm for Traveling Salesman Problem
Zhan, Shi-hua; Lin, Juan; Zhang, Ze-jun
2016-01-01
Simulated annealing (SA) algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters' setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA) algorithm to solve traveling salesman problem (TSP). LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms. PMID:27034650
Statistical hadronization and microcanonical ensemble
Becattini, F.; Ferroni, L.
2004-01-01
We present a Monte Carlo calculation of the microcanonical ensemble of the of the ideal hadron-resonance gas including all known states up to a mass of 1. 8 GeV, taking into account quantum statistics. The computing method is a development of a previous one based on a Metropolis Monte Carlo algorithm, with a the grand-canonical limit of the multi-species multiplicity distribution as proposal matrix. The microcanonical average multiplicities of the various hadron species are found to converge to the canonical ones for moderately low values of the total energy. This algorithm opens the way for event generators based for themore » statistical hadronization model.« less
Ising antiferromagnet on the Archimedean lattices.
Yu, Unjong
2015-06-01
Geometric frustration effects were studied systematically with the Ising antiferromagnet on the 11 Archimedean lattices using the Monte Carlo methods. The Wang-Landau algorithm for static properties (specific heat and residual entropy) and the Metropolis algorithm for a freezing order parameter were adopted. The exact residual entropy was also found. Based on the degree of frustration and dynamic properties, ground states of them were determined. The Shastry-Sutherland lattice and the trellis lattice are weakly frustrated and have two- and one-dimensional long-range-ordered ground states, respectively. The bounce, maple-leaf, and star lattices have the spin ice phase. The spin liquid phase appears in the triangular and kagome lattices.
Ising antiferromagnet on the Archimedean lattices
NASA Astrophysics Data System (ADS)
Yu, Unjong
2015-06-01
Geometric frustration effects were studied systematically with the Ising antiferromagnet on the 11 Archimedean lattices using the Monte Carlo methods. The Wang-Landau algorithm for static properties (specific heat and residual entropy) and the Metropolis algorithm for a freezing order parameter were adopted. The exact residual entropy was also found. Based on the degree of frustration and dynamic properties, ground states of them were determined. The Shastry-Sutherland lattice and the trellis lattice are weakly frustrated and have two- and one-dimensional long-range-ordered ground states, respectively. The bounce, maple-leaf, and star lattices have the spin ice phase. The spin liquid phase appears in the triangular and kagome lattices.
Nanothermodynamics Applied to Thermal Processes in Heterogeneous Materials
2012-08-03
models agree favorably with a wide range of measurements of local thermal and dynamic properties. Progress in understanding basic thermodynamic...Monte- Carlo (MC) simulations of the Ising model .7 The solid black lines in Fig. 4 show results using the uncorrected (Metropolis) algorithm on the...parameter g=0.5 (green, dash-dot), g=1 (black, solid ), and g=2 (blue, dash-dot-dot). Note the failure of the standard Ising model (g=0) to match
Wang, Hailong; Sun, Yuqiu; Su, Qinghua; Xia, Xuewen
2018-01-01
The backtracking search optimization algorithm (BSA) is a population-based evolutionary algorithm for numerical optimization problems. BSA has a powerful global exploration capacity while its local exploitation capability is relatively poor. This affects the convergence speed of the algorithm. In this paper, we propose a modified BSA inspired by simulated annealing (BSAISA) to overcome the deficiency of BSA. In the BSAISA, the amplitude control factor (F) is modified based on the Metropolis criterion in simulated annealing. The redesigned F could be adaptively decreased as the number of iterations increases and it does not introduce extra parameters. A self-adaptive ε-constrained method is used to handle the strict constraints. We compared the performance of the proposed BSAISA with BSA and other well-known algorithms when solving thirteen constrained benchmarks and five engineering design problems. The simulation results demonstrated that BSAISA is more effective than BSA and more competitive with other well-known algorithms in terms of convergence speed. PMID:29666635
NASA Astrophysics Data System (ADS)
Lu, Dan; Ricciuto, Daniel; Walker, Anthony; Safta, Cosmin; Munger, William
2017-09-01
Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM) algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results in a better model fit and predictive performance compared to the popular adaptive Metropolis (AM) scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. The result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Dan; Ricciuto, Daniel; Walker, Anthony
Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this study, a Differential Evolution Adaptive Metropolis (DREAM) algorithm was used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The DREAM is a multi-chainmore » method and uses differential evolution technique for chain movement, allowing it to be efficiently applied to high-dimensional problems, and can reliably estimate heavy-tailed and multimodal distributions that are difficult for single-chain schemes using a Gaussian proposal distribution. The results were evaluated against the popular Adaptive Metropolis (AM) scheme. DREAM indicated that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identified one mode. The calibration of DREAM resulted in a better model fit and predictive performance compared to the AM. DREAM provides means for a good exploration of the posterior distributions of model parameters. Lastly, it reduces the risk of false convergence to a local optimum and potentially improves the predictive performance of the calibrated model.« less
Lu, Dan; Ricciuto, Daniel; Walker, Anthony; ...
2017-02-22
Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this study, a Differential Evolution Adaptive Metropolis (DREAM) algorithm was used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The DREAM is a multi-chainmore » method and uses differential evolution technique for chain movement, allowing it to be efficiently applied to high-dimensional problems, and can reliably estimate heavy-tailed and multimodal distributions that are difficult for single-chain schemes using a Gaussian proposal distribution. The results were evaluated against the popular Adaptive Metropolis (AM) scheme. DREAM indicated that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identified one mode. The calibration of DREAM resulted in a better model fit and predictive performance compared to the AM. DREAM provides means for a good exploration of the posterior distributions of model parameters. Lastly, it reduces the risk of false convergence to a local optimum and potentially improves the predictive performance of the calibrated model.« less
Peltola, Tomi; Marttinen, Pekka; Vehtari, Aki
2012-01-01
High-dimensional datasets with large amounts of redundant information are nowadays available for hypothesis-free exploration of scientific questions. A particular case is genome-wide association analysis, where variations in the genome are searched for effects on disease or other traits. Bayesian variable selection has been demonstrated as a possible analysis approach, which can account for the multifactorial nature of the genetic effects in a linear regression model. Yet, the computation presents a challenge and application to large-scale data is not routine. Here, we study aspects of the computation using the Metropolis-Hastings algorithm for the variable selection: finite adaptation of the proposal distributions, multistep moves for changing the inclusion state of multiple variables in a single proposal and multistep move size adaptation. We also experiment with a delayed rejection step for the multistep moves. Results on simulated and real data show increase in the sampling efficiency. We also demonstrate that with application specific proposals, the approach can overcome a specific mixing problem in real data with 3822 individuals and 1,051,811 single nucleotide polymorphisms and uncover a variant pair with synergistic effect on the studied trait. Moreover, we illustrate multimodality in the real dataset related to a restrictive prior distribution on the genetic effect sizes and advocate a more flexible alternative. PMID:23166669
Rényi information flow in the Ising model with single-spin dynamics.
Deng, Zehui; Wu, Jinshan; Guo, Wenan
2014-12-01
The n-index Rényi mutual information and transfer entropies for the two-dimensional kinetic Ising model with arbitrary single-spin dynamics in the thermodynamic limit are derived as functions of ensemble averages of observables and spin-flip probabilities. Cluster Monte Carlo algorithms with different dynamics from the single-spin dynamics are thus applicable to estimate the transfer entropies. By means of Monte Carlo simulations with the Wolff algorithm, we calculate the information flows in the Ising model with the Metropolis dynamics and the Glauber dynamics, respectively. We find that not only the global Rényi transfer entropy, but also the pairwise Rényi transfer entropy, peaks in the disorder phase.
De Sanctis, Veronica; La Terra, Sabrina; Bianchi, Alessandro; Shore, David; Burderi, Luciano; Di Mauro, Ernesto; Negri, Rodolfo
2002-04-26
We have analyzed in detail the structure of RAP1-UAS(RPG) complexes in Saccharomyces cerevisiae cells using multi-hit KMnO(4), UV and micrococcal nuclease high-resolution footprinting. Three copies of the Rap1 protein are bound to the promoter simultaneously in exponentially growing cells, as shown by KMnO(4) multi-hit footprinting analysis, causing extended and diagnostic changes in the DNA structure of the region containing the UAS(RPG). Amino acid starvation does not cause loss of Rap1p from the complex; however, in vivo UV-footprinting reveals the occurrence of structural modifications of the complex. Moreover, low-resolution micrococcal nuclease digestion shows that the chromatin of the entire region is devoid of positioned nucleosomes but is susceptible to changes in accessibility to the nuclease upon amino acid starvation. The implications of these results for the mechanism of Rap1p action are discussed. (c) 2002 Elsevier Science Ltd.
Advance in multi-hit detection and quantization in atom probe tomography.
Da Costa, G; Wang, H; Duguay, S; Bostel, A; Blavette, D; Deconihout, B
2012-12-01
The preferential retention of high evaporation field chemical species at the sample surface in atom-probe tomography (e.g., boron in silicon or in metallic alloys) leads to correlated field evaporation and pronounced pile-up effects on the detector. The latter severely affects the reliability of concentration measurements of current 3D atom probes leading to an under-estimation of the concentrations of the high-field species. The multi-hit capabilities of the position-sensitive time-resolved detector is shown to play a key role. An innovative method based on Fourier space signal processing of signals supplied by an advance delay-line position-sensitive detector is shown to drastically improve the time resolving power of the detector and consequently its capability to detect multiple events. Results show that up to 30 ions on the same evaporation pulse can be detected and properly positioned. The major impact of this new method on the quantization of chemical composition in materials, particularly in highly-doped Si(B) samples is highlighted.
Coincidence ion imaging with a fast frame camera
NASA Astrophysics Data System (ADS)
Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander H.; Fan, Lin; Li, Wen
2014-12-01
A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.
Ising model simulation in directed lattices and networks
NASA Astrophysics Data System (ADS)
Lima, F. W. S.; Stauffer, D.
2006-01-01
On directed lattices, with half as many neighbours as in the usual undirected lattices, the Ising model does not seem to show a spontaneous magnetisation, at least for lower dimensions. Instead, the decay time for flipping of the magnetisation follows an Arrhenius law on the square and simple cubic lattice. On directed Barabási-Albert networks with two and seven neighbours selected by each added site, Metropolis and Glauber algorithms give similar results, while for Wolff cluster flipping the magnetisation decays exponentially with time.
Chen, Yunjie; Roux, Benoît
2014-09-21
Hybrid schemes combining the strength of molecular dynamics (MD) and Metropolis Monte Carlo (MC) offer a promising avenue to improve the sampling efficiency of computer simulations of complex systems. A number of recently proposed hybrid methods consider new configurations generated by driving the system via a non-equilibrium MD (neMD) trajectory, which are subsequently treated as putative candidates for Metropolis MC acceptance or rejection. To obey microscopic detailed balance, it is necessary to alter the momentum of the system at the beginning and/or the end of the neMD trajectory. This strict rule then guarantees that the random walk in configurational space generated by such hybrid neMD-MC algorithm will yield the proper equilibrium Boltzmann distribution. While a number of different constructs are possible, the most commonly used prescription has been to simply reverse the momenta of all the particles at the end of the neMD trajectory ("one-end momentum reversal"). Surprisingly, it is shown here that the choice of momentum reversal prescription can have a considerable effect on the rate of convergence of the hybrid neMD-MC algorithm, with the simple one-end momentum reversal encountering particularly acute problems. In these neMD-MC simulations, different regions of configurational space end up being essentially isolated from one another due to a very small transition rate between regions. In the worst-case scenario, it is almost as if the configurational space does not constitute a single communicating class that can be sampled efficiently by the algorithm, and extremely long neMD-MC simulations are needed to obtain proper equilibrium probability distributions. To address this issue, a novel momentum reversal prescription, symmetrized with respect to both the beginning and the end of the neMD trajectory ("symmetric two-ends momentum reversal"), is introduced. Illustrative simulations demonstrate that the hybrid neMD-MC algorithm robustly yields a correct equilibrium probability distribution with this prescription.
NASA Astrophysics Data System (ADS)
Chen, Yunjie; Roux, Benoît
2014-09-01
Hybrid schemes combining the strength of molecular dynamics (MD) and Metropolis Monte Carlo (MC) offer a promising avenue to improve the sampling efficiency of computer simulations of complex systems. A number of recently proposed hybrid methods consider new configurations generated by driving the system via a non-equilibrium MD (neMD) trajectory, which are subsequently treated as putative candidates for Metropolis MC acceptance or rejection. To obey microscopic detailed balance, it is necessary to alter the momentum of the system at the beginning and/or the end of the neMD trajectory. This strict rule then guarantees that the random walk in configurational space generated by such hybrid neMD-MC algorithm will yield the proper equilibrium Boltzmann distribution. While a number of different constructs are possible, the most commonly used prescription has been to simply reverse the momenta of all the particles at the end of the neMD trajectory ("one-end momentum reversal"). Surprisingly, it is shown here that the choice of momentum reversal prescription can have a considerable effect on the rate of convergence of the hybrid neMD-MC algorithm, with the simple one-end momentum reversal encountering particularly acute problems. In these neMD-MC simulations, different regions of configurational space end up being essentially isolated from one another due to a very small transition rate between regions. In the worst-case scenario, it is almost as if the configurational space does not constitute a single communicating class that can be sampled efficiently by the algorithm, and extremely long neMD-MC simulations are needed to obtain proper equilibrium probability distributions. To address this issue, a novel momentum reversal prescription, symmetrized with respect to both the beginning and the end of the neMD trajectory ("symmetric two-ends momentum reversal"), is introduced. Illustrative simulations demonstrate that the hybrid neMD-MC algorithm robustly yields a correct equilibrium probability distribution with this prescription.
Radiocytogenetic effects on bone marrow cells of opossum in vivo
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prasad, N.; Bushong, S.C.; MacIntyre, R.S.
1973-03-01
Bone marrow cells of the opossum, Didelphis virginiana, were examined 24 hr following a whole-body /sup 60/Co radiation dosc of 100, and the chromosomal aberrations resulted in a radiation sensitivity of 0.000605 aberrations/cell/rad/ sup 2/ for single-hit and multihit type damage respectively. (auth)
Comparison of sampling techniques for Bayesian parameter estimation
NASA Astrophysics Data System (ADS)
Allison, Rupert; Dunkley, Joanna
2014-02-01
The posterior probability distribution for a set of model parameters encodes all that the data have to tell us in the context of a given model; it is the fundamental quantity for Bayesian parameter estimation. In order to infer the posterior probability distribution we have to decide how to explore parameter space. Here we compare three prescriptions for how parameter space is navigated, discussing their relative merits. We consider Metropolis-Hasting sampling, nested sampling and affine-invariant ensemble Markov chain Monte Carlo (MCMC) sampling. We focus on their performance on toy-model Gaussian likelihoods and on a real-world cosmological data set. We outline the sampling algorithms themselves and elaborate on performance diagnostics such as convergence time, scope for parallelization, dimensional scaling, requisite tunings and suitability for non-Gaussian distributions. We find that nested sampling delivers high-fidelity estimates for posterior statistics at low computational cost, and should be adopted in favour of Metropolis-Hastings in many cases. Affine-invariant MCMC is competitive when computing clusters can be utilized for massive parallelization. Affine-invariant MCMC and existing extensions to nested sampling naturally probe multimodal and curving distributions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Dan; Ricciuto, Daniel M.; Walker, Anthony P.
Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM) algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results inmore » a better model fit and predictive performance compared to the popular adaptive Metropolis (AM) scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. Here, the result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.« less
Lu, Dan; Ricciuto, Daniel M.; Walker, Anthony P.; ...
2017-09-27
Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM) algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results inmore » a better model fit and predictive performance compared to the popular adaptive Metropolis (AM) scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. Here, the result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.« less
Stochastic evaluation of second-order many-body perturbation energies.
Willow, Soohaeng Yoo; Kim, Kwang S; Hirata, So
2012-11-28
With the aid of the Laplace transform, the canonical expression of the second-order many-body perturbation correction to an electronic energy is converted into the sum of two 13-dimensional integrals, the 12-dimensional parts of which are evaluated by Monte Carlo integration. Weight functions are identified that are analytically normalizable, are finite and non-negative everywhere, and share the same singularities as the integrands. They thus generate appropriate distributions of four-electron walkers via the Metropolis algorithm, yielding correlation energies of small molecules within a few mE(h) of the correct values after 10(8) Monte Carlo steps. This algorithm does away with the integral transformation as the hotspot of the usual algorithms, has a far superior size dependence of cost, does not suffer from the sign problem of some quantum Monte Carlo methods, and potentially easily parallelizable and extensible to other more complex electron-correlation theories.
Simulating the Rayleigh-Taylor instability with the Ising model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ball, Justin R.; Elliott, James B.
2011-08-26
The Ising model, implemented with the Metropolis algorithm and Kawasaki dynamics, makes a system with its own physics, distinct from the real world. These physics are sophisticated enough to model behavior similar to the Rayleigh-Taylor instability and by better understanding these physics, we can learn how to modify the system to better re ect reality. For example, we could add a v x and a v y to each spin and modify the exchange rules to incorporate them, possibly using two body scattering laws to construct a more realistic system.
Formulation of the Multi-Hit Model With a Non-Poisson Distribution of Hits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vassiliev, Oleg N., E-mail: Oleg.Vassiliev@albertahealthservices.ca
2012-07-15
Purpose: We proposed a formulation of the multi-hit single-target model in which the Poisson distribution of hits was replaced by a combination of two distributions: one for the number of particles entering the target and one for the number of hits a particle entering the target produces. Such an approach reflects the fact that radiation damage is a result of two different random processes: particle emission by a radiation source and interaction of particles with matter inside the target. Methods and Materials: Poisson distribution is well justified for the first of the two processes. The second distribution depends on howmore » a hit is defined. To test our approach, we assumed that the second distribution was also a Poisson distribution. The two distributions combined resulted in a non-Poisson distribution. We tested the proposed model by comparing it with previously reported data for DNA single- and double-strand breaks induced by protons and electrons, for survival of a range of cell lines, and variation of the initial slopes of survival curves with radiation quality for heavy-ion beams. Results: Analysis of cell survival equations for this new model showed that they had realistic properties overall, such as the initial and high-dose slopes of survival curves, the shoulder, and relative biological effectiveness (RBE) In most cases tested, a better fit of survival curves was achieved with the new model than with the linear-quadratic model. The results also suggested that the proposed approach may extend the multi-hit model beyond its traditional role in analysis of survival curves to predicting effects of radiation quality and analysis of DNA strand breaks. Conclusions: Our model, although conceptually simple, performed well in all tests. The model was able to consistently fit data for both cell survival and DNA single- and double-strand breaks. It correctly predicted the dependence of radiation effects on parameters of radiation quality.« less
Robertson, Kevin A.; Hsieh, Wei Yuan; Forster, Thorsten; Blanc, Mathieu; Lu, Hongjin; Crick, Peter J.; Yutuc, Eylan; Watterson, Steven; Martin, Kimberly; Griffiths, Samantha J.; Enright, Anton J.; Yamamoto, Mami; Pradeepa, Madapura M.; Lennox, Kimberly A.; Behlke, Mark A.; Talbot, Simon; Haas, Jürgen; Dölken, Lars; Griffiths, William J.; Wang, Yuqin; Angulo, Ana; Ghazal, Peter
2016-01-01
In invertebrates, small interfering RNAs are at the vanguard of cell-autonomous antiviral immunity. In contrast, antiviral mechanisms initiated by interferon (IFN) signaling predominate in mammals. Whilst mammalian IFN-induced miRNA are known to inhibit specific viruses, it is not known whether host-directed microRNAs, downstream of IFN-signaling, have a role in mediating broad antiviral resistance. By performing an integrative, systematic, global analysis of RNA turnover utilizing 4-thiouridine labeling of newly transcribed RNA and pri/pre-miRNA in IFN-activated macrophages, we identify a new post-transcriptional viral defense mechanism mediated by miR-342-5p. On the basis of ChIP and site-directed promoter mutagenesis experiments, we find the synthesis of miR-342-5p is coupled to the antiviral IFN response via the IFN-induced transcription factor, IRF1. Strikingly, we find miR-342-5p targets mevalonate-sterol biosynthesis using a multihit mechanism suppressing the pathway at different functional levels: transcriptionally via SREBF2, post-transcriptionally via miR-33, and enzymatically via IDI1 and SC4MOL. Mass spectrometry-based lipidomics and enzymatic assays demonstrate the targeting mechanisms reduce intermediate sterol pathway metabolites and total cholesterol in macrophages. These results reveal a previously unrecognized mechanism by which IFN regulates the sterol pathway. The sterol pathway is known to be an integral part of the macrophage IFN antiviral response, and we show that miR-342-5p exerts broad antiviral effects against multiple, unrelated pathogenic viruses such Cytomegalovirus and Influenza A (H1N1). Metabolic rescue experiments confirm the specificity of these effects and demonstrate that unrelated viruses have differential mevalonate and sterol pathway requirements for their replication. This study, therefore, advances the general concept of broad antiviral defense through multihit targeting of a single host pathway. PMID:26938778
Robertson, Kevin A; Hsieh, Wei Yuan; Forster, Thorsten; Blanc, Mathieu; Lu, Hongjin; Crick, Peter J; Yutuc, Eylan; Watterson, Steven; Martin, Kimberly; Griffiths, Samantha J; Enright, Anton J; Yamamoto, Mami; Pradeepa, Madapura M; Lennox, Kimberly A; Behlke, Mark A; Talbot, Simon; Haas, Jürgen; Dölken, Lars; Griffiths, William J; Wang, Yuqin; Angulo, Ana; Ghazal, Peter
2016-03-01
In invertebrates, small interfering RNAs are at the vanguard of cell-autonomous antiviral immunity. In contrast, antiviral mechanisms initiated by interferon (IFN) signaling predominate in mammals. Whilst mammalian IFN-induced miRNA are known to inhibit specific viruses, it is not known whether host-directed microRNAs, downstream of IFN-signaling, have a role in mediating broad antiviral resistance. By performing an integrative, systematic, global analysis of RNA turnover utilizing 4-thiouridine labeling of newly transcribed RNA and pri/pre-miRNA in IFN-activated macrophages, we identify a new post-transcriptional viral defense mechanism mediated by miR-342-5p. On the basis of ChIP and site-directed promoter mutagenesis experiments, we find the synthesis of miR-342-5p is coupled to the antiviral IFN response via the IFN-induced transcription factor, IRF1. Strikingly, we find miR-342-5p targets mevalonate-sterol biosynthesis using a multihit mechanism suppressing the pathway at different functional levels: transcriptionally via SREBF2, post-transcriptionally via miR-33, and enzymatically via IDI1 and SC4MOL. Mass spectrometry-based lipidomics and enzymatic assays demonstrate the targeting mechanisms reduce intermediate sterol pathway metabolites and total cholesterol in macrophages. These results reveal a previously unrecognized mechanism by which IFN regulates the sterol pathway. The sterol pathway is known to be an integral part of the macrophage IFN antiviral response, and we show that miR-342-5p exerts broad antiviral effects against multiple, unrelated pathogenic viruses such Cytomegalovirus and Influenza A (H1N1). Metabolic rescue experiments confirm the specificity of these effects and demonstrate that unrelated viruses have differential mevalonate and sterol pathway requirements for their replication. This study, therefore, advances the general concept of broad antiviral defense through multihit targeting of a single host pathway.
ERIC Educational Resources Information Center
Abayomi, B. O.; Oyeniyi, Pat Ola; Ainazx, O. O.
2017-01-01
The paper appraised the organization and administration of intramural sports programmes in secondary schools in Ibadan metropolis. The descriptive research design of survey type was employed for the study. The population was all secondary school students and teachers in Ibadan Metropolis. The sample consisted of 500 respondents, 40 public…
ERIC Educational Resources Information Center
Bua, Felix Terhile
2013-01-01
The study investigated the influence of school environment on the management of secondary school education in Makurdi Metropolis of Benue State. Two research questions and two hypotheses guided the study. The survey design was adopted for the study. Four hundred (400) teachers from 20 grant aided secondary schools in Markurdi Metropolis of Benue…
Coincidence ion imaging with a fast frame camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei
2014-12-15
A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots onmore » each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.« less
NASA Astrophysics Data System (ADS)
Smith, T.; Marshall, L.
2007-12-01
In many mountainous regions, the single most important parameter in forecasting the controls on regional water resources is snowpack (Williams et al., 1999). In an effort to bridge the gap between theoretical understanding and functional modeling of snow-driven watersheds, a flexible hydrologic modeling framework is being developed. The aim is to create a suite of models that move from parsimonious structures, concentrated on aggregated watershed response, to those focused on representing finer scale processes and distributed response. This framework will operate as a tool to investigate the link between hydrologic model predictive performance, uncertainty, model complexity, and observable hydrologic processes. Bayesian methods, and particularly Markov chain Monte Carlo (MCMC) techniques, are extremely useful in uncertainty assessment and parameter estimation of hydrologic models. However, these methods have some difficulties in implementation. In a traditional Bayesian setting, it can be difficult to reconcile multiple data types, particularly those offering different spatial and temporal coverage, depending on the model type. These difficulties are also exacerbated by sensitivity of MCMC algorithms to model initialization and complex parameter interdependencies. As a way of circumnavigating some of the computational complications, adaptive MCMC algorithms have been developed to take advantage of the information gained from each successive iteration. Two adaptive algorithms are compared is this study, the Adaptive Metropolis (AM) algorithm, developed by Haario et al (2001), and the Delayed Rejection Adaptive Metropolis (DRAM) algorithm, developed by Haario et al (2006). While neither algorithm is truly Markovian, it has been proven that each satisfies the desired ergodicity and stationarity properties of Markov chains. Both algorithms were implemented as the uncertainty and parameter estimation framework for a conceptual rainfall-runoff model based on the Probability Distributed Model (PDM), developed by Moore (1985). We implement the modeling framework in Stringer Creek watershed in the Tenderfoot Creek Experimental Forest (TCEF), Montana. The snowmelt-driven watershed offers that additional challenge of modeling snow accumulation and melt and current efforts are aimed at developing a temperature- and radiation-index snowmelt model. Auxiliary data available from within TCEF's watersheds are used to support in the understanding of information value as it relates to predictive performance. Because the model is based on lumped parameters, auxiliary data are hard to incorporate directly. However, these additional data offer benefits through the ability to inform prior distributions of the lumped, model parameters. By incorporating data offering different information into the uncertainty assessment process, a cross-validation technique is engaged to better ensure that modeled results reflect real process complexity.
Strategic guidelines of a megalopolis’s development: new industrialization and ecological tension
NASA Astrophysics Data System (ADS)
Lavrikova, Yulia; Akberdina, Victoria; Mezentseva, Elena
2017-06-01
The article is devoted to the integration of environmental concerns in the development strategy of the metropolis. The authors substantiate the relationship of the new industrialization and reduce the burden on the environment. For example, a large city in Russia - Ekaterinburg - projections and strategic directions of ecological development of the city for the period up to 2035. The basis for the forecast were the methods of aggregation of economic sectors, functional relationship and forecasting algorithms. The article describes three scenarios for the development of Ekaterinburg, the results of calculations by the author’s method. The authors have shown the relationship of industrial development of the metropolis and the anthropogenic load, which is assessed using such indicators as emissions of harmful substances into the atmosphere, discharges of sewage, green spaces per inhabitant. The authors note that the ecological security of the residents is directly related to the improvement of controllability of the municipal economy, the increased control in the field of the environment, reduction of environmental burden on humans and the environment, zoning of the city with a goal differential application of indicators of the quality of the environment.
Cure fraction model with random effects for regional variation in cancer survival.
Seppä, Karri; Hakulinen, Timo; Kim, Hyon-Jung; Läärä, Esa
2010-11-30
Assessing regional differences in the survival of cancer patients is important but difficult when separate regions are small or sparsely populated. In this paper, we apply a mixture cure fraction model with random effects to cause-specific survival data of female breast cancer patients collected by the population-based Finnish Cancer Registry. Two sets of random effects were used to capture the regional variation in the cure fraction and in the survival of the non-cured patients, respectively. This hierarchical model was implemented in a Bayesian framework using a Metropolis-within-Gibbs algorithm. To avoid poor mixing of the Markov chain, when the variance of either set of random effects was close to zero, posterior simulations were based on a parameter-expanded model with tailor-made proposal distributions in Metropolis steps. The random effects allowed the fitting of the cure fraction model to the sparse regional data and the estimation of the regional variation in 10-year cause-specific breast cancer survival with a parsimonious number of parameters. Before 1986, the capital of Finland clearly stood out from the rest, but since then all the 21 hospital districts have achieved approximately the same level of survival. Copyright © 2010 John Wiley & Sons, Ltd.
Chen, Yunjie; Roux, Benoît
2015-08-11
Molecular dynamics (MD) trajectories based on a classical equation of motion provide a straightforward, albeit somewhat inefficient approach, to explore and sample the configurational space of a complex molecular system. While a broad range of techniques can be used to accelerate and enhance the sampling efficiency of classical simulations, only algorithms that are consistent with the Boltzmann equilibrium distribution yield a proper statistical mechanical computational framework. Here, a multiscale hybrid algorithm relying simultaneously on all-atom fine-grained (FG) and coarse-grained (CG) representations of a system is designed to improve sampling efficiency by combining the strength of nonequilibrium molecular dynamics (neMD) and Metropolis Monte Carlo (MC). This CG-guided hybrid neMD-MC algorithm comprises six steps: (1) a FG configuration of an atomic system is dynamically propagated for some period of time using equilibrium MD; (2) the resulting FG configuration is mapped onto a simplified CG model; (3) the CG model is propagated for a brief time interval to yield a new CG configuration; (4) the resulting CG configuration is used as a target to guide the evolution of the FG system; (5) the FG configuration (from step 1) is driven via a nonequilibrium MD (neMD) simulation toward the CG target; (6) the resulting FG configuration at the end of the neMD trajectory is then accepted or rejected according to a Metropolis criterion before returning to step 1. A symmetric two-ends momentum reversal prescription is used for the neMD trajectories of the FG system to guarantee that the CG-guided hybrid neMD-MC algorithm obeys microscopic detailed balance and rigorously yields the equilibrium Boltzmann distribution. The enhanced sampling achieved with the method is illustrated with a model system with hindered diffusion and explicit-solvent peptide simulations. Illustrative tests indicate that the method can yield a speedup of about 80 times for the model system and up to 21 times for polyalanine and (AAQAA)3 in water.
2015-01-01
Molecular dynamics (MD) trajectories based on a classical equation of motion provide a straightforward, albeit somewhat inefficient approach, to explore and sample the configurational space of a complex molecular system. While a broad range of techniques can be used to accelerate and enhance the sampling efficiency of classical simulations, only algorithms that are consistent with the Boltzmann equilibrium distribution yield a proper statistical mechanical computational framework. Here, a multiscale hybrid algorithm relying simultaneously on all-atom fine-grained (FG) and coarse-grained (CG) representations of a system is designed to improve sampling efficiency by combining the strength of nonequilibrium molecular dynamics (neMD) and Metropolis Monte Carlo (MC). This CG-guided hybrid neMD-MC algorithm comprises six steps: (1) a FG configuration of an atomic system is dynamically propagated for some period of time using equilibrium MD; (2) the resulting FG configuration is mapped onto a simplified CG model; (3) the CG model is propagated for a brief time interval to yield a new CG configuration; (4) the resulting CG configuration is used as a target to guide the evolution of the FG system; (5) the FG configuration (from step 1) is driven via a nonequilibrium MD (neMD) simulation toward the CG target; (6) the resulting FG configuration at the end of the neMD trajectory is then accepted or rejected according to a Metropolis criterion before returning to step 1. A symmetric two-ends momentum reversal prescription is used for the neMD trajectories of the FG system to guarantee that the CG-guided hybrid neMD-MC algorithm obeys microscopic detailed balance and rigorously yields the equilibrium Boltzmann distribution. The enhanced sampling achieved with the method is illustrated with a model system with hindered diffusion and explicit-solvent peptide simulations. Illustrative tests indicate that the method can yield a speedup of about 80 times for the model system and up to 21 times for polyalanine and (AAQAA)3 in water. PMID:26574442
Markov Chain Monte Carlo Used in Parameter Inference of Magnetic Resonance Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hock, Kiel; Earle, Keith
2016-02-06
In this paper, we use Boltzmann statistics and the maximum likelihood distribution derived from Bayes’ Theorem to infer parameter values for a Pake Doublet Spectrum, a lineshape of historical significance and contemporary relevance for determining distances between interacting magnetic dipoles. A Metropolis Hastings Markov Chain Monte Carlo algorithm is implemented and designed to find the optimum parameter set and to estimate parameter uncertainties. In conclusion, the posterior distribution allows us to define a metric on parameter space that induces a geometry with negative curvature that affects the parameter uncertainty estimates, particularly for spectra with low signal to noise.
Irreversibility and entanglement spectrum statistics in quantum circuits
NASA Astrophysics Data System (ADS)
Shaffer, Daniel; Chamon, Claudio; Hamma, Alioscia; Mucciolo, Eduardo R.
2014-12-01
We show that in a quantum system evolving unitarily under a stochastic quantum circuit the notions of irreversibility, universality of computation, and entanglement are closely related. As the state evolves from an initial product state, it gets asymptotically maximally entangled. We define irreversibility as the failure of searching for a disentangling circuit using a Metropolis-like algorithm. We show that irreversibility corresponds to Wigner-Dyson statistics in the level spacing of the entanglement eigenvalues, and that this is obtained from a quantum circuit made from a set of universal gates for quantum computation. If, on the other hand, the system is evolved with a non-universal set of gates, the statistics of the entanglement level spacing deviates from Wigner-Dyson and the disentangling algorithm succeeds. These results open a new way to characterize irreversibility in quantum systems.
Markov Chain Monte Carlo Bayesian Learning for Neural Networks
NASA Technical Reports Server (NTRS)
Goodrich, Michael S.
2011-01-01
Conventional training methods for neural networks involve starting al a random location in the solution space of the network weights, navigating an error hyper surface to reach a minimum, and sometime stochastic based techniques (e.g., genetic algorithms) to avoid entrapment in a local minimum. It is further typically necessary to preprocess the data (e.g., normalization) to keep the training algorithm on course. Conversely, Bayesian based learning is an epistemological approach concerned with formally updating the plausibility of competing candidate hypotheses thereby obtaining a posterior distribution for the network weights conditioned on the available data and a prior distribution. In this paper, we developed a powerful methodology for estimating the full residual uncertainty in network weights and therefore network predictions by using a modified Jeffery's prior combined with a Metropolis Markov Chain Monte Carlo method.
Norris, Peter M; da Silva, Arlindo M
2016-07-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.
NASA Technical Reports Server (NTRS)
Norris, Peter M.; Da Silva, Arlindo M.
2016-01-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.
Norris, Peter M.; da Silva, Arlindo M.
2018-01-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC. PMID:29618847
Wang, Hongrui; Wang, Cheng; Wang, Ying; ...
2017-04-05
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLEmore » confidence interval and thus more precise estimation by using the related information from regional gage stations. As a result, the Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.« less
Monte Carlo simulation of a noisy quantum channel with memory.
Akhalwaya, Ismail; Moodley, Mervlyn; Petruccione, Francesco
2015-10-01
The classical capacity of quantum channels is well understood for channels with uncorrelated noise. For the case of correlated noise, however, there are still open questions. We calculate the classical capacity of a forgetful channel constructed by Markov switching between two depolarizing channels. Techniques have previously been applied to approximate the output entropy of this channel and thus its capacity. In this paper, we use a Metropolis-Hastings Monte Carlo approach to numerically calculate the entropy. The algorithm is implemented in parallel and its performance is studied and optimized. The effects of memory on the capacity are explored and previous results are confirmed to higher precision.
Simulated Annealing in the Variable Landscape
NASA Astrophysics Data System (ADS)
Hasegawa, Manabu; Kim, Chang Ju
An experimental analysis is conducted to test whether the appropriate introduction of the smoothness-temperature schedule enhances the optimizing ability of the MASSS method, the combination of the Metropolis algorithm (MA) and the search-space smoothing (SSS) method. The test is performed on two types of random traveling salesman problems. The results show that the optimization performance of the MA is substantially improved by a single smoothing alone and slightly more by a single smoothing with cooling and by a de-smoothing process with heating. The performance is compared to that of the parallel tempering method and a clear advantage of the idea of smoothing is observed depending on the problem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berco, Dan, E-mail: danny.barkan@gmail.com; Tseng, Tseung-Yuen, E-mail: tseng@cc.nctu.edu.tw
This study presents an evaluation method for resistive random access memory retention reliability based on the Metropolis Monte Carlo algorithm and Gibbs free energy. The method, which does not rely on a time evolution, provides an extremely efficient way to compare the relative retention properties of metal-insulator-metal structures. It requires a small number of iterations and may be used for statistical analysis. The presented approach is used to compare the relative robustness of a single layer ZrO{sub 2} device with a double layer ZnO/ZrO{sub 2} one, and obtain results which are in good agreement with experimental data.
Geometrically Constructed Markov Chain Monte Carlo Study of Quantum Spin-phonon Complex Systems
NASA Astrophysics Data System (ADS)
Suwa, Hidemaro
2013-03-01
We have developed novel Monte Carlo methods for precisely calculating quantum spin-boson models and investigated the critical phenomena of the spin-Peierls systems. Three significant methods are presented. The first is a new optimization algorithm of the Markov chain transition kernel based on the geometric weight allocation. This algorithm, for the first time, satisfies the total balance generally without imposing the detailed balance and always minimizes the average rejection rate, being better than the Metropolis algorithm. The second is the extension of the worm (directed-loop) algorithm to non-conserved particles, which cannot be treated efficiently by the conventional methods. The third is the combination with the level spectroscopy. Proposing a new gap estimator, we are successful in eliminating the systematic error of the conventional moment method. Then we have elucidated the phase diagram and the universality class of the one-dimensional XXZ spin-Peierls system. The criticality is totally consistent with the J1 -J2 model, an effective model in the antiadiabatic limit. Through this research, we have succeeded in investigating the critical phenomena of the effectively frustrated quantum spin system by the quantum Monte Carlo method without the negative sign. JSPS Postdoctoral Fellow for Research Abroad
Coincidence electron/ion imaging with a fast frame camera
NASA Astrophysics Data System (ADS)
Li, Wen; Lee, Suk Kyoung; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander; Fan, Lin
2015-05-01
A new time- and position- sensitive particle detection system based on a fast frame CMOS camera is developed for coincidence electron/ion imaging. The system is composed of three major components: a conventional microchannel plate (MCP)/phosphor screen electron/ion imager, a fast frame CMOS camera and a high-speed digitizer. The system collects the positional information of ions/electrons from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of MCPs processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of electron/ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide. We further show that a time resolution of 30 ps can be achieved when measuring electron TOF spectrum and this enables the new system to achieve a good energy resolution along the TOF axis.
Modelling Evolutionary Algorithms with Stochastic Differential Equations.
Heredia, Jorge Pérez
2017-11-20
There has been renewed interest in modelling the behaviour of evolutionary algorithms (EAs) by more traditional mathematical objects, such as ordinary differential equations or Markov chains. The advantage is that the analysis becomes greatly facilitated due to the existence of well established methods. However, this typically comes at the cost of disregarding information about the process. Here, we introduce the use of stochastic differential equations (SDEs) for the study of EAs. SDEs can produce simple analytical results for the dynamics of stochastic processes, unlike Markov chains which can produce rigorous but unwieldy expressions about the dynamics. On the other hand, unlike ordinary differential equations (ODEs), they do not discard information about the stochasticity of the process. We show that these are especially suitable for the analysis of fixed budget scenarios and present analogues of the additive and multiplicative drift theorems from runtime analysis. In addition, we derive a new more general multiplicative drift theorem that also covers non-elitist EAs. This theorem simultaneously allows for positive and negative results, providing information on the algorithm's progress even when the problem cannot be optimised efficiently. Finally, we provide results for some well-known heuristics namely Random Walk (RW), Random Local Search (RLS), the (1+1) EA, the Metropolis Algorithm (MA), and the Strong Selection Weak Mutation (SSWM) algorithm.
Three-dimensional Probabilistic Earthquake Location Applied to 2002-2003 Mt. Etna Eruption
NASA Astrophysics Data System (ADS)
Mostaccio, A.; Tuve', T.; Zuccarello, L.; Patane', D.; Saccorotti, G.; D'Agostino, M.
2005-12-01
Recorded seismicity for the Mt. Etna volcano, occurred during the 2002-2003 eruption, has been relocated using a probabilistic, non-linear, earthquake location approach. We used the software package NonLinLoc (Lomax et al., 2000) adopting the 3D velocity model obtained by Cocina et al., 2005. We applied our data through different algorithms: (1) via a grid-search; (2) via a Metropolis-Gibbs; and (3) via an Oct-tree. The Oct-Tree algorithm gives efficient, faster and accurate mapping of the PDF (Probability Density Function) of the earthquake location problem. More than 300 seismic events were analyzed in order to compare non-linear location results with the ones obtained by using traditional, linearized earthquake location algorithm such as Hypoellipse, and a 3D linearized inversion (Thurber, 1983). Moreover, we compare 38 focal mechanisms, chosen following stricta criteria selection, with the ones obtained by the 3D and 1D results. Although the presented approach is more of a traditional relocation application, probabilistic earthquake location could be used in routinely survey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crowder, Jeff; Jet Propulsion Laboratory, California Institute of Technology, Pasadena, California 91109; Cornish, Neil J.
Low frequency gravitational wave detectors, such as the Laser Interferometer Space Antenna (LISA), will have to contend with large foregrounds produced by millions of compact galactic binaries in our galaxy. While these galactic signals are interesting in their own right, the unresolved component can obscure other sources. The science yield for the LISA mission can be improved if the brighter and more isolated foreground sources can be identified and regressed from the data. Since the signals overlap with one another, we are faced with a 'cocktail party' problem of picking out individual conversations in a crowded room. Here we presentmore » and implement an end-to-end solution to the galactic foreground problem that is able to resolve tens of thousands of sources from across the LISA band. Our algorithm employs a variant of the Markov chain Monte Carlo (MCMC) method, which we call the blocked annealed Metropolis-Hastings (BAM) algorithm. Following a description of the algorithm and its implementation, we give several examples ranging from searches for a single source to searches for hundreds of overlapping sources. Our examples include data sets from the first round of mock LISA data challenges.« less
A brief history of the introduction of generalized ensembles to Markov chain Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Berg, Bernd A.
2017-03-01
The most efficient weights for Markov chain Monte Carlo calculations of physical observables are not necessarily those of the canonical ensemble. Generalized ensembles, which do not exist in nature but can be simulated on computers, lead often to a much faster convergence. In particular, they have been used for simulations of first order phase transitions and for simulations of complex systems in which conflicting constraints lead to a rugged free energy landscape. Starting off with the Metropolis algorithm and Hastings' extension, I present a minireview which focuses on the explosive use of generalized ensembles in the early 1990s. Illustrations are given, which range from spin models to peptides.
Exploring first-order phase transitions with population annealing
NASA Astrophysics Data System (ADS)
Barash, Lev Yu.; Weigel, Martin; Shchur, Lev N.; Janke, Wolfhard
2017-03-01
Population annealing is a hybrid of sequential and Markov chain Monte Carlo methods geared towards the efficient parallel simulation of systems with complex free-energy landscapes. Systems with first-order phase transitions are among the problems in computational physics that are difficult to tackle with standard methods such as local-update simulations in the canonical ensemble, for example with the Metropolis algorithm. It is hence interesting to see whether such transitions can be more easily studied using population annealing. We report here our preliminary observations from population annealing runs for the two-dimensional Potts model with q > 4, where it undergoes a first-order transition.
NASA Astrophysics Data System (ADS)
Bérubé, Charles L.; Chouteau, Michel; Shamsipour, Pejman; Enkin, Randolph J.; Olivo, Gema R.
2017-08-01
Spectral induced polarization (SIP) measurements are now widely used to infer mineralogical or hydrogeological properties from the low-frequency electrical properties of the subsurface in both mineral exploration and environmental sciences. We present an open-source program that performs fast multi-model inversion of laboratory complex resistivity measurements using Markov-chain Monte Carlo simulation. Using this stochastic method, SIP parameters and their uncertainties may be obtained from the Cole-Cole and Dias models, or from the Debye and Warburg decomposition approaches. The program is tested on synthetic and laboratory data to show that the posterior distribution of a multiple Cole-Cole model is multimodal in particular cases. The Warburg and Debye decomposition approaches yield unique solutions in all cases. It is shown that an adaptive Metropolis algorithm performs faster and is less dependent on the initial parameter values than the Metropolis-Hastings step method when inverting SIP data through the decomposition schemes. There are no advantages in using an adaptive step method for well-defined Cole-Cole inversion. Finally, the influence of measurement noise on the recovered relaxation time distribution is explored. We provide the geophysics community with a open-source platform that can serve as a base for further developments in stochastic SIP data inversion and that may be used to perform parameter analysis with various SIP models.
Yu, Guozhi; Hozé, Nathanaël; Rolff, Jens
2016-01-01
Antimicrobial peptides (AMPs) and antibiotics reduce the net growth rate of bacterial populations they target. It is relevant to understand if effects of multiple antimicrobials are synergistic or antagonistic, in particular for AMP responses, because naturally occurring responses involve multiple AMPs. There are several competing proposals describing how multiple types of antimicrobials add up when applied in combination, such as Loewe additivity or Bliss independence. These additivity terms are defined ad hoc from abstract principles explaining the supposed interaction between the antimicrobials. Here, we link these ad hoc combination terms to a mathematical model that represents the dynamics of antimicrobial molecules hitting targets on bacterial cells. In this multi-hit model, bacteria are killed when a certain number of targets are hit by antimicrobials. Using this bottom-up approach reveals that Bliss independence should be the model of choice if no interaction between antimicrobial molecules is expected. Loewe additivity, on the other hand, describes scenarios in which antimicrobials affect the same components of the cell, i.e. are not acting independently. While our approach idealizes the dynamics of antimicrobials, it provides a conceptual underpinning of the additivity terms. The choice of the additivity term is essential to determine synergy or antagonism of antimicrobials. This article is part of the themed issue ‘Evolutionary ecology of arthropod antimicrobial peptides’. PMID:27160596
Estimating a Noncompensatory IRT Model Using Metropolis within Gibbs Sampling
ERIC Educational Resources Information Center
Babcock, Ben
2011-01-01
Relatively little research has been conducted with the noncompensatory class of multidimensional item response theory (MIRT) models. A Monte Carlo simulation study was conducted exploring the estimation of a two-parameter noncompensatory item response theory (IRT) model. The estimation method used was a Metropolis-Hastings within Gibbs algorithm…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-27
... NUCLEAR REGULATORY COMMISSION [Docket No. 40-3392; NRC-2011-0143] License Amendment Request for Closure of Calcium Fluoride Ponds at Honeywell Metropolis Works, Honeywell International, Inc. AGENCY... Federal Regulations (10 CFR) to approve the closure of the calcium fluoride ponds in-place, by...
DNA motif alignment by evolving a population of Markov chains.
Bi, Chengpeng
2009-01-30
Deciphering cis-regulatory elements or de novo motif-finding in genomes still remains elusive although much algorithmic effort has been expended. The Markov chain Monte Carlo (MCMC) method such as Gibbs motif samplers has been widely employed to solve the de novo motif-finding problem through sequence local alignment. Nonetheless, the MCMC-based motif samplers still suffer from local maxima like EM. Therefore, as a prerequisite for finding good local alignments, these motif algorithms are often independently run a multitude of times, but without information exchange between different chains. Hence it would be worth a new algorithm design enabling such information exchange. This paper presents a novel motif-finding algorithm by evolving a population of Markov chains with information exchange (PMC), each of which is initialized as a random alignment and run by the Metropolis-Hastings sampler (MHS). It is progressively updated through a series of local alignments stochastically sampled. Explicitly, the PMC motif algorithm performs stochastic sampling as specified by a population-based proposal distribution rather than individual ones, and adaptively evolves the population as a whole towards a global maximum. The alignment information exchange is accomplished by taking advantage of the pooled motif site distributions. A distinct method for running multiple independent Markov chains (IMC) without information exchange, or dubbed as the IMC motif algorithm, is also devised to compare with its PMC counterpart. Experimental studies demonstrate that the performance could be improved if pooled information were used to run a population of motif samplers. The new PMC algorithm was able to improve the convergence and outperformed other popular algorithms tested using simulated and biological motif sequences.
Hu, Y C; Chen, J; Li, M; Wang, R; Li, W D; Yang, Y H; Yang, C; Yun, C F; Yang, L C; Yang, X G
2017-02-06
Objective: To evaluate the prevalence of anemia and the nutritional status of vitamins A and D by analyzing hemoglobin, serum retinol, and serum 25-hydroxyvitamin D levels in Chinese urban pregnant women during 2010-2012. Methods: Data were obtained from the China Nutrition and Health Survey in 2010-2012. Using multi-stage stratified sampling and population proportional stratified random sampling, 2 250 pregnant women from 34 metropolis and 41 middle-sized and small cities were included in this study. Information was collected using a questionnaire survey. The blood hemoglobin concentration was determined using the cyanmethemoglobin method, and anemia was determined using the World Health Organization guidelines combined with the elevation correction standard. The serum retinol level was determined using high-performance liquid chromatography, and vitamin A deficiency (VAD) was judged by the related standard recommended by the World Health Organization. The vitamin D level was determined using enzyme-linked immunosorbent assay and vitamin D deficiency was judged by the recommendation standards from the Institute of Medicine of The National Academies. The hemoglobin, serum retinol, and serum 25-hydroxyvitamin D levels were compared, along with differences in the prevalence of anemia, VAD, and the vitamin D deficiency rate (including deficiency and serious deficiency). Results: A total of 1 738 cases of hemoglobin level, 594 cases of serum retinol level, and 1 027 cases of serum 25-hydroxyvitamin D were available for analysis in this study. The overall blood hemoglobin level ( P (50) ( P (25)- P (75))) was 122.70 (114.00-131.10) g/L; 123.70 (115.21-132.00) g/L for metropolis and 122.01 (113.30-130.40) g/L for middle-sized and small cities. The blood hemoglobin level of metropolis residents was significantly higher than that of middle-sized and small city residents ( P= 0.027). The overall prevalence of anemia was 17.0% (295/1 738). The overall serum retinol level ( P (50) ( P (25)- P (75))) was 1.61 (1.20-2.06) μmol/L; 1.50 (1.04-2.06) μmol/L for metropolis and 1.63 (1.31-2.05) μmol/L for middle-sized and small cities. The serum retinol level of metropolis residents was significantly higher than that of middle-sized and small city residents ( P= 0.033). The overall prevalence of VAD was 7.4% (47/639); 11.5% (33/286) for metropolis and 4.0% (14/353) for middle-sized and small cities. A significant difference was observed in the prevalence of VAD between metropolis and middle-sized and small city residents ( P< 0.001). The overall serum 25-hydroxyvitamin D level ( P (50) ( P (25)- P (75))) was 15.41 (11.79-20.23) ng/ml; 14.71 (11.15-19.07) ng/ml for metropolis and 16.02 (12.65-21.36) ng/ml for middle-sized and small cities. A significant difference was observed in the vitamin D level between metropolis and middle-sized and small city residents ( P< 0.001). The overall prevalence of vitamin D deficiency was 74.3% (763/1 027); A significant difference was observed in the prevalence of serious vitamin D deficiency between metropolis (30.64%(144/470)) and middle-sized and small city residents (26%(267/1 027))( P= 0.002). There were no significant differences between blood hemoglobin level and the prevalence of anemia, VAD, and vitamin D deficiency. Conclusion: The prevalence of anemia in Chinese urban pregnant women improved from 2002 to 2012. The prevalence of vitamin D deficiency in pregnant women was generally more serious, while a certain percentage of women had VAD. The prevalence of VAD and serious vitamin D deficiency among pregnant women from metropolis was significantly higher than that of pregnant women from medium and small-sized cities.
An Improved Nested Sampling Algorithm for Model Selection and Assessment
NASA Astrophysics Data System (ADS)
Zeng, X.; Ye, M.; Wu, J.; WANG, D.
2017-12-01
Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.
Developing a cosmic ray muon sampling capability for muon tomography and monitoring applications
NASA Astrophysics Data System (ADS)
Chatzidakis, S.; Chrysikopoulou, S.; Tsoukalas, L. H.
2015-12-01
In this study, a cosmic ray muon sampling capability using a phenomenological model that captures the main characteristics of the experimentally measured spectrum coupled with a set of statistical algorithms is developed. The "muon generator" produces muons with zenith angles in the range 0-90° and energies in the range 1-100 GeV and is suitable for Monte Carlo simulations with emphasis on muon tomographic and monitoring applications. The muon energy distribution is described by the Smith and Duller (1959) [35] phenomenological model. Statistical algorithms are then employed for generating random samples. The inverse transform provides a means to generate samples from the muon angular distribution, whereas the Acceptance-Rejection and Metropolis-Hastings algorithms are employed to provide the energy component. The predictions for muon energies 1-60 GeV and zenith angles 0-90° are validated with a series of actual spectrum measurements and with estimates from the software library CRY. The results confirm the validity of the phenomenological model and the applicability of the statistical algorithms to generate polyenergetic-polydirectional muons. The response of the algorithms and the impact of critical parameters on computation time and computed results were investigated. Final output from the proposed "muon generator" is a look-up table that contains the sampled muon angles and energies and can be easily integrated into Monte Carlo particle simulation codes such as Geant4 and MCNP.
The Manhattan Frame Model-Manhattan World Inference in the Space of Surface Normals.
Straub, Julian; Freifeld, Oren; Rosman, Guy; Leonard, John J; Fisher, John W
2018-01-01
Objects and structures within man-made environments typically exhibit a high degree of organization in the form of orthogonal and parallel planes. Traditional approaches utilize these regularities via the restrictive, and rather local, Manhattan World (MW) assumption which posits that every plane is perpendicular to one of the axes of a single coordinate system. The aforementioned regularities are especially evident in the surface normal distribution of a scene where they manifest as orthogonally-coupled clusters. This motivates the introduction of the Manhattan-Frame (MF) model which captures the notion of an MW in the surface normals space, the unit sphere, and two probabilistic MF models over this space. First, for a single MF we propose novel real-time MAP inference algorithms, evaluate their performance and their use in drift-free rotation estimation. Second, to capture the complexity of real-world scenes at a global scale, we extend the MF model to a probabilistic mixture of Manhattan Frames (MMF). For MMF inference we propose a simple MAP inference algorithm and an adaptive Markov-Chain Monte-Carlo sampling algorithm with Metropolis-Hastings split/merge moves that let us infer the unknown number of mixture components. We demonstrate the versatility of the MMF model and inference algorithm across several scales of man-made environments.
NASA Astrophysics Data System (ADS)
Hasegawa, Manabu; Hiramatsu, Kotaro
2013-10-01
The effectiveness of the Metropolis algorithm (MA) (constant-temperature simulated annealing) in optimization by the method of search-space smoothing (SSS) (potential smoothing) is studied on two types of random traveling salesman problems. The optimization mechanism of this hybrid approach (MASSS) is investigated by analyzing the exploration dynamics observed in the rugged landscape of the cost function (energy surface). The results show that the MA can be successfully utilized as a local search algorithm in the SSS approach. It is also clarified that the optimization characteristics of these two constituent methods are improved in a mutually beneficial manner in the MASSS run. Specifically, the relaxation dynamics generated by employing the MA work effectively even in a smoothed landscape and more advantage is taken of the guiding function proposed in the idea of SSS; this mechanism operates in an adaptive manner in the de-smoothing process and therefore the MASSS method maintains its optimization function over a wider temperature range than the MA.
Examining Work and Family Conflict among Female Bankers in Accra Metropolis, Ghana
ERIC Educational Resources Information Center
Kissi-Abrokwah, Bernard; Andoh-Robertson, Theophilus; Tutu-Danquah, Cecilia; Agbesi, Catherine Selorm
2015-01-01
This study investigated the effects and solutions of work and family conflict among female bankers in Accra Metropolis. Using triangulatory mixed method design, a structured questionnaire was randomly administered to 300 female bankers and 15 female Bankers who were interviewed were also sampled by using convenient sampling technique. The…
Behavioural Problems of Juvenile Street Hawkers in Uyo Metropolis, Nigeria
ERIC Educational Resources Information Center
Udoh, Nsisong A.; Joseph, Eme U.
2012-01-01
The study sought the opinions of Faculty of Education Students of University of Uyo on the behavioural problems of juvenile street hawkers in Uyo metropolis. Five research hypotheses were formulated to guide the study. This cross-sectional survey employed multi-stage random sampling technique in selecting 200 regular undergraduate students in the…
Women in Educational Leadership within the Tamale Metropolis
ERIC Educational Resources Information Center
Segkulu, L.; Gyimah, K.
2016-01-01
Within the Tamale Metropolis, it is observed that only a few women occupy top level management positions within the Ghana Education Service (GES). A descriptive survey was therefore conducted in 2013/2014 academic year to assess the factors affecting the gender disparity in educational leadership within the Service. Specifically, the study sought…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-23
... NUCLEAR REGULATORY COMMISSION [Docket No. 40-3392; License No. SUB-526; EA-12-157; NRC-2012-0244] Confirmatory Order; In the Matter of Honeywell International Inc.; Metropolis, Illinois I. Honeywell International Inc. (Honeywell or Licensee) is the holder of Materials License No. SUB-526, issued by the U.S...
Environmental Awareness and School Sanitation in Calabar Metropolis of Cross Rivers State, Nigeria
ERIC Educational Resources Information Center
Anijaobi-Idem, F. N.; Ukata, B. N.; Bisong, N. N
2015-01-01
This descriptive survey designed study explored the influence of environmental awareness on secondary school sanitation in Calabar Metropolis. 1 hypothesis was formulated to direct the investigation. 300 subjects made up of 30 principals and 270 teachers constituted the sample drawn from the population of principals and teachers in secondary…
ERIC Educational Resources Information Center
Musa, Alice K. J.; Nwachukwu, Kelechukwu I.; Ali, Domiya Geoffrey
2016-01-01
The study determined Relationship between Students' Expectancy Beliefs and English Language Performance of Students in Maiduguri Metropolis, Borno State, Nigeria. Correlation design was adopted for the study. Four hypotheses which determined the relationships between the components of expectancy beliefs: ability, tasks difficulty, and past…
NASA Astrophysics Data System (ADS)
Grayver, Alexander V.; Kuvshinov, Alexey V.
2016-05-01
This paper presents a methodology to sample equivalence domain (ED) in nonlinear partial differential equation (PDE)-constrained inverse problems. For this purpose, we first applied state-of-the-art stochastic optimization algorithm called Covariance Matrix Adaptation Evolution Strategy (CMAES) to identify low-misfit regions of the model space. These regions were then randomly sampled to create an ensemble of equivalent models and quantify uncertainty. CMAES is aimed at exploring model space globally and is robust on very ill-conditioned problems. We show that the number of iterations required to converge grows at a moderate rate with respect to number of unknowns and the algorithm is embarrassingly parallel. We formulated the problem by using the generalized Gaussian distribution. This enabled us to seamlessly use arbitrary norms for residual and regularization terms. We show that various regularization norms facilitate studying different classes of equivalent solutions. We further show how performance of the standard Metropolis-Hastings Markov chain Monte Carlo algorithm can be substantially improved by using information CMAES provides. This methodology was tested by using individual and joint inversions of magneotelluric, controlled-source electromagnetic (EM) and global EM induction data.
Race, Schools and Opportunity Hoarding: Evidence from a Post-War American Metropolis
ERIC Educational Resources Information Center
Rury, John L.; Rife, Aaron Tyler
2018-01-01
Opportunity hoarding is a sociological concept first introduced by Charles Tilly. This article explores its utility for historians by examining efforts to exclude different groups of people in a major American metropolis during the 1960s and seventies. This was a period of significant social change, as the racial composition of big city schools…
Employee Motivation on the Organisational Growth of Printing Industry in the Kumasi Metropolis
ERIC Educational Resources Information Center
Enninful, Ebenezer Kofi; Boakye-Amponsah, Abraham; Osei-Poku, Patrick
2015-01-01
The printing industry is supposed to be a major contributor to Ghana's development through employment creation and the enhancement of information to the general public. The main purpose of the study was to assess employee motivation on the printing industry within Kumasi Metropolis. The study employed both the quantitative and qualitative surveys…
ERIC Educational Resources Information Center
Shamsuddeen, Abdulrahman; Amina, Hassan
2016-01-01
This study investigated the Correlation between instructional methods and students end of term achievement in Biology in selected secondary schools in Sokoto Metropolis, Sokoto State Nigeria. The study addressed three Specific objectives. To examine the relationship between; Cooperative learning methods, guided discovery, Simulation Method and…
ERIC Educational Resources Information Center
Roth, Lane
Fritz Lang's "Metropolis" (1927) is a seminal film because of its concern, now generic, with the profound impact technological progress has on mankind's social and spiritual progress. As in many later science fiction films, the ascendancy of artifact over nature is depicted not as liberating human beings, but as subjecting and corrupting…
Molecular dynamics simulations of field emission from a planar nanodiode
NASA Astrophysics Data System (ADS)
Torfason, Kristinn; Valfells, Agust; Manolescu, Andrei
2015-03-01
High resolution molecular dynamics simulations with full Coulomb interactions of electrons are used to investigate field emission in planar nanodiodes. The effects of space-charge and emitter radius are examined and compared to previous results concerning transition from Fowler-Nordheim to Child-Langmuir current [Y. Y. Lau, Y. Liu, and R. K. Parker, Phys. Plasmas 1, 2082 (1994) and Y. Feng and J. P. Verboncoeur, Phys. Plasmas 13, 073105 (2006)]. The Fowler-Nordheim law is used to determine the current density injected into the system and the Metropolis-Hastings algorithm to find a favourable point of emission on the emitter surface. A simple fluid like model is also developed and its results are in qualitative agreement with the simulations.
A GPU-based large-scale Monte Carlo simulation method for systems with long-range interactions
NASA Astrophysics Data System (ADS)
Liang, Yihao; Xing, Xiangjun; Li, Yaohang
2017-06-01
In this work we present an efficient implementation of Canonical Monte Carlo simulation for Coulomb many body systems on graphics processing units (GPU). Our method takes advantage of the GPU Single Instruction, Multiple Data (SIMD) architectures, and adopts the sequential updating scheme of Metropolis algorithm. It makes no approximation in the computation of energy, and reaches a remarkable 440-fold speedup, compared with the serial implementation on CPU. We further use this method to simulate primitive model electrolytes, and measure very precisely all ion-ion pair correlation functions at high concentrations. From these data, we extract the renormalized Debye length, renormalized valences of constituent ions, and renormalized dielectric constants. These results demonstrate unequivocally physics beyond the classical Poisson-Boltzmann theory.
Onset transition to cold nuclear matter from lattice QCD with heavy quarks.
Fromm, M; Langelage, J; Lottini, S; Neuman, M; Philipsen, O
2013-03-22
Lattice QCD at finite density suffers from a severe sign problem, which has so far prohibited simulations of the cold and dense regime. Here we study the onset of nuclear matter employing a three-dimensional effective theory derived by combined strong coupling and hopping expansions, which is valid for heavy but dynamical quarks and has a mild sign problem only. Its numerical evaluations agree between a standard Metropolis and complex Langevin algorithm, where the latter is free of the sign problem. Our continuum extrapolated data approach a first order phase transition at μ(B) ≈ m(B) as the temperature approaches zero. An excellent description of the data is achieved by an analytic solution in the strong coupling limit.
Molecular dynamics simulations of field emission from a planar nanodiode
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torfason, Kristinn; Valfells, Agust; Manolescu, Andrei
High resolution molecular dynamics simulations with full Coulomb interactions of electrons are used to investigate field emission in planar nanodiodes. The effects of space-charge and emitter radius are examined and compared to previous results concerning transition from Fowler-Nordheim to Child-Langmuir current [Y. Y. Lau, Y. Liu, and R. K. Parker, Phys. Plasmas 1, 2082 (1994) and Y. Feng and J. P. Verboncoeur, Phys. Plasmas 13, 073105 (2006)]. The Fowler-Nordheim law is used to determine the current density injected into the system and the Metropolis-Hastings algorithm to find a favourable point of emission on the emitter surface. A simple fluid likemore » model is also developed and its results are in qualitative agreement with the simulations.« less
Non-proportional odds multivariate logistic regression of ordinal family data.
Zaloumis, Sophie G; Scurrah, Katrina J; Harrap, Stephen B; Ellis, Justine A; Gurrin, Lyle C
2015-03-01
Methods to examine whether genetic and/or environmental sources can account for the residual variation in ordinal family data usually assume proportional odds. However, standard software to fit the non-proportional odds model to ordinal family data is limited because the correlation structure of family data is more complex than for other types of clustered data. To perform these analyses we propose the non-proportional odds multivariate logistic regression model and take a simulation-based approach to model fitting using Markov chain Monte Carlo methods, such as partially collapsed Gibbs sampling and the Metropolis algorithm. We applied the proposed methodology to male pattern baldness data from the Victorian Family Heart Study. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Immunoglobulin A Nephropathy: Advances in Understanding of Pathogenesis and Treatment.
Lafayette, Richard A; Kelepouris, Ellie
2018-05-31
Immunoglobulin A (IgA) nephropathy is the most common form of primary glomerulonephritis and has clinical associations with a wide range of inflammatory and infectious diseases. There is a substantial variation in clinical course and outcomes, with many patients not diagnosed until they present with sequelae, which may include gross hematuria, hypertension, renal insufficiency, and/or significant proteinuria. Treatment options are currently limited and directed mainly toward control of these sequelae and have limited ability to reduce the incidence of end-stage renal disease or treat the primary IgA defect. Growing knowledge about the pathogenesis of IgA nephropathy and research into its genetic basis are helping to elucidate the course of this widely variable disease. IgA accumulation in the kidneys is thought to be the result of a number of different pathways in a "multi-hit" process that includes an initial traumatic trigger (often infection related) and subsequent memory responses that are amplified in those with a genetic predisposition to the disease and lead to an inflammatory response in susceptible individuals. Genome-wide association studies are providing new insights into the genetic variance of this autoimmune disease and are yielding information that may address both its causes and consequences. Key Messages: New treatment approaches are urgently required for the management of patients with IgA nephropathy. Novel interventions based around its inflammatory nature and "multi-hit" pathogenesis are being investigated to potentially limit disease progression. © 2018 S. Karger AG, Basel.
ERIC Educational Resources Information Center
Oyefara, John Lekan
2005-01-01
This article examines the sexual behaviour and the HIV/AIDS knowledge and vulnerability of female street hawkers in Lagos metropolis, Nigeria. A total of 126 female street hawkers under 18 were sampled in a cross-sectional survey and six Focus Group Discussions (FGDs) were conducted to generate data from respondents. Data on sexual behaviour…
Analysis of Errors Committed by Physics Students in Secondary Schools in Ilorin Metropolis, Nigeria
ERIC Educational Resources Information Center
Omosewo, Esther Ore; Akanbi, Abdulrasaq Oladimeji
2013-01-01
The study attempt to find out the types of error committed and influence of gender on the type of error committed by senior secondary school physics students in metropolis. Six (6) schools were purposively chosen for the study. One hundred and fifty five students' scripts were randomly sampled for the study. Joint Mock physics essay questions…
ERIC Educational Resources Information Center
Ntumi, Simon
2016-01-01
The study examined the challenges that pre-school teachers encounter in the implementation of the early childhood curriculum; exploring teaching methods employed by pre-schools teachers in the Cape Coast Metropolis. The study employed descriptive survey as the research design. A convenient sample of 62 pre-school teachers were selected from a…
ERIC Educational Resources Information Center
Dauda, Bala; Jambo, Hyelni Emmanuel; Umar, Muhammad Amin
2016-01-01
This study examined students' perception of factors influencing teaching and learning of mathematics in senior secondary schools in Maiduguri Metropolis of Borno State, Nigeria. The objectives of the study were to determine the extent to which students perceived: qualification, method of teaching, instructional materials and attitude of both…
Putting the Learning in Service Learning: From Soup Kitchen Models to the Black Metropolis Model
ERIC Educational Resources Information Center
Manley, Theodoric, Jr.; Buffa, Avery S.; Dube, Caleb; Reed, Lauren
2006-01-01
Results of the Black Metropolis Model (BMM) of service learning are analyzed and illustrated in this article to explain how to "put the learning in service learning." There are many soup kitchens or nontransforming models of service learning where students are asked to serve needy populations but internalize and learn little about the…
ERIC Educational Resources Information Center
Ogidi, Reuben C.; Udechukwu, Jonathan O.
2017-01-01
The study sought to investigate the perception of stakeholders on teachers' assessment effectiveness in secondary schools in Port Harcourt Metropolis in Rivers State. Three research questions and one hypothesis were formulated to guide the study. The study adopted survey research design. The sample of the study consisted of 20 principles, 30 vice…
ERIC Educational Resources Information Center
Musa, Alice K. J.; Meshak, Bibi; Sagir, Jummai Ibrahim
2016-01-01
The purpose of the study was to determine adolescents' perceptions of the psychological security of their schools environments and their relationship with their emotional development and academic performance in secondary schools in Gombe Metropolis. A sample of 239 (107 males and 133 females) secondary school students selected via stratified…
Lin, Cheng-Horng
2016-12-23
There are more than 7 million people living near the Tatun volcano group in northern Taiwan. For the safety of the Taipei metropolis, in particular, it has been debated for decades whether or not these volcanoes are active. Here I show evidence of a deep magma reservoir beneath the Taipei metropolis from both S-wave shadows and P-wave delays. The reservoir is probably composed of either a thin magma layer overlay or many molten sills within thick partially molten rocks. Assuming that 40% of the reservoir is partially molten, its total volume could be approximately 350 km 3 . The exact location and geometry of the magma reservoir will be obtained after dense seismic arrays are deployed in 2017-2020.
Estimating rare events in biochemical systems using conditional sampling.
Sundar, V S
2017-01-28
The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.
Benchmarking GPU and CPU codes for Heisenberg spin glass over-relaxation
NASA Astrophysics Data System (ADS)
Bernaschi, M.; Parisi, G.; Parisi, L.
2011-06-01
We present a set of possible implementations for Graphics Processing Units (GPU) of the Over-relaxation technique applied to the 3D Heisenberg spin glass model. The results show that a carefully tuned code can achieve more than 100 GFlops/s of sustained performance and update a single spin in about 0.6 nanoseconds. A multi-hit technique that exploits the GPU shared memory further reduces this time. Such results are compared with those obtained by means of a highly-tuned vector-parallel code on latest generation multi-core CPUs.
Jeong, Jeho; Chen, Qing; Febo, Robert; Yang, Jie; Pham, Hai; Xiong, Jian-Ping; Zanzonico, Pat B.; Deasy, Joseph O.; Humm, John L.; Mageras, Gig S.
2016-01-01
Although spatially precise systems are now available for small-animal irradiations, there are currently limited software tools available for treatment planning for such irradiations. We report on the adaptation, commissioning, and evaluation of a 3-dimensional treatment planning system for use with a small-animal irradiation system. The 225-kV X-ray beam of the X-RAD 225Cx microirradiator (Precision X-Ray) was commissioned using both ion-chamber and radiochromic film for 10 different collimators ranging in field size from 1 mm in diameter to 40 × 40 mm2. A clinical 3-dimensional treatment planning system (Metropolis) developed at our institution was adapted to small-animal irradiation by making it compatible with the dimensions of mice and rats, modeling the microirradiator beam orientations and collimators, and incorporating the measured beam data for dose calculation. Dose calculations in Metropolis were verified by comparison with measurements in phantoms. Treatment plans for irradiation of a tumor-bearing mouse were generated with both the Metropolis and the vendor-supplied software. The calculated beam-on times and the plan evaluation tools were compared. The dose rate at the central axis ranges from 74 to 365 cGy/min depending on the collimator size. Doses calculated with Metropolis agreed with phantom measurements within 3% for all collimators. The beam-on times calculated by Metropolis and the vendor-supplied software agreed within 1% at the isocenter. The modified 3-dimensional treatment planning system provides better visualization of the relationship between the X-ray beams and the small-animal anatomy as well as more complete dosimetric information on target tissues and organs at risk. It thereby enhances the potential of image-guided microirradiator systems for evaluation of dose–response relationships and for preclinical experimentation generally. PMID:25948321
A comparative study of noise pollution levels in some selected areas in Ilorin Metropolis, Nigeria.
Oyedepo, Olayinka S; Saadu, Abdullahi A
2009-11-01
The noise pollution is a major problem for the quality of life in urban areas. This study was conducted to compare the noise pollution levels at busy roads/road junctions, passengers loading parks, commercial, industrial and residential areas in Ilorin metropolis. A total number of 47-locations were selected within the metropolis. Statistical analysis shows significant difference (P < 0.05) in noise pollution levels between industrial areas and low density residential areas, industrial areas and high density areas, industrial areas and passengers loading parks, industrial areas and commercial areas, busy roads/road junctions and low density areas, passengers loading parks and commercial areas and commercial areas and low density areas. There is no significant difference (P > 0.05) in noise pollution levels between industrial areas and busy roads/road junctions, busy roads/road junctions and high density areas, busy roads/road junctions and passengers loading parks, busy roads/road junctions and commercial areas, passengers loading parks and high density areas, passengers loading parks and commercial areas and commercial areas and high density areas. The results show that Industrial areas have the highest noise pollution levels (110.2 dB(A)) followed by busy roads/Road junctions (91.5 dB(A)), Passengers loading parks (87.8 dB(A)) and Commercial areas (84.4 dB(A)). The noise pollution levels in Ilorin metropolis exceeded the recommended level by WHO at 34 of 47 measuring points. It can be concluded that the city is environmentally noise polluted and road traffic and industrial machineries are the major sources of it. Noting the noise emission standards, technical control measures, planning and promoting the citizens awareness about the high noise risk may help to relieve the noise problem in the metropolis.
Lin, Cheng-Horng
2016-01-01
There are more than 7 million people living near the Tatun volcano group in northern Taiwan. For the safety of the Taipei metropolis, in particular, it has been debated for decades whether or not these volcanoes are active. Here I show evidence of a deep magma reservoir beneath the Taipei metropolis from both S-wave shadows and P-wave delays. The reservoir is probably composed of either a thin magma layer overlay or many molten sills within thick partially molten rocks. Assuming that 40% of the reservoir is partially molten, its total volume could be approximately 350 km3. The exact location and geometry of the magma reservoir will be obtained after dense seismic arrays are deployed in 2017–2020. PMID:28008931
NASA Astrophysics Data System (ADS)
Wentworth, Mami Tonoe
Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schröder, Markus, E-mail: Markus.Schroeder@pci.uni-heidelberg.de; Meyer, Hans-Dieter, E-mail: Hans-Dieter.Meyer@pci.uni-heidelberg.de
2014-07-21
We report energies and tunneling splittings of vibrational excited states of malonaldehyde which have been obtained using full dimensional quantum mechanical calculations. To this end we employed the multi configuration time-dependent Hartree method. The results have been obtained using a recently published potential energy surface [Y. Wang, B. J. Braams, J. M. Bowman, S. Carter, and D. P. Tew, J. Chem. Phys. 128, 224314 (2008)] which has been brought into a suitable form by a modified version of the n-mode representation which was used with two different arrangements of coordinates. The relevant terms of the expansion have been identified withmore » a Metropolis algorithm and a diffusion Monte-Carlo technique, respectively.« less
Exact posterior computation in non-conjugate Gaussian location-scale parameters models
NASA Astrophysics Data System (ADS)
Andrade, J. A. A.; Rathie, P. N.
2017-12-01
In Bayesian analysis the class of conjugate models allows to obtain exact posterior distributions, however this class quite restrictive in the sense that it involves only a few distributions. In fact, most of the practical applications involves non-conjugate models, thus approximate methods, such as the MCMC algorithms, are required. Although these methods can deal with quite complex structures, some practical problems can make their applications quite time demanding, for example, when we use heavy-tailed distributions, convergence may be difficult, also the Metropolis-Hastings algorithm can become very slow, in addition to the extra work inevitably required on choosing efficient candidate generator distributions. In this work, we draw attention to the special functions as a tools for Bayesian computation, we propose an alternative method for obtaining the posterior distribution in Gaussian non-conjugate models in an exact form. We use complex integration methods based on the H-function in order to obtain the posterior distribution and some of its posterior quantities in an explicit computable form. Two examples are provided in order to illustrate the theory.
NASA Astrophysics Data System (ADS)
Curotto, E.
2015-12-01
Structural optimizations, classical NVT ensemble, and variational Monte Carlo simulations of ion Stockmayer clusters parameterized to approximate the Li+(CH3NO2)n (n = 1-20) systems are performed. The Metropolis algorithm enhanced by the parallel tempering strategy is used to measure internal energies and heat capacities, and a parallel version of the genetic algorithm is employed to obtain the most important minima. The first solvation sheath is octahedral and this feature remains the dominant theme in the structure of clusters with n ≥ 6. The first "magic number" is identified using the adiabatic solvent dissociation energy, and it marks the completion of the second solvation layer for the lithium ion-nitromethane clusters. It corresponds to the n = 18 system, a solvated ion with the first sheath having octahedral symmetry, weakly bound to an eight-membered and a four-membered ring crowning a vertex of the octahedron. Variational Monte Carlo estimates of the adiabatic solvent dissociation energy reveal that quantum effects further enhance the stability of the n = 18 system relative to its neighbors.
Biomarkers for IgA nephropathy on the basis of multi-hit pathogenesis.
Suzuki, Hitoshi
2018-05-08
IgA nephropathy (IgAN) is the most prevalent glomerular disease worldwide and is associated with a poor prognosis. Development of curative treatment strategies and approaches for early diagnosis is necessary. Renal biopsy is the gold standard for the diagnosis and assessment of disease activity. However, reliable biomarkers are needed for the noninvasive diagnosis of this disease and to more fully delineate the risk of progression. With regard to the pathogenesis of IgAN, the multi-hit hypothesis, including production of galactose-deficient IgA1 (Gd-IgA1; Hit 1), IgG or IgA autoantibodies that recognize Gd-IgA1 (Hit 2), and their subsequent immune complexes formation (Hit 3) and glomerular deposition (Hit 4), has been widely supported by many studies. Although the prognostic values of several biomarkers have been discussed, we recently developed a highly sensitive and specific diagnostic method by measuring serum levels of Gd-IgA1 and Gd-IgA1-containing immune complexes. In addition, urinary Gd-IgA1 may represent a disease-specific biomarker for IgAN. We also confirmed that there is a significant correlation between serum levels of these effector molecules and disease activity, suggesting that each can be considered a practical surrogate marker of therapeutic response. Thus, these disease-oriented specific serum and urine biomarkers may be useful for screening of potential IgAN with isolated hematuria, earlier diagnosis, disease activity, and eventually, response to treatment. In this review, we discuss these concepts, with a focus on potential clinical applications of these biomarkers.
Pt-Zn Clusters on Stoichiometric MgO(100) and TiO2(110): Dramatically Different Sintering Behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dadras, Mostafa J.; Shen, Lu; Alexandrova, Anastassia N.
2015-03-02
Zn was suggested to be a promising additive to Pt in the catalysis of dehydrogenation reactions. In this work, mixed Pt-Zn clusters deposited on two simple oxides, MgO(100) and TiO2(110), were investigated. The stability of these systems against cluster sintering, one of the major mechanisms of catalyst deactivation, is simulated using a Metropolis Monte Carlo scheme under the assumption of the Ostwald ripening mechanism. Particle migration, association to and dissociation from clusters, and evaporation and redeposition of monomers were all included in the simulations. Simulations are done at several high temperatures relevant to reactions of catalytic dehydrogenation. The effect ofmore » temperature is included via both the Metropolis algorithm and the Boltzmann-weighted populations of the global and thermally accessible local minima on the density functional theory potential energy surfaces of clusters of all sizes and compositions up to tetramers. On both surfaces, clusters are shown to sinter quite rapidly. However, the resultant compositions of the clusters most resistant to sintering are quite different on the two supports. On TiO2(110), Pt and Zn appear to phase separate, preferentially forming clusters rich in just one or the other metal. On MgO(100), Pt and Zn remain well-mixed and form a range of bimetallic clusters of various compositions that appear relatively stable. However, Zn is more easily lost from MgO through evaporation. These phenomena were rationalized by several means of chemical bonding analysis.« less
Improving and Evaluating Nested Sampling Algorithm for Marginal Likelihood Estimation
NASA Astrophysics Data System (ADS)
Ye, M.; Zeng, X.; Wu, J.; Wang, D.; Liu, J.
2016-12-01
With the growing impacts of climate change and human activities on the cycle of water resources, an increasing number of researches focus on the quantification of modeling uncertainty. Bayesian model averaging (BMA) provides a popular framework for quantifying conceptual model and parameter uncertainty. The ensemble prediction is generated by combining each plausible model's prediction, and each model is attached with a model weight which is determined by model's prior weight and marginal likelihood. Thus, the estimation of model's marginal likelihood is crucial for reliable and accurate BMA prediction. Nested sampling estimator (NSE) is a new proposed method for marginal likelihood estimation. The process of NSE is accomplished by searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm is often used for local sampling. However, M-H is not an efficient sampling algorithm for high-dimensional or complicated parameter space. For improving the efficiency of NSE, it could be ideal to incorporate the robust and efficient sampling algorithm - DREAMzs into the local sampling of NSE. The comparison results demonstrated that the improved NSE could improve the efficiency of marginal likelihood estimation significantly. However, both improved and original NSEs suffer from heavy instability. In addition, the heavy computation cost of huge number of model executions is overcome by using an adaptive sparse grid surrogates.
Sampling algorithms for validation of supervised learning models for Ising-like systems
NASA Astrophysics Data System (ADS)
Portman, Nataliya; Tamblyn, Isaac
2017-12-01
In this paper, we build and explore supervised learning models of ferromagnetic system behavior, using Monte-Carlo sampling of the spin configuration space generated by the 2D Ising model. Given the enormous size of the space of all possible Ising model realizations, the question arises as to how to choose a reasonable number of samples that will form physically meaningful and non-intersecting training and testing datasets. Here, we propose a sampling technique called ;ID-MH; that uses the Metropolis-Hastings algorithm creating Markov process across energy levels within the predefined configuration subspace. We show that application of this method retains phase transitions in both training and testing datasets and serves the purpose of validation of a machine learning algorithm. For larger lattice dimensions, ID-MH is not feasible as it requires knowledge of the complete configuration space. As such, we develop a new ;block-ID; sampling strategy: it decomposes the given structure into square blocks with lattice dimension N ≤ 5 and uses ID-MH sampling of candidate blocks. Further comparison of the performance of commonly used machine learning methods such as random forests, decision trees, k nearest neighbors and artificial neural networks shows that the PCA-based Decision Tree regressor is the most accurate predictor of magnetizations of the Ising model. For energies, however, the accuracy of prediction is not satisfactory, highlighting the need to consider more algorithmically complex methods (e.g., deep learning).
Magnetocaloric effect in Sr2CrIrO6 double perovskite: Monte Carlo simulation
NASA Astrophysics Data System (ADS)
El Rhazouani, O.; Slassi, A.; Ziat, Y.; Benyoussef, A.
2017-05-01
Monte Carlo simulation (MCS) combined with the Metropolis algorithm has been performed to study the magnetocaloric effect (MCE) in the promising double perovskite (DP) Sr2CrIrO6 that has not so far been synthetized. This paper presents the global magneto-thermodynamic behavior of Sr2CrIrO6 compound in term of MCE and discusses the behavior in comparison to other DPs. Thermal dependence of the magnetization has been investigated for different values of reduced external magnetic field. Thermal magnetic entropy and its change have been obtained. The adiabatic temperature change and the relative cooling power have been established. Through the obtained results, Sr2CrIrO6 DP could have some potential applications for magnetic refrigeration over a wide temperature range above room temperature and at large magnetic fields.
NASA Astrophysics Data System (ADS)
Zaim, N.; Zaim, A.; Kerouad, M.
2017-02-01
In this work, the magnetic behavior of the cylindrical nanowire, consisting of a ferromagnetic core of spin-1 atoms surrounded by a ferromagnetic shell of spin-1 atoms is studied in the presence of a random crystal field interaction. Based on Metropolis algorithm, the Monte Carlo simulation has been used to investigate the effects of the concentration of the random crystal field p, the crystal field D and the shell exchange interaction Js on the phase diagrams and the hysteresis behavior of the system. Some characteristic behaviors have been found, such as the first and second-order phase transitions joined by tricritical point for appropriate values of the system parameters, triple and isolated critical points can be also found. Depending on the Hamiltonian parameters, single, double and para hysteresis regions are explicitly determined.
Korzeniewski, Steven J; Romero, Roberto; Cortez, Josepf; Pappas, Athina; Schwartz, Alyse G; Kim, Chong Jai; Kim, Jung-Sun; Kim, Yeon Mee; Yoon, Bo Hyun; Chaiworapongsa, Tinnakorn; Hassan, Sonia S
2014-11-01
We sought to determine whether cumulative evidence of perinatal inflammation was associated with increased risk in a "multi-hit" model of neonatal white matter injury (WMI). This retrospective cohort study included very preterm (gestational ages at delivery <32 weeks) live-born singleton neonates delivered at Hutzel Women's Hospital, Detroit, MI, from 2006 to 2011. Four pathologists blinded to clinical diagnoses and outcomes performed histological examinations according to standardized protocols. Neurosonography was obtained per routine clinical care. The primary indicator of WMI was ventriculomegaly (VE). Neonatal inflammation-initiating illnesses included bacteremia, surgical necrotizing enterocolitis, other infections, and those requiring mechanical ventilation. A total of 425 live-born singleton neonates delivered before the 32nd week of gestation were included. Newborns delivered of pregnancies affected by chronic chorioamnionitis who had histologic evidence of an acute fetal inflammatory response were at increased risk of VE, unlike those without funisitis, relative to referent newborns without either condition, adjusting for gestational age [odds ratio (OR) 4.7; 95% confidence interval (CI) 1.4-15.8 vs. OR 1.3; 95% CI 0.7-2.6]. Similarly, newborns with funisitis who developed neonatal inflammation-initiating illness were at increased risk of VE, unlike those who did not develop such illness, compared to the referent group without either condition [OR 3.6 (95% CI 1.5-8.3) vs. OR 1.7 (95% CI 0.5-5.5)]. The greater the number of these three types of inflammation documented, the higher the risk of VE (P<0.0001). Chronic placental inflammation, acute fetal inflammation, and neonatal inflammation-initiating illness seem to interact in contributing risk information and/or directly damaging the developing brain of newborns delivered very preterm.
BCL::MP-Fold: membrane protein structure prediction guided by EPR restraints
Fischer, Axel W.; Alexander, Nathan S.; Woetzel, Nils; Karakaş, Mert; Weiner, Brian E.; Meiler, Jens
2016-01-01
For many membrane proteins, the determination of their topology remains a challenge for methods like X-ray crystallography and nuclear magnetic resonance (NMR) spectroscopy. Electron paramagnetic resonance (EPR) spectroscopy has evolved as an alternative technique to study structure and dynamics of membrane proteins. The present study demonstrates the feasibility of membrane protein topology determination using limited EPR distance and accessibility measurements. The BCL::MP-Fold algorithm assembles secondary structure elements (SSEs) in the membrane using a Monte Carlo Metropolis (MCM) approach. Sampled models are evaluated using knowledge-based potential functions and agreement with the EPR data and a knowledge-based energy function. Twenty-nine membrane proteins of up to 696 residues are used to test the algorithm. The protein-size-normalized root-mean-square-deviation (RMSD100) value of the most accurate model is better than 8 Å for twenty-seven, better than 6 Å for twenty-two, and better than 4 Å for fifteen out of twenty-nine proteins, demonstrating the algorithm’s ability to sample the native topology. The average enrichment could be improved from 1.3 to 2.5, showing the improved discrimination power by using EPR data. PMID:25820805
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curotto, E., E-mail: curotto@arcadia.edu
2015-12-07
Structural optimizations, classical NVT ensemble, and variational Monte Carlo simulations of ion Stockmayer clusters parameterized to approximate the Li{sup +}(CH{sub 3}NO{sub 2}){sub n} (n = 1–20) systems are performed. The Metropolis algorithm enhanced by the parallel tempering strategy is used to measure internal energies and heat capacities, and a parallel version of the genetic algorithm is employed to obtain the most important minima. The first solvation sheath is octahedral and this feature remains the dominant theme in the structure of clusters with n ≥ 6. The first “magic number” is identified using the adiabatic solvent dissociation energy, and it marksmore » the completion of the second solvation layer for the lithium ion-nitromethane clusters. It corresponds to the n = 18 system, a solvated ion with the first sheath having octahedral symmetry, weakly bound to an eight-membered and a four-membered ring crowning a vertex of the octahedron. Variational Monte Carlo estimates of the adiabatic solvent dissociation energy reveal that quantum effects further enhance the stability of the n = 18 system relative to its neighbors.« less
The Full Monte Carlo: A Live Performance with Stars
NASA Astrophysics Data System (ADS)
Meng, Xiao-Li
2014-06-01
Markov chain Monte Carlo (MCMC) is being applied increasingly often in modern Astrostatistics. It is indeed incredibly powerful, but also very dangerous. It is popular because of its apparent generality (from simple to highly complex problems) and simplicity (the availability of out-of-the-box recipes). It is dangerous because it always produces something but there is no surefire way to verify or even diagnosis that the “something” is remotely close to what the MCMC theory predicts or one hopes. Using very simple models (e.g., conditionally Gaussian), this talk starts with a tutorial of the two most popular MCMC algorithms, namely, the Gibbs Sampler and the Metropolis-Hasting Algorithm, and illustratestheir good, bad, and ugly implementations via live demonstration. The talk ends with a story of how a recent advance, the Ancillary-Sufficient Interweaving Strategy (ASIS) (Yu and Meng, 2011, http://www.stat.harvard.edu/Faculty_Content/meng/jcgs.2011-article.pdf)reduces the danger. It was discovered almost by accident during a Ph.D. student’s (Yaming Yu) struggle with fitting a Cox process model for detecting changes in source intensity of photon counts observed by the Chandra X-ray telescope from a (candidate) neutron/quark star.
An implementation of differential evolution algorithm for inversion of geoelectrical data
NASA Astrophysics Data System (ADS)
Balkaya, Çağlayan
2013-11-01
Differential evolution (DE), a population-based evolutionary algorithm (EA) has been implemented to invert self-potential (SP) and vertical electrical sounding (VES) data sets. The algorithm uses three operators including mutation, crossover and selection similar to genetic algorithm (GA). Mutation is the most important operator for the success of DE. Three commonly used mutation strategies including DE/best/1 (strategy 1), DE/rand/1 (strategy 2) and DE/rand-to-best/1 (strategy 3) were applied together with a binomial type crossover. Evolution cycle of DE was realized without boundary constraints. For the test studies performed with SP data, in addition to both noise-free and noisy synthetic data sets two field data sets observed over the sulfide ore body in the Malachite mine (Colorado) and over the ore bodies in the Neem-Ka Thana cooper belt (India) were considered. VES test studies were carried out using synthetically produced resistivity data representing a three-layered earth model and a field data set example from Gökçeada (Turkey), which displays a seawater infiltration problem. Mutation strategies mentioned above were also extensively tested on both synthetic and field data sets in consideration. Of these, strategy 1 was found to be the most effective strategy for the parameter estimation by providing less computational cost together with a good accuracy. The solutions obtained by DE for the synthetic cases of SP were quite consistent with particle swarm optimization (PSO) which is a more widely used population-based optimization algorithm than DE in geophysics. Estimated parameters of SP and VES data were also compared with those obtained from Metropolis-Hastings (M-H) sampling algorithm based on simulated annealing (SA) without cooling to clarify uncertainties in the solutions. Comparison to the M-H algorithm shows that DE performs a fast approximate posterior sampling for the case of low-dimensional inverse geophysical problems.
Entanglement complexity in quantum many-body dynamics, thermalization, and localization
NASA Astrophysics Data System (ADS)
Yang, Zhi-Cheng; Hamma, Alioscia; Giampaolo, Salvatore M.; Mucciolo, Eduardo R.; Chamon, Claudio
2017-07-01
Entanglement is usually quantified by von Neumann entropy, but its properties are much more complex than what can be expressed with a single number. We show that the three distinct dynamical phases known as thermalization, Anderson localization, and many-body localization are marked by different patterns of the spectrum of the reduced density matrix for a state evolved after a quantum quench. While the entanglement spectrum displays Poisson statistics for the case of Anderson localization, it displays universal Wigner-Dyson statistics for both the cases of many-body localization and thermalization, albeit the universal distribution is asymptotically reached within very different time scales in these two cases. We further show that the complexity of entanglement, revealed by the possibility of disentangling the state through a Metropolis-like algorithm, is signaled by whether the entanglement spectrum level spacing is Poisson or Wigner-Dyson distributed.
Exchange bias training relaxation in spin glass/ferromagnet bilayers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chi, Xiaodan; Du, An; Rui, Wenbin
2016-04-25
A canonical spin glass (SG) FeAu layer is fabricated to couple to a soft ferromagnet (FM) FeNi layer. Below the SG freezing temperature, exchange bias (EB) and training are observed. Training in SG/FM bilayers is insensitive to cooling field and may suppress the EB or change the sign of the EB field from negative to positive at specific temperatures, violating from the simple power-law or the single exponential function derived from the antiferromagnet based systems. In view of the SG nature, we employ a double decay model to distinguish the contributions from the SG bulk and the SG/FM interface tomore » training. Dynamical properties during training under different cooling fields and at different temperatures are discussed, and the nonzero shifting coefficient in the time index as a signature of slowing-down decay for SG based systems is interpreted by means of a modified Monte Carlo Metropolis algorithm.« less
Markov Chain Monte Carlo from Lagrangian Dynamics.
Lan, Shiwei; Stathopoulos, Vasileios; Shahbaba, Babak; Girolami, Mark
2015-04-01
Hamiltonian Monte Carlo (HMC) improves the computational e ciency of the Metropolis-Hastings algorithm by reducing its random walk behavior. Riemannian HMC (RHMC) further improves the performance of HMC by exploiting the geometric properties of the parameter space. However, the geometric integrator used for RHMC involves implicit equations that require fixed-point iterations. In some cases, the computational overhead for solving implicit equations undermines RHMC's benefits. In an attempt to circumvent this problem, we propose an explicit integrator that replaces the momentum variable in RHMC by velocity. We show that the resulting transformation is equivalent to transforming Riemannian Hamiltonian dynamics to Lagrangian dynamics. Experimental results suggests that our method improves RHMC's overall computational e ciency in the cases considered. All computer programs and data sets are available online (http://www.ics.uci.edu/~babaks/Site/Codes.html) in order to allow replication of the results reported in this paper.
NASA Astrophysics Data System (ADS)
Banerjee, D.; Jiang, F.-J.; Olesen, T. Z.; Orland, P.; Wiese, U.-J.
2018-05-01
We consider the (2 +1 ) -dimensional S U (2 ) quantum link model on the honeycomb lattice and show that it is equivalent to a quantum dimer model on the kagome lattice. The model has crystalline confined phases with spontaneously broken translation invariance associated with pinwheel order, which is investigated with either a Metropolis or an efficient cluster algorithm. External half-integer non-Abelian charges [which transform nontrivially under the Z (2 ) center of the S U (2 ) gauge group] are confined to each other by fractionalized strings with a delocalized Z (2 ) flux. The strands of the fractionalized flux strings are domain walls that separate distinct pinwheel phases. A second-order phase transition in the three-dimensional Ising universality class separates two confining phases: one with correlated pinwheel orientations, and the other with uncorrelated pinwheel orientations.
NASA Astrophysics Data System (ADS)
Riza, Yose; Cheris, Rika; Repi
2017-12-01
The development of Pekanbaru City is very rapid, consequently is constantly experiencing changes in buildings, areas or cultural objects that need to be preserved to be disrupted, replaced by economic-oriented development - commercial. The contradiction between the construction of the metropolis will be the beginning of the problem for urban areas. Kampong Bandar Senapelan is an early town of Pekanbaru town located on the banks of the Siak River. The settlement has a typology of Malay and vernacular Malay architecture. The existence of these villages experienced concern as a contradiction of the city's development toward the metropolis which resulted in degradation of the historical value of urban development in this region. This study was conducted to make an important assessment of preserving Kampung Bandar Senapelan as the oldest area and its great influence on the development of metropolis. Preservation of historical and cultural heritage with conservation and preservation measures is one of the urban design elements to be considered for all city stakeholders to safeguard the civilization of a generation. Considerations that will become a benchmark is the history, conservation and urban development towards the metropolis. The importance of awareness of the conservation of the city through conservation and preservation in this area can lead to new characters and values to the building and its environment and will create an atmosphere different from the rapid development (modern style). In addition, this preservation will be evident in a harmonious life with a high tolerance between multi-ethnicity that co-existed in the past.
NASA Astrophysics Data System (ADS)
Mäkelä, Jarmo; Susiluoto, Jouni; Markkanen, Tiina; Aurela, Mika; Järvinen, Heikki; Mammarella, Ivan; Hagemann, Stefan; Aalto, Tuula
2016-12-01
We examined parameter optimisation in the JSBACH (Kaminski et al., 2013; Knorr and Kattge, 2005; Reick et al., 2013) ecosystem model, applied to two boreal forest sites (Hyytiälä and Sodankylä) in Finland. We identified and tested key parameters in soil hydrology and forest water and carbon-exchange-related formulations, and optimised them using the adaptive Metropolis (AM) algorithm for Hyytiälä with a 5-year calibration period (2000-2004) followed by a 4-year validation period (2005-2008). Sodankylä acted as an independent validation site, where optimisations were not made. The tuning provided estimates for full distribution of possible parameters, along with information about correlation, sensitivity and identifiability. Some parameters were correlated with each other due to a phenomenological connection between carbon uptake and water stress or other connections due to the set-up of the model formulations. The latter holds especially for vegetation phenology parameters. The least identifiable parameters include phenology parameters, parameters connecting relative humidity and soil dryness, and the field capacity of the skin reservoir. These soil parameters were masked by the large contribution from vegetation transpiration. In addition to leaf area index and the maximum carboxylation rate, the most effective parameters adjusting the gross primary production (GPP) and evapotranspiration (ET) fluxes in seasonal tuning were related to soil wilting point, drainage and moisture stress imposed on vegetation. For daily and half-hourly tunings the most important parameters were the ratio of leaf internal CO2 concentration to external CO2 and the parameter connecting relative humidity and soil dryness. Effectively the seasonal tuning transferred water from soil moisture into ET, and daily and half-hourly tunings reversed this process. The seasonal tuning improved the month-to-month development of GPP and ET, and produced the most stable estimates of water use efficiency. When compared to the seasonal tuning, the daily tuning is worse on the seasonal scale. However, daily parametrisation reproduced the observations for average diurnal cycle best, except for the GPP for Sodankylä validation period, where half-hourly tuned parameters were better. In general, the daily tuning provided the largest reduction in model-data mismatch. The models response to drought was unaffected by our parametrisations and further studies are needed into enhancing the dry response in JSBACH.
Chou, Sheng-Kai; Jiau, Ming-Kai; Huang, Shih-Chia
2016-08-01
The growing ubiquity of vehicles has led to increased concerns about environmental issues. These concerns can be mitigated by implementing an effective carpool service. In an intelligent carpool system, an automated service process assists carpool participants in determining routes and matches. It is a discrete optimization problem that involves a system-wide condition as well as participants' expectations. In this paper, we solve the carpool service problem (CSP) to provide satisfactory ride matches. To this end, we developed a particle swarm carpool algorithm based on stochastic set-based particle swarm optimization (PSO). Our method introduces stochastic coding to augment traditional particles, and uses three terminologies to represent a particle: 1) particle position; 2) particle view; and 3) particle velocity. In this way, the set-based PSO (S-PSO) can be realized by local exploration. In the simulation and experiments, two kind of discrete PSOs-S-PSO and binary PSO (BPSO)-and a genetic algorithm (GA) are compared and examined using tested benchmarks that simulate a real-world metropolis. We observed that the S-PSO outperformed the BPSO and the GA thoroughly. Moreover, our method yielded the best result in a statistical test and successfully obtained numerical results for meeting the optimization objectives of the CSP.
Chodera, John D; Shirts, Michael R
2011-11-21
The widespread popularity of replica exchange and expanded ensemble algorithms for simulating complex molecular systems in chemistry and biophysics has generated much interest in discovering new ways to enhance the phase space mixing of these protocols in order to improve sampling of uncorrelated configurations. Here, we demonstrate how both of these classes of algorithms can be considered as special cases of Gibbs sampling within a Markov chain Monte Carlo framework. Gibbs sampling is a well-studied scheme in the field of statistical inference in which different random variables are alternately updated from conditional distributions. While the update of the conformational degrees of freedom by Metropolis Monte Carlo or molecular dynamics unavoidably generates correlated samples, we show how judicious updating of the thermodynamic state indices--corresponding to thermodynamic parameters such as temperature or alchemical coupling variables--can substantially increase mixing while still sampling from the desired distributions. We show how state update methods in common use can lead to suboptimal mixing, and present some simple, inexpensive alternatives that can increase mixing of the overall Markov chain, reducing simulation times necessary to obtain estimates of the desired precision. These improved schemes are demonstrated for several common applications, including an alchemical expanded ensemble simulation, parallel tempering, and multidimensional replica exchange umbrella sampling.
Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach
NASA Technical Reports Server (NTRS)
Warner, James E.; Hochhalter, Jacob D.
2016-01-01
This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.
Wusu, Onipede
2013-06-01
The influence of adolescents' exposure to sexual health content of mass media in their sexual health behaviour in Nigeria is still not clear. Data were gathered through a survey conducted among adolescents aged 12-19 years in Lagos metropolis between November 2009 and February 2010. A multistage sampling strategy was adopted in selecting respondents. Logistic regression technique was utilised in the analysis. The results indicate that the respondents were most frequently exposed to TV (male = 92.2; female = 94.9) and radio (male = 88.2; female = 91.7) media. The odds ratios indicate that sexual health content of mass media significantly predicted condom use, multiple sexual relationship, sexual intercourse and self reported occurrence of abortion in the study sample. The findings imply that positive media sexual health content is likely to promote sexual health among adolescents but negative contents can put adolescents' sexual health in danger. In addition, safe sex can be advanced among adolescents if the media provide accurate information on sexuality, emphasising the dangers of risky sexual practices. Finally, this study posits that accurate portrayal of sexuality in the media would contribute immensely to improving public health in the metropolis.
Gumbo, B
2000-01-01
The Harare metropolis in Zimbabwe, extending upstream from Manyame Dam in the Upper Manyame River Basin, consists of the City of Harare and its satellite towns: Chitungwiza, Norton, Epworth and Ruwa. The existing urban drainage system is typically a single-use-mixing system: water is used and discharged to "waste", excreta are flushed to sewers and eventually, after "treatment", the effluent is discharged to a drinking water supply source. Polluted urban storm water is evacuated as fast as possible. This system not only ignores the substantial value in "waste" materials, but it also exports problems to downstream communities and to vulnerable fresh-water sources. The question is how can the harare metropolis urban drainage system, which is complex and has evolved over time, be rearranged to achieve sustainability (i.e. water conservation, pollution prevention at source, protection of the vulnerable drinking water sources and recovery of valuable materials)? This paper reviews current concepts regarding the future development of the urban drainage system in line with the new vision of "Sustainable Cities of the Future". The Harare Metropolis in Zimbabwe is taken as a case, and philosophical options for re-engineering the drainage system are discussed.
Assessment of radiation protection practices among radiographers in Lagos, Nigeria.
Eze, Cletus Uche; Abonyi, Livinus Chibuzo; Njoku, Jerome; Irurhe, Nicholas Kayode; Olowu, Oluwabola
2013-11-01
Use of ionising radiation in diagnostic radiography could lead to hazards such as somatic and genetic damages. Compliance to safe work and radiation protection practices could mitigate such risks. The aim of the study was to assess the knowledge and radiation protection practices among radiographers in Lagos, Nigeria. The study was a prospective cross sectional survey. Convenience sampling technique was used to select four x-ray diagnostic centres in four tertiary hospitals in Lagos metropolis. Data were analysed with Epi- info software, version 3.5.1. Average score on assessment of knowledge was 73%. Most modern radiation protection instruments were lacking in all the centres studied. Application of shielding devices such as gonad shield for protection was neglected mostly in government hospitals. Most x-ray machines were quite old and evidence of quality assurance tests performed on such machines were lacking. Radiographers within Lagos metropolis showed an excellent knowledge of radiation protection within the study period. Adherence to radiation protection practices among radiographers in Lagos metropolis during the period studied was, however, poor. Radiographers in Lagos, Nigeria should embrace current trends in radiation protection and make more concerted efforts to apply their knowledge in protecting themselves and patients from harmful effects of ionising radiation.
Multi-scale dynamics and relaxation of a tethered membrane in a solvent by Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Pandey, Ras; Anderson, Kelly; Farmer, Barry
2006-03-01
A tethered membrane modeled by a flexible sheet dissipates entropy as it wrinkles and crumples. Nodes of a coarse grained membrane are connected via multiple pathways for dynamical modes to propagate. We consider a sheet with nodes connected by fluctuating bonds on a cubic lattice. The empty lattice sites constitute an effective solvent medium via node-solvent interaction. Each node execute its stochastic motion with the Metropolis algorithm subject to bond fluctuations, excluded volume constraints, and interaction energy. Dynamics and conformation of the sheet are examined at a low and a high temperature with attractive and repulsive node-node interactions for the contrast in an attractive solvent medium. Variations of the mean square displacement of the center node of the sheet and that of its center of mass with the time steps are examined in detail which show different power-law motion from short to long time regimes. Relaxation of the gyration radius and scaling of its asymptotic value with the molecular weight are examined.
Model selection and Bayesian inference for high-resolution seabed reflection inversion.
Dettmer, Jan; Dosso, Stan E; Holland, Charles W
2009-02-01
This paper applies Bayesian inference, including model selection and posterior parameter inference, to inversion of seabed reflection data to resolve sediment structure at a spatial scale below the pulse length of the acoustic source. A practical approach to model selection is used, employing the Bayesian information criterion to decide on the number of sediment layers needed to sufficiently fit the data while satisfying parsimony to avoid overparametrization. Posterior parameter inference is carried out using an efficient Metropolis-Hastings algorithm for high-dimensional models, and results are presented as marginal-probability depth distributions for sound velocity, density, and attenuation. The approach is applied to plane-wave reflection-coefficient inversion of single-bounce data collected on the Malta Plateau, Mediterranean Sea, which indicate complex fine structure close to the water-sediment interface. This fine structure is resolved in the geoacoustic inversion results in terms of four layers within the upper meter of sediments. The inversion results are in good agreement with parameter estimates from a gravity core taken at the experiment site.
PyDREAM: high-dimensional parameter inference for biological models in python.
Shockley, Erin M; Vrugt, Jasper A; Lopez, Carlos F; Valencia, Alfonso
2018-02-15
Biological models contain many parameters whose values are difficult to measure directly via experimentation and therefore require calibration against experimental data. Markov chain Monte Carlo (MCMC) methods are suitable to estimate multivariate posterior model parameter distributions, but these methods may exhibit slow or premature convergence in high-dimensional search spaces. Here, we present PyDREAM, a Python implementation of the (Multiple-Try) Differential Evolution Adaptive Metropolis [DREAM(ZS)] algorithm developed by Vrugt and ter Braak (2008) and Laloy and Vrugt (2012). PyDREAM achieves excellent performance for complex, parameter-rich models and takes full advantage of distributed computing resources, facilitating parameter inference and uncertainty estimation of CPU-intensive biological models. PyDREAM is freely available under the GNU GPLv3 license from the Lopez lab GitHub repository at http://github.com/LoLab-VU/PyDREAM. c.lopez@vanderbilt.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Semiclassical propagation of Wigner functions.
Dittrich, T; Gómez, E A; Pachón, L A
2010-06-07
We present a comprehensive study of semiclassical phase-space propagation in the Wigner representation, emphasizing numerical applications, in particular as an initial-value representation. Two semiclassical approximation schemes are discussed. The propagator of the Wigner function based on van Vleck's approximation replaces the Liouville propagator by a quantum spot with an oscillatory pattern reflecting the interference between pairs of classical trajectories. Employing phase-space path integration instead, caustics in the quantum spot are resolved in terms of Airy functions. We apply both to two benchmark models of nonlinear molecular potentials, the Morse oscillator and the quartic double well, to test them in standard tasks such as computing autocorrelation functions and propagating coherent states. The performance of semiclassical Wigner propagation is very good even in the presence of marked quantum effects, e.g., in coherent tunneling and in propagating Schrodinger cat states, and of classical chaos in four-dimensional phase space. We suggest options for an effective numerical implementation of our method and for integrating it in Monte-Carlo-Metropolis algorithms suitable for high-dimensional systems.
NASA Astrophysics Data System (ADS)
De Lannoy, G. J.; Reichle, R. H.; Vrugt, J. A.
2012-12-01
Simulated L-band (1.4 GHz) brightness temperatures are very sensitive to the values of the parameters in the radiative transfer model (RTM). We assess the optimum RTM parameter values and their (posterior) uncertainty in the Goddard Earth Observing System (GEOS-5) land surface model using observations of multi-angular brightness temperature over North America from the Soil Moisture Ocean Salinity (SMOS) mission. Two different parameter estimation methods are being compared: (i) a particle swarm optimization (PSO) approach, and (ii) an MCMC simulation procedure using the differential evolution adaptive Metropolis (DREAM) algorithm. Our results demonstrate that both methods provide similar "optimal" parameter values. Yet, DREAM exhibits better convergence properties, resulting in a reduced spread of the posterior ensemble. The posterior parameter distributions derived with both methods are used for predictive uncertainty estimation of brightness temperature. This presentation will highlight our model-data synthesis framework and summarize our initial findings.
Stochastic, real-space, imaginary-time evaluation of third-order Feynman-Goldstone diagrams
NASA Astrophysics Data System (ADS)
Willow, Soohaeng Yoo; Hirata, So
2014-01-01
A new, alternative set of interpretation rules of Feynman-Goldstone diagrams for many-body perturbation theory is proposed, which translates diagrams into algebraic expressions suitable for direct Monte Carlo integrations. A vertex of a diagram is associated with a Coulomb interaction (rather than a two-electron integral) and an edge with the trace of a Green's function in real space and imaginary time. With these, 12 diagrams of third-order many-body perturbation (MP3) theory are converted into 20-dimensional integrals, which are then evaluated by a Monte Carlo method. It uses redundant walkers for convergence acceleration and a weight function for importance sampling in conjunction with the Metropolis algorithm. The resulting Monte Carlo MP3 method has low-rank polynomial size dependence of the operation cost, a negligible memory cost, and a naturally parallel computational kernel, while reproducing the correct correlation energies of small molecules within a few mEh after 106 Monte Carlo steps.
The phase diagrams of a spin 1/2 core and a spin 1 shell nanoparticle with a disordered interface
NASA Astrophysics Data System (ADS)
Zaim, N.; Zaim, A.; Kerouad, M.
2016-12-01
The critical and compensation behaviors, of a spherical ferrimagnetic nanoparticle, consisting of a ferromagnetic core of spin-1/2 A atoms, a ferromagnetic shell of spin-1 B atoms and a disordered interface in between that is characterized by a random arrangement of A and B atoms of ApB1-p type and a negative A - B coupling, are studied. The ground state phase diagrams of the system have been determined in the (JAB, D/jA) and (JB, D/jA) planes. Monte Carlo simulation based on Metropolis algorithm has been used to study the effects of the concentration parameter p, the crystal field, the coupling between B - B atoms jB and the antiferromagnetic interface coupling jAB on the phase diagrams and the magnetic properties of the system. It has been found that one, two or even three compensation point(s) can appear for appropriate values of the system parameters.
Magnetic response of a disordered binary ferromagnetic alloy to an oscillating magnetic field
NASA Astrophysics Data System (ADS)
Vatansever, Erol; Polat, Hamza
2015-08-01
By means of Monte Carlo simulation with local spin update Metropolis algorithm, we have elucidated non-equilibrium phase transition properties and stationary-state treatment of a disordered binary ferromagnetic alloy of the type ApB1-p on a square lattice. After a detailed analysis, we have found that the system shows many interesting and unusual thermal and magnetic behaviors, for instance, the locations of dynamic phase transition points change significantly depending upon amplitude and period of the external magnetic field as well as upon the active concentration of A-type components. Much effort has also been dedicated to clarify the hysteresis tools, such as coercivity, dynamic loop area as well as dynamic correlations between time dependent magnetizations and external time dependent applied field as a functions of period and amplitude of field as well as active concentration of A-type components, and outstanding physical findings have been reported in order to better understand the dynamic process underlying present system.
Computer simulation of the mechanical properties of metamaterials
NASA Astrophysics Data System (ADS)
Gerasimov, R. A.; Eremeyev, V. A.; Petrova, T. O.; Egorov, V. I.; Maksimova, O. G.; Maksimov, A. V.
2016-08-01
For a hybrid discrete-continual model describing a system which consists of a substrate and polymer coating, we provide computer simulation of its mechanical properties for various levels of deformations. For the substrate, we apply the elastic model with the Hooke law while for the polymeric coating, we use a discrete model. Here we use the Stockmayer potential which is a Lennard-Jones potential with additional term which describes the dipole interactions between neighbour segments of polymer chains, that is Keesom energy. Using Monte-Carlo method with Metropolis algorithm for a given temperature the equilibrium state is determined. We obtain dependencies of the energy, force, bending moment and Young's modulus for various levels of deformations and for different values of temperature. We show that for the increase of the deformations level the influence of surface coating on the considered material parameters is less pronounced. We provide comparison of obtained results with experimental data on deformations of crystalline polymers (gutta-percha, etc.)
NASA Astrophysics Data System (ADS)
Cubillos, Patricio; Harrington, Joseph; Blecic, Jasmina; Stemm, Madison M.; Lust, Nate B.; Foster, Andrew S.; Rojo, Patricio M.; Loredo, Thomas J.
2014-11-01
Multi-wavelength secondary-eclipse and transit depths probe the thermo-chemical properties of exoplanets. In recent years, several research groups have developed retrieval codes to analyze the existing data and study the prospects of future facilities. However, the scientific community has limited access to these packages. Here we premiere the open-source Bayesian Atmospheric Radiative Transfer (BART) code. We discuss the key aspects of the radiative-transfer algorithm and the statistical package. The radiation code includes line databases for all HITRAN molecules, high-temperature H2O, TiO, and VO, and includes a preprocessor for adding additional line databases without recompiling the radiation code. Collision-induced absorption lines are available for H2-H2 and H2-He. The parameterized thermal and molecular abundance profiles can be modified arbitrarily without recompilation. The generated spectra are integrated over arbitrary bandpasses for comparison to data. BART's statistical package, Multi-core Markov-chain Monte Carlo (MC3), is a general-purpose MCMC module. MC3 implements the Differental-evolution Markov-chain Monte Carlo algorithm (ter Braak 2006, 2009). MC3 converges 20-400 times faster than the usual Metropolis-Hastings MCMC algorithm, and in addition uses the Message Passing Interface (MPI) to parallelize the MCMC chains. We apply the BART retrieval code to the HD 209458b data set to estimate the planet's temperature profile and molecular abundances. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.
Hao, Xiaohu; Zhang, Guijun; Zhou, Xiaogen
2018-04-01
Computing conformations which are essential to associate structural and functional information with gene sequences, is challenging due to the high dimensionality and rugged energy surface of the protein conformational space. Consequently, the dimension of the protein conformational space should be reduced to a proper level, and an effective exploring algorithm should be proposed. In this paper, a plug-in method for guiding exploration in conformational feature space with Lipschitz underestimation (LUE) for ab-initio protein structure prediction is proposed. The conformational space is converted into ultrafast shape recognition (USR) feature space firstly. Based on the USR feature space, the conformational space can be further converted into Underestimation space according to Lipschitz estimation theory for guiding exploration. As a consequence of the use of underestimation model, the tight lower bound estimate information can be used for exploration guidance, the invalid sampling areas can be eliminated in advance, and the number of energy function evaluations can be reduced. The proposed method provides a novel technique to solve the exploring problem of protein conformational space. LUE is applied to differential evolution (DE) algorithm, and metropolis Monte Carlo(MMC) algorithm which is available in the Rosetta; When LUE is applied to DE and MMC, it will be screened by the underestimation method prior to energy calculation and selection. Further, LUE is compared with DE and MMC by testing on 15 small-to-medium structurally diverse proteins. Test results show that near-native protein structures with higher accuracy can be obtained more rapidly and efficiently with the use of LUE. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Suh, Donghyuk; Radak, Brian K.; Chipot, Christophe; Roux, Benoît
2018-01-01
Molecular dynamics (MD) trajectories based on classical equations of motion can be used to sample the configurational space of complex molecular systems. However, brute-force MD often converges slowly due to the ruggedness of the underlying potential energy surface. Several schemes have been proposed to address this problem by effectively smoothing the potential energy surface. However, in order to recover the proper Boltzmann equilibrium probability distribution, these approaches must then rely on statistical reweighting techniques or generate the simulations within a Hamiltonian tempering replica-exchange scheme. The present work puts forth a novel hybrid sampling propagator combining Metropolis-Hastings Monte Carlo (MC) with proposed moves generated by non-equilibrium MD (neMD). This hybrid neMD-MC propagator comprises three elementary elements: (i) an atomic system is dynamically propagated for some period of time using standard equilibrium MD on the correct potential energy surface; (ii) the system is then propagated for a brief period of time during what is referred to as a "boosting phase," via a time-dependent Hamiltonian that is evolved toward the perturbed potential energy surface and then back to the correct potential energy surface; (iii) the resulting configuration at the end of the neMD trajectory is then accepted or rejected according to a Metropolis criterion before returning to step 1. A symmetric two-end momentum reversal prescription is used at the end of the neMD trajectories to guarantee that the hybrid neMD-MC sampling propagator obeys microscopic detailed balance and rigorously yields the equilibrium Boltzmann distribution. The hybrid neMD-MC sampling propagator is designed and implemented to enhance the sampling by relying on the accelerated MD and solute tempering schemes. It is also combined with the adaptive biased force sampling algorithm to examine. Illustrative tests with specific biomolecular systems indicate that the method can yield a significant speedup.
Suh, Donghyuk; Radak, Brian K; Chipot, Christophe; Roux, Benoît
2018-01-07
Molecular dynamics (MD) trajectories based on classical equations of motion can be used to sample the configurational space of complex molecular systems. However, brute-force MD often converges slowly due to the ruggedness of the underlying potential energy surface. Several schemes have been proposed to address this problem by effectively smoothing the potential energy surface. However, in order to recover the proper Boltzmann equilibrium probability distribution, these approaches must then rely on statistical reweighting techniques or generate the simulations within a Hamiltonian tempering replica-exchange scheme. The present work puts forth a novel hybrid sampling propagator combining Metropolis-Hastings Monte Carlo (MC) with proposed moves generated by non-equilibrium MD (neMD). This hybrid neMD-MC propagator comprises three elementary elements: (i) an atomic system is dynamically propagated for some period of time using standard equilibrium MD on the correct potential energy surface; (ii) the system is then propagated for a brief period of time during what is referred to as a "boosting phase," via a time-dependent Hamiltonian that is evolved toward the perturbed potential energy surface and then back to the correct potential energy surface; (iii) the resulting configuration at the end of the neMD trajectory is then accepted or rejected according to a Metropolis criterion before returning to step 1. A symmetric two-end momentum reversal prescription is used at the end of the neMD trajectories to guarantee that the hybrid neMD-MC sampling propagator obeys microscopic detailed balance and rigorously yields the equilibrium Boltzmann distribution. The hybrid neMD-MC sampling propagator is designed and implemented to enhance the sampling by relying on the accelerated MD and solute tempering schemes. It is also combined with the adaptive biased force sampling algorithm to examine. Illustrative tests with specific biomolecular systems indicate that the method can yield a significant speedup.
The role of chemicals and radiation in the etiology of cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huberman, E.; Barr, S.H.
In this volume, investigators consider the mechanisms of oncogenesis, cell transformation, and carcinogen metabolism and present new findings on chemical and radiation carcinogenesis and chemically induced mutagenesis and chromosomal changes. As background to the studies of chemical and radiation carcinogenesis, the book surveys knowledge of cell transformation and carcinogen metabolism. Among the topics reviewed are the transforming genes involved in human malignancy, the genetics and epigenetics of neoplasia, and the single-hit and multi-hit concepts of hepatocarcinogenesis. Also examined are organ, species, and interindividual differences in carcinogen metabolism; chemical and biochemical dosimetry of genotoxic chemical exposure; and the role of pharmacokineticsmore » and DNA dosimetry in relating in vitro to in vivo actions of N-nitroso compounds.« less
Lin, A; Lee, T M; Rern, J C
1994-07-01
Tricholin, a ribosome-inactivating protein isolated from the culture broth of Trichoderma viride, has been shown to exert fungicidal effects on Rhizoctonia solani through a multi-hit kinetic interaction. Tricholin causes a parallel cessation of growth, uptake of amino acids, and protein biosynthesis. The in vivo mode of action of tricholin on protein synthesis and cell growth appears to be attributed to the diminishing of the polysome formation in R. solani through damage to large ribosomal subunits. These results concur with previous data and prove that tricholin is an effective inhibitor of protein synthesis. The efficacy of tricholin as an antibiotic agent was estimated to have a duration of approximately 42 hours.
NASA Astrophysics Data System (ADS)
Wu, Yi-Hua; Chan, Chang-Chuan; Rao, Carol Y.; Lee, Chung-Te; Hsu, Hsiao-Hsien; Chiu, Yueh-Hsiu; Chao, H. Jasmine
This study was conducted to investigate the temporal and spatial distributions, compositions, and determinants of ambient aeroallergens in Taipei, Taiwan, a subtropical metropolis. We monitored ambient culturable fungi in Shin-Jhuang City, an urban area, and Shi-Men Township, a rural area, in Taipei metropolis from 2003 to 2004. We collected ambient fungi in the last week of every month during the study period, using duplicate Burkard portable samplers and Malt Extract Agar. The median concentration of total fungi was 1339 colony-forming units m -3 of air over the study period. The most prevalent fungi were non-sporulating fungi, Cladosporium, Penicillium, Curvularia and Aspergillus at both sites. Airborne fungal concentrations and diversity of fungal species were generally higher in urban than in rural areas. Most fungal taxa had significant seasonal variations, with higher levels in summer. Multivariate analyses showed that the levels of ambient fungi were associated positively with temperature, but negatively with ozone and several other air pollutants. Relative humidity also had a significant non-linear relationship with ambient fungal levels. We concluded that the concentrations and the compositions of ambient fungi are diverse in urban and rural areas in the subtropical region. High ambient fungal levels were related to an urban environment and environmental conditions of high temperature and low ozone levels.
Gato, Worlanyo E; Acquah, Samuel; Apenteng, Bettye A; Opoku, Samuel T; Boakye, Blessed K
2017-09-01
Despite the significant increase in the incidence of diabetes in Ghana, research in this area has been lagging. The purpose of the study was to assess the risk factors associated with diabetes in the Cape Coast metropolis of Ghana, and to describe nutritional practices and efforts toward lifestyle change. A convenient sample of 482 adults from the Cape Coast metropolis was surveyed using a self-reported questionnaire. The survey collected information on the demographic, socioeconomic characteristics, health status and routine nutritional practices of respondents. The aims of the study were addressed using multivariable regression analyses. A total of 8% of respondents reported that they had been diagnosed with diabetes. Older age and body weight were found to be independently associated with diabetes. Individuals living with diabetes were no more likely than those without diabetes to have taken active steps at reducing their weight. The percentage of self-reported diabetes in this population was consistent with what has been reported in previous studies in Ghana. The findings from this study highlight the need for more patient education on physical activity and weight management. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Geo-products of urban areas: Silesian Metropolis, Southern Poland
NASA Astrophysics Data System (ADS)
Chybiorz, Ryszard; Abramowicz, Anna
2017-04-01
Silesian Metropolis is located in the Silesian Voivodeship, in the most important industrial region in Poland. It consist of 14 cities with powiat rights, which create the largest urban center in Poland and one of the largest in Central and Eastern Europe. Almost 2 million people live in its territory. A large concentration of the population is associated with industrialization and especially with the development of the mining industry (Upper Silesian Coal Basin) and the processing industry (steelworks, textile industry) at the end of 19th century. One hundred years later, during the creation of the modern sectors of the economy, processes of metallurgy and mining restructuring have been started. Created mechanisms and conditions for development of post-industrial areas were consistent with the principles of sustainable development and had many new features, including cultural and touristic features. The Industrial Monuments Route was opened for the inhabitants and visitors in October 2006. The route joined the European Route of Industrial Heritage (ERIH) in 2010. Its most interesting mining attractions are located in Silesian Metropolis, and the most frequently visited object on the route is the Guido Historical Coal Mine in Zabrze and the Historical Silver Mine in Tarnowskie Góry. The project, which is realized in Zabrze, will provide for tourists a system of underground corridors, which were used for coal transportation in the 19th century. Visitors will be able to actively explore the work of miners, moving by underground boats, railway and suspension railway. Surface mines are also available for geotourists. The Ecological and Geological Education Center GEOsfera was created in a former Triassic quarry in Jaworzno. Although the area of the Silesian Metropolis is characterized by a very large devastation of the environment, the following objects were created (and are still created) on the basis of inanimate nature and they have a touristic value for the region and the country. Some of them already provide for the implementation of geotouristic purposes.
NASA Astrophysics Data System (ADS)
Ekinci, Yunus Levent; Balkaya, Çağlayan; Göktürkler, Gökhan; Turan, Seçil
2016-06-01
An efficient approach to estimate model parameters from residual gravity data based on differential evolution (DE), a stochastic vector-based metaheuristic algorithm, has been presented. We have showed the applicability and effectiveness of this algorithm on both synthetic and field anomalies. According to our knowledge, this is a first attempt of applying DE for the parameter estimations of residual gravity anomalies due to isolated causative sources embedded in the subsurface. The model parameters dealt with here are the amplitude coefficient (A), the depth and exact origin of causative source (zo and xo, respectively) and the shape factors (q and ƞ). The error energy maps generated for some parameter pairs have successfully revealed the nature of the parameter estimation problem under consideration. Noise-free and noisy synthetic single gravity anomalies have been evaluated with success via DE/best/1/bin, which is a widely used strategy in DE. Additionally some complicated gravity anomalies caused by multiple source bodies have been considered, and the results obtained have showed the efficiency of the algorithm. Then using the strategy applied in synthetic examples some field anomalies observed for various mineral explorations such as a chromite deposit (Camaguey district, Cuba), a manganese deposit (Nagpur, India) and a base metal sulphide deposit (Quebec, Canada) have been considered to estimate the model parameters of the ore bodies. Applications have exhibited that the obtained results such as the depths and shapes of the ore bodies are quite consistent with those published in the literature. Uncertainty in the solutions obtained from DE algorithm has been also investigated by Metropolis-Hastings (M-H) sampling algorithm based on simulated annealing without cooling schedule. Based on the resulting histogram reconstructions of both synthetic and field data examples the algorithm has provided reliable parameter estimations being within the sampling limits of M-H sampler. Although it is not a common inversion technique in geophysics, it can be stated that DE algorithm is worth to get more interest for parameter estimations from potential field data in geophysics considering its good accuracy, less computational cost (in the present problem) and the fact that a well-constructed initial guess is not required to reach the global minimum.
A reversible-jump Markov chain Monte Carlo algorithm for 1D inversion of magnetotelluric data
NASA Astrophysics Data System (ADS)
Mandolesi, Eric; Ogaya, Xenia; Campanyà, Joan; Piana Agostinetti, Nicola
2018-04-01
This paper presents a new computer code developed to solve the 1D magnetotelluric (MT) inverse problem using a Bayesian trans-dimensional Markov chain Monte Carlo algorithm. MT data are sensitive to the depth-distribution of rock electric conductivity (or its reciprocal, resistivity). The solution provided is a probability distribution - the so-called posterior probability distribution (PPD) for the conductivity at depth, together with the PPD of the interface depths. The PPD is sampled via a reversible-jump Markov Chain Monte Carlo (rjMcMC) algorithm, using a modified Metropolis-Hastings (MH) rule to accept or discard candidate models along the chains. As the optimal parameterization for the inversion process is generally unknown a trans-dimensional approach is used to allow the dataset itself to indicate the most probable number of parameters needed to sample the PPD. The algorithm is tested against two simulated datasets and a set of MT data acquired in the Clare Basin (County Clare, Ireland). For the simulated datasets the correct number of conductive layers at depth and the associated electrical conductivity values is retrieved, together with reasonable estimates of the uncertainties on the investigated parameters. Results from the inversion of field measurements are compared with results obtained using a deterministic method and with well-log data from a nearby borehole. The PPD is in good agreement with the well-log data, showing as a main structure a high conductive layer associated with the Clare Shale formation. In this study, we demonstrate that our new code go beyond algorithms developend using a linear inversion scheme, as it can be used: (1) to by-pass the subjective choices in the 1D parameterizations, i.e. the number of horizontal layers in the 1D parameterization, and (2) to estimate realistic uncertainties on the retrieved parameters. The algorithm is implemented using a simple MPI approach, where independent chains run on isolated CPU, to take full advantage of parallel computer architectures. In case of a large number of data, a master/slave appoach can be used, where the master CPU samples the parameter space and the slave CPUs compute forward solutions.
Spatial-Temporal Data Collection with Compressive Sensing in Mobile Sensor Networks
Li, Jiayin; Guo, Wenzhong; Chen, Zhonghui; Xiong, Neal
2017-01-01
Compressive sensing (CS) provides an energy-efficient paradigm for data gathering in wireless sensor networks (WSNs). However, the existing work on spatial-temporal data gathering using compressive sensing only considers either multi-hop relaying based or multiple random walks based approaches. In this paper, we exploit the mobility pattern for spatial-temporal data collection and propose a novel mobile data gathering scheme by employing the Metropolis-Hastings algorithm with delayed acceptance, an improved random walk algorithm for a mobile collector to collect data from a sensing field. The proposed scheme exploits Kronecker compressive sensing (KCS) for spatial-temporal correlation of sensory data by allowing the mobile collector to gather temporal compressive measurements from a small subset of randomly selected nodes along a random routing path. More importantly, from the theoretical perspective we prove that the equivalent sensing matrix constructed from the proposed scheme for spatial-temporal compressible signal can satisfy the property of KCS models. The simulation results demonstrate that the proposed scheme can not only significantly reduce communication cost but also improve recovery accuracy for mobile data gathering compared to the other existing schemes. In particular, we also show that the proposed scheme is robust in unreliable wireless environment under various packet losses. All this indicates that the proposed scheme can be an efficient alternative for data gathering application in WSNs. PMID:29117152
Efficient Implementation of MrBayes on Multi-GPU
Zhou, Jianfu; Liu, Xiaoguang; Wang, Gang
2013-01-01
MrBayes, using Metropolis-coupled Markov chain Monte Carlo (MCMCMC or (MC)3), is a popular program for Bayesian inference. As a leading method of using DNA data to infer phylogeny, the (MC)3 Bayesian algorithm and its improved and parallel versions are now not fast enough for biologists to analyze massive real-world DNA data. Recently, graphics processor unit (GPU) has shown its power as a coprocessor (or rather, an accelerator) in many fields. This article describes an efficient implementation a(MC)3 (aMCMCMC) for MrBayes (MC)3 on compute unified device architecture. By dynamically adjusting the task granularity to adapt to input data size and hardware configuration, it makes full use of GPU cores with different data sets. An adaptive method is also developed to split and combine DNA sequences to make full use of a large number of GPU cards. Furthermore, a new “node-by-node” task scheduling strategy is developed to improve concurrency, and several optimizing methods are used to reduce extra overhead. Experimental results show that a(MC)3 achieves up to 63× speedup over serial MrBayes on a single machine with one GPU card, and up to 170× speedup with four GPU cards, and up to 478× speedup with a 32-node GPU cluster. a(MC)3 is dramatically faster than all the previous (MC)3 algorithms and scales well to large GPU clusters. PMID:23493260
Efficient implementation of MrBayes on multi-GPU.
Bao, Jie; Xia, Hongju; Zhou, Jianfu; Liu, Xiaoguang; Wang, Gang
2013-06-01
MrBayes, using Metropolis-coupled Markov chain Monte Carlo (MCMCMC or (MC)(3)), is a popular program for Bayesian inference. As a leading method of using DNA data to infer phylogeny, the (MC)(3) Bayesian algorithm and its improved and parallel versions are now not fast enough for biologists to analyze massive real-world DNA data. Recently, graphics processor unit (GPU) has shown its power as a coprocessor (or rather, an accelerator) in many fields. This article describes an efficient implementation a(MC)(3) (aMCMCMC) for MrBayes (MC)(3) on compute unified device architecture. By dynamically adjusting the task granularity to adapt to input data size and hardware configuration, it makes full use of GPU cores with different data sets. An adaptive method is also developed to split and combine DNA sequences to make full use of a large number of GPU cards. Furthermore, a new "node-by-node" task scheduling strategy is developed to improve concurrency, and several optimizing methods are used to reduce extra overhead. Experimental results show that a(MC)(3) achieves up to 63× speedup over serial MrBayes on a single machine with one GPU card, and up to 170× speedup with four GPU cards, and up to 478× speedup with a 32-node GPU cluster. a(MC)(3) is dramatically faster than all the previous (MC)(3) algorithms and scales well to large GPU clusters.
Spatial-Temporal Data Collection with Compressive Sensing in Mobile Sensor Networks.
Zheng, Haifeng; Li, Jiayin; Feng, Xinxin; Guo, Wenzhong; Chen, Zhonghui; Xiong, Neal
2017-11-08
Compressive sensing (CS) provides an energy-efficient paradigm for data gathering in wireless sensor networks (WSNs). However, the existing work on spatial-temporal data gathering using compressive sensing only considers either multi-hop relaying based or multiple random walks based approaches. In this paper, we exploit the mobility pattern for spatial-temporal data collection and propose a novel mobile data gathering scheme by employing the Metropolis-Hastings algorithm with delayed acceptance, an improved random walk algorithm for a mobile collector to collect data from a sensing field. The proposed scheme exploits Kronecker compressive sensing (KCS) for spatial-temporal correlation of sensory data by allowing the mobile collector to gather temporal compressive measurements from a small subset of randomly selected nodes along a random routing path. More importantly, from the theoretical perspective we prove that the equivalent sensing matrix constructed from the proposed scheme for spatial-temporal compressible signal can satisfy the property of KCS models. The simulation results demonstrate that the proposed scheme can not only significantly reduce communication cost but also improve recovery accuracy for mobile data gathering compared to the other existing schemes. In particular, we also show that the proposed scheme is robust in unreliable wireless environment under various packet losses. All this indicates that the proposed scheme can be an efficient alternative for data gathering application in WSNs .
NASA Astrophysics Data System (ADS)
Feldt, Jonas; Miranda, Sebastião; Pratas, Frederico; Roma, Nuno; Tomás, Pedro; Mata, Ricardo A.
2017-12-01
In this work, we present an optimized perturbative quantum mechanics/molecular mechanics (QM/MM) method for use in Metropolis Monte Carlo simulations. The model adopted is particularly tailored for the simulation of molecular systems in solution but can be readily extended to other applications, such as catalysis in enzymatic environments. The electrostatic coupling between the QM and MM systems is simplified by applying perturbation theory to estimate the energy changes caused by a movement in the MM system. This approximation, together with the effective use of GPU acceleration, leads to a negligible added computational cost for the sampling of the environment. Benchmark calculations are carried out to evaluate the impact of the approximations applied and the overall computational performance.
Kyiv Small Rivers in Metropolis Water Objects System
NASA Astrophysics Data System (ADS)
Krelshteyn, P.; Dubnytska, M.
2017-12-01
The article answers the question, what really are the small underground rivers with artificial watercourses: water bodies or city engineering infrastructure objects? The place of such rivers in metropolis water objects system is identified. The ecological state and the degree of urbanization of small rivers, as well as the dynamics of change in these indicators are analysed on the Kiev city example with the help of water objects cadastre. It was found that the registration of small rivers in Kyiv city is not conducted, and the summary information on such water objects is absent and is not taken into account when making managerial decisions at the urban level. To solve this problem, we propose to create some water bodies accounting system (water cadastre).
Feldt, Jonas; Miranda, Sebastião; Pratas, Frederico; Roma, Nuno; Tomás, Pedro; Mata, Ricardo A
2017-12-28
In this work, we present an optimized perturbative quantum mechanics/molecular mechanics (QM/MM) method for use in Metropolis Monte Carlo simulations. The model adopted is particularly tailored for the simulation of molecular systems in solution but can be readily extended to other applications, such as catalysis in enzymatic environments. The electrostatic coupling between the QM and MM systems is simplified by applying perturbation theory to estimate the energy changes caused by a movement in the MM system. This approximation, together with the effective use of GPU acceleration, leads to a negligible added computational cost for the sampling of the environment. Benchmark calculations are carried out to evaluate the impact of the approximations applied and the overall computational performance.
Salehi Shahrabi, Narges; Pourezzat, Aliasghar; Mobaraki, Hossein; Mafimoradi, Shiva
2013-01-01
Abstract Background The centralization of human activities is associated with different pollutants which enter into environment easily and cause the urban environment more vulnerable. Regarding the importance of air pollution issue for Tehran metropolis, many plans and regulations have been developed. However, most of them failed to decline the pollution. The purpose of this study was to pathologically analyze air-pollution control plans to offer effective solutions for Tehran metropolis. Methods A Qualitative content analysis in addition to a semi-structured interview with 14 practicing professional were used to identify 1) key sources of Tehran’s air pollution, 2) recognize challenges towards effective performance of pertinent plans and 3), offer effective solutions. Results Related challenges to air-pollution control plans can be divided into two major categories including lack of integrated and organized stewardship and PEST challenges. Conclusion For controlling the air pollution of Tehran effectively, various controlling alternatives were identified as systematization of plan preparation process, standardization and utilization of new technologies & experts, infrastructural development, realization of social justice, developing coordination mechanisms, improving citizens’ participatory capacity and focusing on effective management of fuel and energy. Controlling air pollution in Tehran needs a serious attention of policymakers to make enforcements through applying a systemic cycle of preparation comprehensive plans. Further, implement the enforcements and evaluate the environmental impact of the plans through involving all stakeholders. PMID:26171340
Sensitivity to. gamma. rays of avian sarcoma and murine leukemia viruses. [/sup 60/Co, uv
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toyoshima, K.; Niwa, O.; Yutsudo, M.
1980-09-01
The direct inactivation of avian and murine oncoviruses by ..gamma.. rays was examined using /sup 60/Co as a ..gamma..-ray source. The inactivation of murine leukemia virus (M-MuLV) followed single-hit kinetics while the subgroup D Schmidt-Ruppin strain of avian sarcoma virus (SR-RSV D) showed multihit inactivation kinetics with an extrapolation number of 5. The two viruses showed similar uv-inactivation kinetics. The genomic RNA of the SR-RSV D strain was degraded by ..gamma.. irradiation faster than its infectivity, but viral clones isolated from the foci formed after ..gamma.. irradiation had a complete genome. These results suggest that SR-RSV D has a strongmore » repair function, possibly connected with reverse transcriptase activity.« less
Monte Carlo Analysis of Reservoir Models Using Seismic Data and Geostatistical Models
NASA Astrophysics Data System (ADS)
Zunino, A.; Mosegaard, K.; Lange, K.; Melnikova, Y.; Hansen, T. M.
2013-12-01
We present a study on the analysis of petroleum reservoir models consistent with seismic data and geostatistical constraints performed on a synthetic reservoir model. Our aim is to invert directly for structure and rock bulk properties of the target reservoir zone. To infer the rock facies, porosity and oil saturation seismology alone is not sufficient but a rock physics model must be taken into account, which links the unknown properties to the elastic parameters. We then combine a rock physics model with a simple convolutional approach for seismic waves to invert the "measured" seismograms. To solve this inverse problem, we employ a Markov chain Monte Carlo (MCMC) method, because it offers the possibility to handle non-linearity, complex and multi-step forward models and provides realistic estimates of uncertainties. However, for large data sets the MCMC method may be impractical because of a very high computational demand. To face this challenge one strategy is to feed the algorithm with realistic models, hence relying on proper prior information. To address this problem, we utilize an algorithm drawn from geostatistics to generate geologically plausible models which represent samples of the prior distribution. The geostatistical algorithm learns the multiple-point statistics from prototype models (in the form of training images), then generates thousands of different models which are accepted or rejected by a Metropolis sampler. To further reduce the computation time we parallelize the software and run it on multi-core machines. The solution of the inverse problem is then represented by a collection of reservoir models in terms of facies, porosity and oil saturation, which constitute samples of the posterior distribution. We are finally able to produce probability maps of the properties we are interested in by performing statistical analysis on the collection of solutions.
Liang, Jie; Qian, Hong
2010-01-01
Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand “complex behavior” and complexity theory, and from which important biological insight can be gained. PMID:24999297
Liang, Jie; Qian, Hong
2010-01-01
Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.
The New Sectionalism: I. Metropolis Without Growth
ERIC Educational Resources Information Center
Alonso, William
1978-01-01
This article suggests that there are three principal sources of metropolitan population decline: the declining birth rate, the reversal of rural-to-urban migration, and inter-metropolitan migration. (Author/AM)
Mckee, D L
1985-01-01
The tendency toward hypertrophy of large metropolitan areas in the Third World has been a subject of concern to economists and other social scientists for some time. Inability to absorb vast waves of migrants into the organized labor force or to provide adequate infrastructure and services are serious problems in many growing cities of Asia, Africa, and Latin America. A different phenomenon created by perpetual urban expansion has been relatively neglected: the problems caused when preexisting urban areas are absorbed into the metropolis. The tendency of squatter settlements to constrict normal urban growth and expansion and to impede rational provision of services has been recognized, but the absorption of small cities does not necessarily produce identical problems. Small cities absorbed into a metropolis lose their identity in the successive waves of suburban proliferation. Los Angeles in the US may be considered the prototype of the phenomenon in which multiple preexisting urban zones are absorbed into the same metropolis without formation of any visible center of gravity. In some cases, small cities may be completely engulfed by the encroaching metropolis, if transit routes or availability of land makes them interesting to developers. The livelihood of residents may be threatened if they are no longer able to cultivate gardens or raise small animals. Local services may deteriorate. The youngest and most able residents are likely to abandon such places for the greater opportunities of the city, leaving the aged and less qualified to fend for themselves. Jobs may disappear and traditional commercial relations may be destroyed without being replaced. The future wellbeing of residents depends on their ability to maneuver in the new metropolitan environment, but many will be unable to adjust for lack of training, the weight of immovable property, or diverse personal considerations. Planning could help to reduce the problems that occasional survival of some small entities may pose for rational expansion of transportation and services at the metropolitan level, but many Third World cities lack such planning capacity altogether.
Obiri-Yeboah, Dorcas; Asante Awuku, Yaw; Adu, Joseph; Pappoe, Faustina; Obboh, Evans; Nsiah, Paul; Amoako-Sakyi, Daniel; Simpore, Jacques
2018-01-01
Hepatitis E virus is an emerging infection in Africa with poor maternal and foetal outcomes. There is scanty data on the sero-prevalence of HEV infection among pregnant women in Ghana. This study highlighted the prevalence and risk factors associated with HEV infection among pregnant women in Cape Coast Metropolis, Central Region of Ghana. A multicenter (3 selected sites) analytical cross sectional study involving 398 pregnant women in the Cape Coast metropolis was conducted. HEV (Anti-HEV IgG and Anti-HEV IgM) ELISA was performed. Sero-positive women had liver chemistries done and data collected on maternal and neonatal outcomes. Data analyses were performed using Stata version 13 software (STATA Corp, Texas USA). Mean age was 28.01 (± 5.93) years. HEV sero-prevalence was 12.2% (n = 48) for IgG and 0.2% (n = 1) for IgM with overall of 12.3%. The odds of being HEV sero-positive for women aged 26-35 years was 3.1 (95% CI: 1.1-8.1), p = 0.02 and ≥36 years it was 10.7 (95% CI; 3.4-33.5), p = 0.0001. Living in urban settlement was associated with lowest odds of HEV infection {OR 0.4 (95% CI; 0.2-0.8), p = 0.01}. Factors with no statistical evidence of association include main source of drinking water and history of blood transfusion. The sero-prevalence of HEV IgG increased progressively across trimesters with the highest among women in their third trimester (55.3%). None of the 49 HEV sero-positive women had elevated ALT level. Ten (N = 41) of the neonates born to sero-positive women developed jaundice in the neonatal period. The mean birth weight was 3.1kg (SD 0.4). HEV sero-prevalence among pregnant women in the Cape Coast Metropolis is high enough to deserve more attention than it has received so far. It is therefore important to conduct further research on the potential impact on maternal and neonatal mortality and morbidity in Ghana.
Rapid Non-Gaussian Uncertainty Quantification of Seismic Velocity Models and Images
NASA Astrophysics Data System (ADS)
Ely, G.; Malcolm, A. E.; Poliannikov, O. V.
2017-12-01
Conventional seismic imaging typically provides a single estimate of the subsurface without any error bounds. Noise in the observed raw traces as well as the uncertainty of the velocity model directly impact the uncertainty of the final seismic image and its resulting interpretation. We present a Bayesian inference framework to quantify uncertainty in both the velocity model and seismic images, given noise statistics of the observed data.To estimate velocity model uncertainty, we combine the field expansion method, a fast frequency domain wave equation solver, with the adaptive Metropolis-Hastings algorithm. The speed of the field expansion method and its reduced parameterization allows us to perform the tens or hundreds of thousands of forward solves needed for non-parametric posterior estimations. We then migrate the observed data with the distribution of velocity models to generate uncertainty estimates of the resulting subsurface image. This procedure allows us to create both qualitative descriptions of seismic image uncertainty and put error bounds on quantities of interest such as the dip angle of a subduction slab or thickness of a stratigraphic layer.
NASA Astrophysics Data System (ADS)
Vatansever, Erol
2017-05-01
By means of Monte Carlo simulation method with Metropolis algorithm, we elucidate the thermal and magnetic phase transition behaviors of a ferrimagnetic core/shell nanocubic system driven by a time dependent magnetic field. The particle core is composed of ferromagnetic spins, and it is surrounded by an antiferromagnetic shell. At the interface of the core/shell particle, we use antiferromagnetic spin-spin coupling. We simulate the nanoparticle using classical Heisenberg spins. After a detailed analysis, our Monte Carlo simulation results suggest that present system exhibits unusual and interesting magnetic behaviors. For example, at the relatively lower temperature regions, an increment in the amplitude of the external field destroys the antiferromagnetism in the shell part of the nanoparticle, leading to a ground state with ferromagnetic character. Moreover, particular attention has been dedicated to the hysteresis behaviors of the system. For the first time, we show that frequency dispersions can be categorized into three groups for a fixed temperature for finite core/shell systems, as in the case of the conventional bulk systems under the influence of an oscillating magnetic field.
Direct simulation Monte Carlo modeling of relaxation processes in polyatomic gases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfeiffer, M., E-mail: mpfeiffer@irs.uni-stuttgart.de; Nizenkov, P., E-mail: nizenkov@irs.uni-stuttgart.de; Mirza, A., E-mail: mirza@irs.uni-stuttgart.de
2016-02-15
Relaxation processes of polyatomic molecules are modeled and implemented in an in-house Direct Simulation Monte Carlo code in order to enable the simulation of atmospheric entry maneuvers at Mars and Saturn’s Titan. The description of rotational and vibrational relaxation processes is derived from basic quantum-mechanics using a rigid rotator and a simple harmonic oscillator, respectively. Strategies regarding the vibrational relaxation process are investigated, where good agreement for the relaxation time according to the Landau-Teller expression is found for both methods, the established prohibiting double relaxation method and the new proposed multi-mode relaxation. Differences and applications areas of these two methodsmore » are discussed. Consequently, two numerical methods used for sampling of energy values from multi-dimensional distribution functions are compared. The proposed random-walk Metropolis algorithm enables the efficient treatment of multiple vibrational modes within a time step with reasonable computational effort. The implemented model is verified and validated by means of simple reservoir simulations and the comparison to experimental measurements of a hypersonic, carbon-dioxide flow around a flat-faced cylinder.« less
Direct simulation Monte Carlo modeling of relaxation processes in polyatomic gases
NASA Astrophysics Data System (ADS)
Pfeiffer, M.; Nizenkov, P.; Mirza, A.; Fasoulas, S.
2016-02-01
Relaxation processes of polyatomic molecules are modeled and implemented in an in-house Direct Simulation Monte Carlo code in order to enable the simulation of atmospheric entry maneuvers at Mars and Saturn's Titan. The description of rotational and vibrational relaxation processes is derived from basic quantum-mechanics using a rigid rotator and a simple harmonic oscillator, respectively. Strategies regarding the vibrational relaxation process are investigated, where good agreement for the relaxation time according to the Landau-Teller expression is found for both methods, the established prohibiting double relaxation method and the new proposed multi-mode relaxation. Differences and applications areas of these two methods are discussed. Consequently, two numerical methods used for sampling of energy values from multi-dimensional distribution functions are compared. The proposed random-walk Metropolis algorithm enables the efficient treatment of multiple vibrational modes within a time step with reasonable computational effort. The implemented model is verified and validated by means of simple reservoir simulations and the comparison to experimental measurements of a hypersonic, carbon-dioxide flow around a flat-faced cylinder.
Bayesian Inference for Source Reconstruction: A Real-World Application
Yee, Eugene; Hoffman, Ian; Ungar, Kurt
2014-01-01
This paper applies a Bayesian probabilistic inferential methodology for the reconstruction of the location and emission rate from an actual contaminant source (emission from the Chalk River Laboratories medical isotope production facility) using a small number of activity concentration measurements of a noble gas (Xenon-133) obtained from three stations that form part of the International Monitoring System radionuclide network. The sampling of the resulting posterior distribution of the source parameters is undertaken using a very efficient Markov chain Monte Carlo technique that utilizes a multiple-try differential evolution adaptive Metropolis algorithm with an archive of past states. It is shown that the principal difficulty in the reconstruction lay in the correct specification of the model errors (both scale and structure) for use in the Bayesian inferential methodology. In this context, two different measurement models for incorporation of the model error of the predicted concentrations are considered. The performance of both of these measurement models with respect to their accuracy and precision in the recovery of the source parameters is compared and contrasted. PMID:27379292
A bayesian hierarchical model for classification with selection of functional predictors.
Zhu, Hongxiao; Vannucci, Marina; Cox, Dennis D
2010-06-01
In functional data classification, functional observations are often contaminated by various systematic effects, such as random batch effects caused by device artifacts, or fixed effects caused by sample-related factors. These effects may lead to classification bias and thus should not be neglected. Another issue of concern is the selection of functions when predictors consist of multiple functions, some of which may be redundant. The above issues arise in a real data application where we use fluorescence spectroscopy to detect cervical precancer. In this article, we propose a Bayesian hierarchical model that takes into account random batch effects and selects effective functions among multiple functional predictors. Fixed effects or predictors in nonfunctional form are also included in the model. The dimension of the functional data is reduced through orthonormal basis expansion or functional principal components. For posterior sampling, we use a hybrid Metropolis-Hastings/Gibbs sampler, which suffers slow mixing. An evolutionary Monte Carlo algorithm is applied to improve the mixing. Simulation and real data application show that the proposed model provides accurate selection of functional predictors as well as good classification.
2013-12-18
The collection of red dots seen here show one of several very distant galaxy clusters discovered by combining ground-based optical data from the NOAO Kitt Peak National Observatory with infrared data from NASA Spitzer Space Telescope.
78 FR 962 - Sunshine Act Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-07
.... (Metropolis Works Uranium Conversion Facility), Docket No. 40-3392, Petition for Review of LBP-12-6 (Mar. 22...--Tentative Thursday, February 7, 2013 1:00 p.m. Briefing on Steam Generator Tube Degradation (Public Meeting...
2008-01-17
Delhi is the second largest metropolis in India, with a population of 16 million and is located in northern India along the banks of the Yamuna River. This image was acquired by NASA Terra satellite on September 22, 2003.
NASA Astrophysics Data System (ADS)
Chen, C.; Lee, J.; Chan, Y.; Lu, C.
2010-12-01
The Taipei Metropolis, home to around 10 million people, is subject to seismic hazard originated from not only distant faults or sources scattered throughout the Taiwan region, but also active fault lain directly underneath. Northern Taiwan including the Taipei region is currently affected by post-orogenic (Penglai arc-continent collision) processes related to backarc extension of the Ryukyu subduction system. The Shanchiao Fault, an active normal fault outcropping along the western boundary of the Taipei Basin and dipping to the east, is investigated here for its subsurface structure and activities. Boreholes records in the central portion of the fault were analyzed to document the stacking of post- Last Glacial Maximum growth sediments, and a tulip flower structure is illuminated with averaged vertical slip rate of about 3 mm/yr. Similar fault zone architecture and post-LGM tectonic subsidence rate is also found in the northern portion of the fault. A correlation between geomorphology and structural geology in the Shanchiao Fault zone demonstrates an array of subtle geomorphic scarps corresponds to the branch fault while the surface trace of the main fault seems to be completely erased by erosion and sedimentation. Such constraints and knowledge are crucial in earthquake hazard evaluation and mitigation in the Taipei Metropolis, and in understanding the kinematics of transtensional tectonics in northern Taiwan. Schematic 3D diagram of the fault zone in the central portion of the Shanchiao Fault, displaying regional subsurface geology and its relation to topographic features.
Wheelchair accessibility to public buildings in the Kumasi metropolis, Ghana.
Yarfi, Cosmos; Ashigbi, Evans Y K; Nakua, Emmanuel K
2017-01-01
Accessibility implies making public places accessible to every individual, irrespective of his or her disability or special need, ensuring the integration of the wheelchair user into the society and thereby granting them the capability of participating in activities of daily living and ensuring equality in daily life. This study was carried out to assess the accessibility of the physical infrastructures (public buildings) in the Kumasi metropolis to wheelchairs after the passage of the Ghanaian Disability Law (Act 716, 2006). Eighty-four public buildings housing education facilities, health facilities, ministries, departments and agencies, sports and recreation, religious groups and banks were assessed. The routes, entrances, height of steps, grade of ramps, sinks, entrance to washrooms, toilets, urinals, automated teller machines and tellers' counters were measured and computed. Out of a total of 84 buildings assessed, only 34 (40.5%) of the buildings, 52.3% of the entrances and 87.4% of the routes of the buildings were accessible to wheelchair users. A total of 25% (13 out of 52) of the public buildings with more than one floor were fitted with elevators to connect the different levels of floors. The results of this study show that public buildings in the Kumasi metropolis are not wheelchair accessible. An important observation made during this study was that there is an intention to improve accessibility when buildings are being constructed or renovated, but there are no laid down guidelines as how to make the buildings accessible for wheelchair users.
Ocular Health and Safety Assessment among Mechanics of the Cape Coast Metropolis, Ghana.
Abu, Emmanuel Kwasi; Boadi-Kusi, Samuel Bert; Opuni, Prince Quarcoo; Kyei, Samuel; Owusu-Ansah, Andrew; Darko-Takyi, Charles
2016-01-01
To conduct an ocular health and safety assessment among mechanics in the Cape Coast Metropolis, Ghana. This descriptive cross sectional study included 500 mechanics using multistage sampling. All participants filled a structured questionnaire on demographic data, occupational history and ocular health history. Study participants underwent determination of visual acuity (VA) using LogMAR chart, external eye examination with a handheld slit lamp biomicroscope, dilated fundus examination, applanation tonometry and refraction. Out of 500 mechanics, 433 were examined (response rate, 87%) comprised of 408 (94.2%) male and 25 (5.8%) female subjects. The prevalence of visual impairment (i.e. presenting VA < 6/18) among the respondents was 2.1%. Eye injuries were reported in 171 (39.5%) mechanics probably due to the large number of workers, 314 (72.5%), who did not use eye protective devices. Mechanics in the auto welding category were at the highest risk of sustaining an eye injury (odds ratio [OR], 13.4; P < 0.001). Anterior segment ocular disorders were mostly pterygia while posterior segment eye disorders included glaucoma suspects and retinochoroidal lesions. The development of pterygia was associated with the number of years a mechanic stayed on the job. Eye care seeking behavior among the participants was poor. Eye injuries were prevalent among the mechanics as the use of eye protection was low. Eye safety should be made an integral part of the public health agenda in the Cape Coast Metropolis.
Wheelchair accessibility to public buildings in the Kumasi metropolis, Ghana
Ashigbi, Evans Y.K.
2017-01-01
Background Accessibility implies making public places accessible to every individual, irrespective of his or her disability or special need, ensuring the integration of the wheelchair user into the society and thereby granting them the capability of participating in activities of daily living and ensuring equality in daily life. Objective This study was carried out to assess the accessibility of the physical infrastructures (public buildings) in the Kumasi metropolis to wheelchairs after the passage of the Ghanaian Disability Law (Act 716, 2006). Methods Eighty-four public buildings housing education facilities, health facilities, ministries, departments and agencies, sports and recreation, religious groups and banks were assessed. The routes, entrances, height of steps, grade of ramps, sinks, entrance to washrooms, toilets, urinals, automated teller machines and tellers’ counters were measured and computed. Results Out of a total of 84 buildings assessed, only 34 (40.5%) of the buildings, 52.3% of the entrances and 87.4% of the routes of the buildings were accessible to wheelchair users. A total of 25% (13 out of 52) of the public buildings with more than one floor were fitted with elevators to connect the different levels of floors. Conclusion The results of this study show that public buildings in the Kumasi metropolis are not wheelchair accessible. An important observation made during this study was that there is an intention to improve accessibility when buildings are being constructed or renovated, but there are no laid down guidelines as how to make the buildings accessible for wheelchair users. PMID:29062761
Obembe, Taiwo A.; Osungbade, Kayode O.; Ademokun, Oluwakemi M.
2016-01-01
Background: The awareness, knowledge, and involvement of teachers in the implementation of School Health Programme (SHP) in secondary schools are essential in ensuring the effectiveness and overall success of the School Health Policy. This study assessed the awareness and knowledge of teachers on SHP in Ibadan metropolis. Methods: A descriptive cross-sectional study was carried out using a two-stage sampling technique to select 426 secondary school teachers across all the five Urban Local Government Areas (LGAs) in Ibadan metropolis by balloting. Pretested semi-structured questionnaires were used to collect data from 426 teachers. Quantitative data were analyzed using descriptive statistics, Chi-square, and logistics regression tests at 5% level of significance. Results: About one-third of the respondents had heard of National School Health Policy (NSHP); however, few had seen the document. About half of the respondents were aware of the SHP in their schools. Many of the respondents had a good knowledge of SHP. Age and level of education of participants significantly influenced the knowledge of SHP. Above 50 years of age and postgraduate qualification were the significant predictors for the good knowledge of SHP. Conclusions: Awareness of the NSHP was low despite the good knowledge of SHP. This could be due to the tertiary education that most of the respondents had. Concerted efforts of stakeholders are required to intensify the health education awareness campaign to improve teachers’ knowledge based on NSHP. PMID:27630385
Lü, Qiang; Xia, Xiao-Yan; Chen, Rong; Miao, Da-Jun; Chen, Sha-Sha; Quan, Li-Jun; Li, Hai-Ou
2012-01-01
Protein structure prediction (PSP), which is usually modeled as a computational optimization problem, remains one of the biggest challenges in computational biology. PSP encounters two difficult obstacles: the inaccurate energy function problem and the searching problem. Even if the lowest energy has been luckily found by the searching procedure, the correct protein structures are not guaranteed to obtain. A general parallel metaheuristic approach is presented to tackle the above two problems. Multi-energy functions are employed to simultaneously guide the parallel searching threads. Searching trajectories are in fact controlled by the parameters of heuristic algorithms. The parallel approach allows the parameters to be perturbed during the searching threads are running in parallel, while each thread is searching the lowest energy value determined by an individual energy function. By hybridizing the intelligences of parallel ant colonies and Monte Carlo Metropolis search, this paper demonstrates an implementation of our parallel approach for PSP. 16 classical instances were tested to show that the parallel approach is competitive for solving PSP problem. This parallel approach combines various sources of both searching intelligences and energy functions, and thus predicts protein conformations with good quality jointly determined by all the parallel searching threads and energy functions. It provides a framework to combine different searching intelligence embedded in heuristic algorithms. It also constructs a container to hybridize different not-so-accurate objective functions which are usually derived from the domain expertise.
Lü, Qiang; Xia, Xiao-Yan; Chen, Rong; Miao, Da-Jun; Chen, Sha-Sha; Quan, Li-Jun; Li, Hai-Ou
2012-01-01
Background Protein structure prediction (PSP), which is usually modeled as a computational optimization problem, remains one of the biggest challenges in computational biology. PSP encounters two difficult obstacles: the inaccurate energy function problem and the searching problem. Even if the lowest energy has been luckily found by the searching procedure, the correct protein structures are not guaranteed to obtain. Results A general parallel metaheuristic approach is presented to tackle the above two problems. Multi-energy functions are employed to simultaneously guide the parallel searching threads. Searching trajectories are in fact controlled by the parameters of heuristic algorithms. The parallel approach allows the parameters to be perturbed during the searching threads are running in parallel, while each thread is searching the lowest energy value determined by an individual energy function. By hybridizing the intelligences of parallel ant colonies and Monte Carlo Metropolis search, this paper demonstrates an implementation of our parallel approach for PSP. 16 classical instances were tested to show that the parallel approach is competitive for solving PSP problem. Conclusions This parallel approach combines various sources of both searching intelligences and energy functions, and thus predicts protein conformations with good quality jointly determined by all the parallel searching threads and energy functions. It provides a framework to combine different searching intelligence embedded in heuristic algorithms. It also constructs a container to hybridize different not-so-accurate objective functions which are usually derived from the domain expertise. PMID:23028708
Mathieu, Jordane A; Hatté, Christine; Balesdent, Jérôme; Parent, Éric
2015-11-01
The response of soil carbon dynamics to climate and land-use change will affect both the future climate and the quality of ecosystems. Deep soil carbon (>20 cm) is the primary component of the soil carbon pool, but the dynamics of deep soil carbon remain poorly understood. Therefore, radiocarbon activity (Δ14C), which is a function of the age of carbon, may help to understand the rates of soil carbon biodegradation and stabilization. We analyzed the published 14C contents in 122 profiles of mineral soil that were well distributed in most of the large world biomes, except for the boreal zone. With a multivariate extension of a linear mixed-effects model whose inference was based on the parallel combination of two algorithms, the expectation-maximization (EM) and the Metropolis-Hasting algorithms, we expressed soil Δ14C profiles as a four-parameter function of depth. The four-parameter model produced insightful predictions of soil Δ14C as dependent on depth, soil type, climate, vegetation, land-use and date of sampling (R2=0.68). Further analysis with the model showed that the age of topsoil carbon was primarily affected by climate and cultivation. By contrast, the age of deep soil carbon was affected more by soil taxa than by climate and thus illustrated the strong dependence of soil carbon dynamics on other pedologic traits such as clay content and mineralogy. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Yamauchi, Masataka; Okumura, Hisashi
2017-11-01
We developed a two-dimensional replica-permutation molecular dynamics method in the isothermal-isobaric ensemble. The replica-permutation method is a better alternative to the replica-exchange method. It was originally developed in the canonical ensemble. This method employs the Suwa-Todo algorithm, instead of the Metropolis algorithm, to perform permutations of temperatures and pressures among more than two replicas so that the rejection ratio can be minimized. We showed that the isothermal-isobaric replica-permutation method performs better sampling efficiency than the isothermal-isobaric replica-exchange method and infinite swapping method. We applied this method to a β-hairpin mini protein, chignolin. In this simulation, we observed not only the folded state but also the misfolded state. We calculated the temperature and pressure dependence of the fractions on the folded, misfolded, and unfolded states. Differences in partial molar enthalpy, internal energy, entropy, partial molar volume, and heat capacity were also determined and agreed well with experimental data. We observed a new phenomenon that misfolded chignolin becomes more stable under high-pressure conditions. We also revealed this mechanism of the stability as follows: TYR2 and TRP9 side chains cover the hydrogen bonds that form a β-hairpin structure. The hydrogen bonds are protected from the water molecules that approach the protein as the pressure increases.
NASA Astrophysics Data System (ADS)
Hameed, Amer; Appleby-Thomas, Gareth; Wood, David; Jaansalu, Kevin
2015-06-01
Recent studies have shown evidence that the ballistic-resistance of fragmented (comminuted) ceramics is independent of the original strength of the material. In particular, experimental investigations into the ballistic behaviour of such fragmented ceramics have indicated that this response is correlated to shattered ceramic morphology. This suggests that careful control of ceramic microstructure - and therefore failure paths - might provide a route to optimise post-impact ballistic performance, thereby enhancing multi-hit capability. In this study, building on previous in-house work, ballistic tests were conducted using pre-formed `fragmented-ceramic' analogues based around three morphologically differing (but chemically identical) alumina feedstock materials compacted into target `pucks. In an evolution of previous work, variation of target thickness provided additional insight into an apparent morphology-based contribution to ballistic response.
Hybrid lattice gas simulations of flow through porous media
NASA Astrophysics Data System (ADS)
Becklehimer, Jeffrey Lynn
1997-10-01
This study introduces a suite of models designed to investigate transport phenomena in simulated porous media such as rigid or quenched sediment and clay-like deformable environments. This is achieved by using a variety of techniques that are borrowed from the field of statistical physics. These techniques include percolation, lattice gas, and cellular automata. A percolation-based model is used to study a porous medium by using rods and chains of various shapes and sizes to model the porous media formed by sediments. This is further extended to model clay-like deformable media by interacting heavy sediment particles. An interacting lattice gas computer simulation model based on the Metropolis algorithm is used to study the transport properties of fluid particles and permeability of a porous sediment. Finally, a hybrid lattice gas model is introduced by combining the Metropolis Monte Carlo method with a direct simulation which involves the collision rules as in cellular automata. This model is then used to study shock propagation in a fluid filled porous medium. This study is then extended to study shock propagation through in a fluid filled elastic porous medium. Several interesting and new results were obtained. These results show that for rigid chain percolation the percolation threshold shows a dependence on the chain length of pc~ Lc-1/2 and the jamming coverage decreases with the chain length as Lc- 1/3. For the random SAW-like chains the percolation threshold decays with the chain length as Lc- 0.01 and the jamming coverage as Lc-1/3. The fluid flow model shows that permeability depends nonmonotonically on the concentration of the fluid. For some fluids at a fixed porosity, the permeability increases on increasing the bias until a certain value Bc above which it decreases. Also, it was found that a shock propagates in a drift-like fashion when in a rigid porous medium when the porosity is high; low porosity damps out the shock front very quickly. For a shock propagating in a clay-like porous medium an unusually super-fast power-law behavior is observed for the RMS displacements of the fluid and clay particles.
1995 Bicycle and Pedestrian Safety Report
DOT National Transportation Integrated Search
1995-03-01
This report provides a review of the current data on bicycle and pedestrian : safety across the United States, finding that safety and education : programs could significantly improve bicycle and pedestrian safety in the : Dallas-Fort Worth Metropoli...
[Study on vitamin A nutritional status of Chinese urban elderly residents in 2010-2012].
Chen, J; Hu, Y C; Yang, C; Yun, C F; Wang, R; Mao, D Q; Li, W D; Yang, Y H; Yang, X G; Yang, L C
2017-02-06
Objective: To assess the vitamin A nutritional status of the Chinese urban elderly population by analyzing serum retinol level in 2010-2012. Methods: Data were collected from the Chinese National Nutrition and Health Survey in 2010-2012. Using the multi-stage stratified cluster sampling method, serum samples from elderly residents aged ≥60 years old were obtained from 34 metropolis and 41 middle-sized and small cities. Demographic data were collected using a questionnaire survey. The serum retinol concentration was determined by high-performance liquid chromatography. Vitamin A deficiency (VAD) was determined using the World Health Organization guidelines. A total of 3 200 elderly residents were included in the study. The serum retinol levels and prevalence of VAD and marginal VAD were also compared. Results: The serum retinol concentration ( P (50)( P (25)- P (75))) of Chinese urban elderly residents was 1.83 (1.37-2.39) μmoL/L. Compared with middle-sized and small cities (1.91 (1.47-2.48) μmol/L), the retinol level of senior citizens in metropolis (1.70 (1.25-2.25) μmol/L) was significantly lower ( P< 0.001). The serum retinol levels of elderly male (1.89 (1.37-2.47) μmoL/L) was significantly higher than that of female (1.80 (1.36-2.28) μmoL/L) ( P= 0.001). The serum retinol concentration was 1.87 (1.42-2.43), 1.78 (1.32-2.33), and 1.71 (1.24-2.24) μmol/L for 60-69, 70-79, and ≥80 years olds, respectively. The retinol level in elderly people ≥70 years olds was significantly lower than that of 60-69 years olds ( P< 0.001). The overall prevalence of VAD among Chinese urban elderly residents was 4.22% (135/3 200); 6.00% (81/1 350) for metropolis residents and 2.92% (54/1 850) for middle-sized and small city residents. The overall marginal VAD rate of Chinese urban elderly residents was 8.19% (262/3 200); 10.51% (142/1 350) for metropolis residents and 6.49% (120/1 850) for medium-sized and small city residents. The prevalence of VAD and marginal VAD for males was 3.87% (61/1 577) and 8.24% (130/1 577), respectively ( P< 0.05). The prevalence of VAD according to age group was 3.65% (72/1 975), 4.96% (50/1 008), and 5.99% (13/217), respectively( P =0.097). The prevalence of marginal VAD according to age group was 6.99% (138/1 975), 9.82% (99/1 008), and 11.52% (25/217), respectively( P =0.05). Conclusion: Chinese urban elderly residents showed various levels of VAD, although marginal VAD was quite common. As VAD was more common in metropolis residents and older residents, specific strategies should target these populations.
Galactic City at the Edge of the Universe
2011-01-12
Astronomers have discovered a massive cluster of young galaxies forming in the distant universe. The growing galactic metropolis is known as COSMOS-AzTEC3. This image was taken Japan Subaru telescope atop Mauna Kea in Hawaii.
Rainforest metropolis casts 1,000-km defaunation shadow.
Tregidgo, Daniel J; Barlow, Jos; Pompeu, Paulo S; de Almeida Rocha, Mayana; Parry, Luke
2017-08-08
Tropical rainforest regions are urbanizing rapidly, yet the role of emerging metropolises in driving wildlife overharvesting in forests and inland waters is unknown. We present evidence of a large defaunation shadow around a rainforest metropolis. Using interviews with 392 rural fishers, we show that fishing has severely depleted a large-bodied keystone fish species, tambaqui ( Colossoma macropomum ), with an impact extending over 1,000 km from the rainforest city of Manaus (population 2.1 million). There was strong evidence of defaunation within this area, including a 50% reduction in body size and catch rate (catch per unit effort). Our findings link these declines to city-based boats that provide rural fishers with reliable access to fish buyers and ice and likely impact rural fisher livelihoods and flooded forest biodiversity. This empirical evidence that urban markets can defaunate deep into rainforest wilderness has implications for other urbanizing socioecological systems.
Optimized nested Markov chain Monte Carlo sampling: theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coe, Joshua D; Shaw, M Sam; Sewell, Thomas D
2009-01-01
Metropolis Monte Carlo sampling of a reference potential is used to build a Markov chain in the isothermal-isobaric ensemble. At the endpoints of the chain, the energy is reevaluated at a different level of approximation (the 'full' energy) and a composite move encompassing all of the intervening steps is accepted on the basis of a modified Metropolis criterion. By manipulating the thermodynamic variables characterizing the reference system we maximize the average acceptance probability of composite moves, lengthening significantly the random walk made between consecutive evaluations of the full energy at a fixed acceptance probability. This provides maximally decorrelated samples ofmore » the full potential, thereby lowering the total number required to build ensemble averages of a given variance. The efficiency of the method is illustrated using model potentials appropriate to molecular fluids at high pressure. Implications for ab initio or density functional theory (DFT) treatment are discussed.« less
Quality of life of deaf and hard of hearing students in Ibadan metropolis, Nigeria
2018-01-01
Quality of Life encompasses an individual’s well-being and health, social participation and satisfaction with functional daily living. Disabilities such as deafness can impact on the quality of life with spatial variance to the environment. Deafness causes communicative problems with significant consequences in cognitive, social, and emotional well-being of affected individuals. However, information relating to the quality of life of deaf and hard of hearing individuals, especially students in developing countries like Nigeria, which could be used to design special health-related interventions is sparse. This study examined the quality of life of deaf and hard of hearing students in Ibadan metropolis, Nigeria. One hundred and ten deaf and hard of hearing students participated in this cross-sectional study. Participants were drawn from all four secondary schools for the Deaf in Ibadan metropolis. The 26 item Brief version of the WHO Quality of Life questionnaire was used for data collection. The data was analyzed using descriptive and inferential statistics at statistical significance of p<0.05. Majority (57.8%) of the deaf and hard of hearing students had poor quality of life. Attending the special school for the Deaf, upper socio-economic status and age (≥17years) are significantly associated with better quality of life. However, gender and age at onset of hearing loss had no significant influence on the quality of life. The Deaf community available in the special school appeared to protect against stigma and discrimination, while also promoting social interactions between deaf and hard of hearing individuals. PMID:29293560
Quality of life of deaf and hard of hearing students in Ibadan metropolis, Nigeria.
Jaiyeola, Mofadeke T; Adeyemo, Adebolajo A
2018-01-01
Quality of Life encompasses an individual's well-being and health, social participation and satisfaction with functional daily living. Disabilities such as deafness can impact on the quality of life with spatial variance to the environment. Deafness causes communicative problems with significant consequences in cognitive, social, and emotional well-being of affected individuals. However, information relating to the quality of life of deaf and hard of hearing individuals, especially students in developing countries like Nigeria, which could be used to design special health-related interventions is sparse. This study examined the quality of life of deaf and hard of hearing students in Ibadan metropolis, Nigeria. One hundred and ten deaf and hard of hearing students participated in this cross-sectional study. Participants were drawn from all four secondary schools for the Deaf in Ibadan metropolis. The 26 item Brief version of the WHO Quality of Life questionnaire was used for data collection. The data was analyzed using descriptive and inferential statistics at statistical significance of p<0.05. Majority (57.8%) of the deaf and hard of hearing students had poor quality of life. Attending the special school for the Deaf, upper socio-economic status and age (≥17years) are significantly associated with better quality of life. However, gender and age at onset of hearing loss had no significant influence on the quality of life. The Deaf community available in the special school appeared to protect against stigma and discrimination, while also promoting social interactions between deaf and hard of hearing individuals.
Martin, Stephen B.; Schauer, Elizabeth S.; Blum, David H.; Kremer, Paul A.; Bahnfleth, William P.; Freihaut, James D.
2017-01-01
We developed, characterized, and tested a new dual-collimation aqueous UV reactor to improve the accuracy and consistency of aqueous k-value determinations. This new system is unique because it collimates UV energy from a single lamp in two opposite directions. The design provides two distinct advantages over traditional single-collimation systems: 1) real-time UV dose (fluence) determination; and 2) simple actinometric determination of a reactor factor that relates measured irradiance levels to actual irradiance levels experienced by the microbial suspension. This reactor factor replaces three of the four typical correction factors required for single-collimation reactors. Using this dual-collimation reactor, Bacillus subtilis spores demonstrated inactivation following the classic multi-hit model with k = 0.1471 cm2/mJ (with 95% confidence bounds of 0.1426 to 0.1516). PMID:27498232
Development of human epithelial cell systems for radiation risk assessment
NASA Astrophysics Data System (ADS)
Yang, C. H.; Craise, L. M.
1994-10-01
The most important health effect of space radiation for astronauts is cancer induction. For radiation risk assessment, an understanding of carcinogenic effect of heavy ions in human cells is most essential. In our laboratory, we have successfully developed a human mammary epithelial cell system for studying the neoplastic transformation in vitro. Growth variants were obtained from heavy ion irradiated immortal mammary cell line. These cloned growth variants can grow in regular tissue culture media and maintain anchorage dependent growth and density inhibition property. Upon further irradiation with high-LET radiation, transformed foci were found. Experimental results from these studies suggest that multiexposure of radiation is required to induce neoplastic transformation of human epithelial cells. This multihits requirement may be due to high genomic stability of human cells. These growth variants can be useful model systems for space flight experiments to determine the carcinogenic effect of space radiation in human epithelial cells.
Development of human epithelial cell systems for radiation risk assessment
NASA Technical Reports Server (NTRS)
Yang, C. H.; Craise, L. M.
1994-01-01
The most important health effect of space radiation for astronauts is cancer induction. For radiation risk assessment, an understanding of carcinogenic effect of heavy ions in human cells is most essential. In our laboratory, we have successfully developed a human mammary epithelial cell system for studying the neoplastic transformation in vitro. Growth variants were obtained from heavy ion irradiated immortal mammary cell line. These cloned growth variants can grow in regular tissue culture media and maintain anchorage dependent growth and density inhibition property. Upon further irradiation with high-Linear Energy Transfer (LET) radiation, transformed foci were found. Experimental results from these studies suggest that multiexposure of radiation is required to induce neoplastic tranformation of human epithelial cells. This multihits requirement may be due to high genomic stability of human cells. These growth variants can be useful model systems for space flight experiments to determine the carcinogenic effect of space radiation in human epithelial cells.
Bayesian inference in an item response theory model with a generalized student t link function
NASA Astrophysics Data System (ADS)
Azevedo, Caio L. N.; Migon, Helio S.
2012-10-01
In this paper we introduce a new item response theory (IRT) model with a generalized Student t-link function with unknown degrees of freedom (df), named generalized t-link (GtL) IRT model. In this model we consider only the difficulty parameter in the item response function. GtL is an alternative to the two parameter logit and probit models, since the degrees of freedom (df) play a similar role to the discrimination parameter. However, the behavior of the curves of the GtL is different from those of the two parameter models and the usual Student t link, since in GtL the curve obtained from different df's can cross the probit curves in more than one latent trait level. The GtL model has similar proprieties to the generalized linear mixed models, such as the existence of sufficient statistics and easy parameter interpretation. Also, many techniques of parameter estimation, model fit assessment and residual analysis developed for that models can be used for the GtL model. We develop fully Bayesian estimation and model fit assessment tools through a Metropolis-Hastings step within Gibbs sampling algorithm. We consider a prior sensitivity choice concerning the degrees of freedom. The simulation study indicates that the algorithm recovers all parameters properly. In addition, some Bayesian model fit assessment tools are considered. Finally, a real data set is analyzed using our approach and other usual models. The results indicate that our model fits the data better than the two parameter models.
NASA Astrophysics Data System (ADS)
Keating, Elizabeth H.; Doherty, John; Vrugt, Jasper A.; Kang, Qinjun
2010-10-01
Highly parameterized and CPU-intensive groundwater models are increasingly being used to understand and predict flow and transport through aquifers. Despite their frequent use, these models pose significant challenges for parameter estimation and predictive uncertainty analysis algorithms, particularly global methods which usually require very large numbers of forward runs. Here we present a general methodology for parameter estimation and uncertainty analysis that can be utilized in these situations. Our proposed method includes extraction of a surrogate model that mimics key characteristics of a full process model, followed by testing and implementation of a pragmatic uncertainty analysis technique, called null-space Monte Carlo (NSMC), that merges the strengths of gradient-based search and parameter dimensionality reduction. As part of the surrogate model analysis, the results of NSMC are compared with a formal Bayesian approach using the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. Such a comparison has never been accomplished before, especially in the context of high parameter dimensionality. Despite the highly nonlinear nature of the inverse problem, the existence of multiple local minima, and the relatively large parameter dimensionality, both methods performed well and results compare favorably with each other. Experiences gained from the surrogate model analysis are then transferred to calibrate the full highly parameterized and CPU intensive groundwater model and to explore predictive uncertainty of predictions made by that model. The methodology presented here is generally applicable to any highly parameterized and CPU-intensive environmental model, where efficient methods such as NSMC provide the only practical means for conducting predictive uncertainty analysis.
Dynamic Conformations of Nucleosome Arrays in Solution from Small-Angle X-ray Scattering
NASA Astrophysics Data System (ADS)
Howell, Steven C.
Chromatin conformation and dynamics remains unsolved despite the critical role of the chromatin in fundamental genetic functions such as transcription, replication, and repair. At the molecular level, chromatin can be viewed as a linear array of nucleosomes, each consisting of 147 base pairs (bp) of double-stranded DNA (dsDNA) wrapped around a protein core and connected by 10 to 90 bp of linker dsDNA. Using small-angle X-ray scattering (SAXS), we investigated how the conformations of model nucleosome arrays in solution are modulated by ionic condition as well as the effect of linker histone proteins. To facilitate ensemble modeling of these SAXS measurements, we developed a simulation method that treats coarse-grained DNA as a Markov chain, then explores possible DNA conformations using Metropolis Monte Carlo (MC) sampling. This algorithm extends the functionality of SASSIE, a program used to model intrinsically disordered biological molecules, adding to the previous methods for simulating protein, carbohydrates, and single-stranded DNA. Our SAXS measurements of various nucleosome arrays together with the MC generated models provide valuable solution structure information identifying specific differences from the structure of crystallized arrays.
Learn-as-you-go acceleration of cosmological parameter estimates
NASA Astrophysics Data System (ADS)
Aslanyan, Grigor; Easther, Richard; Price, Layne C.
2015-09-01
Cosmological analyses can be accelerated by approximating slow calculations using a training set, which is either precomputed or generated dynamically. However, this approach is only safe if the approximations are well understood and controlled. This paper surveys issues associated with the use of machine-learning based emulation strategies for accelerating cosmological parameter estimation. We describe a learn-as-you-go algorithm that is implemented in the Cosmo++ code and (1) trains the emulator while simultaneously estimating posterior probabilities; (2) identifies unreliable estimates, computing the exact numerical likelihoods if necessary; and (3) progressively learns and updates the error model as the calculation progresses. We explicitly describe and model the emulation error and show how this can be propagated into the posterior probabilities. We apply these techniques to the Planck likelihood and the calculation of ΛCDM posterior probabilities. The computation is significantly accelerated without a pre-defined training set and uncertainties in the posterior probabilities are subdominant to statistical fluctuations. We have obtained a speedup factor of 6.5 for Metropolis-Hastings and 3.5 for nested sampling. Finally, we discuss the general requirements for a credible error model and show how to update them on-the-fly.
Effective gravitational coupling in modified teleparallel theories
NASA Astrophysics Data System (ADS)
Abedi, Habib; Capozziello, Salvatore; D'Agostino, Rocco; Luongo, Orlando
2018-04-01
In the present study, we consider an extended form of teleparallel Lagrangian f (T ,ϕ ,X ) , as function of a scalar field ϕ , its kinetic term X and the torsion scalar T . We use linear perturbations to obtain the equation of matter density perturbations on sub-Hubble scales. The gravitational coupling is modified in scalar modes with respect to the one of general relativity, albeit vector modes decay and do not show any significant effects. We thus extend these results by involving multiple scalar field models. Further, we study conformal transformations in teleparallel gravity and we obtain the coupling as the scalar field is nonminimally coupled to both torsion and boundary terms. Finally, we propose the specific model f (T ,ϕ ,X )=T +∂μϕ ∂μϕ +ξ T ϕ2 . To check its goodness, we employ the observational Hubble data, constraining the coupling constant, ξ , through a Monte Carlo technique based on the Metropolis-Hastings algorithm. Hence, fixing ξ to its best-fit value got from our numerical analysis, we calculate the growth rate of matter perturbations and we compare our outcomes with the latest measurements and the predictions of the Λ CDM model.
Learn-as-you-go acceleration of cosmological parameter estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aslanyan, Grigor; Easther, Richard; Price, Layne C., E-mail: g.aslanyan@auckland.ac.nz, E-mail: r.easther@auckland.ac.nz, E-mail: lpri691@aucklanduni.ac.nz
2015-09-01
Cosmological analyses can be accelerated by approximating slow calculations using a training set, which is either precomputed or generated dynamically. However, this approach is only safe if the approximations are well understood and controlled. This paper surveys issues associated with the use of machine-learning based emulation strategies for accelerating cosmological parameter estimation. We describe a learn-as-you-go algorithm that is implemented in the Cosmo++ code and (1) trains the emulator while simultaneously estimating posterior probabilities; (2) identifies unreliable estimates, computing the exact numerical likelihoods if necessary; and (3) progressively learns and updates the error model as the calculation progresses. We explicitlymore » describe and model the emulation error and show how this can be propagated into the posterior probabilities. We apply these techniques to the Planck likelihood and the calculation of ΛCDM posterior probabilities. The computation is significantly accelerated without a pre-defined training set and uncertainties in the posterior probabilities are subdominant to statistical fluctuations. We have obtained a speedup factor of 6.5 for Metropolis-Hastings and 3.5 for nested sampling. Finally, we discuss the general requirements for a credible error model and show how to update them on-the-fly.« less
Mattfeldt, Torsten
2011-04-01
Computer-intensive methods may be defined as data analytical procedures involving a huge number of highly repetitive computations. We mention resampling methods with replacement (bootstrap methods), resampling methods without replacement (randomization tests) and simulation methods. The resampling methods are based on simple and robust principles and are largely free from distributional assumptions. Bootstrap methods may be used to compute confidence intervals for a scalar model parameter and for summary statistics from replicated planar point patterns, and for significance tests. For some simple models of planar point processes, point patterns can be simulated by elementary Monte Carlo methods. The simulation of models with more complex interaction properties usually requires more advanced computing methods. In this context, we mention simulation of Gibbs processes with Markov chain Monte Carlo methods using the Metropolis-Hastings algorithm. An alternative to simulations on the basis of a parametric model consists of stochastic reconstruction methods. The basic ideas behind the methods are briefly reviewed and illustrated by simple worked examples in order to encourage novices in the field to use computer-intensive methods. © 2010 The Authors Journal of Microscopy © 2010 Royal Microscopical Society.
Sterilisation: characteristics of vasectomy acceptors in Delhi.
Sarkar, N N
1993-01-01
The place of vasectomy within the sterilisation programme in Delhi over the period 1983-88 is reviewed and data on vasectomy acceptance and characteristics of acceptors are analysed. Findings suggest a need to improve the strategy for the promotion of vasectomy within the metropolis.
NASA Astrophysics Data System (ADS)
Ustinov, E. A.
2017-01-01
The paper aims at a comparison of techniques based on the kinetic Monte Carlo (kMC) and the conventional Metropolis Monte Carlo (MC) methods as applied to the hard-sphere (HS) fluid and solid. In the case of the kMC, an alternative representation of the chemical potential is explored [E. A. Ustinov and D. D. Do, J. Colloid Interface Sci. 366, 216 (2012)], which does not require any external procedure like the Widom test particle insertion method. A direct evaluation of the chemical potential of the fluid and solid without thermodynamic integration is achieved by molecular simulation in an elongated box with an external potential imposed on the system in order to reduce the particle density in the vicinity of the box ends. The existence of rarefied zones allows one to determine the chemical potential of the crystalline phase and substantially increases its accuracy for the disordered dense phase in the central zone of the simulation box. This method is applicable to both the Metropolis MC and the kMC, but in the latter case, the chemical potential is determined with higher accuracy at the same conditions and the number of MC steps. Thermodynamic functions of the disordered fluid and crystalline face-centered cubic (FCC) phase for the hard-sphere system have been evaluated with the kinetic MC and the standard MC coupled with the Widom procedure over a wide range of density. The melting transition parameters have been determined by the point of intersection of the pressure-chemical potential curves for the disordered HS fluid and FCC crystal using the Gibbs-Duhem equation as a constraint. A detailed thermodynamic analysis of the hard-sphere fluid has provided a rigorous verification of the approach, which can be extended to more complex systems.
Ozoh, Obianuju B.; Okubadejo, Njideka U.; Akanbi, Maxwell O.; Dania, Michelle G.
2013-01-01
Background: The burden of obstructive sleep apnea among commercial drivers in Nigeria is not known. Aim: To assess the prevalence of high risk of obstructive sleep apnea (OSA) and excessive daytime sleepiness (EDS) among intra-city commercial drivers. Setting and Design: A descriptive cross-sectional study in three major motor parks in Lagos metropolis. Materials and Methods: Demographic, anthropometric and historical data was obtained. The risk of OSA and EDS was assessed using the STOP BANG questionnaire and the Epworth Sleepiness Scale, respectively. Statistical Analysis: The relationship between the OSA risk, EDS risk and past road traffic accident (RTA) was explored using the Pearson's chi square. Independent determinants of OSA risk, EDS risk and past RTA, respectively, were assessed by multiple logistic regression models. Result: Five hundred male commercial drivers (mean age (years) ±SD = 42.36 ± 11.17 and mean BMI (kg/m2) ±SD = 25.68 ± 3.79) were recruited. OSA risk was high in 244 (48.8%) drivers and 72 (14.4%) had EDS. There was a positive relationship between OSA risk and the risk of EDS (Pearson's X2 = 28.2, P < 0.001). Sixty-one (12.2%) drivers had a past history of RTA but there was no significant relationship between a past RTA and either OSA risk (X2 = 2.05, P = 0.15) or EDS risk (X2 = 2.7, P = 0.1), respectively. Abdominal adiposity, regular alcohol use and EDS were independent determinants of OSA risk while the use of cannabis and OSA risk were independent determinants of EDS. No independent risk factor for past RTA was identified. Conclusion: A significant proportion of commercial drivers in Lagos metropolis are at high risk of OSA and EDS. PMID:24249946
Orish, Verner N; Onyeabor, Onyekachi S; Boampong, Johnson N; Aforakwah, Richmond; Nwaefuna, Ekene; Iriemenam, Nnaemeka C
2012-09-01
The problem of malaria in adolescence has been surpassed by the immense burden of malaria in children, most especially less than 5. A substantial amount of work done on malaria in pregnancy in endemic regions has not properly considered the adolescence. The present study therefore aimed at evaluating the prevalence of Plasmodium falciparum and anaemia infection in adolescent pregnant girls in the Sekondi-Takoradi metropolis, Ghana. The study was carried out at four hospitals in the Sekondi-Takoradi metropolis of the western region of Ghana from January 2010 to October 2010. Structured questionnaires were administered to the consenting pregnant women during their antenatal care visits. Information on education, age, gravidae, occupation and socio-demographic characteristics were recorded. Venous bloods were screened for malaria using RAPID response antibody kit and Geimsa staining while haemoglobin estimations were done by cyanmethemoglobin method. The results revealed that adolescent pregnant girls were more likely to have malaria infection than the adult pregnant women (34.6% verses 21.3%, adjusted OR 1.65, 95% CI, 1.03-2.65, P=0.039). In addition, adolescent pregnant girls had higher odds of anaemia than their adult pregnant women equivalent (43.9% versus 33.2%; adjusted OR 1.63, 95% CI, 1.01-2.62, P=0.046). Taken together, these data suggest that adolescent pregnant girls were more likely to have malaria and anaemia compared to their adult pregnant counterpart. Results from this study shows that proactive adolescent friendly policies and control programmes for malaria and anaemia are needed in this region in order to protect this vulnerable group of pregnant women. Copyright © 2012 Elsevier B.V. All rights reserved.
Drivers and Pattern of Social Vulnerability to Flood in Metropolitan Lagos, Nigeria
NASA Astrophysics Data System (ADS)
Fasona, M.
2016-12-01
Lagos is Africa's second largest city and a city-state in southwest Nigeria. Population and economic activities in the city are concentrated in the greater Lagos metropolitan area - a group of barrier islands less than a thousand square kilometer. Several physical factors and critical human-environmental conditions contribute to high flood vulnerability across the city. Flood impact is highly denominated and the poor tend to suffer more due to higher risk of exposure and poor adaptive capacity. In this study we present the pattern of social vulnerability to flooding across the Lagos metropolis and argued that the pattern substantially reflects the pattern and severity of flooding impact on people across the metropolis. Twenty nine social indicators and experiences including poverty profile, housing conditions, education, population and demography, social network, and communication, among others, were considered. The data were collated through field survey and subjected to principal component analysis. The results were processed into raster surfaces using GIS for social vulnerability characterization at neighborhood levels. The results suggest the social status indicators, neighborhood standing and social networks indictors, the indicators of emergency responses and security, and the neighborhood conditions, in that order, are the most important determinants of social vulnerability. Six of the 16 LGAs in metropolitan Lagos have high social vulnerability. Neighborhoods that combine poor social status indicators and poor neighborhood standing and social networks are found to have high social vulnerability whereas other poor neighborhoods with strong social networks performed better. We conclude that improved human living condition and social network and communication in poor urban neighborhoods are important to reducing social vulnerability to flooding in the metropolis.
Does standard Monte Carlo give justice to instantons?
NASA Astrophysics Data System (ADS)
Fucito, F.; Solomon, S.
1984-01-01
The results of the standard local Monte Carlo are changed by offering instantons as candidates in the Metropolis procedure. We also define an O(3) topological charge with no contribution from planar dislocations. The RG behavior is still not recovered. Bantrell Fellow in Theoretical Physics.
Who Cares? Pre and Post Abortion Experiences among Young Females in Cape Coast Metropolis, Ghana.
Esia-Donkoh, Kobina; Darteh, Eugene K M; Blemano, Harriet; Asare, Hagar
2015-06-01
Issues of abortion are critical in Ghana largely due to its consequences on sexual and reproductive health. The negative perception society attaches to it makes it difficult for young females to access services and share their experiences. This paper examines the pre and post abortion experiences of young females; a subject scarcely researched in the country. Twenty-one clients of Planned Parenthood Association of Ghana (PPAG) clinic at Cape Coast were interviewed. Guided by the biopsychosocial model, the study revealed that fear of societal stigma, shame, and rejection by partners, as well as self-imposed stigma constituted some of the pre and post abortion experiences the respondents. Other experiences reported were bleeding, severe abdominal pain and psychological pain. The Ghana Health Services (GHS) and other service providers should partner the PPAG clinic to integrate psychosocial treatment in its abortion services while intensifying behaviour change communication and community-based stigma-reduction education in the Metropolis.
Comparing Three Estimation Methods for the Three-Parameter Logistic IRT Model
ERIC Educational Resources Information Center
Lamsal, Sunil
2015-01-01
Different estimation procedures have been developed for the unidimensional three-parameter item response theory (IRT) model. These techniques include the marginal maximum likelihood estimation, the fully Bayesian estimation using Markov chain Monte Carlo simulation techniques, and the Metropolis-Hastings Robbin-Monro estimation. With each…
DOT National Transportation Integrated Search
2016-10-01
This report summarizes the presentations, key themes, and recommendations identified at a Regional Models of Cooperation peer exchange on October 24, 2016 in Salt Lake City, Utah. The Utah Transit Authority hosted peers from the Los Angeles Metropoli...
Small Au clusters on a defective MgO(1 0 0) surface
NASA Astrophysics Data System (ADS)
Barcaro, Giovanni; Fortunelli, Alessandro
2008-05-01
The lowest energy structures of small T]>rndm where rndm is a random number (Metropolis criterion), the new configuration is accepted, otherwise the old configuration is kept, and the process is iterated. For each size we performed 3-5 BH runs, each one composed of 20-25 Monte Carlo steps, using a value of 0.5 eV as kT in the Metropolis criterion. Previous experience [13-15] shows that this is sufficient to single out the global minimum for adsorbed clusters of this size, and that the BH approach is more efficient as a global optimization algorithm than other techniques such as simulated annealing [18]. The MgO support was described via an (Mg 12O 12) cluster embedded in an array of ±2.0 a.u. point charges and repulsive pseudopotentials on the positive charges in direct contact with the cluster (see Ref. [15] for more details on the method). The atoms of the oxide cluster and the point charges were located at the lattice positions of the MgO rock-salt bulk structure using the experimental lattice constant of 4.208 Å. At variance with the ), evaluated by subtracting the energy of the oxide surface and of the metal cluster, both frozen in their interacting configuration, from the value of the total energy of the system, and by taking the absolute value; (ii) the binding energy of the metal cluster (E), evaluated by subtracting the energy of the isolated metal atoms from the total energy of the metal cluster in its interacting configuration, and by taking the absolute value; (iii) the metal cluster distortion energy (E), which corresponds to the difference between the energy of the metal cluster in the configuration interacting with the surface minus the energy of the cluster in its lowest-energy gas-phase configuration (a positive quantity); (iv) the oxide distortion energy (ΔE), evaluated subtracting the energy of the relaxed isolated defected oxide from the energy of the isolated defected oxide in the interacting configuration; and (v) the total binding energy (E), which is the sum of the binding energy of the metal cluster, the adhesion energy and the oxide distortion energy (E=E+E-ΔE). Note that the total binding energy of gas-phase clusters in their global minima can be obtained by summing E+E.
BOOK REVIEW: HANDBOOK OF URBAN HEALTH: POPULATIONS, METHODS AND PRACTICE
In Clifford D. Simak's 1952 science fiction classic, City, the metropolis is dead by the end of the 20th century. Cheap atomic power and ubiquitous private helicopters have made concentrated human existence a quaint memory. Simak could not have been more wrong, of course. About h...
ERIC Educational Resources Information Center
Fogg, Piper
2007-01-01
When the nearest metropolis is hundreds of miles away, cultural enrichment is not always easy to come by. Arts programs have evolved to reflect the needs of such regions, providing a rich diet for culture-starved residents. Some colleges have created choirs or theater groups that welcome local participation, while others have developed elaborate…
Fission Dynamics with Microscopic Level Densities
Ward, D.; Carlsson, B. G.; Dossing, Th.; ...
2017-01-01
We present a consistent framework for treating the energy and angularmomentum dependence of the shape evolution in the nuclear fission. It combines microscopically calculated level densities with the Metropolis-walk method, has no new parameters, and can elucidate the energy-dependent influence of pairing and shell effects on the dynamics of warm nuclei.
ERIC Educational Resources Information Center
Pearman, Francis A., III; Swain, Walker A.
2017-01-01
Racial and socioeconomic stratification have long governed patterns of residential sorting in the American metropolis. However, recent expansions of school choice policies that allow parents to select schools outside their neighborhood raise questions as to whether this weakening of the neighborhood-school connection might influence the…
Uncertainty in dual permeability model parameters for structured soils.
Arora, B; Mohanty, B P; McGuire, J T
2012-01-01
Successful application of dual permeability models (DPM) to predict contaminant transport is contingent upon measured or inversely estimated soil hydraulic and solute transport parameters. The difficulty in unique identification of parameters for the additional macropore- and matrix-macropore interface regions, and knowledge about requisite experimental data for DPM has not been resolved to date. Therefore, this study quantifies uncertainty in dual permeability model parameters of experimental soil columns with different macropore distributions (single macropore, and low- and high-density multiple macropores). Uncertainty evaluation is conducted using adaptive Markov chain Monte Carlo (AMCMC) and conventional Metropolis-Hastings (MH) algorithms while assuming 10 out of 17 parameters to be uncertain or random. Results indicate that AMCMC resolves parameter correlations and exhibits fast convergence for all DPM parameters while MH displays large posterior correlations for various parameters. This study demonstrates that the choice of parameter sampling algorithms is paramount in obtaining unique DPM parameters when information on covariance structure is lacking, or else additional information on parameter correlations must be supplied to resolve the problem of equifinality of DPM parameters. This study also highlights the placement and significance of matrix-macropore interface in flow experiments of soil columns with different macropore densities. Histograms for certain soil hydraulic parameters display tri-modal characteristics implying that macropores are drained first followed by the interface region and then by pores of the matrix domain in drainage experiments. Results indicate that hydraulic properties and behavior of the matrix-macropore interface is not only a function of saturated hydraulic conductivity of the macroporematrix interface ( K sa ) and macropore tortuosity ( l f ) but also of other parameters of the matrix and macropore domains.
Uncertainty in dual permeability model parameters for structured soils
NASA Astrophysics Data System (ADS)
Arora, B.; Mohanty, B. P.; McGuire, J. T.
2012-01-01
Successful application of dual permeability models (DPM) to predict contaminant transport is contingent upon measured or inversely estimated soil hydraulic and solute transport parameters. The difficulty in unique identification of parameters for the additional macropore- and matrix-macropore interface regions, and knowledge about requisite experimental data for DPM has not been resolved to date. Therefore, this study quantifies uncertainty in dual permeability model parameters of experimental soil columns with different macropore distributions (single macropore, and low- and high-density multiple macropores). Uncertainty evaluation is conducted using adaptive Markov chain Monte Carlo (AMCMC) and conventional Metropolis-Hastings (MH) algorithms while assuming 10 out of 17 parameters to be uncertain or random. Results indicate that AMCMC resolves parameter correlations and exhibits fast convergence for all DPM parameters while MH displays large posterior correlations for various parameters. This study demonstrates that the choice of parameter sampling algorithms is paramount in obtaining unique DPM parameters when information on covariance structure is lacking, or else additional information on parameter correlations must be supplied to resolve the problem of equifinality of DPM parameters. This study also highlights the placement and significance of matrix-macropore interface in flow experiments of soil columns with different macropore densities. Histograms for certain soil hydraulic parameters display tri-modal characteristics implying that macropores are drained first followed by the interface region and then by pores of the matrix domain in drainage experiments. Results indicate that hydraulic properties and behavior of the matrix-macropore interface is not only a function of saturated hydraulic conductivity of the macroporematrix interface (Ksa) and macropore tortuosity (lf) but also of other parameters of the matrix and macropore domains.
MCMC-ODPR: primer design optimization using Markov Chain Monte Carlo sampling.
Kitchen, James L; Moore, Jonathan D; Palmer, Sarah A; Allaby, Robin G
2012-11-05
Next generation sequencing technologies often require numerous primer designs that require good target coverage that can be financially costly. We aimed to develop a system that would implement primer reuse to design degenerate primers that could be designed around SNPs, thus find the fewest necessary primers and the lowest cost whilst maintaining an acceptable coverage and provide a cost effective solution. We have implemented Metropolis-Hastings Markov Chain Monte Carlo for optimizing primer reuse. We call it the Markov Chain Monte Carlo Optimized Degenerate Primer Reuse (MCMC-ODPR) algorithm. After repeating the program 1020 times to assess the variance, an average of 17.14% fewer primers were found to be necessary using MCMC-ODPR for an equivalent coverage without implementing primer reuse. The algorithm was able to reuse primers up to five times. We compared MCMC-ODPR with single sequence primer design programs Primer3 and Primer-BLAST and achieved a lower primer cost per amplicon base covered of 0.21 and 0.19 and 0.18 primer nucleotides on three separate gene sequences, respectively. With multiple sequences, MCMC-ODPR achieved a lower cost per base covered of 0.19 than programs BatchPrimer3 and PAMPS, which achieved 0.25 and 0.64 primer nucleotides, respectively. MCMC-ODPR is a useful tool for designing primers at various melting temperatures at good target coverage. By combining degeneracy with optimal primer reuse the user may increase coverage of sequences amplified by the designed primers at significantly lower costs. Our analyses showed that overall MCMC-ODPR outperformed the other primer-design programs in our study in terms of cost per covered base.
MCMC-ODPR: Primer design optimization using Markov Chain Monte Carlo sampling
2012-01-01
Background Next generation sequencing technologies often require numerous primer designs that require good target coverage that can be financially costly. We aimed to develop a system that would implement primer reuse to design degenerate primers that could be designed around SNPs, thus find the fewest necessary primers and the lowest cost whilst maintaining an acceptable coverage and provide a cost effective solution. We have implemented Metropolis-Hastings Markov Chain Monte Carlo for optimizing primer reuse. We call it the Markov Chain Monte Carlo Optimized Degenerate Primer Reuse (MCMC-ODPR) algorithm. Results After repeating the program 1020 times to assess the variance, an average of 17.14% fewer primers were found to be necessary using MCMC-ODPR for an equivalent coverage without implementing primer reuse. The algorithm was able to reuse primers up to five times. We compared MCMC-ODPR with single sequence primer design programs Primer3 and Primer-BLAST and achieved a lower primer cost per amplicon base covered of 0.21 and 0.19 and 0.18 primer nucleotides on three separate gene sequences, respectively. With multiple sequences, MCMC-ODPR achieved a lower cost per base covered of 0.19 than programs BatchPrimer3 and PAMPS, which achieved 0.25 and 0.64 primer nucleotides, respectively. Conclusions MCMC-ODPR is a useful tool for designing primers at various melting temperatures at good target coverage. By combining degeneracy with optimal primer reuse the user may increase coverage of sequences amplified by the designed primers at significantly lower costs. Our analyses showed that overall MCMC-ODPR outperformed the other primer-design programs in our study in terms of cost per covered base. PMID:23126469
Monte Carlo explicitly correlated second-order many-body perturbation theory
NASA Astrophysics Data System (ADS)
Johnson, Cole M.; Doran, Alexander E.; Zhang, Jinmei; Valeev, Edward F.; Hirata, So
2016-10-01
A stochastic algorithm is proposed and implemented that computes a basis-set-incompleteness (F12) correction to an ab initio second-order many-body perturbation energy as a short sum of 6- to 15-dimensional integrals of Gaussian-type orbitals, an explicit function of the electron-electron distance (geminal), and its associated excitation amplitudes held fixed at the values suggested by Ten-no. The integrals are directly evaluated (without a resolution-of-the-identity approximation or an auxiliary basis set) by the Metropolis Monte Carlo method. Applications of this method to 17 molecular correlation energies and 12 gas-phase reaction energies reveal that both the nonvariational and variational formulas for the correction give reliable correlation energies (98% or higher) and reaction energies (within 2 kJ mol-1 with a smaller statistical uncertainty) near the complete-basis-set limits by using just the aug-cc-pVDZ basis set. The nonvariational formula is found to be 2-10 times less expensive to evaluate than the variational one, though the latter yields energies that are bounded from below and is, therefore, slightly but systematically more accurate for energy differences. Being capable of using virtually any geminal form, the method confirms the best overall performance of the Slater-type geminal among 6 forms satisfying the same cusp conditions. Not having to precompute lower-dimensional integrals analytically, to store them on disk, or to transform them in a nonscalable dense-matrix-multiplication algorithm, the method scales favorably with both system size and computer size; the cost increases only as O(n4) with the number of orbitals (n), and its parallel efficiency reaches 99.9% of the ideal case on going from 16 to 4096 computer processors.
NASA Astrophysics Data System (ADS)
Spezi, Emiliano
2010-08-01
Sixty years after the paper 'The Monte Carlo method' by N Metropolis and S Ulam in The Journal of the American Statistical Association (Metropolis and Ulam 1949), use of the most accurate algorithm for computer modelling of radiotherapy linear accelerators, radiation detectors and three dimensional patient dose was discussed in Wales (UK). The Second European Workshop on Monte Carlo Treatment Planning (MCTP2009) was held at the National Museum of Wales in Cardiff. The event, organized by Velindre NHS Trust, Cardiff University and Cancer Research Wales, lasted two and a half days, during which leading experts and contributing authors presented and discussed the latest advances in the field of Monte Carlo treatment planning (MCTP). MCTP2009 was highly successful, judging from the number of participants which was in excess of 140. Of the attendees, 24% came from the UK, 46% from the rest of Europe, 12% from North America and 18% from the rest of the World. Fifty-three oral presentations and 24 posters were delivered in a total of 12 scientific sessions. MCTP2009 follows the success of previous similar initiatives (Verhaegen and Seuntjens 2005, Reynaert 2007, Verhaegen and Seuntjens 2008), and confirms the high level of interest in Monte Carlo technology for radiotherapy treatment planning. The 13 articles selected for this special section (following Physics in Medicine and Biology's usual rigorous peer-review procedure) give a good picture of the high quality of the work presented at MCTP2009. The book of abstracts can be downloaded from http://www.mctp2009.org. I wish to thank the IOP Medical Physics and Computational Physics Groups for their financial support, Elekta Ltd and Dosisoft for sponsoring MCTP2009, and leading manufacturers such as BrainLab, Nucletron and Varian for showcasing their latest MC-based radiotherapy solutions during a dedicated technical session. I am also very grateful to the eight invited speakers who kindly accepted to give keynote presentations which contributed significantly to raising the quality of the event and capturing the interest of the medical physics community. I also wish to thank all those who contributed to the success of MCTP2009: the members of the local Organizing Committee and the Workshop Management Team who managed the event very efficiently, the members of the European Working Group in Monte Carlo Treatment Planning (EWG-MCTP) who acted as Guest Associate Editors for the MCTP2009 abstracts reviewing process, and all the authors who generated new, high quality work. Finally, I hope that you find the contents of this special section enjoyable and informative. Emiliano Spezi Chairman of MCTP2009 Organizing Committee and Guest Editor References Metropolis N and Ulam S 1949 The Monte Carlo method J. Amer. Stat. Assoc. 44 335-41 Reynaert N 2007 First European Workshop on Monte Carlo Treatment Planning J. Phys.: Conf. Ser. 74 011001 Verhaegen F and Seuntjens J 2005 International Workshop on Current Topics in Monte Carlo Treatment Planning Phys. Med. Biol. 50 Verhaegen F and Seuntjens J 2008 International Workshop on Monte Carlo Techniques in Radiotherapy Delivery and Verification J. Phys.: Conf. Ser. 102 011001
The design and performance of the ZEUS Central Tracking Detector z-by-timing system
NASA Astrophysics Data System (ADS)
Bailey, D. S.; Foster, B.; Heath, G. P.; Morgado, C. J. S.; Harnew, N.; Khatri, T.; Lancaster, M.; McArthur, I. C.; McFall, J. D.; Nash, J.; Shield, P. D.; Topp-Jorgensen, S.; Wilson, F. F.; Carter, R. C.; Jeffs, M. D.; Milborrow, R.; Morrissey, M. C.; Phillips, D. A.; Quinton, S. P. H.; Westlake, G.; White, D. J.; Lane, J. B.; Nixon, G.; Postranecky, M.
1997-02-01
The ZEUS Central Tracking Detector utilizes a time difference measurement to provide a fast determination of the z coordinate of each hit. The z-by-timing measurement is achieved by using a Time-to-Amplitude Converter which has an intrinsic timing resolution of 36 ps, has pipelined readout, and has a multihit capability of 48 ns. In order to maintain the required sub-nanosecond timing accuracy, the technique incorporates an automated self-calibration system. The readout of the z-by-timing data utilizes a fully customized timing control system which runs synchronously with the HERA beam-crossing clock, and a data acquisition system implemented on a network of Transputers. Three dimensional space-points provided by the z-by-timing system are used as input to all three levels of the ZEUS trigger and for offline track reconstruction. The average z resolution is determined to be 4.4 cm for multi-track events from positron-proton collisions in the ZEUS detector.
NASA Astrophysics Data System (ADS)
Li, Ling; Ortiz, Christine
2014-05-01
Hierarchical composite materials design in biological exoskeletons achieves penetration resistance through a variety of energy-dissipating mechanisms while simultaneously balancing the need for damage localization to avoid compromising the mechanical integrity of the entire structure and to maintain multi-hit capability. Here, we show that the shell of the bivalve Placuna placenta (~99 wt% calcite), which possesses the unique optical property of ~80% total transmission of visible light, simultaneously achieves penetration resistance and deformation localization via increasing energy dissipation density (0.290 ± 0.072 nJ μm-3) by approximately an order of magnitude relative to single-crystal geological calcite (0.034 ± 0.013 nJ μm-3). P. placenta, which is composed of a layered assembly of elongated diamond-shaped calcite crystals, undergoes pervasive nanoscale deformation twinning (width ~50 nm) surrounding the penetration zone, which catalyses a series of additional inelastic energy dissipating mechanisms such as interfacial and intracrystalline nanocracking, viscoplastic stretching of interfacial organic material, and nanograin formation and reorientation.
Virus Neutralisation: New Insights from Kinetic Neutralisation Curves
Magnus, Carsten
2013-01-01
Antibodies binding to the surface of virions can lead to virus neutralisation. Different theories have been proposed to determine the number of antibodies that must bind to a virion for neutralisation. Early models are based on chemical binding kinetics. Applying these models lead to very low estimates of the number of antibodies needed for neutralisation. In contrast, according to the more conceptual approach of stoichiometries in virology a much higher number of antibodies is required for virus neutralisation by antibodies. Here, we combine chemical binding kinetics with (virological) stoichiometries to better explain virus neutralisation by antibody binding. This framework is in agreement with published data on the neutralisation of the human immunodeficiency virus. Knowing antibody reaction constants, our model allows us to estimate stoichiometrical parameters from kinetic neutralisation curves. In addition, we can identify important parameters that will make further analysis of kinetic neutralisation curves more valuable in the context of estimating stoichiometries. Our model gives a more subtle explanation of kinetic neutralisation curves in terms of single-hit and multi-hit kinetics. PMID:23468602
Manifestations of Dyslexia and Dyscalculia
ERIC Educational Resources Information Center
Osisanya, Ayo; Lazarus, Kelechi; Adewunmi, Abiodun
2013-01-01
This study examined the prevalence of dyslexia and dyscalculia among persons with academic deficits in English Language and Mathematics in public primary schools in Ibadan metropolis. A correlational survey study, sampling 477 pupils who were between the ages of eight and 12 years, and in 4th and 5th grades with the use of four research…
Engaging Suburban Students in Dialogues on Diversity in a Segregated Metropolitan Area
ERIC Educational Resources Information Center
Checkoway, Barry; Lipa, Todd; Vivyan, Erika; Zurvalec, Sue
2017-01-01
What are some strategies for engaging suburban students in dialogues on diversity in new American metropolis? This question is important, especially at a time when some suburbs are changing from "segregated" to "segregated and diverse," and scholarship is needed to guide their discussion. This article analyzes efforts by a…
Urbanization and Spatial Organization: Hospital and Orphanage Location in Chicago, 1848-1916
ERIC Educational Resources Information Center
Britton, Marcus; Ocasio, William
2007-01-01
What factors affect where organizations locate facilities in local communities? This paper examines how urban development influenced the neighborhood location of two very different types of facilities, general hospitals and orphanages, over the 70-year period during which Chicago emerged as an urban metropolis. Our results suggest that the human…
Mapping Language Ideologies in Multi-Ethnic Urban Europe: The Case of Parisian French
ERIC Educational Resources Information Center
Stewart, Christopher Michael
2012-01-01
Although the modern multicultural European metropolis has brought previously disparate groups into close contact, little research has focused on the effect of these shifting demographic patterns on language attitudes and ideologies. This is probably due to the sensitive nature of issues relating to immigration which may evoke contexts of…
Community of Inquiry Method and Language Skills Acquisition: Empirical Evidence
ERIC Educational Resources Information Center
Preece, Abdul Shakhour Duncan
2015-01-01
The study investigates the effectiveness of community of inquiry method in preparing students to develop listening and speaking skills in a sample of junior secondary school students in Borno state, Nigeria. A sample of 100 students in standard classes was drawn in one secondary school in Maiduguri metropolis through stratified random sampling…
ERIC Educational Resources Information Center
Fisher-Maltese, Carley; Fisher, Dana R.; Ray, Rashawn
2018-01-01
This article explores how school gardens provide learning opportunities for school-aged children while concurrently helping cities achieve sustainability. The authors analyse this process in Washington, DC, a particularly innovative metropolis in the United States. This national capital city boasts two of the most progressive examples of…
Battle in Los Angeles: Conflict Escalates as Charter Schools Thrive
ERIC Educational Resources Information Center
Whitmire, Richard
2016-01-01
Throughout the 1990s and well into the new millennium, the massive Los Angeles Unified School District barely noticed the many charter schools that were springing up around the metropolis. But Los Angeles parents certainly took notice, and started enrolling their children. In 2008, five charter-management organizations announced plans to…
Challenges of Attending E-Learning Studies in Nigeria
ERIC Educational Resources Information Center
Bugi, Stephan Z.
2012-01-01
This study set out to find out what challenges the E-leaner faces in the Nigerian environment. Survey research design was used to obtain the opinion of 200 randomly selected E-learners in Kaduna metropolis. Their responses revealed that the most prominent challenges they face are, Inadequate Power supply, Internet connectivity problems, Efficacy…
From Mountain to Metropolis: Appalachian Migrants in American Cities.
ERIC Educational Resources Information Center
Borman, Kathryn M., Ed.; Obermiller, Phillip J., Ed.
This book consists of 14 essays that focus on the condition of urban Appalachians (former migrants to cities from Appalachia and their descendants). Chapters address issues of health, environment, education, and cultural identity in an urban Appalachian context, and are meant to be a resource for educators and health and human service…
No Safe Place: Environmental Hazards & Injustice along Mexico's Northern Border
ERIC Educational Resources Information Center
Grineski, Sara E.; Collins, Timothy W.; Aguilar, Maria de Lourdes Romo; Aldouri, Raed
2010-01-01
This article examines spatial relationships between environmental hazards (i.e., pork feed lots, brick kilns, final assembly plants and a rail line) and markers of social marginality in Ciudad Juarez, Mexico. Juarez represents an opportunity for researchers to test for patterns of injustice in a recently urbanizing metropolis of the Global South.…
ERIC Educational Resources Information Center
Adam, Abdul-Kahar
2015-01-01
This project is carried out by employing an empirical method through questionnaire design and administration and tapped the perceptions and knowledge of the target elements of this study. The research frame was about Ghana Education Service office workers within the Accra Metropolis including higher education institutions. A qualitative data…
Meng, Xia; Fu, Qingyan; Ma, Zongwei; Chen, Li; Zou, Bin; Zhang, Yan; Xue, Wenbo; Wang, Jinnan; Wang, Dongfang; Kan, Haidong; Liu, Yang
2016-01-01
Development of exposure assessment model is the key component for epidemiological studies concerning air pollution, but the evidence from China is limited. Therefore, a linear mixed effects (LME) model was established in this study in a Chinese metropolis by incorporating aerosol optical depth (AOD), meteorological information and the land use regression (LUR) model to predict ground PM10 levels on high spatiotemporal resolution. The cross validation (CV) R(2) and the RMSE of the LME model were 0.87 and 19.2 μg/m(3), respectively. The relative prediction error (RPE) of daily and annual mean predicted PM10 concentrations were 19.1% and 7.5%, respectively. This study was the first attempt in China to estimate both short-term and long-term variation of PM10 levels with high spatial resolution in a Chinese metropolis with the LME model. The results suggested that the LME model could provide exposure assessment for short-term and long-term epidemiological studies in China. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hazardous waste management and weight-based indicators--the case of Haifa Metropolis.
Elimelech, E; Ayalon, O; Flicstein, B
2011-01-30
The quantity control of hazardous waste in Israel relies primarily on the Environmental Services Company (ESC) reports. With limited management tools, the Ministry of Environmental Protection (MoEP) has no applicable methodology to confirm or monitor the actual amounts of hazardous waste produced by various industrial sectors. The main goal of this research was to develop a method for estimating the amounts of hazardous waste produced by various sectors. In order to achieve this goal, sector-specific indicators were tested on three hazardous waste producing sectors in the Haifa Metropolis: petroleum refineries, dry cleaners, and public hospitals. The findings reveal poor practice of hazardous waste management in the dry cleaning sector and in the public hospitals sector. Large discrepancies were found in the dry cleaning sector, between the quantities of hazardous waste reported and the corresponding indicator estimates. Furthermore, a lack of documentation on hospitals' pharmaceutical and chemical waste production volume was observed. Only in the case of petroleum refineries, the reported amount was consistent with the estimate. Copyright © 2010 Elsevier B.V. All rights reserved.
Audit of sharp weapon deaths in metropolis of Karachi--an autopsy based study.
Mirza, Farhat Hussain; Hasan, Qudsia; Memon, Akhtar Amin; Adil, Syeda Ezz-e-Rukhshan
2010-01-01
Sharp weapons are one of the most violent and abhorrent means of deaths. This study assesses the frequency of sharp weapon deaths in Karachi. This was a cross sectional study, and involves the deaths by sharp weapons autopsied in Karachi during Mar 2008-Feb 2009. This study reports that the frequency of sharp weapon deaths in Karachi is similar to some other studies conducted in different regions of Pakistan, yet it is very high as the population of Karachi is way more than any other metropolis of Pakistan. Our study reported that out of 2090 medico-legal deaths in Karachi during the study period, 91 deaths were due to sharp weapons, including 73 (80.2%) males and 18 (19.8%) females. 100% of the deaths were homicides, so none were suicides. Deaths were more frequent in age group ranging from 20-39 years (59.3%). Sharp weapon deaths continue to be a means of quite a number of deaths in Karachi. Such violence depicts intolerant and frustrated nature of the citizens.
Spectral analysis of finite-time correlation matrices near equilibrium phase transitions
NASA Astrophysics Data System (ADS)
Vinayak; Prosen, T.; Buča, B.; Seligman, T. H.
2014-10-01
We study spectral densities for systems on lattices, which, at a phase transition display, power-law spatial correlations. Constructing the spatial correlation matrix we prove that its eigenvalue density shows a power law that can be derived from the spatial correlations. In practice time series are short in the sense that they are either not stationary over long time intervals or not available over long time intervals. Also we usually do not have time series for all variables available. We shall make numerical simulations on a two-dimensional Ising model with the usual Metropolis algorithm as time evolution. Using all spins on a grid with periodic boundary conditions we find a power law, that is, for large grids, compatible with the analytic result. We still find a power law even if we choose a fairly small subset of grid points at random. The exponents of the power laws will be smaller under such circumstances. For very short time series leading to singular correlation matrices we use a recently developed technique to lift the degeneracy at zero in the spectrum and find a significant signature of critical behavior even in this case as compared to high temperature results which tend to those of random matrix models.
Conformation and Dynamics of a Flexible Sheet in Solvent Media by Monte Carlo Simulations
NASA Astrophysics Data System (ADS)
Pandey, Ras; Anderson, Kelly; Heinz, Hendrik; Farmer, Barry
2005-03-01
Flexibility of the clay sheet is limited even in the ex-foliated state in some solvent media. A coarse grained model is used to investigate dynamics and conformation of a flexible sheet to model such a clay platelet in an effective solvent medium on a cubic lattice of size L^3 with lattice constant a. The undeformed sheet is described by a square lattice of size Ls^2, where, each node of the sheet is represented by the unit cube of the cubic lattice and 2a is the minimum distance between the nearest neighbor nodes to incorporate the excluded volume constraints. Additionally, each node interacts with neighboring nodes and solvent (empty) sites within a range ri. Each node execute their stochastic motion with the Metropolis algorithm subject to bond length fluctuation and excluded volume constraints. Mean square displacements of the center node and that of its center of mass are investigated as a function of time step for a set of these parameters. The radius of gyration (Rg) is also examined concurrently to understand its relaxation. Multi-scale segmental dynamics of the sheet is studied by identifying the power-law dependence in various time regimes. Relaxation of Rg and its dependence of temperature are planned to be discussed.
Active heat pulse sensing of 3-D-flow fields in streambeds
NASA Astrophysics Data System (ADS)
Banks, Eddie W.; Shanafield, Margaret A.; Noorduijn, Saskia; McCallum, James; Lewandowski, Jörg; Batelaan, Okke
2018-03-01
Profiles of temperature time series are commonly used to determine hyporheic flow patterns and hydraulic dynamics in the streambed sediments. Although hyporheic flows are 3-D, past research has focused on determining the magnitude of the vertical flow component and how this varies spatially. This study used a portable 56-sensor, 3-D temperature array with three heat pulse sources to measure the flow direction and magnitude up to 200 mm below the water-sediment interface. Short, 1 min heat pulses were injected at one of the three heat sources and the temperature response was monitored over a period of 30 min. Breakthrough curves from each of the sensors were analysed using a heat transport equation. Parameter estimation and uncertainty analysis was undertaken using the differential evolution adaptive metropolis (DREAM) algorithm, an adaption of the Markov chain Monte Carlo method, to estimate the flux and its orientation. Measurements were conducted in the field and in a sand tank under an extensive range of controlled hydraulic conditions to validate the method. The use of short-duration heat pulses provided a rapid, accurate assessment technique for determining dynamic and multi-directional flow patterns in the hyporheic zone and is a basis for improved understanding of biogeochemical processes at the water-streambed interface.
Zaikin, Alexey; Míguez, Joaquín
2017-01-01
We compare three state-of-the-art Bayesian inference methods for the estimation of the unknown parameters in a stochastic model of a genetic network. In particular, we introduce a stochastic version of the paradigmatic synthetic multicellular clock model proposed by Ullner et al., 2007. By introducing dynamical noise in the model and assuming that the partial observations of the system are contaminated by additive noise, we enable a principled mechanism to represent experimental uncertainties in the synthesis of the multicellular system and pave the way for the design of probabilistic methods for the estimation of any unknowns in the model. Within this setup, we tackle the Bayesian estimation of a subset of the model parameters. Specifically, we compare three Monte Carlo based numerical methods for the approximation of the posterior probability density function of the unknown parameters given a set of partial and noisy observations of the system. The schemes we assess are the particle Metropolis-Hastings (PMH) algorithm, the nonlinear population Monte Carlo (NPMC) method and the approximate Bayesian computation sequential Monte Carlo (ABC-SMC) scheme. We present an extensive numerical simulation study, which shows that while the three techniques can effectively solve the problem there are significant differences both in estimation accuracy and computational efficiency. PMID:28797087
Quantum Corral Wave-function Engineering
NASA Astrophysics Data System (ADS)
Correa, Alfredo; Reboredo, Fernando; Balseiro, Carlos
2005-03-01
We present a theoretical method for the design and optimization of quantum corrals[1] with specific electronic properties. Taking advantage that spins are subject to a RKKY interaction that is directly controlled by the scattering of the quantum corral, we design corral structures that reproduce spin Hamiltonians with coupling constants determined a priori[2]. We solve exactly the bi-dimensional scattering problem for each corral configuration within the s-wave approximation[3] and subsequently the geometry of the quantum corral is optimized by means of simulated annealing[4] and genetic algorithms[5]. We demonstrate the possibility of automatic design of structures with complicated target electronic properties[6]. This work was performed under the auspices of the US Department of Energy by the University of California at the LLNL under contract no W-7405-Eng-48. [1] M. F. Crommie, C. P. Lutz and D. M. Eigler, Nature 403, 512 (2000) [2] D. P. DiVincenzo et al., Nature 408, 339 (2000) [3] G. A. Fiete and E. J. Heller, Rev. Mod. Phys. 75, 933 (2003) [4] M. R. A. T. N. Metropolis et al., J. Chem. Phys. 1087 (1953) [5] E. Aarts and J. K. Lenstra, eds. Local search in combinatorial problems (Princeton University Press, 1997) [6] A. A. Correa, F. Reboredo and C. Balseiro, Phys. Rev. B (in press).
Appplication of statistical mechanical methods to the modeling of social networks
NASA Astrophysics Data System (ADS)
Strathman, Anthony Robert
With the recent availability of large-scale social data sets, social networks have become open to quantitative analysis via the methods of statistical physics. We examine the statistical properties of a real large-scale social network, generated from cellular phone call-trace logs. We find this network, like many other social networks to be assortative (r = 0.31) and clustered (i.e., strongly transitive, C = 0.21). We measure fluctuation scaling to identify the presence of internal structure in the network and find that structural inhomogeneity effectively disappears at the scale of a few hundred nodes, though there is no sharp cutoff. We introduce an agent-based model of social behavior, designed to model the formation and dissolution of social ties. The model is a modified Metropolis algorithm containing agents operating under the basic sociological constraints of reciprocity, communication need and transitivity. The model introduces the concept of a social temperature. We go on to show that this simple model reproduces the global statistical network features (incl. assortativity, connected fraction, mean degree, clustering, and mean shortest path length) of the real network data and undergoes two phase transitions, one being from a "gas" to a "liquid" state and the second from a liquid to a glassy state as function of this social temperature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gencaga, Deniz; Knuth, Kevin H.; Carbon, Duane F.
Understanding the origins of life has been one of the greatest dreams throughout history. It is now known that star-forming regions contain complex organic molecules, known as Polycyclic Aromatic Hydrocarbons (PAHs), each of which has particular infrared spectral characteristics. By understanding which PAH species are found in specific star-forming regions, we can better understand the biochemistry that takes place in interstellar clouds. Identifying and classifying PAHs is not an easy task: we can only observe a single superposition of PAH spectra at any given astrophysical site, with the PAH species perhaps numbering in the hundreds or even thousands. This ismore » a challenging source separation problem since we have only one observation composed of numerous mixed sources. However, it is made easier with the help of a library of hundreds of PAH spectra. In order to separate PAH molecules from their mixture, we need to identify the specific species and their unique concentrations that would provide the given mixture. We develop a Bayesian approach for this problem where sources are separated from their mixture by Metropolis Hastings algorithm. Separated PAH concentrations are provided with their error bars, illustrating the uncertainties involved in the estimation process. The approach is demonstrated on synthetic spectral mixtures using spectral resolutions from the Infrared Space Observatory (ISO). Performance of the method is tested for different noise levels.« less
EXOFIT: orbital parameters of extrasolar planets from radial velocities
NASA Astrophysics Data System (ADS)
Balan, Sreekumar T.; Lahav, Ofer
2009-04-01
Retrieval of orbital parameters of extrasolar planets poses considerable statistical challenges. Due to sparse sampling, measurement errors, parameters degeneracy and modelling limitations, there are no unique values of basic parameters, such as period and eccentricity. Here, we estimate the orbital parameters from radial velocity data in a Bayesian framework by utilizing Markov Chain Monte Carlo (MCMC) simulations with the Metropolis-Hastings algorithm. We follow a methodology recently proposed by Gregory and Ford. Our implementation of MCMC is based on the object-oriented approach outlined by Graves. We make our resulting code, EXOFIT, publicly available with this paper. It can search for either one or two planets as illustrated on mock data. As an example we re-analysed the orbital solution of companions to HD 187085 and HD 159868 from the published radial velocity data. We confirm the degeneracy reported for orbital parameters of the companion to HD 187085, and show that a low-eccentricity orbit is more probable for this planet. For HD 159868, we obtained slightly different orbital solution and a relatively high `noise' factor indicating the presence of an unaccounted signal in the radial velocity data. EXOFIT is designed in such a way that it can be extended for a variety of probability models, including different Bayesian priors.
NASA Astrophysics Data System (ADS)
García, M. F.; Restrepo-Parra, E.; Riaño-Rojas, J. C.
2015-05-01
This work develops a model that mimics the growth of diatomic, polycrystalline thin films by artificially splitting the growth into deposition and relaxation processes including two stages: (1) a grain-based stochastic method (grains orientation randomly chosen) is considered and by means of the Kinetic Monte Carlo method employing a non-standard version, known as Constant Time Stepping, the deposition is simulated. The adsorption of adatoms is accepted or rejected depending on the neighborhood conditions; furthermore, the desorption process is not included in the simulation and (2) the Monte Carlo method combined with the metropolis algorithm is used to simulate the diffusion. The model was developed by accounting for parameters that determine the morphology of the film, such as the growth temperature, the interacting atomic species, the binding energy and the material crystal structure. The modeled samples exhibited an FCC structure with grain formation with orientations in the family planes of < 111 >, < 200 > and < 220 >. The grain size and film roughness were analyzed. By construction, the grain size decreased, and the roughness increased, as the growth temperature increased. Although, during the growth process of real materials, the deposition and relaxation occurs simultaneously, this method may perhaps be valid to build realistic polycrystalline samples.
NASA Astrophysics Data System (ADS)
Pandey, Ras; Kuang, Zhifeng; Farmer, Barry; Kim, Sang; Naik, Rajesh
2012-02-01
Recently, Kim et al. [1] have found that peptides P1: HSSYWYAFNNKT and P2: EPLQLKM bind selectively to graphene surfaces and edges respectively which are critical in modulating both the mechanical as well as electronic transport properties of graphene. Such distinctions in binding sites (edge versus surface) observed in electron micrographs were verified by computer simulation by an all-atomic model that captures the pi-pi bonding. We propose a hierarchical approach that involves input from the all-atom Molecular Dynamics (MD) study (with atomistic detail) into a coarse-grained Monte Carlo simulation to extend this study further to a larger scale. The binding energy of a free amino acid with the graphene sheet from all-atom simulation is used in the interaction parameter for the coarse-grained approach. Peptide chain executes its stochastic motion with the Metropolis algorithm. We investigate a number of local and global physical quantities and find that peptide P1 is likely to bind more strongly to graphene sheet than P2 and that it is anchored by three residues ^4Y^5W^6Y. [1] S.N. Kim et al J. Am. Chem. Soc. 133, 14480 (2011).
Profile-Based LC-MS Data Alignment—A Bayesian Approach
Tsai, Tsung-Heng; Tadesse, Mahlet G.; Wang, Yue; Ressom, Habtom W.
2014-01-01
A Bayesian alignment model (BAM) is proposed for alignment of liquid chromatography-mass spectrometry (LC-MS) data. BAM belongs to the category of profile-based approaches, which are composed of two major components: a prototype function and a set of mapping functions. Appropriate estimation of these functions is crucial for good alignment results. BAM uses Markov chain Monte Carlo (MCMC) methods to draw inference on the model parameters and improves on existing MCMC-based alignment methods through 1) the implementation of an efficient MCMC sampler and 2) an adaptive selection of knots. A block Metropolis-Hastings algorithm that mitigates the problem of the MCMC sampler getting stuck at local modes of the posterior distribution is used for the update of the mapping function coefficients. In addition, a stochastic search variable selection (SSVS) methodology is used to determine the number and positions of knots. We applied BAM to a simulated data set, an LC-MS proteomic data set, and two LC-MS metabolomic data sets, and compared its performance with the Bayesian hierarchical curve registration (BHCR) model, the dynamic time-warping (DTW) model, and the continuous profile model (CPM). The advantage of applying appropriate profile-based retention time correction prior to performing a feature-based approach is also demonstrated through the metabolomic data sets. PMID:23929872
Russoff, D
1986-04-01
The Tokyo metropolis houses 11,892,016 people, 1/10 of the Japanese population. In recent years, Tokyo's population growth has slowed as the birthrate has fallen from a 1947 postwar high of 31.5/1000 to 11.4/1000 in 1983. 5.9 million males and 5.8 million females, composing 4 1/2 million households, live in Tokyo's 2160 square kilometers. Within the metropolis' 23 wards, density per square kilometer was 14,023 persons in 1983, with Toshima ward containing 21,844 people per square kilometer. Wards around the city's center held 71% of the population in 1983, but had only 27% of its land mass; outlying cities, towns, and villages held 28% of the population on 54% of the land. 50% of Tokyo's population is aged 25-59; those 65 or over will rise from 1980's 9% to 15.6% of the population in 2000. In 40 years, Japan will have more elderly people than any other advanced country. In 1983, Tokyo had over 150,000 housing starts, high by Japanese and international standards. Nearly 1/4 of Tokyo households each contain a married couple with 2 children, but single person households predominate, reflecting Tokyo's student, working bachelor, and elderly populations. Young, single Japanese workers spend 1/3 of their income on leisure, entertainment, cultural activities and education; couples marry late and, with 2 incomes, can purchase many nonessentials. Nearly 3.25 million students attend Tokyo's fiercely competitive schools and colleges; Japan is almost 100% literate. Of the 6 million people working in Tokyo, half work in the service and retail sector, 25% in manufacturing, 12% in transportation and communication, 9% in finance and insurance, and 8% work in construction. Tokyo workers earn nearly 20% more than the average Japanese worker. Japan now faces job shortages and will see many unemployment problems by 1990. To help absorb new workers, government planners recommend increasing vacation time, training workers as specialists rather than generalists, and encouraging job sharing and part-time work.
From Internet of Things to Smart Data for Smart Urban Monitoring
NASA Astrophysics Data System (ADS)
Gastaud, E.
2017-09-01
Cities are facing some of the major challenges of our time: global warming, pollution, waste management, energy efficiency. The territory of the Metropolis of Lyon, France, which brings together 59 municipalities, for a total of 1.3 million inhabitants, has launched a smart city policy aimed, among other things, at finding solutions for these issues. The data platform set up in 2013 is one of the cornerstones of this policy. In this context, the Metropolis of Lyon is deploying solutions that will enable, through the collection of new data, to implement monitoring and action tools in several fields. As part of a European innovation project called "bIoTope", focused on the development of new services based on the Internet of Things, a multidisciplinary team is implementing a system to mitigate the effects of global warming in the city. Thanks to various connected objects allowing a true monitoring of the trees, and by using different data sources, an automatic and intelligent irrigation system is developed. In the field of waste management, several hundred containers in which the inhabitants throw away their used glass for recycling will soon be equipped with fill rate sensors. The main objective is to have this network of sensors interact easily with the container collection trucks. Expected results are an optimization of the collection, thus less fuel consumed, less noise, less traffic jam. The Metropolis of Lyon also participates in the "Smarter Together" project, focused on the development of intelligent duplicable solutions for cities, in the field of energy. A digital tool for analysing consumption and energy production at the level of a neighbourhood is currently being developed. This requires both interfaces with multiple partners, the development of a data model reflecting the reality of the terrain, from the sensors to the buildings, and the implementation of a visualization tool.
Childhood epilepsy: knowledge and attitude of primary school teachers in Port Harcourt, Nigeria.
Alikor, E A D; Essien, A A
2005-01-01
This study was conducted to determine the knowledge of primary school teachers in Port Harcourt metropolis of epilepsy, their knowledge of the management of an attack of epilepsy and the attitude of these teachers towards epilepsy in children. This is a questionnaire-based, cross-sectional study of 118 school teachers from five randomly selected primary schools in Port Harcourt metropolis, Nigeria. Ten percent (12) of the 118 teachers were graded "Good", 45% (54) "Fair" and 43% (52) "Poor" in overall knowledge score. Sixty six teachers (56%) accept applying crude oil on the body as useful in stopping epileptic attacks in children. There was no significant association between overall knowledge score and sex, year of experience as a teacher and experience with a child with epilepsy. Only 10% of the teachers studied were classified as having overall good knowledge of epilepsy. Sixty nine teachers (58.5%) were graded as having good knowledge of cause of epilepsy. Only 38 (32%) disagree that the saliva drooled during an epileptic attack is contagious; one hundred (84.8%) and 65 (55.1%) agree that some childhood illnesses can cause epilepsy and that it runs in families respectively. Overall, 54 teachers (45.8%) had a cumulative score of negative attitude towards epilepsy. Eighty three teachers (73.3%) would want all children with epilepsy put in a special school whilst 57 (48%) agree that children with epilepsy should be withdrawn from schools. The longer the teacher's professional experience, the more the likelihood of positive attitude towards epilepsy but the association did not reach statistically significant level (p = 0.076). Attitude was not statistically associated with sex and educational qualification. The overall knowledge of primary school teachers in Port Harcourt metropolis of epilepsy and the first-aid management of an epileptic attack is poor. The attitude of these teachers towards epilepsy is negative. Education of the primary school teacher and general public on epilepsy is recommended.
Advances in Radiation Mutagenesis through Studies on Drosophila
DOE R&D Accomplishments Database
Muller, H. J.
1958-06-01
The approximately linear relation between radiation dose and induced lethals known for Drosophila spermatozoa, is now extended to spermatids. Data are included regarding oogonia. The linearity principle has been confined for minute structural changes in sperm as multi-hit events, on about the 1.5 power of the dose, long known for spermatozoa, is now extended to spermatids and late oocytes, for relatively short exposures. are found to allow union of broken chromosomes. Therefore, the frequencies are lower for more dispersed exposures of varies with lethals induced in late oocytes follow the same frequency pattern and there fore are multi-hit events. Yet han spermatozoan irradiation that two broken ends derived from nonreciprocal. The following is the order of decreasing radiation mutability of different stages found by ourselves and others: spermatids, spermatozoa in females, spermatozoa 0 to 1 day before ejaculation, earlier spermatozoa, late oocytes, gonia of either sex. Lethal frequencies for these stages range over approximately an order of magnitude, gross structural changes far more widely. Of potential usefulness is our extension of genesis by anoxia, known for spermatozoa in adult males, to those in pupal males and in females, to sperion is especially marked but the increase caused by substituting oxygen for air is less marked, perhaps because of enzymatic differences. In contrast, the induction of gross structural changes in oocytes, but not in spermatids, is markedly reduced by oxygen post-treatment; it is increased by dehydration. The efficacy of induction of structural changes by treatment of spermatozoa, whether with radiation or chemical mutagen, is correlated with the conditions of sperm utilization and egg production. Improving our perspective on radiation effects, some 800,000 offspring have been scored for spontaneous visible mutations of 13 specific loci. The average point-mutation rate was 0.5 to 1.0 per locus among 10/sup 5/ germ cells. Most mutation occurred in peri- fertilization stages. All loci studied mutated from one to nine times. Loci mutating oftener spontaneously also gave more radiation mutation, in other studies, Spectra of individual loci prove similar for spontaneous and induced mutation. Studies on back-mutation also showed similarity of spontaneous and radiation mutations. The doubling dose for back-mutations of forked induced in spermatozoa was several hundred roentgens, gonia at diverse loci. Recent analyses of human mutational load lead to mutation-rate estimated like those earlier based on extrapolations from Drosophila, thus supporting the significance for man of the present studies. (auth)
Mobilizing Practice: Engaging Space, Technology and Design from a Thai Metropolis
ERIC Educational Resources Information Center
Williams, Amanda Marisa
2009-01-01
The project of ubiquitous computing aims to embed computation into everyday spaces. As a practice that is heavily concerned with space and place, its stance towards mobility is sometimes conflicted--treating mobility by turns as a disruption or as an opportunity--and almost always conceiving of it as free and empowered. Conducted in industrial and…
ERIC Educational Resources Information Center
Gesinde, Abiodun Matthew; Sanu, Oluwafunto Jolade
2015-01-01
This study sought to examine the impact which age, gender and psychological adjustment have on behaviour towards seeking professional counselling intervention. Multistage sampling technique was employed to select a total of three hundred workers across Lagos metropolis. The ex post facto research design was adopted for the study. Inventory of…
Factors That Inform Students' Choice of Study and Career
ERIC Educational Resources Information Center
Theresa, Lawer Dede
2015-01-01
The research was conducted to find out factors that informed second cycle students' choices of programmes of study and career in the Kumasi Metropolis of Ghana. The descriptive survey was used for the study, and both questionnaire and interview guide were used in gathering the data. The questionnaire was administered on the students while the…
Urban Inequality: Evidence from Four Cities. A Volume in the Multi-City Study of Urban Inequality.
ERIC Educational Resources Information Center
O'Connor, Alice, Ed.; Tilly, Chris, Ed.; Bobo, Lawrence D., Ed.
This collection of papers focuses on urban inequalities in Atlanta, Boston, Detroit, and Los Angeles. There are 11 chapters in 3 parts. The book begins with an introduction, "Understanding Inequality in the Late Twentieth-Century Metropolis: New Perspectives on the Enduring Racial Divide" (Alice O'Connor) and chapter 1,…
Exploring In-Service Teachers' Self-Efficacy in the Kindergarten Classrooms in Ghana
ERIC Educational Resources Information Center
Boateng, Philip; Sekyere, Frank Owusu
2018-01-01
The study explored in-service teachers' efficacy beliefs in pupil engagement. The sample size was 299 kindergarten teachers selected from both public and private kindergarten schools in the Kumasi metropolis of Ghana. The study adopted and used pupil engagement subscale of the Ohio State Teacher Efficacy Scale (OSTES) developed by Tschannen-Moran…
ERIC Educational Resources Information Center
Dodoo, Joana Eva; Kuupole, Domwini Dabire
2017-01-01
The majority of studies and reports on university education in Africa have focused mainly on issues related to access, quality, teaching and learning environment, and so on. Although these issues are undoubtedly critical, even more germane to the discourse is the desired utility of university education to society. The authors present the…
Perceiving the Metropolis: Seeing the City through a Prism of Race
ERIC Educational Resources Information Center
Krysan, Maria; Bader, Michael
2007-01-01
Investigating the role of preferences in causing persistent patterns of racial residential segregation in the United States has a long history. In this paper, we bring a new perspective--and new data from the 2004 Detroit Area Study--to the question of how best to characterize black and white preferences toward living in neighborhoods with people…
Effects of Goal-Setting Skills on Students'academic Performance in English Language in Enugu Nigeria
ERIC Educational Resources Information Center
Abe, Iyabo Idowu; Ilogu, Guy Chibuzoh; Madueke, Ify Louisa
2014-01-01
The study investigated the effectiveness of goal-setting skills among Senior Secondary II students' academic performance in English language in Enugu Metropolis, Enugu state, Nigeria. Quasi-experimental pre-test, post-test control group design was adopted for the study. The initial sample was 147 participants (male and female) Senior Secondary…
ERIC Educational Resources Information Center
Yaki, Akawo Angwal; Babagana, Mohammed
2016-01-01
The paper examined the effects of a Technological Instructional Package (TIP) on secondary school students' performance in biology. The study adopted a pre-test, post-test experimental control group design. The sample size of the study was 80 students from Minna metropolis, Niger state, Nigeria; the samples were randomly assigned into treatment…
Beyond the Metropolis: The Forgotten History of Small-Town Teachers' Unions
ERIC Educational Resources Information Center
Scribner, Campbell F.
2015-01-01
This article examines the legal and political significance of teacher unionization in rural and suburban school districts between 1960 and 1975. While most historians focus on the growth of unions in urban areas, strikes in outlying districts played a determinative role in the development of public sector labor law, particularly in the arbitration…
ERIC Educational Resources Information Center
Shelina, S. L.; Mitina, O. V.
2015-01-01
The article presents the results of an analysis of the moral value judgments of adults (parents, teachers, educators) that directly concern the socialization process of the young generation in the modern metropolis. This paper follows the model study by Jean Piaget that investigated the moral value judgments of children. A comparative analysis of…
ERIC Educational Resources Information Center
Oluwatomiwo, Oladunmoye Enoch
2015-01-01
This study examined the development and validation of socio provision scale on first year undergraduates adjustment among institution in Ibadan metropolis. The study adopted a descriptive survey design. A sample of 300 participants was randomly selected across institutions in Ibadan. Data were collected using socio provision scale (a =0.76),…
Ecology, Literature and Environmental Education
ERIC Educational Resources Information Center
Tsekos, Christos A.; Tsekos, Evangelos A.; Christoforidou, Elena I.
2012-01-01
The first part of this article refers to the initial attempt to relate Nature to Literature since the age of Hellenistic Alexandria in Egypt. Alexandria was a metropolis of its time with a quite lively character of urban life. Influenced by that character Theocritus was the first to lay the foundations of what is defined as pastoral poetry. In the…
ERIC Educational Resources Information Center
Kuo, Fan-Sheng; Perng, Yeng-Horng
2016-01-01
Creating an attractive cityscape has become one of the most promising actions to improve urban functionality and increase urban competitiveness. However, the resistances from the local inhabitants are always against the urban development. Taipei City, a metropolis in Taiwan, is now composed of complex urban systems chaotically enclosed by existing…
Firoz Uncle: A "Reluctant" Educationist in a Mumbai Ghetto
ERIC Educational Resources Information Center
Murali, Sreejith
2017-01-01
This article focuses on the educational efforts of Syed Firoz Ashraf in the East Jogeshwari area of Mumbai and places his work in the context of the increasing communalisation of social life and education in a poor working class suburb in Mumbai city. Muslim community has been ghettoised in the metropolis to specific areas especially since the…
Determinants of Differing Teacher Attitudes towards Inclusive Education Practice
ERIC Educational Resources Information Center
Gyimah, Emmanuel K.; Ackah, Francis R., Jr.; Yarquah, John A.
2010-01-01
An examination of literature reveals that teacher attitude is fundamental to the practice of inclusive education. In order to verify the extent to which the assertion is applicable in Ghana, 132 teachers were selected from 16 regular schools in the Cape Coast Metropolis using purposive and simple random sampling techniques to respond to a four…
Evaluation of VIIRS AOD over North China Plain: biases from aerosol models
NASA Astrophysics Data System (ADS)
Zhu, J.; Xia, X.; Wang, J.; Chen, H.; Zhang, J.; Oo, M. M.; Holz, R.
2014-12-01
With the launch of the Visible Infrared Imaging Radiometer Suit (VIIRS) instrument onboard Suomi National Polar-orbiting Partnership(S-NPP) in late 2011, the aerosol products of VIIRS are receiving much attention.To date, mostevaluations of VIIRS aerosol productswere carried out about aerosol optical depth (AOD). To further assess the VIIRS AOD in China which is a heavy polluted region in the world,we made a comparison between VIIRS AOD and CE-318 radiometerobservation at the following three sites overNorth China Plain (NCP): metropolis-Beijing (AERONET), suburbs-XiangHe (AERONET) and regional background site- Xinglong (CARSNET).The results showed the VIIRS AOD at 550 nm has a positive mean bias error (MBE) of 0.14-0.15 and root mean square error (RMBE) 0.20. Among three sites, Beijing is mainly a source of bias with MBE 0.17-0.18 and RMBE 0.23-0.24, and this bias is larger than some recent global statics recently published in the literature. Further analysis shows that this large bias in VIIRS AOD overNCP may be partly caused by the aerosol model selection in VIIRS aerosol inversion. According to the retrieval of sky radiance from CE-318 at three sites, aerosols in NCP have high mean real part of refractive indices (1.52-1.53), large volume mean radius (0.17-0.18) and low concentration (0.04-0.09) of fine aerosol, and small mean radius (2.86-2.92) and high concentration (0.06-0.16) of coarse mode aerosol. These observation-based aerosol single scattering properties and size of fine and coarse aerosols differ fromthe aerosol properties used in VIIRSoperational algorithm.The dominant aerosol models used in VIIRS algorithm for these three sites are less polluted urban aerosol in Beijing and low-absorption smoke in other two sites, all of which don't agree with the high imaginary part of refractive indices from CE-318 retrieval. Therefore, the aerosol models in VIIRS algorithm are likely to be refined in NCP region.
Li, Xianfeng; Murthy, Sanjeeva; Latour, Robert A.
2011-01-01
A new empirical sampling method termed “temperature intervals with global exchange of replicas and reduced radii” (TIGER3) is presented and demonstrated to efficiently equilibrate entangled long-chain molecular systems such as amorphous polymers. The TIGER3 algorithm is a replica exchange method in which simulations are run in parallel over a range of temperature levels at and above a designated baseline temperature. The replicas sampled at temperature levels above the baseline are run through a series of cycles with each cycle containing four stages – heating, sampling, quenching, and temperature level reassignment. The method allows chain segments to pass through one another at elevated temperature levels during the sampling stage by reducing the van der Waals radii of the atoms, thus eliminating chain entanglement problems. Atomic radii are then returned to their regular values and re-equilibrated at elevated temperature prior to quenching to the baseline temperature. Following quenching, replicas are compared using a Metropolis Monte Carlo exchange process for the construction of an approximate Boltzmann-weighted ensemble of states and then reassigned to the elevated temperature levels for additional sampling. Further system equilibration is performed by periodic implementation of the previously developed TIGER2 algorithm between cycles of TIGER3, which applies thermal cycling without radii reduction. When coupled with a coarse-grained modeling approach, the combined TIGER2/TIGER3 algorithm yields fast equilibration of bulk-phase models of amorphous polymer, even for polymers with complex, highly branched structures. The developed method was tested by modeling the polyethylene melt. The calculated properties of chain conformation and chain segment packing agreed well with published data. The method was also applied to generate equilibrated structural models of three increasingly complex amorphous polymer systems: poly(methyl methacrylate), poly(butyl methacrylate), and DTB-succinate copolymer. Calculated glass transition temperature (Tg) and structural parameter profile (S(q)) for each resulting polymer model were found to be in close agreement with experimental Tg values and structural measurements obtained by x-ray diffraction, thus validating that the developed methods provide realistic models of amorphous polymer structure. PMID:21769156
NASA Astrophysics Data System (ADS)
Wall, Michael
2014-03-01
Experimental progress in generating and manipulating synthetic quantum systems, such as ultracold atoms and molecules in optical lattices, has revolutionized our understanding of quantum many-body phenomena and posed new challenges for modern numerical techniques. Ultracold molecules, in particular, feature long-range dipole-dipole interactions and a complex and selectively accessible internal structure of rotational and hyperfine states, leading to many-body models with long range interactions and many internal degrees of freedom. Additionally, the many-body physics of ultracold molecules is often probed far from equilibrium, and so algorithms which simulate quantum many-body dynamics are essential. Numerical methods which are to have significant impact in the design and understanding of such synthetic quantum materials must be able to adapt to a variety of different interactions, physical degrees of freedom, and out-of-equilibrium dynamical protocols. Matrix product state (MPS)-based methods, such as the density-matrix renormalization group (DMRG), have become the de facto standard for strongly interacting low-dimensional systems. Moreover, the flexibility of MPS-based methods makes them ideally suited both to generic, open source implementation as well as to studies of the quantum many-body dynamics of ultracold molecules. After introducing MPSs and variational algorithms using MPSs generally, I will discuss my own research using MPSs for many-body dynamics of long-range interacting systems. In addition, I will describe two open source implementations of MPS-based algorithms in which I was involved, as well as educational materials designed to help undergraduates and graduates perform research in computational quantum many-body physics using a variety of numerical methods including exact diagonalization and static and dynamic variational MPS methods. Finally, I will mention present research on ultracold molecules in optical lattices, such as the exploration of many-body physics with polyatomic molecules, and the next generation of open source matrix product state codes. This work was performed in the research group of Prof. Lincoln D. Carr.
ERIC Educational Resources Information Center
Akpoghol, T. V.; Ezeudu, F. O.; Adzape, J. N.; Otor, E. E.
2016-01-01
The study investigated the effects of Lecture Method Supplemented with Music (LMM) and Computer Animation (LMC) on senior secondary school students' academic achievement in electrochemistry in Makurdi metropolis. Six research questions and six hypotheses guided the study. The design of the study was quasi experimental, specifically the pre-test,…
ERIC Educational Resources Information Center
Akpoghol, T. V.; Ezeudu, F. O.; Adzape, J. N.; Otor, E. E.
2016-01-01
The study investigated the effects of Lecture Method Supplemented with Music (LMM) and Computer Animation (LMC) on senior secondary school students' retention in electrochemistry in Makurdi metropolis. Three research questions and three hypotheses guided the study. The design of the study was quasi experimental, specifically the pre-test,…
ERIC Educational Resources Information Center
Obomanu, B. J.; Adaramola, M. O.
2011-01-01
We report a research into factors related to underachievement in science, technology and mathematics (STM) education in schools in Rivers State, Nigeria. The study investigated 240 Nigerian secondary school students, 100 parents, 140 STM teachers and 20 government officials from Port Harcourt Metropolis. Five (5) research questions and one…
The Vitality of a City: Challenge to Higher Education; Challenge to Education: A New Approach.
ERIC Educational Resources Information Center
Johnson, Byron
The US higher education system adopted the European pattern of separating the university from the city. This pattern has changed somewhat in the last few decades, when new universities or branches of older ones have appeared in the metropolis. But frequently these institutions are unconcerned with finding ways to contribute toward improving urban…
Teachers' Level of Awareness of 21st Century Occupational Roles in Rivers State Secondary Schools
ERIC Educational Resources Information Center
Uche, Chineze M.; Kaegon, Leesi E. S. P.; Okata, Fanny Chiemezie
2016-01-01
This study investigated the teachers' level of awareness of 21st century occupational roles in Rivers state secondary schools. Three research questions and three hypotheses guided the study. The population of study comprised of 247 public secondary schools and 57 private secondary schools in Port Harcourt metropolis of Rivers state which gave a…
ERIC Educational Resources Information Center
Mccrea, Rod; Stimson, Robert; Western, John
2005-01-01
Using survey data collected from households living in the Brisbane-South East Queensland region, a rapidly growing metropolis in Australia, path analysis is used to test links between urban residents' assessment of various urban attributes and their level of satisfaction in three urban domains--housing, neighbourhood or local area, and the wider…
ERIC Educational Resources Information Center
Iji, C. O.; Ogbole, P. O.; Uka, N. K.
2014-01-01
Among all approaches aimed at reducing poor mathematics achievement among the students, adoption of appropriate methods of teaching appears to be more rewarding. In this study, improvised instructional materials were used to ascertain students' geometry achievement at the upper basic education one. Two research questions were asked with associated…
ERIC Educational Resources Information Center
Yusuf, Hanna Onyi
2014-01-01
This study assessed the implementation of the reading component of the Junior Secondary School English Language Curriculum for Basic Education in Nigeria. Ten (10) randomly selected public and private secondary schools from Kaduna metropolis in Kaduna State of Nigeria were used for the study. Among the factors assessed in relation to the…
Code of Federal Regulations, 2014 CFR
2014-07-01
... representatives include commissioned, warrant, and petty officers of the U.S. Coast Guard. (d) Informational...-90.5 (West Virginia). 9. 1 day—Third or fourth of July Harrah's Casino/Metropolis Fireworks... Mississippi River mile marker 518.0 to 519.0 (Iowa). 27. 1 day—4th of July weekend Harrah's Casino and Hotel...
ERIC Educational Resources Information Center
Awere, E.; Edu-Buandoh, K. B. M.; Dadzie, D. K.; Aboagye, J. A.
2016-01-01
Building Technology graduates from Ghanaian Polytechnics seek employment in the construction industry, yet little information is known as to whether their tertiary education is really related to and meeting the actual needs of their prospective employers in the construction industry. The tracer study was conducted to ascertain the performance of…
1985-10-30
61A-31-005 (30 Oct 1985) --- This almost vertical view, photographed from Earth-orbit by an STS-61A crew member, centers on the metropolis of Milwaukee, Wisconsin, and some of the adjacent Lake Michigan shoreline, southward toward the Illinois border. The 70mm frame was photographed on the first day of the Spacelab D-1 mission with a handheld Hasselblad camera.
ERIC Educational Resources Information Center
Boakye-Amponsah, Abraham; Enninful, Ebenezer Kofi; Anin, Emmanuel Kwabena; Vanderpuye, Patience
2015-01-01
Background: Ghana being a member of the United Nations, committed to the Universal Primary Education initiative in 2000 and has since implemented series of educational reforms to meet the target for the Millennium Development Goal (MDG) 2. Despite the numerous government interventions to achieve the MDG 2, many children in Ghana have been denied…
ERIC Educational Resources Information Center
Tunckan, Ergun
2007-01-01
The Open Education Faculty Students Centers have been offering many services to students in Turkey since 1982. Building up bridges between students and faculties, student centers have had technological improvements since 1998 and thereafter quality of services have been increased and services given to students at the student center have been…
Visual defects and commercial motorcycle accidents in south eastern Nigeria.
Achigbu, E O; Fiebai, B
2013-01-01
Commercial motorcyclists are a regular part of our highways, especially with the decrease in the number and quality of good roads. This study is aimed at determining the role of vision if any in the increasing number of road traffic accidents (RTA's) among commercial motorcyclists in Enugu metropolis, Nigeria. A cross sectional survey with a multi stage random sampling design was used to select the 615 commercial motorcyclists in Enugu metropolis enrolled in the study. Out of the 615 motorcyclists, seven (1.14% +/- 0.70%) motorcyclists had visual impairment (< 6/18-3/60). Visual field defect was noted in 2.3% +/- 0.98% while 2.6% +/- 0.98% had colour vision defect. The prevalence of road traffic accident (RTA) was 57.7%. Visual impairment was not significantly associated with RTA (P = 0.333) while visual field defect (P = 0.000), and colour vision defect (P = 0.003) were positively associated with RTA. Inexperienced riders had significantly more RTAs than their counterparts (P = 0.000). CONCLUSION; Visual field defect, and colour vision defect were significantly associated with RTA but this finding is in the backdrop of poor training, and inexperience which also significantly affected RTA among the predominantly young riders involved in RTA.
Spatial patterns monitoring of road traffic injuries in Karachi metropolis.
Lateef, Muhammad U
2011-06-01
This article aims to assess the pattern of road traffic injuries (RTIs) and fatalities in Karachi metropolis. Assessing the pattern of RTIs in Karachi at this juncture is important for many reasons. The rapid motorisation in the recent years due to the availability of credit has significantly increased the traffic volume of the city. Since then, the roads of Karachi have continuously developed at a rapid pace. This development has come with a high human loss, because the construction of multilevel flyovers, signal-free corridors and the resulting high-speed traffic ultimately increase the severity of injuries. The reasons for this high proportion are inadequate infrastructure, poor enforcement of safety regulations, high crash severity index and greater population of vulnerable road user groups (riders and pedestrians). This research is the first of its kind in the country to have a geocoded database of fatalities and injuries in a geographical information system for the entire city of Karachi. In fact, road crashes are both predictable and preventable. Developing countries should learn from the experience of highly motorised nations to avoid the high burden of RTIs by adopting road safety and prevention measures.
Mazumdar, Subhendu; Ghose, Dipankar; Saha, Goutam Kumar
2017-12-14
Although Black Kites (Milvus migrans govinda) serve as major scavenging raptor in most of the urban areas, scientific studies on this important ecosystem service provider are almost non-existent in Indian context. The present study was carried out in a metropolis in eastern India to find out the factors influencing relative abundance and roosting site selection of Black Kites. Separate generalized linear models (GLMs) were performed considering encounter rate and roosting Black Kite abundance as response variables. The study conclusively indicated that encounter rates of Black Kites were significantly influenced by the presence of garbage dumps in its vicinity. Numbers of Black Kites were also higher in the roosting sites situated closer to garbage dumps and open spaces. In addition, expected counts of Black Kites significantly increased in roosting sites situated away from buildings and water bodies. However, built-up area and tree cover around the roosting sites had no influence on the abundance of Black Kites therein. With rapid urbanization and changing offal disposal patterns, our findings would be useful to ensure continued availability of food and roosting sites of Black Kites in urban areas.
Zhang, Wenquan; Logan, John R.
2018-01-01
The rapid growth of Asian and Hispanic populations in urban areas is superceding traditional classifications of neighborhoods (for example as white, transitional, or minority). The “global neighborhood” that includes all groups (white, black, Hispanic and Asian) is one important new category. We examine the emerging spatial pattern of racial/ethnic composition in the Chicago metropolis, documenting an expansion of all-minority neighborhoods in the city and just beyond its borders, a shrinking set of all-white neighborhoods in the outer suburbs, and more diverse neighborhoods including whites mainly in between. The most novel element of this pattern is how large the zone of diversity has become and how far it extends into suburbia, upending the old dichotomy of “chocolate city” and “vanilla suburbs.” In addition to comparing the distance of different kinds of neighborhoods from the urban core, we also analyze their adjacency to neighborhoods of the same type or other types. There is a strong tendency toward spatial clustering of each neighborhood type and also for transitions on the boundaries of clusters either to expand or to contract their territory. PMID:29430517
Li, Ling; Ortiz, Christine
2014-05-01
Hierarchical composite materials design in biological exoskeletons achieves penetration resistance through a variety of energy-dissipating mechanisms while simultaneously balancing the need for damage localization to avoid compromising the mechanical integrity of the entire structure and to maintain multi-hit capability. Here, we show that the shell of the bivalve Placuna placenta (~99 wt% calcite), which possesses the unique optical property of ~80% total transmission of visible light, simultaneously achieves penetration resistance and deformation localization via increasing energy dissipation density (0.290 ± 0.072 nJ μm(-3)) by approximately an order of magnitude relative to single-crystal geological calcite (0.034 ± 0.013 nJ μm(-3)). P. placenta, which is composed of a layered assembly of elongated diamond-shaped calcite crystals, undergoes pervasive nanoscale deformation twinning (width ~50 nm) surrounding the penetration zone, which catalyses a series of additional inelastic energy dissipating mechanisms such as interfacial and intracrystalline nanocracking, viscoplastic stretching of interfacial organic material, and nanograin formation and reorientation.
Effects of snowmelt on watershed transit time distributions
NASA Astrophysics Data System (ADS)
Fang, Z.; Carroll, R. W. H.; Harman, C. J.; Wilusz, D. C.; Schumer, R.
2017-12-01
Snowmelt is the principal control of the timing and magnitude of water flow through alpine watersheds, but the streamflow generated may be displaced groundwater. To quantify this effect, we use a rank StorAge Selection (rSAS) model to estimate time-dependent travel time distributions (TTDs) for the East River Catchment (ERC, 84 km2) - a headwater basin of the Colorado River, and newly designated as the Lawrence Berkeley National Laboratory's Watershed Function Science Focus Area (SFA). Through the SFA, observational networks related to precipitation and stream fluxes have been established with a focus on environmental tracers and stable isotopes. The United Stated Geological Survey Precipitation Runoff Modeling System (PRMS) was used to estimate spatially- and temporally-variable boundary fluxes of effective precipitation (snowmelt & rain), evapotranspiration, and subsurface storage. The DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm was used to calibrate the rSAS model to observed stream isotopic concentration data and quantify uncertainty. The sensitivity of the simulated TTDs to systematic changes in the boundary fluxes was explored. Different PRMS and rSAS model parameters setup were tested to explore how they affect the relationship between input precipitation, especially snowmelt, and the estimated TTDs. Wavelet Coherence Analysis (WCA) was applied to investigate the seasonality of TTD simulations. Our ultimate goal is insight into how the Colorado River headwater catchments store and route water, and how sensitive flow paths and transit times are to climatic changes.
Design and Construction of a Dual Anti-Helmholtz Magnet System for a Side-by-Side MOT
NASA Astrophysics Data System (ADS)
Narducci, Frank; Prasher, Rebecca; Adler, Charles
2012-06-01
The design of a cold-atom interferometric gradient magnetometer [1] requires two side-by-side identical atom clouds separated by approximately 1 cm for noise reduction purposes. The first step in building this system is a side-by-side MOT to capture the atoms; however, the design of a coil system to provide two zero field crossings with high field gradients separated by a small distance with low power consumption can be challenging. These three requirements are not easy to satisfy simultaneously, but there is a large ``state space'' in which we can evolve different designs. In this poster we analyze the requirements for such a system and discuss our design consisting of coils with wires wrapped on a truncated cone; this type of design has been made possible by recent advances in 3D printers, and we will go over the issues involved in printing the coil supports, building the coils and comparison of our measurements of the magnetic field to theory. We also discuss the possibility of optimizing coil design using state space searches like the Metropolis algorithm, and how these designs can be realized using 3D printing technology. [4pt] [1] Davis, J. P. and Narducci, F. A.(2008) ``A proposal for a gradient magnetometer atom interferometer,'' Journal of Modern Optics,55:19,3173 --- 3185
Multiscale implementation of infinite-swap replica exchange molecular dynamics.
Yu, Tang-Qing; Lu, Jianfeng; Abrams, Cameron F; Vanden-Eijnden, Eric
2016-10-18
Replica exchange molecular dynamics (REMD) is a popular method to accelerate conformational sampling of complex molecular systems. The idea is to run several replicas of the system in parallel at different temperatures that are swapped periodically. These swaps are typically attempted every few MD steps and accepted or rejected according to a Metropolis-Hastings criterion. This guarantees that the joint distribution of the composite system of replicas is the normalized sum of the symmetrized product of the canonical distributions of these replicas at the different temperatures. Here we propose a different implementation of REMD in which (i) the swaps obey a continuous-time Markov jump process implemented via Gillespie's stochastic simulation algorithm (SSA), which also samples exactly the aforementioned joint distribution and has the advantage of being rejection free, and (ii) this REMD-SSA is combined with the heterogeneous multiscale method to accelerate the rate of the swaps and reach the so-called infinite-swap limit that is known to optimize sampling efficiency. The method is easy to implement and can be trivially parallelized. Here we illustrate its accuracy and efficiency on the examples of alanine dipeptide in vacuum and C-terminal β-hairpin of protein G in explicit solvent. In this latter example, our results indicate that the landscape of the protein is a triple funnel with two folded structures and one misfolded structure that are stabilized by H-bonds.
Alderman, Phillip D.; Stanfill, Bryan
2016-10-06
Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less
Cosmographic analysis with Chebyshev polynomials
NASA Astrophysics Data System (ADS)
Capozziello, Salvatore; D'Agostino, Rocco; Luongo, Orlando
2018-05-01
The limits of standard cosmography are here revised addressing the problem of error propagation during statistical analyses. To do so, we propose the use of Chebyshev polynomials to parametrize cosmic distances. In particular, we demonstrate that building up rational Chebyshev polynomials significantly reduces error propagations with respect to standard Taylor series. This technique provides unbiased estimations of the cosmographic parameters and performs significatively better than previous numerical approximations. To figure this out, we compare rational Chebyshev polynomials with Padé series. In addition, we theoretically evaluate the convergence radius of (1,1) Chebyshev rational polynomial and we compare it with the convergence radii of Taylor and Padé approximations. We thus focus on regions in which convergence of Chebyshev rational functions is better than standard approaches. With this recipe, as high-redshift data are employed, rational Chebyshev polynomials remain highly stable and enable one to derive highly accurate analytical approximations of Hubble's rate in terms of the cosmographic series. Finally, we check our theoretical predictions by setting bounds on cosmographic parameters through Monte Carlo integration techniques, based on the Metropolis-Hastings algorithm. We apply our technique to high-redshift cosmic data, using the Joint Light-curve Analysis supernovae sample and the most recent versions of Hubble parameter and baryon acoustic oscillation measurements. We find that cosmography with Taylor series fails to be predictive with the aforementioned data sets, while turns out to be much more stable using the Chebyshev approach.
NASA Astrophysics Data System (ADS)
Santabarbara, Ignacio; Haas, Edwin; Kraus, David; Herrera, Saul; Klatt, Steffen; Kiese, Ralf
2014-05-01
When using biogeochemical models to estimate greenhouse gas emissions at site to regional/national levels, the assessment and quantification of the uncertainties of simulation results are of significant importance. The uncertainties in simulation results of process-based ecosystem models may result from uncertainties of the process parameters that describe the processes of the model, model structure inadequacy as well as uncertainties in the observations. Data for development and testing of uncertainty analisys were corp yield observations, measurements of soil fluxes of nitrous oxide (N2O) and carbon dioxide (CO2) from 8 arable sites across Europe. Using the process-based biogeochemical model LandscapeDNDC for simulating crop yields, N2O and CO2 emissions, our aim is to assess the simulation uncertainty by setting up a Bayesian framework based on Metropolis-Hastings algorithm. Using Gelman statistics convergence criteria and parallel computing techniques, enable multi Markov Chains to run independently in parallel and create a random walk to estimate the joint model parameter distribution. Through means distribution we limit the parameter space, get probabilities of parameter values and find the complex dependencies among them. With this parameter distribution that determines soil-atmosphere C and N exchange, we are able to obtain the parameter-induced uncertainty of simulation results and compare them with the measurements data.
NASA Astrophysics Data System (ADS)
Santos-Filho, J. B.; Plascak, J. A.
2017-09-01
The X Y vectorial generalization of the Blume-Emery-Griffiths (X Y -VBEG) model, which is suitable to be applied to the study of 3He-4He mixtures, is treated on thin films structure and its thermodynamical properties are analyzed as a function of the film thickness. We employ extensive and up-to-date Monte Carlo simulations consisting of hybrid algorithms combining lattice-gas moves, Metropolis, Wolff, and super-relaxation procedures to overcome the critical slowing down and correlations among different spin configurations of the system. We also make use of single histogram techniques to get the behavior of the thermodynamical quantities close to the corresponding transition temperatures. Thin films of the X Y -VBEG model present a quite rich phase diagram with Berezinskii-Kosterlitz-Thouless (BKT) transitions, BKT endpoints, and isolated critical points. As one varies the impurity concentrations along the layers, and in the limit of infinite film thickness, there is a coalescence of the BKT transition endpoint and the isolated critical point into a single, unique tricritical point. In addition, when mimicking the behavior of thin films of 3He-4He mixtures, one obtains that the concentration of 3He atoms decreases from the outer layers to the inner layers of the film, meaning that the superfluid particles tend to locate in the bulk of the system.
NASA Astrophysics Data System (ADS)
Jones, Alan G.; Afonso, Juan Carlos; Fullea, Javier
2015-04-01
The deep mantle African Superswell is thought to cause up to 500 m of the uplift of the Southern African Plateau. We investigate this phenomenon through stochastic thermo-chemical inversion modelling of the geoid, surface heat flow, Rayleigh and Love dispersion curves and MT data, in a manner that is fully petrologically-consistent. We invert for a three layer crustal velocity, density and thermal structure, but assume the resistivity layering (based on prior inversion of the MT data alone). Inversions are performed using an improved Delayed Rejection and Adaptive Metropolis (DRAM) type Markov chain Monte Carlo (MCMC) algorithm. We demonstrate that a single layer lithosphere can fit most of the data, but not the MT responses. We further demonstrate that modelling the seismic data alone, without the constraint of requiring reasonable oxide chemistry or of fitting the geoid, permits wildly acceptable elevations and with very poorly defined lithosphere-asthenosphere boundary (LAB). We parameterise the lithosphere into three layers, and bound the permitted oxide chemistry of each layer consistent with known chemical layering. We find acceptable models, from 5 million tested in each case, that fit all responses and yield a posteriori elevation distributions centred on 900-950 m, suggesting dynamic support from the lower mantle of some 400 m.
A Bayesian hierarchical model with novel prior specifications for estimating HIV testing rates
An, Qian; Kang, Jian; Song, Ruiguang; Hall, H. Irene
2016-01-01
Human immunodeficiency virus (HIV) infection is a severe infectious disease actively spreading globally, and acquired immunodeficiency syndrome (AIDS) is an advanced stage of HIV infection. The HIV testing rate, that is, the probability that an AIDS-free HIV infected person seeks a test for HIV during a particular time interval, given no previous positive test has been obtained prior to the start of the time, is an important parameter for public health. In this paper, we propose a Bayesian hierarchical model with two levels of hierarchy to estimate the HIV testing rate using annual AIDS and AIDS-free HIV diagnoses data. At level one, we model the latent number of HIV infections for each year using a Poisson distribution with the intensity parameter representing the HIV incidence rate. At level two, the annual numbers of AIDS and AIDS-free HIV diagnosed cases and all undiagnosed cases stratified by the HIV infections at different years are modeled using a multinomial distribution with parameters including the HIV testing rate. We propose a new class of priors for the HIV incidence rate and HIV testing rate taking into account the temporal dependence of these parameters to improve the estimation accuracy. We develop an efficient posterior computation algorithm based on the adaptive rejection metropolis sampling technique. We demonstrate our model using simulation studies and the analysis of the national HIV surveillance data in the USA. PMID:26567891
ERIC Educational Resources Information Center
Akom, A. A.
2008-01-01
In this article, I reflect on Signithia Fordham and John Ogbu's classic research on the "burden of "acting White"" to develop a long overdue dialogue between Africana studies and critical white studies. It highlights the dialectical nature of Fordham and Ogbu's philosophy of race and critical race theory by locating the origins of the "burden of…
ERIC Educational Resources Information Center
Amosa, Abdul Ganiyu Alasela; Ogunlade, Oyeronke Olufunmilola; Atobatele, Adunni Suliat
2015-01-01
The use of field trip in teaching and learning helps to bring about effective and efficient learning in Basic Technology. Field trip is a group excursion away from the normal education environment for firsthand experience of an historic site or place of special interest. This study therefore was geared towards finding out the effect of field trip…
ERIC Educational Resources Information Center
Bonney, Ebenezer Appah; Amoah, Daniel F.; Micah, Sophia A.; Ahiamenyo, Comfort; Lemaire, Margaret B.
2015-01-01
The study investigated into the relationship between the quality of teachers and students' academic performance in Sekondi Takoradi Metropolitan Assembly (STMA) Junior High Schools. Descriptive survey design was used and the target population was Junior High School teachers and pupils in the metropolis. Five educational circuits in the metropolis…
ERIC Educational Resources Information Center
LAMANNA, RICHARD A.; SAMORA, JULIAN
MEXICAN AMERICANS WHO HAVE MIGRATED TO THE INDUSTRIAL COMPLEX OF EAST CHICAGO ARE ANALYZED TO DETERMINE THE VALIDITY OF A HYPOTHESIS THAT THIS GROUP WAS PROVIDED OPPORTUNITIES NOT AVAILABLE TO THEIR COUNTERPARTS IN THE SOUTHWEST FOR ASSIMILATION INTO THE COMMUNITY. A CONCISE REPORT ON THE HISTORY OF THE MEXICAN-AMERICAN COLONY IN EAST CHICAGO, ITS…
ERIC Educational Resources Information Center
Torto, Gertrude Afiba
2017-01-01
The English language curriculum for primary schools in Ghana spells out the various aspects, topics and sub topics that teachers must teach the child within a specified time. The syllabus again specifies the various topics and sub topics that should be taught in an integrated manner so as to enhance meaningful learning. For this meaningful…
Area Handbook Series: Paraguay: A Country Study
1988-12-01
fattening period for steers. Artificial insemination was increasingly common. To a certain extent, cattle raising reflected the dispari- ties in agriculture...metropolis and satellite . Joseph had no constituency in Spanish America. Without a king, the entire colonial system lost its legitimacy, and the colonists...generally more dependable than local service because it used a microwave and satellite transmission system. Telex ser- vices also were available through
ERIC Educational Resources Information Center
Cobbold, Cosmas; Boateng, Philip
2016-01-01
The objective of the study was to investigate kindergarten teachers' efficacy beliefs in classroom management. The sample size was 299 teachers drawn from both public and private kindergarten schools in the Kumasi Metropolis of Ghana. The efficacy beliefs of the teachers with respect to their classroom management practices were measured on a…
ERIC Educational Resources Information Center
Achor, Emmanuel E.; Amadu, Samuel O.
2015-01-01
This study examined the extent to which school outdoor activities could enhance senior secondary (SS) two students' achievement in ecology. Non randomized pre test post test control group Quasi-experimental design was adopted. A sample of 160 SS II students from 4 co-educational schools in Jalingo metropolis, Taraba State Nigeria was used. A 40…
ERIC Educational Resources Information Center
Durowoju, Esther O.; Onuka, Adams O. U.
2015-01-01
The paper investigated the effect of teacher self-efficacy enhancement and school location on students' achievement in Economics in Senior Secondary School in Ibadan Metropolis of Oyo State, Nigeria. Three hypotheses were tested at 0.05 level of significance. Multi-stage sampling technique was adopted in the study. Four Local Government Areas (two…
Sleep Pattern and Sleep Hygiene Practices among Nigerian Schooling Adolescents
Peter, Igoche David; Adamu, Halima; Asani, Mustafa O.; Aliyu, Ibrahim; Sabo, Umar A.; Umar, Umar I.
2017-01-01
Background: Sleep problems, especially in the adolescent stage of development, may be associated with excessive daytime sleepiness, impaired neurocognitive function, and a host of others leading to suboptimal performance. Objectives: To determine the pattern of sleep problems in school-going adolescents based on the bedtime problems; excessive daytime sleepiness; awakenings during the night and problems falling back asleep; regularity and duration of sleep; sleep-disordered breathing (BEARS) sleep screening algorithm. Materials and Methods: This is a cross-sectional descriptive study involving 353 secondary school-going adolescents in Kano metropolis. Subjects were selected for the study using multistage sampling technique. The study lasted from March 2015 to July 2015. Sleep problems were screened for using the BEARS sleep screening algorithm. Tables were used to present the qualitative data. The various BEARS sleep patterns were assessed, and comparison between stages of adolescence was done using Chi-square test (and Fisher's exact test where necessary). A significant association was considered at P < 0.05. Results: Of the 353 adolescents studied, 61.8% were males while 38.2% were females with male, female ratio of 1.6:1. Early, middle, and late adolescents constituted 13.9%, 39.9%, 46.2% respectively. BEARS sleep screening revealed awakenings during the night (34.6%) as the most common sleep-related problem reported, and this was followed by excessive daytime sleepiness (21.0%). Age-group dependent sleep duration was 7.19 ± 1.26, 7.13 ± 1.13, 7.16 ± 1.28, with P > 0.05. Although 62.9% of all the adolescents watched TV/play video games until 1 h before going to bed and this was highest in late adolescence, it was not statistically significantly associated with any of the sleep problems. Conclusion: Both the quality and quantity of sleep in Nigerian adolescents in Kano is suboptimal. Adolescent and sleep medicine should receive more attention in our environment. PMID:28852230
Stochastic many-body perturbation theory for anharmonic molecular vibrations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hermes, Matthew R.; Hirata, So, E-mail: sohirata@illinois.edu; CREST, Japan Science and Technology Agency, 4-1-8 Honcho, Kawaguchi, Saitama 332-0012
2014-08-28
A new quantum Monte Carlo (QMC) method for anharmonic vibrational zero-point energies and transition frequencies is developed, which combines the diagrammatic vibrational many-body perturbation theory based on the Dyson equation with Monte Carlo integration. The infinite sums of the diagrammatic and thus size-consistent first- and second-order anharmonic corrections to the energy and self-energy are expressed as sums of a few m- or 2m-dimensional integrals of wave functions and a potential energy surface (PES) (m is the vibrational degrees of freedom). Each of these integrals is computed as the integrand (including the value of the PES) divided by the value ofmore » a judiciously chosen weight function evaluated on demand at geometries distributed randomly but according to the weight function via the Metropolis algorithm. In this way, the method completely avoids cumbersome evaluation and storage of high-order force constants necessary in the original formulation of the vibrational perturbation theory; it furthermore allows even higher-order force constants essentially up to an infinite order to be taken into account in a scalable, memory-efficient algorithm. The diagrammatic contributions to the frequency-dependent self-energies that are stochastically evaluated at discrete frequencies can be reliably interpolated, allowing the self-consistent solutions to the Dyson equation to be obtained. This method, therefore, can compute directly and stochastically the transition frequencies of fundamentals and overtones as well as their relative intensities as pole strengths, without fixed-node errors that plague some QMC. It is shown that, for an identical PES, the new method reproduces the correct deterministic values of the energies and frequencies within a few cm{sup −1} and pole strengths within a few thousandths. With the values of a PES evaluated on the fly at random geometries, the new method captures a noticeably greater proportion of anharmonic effects.« less
Surface Segregation in Ternary Alloys
NASA Technical Reports Server (NTRS)
Good, Brian; Bozzolo, Guillermo H.; Abel, Phillip B.
2000-01-01
Surface segregation profiles of binary (Cu-Ni, Au-Ni, Cu-Au) and ternary (Cu-Au-Ni) alloys are determined via Monte Carlo-Metropolis computer simulations using the BFS method for alloys for the calculation of the energetics. The behavior of Cu or Au in Ni is contrasted with their behavior when both are present. The interaction between Cu and Au and its effect on the segregation profiles for Cu-Au-Ni alloys is discussed.
ERIC Educational Resources Information Center
African-American Inst., New York, NY. School Services Div.
Four modules dealing with African culture are combined in this document. The first module discusses various life-styles of African women, including warrior, queen, ruler, and matriarch. A lesson plan uses a question-and-answer format to encourage discussion of the effects of tradition, society, and nation upon African women. Questions asked…
ERIC Educational Resources Information Center
Wahab, E. O.; Ajiboye, O. E.; Atere, A. A.
2011-01-01
This research is motivated as a result of rapid changes in the lifestyle of our youths and the increasing deterioration of their welfare in terms of the increase in the number of out of school youth in the country, high incidence of child participation in economic activities and incidence of street children in Nigeria. Although, many researches…
ERIC Educational Resources Information Center
Arhin, Ato Kwamina
2015-01-01
The study was a quasi-experimental research project conducted to investigate the effect of performance assessment-driven instructions on the attitude and achievement in mathematics of senior high school students in Ghana at Ghana National College in Cape Coast. Two Form 1 science classes were used for the study and were assigned as experimental…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hou Fengji; Hogg, David W.; Goodman, Jonathan
Markov chain Monte Carlo (MCMC) proves to be powerful for Bayesian inference and in particular for exoplanet radial velocity fitting because MCMC provides more statistical information and makes better use of data than common approaches like chi-square fitting. However, the nonlinear density functions encountered in these problems can make MCMC time-consuming. In this paper, we apply an ensemble sampler respecting affine invariance to orbital parameter extraction from radial velocity data. This new sampler has only one free parameter, and does not require much tuning for good performance, which is important for automatization. The autocorrelation time of this sampler is approximatelymore » the same for all parameters and far smaller than Metropolis-Hastings, which means it requires many fewer function calls to produce the same number of independent samples. The affine-invariant sampler speeds up MCMC by hundreds of times compared with Metropolis-Hastings in the same computing situation. This novel sampler would be ideal for projects involving large data sets such as statistical investigations of planet distribution. The biggest obstacle to ensemble samplers is the existence of multiple local optima; we present a clustering technique to deal with local optima by clustering based on the likelihood of the walkers in the ensemble. We demonstrate the effectiveness of the sampler on real radial velocity data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalkwarf, D.R.
1980-05-01
Airborne uranium products were collected at the perimeter of the uranium-conversion plant operated by the Allied Chemical Corporation at Metropolis, Illinois, and the dissolution rates of these products were classified in terms of the ICRP Task Group Lung Model. Assignments were based on measurements of the dissolution half-times exhibited by uranium components of the dust samples as they dissolved in simulated lung fluid at 37/sup 0/C. Based on three trials, the dissolution behavior of dust with aerodynamic equivalent diameter (AED) less than 5.5 ..mu..m and collected nearest the closest residence to the plant was classified 0.40 D, 0.60 Y. Basedmore » on two trials, the dissolution behavior of dust with AED greater than 5.5 ..mu..m and collected at this location was classified 0.37 D, 0.63 Y. Based on one trial, the dissolution behavior of dust with AED less than 5.5 ..mu..m and collected at a location on the opposite side of the plant was classified 0.68 D, 0.32 Y. There was some evidence for adsorption of dissolved uranium onto other dust components during dissolution, and preliminary dissolution trials are recommended for future samples in order to optimize the fluid replacement schedule.« less
NASA Astrophysics Data System (ADS)
Shepherd, J. Marshall; Pierce, Harold; Negri, Andrew J.
2002-07-01
Data from the Tropical Rainfall Measuring Mission (TRMM) satellite's precipitation radar (PR) were employed to identify warm-season rainfall (1998-2000) patterns around Atlanta, Georgia; Montgomery, Alabama; Nashville, Tennessee; and San Antonio, Waco, and Dallas, Texas. Results reveal an average increase of about 28% in monthly rainfall rates within 30-60 km downwind of the metropolis, with a modest increase of 5.6% over the metropolis. Portions of the downwind area exhibit increases as high as 51%. The percentage changes are relative to an upwind control area. It was also found that maximum rainfall rates in the downwind impact area exceeded the mean value in the upwind control area by 48%-116%. The maximum value was generally found at an average distance of 39 km from the edge of the urban center or 64 km from the center of the city. Results are consistent with the Metropolitan Meteorological Experiment (METROMEX) studies of St. Louis, Missouri, almost two decades ago and with more recent studies near Atlanta. The study establishes the possibility of utilizing satellite-based rainfall estimates for examining rainfall modification by urban areas on global scales and over longer time periods. Such research has implications for weather forecasting, urban planning, water resource management, and understanding human impact on the environment and climate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan
In this study we developed an efficient Bayesian inversion framework for interpreting marine seismic amplitude versus angle (AVA) and controlled source electromagnetic (CSEM) data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo (MCMC) sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis (DREAM) and Adaptive Metropolis (AM) samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and CSEM data. The multi-chain MCMC is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration,more » the approach is used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic AVA and CSEM joint inversion provides better estimation of reservoir saturations than the seismic AVA-only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated – reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.« less
Determinants of parents' decisions on childhood immunisations at Kumasi Metropolis in Ghana.
Hagan, Doris; Phethlu, Deliwe R
2016-07-29
To describe factors that influence parents' decisions on childhood immunisations at Kumasi Metropolis in Ghana. Quantitative cross-sectional survey. A sample of 303 parents was obtained from a monthly accessible population of 1420 individuals from the five district hospitals through convenience sampling of respondents at immunisation sessions in Kumasi. Data obtained from the survey were analysed with SPSS version 21 software. Most parents were aware of child immunisations, but they had limited knowledge on vaccines and immunisation schedules. Antenatal nurses constituted the most accessible source of vaccine information. The study established a high percentage of complete immunisation, influenced by parents' fear of their children contracting vaccine-preventable diseases. Remarkably, some parents indicated that they immunised their children because they wanted to know the weight of their children. Forgetfulness and lack of personnel or vaccine at the centres were the reasons given by the few parents who could not complete immunisation schedules for their children, whereas the socio-demographic variables considered did not influence parents' decision on immunisation. Knowledge on immunisation could not influence immunisation decisions but parents' fear of vaccine-preventable diseases, awareness on the benefits of immunisations and sources of vaccine information were the main factors that influenced immunisation decision at Kumasi in Ghana.
NASA Astrophysics Data System (ADS)
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan; Huang, Maoyi; Bao, Jie; Swiler, Laura
2017-12-01
In this study we developed an efficient Bayesian inversion framework for interpreting marine seismic Amplitude Versus Angle and Controlled-Source Electromagnetic data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis and Adaptive Metropolis samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and Controlled-Source Electromagnetic data. The multi-chain Markov-chain Monte Carlo is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration, the approach is used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic Amplitude Versus Angle and Controlled-Source Electromagnetic joint inversion provides better estimation of reservoir saturations than the seismic Amplitude Versus Angle only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated - reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.
Volatility modeling for IDR exchange rate through APARCH model with student-t distribution
NASA Astrophysics Data System (ADS)
Nugroho, Didit Budi; Susanto, Bambang
2017-08-01
The aim of this study is to empirically investigate the performance of APARCH(1,1) volatility model with the Student-t error distribution on five foreign currency selling rates to Indonesian rupiah (IDR), including the Swiss franc (CHF), the Euro (EUR), the British pound (GBP), Japanese yen (JPY), and the US dollar (USD). Six years daily closing rates over the period of January 2010 to December 2016 for a total number of 1722 observations have analysed. The Bayesian inference using the efficient independence chain Metropolis-Hastings and adaptive random walk Metropolis methods in the Markov chain Monte Carlo (MCMC) scheme has been applied to estimate the parameters of model. According to the DIC criterion, this study has found that the APARCH(1,1) model under Student-t distribution is a better fit than the model under normal distribution for any observed rate return series. The 95% highest posterior density interval suggested the APARCH models to model the IDR/JPY and IDR/USD volatilities. In particular, the IDR/JPY and IDR/USD data, respectively, have significant negative and positive leverage effect in the rate returns. Meanwhile, the optimal power coefficient of volatility has been found to be statistically different from 2 in adopting all rate return series, save the IDR/EUR rate return series.
Pesewu, George A; Bentum, Daniel; Olu-Taiwo, Michael A; Glover, Kathreen K; Yirenya-Tawiah, Dzidzo R
2017-01-01
Many developing countries, including Ghana, are water stressed. As such, farmers, particularly those in urban areas, have adopted the use of wastewater for irrigation. This study evaluated the bacteriological water quality of the wastewater used for irrigation in the vegetable farms at Korle-Bu Teaching Hospital (KBTH), Accra Metropolis, Ghana. In all, 40 wastewater samples were collected and analysed bacteriologically using the total aerobic plate count method. The isolated bacteria were identified biochemically using Bergey's manual for determinative bacteriology. Mean total bacterial colony count values in the range of 2.75-4.44 × 10 5 CFU/100 mL were isolated which far exceeds values of 1 × 10 3 /100 mL recommended by the World Health Organization (WHO) for unrestricted irrigation of crops likely to be eaten raw. Enterobacter cloacae (51.4%), Klebsiella sp. (24.1%), Pseudomonas aeruginosa (11.3%), Salmonella typhi (10.6%), Escherichia coli (2.2%) and Proteus sp. (0.4%) were the predominant bacteria isolated. Growers should use treated wastewater for farming while processors and consumers should minimize contamination risks of produce from the vegetable farms/garden to the plate. © The Author(s) 2016.
Wemakor, Anthony; Iddrisu, Habib
2018-06-25
Maternal depression may affect child feeding practice which is an important determinant of child nutritional status. The objective of this study was to explore the association between maternal depression and WHO complementary feeding indicators [minimum dietary diversity (MDD), minimum meal frequency (MMF) and minimum acceptable diet (MAD)] or stunting status of children (6-23 months) in Tamale Metropolis, Ghana. A community-based cross-sectional study was carried out involving 200 mother-child pairs randomly sampled from three communities in Tamale Metropolis, Ghana. The prevalence of MDD, MMF, and MAD were 56.5, 65.0, and 44.0% respectively and 41.0% of the children sampled were stunted. A third of the mothers (33.5%) screened positive for depression. Maternal depression did not influence significantly MDD (p = 0.245), MMF (p = 0.442), and MAD (p = 0.885) or children's risk of stunting (p = 0.872). In conclusion maternal depression and child stunting are prevalent in Northern Ghana but there is a lack of evidence of an association between maternal depression and child feeding practices or nutritional status in this study population. Further research is needed to assess the effect of maternal depression on feeding practices and growth of young children.
NASA Astrophysics Data System (ADS)
Adeniran, J. A.; Yusuf, R. O.; Olajire, A. A.
2017-10-01
This study aims to determine the seasonal variations and composition of suspended particulate matter in different sizes PM1.0, PM2.5, PM10 and the total suspended particles (TSP) emitted at major intra-urban traffic intersections (TIs) of Ilorin metropolis. The concentration levels of PM (PM1.0, PM2.5, PM10) obtained at the TIs during the rush hours (45.1, 77.9, and 513 μg/m3) are higher than the levels obtained for the non-rush hour periods (42.3, 62.7, and 390 μg/m3). The average on-road respiratory deposition dose (RDD) rates of PM1.0, PM2.5 and PM10 during the dry period at TIs types was found to be about 24%, 9% and 25% higher than those obtained during the wet period. Based on EF values calculated, Pb and Zn were anthropogenically-derived while Fe, Mn, Cr, Cu and Mg were of crustal source. Principal component analysis (PCA) has been applied to a set of PM data in order to determine the contribution of different sources. It was found that the main principal factors extracted from particulate emission data were related to exhaust and non-exhaust emissions such as tyre wears, oil and fuel combustion sources.
Water budget analysis and management for Bangkok Metropolis, Thailand.
Singkran, Nuanchan
2017-09-01
The water budget of the Bangkok Metropolis system was analyzed using a material flow analysis model. Total imported flows into the system were 80,080 million m 3 per year (Mm 3 y -1 ) including inflows from the Chao Phraya and Mae Klong rivers and rainwater. Total exported flows out of the system were 78,528 Mm 3 y -1 including outflow into the lower Chao Phraya River and tap water (TW) distributed to suburbs. Total rates of stock exchange (1,552 Mm 3 y -1 ) were found in the processes of water recycling, TW distribution, domestic use, swine farming, aquaculture, and paddy fields. Only 21% of the total amount of wastewater (1,255 Mm 3 y -1 ) was collected, with insufficient treatment capacity of about 415 Mm 3 y -1 . Domestic and business (industrial and commercial sectors) areas were major point sources, whereas paddy fields were a major non-point source of wastewater. To manage Bangkok's water budget, critical measures have to be considered. Wastewater treatment capacity and efficiency of wastewater collection should be improved. On-site wastewater treatment plants for residential areas should be installed. Urban planning and land use zoning are suggested to control land use activities. Green technology should be supported to reduce wastewater from farming.
NASA Astrophysics Data System (ADS)
Chen, Bingzhang; Smith, Sherwood Lan
2018-02-01
Diversity plays critical roles in ecosystem functioning, but it remains challenging to model phytoplankton diversity in order to better understand those roles and reproduce consistently observed diversity patterns in the ocean. In contrast to the typical approach of resolving distinct species or functional groups, we present a ContInuous TRAiT-basEd phytoplankton model (CITRATE) that focuses on macroscopic system properties such as total biomass, mean trait values, and trait variance. This phytoplankton component is embedded within a nitrogen-phytoplankton-zooplankton-detritus-iron model that itself is coupled with a simplified one-dimensional ocean model. Size is used as the master trait for phytoplankton. CITRATE also incorporates trait diffusion
for sustaining diversity and simple representations of physiological acclimation, i.e., flexible chlorophyll-to-carbon and nitrogen-to-carbon ratios. We have implemented CITRATE at two contrasting stations in the North Pacific where several years of observational data are available. The model is driven by physical forcing including vertical eddy diffusivity imported from three-dimensional general ocean circulation models (GCMs). One common set of model parameters for the two stations is optimized using the Delayed-Rejection Adaptive Metropolis-Hasting Monte Carlo (DRAM) algorithm. The model faithfully reproduces most of the observed patterns and gives robust predictions on phytoplankton mean size and size diversity. CITRATE is suitable for applications in GCMs and constitutes a prototype upon which more sophisticated continuous trait-based models can be developed.
NASA Astrophysics Data System (ADS)
Touhidul Mustafa, Syed Md.; Nossent, Jiri; Ghysels, Gert; Huysmans, Marijke
2017-04-01
Transient numerical groundwater flow models have been used to understand and forecast groundwater flow systems under anthropogenic and climatic effects, but the reliability of the predictions is strongly influenced by different sources of uncertainty. Hence, researchers in hydrological sciences are developing and applying methods for uncertainty quantification. Nevertheless, spatially distributed flow models pose significant challenges for parameter and spatially distributed input estimation and uncertainty quantification. In this study, we present a general and flexible approach for input and parameter estimation and uncertainty analysis of groundwater models. The proposed approach combines a fully distributed groundwater flow model (MODFLOW) with the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. To avoid over-parameterization, the uncertainty of the spatially distributed model input has been represented by multipliers. The posterior distributions of these multipliers and the regular model parameters were estimated using DREAM. The proposed methodology has been applied in an overexploited aquifer in Bangladesh where groundwater pumping and recharge data are highly uncertain. The results confirm that input uncertainty does have a considerable effect on the model predictions and parameter distributions. Additionally, our approach also provides a new way to optimize the spatially distributed recharge and pumping data along with the parameter values under uncertain input conditions. It can be concluded from our approach that considering model input uncertainty along with parameter uncertainty is important for obtaining realistic model predictions and a correct estimation of the uncertainty bounds.
A Bayesian hierarchical model with novel prior specifications for estimating HIV testing rates.
An, Qian; Kang, Jian; Song, Ruiguang; Hall, H Irene
2016-04-30
Human immunodeficiency virus (HIV) infection is a severe infectious disease actively spreading globally, and acquired immunodeficiency syndrome (AIDS) is an advanced stage of HIV infection. The HIV testing rate, that is, the probability that an AIDS-free HIV infected person seeks a test for HIV during a particular time interval, given no previous positive test has been obtained prior to the start of the time, is an important parameter for public health. In this paper, we propose a Bayesian hierarchical model with two levels of hierarchy to estimate the HIV testing rate using annual AIDS and AIDS-free HIV diagnoses data. At level one, we model the latent number of HIV infections for each year using a Poisson distribution with the intensity parameter representing the HIV incidence rate. At level two, the annual numbers of AIDS and AIDS-free HIV diagnosed cases and all undiagnosed cases stratified by the HIV infections at different years are modeled using a multinomial distribution with parameters including the HIV testing rate. We propose a new class of priors for the HIV incidence rate and HIV testing rate taking into account the temporal dependence of these parameters to improve the estimation accuracy. We develop an efficient posterior computation algorithm based on the adaptive rejection metropolis sampling technique. We demonstrate our model using simulation studies and the analysis of the national HIV surveillance data in the USA. Copyright © 2015 John Wiley & Sons, Ltd.
Modelling malaria incidence by an autoregressive distributed lag model with spatial component.
Laguna, Francisco; Grillet, María Eugenia; León, José R; Ludeña, Carenne
2017-08-01
The influence of climatic variables on the dynamics of human malaria has been widely highlighted. Also, it is known that this mosquito-borne infection varies in space and time. However, when the data is spatially incomplete most popular spatio-temporal methods of analysis cannot be applied directly. In this paper, we develop a two step methodology to model the spatio-temporal dependence of malaria incidence on local rainfall, temperature, and humidity as well as the regional sea surface temperatures (SST) in the northern coast of Venezuela. First, we fit an autoregressive distributed lag model (ARDL) to the weekly data, and then, we adjust a linear separable spacial vectorial autoregressive model (VAR) to the residuals of the ARDL. Finally, the model parameters are tuned using a Markov Chain Monte Carlo (MCMC) procedure derived from the Metropolis-Hastings algorithm. Our results show that the best model to account for the variations of malaria incidence from 2001 to 2008 in 10 endemic Municipalities in North-Eastern Venezuela is a logit model that included the accumulated local precipitation in combination with the local maximum temperature of the preceding month as positive regressors. Additionally, we show that although malaria dynamics is highly heterogeneous in space, a detailed analysis of the estimated spatial parameters in our model yield important insights regarding the joint behavior of the disease incidence across the different counties in our study. Copyright © 2017 Elsevier Ltd. All rights reserved.
GRID-BASED EXPLORATION OF COSMOLOGICAL PARAMETER SPACE WITH SNAKE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikkelsen, K.; Næss, S. K.; Eriksen, H. K., E-mail: kristin.mikkelsen@astro.uio.no
2013-11-10
We present a fully parallelized grid-based parameter estimation algorithm for investigating multidimensional likelihoods called Snake, and apply it to cosmological parameter estimation. The basic idea is to map out the likelihood grid-cell by grid-cell according to decreasing likelihood, and stop when a certain threshold has been reached. This approach improves vastly on the 'curse of dimensionality' problem plaguing standard grid-based parameter estimation simply by disregarding grid cells with negligible likelihood. The main advantages of this method compared to standard Metropolis-Hastings Markov Chain Monte Carlo methods include (1) trivial extraction of arbitrary conditional distributions; (2) direct access to Bayesian evidences; (3)more » better sampling of the tails of the distribution; and (4) nearly perfect parallelization scaling. The main disadvantage is, as in the case of brute-force grid-based evaluation, a dependency on the number of parameters, N{sub par}. One of the main goals of the present paper is to determine how large N{sub par} can be, while still maintaining reasonable computational efficiency; we find that N{sub par} = 12 is well within the capabilities of the method. The performance of the code is tested by comparing cosmological parameters estimated using Snake and the WMAP-7 data with those obtained using CosmoMC, the current standard code in the field. We find fully consistent results, with similar computational expenses, but shorter wall time due to the perfect parallelization scheme.« less
NASA Astrophysics Data System (ADS)
Li, L.; Xu, C.-Y.; Engeland, K.
2012-04-01
With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD
NASA Astrophysics Data System (ADS)
Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn
2013-04-01
SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.
The behavior of Metropolis-coupled Markov chains when sampling rugged phylogenetic distributions.
Brown, Jeremy M; Thomson, Robert C
2018-02-15
Bayesian phylogenetic inference involves sampling from posterior distributions of trees, which sometimes exhibit local optima, or peaks, separated by regions of low posterior density. Markov chain Monte Carlo (MCMC) algorithms are the most widely used numerical method for generating samples from these posterior distributions, but they are susceptible to entrapment on individual optima in rugged distributions when they are unable to easily cross through or jump across regions of low posterior density. Ruggedness of posterior distributions can result from a variety of factors, including unmodeled variation in evolutionary processes and unrecognized variation in the true topology across sites or genes. Ruggedness can also become exaggerated when constraints are placed on topologies that require the presence or absence of particular bipartitions (often referred to as positive or negative constraints, respectively). These types of constraints are frequently employed when conducting tests of topological hypotheses (Bergsten et al. 2013; Brown and Thomson 2017). Negative constraints can lead to particularly rugged distributions when the data strongly support a forbidden clade, because monophyly of the clade can be disrupted by inserting outgroup taxa in many different ways. However, topological moves between the alternative disruptions are very difficult, because they require swaps between the inserted outgroup taxa while the data constrain taxa from the forbidden clade to remain close together on the tree. While this precise form of ruggedness is particular to negative constraints, trees with high posterior density can be separated by similarly complicated topological rearrangements, even in the absence of constraints.
NASA Astrophysics Data System (ADS)
Tuve, T.; Mostaccio, A.; Langer, H. K.; di Grazia, G.
2005-12-01
A recent research project carried out together with the Italian Civil Protection concerns the study of amplitude decay laws in various areas on the Italian territory, including Mt Etna. A particular feature of seismic activity is the presence of moderate magnitude earthquakes causing frequently considerable damage in the epicentre areas. These earthquakes are supposed to occur at rather shallow depth, no more than 5 km. Given the geological context, however, these shallow earthquakes would origin in rather weak sedimentary material. In this study we check the reliability of standard earthquake location, in particular with respect to the calculated focal depth, using standard location methods as well as more advanced approaches such as the NONLINLOC software proposed by Lomax et al. (2000) using it with its various options (i.e., Grid Search, Metropolis-Gibbs and Oct-Tree) and 3D velocity model (Cocina et al., 2005). All three options of NONLINLOC gave comparable results with respect to hypocenter locations and quality. Compared to standard locations we note a significant improve of location quality and, in particular a considerable difference of focal depths (in the order of 1.5 - 2 km). However, we cannot find a clear bias towards greater or lower depth. Further analyses concern the assessment of the stability of locations. For this purpose we carry out various Monte Carlo experiments perturbing travel time reading randomly. Further investigations are devoted to possible biases which may arise from the use of an unsuitable velocity model.
NASA Astrophysics Data System (ADS)
Rosat, S.; Lambert, S. B.; Gattano, C.; Calvo, M.
2017-01-01
Geophysical parameters of the deep Earth's interior can be evaluated through the resonance effects associated with the core and inner-core wobbles on the forced nutations of the Earth's figure axis, as observed by very long baseline interferometry (VLBI), or on the diurnal tidal waves, retrieved from the time-varying surface gravity recorded by superconducting gravimeters (SGs). In this paper, we inverse for the rotational mode parameters from both techniques to retrieve geophysical parameters of the deep Earth. We analyse surface gravity data from 15 SG stations and VLBI delays accumulated over the last 35 yr. We show existing correlations between several basic Earth parameters and then decide to inverse for the rotational modes parameters. We employ a Bayesian inversion based on the Metropolis-Hastings algorithm with a Markov-chain Monte Carlo method. We obtain estimates of the free core nutation resonant period and quality factor that are consistent for both techniques. We also attempt an inversion for the free inner-core nutation (FICN) resonant period from gravity data. The most probable solution gives a period close to the annual prograde term (or S1 tide). However the 95 per cent confidence interval extends the possible values between roughly 28 and 725 d for gravity, and from 362 to 414 d from nutation data, depending on the prior bounds. The precisions of the estimated long-period nutation and respective small diurnal tidal constituents are hence not accurate enough for a correct determination of the FICN complex frequency.
Germinal center reentries of BCL2-overexpressing B cells drive follicular lymphoma progression
Sungalee, Stéphanie; Mamessier, Emilie; Morgado, Ester; Grégoire, Emilie; Brohawn, Philip Z.; Morehouse, Christopher A.; Jouve, Nathalie; Monvoisin, Céline; Menard, Cédric; Debroas, Guilhaume; Faroudi, Mustapha; Mechin, Violaine; Navarro, Jean-Marc; Drevet, Charlotte; Eberle, Franziska C.; Chasson, Lionel; Baudimont, Fannie; Mancini, Stéphane J.; Tellier, Julie; Picquenot, Jean-Michel; Kelly, Rachel; Vineis, Paolo; Ruminy, Philippe; Chetaille, Bruno; Jaffe, Elaine S.; Schiff, Claudine; Hardwigsen, Jean; Tice, David A.; Higgs, Brandon W.; Tarte, Karin; Nadel, Bertrand; Roulland, Sandrine
2014-01-01
It has recently been demonstrated that memory B cells can reenter and reengage germinal center (GC) reactions, opening the possibility that multi-hit lymphomagenesis gradually occurs throughout life during successive immunological challenges. Here, we investigated this scenario in follicular lymphoma (FL), an indolent GC-derived malignancy. We developed a mouse model that recapitulates the FL hallmark t(14;18) translocation, which results in constitutive activation of antiapoptotic protein B cell lymphoma 2 (BCL2) in a subset of B cells, and applied a combination of molecular and immunofluorescence approaches to track normal and t(14;18)+ memory B cells in human and BCL2-overexpressing B cells in murine lymphoid tissues. BCL2-overexpressing B cells required multiple GC transits before acquiring FL-associated developmental arrest and presenting as GC B cells with constitutive activation–induced cytidine deaminase (AID) mutator activity. Moreover, multiple reentries into the GC were necessary for the progression to advanced precursor stages of FL. Together, our results demonstrate that protracted subversion of immune dynamics contributes to early dissemination and progression of t(14;18)+ precursors and shapes the systemic presentation of FL patients. PMID:25384217
Chronic inflammation and impaired development of the preterm brain.
Bennet, Laura; Dhillon, Simerdeep; Lear, Chris A; van den Heuij, Lotte; King, Victoria; Dean, Justin M; Wassink, Guido; Davidson, Joanne O; Gunn, Alistair Jan
2018-02-01
The preterm newborn is at significant risk of neural injury and impaired neurodevelopment. Infants with mild or no evidence of injury may also be at risk of altered brain development, with evidence impaired cell maturation. The underlying causes are multifactorial and include exposure of both the fetus and newborn to hypoxia-ischemia, inflammation (chorioamnionitis) and infection, adverse maternal lifestyle choices (smoking, drug and alcohol use, diet) and obesity, as well as the significant demand that adaptation to post-natal life places on immature organs. Further, many fetuses and infants may have combinations of these events, and repeated (multi-hit) events that may induce tolerance to injury or sensitize to greater injury. Currently there are no treatments to prevent preterm injury or impaired neurodevelopment. However, inflammation is a common pathway for many of these insults, and clinical and experimental evidence demonstrates that acute and chronic inflammation is associated with impaired brain development. This review examines our current knowledge about the relationship between inflammation and preterm brain development, and the potential for stem cell therapy to provide neuroprotection and neurorepair through reducing inflammation and release of trophic factors, which promote cell maturation and repair. Copyright © 2017 Elsevier B.V. All rights reserved.
Translational Control in Cancer Etiology
Ruggero, Davide
2013-01-01
The link between perturbations in translational control and cancer etiology is becoming a primary focus in cancer research. It has now been established that genetic alterations in several components of the translational apparatus underlie spontaneous cancers as well as an entire class of inherited syndromes known as “ribosomopathies” associated with increased cancer susceptibility. These discoveries have illuminated the importance of deregulations in translational control to very specific cellular processes that contribute to cancer etiology. In addition, a growing body of evidence supports the view that deregulation of translational control is a common mechanism by which diverse oncogenic pathways promote cellular transformation and tumor development. Indeed, activation of these key oncogenic pathways induces rapid and dramatic translational reprogramming both by increasing overall protein synthesis and by modulating specific mRNA networks. These translational changes promote cellular transformation, impacting almost every phase of tumor development. This paradigm represents a new frontier in the multihit model of cancer formation and offers significant promise for innovative cancer therapies. Current research, in conjunction with cutting edge technologies, will further enable us to explore novel mechanisms of translational control, functionally identify translationally controlled mRNA groups, and unravel their impact on cellular transformation and tumorigenesis. PMID:22767671
Chromosomal changes in cultured human epithelial cells transformed by low- and high-LET radiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Tracy Chui-hsu; Craise, L.M; Prioleau, J.C.
1990-11-01
For a better assessment of radiation risk in space, an understanding of the responses of human cells, especially the epithelial cells, to low- and high-LET radiation is essential. In our laboratory, we have successfully developed techniques to study the neoplastic transformation of two human epithelial cell systems by ionizing radiation. These cell systems are human mammary epithelial cells (H184B5) and human epidermal keratinocytes (HEK). Both cell lines are immortal, anchorage dependent for growth, and nontumorigenic in athymic nude nice. Neoplastic transformation was achieved by irradiation cells successively. Our results showed that radiogenic cell transformation is a multistep process and thatmore » a single exposure of ionizing radiation can cause only one step of transformation. It requires, therefore, multihits to make human epithelial cells fully tumorigenic. Using a simple karyotyping method, we did chromosome analysis with cells cloned at various stages of transformation. We found no consistent large terminal deletion of chromosomes in radiation-induced transformants. Some changes of total number of chromosomes, however, were observed in the transformed cells. These transformants provide an unique opportunity for further genetic studies at a molecular level. 15 refs., 9 figs., 2 tabs.« less
Chromosomal changes in cultured human epithelial cells transformed by low- and high-let radiation
NASA Astrophysics Data System (ADS)
Chui-Hsu Yang, Tracy; Craise, Laurie M.; Prioleau, John C.; Stampfer, Martha R.; Rhim, Johng S.
1992-07-01
For a better assessment of radiation risk in space, an understanding of the responses of human cells, especially the epithelial cells, to low- and high-LET radiation is essential. In our laboratory, we have successfully developed techniques to study the neoplastic transformation of two human epithelial cell systems by ionizing radiation. These cell systems are human mammary epithelial cells (H184B5) and human epidermal keratinocytes (HEK). Both cell lines are immortal, anchorage dependent for growth, and nontumorigenic in athymic nude mice. Neoplastic transformation was achieved by irradiating cells successively. Our results showed that radiogenic cell transformation is a multistep process and that a single exposure of ionizing radiation can cause only one step of transformation. It requires, therefore, multihits to make human epithelial cells fully tumorigenic. Using a simple karyotyping method, we did chromosome analysis with cells cloned at various stages of transformation. We found no consistent large terminal deletion of chromosomes in radiation-induced transformants. Some changes of total number of chromosomes, however, were observed in the transformed cells. These transformants provide an unique opportunity for further genetic studies at a molecular level.
Chromosomal changes in cultured human epithelial cells transformed by low- and high-LET radiation
NASA Technical Reports Server (NTRS)
Craise, L. M.; Prioleau, J. C.; Stampfer, M. R.; Rhim, J. S.; Yang, TC-H (Principal Investigator)
1992-01-01
For a better assessment of radiation risk in space, an understanding of the responses of human cells, especially the epithelial cells, to low- and high-LET radiation is essential. In our laboratory, we have successfully developed techniques to study the neoplastic transformation of two human epithelial cell systems by ionizing radiation. These cell systems are human mammary epithelial cells (H184B5) and human epidermal keratinocytes (HEK). Both cell lines are immortal, anchorage dependent for growth, and nontumorigenic in athymic nude mice. Neoplastic transformation was achieved by irradiating cells successively. Our results showed that radiogenic cell transformation is a multistep process and that a single exposure of ionizing radiation can cause only one step of transformation. It requires, therefore, multihits to make human epithelial cells fully tumorigenic. Using a simple karyotyping method, we did chromosome analysis with cells cloned at various stages of transformation. We found no consistent large terminal deletion of chromosomes in radiation-induced transformants. Some changes of total number of chromosomes, however, were observed in the transformed cells. These transformants provide an unique opportunity for further genetic studies at a molecular level.
Applicability of land use models for the Houston area test site
NASA Technical Reports Server (NTRS)
Petersburg, R. K.; Bradford, L. H.
1973-01-01
Descriptions of land use models are presented which were considered for their applicability to the Houston Area Test Site. These models are representative both of the prevailing theories of land use dynamics and of basic approaches to simulation. The models considered are: a model of metropolis, land use simulation model, emperic land use forecasting model, a probabilistic model for residential growth, and the regional environmental management allocation process. Sources of environmental/resource information are listed.
Rainfall Modification by Urban Areas: New Perspectives from TRMM
NASA Technical Reports Server (NTRS)
Shepherd, J. Marshall; Pierce, Harold F.; Negri, Andrew
2002-01-01
Data from the Tropical Rainfall Measuring Mission's (TRMM) Precipitation Radar (PR) were employed to identify warm season rainfall (1998-2000) patterns around Atlanta, Montgomery, Nashville, San Antonio, Waco, and Dallas. Results reveal an average increase of -28% in monthly rainfall rates within 30-60 kilometers downwind of the metropolis with a modest increase of 5.6% over the metropolis. Portions of the downwind area exhibit increases as high as 51%. The percentage changes are relative to an upwind control area. It was also found that maximum rainfall rates in the downwind impact area exceeded the mean value in the upwind control area by 48% - 116%. The maximum value was generally found at an average distance of 39 km from the edge of the urban center or 64 km from the center of the city. Results are consistent with METROMEX studies of St. Louis almost two decades ago and with more recent studies near Atlanta. Future work is extending the investigation to Phoenix, Arizona, an arid U.S. city, and several international cities like Mexico City, Johannesburg, and Brasilia. The study establishes the possibility of utilizing satellite-based rainfall estimates for examining rainfall modification by urban areas on global scales and over longer time periods. Such research has implications for weather forecasting, urban planning, water resource management, and understanding human impact on the environment and climate.
SAChES: Scalable Adaptive Chain-Ensemble Sampling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swiler, Laura Painton; Ray, Jaideep; Ebeida, Mohamed Salah
We present the development of a parallel Markov Chain Monte Carlo (MCMC) method called SAChES, Scalable Adaptive Chain-Ensemble Sampling. This capability is targed to Bayesian calibration of com- putationally expensive simulation models. SAChES involves a hybrid of two methods: Differential Evo- lution Monte Carlo followed by Adaptive Metropolis. Both methods involve parallel chains. Differential evolution allows one to explore high-dimensional parameter spaces using loosely coupled (i.e., largely asynchronous) chains. Loose coupling allows the use of large chain ensembles, with far more chains than the number of parameters to explore. This reduces per-chain sampling burden, enables high-dimensional inversions and the usemore » of computationally expensive forward models. The large number of chains can also ameliorate the impact of silent-errors, which may affect only a few chains. The chain ensemble can also be sampled to provide an initial condition when an aberrant chain is re-spawned. Adaptive Metropolis takes the best points from the differential evolution and efficiently hones in on the poste- rior density. The multitude of chains in SAChES is leveraged to (1) enable efficient exploration of the parameter space; and (2) ensure robustness to silent errors which may be unavoidable in extreme-scale computational platforms of the future. This report outlines SAChES, describes four papers that are the result of the project, and discusses some additional results.« less
The prevalence and incidence of COPD among urban older persons of Bangkok Metropolis.
Maranetra, Khun Nanta; Chuaychoo, Benjamas; Dejsomritrutai, Wanchai; Chierakul, Nitipatana; Nana, Arth; Lertakyamanee, Jariya; Naruman, Chana; Suthamsmai, Tasneeya; Sangkaew, Sutee; Sreelum, Wichean; Aksornin, Montchai; Dechapol, Jaroon; Sathet, Wichean
2002-11-01
COPD substantially affects the national healthcare resource and healthcare cost especially among the older persons. Identifying the accurate prevalence and incidence reflects the scale of problem posed by COPD. This epidemiological study using the criteria for diagnosing COPD based on ratio of FEV1.0/FVC less than 70 per cent and the reversibility of less than 15 per cent increase of post bronchodilator FEV1.0 in the absence of parenchymal lesions and cardiomegaly in CXR (PA and lateral view) revealed the prevalence (1998) of COPD among the 3094 older persons aged 60 years and over in the communities of Bangkok Metropolis 10 km around Siriraj Hospital was 7.11 per cent (95% CI: 6.21-8.01), whereas the incidence (1999) of COPD was 3.63 per cent (95% CI: 2.83-4.43). Both the prevalence and the incidence were increased with increasing age. The disease occurred predominantly among male smokers. The distribution of mild : moderate : severe COPD in the prevalence study was 5.6:2.2:1. The current findings also suggest that tobacco smoking is the prime important cause of COPD and the indoor pollution especially cooking smoke is not significant. In particular, the unexpectedly high incidence compared with prevalence in this population probably represents the warning message to the national policy maker for prompt and effective health promotion and disease prevention to prevent further social and economic loss.
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan; ...
2017-10-17
In this paper we developed an efficient Bayesian inversion framework for interpreting marine seismic Amplitude Versus Angle and Controlled-Source Electromagnetic data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis and Adaptive Metropolis samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and Controlled-Source Electromagnetic data. The multi-chain Markov-chain Monte Carlo is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration, the approach ismore » used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic Amplitude Versus Angle and Controlled-Source Electromagnetic joint inversion provides better estimation of reservoir saturations than the seismic Amplitude Versus Angle only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated — reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.« less
Orish, Verner N; Onyeabor, Onyekachi S; Boampong, Johnson N; Afoakwah, Richmond; Nwaefuna, Ekene; Acquah, Samuel; Orish, Esther O; Sanyaolu, Adekunle O; Iriemenam, Nnaemeka C
2014-08-01
This study investigated the influence of the level of education on HIV infection among pregnant women attending antenatal care in Sekondi-Takoradi, Ghana. A cross-sectional study was conducted at four hospitals in the Sekondi-Takoradi metropolis. The study group comprised 885 consenting pregnant women attending antenatal care clinics. Questionnaires were administered and venous blood samples were screened for HIV and other parameters. Multivariable logistic regression analyses were performed to determine the association between the level of education attained by the pregnant women and their HIV statuses. The data showed that 9.83% (87/885) of the pregnant women were HIV seropositive while 90.17% (798/885) were HIV seronegative. There were significant differences in mean age (years) between the HIV seropositive women (27.45 ± 5.5) and their HIV seronegative (26.02 ± 5.6) counterparts (p = .026) but the inference disappeared after adjustment (p = .22). Multivariable logistic regression analysis revealed that pregnant women with secondary/tertiary education were less likely to have HIV infection compared with those with none/primary education (adjusted OR, 0.53; 95% CI, 0.30-0.91; p = .022). Our data showed an association with higher level of education and HIV statuses of the pregnant women. It is imperative to encourage formal education among pregnant women in this region.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan
In this paper we developed an efficient Bayesian inversion framework for interpreting marine seismic Amplitude Versus Angle and Controlled-Source Electromagnetic data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis and Adaptive Metropolis samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and Controlled-Source Electromagnetic data. The multi-chain Markov-chain Monte Carlo is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration, the approach ismore » used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic Amplitude Versus Angle and Controlled-Source Electromagnetic joint inversion provides better estimation of reservoir saturations than the seismic Amplitude Versus Angle only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated — reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.« less
Urban transformation of a metropolis and its environmental impacts: a case study in Shanghai.
Tian, Zhan; Cao, Guiying; Shi, Jun; McCallum, Ian; Cui, Linli; Fan, Dongli; Li, Xinhu
2012-06-01
The aim of this paper is to understand the sustainability of urban spatial transformation in the process of rapid urbanization, and calls for future research on the demographic and economic dimensions of climate change. Shanghai towards its transformation to a metropolis has experienced vast socioeconomic and ecological change and calls for future research on the impacts of demographic and economic dimensions on climate change. We look at the major questions (1) to explore economic and demographic growth, land use and land-cover changes in the context of rapid economic and city growth, and (2) to analyze how the demography and economic growth have been associated with the local air temperature and vegetation. We examine urban growth, land use and land-cover changes in the context of rapid economic development and urbanization. We assess the impact of urban expansion on local air temperature and vegetation. The analysis is based on time series data of land use, normalized difference vegetation index (NDVI), and meteorological, demographic and economic data. The results indicate that urban growth has been driven by mass immigration; as a consequence of economic growth and urban expansion, a large amount of farmland has been converted to paved road and residential buildings. Furthermore, the difference between air temperature in urban and exurban areas has increased rapidly. The decrease of high mean annual NDVI has mainly occurred around the dense urban areas.
Markov Chain Monte Carlo Inference of Parametric Dictionaries for Sparse Bayesian Approximations
Chaspari, Theodora; Tsiartas, Andreas; Tsilifis, Panagiotis; Narayanan, Shrikanth
2016-01-01
Parametric dictionaries can increase the ability of sparse representations to meaningfully capture and interpret the underlying signal information, such as encountered in biomedical problems. Given a mapping function from the atom parameter space to the actual atoms, we propose a sparse Bayesian framework for learning the atom parameters, because of its ability to provide full posterior estimates, take uncertainty into account and generalize on unseen data. Inference is performed with Markov Chain Monte Carlo, that uses block sampling to generate the variables of the Bayesian problem. Since the parameterization of dictionary atoms results in posteriors that cannot be analytically computed, we use a Metropolis-Hastings-within-Gibbs framework, according to which variables with closed-form posteriors are generated with the Gibbs sampler, while the remaining ones with the Metropolis Hastings from appropriate candidate-generating densities. We further show that the corresponding Markov Chain is uniformly ergodic ensuring its convergence to a stationary distribution independently of the initial state. Results on synthetic data and real biomedical signals indicate that our approach offers advantages in terms of signal reconstruction compared to previously proposed Steepest Descent and Equiangular Tight Frame methods. This paper demonstrates the ability of Bayesian learning to generate parametric dictionaries that can reliably represent the exemplar data and provides the foundation towards inferring the entire variable set of the sparse approximation problem for signal denoising, adaptation and other applications. PMID:28649173
King, Samuel B.; Lapidus, Mariana
2015-01-01
Objective: The authors' goal was to assess changes in the role of librarians in informatics education from 2004 to 2013. This is a follow-up to “Metropolis Redux: The Unique Importance of Library Skills in Informatics,” a 2004 survey of informatics programs. Methods: An electronic survey was conducted in January 2013 and sent to librarians via the MEDLIB-L email discussion list, the library section of the American Association of Colleges of Pharmacy, the Medical Informatics Section of the Medical Library Association, the Information Technology Interest Group of the Association of College and Research Libraries/New England Region, and various library directors across the country. Results: Librarians from fifty-five institutions responded to the survey. Of these respondents, thirty-four included librarians in nonlibrary aspects of informatics training. Fifteen institutions have librarians participating in leadership positions in their informatics programs. Compared to the earlier survey, the role of librarians has evolved. Conclusions: Librarians possess skills that enable them to participate in informatics programs beyond a narrow library focus. Librarians currently perform significant leadership roles in informatics education. There are opportunities for librarian interdisciplinary collaboration in informatics programs. Implications: Informatics is much more than the study of technology. The information skills that librarians bring to the table enrich and broaden the study of informatics in addition to adding value to the library profession itself. PMID:25552939
King, Samuel B; Lapidus, Mariana
2015-01-01
The authors' goal was to assess changes in the role of librarians in informatics education from 2004 to 2013. This is a follow-up to "Metropolis Redux: The Unique Importance of Library Skills in Informatics," a 2004 survey of informatics programs. An electronic survey was conducted in January 2013 and sent to librarians via the MEDLIB-L email discussion list, the library section of the American Association of Colleges of Pharmacy, the Medical Informatics Section of the Medical Library Association, the Information Technology Interest Group of the Association of College and Research Libraries/New England Region, and various library directors across the country. Librarians from fifty-five institutions responded to the survey. Of these respondents, thirty-four included librarians in nonlibrary aspects of informatics training. Fifteen institutions have librarians participating in leadership positions in their informatics programs. Compared to the earlier survey, the role of librarians has evolved. Librarians possess skills that enable them to participate in informatics programs beyond a narrow library focus. Librarians currently perform significant leadership roles in informatics education. There are opportunities for librarian interdisciplinary collaboration in informatics programs. Informatics is much more than the study of technology. The information skills that librarians bring to the table enrich and broaden the study of informatics in addition to adding value to the library profession itself.
Isaiah, Ibeh Nnana; Nche, Bikwe Thomas; Nwagu, Ibeh Georgina; Nnanna, Ibeh Isaiah
2011-12-01
The current rise of male infertility associated with bacterospermia and urogenital infection has been on the increase amongst adult married males in Benin metropolis and a major cause of concern to male fertility and reproduction in Nigeria. To microbiologically isolate and study the infectious agent that has led to male infertility and also to study the percentage occurrence of bacteropsermia and urogenital caused infertility in adult married males in Benin metropolis using standard microbiological methods of isolating and identifying the organism, specimen was collected and processed which includes the susceptibility profile of isolates and sperm quality. In this study a total of 140 sperm samples was collected from patient who were referred from the consultant outpatient department of the University of Benin Teaching Hospital and then evaluated bacteriologically using standard bacterial cultural methods Among the total cases, 92 (65.7%) showed at least one pathogen. Staphylococcus aureus (28.3%), Staphylococcus Saprophyticus (13.0%), Pseudomonas aerouginosa (6.5%), Escherichia Coli (19.6%) Proteus mirabilis (10.8%) Klebsiella spp (10.8%) and Proteus vulgaris (10.8%). There was an outstanding significant relationship between bacteriospermia and the rate of total motility and morphologically abnormal sperms, The percentage of morphologically normal sperm was lower in this study. Staphylococcus aureus Staphylococcus saprohyticus and Escherichia coli were the most common pathogen having negative effects on sperm motility and morphology in this study.
Testing calibration routines for LISFLOOD, a distributed hydrological model
NASA Astrophysics Data System (ADS)
Pannemans, B.
2009-04-01
Traditionally hydrological models are considered as difficult to calibrate: their highly non-linearity results in rugged and rough response surfaces were calibration algorithms easily get stuck in local minima. For the calibration of distributed hydrological models two extra factors play an important role: on the one hand they are often costly on computation, thus restricting the feasible number of model runs; on the other hand their distributed nature smooths the response surface, thus facilitating the search for a global minimum. Lisflood is a distributed hydrological model currently used for the European Flood Alert System - EFAS (Van der Knijff et al, 2008). Its upcoming recalibration over more then 200 catchments, each with an average runtime of 2-3 minutes, proved a perfect occasion to put several existing calibration algorithms to the test. The tested routines are Downhill Simplex (DHS, Nelder and Mead, 1965), SCEUA (Duan et Al. 1993), SCEM (Vrugt et al., 2003) and AMALGAM (Vrugt et al., 2008), and they were evaluated on their capability to efficiently converge onto the global minimum and on the spread in the found solutions in repeated runs. The routines were let loose on a simple hyperbolic function, on a Lisflood catchment using model output as observation, and on two Lisflood catchments using real observations (one on the river Inn in the Alps, the other along the downstream stretch of the Elbe). On the mathematical problem and on the catchment with synthetic observations DHS proved to be the fastest and the most efficient in finding a solution. SCEUA and AMALGAM are a slower, but while SCEUA keeps converging on the exact solution, AMALGAM slows down after about 600 runs. For the Lisflood models with real-time observations AMALGAM (hybrid algorithm that combines several other algorithms, we used CMA, PSO and GA) came as fastest out of the tests, and giving comparable results in consecutive runs. However, some more work is needed to tweak the stopping criteria. SCEUA is a bit slower, but has very transparent stopping rules. Both have closed in on the minima after about 600 runs. DHS equals only SCEUA on convergence speed. The stopping criteria we applied so far are too strict, causing it to stop too early. SCEM converges 5-6 times slower. This is a high price for the parameter uncertainty analysis that is simultaneously done. The ease with which all algorithms find the same optimum suggests that we are dealing with a smooth and relatively simple response surface. This leaves room for other deterministic calibration algorithms being smarter than DHS in sliding downhill. PEST seems promising but sofar we haven't managed to get it running with LISFLOOD. • Duan, Q.; Gupta, V. & Sorooshian, S., 1993, Shuffled complex evolution approach for effective and efficient global minimization, J Optim Theory Appl, Kluwer Academic Publishers-Plenum Publishers, 76, 501-521 • Nelder, J. & Mead, R., 1965, A simplex method for function minimization, Comput. J., 7, 308-313 • Van Der Knijff, J. M.; Younis, J. & De Roo, A. P. J., 2008, LISFLOOD: a GIS-based distributed model for river basin scale water balance and flood simulation, International Journal of Geographical Information Science, • Vrugt, J.; Gupta, H.; Bouten, W. & Sorooshian, S., 2003, A Shuffled Complex Evolution Metropolis algorithm for optimization and uncertainty assessment of hydrologic model parameters, Water Resour. Res., 39 • Vrugt, J.; Robinson, B. & Hyman, J., 2008, Self-Adaptive Multimethod Search for Global Optimization in Real-Parameter Spaces, IEEE Trans Evol Comput, IEEE,
Advances in radiation mutagenesis through studies on Drosophila
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muller, H. J.
The approximately linear relation between radiation dose and induced lethals, known for Drosophila spermatozoa, is now extended to spermatids. Data are included regarding oogonia. The linearity principle has been confirmed for minute structural changes in spermatozoa. The dependence of gross structural changes, as multi-hit events, on about the 1.5 power of the dose, long known for spermatozoa, is now extended to spermatids and late oocytes, for relatively short exposures. However, these stages unlike spermatozoa are found to allow union of broken chromosomes. Therefore, the frequencies are lower for more dispersed exposures of these stages, and the precise dose relation variesmore » with the timing. Part of the dominant and even recessive lethals induced in late oocytes follow the same frequency pattern and therefore are multi-hit events. Yet there is a much lower chance after oocytic than spermatozoan irradiation that two broken ends derived from different hits will unite, hence most such unions are nonreciprocal. The following is the order of decreasing radiation mutability of different stages found by ourselves and others: spermatids, spermatozoa in females, spermatozoa 0 to 1 day before ejaculation, earlier spermatozoa, late oocytes, gonia of either sex. Lethal frequencies for these stages range over approximately an order of magnitude, gross structural changes far more widely. Of potential usefulness is our extension of the principle of marked reduction of radiation mutagenesis by anoxia, known for spermatozoa in adult males, to those in pupal males and in females to spermatids and to oocytes. In spermatids this reduction is especially marked but the increase caused by substituting oxygen for air is less marked, perhaps because of enzymatic differences. In contrast, the induction of gross structural changes in oocytes, but not in spermatids, is markedly reduced by oxygen post-treatment; it is increased by dehydration. The efficacy of induction of structural changes by treatment of spermatozoa, whether with radiation or chemical mutagens, is correlated with the conditions of sperm utilization and egg production. Improving our perspective on radiation effects, some 800,000 offspring have been scored for spontaneous visible mutations of 13 specific loci. The average point-mutation rate was 0.5 to 1.0 per locus among 10/sup 5/ germ cells. Most mutations occurred in peri-fertilization stages. All loci studied mutated from one to nine times. Loci mutating oftener spontaneously also gave more radiation mutation, in other studies. Spectra of individual loci prove similar for spontaneous and induced mutation. Studies on back-mutations also showed similarity of spontaneous and radiation mutations. The doubling dose for back-mutations of forked induced in spermatozoa was several hundred roentgens, similar to that for direct point-mutations induced in gonia at diverse loci. Recent analyses of human mutational load lead to mutation-rate estimates like those earlier based on extrapolations from Drosophila, thus supporting the significance for man of the present studies. (auth)« less
Chrzastowski, Michael J.
1983-01-01
Lake Washington, in the midst of the greater Seattle metropolitan area of the Puget Sound region (fig. 1), is an exceptional commercial, recreational, and esthetic resource for the region . In the past 130 years, Lake Washington has been changed from a " wild " lake in a wilderness setting to a regulated lake surrounded by a growing metropolis--a transformation that provides an unusual opportunity to study changes to a lake's shoreline and hydrologic characteristics -resulting from urbanization.
Transition records of stationary Markov chains.
Naudts, Jan; Van der Straeten, Erik
2006-10-01
In any Markov chain with finite state space the distribution of transition records always belongs to the exponential family. This observation is used to prove a fluctuation theorem, and to show that the dynamical entropy of a stationary Markov chain is linear in the number of steps. Three applications are discussed. A known result about entropy production is reproduced. A thermodynamic relation is derived for equilibrium systems with Metropolis dynamics. Finally, a link is made with recent results concerning a one-dimensional polymer model.
[Marketing as a tool in the medical institution management].
Petrova, N G; Balokhina, S A
2009-01-01
The contemporary social economic conditions dictate the necessity to change tactics and strategy of functioning of medical institutions of different property forms. Marketing, alongside with management is to become a leading concept of administration of medical institutions. It should be a framework for systematic collection, registration and analysis of data relevant to the medical services market. The issues of the implementation of marketing concept in the practical everyday activities of commercial medical organization providing cosmetology services to population of metropolis.
NASA Astrophysics Data System (ADS)
Sleeman, J.; Halem, M.; Finin, T.; Cane, M. A.
2016-12-01
Approximately every five years dating back to 1989, thousands of climate scientists, research centers and government labs volunteer to prepare comprehensive Assessment Reports for the Intergovernmental Panel on Climate Change. These are highly curated reports distributed to 200 nation policy makers. There have been five IPCC Assessment Reports to date, the latest leading to a Paris Agreement in Dec. 2016 signed thus far by 172 nations to limit the amount of global Greenhouse gases emitted to producing no more than a 20 C warming of the atmosphere. These reports are a living evolving big data collection tracing 30 years of climate science research, observations, and model scenario intercomparisons. They contain more than 200,000 citations over a 30 year period that trace the evolution of the physical basis of climate science, the observed and predicted impact, risk and adaptation to increased greenhouse gases and mitigation approaches, pathways, policies for climate change. Document-topic and topic-term probability distributions are built from the vocabularies of the respective assessment report chapters and citations. Using Microsoft Bing, we retrieve 150,000 citations referenced across chapters and convert those citations to text. Using a word n-gram model based on a heterogeneous set of climate change terminology, lemmatization, noise filtering and stopword elimination, we calculate word frequencies for chapters and citations. Temporal document sets are built based on the assessment period. In addition to topic modeling, we employ cross domain correlation measures. Using the Jensen-Shannon divergence and Pearson correlation we build correlation matrices for chapter and citations topics. The shared vocabulary acts as the bridge between domains resulting in chapter-citation point pairs in space. Pairs are established based on a document-topic probability distribution. Each chapter and citation is associated with a vector of topics and based on the n most probable topics, we establish which chapter-citation pairs are most similar. We will perform posterior inferences based on Hastings -Metropolis simulated annealing MCMC algorithm to infer, from the evolution of topics starting from AR1 to AR4, assertions of topics for AR5 and potentially AR6.
NASA Astrophysics Data System (ADS)
Bagnardi, M.; Hooper, A. J.
2017-12-01
Inversions of geodetic observational data, such as Interferometric Synthetic Aperture Radar (InSAR) and Global Navigation Satellite System (GNSS) measurements, are often performed to obtain information about the source of surface displacements. Inverse problem theory has been applied to study magmatic processes, the earthquake cycle, and other phenomena that cause deformation of the Earth's interior and of its surface. Together with increasing improvements in data resolution, both spatial and temporal, new satellite missions (e.g., European Commission's Sentinel-1 satellites) are providing the unprecedented opportunity to access space-geodetic data within hours from their acquisition. To truly take advantage of these opportunities we must become able to interpret geodetic data in a rapid and robust manner. Here we present the open-source Geodetic Bayesian Inversion Software (GBIS; available for download at http://comet.nerc.ac.uk/gbis). GBIS is written in Matlab and offers a series of user-friendly and interactive pre- and post-processing tools. For example, an interactive function has been developed to estimate the characteristics of noise in InSAR data by calculating the experimental semi-variogram. The inversion software uses a Markov-chain Monte Carlo algorithm, incorporating the Metropolis-Hastings algorithm with adaptive step size, to efficiently sample the posterior probability distribution of the different source parameters. The probabilistic Bayesian approach allows the user to retrieve estimates of the optimal (best-fitting) deformation source parameters together with the associated uncertainties produced by errors in the data (and by scaling, errors in the model). The current version of GBIS (V1.0) includes fast analytical forward models for magmatic sources of different geometry (e.g., point source, finite spherical source, prolate spheroid source, penny-shaped sill-like source, and dipping-dike with uniform opening) and for dipping faults with uniform slip, embedded in a isotropic elastic half-space. However, the software architecture allows the user to easily add any other analytical or numerical forward models to calculate displacements at the surface. GBIS is delivered with a detailed user manual and three synthetic datasets for testing and practical training.
NASA Astrophysics Data System (ADS)
Oware, E. K.
2017-12-01
Geophysical quantification of hydrogeological parameters typically involve limited noisy measurements coupled with inadequate understanding of the target phenomenon. Hence, a deterministic solution is unrealistic in light of the largely uncertain inputs. Stochastic imaging (SI), in contrast, provides multiple equiprobable realizations that enable probabilistic assessment of aquifer properties in a realistic manner. Generation of geologically realistic prior models is central to SI frameworks. Higher-order statistics for representing prior geological features in SI are, however, usually borrowed from training images (TIs), which may produce undesirable outcomes if the TIs are unpresentatitve of the target structures. The Markov random field (MRF)-based SI strategy provides a data-driven alternative to TI-based SI algorithms. In the MRF-based method, the simulation of spatial features is guided by Gibbs energy (GE) minimization. Local configurations with smaller GEs have higher likelihood of occurrence and vice versa. The parameters of the Gibbs distribution for computing the GE are estimated from the hydrogeophysical data, thereby enabling the generation of site-specific structures in the absence of reliable TIs. In Metropolis-like SI methods, the variance of the transition probability controls the jump-size. The procedure is a standard Markov chain Monte Carlo (McMC) method when a constant variance is assumed, and becomes simulated annealing (SA) when the variance (cooling temperature) is allowed to decrease gradually with time. We observe that in certain problems, the large variance typically employed at the beginning to hasten burn-in may be unideal for sampling at the equilibrium state. The powerfulness of SA stems from its flexibility to adaptively scale the variance at different stages of the sampling. Degeneration of results were reported in a previous implementation of the MRF-based SI strategy based on a constant variance. Here, we present an updated version of the algorithm based on SA that appears to resolve the degeneration problem with seemingly improved results. We illustrate the performance of the SA version with a joint inversion of time-lapse concentration and electrical resistivity measurements in a hypothetical trinary hydrofacies aquifer characterization problem.
NASA Astrophysics Data System (ADS)
Ladwig, Robert; Kirillin, Georgiy; Hinkelmann, Reinhard; Hupfer, Michael
2017-04-01
Urban surface water systems and especially lakes are heavily stressed and modified systems to comply with water management goals and expectations. In this study we focus on Lake Tegel in Berlin, Germany, as a representative of heavily modified urban lakes. In the 20th century, Lake Tegel received increased loadings of nutrients and leached heavy metals from an upstream sewage farm resulting in severe eutrophication problems. The construction of two upstream treatment plants caused a lowering of nutrient concentrations and a re-oligotrophication of the lake. Additionally, artificial aerators, to keep the hypolimnion oxic, and a lake pipeline, to bypass water for maintaining a minimum discharge, went into operation. Lake Tegel is still heavily used for drinking water extraction by bank filtration. These interacting management measures make the system vulnerable to changing climate conditions and pollutant loads. Past modelling studies have shown the complex hydrodynamics of the lake. Here, we are following a simplified approach by using a less computational time consuming vertical 1D-model to simulate the hydrodynamics and the ecological interactions of the system by coupling the General Lake Model to the Aquatic Ecodynamics Model Library 2. For calibration of the multidimensional parameter space we applied the Covariance Matrix Adaption-Evolution Strategy algorithm. The model is able to sufficiently replicate the vertical field temperature profiles of Lake Tegel as well as to simulate similar concentration ranges of phosphate, dissolved oxygen and nitrate. The calibrated model is used to run an uncertainty analysis by sampling the simulated data within the meaning of the Metropolis-Hastings algorithm. Finally, we are evaluating different scenarios: (1) changing air temperatures, precipitation and wind speed due to effects of climate change, (2) decreased discharges into the lake due to bypassing treated effluents into a near stream instead of Lake Tegel, and (3) increased nutrient elimination at the upstream treatment plants. We are focusing on quantifying the impact of these scenarios on lake stability as well as the abundance and distribution of nutrients.
Hasegawa, M
2011-03-01
The aim of the present study is to elucidate how simulated annealing (SA) works in its finite-time implementation by starting from the verification of its conventional optimization scenario based on equilibrium statistical mechanics. Two and one supplementary experiments, the design of which is inspired by concepts and methods developed for studies on liquid and glass, are performed on two types of random traveling salesman problems. In the first experiment, a newly parameterized temperature schedule is introduced to simulate a quasistatic process along the scenario and a parametric study is conducted to investigate the optimization characteristics of this adaptive cooling. In the second experiment, the search trajectory of the Metropolis algorithm (constant-temperature SA) is analyzed in the landscape paradigm in the hope of drawing a precise physical analogy by comparison with the corresponding dynamics of glass-forming molecular systems. These two experiments indicate that the effectiveness of finite-time SA comes not from equilibrium sampling at low temperature but from downward interbasin dynamics occurring before equilibrium. These dynamics work most effectively at an intermediate temperature varying with the total search time and thus this effective temperature is identified using the Deborah number. To test directly the role of these relaxation dynamics in the process of cooling, a supplementary experiment is performed using another parameterized temperature schedule with a piecewise variable cooling rate and the effect of this biased cooling is examined systematically. The results show that the optimization performance is not only dependent on but also sensitive to cooling in the vicinity of the above effec-tive temperature and that this feature is interpreted as a consequence of the presence or absence of the workable interbasin dynamics. It is confirmed for the present instances that the effectiveness of finite-time SA derives from the glassy relaxation dynamics occurring in the "landscape-influenced" temperature regime and that its naive optimization scenario should be rectified by considering the analogy with vitrification phenomena. A comprehensive guideline for the design of finite-time SA and SA-related algorithms is discussed on the basis of this rectified analogy.
NASA Astrophysics Data System (ADS)
Pankratov, Oleg; Kuvshinov, Alexey
2016-01-01
Despite impressive progress in the development and application of electromagnetic (EM) deterministic inverse schemes to map the 3-D distribution of electrical conductivity within the Earth, there is one question which remains poorly addressed—uncertainty quantification of the recovered conductivity models. Apparently, only an inversion based on a statistical approach provides a systematic framework to quantify such uncertainties. The Metropolis-Hastings (M-H) algorithm is the most popular technique for sampling the posterior probability distribution that describes the solution of the statistical inverse problem. However, all statistical inverse schemes require an enormous amount of forward simulations and thus appear to be extremely demanding computationally, if not prohibitive, if a 3-D set up is invoked. This urges development of fast and scalable 3-D modelling codes which can run large-scale 3-D models of practical interest for fractions of a second on high-performance multi-core platforms. But, even with these codes, the challenge for M-H methods is to construct proposal functions that simultaneously provide a good approximation of the target density function while being inexpensive to be sampled. In this paper we address both of these issues. First we introduce a variant of the M-H method which uses information about the local gradient and Hessian of the penalty function. This, in particular, allows us to exploit adjoint-based machinery that has been instrumental for the fast solution of deterministic inverse problems. We explain why this modification of M-H significantly accelerates sampling of the posterior probability distribution. In addition we show how Hessian handling (inverse, square root) can be made practicable by a low-rank approximation using the Lanczos algorithm. Ultimately we discuss uncertainty analysis based on stochastic inversion results. In addition, we demonstrate how this analysis can be performed within a deterministic approach. In the second part, we summarize modern trends in the development of efficient 3-D EM forward modelling schemes with special emphasis on recent advances in the integral equation approach.
A Bayesian approach to earthquake source studies
NASA Astrophysics Data System (ADS)
Minson, Sarah
Bayesian sampling has several advantages over conventional optimization approaches to solving inverse problems. It produces the distribution of all possible models sampled proportionally to how much each model is consistent with the data and the specified prior information, and thus images the entire solution space, revealing the uncertainties and trade-offs in the model. Bayesian sampling is applicable to both linear and non-linear modeling, and the values of the model parameters being sampled can be constrained based on the physics of the process being studied and do not have to be regularized. However, these methods are computationally challenging for high-dimensional problems. Until now the computational expense of Bayesian sampling has been too great for it to be practicable for most geophysical problems. I present a new parallel sampling algorithm called CATMIP for Cascading Adaptive Tempered Metropolis In Parallel. This technique, based on Transitional Markov chain Monte Carlo, makes it possible to sample distributions in many hundreds of dimensions, if the forward model is fast, or to sample computationally expensive forward models in smaller numbers of dimensions. The design of the algorithm is independent of the model being sampled, so CATMIP can be applied to many areas of research. I use CATMIP to produce a finite fault source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. Surface displacements from the earthquake were recorded by six interferograms and twelve local high-rate GPS stations. Because of the wealth of near-fault data, the source process is well-constrained. I find that the near-field high-rate GPS data have significant resolving power above and beyond the slip distribution determined from static displacements. The location and magnitude of the maximum displacement are resolved. The rupture almost certainly propagated at sub-shear velocities. The full posterior distribution can be used not only to calculate source parameters but also to determine their uncertainties. So while kinematic source modeling and the estimation of source parameters is not new, with CATMIP I am able to use Bayesian sampling to determine which parts of the source process are well-constrained and which are not.
Garcia, Jonathan; Muñoz-Laboy, Miguel; Parker, Richard; Wilson, Patrick A
2014-04-01
Sex markets (the spatially and culturally bounded arenas) that shape bisexual behavior among Latino men have been utilized as a deterministic concept without a sufficient focus on the ability of individuals to make autonomous decisions within such arenas. We nuance the theory of sex markets using the concept of sexual opportunity structures to investigate the ways in which behaviorally bisexual Latino men in the urban metropolis of New York City navigate sexual geographies, cultural meaning systems, sexual scripts, and social institutions to configure their bisexual behaviors. Drawing on 60 in-depth interviews with bisexual Latino men in New York City, we first describe and analyze venues that constitute sexual geographies that facilitate and impede sexual interaction. These also allow for a degree of autonomy in decision-making, as men travel throughout the urban sexual landscape and sometimes even manage to reject norms, such as those imposed by Christian religion. We explore some of the cultural meaning systems and social institutions that regulate sex markets and influence individual decision-making. Secrecy and discretion-regulated by the family, masculinity, migration, and religion-only partially shaped sexual behavior and relationships. These factors create a flux in "equilibrium" in bisexual sex markets in which sociocultural-economic structures constantly interplay with human agency. This article contributes to the literature in identifying dynamic spaces for sexual health interventions that draw on individual agency and empowerment.
NASA Technical Reports Server (NTRS)
Shepherd, J. Marshall; Pierce, Harold; Starr, David OC. (Technical Monitor)
2001-01-01
This study represents one of the first published attempts to identify rainfall modification by urban areas using satellite-based rainfall measurements. Data from the first space-based rain-radar, the Tropical Rainfall Measuring Mission's (TRMM) Precipitation Radar, are employed. Analysis of the data enables identification of rainfall patterns around Atlanta, Montgomery, Nashville, San Antonio, Waco, and Dallas during the warm season. Results reveal an average increase of -28% in monthly rainfall rates within 30-60 kilometers downwind of the metropolis with a modest increase of 5.6% over the metropolis. Portions of the downwind area exhibit increases as high as 51%. The percentage chances are relative to an upwind CONTROL area. It was also found that maximum rainfall rates in the downwind impact area can exceed the mean value in the upwind CONTROL area by 48%-116%. The maximum value was generally found at an average distance of 39 km from the edge of the urban center or 64 km from the center of the city. These results are consistent with METROMEX studies of St. Louis almost two decades ago and more recent studies near Atlanta. Future work will investi(yate hypothesized factors causing rainfall modification by urban areas. Additional work is also needed to provide more robust validation of space-based rain estimates near major urban areas. Such research has implications for urban planning, water resource management, and understanding human impact on the environment.
Garcia, Jonathan; Muñoz-Laboy, Miguel; Parker, Richard; Wilson, Patrick A.
2013-01-01
Sex markets (the spatially and culturally bounded arenas) that shape bisexual behavior among Latino men have been utilized as a deterministic concept without a sufficient focus on the ability of individuals to make autonomous decisions within such arenas. We nuance the theory of sex markets using the concept of sexual opportunity structures to investigate the ways in which behaviorally bisexual Latino men in the urban metropolis of New York City navigate sexual geographies, cultural meaning systems, sexual scripts, and social institutions to configure their bisexual behaviors. Drawing on 60 in-depth interviews with bisexual Latino men in New York city, we first describe and analyze venues that constitute sexual geographies that facilitate and impede sexual interaction. These also allow for a degree of autonomy in decision-making, as men travel throughout the urban sexual landscape and sometimes even manage to reject norms, such as those imposed by Christian religion. We explore some of the cultural meaning systems and social institutions that regulate sex markets and influence individual decision-making. Secrecy and discretion—regulated by the family, masculinity, migration, and religion—only partially shaped sexual behavior and relationships. These factors create a flux in “equilibrium” in bisexual sex markets in which sociocultural-economic structures constantly interplay with human agency. This article contributes to the literature in identifying dynamic spaces for sexual health interventions that draw on individual agency and empowerment. PMID:23479357
Isaiah, Ibeh Nnana; Nche, Bikwe Thomas; Nwagu, Ibeh Georgina; Nnanna, Ibeh Isaiah
2011-01-01
Background: The current rise of male infertility associated with bacterospermia and urogenital infection has been on the increase amongst adult married males in Benin metropolis and a major cause of concern to male fertility and reproduction in Nigeria. Aim: To microbiologically isolate and study the infectious agent that has led to male infertility and also to study the percentage occurrence of bacteropsermia and urogenital caused infertility in adult married males in Benin metropolis Material and Method: using standard microbiological methods of isolating and identifying the organism, specimen was collected and processed which includes the susceptibility profile of isolates and sperm quality. In this study a total of 140 sperm samples was collected from patient who were referred from the consultant outpatient department of the University of Benin Teaching Hospital and then evaluated bacteriologically using standard bacterial cultural methods Results: Among the total cases, 92 (65.7%) showed at least one pathogen. Staphylococcus aureus (28.3%), Staphylococcus Saprophyticus (13.0%), Pseudomonas aerouginosa (6.5%), Escherichia Coli (19.6%) Proteus mirabilis (10.8%) Klebsiella spp (10.8%) and Proteus vulgaris (10.8%). Conclusion: There was an outstanding significant relationship between bacteriospermia and the rate of total motility and morphologically abnormal sperms, The percentage of morphologically normal sperm was lower in this study. Staphylococcus aureus Staphylococcus saprohyticus and Escherichia coli were the most common pathogen having negative effects on sperm motility and morphology in this study. PMID:22363079
NASA Astrophysics Data System (ADS)
Karikari, A. Y.; Ampofo, J. A.
2013-06-01
Drinking water quality from two major treatment plants in Ghana; Kpong and Weija Plants, and distribution networks in the Accra-Tema Metropolis were monitored monthly for a year at fifteen different locations. The study determined the relationship between chlorine residual, other physico-chemical qualities of the treated water, and, bacteria regrowth. Results indicated that the treated water at the Kpong and Weija Treatment Plants conformed to WHO guidelines for potable water. However, the water quality deteriorated bacteriologically, from the plants to the delivery points with high numbers of indicator and opportunistic pathogens. This could be due to inadequate disinfection residual, biofilms or accidental point source contamination by broken pipes, installation and repair works. The mean turbidity ranged from 1.6 to 2.4 NTU; pH varied from 6.8 to 7.4; conductivity fluctuated from 71.1 to 293 μS/cm. Chlorine residual ranged from 0.13 to 1.35 mg/l. High residual chlorine was observed at the treatment plants, which decreased further from the plants. Results showed that additional chlorination does not take place at the booster stations. Chlorine showed inverse relationship with microbial counts. Total coliform bacteria ranged from 0 to 248 cfu/100 ml, and faecal coliform values varied from 0 to 128 cfu/100 ml. Other microorganisms observed in the treated water included Aeromonas spp., Clostridium spp. and Pseudomonas spp. Boiling water in the household before consumption will reduce water-related health risks.
NASA Astrophysics Data System (ADS)
Post, Hanna; Vrugt, Jasper A.; Fox, Andrew; Vereecken, Harry; Hendricks Franssen, Harrie-Jan
2017-03-01
The Community Land Model (CLM) contains many parameters whose values are uncertain and thus require careful estimation for model application at individual sites. Here we used Bayesian inference with the DiffeRential Evolution Adaptive Metropolis (DREAM(zs)) algorithm to estimate eight CLM v.4.5 ecosystem parameters using 1 year records of half-hourly net ecosystem CO2 exchange (NEE) observations of four central European sites with different plant functional types (PFTs). The posterior CLM parameter distributions of each site were estimated per individual season and on a yearly basis. These estimates were then evaluated using NEE data from an independent evaluation period and data from "nearby" FLUXNET sites at 600 km distance to the original sites. Latent variables (multipliers) were used to treat explicitly uncertainty in the initial carbon-nitrogen pools. The posterior parameter estimates were superior to their default values in their ability to track and explain the measured NEE data of each site. The seasonal parameter values reduced with more than 50% (averaged over all sites) the bias in the simulated NEE values. The most consistent performance of CLM during the evaluation period was found for the posterior parameter values of the forest PFTs, and contrary to the C3-grass and C3-crop sites, the latent variables of the initial pools further enhanced the quality-of-fit. The carbon sink function of the forest PFTs significantly increased with the posterior parameter estimates. We thus conclude that land surface model predictions of carbon stocks and fluxes require careful consideration of uncertain ecological parameters and initial states.
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
A computer program for uncertainty analysis integrating regression and Bayesian methods
Lu, Dan; Ye, Ming; Hill, Mary C.; Poeter, Eileen P.; Curtis, Gary
2014-01-01
This work develops a new functionality in UCODE_2014 to evaluate Bayesian credible intervals using the Markov Chain Monte Carlo (MCMC) method. The MCMC capability in UCODE_2014 is based on the FORTRAN version of the differential evolution adaptive Metropolis (DREAM) algorithm of Vrugt et al. (2009), which estimates the posterior probability density function of model parameters in high-dimensional and multimodal sampling problems. The UCODE MCMC capability provides eleven prior probability distributions and three ways to initialize the sampling process. It evaluates parametric and predictive uncertainties and it has parallel computing capability based on multiple chains to accelerate the sampling process. This paper tests and demonstrates the MCMC capability using a 10-dimensional multimodal mathematical function, a 100-dimensional Gaussian function, and a groundwater reactive transport model. The use of the MCMC capability is made straightforward and flexible by adopting the JUPITER API protocol. With the new MCMC capability, UCODE_2014 can be used to calculate three types of uncertainty intervals, which all can account for prior information: (1) linear confidence intervals which require linearity and Gaussian error assumptions and typically 10s–100s of highly parallelizable model runs after optimization, (2) nonlinear confidence intervals which require a smooth objective function surface and Gaussian observation error assumptions and typically 100s–1,000s of partially parallelizable model runs after optimization, and (3) MCMC Bayesian credible intervals which require few assumptions and commonly 10,000s–100,000s or more partially parallelizable model runs. Ready access allows users to select methods best suited to their work, and to compare methods in many circumstances.
NASA Astrophysics Data System (ADS)
Koskela, J. J.; Croke, B. W. F.; Koivusalo, H.; Jakeman, A. J.; Kokkonen, T.
2012-11-01
Bayesian inference is used to study the effect of precipitation and model structural uncertainty on estimates of model parameters and confidence limits of predictive variables in a conceptual rainfall-runoff model in the snow-fed Rudbäck catchment (142 ha) in southern Finland. The IHACRES model is coupled with a simple degree day model to account for snow accumulation and melt. The posterior probability distribution of the model parameters is sampled by using the Differential Evolution Adaptive Metropolis (DREAM(ZS)) algorithm and the generalized likelihood function. Precipitation uncertainty is taken into account by introducing additional latent variables that were used as multipliers for individual storm events. Results suggest that occasional snow water equivalent (SWE) observations together with daily streamflow observations do not contain enough information to simultaneously identify model parameters, precipitation uncertainty and model structural uncertainty in the Rudbäck catchment. The addition of an autoregressive component to account for model structure error and latent variables having uniform priors to account for input uncertainty lead to dubious posterior distributions of model parameters. Thus our hypothesis that informative priors for latent variables could be replaced by additional SWE data could not be confirmed. The model was found to work adequately in 1-day-ahead simulation mode, but the results were poor in the simulation batch mode. This was caused by the interaction of parameters that were used to describe different sources of uncertainty. The findings may have lessons for other cases where parameterizations are similarly high in relation to available prior information.
NASA Astrophysics Data System (ADS)
Jakkareddy, Pradeep S.; Balaji, C.
2017-02-01
This paper reports the results of an experimental study to estimate the heat flux and convective heat transfer coefficient using liquid crystal thermography and Bayesian inference in a heat generating sphere, enclosed in a cubical Teflon block. The geometry considered for the experiments comprises a heater inserted in a hollow hemispherical aluminium ball, resulting in a volumetric heat generation source that is placed at the center of the Teflon block. Calibrated thermochromic liquid crystal sheets are used to capture the temperature distribution at the front face of the Teflon block. The forward model is the three dimensional conduction equation which is solved within the Teflon block to obtain steady state temperatures, using COMSOL. Match up experiments are carried out for various velocities by minimizing the residual between TLC and simulated temperatures for every assumed loss coefficient, to obtain a correlation of average Nusselt number against Reynolds number. This is used for prescribing the boundary condition for the solution to the forward model. A surrogate model obtained by artificial neural network built upon the data from COMSOL simulations is used to drive a Markov Chain Monte Carlo based Metropolis Hastings algorithm to generate the samples. Bayesian inference is adopted to solve the inverse problem for determination of heat flux and heat transfer coefficient from the measured temperature field. Point estimates of the posterior like the mean, maximum a posteriori and standard deviation of the retrieved heat flux and convective heat transfer coefficient are reported. Additionally the effect of number of samples on the performance of the estimation process has been investigated.
NASA Astrophysics Data System (ADS)
Alzate-Cardona, J. D.; Barco-Rios, H.; Restrepo-Parra, E.
2018-02-01
The magnetocaloric behavior of La{2/{3}} Ca{1/{3}} Mn1-x Fe x O3 for x = 0.00, 0.02, 0.03, 0.05, 0.07, 0.08 and 0.10 under the influence of an external magnetic field was simulated and analyzed. Simulations were carried out using the Monte Carlo method and the classical Heisenberg model under the Metropolis algorithm. These mixed valence manganites are characterized by having three types of magnetic ions corresponding to Mn4+≤ft(S=\\frac{3}{2}\\right) , which are bonded with Ca2+ , and Mneg3+ and Mneg\\prime3+ (S=2) , related to La3+ . The Fe ions were randomly included, replacing Mn ions. With this model, the magnetic entropy change, Δ S , in an isothermal process was determined. -Δ Sm showed maximum peaks around the paramagnetic-ferromagnetic transition temperature, which depends on Fe doping. Relative cooling power was computed for different Fe concentrations varying the magnetic applied field. Our model and results show that the Fe doping decreases the magnetocaloric effect in the La{2/{3}} Ca{1/{3}} Mn1-x Fe x O3, making this a bad candidate for magnetic refrigeration. The strong dependence of the magnetocaloric behavior on Fe doping and the external magnetic field in La{2/{3}} Ca{1/{3}} Mn1-x Fe x O3 can boost these materials for the future technological applications.
NASA Astrophysics Data System (ADS)
Liu, Boda; Liang, Yan
2017-04-01
Markov chain Monte Carlo (MCMC) simulation is a powerful statistical method in solving inverse problems that arise from a wide range of applications. In Earth sciences applications of MCMC simulations are primarily in the field of geophysics. The purpose of this study is to introduce MCMC methods to geochemical inverse problems related to trace element fractionation during mantle melting. MCMC methods have several advantages over least squares methods in deciphering melting processes from trace element abundances in basalts and mantle rocks. Here we use an MCMC method to invert for extent of melting, fraction of melt present during melting, and extent of chemical disequilibrium between the melt and residual solid from REE abundances in clinopyroxene in abyssal peridotites from Mid-Atlantic Ridge, Central Indian Ridge, Southwest Indian Ridge, Lena Trough, and American-Antarctic Ridge. We consider two melting models: one with exact analytical solution and the other without. We solve the latter numerically in a chain of melting models according to the Metropolis-Hastings algorithm. The probability distribution of inverted melting parameters depends on assumptions of the physical model, knowledge of mantle source composition, and constraints from the REE data. Results from MCMC inversion are consistent with and provide more reliable uncertainty estimates than results based on nonlinear least squares inversion. We show that chemical disequilibrium is likely to play an important role in fractionating LREE in residual peridotites during partial melting beneath mid-ocean ridge spreading centers. MCMC simulation is well suited for more complicated but physically more realistic melting problems that do not have analytical solutions.
Ochoa, Silvia; Yoo, Ahrim; Repke, Jens-Uwe; Wozny, Günter; Yang, Dae Ryook
2007-01-01
Despite many environmental advantages of using alcohol as a fuel, there are still serious questions about its economical feasibility when compared with oil-based fuels. The bioethanol industry needs to be more competitive, and therefore, all stages of its production process must be simple, inexpensive, efficient, and "easy" to control. In recent years, there have been significant improvements in process design, such as in the purification technologies for ethanol dehydration (molecular sieves, pressure swing adsorption, pervaporation, etc.) and in genetic modifications of microbial strains. However, a lot of research effort is still required in optimization and control, where the first step is the development of suitable models of the process, which can be used as a simulated plant, as a soft sensor or as part of the control algorithm. Thus, toward developing good, reliable, and simple but highly predictive models that can be used in the future for optimization and process control applications, in this paper an unstructured and a cybernetic model are proposed and compared for the simultaneous saccharification-fermentation process (SSF) for the production of ethanol from starch by a recombinant Saccharomyces cerevisiae strain. The cybernetic model proposed is a new one that considers the degradation of starch not only into glucose but also into dextrins (reducing sugars) and takes into account the intracellular reactions occurring inside the cells, giving a more detailed description of the process. Furthermore, an identification procedure based on the Metropolis Monte Carlo optimization method coupled with a sensitivity analysis is proposed for the identification of the model's parameters, employing experimental data reported in the literature.
NASA Astrophysics Data System (ADS)
Jakkareddy, Pradeep S.; Balaji, C.
2016-09-01
This paper employs the Bayesian based Metropolis Hasting - Markov Chain Monte Carlo algorithm to solve inverse heat transfer problem of determining the spatially varying heat transfer coefficient from a flat plate with flush mounted discrete heat sources with measured temperatures at the bottom of the plate. The Nusselt number is assumed to be of the form Nu = aReb(x/l)c . To input reasonable values of ’a’ and ‘b’ into the inverse problem, first limited two dimensional conjugate convection simulations were done with Comsol. Based on the guidance from this different values of ‘a’ and ‘b’ are input to a computationally less complex problem of conjugate conduction in the flat plate (15mm thickness) and temperature distributions at the bottom of the plate which is a more convenient location for measuring the temperatures without disturbing the flow were obtained. Since the goal of this work is to demonstrate the eficiacy of the Bayesian approach to accurately retrieve ‘a’ and ‘b’, numerically generated temperatures with known values of ‘a’ and ‘b’ are treated as ‘surrogate’ experimental data. The inverse problem is then solved by repeatedly using the forward solutions together with the MH-MCMC aprroach. To speed up the estimation, the forward model is replaced by an artificial neural network. The mean, maximum-a-posteriori and standard deviation of the estimated parameters ‘a’ and ‘b’ are reported. The robustness of the proposed method is examined, by synthetically adding noise to the temperatures.
Marsh, Sharon; Hu, Junbo; Feng, Wenke
2016-01-01
Nonalcoholic fatty liver disease (NAFLD) is the most common chronic liver disease in the world, and it comprises a spectrum of hepatic abnormalities from simple hepatic steatosis to steatohepatitis, fibrosis, cirrhosis, and liver cancer. While the pathogenesis of NAFLD remains incompletely understood, a multihit model has been proposed that accommodates causal factors from a variety of sources, including intestinal and adipose proinflammatory stimuli acting on the liver simultaneously. Prior cellular and molecular studies of patient and animal models have characterized several common pathogenic mechanisms of NAFLD, including proinflammation cytokines, lipotoxicity, oxidative stress, and endoplasmic reticulum stress. In recent years, gut microbiota has gained much attention, and dysbiosis is recognized as a crucial factor in NAFLD. Moreover, several genetic variants have been identified through genome-wide association studies, particularly rs738409 (Ile748Met) in PNPLA3 and rs58542926 (Glu167Lys) in TM6SF2, which are critical risk alleles of the disease. Although a high-fat diet and inactive lifestyles are typical risk factors for NAFLD, the interplay between diet, gut microbiota, and genetic background is believed to be more important in the development and progression of NAFLD. This review summarizes the common pathogenic mechanisms, the gut microbiota relevant mechanisms, and the major genetic variants leading to NAFLD and its progression. PMID:27247565
Promoting Improved Ballistic Resistance of Transparent Armor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wereszczak, Andrew A; Patel, P; Templeton, D W
2011-01-01
Transparent armor is a material or system of materials designed to be optically transparent, yet protect from fragmentation or ballistic impacts. Although engineered to defeat specific threats, or a range of threats, there are general requirements common to all of these designs. The primary requirement for a transparent armor system is to not only defeat the designated threat but also provide a multi-hit capability with minimized distortion of surrounding areas. Ground platforms have several parameters that must be optimized, such as weight, space efficiency, and cost versus performance. Glass exhibits tensile failure stress that is very much dependent on themore » amount of material being stressed, the side being tensile-stressed (i.e., air-versus tin-side if a float glass), and where it is being tensile stressed (i.e., in the middle or near an edge). An axiom arising from those effects is a greater amount of allowable deflection (i.e., higher failure stress) of a ballistically impacted transparent armor will result in improved ballistic resistance. Therefore, the interpretation and management of those tensile-failure-stress dependencies shall ultimately improve ballistic resistance and its predictability of transparent armor. Each of those three dependencies (size, side, and location) in a soda-lime silicate glass is described.« less
LMC: Logarithmantic Monte Carlo
NASA Astrophysics Data System (ADS)
Mantz, Adam B.
2017-06-01
LMC is a Markov Chain Monte Carlo engine in Python that implements adaptive Metropolis-Hastings and slice sampling, as well as the affine-invariant method of Goodman & Weare, in a flexible framework. It can be used for simple problems, but the main use case is problems where expensive likelihood evaluations are provided by less flexible third-party software, which benefit from parallelization across many nodes at the sampling level. The parallel/adaptive methods use communication through MPI, or alternatively by writing/reading files, and mostly follow the approaches pioneered by CosmoMC (ascl:1106.025).
Lattice QCD calculation using VPP500
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Seyong; Ohta, Shigemi
1995-02-01
A new vector parallel supercomputer, Fujitsu VPP500, was installed at RIKEN earlier this year. It consists of 30 vector computers, each with 1.6 GFLOPS peak speed and 256 MB memory, connected by a crossbar switch with 400 MB/s peak data transfer rate each way between any pair of nodes. The authors developed a Fortran lattice QCD simulation code for it. It runs at about 1.1 GFLOPS sustained per node for Metropolis pure-gauge update, and about 0.8 GFLOPS sustained per node for conjugate gradient inversion of staggered fermion matrix.
Out to eat: the emergence and evolution of the restaurant in nineteenth-century New York City.
Lobel, Cindy R
2010-01-01
Unheard of in the eighteenth century, restaurants became an integral part of New York City's public culture in the antebellum period. This article examines the emergence and development of New York's restaurant sector in the nineteenth century, focusing on three aspects in particular: the close ties between urbanization and the rise of New York's restaurants, the role restaurants played in enforcing the city's class structure and gender mores, and the role of restaurants in shaping the public culture of the growing metropolis.
Matsumoto, Yoko; Nakai, Akihito; Nishijima, Yasuhiro; Kishita, Eisaku; Hakuno, Haruhiko; Sakoi, Masami; Kusuda, Satoshi; Unno, Nobuya; Tamura, Masanori; Fujii, Tomoyuki
2016-10-01
National medical projects are carried out according to medical care plans directed by the Medical Care Act of Japan. In order to improve Japanese perinatal medical care, it is necessary to determine the factors that might influence perinatal outcome. Statistical data of births and perinatal deaths were obtained for all municipalities in Japan from 2008 to 2012 from the Portal Site of Official Statistics of Japan (e-Stat). The perinatal mortality of all 349 Japanese secondary medical care zones was calculated. The number of neonatal intensive care units (NICUs), maternal-fetal intensive care units (MFICUs), pediatricians and obstetricians in 2011 were also obtained from e-Stat. Nine secondary medical care zones in two prefectures, Fukushima (7) and Miyagi (2) were excluded to eliminate the influence of the 2011 Great East Japan Earthquake. The 340 secondary medical care zones were divided into three groups according to population size and density: metropolis, provincial city, and depopulation. The number of secondary medical care zones in each group were 52, 168, and 120, respectively. The secondary medical care zones in the depopulation group had fewer pediatricians and significantly fewer NICUs and MFICUs than the metropolis group, but there was no significant difference in perinatal mortality. The only independent risk factor for high perinatal mortality, determined by multivariable analysis, was the absence of an NICU (P = 0.011). To consider directions in perinatal medical care, planned arrangement and appropriate access to NICUs is indispensable. © 2016 Japan Society of Obstetrics and Gynecology.
NASA Astrophysics Data System (ADS)
Alaigba, D. B.; Soumah, M.; Banjo, M. O.
2017-05-01
The problem of urban mobility is complicated by traffic delay, resulting from poor planning, high population density and poor condition of roads within urban spaces. This study assessed traffic congestion resulting from differential contribution made by various land-uses along Apapa-Oworoshoki expressway in Lagos metropolis. The data for this study was from both primary and secondary sources; GPS point data was collected at selected points for traffic volume count; observation of the nature of vehicular traffic congestion, and land use types along the corridor. Existing data on traffic count along the corridor, connectivity map and land use map sourced from relevant authorities were acquired. Traffic congestion within the area was estimated using volume capacity ratio (V/C). Heterogeneity Index was developed and used to quantify the percentage contribution to traffic volume from various land-use categories. Analytical Hierarchical Processing (AHP) and knowledge-based weighting were used to rank the importance of different heterogeneity indices. Results showed significant relationship between the degree of heterogeneity of the land use pattern and road traffic congestion. Volume Capacity Ratio computed revealed that the route corridor exceeds its designed capacity in the southward direction between the hours of 8am and 12pm on working days. Five major nodes were analyzed along the corridor, and were all above the expected Passenger Car Unit (PCU), these are "Oshodi" 15 %, "Airport junction" 10 %, "Cele bus stop" 21 %, "Mile 2" 14 %, "Berger" 15 % and "Tincan bus stop" 33 % indicating heavy traffic congestion.
Assessment of Heavy Metal Pollution in Topsoil around Beijing Metropolis
Sun, Ranhao; Chen, Liding
2016-01-01
The topsoil around Beijing metropolis, China, is experiencing impacts of rapid urbanization, intensive farming, and extensive industrial emissions. We analyzed the concentrations of Cu, Ni, Pb, Zn, Cd, and Cr from 87 topsoil samples in the pre-rainy season and 115 samples in the post-rainy season. These samples were attributed to nine land use types: forest, grass, shrub, orchard, wheat, cotton, spring maize, summer maize, and mixed farmland. The pollution index (PI) of heavy metals was calculated from the measured and background concentrations. The ecological risk index (RI) was assessed based on the PI values and toxic-response parameters. The results showed that the mean PI values of Pb, Cr, and Cd were > 1 while those of Cu, Ni, and Zn were < 1. All the samples had low ecological risk for Cu, Ni, Pb, Zn, and Cr while only 15.35% of samples had low ecological risk for Cd. Atmospheric transport rather than land use factors best explained the seasonal variations in heavy metal concentrations and the impact of atmospheric transport on heavy metal concentrations varied according to the heavy metal types. The concentrations of Cu, Cd, and Cr decreased from the pre- to post-rainy season, while those of Ni, Pb, and Zn increased during this period. Future research should be focused on the underlying atmospheric processes that lead to these spatial and seasonal variations in heavy metals. The policymaking on environmental management should pay close attention to potential ecological risks of Cd as well as identifying the transport pathways of different heavy metals. PMID:27159454
Amjadian, Keyvan; Sacchi, Elisa; Rastegari Mehr, Meisam
2016-11-01
Urban soil contamination is a growing concern for the potential health impact on the increasing number of people living in these areas. In this study, the concentration, the distribution, the contamination levels, and the role of land use were investigated in Erbil metropolis, the capital of Iraqi Kurdistan. A total of 74 soil samples were collected, treated, and analyzed for their physicochemical properties, and for 7 heavy metals (As, Cd, Cr, Cu, Fe, Pb, and Zn) and 16 PAH contents. High concentrations, especially of Cd, Cu Pb, and Zn, were found. The Geoaccumulation index (I geo ), along with correlation coefficients and principal component analysis (PCA) showed that Cd, Cu, Pb, and Zn have similar behaviors and spatial distribution patterns. Heavy traffic density mainly contributed to the high concentrations of these metals. The total concentration of ∑PAHs ranged from 24.26 to 6129.14 ng/g with a mean of 2296.1 ng/g. The PAH pattern was dominated by 4- and 5-ring PAHs, while diagnostic ratios and PCA indicated that the main sources of PAHs were pyrogenic. The toxic equivalent (TEQ) values ranged from 3.26 to 362.84 ng/g, with higher values in central parts of the city. A statistically significant difference in As, Cd, Cu, Pb, Zn, and ∑PAH concentrations between different land uses was observed. The highest As concentrations were found in agricultural areas while roadside, commercial, and industrial areas had the highest Cd, Cu, Pb, Zn, and ∑PAH contents.
NASA Technical Reports Server (NTRS)
Shepherd, J. Marshell; Starr, David OC. (Technical Monitor)
2001-01-01
A novel approach is introduced to correlating urbanization and rainfall modification. This study represents one of the first published attempts (possibly the first) to identify and quantify rainfall modification by urban areas using satellite-based rainfall measurements. Previous investigations successfully used rain gauge networks and around-based radar to investigate this phenomenon but still encountered difficulties due to limited, specialized measurements and separation of topographic and other influences. Three years of mean monthly rainfall rates derived from the first space-based rainfall radar, Tropical Rainfall Measuring Mission's (TRMM) Precipitation Radar, are employed. Analysis of data at half-degree latitude resolution enables identification of rainfall patterns around major metropolitan areas of Atlanta, Montgomery, Nashville, San Antonio, Waco, and Dallas during the warm season. Preliminary results reveal an average increase of 5.6% in monthly rainfall rates (relative to a mean upwind CONTROL area) over the metropolis but an average increase of approx. 28%, in monthly rainfall rates within 30-60 kilometers downwind of the metropolis. Some portions of the downwind area exhibit increases as high as 51%. It was also found that maximum rainfall rates found in the downwind impact area exceeded the mean value in the upwind CONTROL area by 48%-116% and were generally found at an average distance of 39 km from the edge of the urban center or 64 km from the center of the city. These results are quite consistent studies of St. Louis (e.g' METROMEX) and Chicago almost two decades ago and more recent studies in the Atlanta and Mexico City areas.
Constant-pH Hybrid Nonequilibrium Molecular Dynamics–Monte Carlo Simulation Method
2016-01-01
A computational method is developed to carry out explicit solvent simulations of complex molecular systems under conditions of constant pH. In constant-pH simulations, preidentified ionizable sites are allowed to spontaneously protonate and deprotonate as a function of time in response to the environment and the imposed pH. The method, based on a hybrid scheme originally proposed by H. A. Stern (J. Chem. Phys.2007, 126, 164112), consists of carrying out short nonequilibrium molecular dynamics (neMD) switching trajectories to generate physically plausible configurations with changed protonation states that are subsequently accepted or rejected according to a Metropolis Monte Carlo (MC) criterion. To ensure microscopic detailed balance arising from such nonequilibrium switches, the atomic momenta are altered according to the symmetric two-ends momentum reversal prescription. To achieve higher efficiency, the original neMD–MC scheme is separated into two steps, reducing the need for generating a large number of unproductive and costly nonequilibrium trajectories. In the first step, the protonation state of a site is randomly attributed via a Metropolis MC process on the basis of an intrinsic pKa; an attempted nonequilibrium switch is generated only if this change in protonation state is accepted. This hybrid two-step inherent pKa neMD–MC simulation method is tested with single amino acids in solution (Asp, Glu, and His) and then applied to turkey ovomucoid third domain and hen egg-white lysozyme. Because of the simple linear increase in the computational cost relative to the number of titratable sites, the present method is naturally able to treat extremely large systems. PMID:26300709
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ojedokun, Oluyinka, E-mail: yinkaoje2004@yahoo.com
Highlights: > Independently, altruism and locus of control contributed significantly toward attitude towards littering. > Altruism and locus of control jointly contributed significantly to attitude towards littering. > The results further show a significant joint influence of altruism and locus of control on REB. > The independent contributions reveal that altruism and locus of control contribute significantly to REB. > Attitude towards littering mediates the relationship between locus of control and REB. - Abstract: The study tested whether attitude towards littering mediates the relationship between personality attributes (altruism and locus of control) and responsible environmental behavior (REB) among some residentsmore » of Ibadan metropolis, Nigeria. Using multistage sampling technique, measures of each construct were administered to 1360 participants. Results reveal significant independent and joint influence of personality attributes on attitude towards littering and responsible environmental behavior, respectively. Attitude towards littering also mediates the relationship between personality characteristics and REB. These findings imply that individuals who possess certain desirable personality characteristics and who have unfavorable attitude towards littering have more tendencies to engage in pro-environmental behavior. Therefore, stakeholders who have waste management as their priority should incorporate this information when guidelines for public education and litter prevention programs are being developed. It is suggested that psychologists should be involved in designing of litter prevention strategies. This will ensure the inclusion of behavioral issues in such strategies. An integrated approach to litter prevention that combines empowerment, cognitive, social, and technical solutions is recommended as the most effective tool of tackling the litter problem among residents of Ibadan metropolis.« less
NASA Astrophysics Data System (ADS)
Liu, Y. R.; Li, Y. P.; Huang, G. H.; Zhang, J. L.; Fan, Y. R.
2017-10-01
In this study, a Bayesian-based multilevel factorial analysis (BMFA) method is developed to assess parameter uncertainties and their effects on hydrological model responses. In BMFA, Differential Evolution Adaptive Metropolis (DREAM) algorithm is employed to approximate the posterior distributions of model parameters with Bayesian inference; factorial analysis (FA) technique is used for measuring the specific variations of hydrological responses in terms of posterior distributions to investigate the individual and interactive effects of parameters on model outputs. BMFA is then applied to a case study of the Jinghe River watershed in the Loess Plateau of China to display its validity and applicability. The uncertainties of four sensitive parameters, including soil conservation service runoff curve number to moisture condition II (CN2), soil hydraulic conductivity (SOL_K), plant available water capacity (SOL_AWC), and soil depth (SOL_Z), are investigated. Results reveal that (i) CN2 has positive effect on peak flow, implying that the concentrated rainfall during rainy season can cause infiltration-excess surface flow, which is an considerable contributor to peak flow in this watershed; (ii) SOL_K has positive effect on average flow, implying that the widely distributed cambisols can lead to medium percolation capacity; (iii) the interaction between SOL_AWC and SOL_Z has noticeable effect on the peak flow and their effects are dependent upon each other, which discloses that soil depth can significant influence the processes of plant uptake of soil water in this watershed. Based on the above findings, the significant parameters and the relationship among uncertain parameters can be specified, such that hydrological model's capability for simulating/predicting water resources of the Jinghe River watershed can be improved.
Assessment of parametric uncertainty for groundwater reactive transport modeling,
Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun
2014-01-01
The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.
NASA Astrophysics Data System (ADS)
Scharnagl, B.; Vrugt, J. A.; Vereecken, H.; Herbst, M.
2010-02-01
A major drawback of current soil organic carbon (SOC) models is that their conceptually defined pools do not necessarily correspond to measurable SOC fractions in real practice. This not only impairs our ability to rigorously evaluate SOC models but also makes it difficult to derive accurate initial states of the individual carbon pools. In this study, we tested the feasibility of inverse modelling for estimating pools in the Rothamsted carbon model (ROTHC) using mineralization rates observed during incubation experiments. This inverse approach may provide an alternative to existing SOC fractionation methods. To illustrate our approach, we used a time series of synthetically generated mineralization rates using the ROTHC model. We adopted a Bayesian approach using the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm to infer probability density functions of the various carbon pools at the start of incubation. The Kullback-Leibler divergence was used to quantify the information content of the mineralization rate data. Our results indicate that measured mineralization rates generally provided sufficient information to reliably estimate all carbon pools in the ROTHC model. The incubation time necessary to appropriately constrain all pools was about 900 days. The use of prior information on microbial biomass carbon significantly reduced the uncertainty of the initial carbon pools, decreasing the required incubation time to about 600 days. Simultaneous estimation of initial carbon pools and decomposition rate constants significantly increased the uncertainty of the carbon pools. This effect was most pronounced for the intermediate and slow pools. Altogether, our results demonstrate that it is particularly difficult to derive reasonable estimates of the humified organic matter pool and the inert organic matter pool from inverse modelling of mineralization rates observed during incubation experiments.
NASA Astrophysics Data System (ADS)
Kim, Seung Joong
The protein folding problem has been one of the most challenging subjects in biological physics due to its complexity. Energy landscape theory based on statistical mechanics provides a thermodynamic interpretation of the protein folding process. We have been working to answer fundamental questions about protein-protein and protein-water interactions, which are very important for describing the energy landscape surface of proteins correctly. At first, we present a new method for computing protein-protein interaction potentials of solvated proteins directly from SAXS data. An ensemble of proteins was modeled by Metropolis Monte Carlo and Molecular Dynamics simulations, and the global X-ray scattering of the whole model ensemble was computed at each snapshot of the simulation. The interaction potential model was optimized and iterated by a Levenberg-Marquardt algorithm. Secondly, we report that terahertz spectroscopy directly probes hydration dynamics around proteins and determines the size of the dynamical hydration shell. We also present the sequence and pH-dependence of the hydration shell and the effect of the hydrophobicity. On the other hand, kinetic terahertz absorption (KITA) spectroscopy is introduced to study the refolding kinetics of ubiquitin and its mutants. KITA results are compared to small angle X-ray scattering, tryptophan fluorescence, and circular dichroism results. We propose that KITA monitors the rearrangement of hydrogen bonding during secondary structure formation. Finally, we present development of the automated single molecule operating system (ASMOS) for a high throughput single molecule detector, which levitates a single protein molecule in a 10 microm diameter droplet by the laser guidance. I also have performed supporting calculations and simulations with my own program codes.
Assessment of SWE data assimilation for ensemble streamflow predictions
NASA Astrophysics Data System (ADS)
Franz, Kristie J.; Hogue, Terri S.; Barik, Muhammad; He, Minxue
2014-11-01
An assessment of data assimilation (DA) for Ensemble Streamflow Prediction (ESP) using seasonal water supply hindcasting in the North Fork of the American River Basin (NFARB) and the National Weather Service (NWS) hydrologic forecast models is undertaken. Two parameter sets, one from the California Nevada River Forecast Center (RFC) and one from the Differential Evolution Adaptive Metropolis (DREAM) algorithm, are tested. For each parameter set, hindcasts are generated using initial conditions derived with and without the inclusion of a DA scheme that integrates snow water equivalent (SWE) observations. The DREAM-DA scenario uses an Integrated Uncertainty and Ensemble-based data Assimilation (ICEA) framework that also considers model and parameter uncertainty. Hindcasts are evaluated using deterministic and probabilistic forecast verification metrics. In general, the impact of DA on the skill of the seasonal water supply predictions is mixed. For deterministic (ensemble mean) predictions, the Percent Bias (PBias) is improved with integration of the DA. DREAM-DA and the RFC-DA have the lowest biases and the RFC-DA has the lowest Root Mean Squared Error (RMSE). However, the RFC and DREAM-DA have similar RMSE scores. For the probabilistic predictions, the RFC and DREAM have the highest Continuous Ranked Probability Skill Scores (CRPSS) and the RFC has the best discrimination for low flows. Reliability results are similar between the non-DA and DA tests and the DREAM and DREAM-DA have better reliability than the RFC and RFC-DA for forecast dates February 1 and later. Despite producing improved streamflow simulations in previous studies, the hindcast analysis suggests that the DA method tested may not result in obvious improvements in streamflow forecasts. We advocate that integration of hindcasting and probabilistic metrics provides more rigorous insight on model performance for forecasting applications, such as in this study.
NASA Astrophysics Data System (ADS)
Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.
2012-12-01
Snow water equivalent (SWE) estimation is a key factor in producing reliable streamflow simulations and forecasts in snow dominated areas. However, measuring or predicting SWE has significant uncertainty. Sequential data assimilation, which updates states using both observed and modeled data based on error estimation, has been shown to reduce streamflow simulation errors but has had limited testing for forecasting applications. In the current study, a snow data assimilation framework integrated with the National Weather System River Forecasting System (NWSRFS) is evaluated for use in ensemble streamflow prediction (ESP). Seasonal water supply ESP hindcasts are generated for the North Fork of the American River Basin (NFARB) in northern California. Parameter sets from the California Nevada River Forecast Center (CNRFC), the Differential Evolution Adaptive Metropolis (DREAM) algorithm and the Multistep Automated Calibration Scheme (MACS) are tested both with and without sequential data assimilation. The traditional ESP method considers uncertainty in future climate conditions using historical temperature and precipitation time series to generate future streamflow scenarios conditioned on the current basin state. We include data uncertainty analysis in the forecasting framework through the DREAM-based parameter set which is part of a recently developed Integrated Uncertainty and Ensemble-based data Assimilation framework (ICEA). Extensive verification of all tested approaches is undertaken using traditional forecast verification measures, including root mean square error (RMSE), Nash-Sutcliffe efficiency coefficient (NSE), volumetric bias, joint distribution, rank probability score (RPS), and discrimination and reliability plots. In comparison to the RFC parameters, the DREAM and MACS sets show significant improvement in volumetric bias in flow. Use of assimilation improves hindcasts of higher flows but does not significantly improve performance in the mid flow and low flow categories.
Multi-component fluid flow through porous media by interacting lattice gas computer simulation
NASA Astrophysics Data System (ADS)
Cueva-Parra, Luis Alberto
In this work we study structural and transport properties such as power-law behavior of trajectory of each constituent and their center of mass, density profile, mass flux, permeability, velocity profile, phase separation, segregation, and mixing of miscible and immiscible multicomponent fluid flow through rigid and non-consolidated porous media. The considered parameters are the mass ratio of the components, temperature, external pressure, and porosity. Due to its solid theoretical foundation and computational simplicity, the selected approaches are the Interacting Lattice Gas with Monte Carlo Method (Metropolis Algorithm) and direct sampling, combined with particular collision rules. The percolation mechanism is used for modeling initial random porous media. The introduced collision rules allow to model non-consolidated porous media, because part of the kinetic energy of the fluid particles is transfered to barrier particles, which are the components of the porous medium. Having gained kinetic energy, the barrier particles can move. A number of interesting results are observed. Some findings include, (i) phase separation in immiscible fluid flow through a medium with no barrier particles (porosity p P = 1). (ii) For the flow of miscible fluids through rigid porous medium with porosity close to percolation threshold (p C), the flux density (measure of permeability) shows a power law increase ∝ (pC - p) mu with mu = 2.0, and the density profile is found to decay with height ∝ exp(-mA/Bh), consistent with the barometric height law. (iii) Sedimentation and driving of barrier particles in fluid flow through non-consolidated porous medium. This study involves developing computer simulation models with efficient serial and parallel codes, extensive data analysis via graphical utilities, and computer visualization techniques.
NASA Astrophysics Data System (ADS)
Dal Molin, J. P.; Caliri, A.
2018-01-01
Here we focus on the conformational search for the native structure when it is ruled by the hydrophobic effect and steric specificities coming from amino acids. Our main tool of investigation is a 3D lattice model provided by a ten-letter alphabet, the stereochemical model. This minimalist model was conceived for Monte Carlo (MC) simulations when one keeps in mind the kinetic behavior of protein-like chains in solution. We have three central goals here. The first one is to characterize the folding time (τ) by two distinct sampling methods, so we present two sets of 103 MC simulations for a fast protein-like sequence. The resulting sets of characteristic folding times, τ and τq were obtained by the application of the standard Metropolis algorithm (MA), as well as by an enhanced algorithm (Mq A). The finding for τq shows two things: (i) the chain-solvent hydrophobic interactions {hk } plus a set of inter-residues steric constraints {ci,j } are able to emulate the conformational search for the native structure. For each one of the 103MC performed simulations, the target is always found within a finite time window; (ii) the ratio τq / τ ≅ 1 / 10 suggests that the effect of local thermal fluctuations, encompassed by the Tsallis weight, provides to the chain an innate efficiency to escape from energetic and steric traps. We performed additional MC simulations with variations of our design rule to attest this first result, both algorithms the MA and the Mq A were applied to a restricted set of targets, a physical insight is provided. Our second finding was obtained by a set of 600 independent MC simulations, only performed with the Mq A applied to an extended set of 200 representative targets, our native structures. The results show how structural patterns should modulate τq, which cover four orders of magnitude; this finding is our second goal. The third, and last result, was obtained with a special kind of simulation performed with the purpose to explore a possible connection between the hydrophobic component of protein stability and the native structural topology. We simulated those same 200 targets again with the Mq A, only. However, this time we evaluated the relative frequency {ϕq } in which each target visits its corresponding native structure along an appropriate simulation time. Due to the presence of the hydrophobic effect in our approach we obtained a strong correlation between the stability and the folding rate (R = 0 . 85). So, as faster a sequence found its target, as larger is the hydrophobic component of its stability. The strong correlation fulfills our last goal. This final finding suggests that the hydrophobic effect could not be a general stabilizing factor for proteins.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albert, R.E.; Burns, F.J.; Altshuler, B.
1978-02-01
The research proposed here is designed to obtain a better understanding of the temporal kinetics of tumor induction when one or more carcinogens are present simultaneously or sequentially for prolonged periods of time. Studies done to date under this contract have shown that carcinogenesis in mouse skin by polycyclic aromatic hydrocarbon carcinogens is consistent with the induction of dependent and autonomous cell transformations by the carcinogen followed by the conversion of autonomous tumor cells into malignancies at a rate which is determined by the level of carcinogen exposure. Dependent cell transformations remain latent in the skin unless expressed by amore » promoting agent. Dependent neoplasia appears to follow one-hit kinetics while malignancy is a multihit endpoint. Dose-related and time-related aspects of tumor induction are separable in the initiation-promotion system of mouse skin which along with rat skin and hamster lung is being used as a model for testing hypotheses. Results to date provide the basis for a new interpretation of the linear non-threshold extrapolation model. The broad aim of the study is to provide a basis or rationale for estimating risks associated with prolonged exposures to carcinogens found in the environment and to predict how different tissues and species respond to the same carcinogens.« less
Emerging literature in the Microbiota-Brain Axis and Perinatal Mood and Anxiety Disorders.
Rackers, Hannah S; Thomas, Stephanie; Williamson, Kelsey; Posey, Rachael; Kimmel, Mary C
2018-05-17
Perinatal Mood and Anxiety Disorders (PMAD) are common and can cause significant morbidity and mortality for mother and child. A healthy perinatal period requires significant adaptations; however, systems can become imbalanced resulting in depressive and anxiety symptoms. The interface between the microbiome, the immune system, and the stress system may be a model for understanding mechanisms underlying PMAD. Emerging literature from general populations regarding immune, hormone, and HPA axis changes in relation to the microbiome combined with literature on immune, gonadotropin, and stress systems in the perinatal period provides a background. We systematically investigated literature in the developing field of the microbiome in relation to PMAD. Our inclusion criteria were 1) reporting measure of maternal mood, stress, or anxious or depressed behavior; 2) in the perinatal period, defined as pregnancy through one year postpartum; and 3) reporting measure of maternal microbiome including manipulations of the microbiome through prebiotics, probiotics, or interventions with microbial byproducts. The review identified research studying associations between stress and maternal microbiome; dietary impacts on microbial composition, mood, and stress; and the relationship between the microbiome and the immune system through immunoregulatory mechanisms. Important themes identified include: the importance of studying the maternal microbiome and measures of stress, anxiety, and depression and that multi-hit models will be needed as research strives to determine the effects of multiple mechanisms working in concert. Copyright © 2018 Elsevier Ltd. All rights reserved.
Depression and substance use comorbidity: What we have learned from animal studies.
Ng, Enoch; Browne, Caleb J; Samsom, James N; Wong, Albert H C
2017-07-01
Depression and substance use disorders are often comorbid, but the reasons for this are unclear. In human studies, it is difficult to determine how one disorder may affect predisposition to the other and what the underlying mechanisms might be. Instead, animal studies allow experimental induction of behaviors relevant to depression and drug-taking, and permit direct interrogation of changes to neural circuits and molecular pathways. While this field is still new, here we review animal studies that investigate whether depression-like states increase vulnerability to drug-taking behaviors. Since chronic psychosocial stress can precipitate or predispose to depression in humans, we review studies that use psychosocial stressors to produce depression-like phenotypes in animals. Specifically, we describe how postweaning isolation stress, repeated social defeat stress, and chronic mild (or unpredictable) stress affect behaviors relevant to substance abuse, especially operant self-administration. Potential brain changes mediating these effects are also discussed where available, with an emphasis on mesocorticolimbic dopamine circuits. Postweaning isolation stress and repeated social defeat generally increase acquisition or maintenance of drug self-administration, and alter dopamine sensitivity in various brain regions. However, the effects of chronic mild stress on drug-taking have been much less studied. Future studies should consider standardizing stress-induction protocols, including female subjects, and using multi-hit models (e.g. genetic vulnerabilities and environmental stress).
Pierantonelli, Irene; Rychlicki, Chiara; Agostinelli, Laura; Giordano, Debora Maria; Gaggini, Melania; Fraumene, Cristina; Saponaro, Chiara; Manghina, Valeria; Sartini, Loris; Mingarelli, Eleonora; Pinto, Claudio; Buzzigoli, Emma; Trozzi, Luciano; Giordano, Antonio; Marzioni, Marco; Minicis, Samuele De; Uzzau, Sergio; Cinti, Saverio; Gastaldelli, Amalia; Svegliati-Baroni, Gianluca
2017-09-22
Non-Alcoholic Fatty Liver Disease (NAFLD) represents the most common form of chronic liver injury and can progress to cirrhosis and hepatocellular carcinoma. A "multi-hit" theory, involving high fat diet and signals from the gut-liver axis, has been hypothesized. The role of the NLRP3-inflammasome, which senses dangerous signals, is controversial. Nlrp3 -/- and wild-type mice were fed a Western-lifestyle diet with fructose in drinking water (HFHC) or a chow diet. Nlrp3 -/- -HFHC showed higher hepatic expression of PPAR γ2 (that regulates lipid uptake and storage) and triglyceride content, histological score of liver injury and greater adipose tissue inflammation. In Nlrp3 -/- -HFHC, dysregulation of gut immune response with impaired antimicrobial peptides expression, increased intestinal permeability and the occurrence of a dysbiotic microbiota led to bacterial translocation, associated with higher hepatic expression of TLR4 (an LPS receptor) and TLR9 (a receptor for double-stranded bacterial DNA). After antibiotic treatment, gram-negative species and bacterial translocation were reduced, and adverse effects restored both in liver and adipose tissue. In conclusion, the combination of a Western-lifestyle diet with innate immune dysfunction leads to NAFLD progression, mediated at least in part by dysbiosis and bacterial translocation, thus identifying new specific targets for NAFLD therapy.
HIT'nDRIVE: patient-specific multidriver gene prioritization for precision oncology
Hodzic, Ermin; Sauerwald, Thomas; Dao, Phuong; Wang, Kendric; Yeung, Jake; Anderson, Shawn; Vandin, Fabio; Haffari, Gholamreza; Collins, Colin C.; Sahinalp, S. Cenk
2017-01-01
Prioritizing molecular alterations that act as drivers of cancer remains a crucial bottleneck in therapeutic development. Here we introduce HIT'nDRIVE, a computational method that integrates genomic and transcriptomic data to identify a set of patient-specific, sequence-altered genes, with sufficient collective influence over dysregulated transcripts. HIT'nDRIVE aims to solve the “random walk facility location” (RWFL) problem in a gene (or protein) interaction network, which differs from the standard facility location problem by its use of an alternative distance measure: “multihitting time,” the expected length of the shortest random walk from any one of the set of sequence-altered genes to an expression-altered target gene. When applied to 2200 tumors from four major cancer types, HIT'nDRIVE revealed many potentially clinically actionable driver genes. We also demonstrated that it is possible to perform accurate phenotype prediction for tumor samples by only using HIT'nDRIVE-seeded driver gene modules from gene interaction networks. In addition, we identified a number of breast cancer subtype-specific driver modules that are associated with patients’ survival outcome. Furthermore, HIT'nDRIVE, when applied to a large panel of pan-cancer cell lines, accurately predicted drug efficacy using the driver genes and their seeded gene modules. Overall, HIT'nDRIVE may help clinicians contextualize massive multiomics data in therapeutic decision making, enabling widespread implementation of precision oncology. PMID:28768687
Kaur, Harparkash; Allan, Elizabeth Louise; Mamadu, Ibrahim; Hall, Zoe; Ibe, Ogochukwu; El Sherbiny, Mohamed; van Wyk, Albert; Yeung, Shunmay; Swamidoss, Isabel; Green, Michael D; Dwivedi, Prabha; Culzoni, Maria Julia; Clarke, Siân; Schellenberg, David; Fernández, Facundo M; Onwujekwe, Obinna
2015-01-01
Artemisinin-based combination therapies are recommended by the World Health Organisation (WHO) as first-line treatment for Plasmodium falciparum malaria, yet medication must be of good quality for efficacious treatment. A recent meta-analysis reported 35% (796/2,296) of antimalarial drug samples from 21 Sub-Saharan African countries, purchased from outlets predominantly using convenience sampling, failed chemical content analysis. We used three sampling strategies to purchase artemisinin-containing antimalarials (ACAs) in Enugu metropolis, Nigeria, and compared the resulting quality estimates. ACAs were purchased using three sampling approaches--convenience, mystery clients and overt, within a defined area and sampling frame in Enugu metropolis. The active pharmaceutical ingredients were assessed using high-performance liquid chromatography and confirmed by mass spectrometry at three independent laboratories. Results were expressed as percentage of APIs stated on the packaging and used to categorise each sample as acceptable quality, substandard, degraded, or falsified. Content analysis of 3024 samples purchased from 421 outlets using convenience (n=200), mystery (n=1,919) and overt (n=905) approaches, showed overall 90.8% ACAs to be of acceptable quality, 6.8% substandard, 1.3% degraded and 1.2% falsified. Convenience sampling yielded a significantly higher prevalence of poor quality ACAs, but was not evident by the mystery and overt sampling strategies both of which yielded results that were comparable between each other. Artesunate (n=135; 4 falsified) and dihydroartemisinin (n=14) monotherapy tablets, not recommended by WHO, were also identified. Randomised sampling identified fewer falsified ACAs than previously reported by convenience approaches. Our findings emphasise the need for specific consideration to be given to sampling frame and sampling approach if representative information on drug quality is to be obtained.
Groundwater fluoride and dental fluorosis in southwestern Nigeria.
Gbadebo, A M
2012-10-01
This study was carried out to assess the fluoride levels of groundwater from open wells, consumed by the residents of three communities located in two distinct geological terrains of southwestern Nigeria. Fluoride concentration was determined using spectrophotometric technique, while analysis of other parameters like temperature, pH and total dissolve solids followed standard methods. Results of the analysis indicated that groundwater samples from Abeokuta Metropolis (i.e., basement complex terrain) had fluoride content in the range of 0.65 ± 0.21 and 1.20 ± 0.14. These values were found to be lower than the fluoride contents in the groundwater samples from Ewekoro peri-urban and Lagos metropolis where the values ranged between 1.10 ± 0.14-1.45 ± 0.07 and 0.15 ± 0.07-2.20 ± 1.41 mg/l, respectively. The fluoride contents in almost all locations were generally higher than the WHO recommended 0.6 mg/l. Analysis of Duncan multiple range test indicated that there is similarity in the level of significance of fluoride contents between different locations of same geological terrain at p ≤ 0.05. It was also observed that fluoride distribution of groundwater samples from the different geological terrain was more dependent on factors like pH and TDS than on temperature. The result of the analyzed social demographic characteristics of the residents indicated that the adults (between the age of 20 and >40 years) showed dental decay than the adolescent (<20 years). This signifies incidence of dental fluorosis by the high fluoride content in the drinking water of the populace. Further investigation on all sources of drinking water and other causes of tooth decay in the area is suggested.
Saba, Courage Kosi Setsoafia; Atayure, Seidu Isaac; Adzitey, Frederick
2015-03-01
Fish is an important source of protein all over the world, including in Ghana. The fishery sector plays a major role in meeting the domestic need of animal protein and also contributes greatly in foreign exchange earnings. The domestic supply of fish does not meet the demand, so Ghana imports fish and fish products from other countries. Media reports in Ghana have alleged the use of formaldehyde to preserve fish for increased shelf life and to maintain freshness. This research, therefore, sought to establish the levels of formaldehyde in imported and local fresh fish in the Tamale Metropolis by using a ChemSee formaldehyde and formalin detection test kit. Positive and negative controls were performed by using various concentrations of formalin (1, 10, 30, 50, 100, and 300 ppm) and sterile distilled water, respectively. Three times over a 6-month period, different fish species were obtained from five wholesale cold stores (where fish are sold in cartons) and some local sales points (where locally caught fish are sold). A total of 32 samples were taken during three different sampling sessions: 23 imported fish (mackerel, herring, horse mackerel, salmon, and redfish) and 9 local tilapia. The fish were cut, and 50 g was weighed and blended with an equal volume (50 ml) of sterile distilled water. Samples were transferred to test tubes and centrifuged. A test strip was dipped into the supernatant and observed for a color change. A change in color from white to pink or purple indicated the presence of formaldehyde in fish. The study showed that no formaldehyde was present in the imported and local fish obtained. The appropriate regulatory agencies should carry out this study regularly to ensure that fish consumed in Ghana is safe for consumption.
Hussain, Muhammad Hammad; Saqib, Muhammad; Raza, Fahad; Muhammad, Ghulam; Asi, Muhammad Nadeem; Mansoor, Muhammad Khalid; Saleem, Muhammad; Jabbar, Abdul
2014-05-28
Equine piroplasmosis (EP) caused by intraerythrocytic parasites (Theileria equi and Babesia caballi) is an emerging equine disease of world-wide distribution. In Pakistan, the prevalence and incidence of EP are unknown. In order to obtain the first insights into the prevalence of the disease, a total of 430 equids, including 33 mules, 65 horses and 332 donkeys, aging from ≤ 5 to ≥ 10 years of either sex, from five metropolises of Punjab, Pakistan, were serologically tested for the presence of antibodies directed against B. caballi and T. equi, using a competitive enzyme-linked immunosorbent assay (cELISA). Out of 430 equid serum samples tested, 226 (52.6%, 95% CI 47.7-57.4) were found cELISA positive for EP (T. equi and/or B. caballi infections). The overall seroprevalence of EP was 41.2% (95% CI 36.5-46.0) for T. equi and 21.6% (95% CI 17.8-25.8) for B. caballi. A small proportion of equids (10.2%, 95% CI 7.5-13.5) was seropositive for both T. equi and B. caballi. Seroprevalence of T. equi was significantly higher (P<0.01) in equines from the metropolis of Lahore (66.7%, 95% CI 54.3-77.6) and in horses (56.9%, 95% CI 44.0-69.2). Multivariable logistic regression model analysis indicated that factors associated with prevalence of EP were being an equine species kept in metropolis Lahore (OR=4.24, 95% CI 2.28-7.90), horse (OR=2.82, 95% CI 1.53-5.20) and male equids (OR=1.81, 95% CI 1.15-2.86). Copyright © 2014 Elsevier B.V. All rights reserved.
Adegboyega, O; Abioye, K
2017-08-01
The payment for health-care services is a major problem for many poor patients in developing nations. The aim of the study was to examine the cost of services and commodities and how these affect the patients who utilizes the primary health-care centers in Zaria, North western Nigeria. A descriptive cross-sectional survey of six primary health-care facilities in Zaria metropolis, namely Baban dodo, Tudun Wada, Magajiya PHCs from Zaria local government areas (LGA) and Samaru, Kwata, and Dogarawa PHCs from Sabon Gari LGA, was carried out. The mean age of the respondents was 28.87± 8.63 years, most of them were married (53.3%), Hausa (63.3%), and Muslims (85.7%); also, they were unemployed housewives with daily stipends from their husbands less than 1 dollar/day. The major method for payment for health-care services was out of pocket (98.3%). More than one-third of the clients were not aware of the National Health Insurance Scheme (NHIS) (39%). There was a significant inverse relationship between the monthly income of the clients and the experience of financial stress and a positive association between patients' monthly income and awareness of the NHIS (P < 0.05). The respondents were paying user fees for essential health-care services at the primary health-care centers and this was not convenient for them. There is a need for the LGA health department to intensify the supervision of the activities at the PHCs. Standardization of prices of services and commodities and the implementation of the National Health Act may alleviate the burdens of the poor community members who access PHCs in Nigeria.
NASA Astrophysics Data System (ADS)
Ray, Raghab; Jana, Tapan Kumar
2017-12-01
Mangroves are known as natural carbon sinks, taking CO2 out of the atmosphere and store it in their biomass for many years. This study aimed to investigate the capacity of world's largest mangrove, the Sundarbans (Indian part) to sequester anthropogenic CO2 emitted from the proximate coal-based thermal power plant in Kolaghat (∼100 km away from mangrove site). Study also includes Kolkata, one of the largest metropolises of India (∼150 km away from mangrove site) for comparing micrometeorological parameters, biosphere-atmosphere CO2 exchange fluxes and atmospheric pollutants between three distinct environments: mangrove-power plant-metropolis. Hourly sampling of atmospheric CO2 in all three sites (late December 2011 and early January 2012) revealed that CO2 concentrations and emission fluxes were maximum around the power plant (360-621 ppmv, 5.6-56.7 mg m-2s-1 respectively) followed by the metropolis (383-459 ppmv, 3.8-20.4 mg m-2s-1 respectively) and mangroves (277-408 ppmv, -8.9-11.4 mg m-2s-1, respectively). Monthly coal consumption rates (41-57, in 104 ton month-1) were converted to CO2 suggesting that 2.83 Tg C was added to the atmosphere in 2011 for the generation of 7469732 MW energy from the power plant. Indian Sundarbans (4264 km2) sequestered total of 2.79 Tg C which was 0.64% of the annual fossil fuel emission from India in the same time period. Based on these data from 2010 to 2011, it is calculated that about 4328 km2 mangrove forest coverage is needed to sequester all CO2 emitted from the Kolaghat power plant.
Arlinghaus, Robert; Mehner, Thomas
2004-03-01
Increased efforts to analyze the human dimensions of anglers are necessary to improve freshwater fisheries management. This paper is a comparative analysis of urban and rural anglers living in a metropolis, based on n = 1061 anglers responding to a mail survey in the German capital of Berlin. More than two thirds of the anglers (71%) had spent most (>50%) of their effort outside the city borders of Berlin and thus were categorized as rural anglers. Compared to the rural anglers, urban anglers (>/=50% of total effort spent inside the city) were younger and less educated. Urban anglers were more avid and committed, less mobile, and more frequently fished from boats and during weekdays. Rural anglers were more experienced, fished for longer times per trip, fished more often at weekends and on holidays, were more often members of angling clubs, and more frequently caught higher valued fish species. The achievement and fish quantity aspects of the angling experience were more important for urban than for rural anglers. Concerning management options, urban anglers more frequently suggested constraining other stakeholders and reducing regulations, whereas rural anglers more often proposed improving physical access to angling sites. Future urban fishing programs should offer ease of access, connection to public transportation, moderate prices, and diverse piscivorous fish stocks. In contrast to rural fisheries, the provision of high ecological and aesthetical quality of the angling waters can be regarded as of minor importance in urban fisheries. Rural fisheries managers need to consider the needs of stakeholders living in Berlin to minimize impacts on the less degraded rural water bodies and potential user conflicts with resident anglers. Ecosystem-based management approaches should guide rural fisheries policy.
NASA Astrophysics Data System (ADS)
Adak, Anandamay; Chatterjee, Abhijit; Ghosh, Sanjay; Raha, Sibaji; Roy, Arindam
2016-07-01
A study was conducted on the chemical characterization of fine mode aerosol or PM2.5 over a rural atmosphere near the coast of Bay of Bengal in eastern India. Samples were collected and analyzed during March 2013 - February 2014. The concentration of PM2.5 was found span over a wide range from as low as 3 µg m-3 to as high as 180 µg m-3. The average concentration of PM2.5 was 62 µg m-3. Maximum accumulation of fine mode aerosol was observed during winter whereas minimum was observed during monsoon. Water soluble ionic species of fine mode aerosol were characterized over this rural atmosphere. In spite of being situated near the coast of Bay of Bengal, we observed significantly higher concentrations for anthropogenic species like ammonium and sulphate. The concentrations of these two species were much higher than the sea-salt aerosols. Ammonium and sulphate contributed around 30 % to the total fine mode aerosols. Even dust aerosol species like calcium also showed higher concentrations. Chloride to sodium ratio was found to be much less than that in standard sea-water indicating strong interaction between sea-salt and anthropogenic aerosols. Use of fertilizers in various crop fields and human and animal wastes significantly increased ammonium in fine mode aerosols. Dust aerosol species were accumulated in the atmosphere which could be due to transport of finer dust species from nearby metropolis or locally generated. Non-sea-sulphate and nitrate showed significant contributions in fine mode aerosols having both local and transported sources. Source apportionment shows prominent emission sources of anthropogenic aerosols from local anthropogenic activities and transported from nearby Kolkata metropolis as well.
Ntodie, Michael; Abu, Sampson L; Kyei, Samuel; Abokyi, Samuel; Abu, Emmanuel K
2017-06-01
To determine the near vision spectacle coverage and barriers to obtaining near vision correction among adults aged 35 years and older in the Cape Coast Metropolis of Ghana. A population-based cross-sectional study design was adopted and 500 out of 576 participants aged 35 years and older were examined from 12 randomly selected clusters in Cape Coast, Ghana. All participants underwent a comprehensive eye examination which included: distance and near visual acuities measurements and external and internal ocular health assessments. Distance and near refractions were performed using subjective refraction technique. Information on participants' demographics, near vision correction status, near visual needs and barriers to acquiring near vision correction were obtained through a questionnaire administered as part of the study. The mean age of participants was 52.3±10.3 years of whom 280 (56%) were females and 220 (44%) were males. The near vision spectacle coverage was 25%, 33% "met need" for near vision correction in the presbyopic population, and 64% unmet need in the entire study population. After controlling for other variables, age (5 th and 6 th decades) and educational level were associated with "met need" for near vision correction (OR=2.7 (1.55-4.68), p =0.00, and OR=2.36 (1.18-4.72), p=0.02 respectively). Among those who needed but did not have near vision correction, 64 (26%) did not feel the need for correction, 55 (22%) stated that they were unaware of available interventions, and 53 (21%) found the cost of near vision correction prohibitive. There was a low near vision spectacle coverage in this population which suggests the need for strategies on health education and promotion to address the lack of awareness of spectacle need and cost of services.
NASA Astrophysics Data System (ADS)
Rosenberg, D. E.; Alafifi, A.
2016-12-01
Water resources systems analysis often focuses on finding optimal solutions. Yet an optimal solution is optimal only for the modelled issues and managers often seek near-optimal alternatives that address un-modelled objectives, preferences, limits, uncertainties, and other issues. Early on, Modelling to Generate Alternatives (MGA) formalized near-optimal as the region comprising the original problem constraints plus a new constraint that allowed performance within a specified tolerance of the optimal objective function value. MGA identified a few maximally-different alternatives from the near-optimal region. Subsequent work applied Markov Chain Monte Carlo (MCMC) sampling to generate a larger number of alternatives that span the near-optimal region of linear problems or select portions for non-linear problems. We extend the MCMC Hit-And-Run method to generate alternatives that span the full extent of the near-optimal region for non-linear, non-convex problems. First, start at a feasible hit point within the near-optimal region, then run a random distance in a random direction to a new hit point. Next, repeat until generating the desired number of alternatives. The key step at each iterate is to run a random distance along the line in the specified direction to a new hit point. If linear equity constraints exist, we construct an orthogonal basis and use a null space transformation to confine hits and runs to a lower-dimensional space. Linear inequity constraints define the convex bounds on the line that runs through the current hit point in the specified direction. We then use slice sampling to identify a new hit point along the line within bounds defined by the non-linear inequity constraints. This technique is computationally efficient compared to prior near-optimal alternative generation techniques such MGA, MCMC Metropolis-Hastings, evolutionary, or firefly algorithms because search at each iteration is confined to the hit line, the algorithm can move in one step to any point in the near-optimal region, and each iterate generates a new, feasible alternative. We use the method to generate alternatives that span the near-optimal regions of simple and more complicated water management problems and may be preferred to optimal solutions. We also discuss extensions to handle non-linear equity constraints.
Application of Simulated Annealing and Related Algorithms to TWTA Design
NASA Technical Reports Server (NTRS)
Radke, Eric M.
2004-01-01
Simulated Annealing (SA) is a stochastic optimization algorithm used to search for global minima in complex design surfaces where exhaustive searches are not computationally feasible. The algorithm is derived by simulating the annealing process, whereby a solid is heated to a liquid state and then cooled slowly to reach thermodynamic equilibrium at each temperature. The idea is that atoms in the solid continually bond and re-bond at various quantum energy levels, and with sufficient cooling time they will rearrange at the minimum energy state to form a perfect crystal. The distribution of energy levels is given by the Boltzmann distribution: as temperature drops, the probability of the presence of high-energy bonds decreases. In searching for an optimal design, local minima and discontinuities are often present in a design surface. SA presents a distinct advantage over other optimization algorithms in its ability to escape from these local minima. Just as high-energy atomic configurations are visited in the actual annealing process in order to eventually reach the minimum energy state, in SA highly non-optimal configurations are visited in order to find otherwise inaccessible global minima. The SA algorithm produces a Markov chain of points in the design space at each temperature, with a monotonically decreasing temperature. A random point is started upon, and the objective function is evaluated at that point. A stochastic perturbation is then made to the parameters of the point to arrive at a proposed new point in the design space, at which the objection function is evaluated as well. If the change in objective function values (Delta)E is negative, the proposed new point is accepted. If (Delta)E is positive, the proposed new point is accepted according to the Metropolis criterion: rho((Delta)f) = exp((-Delta)E/T), where T is the temperature for the current Markov chain. The process then repeats for the remainder of the Markov chain, after which the temperature is decremented and the process repeats. Eventually (and hopefully), a near-globally optimal solution is attained as T approaches zero. Several exciting variants of SA have recently emerged, including Discrete-State Simulated Annealing (DSSA) and Simulated Tempering (ST). The DSSA algorithm takes the thermodynamic analogy one step further by categorizing objective function evaluations into discrete states. In doing so, many of the case-specific problems associated with fine-tuning the SA algorithm can be avoided; for example, theoretical approximations for the initial and final temperature can be derived independently of the case. In this manner, DSSA provides a scheme that is more robust with respect to widely differing design surfaces. ST differs from SA in that the temperature T becomes an additional random variable in the optimization. The system is also kept in equilibrium as the temperature changes, as opposed to the system being driven out of equilibrium as temperature changes in SA. ST is designed to overcome obstacles in design surfaces where numerous local minima are separated by high barriers. These algorithms are incorporated into the optimal design of the traveling-wave tube amplifier (TWTA). The area under scrutiny is the collector, in which it would be ideal to use negative potential to decelerate the spent electron beam to zero kinetic energy just as it reaches the collector surface. In reality this is not plausible due to a number of physical limitations, including repulsion and differing levels of kinetic energy among individual electrons. Instead, the collector is designed with multiple stages depressed below ground potential. The design of this multiple-stage collector is the optimization problem of interest. One remaining problem in SA and DSSA is the difficulty in determining when equilibrium has been reached so that the current Markov chain can be terminated. It has been suggested in recent literature that simulating the thermodynamic properties opecific heat, entropy, and internal energy from the Boltzmann distribution can provide good indicators of having reached equilibrium at a certain temperature. These properties are tested for their efficacy and implemented in SA and DSSA code with respect to TWTA collector optimization.
[Urbanization in tropical countries].
Amat-Roze, J M
1983-01-01
Rural populations are still the most numerous in tropical countries. But we can witness an unprecedent process of urbanization. However the dynamics of the phenomenon differs greatly in different countries and in different towns. As a matter of fact, the greatest overcrowded metropolis attract the greatest part of the migrants from rural areas; the attractive factors are multifarious and universal. This not easily controllable trend seems to be irreversible. The migrant farmers will generally find a job within the informed economic sector. The zones of spontaneous and precarious settlement are often their first environmental living conditions. Some of these unhealthy dwelling areas are subject to development plans; some of them being extremely well designed.
Efficient Parameter Searches for Colloidal Materials Design with Digital Alchemy
NASA Astrophysics Data System (ADS)
Dodd, Paul, M.; Geng, Yina; van Anders, Greg; Glotzer, Sharon C.
Optimal colloidal materials design is challenging, even for high-throughput or genomic approaches, because the design space provided by modern colloid synthesis techniques can easily have dozens of dimensions. In this talk we present the methodology of an inverse approach we term ''digital alchemy'' to perform rapid searches of design-paramenter spaces with up to 188 dimensions that yield thermodynamically optimal colloid parameters for target crystal structures with up to 20 particles in a unit cell. The method relies only on fundamental principles of statistical mechanics and Metropolis Monte Carlo techniques, and yields particle attribute tolerances via analogues of familiar stress-strain relationships.
Rapid recipe formulation for plasma etching of new materials
NASA Astrophysics Data System (ADS)
Chopra, Meghali; Zhang, Zizhuo; Ekerdt, John; Bonnecaze, Roger T.
2016-03-01
A fast and inexpensive scheme for etch rate prediction using flexible continuum models and Bayesian statistics is demonstrated. Bulk etch rates of MgO are predicted using a steady-state model with volume-averaged plasma parameters and classical Langmuir surface kinetics. Plasma particle and surface kinetics are modeled within a global plasma framework using single component Metropolis Hastings methods and limited data. The accuracy of these predictions is evaluated with synthetic and experimental etch rate data for magnesium oxide in an ICP-RIE system. This approach is compared and superior to factorial models generated from JMP, a software package frequently employed for recipe creation and optimization.
Kulis, Stephen; Hodge, David R; Ayers, Stephanie L; Brown, Eddie F; Marsiglia, Flavio F
2012-09-01
This article explores the aspects of spirituality and religious involvement that may be the protective factors against substance use among urban American Indian (AI) youth. Data come from AI youth (N = 123) in five urban middle schools in a southwestern metropolis. Ordinary least squares regression analyses indicated that following Christian beliefs and belonging to the Native American Church were associated with lower levels of substance use. Following AI traditional spiritual beliefs was associated with antidrug attitudes, norms, and expectancies. Having a sense of belonging to traditions from both AI cultures and Christianity may foster integration of the two worlds in which urban AI youth live.
The experience of pregnancy and childbirth for unmarried mothers in London, 1760-1866.
Williams, Samantha
2011-01-01
This article explores the experience of pregnancy and childbirth for unmarried mothers in the metropolis in the eighteenth and nineteenth centuries. It draws upon, in particular, the infanticide cases heard at the Old Bailey between 1760 and 1866. Many of the women in these records found themselves alone and afraid as they coped with the pregnancy and birth of their first child. A great deal is revealed about the birthing body: the ambiguity surrounding the identification of and signs of pregnancy, labour and delivery, the place of birth and the degree of privacy, and the nature of, and dangers associated with, solitary childbirth.
High-Rate Capable Floating Strip Micromegas
NASA Astrophysics Data System (ADS)
Bortfeldt, Jonathan; Bender, Michael; Biebel, Otmar; Danger, Helge; Flierl, Bernhard; Hertenberger, Ralf; Lösel, Philipp; Moll, Samuel; Parodi, Katia; Rinaldi, Ilaria; Ruschke, Alexander; Zibell, André
2016-04-01
We report on the optimization of discharge insensitive floating strip Micromegas (MICRO-MEsh GASeous) detectors, fit for use in high-energy muon spectrometers. The suitability of these detectors for particle tracking is shown in high-background environments and at very high particle fluxes up to 60 MHz/cm2. Measurement and simulation of the microscopic discharge behavior have demonstrated the excellent discharge tolerance. A floating strip Micromegas with an active area of 48 cm × 50 cm with 1920 copper anode strips exhibits in 120 GeV pion beams a spatial resolution of 50 μm at detection efficiencies above 95%. Pulse height, spatial resolution and detection efficiency are homogeneous over the detector. Reconstruction of particle track inclination in a single detector plane is discussed, optimum angular resolutions below 5° are observed. Systematic deviations of this μTPC-method are fully understood. The reconstruction capabilities for minimum ionizing muons are investigated in a 6.4 cm × 6.4 cm floating strip Micromegas under intense background irradiation of the whole active area with 20 MeV protons at a rate of 550 kHz. The spatial resolution for muons is not distorted by space charge effects. A 6.4 cm × 6.4 cm floating strip Micromegas doublet with low material budget is investigated in highly ionizing proton and carbon ion beams at particle rates between 2 MHz and 2 GHz. Stable operation up to the highest rates is observed, spatial resolution, detection efficiencies, the multi-hit and high-rate capability are discussed.
The KLOE-2 high energy taggers
NASA Astrophysics Data System (ADS)
Curciarello, F.
2017-06-01
The precision measurement of the π0 → γγ width allows to gain insights into the low-energy QCD dynamics. A way to achieve the precision needed (1%) in order to test theory predictions is to study the π0 production through γγ fusion in the e+e- → e+e-γ*γ* → e+e-π0 reaction. The KLOE-2 experiment, currently running at the DAΦNE facility in Frascati, aims to perform this measurement. For this reason, new detectors, which allow to tag final state leptons, have been installed along the DAΦNE beam line in order to reduce the background coming from phi-meson decays. The High Energy Tagger (HET) detector measures the deviation of leptons from their main orbit by determining their position and timing. The HET detectors are placed in roman pots just at the exit of the DAΦNE dipole magnets, 11 m away from the IP, both on positron and electron sides. The HET sensitive area is made up of a set of 28 plastic scintillators. A dedicated DAQ electronic board, based on a Xilinx Virtex-5 FPGA, has been developed for this detector. It provides a MultiHit TDC with a time resolution of 550(1) ps and the possibility to clearly identify the correct bunch crossing (ΔTbunch ~ 2.7 ns). The most relevant features of the KLOE-2 tagging system operation as time performance, stability and the techniques used to determine the time overlap between the KLOE and HET asynchronous DAQs will be presented.
Effect of Concussion on Performance of National Football League Players.
Reams, Nicole; Hayward, Rodney A; Kutcher, Jeffrey S; Burke, James F
2017-09-01
Lingering neurologic injury after concussion may expose athletes to increased risk if return to play is premature. The authors explored whether on-field performance after concussion is a marker of lingering neurologic injury. Retrospective cohort study on 1882 skill-position players who played in the National Football League (NFL) during 2007-2010. Players with concussion based on the weekly injury report were compared with players with other head and neck injuries (controls) on measures of on-field performance using Football Outsiders' calculation of defense-adjusted yards above replacement (DYAR), a measure of a player's contribution controlling for game context. Changes in performance, relative to a player's baseline level of performance, were estimated before and after injury using fixed-effects models. The study included 140 concussed players and 57 controls. Players with concussion performed no better or worse than their baseline on return to play. However, a decline in DYAR relative to their prior performance was noted 2 wk and 1 wk before appearing on the injury report. Concussed players performed slightly better than controls in situations where they returned to play the same week as appearing on the injury report. On return, concussed NFL players performed at their baseline level of performance, suggesting that players have recovered from concussion. Decline in performance noted 2 wk and 1 wk before appearing on the injury report may suggest that concussion diagnosis was delayed or that concussion can be a multihit phenomenon. Athletic performance may be a novel tool for assessing concussion injury and recovery.
TRPC6 mutational analysis in a large cohort of patients with focal segmental glomerulosclerosis.
Santín, Sheila; Ars, Elisabet; Rossetti, Sandro; Salido, Eduardo; Silva, Irene; García-Maset, Rafael; Giménez, Isabel; Ruíz, Patricia; Mendizábal, Santiago; Luciano Nieto, José; Peña, Antonia; Camacho, Juan Antonio; Fraga, Gloria; Cobo, M Angeles; Bernis, Carmen; Ortiz, Alberto; de Pablos, Augusto Luque; Sánchez-Moreno, Ana; Pintos, Guillem; Mirapeix, Eduard; Fernández-Llama, Patricia; Ballarín, José; Torra, Roser; Zamora, Isabel; López-Hellin, Joan; Madrid, Alvaro; Ventura, Clara; Vilalta, Ramón; Espinosa, Laura; García, Carmen; Melgosa, Marta; Navarro, Mercedes; Giménez, Antonio; Cots, Jorge Vila; Alexandra, Simona; Caramelo, Carlos; Egido, Jesús; San José, M Dolores Morales; de la Cerda, Francisco; Sala, Pere; Raspall, Frederic; Vila, Angel; Daza, Antonio María; Vázquez, Mercedes; Ecija, José Luis; Espinosa, Mario; Justa, Ma Luisa; Poveda, Rafael; Aparicio, Cristina; Rosell, Jordi; Muley, Rafael; Montenegro, Jesús; González, Domingo; Hidalgo, Emilia; de Frutos, David Barajas; Trillo, Esther; Gracia, Salvador; de los Ríos, Francisco Javier Gainza
2009-10-01
Mutations in the TRPC6 gene have been reported in six families with adult-onset (17-57 years) autosomal dominant focal segmental glomerulosclerosis (FSGS). Electrophysiology studies confirmed augmented calcium influx only in three of these six TRPC6 mutations. To date, the role of TRPC6 in childhood and adulthood non-familial forms is unknown. TRPC6 mutation analysis was performed by direct sequencing in 130 Spanish patients from 115 unrelated families with FSGS. An in silico scoring matrix was developed to evaluate the pathogenicity of amino acid substitutions, by using the bio-physical and bio-chemical differences between wild-type and mutant amino acid, the evolutionary conservation of the amino acid residue in orthologues, homologues and defined domains, with the addition of contextual information. Three new missense substitutions were identified in two clinically non-familial cases and in one familial case. The analysis by means of this scoring system allowed us to classify these variants as likely pathogenic mutations. One of them was detected in a female patient with unusual clinical features: mesangial proliferative FSGS in childhood (7 years) and partial response to immunosupressive therapy (CsA + MMF). Asymptomatic carriers of this likely mutation were found within her family. We describe for the first time TRPC6 mutations in children and adults with non-familial FSGS. It seems that TRPC6 is a gene with a very variable penetrance that may contribute to glomerular diseases in a multi-hit setting.
Anisotropic dielectric properties of two-dimensional matrix in pseudo-spin ferroelectric system
NASA Astrophysics Data System (ADS)
Kim, Se-Hun
2016-10-01
The anisotropic dielectric properties of a two-dimensional (2D) ferroelectric system were studied using the statistical calculation of the pseudo-spin Ising Hamiltonian model. It is necessary to delay the time for measurements of the observable and the independence of the new spin configuration under Monte Carlo sampling, in which the thermal equilibrium state depends on the temperature and size of the system. The autocorrelation time constants of the normalized relaxation function were determined by taking temperature and 2D lattice size into account. We discuss the dielectric constants of a two-dimensional ferroelectric system by using the Metropolis method in view of the Slater-Takagi defect energies.
NASA Technical Reports Server (NTRS)
Sequera, Pedro; McDonald, Kyle C.; Gonzalez, Jorge; Arend, Mark; Krakauer, Nir; Bornstein, Robert; Luvll, Jeffrey
2012-01-01
The need for comprehensive studies of the relationships between past and projected changes of regional climate and human activity in comple x urban environments has been well established. The HyspIRI preparato ry airborne activities in California, associated science and applicat ions research, and eventually HyspIRI itself provide an unprecedented opportunity for development and implementation of an integrated data and modeling analysis system focused on coastal urban environments. We will utilize HyspIRI preparatory data collections in developing ne w remote sensing-based tools for investigating the integrated urban e nvironment, emphasizing weather, climate, and energy demands in compl ex coastal cities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rotondo, M.; Rueda, Jorge A.; Xue, S.-S.
The Feynman-Metropolis-Teller treatment of compressed atoms is extended to the relativistic regimes. Each atomic configuration is confined by a Wigner-Seitz cell and is characterized by a positive electron Fermi energy. The nonrelativistic treatment assumes a pointlike nucleus and infinite values of the electron Fermi energy can be attained. In the relativistic treatment there exists a limiting configuration, reached when the Wigner-Seitz cell radius equals the radius of the nucleus, with a maximum value of the electron Fermi energy (E{sub e}{sup F}){sub max}, here expressed analytically in the ultrarelativistic approximation. The corrections given by the relativistic Thomas-Fermi-Dirac exchange term are alsomore » evaluated and shown to be generally small and negligible in the relativistic high-density regime. The dependence of the relativistic electron Fermi energies by compression for selected nuclei are compared and contrasted to the nonrelativistic ones and to the ones obtained in the uniform approximation. The relativistic Feynman-Metropolis-Teller approach here presented overcomes some difficulties in the Salpeter approximation generally adopted for compressed matter in physics and astrophysics. The treatment is then extrapolated to compressed nuclear matter cores of stellar dimensions with A{approx_equal}(m{sub Planck}/m{sub n}){sup 3}{approx}10{sup 57} or M{sub core}{approx}M{sub {circle_dot}}. A new family of equilibrium configurations exists for selected values of the electron Fermi energy varying in the range 0
Kaur, Harparkash; Allan, Elizabeth Louise; Mamadu, Ibrahim; Hall, Zoe; Ibe, Ogochukwu; El Sherbiny, Mohamed; van Wyk, Albert; Yeung, Shunmay; Swamidoss, Isabel; Green, Michael D.; Dwivedi, Prabha; Culzoni, Maria Julia; Clarke, Siân; Schellenberg, David; Fernández, Facundo M.; Onwujekwe, Obinna
2015-01-01
Background Artemisinin-based combination therapies are recommended by the World Health Organisation (WHO) as first-line treatment for Plasmodium falciparum malaria, yet medication must be of good quality for efficacious treatment. A recent meta-analysis reported 35% (796/2,296) of antimalarial drug samples from 21 Sub-Saharan African countries, purchased from outlets predominantly using convenience sampling, failed chemical content analysis. We used three sampling strategies to purchase artemisinin-containing antimalarials (ACAs) in Enugu metropolis, Nigeria, and compared the resulting quality estimates. Methods ACAs were purchased using three sampling approaches - convenience, mystery clients and overt, within a defined area and sampling frame in Enugu metropolis. The active pharmaceutical ingredients were assessed using high-performance liquid chromatography and confirmed by mass spectrometry at three independent laboratories. Results were expressed as percentage of APIs stated on the packaging and used to categorise each sample as acceptable quality, substandard, degraded, or falsified. Results Content analysis of 3024 samples purchased from 421 outlets using convenience (n=200), mystery (n=1,919) and overt (n=905) approaches, showed overall 90.8% ACAs to be of acceptable quality, 6.8% substandard, 1.3% degraded and 1.2% falsified. Convenience sampling yielded a significantly higher prevalence of poor quality ACAs, but was not evident by the mystery and overt sampling strategies both of which yielded results that were comparable between each other. Artesunate (n=135; 4 falsified) and dihydroartemisinin (n=14) monotherapy tablets, not recommended by WHO, were also identified. Conclusion Randomised sampling identified fewer falsified ACAs than previously reported by convenience approaches. Our findings emphasise the need for specific consideration to be given to sampling frame and sampling approach if representative information on drug quality is to be obtained. PMID:26018221
Ankrah Odame, Emmanuel; Akweongo, Patricia; Yankah, Ben; Asenso-Boadi, Francis; Agyepong, Irene
2014-05-01
Sustainability of public social welfare programmes has long been of concern in development circles. An important aspect of sustainability is the ability to sustain the recurrent financial costs of programmes. A free maternal care programme (FMCP) was launched under the Ghana National Health Insurance Scheme (NHIS) in 2008 with a start-up grant from the British Government. This article examines claims expenditure under the programme and the implications for the financial sustainability of the programme, and the lessons for donor and public financing of social welfare programmes. Records of reimbursement claims for services and medicines by women benefitting from the policy in participating facilities in one sub-metropolis in Ghana were analysed to gain an understanding of the expenditure on this programme at facility level. National level financial inflow and outflow (expenditure) data of the NHIS, related to implementation of this policy for 2008 and 2009, were reviewed to put the facility-based data in the national perspective. A total of US$936 450.94 was spent in 2009 by the scheme on FMCP in the sub-metropolis. The NHIS expenditure on the programme for the entire country in 2009 was US$49.25 million, exceeding the British grant of US$10.00 million given for that year. Subsequently, the programme has been entirely financed by the National Health Insurance Fund. The rapidly increasing, recurrent demands on this fund from the maternal delivery exemption programme-without a commensurate growth on the amounts generated annually-is an increasing threat to the sustainability of the fund. Provision of donor start-up funding for programmes with high recurrent expenditures, under the expectation that government will take over and sustain the programme, must be accompanied by clear long-term analysis and planning as to how government will sustain the programme.
Zhang, Hao; Zhou, Li-Guo; Chen, Ming-Nan; Ma, Wei-Chun
2011-01-01
Through the integrated approach of remote sensing and geographic information system (GIS) techniques, four Landsat TM/ETM+ imagery acquired during 1979 and 2008 were used to quantitatively characterize the patterns of land use and land cover change (LULC) and urban sprawl in the fast-growing Shanghai Metropolis, China. Results showed that, the urban/built-up area grew on average by 4,242.06 ha yr−1. Bare land grew by 1,594.66 ha yr−1 on average. In contrast, cropland decreased by 3,286.26 ha yr−1 on average, followed by forest and shrub, water, and tidal land, which decreased by 1,331.33 ha yr−1, 903.43 ha yr−1, and 315.72 ha yr−1 on average, respectively. As a result, during 1979 and 2008 approximately 83.83% of the newly urban/built-up land was converted from cropland (67.35%), forest and shrub (9.12%), water (4.80%), and tidal land (2.19%). Another significant change was the continuous increase in regular residents, which played a very important role in contributing to local population growth and increase in urban/built-up land. This can be explained with this city’s huge demand for investment and qualified labor since the latest industrial transformation. Moreover, with a decrease in cropland, the proportion of population engaged in farming decreased 13.84%. Therefore, significant socio-economic transformation occurred, and this would lead to new demand for land resources. However, due to very scarce land resources and overload of population in Shanghai, the drive to achieve economic goals at the loss of cropland, water, and the other lands is not sustainable. Future urban planning policy aiming at ensuring a win-win balance between sustainable land use and economic growth is urgently needed. PMID:22319382
Epidemiological link of a major cholera outbreak in Greater Accra region of Ghana, 2014.
Ohene-Adjei, Kennedy; Kenu, Ernest; Bandoh, Delia Akosua; Addo, Prince Nii Ossah; Noora, Charles Lwanga; Nortey, Priscillia; Afari, Edwin Andrew
2017-10-11
Cholera remains an important public health challenge globally. Several pandemics have occurred in different parts of the world and have been epidemiologically linked by different researchers to illustrate how the cases were spread and how they were related to index cases. Even though the risk factors associated with the 2014 cholera outbreak were investigated extensively, the link between index cases and the source of infection was not investigated to help break the transmission process. This study sought to show how the index cases from various districts of the Greater Accra Region may have been linked. We carried out a descriptive cross sectional study to investigate the epidemiological link of the 2014 cholera outbreak in the Greater Accra region of Ghana. An extensive review of all district records on cholera cases in the Greater Accra region was carried out. Index cases were identified with the help of line lists. Univariate analyses were expressed as frequency distributions, percentages, mean ± Standard Deviation, and rates (attack rates, case-fatality rates etc.) as appropriate. Maps were drawn using Arc GIS and Epi info software to describe the pattern of transmission. Up to 20,199 cholera cases were recorded. Sixty percent of the cases were between 20 and 40 years and about 58% (11,694) of the total cases were males. Almost 50% of the cases occurred in the Accra Metro district. Two-thirds of the index cases ate food prepared outside their home and had visited the Accra Metropolis. The 2014 cholera outbreak can be described as a propagated source outbreak linked to the Accra Metropolis. The link between index cases and the source of infection, if investigated earlier could have helped break the transmission process. Such investigations also inform decision-making about the appropriate interventions to be instituted to prevent subsequent outbreaks.
Offu, Ogochukwu; Anetoh, Maureen; Okonta, Matthew; Ekwunife, Obinna
2015-01-01
The Nigerian health sector battles with control of infectious diseases and emerging non-communicable diseases. Number of healthcare personnel involved in public health programs need to be boosted to contain the health challenges of the country. Therefore, it is important to assess whether community pharmacists in Nigeria could be engaged in the promotion and delivery of various public health interventions. This study aimed to assess level of knowledge, attitude and practice of public health by community pharmacists. The cross sectional survey was carried out in Enugu metropolis. Questionnaire items were developed from expert literature. Percentage satisfactory knowledge and practice were obtained by determining the percentage of community pharmacists that were able to list more than 2 activities or that stated the correct answer. Attitude score represents the average score on the 5 point Likert scale for each item. Chi square and Fisher's exact test were used to test for statistically significant difference in knowledge, attitude and practice of public health between different groups of community pharmacists. Forty pharmacists participated in the survey. About one third of the participants had satisfactory knowledge of public health. With the exception of one item in attitude assessment, average item score ranged from 'agreed' to 'strongly agreed'. Study participants scored below satisfactory on practice of public health. Knowledge, attitude and practice of public health were not influenced by years of practice, qualification and prior public health experience. Reported barriers to the practice of public health include inadequate funds, lack of time, lack of space, cooperation of clients, inadequate staff, government regulation, insufficient knowledge, and remuneration. Level of knowledge and practice of public health by community pharmacists were not satisfactory although they had a positive attitude towards practice of public health. The findings highlight the importance of educational interventions targeted towards practicing community pharmacists to improve their knowledge level on public health issues. Providing incentives for public health services rendered could increase community pharmacists' engagement in public health activities.
Achievements and future path of Tehran municipality in urban health domain: An Iranian experience
Damari, Behzad; Riazi-Isfahani, Sahand
2016-01-01
Background: According to national laws and world experiences; provision, maintenance, and improving citizens’ health are considered to be the essential functions of municipalities as a "social institute". In order to equitably promote health conditions at urban level, particularly in marginal areas, since 2004 targeted efforts have been implemented in the municipality of Tehran metropolis. This study was intended to identify and analyze these targeted measures and tries to analyze health interventions in a conceptual framework and propose a future path. Methods: This is a qualitative study with content analysis approach. Reviewing documents and structured interviews with national health policy making and planning experts and executive managers of 22-region municipalities of Tehran metropolis were used to collect data. The data were analyzed on the basis of conceptual framework prepared for urban health in 4 domains including municipal interventions, goal achievements, drivers and obstacles of success, and the way forward. Results: From the viewpoint of interviewees, these new health actions of Tehran municipality are more based on public participation and the municipality was able to prioritize health issue in the programs and policies of Tehran city council. Tehran municipality has accomplished three types of interventions to improve health, which in orders of magnitude are: facilitative, promotional, and mandatory interventions. Development and institutionalization of public participation is the greatest achievement in health-oriented actions; and expansion of environmental and physical health-oriented facilities and promoting a healthy lifestyle are next in ranks. Conclusion: Since management alterations seriously challenges institutionalization of actions and innovations especially in the developing countries, it is suggested that mayors of metropolitan cities like Tehran document and review municipal health measures as soon as possible and while eliminating overlapping of interventions with other sectors, design and approve the charter of "health promoting municipality". The most important role of municipalities in this charter would be coordinating health improvement of citizens. This charter, when approved as a national policy could be used for other cities too. PMID:27390693
Integration of climate change in flood prediction: application to the Somme river (France)
NASA Astrophysics Data System (ADS)
Pinault, J.-L.; Amraoui, N.; Noyer, M.-L.
2003-04-01
Exceptional floods that have occurred for the last two years in western and central Europe were very unlikely. The concomitance of such rare events shows that they might be imputable to climate change. The statistical analysis of long rainfall series confirms that both the cumulated annual height and the temporal variability have increased for the last decade. This paper is devoted to the analysis of climate change impact on flood prediction applied to the Somme river. The exceptional pluviometry that occurred from October 2000 to April 2001, about the double of the mean value, entailed catastrophic flood between the high Somme and Abbeville. The flow reached a peak at the beginning of May 2001, involving damages in numerous habitations and communication routes, and economical activity of the region had been flood-bound for more than 2 months. The flood caught unaware the population and caused deep traumas in France since it was the first time such a sudden event was recognized as resulting from groundwater discharge. Mechanisms of flood generation were studied tightly in order to predict the behavior of the Somme catchment and other urbanized basins when the pluviometry is exceptional in winter or in spring, which occurs more and more frequently in the northern part of Europe. The contribution of groundwater in surface water flow was calculated by inverse modeling from piezometers that are representative of aquifers in valleys. They were found on the slopes and near the edge of plateaus in order to characterize the drainage processes of the watertable to the surface water network. For flood prediction, a stochastic process is used, consisting in the generation of both rainfall and PET time series. The precipitation generator uses Markov chain Monte Carlo and simulated annealing from the Hastings -- Metropolis algorithm. Coupling of rainfall and PET generators with transfer enables a new evaluation of the probability of occurrence of floods, taking into account both the memory effect of the Somme basin and the temporal structure of rainfall events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Y; UT Southwestern Medical Center, Dallas, TX; Tian, Z
2015-06-15
Purpose: Intensity-modulated proton therapy (IMPT) is increasingly used in proton therapy. For IMPT optimization, Monte Carlo (MC) is desired for spots dose calculations because of its high accuracy, especially in cases with a high level of heterogeneity. It is also preferred in biological optimization problems due to the capability of computing quantities related to biological effects. However, MC simulation is typically too slow to be used for this purpose. Although GPU-based MC engines have become available, the achieved efficiency is still not ideal. The purpose of this work is to develop a new optimization scheme to include GPU-based MC intomore » IMPT. Methods: A conventional approach using MC in IMPT simply calls the MC dose engine repeatedly for each spot dose calculations. However, this is not the optimal approach, because of the unnecessary computations on some spots that turned out to have very small weights after solving the optimization problem. GPU-memory writing conflict occurring at a small beam size also reduces computational efficiency. To solve these problems, we developed a new framework that iteratively performs MC dose calculations and plan optimizations. At each dose calculation step, the particles were sampled from different spots altogether with Metropolis algorithm, such that the particle number is proportional to the latest optimized spot intensity. Simultaneously transporting particles from multiple spots also mitigated the memory writing conflict problem. Results: We have validated the proposed MC-based optimization schemes in one prostate case. The total computation time of our method was ∼5–6 min on one NVIDIA GPU card, including both spot dose calculation and plan optimization, whereas a conventional method naively using the same GPU-based MC engine were ∼3 times slower. Conclusion: A fast GPU-based MC dose calculation method along with a novel optimization workflow is developed. The high efficiency makes it attractive for clinical usages.« less
NASA Astrophysics Data System (ADS)
Scharnagl, Benedikt; Vrugt, Jasper A.; Vereecken, Harry; Herbst, Michael
2010-05-01
Turnover of soil organic matter is usually described with multi-compartment models. However, a major drawback of these models is that the conceptually defined compartments (or pools) do not necessarily correspond to measurable soil organic carbon (SOC) fractions in real practice. This not only impairs our ability to rigorously evaluate SOC models but also makes it difficult to derive accurate initial states. In this study, we tested the usefulness and applicability of inverse modeling to derive the various carbon pool sizes in the Rothamsted carbon model (ROTHC) using a synthetic time series of mineralization rates from laboratory incubation. To appropriately account for data and model uncertainty we considered a Bayesian approach using the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. This Markov chain Monte Carlo scheme derives the posterior probability density distribution of the initial pool sizes at the start of incubation from observed mineralization rates. We used the Kullback-Leibler divergence to quantify the information contained in the data and to illustrate the effect of increasing incubation times on the reliability of the pool size estimates. Our results show that measured mineralization rates generally provide sufficient information to reliably estimate the sizes of all active pools in the ROTHC model. However, with about 900 days of incubation, these experiments are excessively long. The use of prior information on microbial biomass provided a way forward to significantly reduce uncertainty and required duration of incubation to about 600 days. Explicit consideration of model parameter uncertainty in the estimation process further impaired the identifiability of initial pools, especially for the more slowly decomposing pools. Our illustrative case studies show how Bayesian inverse modeling can be used to provide important insights into the information content of incubation experiments. Moreover, the outcome of this virtual experiment helps to explain the results of related real-world studies on SOC dynamics.
Understanding Yield Anomalies in ICF Implosions via Fully Kinetic Simulations
NASA Astrophysics Data System (ADS)
Taitano, William
2017-10-01
In the quest towards ICF ignition, plasma kinetic effects are among prime candidates for explaining some significant discrepancies between experimental observations and rad-hydro simulations. To assess their importance, high-fidelity fully kinetic simulations of ICF capsule implosions are needed. Owing to the extremely multi-scale nature of the problem, kinetic codes have to overcome nontrivial numerical and algorithmic challenges, and very few options are currently available. Here, we present resolutions of some long-standing yield discrepancy conundrums using a novel, LANL-developed, 1D-2V Vlasov-Fokker-Planck code iFP. iFP possesses an unprecedented fidelity and features fully implicit time-stepping, exact mass, momentum, and energy conservation, and optimal grid adaptation in phase space, all of which are critically important for ensuring long-time numerical accuracy of the implosion simulations. Specifically, we concentrate on several anomalous yield degradation instances observed in Omega campaigns, with the so-called ``Rygg effect'', or an anomalous yield scaling with the fuel composition, being a prime example. Understanding the physical mechanisms responsible for such degradations in non-ignition-grade Omega experiments is of great interest, as such experiments are often used for platform and diagnostic development, which are then used in ignition-grade experiments on NIF. In the case of Rygg's experiments, effects of a kinetic stratification of fuel ions on the yield have been previously proposed as the anomaly explanation, studied with a kinetic code FPION, and found unimportant. We have revisited this issue with iFP and obtained excellent yield-over-clean agreement with the original Rygg results, and several subsequent experiments. This validates iFP and confirms that the kinetic fuel stratification is indeed at the root of the observed yield degradation. This work was sponsored by the Metropolis Postdoctoral Fellowship, LDRD office, Thermonuclear Burn Initiative of ASC, and the LANL Institutional Computing. This work was performed under the NNSA of the USDOE at LANL under contract DE-AC52-06NA25396.
Andasari, Vivi; Roper, Ryan T.; Swat, Maciej H.; Chaplain, Mark A. J.
2012-01-01
In this paper we present a multiscale, individual-based simulation environment that integrates CompuCell3D for lattice-based modelling on the cellular level and Bionetsolver for intracellular modelling. CompuCell3D or CC3D provides an implementation of the lattice-based Cellular Potts Model or CPM (also known as the Glazier-Graner-Hogeweg or GGH model) and a Monte Carlo method based on the metropolis algorithm for system evolution. The integration of CC3D for cellular systems with Bionetsolver for subcellular systems enables us to develop a multiscale mathematical model and to study the evolution of cell behaviour due to the dynamics inside of the cells, capturing aspects of cell behaviour and interaction that is not possible using continuum approaches. We then apply this multiscale modelling technique to a model of cancer growth and invasion, based on a previously published model of Ramis-Conde et al. (2008) where individual cell behaviour is driven by a molecular network describing the dynamics of E-cadherin and -catenin. In this model, which we refer to as the centre-based model, an alternative individual-based modelling technique was used, namely, a lattice-free approach. In many respects, the GGH or CPM methodology and the approach of the centre-based model have the same overall goal, that is to mimic behaviours and interactions of biological cells. Although the mathematical foundations and computational implementations of the two approaches are very different, the results of the presented simulations are compatible with each other, suggesting that by using individual-based approaches we can formulate a natural way of describing complex multi-cell, multiscale models. The ability to easily reproduce results of one modelling approach using an alternative approach is also essential from a model cross-validation standpoint and also helps to identify any modelling artefacts specific to a given computational approach. PMID:22461894
Minimal model for the secondary structures and conformational conversions in proteins
NASA Astrophysics Data System (ADS)
Imamura, Hideo
Better understanding of protein folding process can provide physical insights on the function of proteins and makes it possible to benefit from genetic information accumulated so far. Protein folding process normally takes place in less than seconds but even seconds are beyond reach of current computational power for simulations on a system of all-atom detail. Hence, to model and explore protein folding process it is crucial to construct a proper model that can adequately describe the physical process and mechanism for the relevant time scale. We discuss the reduced off-lattice model that can express _-helix and ?-hairpin conformations defined solely by a given sequence in order to investigate a protein folding mechanism of conformations such as a ?-hairpin and also to investigate conformational conversions in proteins. The first two chapters introduce and review essential concepts in protein folding modelling physical interaction in proteins, various simple models, and also review computational methods, in particular, the Metropolis Monte Carlo method, its dynamic interpretation and thermodynamic Monte Carlo algorithms. Chapter 3 describes the minimalist model that represents both _-helix and ?-sheet conformations using simple potentials. The native conformation can be specified by the sequence without particular conformational biases to a reference state. In Chapter 4, the model is used to investigate the folding mechanism of ?-hairpins exhaustively using the dynamic Monte Carlo and a thermodynamic Monte Carlo method an effcient combination of the multicanonical Monte Carlo and the weighted histogram analysis method. We show that the major folding pathways and folding rate depend on the location of a hydrophobic. The conformational conversions between _-helix and ?-sheet conformations are examined in Chapter 5 and 6. First, the conformational conversion due to mutation in a non-hydrophobic system and then the conformational conversion due to mutation with a hydrophobic pair at a different position at various temperatures are examined.
Design of the smart scenic spot service platform
NASA Astrophysics Data System (ADS)
Yin, Min; Wang, Shi-tai
2015-12-01
With the deepening of the smart city construction, the model "smart+" is rapidly developing. Guilin, the international tourism metropolis fast constructing need smart tourism technology support. This paper studied the smart scenic spot service object and its requirements. And then constructed the smart service platform of the scenic spot application of 3S technology (Geographic Information System (GIS), Remote Sensing (RS) and Global Navigation Satellite System (GNSS)) and the Internet of things, cloud computing. Based on Guilin Seven-star Park scenic area as an object, this paper designed the Seven-star smart scenic spot service platform framework. The application of this platform will improve the tourists' visiting experience, make the tourism management more scientifically and standardly, increase tourism enterprises operating earnings.
Collins, Timothy W; Grineski, Sara E; Chakraborty, Jayajit; McDonald, Yolanda J
2011-01-01
This paper contributes to the environmental justice literature by analyzing contextually relevant and racial/ethnic group-specific variables in relation to air toxics cancer risks in a US-Mexico border metropolis at the census block group-level. Results indicate that Hispanics' ethnic status interacts with class, gender and age status to amplify disproportionate risk. In contrast, results indicate that non-Hispanic whiteness attenuates cancer risk disparities associated with class, gender and age status. Findings suggest that a system of white-Anglo privilege shapes the way in which race/ethnicity articulates with other dimensions of inequality to create unequal cancer risks from air toxics. Copyright © 2010 Elsevier Ltd. All rights reserved.