Sample records for efficient epidemiological simulation

  1. History matching of a complex epidemiological model of human immunodeficiency virus transmission by using variance emulation.

    PubMed

    Andrianakis, I; Vernon, I; McCreesh, N; McKinley, T J; Oakley, J E; Nsubuga, R N; Goldstein, M; White, R G

    2017-08-01

    Complex stochastic models are commonplace in epidemiology, but their utility depends on their calibration to empirical data. History matching is a (pre)calibration method that has been applied successfully to complex deterministic models. In this work, we adapt history matching to stochastic models, by emulating the variance in the model outputs, and therefore accounting for its dependence on the model's input values. The method proposed is applied to a real complex epidemiological model of human immunodeficiency virus in Uganda with 22 inputs and 18 outputs, and is found to increase the efficiency of history matching, requiring 70% of the time and 43% fewer simulator evaluations compared with a previous variant of the method. The insight gained into the structure of the human immunodeficiency virus model, and the constraints placed on it, are then discussed.

  2. Particle Collection Efficiency of a Lens-Liquid Filtration System

    NASA Astrophysics Data System (ADS)

    Wong, Ross Y. M.; Ng, Moses L. F.; Chao, Christopher Y. H.; Li, Z. G.

    2011-09-01

    Clinical and epidemiological studies have shown that indoor air quality has substantial impact on the health of building occupants [1]. Possible sources of indoor air contamination include hazardous gases as well as particulate matters (PMs) [2]. Experimental studies show that the size distribution of PMs in indoor air ranges from tens of nanometers to a few hundreds of micrometers [3]. Vacuum cleaners can be used as a major tool to collect PMs from floor/carpets, which are the main sources of indoor PMs. However, the particle collection efficiency of typical cyclonic filters in the vacuums drops significantly for particles of diameter below 10 μm. In this work, we propose a lens-liquid filtration system (see Figure 1), where the flow channel is formed by a liquid free surface and a planar plate with fin/lens structures. Computational fluid dynamics simulations are performed by using FLUENT to optimize the structure of the proposed system toward high particle collection efficiency and satisfactory pressure drop. Numerical simulations show that the system can collect 250 nm diameter particles with collection efficiency of 50%.

  3. Phylogenetic analysis accounting for age-dependent death and sampling with applications to epidemics.

    PubMed

    Lambert, Amaury; Alexander, Helen K; Stadler, Tanja

    2014-07-07

    The reconstruction of phylogenetic trees based on viral genetic sequence data sequentially sampled from an epidemic provides estimates of the past transmission dynamics, by fitting epidemiological models to these trees. To our knowledge, none of the epidemiological models currently used in phylogenetics can account for recovery rates and sampling rates dependent on the time elapsed since transmission, i.e. age of infection. Here we introduce an epidemiological model where infectives leave the epidemic, by either recovery or sampling, after some random time which may follow an arbitrary distribution. We derive an expression for the likelihood of the phylogenetic tree of sampled infectives under our general epidemiological model. The analytic concept developed in this paper will facilitate inference of past epidemiological dynamics and provide an analytical framework for performing very efficient simulations of phylogenetic trees under our model. The main idea of our analytic study is that the non-Markovian epidemiological model giving rise to phylogenetic trees growing vertically as time goes by can be represented by a Markovian "coalescent point process" growing horizontally by the sequential addition of pairs of coalescence and sampling times. As examples, we discuss two special cases of our general model, described in terms of influenza and HIV epidemics. Though phrased in epidemiological terms, our framework can also be used for instance to fit macroevolutionary models to phylogenies of extant and extinct species, accounting for general species lifetime distributions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Comparison of Random Forest and Parametric Imputation Models for Imputing Missing Data Using MICE: A CALIBER Study

    PubMed Central

    Shah, Anoop D.; Bartlett, Jonathan W.; Carpenter, James; Nicholas, Owen; Hemingway, Harry

    2014-01-01

    Multivariate imputation by chained equations (MICE) is commonly used for imputing missing data in epidemiologic research. The “true” imputation model may contain nonlinearities which are not included in default imputation models. Random forest imputation is a machine learning technique which can accommodate nonlinearities and interactions and does not require a particular regression model to be specified. We compared parametric MICE with a random forest-based MICE algorithm in 2 simulation studies. The first study used 1,000 random samples of 2,000 persons drawn from the 10,128 stable angina patients in the CALIBER database (Cardiovascular Disease Research using Linked Bespoke Studies and Electronic Records; 2001–2010) with complete data on all covariates. Variables were artificially made “missing at random,” and the bias and efficiency of parameter estimates obtained using different imputation methods were compared. Both MICE methods produced unbiased estimates of (log) hazard ratios, but random forest was more efficient and produced narrower confidence intervals. The second study used simulated data in which the partially observed variable depended on the fully observed variables in a nonlinear way. Parameter estimates were less biased using random forest MICE, and confidence interval coverage was better. This suggests that random forest imputation may be useful for imputing complex epidemiologic data sets in which some patients have missing data. PMID:24589914

  5. Comparison of random forest and parametric imputation models for imputing missing data using MICE: a CALIBER study.

    PubMed

    Shah, Anoop D; Bartlett, Jonathan W; Carpenter, James; Nicholas, Owen; Hemingway, Harry

    2014-03-15

    Multivariate imputation by chained equations (MICE) is commonly used for imputing missing data in epidemiologic research. The "true" imputation model may contain nonlinearities which are not included in default imputation models. Random forest imputation is a machine learning technique which can accommodate nonlinearities and interactions and does not require a particular regression model to be specified. We compared parametric MICE with a random forest-based MICE algorithm in 2 simulation studies. The first study used 1,000 random samples of 2,000 persons drawn from the 10,128 stable angina patients in the CALIBER database (Cardiovascular Disease Research using Linked Bespoke Studies and Electronic Records; 2001-2010) with complete data on all covariates. Variables were artificially made "missing at random," and the bias and efficiency of parameter estimates obtained using different imputation methods were compared. Both MICE methods produced unbiased estimates of (log) hazard ratios, but random forest was more efficient and produced narrower confidence intervals. The second study used simulated data in which the partially observed variable depended on the fully observed variables in a nonlinear way. Parameter estimates were less biased using random forest MICE, and confidence interval coverage was better. This suggests that random forest imputation may be useful for imputing complex epidemiologic data sets in which some patients have missing data.

  6. Efficient Control of Epidemics Spreading on Networks: Balance between Treatment and Recovery

    PubMed Central

    Oleś, Katarzyna; Gudowska-Nowak, Ewa; Kleczkowski, Adam

    2013-01-01

    We analyse two models describing disease transmission and control on regular and small-world networks. We use simulations to find a control strategy that minimizes the total cost of an outbreak, thus balancing the costs of disease against that of the preventive treatment. The models are similar in their epidemiological part, but differ in how the removed/recovered individuals are treated. The differences in models affect choice of the strategy only for very cheap treatment and slow spreading disease. However for the combinations of parameters that are important from the epidemiological perspective (high infectiousness and expensive treatment) the models give similar results. Moreover, even where the choice of the strategy is different, the total cost spent on controlling the epidemic is very similar for both models. PMID:23750205

  7. Efficient control of epidemics spreading on networks: balance between treatment and recovery.

    PubMed

    Oleś, Katarzyna; Gudowska-Nowak, Ewa; Kleczkowski, Adam

    2013-01-01

    We analyse two models describing disease transmission and control on regular and small-world networks. We use simulations to find a control strategy that minimizes the total cost of an outbreak, thus balancing the costs of disease against that of the preventive treatment. The models are similar in their epidemiological part, but differ in how the removed/recovered individuals are treated. The differences in models affect choice of the strategy only for very cheap treatment and slow spreading disease. However for the combinations of parameters that are important from the epidemiological perspective (high infectiousness and expensive treatment) the models give similar results. Moreover, even where the choice of the strategy is different, the total cost spent on controlling the epidemic is very similar for both models.

  8. A Parallel Sliding Region Algorithm to Make Agent-Based Modeling Possible for a Large-Scale Simulation: Modeling Hepatitis C Epidemics in Canada.

    PubMed

    Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla

    2016-11-01

    Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.

  9. Estimation of a partially linear additive model for data from an outcome-dependent sampling design with a continuous outcome

    PubMed Central

    Tan, Ziwen; Qin, Guoyou; Zhou, Haibo

    2016-01-01

    Outcome-dependent sampling (ODS) designs have been well recognized as a cost-effective way to enhance study efficiency in both statistical literature and biomedical and epidemiologic studies. A partially linear additive model (PLAM) is widely applied in real problems because it allows for a flexible specification of the dependence of the response on some covariates in a linear fashion and other covariates in a nonlinear non-parametric fashion. Motivated by an epidemiological study investigating the effect of prenatal polychlorinated biphenyls exposure on children's intelligence quotient (IQ) at age 7 years, we propose a PLAM in this article to investigate a more flexible non-parametric inference on the relationships among the response and covariates under the ODS scheme. We propose the estimation method and establish the asymptotic properties of the proposed estimator. Simulation studies are conducted to show the improved efficiency of the proposed ODS estimator for PLAM compared with that from a traditional simple random sampling design with the same sample size. The data of the above-mentioned study is analyzed to illustrate the proposed method. PMID:27006375

  10. An epidemiological model of internet worms with hierarchical dispersal and spatial clustering of hosts.

    PubMed

    Hiebeler, David E; Audibert, Andrew; Strubell, Emma; Michaud, Isaac J

    2017-04-07

    Beginning in 2001, many instances of malicious software known as Internet worms have been using biological strategies such as hierarchical dispersal to seek out and spread to new susceptible hosts more efficiently. We measured the distribution of potentially susceptible hosts in the space of Internet addresses to determine their clustering. We have used the results to construct a full-size simulated Internet with 2 32 hosts with mean and variance of susceptible hosts chosen to match our measurements at multiple spatial scales. Epidemiological simulations of outbreaks among the roughly 2.8×10 6 susceptible hosts on this full-sized network show that local preference scanning greatly increases the chances for an infected host to locate and infect other susceptible hosts by a factor of as much as several hundred. However, once deploying this strategy, the overall success of a worm is relatively insensitive to the details of its dispersal strategy over a wide range of parameters. In addition, although using localized interactions may allow malicious software to spread more rapidly or to more hosts on average, it can also lead to increased variability in infection levels among replicate simulations. Using such dispersal strategies may therefore be a high risk, high reward strategy for the authors of such software. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Bayesian History Matching of Complex Infectious Disease Models Using Emulation: A Tutorial and a Case Study on HIV in Uganda

    PubMed Central

    Andrianakis, Ioannis; Vernon, Ian R.; McCreesh, Nicky; McKinley, Trevelyan J.; Oakley, Jeremy E.; Nsubuga, Rebecca N.; Goldstein, Michael; White, Richard G.

    2015-01-01

    Advances in scientific computing have allowed the development of complex models that are being routinely applied to problems in disease epidemiology, public health and decision making. The utility of these models depends in part on how well they can reproduce empirical data. However, fitting such models to real world data is greatly hindered both by large numbers of input and output parameters, and by long run times, such that many modelling studies lack a formal calibration methodology. We present a novel method that has the potential to improve the calibration of complex infectious disease models (hereafter called simulators). We present this in the form of a tutorial and a case study where we history match a dynamic, event-driven, individual-based stochastic HIV simulator, using extensive demographic, behavioural and epidemiological data available from Uganda. The tutorial describes history matching and emulation. History matching is an iterative procedure that reduces the simulator's input space by identifying and discarding areas that are unlikely to provide a good match to the empirical data. History matching relies on the computational efficiency of a Bayesian representation of the simulator, known as an emulator. Emulators mimic the simulator's behaviour, but are often several orders of magnitude faster to evaluate. In the case study, we use a 22 input simulator, fitting its 18 outputs simultaneously. After 9 iterations of history matching, a non-implausible region of the simulator input space was identified that was times smaller than the original input space. Simulator evaluations made within this region were found to have a 65% probability of fitting all 18 outputs. History matching and emulation are useful additions to the toolbox of infectious disease modellers. Further research is required to explicitly address the stochastic nature of the simulator as well as to account for correlations between outputs. PMID:25569850

  12. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    NASA Astrophysics Data System (ADS)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-05-01

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. This work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic.

  13. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-05-08

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. Thismore » work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic.« less

  14. Problems encountered with the use of simulation in an attempt to enhance interpretation of a secondary data source in epidemiologic mental health research

    PubMed Central

    2010-01-01

    Background The longitudinal epidemiology of major depressive episodes (MDE) is poorly characterized in most countries. Some potentially relevant data sources may be underutilized because they are not conducive to estimating the most salient epidemiologic parameters. An available data source in Canada provides estimates that are potentially valuable, but that are difficult to apply in clinical or public health practice. For example, weeks depressed in the past year is assessed in this data source whereas episode duration would be of more interest. The goal of this project was to derive, using simulation, more readily interpretable parameter values from the available data. Findings The data source was a Canadian longitudinal study called the National Population Health Survey (NPHS). A simulation model representing the course of depressive episodes was used to reshape estimates deriving from binary and ordinal logistic models (fit to the NPHS data) into equations more capable of informing clinical and public health decisions. Discrete event simulation was used for this purpose. Whereas the intention was to clarify a complex epidemiology, the models themselves needed to become excessively complex in order to provide an accurate description of the data. Conclusions Simulation methods are useful in circumstances where a representation of a real-world system has practical value. In this particular scenario, the usefulness of simulation was limited both by problems with the data source and by inherent complexity of the underlying epidemiology. PMID:20796271

  15. Likelihood-based inference for discretely observed birth-death-shift processes, with applications to evolution of mobile genetic elements.

    PubMed

    Xu, Jason; Guttorp, Peter; Kato-Maeda, Midori; Minin, Vladimir N

    2015-12-01

    Continuous-time birth-death-shift (BDS) processes are frequently used in stochastic modeling, with many applications in ecology and epidemiology. In particular, such processes can model evolutionary dynamics of transposable elements-important genetic markers in molecular epidemiology. Estimation of the effects of individual covariates on the birth, death, and shift rates of the process can be accomplished by analyzing patient data, but inferring these rates in a discretely and unevenly observed setting presents computational challenges. We propose a multi-type branching process approximation to BDS processes and develop a corresponding expectation maximization algorithm, where we use spectral techniques to reduce calculation of expected sufficient statistics to low-dimensional integration. These techniques yield an efficient and robust optimization routine for inferring the rates of the BDS process, and apply broadly to multi-type branching processes whose rates can depend on many covariates. After rigorously testing our methodology in simulation studies, we apply our method to study intrapatient time evolution of IS6110 transposable element, a genetic marker frequently used during estimation of epidemiological clusters of Mycobacterium tuberculosis infections. © 2015, The International Biometric Society.

  16. Cancer Epidemiology Data Repository (CEDR)

    Cancer.gov

    In an effort to broaden access and facilitate efficient data sharing, the Epidemiology and Genomics Research Program (EGRP) has created the Cancer Epidemiology Data Repository (CEDR), a centralized, controlled-access database, where Investigators can deposit individual-level de-identified observational cancer datasets.

  17. Phylodynamic Inference with Kernel ABC and Its Application to HIV Epidemiology.

    PubMed

    Poon, Art F Y

    2015-09-01

    The shapes of phylogenetic trees relating virus populations are determined by the adaptation of viruses within each host, and by the transmission of viruses among hosts. Phylodynamic inference attempts to reverse this flow of information, estimating parameters of these processes from the shape of a virus phylogeny reconstructed from a sample of genetic sequences from the epidemic. A key challenge to phylodynamic inference is quantifying the similarity between two trees in an efficient and comprehensive way. In this study, I demonstrate that a new distance measure, based on a subset tree kernel function from computational linguistics, confers a significant improvement over previous measures of tree shape for classifying trees generated under different epidemiological scenarios. Next, I incorporate this kernel-based distance measure into an approximate Bayesian computation (ABC) framework for phylodynamic inference. ABC bypasses the need for an analytical solution of model likelihood, as it only requires the ability to simulate data from the model. I validate this "kernel-ABC" method for phylodynamic inference by estimating parameters from data simulated under a simple epidemiological model. Results indicate that kernel-ABC attained greater accuracy for parameters associated with virus transmission than leading software on the same data sets. Finally, I apply the kernel-ABC framework to study a recent outbreak of a recombinant HIV subtype in China. Kernel-ABC provides a versatile framework for phylodynamic inference because it can fit a broader range of models than methods that rely on the computation of exact likelihoods. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  18. New Insights into Handling Missing Values in Environmental Epidemiological Studies

    PubMed Central

    Roda, Célina; Nicolis, Ioannis; Momas, Isabelle; Guihenneuc, Chantal

    2014-01-01

    Missing data are unavoidable in environmental epidemiologic surveys. The aim of this study was to compare methods for handling large amounts of missing values: omission of missing values, single and multiple imputations (through linear regression or partial least squares regression), and a fully Bayesian approach. These methods were applied to the PARIS birth cohort, where indoor domestic pollutant measurements were performed in a random sample of babies' dwellings. A simulation study was conducted to assess performances of different approaches with a high proportion of missing values (from 50% to 95%). Different simulation scenarios were carried out, controlling the true value of the association (odds ratio of 1.0, 1.2, and 1.4), and varying the health outcome prevalence. When a large amount of data is missing, omitting these missing data reduced statistical power and inflated standard errors, which affected the significance of the association. Single imputation underestimated the variability, and considerably increased risk of type I error. All approaches were conservative, except the Bayesian joint model. In the case of a common health outcome, the fully Bayesian approach is the most efficient approach (low root mean square error, reasonable type I error, and high statistical power). Nevertheless for a less prevalent event, the type I error is increased and the statistical power is reduced. The estimated posterior distribution of the OR is useful to refine the conclusion. Among the methods handling missing values, no approach is absolutely the best but when usual approaches (e.g. single imputation) are not sufficient, joint modelling approach of missing process and health association is more efficient when large amounts of data are missing. PMID:25226278

  19. Simulation methods to estimate design power: an overview for applied research.

    PubMed

    Arnold, Benjamin F; Hogan, Daniel R; Colford, John M; Hubbard, Alan E

    2011-06-20

    Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research.

  20. Simulation methods to estimate design power: an overview for applied research

    PubMed Central

    2011-01-01

    Background Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. Methods We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. Results We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Conclusions Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research. PMID:21689447

  1. Transmission of infectious diseases en route to habitat hotspots.

    PubMed

    Benavides, Julio; Walsh, Peter D; Meyers, Lauren Ancel; Raymond, Michel; Caillaud, Damien

    2012-01-01

    The spread of infectious diseases in wildlife populations is influenced by patterns of between-host contacts. Habitat "hotspots"--places attracting a large numbers of individuals or social groups--can significantly alter contact patterns and, hence, disease propagation. Research on the importance of habitat hotspots in wildlife epidemiology has primarily focused on how inter-individual contacts occurring at the hotspot itself increase disease transmission. However, in territorial animals, epidemiologically important contacts may primarily occur as animals cross through territories of conspecifics en route to habitat hotspots. So far, the phenomenon has received little attention. Here, we investigate the importance of these contacts in the case where infectious individuals keep visiting the hotspots and in the case where these individuals are not able to travel to the hotspot any more. We developed a simulation epidemiological model to investigate both cases in a scenario when transmission at the hotspot does not occur. We find that (i) hotspots still exacerbate epidemics, (ii) when infectious individuals do not travel to the hotspot, the most vulnerable individuals are those residing at intermediate distances from the hotspot rather than nearby, and (iii) the epidemiological vulnerability of a population is the highest when the number of hotspots is intermediate. By altering animal movements in their vicinity, habitat hotspots can thus strongly increase the spread of infectious diseases, even when disease transmission does not occur at the hotspot itself. Interestingly, when animals only visit the nearest hotspot, creating additional artificial hotspots, rather than reducing their number, may be an efficient disease control measure.

  2. Next Generation Analytic Tools for Large Scale Genetic Epidemiology Studies of Complex Diseases

    PubMed Central

    Mechanic, Leah E.; Chen, Huann-Sheng; Amos, Christopher I.; Chatterjee, Nilanjan; Cox, Nancy J.; Divi, Rao L.; Fan, Ruzong; Harris, Emily L.; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Province, Michael A.; Ramos, Erin M.; Ritchie, Marylyn D.; Roeder, Kathryn; Schaid, Daniel J.; Stephens, Matthew; Thomas, Duncan C.; Weinberg, Clarice R.; Witte, John S.; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J.; Gillanders, Elizabeth M.

    2012-01-01

    Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled “Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases” on September 15–16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. PMID:22147673

  3. Foodborne outbreak simulation to teach field epidemiology: the Moroccan Field Epidemiology Training Program.

    PubMed

    Jroundi, Imane; Belarbi, Abdellatif

    2016-11-01

    Morocco in 2010 launched a new field epidemiology training program to enhance the skills of health professionals in charge of epidemiological surveillance and to investigate outbreaks; including foodborne diseases that represent a very substantial burden of disease. To apply an active learning method to teach outbreak investigation within a controled environment for field epidemiology trainees program at the Moroccan National school of public Health. A scenario describing digestive symptoms evoking a restaurant-associated foodborne outbreak that would affect the school staff was designed for the residents to investigate, to assess their organizational capacity and application of all stages of epidemiological investigation. Nine Residents applied study design, database management and statistical analysis to investigate the foodborne outbreak, to estimate attack rates, classify cases and controls, to identify the contaminated foods and pathogens and to issue preventive recommendations for the control and the prevention of further transmission. The overall resident's satisfaction of the learning method was 67%. A simulation of an outbreak investigation within an academic setting is an active learning method to be used in the curriculum for introducing the residents on field epidemiology program to the principles and practices of outbreak investigation before their implication in a real situation.

  4. Molecular Infectious Disease Epidemiology: Survival Analysis and Algorithms Linking Phylogenies to Transmission Trees

    PubMed Central

    Kenah, Eben; Britton, Tom; Halloran, M. Elizabeth; Longini, Ira M.

    2016-01-01

    Recent work has attempted to use whole-genome sequence data from pathogens to reconstruct the transmission trees linking infectors and infectees in outbreaks. However, transmission trees from one outbreak do not generalize to future outbreaks. Reconstruction of transmission trees is most useful to public health if it leads to generalizable scientific insights about disease transmission. In a survival analysis framework, estimation of transmission parameters is based on sums or averages over the possible transmission trees. A phylogeny can increase the precision of these estimates by providing partial information about who infected whom. The leaves of the phylogeny represent sampled pathogens, which have known hosts. The interior nodes represent common ancestors of sampled pathogens, which have unknown hosts. Starting from assumptions about disease biology and epidemiologic study design, we prove that there is a one-to-one correspondence between the possible assignments of interior node hosts and the transmission trees simultaneously consistent with the phylogeny and the epidemiologic data on person, place, and time. We develop algorithms to enumerate these transmission trees and show these can be used to calculate likelihoods that incorporate both epidemiologic data and a phylogeny. A simulation study confirms that this leads to more efficient estimates of hazard ratios for infectiousness and baseline hazards of infectious contact, and we use these methods to analyze data from a foot-and-mouth disease virus outbreak in the United Kingdom in 2001. These results demonstrate the importance of data on individuals who escape infection, which is often overlooked. The combination of survival analysis and algorithms linking phylogenies to transmission trees is a rigorous but flexible statistical foundation for molecular infectious disease epidemiology. PMID:27070316

  5. Invasive meningococcal disease epidemiology and control measures: a framework for evaluation.

    PubMed

    Caro, J Jaime; Möller, Jörgen; Getsios, Denis; Coudeville, L; El-Hadi, Wissam; Chevat, Catherine; Nguyen, Van Hung; Caro, Ingrid

    2007-06-29

    Meningococcal disease can have devastating consequences. As new vaccines emerge, it is necessary to assess their impact on public health. In the absence of long-term real world data, modeling the effects of different vaccination strategies is required. Discrete event simulation provides a flexible platform with which to conduct such evaluations. A discrete event simulation of the epidemiology of invasive meningococcal disease was developed to quantify the potential impact of implementing routine vaccination of adolescents in the United States with a quadrivalent conjugate vaccine protecting against serogroups A, C, Y, and W-135. The impact of vaccination is assessed including both the direct effects on individuals vaccinated and the indirect effects resulting from herd immunity. The simulation integrates a variety of epidemiologic and demographic data, with core information on the incidence of invasive meningococcal disease and outbreak frequency derived from data available through the Centers for Disease Control and Prevention. Simulation of the potential indirect benefits of vaccination resulting from herd immunity draw on data from the United Kingdom, where routine vaccination with a conjugate vaccine has been in place for a number of years. Cases of disease are modeled along with their health consequences, as are the occurrence of disease outbreaks. When run without a strategy of routine immunization, the simulation accurately predicts the age-specific incidence of invasive meningococcal disease and the site-specific frequency of outbreaks in the Unite States. 2,807 cases are predicted annually, resulting in over 14,000 potential life years lost due to invasive disease. In base case analyses of routine vaccination, life years lost due to infection are reduced by over 45% (to 7,600) when routinely vaccinating adolescents 12 years of age at 70% coverage. Sensitivity analyses indicate that herd immunity plays an important role when this population is targeted for vaccination. While 1,100 cases are avoided annually when herd immunity effects are included, in the absence of any herd immunity, the number of cases avoided with routine vaccination falls to 380 annually. The duration of vaccine protection also strongly influences results. In the absence of appropriate real world data on outcomes associated with large-scale vaccination programs, decisions on optimal immunization strategies can be aided by discrete events simulations such as the one described here. Given the importance of herd immunity on outcomes associated with routine vaccination, published estimates of the economic efficiency of routine vaccination with a quadrivalent conjugate vaccine in the United States may have considerably underestimated the benefits associated with a policy of routine immunization of adolescents.

  6. Bayesian reconstruction of transmission within outbreaks using genomic variants.

    PubMed

    De Maio, Nicola; Worby, Colin J; Wilson, Daniel J; Stoesser, Nicole

    2018-04-01

    Pathogen genome sequencing can reveal details of transmission histories and is a powerful tool in the fight against infectious disease. In particular, within-host pathogen genomic variants identified through heterozygous nucleotide base calls are a potential source of information to identify linked cases and infer direction and time of transmission. However, using such data effectively to model disease transmission presents a number of challenges, including differentiating genuine variants from those observed due to sequencing error, as well as the specification of a realistic model for within-host pathogen population dynamics. Here we propose a new Bayesian approach to transmission inference, BadTrIP (BAyesian epiDemiological TRansmission Inference from Polymorphisms), that explicitly models evolution of pathogen populations in an outbreak, transmission (including transmission bottlenecks), and sequencing error. BadTrIP enables the inference of host-to-host transmission from pathogen sequencing data and epidemiological data. By assuming that genomic variants are unlinked, our method does not require the computationally intensive and unreliable reconstruction of individual haplotypes. Using simulations we show that BadTrIP is robust in most scenarios and can accurately infer transmission events by efficiently combining information from genetic and epidemiological sources; thanks to its realistic model of pathogen evolution and the inclusion of epidemiological data, BadTrIP is also more accurate than existing approaches. BadTrIP is distributed as an open source package (https://bitbucket.org/nicofmay/badtrip) for the phylogenetic software BEAST2. We apply our method to reconstruct transmission history at the early stages of the 2014 Ebola outbreak, showcasing the power of within-host genomic variants to reconstruct transmission events.

  7. Towards an integrated food safety surveillance system: a simulation study to explore the potential of combining genomic and epidemiological metadata.

    PubMed

    Hill, A A; Crotta, M; Wall, B; Good, L; O'Brien, S J; Guitian, J

    2017-03-01

    Foodborne infection is a result of exposure to complex, dynamic food systems. The efficiency of foodborne infection is driven by ongoing shifts in genetic machinery. Next-generation sequencing technologies can provide high-fidelity data about the genetics of a pathogen. However, food safety surveillance systems do not currently provide similar high-fidelity epidemiological metadata to associate with genetic data. As a consequence, it is rarely possible to transform genetic data into actionable knowledge that can be used to genuinely inform risk assessment or prevent outbreaks. Big data approaches are touted as a revolution in decision support, and pose a potentially attractive method for closing the gap between the fidelity of genetic and epidemiological metadata for food safety surveillance. We therefore developed a simple food chain model to investigate the potential benefits of combining 'big' data sources, including both genetic and high-fidelity epidemiological metadata. Our results suggest that, as for any surveillance system, the collected data must be relevant and characterize the important dynamics of a system if we are to properly understand risk: this suggests the need to carefully consider data curation, rather than the more ambitious claims of big data proponents that unstructured and unrelated data sources can be combined to generate consistent insight. Of interest is that the biggest influencers of foodborne infection risk were contamination load and processing temperature, not genotype. This suggests that understanding food chain dynamics would probably more effectively generate insight into foodborne risk than prescribing the hazard in ever more detail in terms of genotype.

  8. Transmission of Infectious Diseases En Route to Habitat Hotspots

    PubMed Central

    Benavides, Julio; Walsh, Peter D.; Meyers, Lauren Ancel; Raymond, Michel; Caillaud, Damien

    2012-01-01

    Background The spread of infectious diseases in wildlife populations is influenced by patterns of between-host contacts. Habitat “hotspots” - places attracting a large numbers of individuals or social groups - can significantly alter contact patterns and, hence, disease propagation. Research on the importance of habitat hotspots in wildlife epidemiology has primarily focused on how inter-individual contacts occurring at the hotspot itself increase disease transmission. However, in territorial animals, epidemiologically important contacts may primarily occur as animals cross through territories of conspecifics en route to habitat hotspots. So far, the phenomenon has received little attention. Here, we investigate the importance of these contacts in the case where infectious individuals keep visiting the hotspots and in the case where these individuals are not able to travel to the hotspot any more. Methodology and Principal Findings We developed a simulation epidemiological model to investigate both cases in a scenario when transmission at the hotspot does not occur. We find that (i) hotspots still exacerbate epidemics, (ii) when infectious individuals do not travel to the hotspot, the most vulnerable individuals are those residing at intermediate distances from the hotspot rather than nearby, and (iii) the epidemiological vulnerability of a population is the highest when the number of hotspots is intermediate. Conclusions and Significance By altering animal movements in their vicinity, habitat hotspots can thus strongly increase the spread of infectious diseases, even when disease transmission does not occur at the hotspot itself. Interestingly, when animals only visit the nearest hotspot, creating additional artificial hotspots, rather than reducing their number, may be an efficient disease control measure. PMID:22363606

  9. Towards an integrated food safety surveillance system: a simulation study to explore the potential of combining genomic and epidemiological metadata

    PubMed Central

    Crotta, M.; Wall, B.; Good, L.; O'Brien, S. J.; Guitian, J.

    2017-01-01

    Foodborne infection is a result of exposure to complex, dynamic food systems. The efficiency of foodborne infection is driven by ongoing shifts in genetic machinery. Next-generation sequencing technologies can provide high-fidelity data about the genetics of a pathogen. However, food safety surveillance systems do not currently provide similar high-fidelity epidemiological metadata to associate with genetic data. As a consequence, it is rarely possible to transform genetic data into actionable knowledge that can be used to genuinely inform risk assessment or prevent outbreaks. Big data approaches are touted as a revolution in decision support, and pose a potentially attractive method for closing the gap between the fidelity of genetic and epidemiological metadata for food safety surveillance. We therefore developed a simple food chain model to investigate the potential benefits of combining ‘big’ data sources, including both genetic and high-fidelity epidemiological metadata. Our results suggest that, as for any surveillance system, the collected data must be relevant and characterize the important dynamics of a system if we are to properly understand risk: this suggests the need to carefully consider data curation, rather than the more ambitious claims of big data proponents that unstructured and unrelated data sources can be combined to generate consistent insight. Of interest is that the biggest influencers of foodborne infection risk were contamination load and processing temperature, not genotype. This suggests that understanding food chain dynamics would probably more effectively generate insight into foodborne risk than prescribing the hazard in ever more detail in terms of genotype. PMID:28405360

  10. Role of Edges in Complex Network Epidemiology

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Jiang, Zhi-Hong; Wang, Hui; Xie, Fei; Chen, Chao

    2012-09-01

    In complex network epidemiology, diseases spread along contacting edges between individuals and different edges may play different roles in epidemic outbreaks. Quantifying the efficiency of edges is an important step towards arresting epidemics. In this paper, we study the efficiency of edges in general susceptible-infected-recovered models, and introduce the transmission capability to measure the efficiency of edges. Results show that deleting edges with the highest transmission capability will greatly decrease epidemics on scale-free networks. Basing on the message passing approach, we get exact mathematical solution on configuration model networks with edge deletion in the large size limit.

  11. Lung cancer risk due to residential radon exposures: estimation and prevention.

    PubMed

    Truta, L A; Hofmann, W; Cosma, C

    2014-07-01

    Epidemiological studies proved that cumulative exposure to radon is the second leading cause of lung cancer, the world's most common cancer. The objectives of the present study are (i) to analyse lung cancer risk for chronic, low radon exposures based on the transformation frequency-tissue response (TF-TR) model formulated in terms of alpha particle hits in cell nuclei; (ii) to assess the percentage of attributable lung cancers in six areas of Transylvania where the radon concentration was measured and (iii) to point out the most efficient remediation measures tested on a pilot house in Stei, Romania. Simulations performed with the TF-TR model exhibit a linear dose-effect relationship for chronic, residential radon exposures. The fraction of lung cancer cases attributed to radon ranged from 9 to 28% for the investigated areas. Model predictions may represent a useful tool to complement epidemiological studies on lung cancer risk and to establish reasonable radiation protection regulations for human safety. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Simulating Nationwide Pandemics: Applying the Multi-scale Epidemiologic Simulation and Analysis System to Human Infectious Diseases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dombroski, M; Melius, C; Edmunds, T

    2008-09-24

    This study uses the Multi-scale Epidemiologic Simulation and Analysis (MESA) system developed for foreign animal diseases to assess consequences of nationwide human infectious disease outbreaks. A literature review identified the state of the art in both small-scale regional models and large-scale nationwide models and characterized key aspects of a nationwide epidemiological model. The MESA system offers computational advantages over existing epidemiological models and enables a broader array of stochastic analyses of model runs to be conducted because of those computational advantages. However, it has only been demonstrated on foreign animal diseases. This paper applied the MESA modeling methodology to humanmore » epidemiology. The methodology divided 2000 US Census data at the census tract level into school-bound children, work-bound workers, elderly, and stay at home individuals. The model simulated mixing among these groups by incorporating schools, workplaces, households, and long-distance travel via airports. A baseline scenario with fixed input parameters was run for a nationwide influenza outbreak using relatively simple social distancing countermeasures. Analysis from the baseline scenario showed one of three possible results: (1) the outbreak burned itself out before it had a chance to spread regionally, (2) the outbreak spread regionally and lasted a relatively long time, although constrained geography enabled it to eventually be contained without affecting a disproportionately large number of people, or (3) the outbreak spread through air travel and lasted a long time with unconstrained geography, becoming a nationwide pandemic. These results are consistent with empirical influenza outbreak data. The results showed that simply scaling up a regional small-scale model is unlikely to account for all the complex variables and their interactions involved in a nationwide outbreak. There are several limitations of the methodology that should be explored in future work including validating the model against reliable historical disease data, improving contact rates, spread methods, and disease parameters through discussions with epidemiological experts, and incorporating realistic behavioral assumptions.« less

  13. A generic model to simulate air-borne diseases as a function of crop architecture.

    PubMed

    Casadebaig, Pierre; Quesnel, Gauthier; Langlais, Michel; Faivre, Robert

    2012-01-01

    In a context of pesticide use reduction, alternatives to chemical-based crop protection strategies are needed to control diseases. Crop and plant architectures can be viewed as levers to control disease outbreaks by affecting microclimate within the canopy or pathogen transmission between plants. Modeling and simulation is a key approach to help analyze the behaviour of such systems where direct observations are difficult and tedious. Modeling permits the joining of concepts from ecophysiology and epidemiology to define structures and functions generic enough to describe a wide range of epidemiological dynamics. Additionally, this conception should minimize computing time by both limiting the complexity and setting an efficient software implementation. In this paper, our aim was to present a model that suited these constraints so it could first be used as a research and teaching tool to promote discussions about epidemic management in cropping systems. The system was modelled as a combination of individual hosts (population of plants or organs) and infectious agents (pathogens) whose contacts are restricted through a network of connections. The system dynamics were described at an individual scale. Additional attention was given to the identification of generic properties of host-pathogen systems to widen the model's applicability domain. Two specific pathosystems with contrasted crop architectures were considered: ascochyta blight on pea (homogeneously layered canopy) and potato late blight (lattice of individualized plants). The model behavior was assessed by simulation and sensitivity analysis and these results were discussed against the model ability to discriminate between the defined types of epidemics. Crop traits related to disease avoidance resulting in a low exposure, a slow dispersal or a de-synchronization of plant and pathogen cycles were shown to strongly impact the disease severity at the crop scale.

  14. Inference of Epidemiological Dynamics Based on Simulated Phylogenies Using Birth-Death and Coalescent Models

    PubMed Central

    Boskova, Veronika; Bonhoeffer, Sebastian; Stadler, Tanja

    2014-01-01

    Quantifying epidemiological dynamics is crucial for understanding and forecasting the spread of an epidemic. The coalescent and the birth-death model are used interchangeably to infer epidemiological parameters from the genealogical relationships of the pathogen population under study, which in turn are inferred from the pathogen genetic sequencing data. To compare the performance of these widely applied models, we performed a simulation study. We simulated phylogenetic trees under the constant rate birth-death model and the coalescent model with a deterministic exponentially growing infected population. For each tree, we re-estimated the epidemiological parameters using both a birth-death and a coalescent based method, implemented as an MCMC procedure in BEAST v2.0. In our analyses that estimate the growth rate of an epidemic based on simulated birth-death trees, the point estimates such as the maximum a posteriori/maximum likelihood estimates are not very different. However, the estimates of uncertainty are very different. The birth-death model had a higher coverage than the coalescent model, i.e. contained the true value in the highest posterior density (HPD) interval more often (2–13% vs. 31–75% error). The coverage of the coalescent decreases with decreasing basic reproductive ratio and increasing sampling probability of infecteds. We hypothesize that the biases in the coalescent are due to the assumption of deterministic rather than stochastic population size changes. Both methods performed reasonably well when analyzing trees simulated under the coalescent. The methods can also identify other key epidemiological parameters as long as one of the parameters is fixed to its true value. In summary, when using genetic data to estimate epidemic dynamics, our results suggest that the birth-death method will be less sensitive to population fluctuations of early outbreaks than the coalescent method that assumes a deterministic exponentially growing infected population. PMID:25375100

  15. The Apollo Structured Vocabulary: an OWL2 ontology of phenomena in infectious disease epidemiology and population biology for use in epidemic simulation.

    PubMed

    Hogan, William R; Wagner, Michael M; Brochhausen, Mathias; Levander, John; Brown, Shawn T; Millett, Nicholas; DePasse, Jay; Hanna, Josh

    2016-08-18

    We developed the Apollo Structured Vocabulary (Apollo-SV)-an OWL2 ontology of phenomena in infectious disease epidemiology and population biology-as part of a project whose goal is to increase the use of epidemic simulators in public health practice. Apollo-SV defines a terminology for use in simulator configuration. Apollo-SV is the product of an ontological analysis of the domain of infectious disease epidemiology, with particular attention to the inputs and outputs of nine simulators. Apollo-SV contains 802 classes for representing the inputs and outputs of simulators, of which approximately half are new and half are imported from existing ontologies. The most important Apollo-SV class for users of simulators is infectious disease scenario, which is a representation of an ecosystem at simulator time zero that has at least one infection process (a class) affecting at least one population (also a class). Other important classes represent ecosystem elements (e.g., households), ecosystem processes (e.g., infection acquisition and infectious disease), censuses of ecosystem elements (e.g., censuses of populations), and infectious disease control measures. In the larger project, which created an end-user application that can send the same infectious disease scenario to multiple simulators, Apollo-SV serves as the controlled terminology and strongly influences the design of the message syntax used to represent an infectious disease scenario. As we added simulators for different pathogens (e.g., malaria and dengue), the core classes of Apollo-SV have remained stable, suggesting that our conceptualization of the information required by simulators is sound. Despite adhering to the OBO Foundry principle of orthogonality, we could not reuse Infectious Disease Ontology classes as the basis for infectious disease scenarios. We thus defined new classes in Apollo-SV for host, pathogen, infection, infectious disease, colonization, and infection acquisition. Unlike IDO, our ontological analysis extended to existing mathematical models of key biological phenomena studied by infectious disease epidemiology and population biology. Our ontological analysis as expressed in Apollo-SV was instrumental in developing a simulator-independent representation of infectious disease scenarios that can be run on multiple epidemic simulators. Our experience suggests the importance of extending ontological analysis of a domain to include existing mathematical models of the phenomena studied by the domain. Apollo-SV is freely available at: http://purl.obolibrary.org/obo/apollo_sv.owl .

  16. Epidemiological links between tuberculosis cases identified twice as efficiently by whole genome sequencing than conventional molecular typing: A population-based study.

    PubMed

    Jajou, Rana; de Neeling, Albert; van Hunen, Rianne; de Vries, Gerard; Schimmel, Henrieke; Mulder, Arnout; Anthony, Richard; van der Hoek, Wim; van Soolingen, Dick

    2018-01-01

    Patients with Mycobacterium tuberculosis isolates sharing identical DNA fingerprint patterns can be epidemiologically linked. However, municipal health services in the Netherlands are able to confirm an epidemiological link in only around 23% of the patients with isolates clustered by the conventional variable number of tandem repeat (VNTR) genotyping. This research aims to investigate whether whole genome sequencing (WGS) is a more reliable predictor of epidemiological links between tuberculosis patients than VNTR genotyping. VNTR genotyping and WGS were performed in parallel on all Mycobacterium tuberculosis complex isolates received at the Netherlands National Institute for Public Health and the Environment in 2016. Isolates were clustered by VNTR when they shared identical 24-loci VNTR patterns; isolates were assigned to a WGS cluster when the pair-wise genetic distance was ≤ 12 single nucleotide polymorphisms (SNPs). Cluster investigation was performed by municipal health services on all isolates clustered by VNTR in 2016. The proportion of epidemiological links identified among patients clustered by either method was calculated. In total, 535 isolates were genotyped, of which 25% (134/535) were clustered by VNTR and 14% (76/535) by WGS; the concordance between both typing methods was 86%. The proportion of epidemiological links among WGS clustered cases (57%) was twice as common than among VNTR clustered cases (31%). When WGS was applied, the number of clustered isolates was halved, while all epidemiologically linked cases remained clustered. WGS is therefore a more reliable tool to predict epidemiological links between tuberculosis cases than VNTR genotyping and will allow more efficient transmission tracing, as epidemiological investigations based on false clustering can be avoided.

  17. Reconstruction of organ dose for external radiotherapy patients in retrospective epidemiologic studies

    NASA Astrophysics Data System (ADS)

    Lee, Choonik; Jung, Jae Won; Pelletier, Christopher; Pyakuryal, Anil; Lamart, Stephanie; Kim, Jong Oh; Lee, Choonsik

    2015-03-01

    Organ dose estimation for retrospective epidemiological studies of late effects in radiotherapy patients involves two challenges: radiological images to represent patient anatomy are not usually available for patient cohorts who were treated years ago, and efficient dose reconstruction methods for large-scale patient cohorts are not well established. In the current study, we developed methods to reconstruct organ doses for radiotherapy patients by using a series of computational human phantoms coupled with a commercial treatment planning system (TPS) and a radiotherapy-dedicated Monte Carlo transport code, and performed illustrative dose calculations. First, we developed methods to convert the anatomy and organ contours of the pediatric and adult hybrid computational phantom series to Digital Imaging and Communications in Medicine (DICOM)-image and DICOM-structure files, respectively. The resulting DICOM files were imported to a commercial TPS for simulating radiotherapy and dose calculation for in-field organs. The conversion process was validated by comparing electron densities relative to water and organ volumes between the hybrid phantoms and the DICOM files imported in TPS, which showed agreements within 0.1 and 2%, respectively. Second, we developed a procedure to transfer DICOM-RT files generated from the TPS directly to a Monte Carlo transport code, x-ray Voxel Monte Carlo (XVMC) for more accurate dose calculations. Third, to illustrate the performance of the established methods, we simulated a whole brain treatment for the 10 year-old male phantom and a prostate treatment for the adult male phantom. Radiation doses to selected organs were calculated using the TPS and XVMC, and compared to each other. Organ average doses from the two methods matched within 7%, whereas maximum and minimum point doses differed up to 45%. The dosimetry methods and procedures established in this study will be useful for the reconstruction of organ dose to support retrospective epidemiological studies of late effects in radiotherapy patients.

  18. SU-F-J-174: A Series of Computational Human Phantoms in DICOM-RT Format for Normal Tissue Dose Reconstruction in Epidemiological Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pyakuryal, A; Moroz, B; Lee, C

    2016-06-15

    Purpose: Epidemiological studies of second cancer risk in radiotherapy patients often require individualized dose estimates of normal tissues. Prior to 3D conformal radiation therapy planning, patient anatomy information was mostly limited to 2D radiological images or not even available. Generic patient CT images are often used in commercial radiotherapy treatment planning system (TPS) to reconstruct normal tissue doses. The objective of the current work was to develop a series of reference size computational human phantoms in DICOM-RT format for direct use in dose reconstruction in TPS. Methods: Contours of 93 organs and tissues were extracted from a series of pediatricmore » and adult hybrid computational human phantoms (newborn, 1-, 5-, 10-, 15-year-old, and adult males and females) using Rhinoceros software. A MATLAB script was created to convert the contours into the DICOM-RT structure format. The simulated CT images with the resolution of 1×1×3 mm3 were also generated from the binary phantom format and coupled with the DICOM-structure files. Accurate volumes of the organs were drawn in the format using precise delineation of the contours in converted format. Due to complex geometry of organs, higher resolution (1×1×1 mm3) was found to be more efficient in the conversion of newborn and 1-year-old phantoms. Results: Contour sets were efficiently converted into DICOM-RT structures in relatively short time (about 30 minutes for each phantom). A good agreement was observed in the volumes between the original phantoms and the converted contours for large organs (NRMSD<1.0%) and small organs (NRMSD<7.7%). Conclusion: A comprehensive series of computational human phantoms in DICOM-RT format was created to support epidemiological studies of second cancer risks in radiotherapy patients. We confirmed the DICOM-RT phantoms were successfully imported into the TPS programs of major vendors.« less

  19. Information diffusion in structured online social networks

    NASA Astrophysics Data System (ADS)

    Li, Pei; Zhang, Yini; Qiao, Fengcai; Wang, Hui

    2015-05-01

    Nowadays, due to the word-of-mouth effect, online social networks have been considered to be efficient approaches to conduct viral marketing, which makes it of great importance to understand the diffusion dynamics in online social networks. However, most research on diffusion dynamics in epidemiology and existing social networks cannot be applied directly to characterize online social networks. In this paper, we propose models to characterize the information diffusion in structured online social networks with push-based forwarding mechanism. We introduce the term user influence to characterize the average number of times that messages are browsed which is incurred by a given type user generating a message, and study the diffusion threshold, above which the user influence of generating a message will approach infinity. We conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of use in understanding the diffusion dynamics in online social networks and also critical for advertisers in viral marketing who want to estimate the user influence before posting an advertisement.

  20. Effect of Temperature on Growth and Sporulation of US-22, US-23, and US-24 Clonal Lineages of Phytophthora infestans and Implications for Late Blight Epidemiology.

    PubMed

    Seidl Johnson, Anna C; Frost, Kenneth E; Rouse, Douglas I; Gevens, Amanda J

    2015-04-01

    Epidemics of late blight, caused by Phytophthora infestans (Mont.) de Bary, have been studied by plant pathologists and regarded with great concern by potato and tomato growers since the Irish potato famine in the 1840s. P. infestans populations have continued to evolve, with unique clonal lineages arising which differ in pathogen fitness and pathogenicity, potentially impacting epidemiology. In 2012 and 2013, the US-23 clonal lineage predominated late blight epidemics in most U.S. potato and tomato production regions, including Wisconsin. This lineage was unknown prior to 2009. For isolates of three recently identified clonal lineages of P. infestans (US-22, US-23, and US-24), sporulation rates were experimentally determined on potato and tomato foliage and the effect of temperature on lesion growth rate on tomato was investigated. The US-22 and US-23 isolates had greater lesion growth rates on tomato than US-24 isolates. Sporulation rates for all isolates were greater on potato than tomato, and the US-23 isolates had greater sporulation rates on both tomato and potato than the US-22 and US-24 isolates. Experimentally determined correlates of fitness were input to the LATEBLIGHT model and epidemics were simulated using archived Wisconsin weather data from four growing seasons (2009 to 2012) to investigate the effect of isolates of these new lineages on late blight epidemiology. The fast lesion growth rates of US-22 and US-23 isolates resulted in severe epidemics in all years tested, particularly in 2011. The greater sporulation rates of P. infestans on potato resulted in simulated epidemics that progressed faster than epidemics simulated for tomato; the high sporulation rates of US-23 isolates resulted in simulated epidemics more severe than simulated epidemics of isolates of the US-22 and US-24 isolates and EC-1 clonal lineages on potato and tomato. Additionally, US-23 isolates consistently caused severe simulated epidemics when lesion growth rate and sporulation were input into the model singly or together. Sporangial size of the US-23 isolates was significantly smaller than that of US-22 and US-24 isolates, which may result in more efficient release of sporangia from the tomato or potato canopy. Our experimentally determined correlates of fitness and the simulated epidemics resulting from their incorporation into the LATEBLIGHT model suggest that US-23 isolates of P. infestans may have the greatest fitness among currently prevalent lineages and may be the most likely lineage to persist in the P. infestans population. The US-23 clonal lineage has been documented as the most prevalent lineage in recent years, indicating its overall fitness. In our work, US-23 had the highest epidemic potential among current genotypes. Given that epidemic potential is a component of fitness, this may, in part, explain the current predominance of the US-23 lineage.

  1. High-performance biocomputing for simulating the spread of contagion over large contact networks

    PubMed Central

    2012-01-01

    Background Many important biological problems can be modeled as contagion diffusion processes over interaction networks. This article shows how the EpiSimdemics interaction-based simulation system can be applied to the general contagion diffusion problem. Two specific problems, computational epidemiology and human immune system modeling, are given as examples. We then show how the graphics processing unit (GPU) within each compute node of a cluster can effectively be used to speed-up the execution of these types of problems. Results We show that a single GPU can accelerate the EpiSimdemics computation kernel by a factor of 6 and the entire application by a factor of 3.3, compared to the execution time on a single core. When 8 CPU cores and 2 GPU devices are utilized, the speed-up of the computational kernel increases to 9.5. When combined with effective techniques for inter-node communication, excellent scalability can be achieved without significant loss of accuracy in the results. Conclusions We show that interaction-based simulation systems can be used to model disparate and highly relevant problems in biology. We also show that offloading some of the work to GPUs in distributed interaction-based simulations can be an effective way to achieve increased intra-node efficiency. PMID:22537298

  2. Associations of Perfluoroalkyl Substances (PFAS) with Lower Birth Weight: An Evaluation of Potential Confounding by Glomerular Filtration Rate Using a Physiologically Based Pharmacokinetic Model (PBPK).

    PubMed

    Verner, Marc-André; Loccisano, Anne E; Morken, Nils-Halvdan; Yoon, Miyoung; Wu, Huali; McDougall, Robin; Maisonet, Mildred; Marcus, Michele; Kishi, Reiko; Miyashita, Chihiro; Chen, Mei-Huei; Hsieh, Wu-Shiun; Andersen, Melvin E; Clewell, Harvey J; Longnecker, Matthew P

    2015-12-01

    Prenatal exposure to perfluoroalkyl substances (PFAS) has been associated with lower birth weight in epidemiologic studies. This association could be attributable to glomerular filtration rate (GFR), which is related to PFAS concentration and birth weight. We used a physiologically based pharmacokinetic (PBPK) model of pregnancy to assess how much of the PFAS-birth weight association observed in epidemiologic studies might be attributable to GFR. We modified a PBPK model to reflect the association of GFR with birth weight (estimated from three studies of GFR and birth weight) and used it to simulate PFAS concentrations in maternal and cord plasma. The model was run 250,000 times, with variation in parameters, to simulate a population. Simulated data were analyzed to evaluate the association between PFAS levels and birth weight due to GFR. We compared simulated estimates with those from a meta-analysis of epidemiologic data. The reduction in birth weight for each 1-ng/mL increase in simulated cord plasma for perfluorooctane sulfonate (PFOS) was 2.72 g (95% CI: -3.40, -2.04), and for perfluorooctanoic acid (PFOA) was 7.13 g (95% CI: -8.46, -5.80); results based on maternal plasma at term were similar. Results were sensitive to variations in PFAS level distributions and the strength of the GFR-birth weight association. In comparison, our meta-analysis of epidemiologic studies suggested that each 1-ng/mL increase in prenatal PFOS and PFOA levels was associated with 5.00 g (95% CI: -21.66, -7.78) and 14.72 g (95% CI: -8.92, -1.09) reductions in birth weight, respectively. Results of our simulations suggest that a substantial proportion of the association between prenatal PFAS and birth weight may be attributable to confounding by GFR and that confounding by GFR may be more important in studies with sample collection later in pregnancy.

  3. An animated depiction of major depression epidemiology.

    PubMed

    Patten, Scott B

    2007-06-08

    Epidemiologic estimates are now available for a variety of parameters related to major depression epidemiology (incidence, prevalence, etc.). These estimates are potentially useful for policy and planning purposes, but it is first necessary that they be synthesized into a coherent picture of the epidemiology of the condition. Several attempts to do so have been made using mathematical modeling procedures. However, this information is not easy to communicate to users of epidemiological data (clinicians, administrators, policy makers). In this study, up-to-date data on major depression epidemiology were integrated using a discrete event simulation model. The mathematical model was animated in Virtual Reality Modeling Language (VRML) to create a visual, rather than mathematical, depiction of the epidemiology. Consistent with existing literature, the model highlights potential advantages of population health strategies that emphasize access to effective long-term treatment. The paper contains a web-link to the animation. Visual animation of epidemiological results may be an effective knowledge translation tool. In clinical practice, such animations could potentially assist with patient education and enhanced long-term compliance.

  4. [Systems epidemiology].

    PubMed

    Huang, T; Li, L M

    2018-05-10

    The era of medical big data, translational medicine and precision medicine brings new opportunities for the study of etiology of chronic complex diseases. How to implement evidence-based medicine, translational medicine and precision medicine are the challenges we are facing. Systems epidemiology, a new field of epidemiology, combines medical big data with system biology and examines the statistical model of disease risk, the future risk simulation and prediction using the data at molecular, cellular, population, social and ecological levels. Due to the diversity and complexity of big data sources, the development of study design and analytic methods of systems epidemiology face new challenges and opportunities. This paper summarizes the theoretical basis, concept, objectives, significances, research design and analytic methods of systems epidemiology and its application in the field of public health.

  5. Toward Endemic Deployment of Educational Simulation Games: A Review of Progress and Future Recommendations

    ERIC Educational Resources Information Center

    Moizer, Jonathan; Lean, Jonathan

    2010-01-01

    This article presents a conceptual analysis of simulation game adoption and use across university faculty. The metaphor of epidemiology is used to characterize the diffusion of simulation games for teaching and learning. A simple stock-flow diagram is presented to illustrate this dynamic. Future scenarios for simulation game adoption are…

  6. [Congenital ChagaśDisease: epidemiology, laboratorial diagnosis, prognosis and treatment].

    PubMed

    Reiche, E M; Inouye, M M; Bonametti, A M; Jankevicius, J V

    1996-01-01

    The authors review studies about epidemiology, clinical aspects and methods used in laboratorial diagnosis of congenital Chagas'disease, emphasizing the limitations in their specificity and sensibility, and suggest alternative methods to improve the accuracy and the quality of the laboratorial diagnosis of congenital Chagaśdisease, essential to an efficient treatment.

  7. Inferring epidemiological parameters from phylogenetic information for the HIV-1 epidemic among MSM

    NASA Astrophysics Data System (ADS)

    Quax, Rick; van de Vijver, David A. M. C.; Frentz, Dineke; Sloot, Peter M. A.

    2013-09-01

    The HIV-1 epidemic in Europe is primarily sustained by a dynamic topology of sexual interactions among MSM who have individual immune systems and behavior. This epidemiological process shapes the phylogeny of the virus population. Both fields of epidemic modeling and phylogenetics have a long history, however it remains difficult to use phylogenetic data to infer epidemiological parameters such as the structure of the sexual network and the per-act infectiousness. This is because phylogenetic data is necessarily incomplete and ambiguous. Here we show that the cluster-size distribution indeed contains information about epidemiological parameters using detailed numberical experiments. We simulate the HIV epidemic among MSM many times using the Monte Carlo method with all parameter values and their ranges taken from literature. For each simulation and the corresponding set of parameter values we calculate the likelihood of reproducing an observed cluster-size distribution. The result is an estimated likelihood distribution of all parameters from the phylogenetic data, in particular the structure of the sexual network, the per-act infectiousness, and the risk behavior reduction upon diagnosis. These likelihood distributions encode the knowledge provided by the observed cluster-size distrbution, which we quantify using information theory. Our work suggests that the growing body of genetic data of patients can be exploited to understand the underlying epidemiological process.

  8. eCOMPAGT – efficient Combination and Management of Phenotypes and Genotypes for Genetic Epidemiology

    PubMed Central

    Schönherr, Sebastian; Weißensteiner, Hansi; Coassin, Stefan; Specht, Günther; Kronenberg, Florian; Brandstätter, Anita

    2009-01-01

    Background High-throughput genotyping and phenotyping projects of large epidemiological study populations require sophisticated laboratory information management systems. Most epidemiological studies include subject-related personal information, which needs to be handled with care by following data privacy protection guidelines. In addition, genotyping core facilities handling cooperative projects require a straightforward solution to monitor the status and financial resources of the different projects. Description We developed a database system for an efficient combination and management of phenotypes and genotypes (eCOMPAGT) deriving from genetic epidemiological studies. eCOMPAGT securely stores and manages genotype and phenotype data and enables different user modes with different rights. Special attention was drawn on the import of data deriving from TaqMan and SNPlex genotyping assays. However, the database solution is adjustable to other genotyping systems by programming additional interfaces. Further important features are the scalability of the database and an export interface to statistical software. Conclusion eCOMPAGT can store, administer and connect phenotype data with all kinds of genotype data and is available as a downloadable version at . PMID:19432954

  9. The Schisto Track: A System for Gathering and Monitoring Epidemiological Surveys by Connecting Geographical Information Systems in Real Time

    PubMed Central

    2014-01-01

    Background Using the Android platform as a notification instrument for diseases and disorders forms a new alternative for computerization of epidemiological studies. Objective The objective of our study was to construct a tool for gathering epidemiological data on schistosomiasis using the Android platform. Methods The developed application (app), named the Schisto Track, is a tool for data capture and analysis that was designed to meet the needs of a traditional epidemiological survey. An initial version of the app was finished and tested in both real situations and simulations for epidemiological surveys. Results The app proved to be a tool capable of automation of activities, with data organization and standardization, easy data recovery (to enable interfacing with other systems), and totally modular architecture. Conclusions The proposed Schisto Track is in line with worldwide trends toward use of smartphones with the Android platform for modeling epidemiological scenarios. PMID:25099881

  10. [Mathematical models and epidemiological analysis].

    PubMed

    Gerasimov, A N

    2010-01-01

    The limited use of mathematical simulation in epidemiology is due not only to the difficulty of monitoring the epidemic process and identifying its parameters but also to the application of oversimplified models. It is shown that realistic reproduction of actual morbidity dynamics requires taking into account heterogeneity and finiteness of the population and seasonal character of pathogen transmission mechanism.

  11. An Authentic RFLP Lab for High School or College Biology Students.

    ERIC Educational Resources Information Center

    Guilfoile, Patrick G.; Plum, Stephen

    1998-01-01

    Explains how students can perform an alternative authentic DNA fingerprinting analysis. Presents restriction fragment length polymorphism (RFLP) analysis and can serve as a simulated molecular epidemiology laboratory or as a simulated forensic laboratory exercise. (DDR)

  12. Computer simulation of white pine blister rust epidemics

    Treesearch

    Geral I. McDonald; Raymond J. Hoff; William R. Wykoff

    1981-01-01

    A simulation of white pine blister rust is described in both word and mathematical models. The objective of this first generation simulation was to organize and analyze the available epidemiological knowledge to produce a foundation for integrated management of this destructive rust of 5-needle pines. Verification procedures and additional research needs are also...

  13. Computerized Adaptive Testing Provides Reliable and Efficient Depression Measurement Using the CES-D Scale

    PubMed Central

    2017-01-01

    Background The Center for Epidemiologic Studies Depression Scale (CES-D) is a measure of depressive symptomatology which is widely used internationally. Though previous attempts were made to shorten the CES-D scale, few have attempted to develop a Computerized Adaptive Test (CAT) version for the CES-D. Objective The aim of this study was to provide evidence on the efficiency and accuracy of the CES-D when administered using CAT using an American sample group. Methods We obtained a sample of 2060 responses to the CESD-D from US participants using the myPersonality application. The average age of participants was 26 years (range 19-77). We randomly split the sample into two groups to evaluate and validate the psychometric models. We used evaluation group data (n=1018) to assess dimensionality with both confirmatory factor and Mokken analysis. We conducted further psychometric assessments using item response theory (IRT), including assessments of item and scale fit to Samejima’s graded response model (GRM), local dependency and differential item functioning. We subsequently conducted two CAT simulations to evaluate the CES-D CAT using the validation group (n=1042). Results Initial CFA results indicated a poor fit to the model and Mokken analysis revealed 3 items which did not conform to the same dimension as the rest of the items. We removed the 3 items and fit the remaining 17 items to GRM. We found no evidence of differential item functioning (DIF) between age and gender groups. Estimates of the level of CES-D trait score provided by the simulated CAT algorithm and the original CES-D trait score derived from original scale were correlated highly. The second CAT simulation conducted using real participant data demonstrated higher precision at the higher levels of depression spectrum. Conclusions Depression assessments using the CES-D CAT can be more accurate and efficient than those made using the fixed-length assessment. PMID:28931496

  14. A novel chi-square statistic for detecting group differences between pathways in systems epidemiology.

    PubMed

    Yuan, Zhongshang; Ji, Jiadong; Zhang, Tao; Liu, Yi; Zhang, Xiaoshuai; Chen, Wei; Xue, Fuzhong

    2016-12-20

    Traditional epidemiology often pays more attention to the identification of a single factor rather than to the pathway that is related to a disease, and therefore, it is difficult to explore the disease mechanism. Systems epidemiology aims to integrate putative lifestyle exposures and biomarkers extracted from multiple omics platforms to offer new insights into the pathway mechanisms that underlie disease at the human population level. One key but inadequately addressed question is how to develop powerful statistics to identify whether one candidate pathway is associated with a disease. Bearing in mind that a pathway difference can result from not only changes in the nodes but also changes in the edges, we propose a novel statistic for detecting group differences between pathways, which in principle, captures the nodes changes and edge changes, as well as simultaneously accounting for the pathway structure simultaneously. The proposed test has been proven to follow the chi-square distribution, and various simulations have shown it has better performance than other existing methods. Integrating genome-wide DNA methylation data, we analyzed one real data set from the Bogalusa cohort study and significantly identified a potential pathway, Smoking → SOCS3 → PIK3R1, which was strongly associated with abdominal obesity. The proposed test was powerful and efficient at identifying pathway differences between two groups, and it can be extended to other disciplines that involve statistical comparisons between pathways. The source code in R is available on our website. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. GapMap: Enabling Comprehensive Autism Resource Epidemiology

    PubMed Central

    Albert, Nikhila; Schwartz, Jessey; Du, Michael

    2017-01-01

    Background For individuals with autism spectrum disorder (ASD), finding resources can be a lengthy and difficult process. The difficulty in obtaining global, fine-grained autism epidemiological data hinders researchers from quickly and efficiently studying large-scale correlations among ASD, environmental factors, and geographical and cultural factors. Objective The objective of this study was to define resource load and resource availability for families affected by autism and subsequently create a platform to enable a more accurate representation of prevalence rates and resource epidemiology. Methods We created a mobile application, GapMap, to collect locational, diagnostic, and resource use information from individuals with autism to compute accurate prevalence rates and better understand autism resource epidemiology. GapMap is hosted on AWS S3, running on a React and Redux front-end framework. The backend framework is comprised of an AWS API Gateway and Lambda Function setup, with secure and scalable end points for retrieving prevalence and resource data, and for submitting participant data. Measures of autism resource scarcity, including resource load, resource availability, and resource gaps were defined and preliminarily computed using simulated or scraped data. Results The average distance from an individual in the United States to the nearest diagnostic center is approximately 182 km (50 miles), with a standard deviation of 235 km (146 miles). The average distance from an individual with ASD to the nearest diagnostic center, however, is only 32 km (20 miles), suggesting that individuals who live closer to diagnostic services are more likely to be diagnosed. Conclusions This study confirmed that individuals closer to diagnostic services are more likely to be diagnosed and proposes GapMap, a means to measure and enable the alleviation of increasingly overburdened diagnostic centers and resource-poor areas where parents are unable to diagnose their children as quickly and easily as needed. GapMap will collect information that will provide more accurate data for computing resource loads and availability, uncovering the impact of resource epidemiology on age and likelihood of diagnosis, and gathering localized autism prevalence rates. PMID:28473303

  16. GapMap: Enabling Comprehensive Autism Resource Epidemiology.

    PubMed

    Albert, Nikhila; Daniels, Jena; Schwartz, Jessey; Du, Michael; Wall, Dennis P

    2017-05-04

    For individuals with autism spectrum disorder (ASD), finding resources can be a lengthy and difficult process. The difficulty in obtaining global, fine-grained autism epidemiological data hinders researchers from quickly and efficiently studying large-scale correlations among ASD, environmental factors, and geographical and cultural factors. The objective of this study was to define resource load and resource availability for families affected by autism and subsequently create a platform to enable a more accurate representation of prevalence rates and resource epidemiology. We created a mobile application, GapMap, to collect locational, diagnostic, and resource use information from individuals with autism to compute accurate prevalence rates and better understand autism resource epidemiology. GapMap is hosted on AWS S3, running on a React and Redux front-end framework. The backend framework is comprised of an AWS API Gateway and Lambda Function setup, with secure and scalable end points for retrieving prevalence and resource data, and for submitting participant data. Measures of autism resource scarcity, including resource load, resource availability, and resource gaps were defined and preliminarily computed using simulated or scraped data. The average distance from an individual in the United States to the nearest diagnostic center is approximately 182 km (50 miles), with a standard deviation of 235 km (146 miles). The average distance from an individual with ASD to the nearest diagnostic center, however, is only 32 km (20 miles), suggesting that individuals who live closer to diagnostic services are more likely to be diagnosed. This study confirmed that individuals closer to diagnostic services are more likely to be diagnosed and proposes GapMap, a means to measure and enable the alleviation of increasingly overburdened diagnostic centers and resource-poor areas where parents are unable to diagnose their children as quickly and easily as needed. GapMap will collect information that will provide more accurate data for computing resource loads and availability, uncovering the impact of resource epidemiology on age and likelihood of diagnosis, and gathering localized autism prevalence rates. ©Nikhila Albert, Jena Daniels, Jessey Schwartz, Michael Du, Dennis P Wall. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 04.05.2017.

  17. A general framework for the regression analysis of pooled biomarker assessments.

    PubMed

    Liu, Yan; McMahan, Christopher; Gallagher, Colin

    2017-07-10

    As a cost-efficient data collection mechanism, the process of assaying pooled biospecimens is becoming increasingly common in epidemiological research; for example, pooling has been proposed for the purpose of evaluating the diagnostic efficacy of biological markers (biomarkers). To this end, several authors have proposed techniques that allow for the analysis of continuous pooled biomarker assessments. Regretfully, most of these techniques proceed under restrictive assumptions, are unable to account for the effects of measurement error, and fail to control for confounding variables. These limitations are understandably attributable to the complex structure that is inherent to measurements taken on pooled specimens. Consequently, in order to provide practitioners with the tools necessary to accurately and efficiently analyze pooled biomarker assessments, herein, a general Monte Carlo maximum likelihood-based procedure is presented. The proposed approach allows for the regression analysis of pooled data under practically all parametric models and can be used to directly account for the effects of measurement error. Through simulation, it is shown that the proposed approach can accurately and efficiently estimate all unknown parameters and is more computational efficient than existing techniques. This new methodology is further illustrated using monocyte chemotactic protein-1 data collected by the Collaborative Perinatal Project in an effort to assess the relationship between this chemokine and the risk of miscarriage. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Efficient Semiparametric Inference Under Two-Phase Sampling, With Applications to Genetic Association Studies.

    PubMed

    Tao, Ran; Zeng, Donglin; Lin, Dan-Yu

    2017-01-01

    In modern epidemiological and clinical studies, the covariates of interest may involve genome sequencing, biomarker assay, or medical imaging and thus are prohibitively expensive to measure on a large number of subjects. A cost-effective solution is the two-phase design, under which the outcome and inexpensive covariates are observed for all subjects during the first phase and that information is used to select subjects for measurements of expensive covariates during the second phase. For example, subjects with extreme values of quantitative traits were selected for whole-exome sequencing in the National Heart, Lung, and Blood Institute (NHLBI) Exome Sequencing Project (ESP). Herein, we consider general two-phase designs, where the outcome can be continuous or discrete, and inexpensive covariates can be continuous and correlated with expensive covariates. We propose a semiparametric approach to regression analysis by approximating the conditional density functions of expensive covariates given inexpensive covariates with B-spline sieves. We devise a computationally efficient and numerically stable EM-algorithm to maximize the sieve likelihood. In addition, we establish the consistency, asymptotic normality, and asymptotic efficiency of the estimators. Furthermore, we demonstrate the superiority of the proposed methods over existing ones through extensive simulation studies. Finally, we present applications to the aforementioned NHLBI ESP.

  19. Simultaneous inference of phylogenetic and transmission trees in infectious disease outbreaks

    PubMed Central

    2017-01-01

    Whole-genome sequencing of pathogens from host samples becomes more and more routine during infectious disease outbreaks. These data provide information on possible transmission events which can be used for further epidemiologic analyses, such as identification of risk factors for infectivity and transmission. However, the relationship between transmission events and sequence data is obscured by uncertainty arising from four largely unobserved processes: transmission, case observation, within-host pathogen dynamics and mutation. To properly resolve transmission events, these processes need to be taken into account. Recent years have seen much progress in theory and method development, but existing applications make simplifying assumptions that often break up the dependency between the four processes, or are tailored to specific datasets with matching model assumptions and code. To obtain a method with wider applicability, we have developed a novel approach to reconstruct transmission trees with sequence data. Our approach combines elementary models for transmission, case observation, within-host pathogen dynamics, and mutation, under the assumption that the outbreak is over and all cases have been observed. We use Bayesian inference with MCMC for which we have designed novel proposal steps to efficiently traverse the posterior distribution, taking account of all unobserved processes at once. This allows for efficient sampling of transmission trees from the posterior distribution, and robust estimation of consensus transmission trees. We implemented the proposed method in a new R package phybreak. The method performs well in tests of both new and published simulated data. We apply the model to five datasets on densely sampled infectious disease outbreaks, covering a wide range of epidemiological settings. Using only sampling times and sequences as data, our analyses confirmed the original results or improved on them: the more realistic infection times place more confidence in the inferred transmission trees. PMID:28545083

  20. Simultaneous inference of phylogenetic and transmission trees in infectious disease outbreaks.

    PubMed

    Klinkenberg, Don; Backer, Jantien A; Didelot, Xavier; Colijn, Caroline; Wallinga, Jacco

    2017-05-01

    Whole-genome sequencing of pathogens from host samples becomes more and more routine during infectious disease outbreaks. These data provide information on possible transmission events which can be used for further epidemiologic analyses, such as identification of risk factors for infectivity and transmission. However, the relationship between transmission events and sequence data is obscured by uncertainty arising from four largely unobserved processes: transmission, case observation, within-host pathogen dynamics and mutation. To properly resolve transmission events, these processes need to be taken into account. Recent years have seen much progress in theory and method development, but existing applications make simplifying assumptions that often break up the dependency between the four processes, or are tailored to specific datasets with matching model assumptions and code. To obtain a method with wider applicability, we have developed a novel approach to reconstruct transmission trees with sequence data. Our approach combines elementary models for transmission, case observation, within-host pathogen dynamics, and mutation, under the assumption that the outbreak is over and all cases have been observed. We use Bayesian inference with MCMC for which we have designed novel proposal steps to efficiently traverse the posterior distribution, taking account of all unobserved processes at once. This allows for efficient sampling of transmission trees from the posterior distribution, and robust estimation of consensus transmission trees. We implemented the proposed method in a new R package phybreak. The method performs well in tests of both new and published simulated data. We apply the model to five datasets on densely sampled infectious disease outbreaks, covering a wide range of epidemiological settings. Using only sampling times and sequences as data, our analyses confirmed the original results or improved on them: the more realistic infection times place more confidence in the inferred transmission trees.

  1. Statistical inference to advance network models in epidemiology.

    PubMed

    Welch, David; Bansal, Shweta; Hunter, David R

    2011-03-01

    Contact networks are playing an increasingly important role in the study of epidemiology. Most of the existing work in this area has focused on considering the effect of underlying network structure on epidemic dynamics by using tools from probability theory and computer simulation. This work has provided much insight on the role that heterogeneity in host contact patterns plays on infectious disease dynamics. Despite the important understanding afforded by the probability and simulation paradigm, this approach does not directly address important questions about the structure of contact networks such as what is the best network model for a particular mode of disease transmission, how parameter values of a given model should be estimated, or how precisely the data allow us to estimate these parameter values. We argue that these questions are best answered within a statistical framework and discuss the role of statistical inference in estimating contact networks from epidemiological data. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Effects of simulated rain acidified with sulfuric acid on host-parasite interactions

    Treesearch

    D. S. Shriner

    1976-01-01

    Wind-blown rain, rain splash, and films of free moisture play important roles in the epidemiology of many plant diseases. The effects of simulated rain acidified with sulfuric acid were studied on several host-parasite systems. Plants were exposed, in greenhouse or field, to simulated rain of pH 3.2 ? 0.1 or pH 6.0 ? 0.2. Simulated "rain" of pH 3.2 resulted...

  3. Quantitative bias analysis of a reported association between perfluoroalkyl substances (PFAS) and endometriosis: The influence of oral contraceptive use.

    PubMed

    Ngueta, Gerard; Longnecker, Matthew P; Yoon, Miyoung; Ruark, Christopher D; Clewell, Harvey J; Andersen, Melvin E; Verner, Marc-André

    2017-07-01

    An association between serum levels of perfluoroalkyl substances (PFAS) and endometriosis has recently been reported in an epidemiologic study. Oral contraceptive use to treat dysmenorrhea (pelvic pain associated with endometriosis) could potentially influence this association by reducing menstrual fluid loss, a route of excretion for PFAS. In this study, we aimed to evaluate the influence of differential oral contraceptive use on the association between PFAS and endometriosis. We used a published life-stage physiologically based pharmacokinetic (PBPK) model to simulate plasma levels of perfluorooctanoic acid (PFOA) and perfluorooctane sulfonate (PFOS) from birth to age at study participation (range 18-44years). In the simulated population, PFAS level distributions matched those for controls in the epidemiologic study. Prevalence and geometric mean duration (standard deviation [SD]) of oral contraceptive use in the simulated women were based on data from the National Health and Nutrition Examination Survey; among the women with endometriosis the values were, respectively, 29% and 6.8 (3.1) years; among those without endometriosis these values were 18% and 5.3 (2.8) years. In simulations, menstrual fluid loss (ml/cycle) in women taking oral contraceptives was assumed to be 56% of loss in non-users. We evaluated the association between simulated plasma PFAS concentration and endometriosis in the simulated population using logistic regression. Based on the simulations, the association between PFAS levels and endometriosis attributable to differential contraceptive use had an odds ratio (95% CI) of 1.05 (1.02, 1.07) for a log e unit increase in PFOA and 1.03 (1.02, 1.05) for PFOS. In comparison, the epidemiologic study reported odds ratios of 1.62 (0.99, 2.66) for PFOA and 1.25 (0.87, 1.80) for PFOS. Our results suggest that the influence of oral contraceptive use on the association between PFAS levels and endometriosis is relatively small. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Comparing control strategies against foot-and-mouth disease: will vaccination be cost-effective in Denmark?

    PubMed

    Boklund, A; Halasa, T; Christiansen, L E; Enøe, C

    2013-09-01

    Recent outbreaks of foot-and-mouth disease (FMD) in Europe have highlighted the need for assessment of control strategies to optimise control of the spread of FMD. Our objectives were to assess the epidemiological and financial impact of simulated FMD outbreaks in Denmark and the effect of using ring depopulation or emergency vaccination to control these outbreaks. Two stochastic simulation models (InterSpreadPlus (ISP) and the modified Davis Animal Disease Simulation model (DTU-DADS)) were used to simulate the spread of FMD in Denmark using different control strategies. Each epidemic was initiated in one herd (index herd), and a total of 5000 index herds were used. Four types of control measures were investigated: (1) a basic scenario including depopulation of detected herds, 3 km protection and 10 km surveillance zones, movement tracing and a three-day national standstill, (2) the basic scenario plus depopulation in ring zones around detected herds (Depop), (3) the basic scenario plus protective vaccination within ring zones around detected herds, and (4) the basic scenario plus protective vaccination within ring zones around detected herds. Disease spread was simulated through direct animal movements, medium-risk contacts (veterinarians, artificial inseminators or milk controllers), low-risk contacts (animal feed and rendering trucks, technicians or visitors), market contacts, abattoir trucks, milk tanks, or local spread. The two simulation models showed different results in terms of the estimated numbers. However, the tendencies in terms of recommendations of strategies were similar for both models. Comparison of the different control strategies showed that, from an epidemiological point of view, protective vaccination would be preferable if the epidemic started in a cattle herd in an area with a high density of cattle, whereas if the epidemic started in an area with a low density of cattle or in other species, protective vaccination or depopulation would have almost the same preventive effect. Implementing additional control measures either 14 days after detection of the first infected herd or when 10 herds have been diagnosed would be more efficient than implementing additional control measures when more herds have been diagnosed. Protective vaccination scenarios would never be cost-effective, whereas depopulation or suppressive vaccination scenarios would most often be recommended. Looking at the median estimates of the cost-benefit analysis, depopulation in zones would most often be recommended, although, in extreme epidemics, suppressive vaccination scenarios could be less expensive. The vast majority of the costs and losses associated with a Danish epidemic could be attributed to export losses. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Potential for adult-based epidemiological studies to characterize overall cancer risks associated with a lifetime of CT scans.

    PubMed

    Shuryak, Igor; Lubin, Jay H; Brenner, David J

    2014-06-01

    Recent epidemiological studies have suggested that radiation exposure from pediatric CT scanning is associated with small excess cancer risks. However, the majority of CT scans are performed on adults, and most radiation-induced cancers appear during middle or old age, in the same age range as background cancers. Consequently, a logical next step is to investigate the effects of CT scanning in adulthood on lifetime cancer risks by conducting adult-based, appropriately designed epidemiological studies. Here we estimate the sample size required for such studies to detect CT-associated risks. This was achieved by incorporating different age-, sex-, time- and cancer type-dependent models of radiation carcinogenesis into an in silico simulation of a population-based cohort study. This approach simulated individual histories of chest and abdominal CT exposures, deaths and cancer diagnoses. The resultant sample sizes suggest that epidemiological studies of realistically sized cohorts can detect excess lifetime cancer risks from adult CT exposures. For example, retrospective analysis of CT exposure and cancer incidence data from a population-based cohort of 0.4 to 1.3 million (depending on the carcinogenic model) CT-exposed UK adults, aged 25-65 in 1980 and followed until 2015, provides 80% power for detecting cancer risks from chest and abdominal CT scans.

  6. Reevaluation of epidemiological data demonstrates that it is consistent with cross-immunity among human papillomavirus types.

    PubMed

    Durham, David P; Poolman, Eric M; Ibuka, Yoko; Townsend, Jeffrey P; Galvani, Alison P

    2012-10-01

    The degree of cross-immunity between human papillomavirus (HPV) types is fundamental both to the epidemiological dynamics of HPV and to the impact of HPV vaccination. Epidemiological data on HPV infections has been repeatedly interpreted as inconsistent with cross-immunity. We reevaluate the epidemiological data using a model to determine the odds ratios of multiple to single infections expected in the presence or absence of cross-immunity. We simulate a virtual longitudinal survey to determine the effect cross-immunity has on the prevalence of multiple infections. We calibrate our model to epidemiological data and estimate the extent of type replacement following vaccination against specific HPV types. We find that cross-immunity can produce odds ratios of infection comparable with epidemiological observations. We show that the sample sizes underlying existing surveys have been insufficient to identify even intense cross-immunity. We also find that the removal of HPV type 16, type 18, and types 6 and 11 would increase the prevalence of nontargeted types by 50%, 29%, and 183%, respectively. Cross-immunity between HPV types is consistent with epidemiological data, contrary to previous interpretations. Cross-immunity may cause significant type replacement following vaccination, and therefore should be considered in future vaccine studies and epidemiological models.

  7. Networks and the Epidemiology of Infectious Disease

    PubMed Central

    Danon, Leon; Ford, Ashley P.; House, Thomas; Jewell, Chris P.; Keeling, Matt J.; Roberts, Gareth O.; Ross, Joshua V.; Vernon, Matthew C.

    2011-01-01

    The science of networks has revolutionised research into the dynamics of interacting elements. It could be argued that epidemiology in particular has embraced the potential of network theory more than any other discipline. Here we review the growing body of research concerning the spread of infectious diseases on networks, focusing on the interplay between network theory and epidemiology. The review is split into four main sections, which examine: the types of network relevant to epidemiology; the multitude of ways these networks can be characterised; the statistical methods that can be applied to infer the epidemiological parameters on a realised network; and finally simulation and analytical methods to determine epidemic dynamics on a given network. Given the breadth of areas covered and the ever-expanding number of publications, a comprehensive review of all work is impossible. Instead, we provide a personalised overview into the areas of network epidemiology that have seen the greatest progress in recent years or have the greatest potential to provide novel insights. As such, considerable importance is placed on analytical approaches and statistical methods which are both rapidly expanding fields. Throughout this review we restrict our attention to epidemiological issues. PMID:21437001

  8. Cox process representation and inference for stochastic reaction-diffusion processes

    NASA Astrophysics Data System (ADS)

    Schnoerr, David; Grima, Ramon; Sanguinetti, Guido

    2016-05-01

    Complex behaviour in many systems arises from the stochastic interactions of spatially distributed particles or agents. Stochastic reaction-diffusion processes are widely used to model such behaviour in disciplines ranging from biology to the social sciences, yet they are notoriously difficult to simulate and calibrate to observational data. Here we use ideas from statistical physics and machine learning to provide a solution to the inverse problem of learning a stochastic reaction-diffusion process from data. Our solution relies on a non-trivial connection between stochastic reaction-diffusion processes and spatio-temporal Cox processes, a well-studied class of models from computational statistics. This connection leads to an efficient and flexible algorithm for parameter inference and model selection. Our approach shows excellent accuracy on numeric and real data examples from systems biology and epidemiology. Our work provides both insights into spatio-temporal stochastic systems, and a practical solution to a long-standing problem in computational modelling.

  9. Maximum likelihood estimation for semiparametric transformation models with interval-censored data

    PubMed Central

    Mao, Lu; Lin, D. Y.

    2016-01-01

    Abstract Interval censoring arises frequently in clinical, epidemiological, financial and sociological studies, where the event or failure of interest is known only to occur within an interval induced by periodic monitoring. We formulate the effects of potentially time-dependent covariates on the interval-censored failure time through a broad class of semiparametric transformation models that encompasses proportional hazards and proportional odds models. We consider nonparametric maximum likelihood estimation for this class of models with an arbitrary number of monitoring times for each subject. We devise an EM-type algorithm that converges stably, even in the presence of time-dependent covariates, and show that the estimators for the regression parameters are consistent, asymptotically normal, and asymptotically efficient with an easily estimated covariance matrix. Finally, we demonstrate the performance of our procedures through simulation studies and application to an HIV/AIDS study conducted in Thailand. PMID:27279656

  10. A simulation study to quantify the impacts of exposure measurement error on air pollution health risk estimates in copollutant time-series models.

    EPA Science Inventory

    BackgroundExposure measurement error in copollutant epidemiologic models has the potential to introduce bias in relative risk (RR) estimates. A simulation study was conducted using empirical data to quantify the impact of correlated measurement errors in time-series analyses of a...

  11. Analysis of Spatiotemporal Characteristics of Pandemic SARS Spread in Mainland China.

    PubMed

    Cao, Chunxiang; Chen, Wei; Zheng, Sheng; Zhao, Jian; Wang, Jinfeng; Cao, Wuchun

    2016-01-01

    Severe acute respiratory syndrome (SARS) is one of the most severe emerging infectious diseases of the 21st century so far. SARS caused a pandemic that spread throughout mainland China for 7 months, infecting 5318 persons in 194 administrative regions. Using detailed mainland China epidemiological data, we study spatiotemporal aspects of this person-to-person contagious disease and simulate its spatiotemporal transmission dynamics via the Bayesian Maximum Entropy (BME) method. The BME reveals that SARS outbreaks show autocorrelation within certain spatial and temporal distances. We use BME to fit a theoretical covariance model that has a sine hole spatial component and exponential temporal component and obtain the weights of geographical and temporal autocorrelation factors. Using the covariance model, SARS dynamics were estimated and simulated under the most probable conditions. Our study suggests that SARS transmission varies in its epidemiological characteristics and SARS outbreak distributions exhibit palpable clusters on both spatial and temporal scales. In addition, the BME modelling demonstrates that SARS transmission features are affected by spatial heterogeneity, so we analyze potential causes. This may benefit epidemiological control of pandemic infectious diseases.

  12. Analysis of Spatiotemporal Characteristics of Pandemic SARS Spread in Mainland China

    PubMed Central

    Cao, Chunxiang; Zheng, Sheng; Zhao, Jian; Wang, Jinfeng; Cao, Wuchun

    2016-01-01

    Severe acute respiratory syndrome (SARS) is one of the most severe emerging infectious diseases of the 21st century so far. SARS caused a pandemic that spread throughout mainland China for 7 months, infecting 5318 persons in 194 administrative regions. Using detailed mainland China epidemiological data, we study spatiotemporal aspects of this person-to-person contagious disease and simulate its spatiotemporal transmission dynamics via the Bayesian Maximum Entropy (BME) method. The BME reveals that SARS outbreaks show autocorrelation within certain spatial and temporal distances. We use BME to fit a theoretical covariance model that has a sine hole spatial component and exponential temporal component and obtain the weights of geographical and temporal autocorrelation factors. Using the covariance model, SARS dynamics were estimated and simulated under the most probable conditions. Our study suggests that SARS transmission varies in its epidemiological characteristics and SARS outbreak distributions exhibit palpable clusters on both spatial and temporal scales. In addition, the BME modelling demonstrates that SARS transmission features are affected by spatial heterogeneity, so we analyze potential causes. This may benefit epidemiological control of pandemic infectious diseases. PMID:27597972

  13. Consequences of kriging and land use regression for PM2.5 predictions in epidemiologic analyses: Insights into spatial variability using high-resolution satellite data

    PubMed Central

    Alexeeff, Stacey E.; Schwartz, Joel; Kloog, Itai; Chudnovsky, Alexandra; Koutrakis, Petros; Coull, Brent A.

    2016-01-01

    Many epidemiological studies use predicted air pollution exposures as surrogates for true air pollution levels. These predicted exposures contain exposure measurement error, yet simulation studies have typically found negligible bias in resulting health effect estimates. However, previous studies typically assumed a statistical spatial model for air pollution exposure, which may be oversimplified. We address this shortcoming by assuming a realistic, complex exposure surface derived from fine-scale (1km x 1km) remote-sensing satellite data. Using simulation, we evaluate the accuracy of epidemiological health effect estimates in linear and logistic regression when using spatial air pollution predictions from kriging and land use regression models. We examined chronic (long-term) and acute (short-term) exposure to air pollution. Results varied substantially across different scenarios. Exposure models with low out-of-sample R2 yielded severe biases in the health effect estimates of some models, ranging from 60% upward bias to 70% downward bias. One land use regression exposure model with greater than 0.9 out-of-sample R2 yielded upward biases up to 13% for acute health effect estimates. Almost all models drastically underestimated the standard errors. Land use regression models performed better in chronic effects simulations. These results can help researchers when interpreting health effect estimates in these types of studies. PMID:24896768

  14. Usefulness of Mendelian Randomization in Observational Epidemiology

    PubMed Central

    Bochud, Murielle; Rousson, Valentin

    2010-01-01

    Mendelian randomization refers to the random allocation of alleles at the time of gamete formation. In observational epidemiology, this refers to the use of genetic variants to estimate a causal effect between a modifiable risk factor and an outcome of interest. In this review, we recall the principles of a “Mendelian randomization” approach in observational epidemiology, which is based on the technique of instrumental variables; we provide simulations and an example based on real data to demonstrate its implications; we present the results of a systematic search on original articles having used this approach; and we discuss some limitations of this approach in view of what has been found so far. PMID:20616999

  15. Systems Epidemiology: What’s in a Name?

    PubMed Central

    Dammann, O.; Gray, P.; Gressens, P.; Wolkenhauer, O.; Leviton, A.

    2014-01-01

    Systems biology is an interdisciplinary effort to integrate molecular, cellular, tissue, organ, and organism levels of function into computational models that facilitate the identification of general principles. Systems medicine adds a disease focus. Systems epidemiology adds yet another level consisting of antecedents that might contribute to the disease process in populations. In etiologic and prevention research, systems-type thinking about multiple levels of causation will allow epidemiologists to identify contributors to disease at multiple levels as well as their interactions. In public health, systems epidemiology will contribute to the improvement of syndromic surveillance methods. We encourage the creation of computational simulation models that integrate information about disease etiology, pathogenetic data, and the expertise of investigators from different disciplines. PMID:25598870

  16. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies

    PubMed Central

    Vatcheva, Kristina P.; Lee, MinJae; McCormick, Joseph B.; Rahbar, Mohammad H.

    2016-01-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis. PMID:27274911

  17. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies.

    PubMed

    Vatcheva, Kristina P; Lee, MinJae; McCormick, Joseph B; Rahbar, Mohammad H

    2016-04-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis.

  18. Prediction and analysis of near-road concentrations using a reduced-form emission/dispersion model

    PubMed Central

    2010-01-01

    Background Near-road exposures of traffic-related air pollutants have been receiving increased attention due to evidence linking emissions from high-traffic roadways to adverse health outcomes. To date, most epidemiological and risk analyses have utilized simple but crude exposure indicators, most typically proximity measures, such as the distance between freeways and residences, to represent air quality impacts from traffic. This paper derives and analyzes a simplified microscale simulation model designed to predict short- (hourly) to long-term (annual average) pollutant concentrations near roads. Sensitivity analyses and case studies are used to highlight issues in predicting near-road exposures. Methods Process-based simulation models using a computationally efficient reduced-form response surface structure and a minimum number of inputs integrate the major determinants of air pollution exposures: traffic volume and vehicle emissions, meteorology, and receptor location. We identify the most influential variables and then derive a set of multiplicative submodels that match predictions from "parent" models MOBILE6.2 and CALINE4. The assembled model is applied to two case studies in the Detroit, Michigan area. The first predicts carbon monoxide (CO) concentrations at a monitoring site near a freeway. The second predicts CO and PM2.5 concentrations in a dense receptor grid over a 1 km2 area around the intersection of two major roads. We analyze the spatial and temporal patterns of pollutant concentration predictions. Results Predicted CO concentrations showed reasonable agreement with annual average and 24-hour measurements, e.g., 59% of the 24-hr predictions were within a factor of two of observations in the warmer months when CO emissions are more consistent. The highest concentrations of both CO and PM2.5 were predicted to occur near intersections and downwind of major roads during periods of unfavorable meteorology (e.g., low wind speeds) and high emissions (e.g., weekday rush hour). The spatial and temporal variation among predicted concentrations was significant, and resulted in unusual distributional and correlation characteristics, including strong negative correlation for receptors on opposite sides of a road and the highest short-term concentrations on the "upwind" side of the road. Conclusions The case study findings can likely be generalized to many other locations, and they have important implications for epidemiological and other studies. The reduced-form model is intended for exposure assessment, risk assessment, epidemiological, geographical information systems, and other applications. PMID:20579353

  19. Solving search problems by strongly simulating quantum circuits

    PubMed Central

    Johnson, T. H.; Biamonte, J. D.; Clark, S. R.; Jaksch, D.

    2013-01-01

    Simulating quantum circuits using classical computers lets us analyse the inner workings of quantum algorithms. The most complete type of simulation, strong simulation, is believed to be generally inefficient. Nevertheless, several efficient strong simulation techniques are known for restricted families of quantum circuits and we develop an additional technique in this article. Further, we show that strong simulation algorithms perform another fundamental task: solving search problems. Efficient strong simulation techniques allow solutions to a class of search problems to be counted and found efficiently. This enhances the utility of strong simulation methods, known or yet to be discovered, and extends the class of search problems known to be efficiently simulable. Relating strong simulation to search problems also bounds the computational power of efficiently strongly simulable circuits; if they could solve all problems in P this would imply that all problems in NP and #P could be solved in polynomial time. PMID:23390585

  20. A unifying theory for genetic epidemiological analysis of binary disease data

    PubMed Central

    2014-01-01

    Background Genetic selection for host resistance offers a desirable complement to chemical treatment to control infectious disease in livestock. Quantitative genetics disease data frequently originate from field studies and are often binary. However, current methods to analyse binary disease data fail to take infection dynamics into account. Moreover, genetic analyses tend to focus on host susceptibility, ignoring potential variation in infectiousness, i.e. the ability of a host to transmit the infection. This stands in contrast to epidemiological studies, which reveal that variation in infectiousness plays an important role in the progression and severity of epidemics. In this study, we aim at filling this gap by deriving an expression for the probability of becoming infected that incorporates infection dynamics and is an explicit function of both host susceptibility and infectiousness. We then validate this expression according to epidemiological theory and by simulating epidemiological scenarios, and explore implications of integrating this expression into genetic analyses. Results Our simulations show that the derived expression is valid for a range of stochastic genetic-epidemiological scenarios. In the particular case of variation in susceptibility only, the expression can be incorporated into conventional quantitative genetic analyses using a complementary log-log link function (rather than probit or logit). Similarly, if there is moderate variation in both susceptibility and infectiousness, it is possible to use a logarithmic link function, combined with an indirect genetic effects model. However, in the presence of highly infectious individuals, i.e. super-spreaders, the use of any model that is linear in susceptibility and infectiousness causes biased estimates. Thus, in order to identify super-spreaders, novel analytical methods using our derived expression are required. Conclusions We have derived a genetic-epidemiological function for quantitative genetic analyses of binary infectious disease data, which, unlike current approaches, takes infection dynamics into account and allows for variation in host susceptibility and infectiousness. PMID:24552188

  1. A unifying theory for genetic epidemiological analysis of binary disease data.

    PubMed

    Lipschutz-Powell, Debby; Woolliams, John A; Doeschl-Wilson, Andrea B

    2014-02-19

    Genetic selection for host resistance offers a desirable complement to chemical treatment to control infectious disease in livestock. Quantitative genetics disease data frequently originate from field studies and are often binary. However, current methods to analyse binary disease data fail to take infection dynamics into account. Moreover, genetic analyses tend to focus on host susceptibility, ignoring potential variation in infectiousness, i.e. the ability of a host to transmit the infection. This stands in contrast to epidemiological studies, which reveal that variation in infectiousness plays an important role in the progression and severity of epidemics. In this study, we aim at filling this gap by deriving an expression for the probability of becoming infected that incorporates infection dynamics and is an explicit function of both host susceptibility and infectiousness. We then validate this expression according to epidemiological theory and by simulating epidemiological scenarios, and explore implications of integrating this expression into genetic analyses. Our simulations show that the derived expression is valid for a range of stochastic genetic-epidemiological scenarios. In the particular case of variation in susceptibility only, the expression can be incorporated into conventional quantitative genetic analyses using a complementary log-log link function (rather than probit or logit). Similarly, if there is moderate variation in both susceptibility and infectiousness, it is possible to use a logarithmic link function, combined with an indirect genetic effects model. However, in the presence of highly infectious individuals, i.e. super-spreaders, the use of any model that is linear in susceptibility and infectiousness causes biased estimates. Thus, in order to identify super-spreaders, novel analytical methods using our derived expression are required. We have derived a genetic-epidemiological function for quantitative genetic analyses of binary infectious disease data, which, unlike current approaches, takes infection dynamics into account and allows for variation in host susceptibility and infectiousness.

  2. Structural and Practical Identifiability Issues of Immuno-Epidemiological Vector-Host Models with Application to Rift Valley Fever.

    PubMed

    Tuncer, Necibe; Gulbudak, Hayriye; Cannataro, Vincent L; Martcheva, Maia

    2016-09-01

    In this article, we discuss the structural and practical identifiability of a nested immuno-epidemiological model of arbovirus diseases, where host-vector transmission rate, host recovery, and disease-induced death rates are governed by the within-host immune system. We incorporate the newest ideas and the most up-to-date features of numerical methods to fit multi-scale models to multi-scale data. For an immunological model, we use Rift Valley Fever Virus (RVFV) time-series data obtained from livestock under laboratory experiments, and for an epidemiological model we incorporate a human compartment to the nested model and use the number of human RVFV cases reported by the CDC during the 2006-2007 Kenya outbreak. We show that the immunological model is not structurally identifiable for the measurements of time-series viremia concentrations in the host. Thus, we study the non-dimensionalized and scaled versions of the immunological model and prove that both are structurally globally identifiable. After fixing estimated parameter values for the immunological model derived from the scaled model, we develop a numerical method to fit observable RVFV epidemiological data to the nested model for the remaining parameter values of the multi-scale system. For the given (CDC) data set, Monte Carlo simulations indicate that only three parameters of the epidemiological model are practically identifiable when the immune model parameters are fixed. Alternatively, we fit the multi-scale data to the multi-scale model simultaneously. Monte Carlo simulations for the simultaneous fitting suggest that the parameters of the immunological model and the parameters of the immuno-epidemiological model are practically identifiable. We suggest that analytic approaches for studying the structural identifiability of nested models are a necessity, so that identifiable parameter combinations can be derived to reparameterize the nested model to obtain an identifiable one. This is a crucial step in developing multi-scale models which explain multi-scale data.

  3. Dynamics of eco-epidemiological model with harvesting

    NASA Astrophysics Data System (ADS)

    Purnomo, Anna Silvia; Darti, Isnani; Suryanto, Agus

    2017-12-01

    In this paper, we study an eco-epidemiology model which is derived from S I epidemic model with bilinear incidence rate and modified Leslie Gower predator-prey model with harvesting on susceptible prey. Existence condition and stability of all equilibrium points are discussed for the proposed model. Furthermore, we show that the model exhibits a Hopf bifurcation around interior equilibrium point which is driven by the rate of infection. Our numerical simulations using some different value of parameters confirm our analytical analysis.

  4. Associations of Perfluoroalkyl Substances (PFAS) with Lower Birth Weight: An Evaluation of Potential Confounding by Glomerular Filtration Rate Using a Physiologically Based Pharmacokinetic Model (PBPK)

    PubMed Central

    Loccisano, Anne E.; Morken, Nils-Halvdan; Yoon, Miyoung; Wu, Huali; McDougall, Robin; Maisonet, Mildred; Marcus, Michele; Kishi, Reiko; Miyashita, Chihiro; Chen, Mei-Huei; Hsieh, Wu-Shiun; Andersen, Melvin E.; Clewell, Harvey J.; Longnecker, Matthew P.

    2015-01-01

    Background Prenatal exposure to perfluoroalkyl substances (PFAS) has been associated with lower birth weight in epidemiologic studies. This association could be attributable to glomerular filtration rate (GFR), which is related to PFAS concentration and birth weight. Objectives We used a physiologically based pharmacokinetic (PBPK) model of pregnancy to assess how much of the PFAS–birth weight association observed in epidemiologic studies might be attributable to GFR. Methods We modified a PBPK model to reflect the association of GFR with birth weight (estimated from three studies of GFR and birth weight) and used it to simulate PFAS concentrations in maternal and cord plasma. The model was run 250,000 times, with variation in parameters, to simulate a population. Simulated data were analyzed to evaluate the association between PFAS levels and birth weight due to GFR. We compared simulated estimates with those from a meta-analysis of epidemiologic data. Results The reduction in birth weight for each 1-ng/mL increase in simulated cord plasma for perfluorooctane sulfonate (PFOS) was 2.72 g (95% CI: –3.40, –2.04), and for perfluorooctanoic acid (PFOA) was 7.13 g (95% CI: –8.46, –5.80); results based on maternal plasma at term were similar. Results were sensitive to variations in PFAS level distributions and the strength of the GFR–birth weight association. In comparison, our meta-analysis of epidemiologic studies suggested that each 1-ng/mL increase in prenatal PFOS and PFOA levels was associated with 5.00 g (95% CI: –21.66, –7.78) and 14.72 g (95% CI: –8.92, –1.09) reductions in birth weight, respectively. Conclusion Results of our simulations suggest that a substantial proportion of the association between prenatal PFAS and birth weight may be attributable to confounding by GFR and that confounding by GFR may be more important in studies with sample collection later in pregnancy. Citation Verner MA, Loccisano AE, Morken NH, Yoon M, Wu H, McDougall R, Maisonet M, Marcus M, Kishi R, Miyashita C, Chen MH, Hsieh WS, Andersen ME, Clewell HJ III, Longnecker MP. 2015. Associations of perfluoroalkyl substances (PFAS) with lower birth weight: an evaluation of potential confounding by glomerular filtration rate using a physiologically based pharmacokinetic model (PBPK). Environ Health Perspect 123:1317–1324; http://dx.doi.org/10.1289/ehp.1408837 PMID:26008903

  5. An evaluation of the accuracy of small-area demographic estimates of population at risk and its effect on prevalence statistics

    PubMed Central

    2013-01-01

    Demographic estimates of population at risk often underpin epidemiologic research and public health surveillance efforts. In spite of their central importance to epidemiology and public-health practice, little previous attention has been paid to evaluating the magnitude of errors associated with such estimates or the sensitivity of epidemiologic statistics to these effects. In spite of the well-known observation that accuracy in demographic estimates declines as the size of the population to be estimated decreases, demographers continue to face pressure to produce estimates for increasingly fine-grained population characteristics at ever-smaller geographic scales. Unfortunately, little guidance on the magnitude of errors that can be expected in such estimates is currently available in the literature and available for consideration in small-area epidemiology. This paper attempts to fill this current gap by producing a Vintage 2010 set of single-year-of-age estimates for census tracts, then evaluating their accuracy and precision in light of the results of the 2010 Census. These estimates are produced and evaluated for 499 census tracts in New Mexico for single-years of age from 0 to 21 and for each sex individually. The error distributions associated with these estimates are characterized statistically using non-parametric statistics including the median and 2.5th and 97.5th percentiles. The impact of these errors are considered through simulations in which observed and estimated 2010 population counts are used as alternative denominators and simulated event counts are used to compute a realistic range fo prevalence values. The implications of the results of this study for small-area epidemiologic research in cancer and environmental health are considered. PMID:24359344

  6. A clinical research analytics toolkit for cohort study.

    PubMed

    Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue

    2012-01-01

    This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.

  7. A Data Base Management System for Clinical and Epidemiologic Studies In Systemic Lupus Erythematosus: Design and Maintenance

    PubMed Central

    Kosmides, Victoria S.; Hochberg, Marc C.

    1984-01-01

    This report describes the development, design specifications, features and implementation of a data base management system (DBMS) for clinical and epidemiologic studies in SLE. The DBMS is multidimensional with arrays formulated across patients, studies and variables. The major impact of this DBMS has been to increase the efficiency of managing and analyzing vast amounts of clinical and laboratory data and, as a result, to allow for continued growth in research productivity in areas related to SLE.

  8. An operational epidemiological model for calibrating agent-based simulations of pandemic influenza outbreaks.

    PubMed

    Prieto, D; Das, T K

    2016-03-01

    Uncertainty of pandemic influenza viruses continue to cause major preparedness challenges for public health policymakers. Decisions to mitigate influenza outbreaks often involve tradeoff between the social costs of interventions (e.g., school closure) and the cost of uncontrolled spread of the virus. To achieve a balance, policymakers must assess the impact of mitigation strategies once an outbreak begins and the virus characteristics are known. Agent-based (AB) simulation is a useful tool for building highly granular disease spread models incorporating the epidemiological features of the virus as well as the demographic and social behavioral attributes of tens of millions of affected people. Such disease spread models provide excellent basis on which various mitigation strategies can be tested, before they are adopted and implemented by the policymakers. However, to serve as a testbed for the mitigation strategies, the AB simulation models must be operational. A critical requirement for operational AB models is that they are amenable for quick and simple calibration. The calibration process works as follows: the AB model accepts information available from the field and uses those to update its parameters such that some of its outputs in turn replicate the field data. In this paper, we present our epidemiological model based calibration methodology that has a low computational complexity and is easy to interpret. Our model accepts a field estimate of the basic reproduction number, and then uses it to update (calibrate) the infection probabilities in a way that its effect combined with the effects of the given virus epidemiology, demographics, and social behavior results in an infection pattern yielding a similar value of the basic reproduction number. We evaluate the accuracy of the calibration methodology by applying it for an AB simulation model mimicking a regional outbreak in the US. The calibrated model is shown to yield infection patterns closely replicating the input estimates of the basic reproduction number. The calibration method is also tested to replicate an initial infection incidence trend for a H1N1 outbreak like that of 2009.

  9. Support to triage and public risk perception considering long-term response to a Cs-137 radiological dispersive device scenario.

    PubMed

    Andrade, Cristiane Ps; Souza, Cláudio J; Camerini, Eduardo Sn; Alves, Isabela S; Vital, Hélio C; Healy, Matthew Jf; Ramos De Andrade, Edson

    2018-06-01

    A radiological dispersive device (RDD) spreads radioactive material, complicates the treatment of physical injuries, raises cancer risk, and induces disproportionate fear. Simulating such an event enables more effective and efficient utilization of the triage and treatment resources of staff, facilities, and space. Fast simulation can give detail on events in progress or future events. The resources for triage and treatment of contaminated trauma victims can differ for pure exposure individuals, while discouraging the "worried well" from presenting in the crisis phase by media announcement would relieve pressure on hospital facilities. The proposed methodology integrates capabilities from different platforms in a convergent way composed of three phases: (a) scenario simulation, (b) data generation, and (c) risk assessment for triage focused on follow-up epidemiological assessment. Simulations typically indicate that most of the affected population does not require immediate medical assistance. Medical triage for the few severely injured and the radiological triage to diminish the contamination with radioactivity will always be the priority. For this study, however, higher priorities should be given to individuals from radiological "warm" and "hot" zones as required by risk criteria. The proposed methodology could thus help to (a) filter and reduce the number of individuals to be attended, (b) optimize the prioritization of medical care, (c) reduce or prepare for future costs, (d) effectively locate the operational triage site to avoid possible contamination on the main facility, and (e) provide the scientific data needed to develop an adequate approach to risk and its proper communication.

  10. Resource recovery and epidemiology of anaerobic wastewater treatment process in a controlled ecological life support system

    NASA Technical Reports Server (NTRS)

    Li, Ku-Yen; Hunt, Madelyn D.

    1995-01-01

    The results of work accomplished under two different areas: (1) Resource Recovery of an Anaerobic Wastewater Treatment process, and (2) Epidemiological Study of an Anaerobic Wastewater Treatment Process are documented. The first part of the work was to set up and test three anaerobic digesters and then run these three digesters with a NASA-simulated wastewater. The second part of the work was to use a multi-drug resistant strain of Salmonella choleraesuis as the indicator bacteria for the epidemiological study. Details of these two parts can be found in two master's theses and are described in Sections 3 and 4 of this report. Several important results condensed from these two parts are summarized in Section 2.

  11. An epidemiologic simulation model of the spread and control of highly pathogenic avian influenza (H5N1) among commercial and backyard poultry flocks in South Carolina, United States.

    PubMed

    Patyk, Kelly A; Helm, Julie; Martin, Michael K; Forde-Folle, Kimberly N; Olea-Popelka, Francisco J; Hokanson, John E; Fingerlin, Tasha; Reeves, Aaron

    2013-07-01

    Epidemiologic simulation modeling of highly pathogenic avian influenza (HPAI) outbreaks provides a useful conceptual framework with which to estimate the consequences of HPAI outbreaks and to evaluate disease control strategies. The purposes of this study were to establish detailed and informed input parameters for an epidemiologic simulation model of the H5N1 strain of HPAI among commercial and backyard poultry in the state of South Carolina in the United States using a highly realistic representation of this poultry population; to estimate the consequences of an outbreak of HPAI in this population with a model constructed from these parameters; and to briefly evaluate the sensitivity of model outcomes to several parameters. Parameters describing disease state durations; disease transmission via direct contact, indirect contact, and local-area spread; and disease detection, surveillance, and control were established through consultation with subject matter experts, a review of the current literature, and the use of several computational tools. The stochastic model constructed from these parameters produced simulated outbreaks ranging from 2 to 111 days in duration (median 25 days), during which 1 to 514 flocks were infected (median 28 flocks). Model results were particularly sensitive to the rate of indirect contact that occurs among flocks. The baseline model established in this study can be used in the future to evaluate various control strategies, as a tool for emergency preparedness and response planning, and to assess the costs associated with disease control and the economic consequences of a disease outbreak. Published by Elsevier B.V.

  12. Discrete Event Modeling and Massively Parallel Execution of Epidemic Outbreak Phenomena

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S; Seal, Sudip K

    2011-01-01

    In complex phenomena such as epidemiological outbreaks, the intensity of inherent feedback effects and the significant role of transients in the dynamics make simulation the only effective method for proactive, reactive or post-facto analysis. The spatial scale, runtime speed, and behavioral detail needed in detailed simulations of epidemic outbreaks make it necessary to use large-scale parallel processing. Here, an optimistic parallel execution of a new discrete event formulation of a reaction-diffusion simulation model of epidemic propagation is presented to facilitate in dramatically increasing the fidelity and speed by which epidemiological simulations can be performed. Rollback support needed during optimistic parallelmore » execution is achieved by combining reverse computation with a small amount of incremental state saving. Parallel speedup of over 5,500 and other runtime performance metrics of the system are observed with weak-scaling execution on a small (8,192-core) Blue Gene / P system, while scalability with a weak-scaling speedup of over 10,000 is demonstrated on 65,536 cores of a large Cray XT5 system. Scenarios representing large population sizes exceeding several hundreds of millions of individuals in the largest cases are successfully exercised to verify model scalability.« less

  13. A Review of Exposure Assessment Methods in Epidemiological Studies on Incinerators

    PubMed Central

    Ranzi, Andrea; De Leo, Giulio A.; Lauriola, Paolo

    2013-01-01

    Incineration is a common technology for waste disposal, and there is public concern for the health impact deriving from incinerators. Poor exposure assessment has been claimed as one of the main causes of inconsistency in the epidemiological literature. We reviewed 41 studies on incinerators published between 1984 and January 2013 and classified them on the basis of exposure assessment approach. Moreover, we performed a simulation study to explore how the different exposure metrics may influence the exposure levels used in epidemiological studies. 19 studies used linear distance as a measure of exposure to incinerators, 11 studies atmospheric dispersion models, and the remaining 11 studies a qualitative variable such as presence/absence of the source. All reviewed studies utilized residence as a proxy for population exposure, although residence location was evaluated with different precision (e.g., municipality, census block, or exact address). Only one study reconstructed temporal variability in exposure. Our simulation study showed a notable degree of exposure misclassification caused by the use of distance compared to dispersion modelling. We suggest that future studies (i) make full use of pollution dispersion models; (ii) localize population on a fine-scale; and (iii) explicitly account for the presence of potential environmental and socioeconomic confounding. PMID:23840228

  14. CMOST: an open-source framework for the microsimulation of colorectal cancer screening strategies.

    PubMed

    Prakash, Meher K; Lang, Brian; Heinrich, Henriette; Valli, Piero V; Bauerfeind, Peter; Sonnenberg, Amnon; Beerenwinkel, Niko; Misselwitz, Benjamin

    2017-06-05

    Colorectal cancer (CRC) is a leading cause of cancer-related mortality. CRC incidence and mortality can be reduced by several screening strategies, including colonoscopy, but randomized CRC prevention trials face significant obstacles such as the need for large study populations with long follow-up. Therefore, CRC screening strategies will likely be designed and optimized based on computer simulations. Several computational microsimulation tools have been reported for estimating efficiency and cost-effectiveness of CRC prevention. However, none of these tools is publicly available. There is a need for an open source framework to answer practical questions including testing of new screening interventions and adapting findings to local conditions. We developed and implemented a new microsimulation model, Colon Modeling Open Source Tool (CMOST), for modeling the natural history of CRC, simulating the effects of CRC screening interventions, and calculating the resulting costs. CMOST facilitates automated parameter calibration against epidemiological adenoma prevalence and CRC incidence data. Predictions of CMOST were highly similar compared to a large endoscopic CRC prevention study as well as predictions of existing microsimulation models. We applied CMOST to calculate the optimal timing of a screening colonoscopy. CRC incidence and mortality are reduced most efficiently by a colonoscopy between the ages of 56 and 59; while discounted life years gained (LYG) is maximal at 49-50 years. With a dwell time of 13 years, the most cost-effective screening is at 59 years, at $17,211 discounted USD per LYG. While cost-efficiency varied according to dwell time it did not influence the optimal time point of screening interventions within the tested range. Predictions of CMOST are highly similar compared to a randomized CRC prevention trial as well as those of other microsimulation tools. This open source tool will enable health-economics analyses in for various countries, health-care scenarios and CRC prevention strategies. CMOST is freely available under the GNU General Public License at https://gitlab.com/misselwb/CMOST.

  15. Multiple Imputation for Incomplete Data in Epidemiologic Studies

    PubMed Central

    Harel, Ofer; Mitchell, Emily M; Perkins, Neil J; Cole, Stephen R; Tchetgen Tchetgen, Eric J; Sun, BaoLuo; Schisterman, Enrique F

    2018-01-01

    Abstract Epidemiologic studies are frequently susceptible to missing information. Omitting observations with missing variables remains a common strategy in epidemiologic studies, yet this simple approach can often severely bias parameter estimates of interest if the values are not missing completely at random. Even when missingness is completely random, complete-case analysis can reduce the efficiency of estimated parameters, because large amounts of available data are simply tossed out with the incomplete observations. Alternative methods for mitigating the influence of missing information, such as multiple imputation, are becoming an increasing popular strategy in order to retain all available information, reduce potential bias, and improve efficiency in parameter estimation. In this paper, we describe the theoretical underpinnings of multiple imputation, and we illustrate application of this method as part of a collaborative challenge to assess the performance of various techniques for dealing with missing data (Am J Epidemiol. 2018;187(3):568–575). We detail the steps necessary to perform multiple imputation on a subset of data from the Collaborative Perinatal Project (1959–1974), where the goal is to estimate the odds of spontaneous abortion associated with smoking during pregnancy. PMID:29165547

  16. Finding the probability of infection in an SIR network is NP-Hard

    PubMed Central

    Shapiro, Michael; Delgado-Eckert, Edgar

    2012-01-01

    It is the purpose of this article to review results that have long been known to communications network engineers and have direct application to epidemiology on networks. A common approach in epidemiology is to study the transmission of a disease in a population where each individual is initially susceptible (S), may become infective (I) and then removed or recovered (R) and plays no further epidemiological role. Much of the recent work gives explicit consideration to the network of social interactions or disease-transmitting contacts and attendant probability of transmission for each interacting pair. The state of such a network is an assignment of the values {S, I, R} to its members. Given such a network, an initial state and a particular susceptible individual, we would like to compute their probability of becoming infected in the course of an epidemic. It turns out that this and related problems are NP-hard. In particular, it belongs in a class of problems for which no efficient algorithms for their solution are known. Moreover, finding an efficient algorithm for the solution of any problem in this class would entail a major breakthrough in theoretical computer science. PMID:22824138

  17. [Current concept of the role of the hygienic-epidemiologic service under special conditions and of the protection of human environment].

    PubMed

    Zdravkovic, A; Dordevic, S; Andelković, N

    1975-01-01

    The contemporary role and tasks of hygienic-epidemiological service are the outcome of the significant social, demographic, economic and health changes as a result of general social development. Natural phenomena such as elemental catastrophies, pollution of human environment, permanent threat of breaking out war at different places, all these set special tasks and determine the role of hygienic-epidemiological service. The contemporary role of the hygienic-epidemiological service is to provide scientific approach to the efficient solving of topical hygienic-epidemiological problems originated in newly-made ecological conditions, disturbed balance in nature, and also in changes of social structure of population. The tasks of the hygienic-epidemiological service are classified according to the role and purpose of each institution and according to the territory where the institution is situated, all these depending on situation (regular or special). In regular situation they have the tasks which are concerned with protection of the living environment and they include prevention of the mass deseases, contagious and other. In special situation they are concerned with catastrophies such as floods, earthquakes, breaking out of grow epidemics, occurence of quarantine diseases and other, and their aim is to prevent or at least to lessen the consequences caused by these catastrophies.

  18. External Source of Infection and Nutritional Efficiency Control Chaos in a Predator-Prey Model with Disease in the Predator

    NASA Astrophysics Data System (ADS)

    Pada Das, Krishna; Roy, Prodip; Ghosh, Subhabrata; Maiti, Somnath

    This paper deals with an eco-epidemiological approach with disease circulating through the predator species. Disease circulation in the predator species can be possible by contact as well as by external sources. Here, we try to discuss the role of external source of infection along with nutritional value on system dynamics. To establish our findings, we have worked out the local and global stability analysis of the equilibrium points with Hopf bifurcation analysis associated with interior equilibrium point. The ecological consequence by ecological basic reproduction number as well as the disease basic reproduction number or basic reproductive ratio are obtained and we have analyzed the community structure of the particular system with the help of ecological and disease basic reproduction numbers. Further we pay attention to the chaotic dynamics which is produced by disease circulating in predator species by contact. Our numerical simulations reveal that eco-epidemiological system without external source of infection induced chaotic dynamics for increasing force of infection due to contact, whereas in the presence of external source of infection, it exhibits stable solution. It is also observed that nutritional value can prevent chaotic dynamics. We conclude that chaotic dynamics can be controlled by the external source of infection as well as nutritional value. We apply basic tools of nonlinear dynamics such as Poincare section and maximum Lyapunov exponent to investigate chaotic behavior of the system.

  19. Quasi-Steady Simulations for the Efficient Generation of Static Aerodynamic Coefficients at Subsonic Velocity

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7790 ● SEP 2016 US Army Research Laboratory Quasi -Steady Simulations for the Efficient Generation of Static Aerodynamic... Quasi -Steady Simulations for the Efficient Generation of Static Aerodynamic Coefficients at Subsonic Velocity by Sidra I Silton Weapons and...To) December 2014–April 2015 4. TITLE AND SUBTITLE Quasi -Steady Simulations for the Efficient Generation of Static Aerodynamic Coefficients at

  20. On efficiency of fire simulation realization: parallelization with greater number of computational meshes

    NASA Astrophysics Data System (ADS)

    Valasek, Lukas; Glasa, Jan

    2017-12-01

    Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.

  1. Controlling for seasonal patterns and time varying confounders in time-series epidemiological models: a simulation study.

    PubMed

    Perrakis, Konstantinos; Gryparis, Alexandros; Schwartz, Joel; Le Tertre, Alain; Katsouyanni, Klea; Forastiere, Francesco; Stafoggia, Massimo; Samoli, Evangelia

    2014-12-10

    An important topic when estimating the effect of air pollutants on human health is choosing the best method to control for seasonal patterns and time varying confounders, such as temperature and humidity. Semi-parametric Poisson time-series models include smooth functions of calendar time and weather effects to control for potential confounders. Case-crossover (CC) approaches are considered efficient alternatives that control seasonal confounding by design and allow inclusion of smooth functions of weather confounders through their equivalent Poisson representations. We evaluate both methodological designs with respect to seasonal control and compare spline-based approaches, using natural splines and penalized splines, and two time-stratified CC approaches. For the spline-based methods, we consider fixed degrees of freedom, minimization of the partial autocorrelation function, and general cross-validation as smoothing criteria. Issues of model misspecification with respect to weather confounding are investigated under simulation scenarios, which allow quantifying omitted, misspecified, and irrelevant-variable bias. The simulations are based on fully parametric mechanisms designed to replicate two datasets with different mortality and atmospheric patterns. Overall, minimum partial autocorrelation function approaches provide more stable results for high mortality counts and strong seasonal trends, whereas natural splines with fixed degrees of freedom perform better for low mortality counts and weak seasonal trends followed by the time-season-stratified CC model, which performs equally well in terms of bias but yields higher standard errors. Copyright © 2014 John Wiley & Sons, Ltd.

  2. Accounting for selection bias in association studies with complex survey data.

    PubMed

    Wirth, Kathleen E; Tchetgen Tchetgen, Eric J

    2014-05-01

    Obtaining representative information from hidden and hard-to-reach populations is fundamental to describe the epidemiology of many sexually transmitted diseases, including HIV. Unfortunately, simple random sampling is impractical in these settings, as no registry of names exists from which to sample the population at random. However, complex sampling designs can be used, as members of these populations tend to congregate at known locations, which can be enumerated and sampled at random. For example, female sex workers may be found at brothels and street corners, whereas injection drug users often come together at shooting galleries. Despite the logistical appeal, complex sampling schemes lead to unequal probabilities of selection, and failure to account for this differential selection can result in biased estimates of population averages and relative risks. However, standard techniques to account for selection can lead to substantial losses in efficiency. Consequently, researchers implement a variety of strategies in an effort to balance validity and efficiency. Some researchers fully or partially account for the survey design, whereas others do nothing and treat the sample as a realization of the population of interest. We use directed acyclic graphs to show how certain survey sampling designs, combined with subject-matter considerations unique to individual exposure-outcome associations, can induce selection bias. Finally, we present a novel yet simple maximum likelihood approach for analyzing complex survey data; this approach optimizes statistical efficiency at no cost to validity. We use simulated data to illustrate this method and compare it with other analytic techniques.

  3. Improved Horvitz-Thompson Estimation of Model Parameters from Two-phase Stratified Samples: Applications in Epidemiology

    PubMed Central

    Breslow, Norman E.; Lumley, Thomas; Ballantyne, Christie M; Chambless, Lloyd E.; Kulich, Michal

    2009-01-01

    The case-cohort study involves two-phase sampling: simple random sampling from an infinite super-population at phase one and stratified random sampling from a finite cohort at phase two. Standard analyses of case-cohort data involve solution of inverse probability weighted (IPW) estimating equations, with weights determined by the known phase two sampling fractions. The variance of parameter estimates in (semi)parametric models, including the Cox model, is the sum of two terms: (i) the model based variance of the usual estimates that would be calculated if full data were available for the entire cohort; and (ii) the design based variance from IPW estimation of the unknown cohort total of the efficient influence function (IF) contributions. This second variance component may be reduced by adjusting the sampling weights, either by calibration to known cohort totals of auxiliary variables correlated with the IF contributions or by their estimation using these same auxiliary variables. Both adjustment methods are implemented in the R survey package. We derive the limit laws of coefficients estimated using adjusted weights. The asymptotic results suggest practical methods for construction of auxiliary variables that are evaluated by simulation of case-cohort samples from the National Wilms Tumor Study and by log-linear modeling of case-cohort data from the Atherosclerosis Risk in Communities Study. Although not semiparametric efficient, estimators based on adjusted weights may come close to achieving full efficiency within the class of augmented IPW estimators. PMID:20174455

  4. Accounting for response misclassification and covariate measurement error improves power and reduces bias in epidemiologic studies.

    PubMed

    Cheng, Dunlei; Branscum, Adam J; Stamey, James D

    2010-07-01

    To quantify the impact of ignoring misclassification of a response variable and measurement error in a covariate on statistical power, and to develop software for sample size and power analysis that accounts for these flaws in epidemiologic data. A Monte Carlo simulation-based procedure is developed to illustrate the differences in design requirements and inferences between analytic methods that properly account for misclassification and measurement error to those that do not in regression models for cross-sectional and cohort data. We found that failure to account for these flaws in epidemiologic data can lead to a substantial reduction in statistical power, over 25% in some cases. The proposed method substantially reduced bias by up to a ten-fold margin compared to naive estimates obtained by ignoring misclassification and mismeasurement. We recommend as routine practice that researchers account for errors in measurement of both response and covariate data when determining sample size, performing power calculations, or analyzing data from epidemiological studies. 2010 Elsevier Inc. All rights reserved.

  5. Malaria and global change: Insights, uncertainties and possible surprises

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, P.H.; Steel, A.

    Malaria may change with global change. Indeed, global change may affect malaria risk and malaria epidemiology. Malaria risk may change in response to a greenhouse warming; malaria epidemiology, in response to the social, economic, and political developments which a greenhouse warming may trigger. To date, malaria receptivity and epidemiology futures have been explored within the context of equilibrium studies. Equilibrium studies of climate change postulate an equilibrium present climate (the starting point) and a doubled-carbon dioxide climate (the end point), simulate conditions in both instances, and compare the two. What happens while climate changes, i.e., between the starting point andmore » the end point, is ignored. The present paper focuses on malaria receptivity and addresses what equilibrium studies miss, namely transient malaria dynamics.« less

  6. Integrating Phylodynamics and Epidemiology to Estimate Transmission Diversity in Viral Epidemics

    PubMed Central

    Magiorkinis, Gkikas; Sypsa, Vana; Magiorkinis, Emmanouil; Paraskevis, Dimitrios; Katsoulidou, Antigoni; Belshaw, Robert; Fraser, Christophe; Pybus, Oliver George; Hatzakis, Angelos

    2013-01-01

    The epidemiology of chronic viral infections, such as those caused by Hepatitis C Virus (HCV) and Human Immunodeficiency Virus (HIV), is affected by the risk group structure of the infected population. Risk groups are defined by each of their members having acquired infection through a specific behavior. However, risk group definitions say little about the transmission potential of each infected individual. Variation in the number of secondary infections is extremely difficult to estimate for HCV and HIV but crucial in the design of efficient control interventions. Here we describe a novel method that combines epidemiological and population genetic approaches to estimate the variation in transmissibility of rapidly-evolving viral epidemics. We evaluate this method using a nationwide HCV epidemic and for the first time co-estimate viral generation times and superspreading events from a combination of molecular and epidemiological data. We anticipate that this integrated approach will form the basis of powerful tools for describing the transmission dynamics of chronic viral diseases, and for evaluating control strategies directed against them. PMID:23382662

  7. Emerging trends in geospatial artificial intelligence (geoAI): potential applications for environmental epidemiology.

    PubMed

    VoPham, Trang; Hart, Jaime E; Laden, Francine; Chiang, Yao-Yi

    2018-04-17

    Geospatial artificial intelligence (geoAI) is an emerging scientific discipline that combines innovations in spatial science, artificial intelligence methods in machine learning (e.g., deep learning), data mining, and high-performance computing to extract knowledge from spatial big data. In environmental epidemiology, exposure modeling is a commonly used approach to conduct exposure assessment to determine the distribution of exposures in study populations. geoAI technologies provide important advantages for exposure modeling in environmental epidemiology, including the ability to incorporate large amounts of big spatial and temporal data in a variety of formats; computational efficiency; flexibility in algorithms and workflows to accommodate relevant characteristics of spatial (environmental) processes including spatial nonstationarity; and scalability to model other environmental exposures across different geographic areas. The objectives of this commentary are to provide an overview of key concepts surrounding the evolving and interdisciplinary field of geoAI including spatial data science, machine learning, deep learning, and data mining; recent geoAI applications in research; and potential future directions for geoAI in environmental epidemiology.

  8. A Systematic Bayesian Integration of Epidemiological and Genetic Data

    PubMed Central

    Lau, Max S. Y.; Marion, Glenn; Streftaris, George; Gibson, Gavin

    2015-01-01

    Genetic sequence data on pathogens have great potential to inform inference of their transmission dynamics ultimately leading to better disease control. Where genetic change and disease transmission occur on comparable timescales additional information can be inferred via the joint analysis of such genetic sequence data and epidemiological observations based on clinical symptoms and diagnostic tests. Although recently introduced approaches represent substantial progress, for computational reasons they approximate genuine joint inference of disease dynamics and genetic change in the pathogen population, capturing partially the joint epidemiological-evolutionary dynamics. Improved methods are needed to fully integrate such genetic data with epidemiological observations, for achieving a more robust inference of the transmission tree and other key epidemiological parameters such as latent periods. Here, building on current literature, a novel Bayesian framework is proposed that infers simultaneously and explicitly the transmission tree and unobserved transmitted pathogen sequences. Our framework facilitates the use of realistic likelihood functions and enables systematic and genuine joint inference of the epidemiological-evolutionary process from partially observed outbreaks. Using simulated data it is shown that this approach is able to infer accurately joint epidemiological-evolutionary dynamics, even when pathogen sequences and epidemiological data are incomplete, and when sequences are available for only a fraction of exposures. These results also characterise and quantify the value of incomplete and partial sequence data, which has important implications for sampling design, and demonstrate the abilities of the introduced method to identify multiple clusters within an outbreak. The framework is used to analyse an outbreak of foot-and-mouth disease in the UK, enhancing current understanding of its transmission dynamics and evolutionary process. PMID:26599399

  9. Two-phase designs for joint quantitative-trait-dependent and genotype-dependent sampling in post-GWAS regional sequencing.

    PubMed

    Espin-Garcia, Osvaldo; Craiu, Radu V; Bull, Shelley B

    2018-02-01

    We evaluate two-phase designs to follow-up findings from genome-wide association study (GWAS) when the cost of regional sequencing in the entire cohort is prohibitive. We develop novel expectation-maximization-based inference under a semiparametric maximum likelihood formulation tailored for post-GWAS inference. A GWAS-SNP (where SNP is single nucleotide polymorphism) serves as a surrogate covariate in inferring association between a sequence variant and a normally distributed quantitative trait (QT). We assess test validity and quantify efficiency and power of joint QT-SNP-dependent sampling and analysis under alternative sample allocations by simulations. Joint allocation balanced on SNP genotype and extreme-QT strata yields significant power improvements compared to marginal QT- or SNP-based allocations. We illustrate the proposed method and evaluate the sensitivity of sample allocation to sampling variation using data from a sequencing study of systolic blood pressure. © 2017 The Authors. Genetic Epidemiology Published by Wiley Periodicals, Inc.

  10. A critical review of the field application of a mathematical model of malaria eradication

    PubMed Central

    Nájera, J. A.

    1974-01-01

    A malaria control field research trial in northern Nigeria was planned with the aid of a computer simulation based on Macdonald's mathematical model of malaria epidemiology. Antimalaria attack was based on a combination of mass drug administration (chloroquine and pyrimethamine) and DDT house spraying. The observed results were at great variance with the predictions of the model. The causes of these discrepancies included inadequate estimation of the model's basic variables, and overestimation, in planning the simulation, of the effects of the attack measures and of the degree of perfection attainable by their application. The discrepancies were to a great extent also due to deficiencies in the model. An analysis is made of those considered to be the most important. It is concluded that research efforts should be encouraged to increase our knowledge of the basic epidemiological factors, their variation and correlations, and to formulate more realistic and useful theoretical models. PMID:4156197

  11. Teaching concepts of clinical measurement variation to medical students.

    PubMed

    Hodder, R A; Longfield, J N; Cruess, D F; Horton, J A

    1982-09-01

    An exercise in clinical epidemiology was developed for medical students to demonstrate the process and limitations of scientific measurement using models that simulate common clinical experiences. All scales of measurement (nominal, ordinal and interval) were used to illustrate concepts of intra- and interobserver variation, systematic error, recording error, and procedural error. In a laboratory, students a) determined blood pressures on six videotaped subjects, b) graded sugar content of unknown solutions from 0 to 4+ using Clinitest tablets, c) measured papules that simulated PPD reactions, d) measured heart and kidney size on X-rays and, e) described a model skin lesion (melanoma). Traditionally, measurement variation is taught in biostatistics or epidemiology courses using previously collected data. Use of these models enables students to produce their own data using measurements commonly employed by the clinician. The exercise provided material for a meaningful discussion of the implications of measurement error in clinical decision-making.

  12. Comparison of four molecular methods to type Salmonella Enteritidis strains.

    PubMed

    Campioni, Fábio; Pitondo-Silva, André; Bergamini, Alzira M M; Falcão, Juliana P

    2015-05-01

    This study compared the pulsed-field gel electrophoresis (PFGE), enterobacterial repetitive intergenic consensus-PCR (ERIC-PCR), multilocus variable-number of tanden-repeat analysis (MLVA), and multilocus sequence typing (MLST) methods for typing 188 Salmonella Enteritidis strains from different sources isolated over a 24-year period in Brazil. PFGE and ERIC-PCR were more efficient than MLVA for subtyping the strains. However, MLVA provided additional epidemiological information for those strains. In addition, MLST showed the Brazilian strains as belonging to the main clonal complex of S. Enteritidis, CC11, and provided the first report of two new STs in the S. enterica database but could not properly subtype the strains. Our results showed that the use of PFGE or ERIC-PCR together with MLVA is suitable to efficiently subtype S. Enteritidis strains and provide important epidemiological information. © 2015 APMIS. Published by John Wiley & Sons Ltd.

  13. Why simulation can be efficient: on the preconditions of efficient learning in complex technology based practices.

    PubMed

    Hofmann, Bjørn

    2009-07-23

    It is important to demonstrate learning outcomes of simulation in technology based practices, such as in advanced health care. Although many studies show skills improvement and self-reported change to practice, there are few studies demonstrating patient outcome and societal efficiency. The objective of the study is to investigate if and why simulation can be effective and efficient in a hi-tech health care setting. This is important in order to decide whether and how to design simulation scenarios and outcome studies. Core theoretical insights in Science and Technology Studies (STS) are applied to analyze the field of simulation in hi-tech health care education. In particular, a process-oriented framework where technology is characterized by its devices, methods and its organizational setting is applied. The analysis shows how advanced simulation can address core characteristics of technology beyond the knowledge of technology's functions. Simulation's ability to address skilful device handling as well as purposive aspects of technology provides a potential for effective and efficient learning. However, as technology is also constituted by organizational aspects, such as technology status, disease status, and resource constraints, the success of simulation depends on whether these aspects can be integrated in the simulation setting as well. This represents a challenge for future development of simulation and for demonstrating its effectiveness and efficiency. Assessing the outcome of simulation in education in hi-tech health care settings is worthwhile if core characteristics of medical technology are addressed. This challenges the traditional technical versus non-technical divide in simulation, as organizational aspects appear to be part of technology's core characteristics.

  14. Quantitative basis for component factors of gas flow proportional counting efficiencies

    NASA Astrophysics Data System (ADS)

    Nichols, Michael C.

    This dissertation investigates the counting efficiency calibration of a gas flow proportional counter with beta-particle emitters in order to (1) determine by measurements and simulation the values of the component factors of beta-particle counting efficiency for a proportional counter, (2) compare the simulation results and measured counting efficiencies, and (3) determine the uncertainty of the simulation and measurements. Monte Carlo simulation results by the MCNP5 code were compared with measured counting efficiencies as a function of sample thickness for 14C, 89Sr, 90Sr, and 90Y. The Monte Carlo model simulated strontium carbonate with areal thicknesses from 0.1 to 35 mg cm-2. The samples were precipitated as strontium carbonate with areal thicknesses from 3 to 33 mg cm-2 , mounted on membrane filters, and counted on a low background gas flow proportional counter. The estimated fractional standard deviation was 2--4% (except 6% for 14C) for efficiency measurements of the radionuclides. The Monte Carlo simulations have uncertainties estimated to be 5 to 6 percent for carbon-14 and 2.4 percent for strontium-89, strontium-90, and yttrium-90. The curves of simulated counting efficiency vs. sample areal thickness agreed within 3% of the curves of best fit drawn through the 25--49 measured points for each of the four radionuclides. Contributions from this research include development of uncertainty budgets for the analytical processes; evaluation of alternative methods for determining chemical yield critical to the measurement process; correcting a bias found in the MCNP normalization of beta spectra histogram; clarifying the interpretation of the commonly used ICRU beta-particle spectra for use by MCNP; and evaluation of instrument parameters as applied to the simulation model to obtain estimates of the counting efficiency from simulated pulse height tallies.

  15. The diffusive finite state projection algorithm for efficient simulation of the stochastic reaction-diffusion master equation.

    PubMed

    Drawert, Brian; Lawson, Michael J; Petzold, Linda; Khammash, Mustafa

    2010-02-21

    We have developed a computational framework for accurate and efficient simulation of stochastic spatially inhomogeneous biochemical systems. The new computational method employs a fractional step hybrid strategy. A novel formulation of the finite state projection (FSP) method, called the diffusive FSP method, is introduced for the efficient and accurate simulation of diffusive transport. Reactions are handled by the stochastic simulation algorithm.

  16. Efficient Conformational Sampling in Explicit Solvent Using a Hybrid Replica Exchange Molecular Dynamics Method

    DTIC Science & Technology

    2011-12-01

    REMD while reproducing the energy landscape of explicit solvent simulations . ’ INTRODUCTION Molecular dynamics (MD) simulations of proteins can pro...Mongan, J.; McCammon, J. A. Accelerated molecular dynamics : a promising and efficient simulation method for biomolecules. J. Chem. Phys. 2004, 120 (24...Chemical Theory and Computation ARTICLE (8) Abraham,M. J.; Gready, J. E. Ensuringmixing efficiency of replica- exchange molecular dynamics simulations . J

  17. Use of instrumental variables in the analysis of generalized linear models in the presence of unmeasured confounding with applications to epidemiological research.

    PubMed

    Johnston, K M; Gustafson, P; Levy, A R; Grootendorst, P

    2008-04-30

    A major, often unstated, concern of researchers carrying out epidemiological studies of medical therapy is the potential impact on validity if estimates of treatment are biased due to unmeasured confounders. One technique for obtaining consistent estimates of treatment effects in the presence of unmeasured confounders is instrumental variables analysis (IVA). This technique has been well developed in the econometrics literature and is being increasingly used in epidemiological studies. However, the approach to IVA that is most commonly used in such studies is based on linear models, while many epidemiological applications make use of non-linear models, specifically generalized linear models (GLMs) such as logistic or Poisson regression. Here we present a simple method for applying IVA within the class of GLMs using the generalized method of moments approach. We explore some of the theoretical properties of the method and illustrate its use within both a simulation example and an epidemiological study where unmeasured confounding is suspected to be present. We estimate the effects of beta-blocker therapy on one-year all-cause mortality after an incident hospitalization for heart failure, in the absence of data describing disease severity, which is believed to be a confounder. 2008 John Wiley & Sons, Ltd

  18. Kernel-density estimation and approximate Bayesian computation for flexible epidemiological model fitting in Python.

    PubMed

    Irvine, Michael A; Hollingsworth, T Déirdre

    2018-05-26

    Fitting complex models to epidemiological data is a challenging problem: methodologies can be inaccessible to all but specialists, there may be challenges in adequately describing uncertainty in model fitting, the complex models may take a long time to run, and it can be difficult to fully capture the heterogeneity in the data. We develop an adaptive approximate Bayesian computation scheme to fit a variety of epidemiologically relevant data with minimal hyper-parameter tuning by using an adaptive tolerance scheme. We implement a novel kernel density estimation scheme to capture both dispersed and multi-dimensional data, and directly compare this technique to standard Bayesian approaches. We then apply the procedure to a complex individual-based simulation of lymphatic filariasis, a human parasitic disease. The procedure and examples are released alongside this article as an open access library, with examples to aid researchers to rapidly fit models to data. This demonstrates that an adaptive ABC scheme with a general summary and distance metric is capable of performing model fitting for a variety of epidemiological data. It also does not require significant theoretical background to use and can be made accessible to the diverse epidemiological research community. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  19. The High-Density Lipoprotein Puzzle: Why Classic Epidemiology, Genetic Epidemiology, and Clinical Trials Conflict?

    PubMed

    Rosenson, Robert S

    2016-05-01

    Classical epidemiology has established the incremental contribution of the high-density lipoprotein (HDL) cholesterol measure in the assessment of atherosclerotic cardiovascular disease risk; yet, genetic epidemiology does not support a causal relationship between HDL cholesterol and the future risk of myocardial infarction. Therapeutic interventions directed toward cholesterol loading of the HDL particle have been based on epidemiological studies that have established HDL cholesterol as a biomarker of atherosclerotic cardiovascular risk. However, therapeutic interventions such as niacin, cholesteryl ester transfer protein inhibitors increase HDL cholesterol in patients treated with statins, but have repeatedly failed to reduce cardiovascular events. Statin therapy interferes with ATP-binding cassette transporter-mediated macrophage cholesterol efflux via miR33 and thus may diminish certain HDL functional properties. Unraveling the HDL puzzle will require continued technical advances in the characterization and quantification of multiple HDL subclasses and their functional properties. Key mechanistic criteria for clinical outcomes trials with HDL-based therapies include formation of HDL subclasses that improve the efficiency of macrophage cholesterol efflux and compositional changes in the proteome and lipidome of the HDL particle that are associated with improved antioxidant and anti-inflammatory properties. These measures require validation in genetic studies and clinical trials of HDL-based therapies on the background of statins. © 2016 American Heart Association, Inc.

  20. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    William M. Bond; Salih Ersayin

    2007-03-30

    This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency ofmore » individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern Minnesota, and future proposals are pending with non-taconite mineral processing applications.« less

  1. Simulating the elimination of sleeping sickness with an agent-based model.

    PubMed

    Grébaut, Pascal; Girardin, Killian; Fédérico, Valentine; Bousquet, François

    2016-01-01

    Although Human African Trypanosomiasis is largely considered to be in the process of extinction today, the persistence of human and animal reservoirs, as well as the vector, necessitates a laborious elimination process. In this context, modeling could be an effective tool to evaluate the ability of different public health interventions to control the disease. Using the Cormas ® system, we developed HATSim, an agent-based model capable of simulating the possible endemic evolutions of sleeping sickness and the ability of National Control Programs to eliminate the disease. This model takes into account the analysis of epidemiological, entomological, and ecological data from field studies conducted during the last decade, making it possible to predict the evolution of the disease within this area over a 5-year span. In this article, we first present HATSim according to the Overview, Design concepts, and Details (ODD) protocol that is classically used to describe agent-based models, then, in a second part, we present predictive results concerning the evolution of Human African Trypanosomiasis in the village of Lambi (Cameroon), in order to illustrate the interest of such a tool. Our results are consistent with what was observed in the field by the Cameroonian National Control Program (CNCP). Our simulations also revealed that regular screening can be sufficient, although vector control applied to all areas with human activities could be significantly more efficient. Our results indicate that the current model can already help decision-makers in planning the elimination of the disease in foci. © P. Grébaut et al., published by EDP Sciences, 2016.

  2. Impact of Exposure Uncertainty on the Association between Perfluorooctanoate and Preeclampsia in the C8 Health Project Population.

    PubMed

    Avanasi, Raghavendhran; Shin, Hyeong-Moo; Vieira, Verónica M; Savitz, David A; Bartell, Scott M

    2016-01-01

    Uncertainty in exposure estimates from models can result in exposure measurement error and can potentially affect the validity of epidemiological studies. We recently used a suite of environmental models and an integrated exposure and pharmacokinetic model to estimate individual perfluorooctanoate (PFOA) serum concentrations and assess the association with preeclampsia from 1990 through 2006 for the C8 Health Project participants. The aims of the current study are to evaluate impact of uncertainty in estimated PFOA drinking-water concentrations on estimated serum concentrations and their reported epidemiological association with preeclampsia. For each individual public water district, we used Monte Carlo simulations to vary the year-by-year PFOA drinking-water concentration by randomly sampling from lognormal distributions for random error in the yearly public water district PFOA concentrations, systematic error specific to each water district, and global systematic error in the release assessment (using the estimated concentrations from the original fate and transport model as medians and a range of 2-, 5-, and 10-fold uncertainty). Uncertainty in PFOA water concentrations could cause major changes in estimated serum PFOA concentrations among participants. However, there is relatively little impact on the resulting epidemiological association in our simulations. The contribution of exposure uncertainty to the total uncertainty (including regression parameter variance) ranged from 5% to 31%, and bias was negligible. We found that correlated exposure uncertainty can substantially change estimated PFOA serum concentrations, but results in only minor impacts on the epidemiological association between PFOA and preeclampsia. Avanasi R, Shin HM, Vieira VM, Savitz DA, Bartell SM. 2016. Impact of exposure uncertainty on the association between perfluorooctanoate and preeclampsia in the C8 Health Project population. Environ Health Perspect 124:126-132; http://dx.doi.org/10.1289/ehp.1409044.

  3. Epidemiologic research using probabilistic outcome definitions.

    PubMed

    Cai, Bing; Hennessy, Sean; Lo Re, Vincent; Small, Dylan S

    2015-01-01

    Epidemiologic studies using electronic healthcare data often define the presence or absence of binary clinical outcomes by using algorithms with imperfect specificity, sensitivity, and positive predictive value. This results in misclassification and bias in study results. We describe and evaluate a new method called probabilistic outcome definition (POD) that uses logistic regression to estimate the probability of a clinical outcome using multiple potential algorithms and then uses multiple imputation to make valid inferences about the risk ratio or other epidemiologic parameters of interest. We conducted a simulation to evaluate the performance of the POD method with two variables that can predict the true outcome and compared the POD method with the conventional method. The simulation results showed that when the true risk ratio is equal to 1.0 (null), the conventional method based on a binary outcome provides unbiased estimates. However, when the risk ratio is not equal to 1.0, the traditional method, either using one predictive variable or both predictive variables to define the outcome, is biased when the positive predictive value is <100%, and the bias is very severe when the sensitivity or positive predictive value is poor (less than 0.75 in our simulation). In contrast, the POD method provides unbiased estimates of the risk ratio both when this measure of effect is equal to 1.0 and not equal to 1.0. Even when the sensitivity and positive predictive value are low, the POD method continues to provide unbiased estimates of the risk ratio. The POD method provides an improved way to define outcomes in database research. This method has a major advantage over the conventional method in that it provided unbiased estimates of risk ratios and it is easy to use. Copyright © 2014 John Wiley & Sons, Ltd.

  4. Interdisciplinary pharmacometrics linking oseltamivir pharmacology, influenza epidemiology and health economics to inform antiviral use in pandemics.

    PubMed

    Kamal, Mohamed A; Smith, Patrick F; Chaiyakunapruk, Nathorn; Wu, David B C; Pratoomsoot, Chayanin; Lee, Kenneth K C; Chong, Huey Yi; Nelson, Richard E; Nieforth, Keith; Dall, Georgina; Toovey, Stephen; Kong, David C M; Kamauu, Aaron; Kirkpatrick, Carl M; Rayner, Craig R

    2017-07-01

    A modular interdisciplinary platform was developed to investigate the economic impact of oseltamivir treatment by dosage regimen under simulated influenza pandemic scenarios. The pharmacology module consisted of a pharmacokinetic distribution of oseltamivir carboxylate daily area under the concentration-time curve at steady state (simulated for 75 mg and 150 mg twice daily regimens for 5 days) and a pharmacodynamic distribution of viral shedding duration obtained from phase II influenza inoculation data. The epidemiological module comprised a susceptible, exposed, infected, recovered (SEIR) model to which drug effect on the basic reproductive number (R 0 ), a measure of transmissibility, was linked by reduction of viral shedding duration. The number of infected patients per population of 100 000 susceptible individuals was simulated for a series of pandemic scenarios, varying oseltamivir dose, R 0 (1.9 vs. 2.7), and drug uptake (25%, 50%, and 80%). The number of infected patients for each scenario was entered into the health economics module, a decision analytic model populated with branch probabilities, disease utility, costs of hospitalized patients developing complications, and case-fatality rates. Change in quality-adjusted life years was determined relative to base case. Oseltamivir 75 mg relative to no treatment reduced the median number of infected patients, increased change in quality-adjusted life years by deaths averted, and was cost-saving under all scenarios; 150 mg relative to 75 mg was not cost effective in low transmissibility scenarios but was cost saving in high transmissibility scenarios. This methodological study demonstrates proof of concept that the disciplines of pharmacology, disease epidemiology and health economics can be linked in a single quantitative framework. © 2017 The British Pharmacological Society.

  5. Subtype-independent near full-length HIV-1 genome sequencing and assembly to be used in large molecular epidemiological studies and clinical management.

    PubMed

    Grossmann, Sebastian; Nowak, Piotr; Neogi, Ujjwal

    2015-01-01

    HIV-1 near full-length genome (HIV-NFLG) sequencing from plasma is an attractive multidimensional tool to apply in large-scale population-based molecular epidemiological studies. It also enables genotypic resistance testing (GRT) for all drug target sites allowing effective intervention strategies for control and prevention in high-risk population groups. Thus, the main objective of this study was to develop a simplified subtype-independent, cost- and labour-efficient HIV-NFLG protocol that can be used in clinical management as well as in molecular epidemiological studies. Plasma samples (n=30) were obtained from HIV-1B (n=10), HIV-1C (n=10), CRF01_AE (n=5) and CRF01_AG (n=5) infected individuals with minimum viral load >1120 copies/ml. The amplification was performed with two large amplicons of 5.5 kb and 3.7 kb, sequenced with 17 primers to obtain HIV-NFLG. GRT was validated against ViroSeq™ HIV-1 Genotyping System. After excluding four plasma samples with low-quality RNA, a total of 26 samples were attempted. Among them, NFLG was obtained from 24 (92%) samples with the lowest viral load being 3000 copies/ml. High (>99%) concordance was observed between HIV-NFLG and ViroSeq™ when determining the drug resistance mutations (DRMs). The N384I connection mutation was additionally detected by NFLG in two samples. Our high efficiency subtype-independent HIV-NFLG is a simple and promising approach to be used in large-scale molecular epidemiological studies. It will facilitate the understanding of the HIV-1 pandemic population dynamics and outline effective intervention strategies. Furthermore, it can potentially be applicable in clinical management of drug resistance by evaluating DRMs against all available antiretrovirals in a single assay.

  6. Computer-based coding of free-text job descriptions to efficiently identify occupations in epidemiological studies

    PubMed Central

    Russ, Daniel E.; Ho, Kwan-Yuet; Colt, Joanne S.; Armenti, Karla R.; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P.; Karagas, Margaret R.; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T.; Johnson, Calvin A.; Friesen, Melissa C.

    2016-01-01

    Background Mapping job titles to standardized occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiologic studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Methods Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14,983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in two occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. Results For 11,991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6- and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (kappa: 0.6–0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Conclusions Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiologic studies. PMID:27102331

  7. The development of a simulation model of the treatment of coronary heart disease.

    PubMed

    Cooper, Keith; Davies, Ruth; Roderick, Paul; Chase, Debbie; Raftery, James

    2002-11-01

    A discrete event simulation models the progress of patients who have had a coronary event, through their treatment pathways and subsequent coronary events. The main risk factors in the model are age, sex, history of previous events and the extent of the coronary vessel disease. The model parameters are based on data collected from epidemiological studies of incidence and prognosis, efficacy studies. national surveys and treatment audits. The simulation results were validated against different sources of data. The initial results show that increasing revascularisation has considerable implications for resource use but has little impact on patient mortality.

  8. Recent progress and future direction of cancer epidemiological research in Japan.

    PubMed

    Sobue, Tomotaka

    2015-06-01

    In 2006, the Cancer Control Act was approved and a Basic Plan, to Promote the Cancer Control Program at the national level, was developed in 2007. Cancer research is recognized as a fundamental component to provide evidence in cancer control program. Cancer epidemiology plays central role in connecting research and policy, since it directly deals with data from humans. Research for cancer epidemiology in Japan made substantial progress, in the field of descriptive studies, cohort studies, intervention studies and activities for summarizing evidences. In future, promoting high-quality large-scale intervention studies, individual-level linkage studies, simulation models and studies for elderly population will be of great importance, but at the same time research should be promoted in well-balanced fashion not placing too much emphasis on one particular research field. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Comparison of an Agent-based Model of Disease Propagation with the Generalised SIR Epidemic Model

    DTIC Science & Technology

    2009-08-01

    has become a practical method for conducting Epidemiological Modelling. In the agent- based approach the whole township can be modelled as a system of...SIR system was initially developed based on a very simplified model of social interaction. For instance an assumption of uniform population mixing was...simulating the progress of a disease within a host and of transmission between hosts is based upon Transportation Analysis and Simulation System

  10. 50 % Response rates: half-empty, or half-full?

    PubMed

    Lacey, James V; Savage, Kristen E

    2016-06-01

    When the Black Women's Health Study, a prospective cohort of over 59,000 women who have been followed since 1995, invited all of its participants to provide a DNA sample for future research, only 51 % of those participants agreed to do so. Responders were significantly older and more health conscious than non-responders. The Black Women's Health Study is a unique resource, but this low level of response and its resulting self-selection bias are now the norm in contemporary epidemiologic, and especially cohort, studies. Epidemiology desperately needs new approaches that work better and cost less. The literature on predictors of response focuses too narrowly on participant characteristics and does not identify any clear steps studies can take to increase participation. To improve research quality, cost-efficiency, and long-term sustainability of studies, epidemiology can and should approach, analyze, and leverage response-rate data more creatively and extensively than most studies have done to date.

  11. An Efficient Design Strategy for Logistic Regression Using Outcome- and Covariate-Dependent Pooling of Biospecimens Prior to Assay

    PubMed Central

    Lyles, Robert H.; Mitchell, Emily M.; Weinberg, Clarice R.; Umbach, David M.; Schisterman, Enrique F.

    2016-01-01

    Summary Potential reductions in laboratory assay costs afforded by pooling equal aliquots of biospecimens have long been recognized in disease surveillance and epidemiological research and, more recently, have motivated design and analytic developments in regression settings. For example, Weinberg and Umbach (1999, Biometrics 55, 718–726) provided methods for fitting set-based logistic regression models to case-control data when a continuous exposure variable (e.g., a biomarker) is assayed on pooled specimens. We focus on improving estimation efficiency by utilizing available subject-specific information at the pool allocation stage. We find that a strategy that we call “(y,c)-pooling,” which forms pooling sets of individuals within strata defined jointly by the outcome and other covariates, provides more precise estimation of the risk parameters associated with those covariates than does pooling within strata defined only by the outcome. We review the approach to set-based analysis through offsets developed by Weinberg and Umbach in a recent correction to their original paper. We propose a method for variance estimation under this design and use simulations and a real-data example to illustrate the precision benefits of (y,c)-pooling relative to y-pooling. We also note and illustrate that set-based models permit estimation of covariate interactions with exposure. PMID:26964741

  12. KINETICS OF THM AND HAA PRODUCTION IN A SIMULATED DISTRIBUTION SYSTEM

    EPA Science Inventory

    Limited data exist on how the growth of halogenated disinfection by-products (DBPs) is affected by time spent in a distribution system. such information is needed to estimate human exposures to these chemicals for both regulatory analyses and epidemiological studies. Current me...

  13. Report of the Defense Science Board Task Force On Information Warfare - Defense (IW-D)

    DTIC Science & Technology

    1996-11-01

    pathogens. Partnerships NCID provides epidemiological, microbiologic , and consultative services to federal agencies, state and local health departments...FOR DETECTING LOCAL OR LARGE-SCALE ATTACKS, AND FOR ADAPTATION TO SUPPORT GRACEFUL DEGRADATION * TESi •BEDS AND SIMULATION-BASED MECHANISMS FOR

  14. PAVA: Physiological and Anatomical Visual Analytics for Mapping of Tissue-Specific Concentration and Time-Course Data

    EPA Science Inventory

    We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...

  15. WiFi networks and malware epidemiology

    PubMed Central

    Hu, Hao; Myers, Steven; Colizza, Vittoria; Vespignani, Alessandro

    2009-01-01

    In densely populated urban areas WiFi routers form a tightly interconnected proximity network that can be exploited as a substrate for the spreading of malware able to launch massive fraudulent attacks. In this article, we consider several scenarios for the deployment of malware that spreads over the wireless channel of major urban areas in the US. We develop an epidemiological model that takes into consideration prevalent security flaws on these routers. The spread of such a contagion is simulated on real-world data for georeferenced wireless routers. We uncover a major weakness of WiFi networks in that most of the simulated scenarios show tens of thousands of routers infected in as little as 2 weeks, with the majority of the infections occurring in the first 24–48 h. We indicate possible containment and prevention measures and provide computational estimates for the rate of encrypted routers that would stop the spreading of the epidemics by placing the system below the percolation threshold. PMID:19171909

  16. WiFi networks and malware epidemiology.

    PubMed

    Hu, Hao; Myers, Steven; Colizza, Vittoria; Vespignani, Alessandro

    2009-02-03

    In densely populated urban areas WiFi routers form a tightly interconnected proximity network that can be exploited as a substrate for the spreading of malware able to launch massive fraudulent attacks. In this article, we consider several scenarios for the deployment of malware that spreads over the wireless channel of major urban areas in the US. We develop an epidemiological model that takes into consideration prevalent security flaws on these routers. The spread of such a contagion is simulated on real-world data for georeferenced wireless routers. We uncover a major weakness of WiFi networks in that most of the simulated scenarios show tens of thousands of routers infected in as little as 2 weeks, with the majority of the infections occurring in the first 24-48 h. We indicate possible containment and prevention measures and provide computational estimates for the rate of encrypted routers that would stop the spreading of the epidemics by placing the system below the percolation threshold.

  17. Change rates and prevalence of a dichotomous variable: simulations and applications.

    PubMed

    Brinks, Ralph; Landwehr, Sandra

    2015-01-01

    A common modelling approach in public health and epidemiology divides the population under study into compartments containing persons that share the same status. Here we consider a three-state model with the compartments: A, B and Dead. States A and B may be the states of any dichotomous variable, for example, Healthy and Ill, respectively. The transitions between the states are described by change rates, which depend on calendar time and on age. So far, a rigorous mathematical calculation of the prevalence of property B has been difficult, which has limited the use of the model in epidemiology and public health. We develop a partial differential equation (PDE) that simplifies the use of the three-state model. To demonstrate the validity of the PDE, it is applied to two simulation studies, one about a hypothetical chronic disease and one about dementia in Germany. In two further applications, the PDE may provide insights into smoking behaviour of males in Germany and the knowledge about the ovulatory cycle in Egyptian women.

  18. Analysis of simulated angiographic procedures. Part 2: extracting efficiency data from audio and video recordings.

    PubMed

    Duncan, James R; Kline, Benjamin; Glaiberman, Craig B

    2007-04-01

    To create and test methods of extracting efficiency data from recordings of simulated renal stent procedures. Task analysis was performed and used to design a standardized testing protocol. Five experienced angiographers then performed 16 renal stent simulations using the Simbionix AngioMentor angiographic simulator. Audio and video recordings of these simulations were captured from multiple vantage points. The recordings were synchronized and compiled. A series of efficiency metrics (procedure time, contrast volume, and tool use) were then extracted from the recordings. The intraobserver and interobserver variability of these individual metrics was also assessed. The metrics were converted to costs and aggregated to determine the fixed and variable costs of a procedure segment or the entire procedure. Task analysis and pilot testing led to a standardized testing protocol suitable for performance assessment. Task analysis also identified seven checkpoints that divided the renal stent simulations into six segments. Efficiency metrics for these different segments were extracted from the recordings and showed excellent intra- and interobserver correlations. Analysis of the individual and aggregated efficiency metrics demonstrated large differences between segments as well as between different angiographers. These differences persisted when efficiency was expressed as either total or variable costs. Task analysis facilitated both protocol development and data analysis. Efficiency metrics were readily extracted from recordings of simulated procedures. Aggregating the metrics and dividing the procedure into segments revealed potential insights that could be easily overlooked because the simulator currently does not attempt to aggregate the metrics and only provides data derived from the entire procedure. The data indicate that analysis of simulated angiographic procedures will be a powerful method of assessing performance in interventional radiology.

  19. Chaos Versus Noisy Periodicity: Alternative Hypotheses for Childhood Epidemics

    NASA Astrophysics Data System (ADS)

    Olsen, L. F.; Schaffer, W. M.

    1990-08-01

    Whereas case rates for some childhood diseases (chickenpox) often vary according to an almost regular annual cycle, the incidence of more efficiently transmitted infections such as measles is more variable. Three hypotheses have been proposed to account for such fluctuations. (i) Irregular dynamics result from random shocks to systems with stable equilibria. (ii) The intrinsic dynamics correspond to biennial cycles that are subject to stochastic forcing. (iii) Aperiodic fluctuations are intrinsic to the epidemiology. Comparison of real world data and epidemiological models suggests that measles epidemics are inherently chaotic. Conversely, the extent to which chickenpox outbreaks approximate a yearly cycle depends inversely on the population size.

  20. [Stress and work. Result of an epidemiological investigation among the voluntary postal and telecommunications personnel of the Calabria region].

    PubMed

    Legato, G; Migliara, M; Battista, G; Arcudi, N; Galtieri, G

    2007-01-01

    The Stress, in consequence of the alterations neuroendocryne to it correlated, as cause of pathology is often suitable, professional and not, among the various. But the circumstance that it can recognize causal moments in the different existential situations makes difficult to bring back the cause of a lot of pathologies to the only working stress somehow in relationship with the working activity. To overcome this "enpasse" I diagnose the authors they apply, in one epidemiological job of theirs, with methodological iter that considers, above all, the efficient and conclusive role of the working factor.

  1. Efficiently Scheduling Multi-core Guest Virtual Machines on Multi-core Hosts in Network Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoginath, Srikanth B; Perumalla, Kalyan S

    2011-01-01

    Virtual machine (VM)-based simulation is a method used by network simulators to incorporate realistic application behaviors by executing actual VMs as high-fidelity surrogates for simulated end-hosts. A critical requirement in such a method is the simulation time-ordered scheduling and execution of the VMs. Prior approaches such as time dilation are less efficient due to the high degree of multiplexing possible when multiple multi-core VMs are simulated on multi-core host systems. We present a new simulation time-ordered scheduler to efficiently schedule multi-core VMs on multi-core real hosts, with a virtual clock realized on each virtual core. The distinguishing features of ourmore » approach are: (1) customizable granularity of the VM scheduling time unit on the simulation time axis, (2) ability to take arbitrary leaps in virtual time by VMs to maximize the utilization of host (real) cores when guest virtual cores idle, and (3) empirically determinable optimality in the tradeoff between total execution (real) time and time-ordering accuracy levels. Experiments show that it is possible to get nearly perfect time-ordered execution, with a slight cost in total run time, relative to optimized non-simulation VM schedulers. Interestingly, with our time-ordered scheduler, it is also possible to reduce the time-ordering error from over 50% of non-simulation scheduler to less than 1% realized by our scheduler, with almost the same run time efficiency as that of the highly efficient non-simulation VM schedulers.« less

  2. Indigenous American ancestry is associated with arsenic methylation efficiency in an admixed population of northwest Mexico

    PubMed Central

    Gomez-Rubio, Paulina; Klimentidis, Yann C.; Cantu-Soto, Ernesto; Meza-Montenegro, Maria M.; Billheimer, Dean; Lu, Zhenqiang; Chen, Zhao; Klimecki, Walter T.

    2013-01-01

    Many studies provide evidence relating lower human arsenic (As) methylation efficiency, represented by high % urinary monomethylarsonic acid (MMA(V)), with several arsenic-induced diseases, possibly due to the fact that MMA(V) serves as a proxy for MMA(III), the most toxic arsenic metabolite. Some epidemiological studies have suggested that indigenous Americans (AME) methylate As more efficiently, however data supporting this have been equivocal. The aim of this study was to characterize the association between AME ancestry and arsenic methylation efficiency using a panel of ancestry informative genetic markers to determine individual ancestry proportions in an admixed population (composed of two or more isolated ancestral populations) of 746 individuals environmentally exposed to arsenic in northwest Mexico. Total urinary As (TAs) mean and range were 170.4 and 2.3–1053.5 μg/L, while %AME mean and range were 72.4 and 23–100. Adjusted (gender, age, AS3MT 7388/M287T haplotypes, body mass index (BMI), and TAs) multiple regression model showed that higher AME ancestry is associated with lower %uMMA excretion in this population (p <0.01). The data also showed a significant interaction between BMI and gender indicating negative association between BMI and %uMMA, stronger in women than men (p <0.01). Moreover age and the AS3MT variants 7388 (intronic) and M287T (non-synonymous) were also significantly associated with As methylation efficiency (p = 0.01). This study highlights the importance of BMI and indigenous American ancestry in some of the observed variability in As methylation efficiency, underscoring the need to be considered in epidemiology studies, particularly those carried out in admixed populations. PMID:22047162

  3. Optimisation of GaN LEDs and the reduction of efficiency droop using active machine learning

    DOE PAGES

    Rouet-Leduc, Bertrand; Barros, Kipton Marcos; Lookman, Turab; ...

    2016-04-26

    A fundamental challenge in the design of LEDs is to maximise electro-luminescence efficiency at high current densities. We simulate GaN-based LED structures that delay the onset of efficiency droop by spreading carrier concentrations evenly across the active region. Statistical analysis and machine learning effectively guide the selection of the next LED structure to be examined based upon its expected efficiency as well as model uncertainty. This active learning strategy rapidly constructs a model that predicts Poisson-Schrödinger simulations of devices, and that simultaneously produces structures with higher simulated efficiencies.

  4. An intelligent processing environment for real-time simulation

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Wells, Buren Earl, Jr.

    1988-01-01

    The development of a highly efficient and thus truly intelligent processing environment for real-time general purpose simulation of continuous systems is described. Such an environment can be created by mapping the simulation process directly onto the University of Alamba's OPERA architecture. To facilitate this effort, the field of continuous simulation is explored, highlighting areas in which efficiency can be improved. Areas in which parallel processing can be applied are also identified, and several general OPERA type hardware configurations that support improved simulation are investigated. Three direct execution parallel processing environments are introduced, each of which greatly improves efficiency by exploiting distinct areas of the simulation process. These suggested environments are candidate architectures around which a highly intelligent real-time simulation configuration can be developed.

  5. Comparing trends in cancer rates across overlapping regions.

    PubMed

    Li, Yi; Tiwari, Ram C

    2008-12-01

    Monitoring and comparing trends in cancer rates across geographic regions or over different time periods have been major tasks of the National Cancer Institute's (NCI) Surveillance, Epidemiology, and End Results (SEER) Program as it profiles healthcare quality as well as decides healthcare resource allocations within a spatial-temporal framework. A fundamental difficulty, however, arises when such comparisons have to be made for regions or time intervals that overlap, for example, comparing the change in trends of mortality rates in a local area (e.g., the mortality rate of breast cancer in California) with a more global level (i.e., the national mortality rate of breast cancer). In view of sparsity of available methodologies, this article develops a simple corrected Z-test that accounts for such overlapping. The performance of the proposed test over the two-sample "pooled"t-test that assumes independence across comparison groups is assessed via the Pitman asymptotic relative efficiency as well as Monte Carlo simulations and applications to the SEER cancer data. The proposed test will be important for the SEER * STAT software, maintained by the NCI, for the analysis of the SEER data.

  6. Relating phylogenetic trees to transmission trees of infectious disease outbreaks.

    PubMed

    Ypma, Rolf J F; van Ballegooijen, W Marijn; Wallinga, Jacco

    2013-11-01

    Transmission events are the fundamental building blocks of the dynamics of any infectious disease. Much about the epidemiology of a disease can be learned when these individual transmission events are known or can be estimated. Such estimations are difficult and generally feasible only when detailed epidemiological data are available. The genealogy estimated from genetic sequences of sampled pathogens is another rich source of information on transmission history. Optimal inference of transmission events calls for the combination of genetic data and epidemiological data into one joint analysis. A key difficulty is that the transmission tree, which describes the transmission events between infected hosts, differs from the phylogenetic tree, which describes the ancestral relationships between pathogens sampled from these hosts. The trees differ both in timing of the internal nodes and in topology. These differences become more pronounced when a higher fraction of infected hosts is sampled. We show how the phylogenetic tree of sampled pathogens is related to the transmission tree of an outbreak of an infectious disease, by the within-host dynamics of pathogens. We provide a statistical framework to infer key epidemiological and mutational parameters by simultaneously estimating the phylogenetic tree and the transmission tree. We test the approach using simulations and illustrate its use on an outbreak of foot-and-mouth disease. The approach unifies existing methods in the emerging field of phylodynamics with transmission tree reconstruction methods that are used in infectious disease epidemiology.

  7. Variable number of tandem repeats and pulsed-field gel electrophoresis cluster analysis of enterohemorrhagic Escherichia coli serovar O157 strains.

    PubMed

    Yokoyama, Eiji; Uchimura, Masako

    2007-11-01

    Ninety-five enterohemorrhagic Escherichia coli serovar O157 strains, including 30 strains isolated from 13 intrafamily outbreaks and 14 strains isolated from 3 mass outbreaks, were studied by pulsed-field gel electrophoresis (PFGE) and variable number of tandem repeats (VNTR) typing, and the resulting data were subjected to cluster analysis. Cluster analysis of the VNTR typing data revealed that 57 (60.0%) of 95 strains, including all epidemiologically linked strains, formed clusters with at least 95% similarity. Cluster analysis of the PFGE patterns revealed that 67 (70.5%) of 95 strains, including all but 1 of the epidemiologically linked strains, formed clusters with 90% similarity. The number of epidemiologically unlinked strains forming clusters was significantly less by VNTR cluster analysis than by PFGE cluster analysis. The congruence value between PFGE and VNTR cluster analysis was low and did not show an obvious correlation. With two-step cluster analysis, the number of clustered epidemiologically unlinked strains by PFGE cluster analysis that were divided by subsequent VNTR cluster analysis was significantly higher than the number by VNTR cluster analysis that were divided by subsequent PFGE cluster analysis. These results indicate that VNTR cluster analysis is more efficient than PFGE cluster analysis as an epidemiological tool to trace the transmission of enterohemorrhagic E. coli O157.

  8. Global epidemiology of HIV infection in men who have sex with men

    PubMed Central

    Beyrer, Chris; Baral, Stefan D; van Griensven, Frits; Goodreau, Steven M; Chariyalertsak, Suwat; Wirtz, Andrea L; Brookmeyer, Ron

    2013-01-01

    Epidemics of HIV in men who have sex with men (MSM) continue to expand in most countries. We sought to understand the epidemiological drivers of the global epidemic in MSM and why it continues unabated. We did a comprehensive review of available data for HIV prevalence, incidence, risk factors, and the molecular epidemiology of HIV in MSM from 2007 to 2011, and modelled the dynamics of HIV transmission with an agent-based simulation. Our findings show that the high probability of transmission per act through receptive anal intercourse has a central role in explaining the disproportionate disease burden in MSM. HIV can be transmitted through large MSM networks at great speed. Molecular epidemiological data show substantial clustering of HIV infections in MSM networks, and higher rates of dual-variant and multiple-variant HIV infection in MSM than in heterosexual people in the same populations. Prevention strategies that lower biological transmission and acquisition risks, such as approaches based on antiretrovirals, offer promise for controlling the expanding epidemic in MSM, but their potential effectiveness is limited by structural factors that contribute to low health-seeking behaviours in populations of MSM in many parts of the world. PMID:22819660

  9. Experimental evidence of a pathogen invasion threshold

    PubMed Central

    Krkošek, Martin

    2018-01-01

    Host density thresholds to pathogen invasion separate regions of parameter space corresponding to endemic and disease-free states. The host density threshold is a central concept in theoretical epidemiology and a common target of human and wildlife disease control programmes, but there is mixed evidence supporting the existence of thresholds, especially in wildlife populations or for pathogens with complex transmission modes (e.g. environmental transmission). Here, we demonstrate the existence of a host density threshold for an environmentally transmitted pathogen by combining an epidemiological model with a microcosm experiment. Experimental epidemics consisted of replicate populations of naive crustacean zooplankton (Daphnia dentifera) hosts across a range of host densities (20–640 hosts l−1) that were exposed to an environmentally transmitted fungal pathogen (Metschnikowia bicuspidata). Epidemiological model simulations, parametrized independently of the experiment, qualitatively predicted experimental pathogen invasion thresholds. Variability in parameter estimates did not strongly influence outcomes, though systematic changes to key parameters have the potential to shift pathogen invasion thresholds. In summary, we provide one of the first clear experimental demonstrations of pathogen invasion thresholds in a replicated experimental system, and provide evidence that such thresholds may be predictable using independently constructed epidemiological models. PMID:29410876

  10. Incorporating Measurement Error from Modeled Air Pollution Exposures into Epidemiological Analyses.

    PubMed

    Samoli, Evangelia; Butland, Barbara K

    2017-12-01

    Outdoor air pollution exposures used in epidemiological studies are commonly predicted from spatiotemporal models incorporating limited measurements, temporal factors, geographic information system variables, and/or satellite data. Measurement error in these exposure estimates leads to imprecise estimation of health effects and their standard errors. We reviewed methods for measurement error correction that have been applied in epidemiological studies that use model-derived air pollution data. We identified seven cohort studies and one panel study that have employed measurement error correction methods. These methods included regression calibration, risk set regression calibration, regression calibration with instrumental variables, the simulation extrapolation approach (SIMEX), and methods under the non-parametric or parameter bootstrap. Corrections resulted in small increases in the absolute magnitude of the health effect estimate and its standard error under most scenarios. Limited application of measurement error correction methods in air pollution studies may be attributed to the absence of exposure validation data and the methodological complexity of the proposed methods. Future epidemiological studies should consider in their design phase the requirements for the measurement error correction method to be later applied, while methodological advances are needed under the multi-pollutants setting.

  11. Performance of a two-leaf light use efficiency model for mapping gross primary productivity against remotely sensed sun-induced chlorophyll fluorescence data.

    PubMed

    Zan, Mei; Zhou, Yanlian; Ju, Weimin; Zhang, Yongguang; Zhang, Leiming; Liu, Yibo

    2018-02-01

    Estimating terrestrial gross primary production is an important task when studying the carbon cycle. In this study, the ability of a two-leaf light use efficiency model to simulate regional gross primary production in China was validated using satellite Global Ozone Monitoring Instrument - 2 sun-induced chlorophyll fluorescence data. The two-leaf light use efficiency model was used to estimate daily gross primary production in China's terrestrial ecosystems with 500-m resolution for the period from 2007 to 2014. Gross primary production simulated with the two-leaf light use efficiency model was resampled to a spatial resolution of 0.5° and then compared with sun-induced chlorophyll fluorescence. During the study period, sun-induced chlorophyll fluorescence and gross primary production simulated by the two-leaf light use efficiency model exhibited similar spatial and temporal patterns in China. The correlation coefficient between sun-induced chlorophyll fluorescence and monthly gross primary production simulated by the two-leaf light use efficiency model was significant (p<0.05, n=96) in 88.9% of vegetated areas in China (average value 0.78) and varied among vegetation types. The interannual variations in monthly sun-induced chlorophyll fluorescence and gross primary production simulated by the two-leaf light use efficiency model were similar in spring and autumn in most vegetated regions, but dissimilar in winter and summer. The spatial variability of sun-induced chlorophyll fluorescence and gross primary production simulated by the two-leaf light use efficiency model was similar in spring, summer, and autumn. The proportion of spatial variations of sun-induced chlorophyll fluorescence and annual gross primary production simulated by the two-leaf light use efficiency model explained by ranged from 0.76 (2011) to 0.80 (2013) during the study period. Overall, the two-leaf light use efficiency model was capable of capturing spatial and temporal variations in gross primary production in China. However, the model needs further improvement to better simulate gross primary production in summer. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Establishment and operation of a biorepository for molecular epidemiologic studies in Costa Rica.

    PubMed

    Cortés, Bernal; Schiffman, Mark; Herrero, Rolando; Hildesheim, Allan; Jiménez, Silvia; Shea, Katheryn; González, Paula; Porras, Carolina; Fallas, Greivin; Rodríguez, Ana Cecilia

    2010-04-01

    The Proyecto Epidemiológico Guanacaste (PEG) has conducted several large studies related to human papillomavirus (HPV) and cervical cancer in Guanacaste, Costa Rica in a long-standing collaboration with the U.S. National Cancer Institute. To improve molecular epidemiology efforts and save costs, we have gradually transferred technology to Costa Rica, culminating in state-of-the-art laboratories and a biorepository to support a phase III clinical trial investigating the efficacy of HPV 16/18 vaccine. Here, we describe the rationale and lessons learned in transferring molecular epidemiologic and biorepository technology to a developing country. At the outset of the PEG in the early 1990s, we shipped all specimens to repositories and laboratories in the United States, which created multiple problems. Since then, by intensive personal interactions between experts from the United States and Costa Rica, we have successfully transferred liquid-based cytology, HPV DNA testing and serology, chlamydia and gonorrhea testing, PCR-safe tissue processing, and viable cryopreservation. To accommodate the vaccine trial, a state-of-the-art repository opened in mid-2004. Approximately 15,000 to 50,000 samples are housed in the repository on any given day, and >500,000 specimens have been shipped, many using a custom-made dry shipper that permits exporting >20,000 specimens at a time. Quality control of shipments received by the NCI biorepository has revealed an error rate of <0.2%. Recently, the PEG repository has incorporated other activities; for example, large-scale aliquotting and long-term, cost-efficient storage of frozen specimens returned from the United States. Using Internet-based specimen tracking software has proven to be efficient even across borders. For long-standing collaborations, it makes sense to transfer the molecular epidemiology expertise toward the source of specimens. The successes of the PEG molecular epidemiology laboratories and biorepository prove that the physical and informatics infrastructures of a modern biorepository can be transferred to a resource-limited and weather-challenged region. Technology transfer is an important and feasible goal of international collaborations.

  13. Bayesian Propensity Score Analysis: Simulation and Case Study

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Cassie J. S.

    2011-01-01

    Propensity score analysis (PSA) has been used in a variety of settings, such as education, epidemiology, and sociology. Most typically, propensity score analysis has been implemented within the conventional frequentist perspective of statistics. This perspective, as is well known, does not account for uncertainty in either the parameters of the…

  14. Effects of Exposure Measurement on Particle Matter Epidemiology: A Simulation Using Data from a Panel Study in Baltimore MD

    EPA Science Inventory

    Ascertaining the true risk associated with exposure to particulate matter (PM) is difficult, given the fact that pollutant components are frequently correlated with each other and with other gaseous pollutants; relationships between ambient concentrations and personal exposures a...

  15. Simulating malaria transmission in the current and future climate of West Africa

    NASA Astrophysics Data System (ADS)

    Yamana, T. K.; Bomblies, A.; Eltahir, E. A. B.

    2015-12-01

    Malaria transmission in West Africa is closely tied to climate, as rain fed water pools provide breeding habitat for the anopheles mosquito vector, and temperature affects the mosquito's ability to spread disease. We present results of a highly detailed, spatially explicit mechanistic modelling study exploring the relationships between the environment and malaria in the current and future climate of West Africa. A mechanistic model of human immunity was incorporated into an existing agent-based model of malaria transmission, allowing us to move beyond entomological measures such as mosquito density and vectorial capacity to analyzing the prevalence of the malaria parasite within human populations. The result is a novel modelling tool that mechanistically simulates all of the key processes linking environment to malaria transmission. Simulations were conducted across climate zones in West Africa, linking temperature and rainfall to entomological and epidemiological variables with a focus on nonlinearities due to threshold effects and interannual variability. Comparisons to observations from the region confirmed that the model provides a reasonable representation of the entomological and epidemiological conditions in this region. We used the predictions of future climate from the most credible CMIP5 climate models to predict the change in frequency and severity of malaria epidemics in West Africa as a result of climate change.

  16. Simulation of emotional contagion using modified SIR model: A cellular automaton approach

    NASA Astrophysics Data System (ADS)

    Fu, Libi; Song, Weiguo; Lv, Wei; Lo, Siuming

    2014-07-01

    Emotion plays an important role in the decision-making of individuals in some emergency situations. The contagion of emotion may induce either normal or abnormal consolidated crowd behavior. This paper aims to simulate the dynamics of emotional contagion among crowds by modifying the epidemiological SIR model to a cellular automaton approach. This new cellular automaton model, entitled the “CA-SIRS model”, captures the dynamic process ‘susceptible-infected-recovered-susceptible', which is based on SIRS contagion in epidemiological theory. Moreover, in this new model, the process is integrated with individual movement. The simulation results of this model show that multiple waves and dynamical stability around a mean value will appear during emotion spreading. It was found that the proportion of initial infected individuals had little influence on the final stable proportion of infected population in a given system, and that infection frequency increased with an increase in the average crowd density. Our results further suggest that individual movement accelerates the spread speed of emotion and increases the stable proportion of infected population. Furthermore, decreasing the duration of an infection and the probability of reinfection can markedly reduce the number of infected individuals. It is hoped that this study will be helpful in crowd management and evacuation organization.

  17. Simulating the Risk of Liver Fluke Infection using a Mechanistic Hydro-epidemiological Model

    NASA Astrophysics Data System (ADS)

    Beltrame, Ludovica; Dunne, Toby; Rose, Hannah; Walker, Josephine; Morgan, Eric; Vickerman, Peter; Wagener, Thorsten

    2016-04-01

    Liver Fluke (Fasciola hepatica) is a common parasite found in livestock and responsible for considerable economic losses throughout the world. Risk of infection is strongly influenced by climatic and hydrological conditions, which characterise the host environment for parasite development and transmission. Despite on-going control efforts, increases in fluke outbreaks have been reported in recent years in the UK, and have been often attributed to climate change. Currently used fluke risk models are based on empirical relationships derived between historical climate and incidence data. However, hydro-climate conditions are becoming increasingly non-stationary due to climate change and direct anthropogenic impacts such as land use change, making empirical models unsuitable for simulating future risk. In this study we introduce a mechanistic hydro-epidemiological model for Liver Fluke, which explicitly simulates habitat suitability for disease development in space and time, representing the parasite life cycle in connection with key environmental conditions. The model is used to assess patterns of Liver Fluke risk for two catchments in the UK under current and potential future climate conditions. Comparisons are made with a widely used empirical model employing different datasets, including data from regional veterinary laboratories. Results suggest that mechanistic models can achieve adequate predictive ability and support adaptive fluke control strategies under climate change scenarios.

  18. The Consequences of Early Gestational Ozone Exposure On Uterine Arterial Flow And Placental Efficiency In Long-Evans Rats#

    EPA Science Inventory

    Exposure to air pollutants during gestation have been epidemiologically linked to adverse pregnancy outcomes and impaired fetal growth. Despite this, limited experimental evidence exists on the toxicological impacts of ozone in pregnancy and fetal development. Pregnant Long-Evans...

  19. The Consequences of Early Gestational Ozone Exposure On Uterine Arterial Flow And Placental Efficiency In Long-Evans Rats

    EPA Science Inventory

    Exposure to air pollutants during gestation have been epidemiologically linked to adverse pregnancy outcomes and impaired fetal growth. Despite this, limited experimental evidence exists on the toxicological impacts of ozone in pregnancy and fetal development. Pregnant Long-Evans...

  20. A novel approach to simulate gene-environment interactions in complex diseases.

    PubMed

    Amato, Roberto; Pinelli, Michele; D'Andrea, Daniel; Miele, Gennaro; Nicodemi, Mario; Raiconi, Giancarlo; Cocozza, Sergio

    2010-01-05

    Complex diseases are multifactorial traits caused by both genetic and environmental factors. They represent the major part of human diseases and include those with largest prevalence and mortality (cancer, heart disease, obesity, etc.). Despite a large amount of information that has been collected about both genetic and environmental risk factors, there are few examples of studies on their interactions in epidemiological literature. One reason can be the incomplete knowledge of the power of statistical methods designed to search for risk factors and their interactions in these data sets. An improvement in this direction would lead to a better understanding and description of gene-environment interactions. To this aim, a possible strategy is to challenge the different statistical methods against data sets where the underlying phenomenon is completely known and fully controllable, for example simulated ones. We present a mathematical approach that models gene-environment interactions. By this method it is possible to generate simulated populations having gene-environment interactions of any form, involving any number of genetic and environmental factors and also allowing non-linear interactions as epistasis. In particular, we implemented a simple version of this model in a Gene-Environment iNteraction Simulator (GENS), a tool designed to simulate case-control data sets where a one gene-one environment interaction influences the disease risk. The main aim has been to allow the input of population characteristics by using standard epidemiological measures and to implement constraints to make the simulator behaviour biologically meaningful. By the multi-logistic model implemented in GENS it is possible to simulate case-control samples of complex disease where gene-environment interactions influence the disease risk. The user has full control of the main characteristics of the simulated population and a Monte Carlo process allows random variability. A knowledge-based approach reduces the complexity of the mathematical model by using reasonable biological constraints and makes the simulation more understandable in biological terms. Simulated data sets can be used for the assessment of novel statistical methods or for the evaluation of the statistical power when designing a study.

  1. Mathematical modelling of vector-borne diseases and insecticide resistance evolution.

    PubMed

    Gabriel Kuniyoshi, Maria Laura; Pio Dos Santos, Fernando Luiz

    2017-01-01

    Vector-borne diseases are important public health issues and, consequently, in silico models that simulate them can be useful. The susceptible-infected-recovered (SIR) model simulates the population dynamics of an epidemic and can be easily adapted to vector-borne diseases, whereas the Hardy-Weinberg model simulates allele frequencies and can be used to study insecticide resistance evolution. The aim of the present study is to develop a coupled system that unifies both models, therefore enabling the analysis of the effects of vector population genetics on the population dynamics of an epidemic. Our model consists of an ordinary differential equation system. We considered the populations of susceptible, infected and recovered humans, as well as susceptible and infected vectors. Concerning these vectors, we considered a pair of alleles, with complete dominance interaction that determined the rate of mortality induced by insecticides. Thus, we were able to separate the vectors according to the genotype. We performed three numerical simulations of the model. In simulation one, both alleles conferred the same mortality rate values, therefore there was no resistant strain. In simulations two and three, the recessive and dominant alleles, respectively, conferred a lower mortality. Our numerical results show that the genetic composition of the vector population affects the dynamics of human diseases. We found that the absolute number of vectors and the proportion of infected vectors are smaller when there is no resistant strain, whilst the ratio of infected people is larger in the presence of insecticide-resistant vectors. The dynamics observed for infected humans in all simulations has a very similar shape to real epidemiological data. The population genetics of vectors can affect epidemiological dynamics, and the presence of insecticide-resistant strains can increase the number of infected people. Based on the present results, the model is a basis for development of other models and for investigating population dynamics.

  2. A Computationally Efficient Hypothesis Testing Method for Epistasis Analysis using Multifactor Dimensionality Reduction

    PubMed Central

    Pattin, Kristine A.; White, Bill C.; Barney, Nate; Gui, Jiang; Nelson, Heather H.; Kelsey, Karl R.; Andrew, Angeline S.; Karagas, Margaret R.; Moore, Jason H.

    2008-01-01

    Multifactor dimensionality reduction (MDR) was developed as a nonparametric and model-free data mining method for detecting, characterizing, and interpreting epistasis in the absence of significant main effects in genetic and epidemiologic studies of complex traits such as disease susceptibility. The goal of MDR is to change the representation of the data using a constructive induction algorithm to make nonadditive interactions easier to detect using any classification method such as naïve Bayes or logistic regression. Traditionally, MDR constructed variables have been evaluated with a naïve Bayes classifier that is combined with 10-fold cross validation to obtain an estimate of predictive accuracy or generalizability of epistasis models. Traditionally, we have used permutation testing to statistically evaluate the significance of models obtained through MDR. The advantage of permutation testing is that it controls for false-positives due to multiple testing. The disadvantage is that permutation testing is computationally expensive. This is in an important issue that arises in the context of detecting epistasis on a genome-wide scale. The goal of the present study was to develop and evaluate several alternatives to large-scale permutation testing for assessing the statistical significance of MDR models. Using data simulated from 70 different epistasis models, we compared the power and type I error rate of MDR using a 1000-fold permutation test with hypothesis testing using an extreme value distribution (EVD). We find that this new hypothesis testing method provides a reasonable alternative to the computationally expensive 1000-fold permutation test and is 50 times faster. We then demonstrate this new method by applying it to a genetic epidemiology study of bladder cancer susceptibility that was previously analyzed using MDR and assessed using a 1000-fold permutation test. PMID:18671250

  3. Efficient scatter model for simulation of ultrasound images from computed tomography data

    NASA Astrophysics Data System (ADS)

    D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.

    2015-12-01

    Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.

  4. Choice of time-scale in Cox's model analysis of epidemiologic cohort data: a simulation study.

    PubMed

    Thiébaut, Anne C M; Bénichou, Jacques

    2004-12-30

    Cox's regression model is widely used for assessing associations between potential risk factors and disease occurrence in epidemiologic cohort studies. Although age is often a strong determinant of disease risk, authors have frequently used time-on-study instead of age as the time-scale, as for clinical trials. Unless the baseline hazard is an exponential function of age, this approach can yield different estimates of relative hazards than using age as the time-scale, even when age is adjusted for. We performed a simulation study in order to investigate the existence and magnitude of bias for different degrees of association between age and the covariate of interest. Age to disease onset was generated from exponential, Weibull or piecewise Weibull distributions, and both fixed and time-dependent dichotomous covariates were considered. We observed no bias upon using age as the time-scale. Upon using time-on-study, we verified the absence of bias for exponentially distributed age to disease onset. For non-exponential distributions, we found that bias could occur even when the covariate of interest was independent from age. It could be severe in case of substantial association with age, especially with time-dependent covariates. These findings were illustrated on data from a cohort of 84,329 French women followed prospectively for breast cancer occurrence. In view of our results, we strongly recommend not using time-on-study as the time-scale for analysing epidemiologic cohort data. 2004 John Wiley & Sons, Ltd.

  5. InterSpread Plus: a spatial and stochastic simulation model of disease in animal populations.

    PubMed

    Stevenson, M A; Sanson, R L; Stern, M W; O'Leary, B D; Sujau, M; Moles-Benfell, N; Morris, R S

    2013-04-01

    We describe the spatially explicit, stochastic simulation model of disease spread, InterSpread Plus, in terms of its epidemiological framework, operation, and mode of use. The input data required by the model, the method for simulating contact and infection spread, and methods for simulating disease control measures are described. Data and parameters that are essential for disease simulation modelling using InterSpread Plus are distinguished from those that are non-essential, and it is suggested that a rational approach to simulating disease epidemics using this tool is to start with core data and parameters, adding additional layers of complexity if and when the specific requirements of the simulation exercise require it. We recommend that simulation models of disease are best developed as part of epidemic contingency planning so decision makers are familiar with model outputs and assumptions and are well-positioned to evaluate their strengths and weaknesses to make informed decisions in times of crisis. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship: Use of Administrative and Surveillance Databases.

    PubMed

    Drees, Marci; Gerber, Jeffrey S; Morgan, Daniel J; Lee, Grace M

    2016-11-01

    Administrative and surveillance data are used frequently in healthcare epidemiology and antimicrobial stewardship (HE&AS) research because of their wide availability and efficiency. However, data quality issues exist, requiring careful consideration and potential validation of data. This methods paper presents key considerations for using administrative and surveillance data in HE&AS, including types of data available and potential use, data limitations, and the importance of validation. After discussing these issues, we review examples of HE&AS research using administrative data with a focus on scenarios when their use may be advantageous. A checklist is provided to help aid study development in HE&AS using administrative data. Infect Control Hosp Epidemiol 2016;1-10.

  7. [Effectiveness of the Federal Inspectorate for the Protection of Consumer Rights and Human Welfare in the Novosibirsk Region in 2007 to 2009].

    PubMed

    Mikheev, V N; Ivanova, L K; Iagudin, B I; Turbinskiĭ, V V

    2010-01-01

    A system for monitoring and analyzing the effectiveness and efficiency of the performance of the Board of the Federal Inspectorate for the Protection of Consumer Rights and Human Welfare in the Novosibirsk Region was introduced into its activities to estimate the provision of the Novosibirsk Region's population with sanitary epidemiological wellbeing in 2007-2009. The introduction of monitoring was ascertained to increase the effectiveness of budgetary fund surveillance and spending, by predicting the effectiveness and choice of priority lines of activities, by increasing the quality of budgetary services rendered in the provision of sanitary and epidemiological well-being to the population.

  8. Improving membrane protein expression by optimizing integration efficiency

    PubMed Central

    2017-01-01

    The heterologous overexpression of integral membrane proteins in Escherichia coli often yields insufficient quantities of purifiable protein for applications of interest. The current study leverages a recently demonstrated link between co-translational membrane integration efficiency and protein expression levels to predict protein sequence modifications that improve expression. Membrane integration efficiencies, obtained using a coarse-grained simulation approach, robustly predicted effects on expression of the integral membrane protein TatC for a set of 140 sequence modifications, including loop-swap chimeras and single-residue mutations distributed throughout the protein sequence. Mutations that improve simulated integration efficiency were 4-fold enriched with respect to improved experimentally observed expression levels. Furthermore, the effects of double mutations on both simulated integration efficiency and experimentally observed expression levels were cumulative and largely independent, suggesting that multiple mutations can be introduced to yield higher levels of purifiable protein. This work provides a foundation for a general method for the rational overexpression of integral membrane proteins based on computationally simulated membrane integration efficiencies. PMID:28918393

  9. Efficient simulation of intensity profile of light through subpixel-matched lenticular lens array for two- and four-view auto-stereoscopic liquid-crystal display.

    PubMed

    Chang, Yia-Chung; Tang, Li-Chuan; Yin, Chun-Yi

    2013-01-01

    Both an analytical formula and an efficient numerical method for simulation of the accumulated intensity profile of light that is refracted through a lenticular lens array placed on top of a liquid-crystal display (LCD) are presented. The influence due to light refracted through adjacent lens is examined in the two-view and four-view systems. Our simulation results are in good agreement with those obtained by a piece of commercial software, ASAP, but our method is much more efficient. This proposed method allows one to adjust the design parameters and carry out simulation for the performance of a subpixel-matched auto-stereoscopic LCD more efficiently and easily.

  10. Information content of contact-pattern representations and predictability of epidemic outbreaks

    PubMed Central

    Holme, Petter

    2015-01-01

    To understand the contact patterns of a population—who is in contact with whom, and when the contacts happen—is crucial for modeling outbreaks of infectious disease. Traditional theoretical epidemiology assumes that any individual can meet any with equal probability. A more modern approach, network epidemiology, assumes people are connected into a static network over which the disease spreads. Newer yet, temporal network epidemiology, includes the time in the contact representations. In this paper, we investigate the effect of these successive inclusions of more information. Using empirical proximity data, we study both outbreak sizes from unknown sources, and from known states of ongoing outbreaks. In the first case, there are large differences going from a fully mixed simulation to a network, and from a network to a temporal network. In the second case, differences are smaller. We interpret these observations in terms of the temporal network structure of the data sets. For example, a fast overturn of nodes and links seem to make the temporal information more important. PMID:26403504

  11. Spatially explicit modelling of cholera epidemics

    NASA Astrophysics Data System (ADS)

    Finger, F.; Bertuzzo, E.; Mari, L.; Knox, A. C.; Gatto, M.; Rinaldo, A.

    2013-12-01

    Epidemiological models can provide crucial understanding about the dynamics of infectious diseases. Possible applications range from real-time forecasting and allocation of health care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. We apply a spatially explicit model to the cholera epidemic that struck Haiti in October 2010 and is still ongoing. The dynamics of susceptibles as well as symptomatic and asymptomatic infectives are modelled at the scale of local human communities. Dissemination of Vibrio cholerae through hydrological transport and human mobility along the road network is explicitly taken into account, as well as the effect of rainfall as a driver of increasing disease incidence. The model is calibrated using a dataset of reported cholera cases. We further model the long term impact of several types of interventions on the disease dynamics by varying parameters appropriately. Key epidemiological mechanisms and parameters which affect the efficiency of treatments such as antibiotics are identified. Our results lead to conclusions about the influence of different intervention strategies on the overall epidemiological dynamics.

  12. A review of the global prevalence, molecular epidemiology and economics of cystic echinococcosis in production animals.

    PubMed

    Cardona, Guillermo A; Carmena, David

    2013-02-18

    Cystic echinococcosis (CE) is an important and widespread zoonotic infection caused by the larval stages of taeniid cestodes of the genus Echinococcus. The disease represents a serious animal health concern in many rural areas of the world, causing important economic losses derived from decreased productivity and viscera condemnation in livestock species. In this review we aim to provide a comprehensive overview on recent research progress in the epidemiology of CE in production animals from a global perspective. Particular attention has been paid to the discussion of the extent and significance of recent molecular epidemiologic data. The financial burden associated to CE on the livestock industry has also been addressed. Data presented are expected to improve our current understanding of the parasite's geographical distribution, transmission, host range, immunogenicity, pathogenesis, and genotype frequencies. This information should be also valuable for the design and implementation of more efficient control strategies against CE. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Barrett's oesophagus: epidemiology, cancer risk and implications for management.

    PubMed

    de Jonge, Pieter Jan F; van Blankenstein, Mark; Grady, William M; Kuipers, Ernst J

    2014-01-01

    Although endoscopic surveillance of patients with Barrett's oesophagus has been widely implemented, its effectiveness is debateable. The recently reported low annual oesophageal adenocarcinoma risk in population studies, the failure to identify most Barrett's patients at risk of disease progression, the poor adherence to surveillance and biopsy protocols, and the significant risk of misclassification of dysplasia all tend to undermine the effectiveness of current management, in particular, endoscopic surveillance programmes, to prevent or improve the outcomes of patients with oesophageal adenocarcinoma. The ongoing increase in incidence of Barrett's oesophagus and consequent growth of the surveillance population, together with the associated discomfort and costs of endoscopic surveillance, demand improved techniques for accurately determining individual risk of oesophageal adenocarcinoma. More accurate techniques are needed to run efficient surveillance programmes in the coming decades. In this review, we will discuss the current knowledge on the epidemiology of Barrett's oesophagus, and the challenging epidemiological dilemmas that need to be addressed when assessing the current screening and surveillance strategies.

  14. Role of data warehousing in healthcare epidemiology.

    PubMed

    Wyllie, D; Davies, J

    2015-04-01

    Electronic storage of healthcare data, including individual-level risk factors for both infectious and other diseases, is increasing. These data can be integrated at hospital, regional and national levels. Data sources that contain risk factor and outcome information for a wide range of conditions offer the potential for efficient epidemiological analysis of multiple diseases. Opportunities may also arise for monitoring healthcare processes. Integrating diverse data sources presents epidemiological, practical, and ethical challenges. For example, diagnostic criteria, outcome definitions, and ascertainment methods may differ across the data sources. Data volumes may be very large, requiring sophisticated computing technology. Given the large populations involved, perhaps the most challenging aspect is how informed consent can be obtained for the development of integrated databases, particularly when it is not easy to demonstrate their potential. In this article, we discuss some of the ups and downs of recent projects as well as the potential of data warehousing for antimicrobial resistance monitoring. Copyright © 2015. Published by Elsevier Ltd.

  15. Implementing Parquet equations using HPX

    NASA Astrophysics Data System (ADS)

    Kellar, Samuel; Wagle, Bibek; Yang, Shuxiang; Tam, Ka-Ming; Kaiser, Hartmut; Moreno, Juana; Jarrell, Mark

    A new C++ runtime system (HPX) enables simulations of complex systems to run more efficiently on parallel and heterogeneous systems. This increased efficiency allows for solutions to larger simulations of the parquet approximation for a system with impurities. The relevancy of the parquet equations depends upon the ability to solve systems which require long runs and large amounts of memory. These limitations, in addition to numerical complications arising from stability of the solutions, necessitate running on large distributed systems. As the computational resources trend towards the exascale and the limitations arising from computational resources vanish efficiency of large scale simulations becomes a focus. HPX facilitates efficient simulations through intelligent overlapping of computation and communication. Simulations such as the parquet equations which require the transfer of large amounts of data should benefit from HPX implementations. Supported by the the NSF EPSCoR Cooperative Agreement No. EPS-1003897 with additional support from the Louisiana Board of Regents.

  16. A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition

    DTIC Science & Technology

    2015-10-05

    simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows

  17. Development of a resource modelling tool to support decision makers in pandemic influenza preparedness: The AsiaFluCap Simulator.

    PubMed

    Stein, Mart Lambertus; Rudge, James W; Coker, Richard; van der Weijden, Charlie; Krumkamp, Ralf; Hanvoravongchai, Piya; Chavez, Irwin; Putthasri, Weerasak; Phommasack, Bounlay; Adisasmito, Wiku; Touch, Sok; Sat, Le Minh; Hsu, Yu-Chen; Kretzschmar, Mirjam; Timen, Aura

    2012-10-12

    Health care planning for pandemic influenza is a challenging task which requires predictive models by which the impact of different response strategies can be evaluated. However, current preparedness plans and simulations exercises, as well as freely available simulation models previously made for policy makers, do not explicitly address the availability of health care resources or determine the impact of shortages on public health. Nevertheless, the feasibility of health systems to implement response measures or interventions described in plans and trained in exercises depends on the available resource capacity. As part of the AsiaFluCap project, we developed a comprehensive and flexible resource modelling tool to support public health officials in understanding and preparing for surges in resource demand during future pandemics. The AsiaFluCap Simulator is a combination of a resource model containing 28 health care resources and an epidemiological model. The tool was built in MS Excel© and contains a user-friendly interface which allows users to select mild or severe pandemic scenarios, change resource parameters and run simulations for one or multiple regions. Besides epidemiological estimations, the simulator provides indications on resource gaps or surpluses, and the impact of shortages on public health for each selected region. It allows for a comparative analysis of the effects of resource availability and consequences of different strategies of resource use, which can provide guidance on resource prioritising and/or mobilisation. Simulation results are displayed in various tables and graphs, and can also be easily exported to GIS software to create maps for geographical analysis of the distribution of resources. The AsiaFluCap Simulator is freely available software (http://www.cdprg.org) which can be used by policy makers, policy advisors, donors and other stakeholders involved in preparedness for providing evidence based and illustrative information on health care resource capacities during future pandemics. The tool can inform both preparedness plans and simulation exercises and can help increase the general understanding of dynamics in resource capacities during a pandemic. The combination of a mathematical model with multiple resources and the linkage to GIS for creating maps makes the tool unique compared to other available software.

  18. A simulation study to determine the attenuation and bias in health risk estimates due to exposure measurement error in bi-pollutant models

    EPA Science Inventory

    To understand the combined health effects of exposure to ambient air pollutant mixtures, it is becoming more common to include multiple pollutants in epidemiologic models. However, the complex spatial and temporal pattern of ambient pollutant concentrations and related exposures ...

  19. Epidemiology of Phytophthora ramorum infecting rhododendrons under simulated nursery conditions

    Treesearch

    S.A. Tjosvold; D.L. Chambers; S. Koike; E. Fichtner

    2006-01-01

    The current understanding of diseases caused by Phytophthora ramorum and their dynamics in nursery crops is almost entirely derived from casual field observations. The objectives of the study are to help understand basic biological factors such as, inoculum viability, dispersal, and infectivity that influence disease occurrence and severity in a...

  20. Molecular typing of Salmonella enterica serovar typhi.

    PubMed Central

    Navarro, F; Llovet, T; Echeita, M A; Coll, P; Aladueña, A; Usera, M A; Prats, G

    1996-01-01

    The efficiencies of different tests for epidemiological markers--phage typing, ribotyping, IS200 typing, and pulsed-field gel electrophoresis (PFGE)--were evaluated for strains from sporadic cases of typhoid fever and a well-defined outbreak. Ribotyping and PFGE proved to be the most discriminating. Both detected two different patterns among outbreak-associated strains. PMID:8897193

  1. Estimating School Efficiency: A Comparison of Methods Using Simulated Data.

    ERIC Educational Resources Information Center

    Bifulco, Robert; Bretschneider, Stuart

    2001-01-01

    Uses simulated data to assess the adequacy of two econometric and linear-programming techniques (data-envelopment analysis and corrected ordinary least squares) for measuring performance-based school reform. In complex data sets (simulated to contain measurement error and endogeneity), these methods are inadequate efficiency measures. (Contains 40…

  2. A simulation-based efficiency comparison of AC and DC power distribution networks in commercial buildings

    DOE PAGES

    Gerber, Daniel L.; Vossos, Vagelis; Feng, Wei; ...

    2017-06-12

    Direct current (DC) power distribution has recently gained traction in buildings research due to the proliferation of on-site electricity generation and battery storage, and an increasing prevalence of internal DC loads. The research discussed in this paper uses Modelica-based simulation to compare the efficiency of DC building power distribution with an equivalent alternating current (AC) distribution. The buildings are all modeled with solar generation, battery storage, and loads that are representative of the most efficient building technology. A variety of paramet ric simulations determine how and when DC distribution proves advantageous. These simulations also validate previous studies that use simplermore » approaches and arithmetic efficiency models. This work shows that using DC distribution can be considerably more efficient: a medium sized office building using DC distribution has an expected baseline of 12% savings, but may also save up to 18%. In these results, the baseline simulation parameters are for a zero net energy (ZNE) building that can island as a microgrid. DC is most advantageous in buildings with large solar capacity, large battery capacity, and high voltage distribution.« less

  3. A simulation-based efficiency comparison of AC and DC power distribution networks in commercial buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Daniel L.; Vossos, Vagelis; Feng, Wei

    Direct current (DC) power distribution has recently gained traction in buildings research due to the proliferation of on-site electricity generation and battery storage, and an increasing prevalence of internal DC loads. The research discussed in this paper uses Modelica-based simulation to compare the efficiency of DC building power distribution with an equivalent alternating current (AC) distribution. The buildings are all modeled with solar generation, battery storage, and loads that are representative of the most efficient building technology. A variety of paramet ric simulations determine how and when DC distribution proves advantageous. These simulations also validate previous studies that use simplermore » approaches and arithmetic efficiency models. This work shows that using DC distribution can be considerably more efficient: a medium sized office building using DC distribution has an expected baseline of 12% savings, but may also save up to 18%. In these results, the baseline simulation parameters are for a zero net energy (ZNE) building that can island as a microgrid. DC is most advantageous in buildings with large solar capacity, large battery capacity, and high voltage distribution.« less

  4. The epidemiological and economic effects from systematic depopulation of Norwegian marine salmon farms infected with pancreas disease virus.

    PubMed

    Pettersen, J M; Brynildsrud, O B; Huseby, R B; Rich, K M; Aunsmo, A; Bang, B Jensen; Aldrin, M

    2016-09-15

    Pancreas disease (PD) is a viral disease associated with significant economic losses in Scottish, Irish, and Norwegian marine salmon aquaculture. In this paper, we investigate how disease-triggered harvest strategies (systematic depopulation of infected marine salmon farms) towards PD can affect disease dynamics and salmon producer profits in an endemic area in the southwestern part of Norway. Four different types of disease-triggered harvest strategies were evaluated over a four-year period (2011-2014), each scenario with different disease-screening procedures, timing for initiating the harvest interventions on infected cohorts, and levels of farmer compliance to the strategy. Our approach applies a spatio-temporal stochastic model for simulating the spread of PD in the separate scenarios. Results from these simulations were then used in cost-benefit analyses to estimate the net benefits of different harvest strategies over time. We find that the most aggressive strategy, in which infected farms are harvested without delay, was most efficient in terms of reducing infection pressure in the area and providing economic benefits for the studied group of salmon producers. On the other hand, lower farm compliance leads to higher infection pressure and less economic benefits. Model results further highlight trade-offs in strategies between those that primarily benefit individual producers and those that have collective benefits, suggesting a need for institutional mechanisms that address these potential tensions. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. A model-based tool to predict the propagation of infectious disease via airports.

    PubMed

    Hwang, Grace M; Mahoney, Paula J; James, John H; Lin, Gene C; Berro, Andre D; Keybl, Meredith A; Goedecke, D Michael; Mathieu, Jennifer J; Wilson, Todd

    2012-01-01

    Epidemics of novel or re-emerging infectious diseases have quickly spread globally via air travel, as highlighted by pandemic H1N1 influenza in 2009 (pH1N1). Federal, state, and local public health responders must be able to plan for and respond to these events at aviation points of entry. The emergence of a novel influenza virus and its spread to the United States were simulated for February 2009 from 55 international metropolitan areas using three basic reproduction numbers (R(0)): 1.53, 1.70, and 1.90. Empirical data from the pH1N1 virus were used to validate our SEIR model. Time to entry to the U.S. during the early stages of a prototypical novel communicable disease was predicted based on the aviation network patterns and the epidemiology of the disease. For example, approximately 96% of origins (R(0) of 1.53) propagated a disease into the U.S. in under 75 days, 90% of these origins propagated a disease in under 50 days. An R(0) of 1.53 reproduced the pH1NI observations. The ability to anticipate the rate and location of disease introduction into the U.S. provides greater opportunity to plan responses based on the scenario as it is unfolding. This simulation tool can aid public health officials to assess risk and leverage resources efficiently. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Computer-based coding of free-text job descriptions to efficiently identify occupations in epidemiological studies.

    PubMed

    Russ, Daniel E; Ho, Kwan-Yuet; Colt, Joanne S; Armenti, Karla R; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P; Karagas, Margaret R; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T; Johnson, Calvin A; Friesen, Melissa C

    2016-06-01

    Mapping job titles to standardised occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiological studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14 983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in 2 occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. For 11 991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6-digit and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (κ 0.6-0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiological studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  7. Realistic and efficient 2D crack simulation

    NASA Astrophysics Data System (ADS)

    Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek

    2010-04-01

    Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.

  8. Hepatitis A in Poland in 2014

    PubMed

    Polański, Piotr

    The aim of this article is to assess the epidemiological situation of hepatitis A in Poland in 2014 with the regard to the recent years. The assessment was conducted based on the results of the analysis of data from the bulletins “Infectious diseases and poisonings in Poland in 2014” and “Vaccinations in Poland in 2014”, as well as information from the individual cases questionnaires and reports of epidemiological investigations in outbreaks of hepatitis A, submitted by the sanitary-epidemiological stations to the Department of Epidemiology in NIPH-NIH. In 2014 in Poland there were 76 cases of hepatitis A registered. Incidence per 100 000 inhabitants was 0.20, and in different voivodeships varied from 0.07 (in Dolnosląskie voivodeship) to 0.30 (in Małopolskie voivodeship). The incidence among male and female did not differ (and was 0.20/ 100 000). In 2014 despite the increase in the number of cases (comparing it to the previous year) no significant change in epidemiological situation of hepatitis A was observed. Poland is still regarded as a country of low endemicity of hepatitis A. In routine surveillance system there is no information concerning the professional affiliation of persons being vaccinated, whereas the vaccinations themselves are recommended in the Polish vaccination schedule. Particular attention should be directed towards the vaccinations of persons who take part in berries primal production, product of which Poland is a major exporter of in the EU. In the light of increasing number of international hepatitis A outbreaks (which could be characterized by the prolonged duration, as well as the high possibility of secondary cases appearing- especially in countries of low endemicity) the maintenance of high level routine surveillance in Poland gains importance. The latter could also contribute to the efficiency of epidemiological investigations in multistate outbreaks.

  9. Computer-intensive simulation of solid-state NMR experiments using SIMPSON.

    PubMed

    Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr; Vosegaard, Thomas

    2014-09-01

    Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. A method for detecting and characterizing outbreaks of infectious disease from clinical reports.

    PubMed

    Cooper, Gregory F; Villamarin, Ricardo; Rich Tsui, Fu-Chiang; Millett, Nicholas; Espino, Jeremy U; Wagner, Michael M

    2015-02-01

    Outbreaks of infectious disease can pose a significant threat to human health. Thus, detecting and characterizing outbreaks quickly and accurately remains an important problem. This paper describes a Bayesian framework that links clinical diagnosis of individuals in a population to epidemiological modeling of disease outbreaks in the population. Computer-based diagnosis of individuals who seek healthcare is used to guide the search for epidemiological models of population disease that explain the pattern of diagnoses well. We applied this framework to develop a system that detects influenza outbreaks from emergency department (ED) reports. The system diagnoses influenza in individuals probabilistically from evidence in ED reports that are extracted using natural language processing. These diagnoses guide the search for epidemiological models of influenza that explain the pattern of diagnoses well. Those epidemiological models with a high posterior probability determine the most likely outbreaks of specific diseases; the models are also used to characterize properties of an outbreak, such as its expected peak day and estimated size. We evaluated the method using both simulated data and data from a real influenza outbreak. The results provide support that the approach can detect and characterize outbreaks early and well enough to be valuable. We describe several extensions to the approach that appear promising. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. A Method for Detecting and Characterizing Outbreaks of Infectious Disease from Clinical Reports

    PubMed Central

    Cooper, Gregory F.; Villamarin, Ricardo; Tsui, Fu-Chiang (Rich); Millett, Nicholas; Espino, Jeremy U.; Wagner, Michael M.

    2014-01-01

    Outbreaks of infectious disease can pose a significant threat to human health. Thus, detecting and characterizing outbreaks quickly and accurately remains an important problem. This paper describes a Bayesian framework that links clinical diagnosis of individuals in a population to epidemiological modeling of disease outbreaks in the population. Computer-based diagnosis of individuals who seek healthcare is used to guide the search for epidemiological models of population disease that explain the pattern of diagnoses well. We applied this framework to develop a system that detects influenza outbreaks from emergency department (ED) reports. The system diagnoses influenza in individuals probabilistically from evidence in ED reports that are extracted using natural language processing. These diagnoses guide the search for epidemiological models of influenza that explain the pattern of diagnoses well. Those epidemiological models with a high posterior probability determine the most likely outbreaks of specific diseases; the models are also used to characterize properties of an outbreak, such as its expected peak day and estimated size. We evaluated the method using both simulated data and data from a real influenza outbreak. The results provide support that the approach can detect and characterize outbreaks early and well enough to be valuable. We describe several extensions to the approach that appear promising. PMID:25181466

  12. Exact and efficient simulation of concordant computation

    NASA Astrophysics Data System (ADS)

    Cable, Hugo; Browne, Daniel E.

    2015-11-01

    Concordant computation is a circuit-based model of quantum computation for mixed states, that assumes that all correlations within the register are discord-free (i.e. the correlations are essentially classical) at every step of the computation. The question of whether concordant computation always admits efficient simulation by a classical computer was first considered by Eastin in arXiv:quant-ph/1006.4402v1, where an answer in the affirmative was given for circuits consisting only of one- and two-qubit gates. Building on this work, we develop the theory of classical simulation of concordant computation. We present a new framework for understanding such computations, argue that a larger class of concordant computations admit efficient simulation, and provide alternative proofs for the main results of arXiv:quant-ph/1006.4402v1 with an emphasis on the exactness of simulation which is crucial for this model. We include detailed analysis of the arithmetic complexity for solving equations in the simulation, as well as extensions to larger gates and qudits. We explore the limitations of our approach, and discuss the challenges faced in developing efficient classical simulation algorithms for all concordant computations.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortiz-Ramírez, Pablo, E-mail: rapeitor@ug.uchile.cl; Ruiz, Andrés

    The Monte Carlo simulation of the gamma spectroscopy systems is a common practice in these days. The most popular softwares to do this are MCNP and Geant4 codes. The intrinsic spatial efficiency method is a general and absolute method to determine the absolute efficiency of a spectroscopy system for any extended sources, but this was only demonstrated experimentally for cylindrical sources. Due to the difficulty that the preparation of sources with any shape represents, the simplest way to do this is by the simulation of the spectroscopy system and the source. In this work we present the validation of themore » intrinsic spatial efficiency method for sources with different geometries and for photons with an energy of 661.65 keV. In the simulation the matrix effects (the auto-attenuation effect) are not considered, therefore these results are only preliminaries. The MC simulation is carried out using the FLUKA code and the absolute efficiency of the detector is determined using two methods: the statistical count of Full Energy Peak (FEP) area (traditional method) and the intrinsic spatial efficiency method. The obtained results show total agreement between the absolute efficiencies determined by the traditional method and the intrinsic spatial efficiency method. The relative bias is lesser than 1% in all cases.« less

  14. A Tutorial on RxODE: Simulating Differential Equation Pharmacometric Models in R.

    PubMed

    Wang, W; Hallow, K M; James, D A

    2016-01-01

    This tutorial presents the application of an R package, RxODE, that facilitates quick, efficient simulations of ordinary differential equation models completely within R. Its application is illustrated through simulation of design decision effects on an adaptive dosing regimen. The package provides an efficient, versatile way to specify dosing scenarios and to perform simulation with variability with minimal custom coding. Models can be directly translated to Rshiny applications to facilitate interactive, real-time evaluation/iteration on simulation scenarios.

  15. Interface-Resolving Simulation of Collision Efficiency of Cloud Droplets

    NASA Astrophysics Data System (ADS)

    Wang, Lian-Ping; Peng, Cheng; Rosa, Bodgan; Onishi, Ryo

    2017-11-01

    Small-scale air turbulence could enhance the geometric collision rate of cloud droplets while large-scale air turbulence could augment the diffusional growth of cloud droplets. Air turbulence could also enhance the collision efficiency of cloud droplets. Accurate simulation of collision efficiency, however, requires capture of the multi-scale droplet-turbulence and droplet-droplet interactions, which has only been partially achieved in the recent past using the hybrid direct numerical simulation (HDNS) approach. % where Stokes disturbance flow is assumed. The HDNS approach has two major drawbacks: (1) the short-range droplet-droplet interaction is not treated rigorously; (2) the finite-Reynolds number correction to the collision efficiency is not included. In this talk, using two independent numerical methods, we will develop an interface-resolved simulation approach in which the disturbance flows are directly resolved numerically, combined with a rigorous lubrication correction model for near-field droplet-droplet interaction. This multi-scale approach is first used to study the effect of finite flow Reynolds numbers on the droplet collision efficiency in still air. Our simulation results show a significant finite-Re effect on collision efficiency when the droplets are of similar sizes. Preliminary results on integrating this approach in a turbulent flow laden with droplets will also be presented. This work is partially supported by the National Science Foundation.

  16. A Simulated Annealing Algorithm for the Optimization of Multistage Depressed Collector Efficiency

    NASA Technical Reports Server (NTRS)

    Vaden, Karl R.; Wilson, Jeffrey D.; Bulson, Brian A.

    2002-01-01

    The microwave traveling wave tube amplifier (TWTA) is widely used as a high-power transmitting source for space and airborne communications. One critical factor in designing a TWTA is the overall efficiency. However, overall efficiency is highly dependent upon collector efficiency; so collector design is critical to the performance of a TWTA. Therefore, NASA Glenn Research Center has developed an optimization algorithm based on Simulated Annealing to quickly design highly efficient multi-stage depressed collectors (MDC).

  17. Computationally efficient multibody simulations

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Jayant; Kumar, Manoj

    1994-01-01

    Computationally efficient approaches to the solution of the dynamics of multibody systems are presented in this work. The computational efficiency is derived from both the algorithmic and implementational standpoint. Order(n) approaches provide a new formulation of the equations of motion eliminating the assembly and numerical inversion of a system mass matrix as required by conventional algorithms. Computational efficiency is also gained in the implementation phase by the symbolic processing and parallel implementation of these equations. Comparison of this algorithm with existing multibody simulation programs illustrates the increased computational efficiency.

  18. Estimating investment worthiness of an ergonomic intervention for preventing low back pain from a firm's perspective.

    PubMed

    Hughes, Richard E; Nelson, Nancy A

    2009-05-01

    A mathematical model was developed for estimating the net present value (NPV) of the cash flow resulting from an investment in an intervention to prevent occupational low back pain (LBP). It combines biomechanics, epidemiology, and finance to give an integrated tool for a firm to use to estimate the investment worthiness of an intervention based on a biomechanical analysis of working postures and hand loads. The model can be used by an ergonomist to estimate the investment worthiness of a proposed intervention. The analysis would begin with a biomechanical evaluation of the current job design and post-intervention job. Economic factors such as hourly labor cost, overhead, workers' compensation costs of LBP claims, and discount rate are combined with the biomechanical analysis to estimate the investment worthiness of the proposed intervention. While this model is limited to low back pain, the simulation framework could be applied to other musculoskeletal disorders. The model uses Monte Carlo simulation to compute the statistical distribution of NPV, and it uses a discrete event simulation paradigm based on four states: (1) working and no history of lost time due to LBP, (2) working and history of lost time due to LBP, (3) lost time due to LBP, and (4) leave job. Probabilities of transitions are based on an extensive review of the epidemiologic review of the low back pain literature. An example is presented.

  19. Can the observed association between serum perfluoroalkyl substances and delayed menarche be explained on the basis of puberty-related changes in physiology and pharmacokinetics?

    PubMed

    Wu, Huali; Yoon, Miyoung; Verner, Marc-André; Xue, Jianping; Luo, Man; Andersen, Melvin E; Longnecker, Matthew P; Clewell, Harvey J

    2015-09-01

    An association between serum levels of two perfluoroalkyl substances (PFAS) and delayed age at menarche was reported in a cross-sectional study of adolescents. Because perfluorooctanoic acid (PFOA) and perfluorooctane sulfonate (PFOS) have half-lives of years, growth dilution and the development of a new route of excretion (menstruation) could account for some or all of the reported association. To assess how much of the epidemiologic association between PFAS and delayed menarche can be explained by the correlation of growth and maturation with PFAS body burden. We developed a Monte Carlo (MC) physiologically-based pharmacokinetic (PBPK) model of PFAS to simulate plasma PFAS levels in a hypothetical female population aged 2 to 20years old. Realistic distributions of physiological parameters as well as timing of growth spurts and menarche were incorporated in the model. The association between PFAS level and delayed menarche in the simulated data was compared with the reported association. The prevalence of menarche, distributions of age-dependent physiological parameters, and quartiles of serum PFAS concentrations in the simulated subjects were comparable to those reported in the epidemiologic study. The delay of menarche in days per natural log increase in PFAS concentrations in the simulated data were about one third as large as the observed values. The reported relationship between PFAS and age at menarche appears to be at least partly explained by pharmacokinetics rather than a toxic effect of these substances. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. You can run, you can hide: The epidemiology and statistical mechanics of zombies

    NASA Astrophysics Data System (ADS)

    Alemi, Alexander A.; Bierbaum, Matthew; Myers, Christopher R.; Sethna, James P.

    2015-11-01

    We use a popular fictional disease, zombies, in order to introduce techniques used in modern epidemiology modeling, and ideas and techniques used in the numerical study of critical phenomena. We consider variants of zombie models, from fully connected continuous time dynamics to a full scale exact stochastic dynamic simulation of a zombie outbreak on the continental United States. Along the way, we offer a closed form analytical expression for the fully connected differential equation, and demonstrate that the single person per site two dimensional square lattice version of zombies lies in the percolation universality class. We end with a quantitative study of the full scale US outbreak, including the average susceptibility of different geographical regions.

  1. LightForce Photon-Pressure Collision Avoidance: Updated Efficiency Analysis Utilizing a Highly Parallel Simulation Approach

    NASA Technical Reports Server (NTRS)

    Stupl, Jan; Faber, Nicolas; Foster, Cyrus; Yang, Fan Yang; Nelson, Bron; Aziz, Jonathan; Nuttall, Andrew; Henze, Chris; Levit, Creon

    2014-01-01

    This paper provides an updated efficiency analysis of the LightForce space debris collision avoidance scheme. LightForce aims to prevent collisions on warning by utilizing photon pressure from ground based, commercial off the shelf lasers. Past research has shown that a few ground-based systems consisting of 10 kilowatt class lasers directed by 1.5 meter telescopes with adaptive optics could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. Our simulation approach utilizes the entire Two Line Element (TLE) catalogue in LEO for a given day as initial input. Least-squares fitting of a TLE time series is used for an improved orbit estimate. We then calculate the probability of collision for all LEO objects in the catalogue for a time step of the simulation. The conjunctions that exceed a threshold probability of collision are then engaged by a simulated network of laser ground stations. After those engagements, the perturbed orbits are used to re-assess the probability of collision and evaluate the efficiency of the system. This paper describes new simulations with three updated aspects: 1) By utilizing a highly parallel simulation approach employing hundreds of processors, we have extended our analysis to a much broader dataset. The simulation time is extended to one year. 2) We analyze not only the efficiency of LightForce on conjunctions that naturally occur, but also take into account conjunctions caused by orbit perturbations due to LightForce engagements. 3) We use a new simulation approach that is regularly updating the LightForce engagement strategy, as it would be during actual operations. In this paper we present our simulation approach to parallelize the efficiency analysis, its computational performance and the resulting expected efficiency of the LightForce collision avoidance system. Results indicate that utilizing a network of four LightForce stations with 20 kilowatt lasers, 85% of all conjunctions with a probability of collision Pc > 10 (sup -6) can be mitigated.

  2. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE PAGES

    Huang, Qiuhua; Vittal, Vijay

    2018-05-09

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  3. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; Vittal, Vijay

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  4. Positive Wigner functions render classical simulation of quantum computation efficient.

    PubMed

    Mari, A; Eisert, J

    2012-12-07

    We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

  5. Modelling dwarf mistletoe at three scales: life history, ballistics and contagion

    Treesearch

    Donald C. E. Robinson; Brian W. Geils

    2006-01-01

    The epidemiology of dwarf mistletoe (Arceuthobium) is simulated for the reproduction, dispersal, and spatial patterns of these plant pathogens on conifer trees. A conceptual model for mistletoe spread and intensification is coded as sets of related subprograms that link to either of two individual-tree growth models (FVS and TASS) used by managers to develop...

  6. Evaluating a New Online Course in the Epidemiology of Infectious Diseases by Studying Student Learning Styles

    ERIC Educational Resources Information Center

    Rogers, James W.; Cox, James R.

    2008-01-01

    At RMIT University, students may now elect to study infectious diseases through a course called Outbreak--The Detection and Control of Infectious Disease. Outbreak was designed to simulate in an online class the effective teamwork required to bring resolution to outbreak crises and enable frameworks for future prevention. The appropriateness of…

  7. Efficiency of Health Care Production in Low-Resource Settings: A Monte-Carlo Simulation to Compare the Performance of Data Envelopment Analysis, Stochastic Distance Functions, and an Ensemble Model

    PubMed Central

    Giorgio, Laura Di; Flaxman, Abraham D.; Moses, Mark W.; Fullman, Nancy; Hanlon, Michael; Conner, Ruben O.; Wollum, Alexandra; Murray, Christopher J. L.

    2016-01-01

    Low-resource countries can greatly benefit from even small increases in efficiency of health service provision, supporting a strong case to measure and pursue efficiency improvement in low- and middle-income countries (LMICs). However, the knowledge base concerning efficiency measurement remains scarce for these contexts. This study shows that current estimation approaches may not be well suited to measure technical efficiency in LMICs and offers an alternative approach for efficiency measurement in these settings. We developed a simulation environment which reproduces the characteristics of health service production in LMICs, and evaluated the performance of Data Envelopment Analysis (DEA) and Stochastic Distance Function (SDF) for assessing efficiency. We found that an ensemble approach (ENS) combining efficiency estimates from a restricted version of DEA (rDEA) and restricted SDF (rSDF) is the preferable method across a range of scenarios. This is the first study to analyze efficiency measurement in a simulation setting for LMICs. Our findings aim to heighten the validity and reliability of efficiency analyses in LMICs, and thus inform policy dialogues about improving the efficiency of health service production in these settings. PMID:26812685

  8. Domestic Asbestos Exposure: A Review of Epidemiologic and Exposure Data

    PubMed Central

    Goswami, Emily; Craven, Valerie; Dahlstrom, David L.; Alexander, Dominik; Mowat, Fionna

    2013-01-01

    Inhalation of asbestos resulting from living with and handling the clothing of workers directly exposed to asbestos has been established as a possible contributor to disease. This review evaluates epidemiologic studies of asbestos-related disease or conditions (mesothelioma, lung cancer, and pleural and interstitial abnormalities) among domestically exposed individuals and exposure studies that provide either direct exposure measurements or surrogate measures of asbestos exposure. A meta-analysis of studies providing relative risk estimates (n = 12) of mesothelioma was performed, resulting in a summary relative risk estimate (SRRE) of 5.02 (95% confidence interval [CI]: 2.48–10.13). This SRRE pertains to persons domestically exposed via workers involved in occupations with a traditionally high risk of disease from exposure to asbestos (i.e., asbestos product manufacturing workers, insulators, shipyard workers, and asbestos miners). The epidemiologic studies also show an elevated risk of interstitial, but more likely pleural, abnormalities (n = 6), though only half accounted for confounding exposures. The studies are limited with regard to lung cancer (n = 2). Several exposure-related studies describe results from airborne samples collected within the home (n = 3), during laundering of contaminated clothing (n = 1) or in controlled exposure simulations (n = 5) of domestic exposures, the latter of which were generally associated with low-level chrysotile-exposed workers. Lung burden studies (n = 6) were also evaluated as a surrogate of exposure. In general, available results for domestic exposures are lower than the workers’ exposures. Recent simulations of low-level chrysotile-exposed workers indicate asbestos levels commensurate with background concentrations in those exposed domestically. PMID:24185840

  9. Cost-utility analysis of antiviral use under pandemic influenza using a novel approach - linking pharmacology, epidemiology and heath economics.

    PubMed

    Wu, D B C; Chaiyakunapruk, N; Pratoomsoot, C; Lee, K K C; Chong, H Y; Nelson, R E; Smith, P F; Kirkpatrick, C M; Kamal, M A; Nieforth, K; Dall, G; Toovey, S; Kong, D C M; Kamauu, A; Rayner, C R

    2018-03-01

    Simulation models are used widely in pharmacology, epidemiology and health economics (HEs). However, there have been no attempts to incorporate models from these disciplines into a single integrated model. Accordingly, we explored this linkage to evaluate the epidemiological and economic impact of oseltamivir dose optimisation in supporting pandemic influenza planning in the USA. An HE decision analytic model was linked to a pharmacokinetic/pharmacodynamics (PK/PD) - dynamic transmission model simulating the impact of pandemic influenza with low virulence and low transmissibility and, high virulence and high transmissibility. The cost-utility analysis was from the payer and societal perspectives, comparing oseltamivir 75 and 150 mg twice daily (BID) to no treatment over a 1-year time horizon. Model parameters were derived from published studies. Outcomes were measured as cost per quality-adjusted life year (QALY) gained. Sensitivity analyses were performed to examine the integrated model's robustness. Under both pandemic scenarios, compared to no treatment, the use of oseltamivir 75 or 150 mg BID led to a significant reduction of influenza episodes and influenza-related deaths, translating to substantial savings of QALYs. Overall drug costs were offset by the reduction of both direct and indirect costs, making these two interventions cost-saving from both perspectives. The results were sensitive to the proportion of inpatient presentation at the emergency visit and patients' quality of life. Integrating PK/PD-EPI/HE models is achievable. Whilst further refinement of this novel linkage model to more closely mimic the reality is needed, the current study has generated useful insights to support influenza pandemic planning.

  10. Cost-effectiveness of 12- and 15-year-old girls' human papillomavirus 16/18 population-based vaccination programmes in Lithuania.

    PubMed

    Vanagas, Giedrius; Padaiga, Zilvinas; Kurtinaitis, Juozas; Logminiene, Zeneta

    2010-08-01

    There is a large difference in the prevalence of cervical cancer between European countries. Between European Union countries, cervical cancer is the most prevalent in Lithuania. Currently we have available vaccines for different types of human papillomavirus virus (HPV), but we lack evidence on how the vaccination would be cost-effective in low-resource Eastern European countries like Lithuania. To create a simulation model for the Lithuanian population; to estimate epidemiological benefits and cost-effectiveness for a HPV16/18 vaccination programme in Lithuania. For the cost-effectiveness analysis, we used Lithuanian population mathematical simulation and epidemiological data modelling. We performed comparative analysis of annual vaccination programmes of 12-year-old or 15-year-old girls at different vaccine penetration levels. Lithuanian female population at all age groups. A vaccination programme in Lithuania would gain an average of 35.6 life years per death avoided. Vaccinated girls would experience up to 76.9% overall reduction in incidence of cervical cancers, 80.8% reduction in morbidity and 77.9% reduction in mortality over their lifetime. Cost per life year gained with different vaccine penetration levels would range from 2167.41 Euros to 2999.74 Euros. HPV vaccination in Lithuania would have a very positive impact on the epidemiological situation and it would be cost-effective at all ranges of vaccine penetration. Vaccination in Lithuania in the long term potentially could be more cost-effective due to avoiding early disease onset and lower accumulation of period costs.

  11. Massively parallel multicanonical simulations

    NASA Astrophysics Data System (ADS)

    Gross, Jonathan; Zierenberg, Johannes; Weigel, Martin; Janke, Wolfhard

    2018-03-01

    Generalized-ensemble Monte Carlo simulations such as the multicanonical method and similar techniques are among the most efficient approaches for simulations of systems undergoing discontinuous phase transitions or with rugged free-energy landscapes. As Markov chain methods, they are inherently serial computationally. It was demonstrated recently, however, that a combination of independent simulations that communicate weight updates at variable intervals allows for the efficient utilization of parallel computational resources for multicanonical simulations. Implementing this approach for the many-thread architecture provided by current generations of graphics processing units (GPUs), we show how it can be efficiently employed with of the order of 104 parallel walkers and beyond, thus constituting a versatile tool for Monte Carlo simulations in the era of massively parallel computing. We provide the fully documented source code for the approach applied to the paradigmatic example of the two-dimensional Ising model as starting point and reference for practitioners in the field.

  12. New Automotive Air Conditioning System Simulation Tool Developed in MATLAB/Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiss, T.; Chaney, L.; Meyer, J.

    Further improvements in vehicle fuel efficiency require accurate evaluation of the vehicle's transient total power requirement. When operated, the air conditioning (A/C) system is the largest auxiliary load on a vehicle; therefore, accurate evaluation of the load it places on the vehicle's engine and/or energy storage system is especially important. Vehicle simulation software, such as 'Autonomie,' has been used by OEMs to evaluate vehicles' energy performance. A transient A/C simulation tool incorporated into vehicle simulation models would also provide a tool for developing more efficient A/C systems through a thorough consideration of the transient A/C system performance. The dynamic systemmore » simulation software Matlab/Simulink was used to develop new and more efficient vehicle energy system controls. The various modeling methods used for the new simulation tool are described in detail. Comparison with measured data is provided to demonstrate the validity of the model.« less

  13. Theoretical analysis and Vsim simulation of a low-voltage high-efficiency 250 GHz gyrotron

    NASA Astrophysics Data System (ADS)

    An, Chenxiang; Zhang, Dian; Zhang, Jun; Zhong, Huihuang

    2018-02-01

    Low-voltage, high-frequency gyrotrons with hundreds of watts of power are useful in radar, magnetic resonance spectroscopy and plasma diagnostic applications. In this paper, a 10 kV, 478 W, 250 GHz gyrotron with an efficiency of nearly 40% and a pitch ratio of 1.5 was designed through linear and nonlinear numerical analyses and Vsim particle-in-cell (PIC) simulation. Vsim is a highly efficient parallel PIC code, but it has seldom been used to carry out electron beam wave interaction simulations of gyro-devices. The setting up of the parameters required for the Vsim simulations of the gyrotron is presented. The results of Vsim simulations agree well with that of nonlinear numerical calculation. The commercial software Vsim7.2 completed the 3D gyrotron simulation in 80 h using a 20 core, 2.2 GHz personal computer with 256 GBytes of memory.

  14. The method of constant stimuli is inefficient

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Fitzhugh, Andrew

    1990-01-01

    Simpson (1988) has argued that the method of constant stimuli is as efficient as adaptive methods of threshold estimation and has supported this claim with simulations. It is shown that Simpson's simulations are not a reasonable model of the experimental process and that more plausible simulations confirm that adaptive methods are much more efficient that the method of constant stimuli.

  15. Parallel Simulation of Subsonic Fluid Dynamics on a Cluster of Workstations.

    DTIC Science & Technology

    1994-11-01

    inside wind musical instruments. Typical simulations achieve $80\\%$ parallel efficiency (speedup/processors) using 20 HP-Apollo workstations. Detailed...TERMS AI, MIT, Artificial Intelligence, Distributed Computing, Workstation Cluster, Network, Fluid Dynamics, Musical Instruments 17. SECURITY...for example, the flow of air inside wind musical instruments. Typical simulations achieve 80% parallel efficiency (speedup/processors) using 20 HP

  16. Efficiency of including first-generation information in second-generation ranking and selection: results of computer simulation.

    Treesearch

    T.Z. Ye; K.J.S. Jayawickrama; G.R. Johnson

    2006-01-01

    Using computer simulation, we evaluated the impact of using first-generation information to increase selection efficiency in a second-generation breeding program. Selection efficiency was compared in terms of increase in rank correlation between estimated and true breeding values (i.e., ranking accuracy), reduction in coefficient of variation of correlation...

  17. The Effectiveness and Efficiency of Two Types of Simulation as Functions of Level of Elementary Education Training. Final Report.

    ERIC Educational Resources Information Center

    Girod, Gerald R.

    An experiment was performed to determine the efficiency of simulation teaching techniques in training elementary education teachers to identify and correct classroom management problems. The two presentation modes compared were film and audiotape. Twelve hypotheses were tested via analysis of variance to determine the relative efficiency of these…

  18. Sci-Thur PM – Brachytherapy 01: Fast brachytherapy dose calculations: Characterization of egs-brachy features to enhance simulation efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamberland, Marc; Taylor, Randle E.P.; Rogers, Da

    2016-08-15

    Purpose: egs-brachy is a fast, new EGSnrc user-code for brachytherapy applications. This study characterizes egs-brachy features that enhance simulation efficiency. Methods: Calculations are performed to characterize efficiency gains from various features. Simulations include radionuclide and miniature x-ray tube sources in water phantoms and idealized prostate, breast, and eye plaque treatments. Features characterized include voxel indexing of sources to reduce boundary checks during radiation transport, scoring collision kerma via tracklength estimator, recycling photons emitted from sources, and using phase space data to initiate simulations. Bremsstrahlung cross section enhancement (BCSE), uniform bremsstrahlung splitting (UBS), and Russian Roulette (RR) are considered for electronicmore » brachytherapy. Results: Efficiency is enhanced by a factor of up to 300 using tracklength versus interaction scoring of collision kerma and by up to 2.7 and 2.6 using phase space sources and particle recycling respectively compared to simulations in which particles are initiated within sources. On a single 2.5 GHz Intel Xeon E5-2680 processor cor, simulations approximating prostate and breast permanent implant ((2 mm){sup 3} voxels) and eye plaque ((1 mm){sup 3}) treatments take as little as 9 s (prostate, eye) and up to 31 s (breast) to achieve 2% statistical uncertainty on doses within the PTV. For electronic brachytherapy, BCSE, UBS, and RR enhance efficiency by a factor >2000 compared to a factor of >10{sup 4} using a phase space source. Conclusion: egs-brachy features provide substantial efficiency gains, resulting in calculation times sufficiently fast for full Monte Carlo simulations for routine brachytherapy treatment planning.« less

  19. An epidemiological modeling and data integration framework.

    PubMed

    Pfeifer, B; Wurz, M; Hanser, F; Seger, M; Netzer, M; Osl, M; Modre-Osprian, R; Schreier, G; Baumgartner, C

    2010-01-01

    In this work, a cellular automaton software package for simulating different infectious diseases, storing the simulation results in a data warehouse system and analyzing the obtained results to generate prediction models as well as contingency plans, is proposed. The Brisbane H3N2 flu virus, which has been spreading during the winter season 2009, was used for simulation in the federal state of Tyrol, Austria. The simulation-modeling framework consists of an underlying cellular automaton. The cellular automaton model is parameterized by known disease parameters and geographical as well as demographical conditions are included for simulating the spreading. The data generated by simulation are stored in the back room of the data warehouse using the Talend Open Studio software package, and subsequent statistical and data mining tasks are performed using the tool, termed Knowledge Discovery in Database Designer (KD3). The obtained simulation results were used for generating prediction models for all nine federal states of Austria. The proposed framework provides a powerful and easy to handle interface for parameterizing and simulating different infectious diseases in order to generate prediction models and improve contingency plans for future events.

  20. Text messaging during simulated driving.

    PubMed

    Drews, Frank A; Yazdani, Hina; Godfrey, Celeste N; Cooper, Joel M; Strayer, David L

    2009-10-01

    This research aims to identify the impact of text messaging on simulated driving performance. In the past decade, a number of on-road, epidemiological, and simulator-based studies reported the negative impact of talking on a cell phone on driving behavior. However, the impact of text messaging on simulated driving performance is still not fully understood. Forty participants engaged in both a single task (driving) and a dual task (driving and text messaging) in a high-fidelity driving simulator. Analysis of driving performance revealed that participants in the dual-task condition responded more slowly to the onset of braking lights and showed impairments in forward and lateral control compared with a driving-only condition. Moreover, text-messaging drivers were involved in more crashes than drivers not engaged in text messaging. Text messaging while driving has a negative impact on simulated driving performance. This negative impact appears to exceed the impact of conversing on a cell phone while driving. The results increase our understanding of driver distraction and have potential implications for public safety and device development.

  1. Methods for Computationally Efficient Structured CFD Simulations of Complex Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    Herrick, Gregory P.; Chen, Jen-Ping

    2012-01-01

    This research presents more efficient computational methods by which to perform multi-block structured Computational Fluid Dynamics (CFD) simulations of turbomachinery, thus facilitating higher-fidelity solutions of complicated geometries and their associated flows. This computational framework offers flexibility in allocating resources to balance process count and wall-clock computation time, while facilitating research interests of simulating axial compressor stall inception with more complete gridding of the flow passages and rotor tip clearance regions than is typically practiced with structured codes. The paradigm presented herein facilitates CFD simulation of previously impractical geometries and flows. These methods are validated and demonstrate improved computational efficiency when applied to complicated geometries and flows.

  2. Simplicity and efficiency of integrate-and-fire neuron models.

    PubMed

    Plesser, Hans E; Diesmann, Markus

    2009-02-01

    Lovelace and Cios (2008) recently proposed a very simple spiking neuron (VSSN) model for simulations of large neuronal networks as an efficient replacement for the integrate-and-fire neuron model. We argue that the VSSN model falls behind key advances in neuronal network modeling over the past 20 years, in particular, techniques that permit simulators to compute the state of the neuron without repeated summation over the history of input spikes and to integrate the subthreshold dynamics exactly. State-of-the-art solvers for networks of integrate-and-fire model neurons are substantially more efficient than the VSSN simulator and allow routine simulations of networks of some 10(5) neurons and 10(9) connections on moderate computer clusters.

  3. Vectorized algorithms for spiking neural network simulation.

    PubMed

    Brette, Romain; Goodman, Dan F M

    2011-06-01

    High-level languages (Matlab, Python) are popular in neuroscience because they are flexible and accelerate development. However, for simulating spiking neural networks, the cost of interpretation is a bottleneck. We describe a set of algorithms to simulate large spiking neural networks efficiently with high-level languages using vector-based operations. These algorithms constitute the core of Brian, a spiking neural network simulator written in the Python language. Vectorized simulation makes it possible to combine the flexibility of high-level languages with the computational efficiency usually associated with compiled languages.

  4. Discrete-Event Simulation Models of Plasmodium falciparum Malaria

    PubMed Central

    McKenzie, F. Ellis; Wong, Roger C.; Bossert, William H.

    2008-01-01

    We develop discrete-event simulation models using a single “timeline” variable to represent the Plasmodium falciparum lifecycle in individual hosts and vectors within interacting host and vector populations. Where they are comparable our conclusions regarding the relative importance of vector mortality and the durations of host immunity and parasite development are congruent with those of classic differential-equation models of malaria, epidemiology. However, our results also imply that in regions with intense perennial transmission, the influence of mosquito mortality on malaria prevalence in humans may be rivaled by that of the duration of host infectivity. PMID:18668185

  5. Dynamics of climate-based malaria transmission model with age-structured human population

    NASA Astrophysics Data System (ADS)

    Addawe, Joel; Pajimola, Aprimelle Kris

    2016-10-01

    In this paper, we proposed to study the dynamics of malaria transmission with periodic birth rate of the vector and an age-structure for the human population. The human population is divided into two compartments: pre-school (0-5 years) and the rest of the human population. We showed the existence of a disease-free equilibrium point. Using published epidemiological parameters, we use numerical simulations to show potential effect of climate change in the dynamics of age-structured malaria transmission. Numerical simulations suggest that there exists an asymptotically attractive solution that is positive and periodic.

  6. The application of epidemiology in national veterinary services: Challenges and threats in Brazil.

    PubMed

    Gonçalves, Vitor Salvador Picão; de Moraes, Geraldo Marcos

    2017-02-01

    The application of epidemiology in national veterinary services must take place at the interface between science and politics. Animal health policy development and implementation require attention to macro-epidemiology, the study of economic, social and policy inputs that affect the distribution and impact of animal or human disease at the national level. The world has changed fast over the last three decades including the delivery of veterinary services, their remit and the challenges addressed by public and animal health policies. Rethinking the role of public services and how to make public programs more efficient has been at the heart of the political discussion. The WTO through its SPS Agreement has changed the way in which national veterinary services operate and how trade decisions are made. Most low and middle income countries are still struggling to keep up with the new international scene. Some of these countries, such as Brazil, have very important livestock industries and are key to the global food systems. Over the last two decades, Brazil became a leading player in exports of livestock products, including poultry, and this created a strong pressure on the national veterinary services to respond to trade demands, leading to focus animal health policies on the export-driven sector. During the same period, Brazil has gone a long way in the direction of integrating epidemiology with veterinary services. Epidemiology groups grew at main universities and have been working with government to provide support to animal health policy. The scope and quality of the applied epidemiological work improved and focused on complex data analysis and development of technologies and tools to solve specific disease problems. Many public veterinary officers were trained in modern epidemiological methods. However, there are important institutional bottlenecks that limit the impact of epidemiology in evidence-based decision making. More complex challenges require high levels of expertise in veterinary epidemiology, as well as institutional models that provide an appropriate environment for building and sustaining capacity in national veterinary services. Integrating epidemiology with animal health policy is a great opportunity if epidemiologists can understand the real issues, including the socio-economic dimensions of disease management, and focus on innovation and production of knowledge. It may be a trap if epidemiologists are restricted to answering specific decision-making questions and policy makers perceive their role exclusively as data analysts or providers of technological solutions. Fostering solutions for complex issues is key to successful integration with policy making. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Hybrid ODE/SSA methods and the cell cycle model

    NASA Astrophysics Data System (ADS)

    Wang, S.; Chen, M.; Cao, Y.

    2017-07-01

    Stochastic effect in cellular systems has been an important topic in systems biology. Stochastic modeling and simulation methods are important tools to study stochastic effect. Given the low efficiency of stochastic simulation algorithms, the hybrid method, which combines an ordinary differential equation (ODE) system with a stochastic chemically reacting system, shows its unique advantages in the modeling and simulation of biochemical systems. The efficiency of hybrid method is usually limited by reactions in the stochastic subsystem, which are modeled and simulated using Gillespie's framework and frequently interrupt the integration of the ODE subsystem. In this paper we develop an efficient implementation approach for the hybrid method coupled with traditional ODE solvers. We also compare the efficiency of hybrid methods with three widely used ODE solvers RADAU5, DASSL, and DLSODAR. Numerical experiments with three biochemical models are presented. A detailed discussion is presented for the performances of three ODE solvers.

  8. Improving the sampling efficiency of Monte Carlo molecular simulations: an evolutionary approach

    NASA Astrophysics Data System (ADS)

    Leblanc, Benoit; Braunschweig, Bertrand; Toulhoat, Hervé; Lutton, Evelyne

    We present a new approach in order to improve the convergence of Monte Carlo (MC) simulations of molecular systems belonging to complex energetic landscapes: the problem is redefined in terms of the dynamic allocation of MC move frequencies depending on their past efficiency, measured with respect to a relevant sampling criterion. We introduce various empirical criteria with the aim of accounting for the proper convergence in phase space sampling. The dynamic allocation is performed over parallel simulations by means of a new evolutionary algorithm involving 'immortal' individuals. The method is bench marked with respect to conventional procedures on a model for melt linear polyethylene. We record significant improvement in sampling efficiencies, thus in computational load, while the optimal sets of move frequencies are liable to allow interesting physical insights into the particular systems simulated. This last aspect should provide a new tool for designing more efficient new MC moves.

  9. FERN - a Java framework for stochastic simulation and evaluation of reaction networks.

    PubMed

    Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf

    2008-08-29

    Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand-alone program and within new systems biology applications. Finally, complex scenarios requiring intervention during the simulation progress can be modelled easily with FERN.

  10. A regularized variable selection procedure in additive hazards model with stratified case-cohort design.

    PubMed

    Ni, Ai; Cai, Jianwen

    2018-07-01

    Case-cohort designs are commonly used in large epidemiological studies to reduce the cost associated with covariate measurement. In many such studies the number of covariates is very large. An efficient variable selection method is needed for case-cohort studies where the covariates are only observed in a subset of the sample. Current literature on this topic has been focused on the proportional hazards model. However, in many studies the additive hazards model is preferred over the proportional hazards model either because the proportional hazards assumption is violated or the additive hazards model provides more relevent information to the research question. Motivated by one such study, the Atherosclerosis Risk in Communities (ARIC) study, we investigate the properties of a regularized variable selection procedure in stratified case-cohort design under an additive hazards model with a diverging number of parameters. We establish the consistency and asymptotic normality of the penalized estimator and prove its oracle property. Simulation studies are conducted to assess the finite sample performance of the proposed method with a modified cross-validation tuning parameter selection methods. We apply the variable selection procedure to the ARIC study to demonstrate its practical use.

  11. Maximum Likelihood Estimations and EM Algorithms with Length-biased Data

    PubMed Central

    Qin, Jing; Ning, Jing; Liu, Hao; Shen, Yu

    2012-01-01

    SUMMARY Length-biased sampling has been well recognized in economics, industrial reliability, etiology applications, epidemiological, genetic and cancer screening studies. Length-biased right-censored data have a unique data structure different from traditional survival data. The nonparametric and semiparametric estimations and inference methods for traditional survival data are not directly applicable for length-biased right-censored data. We propose new expectation-maximization algorithms for estimations based on full likelihoods involving infinite dimensional parameters under three settings for length-biased data: estimating nonparametric distribution function, estimating nonparametric hazard function under an increasing failure rate constraint, and jointly estimating baseline hazards function and the covariate coefficients under the Cox proportional hazards model. Extensive empirical simulation studies show that the maximum likelihood estimators perform well with moderate sample sizes and lead to more efficient estimators compared to the estimating equation approaches. The proposed estimates are also more robust to various right-censoring mechanisms. We prove the strong consistency properties of the estimators, and establish the asymptotic normality of the semi-parametric maximum likelihood estimators under the Cox model using modern empirical processes theory. We apply the proposed methods to a prevalent cohort medical study. Supplemental materials are available online. PMID:22323840

  12. A Statistical Approach for Testing Cross-Phenotype Effects of Rare Variants

    PubMed Central

    Broadaway, K. Alaine; Cutler, David J.; Duncan, Richard; Moore, Jacob L.; Ware, Erin B.; Jhun, Min A.; Bielak, Lawrence F.; Zhao, Wei; Smith, Jennifer A.; Peyser, Patricia A.; Kardia, Sharon L.R.; Ghosh, Debashis; Epstein, Michael P.

    2016-01-01

    Increasing empirical evidence suggests that many genetic variants influence multiple distinct phenotypes. When cross-phenotype effects exist, multivariate association methods that consider pleiotropy are often more powerful than univariate methods that model each phenotype separately. Although several statistical approaches exist for testing cross-phenotype effects for common variants, there is a lack of similar tests for gene-based analysis of rare variants. In order to fill this important gap, we introduce a statistical method for cross-phenotype analysis of rare variants using a nonparametric distance-covariance approach that compares similarity in multivariate phenotypes to similarity in rare-variant genotypes across a gene. The approach can accommodate both binary and continuous phenotypes and further can adjust for covariates. Our approach yields a closed-form test whose significance can be evaluated analytically, thereby improving computational efficiency and permitting application on a genome-wide scale. We use simulated data to demonstrate that our method, which we refer to as the Gene Association with Multiple Traits (GAMuT) test, provides increased power over competing approaches. We also illustrate our approach using exome-chip data from the Genetic Epidemiology Network of Arteriopathy. PMID:26942286

  13. Analysis of Sequence Data Under Multivariate Trait-Dependent Sampling.

    PubMed

    Tao, Ran; Zeng, Donglin; Franceschini, Nora; North, Kari E; Boerwinkle, Eric; Lin, Dan-Yu

    2015-06-01

    High-throughput DNA sequencing allows for the genotyping of common and rare variants for genetic association studies. At the present time and for the foreseeable future, it is not economically feasible to sequence all individuals in a large cohort. A cost-effective strategy is to sequence those individuals with extreme values of a quantitative trait. We consider the design under which the sampling depends on multiple quantitative traits. Under such trait-dependent sampling, standard linear regression analysis can result in bias of parameter estimation, inflation of type I error, and loss of power. We construct a likelihood function that properly reflects the sampling mechanism and utilizes all available data. We implement a computationally efficient EM algorithm and establish the theoretical properties of the resulting maximum likelihood estimators. Our methods can be used to perform separate inference on each trait or simultaneous inference on multiple traits. We pay special attention to gene-level association tests for rare variants. We demonstrate the superiority of the proposed methods over standard linear regression through extensive simulation studies. We provide applications to the Cohorts for Heart and Aging Research in Genomic Epidemiology Targeted Sequencing Study and the National Heart, Lung, and Blood Institute Exome Sequencing Project.

  14. Formalizing the Role of Agent-Based Modeling in Causal Inference and Epidemiology

    PubMed Central

    Marshall, Brandon D. L.; Galea, Sandro

    2015-01-01

    Calls for the adoption of complex systems approaches, including agent-based modeling, in the field of epidemiology have largely centered on the potential for such methods to examine complex disease etiologies, which are characterized by feedback behavior, interference, threshold dynamics, and multiple interacting causal effects. However, considerable theoretical and practical issues impede the capacity of agent-based methods to examine and evaluate causal effects and thus illuminate new areas for intervention. We build on this work by describing how agent-based models can be used to simulate counterfactual outcomes in the presence of complexity. We show that these models are of particular utility when the hypothesized causal mechanisms exhibit a high degree of interdependence between multiple causal effects and when interference (i.e., one person's exposure affects the outcome of others) is present and of intrinsic scientific interest. Although not without challenges, agent-based modeling (and complex systems methods broadly) represent a promising novel approach to identify and evaluate complex causal effects, and they are thus well suited to complement other modern epidemiologic methods of etiologic inquiry. PMID:25480821

  15. Comparison of AGE and Spectral Methods for the Simulation of Far-Wakes

    NASA Technical Reports Server (NTRS)

    Bisset, D. K.; Rogers, M. M.; Kega, Dennis (Technical Monitor)

    1999-01-01

    Turbulent flow simulation methods based on finite differences are attractive for their simplicity, flexibility and efficiency, but not always for accuracy or stability. This report demonstrates that a good compromise is possible with the Advected Grid Explicit (AGE) method. AGE has proven to be both efficient and accurate for simulating turbulent free-shear flows, including planar mixing layers and planar jets. Its efficiency results from its localized fully explicit finite difference formulation (Bisset 1998a,b) that is very straightforward to compute, outweighing the need for a fairly small timestep. Also, most of the successful simulations were slightly under-resolved, and therefore they were, in effect, large-eddy simulations (LES) without a sub-grid-scale (SGS) model, rather than direct numerical simulations (DNS). The principle is that the role of the smallest scales of turbulent motion (when the Reynolds number is not too low) is to dissipate turbulent energy, and therefore they do not have to be simulated when the numerical method is inherently dissipative at its resolution limits. Such simulations are termed 'auto-LES' (LES with automatic SGS modeling) in this report.

  16. [Use of satellites for public health purposes in tropical areas].

    PubMed

    Meynard, J B; Orlandi, E; Rogier, C; Sbai Idrissi, K; Deparis, X; Peyreffite, C; Lightburn, E; Malosse, D; Migliani, R; Spiegel, A; Boutin, J P

    2003-01-01

    The epidemiological hallmark of the new millennium has been the emergence or recrudescence of transmissible diseases with high epidemic potential. Disease tracking is becoming an increasingly global task requiring implementation of more and more sophisticated control strategies and facilities for sustainable development. A promising initiative involves the use of satellite technology to monitor and forecast the spread of disease. The Health Early Warning System (HEWS) was designed based on successful application of satellite data in food programs as well as in other areas (e.g. weather, farming and fishing). The HEWS integrates data from communications, remote-sensing and positioning satellites. The purpose of this review is to present the main studies containing satellite data on public health in tropical areas. Satellite data has allowed development of more reactive epidemiological tracking networks better suited to increasing population mobility, correlation of environmental factors (vegetation index, rainfall and ocean surface color) with human, animal and insect factors in epidemiological studies and assessment of the role of such factors in the development or reappearance of disease. Satellite technology holds great promise for more efficient management of public health problems in tropical areas.

  17. Epidemiologic characteristics of scrub typhus on Jeju Island.

    PubMed

    Lee, Sung Uk

    2017-01-01

    Scrub typhus is the most common febrile disease in Korea during the autumn. Jeju Island is the largest island in South Korea and has a distinctive oceanic climate. This study aimed to identify epidemiologic characteristics of scrub typhus on Jeju Island. From January 2011 to December 2016, 446 patients were diagnosed with scrub typhus on Jeju Island. The patients' personal data and the environmental factors that might be related to scrub typhus were investigated and retrospectively analyzed. The median age of the patients was 58-years-old (range, 8 to 91) and 43% of them worked in the agricultural, forestry or livestock industry. Regardless of their job, 87% of the patients had a history of either working outdoors or of other activities before developing scrub typhus. The south and southeast regions of Jeju Island, especially Namwon-eup, showed the highest incidence of scrub typhus. Workers in mandarin orange orchards seemed to be the highest risk group for scrub typhus infection. Scrub typhus on Jeju Island showed unique characteristics. To efficiently prevent scrub typhus, each year individual regional approaches should be developed based on the epidemiologic characteristics of the disease.

  18. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    NASA Astrophysics Data System (ADS)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  19. An efficient surrogate-based simulation-optimization method for calibrating a regional MODFLOW model

    NASA Astrophysics Data System (ADS)

    Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.

    2017-01-01

    Simulation-optimization method entails a large number of model simulations, which is computationally intensive or even prohibitive if the model simulation is extremely time-consuming. Statistical models have been examined as a surrogate of the high-fidelity physical model during simulation-optimization process to tackle this problem. Among them, Multivariate Adaptive Regression Splines (MARS), a non-parametric adaptive regression method, is superior in overcoming problems of high-dimensions and discontinuities of the data. Furthermore, the stability and accuracy of MARS model can be improved by bootstrap aggregating methods, namely, bagging. In this paper, Bagging MARS (BMARS) method is integrated to a surrogate-based simulation-optimization framework to calibrate a three-dimensional MODFLOW model, which is developed to simulate the groundwater flow in an arid hardrock-alluvium region in northwestern Oman. The physical MODFLOW model is surrogated by the statistical model developed using BMARS algorithm. The surrogate model, which is fitted and validated using training dataset generated by the physical model, can approximate solutions rapidly. An efficient Sobol' method is employed to calculate global sensitivities of head outputs to input parameters, which are used to analyze their importance for the model outputs spatiotemporally. Only sensitive parameters are included in the calibration process to further improve the computational efficiency. Normalized root mean square error (NRMSE) between measured and simulated heads at observation wells is used as the objective function to be minimized during optimization. The reasonable history match between the simulated and observed heads demonstrated feasibility of this high-efficient calibration framework.

  20. HRSSA - Efficient hybrid stochastic simulation for spatially homogeneous biochemical reaction networks

    NASA Astrophysics Data System (ADS)

    Marchetti, Luca; Priami, Corrado; Thanh, Vo Hong

    2016-07-01

    This paper introduces HRSSA (Hybrid Rejection-based Stochastic Simulation Algorithm), a new efficient hybrid stochastic simulation algorithm for spatially homogeneous biochemical reaction networks. HRSSA is built on top of RSSA, an exact stochastic simulation algorithm which relies on propensity bounds to select next reaction firings and to reduce the average number of reaction propensity updates needed during the simulation. HRSSA exploits the computational advantage of propensity bounds to manage time-varying transition propensities and to apply dynamic partitioning of reactions, which constitute the two most significant bottlenecks of hybrid simulation. A comprehensive set of simulation benchmarks is provided for evaluating performance and accuracy of HRSSA against other state of the art algorithms.

  1. Distribution and phenology of ixodid ticks in southern Zambia.

    PubMed

    Speybroeck, N; Madder, M; Van Den Bossche, P; Mtambo, J; Berkvens, N; Chaka, G; Mulumba, M; Brandt, J; Tirry, L; Berkvens, D

    2002-12-01

    Distribution data for epidemiologically important ticks (Acari: Ixodidae) in the Southern Province of Zambia, one of the main cattle areas of the country, are presented. Boophilus microplus (Canestrini) was not recorded in southern Zambia, whereas Boophilus decoloratus (Koch) is present throughout the area. New distribution patterns for less economically important ixodid ticks are also discussed. Southern Zambia is a transition zone because it is the most northern area in Africa where mixed Rhipicephalus appendiculatus Neumann and Rhipicephalus zambeziensis Walker, Norval & Corwin populations were reported. Although a second generation of adult R. appendiculatus/R. zamnbeziensis was encountered, simulations indicated that this phenomenon is very rare in southern Zambia, mainly because of the colder temperatures during the early dry season and lower rainfall. These simulations were supported by a development trial under experimental conditions. Tick body size measurements showed that southern Zambian ticks are larger than eastern Zambian R. appendiculatus. It is hypothesized that body size is related to diapausing intensity in this species. The epidemiological consequences are that a different approach to control Theileria parva (Theiler) (Piroplasmida: Theileriidae) and other tick-borne diseases is needed in southern Zambia, compared to the one adopted in eastern Zambia.

  2. Development of a resource modelling tool to support decision makers in pandemic influenza preparedness: The AsiaFluCap Simulator

    PubMed Central

    2012-01-01

    Background Health care planning for pandemic influenza is a challenging task which requires predictive models by which the impact of different response strategies can be evaluated. However, current preparedness plans and simulations exercises, as well as freely available simulation models previously made for policy makers, do not explicitly address the availability of health care resources or determine the impact of shortages on public health. Nevertheless, the feasibility of health systems to implement response measures or interventions described in plans and trained in exercises depends on the available resource capacity. As part of the AsiaFluCap project, we developed a comprehensive and flexible resource modelling tool to support public health officials in understanding and preparing for surges in resource demand during future pandemics. Results The AsiaFluCap Simulator is a combination of a resource model containing 28 health care resources and an epidemiological model. The tool was built in MS Excel© and contains a user-friendly interface which allows users to select mild or severe pandemic scenarios, change resource parameters and run simulations for one or multiple regions. Besides epidemiological estimations, the simulator provides indications on resource gaps or surpluses, and the impact of shortages on public health for each selected region. It allows for a comparative analysis of the effects of resource availability and consequences of different strategies of resource use, which can provide guidance on resource prioritising and/or mobilisation. Simulation results are displayed in various tables and graphs, and can also be easily exported to GIS software to create maps for geographical analysis of the distribution of resources. Conclusions The AsiaFluCap Simulator is freely available software (http://www.cdprg.org) which can be used by policy makers, policy advisors, donors and other stakeholders involved in preparedness for providing evidence based and illustrative information on health care resource capacities during future pandemics. The tool can inform both preparedness plans and simulation exercises and can help increase the general understanding of dynamics in resource capacities during a pandemic. The combination of a mathematical model with multiple resources and the linkage to GIS for creating maps makes the tool unique compared to other available software. PMID:23061807

  3. Simple, quick and cost-efficient: A universal RT-PCR and sequencing strategy for genomic characterisation of foot-and-mouth disease viruses.

    PubMed

    Dill, V; Beer, M; Hoffmann, B

    2017-08-01

    Foot-and-mouth disease (FMD) is a major contributor to poverty and food insecurity in Africa and Asia, and it is one of the biggest threats to agriculture in highly developed countries. As FMD is extremely contagious, strategies for its prevention, early detection, and the immediate characterisation of outbreak strains are of great importance. The generation of whole-genome sequences enables phylogenetic characterisation, the epidemiological tracing of virus transmission pathways and is supportive in disease control strategies. This study describes the development and validation of a rapid, universal and cost-efficient RT-PCR system to generate genome sequences of FMDV, reaching from the IRES to the end of the open reading frame. The method was evaluated using twelve different virus strains covering all seven serotypes of FMDV. Additionally, samples from experimentally infected animals were tested to mimic diagnostic field samples. All primer pairs showed a robust amplification with a high sensitivity for all serotypes. In summary, the described assay is suitable for the generation of FMDV sequences from all serotypes to allow immediate phylogenetic analysis, detailed genotyping and molecular epidemiology. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. GNC Architecture Design for ARES Simulation. Revision 3.0. Revision 3.0

    NASA Technical Reports Server (NTRS)

    Gay, Robert

    2006-01-01

    The purpose of this document is to describe the GNC architecture and associated interfaces for all ARES simulations. Establishing a common architecture facilitates development across the ARES simulations and provides an efficient mechanism for creating an end-to-end simulation capability. In general, the GNC architecture is the frame work in which all GNC development takes place, including sensor and effector models. All GNC software applications have a standard location within the architecture making integration easier and, thus more efficient.

  5. An efficiency improvement in warehouse operation using simulation analysis

    NASA Astrophysics Data System (ADS)

    Samattapapong, N.

    2017-11-01

    In general, industry requires an efficient system for warehouse operation. There are many important factors that must be considered when designing an efficient warehouse system. The most important is an effective warehouse operation system that can help transfer raw material, reduce costs and support transportation. By all these factors, researchers are interested in studying about work systems and warehouse distribution. We start by collecting the important data for storage, such as the information on products, information on size and location, information on data collection and information on production, and all this information to build simulation model in Flexsim® simulation software. The result for simulation analysis found that the conveyor belt was a bottleneck in the warehouse operation. Therefore, many scenarios to improve that problem were generated and testing through simulation analysis process. The result showed that an average queuing time was reduced from 89.8% to 48.7% and the ability in transporting the product increased from 10.2% to 50.9%. Thus, it can be stated that this is the best method for increasing efficiency in the warehouse operation.

  6. Improving Computational Efficiency of Prediction in Model-Based Prognostics Using the Unscented Transform

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Goebel, Kai Frank

    2010-01-01

    Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.

  7. Efficient classical simulation of the Deutsch-Jozsa and Simon's algorithms

    NASA Astrophysics Data System (ADS)

    Johansson, Niklas; Larsson, Jan-Åke

    2017-09-01

    A long-standing aim of quantum information research is to understand what gives quantum computers their advantage. This requires separating problems that need genuinely quantum resources from those for which classical resources are enough. Two examples of quantum speed-up are the Deutsch-Jozsa and Simon's problem, both efficiently solvable on a quantum Turing machine, and both believed to lack efficient classical solutions. Here we present a framework that can simulate both quantum algorithms efficiently, solving the Deutsch-Jozsa problem with probability 1 using only one oracle query, and Simon's problem using linearly many oracle queries, just as expected of an ideal quantum computer. The presented simulation framework is in turn efficiently simulatable in a classical probabilistic Turing machine. This shows that the Deutsch-Jozsa and Simon's problem do not require any genuinely quantum resources, and that the quantum algorithms show no speed-up when compared with their corresponding classical simulation. Finally, this gives insight into what properties are needed in the two algorithms and calls for further study of oracle separation between quantum and classical computation.

  8. Dynamic modeling and verification of an energy-efficient greenhouse with an aquaponic system using TRNSYS

    NASA Astrophysics Data System (ADS)

    Amin, Majdi Talal

    Currently, there is no integrated dynamic simulation program for an energy efficient greenhouse coupled with an aquaponic system. This research is intended to promote the thermal management of greenhouses in order to provide sustainable food production with the lowest possible energy use and material waste. A brief introduction of greenhouses, passive houses, energy efficiency, renewable energy systems, and their applications are included for ready reference. An experimental working scaled-down energy-efficient greenhouse was built to verify and calibrate the results of a dynamic simulation model made using TRNSYS software. However, TRNSYS requires the aid of Google SketchUp to develop 3D building geometry. The simulation model was built following the passive house standard as closely as possible. The new simulation model was then utilized to design an actual greenhouse with Aquaponics. It was demonstrated that the passive house standard can be applied to improve upon conventional greenhouse performance, and that it is adaptable to different climates. The energy-efficient greenhouse provides the required thermal environment for fish and plant growth, while eliminating the need for conventional cooling and heating systems.

  9. Effect of Multiple Delays in an Eco-Epidemiological Model with Strong Allee Effect

    NASA Astrophysics Data System (ADS)

    Ghosh, Kakali; Biswas, Santanu; Samanta, Sudip; Tiwari, Pankaj Kumar; Alshomrani, Ali Saleh; Chattopadhyay, Joydev

    In the present article, we make an attempt to investigate the effect of two time delays, logistic delay and gestation delay, on an eco-epidemiological model. In the proposed model, strong Allee effect is considered in the growth term of the prey population. We incorporate two time lags and inspect elementary mathematical characteristic of the proposed model such as boundedness, uniform persistence, stability and Hopf-bifurcation for all possible combinations of both delays at the interior equilibrium point of the system. We observe that increase in gestation delay leads to chaotic solutions through the limit cycle. We also observe that the Allee effect play a major role in controlling the chaos. We execute several numerical simulations to illustrate the proposed mathematical model and our analytical findings.

  10. Stability analysis of pest-predator interaction model with infectious disease in prey

    NASA Astrophysics Data System (ADS)

    Suryanto, Agus; Darti, Isnani; Anam, Syaiful

    2018-03-01

    We consider an eco-epidemiological model based on a modified Leslie-Gower predator-prey model. Such eco-epidemiological model is proposed to describe the interaction between pest as the prey and its predator. We assume that the pest can be infected by a disease or pathogen and the predator only eats the susceptible prey. The dynamical properties of the model such as the existence and the stability of biologically feasible equilibria are studied. The model has six type of equilibria, but only three of them are conditionally stable. We find that the predator in this system cannot go extinct. However, the susceptible or the infective prey may disappear in the environment. To support our analytical results, we perform some numerical simulations with different scenario.

  11. The simulation of CZTS solar cell for performance improvement

    NASA Astrophysics Data System (ADS)

    Kumar, Atul; Thakur, Ajay D.

    2018-05-01

    A Copper-Zinc-Tin-Sulphide (CZTS) based solar cell of Mo/CZTS/CdS/ZnO is simulated using SCAPS. Quantum efficiency and IV curve of the simulated output of CZTS solar cell is mapped with highest efficiency reported in literature for CZTS solar cell. A modification in back contact thus shottky barrier, spike type band alignment at the CZTS-n type layer junction and higher electron mobility (owing to alkali doping in CZT)S are implement in simulation of CZTS solar cell. An improvement in the solar cell efficiency compared to the standard cell configuration of Mo/CZTS/CdS/ZnO is found. CZTS is plagued with low Voc and low FF which can be increased by optimization as suggested in paper.

  12. Toward high-efficiency and detailed Monte Carlo simulation study of the granular flow spallation target

    NASA Astrophysics Data System (ADS)

    Cai, Han-Jie; Zhang, Zhi-Lei; Fu, Fen; Li, Jian-Yang; Zhang, Xun-Chao; Zhang, Ya-Ling; Yan, Xue-Song; Lin, Ping; Xv, Jian-Ya; Yang, Lei

    2018-02-01

    The dense granular flow spallation target is a new target concept chosen for the Accelerator-Driven Subcritical (ADS) project in China. For the R&D of this kind of target concept, a dedicated Monte Carlo (MC) program named GMT was developed to perform the simulation study of the beam-target interaction. Owing to the complexities of the target geometry, the computational cost of the MC simulation of particle tracks is highly expensive. Thus, improvement of computational efficiency will be essential for the detailed MC simulation studies of the dense granular target. Here we present the special design of the GMT program and its high efficiency performance. In addition, the speedup potential of the GPU-accelerated spallation models is discussed.

  13. Bias Due to Correlation Between Times-at-Risk for Infection in Epidemiologic Studies Measuring Biological Interactions Between Sexually Transmitted Infections: A Case Study Using Human Papillomavirus Type Interactions

    PubMed Central

    Malagón, Talía; Lemieux-Mellouki, Philippe; Laprise, Jean-François; Brisson, Marc

    2016-01-01

    The clustering of human papillomavirus (HPV) infections in some individuals is often interpreted as the result of common risk factors rather than biological interactions between different types of HPV. The intraindividual correlation between times-at-risk for all HPV infections is not generally considered in the analysis of epidemiologic studies. We used a deterministic transmission model to simulate cross-sectional and prospective epidemiologic studies measuring associations between 2 HPV types. When we assumed no interactions, the model predicted that studies would estimate odds ratios and incidence rate ratios greater than 1 between HPV types even after complete adjustment for sexual behavior. We demonstrated that this residual association is due to correlation between the times-at-risk for different HPV types, where individuals become concurrently at risk for all of their partners’ HPV types when they enter a partnership and are not at risk when they are single. This correlation can be controlled in prospective studies by restricting analyses to susceptible individuals with an infected sexual partner. The bias in the measured associations was largest in low-sexual-activity populations, cross-sectional studies, and studies which evaluated infection with a first HPV type as the exposure. These results suggest that current epidemiologic evidence does not preclude the existence of competitive biological interactions between HPV types. PMID:27927619

  14. Effect of repeated simulated clinical use and sterilization on the cutting efficiency and flexibility of Hyflex CM nickel-titanium rotary files.

    PubMed

    Seago, Scott T; Bergeron, Brian E; Kirkpatrick, Timothy C; Roberts, Mark D; Roberts, Howard W; Himel, Van T; Sabey, Kent A

    2015-05-01

    Recent nickel-titanium manufacturing processes have resulted in an alloy that remains in a twinned martensitic phase at operating temperature. This alloy has been shown to have increased flexibility with added tolerance to cyclic and torsional fatigue. The aim of this study was to assess the effect of repeated simulated clinical use and sterilization on cutting efficiency and flexibility of Hyflex CM rotary files. Cutting efficiency was determined by measuring the load required to maintain a constant feed rate while instrumenting simulated canals. Flexibility was determined by using a 3-point bending test. Files were autoclaved after each use according to the manufacturer's recommendations. Files were tested through 10 simulated clinical uses. For cutting efficiency, mean data were analyzed by using multiple factor analysis of variance and the Dunnett post hoc test (P < .05). For flexibility, mean data were analyzed by using Levene's Test of Equality of Error and a general linear model (P < .05). No statistically significant decrease in cutting efficiency was noted in groups 2, 5, 6, and 7. A statistically significant decrease in cutting efficiency was noted in groups 3, 4, 8, 9, and 10. No statistically significant decrease in flexibility was noted in groups 2, 3, and 7. A statistically significant decrease in flexibility was noted in groups 4, 5, 6, 8, 9, 10, and 11. Repeated simulated clinical use and sterilization showed no effect on cutting efficiency through 1 use and no effect on flexibility through 2 uses. Published by Elsevier Inc.

  15. What can we learn about lyssavirus genomes using 454 sequencing?

    PubMed

    Höper, Dirk; Finke, Stefan; Freuling, Conrad M; Hoffmann, Bernd; Beer, Martin

    2012-01-01

    The main task of the individual project number four"Whole genome sequencing, virus-host adaptation, and molecular epidemiological analyses of lyssaviruses "within the network" Lyssaviruses--a potential re-emerging public health threat" is to provide high quality complete genome sequences from lyssaviruses. These sequences are analysed in-depth with regard to the diversity of the viral populations as to both quasi-species and so-called defective interfering RNAs. Moreover, the sequence data will facilitate further epidemiological analyses, will provide insight into the evolution of lyssaviruses and will be the basis for the design of novel nucleic acid based diagnostics. The first results presented here indicate that not only high quality full-length lyssavirus genome sequences can be generated, but indeed efficient analysis of the viral population gets feasible.

  16. Reproductive efficiency and shade avoidance plasticity under simulated competition.

    PubMed

    Fazlioglu, Fatih; Al-Namazi, Ali; Bonser, Stephen P

    2016-07-01

    Plant strategy and life-history theories make different predictions about reproductive efficiency under competition. While strategy theory suggests under intense competition iteroparous perennial plants delay reproduction and semelparous annuals reproduce quickly, life-history theory predicts both annual and perennial plants increase resource allocation to reproduction under intense competition. We tested (1) how simulated competition influences reproductive efficiency and competitive ability (CA) of different plant life histories and growth forms; (2) whether life history or growth form is associated with CA; (3) whether shade avoidance plasticity is connected to reproductive efficiency under simulated competition. We examined plastic responses of 11 herbaceous species representing different life histories and growth forms to simulated competition (spectral shade). We found that both annual and perennial plants invested more to reproduction under simulated competition in accordance with life-history theory predictions. There was no significant difference between competitive abilities of different life histories, but across growth forms, erect species expressed greater CA (in terms of leaf number) than other growth forms. We also found that shade avoidance plasticity can increase the reproductive efficiency by capitalizing on the early life resource acquisition and conversion of these resources into reproduction. Therefore, we suggest that a reassessment of the interpretation of shade avoidance plasticity is necessary by revealing its role in reproduction, not only in competition of plants.

  17. Integrating the landscape epidemiology and genetics of RNA viruses: rabies in domestic dogs as a model.

    PubMed

    Brunker, K; Hampson, K; Horton, D L; Biek, R

    2012-12-01

    Landscape epidemiology and landscape genetics combine advances in molecular techniques, spatial analyses and epidemiological models to generate a more real-world understanding of infectious disease dynamics and provide powerful new tools for the study of RNA viruses. Using dog rabies as a model we have identified how key questions regarding viral spread and persistence can be addressed using a combination of these techniques. In contrast to wildlife rabies, investigations into the landscape epidemiology of domestic dog rabies requires more detailed assessment of the role of humans in disease spread, including the incorporation of anthropogenic landscape features, human movements and socio-cultural factors into spatial models. In particular, identifying and quantifying the influence of anthropogenic features on pathogen spread and measuring the permeability of dispersal barriers are important considerations for planning control strategies, and may differ according to cultural, social and geographical variation across countries or continents. Challenges for dog rabies research include the development of metapopulation models and transmission networks using genetic information to uncover potential source/sink dynamics and identify the main routes of viral dissemination. Information generated from a landscape genetics approach will facilitate spatially strategic control programmes that accommodate for heterogeneities in the landscape and therefore utilise resources in the most cost-effective way. This can include the efficient placement of vaccine barriers, surveillance points and adaptive management for large-scale control programmes.

  18. [Relations between official and private veterinary services in epidemiology and the control of contagious diseases].

    PubMed

    Moura, J A; Bedoya, M; Agudelo, M P

    2004-04-01

    Growing budget restrictions in many countries have meant that official Veterinary Services cannot assume responsibility for any new activities. The natural reaction is to turn to private veterinary services to provide the support needed to strengthen the control and surveillance of priority diseases and thereby support the development of the livestock sector and the establishment of safe international trade. In this context, official Veterinary Services must work together with private veterinarians, delegating various technical animal health activities, so that they may focus their efforts on those tasks that cannot be delegated: standardisation, control, auditing, general system co-ordination, epidemiological surveillance, etc., as well as organising veterinary policy in order to make best use of budget resources. For these relations to be efficient, a dynamic, two-way epidemiological information mechanism must be created, whereby private veterinarians periodically keep governments informed, on the basis of an agreed methodology. Moreover, the official Veterinary Services must systematically transmit information on List A and B diseases of the OIE (World organisation for animal health), and perform detailed analyses of epidemiologically significant events. The article proposes the establishment of relations between public and private veterinary services as a way in which to provide the livestock sector with the health and hygiene conditions that are necessary for effective disease control, which in turn provides greater security for international trade and increased consumer protection.

  19. EpiK: A Knowledge Base for Epidemiological Modeling and Analytics of Infectious Diseases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, S. M. Shamimul; Fox, Edward A.; Bisset, Keith

    Computational epidemiology seeks to develop computational methods to study the distribution and determinants of health-related states or events (including disease), and the application of this study to the control of diseases and other health problems. Recent advances in computing and data sciences have led to the development of innovative modeling environments to support this important goal. The datasets used to drive the dynamic models as well as the data produced by these models presents unique challenges owing to their size, heterogeneity and diversity. These datasets form the basis of effective and easy to use decision support and analytical environments. Asmore » a result, it is important to develop scalable data management systems to store, manage and integrate these datasets. In this paper, we develop EpiK—a knowledge base that facilitates the development of decision support and analytical environments to support epidemic science. An important goal is to develop a framework that links the input as well as output datasets to facilitate effective spatio-temporal and social reasoning that is critical in planning and intervention analysis before and during an epidemic. The data management framework links modeling workflow data and its metadata using a controlled vocabulary. The metadata captures information about storage, the mapping between the linked model and the physical layout, and relationships to support services. EpiK is designed to support agent-based modeling and analytics frameworks—aggregate models can be seen as special cases and are thus supported. We use semantic web technologies to create a representation of the datasets that encapsulates both the location and the schema heterogeneity. The choice of RDF as a representation language is motivated by the diversity and growth of the datasets that need to be integrated. A query bank is developed—the queries capture a broad range of questions that can be posed and answered during a typical case study pertaining to disease outbreaks. The queries are constructed using SPARQL Protocol and RDF Query Language (SPARQL) over the EpiK. EpiK can hide schema and location heterogeneity while efficiently supporting queries that span the computational epidemiology modeling pipeline: from model construction to simulation output. As a result, we show that the performance of benchmark queries varies significantly with respect to the choice of hardware underlying the database and resource description framework (RDF) engine.« less

  20. EpiK: A Knowledge Base for Epidemiological Modeling and Analytics of Infectious Diseases

    DOE PAGES

    Hasan, S. M. Shamimul; Fox, Edward A.; Bisset, Keith; ...

    2017-11-06

    Computational epidemiology seeks to develop computational methods to study the distribution and determinants of health-related states or events (including disease), and the application of this study to the control of diseases and other health problems. Recent advances in computing and data sciences have led to the development of innovative modeling environments to support this important goal. The datasets used to drive the dynamic models as well as the data produced by these models presents unique challenges owing to their size, heterogeneity and diversity. These datasets form the basis of effective and easy to use decision support and analytical environments. Asmore » a result, it is important to develop scalable data management systems to store, manage and integrate these datasets. In this paper, we develop EpiK—a knowledge base that facilitates the development of decision support and analytical environments to support epidemic science. An important goal is to develop a framework that links the input as well as output datasets to facilitate effective spatio-temporal and social reasoning that is critical in planning and intervention analysis before and during an epidemic. The data management framework links modeling workflow data and its metadata using a controlled vocabulary. The metadata captures information about storage, the mapping between the linked model and the physical layout, and relationships to support services. EpiK is designed to support agent-based modeling and analytics frameworks—aggregate models can be seen as special cases and are thus supported. We use semantic web technologies to create a representation of the datasets that encapsulates both the location and the schema heterogeneity. The choice of RDF as a representation language is motivated by the diversity and growth of the datasets that need to be integrated. A query bank is developed—the queries capture a broad range of questions that can be posed and answered during a typical case study pertaining to disease outbreaks. The queries are constructed using SPARQL Protocol and RDF Query Language (SPARQL) over the EpiK. EpiK can hide schema and location heterogeneity while efficiently supporting queries that span the computational epidemiology modeling pipeline: from model construction to simulation output. As a result, we show that the performance of benchmark queries varies significantly with respect to the choice of hardware underlying the database and resource description framework (RDF) engine.« less

  1. A simulation study to quantify the impacts of exposure measurement error on air pollution health risk estimates in copollutant time-series models.

    PubMed

    Dionisio, Kathie L; Chang, Howard H; Baxter, Lisa K

    2016-11-25

    Exposure measurement error in copollutant epidemiologic models has the potential to introduce bias in relative risk (RR) estimates. A simulation study was conducted using empirical data to quantify the impact of correlated measurement errors in time-series analyses of air pollution and health. ZIP-code level estimates of exposure for six pollutants (CO, NO x , EC, PM 2.5 , SO 4 , O 3 ) from 1999 to 2002 in the Atlanta metropolitan area were used to calculate spatial, population (i.e. ambient versus personal), and total exposure measurement error. Empirically determined covariance of pollutant concentration pairs and the associated measurement errors were used to simulate true exposure (exposure without error) from observed exposure. Daily emergency department visits for respiratory diseases were simulated using a Poisson time-series model with a main pollutant RR = 1.05 per interquartile range, and a null association for the copollutant (RR = 1). Monte Carlo experiments were used to evaluate the impacts of correlated exposure errors of different copollutant pairs. Substantial attenuation of RRs due to exposure error was evident in nearly all copollutant pairs studied, ranging from 10 to 40% attenuation for spatial error, 3-85% for population error, and 31-85% for total error. When CO, NO x or EC is the main pollutant, we demonstrated the possibility of false positives, specifically identifying significant, positive associations for copollutants based on the estimated type I error rate. The impact of exposure error must be considered when interpreting results of copollutant epidemiologic models, due to the possibility of attenuation of main pollutant RRs and the increased probability of false positives when measurement error is present.

  2. No evidence for an epidemiological transition in sleep patterns among children: a 12-country study.

    PubMed

    Manyanga, Taru; Barnes, Joel D; Tremblay, Mark S; Katzmarzyk, Peter T; Broyles, Stephanie T; Barreira, Tiago V; Fogelholm, Mikael; Hu, Gang; Maher, Carol; Maia, Jose; Olds, Timothy; Sarmiento, Olga L; Standage, Martyn; Tudor-Locke, Catrine; Chaput, Jean-Philippe

    2018-02-01

    To examine the relationships between socioeconomic status (SES; household income and parental education) and objectively measured sleep patterns (sleep duration, sleep efficiency, and bedtime) among children from around the world and explore how the relationships differ across country levels of human development. Multinational, cross-sectional study from sites in Australia, Brazil, Canada, China, Colombia, Finland, India, Kenya, Portugal, South Africa, the United Kingdom, and the United States. The International Study of Childhood Obesity, Lifestyle and the Environment. A total of 6040 children aged 9-11 years. Sleep duration, sleep efficiency, and bedtime were monitored over 7 consecutive days using waist-worn accelerometers. Multilevel models were used to examine the relationships between sleep patterns and SES. In country-specific analyses, there were no significant linear trends for sleep duration and sleep efficiency based on income and education levels. There were significant linear trends in 4 countries for bedtime (Australia, United States, United Kingdom, and India), generally showing that children in the lowest income group had later bedtimes. Later bedtimes were associated with lowest level of parental education in only 2 countries (United Kingdom and India). Patterns of associations between sleep characteristics and SES were not different between boys and girls. Sleep patterns of children (especially sleep duration and efficiency) appear unrelated to SES in each of the 12 countries, with no differences across country levels of human development. The lack of evidence for an epidemiological transition in sleep patterns suggests that efforts to improve sleep hygiene of children should not be limited to any specific SES level. Copyright © 2017 National Sleep Foundation. Published by Elsevier Inc. All rights reserved.

  3. A Novel Interfacing Technique for Distributed Hybrid Simulations Combining EMT and Transient Stability Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Dewu; Xie, Xiaorong; Jiang, Qirong

    With steady increase of power electronic devices and nonlinear dynamic loads in large scale AC/DC systems, the traditional hybrid simulation method, which incorporates these components into a single EMT subsystem and hence causes great difficulty for network partitioning and significant deterioration in simulation efficiency. To resolve these issues, a novel distributed hybrid simulation method is proposed in this paper. The key to realize this method is a distinct interfacing technique, which includes: i) a new approach based on the two-level Schur complement to update the interfaces by taking full consideration of the couplings between different EMT subsystems; and ii) amore » combined interaction protocol to further improve the efficiency while guaranteeing the simulation accuracy. The advantages of the proposed method in terms of both efficiency and accuracy have been verified by using it for the simulation study of an AC/DC hybrid system including a two-terminal VSC-HVDC and nonlinear dynamic loads.« less

  4. Three dimensional particle-in-cell simulations of electron beams created via reflection of intense laser light from a water target

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ngirmang, Gregory K., E-mail: ngirmang.1@osu.edu; Orban, Chris; Feister, Scott

    We present 3D Particle-in-Cell (PIC) modeling of an ultra-intense laser experiment by the Extreme Light group at the Air Force Research Laboratory using the Large Scale Plasma (LSP) PIC code. This is the first time PIC simulations have been performed in 3D for this experiment which involves an ultra-intense, short-pulse (30 fs) laser interacting with a water jet target at normal incidence. The laser-energy-to-ejected-electron-energy conversion efficiency observed in 2D(3v) simulations were comparable to the conversion efficiencies seen in the 3D simulations, but the angular distribution of ejected electrons in the 2D(3v) simulations displayed interesting differences with the 3D simulations' angular distribution;more » the observed differences between the 2D(3v) and 3D simulations were more noticeable for the simulations with higher intensity laser pulses. An analytic plane-wave model is discussed which provides some explanation for the angular distribution and energies of ejected electrons in the 2D(3v) simulations. We also performed a 3D simulation with circularly polarized light and found a significantly higher conversion efficiency and peak electron energy, which is promising for future experiments.« less

  5. Efficient field-theoretic simulation of polymer solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villet, Michael C.; Fredrickson, Glenn H., E-mail: ghf@mrl.ucsb.edu; Department of Materials, University of California, Santa Barbara, California 93106

    2014-12-14

    We present several developments that facilitate the efficient field-theoretic simulation of polymers by complex Langevin sampling. A regularization scheme using finite Gaussian excluded volume interactions is used to derive a polymer solution model that appears free of ultraviolet divergences and hence is well-suited for lattice-discretized field theoretic simulation. We show that such models can exhibit ultraviolet sensitivity, a numerical pathology that dramatically increases sampling error in the continuum lattice limit, and further show that this pathology can be eliminated by appropriate model reformulation by variable transformation. We present an exponential time differencing algorithm for integrating complex Langevin equations for fieldmore » theoretic simulation, and show that the algorithm exhibits excellent accuracy and stability properties for our regularized polymer model. These developments collectively enable substantially more efficient field-theoretic simulation of polymers, and illustrate the importance of simultaneously addressing analytical and numerical pathologies when implementing such computations.« less

  6. Optimization of output power and transmission efficiency of magnetically coupled resonance wireless power transfer system

    NASA Astrophysics Data System (ADS)

    Yan, Rongge; Guo, Xiaoting; Cao, Shaoqing; Zhang, Changgeng

    2018-05-01

    Magnetically coupled resonance (MCR) wireless power transfer (WPT) system is a promising technology in electric energy transmission. But, if its system parameters are designed unreasonably, output power and transmission efficiency will be low. Therefore, optimized parameters design of MCR WPT has important research value. In the MCR WPT system with designated coil structure, the main parameters affecting output power and transmission efficiency are the distance between the coils, the resonance frequency and the resistance of the load. Based on the established mathematical model and the differential evolution algorithm, the change of output power and transmission efficiency with parameters can be simulated. From the simulation results, it can be seen that output power and transmission efficiency of the two-coil MCR WPT system and four-coil one with designated coil structure are improved. The simulation results confirm the validity of the optimization method for MCR WPT system with designated coil structure.

  7. Introduction of OXA-48-producing Enterobacteriaceae to Israeli hospitals by medical tourism.

    PubMed

    Adler, Amos; Shklyar, Maya; Schwaber, Mitchell J; Navon-Venezia, Shiri; Dhaher, Yacoub; Edgar, Rotem; Solter, Ester; Benenson, Shmuel; Masarwa, Samira; Carmeli, Yehuda

    2011-12-01

    The carbapenemase OXA-48 has been reported from different Mediterranean countries. It is mostly encoded on a single plasmid in various Enterobacteriaceae species. We characterized the epidemiological and molecular features of OXA-48-producing Enterobacteriaceae (OPE) in Israel. Epidemiological investigation was conducted by the National Center for Infection Control. Genotyping was performed using multilocus sequence typing. The bla(OXA-48)-carrying plasmids were investigated using S1 endonuclease and restriction fragment length polymorphism (RFLP). Conjugation efficiency of the bla(OXA-48)-carrying plasmids was studied in a filter mating experiment. Since 2007, four OPE-infected patients were identified, all non-Israeli (two Palestinian, one Jordanian and one Georgian). Three had prior hospitalization; two in Jordan and one in Georgia. The bla(OXA-48) gene was detected in three Escherichia coli strains belonging to different clonal complexes, one Klebsiella oxytoca and one Klebsiella pneumoniae sequence type 101, as previously reported from Tunisia and Spain. In all isolates, the bla(OXA-48) gene was located inside Tn1999.2 and was carried on a 60 kb plasmid with an identical RFLP pattern. The plasmid was able to conjugate from Klebsiella spp. to E. coli, and had a conjugation efficiency up to ~10000 times higher than that of pKpQIL. OPE, introduced mainly by medical tourism, are an emerging threat to patients from affected Mediterranean countries. The bla(OXA-48)-carrying plasmid demonstrated remarkable conjugation efficiency, which is probably important in the success of its dissemination.

  8. Local error estimates for adaptive simulation of the Reaction–Diffusion Master Equation via operator splitting

    PubMed Central

    Hellander, Andreas; Lawson, Michael J; Drawert, Brian; Petzold, Linda

    2015-01-01

    The efficiency of exact simulation methods for the reaction-diffusion master equation (RDME) is severely limited by the large number of diffusion events if the mesh is fine or if diffusion constants are large. Furthermore, inherent properties of exact kinetic-Monte Carlo simulation methods limit the efficiency of parallel implementations. Several approximate and hybrid methods have appeared that enable more efficient simulation of the RDME. A common feature to most of them is that they rely on splitting the system into its reaction and diffusion parts and updating them sequentially over a discrete timestep. This use of operator splitting enables more efficient simulation but it comes at the price of a temporal discretization error that depends on the size of the timestep. So far, existing methods have not attempted to estimate or control this error in a systematic manner. This makes the solvers hard to use for practitioners since they must guess an appropriate timestep. It also makes the solvers potentially less efficient than if the timesteps are adapted to control the error. Here, we derive estimates of the local error and propose a strategy to adaptively select the timestep when the RDME is simulated via a first order operator splitting. While the strategy is general and applicable to a wide range of approximate and hybrid methods, we exemplify it here by extending a previously published approximate method, the Diffusive Finite-State Projection (DFSP) method, to incorporate temporal adaptivity. PMID:26865735

  9. Local error estimates for adaptive simulation of the Reaction-Diffusion Master Equation via operator splitting.

    PubMed

    Hellander, Andreas; Lawson, Michael J; Drawert, Brian; Petzold, Linda

    2014-06-01

    The efficiency of exact simulation methods for the reaction-diffusion master equation (RDME) is severely limited by the large number of diffusion events if the mesh is fine or if diffusion constants are large. Furthermore, inherent properties of exact kinetic-Monte Carlo simulation methods limit the efficiency of parallel implementations. Several approximate and hybrid methods have appeared that enable more efficient simulation of the RDME. A common feature to most of them is that they rely on splitting the system into its reaction and diffusion parts and updating them sequentially over a discrete timestep. This use of operator splitting enables more efficient simulation but it comes at the price of a temporal discretization error that depends on the size of the timestep. So far, existing methods have not attempted to estimate or control this error in a systematic manner. This makes the solvers hard to use for practitioners since they must guess an appropriate timestep. It also makes the solvers potentially less efficient than if the timesteps are adapted to control the error. Here, we derive estimates of the local error and propose a strategy to adaptively select the timestep when the RDME is simulated via a first order operator splitting. While the strategy is general and applicable to a wide range of approximate and hybrid methods, we exemplify it here by extending a previously published approximate method, the Diffusive Finite-State Projection (DFSP) method, to incorporate temporal adaptivity.

  10. Reducing HIV infection in people who inject drugs is impossible without targeting recently-infected subjects

    PubMed Central

    Vasylyeva, Tetyana I.; Friedman, Samuel R.; Lourenco, Jose; Gupta, Sunetra; Hatzakis, Angelos; Pybus, Oliver G.; Katzourakis, Aris; Smyrnov, Pavlo; Karamitros, Timokratis; Paraskevis, Dimitrios; Magiorkinis, Gkikas

    2016-01-01

    Objective: Although our understanding of viral transmission among people who inject drugs (PWID) has improved, we still know little about when and how many times each injector transmits HIV throughout the duration of infection. We describe HIV dynamics in PWID to evaluate which preventive strategies can be efficient. Design: Due to the notably scarce interventions, HIV-1 spread explosively in Russia and Ukraine in 1990s. By studying this epidemic between 1995 and 2005, we characterized naturally occurring transmission dynamics of HIV among PWID. Method: We combined publicly available HIV pol and env sequences with prevalence estimates from Russia and Ukraine under an evolutionary epidemiology framework to characterize HIV transmissibility between PWID. We then constructed compartmental models to simulate HIV spread among PWID. Results: In the absence of interventions, each injector transmits on average to 10 others. Half of the transmissions take place within 1 month after primary infection, suggesting that the epidemic will expand even after blocking all the post–first month transmissions. Primary prevention can realistically target the first month of infection, and we show that it is very efficient to control the spread of HIV-1 in PWID. Treating acutely infected on top of primary prevention is notably effective. Conclusion: As a large proportion of transmissions among PWID occur within 1 month after infection, reducing and delaying transmissions through scale-up of harm reduction programmes should always form the backbone of HIV control strategies in PWID. Growing PWID populations in the developing world, where primary prevention is scarce, constitutes a public health time bomb. PMID:27824626

  11. Better cancer biomarker discovery through better study design.

    PubMed

    Rundle, Andrew; Ahsan, Habibul; Vineis, Paolo

    2012-12-01

    High-throughput laboratory technologies coupled with sophisticated bioinformatics algorithms have tremendous potential for discovering novel biomarkers, or profiles of biomarkers, that could serve as predictors of disease risk, response to treatment or prognosis. We discuss methodological issues in wedding high-throughput approaches for biomarker discovery with the case-control study designs typically used in biomarker discovery studies, especially focusing on nested case-control designs. We review principles for nested case-control study design in relation to biomarker discovery studies and describe how the efficiency of biomarker discovery can be effected by study design choices. We develop a simulated prostate cancer cohort data set and a series of biomarker discovery case-control studies nested within the cohort to illustrate how study design choices can influence biomarker discovery process. Common elements of nested case-control design, incidence density sampling and matching of controls to cases are not typically factored correctly into biomarker discovery analyses, inducing bias in the discovery process. We illustrate how incidence density sampling and matching of controls to cases reduce the apparent specificity of truly valid biomarkers 'discovered' in a nested case-control study. We also propose and demonstrate a new case-control matching protocol, we call 'antimatching', that improves the efficiency of biomarker discovery studies. For a valid, but as yet undiscovered, biomarker(s) disjunctions between correctly designed epidemiologic studies and the practice of biomarker discovery reduce the likelihood that true biomarker(s) will be discovered and increases the false-positive discovery rate. © 2012 The Authors. European Journal of Clinical Investigation © 2012 Stichting European Society for Clinical Investigation Journal Foundation.

  12. A systematic review to identify areas of enhancements of pandemic simulation models for operational use at provincial and local levels

    PubMed Central

    2012-01-01

    Background In recent years, computer simulation models have supported development of pandemic influenza preparedness policies. However, U.S. policymakers have raised several concerns about the practical use of these models. In this review paper, we examine the extent to which the current literature already addresses these concerns and identify means of enhancing the current models for higher operational use. Methods We surveyed PubMed and other sources for published research literature on simulation models for influenza pandemic preparedness. We identified 23 models published between 1990 and 2010 that consider single-region (e.g., country, province, city) outbreaks and multi-pronged mitigation strategies. We developed a plan for examination of the literature based on the concerns raised by the policymakers. Results While examining the concerns about the adequacy and validity of data, we found that though the epidemiological data supporting the models appears to be adequate, it should be validated through as many updates as possible during an outbreak. Demographical data must improve its interfaces for access, retrieval, and translation into model parameters. Regarding the concern about credibility and validity of modeling assumptions, we found that the models often simplify reality to reduce computational burden. Such simplifications may be permissible if they do not interfere with the performance assessment of the mitigation strategies. We also agreed with the concern that social behavior is inadequately represented in pandemic influenza models. Our review showed that the models consider only a few social-behavioral aspects including contact rates, withdrawal from work or school due to symptoms appearance or to care for sick relatives, and compliance to social distancing, vaccination, and antiviral prophylaxis. The concern about the degree of accessibility of the models is palpable, since we found three models that are currently accessible by the public while other models are seeking public accessibility. Policymakers would prefer models scalable to any population size that can be downloadable and operable in personal computers. But scaling models to larger populations would often require computational needs that cannot be handled with personal computers and laptops. As a limitation, we state that some existing models could not be included in our review due to their limited available documentation discussing the choice of relevant parameter values. Conclusions To adequately address the concerns of the policymakers, we need continuing model enhancements in critical areas including: updating of epidemiological data during a pandemic, smooth handling of large demographical databases, incorporation of a broader spectrum of social-behavioral aspects, updating information for contact patterns, adaptation of recent methodologies for collecting human mobility data, and improvement of computational efficiency and accessibility. PMID:22463370

  13. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy

    NASA Astrophysics Data System (ADS)

    Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.

    2016-12-01

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  14. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.

    PubMed

    Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M

    2016-12-07

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  15. Multiresolution molecular mechanics: Implementation and efficiency

    NASA Astrophysics Data System (ADS)

    Biyikli, Emre; To, Albert C.

    2017-01-01

    Atomistic/continuum coupling methods combine accurate atomistic methods and efficient continuum methods to simulate the behavior of highly ordered crystalline systems. Coupled methods utilize the advantages of both approaches to simulate systems at a lower computational cost, while retaining the accuracy associated with atomistic methods. Many concurrent atomistic/continuum coupling methods have been proposed in the past; however, their true computational efficiency has not been demonstrated. The present work presents an efficient implementation of a concurrent coupling method called the Multiresolution Molecular Mechanics (MMM) for serial, parallel, and adaptive analysis. First, we present the features of the software implemented along with the associated technologies. The scalability of the software implementation is demonstrated, and the competing effects of multiscale modeling and parallelization are discussed. Then, the algorithms contributing to the efficiency of the software are presented. These include algorithms for eliminating latent ghost atoms from calculations and measurement-based dynamic balancing of parallel workload. The efficiency improvements made by these algorithms are demonstrated by benchmark tests. The efficiency of the software is found to be on par with LAMMPS, a state-of-the-art Molecular Dynamics (MD) simulation code, when performing full atomistic simulations. Speed-up of the MMM method is shown to be directly proportional to the reduction of the number of the atoms visited in force computation. Finally, an adaptive MMM analysis on a nanoindentation problem, containing over a million atoms, is performed, yielding an improvement of 6.3-8.5 times in efficiency, over the full atomistic MD method. For the first time, the efficiency of a concurrent atomistic/continuum coupling method is comprehensively investigated and demonstrated.

  16. Modeling framework for representing long-term effectiveness of best management practices in addressing hydrology and water quality problems: Framework development and demonstration using a Bayesian method

    NASA Astrophysics Data System (ADS)

    Liu, Yaoze; Engel, Bernard A.; Flanagan, Dennis C.; Gitau, Margaret W.; McMillan, Sara K.; Chaubey, Indrajeet; Singh, Shweta

    2018-05-01

    Best management practices (BMPs) are popular approaches used to improve hydrology and water quality. Uncertainties in BMP effectiveness over time may result in overestimating long-term efficiency in watershed planning strategies. To represent varying long-term BMP effectiveness in hydrologic/water quality models, a high level and forward-looking modeling framework was developed. The components in the framework consist of establishment period efficiency, starting efficiency, efficiency for each storm event, efficiency between maintenance, and efficiency over the life cycle. Combined, they represent long-term efficiency for a specific type of practice and specific environmental concern (runoff/pollutant). An approach for possible implementation of the framework was discussed. The long-term impacts of grass buffer strips (agricultural BMP) and bioretention systems (urban BMP) in reducing total phosphorus were simulated to demonstrate the framework. Data gaps were captured in estimating the long-term performance of the BMPs. A Bayesian method was used to match the simulated distribution of long-term BMP efficiencies with the observed distribution with the assumption that the observed data represented long-term BMP efficiencies. The simulated distribution matched the observed distribution well with only small total predictive uncertainties. With additional data, the same method can be used to further improve the simulation results. The modeling framework and results of this study, which can be adopted in hydrologic/water quality models to better represent long-term BMP effectiveness, can help improve decision support systems for creating long-term stormwater management strategies for watershed management projects.

  17. HRSSA – Efficient hybrid stochastic simulation for spatially homogeneous biochemical reaction networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marchetti, Luca, E-mail: marchetti@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; University of Trento, Department of Mathematics

    This paper introduces HRSSA (Hybrid Rejection-based Stochastic Simulation Algorithm), a new efficient hybrid stochastic simulation algorithm for spatially homogeneous biochemical reaction networks. HRSSA is built on top of RSSA, an exact stochastic simulation algorithm which relies on propensity bounds to select next reaction firings and to reduce the average number of reaction propensity updates needed during the simulation. HRSSA exploits the computational advantage of propensity bounds to manage time-varying transition propensities and to apply dynamic partitioning of reactions, which constitute the two most significant bottlenecks of hybrid simulation. A comprehensive set of simulation benchmarks is provided for evaluating performance andmore » accuracy of HRSSA against other state of the art algorithms.« less

  18. Sensitivity Analysis and Optimization of Enclosure Radiation with Applications to Crystal Growth

    NASA Technical Reports Server (NTRS)

    Tiller, Michael M.

    1995-01-01

    In engineering, simulation software is often used as a convenient means for carrying out experiments to evaluate physical systems. The benefit of using simulations as 'numerical' experiments is that the experimental conditions can be easily modified and repeated at much lower cost than the comparable physical experiment. The goal of these experiments is to 'improve' the process or result of the experiment. In most cases, the computational experiments employ the same trial and error approach as their physical counterparts. When using this approach for complex systems, the cause and effect relationship of the system may never be fully understood and efficient strategies for improvement never utilized. However, it is possible when running simulations to accurately and efficiently determine the sensitivity of the system results with respect to simulation to accurately and efficiently determine the sensitivity of the system results with respect to simulation parameters (e.g., initial conditions, boundary conditions, and material properties) by manipulating the underlying computations. This results in a better understanding of the system dynamics and gives us efficient means to improve processing conditions. We begin by discussing the steps involved in performing simulations. Then we consider how sensitivity information about simulation results can be obtained and ways this information may be used to improve the process or result of the experiment. Next, we discuss optimization and the efficient algorithms which use sensitivity information. We draw on all this information to propose a generalized approach for integrating simulation and optimization, with an emphasis on software programming issues. After discussing our approach to simulation and optimization we consider an application involving crystal growth. This application is interesting because it includes radiative heat transfer. We discuss the computation of radiative new factors and the impact this mode of heat transfer has on our approach. Finally, we will demonstrate the results of our optimization.

  19. Feasibility of Assessing Public Health Impacts of Air Pollution Reduction Programs on a Local Scale: New Haven Case Study

    PubMed Central

    Lobdell, Danelle T.; Isakov, Vlad; Baxter, Lisa; Touma, Jawad S.; Smuts, Mary Beth; Özkaynak, Halûk

    2011-01-01

    Background New approaches to link health surveillance data with environmental and population exposure information are needed to examine the health benefits of risk management decisions. Objective We examined the feasibility of conducting a local assessment of the public health impacts of cumulative air pollution reduction activities from federal, state, local, and voluntary actions in the City of New Haven, Connecticut (USA). Methods Using a hybrid modeling approach that combines regional and local-scale air quality data, we estimated ambient concentrations for multiple air pollutants [e.g., PM2.5 (particulate matter ≤ 2.5 μm in aerodynamic diameter), NOx (nitrogen oxides)] for baseline year 2001 and projected emissions for 2010, 2020, and 2030. We assessed the feasibility of detecting health improvements in relation to reductions in air pollution for 26 different pollutant–health outcome linkages using both sample size and exploratory epidemiological simulations to further inform decision-making needs. Results Model projections suggested decreases (~ 10–60%) in pollutant concentrations, mainly attributable to decreases in pollutants from local sources between 2001 and 2010. Models indicated considerable spatial variability in the concentrations of most pollutants. Sample size analyses supported the feasibility of identifying linkages between reductions in NOx and improvements in all-cause mortality, prevalence of asthma in children and adults, and cardiovascular and respiratory hospitalizations. Conclusion Substantial reductions in air pollution (e.g., ~ 60% for NOx) are needed to detect health impacts of environmental actions using traditional epidemiological study designs in small communities like New Haven. In contrast, exploratory epidemiological simulations suggest that it may be possible to demonstrate the health impacts of PM reductions by predicting intraurban pollution gradients within New Haven using coupled models. PMID:21335318

  20. [Epidemiological dynamics of Dengue on Easter Island].

    PubMed

    Canals, Mauricio; González, Christian; Canals, Andrea; Figueroa, Daniela

    2012-08-01

    Dengue is considered an emerging disease with an increasing prevalence especially in South America. In 2002, an epidemic of classic Dengue (DENV-1) occurred unexpectedly on Easter Island, where it had never been detected before. It reappeared in 2006-2007 and 2008, 2009 and 2011. The aim of this study was to estimate the most relevant parameters of the epidemiological dynamics of transmission of Dengue on Easter Island and to model the dynamics since 2002, comparing the predictions with the actual situation observed. Of the total cases, 52.27% were females and 47.73% men. The average age of infection was 31.38 ± 18.37 years, similar in men and women. We estimated the reproductive number R0 = 3.005 with an IC0,95 = [1.92, 4.61]. The inter-epidemic period reached an estimated T = 5.20 to 6.8 years. The case simulation showed recurrent epidemics with decreasing magnitude (damped oscillations), which is a known phenomenon in models of dengue and malaria. There was good qualitative fit to the epidemiological dynamics from 2002 onwards. It accurately predicted the rise in cases between 2006 and 2011. The predicted number of cases during the 2002 epidemic is greater than the confirmed cases and the predicted epidemic was faster than notified cases. Interepidemic period in the simulation was 6.72 years between 2002 and 2008 and 4.68 years between 2008 and 2013. From the theoretical perspective, the first epidemic had affected 94% of the population (approximately 3500 cases), but 639 were reported suggesting underreporting and a lot of sub-clinical cases occurred. Future epidemic of decreasing size are expected, although the main danger are epidemics of hemorrhagic dengue fever resulting from the introduction of different dengue virus serotypes.

  1. THE REAL McCOIL: A method for the concurrent estimation of the complexity of infection and SNP allele frequency for malaria parasites

    PubMed Central

    Chang, Hsiao-Han; Worby, Colin J.; Yeka, Adoke; Nankabirwa, Joaniter; Kamya, Moses R.; Staedke, Sarah G.; Hubbart, Christina; Amato, Roberto; Kwiatkowski, Dominic P.

    2017-01-01

    As many malaria-endemic countries move towards elimination of Plasmodium falciparum, the most virulent human malaria parasite, effective tools for monitoring malaria epidemiology are urgent priorities. P. falciparum population genetic approaches offer promising tools for understanding transmission and spread of the disease, but a high prevalence of multi-clone or polygenomic infections can render estimation of even the most basic parameters, such as allele frequencies, challenging. A previous method, COIL, was developed to estimate complexity of infection (COI) from single nucleotide polymorphism (SNP) data, but relies on monogenomic infections to estimate allele frequencies or requires external allele frequency data which may not available. Estimates limited to monogenomic infections may not be representative, however, and when the average COI is high, they can be difficult or impossible to obtain. Therefore, we developed THE REAL McCOIL, Turning HEterozygous SNP data into Robust Estimates of ALelle frequency, via Markov chain Monte Carlo, and Complexity Of Infection using Likelihood, to incorporate polygenomic samples and simultaneously estimate allele frequency and COI. This approach was tested via simulations then applied to SNP data from cross-sectional surveys performed in three Ugandan sites with varying malaria transmission. We show that THE REAL McCOIL consistently outperforms COIL on simulated data, particularly when most infections are polygenomic. Using field data we show that, unlike with COIL, we can distinguish epidemiologically relevant differences in COI between and within these sites. Surprisingly, for example, we estimated high average COI in a peri-urban subregion with lower transmission intensity, suggesting that many of these cases were imported from surrounding regions with higher transmission intensity. THE REAL McCOIL therefore provides a robust tool for understanding the molecular epidemiology of malaria across transmission settings. PMID:28125584

  2. Simulations for designing and interpreting intervention trials in infectious diseases.

    PubMed

    Halloran, M Elizabeth; Auranen, Kari; Baird, Sarah; Basta, Nicole E; Bellan, Steven E; Brookmeyer, Ron; Cooper, Ben S; DeGruttola, Victor; Hughes, James P; Lessler, Justin; Lofgren, Eric T; Longini, Ira M; Onnela, Jukka-Pekka; Özler, Berk; Seage, George R; Smith, Thomas A; Vespignani, Alessandro; Vynnycky, Emilia; Lipsitch, Marc

    2017-12-29

    Interventions in infectious diseases can have both direct effects on individuals who receive the intervention as well as indirect effects in the population. In addition, intervention combinations can have complex interactions at the population level, which are often difficult to adequately assess with standard study designs and analytical methods. Herein, we urge the adoption of a new paradigm for the design and interpretation of intervention trials in infectious diseases, particularly with regard to emerging infectious diseases, one that more accurately reflects the dynamics of the transmission process. In an increasingly complex world, simulations can explicitly represent transmission dynamics, which are critical for proper trial design and interpretation. Certain ethical aspects of a trial can also be quantified using simulations. Further, after a trial has been conducted, simulations can be used to explore the possible explanations for the observed effects. Much is to be gained through a multidisciplinary approach that builds collaborations among experts in infectious disease dynamics, epidemiology, statistical science, economics, simulation methods, and the conduct of clinical trials.

  3. DisEpi: Compact Visualization as a Tool for Applied Epidemiological Research.

    PubMed

    Benis, Arriel; Hoshen, Moshe

    2017-01-01

    Outcomes research and evidence-based medical practice is being positively impacted by proliferation of healthcare databases. Modern epidemiologic studies require complex data comprehension. A new tool, DisEpi, facilitates visual exploration of epidemiological data supporting Public Health Knowledge Discovery. It provides domain-experts a compact visualization of information at the population level. In this study, DisEpi is applied to Attention-Deficit/Hyperactivity Disorder (ADHD) patients within Clalit Health Services, analyzing the socio-demographic and ADHD filled prescription data between 2006 and 2016 of 1,605,800 children aged 6 to 17 years. DisEpi's goals facilitate the identification of (1) Links between attributes and/or events, (2) Changes in these relationships over time, and (3) Clusters of population attributes for similar trends. DisEpi combines hierarchical clustering graphics and a heatmap where color shades reflect disease time-trends. In the ADHD context, DisEpi allowed the domain-expert to visually analyze a snapshot summary of data mining results. Accordingly, the domain-expert was able to efficiently identify that: (1) Relatively younger children and particularly youngest children in class are treated more often, (2) Medication incidence increased between 2006 and 2011 but then stabilized, and (3) Progression rates of medication incidence is different for each of the 3 main discovered clusters (aka: profiles) of treated children. DisEpi delivered results similar to those previously published which used classical statistical approaches. DisEpi requires minimal preparation and fewer iterations, generating results in a user-friendly format for the domain-expert. DisEpi will be wrapped as a package containing the end-to-end discovery process. Optionally, it may provide automated annotation using calendar events (such as policy changes or media interests), which can improve discovery efficiency, interpretation, and policy implementation.

  4. Development and application of a crossbreeding simulation model for goat production systems in tropical regions.

    PubMed

    Tsukahara, Y; Oishi, K; Hirooka, H

    2011-12-01

    A deterministic simulation model was developed to estimate biological production efficiency and to evaluate goat crossbreeding systems under tropical conditions. The model involves 5 production systems: pure indigenous, first filial generations (F1), backcross (BC), composite breeds of F1 (CMP(F1)), and BC (CMP(BC)). The model first simulates growth, reproduction, lactation, and energy intakes of a doe and a kid on a 1-d time step at the individual level and thereafter the outputs are integrated into the herd dynamics program. The ability of the model to simulate individual performances was tested under a base situation. The simulation results represented daily BW changes, ME requirements, and milk yield and the estimates were within the range of published data. Two conventional goat production scenarios (an intensive milk production scenario and an integrated goat and oil palm production scenario) in Malaysia were examined. The simulation results of the intensive milk production scenario showed the greater production efficiency of the CMP(BC) and CMP(F1) systems and decreased production efficiency of the F1 and BC systems. The results of the integrated goat and oil palm production scenario showed that the production efficiency and stocking rate were greater for the indigenous goats than for the crossbreeding systems.

  5. Cardiorespiratory endurance evaluation using heart rate analysis during ski simulator exercise and the Harvard step test in elementary school students.

    PubMed

    Lee, Hyo Taek; Roh, Hyo Lyun; Kim, Yoon Sang

    2016-01-01

    [Purpose] Efficient management using exercise programs with various benefits should be provided by educational institutions for children in their growth phase. We analyzed the heart rates of children during ski simulator exercise and the Harvard step test to evaluate the cardiopulmonary endurance by calculating their post-exercise recovery rate. [Subjects and Methods] The subjects (n = 77) were categorized into a normal weight and an overweight/obesity group by body mass index. They performed each exercise for 3 minutes. The cardiorespiratory endurance was calculated using the Physical Efficiency Index formula. [Results] The ski simulator and Harvard step test showed that there was a significant difference in the heart rates of the 2 body mass index-based groups at each minute. The normal weight and the ski-simulator group had higher Physical Efficiency Index levels. [Conclusion] This study showed that a simulator exercise can produce a cumulative load even when performed at low intensity, and can be effectively utilized as exercise equipment since it resulted in higher Physical Efficiency Index levels than the Harvard step test. If schools can increase sport durability by stimulating students' interests, the ski simulator exercise can be used in programs designed to improve and strengthen students' physical fitness.

  6. Graph theoretical analysis of EEG functional network during multi-workload flight simulation experiment in virtual reality environment.

    PubMed

    Shengqian Zhang; Yuan Zhang; Yu Sun; Thakor, Nitish; Bezerianos, Anastasios

    2017-07-01

    The research field of mental workload has attracted abundant researchers as mental workload plays a crucial role in real-life performance and safety. While previous studies have examined the neural correlates of mental workload in 2D scenarios (i.e., presenting stimuli on a computer screen (CS) environment using univariate methods (e.g., EEG channel power), it is still unclear of the findings of one that uses multivariate approach using graphical theory and the effects of a 3D environment (i.e., presenting stimuli on a Virtual Reality (VR)). In this study, twenty subjects undergo flight simulation in both CS and VR environment with three stages each. After preprocessing, the Electroencephalogram (EEG) signals were a connectivity matrix based on Phase Lag Index (PLI) will be constructed. Graph theory analysis then will be applied based on their global efficiency, local efficiency and nodal efficiency on both alpha and theta band. For global efficiency and local efficiency, VR values are generally lower than CS in both bands. For nodal efficiency, the regions that show at least marginally significant decreases are very different for CS and VR. These findings suggest that 3D simulation effects a higher mental workload than 2D simulation and that they each involved a different brain region.

  7. Multiscale Methods for Accurate, Efficient, and Scale-Aware Models of the Earth System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldhaber, Steve; Holland, Marika

    The major goal of this project was to contribute improvements to the infrastructure of an Earth System Model in order to support research in the Multiscale Methods for Accurate, Efficient, and Scale-Aware models of the Earth System project. In support of this, the NCAR team accomplished two main tasks: improving input/output performance of the model and improving atmospheric model simulation quality. Improvement of the performance and scalability of data input and diagnostic output within the model required a new infrastructure which can efficiently handle the unstructured grids common in multiscale simulations. This allows for a more computationally efficient model, enablingmore » more years of Earth System simulation. The quality of the model simulations was improved by reducing grid-point noise in the spectral element version of the Community Atmosphere Model (CAM-SE). This was achieved by running the physics of the model using grid-cell data on a finite-volume grid.« less

  8. Efficiency in nonequilibrium molecular dynamics Monte Carlo simulations

    DOE PAGES

    Radak, Brian K.; Roux, Benoît

    2016-10-07

    Hybrid algorithms combining nonequilibrium molecular dynamics and Monte Carlo (neMD/MC) offer a powerful avenue for improving the sampling efficiency of computer simulations of complex systems. These neMD/MC algorithms are also increasingly finding use in applications where conventional approaches are impractical, such as constant-pH simulations with explicit solvent. However, selecting an optimal nonequilibrium protocol for maximum efficiency often represents a non-trivial challenge. This work evaluates the efficiency of a broad class of neMD/MC algorithms and protocols within the theoretical framework of linear response theory. The approximations are validated against constant pH-MD simulations and shown to provide accurate predictions of neMD/MC performance.more » An assessment of a large set of protocols confirms (both theoretically and empirically) that a linear work protocol gives the best neMD/MC performance. Lastly, a well-defined criterion for optimizing the time parameters of the protocol is proposed and demonstrated with an adaptive algorithm that improves the performance on-the-fly with minimal cost.« less

  9. Molecular dynamics simulations using temperature-enhanced essential dynamics replica exchange.

    PubMed

    Kubitzki, Marcus B; de Groot, Bert L

    2007-06-15

    Today's standard molecular dynamics simulations of moderately sized biomolecular systems at full atomic resolution are typically limited to the nanosecond timescale and therefore suffer from limited conformational sampling. Efficient ensemble-preserving algorithms like replica exchange (REX) may alleviate this problem somewhat but are still computationally prohibitive due to the large number of degrees of freedom involved. Aiming at increased sampling efficiency, we present a novel simulation method combining the ideas of essential dynamics and REX. Unlike standard REX, in each replica only a selection of essential collective modes of a subsystem of interest (essential subspace) is coupled to a higher temperature, with the remainder of the system staying at a reference temperature, T(0). This selective excitation along with the replica framework permits efficient approximate ensemble-preserving conformational sampling and allows much larger temperature differences between replicas, thereby considerably enhancing sampling efficiency. Ensemble properties and sampling performance of the method are discussed using dialanine and guanylin test systems, with multi-microsecond molecular dynamics simulations of these test systems serving as references.

  10. Numerical flow simulation and efficiency prediction for axial turbines by advanced turbulence models

    NASA Astrophysics Data System (ADS)

    Jošt, D.; Škerlavaj, A.; Lipej, A.

    2012-11-01

    Numerical prediction of an efficiency of a 6-blade Kaplan turbine is presented. At first, the results of steady state analysis performed by different turbulence models for different operating regimes are compared to the measurements. For small and optimal angles of runner blades the efficiency was quite accurately predicted, but for maximal blade angle the discrepancy between calculated and measured values was quite large. By transient analysis, especially when the Scale Adaptive Simulation Shear Stress Transport (SAS SST) model with zonal Large Eddy Simulation (ZLES) in the draft tube was used, the efficiency was significantly improved. The improvement was at all operating points, but it was the largest for maximal discharge. The reason was better flow simulation in the draft tube. Details about turbulent structure in the draft tube obtained by SST, SAS SST and SAS SST with ZLES are illustrated in order to explain the reasons for differences in flow energy losses obtained by different turbulence models.

  11. Demonstrating an Order-of-Magnitude Sampling Enhancement in Molecular Dynamics Simulations of Complex Protein Systems.

    PubMed

    Pan, Albert C; Weinreich, Thomas M; Piana, Stefano; Shaw, David E

    2016-03-08

    Molecular dynamics (MD) simulations can describe protein motions in atomic detail, but transitions between protein conformational states sometimes take place on time scales that are infeasible or very expensive to reach by direct simulation. Enhanced sampling methods, the aim of which is to increase the sampling efficiency of MD simulations, have thus been extensively employed. The effectiveness of such methods when applied to complex biological systems like proteins, however, has been difficult to establish because even enhanced sampling simulations of such systems do not typically reach time scales at which convergence is extensive enough to reliably quantify sampling efficiency. Here, we obtain sufficiently converged simulations of three proteins to evaluate the performance of simulated tempering, a member of a widely used class of enhanced sampling methods that use elevated temperature to accelerate sampling. Simulated tempering simulations with individual lengths of up to 100 μs were compared to (previously published) conventional MD simulations with individual lengths of up to 1 ms. With two proteins, BPTI and ubiquitin, we evaluated the efficiency of sampling of conformational states near the native state, and for the third, the villin headpiece, we examined the rate of folding and unfolding. Our comparisons demonstrate that simulated tempering can consistently achieve a substantial sampling speedup of an order of magnitude or more relative to conventional MD.

  12. SIMULATIONS OF BOOSTER INJECTION EFFICIENCY FOR THE APS-UPGRADE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calvey, J.; Borland, M.; Harkay, K.

    2017-06-25

    The APS-Upgrade will require the injector chain to provide high single bunch charge for swap-out injection. One possible limiting factor to achieving this is an observed reduction of injection efficiency into the booster synchrotron at high charge. We have simulated booster injection using the particle tracking code elegant, including a model for the booster impedance and beam loading in the RF cavities. The simulations point to two possible causes for reduced efficiency: energy oscillations leading to losses at high dispersion locations, and a vertical beam size blowup caused by ions in the Particle Accumulator Ring. We also show that themore » efficiency is much higher in an alternate booster lattice with smaller vertical beta function and zero dispersion in the straight sections.« less

  13. Inferring epidemiological parameters from phylogenies using regression-ABC: A comparative study

    PubMed Central

    Gascuel, Olivier

    2017-01-01

    Inferring epidemiological parameters such as the R0 from time-scaled phylogenies is a timely challenge. Most current approaches rely on likelihood functions, which raise specific issues that range from computing these functions to finding their maxima numerically. Here, we present a new regression-based Approximate Bayesian Computation (ABC) approach, which we base on a large variety of summary statistics intended to capture the information contained in the phylogeny and its corresponding lineage-through-time plot. The regression step involves the Least Absolute Shrinkage and Selection Operator (LASSO) method, which is a robust machine learning technique. It allows us to readily deal with the large number of summary statistics, while avoiding resorting to Markov Chain Monte Carlo (MCMC) techniques. To compare our approach to existing ones, we simulated target trees under a variety of epidemiological models and settings, and inferred parameters of interest using the same priors. We found that, for large phylogenies, the accuracy of our regression-ABC is comparable to that of likelihood-based approaches involving birth-death processes implemented in BEAST2. Our approach even outperformed these when inferring the host population size with a Susceptible-Infected-Removed epidemiological model. It also clearly outperformed a recent kernel-ABC approach when assuming a Susceptible-Infected epidemiological model with two host types. Lastly, by re-analyzing data from the early stages of the recent Ebola epidemic in Sierra Leone, we showed that regression-ABC provides more realistic estimates for the duration parameters (latency and infectiousness) than the likelihood-based method. Overall, ABC based on a large variety of summary statistics and a regression method able to perform variable selection and avoid overfitting is a promising approach to analyze large phylogenies. PMID:28263987

  14. Dynamic vs. static social networks in models of parasite transmission: predicting Cryptosporidium spread in wild lemurs.

    PubMed

    Springer, Andrea; Kappeler, Peter M; Nunn, Charles L

    2017-05-01

    Social networks provide an established tool to implement heterogeneous contact structures in epidemiological models. Dynamic temporal changes in contact structure and ranging behaviour of wildlife may impact disease dynamics. A consensus has yet to emerge, however, concerning the conditions in which network dynamics impact model outcomes, as compared to static approximations that average contact rates over longer time periods. Furthermore, as many pathogens can be transmitted both environmentally and via close contact, it is important to investigate the relative influence of both transmission routes in real-world populations. Here, we use empirically derived networks from a population of wild primates, Verreaux's sifakas (Propithecus verreauxi), and simulated networks to investigate pathogen spread in dynamic vs. static social networks. First, we constructed a susceptible-exposed-infected-recovered model of Cryptosporidium spread in wild Verreaux's sifakas. We incorporated social and environmental transmission routes and parameterized the model for two different climatic seasons. Second, we used simulated networks and greater variation in epidemiological parameters to investigate the conditions in which dynamic networks produce larger outbreak sizes than static networks. We found that average outbreak size of Cryptosporidium infections in sifakas was larger when the disease was introduced in the dry season than in the wet season, driven by an increase in home range overlap towards the end of the dry season. Regardless of season, dynamic networks always produced larger average outbreak sizes than static networks. Larger outbreaks in dynamic models based on simulated networks occurred especially when the probability of transmission and recovery were low. Variation in tie strength in the dynamic networks also had a major impact on outbreak size, while network modularity had a weaker influence than epidemiological parameters that determine transmission and recovery. Our study adds to emerging evidence that dynamic networks can change predictions of disease dynamics, especially if the disease shows low transmissibility and a long infectious period, and when environmental conditions lead to enhanced between-group contact after an infectious agent has been introduced. © 2016 The Authors. Journal of Animal Ecology published by John Wiley & Sons Ltd on behalf of British Ecological Society.

  15. Use of static picture prompts versus video modeling during simulation instruction.

    PubMed

    Alberto, Paul A; Cihak, David F; Gama, Robert I

    2005-01-01

    The purpose of this study was to compare the effectiveness and efficiency of static picture prompts and video modeling as classroom simulation strategies in combination with in vivo community instruction. Students with moderate intellectual disabilities were instructed in the tasks of withdrawing money from an ATM and purchasing items using a debit card. Both simulation strategies were effective and efficient at teaching the skills. The two simulation strategies were not functionally different in terms of number of trials to acquisition, number of errors, and number of instructional sessions to criterion.

  16. Overview of HIV molecular epidemiology among people who inject drugs in Europe and Asia.

    PubMed

    Nikolopoulos, Georgios K; Kostaki, Evangelia-Georgia; Paraskevis, Dimitrios

    2016-12-01

    HIV strains continuously evolve, tend to recombine, and new circulating variants are being discovered. Novel strains complicate efforts to develop a vaccine against HIV and may exhibit higher transmission efficiency and virulence, and elevated resistance to antiretroviral agents. The United Nations Joint Programme on HIV/AIDS (UNAIDS) set an ambitious goal to end HIV as a public health threat by 2030 through comprehensive strategies that include epidemiological input as the first step of the process. In this context, molecular epidemiology becomes invaluable as it captures trends in HIV evolution rates that shape epidemiological pictures across several geographical areas. This review briefly summarizes the molecular epidemiology of HIV among people who inject drugs (PWID) in Europe and Asia. Following high transmission rates of subtype G and CRF14_BG among PWID in Portugal and Spain, two European countries, Greece and Romania, experienced recent HIV outbreaks in PWID that consisted of multiple transmission clusters including subtypes B, A, F1, and recombinants CRF14_BG and CRF35_AD. The latter was first identified in Afghanistan. Russia, Ukraine, and other Former Soviet Union (FSU) states are still facing the devastating effects of epidemics in PWID produced by A FSU (also known as IDU-A), B FSU (known as IDU-B), and CRF03_AB. In Asia, CRF01_AE and subtype B (Western B and Thai B) travelled from PWID in Thailand to neighboring countries. Recombination hotspots in South China, Northern Myanmar, and Malaysia have been generating several intersubtype and inter-CRF recombinants (e.g. CRF07_BC, CRF08_BC, CRF33_01B etc.), increasing the complexity of HIV molecular patterns. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Overview of HIV molecular epidemiology among People who Inject Drugs in Europe and Asia

    PubMed Central

    Nikolopoulos, Georgios K.; Kostaki, Evangelia-Georgia; Paraskevis, Dimitrios

    2016-01-01

    HIV strains continuously evolve, tend to recombine and new circulating variants are being discovered. Novel strains complicate efforts to develop a vaccine against HIV and may exhibit higher transmission efficiency and virulence, and elevated resistance to antiretroviral agents. The United Nations Joint Programme on HIV/AIDS (UNAIDS) set an ambitious goal to end HIV as a public health threat by 2030 through comprehensive strategies that include epidemiological input as the first step of the process. In this context, molecular epidemiology becomes invaluable as it captures trends in HIV evolution rates that shape epidemiological pictures across several geographical areas. This review briefly summarizes the molecular epidemiology of HIV among people who inject drugs (PWID) in Europe and Asia. Following high transmission rates of subtype G and CRF14_BG among PWID in Portugal and Spain, two European countries, Greece and Romania, experienced recent HIV outbreaks in PWID that consisted of multiple transmission clusters including subtypes B, A, F1 and recombinants CRF14_BG and CRF35_AD. The latter was first identified in Afghanistan. Russia, Ukraine and other Former Soviet Union (FSU) states are still facing the devastating effects of epidemics in PWID produced by AFSU (also known as IDU-A), BFSU (known as IDU-B), and CRF03_AB. In Asia, CRF01_AE and subtype B (Western B and Thai B) travelled from PWID in Thailand to neighboring countries. Recombination hotspots in South China, Northern Myanmar, and Malaysia have been generating several intersubtype and inter-CRF recombinants (e.g. CRF07_BC, CRF08_BC, CRF33_01B etc.) increasing the complexity of HIV molecular patterns. PMID:27287560

  18. Adsorption of mercury by activated carbon prepared from dried sewage sludge in simulated flue gas.

    PubMed

    Park, Jeongmin; Lee, Sang-Sup

    2018-04-25

    Conversion of sewage sludge to activated carbon is attractive as an alternative method to ocean dumping for the disposal of sewage sludge. Injection of activated carbon upstream of particulate matter control devices has been suggested as a method to remove elemental mercury from flue gas. Activated carbon was prepared using various activation temperatures and times and was tested for their mercury adsorption efficiency using lab-scale systems. To understand the effect of the physical property of the activated carbon, its mercury adsorption efficiency was investigated as a function of their Brunauer-Emmett-Teller (BET) surface area. Two simulated flue gas conditions: (1) without hydrogen chloride (HCl) and (2) with 20 ppm HCl, were used to investigate the effect of flue gas composition on the mercury adsorption capacity of activated carbon. Despite very low BET surface area of the prepared sewage sludge activated carbons, their mercury adsorption efficiencies were comparable under both simulated flue gas conditions to those of pinewood and coal activated carbons. After injecting HCl into the simulated flue gas, all sewage sludge activated carbons demonstrated high adsorption efficiencies, i.e., more than 87%, regardless of their BET surface area. IMPLICATIONS We tested activated carbons prepared from dried sewage sludge to investigate the effect of their physical properties on their mercury adsorption efficiency. Using two simulated flue gas conditions, we conducted mercury speciation for the outlet gas. We found that the sewage sludge activated carbon had comparable mercury adsorption efficiency to pinewood and coal activated carbons, and the presence of HCl minimized the effect of physical property of the activated carbon on its mercury adsorption efficiency.

  19. High-Resolution and -Efficiency Gamma-Ray Detection for the FRIB Decay Station

    NASA Astrophysics Data System (ADS)

    Grover, Hannah; Leach, Kyle; Natzke, Connor; FRIB Decay Station Collaboration Collaboration

    2017-09-01

    As we push our knowledge of nuclear structure to the frontier of the unknown with FRIB, a new high-efficiency, -resolution, and -sensitivity photon-detection device is critical. The FRIB Decay Station Collaboration is working to create a new detector array that meets the needs of the exploratory nature of FRIB by minimizing cost and maximizing efficiency. GEANT4 simulations are being utilized to combine detectors in various configurations to test their feasibility. I will discuss these simulations and how they compare to existing simulations of past-generation decay-spectroscopy equipment. This work has been funded by the DOE Office of Science, Office of Nuclear Physics.

  20. Efficiency of super-Eddington magnetically-arrested accretion

    NASA Astrophysics Data System (ADS)

    McKinney, Jonathan C.; Dai, Lixin; Avara, Mark J.

    2015-11-01

    The radiative efficiency of super-Eddington accreting black holes (BHs) is explored for magnetically-arrested discs, where magnetic flux builds-up to saturation near the BH. Our three-dimensional general relativistic radiation magnetohydrodynamic (GRRMHD) simulation of a spinning BH (spin a/M = 0.8) accreting at ˜50 times Eddington shows a total efficiency ˜50 per cent when time-averaged and total efficiency ≳ 100 per cent in moments. Magnetic compression by the magnetic flux near the rotating BH leads to a thin disc, whose radiation escapes via advection by a magnetized wind and via transport through a low-density channel created by a Blandford-Znajek (BZ) jet. The BZ efficiency is sub-optimal due to inertial loading of field lines by optically thick radiation, leading to BZ efficiency ˜40 per cent on the horizon and BZ efficiency ˜5 per cent by r ˜ 400rg (gravitational radii) via absorption by the wind. Importantly, radiation escapes at r ˜ 400rg with efficiency η ≈ 15 per cent (luminosity L ˜ 50LEdd), similar to η ≈ 12 per cent for a Novikov-Thorne thin disc and beyond η ≲ 1 per cent seen in prior GRRMHD simulations or slim disc theory. Our simulations show how BH spin, magnetic field, and jet mass-loading affect these radiative and jet efficiencies.

  1. Efficient Simulation of Explicitly Solvated Proteins in the Well-Tempered Ensemble.

    PubMed

    Deighan, Michael; Bonomi, Massimiliano; Pfaendtner, Jim

    2012-07-10

    Herein, we report significant reduction in the cost of combined parallel tempering and metadynamics simulations (PTMetaD). The efficiency boost is achieved using the recently proposed well-tempered ensemble (WTE) algorithm. We studied the convergence of PTMetaD-WTE conformational sampling and free energy reconstruction of an explicitly solvated 20-residue tryptophan-cage protein (trp-cage). A set of PTMetaD-WTE simulations was compared to a corresponding standard PTMetaD simulation. The properties of PTMetaD-WTE and the convergence of the calculations were compared. The roles of the number of replicas, total simulation time, and adjustable WTE parameter γ were studied.

  2. Higher Order Time Integration Schemes for the Unsteady Navier-Stokes Equations on Unstructured Meshes

    NASA Technical Reports Server (NTRS)

    Jothiprasad, Giridhar; Mavriplis, Dimitri J.; Caughey, David A.

    2002-01-01

    The rapid increase in available computational power over the last decade has enabled higher resolution flow simulations and more widespread use of unstructured grid methods for complex geometries. While much of this effort has been focused on steady-state calculations in the aerodynamics community, the need to accurately predict off-design conditions, which may involve substantial amounts of flow separation, points to the need to efficiently simulate unsteady flow fields. Accurate unsteady flow simulations can easily require several orders of magnitude more computational effort than a corresponding steady-state simulation. For this reason, techniques for improving the efficiency of unsteady flow simulations are required in order to make such calculations feasible in the foreseeable future. The purpose of this work is to investigate possible reductions in computer time due to the choice of an efficient time-integration scheme from a series of schemes differing in the order of time-accuracy, and by the use of more efficient techniques to solve the nonlinear equations which arise while using implicit time-integration schemes. This investigation is carried out in the context of a two-dimensional unstructured mesh laminar Navier-Stokes solver.

  3. Free-electron laser simulations on the MPP

    NASA Technical Reports Server (NTRS)

    Vonlaven, Scott A.; Liebrock, Lorie M.

    1987-01-01

    Free electron lasers (FELs) are of interest because they provide high power, high efficiency, and broad tunability. FEL simulations can make efficient use of computers of the Massively Parallel Processor (MPP) class because most of the processing consists of applying a simple equation to a set of identical particles. A test version of the KMS Fusion FEL simulation, which resides mainly in the MPPs host computer and only partially in the MPP, has run successfully.

  4. Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases

    PubMed Central

    Amos, Christopher I.; Bafna, Vineet; Hauser, Elizabeth R.; Hernandez, Ryan D.; Li, Chun; Liberles, David A.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Papanicolaou, George J.; Peng, Bo; Ritchie, Marylyn D.; Rosenfeld, Gabriel; Witte, John S.

    2014-01-01

    Genetic simulation programs are used to model data under specified assumptions to facilitate the understanding and study of complex genetic systems. Standardized data sets generated using genetic simulation are essential for the development and application of novel analytical tools in genetic epidemiology studies. With continuing advances in high-throughput genomic technologies and generation and analysis of larger, more complex data sets, there is a need for updating current approaches in genetic simulation modeling. To provide a forum to address current and emerging challenges in this area, the National Cancer Institute (NCI) sponsored a workshop, entitled “Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases” at the National Institutes of Health (NIH) in Bethesda, Maryland on March 11-12, 2014. The goals of the workshop were to: (i) identify opportunities, challenges and resource needs for the development and application of genetic simulation models; (ii) improve the integration of tools for modeling and analysis of simulated data; and (iii) foster collaborations to facilitate development and applications of genetic simulation. During the course of the meeting the group identified challenges and opportunities for the science of simulation, software and methods development, and collaboration. This paper summarizes key discussions at the meeting, and highlights important challenges and opportunities to advance the field of genetic simulation. PMID:25371374

  5. III-V/Si Tandem Cells Utilizing Interdigitated Back Contact Si Cells and Varying Terminal Configurations: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schnabel, Manuel; Klein, Talysa R.; Jain, Nikhil

    Solar cells made from bulk crystalline silicon (c-Si) dominate the market, but laboratory efficiencies have stagnated because the current record efficiency of 26.3% is already very close to the theoretical limit of 29.4% for a single-junction c-Si cell. In order to substantially boost the efficiency of Si solar cells we have been developing stacked III-V/Si tandem cells, recently attaining efficiencies above 32% in four-terminal configuration. In this contribution, we use state-of-the-art III-V cells coupled with equivalent circuit simulations to compare four-terminal (4T) to three- and two-terminal (3T, 2T) operation. Equivalent circuit simulations are used to show that tandem cells canmore » be operated just as efficiently using three terminals as with four terminals. However, care must be taken not to overestimate 3T efficiency, as the two circuits used to extract current interact, and a method is described to accurately determine this efficiency. Experimentally, a 4T GaInP/Si tandem cell utilizing an interdigitated back contact cell is shown, exhibiting a 4T efficiency of 31.5% and a 2T efficiency of 28.1%. In 3T configuration, it is used to verify the finding from simulation that 3T efficiency is overestimated when interactions between the two circuits are neglected. Considering these, a 3T efficiency approaching the 4T efficiency is found, showing that 3T operation is efficient, and an outlook on fully integrated high-efficiency 3T and 2T tandem cells is given.« less

  6. The impact of antigenic drift of influenza A virus on human herd immunity: Sero-epidemiological study of H1N1 in healthy Thai population in 2009.

    PubMed

    Kanai, Yuta; Boonsathorn, Naphatsawan; Chittaganpitch, Malinee; Bai, Guirong; Li, Yonggang; Kase, Tetsuo; Takahashi, Kazuo; Okuno, Yoshinobu; Jampangern, Wipawee; Ikuta, Kazuyoshi; Sawanpanyalert, Pathom

    2010-07-26

    To examine the effect of the antigenic drift of H1N1 influenza viruses on herd immunity, neutralization antibodies from 744 sera from Thai healthy volunteers in 2008-2009, who had not been vaccinated for at least the last 5 years, were investigated by microneutralization (MN) and hemagglutination inhibition (HI) assays. Significantly higher MN titers were observed for the H1N1 Thai isolate in 2006 than in 2008. The results indicate that the antigenically drifted virus effectively escaped herd immunity. Since the low neutralization activity of herd immunity against drifted viruses is an important factor for viruses to spread efficiently, continuous sero-epidemiological study is required for public health. Copyright 2010 Elsevier Ltd. All rights reserved.

  7. Pesticide trapping efficiency of a modified backwater wetland using a simulated runoff event

    USDA-ARS?s Scientific Manuscript database

    This study examined the trapping efficiency of a modified backwater wetland amended with a mixture of three pesticides, atrazine, metolachlor, and fipronil, using a simulated runoff event. The 700 m long, 25 m wide wetland, located along the Coldwater River in Tunica County, Mississippi, was modifie...

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biyikli, Emre; To, Albert C., E-mail: albertto@pitt.edu

    Atomistic/continuum coupling methods combine accurate atomistic methods and efficient continuum methods to simulate the behavior of highly ordered crystalline systems. Coupled methods utilize the advantages of both approaches to simulate systems at a lower computational cost, while retaining the accuracy associated with atomistic methods. Many concurrent atomistic/continuum coupling methods have been proposed in the past; however, their true computational efficiency has not been demonstrated. The present work presents an efficient implementation of a concurrent coupling method called the Multiresolution Molecular Mechanics (MMM) for serial, parallel, and adaptive analysis. First, we present the features of the software implemented along with themore » associated technologies. The scalability of the software implementation is demonstrated, and the competing effects of multiscale modeling and parallelization are discussed. Then, the algorithms contributing to the efficiency of the software are presented. These include algorithms for eliminating latent ghost atoms from calculations and measurement-based dynamic balancing of parallel workload. The efficiency improvements made by these algorithms are demonstrated by benchmark tests. The efficiency of the software is found to be on par with LAMMPS, a state-of-the-art Molecular Dynamics (MD) simulation code, when performing full atomistic simulations. Speed-up of the MMM method is shown to be directly proportional to the reduction of the number of the atoms visited in force computation. Finally, an adaptive MMM analysis on a nanoindentation problem, containing over a million atoms, is performed, yielding an improvement of 6.3–8.5 times in efficiency, over the full atomistic MD method. For the first time, the efficiency of a concurrent atomistic/continuum coupling method is comprehensively investigated and demonstrated.« less

  9. Controlling disease outbreaks in wildlife using limited culling: modelling classical swine fever incursions in wild pigs in Australia.

    PubMed

    Cowled, Brendan D; Garner, M Graeme; Negus, Katherine; Ward, Michael P

    2012-01-16

    Disease modelling is one approach for providing new insights into wildlife disease epidemiology. This paper describes a spatio-temporal, stochastic, susceptible- exposed-infected-recovered process model that simulates the potential spread of classical swine fever through a documented, large and free living wild pig population following a simulated incursion. The study area (300 000 km2) was in northern Australia. Published data on wild pig ecology from Australia, and international Classical Swine Fever data was used to parameterise the model. Sensitivity analyses revealed that herd density (best estimate 1-3 pigs km-2), daily herd movement distances (best estimate approximately 1 km), probability of infection transmission between herds (best estimate 0.75) and disease related herd mortality (best estimate 42%) were highly influential on epidemic size but that extraordinary movements of pigs and the yearly home range size of a pig herd were not. CSF generally established (98% of simulations) following a single point introduction. CSF spread at approximately 9 km2 per day with low incidence rates (< 2 herds per day) in an epidemic wave along contiguous habitat for several years, before dying out (when the epidemic arrived at the end of a contiguous sub-population or at a low density wild pig area). The low incidence rate indicates that surveillance for wildlife disease epidemics caused by short lived infections will be most efficient when surveillance is based on detection and investigation of clinical events, although this may not always be practical. Epidemics could be contained and eradicated with culling (aerial shooting) or vaccination when these were adequately implemented. It was apparent that the spatial structure, ecology and behaviour of wild populations must be accounted for during disease management in wildlife. An important finding was that it may only be necessary to cull or vaccinate relatively small proportions of a population to successfully contain and eradicate some wildlife disease epidemics.

  10. Controlling disease outbreaks in wildlife using limited culling: modelling classical swine fever incursions in wild pigs in Australia

    PubMed Central

    2012-01-01

    Disease modelling is one approach for providing new insights into wildlife disease epidemiology. This paper describes a spatio-temporal, stochastic, susceptible- exposed-infected-recovered process model that simulates the potential spread of classical swine fever through a documented, large and free living wild pig population following a simulated incursion. The study area (300 000 km2) was in northern Australia. Published data on wild pig ecology from Australia, and international Classical Swine Fever data was used to parameterise the model. Sensitivity analyses revealed that herd density (best estimate 1-3 pigs km-2), daily herd movement distances (best estimate approximately 1 km), probability of infection transmission between herds (best estimate 0.75) and disease related herd mortality (best estimate 42%) were highly influential on epidemic size but that extraordinary movements of pigs and the yearly home range size of a pig herd were not. CSF generally established (98% of simulations) following a single point introduction. CSF spread at approximately 9 km2 per day with low incidence rates (< 2 herds per day) in an epidemic wave along contiguous habitat for several years, before dying out (when the epidemic arrived at the end of a contiguous sub-population or at a low density wild pig area). The low incidence rate indicates that surveillance for wildlife disease epidemics caused by short lived infections will be most efficient when surveillance is based on detection and investigation of clinical events, although this may not always be practical. Epidemics could be contained and eradicated with culling (aerial shooting) or vaccination when these were adequately implemented. It was apparent that the spatial structure, ecology and behaviour of wild populations must be accounted for during disease management in wildlife. An important finding was that it may only be necessary to cull or vaccinate relatively small proportions of a population to successfully contain and eradicate some wildlife disease epidemics. PMID:22243996

  11. Epidemiology of urban water distribution systems

    NASA Astrophysics Data System (ADS)

    Bardet, Jean-Pierre; Little, Richard

    2014-08-01

    Urban water distribution systems worldwide contain numerous old and fragile pipes that inevitably break, flood streets and damage property, and disrupt economic and social activities. Such breaks often present dramatically in temporal clusters as occurred in Los Angeles during 2009. These clustered pipe breaks share many characteristics with human mortality observed during extreme climatological events such as heat waves or air pollution. Drawing from research and empirical studies in human epidemiology, a framework is introduced to analyze the time variations of disruptive pipe breaks that can help water agencies better understand clustered pipe failures and institute measures to minimize the disruptions caused by them. It is posited that at any time, a cohort of the pipes comprising the water distribution system will be in a weakened state due to fatigue and corrosion. This frail cohort becomes vulnerable during normal operations and ultimately breaks due to rapid increase in crack lengths induced by abnormal stressors. The epidemiological harvesting model developed in this paper simulates an observed time series of monthly pipe breaks and has both explanatory and predictive power. It also demonstrates that models from nonengineering disciplines such as medicine can provide improved insights into the performance of infrastructure systems.

  12. Multi-d CFD Modeling of a Free-piston Stirling Convertor at NASA Glenn

    NASA Technical Reports Server (NTRS)

    Wilson, Scott D.; Dyson, Rodger W.; Tew, Roy C.; Ibrahim, Mounir B.

    2004-01-01

    A high efficiency Stirling Radioisotope Generator (SRG) is being developed for possible use in long duration space science missions. NASA s advanced technology goals for next generation Stirling convertors include increasing the Carnot efficiency and percent of Carnot efficiency. To help achieve these goals, a multidimensional Computational Fluid Dynamics (CFD) code is being developed to numerically model unsteady fluid flow and heat transfer phenomena of the oscillating working gas inside Stirling convertors. Simulations of the Stirling convertors for the SRG will help characterize the thermodynamic losses resulting from fluid flow and heat transfer between the working gas and solid walls. The current CFD simulation represents approximated 2-dimensional convertor geometry. The simulation solves the Navier Stokes equations for an ideal helium gas oscillating at low speeds. The current simulation results are discussed.

  13. Efficient parallelization of analytic bond-order potentials for large-scale atomistic simulations

    NASA Astrophysics Data System (ADS)

    Teijeiro, C.; Hammerschmidt, T.; Drautz, R.; Sutmann, G.

    2016-07-01

    Analytic bond-order potentials (BOPs) provide a way to compute atomistic properties with controllable accuracy. For large-scale computations of heterogeneous compounds at the atomistic level, both the computational efficiency and memory demand of BOP implementations have to be optimized. Since the evaluation of BOPs is a local operation within a finite environment, the parallelization concepts known from short-range interacting particle simulations can be applied to improve the performance of these simulations. In this work, several efficient parallelization methods for BOPs that use three-dimensional domain decomposition schemes are described. The schemes are implemented into the bond-order potential code BOPfox, and their performance is measured in a series of benchmarks. Systems of up to several millions of atoms are simulated on a high performance computing system, and parallel scaling is demonstrated for up to thousands of processors.

  14. Energy efficient strategy for throughput improvement in wireless sensor networks.

    PubMed

    Jabbar, Sohail; Minhas, Abid Ali; Imran, Muhammad; Khalid, Shehzad; Saleem, Kashif

    2015-01-23

    Network lifetime and throughput are one of the prime concerns while designing routing protocols for wireless sensor networks (WSNs). However, most of the existing schemes are either geared towards prolonging network lifetime or improving throughput. This paper presents an energy efficient routing scheme for throughput improvement in WSN. The proposed scheme exploits multilayer cluster design for energy efficient forwarding node selection, cluster heads rotation and both inter- and intra-cluster routing. To improve throughput, we rotate the role of cluster head among various nodes based on two threshold levels which reduces the number of dropped packets. We conducted simulations in the NS2 simulator to validate the performance of the proposed scheme. Simulation results demonstrate the performance efficiency of the proposed scheme in terms of various metrics compared to similar approaches published in the literature.

  15. Energy Efficient Strategy for Throughput Improvement in Wireless Sensor Networks

    PubMed Central

    Jabbar, Sohail; Minhas, Abid Ali; Imran, Muhammad; Khalid, Shehzad; Saleem, Kashif

    2015-01-01

    Network lifetime and throughput are one of the prime concerns while designing routing protocols for wireless sensor networks (WSNs). However, most of the existing schemes are either geared towards prolonging network lifetime or improving throughput. This paper presents an energy efficient routing scheme for throughput improvement in WSN. The proposed scheme exploits multilayer cluster design for energy efficient forwarding node selection, cluster heads rotation and both inter- and intra-cluster routing. To improve throughput, we rotate the role of cluster head among various nodes based on two threshold levels which reduces the number of dropped packets. We conducted simulations in the NS2 simulator to validate the performance of the proposed scheme. Simulation results demonstrate the performance efficiency of the proposed scheme in terms of various metrics compared to similar approaches published in the literature. PMID:25625902

  16. Simulation of Fluid Flow and Collection Efficiency for an SEA Multi-element Probe

    NASA Technical Reports Server (NTRS)

    Rigby, David L.; Struk, Peter M.; Bidwell, Colin

    2014-01-01

    Numerical simulations of fluid flow and collection efficiency for a Science Engineering Associates (SEA) multi-element probe are presented. Simulation of the flow field was produced using the Glenn-HT Navier-Stokes solver. Three dimensional unsteady results were produced and then time averaged for the collection efficiency results. Three grid densities were investigated to enable an assessment of grid dependence. Collection efficiencies were generated for three spherical particle sizes, 100, 20, and 5 micron in diameter, using the codes LEWICE3D and LEWICE2D. The free stream Mach number was 0.27, representing a velocity of approximately 86 ms. It was observed that a reduction in velocity of about 15-20 occurred as the flow entered the shroud of the probe.Collection efficiency results indicate a reduction in collection efficiency as particle size is reduced. The reduction with particle size is expected, however, the results tended to be lower than previous results generated for isolated two-dimensional elements. The deviation from the two-dimensional results is more pronounced for the smaller particles and is likely due to the effect of the protective shroud.

  17. Spatial Patterns in the Efficiency of the Biological Pump: What Controls Export Ratios at the Global Scale?

    NASA Astrophysics Data System (ADS)

    Moore, J. K.

    2016-02-01

    The efficiency of the biological pump is influenced by complex interactions between chemical, biological, and physical processes. The efficiency of export out of surface waters and down through the water column to the deep ocean has been linked to a number of factors including biota community composition, production of mineral ballast components, physical aggregation and disaggregation processes, and ocean oxygen concentrations. I will examine spatial patterns in the export ratio and the efficiency of the biological pump at the global scale using the Community Earth System Model (CESM). There are strong spatial variations in the export efficiency as simulated by the CESM, which are strongly correlated with new nutrient inputs to the euphotic zone and their impacts on phytoplankton community structure. I will compare CESM simulations that include dynamic, variable export ratios driven by the phytoplankton community structure, with simulations that impose a near-constant export ratio to examine the effects of export efficiency on nutrient and surface chlorophyll distributions. The model predicted export ratios will also be compared with recent satellite-based estimates.

  18. SUNREL Energy Simulation Software | Buildings | NREL

    Science.gov Websites

    SUNREL Energy Simulation Software SUNREL Energy Simulation Software SUNREL® is a hourly building energy simulation program that aids in the design of small energy-efficient buildings where the loads are

  19. Relative validity of micronutrient and fiber intake assessed with two new interactive meal- and Web-based food frequency questionnaires.

    PubMed

    Christensen, Sara E; Möller, Elisabeth; Bonn, Stephanie E; Ploner, Alexander; Bälter, Olle; Lissner, Lauren; Bälter, Katarina

    2014-02-21

    The meal- and Web-based food frequency questionnaires, Meal-Q and MiniMeal-Q, were developed for cost-efficient assessment of dietary intake in epidemiological studies. The objective of this study was to evaluate the relative validity of micronutrient and fiber intake assessed with Meal-Q and MiniMeal-Q. The reproducibility of Meal-Q was also evaluated. A total of 163 volunteer men and women aged between 20 and 63 years were recruited from Stockholm County, Sweden. Assessment of micronutrient and fiber intake with the 174-item Meal-Q was compared to a Web-based 7-day weighed food record (WFR). Two administered Meal-Q questionnaires were compared for reproducibility. The 126-item MiniMeal-Q, developed after the validation study, was evaluated in a simulated validation by using truncated Meal-Q data. The study population consisted of approximately 80% women (129/163) with a mean age of 33 years (SD 12) who were highly educated (130/163, 80% with >12 years of education) on average. Cross-classification of quartiles with the WFR placed 69% to 90% in the same/adjacent quartile for Meal-Q and 67% to 89% for MiniMeal-Q. Bland-Altman plots with the WFR and the questionnaires showed large variances and a trend of increasing underestimation with increasing intakes. Deattenuated and energy-adjusted Spearman rank correlations between the questionnaires and the WFR were in the range ρ=.25-.69, excluding sodium that was not statistically significant. Cross-classifications of quartiles of the 2 Meal-Q administrations placed 86% to 97% in the same/adjacent quartile. Intraclass correlation coefficients for energy-adjusted intakes were in the range of .50-.76. With the exception of sodium, this validation study demonstrates Meal-Q and MiniMeal-Q to be useful methods for ranking micronutrient and fiber intake in epidemiological studies with Web-based data collection.

  20. Epidemiological HIV infection surveillance among subjects with risk behaviours in the city of Messina (Sicily) from 1992 to 2015.

    PubMed

    Visalli, G; Avventuroso, E; Laganà, P; Spataro, P; Di Pietro, A; Bertuccio, M P; Picerno, I

    2017-09-01

    Epidemiological studies are a key element in determining the evolution and spread of HIV infection among the world population. Knowledge of the epidemiological dynamics improves strategies for prevention and monitoring. We examined 2,272 subjects who voluntarily underwent HIV testing from January 1992 to December 2015. For each subject, an anonymous form was completed to obtain information on personal data, sexual habits and exposure to risk factors. The number of subjects undergoing the screening test has increased over the years and the average age of the tested subjects has decreased over time. The main motivation for undergoing HIV testing is unprotected sex. Although heterosexual subjects taking the test were more numerous than homosexuals in this study, an increase in the latter over time should be highlighted. Although the number of tests performed has increased over the years, the persistence of unprotected sex shows an inadequate perception of risk. Therefore, it is necessary to implement programmes to increase the general awareness of HIV infection. It is also essential to undertake constant monitoring of behaviour, risk perception and the application of the screening test via surveillance systems in order to implement effective and efficient prevention.

  1. Epidemiologic characteristics of scrub typhus on Jeju Island

    PubMed Central

    2017-01-01

    OBJECTIVES Scrub typhus is the most common febrile disease in Korea during the autumn. Jeju Island is the largest island in South Korea and has a distinctive oceanic climate. This study aimed to identify epidemiologic characteristics of scrub typhus on Jeju Island. METHODS From January 2011 to December 2016, 446 patients were diagnosed with scrub typhus on Jeju Island. The patients’ personal data and the environmental factors that might be related to scrub typhus were investigated and retrospectively analyzed. RESULTS The median age of the patients was 58-years-old (range, 8 to 91) and 43% of them worked in the agricultural, forestry or livestock industry. Regardless of their job, 87% of the patients had a history of either working outdoors or of other activities before developing scrub typhus. The south and southeast regions of Jeju Island, especially Namwon-eup, showed the highest incidence of scrub typhus. Workers in mandarin orange orchards seemed to be the highest risk group for scrub typhus infection. CONCLUSIONS Scrub typhus on Jeju Island showed unique characteristics. To efficiently prevent scrub typhus, each year individual regional approaches should be developed based on the epidemiologic characteristics of the disease. PMID:28823118

  2. An information system for epidemiology based on a computer-based medical record.

    PubMed

    Verdier, C; Flory, A

    1994-12-01

    A new way is presented to build an information system addressed to problems in epidemiology. Based on our analysis of current and future requirements, a system is proposed which allows for collection, organization and distribution of data within a computer network. In this application, two broad communities of users-physicians and epidemiologists-can be identified, each with their own perspectives and goals. The different requirements of each community lead us to a client-service centered architecture which provides the functionality requirements of the two groups. The resulting physician workstation provides help for recording and querying medical information about patients and from a pharmacological database. All information is classified and coded in order to be retrieved for pharmaco-economic studies. The service center receives information from physician workstations and permits organizations that are in charge of statistical studies to work with "real" data recorded during patient encounters. This leads to a new approach in epidemiology. Studies can be carried out with a more efficient data acquisition. For modelling the information system, we use an object-oriented approach. We have observed that the object-oriented representation, particularly its concepts of generalization, aggregation and encapsulation, are very usable for our problem.

  3. Public Health and Epidemiology Informatics.

    PubMed

    Flahault, A; Bar-Hen, A; Paragios, N

    2016-11-10

    The aim of this manuscript is to provide a brief overview of the scientific challenges that should be addressed in order to unlock the full potential of using data from a general point of view, as well as to present some ideas that could help answer specific needs for data understanding in the field of health sciences and epidemiology. A survey of uses and challenges of big data analyses for medicine and public health was conducted. The first part of the paper focuses on big data techniques, algorithms, and statistical approaches to identify patterns in data. The second part describes some cutting-edge applications of analyses and predictive modeling in public health. In recent years, we witnessed a revolution regarding the nature, collection, and availability of data in general. This was especially striking in the health sector and particularly in the field of epidemiology. Data derives from a large variety of sources, e.g. clinical settings, billing claims, care scheduling, drug usage, web based search queries, and Tweets. The exploitation of the information (data mining, artificial intelligence) relevant to these data has become one of the most promising as well challenging tasks from societal and scientific viewpoints in order to leverage the information available and making public health more efficient.

  4. Simulated learning environment experience in nursing students for paediatric practice.

    PubMed

    Mendoza-Maldonado, Yessy; Barría-Pailaquilén, René Mauricio

    The training of health professionals requires the acquisition of clinical skills in a safe and efficient manner, which is facilitated by a simulated learning environment (SLE). It is also an efficient alternative when there are limitations for clinical practice in certain areas. This paper shows the work undertaken in a Chilean university in implementing paediatric practice using SLE. Over eight days, the care experience of a hospitalized infant was studied applying the nursing process. The participation of a paediatrician, resident physician, nursing technician, and simulated user was included in addition to the use of a simulation mannequin and equipment. Simulation of care was integral and covered interaction with the child and family and was developed in groups of six students by a teacher. The different phases of the simulation methodology were developed from a pedagogical point of view. The possibility of implementing paediatric clinical practice in an efficient and safe way was confirmed. The experience in SLE was highly valued by the students, allowing them to develop different skills and abilities required for paediatric nursing through simulation. Copyright © 2018 Elsevier España, S.L.U. All rights reserved.

  5. Quantifying Transmission Heterogeneity Using Both Pathogen Phylogenies and Incidence Time Series

    PubMed Central

    Li, Lucy M.; Grassly, Nicholas C.; Fraser, Christophe

    2017-01-01

    Abstract Heterogeneity in individual-level transmissibility can be quantified by the dispersion parameter k of the offspring distribution. Quantifying heterogeneity is important as it affects other parameter estimates, it modulates the degree of unpredictability of an epidemic, and it needs to be accounted for in models of infection control. Aggregated data such as incidence time series are often not sufficiently informative to estimate k. Incorporating phylogenetic analysis can help to estimate k concurrently with other epidemiological parameters. We have developed an inference framework that uses particle Markov Chain Monte Carlo to estimate k and other epidemiological parameters using both incidence time series and the pathogen phylogeny. Using the framework to fit a modified compartmental transmission model that includes the parameter k to simulated data, we found that more accurate and less biased estimates of the reproductive number were obtained by combining epidemiological and phylogenetic analyses. However, k was most accurately estimated using pathogen phylogeny alone. Accurately estimating k was necessary for unbiased estimates of the reproductive number, but it did not affect the accuracy of reporting probability and epidemic start date estimates. We further demonstrated that inference was possible in the presence of phylogenetic uncertainty by sampling from the posterior distribution of phylogenies. Finally, we used the inference framework to estimate transmission parameters from epidemiological and genetic data collected during a poliovirus outbreak. Despite the large degree of phylogenetic uncertainty, we demonstrated that incorporating phylogenetic data in parameter inference improved the accuracy and precision of estimates. PMID:28981709

  6. Panel discussion review: session 1--exposure assessment and related errors in air pollution epidemiologic studies.

    PubMed

    Sarnat, Jeremy A; Wilson, William E; Strand, Matthew; Brook, Jeff; Wyzga, Ron; Lumley, Thomas

    2007-12-01

    Examining the validity of exposure metrics used in air pollution epidemiologic models has been a key focus of recent exposure assessment studies. The objective of this work has been, largely, to determine what a given exposure metric represents and to quantify and reduce any potential errors resulting from using these metrics in lieu of true exposure measurements. The current manuscript summarizes the presentations of the co-authors from a recent EPA workshop, held in December 2006, dealing with the role and contributions of exposure assessment in addressing these issues. Results are presented from US and Canadian exposure and pollutant measurement studies as well as theoretical simulations to investigate what both particulate and gaseous pollutant concentrations represent and the potential errors resulting from their use in air pollution epidemiologic studies. Quantifying the association between ambient pollutant concentrations and corresponding personal exposures has led to the concept of defining attenuation factors, or alpha. Specifically, characterizing pollutant-specific estimates for alpha was shown to be useful in developing regression calibration methods involving PM epidemiologic risk estimates. For some gaseous pollutants such as NO2 and SO2, the associations between ambient concentrations and personal exposures were shown to be complex and still poorly understood. Results from recent panel studies suggest that ambient NO2 measurements may, in some locations, be serving as surrogates to traffic pollutants, including traffic-related PM2.5, hopanes, steranes, and oxidized nitrogen compounds (rather than NO2).

  7. Educational Resources | NREL

    Science.gov Websites

    for Energy Simulation Energy Simulation Games ElectroCity Environmental Science Electro Energy Simulation Games Energy Efficiency Energy Audit Conducting a School Energy Audit presentation Exploration of

  8. Efficient EM Simulation of GCPW Structures Applied to a 200-GHz mHEMT Power Amplifier MMIC

    NASA Astrophysics Data System (ADS)

    Campos-Roca, Yolanda; Amado-Rey, Belén; Wagner, Sandrine; Leuther, Arnulf; Bangert, Axel; Gómez-Alcalá, Rafael; Tessmann, Axel

    2017-05-01

    The behaviour of grounded coplanar waveguide (GCPW) structures in the upper millimeter-wave range is analyzed by using full-wave electromagnetic (EM) simulations. A methodological approach to develop reliable and time-efficient simulations is proposed by investigating the impact of different simplifications in the EM modelling and simulation conditions. After experimental validation with measurements on test structures, this approach has been used to model the most critical passive structures involved in the layout of a state-of-the-art 200-GHz power amplifier based on metamorphic high electron mobility transistors (mHEMTs). This millimeter-wave monolithic integrated circuit (MMIC) has demonstrated a measured output power of 8.7 dBm for an input power of 0 dBm at 200 GHz. The measured output power density and power-added efficiency (PAE) are 46.3 mW/mm and 4.5 %, respectively. The peak measured small-signal gain is 12.7 dB (obtained at 196 GHz). A good agreement has been obtained between measurements and simulation results.

  9. Increasing the sampling efficiency of protein conformational transition using velocity-scaling optimized hybrid explicit/implicit solvent REMD simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Yuqi; Wang, Jinan; Shao, Qiang, E-mail: qshao@mail.shcnc.ac.cn, E-mail: Jiye.Shi@ucb.com, E-mail: wlzhu@mail.shcnc.ac.cn

    2015-03-28

    The application of temperature replica exchange molecular dynamics (REMD) simulation on protein motion is limited by its huge requirement of computational resource, particularly when explicit solvent model is implemented. In the previous study, we developed a velocity-scaling optimized hybrid explicit/implicit solvent REMD method with the hope to reduce the temperature (replica) number on the premise of maintaining high sampling efficiency. In this study, we utilized this method to characterize and energetically identify the conformational transition pathway of a protein model, the N-terminal domain of calmodulin. In comparison to the standard explicit solvent REMD simulation, the hybrid REMD is much lessmore » computationally expensive but, meanwhile, gives accurate evaluation of the structural and thermodynamic properties of the conformational transition which are in well agreement with the standard REMD simulation. Therefore, the hybrid REMD could highly increase the computational efficiency and thus expand the application of REMD simulation to larger-size protein systems.« less

  10. An Implicit Algorithm for the Numerical Simulation of Shape-Memory Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, R; Stolken, J; Jannetti, C

    Shape-memory alloys (SMA) have the potential to be used in a variety of interesting applications due to their unique properties of pseudoelasticity and the shape-memory effect. However, in order to design SMA devices efficiently, a physics-based constitutive model is required to accurately simulate the behavior of shape-memory alloys. The scope of this work is to extend the numerical capabilities of the SMA constitutive model developed by Jannetti et. al. (2003), to handle large-scale polycrystalline simulations. The constitutive model is implemented within the finite-element software ABAQUS/Standard using a user defined material subroutine, or UMAT. To improve the efficiency of the numericalmore » simulations, so that polycrystalline specimens of shape-memory alloys can be modeled, a fully implicit algorithm has been implemented to integrate the constitutive equations. Using an implicit integration scheme increases the efficiency of the UMAT over the previously implemented explicit integration method by a factor of more than 100 for single crystal simulations.« less

  11. Full Quantum Dynamics Simulation of a Realistic Molecular System Using the Adaptive Time-Dependent Density Matrix Renormalization Group Method.

    PubMed

    Yao, Yao; Sun, Ke-Wei; Luo, Zhen; Ma, Haibo

    2018-01-18

    The accurate theoretical interpretation of ultrafast time-resolved spectroscopy experiments relies on full quantum dynamics simulations for the investigated system, which is nevertheless computationally prohibitive for realistic molecular systems with a large number of electronic and/or vibrational degrees of freedom. In this work, we propose a unitary transformation approach for realistic vibronic Hamiltonians, which can be coped with using the adaptive time-dependent density matrix renormalization group (t-DMRG) method to efficiently evolve the nonadiabatic dynamics of a large molecular system. We demonstrate the accuracy and efficiency of this approach with an example of simulating the exciton dissociation process within an oligothiophene/fullerene heterojunction, indicating that t-DMRG can be a promising method for full quantum dynamics simulation in large chemical systems. Moreover, it is also shown that the proper vibronic features in the ultrafast electronic process can be obtained by simulating the two-dimensional (2D) electronic spectrum by virtue of the high computational efficiency of the t-DMRG method.

  12. Highly Efficient Computation of the Basal kon using Direct Simulation of Protein-Protein Association with Flexible Molecular Models.

    PubMed

    Saglam, Ali S; Chong, Lillian T

    2016-01-14

    An essential baseline for determining the extent to which electrostatic interactions enhance the kinetics of protein-protein association is the "basal" kon, which is the rate constant for association in the absence of electrostatic interactions. However, since such association events are beyond the milliseconds time scale, it has not been practical to compute the basal kon by directly simulating the association with flexible models. Here, we computed the basal kon for barnase and barstar, two of the most rapidly associating proteins, using highly efficient, flexible molecular simulations. These simulations involved (a) pseudoatomic protein models that reproduce the molecular shapes, electrostatic, and diffusion properties of all-atom models, and (b) application of the weighted ensemble path sampling strategy, which enhanced the efficiency of generating association events by >130-fold. We also examined the extent to which the computed basal kon is affected by inclusion of intermolecular hydrodynamic interactions in the simulations.

  13. [Epidemiology and risk factors of upper urinary tract tumors: literature review for the yearly scientific report of the French National Association of Urology].

    PubMed

    Ouzzane, A; Rouprêt, M; Leon, P; Yates, D R; Colin, P

    2014-11-01

    To describe the epidemiology, the risk and genetic factors involved in carcinogenesis pathways of upper urinary tumors UTUCs. A systematic review of the scientific literature was performed from the database Medline (National Library of Medicine, PubMed) and websites of the HAS and the ANSM using the following keywords: epidemiology; risk factor; tobacco; aristolochic acid; urothelial carcinoma; ureter; renal pelvis. The search was focused on the characteristics, the mode of action, the efficiency and the side effects of the various drugs concerned. The estimated UTUC incidence is 1.2 cases/100,000 inhabitant per year in Europe. The incidence of renal pelvis tumor has been stable for 30years, while the frequency of ureteric locations has increased over time. Locally advanced stage and high grade are more frequent at the time of diagnosis. The median age for diagnosis is 70-years-old. Male-to-female ratio is nearly 2. Main carcinogenic factors are tobacco consumption and occupational exposure. There are specific risk factors for UTUC such acid aristolochic (balkan's nephropathy and Chinese herbs nephropathy). Familial cases are distinct from sporadic cases. UTUCs belong to the HNPCC syndrome and they rank third in its tumor spectrum. UTUCs are scarce tumors with specific epidemiologic characteristics. UTUCs share common risk factors with other urothelial carcinomas such as bladder tumors but have also specific risk factors that clinicians should know. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  14. Study on observation planning of LAMOST focal plane positioning system and its simulation

    NASA Astrophysics Data System (ADS)

    Zhai, Chao; Jin, Yi; Peng, Xiaobo; Xing, Xiaozheng

    2006-06-01

    Fiber Positioning System of LAMOST focal plane based on subarea thinking, adopts a parallel controllable positioning plan, the structure is designed as a round area and overlapped each other in order to eliminate the un-observation region. But it also makes the observation efficiency of the system become an important problem. In this paper According to the system, the model of LAMOST focal plane Observation Planning including 4000 fiber positioning units is built, Stars are allocated using netflow algorithm and mechanical collisions are diminished through the retreat algorithm, then the simulation of the system's observation efficiency is carried out. The problem of observation efficiency of LAMOST focal plane is analysed systemic and all-sided from the aspect of overlapped region, fiber positioning units, observation radius, collisions and so on. The observation efficiency of the system in theory is describes and the simulation indicates that the system's observation efficiency is acceptable. The analyses play an indicative role on the design of the LAMOST focal plane structure.

  15. Analysis of hybrid electric/thermofluidic inputs for wet shape memory alloy actuators

    NASA Astrophysics Data System (ADS)

    Flemming, Leslie; Mascaro, Stephen

    2013-01-01

    A wet shape memory alloy (SMA) actuator is characterized by an SMA wire embedded within a compliant fluid-filled tube. Heating and cooling of the SMA wire produces a linear contraction and extension of the wire. Thermal energy can be transferred to and from the wire using combinations of resistive heating and free/forced convection. This paper analyzes the speed and efficiency of a simulated wet SMA actuator using a variety of control strategies involving different combinations of electrical and thermofluidic inputs. A computational fluid dynamics (CFD) model is used in conjunction with a temperature-strain model of the SMA wire to simulate the thermal response of the wire and compute strains, contraction/extension times and efficiency. The simulations produce cycle rates of up to 5 Hz for electrical heating and fluidic cooling, and up to 2 Hz for fluidic heating and cooling. The simulated results demonstrate efficiencies up to 0.5% for electric heating and up to 0.2% for fluidic heating. Using both electric and fluidic inputs concurrently improves the speed and efficiency of the actuator and allows for the actuator to remain contracted without continually delivering energy to the actuator, because of the thermal capacitance of the hot fluid. The characterized speeds and efficiencies are key requirements for implementing broader research efforts involving the intelligent control of electric and thermofluidic networks to optimize the speed and efficiency of wet actuator arrays.

  16. Efficiencies of joint non-local update moves in Monte Carlo simulations of coarse-grained polymers

    NASA Astrophysics Data System (ADS)

    Austin, Kieran S.; Marenz, Martin; Janke, Wolfhard

    2018-03-01

    In this study four update methods are compared in their performance in a Monte Carlo simulation of polymers in continuum space. The efficiencies of the update methods and combinations thereof are compared with the aid of the autocorrelation time with a fixed (optimal) acceptance ratio. Results are obtained for polymer lengths N = 14, 28 and 42 and temperatures below, at and above the collapse transition. In terms of autocorrelation, the optimal acceptance ratio is approximately 0.4. Furthermore, an overview of the step sizes of the update methods that correspond to this optimal acceptance ratio is given. This shall serve as a guide for future studies that rely on efficient computer simulations.

  17. Efficiency reduction and pseudo-convergence in replica exchange sampling of peptide folding unfolding equilibria

    NASA Astrophysics Data System (ADS)

    Denschlag, Robert; Lingenheil, Martin; Tavan, Paul

    2008-06-01

    Replica exchange (RE) molecular dynamics (MD) simulations are frequently applied to sample the folding-unfolding equilibria of β-hairpin peptides in solution, because efficiency gains are expected from this technique. Using a three-state Markov model featuring key aspects of β-hairpin folding we show that RE simulations can be less efficient than conventional techniques. Furthermore we demonstrate that one is easily seduced to erroneously assign convergence to the RE sampling, because RE ensembles can rapidly reach long-lived stationary states. We conclude that typical REMD simulations covering a few tens of nanoseconds are by far too short for sufficient sampling of β-hairpin folding-unfolding equilibria.

  18. Self-learning Monte Carlo method

    DOE PAGES

    Liu, Junwei; Qi, Yang; Meng, Zi Yang; ...

    2017-01-04

    Monte Carlo simulation is an unbiased numerical tool for studying classical and quantum many-body systems. One of its bottlenecks is the lack of a general and efficient update algorithm for large size systems close to the phase transition, for which local updates perform badly. In this Rapid Communication, we propose a general-purpose Monte Carlo method, dubbed self-learning Monte Carlo (SLMC), in which an efficient update algorithm is first learned from the training data generated in trial simulations and then used to speed up the actual simulation. Lastly, we demonstrate the efficiency of SLMC in a spin model at the phasemore » transition point, achieving a 10–20 times speedup.« less

  19. Efficient evaluation of wireless real-time control networks.

    PubMed

    Horvath, Peter; Yampolskiy, Mark; Koutsoukos, Xenofon

    2015-02-11

    In this paper, we present a system simulation framework for the design and performance evaluation of complex wireless cyber-physical systems. We describe the simulator architecture and the specific developments that are required to simulate cyber-physical systems relying on multi-channel, multihop mesh networks. We introduce realistic and efficient physical layer models and a system simulation methodology, which provides statistically significant performance evaluation results with low computational complexity. The capabilities of the proposed framework are illustrated in the example of WirelessHART, a centralized, real-time, multi-hop mesh network designed for industrial control and monitor applications.

  20. LAMMPS integrated materials engine (LIME) for efficient automation of particle-based simulations: application to equation of state generation

    NASA Astrophysics Data System (ADS)

    Barnes, Brian C.; Leiter, Kenneth W.; Becker, Richard; Knap, Jaroslaw; Brennan, John K.

    2017-07-01

    We describe the development, accuracy, and efficiency of an automation package for molecular simulation, the large-scale atomic/molecular massively parallel simulator (LAMMPS) integrated materials engine (LIME). Heuristics and algorithms employed for equation of state (EOS) calculation using a particle-based model of a molecular crystal, hexahydro-1,3,5-trinitro-s-triazine (RDX), are described in detail. The simulation method for the particle-based model is energy-conserving dissipative particle dynamics, but the techniques used in LIME are generally applicable to molecular dynamics simulations with a variety of particle-based models. The newly created tool set is tested through use of its EOS data in plate impact and Taylor anvil impact continuum simulations of solid RDX. The coarse-grain model results from LIME provide an approach to bridge the scales from atomistic simulations to continuum simulations.

  1. Numerically robust and efficient nonlocal electron transport in 2D DRACO simulations

    NASA Astrophysics Data System (ADS)

    Cao, Duc; Chenhall, Jeff; Moses, Greg; Delettrez, Jacques; Collins, Tim

    2013-10-01

    An improved implicit algorithm based on Schurtz, Nicolai and Busquet (SNB) algorithm for nonlocal electron transport is presented. Validation with direct drive shock timing experiments and verification with the Goncharov nonlocal model in 1D LILAC simulations demonstrate the viability of this efficient algorithm for producing 2D lagrangian radiation hydrodynamics direct drive simulations. Additionally, simulations provide strong incentive to further modify key parameters within the SNB theory, namely the ``mean free path.'' An example 2D polar drive simulation to study 2D effects of the nonlocal flux as well as mean free path modifications will also be presented. This research was supported by the University of Rochester Laboratory for Laser Energetics.

  2. Molecular epidemiology biomarkers-Sample collection and processing considerations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Nina T.; Pfleger, Laura; Berger, Eileen

    2005-08-07

    Biomarker studies require processing and storage of numerous biological samples with the goals of obtaining a large amount of information and minimizing future research costs. An efficient study design includes provisions for processing of the original samples, such as cryopreservation, DNA isolation, and preparation of specimens for exposure assessment. Use of standard, two-dimensional and nanobarcodes and customized electronic databases assure efficient management of large sample collections and tracking results of data analyses. Standard operating procedures and quality control plans help to protect sample quality and to assure validity of the biomarker data. Specific state, federal and international regulations are inmore » place regarding research with human samples, governing areas including custody, safety of handling, and transport of human samples. Appropriate informed consent must be obtained from the study subjects prior to sample collection and confidentiality of results maintained. Finally, examples of three biorepositories of different scale (European Cancer Study, National Cancer Institute and School of Public Health Biorepository, University of California, Berkeley) are used to illustrate challenges faced by investigators and the ways to overcome them. New software and biorepository technologies are being developed by many companies that will help to bring biological banking to a new level required by molecular epidemiology of the 21st century.« less

  3. Solving wood chip transport problems with computer simulation.

    Treesearch

    Dennis P. Bradley; Sharon A. Winsauer

    1976-01-01

    Efficient chip transport operations are difficult to achieve due to frequent and often unpredictable changes in distance to market, chipping rate, time spent at the mill, and equipment costs. This paper describes a computer simulation model that allows a logger to design an efficient transport system in response to these changing factors.

  4. Predicting variation in subject thermal response during transcranial magnetic resonance guided focused ultrasound surgery: Comparison in seventeen subject datasets.

    PubMed

    Vyas, Urvi; Ghanouni, Pejman; Halpern, Casey H; Elias, Jeff; Pauly, Kim Butts

    2016-09-01

    In transcranial magnetic resonance-guided focused ultrasound (tcMRgFUS) treatments, the acoustic and spatial heterogeneity of the skull cause reflection, absorption, and scattering of the acoustic beams. These effects depend on skull-specific parameters and can lead to patient-specific thermal responses to the same transducer power. In this work, the authors develop a simulation tool to help predict these different experimental responses using 3D heterogeneous tissue models based on the subject CT images. The authors then validate and compare the predicted skull efficiencies to an experimental metric based on the subject thermal responses during tcMRgFUS treatments in a dataset of seventeen human subjects. Seventeen human head CT scans were used to create tissue acoustic models, simulating the effects of reflection, absorption, and scattering of the acoustic beam as it propagates through a heterogeneous skull. The hybrid angular spectrum technique was used to model the acoustic beam propagation of the InSightec ExAblate 4000 head transducer for each subject, yielding maps of the specific absorption rate (SAR). The simulation assumed the transducer was geometrically focused to the thalamus of each subject, and the focal SAR at the target was used as a measure of the simulated skull efficiency. Experimental skull efficiency for each subject was calculated using the thermal temperature maps from the tcMRgFUS treatments. Axial temperature images (with no artifacts) were reconstructed with a single baseline, corrected using a referenceless algorithm. The experimental skull efficiency was calculated by dividing the reconstructed temperature rise 8.8 s after sonication by the applied acoustic power. The simulated skull efficiency using individual-specific heterogeneous models predicts well (R(2) = 0.84) the experimental energy efficiency. This paper presents a simulation model to predict the variation in thermal responses measured in clinical ctMRGFYS treatments while being computationally feasible.

  5. Predicting variation in subject thermal response during transcranial magnetic resonance guided focused ultrasound surgery: Comparison in seventeen subject datasets

    PubMed Central

    Vyas, Urvi; Ghanouni, Pejman; Halpern, Casey H.; Elias, Jeff; Pauly, Kim Butts

    2016-01-01

    Purpose: In transcranial magnetic resonance-guided focused ultrasound (tcMRgFUS) treatments, the acoustic and spatial heterogeneity of the skull cause reflection, absorption, and scattering of the acoustic beams. These effects depend on skull-specific parameters and can lead to patient-specific thermal responses to the same transducer power. In this work, the authors develop a simulation tool to help predict these different experimental responses using 3D heterogeneous tissue models based on the subject CT images. The authors then validate and compare the predicted skull efficiencies to an experimental metric based on the subject thermal responses during tcMRgFUS treatments in a dataset of seventeen human subjects. Methods: Seventeen human head CT scans were used to create tissue acoustic models, simulating the effects of reflection, absorption, and scattering of the acoustic beam as it propagates through a heterogeneous skull. The hybrid angular spectrum technique was used to model the acoustic beam propagation of the InSightec ExAblate 4000 head transducer for each subject, yielding maps of the specific absorption rate (SAR). The simulation assumed the transducer was geometrically focused to the thalamus of each subject, and the focal SAR at the target was used as a measure of the simulated skull efficiency. Experimental skull efficiency for each subject was calculated using the thermal temperature maps from the tcMRgFUS treatments. Axial temperature images (with no artifacts) were reconstructed with a single baseline, corrected using a referenceless algorithm. The experimental skull efficiency was calculated by dividing the reconstructed temperature rise 8.8 s after sonication by the applied acoustic power. Results: The simulated skull efficiency using individual-specific heterogeneous models predicts well (R2 = 0.84) the experimental energy efficiency. Conclusions: This paper presents a simulation model to predict the variation in thermal responses measured in clinical ctMRGFYS treatments while being computationally feasible. PMID:27587047

  6. Predicting variation in subject thermal response during transcranial magnetic resonance guided focused ultrasound surgery: Comparison in seventeen subject datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vyas, Urvi, E-mail: urvi.vyas@gmail.com; Ghanouni,

    Purpose: In transcranial magnetic resonance-guided focused ultrasound (tcMRgFUS) treatments, the acoustic and spatial heterogeneity of the skull cause reflection, absorption, and scattering of the acoustic beams. These effects depend on skull-specific parameters and can lead to patient-specific thermal responses to the same transducer power. In this work, the authors develop a simulation tool to help predict these different experimental responses using 3D heterogeneous tissue models based on the subject CT images. The authors then validate and compare the predicted skull efficiencies to an experimental metric based on the subject thermal responses during tcMRgFUS treatments in a dataset of seventeen humanmore » subjects. Methods: Seventeen human head CT scans were used to create tissue acoustic models, simulating the effects of reflection, absorption, and scattering of the acoustic beam as it propagates through a heterogeneous skull. The hybrid angular spectrum technique was used to model the acoustic beam propagation of the InSightec ExAblate 4000 head transducer for each subject, yielding maps of the specific absorption rate (SAR). The simulation assumed the transducer was geometrically focused to the thalamus of each subject, and the focal SAR at the target was used as a measure of the simulated skull efficiency. Experimental skull efficiency for each subject was calculated using the thermal temperature maps from the tcMRgFUS treatments. Axial temperature images (with no artifacts) were reconstructed with a single baseline, corrected using a referenceless algorithm. The experimental skull efficiency was calculated by dividing the reconstructed temperature rise 8.8 s after sonication by the applied acoustic power. Results: The simulated skull efficiency using individual-specific heterogeneous models predicts well (R{sup 2} = 0.84) the experimental energy efficiency. Conclusions: This paper presents a simulation model to predict the variation in thermal responses measured in clinical ctMRGFYS treatments while being computationally feasible.« less

  7. Accelerated simulation of stochastic particle removal processes in particle-resolved aerosol models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, J.H.; Michelotti, M.D.; Riemer, N.

    2016-10-01

    Stochastic particle-resolved methods have proven useful for simulating multi-dimensional systems such as composition-resolved aerosol size distributions. While particle-resolved methods have substantial benefits for highly detailed simulations, these techniques suffer from high computational cost, motivating efforts to improve their algorithmic efficiency. Here we formulate an algorithm for accelerating particle removal processes by aggregating particles of similar size into bins. We present the Binned Algorithm for particle removal processes and analyze its performance with application to the atmospherically relevant process of aerosol dry deposition. We show that the Binned Algorithm can dramatically improve the efficiency of particle removals, particularly for low removalmore » rates, and that computational cost is reduced without introducing additional error. In simulations of aerosol particle removal by dry deposition in atmospherically relevant conditions, we demonstrate about 50-times increase in algorithm efficiency.« less

  8. Ergonomics and simulation-based approach in improving facility layout

    NASA Astrophysics Data System (ADS)

    Abad, Jocelyn D.

    2018-02-01

    The use of the simulation-based technique in facility layout has been a choice in the industry due to its convenience and efficient generation of results. Nevertheless, the solutions generated are not capable of addressing delays due to worker's health and safety which significantly impact overall operational efficiency. It is, therefore, critical to incorporate ergonomics in facility design. In this study, workstation analysis was incorporated into Promodel simulation to improve the facility layout of a garment manufacturing. To test the effectiveness of the method, existing and improved facility designs were measured using comprehensive risk level, efficiency, and productivity. Results indicated that the improved facility layout generated a decrease in comprehensive risk level and rapid upper limb assessment score; an increase of 78% in efficiency and 194% increase in productivity compared to existing design and thus proved that the approach is effective in attaining overall facility design improvement.

  9. Improving the Optical Trapping Efficiency in the 225Ra Electric Dipole Moment Experiment via Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Fromm, Steven

    2017-09-01

    In an effort to study and improve the optical trapping efficiency of the 225Ra Electric Dipole Moment experiment, a fully parallelized Monte Carlo simulation of the laser cooling and trapping apparatus was created at Argonne National Laboratory and now maintained and upgraded at Michigan State University. The simulation allows us to study optimizations and upgrades without having to use limited quantities of 225Ra (15 day half-life) in experiment's apparatus. It predicts a trapping efficiency that differs from the observed value in the experiment by approximately a factor of thirty. The effects of varying oven geometry, background gas interactions, laboratory magnetic fields, MOT laser beam configurations and laser frequency noise were studied and ruled out as causes of the discrepancy between measured and predicted values of the overall trapping efficiency. Presently, the simulation is being used to help optimize a planned blue slower laser upgrade in the experiment's apparatus, which will increase the overall trapping efficiency by up to two orders of magnitude. This work is supported by Michigan State University, the Director's Research Scholars Program at the National Superconducting Cyclotron Laboratory, and the U.S. DOE, Office of Science, Office of Nuclear Physics, under Contract DE-AC02-06CH11357.

  10. Relativistic MHD simulations of collision-induced magnetic dissipation in Poynting-flux-dominated jets/outflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Wei

    2015-07-21

    The question of the energy composition of the jets/outflows in high-energy astrophysical systems, e.g. GRBs, AGNs, is taken up first: Matter-flux-dominated (MFD), σ < 1, and/or Poynting-flux-dominated (PFD), σ >1? The standard fireball IS model and dissipative photosphere model are MFD, while the ICMART (Internal-Collision-induced MAgnetic Reconnection and Turbulence) model is PFD. Motivated by ICMART model and other relevant problems, such as “jets in a jet” model of AGNs, the author investigates the models from the EMF energy dissipation efficiency, relativistic outflow generation, and σ evolution points of view, and simulates collisions between high-σ blobs to mimic the situation ofmore » the interactions inside the PFD jets/outflows by using a 3D SRMHD code which solves the conservative form of the ideal MHD equations. σ b,f is calculated from the simulation results (threshold = 1). The efficiency obtained from this hybrid method is similar to the efficiency got from the energy evolution of the simulations (35.2%). Efficiency is nearly σ independent, which is also confirmed by the hybrid method. σ b,i - σ b,f provides an interesting linear relationship. Results of several parameter studies of EMF energy dissipation efficiency are shown.« less

  11. Efficient parallel simulation of CO2 geologic sequestration insaline aquifers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Keni; Doughty, Christine; Wu, Yu-Shu

    2007-01-01

    An efficient parallel simulator for large-scale, long-termCO2 geologic sequestration in saline aquifers has been developed. Theparallel simulator is a three-dimensional, fully implicit model thatsolves large, sparse linear systems arising from discretization of thepartial differential equations for mass and energy balance in porous andfractured media. The simulator is based on the ECO2N module of the TOUGH2code and inherits all the process capabilities of the single-CPU TOUGH2code, including a comprehensive description of the thermodynamics andthermophysical properties of H2O-NaCl- CO2 mixtures, modeling singleand/or two-phase isothermal or non-isothermal flow processes, two-phasemixtures, fluid phases appearing or disappearing, as well as saltprecipitation or dissolution. The newmore » parallel simulator uses MPI forparallel implementation, the METIS software package for simulation domainpartitioning, and the iterative parallel linear solver package Aztec forsolving linear equations by multiple processors. In addition, theparallel simulator has been implemented with an efficient communicationscheme. Test examples show that a linear or super-linear speedup can beobtained on Linux clusters as well as on supercomputers. Because of thesignificant improvement in both simulation time and memory requirement,the new simulator provides a powerful tool for tackling larger scale andmore complex problems than can be solved by single-CPU codes. Ahigh-resolution simulation example is presented that models buoyantconvection, induced by a small increase in brine density caused bydissolution of CO2.« less

  12. A sup-score test for the cure fraction in mixture models for long-term survivors.

    PubMed

    Hsu, Wei-Wen; Todem, David; Kim, KyungMann

    2016-12-01

    The evaluation of cure fractions in oncology research under the well known cure rate model has attracted considerable attention in the literature, but most of the existing testing procedures have relied on restrictive assumptions. A common assumption has been to restrict the cure fraction to a constant under alternatives to homogeneity, thereby neglecting any information from covariates. This article extends the literature by developing a score-based statistic that incorporates covariate information to detect cure fractions, with the existing testing procedure serving as a special case. A complication of this extension, however, is that the implied hypotheses are not typical and standard regularity conditions to conduct the test may not even hold. Using empirical processes arguments, we construct a sup-score test statistic for cure fractions and establish its limiting null distribution as a functional of mixtures of chi-square processes. In practice, we suggest a simple resampling procedure to approximate this limiting distribution. Our simulation results show that the proposed test can greatly improve efficiency over tests that neglect the heterogeneity of the cure fraction under the alternative. The practical utility of the methodology is illustrated using ovarian cancer survival data with long-term follow-up from the surveillance, epidemiology, and end results registry. © 2016, The International Biometric Society.

  13. Efficient simulation of voxelized phantom in GATE with embedded SimSET multiple photon history generator.

    PubMed

    Lin, Hsin-Hon; Chuang, Keh-Shih; Lin, Yi-Hsing; Ni, Yu-Ching; Wu, Jay; Jan, Meei-Ling

    2014-10-21

    GEANT4 Application for Tomographic Emission (GATE) is a powerful Monte Carlo simulator that combines the advantages of the general-purpose GEANT4 simulation code and the specific software tool implementations dedicated to emission tomography. However, the detailed physical modelling of GEANT4 is highly computationally demanding, especially when tracking particles through voxelized phantoms. To circumvent the relatively slow simulation of voxelized phantoms in GATE, another efficient Monte Carlo code can be used to simulate photon interactions and transport inside a voxelized phantom. The simulation system for emission tomography (SimSET), a dedicated Monte Carlo code for PET/SPECT systems, is well-known for its efficiency in simulation of voxel-based objects. An efficient Monte Carlo workflow integrating GATE and SimSET for simulating pinhole SPECT has been proposed to improve voxelized phantom simulation. Although the workflow achieves a desirable increase in speed, it sacrifices the ability to simulate decaying radioactive sources such as non-pure positron emitters or multiple emission isotopes with complex decay schemes and lacks the modelling of time-dependent processes due to the inherent limitations of the SimSET photon history generator (PHG). Moreover, a large volume of disk storage is needed to store the huge temporal photon history file produced by SimSET that must be transported to GATE. In this work, we developed a multiple photon emission history generator (MPHG) based on SimSET/PHG to support a majority of the medically important positron emitters. We incorporated the new generator codes inside GATE to improve the simulation efficiency of voxelized phantoms in GATE, while eliminating the need for the temporal photon history file. The validation of this new code based on a MicroPET R4 system was conducted for (124)I and (18)F with mouse-like and rat-like phantoms. Comparison of GATE/MPHG with GATE/GEANT4 indicated there is a slight difference in energy spectra for energy below 50 keV due to the lack of x-ray simulation from (124)I decay in the new code. The spatial resolution, scatter fraction and count rate performance are in good agreement between the two codes. For the case studies of (18)F-NaF ((124)I-IAZG) using MOBY phantom with 1  ×  1 × 1 mm(3) voxel sizes, the results show that GATE/MPHG can achieve acceleration factors of approximately 3.1 × (4.5 ×), 6.5 × (10.7 ×) and 9.5 × (31.0 ×) compared with GATE using the regular navigation method, the compressed voxel method and the parameterized tracking technique, respectively. In conclusion, the implementation of MPHG in GATE allows for improved efficiency of voxelized phantom simulations and is suitable for studying clinical and preclinical imaging.

  14. ANNarchy: a code generation approach to neural simulations on parallel hardware

    PubMed Central

    Vitay, Julien; Dinkelbach, Helge Ü.; Hamker, Fred H.

    2015-01-01

    Many modern neural simulators focus on the simulation of networks of spiking neurons on parallel hardware. Another important framework in computational neuroscience, rate-coded neural networks, is mostly difficult or impossible to implement using these simulators. We present here the ANNarchy (Artificial Neural Networks architect) neural simulator, which allows to easily define and simulate rate-coded and spiking networks, as well as combinations of both. The interface in Python has been designed to be close to the PyNN interface, while the definition of neuron and synapse models can be specified using an equation-oriented mathematical description similar to the Brian neural simulator. This information is used to generate C++ code that will efficiently perform the simulation on the chosen parallel hardware (multi-core system or graphical processing unit). Several numerical methods are available to transform ordinary differential equations into an efficient C++code. We compare the parallel performance of the simulator to existing solutions. PMID:26283957

  15. Uncertainty Quantification in Simulations of Epidemics Using Polynomial Chaos

    PubMed Central

    Santonja, F.; Chen-Charpentier, B.

    2012-01-01

    Mathematical models based on ordinary differential equations are a useful tool to study the processes involved in epidemiology. Many models consider that the parameters are deterministic variables. But in practice, the transmission parameters present large variability and it is not possible to determine them exactly, and it is necessary to introduce randomness. In this paper, we present an application of the polynomial chaos approach to epidemiological mathematical models based on ordinary differential equations with random coefficients. Taking into account the variability of the transmission parameters of the model, this approach allows us to obtain an auxiliary system of differential equations, which is then integrated numerically to obtain the first-and the second-order moments of the output stochastic processes. A sensitivity analysis based on the polynomial chaos approach is also performed to determine which parameters have the greatest influence on the results. As an example, we will apply the approach to an obesity epidemic model. PMID:22927889

  16. Interactions among human behavior, social networks, and societal infrastructures: A Case Study in Computational Epidemiology

    NASA Astrophysics Data System (ADS)

    Barrett, Christopher L.; Bisset, Keith; Chen, Jiangzhuo; Eubank, Stephen; Lewis, Bryan; Kumar, V. S. Anil; Marathe, Madhav V.; Mortveit, Henning S.

    Human behavior, social networks, and the civil infrastructures are closely intertwined. Understanding their co-evolution is critical for designing public policies and decision support for disaster planning. For example, human behaviors and day to day activities of individuals create dense social interactions that are characteristic of modern urban societies. These dense social networks provide a perfect fabric for fast, uncontrolled disease propagation. Conversely, people’s behavior in response to public policies and their perception of how the crisis is unfolding as a result of disease outbreak can dramatically alter the normally stable social interactions. Effective planning and response strategies must take these complicated interactions into account. In this chapter, we describe a computer simulation based approach to study these issues using public health and computational epidemiology as an illustrative example. We also formulate game-theoretic and stochastic optimization problems that capture many of the problems that we study empirically.

  17. Cholera outbreak in Homa Bay County, Kenya, 2015.

    PubMed

    Githuku, Jane Njoki; Boru, Waqo Gufu; Hall, Casey Daniel; Gura, Zeinab; Oyugi, Elvis; Kishimba, Rogath Saika; Semali, Innocent; Farhat, Ghada Nadim; Mattie Park, Meeyoung

    2017-01-01

    Cholera is among the re-emerging diseases in Kenya. Beginning in December 2014, a persistent outbreak occurred involving 29 out of the 47 countries. Homa Bay County in Western Kenya was among the first counties to report cholera cases from January to April 2015. This case study is based on an outbreak investigation conducted by FELTP residents in Homa Bay County in February 2015. It simulates an outbreak investigation including laboratory confirmation, active case finding, descriptive epidemiology and implementation of control measures. This case study is designed for the training of basic level field epidemiology trainees or any other health care workers working in public health-related fields. It can be administered in 2-3 hours. Used as adjunct training material, the case study provides the trainees with competencies in investigating an outbreak in preparation for the actual real-life experience of such outbreaks.

  18. Superinfection Behaviors on Scale-Free Networks with Competing Strains

    NASA Astrophysics Data System (ADS)

    Wu, Qingchu; Small, Michael; Liu, Huaxiang

    2013-02-01

    This paper considers the epidemiology of two strains ( I, J) of a disease spreading through a population represented by a scale-free network. The epidemiological model is SIS and the two strains have different reproductive numbers. Superinfection means that strain I can infect individuals already infected with strain J, replacing the strain J infection. Individuals infected with strain I cannot be infected with strain J. The model is set up as a system of ordering differential equations and stability of the disease free, marginal strain I and strain J, and coexistence equilibria are assessed using linear stability analysis, supported by simulations. The main conclusion is that superinfection, as modeled in this paper, can allow strain I to coexist with strain J even when it has a lower basic reproductive number. Most strikingly, it can allow strain I to persist even when its reproductive number is less than 1.

  19. Simulation to assess the efficacy of US airport entry scrreening of passengers for pandemic influenza

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mcmahon, Benjamin

    2009-01-01

    We present our methodology and stochastic discrete-event simulation developed to model the screening of passengers for pandemic influenza at the US port-of-entry airports. Our model uniquely combines epidemiology modelling, evolving infected states and conditions of passengers over time, and operational considerations of screening in a single simulation. The simulation begins with international aircraft arrivals to the US. Passengers are then randomly assigned to one of three states -- not infected, infected with pandemic influenza and infected with other respiratory illness. Passengers then pass through various screening layers (i.e. pre-departure screening, en route screening, primary screening and secondary screening) and ultimatelymore » exit the system. We track the status of each passenger over time, with a special emphasis on false negatives (i.e. passengers infected with pandemic influenza, but are not identified as such) as these passengers pose a significant threat as they could unknowingly spread the pandemic influenza virus throughout our nation.« less

  20. An infectious way to teach students about outbreaks.

    PubMed

    Cremin, Íde; Watson, Oliver; Heffernan, Alastair; Imai, Natsuko; Ahmed, Norin; Bivegete, Sandra; Kimani, Teresia; Kyriacou, Demetris; Mahadevan, Preveina; Mustafa, Rima; Pagoni, Panagiota; Sophiea, Marisa; Whittaker, Charlie; Beacroft, Leo; Riley, Steven; Fisher, Matthew C

    2018-06-01

    The study of infectious disease outbreaks is required to train today's epidemiologists. A typical way to introduce and explain key epidemiological concepts is through the analysis of a historical outbreak. There are, however, few training options that explicitly utilise real-time simulated stochastic outbreaks where the participants themselves comprise the dataset they subsequently analyse. In this paper, we present a teaching exercise in which an infectious disease outbreak is simulated over a five-day period and subsequently analysed. We iteratively developed the teaching exercise to offer additional insight into analysing an outbreak. An R package for visualisation, analysis and simulation of the outbreak data was developed to accompany the practical to reinforce learning outcomes. Computer simulations of the outbreak revealed deviations from observed dynamics, highlighting how simplifying assumptions conventionally made in mathematical models often differ from reality. Here we provide a pedagogical tool for others to use and adapt in their own settings. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  1. [The truth and present uncertainty about mad cow disease].

    PubMed

    Suárez Fernández, G

    2001-01-01

    A historical review is made about Spongiform Encephalopathies which affect both animals and man. This is the base for an epidemiological and predictive analysis of these type of diseases, especially Bovine Spongiform Encephalopathy (BSE) as a present health problem. The scientific certainties or truths, such as the prion theory (PrPc-PrPsc), the low natural infectivity of these group of diseases, the high dose of prions necessary to produce the experimental disease, the species barrier or specificity, the individual susceptibility due to genetic traits, and the low transmission efficiency by the oral route, compared to the parenteral route, agree with the epidemiological observations of human cases of the variant of the Creutzfeldt-Jakob disease (vCJD), which is 0.1 cases per million inhabitants and year. The present and future prediction of BSE should not be alarmist, taking into account the certainties that we know.

  2. Comparison of double-locus sequence typing (DLST) and multilocus sequence typing (MLST) for the investigation of Pseudomonas aeruginosa populations.

    PubMed

    Cholley, Pascal; Stojanov, Milos; Hocquet, Didier; Thouverez, Michelle; Bertrand, Xavier; Blanc, Dominique S

    2015-08-01

    Reliable molecular typing methods are necessary to investigate the epidemiology of bacterial pathogens. Reference methods such as multilocus sequence typing (MLST) and pulsed-field gel electrophoresis (PFGE) are costly and time consuming. Here, we compared our newly developed double-locus sequence typing (DLST) method for Pseudomonas aeruginosa to MLST and PFGE on a collection of 281 isolates. DLST was as discriminatory as MLST and was able to recognize "high-risk" epidemic clones. Both methods were highly congruent. Not surprisingly, a higher discriminatory power was observed with PFGE. In conclusion, being a simple method (single-strand sequencing of only 2 loci), DLST is valuable as a first-line typing tool for epidemiological investigations of P. aeruginosa. Coupled to a more discriminant method like PFGE or whole genome sequencing, it might represent an efficient typing strategy to investigate or prevent outbreaks. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Measuring molecular biomarkers in epidemiologic studies: laboratory techniques and biospecimen considerations.

    PubMed

    Erickson, Heidi S

    2012-09-28

    The future of personalized medicine depends on the ability to efficiently and rapidly elucidate a reliable set of disease-specific molecular biomarkers. High-throughput molecular biomarker analysis methods have been developed to identify disease risk, diagnostic, prognostic, and therapeutic targets in human clinical samples. Currently, high throughput screening allows us to analyze thousands of markers from one sample or one marker from thousands of samples and will eventually allow us to analyze thousands of markers from thousands of samples. Unfortunately, the inherent nature of current high throughput methodologies, clinical specimens, and cost of analysis is often prohibitive for extensive high throughput biomarker analysis. This review summarizes the current state of high throughput biomarker screening of clinical specimens applicable to genetic epidemiology and longitudinal population-based studies with a focus on considerations related to biospecimens, laboratory techniques, and sample pooling. Copyright © 2012 John Wiley & Sons, Ltd.

  4. Impact of committed individuals on vaccination behavior

    NASA Astrophysics Data System (ADS)

    Liu, Xiao-Tao; Wu, Zhi-Xi; Zhang, Lianzhong

    2012-11-01

    We study how the presence of committed vaccinators, a small fraction of individuals who consistently hold the vaccinating strategy and are immune to influence, impact the vaccination dynamics in well-mixed and spatially structured populations. For this purpose, we develop an epidemiological game-theoretic model of a flu-like vaccination by integrating an epidemiological process into a simple agent-based model of adaptive learning, where individuals (except for those committed ones) use anecdotal evidence to estimate costs and benefits of vaccination. We show that the committed vaccinators, acting as “steadfast role models” in the populations, can efficiently avoid the clustering of susceptible individuals and stimulate other imitators to take vaccination, hence contributing to the promotion of vaccine uptake. We substantiate our findings by making comparative studies of our model on a full lattice and on a randomly diluted one. Our work is expected to provide valuable information for decision-making and design more effective disease-control strategy.

  5. Disease prevention--should we target obesity or sedentary lifestyle?

    PubMed

    Charansonney, Olivier L; Després, Jean-Pierre

    2010-08-01

    Obesity is a major health challenge facing the modern world. Some evidence points to obesity itself as the main driver of premature mortality. We propose that this view is oversimplified. For example, high levels of physical activity and cardiorespiratory fitness are associated with lower mortality, even in those who are overweight or obese. To address this issue, we combine epidemiological and physiological evidence in a new paradigm that integrates excess calorie intake, sedentary behavior, and a maladaptive response to stress. Human physiology is optimized to allow large distances to be covered on foot every day in order to find enough food to sustain brain metabolism. Furthermore, when the body is immobilized by an injury, it triggers efficient life-saving metabolic and inflammatory responses. Both these critical adaptations are, however, confounded by a sedentary lifestyle. The implications of these issues for clinical trial design and epidemiologic data analysis are discussed in this article.

  6. A survey of the transmission of infectious diseases/infections between wild and domestic ungulates in Europe

    PubMed Central

    2011-01-01

    The domestic animals/wildlife interface is becoming a global issue of growing interest. However, despite studies on wildlife diseases being in expansion, the epidemiological role of wild animals in the transmission of infectious diseases remains unclear most of the time. Multiple diseases affecting livestock have already been identified in wildlife, especially in wild ungulates. The first objective of this paper was to establish a list of infections already reported in European wild ungulates. For each disease/infection, three additional materials develop examples already published, specifying the epidemiological role of the species as assigned by the authors. Furthermore, risk factors associated with interactions between wild and domestic animals and regarding emerging infectious diseases are summarized. Finally, the wildlife surveillance measures implemented in different European countries are presented. New research areas are proposed in order to provide efficient tools to prevent the transmission of diseases between wild ungulates and livestock. PMID:21635726

  7. Coevolution of patients and hospitals: how changing epidemiology and technological advances create challenges and drive organizational innovation.

    PubMed

    Lega, Federico; Calciolari, Stefano

    2012-01-01

    Over the last 20 years, hospitals have revised their organizational structures in response to new environmental pressures. Today, demographic and epidemiologic trends and recent technological advances call for new strategies to cope with ultra-elderly frail patients characterized by chronic conditions, high-severity health problems, and complex social situations. The main areas of change surround new ways of managing emerging clusters of patients whose needs are not efficiently or effectively met within traditional hospital organizations. Following the practitioner and academic literature, we first identify the most relevant clusters of new kinds of patients who represent an increasingly larger share of the hospital population in developed countries. Second, we propose a framework that synthesizes the major organizational innovations adopted by successful organizations around the world. We conclude by substantiating the trends of and the reasoning behind the prospective pattern of hospital organizational development.

  8. Wound and soft tissue infections of Serratia marcescens in patients receiving wound care: A health care-associated outbreak.

    PubMed

    Us, Ebru; Kutlu, Huseyin H; Tekeli, Alper; Ocal, Duygu; Cirpan, Sevilay; Memikoglu, Kemal O

    2017-04-01

    We described a health care-associated Serratia marcescens outbreak of wound and soft tissue infection lasting approximately 11 months at Ankara University Ibni Sina Hospital. After identification of S marcescens strains from the clinical and environmental samples, and their susceptibility testing to antimicrobial agents, pulsed-field gel electrophoresis (PFGE) was performed to detect molecular epidemiologic relationships among these isolates. The strains which were isolated from the saline bottles used for wound cleansing in the wound care unit were found to be 100% interrelated by PFGE to the strains from the samples of the outbreak patients. Reuse of the emptied bottles has no longer been allowed since the outbreak occurred. Besides, more efficient and frequent infection control training for hospital staff has been conducted. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  9. Numerical convergence improvements for porflow unsaturated flow simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, Greg

    2017-08-14

    Section 3.6 of SRNL (2016) discusses various PORFLOW code improvements to increase modeling efficiency, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision. This memorandum documents interaction with Analytic & Computational Research, Inc. (http://www.acricfd.com/default.htm) to improve numerical convergence efficiency using PORFLOW version 6.42 for unsaturated flow simulations.

  10. Betavoltaic p--n+-structure simulation

    NASA Astrophysics Data System (ADS)

    Urchuk, S. U.; Murashev, V. N.; Legotin, S. A.; Krasnov, A. A.; Rabinovich, O. I.; Kuzmina, K. A.; Omel'chenko, Y. K.; Osipov, U. V.; Didenko, S. I.

    2016-08-01

    In order to increase the betavoltaic batteries efficiency output characteristics of the p--n+ (n--p+) - structures were simulated. Replacing the p+-n-structures on the p-n+ and n-p+ -structures enables the space-charge expansion to the crystal surface and thus to reduce the recombination loss in the heavy doped p+-layer and improve conversion of betavoltaic elements efficiency.

  11. Efficient Green's Function Reaction Dynamics (GFRD) simulations for diffusion-limited, reversible reactions

    NASA Astrophysics Data System (ADS)

    Bashardanesh, Zahedeh; Lötstedt, Per

    2018-03-01

    In diffusion controlled reversible bimolecular reactions in three dimensions, a dissociation step is typically followed by multiple, rapid re-association steps slowing down the simulations of such systems. In order to improve the efficiency, we first derive an exact Green's function describing the rate at which an isolated pair of particles undergoing reversible bimolecular reactions and unimolecular decay separates beyond an arbitrarily chosen distance. Then the Green's function is used in an algorithm for particle-based stochastic reaction-diffusion simulations for prediction of the dynamics of biochemical networks. The accuracy and efficiency of the algorithm are evaluated using a reversible reaction and a push-pull chemical network. The computational work is independent of the rates of the re-associations.

  12. Efficient Monte Carlo Methods for Biomolecular Simulations.

    NASA Astrophysics Data System (ADS)

    Bouzida, Djamal

    A new approach to efficient Monte Carlo simulations of biological molecules is presented. By relaxing the usual restriction to Markov processes, we are able to optimize performance while dealing directly with the inhomogeneity and anisotropy inherent in these systems. The advantage of this approach is that we can introduce a wide variety of Monte Carlo moves to deal with complicated motions of the molecule, while maintaining full optimization at every step. This enables the use of a variety of collective rotational moves that relax long-wavelength modes. We were able to show by explicit simulations that the resulting algorithms substantially increase the speed of the simulation while reproducing the correct equilibrium behavior. This approach is particularly intended for simulations of macromolecules, although we expect it to be useful in other situations. The dynamic optimization of the new Monte Carlo methods makes them very suitable for simulated annealing experiments on all systems whose state space is continuous in general, and to the protein folding problem in particular. We introduce an efficient annealing schedule using preferential bias moves. Our simulated annealing experiments yield structures whose free energies were lower than the equilibrated X-ray structure, which leads us to believe that the empirical energy function used does not fully represent the interatomic interactions. Furthermore, we believe that the largest discrepancies involve the solvent effects in particular.

  13. An earth imaging camera simulation using wide-scale construction of reflectance surfaces

    NASA Astrophysics Data System (ADS)

    Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk

    2013-10-01

    Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.

  14. Clinic Workflow Simulations using Secondary EHR Data

    PubMed Central

    Hribar, Michelle R.; Biermann, David; Read-Brown, Sarah; Reznick, Leah; Lombardi, Lorinna; Parikh, Mansi; Chamberlain, Winston; Yackel, Thomas R.; Chiang, Michael F.

    2016-01-01

    Clinicians today face increased patient loads, decreased reimbursements and potential negative productivity impacts of using electronic health records (EHR), but have little guidance on how to improve clinic efficiency. Discrete event simulation models are powerful tools for evaluating clinical workflow and improving efficiency, particularly when they are built from secondary EHR timing data. The purpose of this study is to demonstrate that these simulation models can be used for resource allocation decision making as well as for evaluating novel scheduling strategies in outpatient ophthalmology clinics. Key findings from this study are that: 1) secondary use of EHR timestamp data in simulation models represents clinic workflow, 2) simulations provide insight into the best allocation of resources in a clinic, 3) simulations provide critical information for schedule creation and decision making by clinic managers, and 4) simulation models built from EHR data are potentially generalizable. PMID:28269861

  15. Communication: Adaptive boundaries in multiscale simulations

    NASA Astrophysics Data System (ADS)

    Wagoner, Jason A.; Pande, Vijay S.

    2018-04-01

    Combined-resolution simulations are an effective way to study molecular properties across a range of length and time scales. These simulations can benefit from adaptive boundaries that allow the high-resolution region to adapt (change size and/or shape) as the simulation progresses. The number of degrees of freedom required to accurately represent even a simple molecular process can vary by several orders of magnitude throughout the course of a simulation, and adaptive boundaries react to these changes to include an appropriate but not excessive amount of detail. Here, we derive the Hamiltonian and distribution function for such a molecular simulation. We also design an algorithm that can efficiently sample the boundary as a new coordinate of the system. We apply this framework to a mixed explicit/continuum simulation of a peptide in solvent. We use this example to discuss the conditions necessary for a successful implementation of adaptive boundaries that is both efficient and accurate in reproducing molecular properties.

  16. Numerical and Experimental Investigation on a Thermo-Photovoltaic Module for Higher Efficiency Energy Generation

    NASA Astrophysics Data System (ADS)

    Karami-Lakeh, Hossein; Hosseini-Abardeh, Reza; Kaatuzian, Hassan

    2017-05-01

    One major problem of solar cells is the decrease in efficiency due to an increase in temperature when operating under constant irradiation of solar energy. The combination of solar cell and a thermoelectric generator is one of the methods proposed to solve this problem. In this paper, the performance of thermo-photovoltaic system is studied experimentally as well as through numerical simulation. In the experimental part, design, manufacture and test of a novel thermo-photovoltaic system assembly are presented. Results of the assembled system showed that with reduction of one degree (Centigrade) in the temperature of solar cell under investigation, and about 0.2 % increase in the efficiency will be obtained in comparison with given efficiency at that specified temperature. The solar cell in a hybrid-assembled system under two cooling conditions (air cooling and water cooling) obtained an efficiency of 8 % and 9.5 %, respectively, while the efficiency of a single cell under the same radiation condition was 6 %. In numerical simulation part, photo-thermoelectric performance of system was analyzed. Two methods for evaluation of thermoelectric performance were used: average properties and finite element method. Results of simulation also demonstrate an increase in solar cell efficiency in the combined system in comparison with that of the single cell configuration.

  17. Numerical simulation of flow in a high head Francis turbine with prediction of efficiency, rotor stator interaction and vortex structures in the draft tube

    NASA Astrophysics Data System (ADS)

    Jošt, D.; Škerlavaj, A.; Morgut, M.; Mežnar, P.; Nobile, E.

    2015-01-01

    The paper presents numerical simulations of flow in a model of a high head Francis turbine and comparison of results to the measurements. Numerical simulations were done by two CFD (Computational Fluid Dynamics) codes, Ansys CFX and OpenFOAM. Steady-state simulations were performed by k-epsilon and SST model, while for transient simulations the SAS SST ZLES model was used. With proper grid refinement in distributor and runner and with taking into account losses in labyrinth seals very accurate prediction of torque on the shaft, head and efficiency was obtained. Calculated axial and circumferential velocity components on two planes in the draft tube matched well with experimental results.

  18. Comprehensive silicon solar cell computer modeling

    NASA Technical Reports Server (NTRS)

    Lamorte, M. F.

    1984-01-01

    The development of an efficient, comprehensive Si solar cell modeling program that has the capability of simulation accuracy of 5 percent or less is examined. A general investigation of computerized simulation is provided. Computer simulation programs are subdivided into a number of major tasks: (1) analytical method used to represent the physical system; (2) phenomena submodels that comprise the simulation of the system; (3) coding of the analysis and the phenomena submodels; (4) coding scheme that results in efficient use of the CPU so that CPU costs are low; and (5) modularized simulation program with respect to structures that may be analyzed, addition and/or modification of phenomena submodels as new experimental data become available, and the addition of other photovoltaic materials.

  19. Implementation of extended Lagrangian dynamics in GROMACS for polarizable simulations using the classical Drude oscillator model.

    PubMed

    Lemkul, Justin A; Roux, Benoît; van der Spoel, David; MacKerell, Alexander D

    2015-07-15

    Explicit treatment of electronic polarization in empirical force fields used for molecular dynamics simulations represents an important advancement in simulation methodology. A straightforward means of treating electronic polarization in these simulations is the inclusion of Drude oscillators, which are auxiliary, charge-carrying particles bonded to the cores of atoms in the system. The additional degrees of freedom make these simulations more computationally expensive relative to simulations using traditional fixed-charge (additive) force fields. Thus, efficient tools are needed for conducting these simulations. Here, we present the implementation of highly scalable algorithms in the GROMACS simulation package that allow for the simulation of polarizable systems using extended Lagrangian dynamics with a dual Nosé-Hoover thermostat as well as simulations using a full self-consistent field treatment of polarization. The performance of systems of varying size is evaluated, showing that the present code parallelizes efficiently and is the fastest implementation of the extended Lagrangian methods currently available for simulations using the Drude polarizable force field. © 2015 Wiley Periodicals, Inc.

  20. Structural, thermodynamic, and electrical properties of polar fluids and ionic solutions on a hypersphere: Results of simulations

    NASA Astrophysics Data System (ADS)

    Caillol, J. M.; Levesque, D.

    1992-01-01

    The reliability and the efficiency of a new method suitable for the simulations of dielectric fluids and ionic solutions is established by numerical computations. The efficiency depends on the use of a simulation cell which is the surface of a four-dimensional sphere. The reliability originates from a charge-charge potential solution of the Poisson equation in this confining volume. The computation time, for systems of a few hundred molecules, is reduced by a factor of 2 or 3 compared to this of a simulation performed in a cubic volume with periodic boundary conditions and the Ewald charge-charge potential.

  1. Real-time simulation of large-scale floods

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  2. Simulated Sunlight-Mediated Photodynamic Therapy for Melanoma Skin Cancer by Titanium-Dioxide-Nanoparticle-Gold-Nanocluster-Graphene Heterogeneous Nanocomposites.

    PubMed

    Cheng, Yan; Chang, Yun; Feng, Yanlin; Liu, Ning; Sun, Xiujuan; Feng, Yuqing; Li, Xi; Zhang, Haiyuan

    2017-05-01

    Simulated sunlight has promise as a light source able to alleviate the severe pain associated with patients during photodynamic therapy (PDT); however, low sunlight utilization efficiency of traditional photosensitizers dramatically limits its application. Titanium-dioxide-nanoparticle-gold-nanocluster-graphene (TAG) heterogeneous nanocomposites are designed to efficiently utilize simulated sunlight for melanoma skin cancer PDT. The narrow band gap in gold nanoclusters (Au NCs), and staggered energy bands between Au NCs, titanium dioxide nanoparticles (TiO 2 NPs), and graphene can result in efficient utilization of simulated sunlight and separation of electron-hole pairs, facilitating the production of abundant hydroxyl and superoxide radicals. Under irradiation of simulated sunlight, TAG nanocomposites can trigger a series of toxicological responses in mouse B16F1 melanoma cells, such as intracellular reactive oxygen species production, glutathione depletion, heme oxygenase-1 expression, and mitochondrial dysfunctions, resulting in severe cell death. Furthermore, intravenous or intratumoral administration of biocompatible TAG nanocomposites in B16F1-tumor-xenograft-bearing mice can significantly inhibit tumor growth and cause severe pathological tumor tissue changes. All of these results demonstrate prominent simulated sunlight-mediated PDT effects. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  4. Propulsive efficiency of the underwater dolphin kick in humans.

    PubMed

    von Loebbecke, Alfred; Mittal, Rajat; Fish, Frank; Mark, Russell

    2009-05-01

    Three-dimensional fully unsteady computational fluid dynamic simulations of five Olympic-level swimmers performing the underwater dolphin kick are used to estimate the swimmer's propulsive efficiencies. These estimates are compared with those of a cetacean performing the dolphin kick. The geometries of the swimmers and the cetacean are based on laser and CT scans, respectively, and the stroke kinematics is based on underwater video footage. The simulations indicate that the propulsive efficiency for human swimmers varies over a relatively wide range from about 11% to 29%. The efficiency of the cetacean is found to be about 56%, which is significantly higher than the human swimmers. The computed efficiency is found not to correlate with either the slender body theory or with the Strouhal number.

  5. The efficiency of photovoltaic cells exposed to pulsed laser light

    NASA Technical Reports Server (NTRS)

    Lowe, R. A.; Landis, G. A.; Jenkins, P.

    1993-01-01

    Future space missions may use laser power beaming systems with a free electron laser (FEL) to transmit light to a photovoltaic array receiver. To investigate the efficiency of solar cells with pulsed laser light, several types of GaAs, Si, CuInSe2, and GaSb cells were tested with the simulated pulse format of the induction and radio frequency (RF) FEL. The induction pulse format was simulated with an 800-watt average power copper vapor laser and the RF format with a frequency-doubled mode-locked Nd:YAG laser. Averaged current vs bias voltage measurements for each cell were taken at various optical power levels and the efficiency measured at the maximum power point. Experimental results show that the conversion efficiency for the cells tested is highly dependent on cell minority carrier lifetime, the width and frequency of the pulses, load impedance, and the average incident power. Three main effects were found to decrease the efficiency of solar cells exposed to simulated FEL illumination: cell series resistance, LC 'ringing', and output inductance. Improvements in efficiency were achieved by modifying the frequency response of the cell to match the spectral energy content of the laser pulse with external passive components.

  6. Wideband piezoelectric energy harvester for low-frequency application with plucking mechanism

    NASA Astrophysics Data System (ADS)

    Hiraki, Yasuhiro; Masuda, Arata; Ikeda, Naoto; Katsumura, Hidenori; Kagata, Hiroshi; Okumura, Hidenori

    2015-04-01

    Wireless sensor networks need energy harvesting from vibrational environment for their power supply. The conventional resonance type vibration energy harvesters, however, are not always effective for low frequency application. The purpose of this paper is to propose a high efficiency energy harvester for low frequency application by utilizing plucking and SSHI techniques, and to investigate the effects of applying those techniques in terms of the energy harvesting efficiency. First, we derived an approximate formulation of energy harvesting efficiency of the plucking device by theoretical analysis. Next, it was confirmed that the improved efficiency agreed with numerical and experimental results. Also, a parallel SSHI, a switching circuit technique to improve the performance of the harvester was introduced and examined by numerical simulations and experiments. Contrary to the simulated results in which the efficiency was improved from 13.1% to 22.6% by introducing the SSHI circuit, the efficiency obtained in the experiment was only 7.43%. This would due to the internal resistance of the inductors and photo MOS relays on the switching circuit and the simulation including this factor revealed large negative influence of it. This result suggested that the reduction of the switching resistance was significantly important to the implementation of SSHI.

  7. Aspects of numerical and representational methods related to the finite-difference simulation of advective and dispersive transport of freshwater in a thin brackish aquifer

    USGS Publications Warehouse

    Merritt, M.L.

    1993-01-01

    The simulation of the transport of injected freshwater in a thin brackish aquifer, overlain and underlain by confining layers containing more saline water, is shown to be influenced by the choice of the finite-difference approximation method, the algorithm for representing vertical advective and dispersive fluxes, and the values assigned to parametric coefficients that specify the degree of vertical dispersion and molecular diffusion that occurs. Computed potable water recovery efficiencies will differ depending upon the choice of algorithm and approximation method, as will dispersion coefficients estimated based on the calibration of simulations to match measured data. A comparison of centered and backward finite-difference approximation methods shows that substantially different transition zones between injected and native waters are depicted by the different methods, and computed recovery efficiencies vary greatly. Standard and experimental algorithms and a variety of values for molecular diffusivity, transverse dispersivity, and vertical scaling factor were compared in simulations of freshwater storage in a thin brackish aquifer. Computed recovery efficiencies vary considerably, and appreciable differences are observed in the distribution of injected freshwater in the various cases tested. The results demonstrate both a qualitatively different description of transport using the experimental algorithms and the interrelated influences of molecular diffusion and transverse dispersion on simulated recovery efficiency. When simulating natural aquifer flow in cross-section, flushing of the aquifer occurred for all tested coefficient choices using both standard and experimental algorithms. ?? 1993.

  8. Efficiency study of a big volume well type NaI(Tl) detector by point and voluminous sources and Monte-Carlo simulation.

    PubMed

    Hansman, Jan; Mrdja, Dusan; Slivka, Jaroslav; Krmar, Miodrag; Bikit, Istvan

    2015-05-01

    The activity of environmental samples is usually measured by high resolution HPGe gamma spectrometers. In this work a set-up with a 9in.x9in. NaI well-detector with 3in. thickness and a 3in.×3in. plug detector in a 15-cm-thick lead shielding is considered as an alternative (Hansman, 2014). In spite of its much poorer resolution, it requires shorter measurement times and may possibly give better detection limits. In order to determine the U-238, Th-232, and K-40 content in the samples by this NaI(Tl) detector, the corresponding photopeak efficiencies must be known. These efficiencies can be found for certain source matrix and geometry by Geant4 simulation. We found discrepancy between simulated and experimental efficiencies of 5-50%, which can be mainly due to effects of light collection within the detector volume, an effect which was not taken into account by simulations. The influence of random coincidence summing on detection efficiency for radionuclide activities in the range 130-4000Bq, was negligible. This paper describes also, how the efficiency in the detector depends on the position of the radioactive point source. To avoid large dead time, relatively weak Mn-54, Co-60 and Na-22 point sources of a few kBq were used. Results for single gamma lines and also for coincidence summing gamma lines are presented. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Evaluation of Simulated Clinical Breast Exam Motion Patterns Using Marker-Less Video Tracking

    PubMed Central

    Azari, David P.; Pugh, Carla M.; Laufer, Shlomi; Kwan, Calvin; Chen, Chia-Hsiung; Yen, Thomas Y.; Hu, Yu Hen; Radwin, Robert G.

    2016-01-01

    Objective This study investigates using marker-less video tracking to evaluate hands-on clinical skills during simulated clinical breast examinations (CBEs). Background There are currently no standardized and widely accepted CBE screening techniques. Methods Experienced physicians attending a national conference conducted simulated CBEs presenting different pathologies with distinct tumorous lesions. Single hand exam motion was recorded and analyzed using marker-less video tracking. Four kinematic measures were developed to describe temporal (time pressing and time searching) and spatial (area covered and distance explored) patterns. Results Mean differences between time pressing, area covered, and distance explored varied across the simulated lesions. Exams were objectively categorized as either sporadic, localized, thorough, or efficient for both temporal and spatial categories based on spatiotemporal characteristics. The majority of trials were temporally or spatially thorough (78% and 91%), exhibiting proportionally greater time pressing and time searching (temporally thorough) and greater area probed with greater distance explored (spatially thorough). More efficient exams exhibited proportionally more time pressing with less time searching (temporally efficient) and greater area probed with less distance explored (spatially efficient). Just two (5.9 %) of the trials exhibited both high temporal and spatial efficiency. Conclusions Marker-less video tracking was used to discriminate different examination techniques and measure when an exam changes from general searching to specific probing. The majority of participants exhibited more thorough than efficient patterns. Application Marker-less video kinematic tracking may be useful for quantifying clinical skills for training and assessment. PMID:26546381

  10. Facilitating researcher use of flight simulators

    NASA Technical Reports Server (NTRS)

    Russell, C. Ray

    1990-01-01

    Researchers conducting experiments with flight simulators encounter numerous obstacles in bringing their ideas to the simulator. Research into how these simulators could be used more efficiently is presented. The study involved: (1) analyzing the Advanced Concepts Simulator software architecture, (2) analyzing the interaction between the researchers and simulation programmers, and (3) proposing a documentation tool for the researchers.

  11. Comparison of normal tissue dose calculation methods for epidemiological studies of radiotherapy patients.

    PubMed

    Mille, Matthew M; Jung, Jae Won; Lee, Choonik; Kuzmin, Gleb A; Lee, Choonsik

    2018-06-01

    Radiation dosimetry is an essential input for epidemiological studies of radiotherapy patients aimed at quantifying the dose-response relationship of late-term morbidity and mortality. Individualised organ dose must be estimated for all tissues of interest located in-field, near-field, or out-of-field. Whereas conventional measurement approaches are limited to points in water or anthropomorphic phantoms, computational approaches using patient images or human phantoms offer greater flexibility and can provide more detailed three-dimensional dose information. In the current study, we systematically compared four different dose calculation algorithms so that dosimetrists and epidemiologists can better understand the advantages and limitations of the various approaches at their disposal. The four dose calculations algorithms considered were as follows: the (1) Analytical Anisotropic Algorithm (AAA) and (2) Acuros XB algorithm (Acuros XB), as implemented in the Eclipse treatment planning system (TPS); (3) a Monte Carlo radiation transport code, EGSnrc; and (4) an accelerated Monte Carlo code, the x-ray Voxel Monte Carlo (XVMC). The four algorithms were compared in terms of their accuracy and appropriateness in the context of dose reconstruction for epidemiological investigations. Accuracy in peripheral dose was evaluated first by benchmarking the calculated dose profiles against measurements in a homogeneous water phantom. Additional simulations in a heterogeneous cylinder phantom evaluated the performance of the algorithms in the presence of tissue heterogeneity. In general, we found that the algorithms contained within the commercial TPS (AAA and Acuros XB) were fast and accurate in-field or near-field, but not acceptable out-of-field. Therefore, the TPS is best suited for epidemiological studies involving large cohorts and where the organs of interest are located in-field or partially in-field. The EGSnrc and XVMC codes showed excellent agreement with measurements both in-field and out-of-field. The EGSnrc code was the most accurate dosimetry approach, but was too slow to be used for large-scale epidemiological cohorts. The XVMC code showed similar accuracy to EGSnrc, but was significantly faster, and thus epidemiological applications seem feasible, especially when the organs of interest reside far away from the field edge.

  12. Inefficient epidemic spreading in scale-free networks

    NASA Astrophysics Data System (ADS)

    Piccardi, Carlo; Casagrandi, Renato

    2008-02-01

    Highly heterogeneous degree distributions yield efficient spreading of simple epidemics through networks, but can be inefficient with more complex epidemiological processes. We study diseases with nonlinear force of infection whose prevalences can abruptly collapse to zero while decreasing the transmission parameters. We find that scale-free networks can be unable to support diseases that, on the contrary, are able to persist at high endemic levels in homogeneous networks with the same average degree.

  13. USSR and Eastern Europe Scientific Abstracts Biomedical and Behavioral Sciences No. 65.

    DTIC Science & Technology

    1977-03-01

    Biophysics 14 Environmental and Ecological Problems 16 Epidemiology 23 Food Supply 31 Hydrobiology . 32 Immunology 34 Industrial Toxicology 35... productivity was established, as well as a high efficiency of the selection for short stem in .different generations. Figure 1; Tables 6; References 14: 4...on twenty-five sub- jects working in agriculture and handling plant protection products . Measure- ments of the rate of nerve conduction were made on

  14. Dengue surveillance in the French armed forces: a dengue sentinel surveillance system in countries without efficient local epidemiological surveillance.

    PubMed

    de Laval, Franck; Dia, Aissata; Plumet, Sébastien; Decam, Christophe; Leparc Goffart, Isabelle; Deparis, Xavier

    2013-01-01

    Surveillance of travel-acquired dengue could improve dengue risk estimation in countries without ability. Surveillance in the French army in 2010 to 2011 highlighted 330 dengue cases, mainly in French West Indies and Guiana: DENV-1 circulated in Guadeloupe, Martinique, French Guiana, New Caledonia, Djibouti; DENV-3 in Mayotte and Djibouti; and DENV-4 in French Guiana. © 2012 International Society of Travel Medicine.

  15. Uncertainty of Wheat Water Use: Simulated Patterns and Sensitivity to Temperature and CO2

    NASA Technical Reports Server (NTRS)

    Cammarano, Davide; Roetter, Reimund P.; Asseng, Senthold; Ewert, Frank; Wallach, Daniel; Martre, Pierre; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia E.; Ruane, Alex C.; hide

    2016-01-01

    Projected global warming and population growth will reduce future water availability for agriculture. Thus, it is essential to increase the efficiency in using water to ensure crop productivity. Quantifying crop water use (WU; i.e. actual evapotranspiration) is a critical step towards this goal. Here, sixteen wheat simulation models were used to quantify sources of model uncertainty and to estimate the relative changes and variability between models for simulated WU, water use efficiency (WUE, WU per unit of grain dry mass produced), transpiration efficiency (Teff, transpiration per kg of unit of grain yield dry mass produced), grain yield, crop transpiration and soil evaporation at increased temperatures and elevated atmospheric carbon dioxide concentrations ([CO2]). The greatest uncertainty in simulating water use, potential evapotranspiration, crop transpiration and soil evaporation was due to differences in how crop transpiration was modelled and accounted for 50 of the total variability among models. The simulation results for the sensitivity to temperature indicated that crop WU will decline with increasing temperature due to reduced growing seasons. The uncertainties in simulated crop WU, and in particularly due to uncertainties in simulating crop transpiration, were greater under conditions of increased temperatures and with high temperatures in combination with elevated atmospheric [CO2] concentrations. Hence the simulation of crop WU, and in particularly crop transpiration under higher temperature, needs to be improved and evaluated with field measurements before models can be used to simulate climate change impacts on future crop water demand.

  16. Efficient and Robust Optimization for Building Energy Simulation

    PubMed Central

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-01-01

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell’s Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell’s method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell’s Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell’s Hybrid method presently used in HVACSIM+. PMID:27325907

  17. Efficient and Robust Optimization for Building Energy Simulation.

    PubMed

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-06-15

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell's Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell's method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell's Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell's Hybrid method presently used in HVACSIM+.

  18. Measurement of the photon identification efficiencies with the ATLAS detector using LHC Run-1 data

    NASA Astrophysics Data System (ADS)

    Aaboud, M.; Aad, G.; Abbott, B.; Abdallah, J.; Abdinov, O.; Abeloos, B.; Aben, R.; AbouZeid, O. S.; Abraham, N. L.; Abramowicz, H.; Abreu, H.; Abreu, R.; Abulaiti, Y.; Acharya, B. S.; Adamczyk, L.; Adams, D. L.; Adelman, J.; Adomeit, S.; Adye, T.; Affolder, A. A.; Agatonovic-Jovin, T.; Agricola, J.; Aguilar-Saavedra, J. A.; Ahlen, S. P.; Ahmadov, F.; Aielli, G.; Akerstedt, H.; Åkesson, T. P. A.; Akimov, A. V.; Alberghi, G. L.; Albert, J.; Albrand, S.; Verzini, M. J. Alconada; Aleksa, M.; Aleksandrov, I. N.; Alexa, C.; Alexander, G.; Alexopoulos, T.; Alhroob, M.; Aliev, M.; Alimonti, G.; Alison, J.; Alkire, S. P.; Allbrooke, B. M. M.; Allen, B. W.; Allport, P. P.; Aloisio, A.; Alonso, A.; Alonso, F.; Alpigiani, C.; Alstaty, M.; Gonzalez, B. Alvarez; Piqueras, D. Álvarez; Alviggi, M. G.; Amadio, B. T.; Amako, K.; Coutinho, Y. Amaral; Amelung, C.; Amidei, D.; Santos, S. P. Amor Dos; Amorim, A.; Amoroso, S.; Amundsen, G.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, G.; Anders, J. K.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Angelidakis, S.; Angelozzi, I.; Anger, P.; Angerami, A.; Anghinolfi, F.; Anisenkov, A. V.; Anjos, N.; Annovi, A.; Antonelli, M.; Antonov, A.; Anulli, F.; Aoki, M.; Bella, L. Aperio; Arabidze, G.; Arai, Y.; Araque, J. P.; Arce, A. T. H.; Arduh, F. A.; Arguin, J.-F.; Argyropoulos, S.; Arik, M.; Armbruster, A. J.; Armitage, L. J.; Arnaez, O.; Arnold, H.; Arratia, M.; Arslan, O.; Artamonov, A.; Artoni, G.; Artz, S.; Asai, S.; Asbah, N.; Ashkenazi, A.; Åsman, B.; Asquith, L.; Assamagan, K.; Astalos, R.; Atkinson, M.; Atlay, N. B.; Augsten, K.; Avolio, G.; Axen, B.; Ayoub, M. K.; Azuelos, G.; Baak, M. A.; Baas, A. E.; Baca, M. J.; Bachacou, H.; Bachas, K.; Backes, M.; Backhaus, M.; Bagiacchi, P.; Bagnaia, P.; Bai, Y.; Baines, J. T.; Baker, O. K.; Baldin, E. M.; Balek, P.; Balestri, T.; Balli, F.; Balunas, W. K.; Banas, E.; Banerjee, Sw.; Bannoura, A. A. E.; Barak, L.; Barberio, E. L.; Barberis, D.; Barbero, M.; Barillari, T.; Barklow, T.; Barlow, N.; Barnes, S. L.; Barnett, B. M.; Barnett, R. M.; Barnovska, Z.; Baroncelli, A.; Barone, G.; Barr, A. J.; Navarro, L. Barranco; Barreiro, F.; da Costa, J. Barreiro Guimarães; Bartoldus, R.; Barton, A. E.; Bartos, P.; Basalaev, A.; Bassalat, A.; Bates, R. L.; Batista, S. J.; Batley, J. R.; Battaglia, M.; Bauce, M.; Bauer, F.; Bawa, H. S.; Beacham, J. B.; Beattie, M. D.; Beau, T.; Beauchemin, P. H.; Bechtle, P.; Beck, H. P.; Becker, K.; Becker, M.; Beckingham, M.; Becot, C.; Beddall, A. J.; Beddall, A.; Bednyakov, V. A.; Bedognetti, M.; Bee, C. P.; Beemster, L. J.; Beermann, T. A.; Begel, M.; Behr, J. K.; Belanger-Champagne, C.; Bell, A. S.; Bella, G.; Bellagamba, L.; Bellerive, A.; Bellomo, M.; Belotskiy, K.; Beltramello, O.; Belyaev, N. L.; Benary, O.; Benchekroun, D.; Bender, M.; Bendtz, K.; Benekos, N.; Benhammou, Y.; Noccioli, E. Benhar; Benitez, J.; Benjamin, D. P.; Bensinger, J. R.; Bentvelsen, S.; Beresford, L.; Beretta, M.; Berge, D.; Kuutmann, E. Bergeaas; Berger, N.; Beringer, J.; Berlendis, S.; Bernard, N. R.; Bernius, C.; Bernlochner, F. U.; Berry, T.; Berta, P.; Bertella, C.; Bertoli, G.; Bertolucci, F.; Bertram, I. A.; Bertsche, C.; Bertsche, D.; Besjes, G. J.; Bylund, O. Bessidskaia; Bessner, M.; Besson, N.; Betancourt, C.; Bethke, S.; Bevan, A. J.; Bhimji, W.; Bianchi, R. M.; Bianchini, L.; Bianco, M.; Biebel, O.; Biedermann, D.; Bielski, R.; Biesuz, N. V.; Biglietti, M.; De Mendizabal, J. Bilbao; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Biondi, S.; Bjergaard, D. M.; Black, C. W.; Black, J. E.; Black, K. M.; Blackburn, D.; Blair, R. E.; Blanchard, J.-B.; Blanco, J. E.; Blazek, T.; Bloch, I.; Blocker, C.; Blum, W.; Blumenschein, U.; Blunier, S.; Bobbink, G. J.; Bobrovnikov, V. S.; Bocchetta, S. S.; Bocci, A.; Bock, C.; Boehler, M.; Boerner, D.; Bogaerts, J. A.; Bogavac, D.; Bogdanchikov, A. G.; Bohm, C.; Boisvert, V.; Bokan, P.; Bold, T.; Boldyrev, A. S.; Bomben, M.; Bona, M.; Boonekamp, M.; Borisov, A.; Borissov, G.; Bortfeldt, J.; Bortoletto, D.; Bortolotto, V.; Bos, K.; Boscherini, D.; Bosman, M.; Sola, J. D. Bossio; Boudreau, J.; Bouffard, J.; Bouhova-Thacker, E. V.; Boumediene, D.; Bourdarios, C.; Boutle, S. K.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bracinik, J.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Madden, W. D. Breaden; Brendlinger, K.; Brennan, A. J.; Brenner, L.; Brenner, R.; Bressler, S.; Bristow, T. M.; Britton, D.; Britzger, D.; Brochu, F. M.; Brock, I.; Brock, R.; Brooijmans, G.; Brooks, T.; Brooks, W. K.; Brosamer, J.; Brost, E.; Broughton, J. H.; de Renstrom, P. A. Bruckman; Bruncko, D.; Bruneliere, R.; Bruni, A.; Bruni, G.; Bruni, L. S.; Brunt, B. H.; Bruschi, M.; Bruscino, N.; Bryant, P.; Bryngemark, L.; Buanes, T.; Buat, Q.; Buchholz, P.; Buckley, A. G.; Budagov, I. A.; Buehrer, F.; Bugge, M. K.; Bulekov, O.; Bullock, D.; Burckhart, H.; Burdin, S.; Burgard, C. D.; Burghgrave, B.; Burka, K.; Burke, S.; Burmeister, I.; Busato, E.; Büscher, D.; Büscher, V.; Bussey, P.; Butler, J. M.; Buttar, C. M.; Butterworth, J. M.; Butti, P.; Buttinger, W.; Buzatu, A.; Buzykaev, A. R.; Urbán, S. Cabrera; Caforio, D.; Cairo, V. M.; Cakir, O.; Calace, N.; Calafiura, P.; Calandri, A.; Calderini, G.; Calfayan, P.; Caloba, L. P.; Calvet, D.; Calvet, S.; Calvet, T. P.; Toro, R. Camacho; Camarda, S.; Camarri, P.; Cameron, D.; Armadans, R. Caminal; Camincher, C.; Campana, S.; Campanelli, M.; Camplani, A.; Campoverde, A.; Canale, V.; Canepa, A.; Bret, M. Cano; Cantero, J.; Cantrill, R.; Cao, T.; Garrido, M. D. M. Capeans; Caprini, I.; Caprini, M.; Capua, M.; Caputo, R.; Carbone, R. M.; Cardarelli, R.; Cardillo, F.; Carli, I.; Carli, T.; Carlino, G.; Carminati, L.; Caron, S.; Carquin, E.; Carrillo-Montoya, G. D.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Casolino, M.; Casper, D. W.; Castaneda-Miranda, E.; Castelijn, R.; Castelli, A.; Gimenez, V. Castillo; Castro, N. F.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Caudron, J.; Cavaliere, V.; Cavallaro, E.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Alberich, L. Cerda; Cerio, B. C.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cerv, M.; Cervelli, A.; Cetin, S. A.; Chafaq, A.; Chakraborty, D.; Chan, S. K.; Chan, Y. L.; Chang, P.; Chapman, J. D.; Charlton, D. G.; Chatterjee, A.; Chau, C. C.; Barajas, C. A. Chavez; Che, S.; Cheatham, S.; Chegwidden, A.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, H.; Chen, K.; Chen, S.; Chen, S.; Chen, X.; Chen, Y.; Cheng, H. C.; Cheng, H. J.; Cheng, Y.; Cheplakov, A.; Cheremushkina, E.; El Moursli, R. Cherkaoui; Chernyatin, V.; Cheu, E.; Chevalier, L.; Chiarella, V.; Chiarelli, G.; Chiodini, G.; Chisholm, A. S.; Chitan, A.; Chizhov, M. V.; Choi, K.; Chomont, A. R.; Chouridou, S.; Chow, B. K. B.; Christodoulou, V.; Chromek-Burckhart, D.; Chudoba, J.; Chuinard, A. J.; Chwastowski, J. J.; Chytka, L.; Ciapetti, G.; Ciftci, A. K.; Cinca, D.; Cindro, V.; Cioara, I. A.; Ciocio, A.; Cirotto, F.; Citron, Z. H.; Citterio, M.; Ciubancan, M.; Clark, A.; Clark, B. L.; Clark, M. R.; Clark, P. J.; Clarke, R. N.; Clement, C.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Coffey, L.; Colasurdo, L.; Cole, B.; Colijn, A. P.; Collot, J.; Colombo, T.; Compostella, G.; Muiño, P. Conde; Coniavitis, E.; Connell, S. H.; Connelly, I. A.; Consorti, V.; Constantinescu, S.; Conti, G.; Conventi, F.; Cooke, M.; Cooper, B. D.; Cooper-Sarkar, A. M.; Cormier, K. J. R.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Corso-Radu, A.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M. J.; Costanzo, D.; Cottin, G.; Cowan, G.; Cox, B. E.; Cranmer, K.; Crawley, S. J.; Cree, G.; Crépé-Renaudin, S.; Crescioli, F.; Cribbs, W. A.; Ortuzar, M. Crispin; Cristinziani, M.; Croft, V.; Crosetti, G.; Donszelmann, T. Cuhadar; Cummings, J.; Curatolo, M.; Cúth, J.; Cuthbert, C.; Czirr, H.; Czodrowski, P.; D'amen, G.; D'Auria, S.; D'Onofrio, M.; Da Cunha Sargedas De Sousa, M. J.; Da Via, C.; Dabrowski, W.; Dado, T.; Dai, T.; Dale, O.; Dallaire, F.; Dallapiccola, C.; Dam, M.; Dandoy, J. R.; Dang, N. P.; Daniells, A. C.; Dann, N. S.; Danninger, M.; Hoffmann, M. Dano; Dao, V.; Darbo, G.; Darmora, S.; Dassoulas, J.; Dattagupta, A.; Davey, W.; David, C.; Davidek, T.; Davies, M.; Davison, P.; Dawe, E.; Dawson, I.; Daya-Ishmukhametova, R. K.; De, K.; de Asmundis, R.; De Benedetti, A.; De Castro, S.; De Cecco, S.; De Groot, N.; de Jong, P.; De la Torre, H.; De Lorenzi, F.; De Maria, A.; De Pedis, D.; De Salvo, A.; De Sanctis, U.; De Santo, A.; De Vivie De Regie, J. B.; Dearnaley, W. J.; Debbe, R.; Debenedetti, C.; Dedovich, D. V.; Dehghanian, N.; Deigaard, I.; Del Gaudio, M.; Del Peso, J.; Del Prete, T.; Delgove, D.; Deliot, F.; Delitzsch, C. M.; Deliyergiyev, M.; Dell'Acqua, A.; Dell'Asta, L.; Dell'Orso, M.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delsart, P. A.; Deluca, C.; DeMarco, D. A.; Demers, S.; Demichev, M.; Demilly, A.; Denisov, S. P.; Denysiuk, D.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Deterre, C.; Dette, K.; Deviveiros, P. O.; Dewhurst, A.; Dhaliwal, S.; Di Ciaccio, A.; Di Ciaccio, L.; Di Clemente, W. K.; Di Donato, C.; Di Girolamo, A.; Di Girolamo, B.; Di Micco, B.; Di Nardo, R.; Di Simone, A.; Di Sipio, R.; Di Valentino, D.; Diaconu, C.; Diamond, M.; Dias, F. A.; Diaz, M. A.; Diehl, E. B.; Dietrich, J.; Diglio, S.; Dimitrievska, A.; Dingfelder, J.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djobava, T.; Djuvsland, J. I.; do Vale, M. A. B.; Dobos, D.; Dobre, M.; Doglioni, C.; Dohmae, T.; Dolejsi, J.; Dolezal, Z.; Dolgoshein, B. A.; Donadelli, M.; Donati, S.; Dondero, P.; Donini, J.; Dopke, J.; Doria, A.; Dova, M. T.; Doyle, A. T.; Drechsler, E.; Dris, M.; Du, Y.; Duarte-Campderros, J.; Duchovni, E.; Duckeck, G.; Ducu, O. A.; Duda, D.; Dudarev, A.; Duffield, E. M.; Duflot, L.; Duguid, L.; Dührssen, M.; Dumancic, M.; Dunford, M.; Yildiz, H. Duran; Düren, M.; Durglishvili, A.; Duschinger, D.; Dutta, B.; Dyndal, M.; Eckardt, C.; Ecker, K. M.; Edgar, R. C.; Edwards, N. C.; Eifert, T.; Eigen, G.; Einsweiler, K.; Ekelof, T.; El Kacimi, M.; Ellajosyula, V.; Ellert, M.; Elles, S.; Ellinghaus, F.; Elliot, A. A.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Enari, Y.; Endner, O. C.; Endo, M.; Ennis, J. S.; Erdmann, J.; Ereditato, A.; Ernis, G.; Ernst, J.; Ernst, M.; Errede, S.; Ertel, E.; Escalier, M.; Esch, H.; Escobar, C.; Esposito, B.; Etienvre, A. I.; Etzion, E.; Evans, H.; Ezhilov, A.; Fabbri, F.; Fabbri, L.; Facini, G.; Fakhrutdinov, R. M.; Falciano, S.; Falla, R. J.; Faltova, J.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farina, C.; Farooque, T.; Farrell, S.; Farrington, S. M.; Farthouat, P.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Giannelli, M. Faucci; Favareto, A.; Fawcett, W. J.; Fayard, L.; Fedin, O. L.; Fedorko, W.; Feigl, S.; Feligioni, L.; Feng, C.; Feng, E. J.; Feng, H.; Fenyuk, A. B.; Feremenga, L.; Martinez, P. Fernandez; Perez, S. Fernandez; Ferrando, J.; Ferrari, A.; Ferrari, P.; Ferrari, R.; de Lima, D. E. Ferreira; Ferrer, A.; Ferrere, D.; Ferretti, C.; Parodi, A. Ferretto; Fiedler, F.; Filipčič, A.; Filipuzzi, M.; Filthaut, F.; Fincke-Keeler, M.; Finelli, K. D.; Fiolhais, M. C. N.; Fiorini, L.; Firan, A.; Fischer, A.; Fischer, C.; Fischer, J.; Fisher, W. C.; Flaschel, N.; Fleck, I.; Fleischmann, P.; Fletcher, G. T.; Fletcher, R. R. M.; Flick, T.; Floderus, A.; Castillo, L. R. Flores; Flowerdew, M. J.; Forcolin, G. T.; Formica, A.; Forti, A.; Foster, A. G.; Fournier, D.; Fox, H.; Fracchia, S.; Francavilla, P.; Franchini, M.; Francis, D.; Franconi, L.; Franklin, M.; Frate, M.; Fraternali, M.; Freeborn, D.; Fressard-Batraneanu, S. M.; Friedrich, F.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Torregrosa, E. Fullana; Fusayasu, T.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gabrielli, A.; Gabrielli, A.; Gach, G. P.; Gadatsch, S.; Gadomski, S.; Gagliardi, G.; Gagnon, L. G.; Gagnon, P.; Galea, C.; Galhardo, B.; Gallas, E. J.; Gallop, B. J.; Gallus, P.; Galster, G.; Gan, K. K.; Gao, J.; Gao, Y.; Gao, Y. S.; Walls, F. M. Garay; García, C.; Navarro, J. E. García; Garcia-Sciveres, M.; Gardner, R. W.; Garelli, N.; Garonne, V.; Bravo, A. Gascon; Gatti, C.; Gaudiello, A.; Gaudio, G.; Gaur, B.; Gauthier, L.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gazis, E. N.; Gecse, Z.; Gee, C. N. P.; Geich-Gimbel, Ch.; Geisen, M.; Geisler, M. P.; Gemme, C.; Genest, M. H.; Geng, C.; Gentile, S.; George, S.; Gerbaudo, D.; Gershon, A.; Ghasemi, S.; Ghazlane, H.; Ghneimat, M.; Giacobbe, B.; Giagu, S.; Giannetti, P.; Gibbard, B.; Gibson, S. M.; Gignac, M.; Gilchriese, M.; Gillam, T. P. S.; Gillberg, D.; Gilles, G.; Gingrich, D. M.; Giokaris, N.; Giordani, M. P.; Giorgi, F. M.; Giorgi, F. M.; Giraud, P. F.; Giromini, P.; Giugni, D.; Giuli, F.; Giuliani, C.; Giulini, M.; Gjelsten, B. K.; Gkaitatzis, S.; Gkialas, I.; Gkougkousis, E. L.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glaysher, P. C. F.; Glazov, A.; Goblirsch-Kolb, M.; Godlewski, J.; Goldfarb, S.; Golling, T.; Golubkov, D.; Gomes, A.; Gonçalo, R.; Da Costa, J. Goncalves Pinto Firmino; Gonella, G.; Gonella, L.; Gongadze, A.; de la Hoz, S. González; Parra, G. Gonzalez; Gonzalez-Sevilla, S.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Goshaw, A. T.; Gössling, C.; Gostkin, M. I.; Goudet, C. R.; Goujdami, D.; Goussiou, A. G.; Govender, N.; Gozani, E.; Graber, L.; Grabowska-Bold, I.; Gradin, P. O. J.; Grafström, P.; Gramling, J.; Gramstad, E.; Grancagnolo, S.; Gratchev, V.; Gravila, P. M.; Gray, H. M.; Graziani, E.; Greenwood, Z. D.; Grefe, C.; Gregersen, K.; Gregor, I. M.; Grenier, P.; Grevtsov, K.; Griffiths, J.; Grillo, A. A.; Grimm, K.; Grinstein, S.; Gris, Ph.; Grivaz, J.-F.; Groh, S.; Grohs, J. P.; Gross, E.; Grosse-Knetter, J.; Grossi, G. C.; Grout, Z. J.; Guan, L.; Guan, W.; Guenther, J.; Guescini, F.; Guest, D.; Gueta, O.; Guido, E.; Guillemin, T.; Guindon, S.; Gul, U.; Gumpert, C.; Guo, J.; Guo, Y.; Gupta, S.; Gustavino, G.; Gutierrez, P.; Ortiz, N. G. Gutierrez; Gutschow, C.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haber, C.; Hadavand, H. K.; Haddad, N.; Hadef, A.; Haefner, P.; Hageböck, S.; Hajduk, Z.; Hakobyan, H.; Haleem, M.; Haley, J.; Halladjian, G.; Hallewell, G. D.; Hamacher, K.; Hamal, P.; Hamano, K.; Hamilton, A.; Hamity, G. N.; Hamnett, P. G.; Han, L.; Hanagaki, K.; Hanawa, K.; Hance, M.; Haney, B.; Hanke, P.; Hanna, R.; Hansen, J. B.; Hansen, J. D.; Hansen, M. C.; Hansen, P. H.; Hara, K.; Hard, A. S.; Harenberg, T.; Hariri, F.; Harkusha, S.; Harrington, R. D.; Harrison, P. F.; Hartjes, F.; Hartmann, N. M.; Hasegawa, M.; Hasegawa, Y.; Hasib, A.; Hassani, S.; Haug, S.; Hauser, R.; Hauswald, L.; Havranek, M.; Hawkes, C. M.; Hawkings, R. J.; Hayden, D.; Hays, C. P.; Hays, J. M.; Hayward, H. S.; Haywood, S. J.; Head, S. J.; Heck, T.; Hedberg, V.; Heelan, L.; Heim, S.; Heim, T.; Heinemann, B.; Heinrich, J. J.; Heinrich, L.; Heinz, C.; Hejbal, J.; Helary, L.; Hellman, S.; Helsens, C.; Henderson, J.; Henderson, R. C. W.; Heng, Y.; Henkelmann, S.; Correia, A. M. Henriques; Henrot-Versille, S.; Herbert, G. H.; Jiménez, Y. Hernández; Herten, G.; Hertenberger, R.; Hervas, L.; Hesketh, G. G.; Hessey, N. P.; Hetherly, J. W.; Hickling, R.; Higón-Rodriguez, E.; Hill, E.; Hill, J. C.; Hiller, K. H.; Hillier, S. J.; Hinchliffe, I.; Hines, E.; Hinman, R. R.; Hirose, M.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoenig, F.; Hohn, D.; Holmes, T. R.; Homann, M.; Hong, T. M.; Hooberman, B. H.; Hopkins, W. H.; Horii, Y.; Horton, A. J.; Hostachy, J.-Y.; Hou, S.; Hoummada, A.; Howarth, J.; Hrabovsky, M.; Hristova, I.; Hrivnac, J.; Hryn'ova, T.; Hrynevich, A.; Hsu, C.; Hsu, P. J.; Hsu, S.-C.; Hu, D.; Hu, Q.; Huang, Y.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, T. B.; Hughes, E. W.; Hughes, G.; Huhtinen, M.; Hülsing, T. A.; Huo, P.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibragimov, I.; Iconomidou-Fayard, L.; Ideal, E.; Idrissi, Z.; Iengo, P.; Igonkina, O.; Iizawa, T.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Iliadis, D.; Ilic, N.; Ince, T.; Introzzi, G.; Ioannou, P.; Iodice, M.; Iordanidou, K.; Ippolito, V.; Ishino, M.; Ishitsuka, M.; Ishmukhametov, R.; Issever, C.; Istin, S.; Ito, F.; Ponce, J. M. Iturbe; Iuppa, R.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jabbar, S.; Jackson, B.; Jackson, M.; Jackson, P.; Jain, V.; Jakobi, K. B.; Jakobs, K.; Jakobsen, S.; Jakoubek, T.; Jamin, D. O.; Jana, D. K.; Jansen, E.; Jansky, R.; Janssen, J.; Janus, M.; Jarlskog, G.; Javadov, N.; Javůrek, T.; Jeanneau, F.; Jeanty, L.; Jejelava, J.; Jeng, G.-Y.; Jennens, D.; Jenni, P.; Jentzsch, J.; Jeske, C.; Jézéquel, S.; Ji, H.; Jia, J.; Jiang, H.; Jiang, Y.; Jiggins, S.; Belenguer, M. Jimenez; Pena, J. Jimenez; Jin, S.; Jinaru, A.; Jinnouchi, O.; Johansson, P.; Johns, K. A.; Johnson, W. J.; Jon-And, K.; Jones, G.; Jones, R. W. L.; Jones, S.; Jones, T. J.; Jongmanns, J.; Jorge, P. M.; Jovicevic, J.; Ju, X.; Rozas, A. Juste; Köhler, M. K.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagan, M.; Kahn, S. J.; Kajomovitz, E.; Kalderon, C. W.; Kaluza, A.; Kama, S.; Kamenshchikov, A.; Kanaya, N.; Kaneti, S.; Kanjir, L.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kaplan, L. S.; Kapliy, A.; Kar, D.; Karakostas, K.; Karamaoun, A.; Karastathis, N.; Kareem, M. J.; Karentzos, E.; Karnevskiy, M.; Karpov, S. N.; Karpova, Z. M.; Karthik, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kasahara, K.; Kashif, L.; Kass, R. D.; Kastanas, A.; Kataoka, Y.; Kato, C.; Katre, A.; Katzy, J.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kazama, S.; Kazanin, V. F.; Keeler, R.; Kehoe, R.; Keller, J. S.; Kempster, J. J.; Kawade, K.; Keoshkerian, H.; Kepka, O.; Kerševan, B. P.; Kersten, S.; Keyes, R. A.; Khalil-zada, F.; Khanov, A.; Kharlamov, A. G.; Khoo, T. J.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kido, S.; Kim, H. Y.; Kim, S. H.; Kim, Y. K.; Kimura, N.; Kind, O. M.; King, B. T.; King, M.; King, S. B.; Kirk, J.; Kiryunin, A. E.; Kishimoto, T.; Kisielewska, D.; Kiss, F.; Kiuchi, K.; Kivernyk, O.; Kladiva, E.; Klein, M. H.; Klein, M.; Klein, U.; Kleinknecht, K.; Klimek, P.; Klimentov, A.; Klingenberg, R.; Klinger, J. A.; Klioutchnikova, T.; Kluge, E.-E.; Kluit, P.; Kluth, S.; Knapik, J.; Kneringer, E.; Knoops, E. B. F. G.; Knue, A.; Kobayashi, A.; Kobayashi, D.; Kobayashi, T.; Kobel, M.; Kocian, M.; Kodys, P.; Koffas, T.; Koffeman, E.; Koi, T.; Kolanoski, H.; Kolb, M.; Koletsou, I.; Komar, A. A.; Komori, Y.; Kondo, T.; Kondrashova, N.; Köneke, K.; König, A. C.; Kono, T.; Konoplich, R.; Konstantinidis, N.; Kopeliansky, R.; Koperny, S.; Köpke, L.; Kopp, A. K.; Korcyl, K.; Kordas, K.; Korn, A.; Korol, A. A.; Korolkov, I.; Korolkova, E. V.; Kortner, O.; Kortner, S.; Kosek, T.; Kostyukhin, V. V.; Kotwal, A.; Kourkoumeli-Charalampidi, A.; Kourkoumelis, C.; Kouskoura, V.; Kowalewska, A. B.; Kowalewski, R.; Kowalski, T. Z.; Kozakai, C.; Kozanecki, W.; Kozhin, A. S.; Kramarenko, V. A.; Kramberger, G.; Krasnopevtsev, D.; Krasny, M. W.; Krasznahorkay, A.; Kraus, J. K.; Kravchenko, A.; Kretz, M.; Kretzschmar, J.; Kreutzfeldt, K.; Krieger, P.; Krizka, K.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Krumnack, N.; Kruse, A.; Kruse, M. C.; Kruskal, M.; Kubota, T.; Kucuk, H.; Kuday, S.; Kuechler, J. T.; Kuehn, S.; Kugel, A.; Kuger, F.; Kuhl, A.; Kuhl, T.; Kukhtin, V.; Kukla, R.; Kulchitsky, Y.; Kuleshov, S.; Kuna, M.; Kunigo, T.; Kupco, A.; Kurashige, H.; Kurochkin, Y. A.; Kus, V.; Kuwertz, E. S.; Kuze, M.; Kvita, J.; Kwan, T.; Kyriazopoulos, D.; Rosa, A. La; Navarro, J. L. La Rosa; Rotonda, L. La; Lacasta, C.; Lacava, F.; Lacey, J.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Lammers, S.; Lampl, W.; Lançon, E.; Landgraf, U.; Landon, M. P. J.; Lang, V. S.; Lange, J. C.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Laplace, S.; Lapoire, C.; Laporte, J. F.; Lari, T.; Manghi, F. Lasagni; Lassnig, M.; Laurelli, P.; Lavrijsen, W.; Law, A. T.; Laycock, P.; Lazovich, T.; Lazzaroni, M.; Le, B.; Dortz, O. Le; Guirriec, E. Le; Quilleuc, E. P. Le; LeBlanc, M.; LeCompte, T.; Ledroit-Guillon, F.; Lee, C. A.; Lee, S. C.; Lee, L.; Lefebvre, G.; Lefebvre, M.; Legger, F.; Leggett, C.; Lehan, A.; Miotto, G. Lehmann; Lei, X.; Leight, W. A.; Leisos, A.; Leister, A. G.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Lemmer, B.; Leney, K. J. C.; Lenz, T.; Lenzi, B.; Leone, R.; Leone, S.; Leonidopoulos, C.; Leontsinis, S.; Lerner, G.; Leroy, C.; Lesage, A. A. J.; Lester, C. G.; Levchenko, M.; Levêque, J.; Levin, D.; Levinson, L. J.; Levy, M.; Lewis, D.; Leyko, A. M.; Leyton, M.; Li, B.; Li, H.; Li, H. L.; Li, L.; Li, L.; Li, Q.; Li, S.; Li, X.; Li, Y.; Liang, Z.; Liberti, B.; Liblong, A.; Lichard, P.; Lie, K.; Liebal, J.; Liebig, W.; Limosani, A.; Lin, S. C.; Lin, T. H.; Lindquist, B. E.; Lionti, A. E.; Lipeles, E.; Lipniacka, A.; Lisovyi, M.; Liss, T. M.; Lister, A.; Litke, A. M.; Liu, B.; Liu, D.; Liu, H.; Liu, H.; Liu, J.; Liu, J. B.; Liu, K.; Liu, L.; Liu, M.; Liu, M.; Liu, Y. L.; Liu, Y.; Livan, M.; Lleres, A.; Merino, J. Llorente; Lloyd, S. L.; Sterzo, F. Lo; Lobodzinska, E.; Loch, P.; Lockman, W. S.; Loebinger, F. K.; Loevschall-Jensen, A. E.; Loew, K. M.; Loginov, A.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Long, B. A.; Long, J. D.; Long, R. E.; Longo, L.; Looper, K. A.; Lopes, L.; Mateos, D. Lopez; Paredes, B. Lopez; Paz, I. Lopez; Solis, A. Lopez; Lorenz, J.; Martinez, N. Lorenzo; Losada, M.; Lösel, P. J.; Lou, X.; Lounis, A.; Love, J.; Love, P. A.; Lu, H.; Lu, N.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Luedtke, C.; Luehring, F.; Lukas, W.; Luminari, L.; Lundberg, O.; Lund-Jensen, B.; Luzi, P. M.; Lynn, D.; Lysak, R.; Lytken, E.; Lyubushkin, V.; Ma, H.; Ma, L. L.; Ma, Y.; Maccarrone, G.; Macchiolo, A.; Macdonald, C. M.; Maček, B.; Miguens, J. Machado; Madaffari, D.; Madar, R.; Maddocks, H. J.; Mader, W. F.; Madsen, A.; Maeda, J.; Maeland, S.; Maeno, T.; Maevskiy, A.; Magradze, E.; Mahlstedt, J.; Maiani, C.; Maidantchik, C.; Maier, A. A.; Maier, T.; Maio, A.; Majewski, S.; Makida, Y.; Makovec, N.; Malaescu, B.; Malecki, Pa.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Malone, C.; Maltezos, S.; Malyukov, S.; Mamuzic, J.; Mancini, G.; Mandelli, B.; Mandelli, L.; Mandić, I.; Maneira, J.; Filho, L. Manhaes de Andrade; Ramos, J. Manjarres; Mann, A.; Manousos, A.; Mansoulie, B.; Mansour, J. D.; Mantifel, R.; Mantoani, M.; Manzoni, S.; Mapelli, L.; Marceca, G.; March, L.; Marchiori, G.; Marcisovsky, M.; Marjanovic, M.; Marley, D. E.; Marroquim, F.; Marsden, S. P.; Marshall, Z.; Marti-Garcia, S.; Martin, B.; Martin, T. A.; Martin, V. J.; Latour, B. Martin dit; Martinez, M.; Martin-Haugh, S.; Martoiu, V. S.; Martyniuk, A. C.; Marx, M.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Massa, I.; Massa, L.; Mastrandrea, P.; Mastroberardino, A.; Masubuchi, T.; Mättig, P.; Mattmann, J.; Maurer, J.; Maxfield, S. J.; Maximov, D. A.; Mazini, R.; Mazza, S. M.; Fadden, N. C. Mc; Goldrick, G. Mc; Kee, S. P. Mc; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McClymont, L. I.; McDonald, E. F.; McFarlane, K. W.; Mcfayden, J. A.; Mchedlidze, G.; McMahon, S. J.; McPherson, R. A.; Medinnis, M.; Meehan, S.; Mehlhase, S.; Mehta, A.; Meier, K.; Meineck, C.; Meirose, B.; Melini, D.; Garcia, B. R. Mellado; Melo, M.; Meloni, F.; Mengarelli, A.; Menke, S.; Meoni, E.; Mergelmeyer, S.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meyer, C.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Theenhausen, H. Meyer Zu; Miano, F.; Middleton, R. P.; Miglioranzi, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikuž, M.; Milesi, M.; Milic, A.; Miller, D. W.; Mills, C.; Milov, A.; Milstead, D. A.; Minaenko, A. A.; Minami, Y.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L. M.; Mistry, K. P.; Mitani, T.; Mitrevski, J.; Mitsou, V. A.; Miucci, A.; Miyagawa, P. S.; Mjörnmark, J. U.; Moa, T.; Mochizuki, K.; Mohapatra, S.; Molander, S.; Moles-Valls, R.; Monden, R.; Mondragon, M. C.; Mönig, K.; Monk, J.; Monnier, E.; Montalbano, A.; Berlingen, J. Montejo; Monticelli, F.; Monzani, S.; Moore, R. W.; Morange, N.; Moreno, D.; Llácer, M. Moreno; Morettini, P.; Mori, D.; Mori, T.; Morii, M.; Morinaga, M.; Morisbak, V.; Moritz, S.; Morley, A. K.; Mornacchi, G.; Morris, J. D.; Mortensen, S. S.; Morvaj, L.; Mosidze, M.; Moss, J.; Motohashi, K.; Mount, R.; Mountricha, E.; Mouraviev, S. V.; Moyse, E. J. W.; Muanza, S.; Mudd, R. D.; Mueller, F.; Mueller, J.; Mueller, R. S. P.; Mueller, T.; Muenstermann, D.; Mullen, P.; Mullier, G. A.; Sanchez, F. J. Munoz; Quijada, J. A. Murillo; Murray, W. J.; Musheghyan, H.; Muškinja, M.; Myagkov, A. G.; Myska, M.; Nachman, B. P.; Nackenhorst, O.; Nagai, K.; Nagai, R.; Nagano, K.; Nagasaka, Y.; Nagata, K.; Nagel, M.; Nagy, E.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakamura, T.; Nakano, I.; Namasivayam, H.; Garcia, R. F. Naranjo; Narayan, R.; Villar, D. I. Narrias; Naryshkin, I.; Naumann, T.; Navarro, G.; Nayyar, R.; Neal, H. A.; Nechaeva, P. Yu.; Neep, T. J.; Nef, P. D.; Negri, A.; Negrini, M.; Nektarijevic, S.; Nellist, C.; Nelson, A.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Neubauer, M. S.; Neumann, M.; Neves, R. M.; Nevski, P.; Newman, P. R.; Nguyen, D. H.; Manh, T. Nguyen; Nickerson, R. B.; Nicolaidou, R.; Nielsen, J.; Nikiforov, A.; Nikolaenko, V.; Nikolic-Audit, I.; Nikolopoulos, K.; Nilsen, J. K.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nisius, R.; Nobe, T.; Nodulman, L.; Nomachi, M.; Nomidis, I.; Nooney, T.; Norberg, S.; Nordberg, M.; Norjoharuddeen, N.; Novgorodova, O.; Nowak, S.; Nozaki, M.; Nozka, L.; Ntekas, K.; Nurse, E.; Nuti, F.; O'grady, F.; O'Neil, D. C.; O'Rourke, A. A.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Obermann, T.; Ocariz, J.; Ochi, A.; Ochoa, I.; Ochoa-Ricoux, J. P.; Oda, S.; Odaka, S.; Ogren, H.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohman, H.; Oide, H.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olariu, A.; Seabra, L. F. Oleiro; Pino, S. A. Olivares; Damazio, D. Oliveira; Olszewski, A.; Olszowska, J.; Onofre, A.; Onogi, K.; Onyisi, P. U. E.; Oreglia, M. J.; Oren, Y.; Orestano, D.; Orlando, N.; Orr, R. S.; Osculati, B.; Ospanov, R.; Garzon, G. Otero y.; Otono, H.; Ouchrif, M.; Ould-Saada, F.; Ouraou, A.; Oussoren, K. P.; Ouyang, Q.; Owen, M.; Owen, R. E.; Ozcan, V. E.; Ozturk, N.; Pachal, K.; Pages, A. Pacheco; Rodriguez, L. Pacheco; Aranda, C. Padilla; Pagáčová, M.; Griso, S. Pagan; Paige, F.; Pais, P.; Pajchel, K.; Palacino, G.; Palestini, S.; Palka, M.; Pallin, D.; Palma, A.; Panagiotopoulou, E. St.; Pandini, C. E.; Vazquez, J. G. Panduro; Pani, P.; Panitkin, S.; Pantea, D.; Paolozzi, L.; Papadopoulou, Th. D.; Papageorgiou, K.; Paramonov, A.; Hernandez, D. Paredes; Parker, A. J.; Parker, M. A.; Parker, K. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pascuzzi, V. R.; Pasqualucci, E.; Passaggio, S.; Pastore, Fr.; Pásztor, G.; Pataraia, S.; Pater, J. R.; Pauly, T.; Pearce, J.; Pearson, B.; Pedersen, L. E.; Pedersen, M.; Lopez, S. Pedraza; Pedro, R.; Peleganchuk, S. V.; Pelikan, D.; Penc, O.; Peng, C.; Peng, H.; Penwell, J.; Peralva, B. S.; Perego, M. M.; Perepelitsa, D. V.; Codina, E. Perez; Perini, L.; Pernegger, H.; Perrella, S.; Peschke, R.; Peshekhonov, V. D.; Peters, K.; Peters, R. F. Y.; Petersen, B. A.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petroff, P.; Petrolo, E.; Petrov, M.; Petrucci, F.; Pettersson, N. E.; Peyaud, A.; Pezoa, R.; Phillips, P. W.; Piacquadio, G.; Pianori, E.; Picazio, A.; Piccaro, E.; Piccinini, M.; Pickering, M. A.; Piegaia, R.; Pilcher, J. E.; Pilkington, A. D.; Pin, A. W. J.; Pinamonti, M.; Pinfold, J. L.; Pingel, A.; Pires, S.; Pirumov, H.; Pitt, M.; Plazak, L.; Pleier, M.-A.; Pleskot, V.; Plotnikova, E.; Plucinski, P.; Pluth, D.; Poettgen, R.; Poggioli, L.; Pohl, D.; Polesello, G.; Poley, A.; Policicchio, A.; Polifka, R.; Polini, A.; Pollard, C. S.; Polychronakos, V.; Pommès, K.; Pontecorvo, L.; Pope, B. G.; Popeneciu, G. A.; Popovic, D. S.; Poppleton, A.; Pospisil, S.; Potamianos, K.; Potrap, I. N.; Potter, C. J.; Potter, C. T.; Poulard, G.; Poveda, J.; Pozdnyakov, V.; Astigarraga, M. E. Pozo; Pralavorio, P.; Pranko, A.; Prell, S.; Price, D.; Price, L. E.; Primavera, M.; Prince, S.; Proissl, M.; Prokofiev, K.; Prokoshin, F.; Protopopescu, S.; Proudfoot, J.; Przybycien, M.; Puddu, D.; Purohit, M.; Puzo, P.; Qian, J.; Qin, G.; Qin, Y.; Quadt, A.; Quayle, W. B.; Queitsch-Maitland, M.; Quilty, D.; Raddum, S.; Radeka, V.; Radescu, V.; Radhakrishnan, S. K.; Radloff, P.; Rados, P.; Ragusa, F.; Rahal, G.; Raine, J. A.; Rajagopalan, S.; Rammensee, M.; Rangel-Smith, C.; Ratti, M. G.; Rauscher, F.; Rave, S.; Ravenscroft, T.; Ravinovich, I.; Raymond, M.; Read, A. L.; Readioff, N. P.; Reale, M.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Rehnisch, L.; Reichert, J.; Reisin, H.; Rembser, C.; Ren, H.; Rescigno, M.; Resconi, S.; Rezanova, O. L.; Reznicek, P.; Rezvani, R.; Richter, R.; Richter, S.; Richter-Was, E.; Ricken, O.; Ridel, M.; Rieck, P.; Riegel, C. J.; Rieger, J.; Rifki, O.; Rijssenbeek, M.; Rimoldi, A.; Rimoldi, M.; Rinaldi, L.; Ristić, B.; Ritsch, E.; Riu, I.; Rizatdinova, F.; Rizvi, E.; Rizzi, C.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robson, A.; Roda, C.; Rodina, Y.; Perez, A. Rodriguez; Rodriguez, D. Rodriguez; Roe, S.; Rogan, C. S.; Røhne, O.; Romaniouk, A.; Romano, M.; Saez, S. M. Romano; Adam, E. Romero; Rompotis, N.; Ronzani, M.; Roos, L.; Ros, E.; Rosati, S.; Rosbach, K.; Rose, P.; Rosenthal, O.; Rosien, N.-A.; Rossetti, V.; Rossi, E.; Rossi, L. P.; Rosten, J. H. N.; Rosten, R.; Rotaru, M.; Roth, I.; Rothberg, J.; Rousseau, D.; Royon, C. R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubbo, F.; Rudolph, M. S.; Rühr, F.; Ruiz-Martinez, A.; Rurikova, Z.; Rusakovich, N. A.; Ruschke, A.; Russell, H. L.; Rutherfoord, J. P.; Ruthmann, N.; Ryabov, Y. F.; Rybar, M.; Rybkin, G.; Ryu, S.; Ryzhov, A.; Rzehorz, G. F.; Saavedra, A. F.; Sabato, G.; Sacerdoti, S.; Sadrozinski, H. F.-W.; Sadykov, R.; Tehrani, F. Safai; Saha, P.; Sahinsoy, M.; Saimpert, M.; Saito, T.; Sakamoto, H.; Sakurai, Y.; Salamanna, G.; Salamon, A.; Loyola, J. E. Salazar; Salek, D.; De Bruin, P. H. Sales; Salihagic, D.; Salnikov, A.; Salt, J.; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sammel, D.; Sampsonidis, D.; Sanchez, A.; Sánchez, J.; Martinez, V. Sanchez; Sandaker, H.; Sandbach, R. L.; Sander, H. G.; Sandhoff, M.; Sandoval, C.; Sandstroem, R.; Sankey, D. P. C.; Sannino, M.; Sansoni, A.; Santoni, C.; Santonico, R.; Santos, H.; Castillo, I. Santoyo; Sapp, K.; Sapronov, A.; Saraiva, J. G.; Sarrazin, B.; Sasaki, O.; Sasaki, Y.; Sato, K.; Sauvage, G.; Sauvan, E.; Savage, G.; Savard, P.; Sawyer, C.; Sawyer, L.; Saxon, J.; Sbarra, C.; Sbrizzi, A.; Scanlon, T.; Scannicchio, D. A.; Scarcella, M.; Scarfone, V.; Schaarschmidt, J.; Schacht, P.; Schachtner, B. M.; Schaefer, D.; Schaefer, R.; Schaeffer, J.; Schaepe, S.; Schaetzel, S.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Scharf, V.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Schiavi, C.; Schier, S.; Schillo, C.; Schioppa, M.; Schlenker, S.; Schmidt-Sommerfeld, K. R.; Schmieden, K.; Schmitt, C.; Schmitt, S.; Schmitz, S.; Schneider, B.; Schnoor, U.; Schoeffel, L.; Schoening, A.; Schoenrock, B. D.; Schopf, E.; Schott, M.; Schovancova, J.; Schramm, S.; Schreyer, M.; Schuh, N.; Schultens, M. J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwartzman, A.; Schwarz, T. A.; Schwegler, Ph.; Schweiger, H.; Schwemling, Ph.; Schwienhorst, R.; Schwindling, J.; Schwindt, T.; Sciolla, G.; Scuri, F.; Scutti, F.; Searcy, J.; Seema, P.; Seidel, S. C.; Seiden, A.; Seifert, F.; Seixas, J. M.; Sekhniaidze, G.; Sekhon, K.; Sekula, S. J.; Seliverstov, D. M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Serkin, L.; Sessa, M.; Seuster, R.; Severini, H.; Sfiligoj, T.; Sforza, F.; Sfyrla, A.; Shabalina, E.; Shaikh, N. W.; Shan, L. Y.; Shang, R.; Shank, J. T.; Shapiro, M.; Shatalov, P. B.; Shaw, K.; Shaw, S. M.; Shcherbakova, A.; Shehu, C. Y.; Sherwood, P.; Shi, L.; Shimizu, S.; Shimmin, C. O.; Shimojima, M.; Shiyakova, M.; Shmeleva, A.; Saadi, D. Shoaleh; Shochet, M. J.; Shojaii, S.; Shrestha, S.; Shulga, E.; Shupe, M. A.; Sicho, P.; Sickles, A. M.; Sidebo, P. E.; Sidiropoulou, O.; Sidorov, D.; Sidoti, A.; Siegert, F.; Sijacki, Dj.; Silva, J.; Silverstein, S. B.; Simak, V.; Simard, O.; Simic, Lj.; Simion, S.; Simioni, E.; Simmons, B.; Simon, D.; Simon, M.; Sinervo, P.; Sinev, N. B.; Sioli, M.; Siragusa, G.; Sivoklokov, S. Yu.; Sjölin, J.; Sjursen, T. B.; Skinner, M. B.; Skottowe, H. P.; Skubic, P.; Slater, M.; Slavicek, T.; Slawinska, M.; Sliwa, K.; Slovak, R.; Smakhtin, V.; Smart, B. H.; Smestad, L.; Smiesko, J.; Smirnov, S. Yu.; Smirnov, Y.; Smirnova, L. N.; Smirnova, O.; Smith, M. N. K.; Smith, R. W.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snyder, S.; Sobie, R.; Socher, F.; Soffer, A.; Soh, D. A.; Sokhrannyi, G.; Sanchez, C. A. Solans; Solar, M.; Soldatov, E. Yu.; Soldevila, U.; Solodkov, A. A.; Soloshenko, A.; Solovyanov, O. V.; Solovyev, V.; Sommer, P.; Son, H.; Song, H. Y.; Sood, A.; Sopczak, A.; Sopko, V.; Sorin, V.; Sosa, D.; Sotiropoulou, C. L.; Soualah, R.; Soukharev, A. M.; South, D.; Sowden, B. C.; Spagnolo, S.; Spalla, M.; Spangenberg, M.; Spanò, F.; Sperlich, D.; Spettel, F.; Spighi, R.; Spigo, G.; Spiller, L. A.; Spousta, M.; Denis, R. D. St.; Stabile, A.; Stamen, R.; Stamm, S.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stanescu-Bellu, M.; Stanitzki, M. M.; Stapnes, S.; Starchenko, E. A.; Stark, G. H.; Stark, J.; Staroba, P.; Starovoitov, P.; Stärz, S.; Staszewski, R.; Steinberg, P.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stewart, G. A.; Stillings, J. A.; Stockton, M. C.; Stoebe, M.; Stoicea, G.; Stolte, P.; Stonjek, S.; Stradling, A. R.; Straessner, A.; Stramaglia, M. E.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Stroynowski, R.; Strubig, A.; Stucci, S. A.; Stugu, B.; Styles, N. A.; Su, D.; Su, J.; Subramaniam, R.; Suchek, S.; Sugaya, Y.; Suk, M.; Sulin, V. V.; Sultansoy, S.; Sumida, T.; Sun, S.; Sun, X.; Sundermann, J. E.; Suruliz, K.; Susinno, G.; Sutton, M. R.; Suzuki, S.; Svatos, M.; Swiatlowski, M.; Sykora, I.; Sykora, T.; Ta, D.; Taccini, C.; Tackmann, K.; Taenzer, J.; Taffard, A.; Tafirout, R.; Taiblum, N.; Takai, H.; Takashima, R.; Takeshita, T.; Takubo, Y.; Talby, M.; Talyshev, A. A.; Tan, K. G.; Tanaka, J.; Tanaka, R.; Tanaka, S.; Tannenwald, B. B.; Araya, S. Tapia; Tapprogge, S.; Tarem, S.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tashiro, T.; Tassi, E.; Delgado, A. Tavares; Tayalati, Y.; Taylor, A. C.; Taylor, G. N.; Taylor, P. T. E.; Taylor, W.; Teischinger, F. A.; Teixeira-Dias, P.; Temming, K. K.; Temple, D.; Kate, H. Ten; Teng, P. K.; Teoh, J. J.; Tepel, F.; Terada, S.; Terashi, K.; Terron, J.; Terzo, S.; Testa, M.; Teuscher, R. J.; Theveneaux-Pelzer, T.; Thomas, J. P.; Thomas-Wilsker, J.; Thompson, E. N.; Thompson, P. D.; Thompson, A. S.; Thomsen, L. A.; Thomson, E.; Thomson, M.; Tibbetts, M. J.; Torres, R. E. Ticse; Tikhomirov, V. O.; Tikhonov, Yu. A.; Timoshenko, S.; Tipton, P.; Tisserant, S.; Todome, K.; Todorov, T.; Todorova-Nova, S.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tolley, E.; Tomlinson, L.; Tomoto, M.; Tompkins, L.; Toms, K.; Tong, B.; Torrence, E.; Torres, H.; Pastor, E. Torró; Toth, J.; Touchard, F.; Tovey, D. R.; Trefzger, T.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Tripiana, M. F.; Trischuk, W.; Trocmé, B.; Trofymov, A.; Troncon, C.; Trottier-McDonald, M.; Trovatelli, M.; Truong, L.; Trzebinski, M.; Trzupek, A.; Tseng, J. C.-L.; Tsiareshka, P. V.; Tsipolitis, G.; Tsirintanis, N.; Tsiskaridze, S.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsui, K. M.; Tsukerman, I. I.; Tsulaia, V.; Tsuno, S.; Tsybychev, D.; Tudorache, A.; Tudorache, V.; Tuna, A. N.; Tupputi, S. A.; Turchikhin, S.; Turecek, D.; Turgeman, D.; Turra, R.; Turvey, A. J.; Tuts, P. M.; Tyndel, M.; Ucchielli, G.; Ueda, I.; Ueno, R.; Ughetto, M.; Ukegawa, F.; Unal, G.; Undrus, A.; Unel, G.; Ungaro, F. C.; Unno, Y.; Unverdorben, C.; Urban, J.; Urquijo, P.; Urrejola, P.; Usai, G.; Usanova, A.; Vacavant, L.; Vacek, V.; Vachon, B.; Valderanis, C.; Santurio, E. Valdes; Valencic, N.; Valentinetti, S.; Valero, A.; Valery, L.; Valkar, S.; Vallecorsa, S.; Ferrer, J. A. Valls; Van Den Wollenberg, W.; Van Der Deijl, P. C.; van der Geer, R.; van der Graaf, H.; van Eldik, N.; van Gemmeren, P.; Van Nieuwkoop, J.; van Vulpen, I.; van Woerden, M. C.; Vanadia, M.; Vandelli, W.; Vanguri, R.; Vaniachine, A.; Vankov, P.; Vardanyan, G.; Vari, R.; Varnes, E. W.; Varol, T.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vasquez, J. G.; Vazeille, F.; Schroeder, T. Vazquez; Veatch, J.; Veloce, L. M.; Veloso, F.; Veneziano, S.; Ventura, A.; Venturi, M.; Venturi, N.; Venturini, A.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vest, A.; Vetterli, M. C.; Viazlo, O.; Vichou, I.; Vickey, T.; Boeriu, O. E. Vickey; Viehhauser, G. H. A.; Viel, S.; Vigani, L.; Vigne, R.; Villa, M.; Perez, M. Villaplana; Vilucchi, E.; Vincter, M. G.; Vinogradov, V. B.; Vittori, C.; Vivarelli, I.; Vlachos, S.; Vlasak, M.; Vogel, M.; Vokac, P.; Volpi, G.; Volpi, M.; von der Schmitt, H.; von Toerne, E.; Vorobel, V.; Vorobev, K.; Vos, M.; Voss, R.; Vossebeld, J. H.; Vranjes, N.; Milosavljevic, M. Vranjes; Vrba, V.; Vreeswijk, M.; Vuillermet, R.; Vukotic, I.; Vykydal, Z.; Wagner, P.; Wagner, W.; Wahlberg, H.; Wahrmund, S.; Wakabayashi, J.; Walder, J.; Walker, R.; Walkowiak, W.; Wallangen, V.; Wang, C.; Wang, C.; Wang, F.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, K.; Wang, R.; Wang, S. M.; Wang, T.; Wang, T.; Wang, W.; Wang, X.; Wanotayaroj, C.; Warburton, A.; Ward, C. P.; Wardrope, D. R.; Washbrook, A.; Watkins, P. M.; Watson, A. T.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, B. M.; Webb, S.; Weber, M. S.; Weber, S. W.; Webster, J. S.; Weidberg, A. R.; Weinert, B.; Weingarten, J.; Weiser, C.; Weits, H.; Wells, P. S.; Wenaus, T.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, M. D.; Werner, P.; Wessels, M.; Wetter, J.; Whalen, K.; Whallon, N. L.; Wharton, A. M.; White, A.; White, M. J.; White, R.; Whiteson, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik-Fuchs, L. A. M.; Wildauer, A.; Wilk, F.; Wilkens, H. G.; Williams, H. H.; Williams, S.; Willis, C.; Willocq, S.; Wilson, J. A.; Wingerter-Seez, I.; Winklmeier, F.; Winston, O. J.; Winter, B. T.; Wittgen, M.; Wittkowski, J.; Wollstadt, S. J.; Wolter, M. W.; Wolters, H.; Wosiek, B. K.; Wotschack, J.; Woudstra, M. J.; Wozniak, K. W.; Wu, M.; Wu, M.; Wu, S. L.; Wu, X.; Wu, Y.; Wyatt, T. R.; Wynne, B. M.; Xella, S.; Xu, D.; Xu, L.; Yabsley, B.; Yacoob, S.; Yakabe, R.; Yamaguchi, D.; Yamaguchi, Y.; Yamamoto, A.; Yamamoto, S.; Yamanaka, T.; Yamauchi, K.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, H.; Yang, Y.; Yang, Z.; Yao, W.-M.; Yap, Y. C.; Yasu, Y.; Yatsenko, E.; Wong, K. H. Yau; Ye, J.; Ye, S.; Yeletskikh, I.; Yen, A. L.; Yildirim, E.; Yorita, K.; Yoshida, R.; Yoshihara, K.; Young, C.; Young, C. J. S.; Youssef, S.; Yu, D. R.; Yu, J.; Yu, J. M.; Yu, J.; Yuan, L.; Yuen, S. P. Y.; Yusuff, I.; Zabinski, B.; Zaidan, R.; Zaitsev, A. M.; Zakharchuk, N.; Zalieckas, J.; Zaman, A.; Zambito, S.; Zanello, L.; Zanzi, D.; Zeitnitz, C.; Zeman, M.; Zemla, A.; Zeng, J. C.; Zeng, Q.; Zengel, K.; Zenin, O.; Ženiš, T.; Zerwas, D.; Zhang, D.; Zhang, F.; Zhang, G.; Zhang, H.; Zhang, J.; Zhang, L.; Zhang, R.; Zhang, R.; Zhang, X.; Zhang, Z.; Zhao, X.; Zhao, Y.; Zhao, Z.; Zhemchugov, A.; Zhong, J.; Zhou, B.; Zhou, C.; Zhou, L.; Zhou, L.; Zhou, M.; Zhou, N.; Zhu, C. G.; Zhu, H.; Zhu, J.; Zhu, Y.; Zhuang, X.; Zhukov, K.; Zibell, A.; Zieminska, D.; Zimine, N. I.; Zimmermann, C.; Zimmermann, S.; Zinonos, Z.; Zinser, M.; Ziolkowski, M.; Živković, L.; Zobernig, G.; Zoccoli, A.; Nedden, M. zur; Zurzolo, G.; Zwalinski, L.

    2016-12-01

    The algorithms used by the ATLAS Collaboration to reconstruct and identify prompt photons are described. Measurements of the photon identification efficiencies are reported, using 4.9 fb^{-1} of pp collision data collected at the LHC at √{s} = 7 {TeV} and 20.3 fb^{-1} at √{s} = 8 {TeV}. The efficiencies are measured separately for converted and unconverted photons, in four different pseudorapidity regions, for transverse momenta between 10 {GeV} and 1.5 {TeV}. The results from the combination of three data-driven techniques are compared to the predictions from a simulation of the detector response, after correcting the electromagnetic shower momenta in the simulation for the average differences observed with respect to data. Data-to-simulation efficiency ratios used as correction factors in physics measurements are determined to account for the small residual efficiency differences. These factors are measured with uncertainties between 0.5% and 10% in 7 {TeV} data and between 0.5% and 5.6% in 8 {TeV} data, depending on the photon transverse momentum and pseudorapidity.

  19. SU-F-T-111: Investigation of the Attila Deterministic Solver as a Supplement to Monte Carlo for Calculating Out-Of-Field Radiotherapy Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mille, M; Lee, C; Failla, G

    Purpose: To use the Attila deterministic solver as a supplement to Monte Carlo for calculating out-of-field organ dose in support of epidemiological studies looking at the risks of second cancers. Supplemental dosimetry tools are needed to speed up dose calculations for studies involving large-scale patient cohorts. Methods: Attila is a multi-group discrete ordinates code which can solve the 3D photon-electron coupled linear Boltzmann radiation transport equation on a finite-element mesh. Dose is computed by multiplying the calculated particle flux in each mesh element by a medium-specific energy deposition cross-section. The out-of-field dosimetry capability of Attila is investigated by comparing averagemore » organ dose to that which is calculated by Monte Carlo simulation. The test scenario consists of a 6 MV external beam treatment of a female patient with a tumor in the left breast. The patient is simulated by a whole-body adult reference female computational phantom. Monte Carlo simulations were performed using MCNP6 and XVMC. Attila can export a tetrahedral mesh for MCNP6, allowing for a direct comparison between the two codes. The Attila and Monte Carlo methods were also compared in terms of calculation speed and complexity of simulation setup. A key perquisite for this work was the modeling of a Varian Clinac 2100 linear accelerator. Results: The solid mesh of the torso part of the adult female phantom for the Attila calculation was prepared using the CAD software SpaceClaim. Preliminary calculations suggest that Attila is a user-friendly software which shows great promise for our intended application. Computational performance is related to the number of tetrahedral elements included in the Attila calculation. Conclusion: Attila is being explored as a supplement to the conventional Monte Carlo radiation transport approach for performing retrospective patient dosimetry. The goal is for the dosimetry to be sufficiently accurate for use in retrospective epidemiological investigations.« less

  20. Neutrons in active proton therapy: Parameterization of dose and dose equivalent.

    PubMed

    Schneider, Uwe; Hälg, Roger A; Lomax, Tony

    2017-06-01

    One of the essential elements of an epidemiological study to decide if proton therapy may be associated with increased or decreased subsequent malignancies compared to photon therapy is an ability to estimate all doses to non-target tissues, including neutron dose. This work therefore aims to predict for patients using proton pencil beam scanning the spatially localized neutron doses and dose equivalents. The proton pencil beam of Gantry 1 at the Paul Scherrer Institute (PSI) was Monte Carlo simulated using GEANT. Based on the simulated neutron dose and neutron spectra an analytical mechanistic dose model was developed. The pencil beam algorithm used for treatment planning at PSI has been extended using the developed model in order to calculate the neutron component of the delivered dose distribution for each treated patient. The neutron dose was estimated for two patient example cases. The analytical neutron dose model represents the three-dimensional Monte Carlo simulated dose distribution up to 85cm from the proton pencil beam with a satisfying precision. The root mean square error between Monte Carlo simulation and model is largest for 138MeV protons and is 19% and 20% for dose and dose equivalent, respectively. The model was successfully integrated into the PSI treatment planning system. In average the neutron dose is increased by 10% or 65% when using 160MeV or 177MeV instead of 138MeV. For the neutron dose equivalent the increase is 8% and 57%. The presented neutron dose calculations allow for estimates of dose that can be used in subsequent epidemiological studies or, should the need arise, to estimate the neutron dose at any point where a subsequent secondary tumour may occur. It was found that the neutron dose to the patient is heavily increased with proton energy. Copyright © 2016. Published by Elsevier GmbH.

  1. a Discrete Mathematical Model to Simulate Malware Spreading

    NASA Astrophysics Data System (ADS)

    Del Rey, A. Martin; Sánchez, G. Rodriguez

    2012-10-01

    With the advent and worldwide development of Internet, the study and control of malware spreading has become very important. In this sense, some mathematical models to simulate malware propagation have been proposed in the scientific literature, and usually they are based on differential equations exploiting the similarities with mathematical epidemiology. The great majority of these models study the behavior of a particular type of malware called computer worms; indeed, to the best of our knowledge, no model has been proposed to simulate the spreading of a computer virus (the traditional type of malware which differs from computer worms in several aspects). In this sense, the purpose of this work is to introduce a new mathematical model not based on continuous mathematics tools but on discrete ones, to analyze and study the epidemic behavior of computer virus. Specifically, cellular automata are used in order to design such model.

  2. Efficient generation of connectivity in neuronal networks from simulator-independent descriptions

    PubMed Central

    Djurfeldt, Mikael; Davison, Andrew P.; Eppler, Jochen M.

    2014-01-01

    Simulator-independent descriptions of connectivity in neuronal networks promise greater ease of model sharing, improved reproducibility of simulation results, and reduced programming effort for computational neuroscientists. However, until now, enabling the use of such descriptions in a given simulator in a computationally efficient way has entailed considerable work for simulator developers, which must be repeated for each new connectivity-generating library that is developed. We have developed a generic connection generator interface that provides a standard way to connect a connectivity-generating library to a simulator, such that one library can easily be replaced by another, according to the modeler's needs. We have used the connection generator interface to connect C++ and Python implementations of the previously described connection-set algebra to the NEST simulator. We also demonstrate how the simulator-independent modeling framework PyNN can transparently take advantage of this, passing a connection description through to the simulator layer for rapid processing in C++ where a simulator supports the connection generator interface and falling-back to slower iteration in Python otherwise. A set of benchmarks demonstrates the good performance of the interface. PMID:24795620

  3. Body Constraints on Motor Simulation in Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Conson, Massimiliano; Hamilton, Antonia; De Bellis, Francesco; Errico, Domenico; Improta, Ilaria; Mazzarella, Elisabetta; Trojano, Luigi; Frolli, Alessandro

    2016-01-01

    Developmental data suggested that mental simulation skills become progressively dissociated from overt motor activity across development. Thus, efficient simulation is rather independent from current sensorimotor information. Here, we tested the impact of bodily (sensorimotor) information on simulation skills of adolescents with Autism Spectrum…

  4. Molecular dynamics simulations in hybrid particle-continuum schemes: Pitfalls and caveats

    NASA Astrophysics Data System (ADS)

    Stalter, S.; Yelash, L.; Emamy, N.; Statt, A.; Hanke, M.; Lukáčová-Medvid'ová, M.; Virnau, P.

    2018-03-01

    Heterogeneous multiscale methods (HMM) combine molecular accuracy of particle-based simulations with the computational efficiency of continuum descriptions to model flow in soft matter liquids. In these schemes, molecular simulations typically pose a computational bottleneck, which we investigate in detail in this study. We find that it is preferable to simulate many small systems as opposed to a few large systems, and that a choice of a simple isokinetic thermostat is typically sufficient while thermostats such as Lowe-Andersen allow for simulations at elevated viscosity. We discuss suitable choices for time steps and finite-size effects which arise in the limit of very small simulation boxes. We also argue that if colloidal systems are considered as opposed to atomistic systems, the gap between microscopic and macroscopic simulations regarding time and length scales is significantly smaller. We propose a novel reduced-order technique for the coupling to the macroscopic solver, which allows us to approximate a non-linear stress-strain relation efficiently and thus further reduce computational effort of microscopic simulations.

  5. Simulation of the real efficiencies of high-efficiency silicon solar cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sachenko, A. V., E-mail: sach@isp.kiev.ua; Skrebtii, A. I.; Korkishko, R. M.

    The temperature dependences of the efficiency η of high-efficiency solar cells based on silicon are calculated. It is shown that the temperature coefficient of decreasing η with increasing temperature decreases as the surface recombination rate decreases. The photoconversion efficiency of high-efficiency silicon-based solar cells operating under natural (field) conditions is simulated. Their operating temperature is determined self-consistently by simultaneously solving the photocurrent, photovoltage, and energy-balance equations. Radiative and convective cooling mechanisms are taken into account. It is shown that the operating temperature of solar cells is higher than the ambient temperature even at very high convection coefficients (~300 W/m{sup 2}more » K). Accordingly, the photoconversion efficiency in this case is lower than when the temperature of the solar cells is equal to the ambient temperature. The calculated dependences for the open-circuit voltage and the photoconversion efficiency of high-quality silicon solar cells under concentrated illumination are discussed taking into account the actual temperature of the solar cells.« less

  6. Reproductive and developmental effects of disinfection by-products in drinking water.

    PubMed Central

    Reif, J S; Hatch, M C; Bracken, M; Holmes, L B; Schwetz, B A; Singer, P C

    1996-01-01

    Recent epidemiologic studies have reported associations between the consumption of chlorinated drinking water and reproductive and developmental effects. Here we review the available epidemiologic data, assess the hazard potential posed by exposure to disinfection by-products, identify critical data gaps, and offer recommendations for further research. The epidemiologic evidence supporting associations between exposure to water disinfection by-products (DBPs) and adverse pregnancy outcomes is sparse, and positive findings should be interpreted cautiously. The methods used during the early stages of research in this area have been diverse. Variability in exposure assessment and endpoints makes it difficult to synthesize or combine the available data. Exposure misclassification and unmeasured confounding may have lead to bias in risk estimation. Future studies of reproductive outcome and exposure to chlorinated water should use improved methods for exposure assessment to 1) assure selection of appropriate exposure markers, 2) assess seasonal and annual fluctuations in DBPs, 3) assess variability within the distribution system, and 4) assess exposure through multiple routes such as bathing and showering, as well as consumption. Population-based studies should be conducted to evaluate male and female fertility, conception delay, growth retardation, and specific birth defects. The reproductive and developmental effects of exposure to DBPs could be efficiently explored in ongoing investigations by incorporating valid exposure markers and relevant questionnaire information. Future studies should make use of naturally occurring variability in the concentrations of DBPs and may incorporate biomarkers of exposure and effect in their design. Epidemiologic investigations should be conducted in parallel with laboratory-based and animal studies in a coordinated, multidisciplinary approach. PMID:8930546

  7. Measuring taste impairment in epidemiologic studies: the Beaver Dam Offspring Study.

    PubMed

    Cruickshanks, K J; Schubert, C R; Snyder, D J; Bartoshuk, L M; Huang, G H; Klein, B E K; Klein, R; Nieto, F J; Pankow, J S; Tweed, T S; Krantz, E M; Moy, G S

    2009-07-01

    Taste or gustatory function may play an important role in determining diet and nutritional status and therefore indirectly impact health. Yet there have been few attempts to study the spectrum of taste function and dysfunction in human populations. Epidemiologic studies are needed to understand the impact of taste function and dysfunction on public health, to identify modifiable risk factors, and to develop and test strategies to prevent clinically significant dysfunction. However, measuring taste function in epidemiologic studies is challenging and requires repeatable, efficient methods that can measure change over time. Insights gained from translating laboratory-based methods to a population-based study, the Beaver Dam Offspring Study (BOSS) will be shared. In this study, a generalized labeled magnitude scale (gLMS) method was used to measure taste intensity of filter paper disks saturated with salt, sucrose, citric acid, quinine, or 6-n-propylthiouracil, and a gLMS measure of taste preferences was administered. In addition, a portable, inexpensive camera system to capture digital images of fungiform papillae and a masked grading system to measure the density of fungiform papillae were developed. Adult children of participants in the population-based Epidemiology of Hearing Loss Study in Beaver Dam, Wisconsin, are eligible for this ongoing study. The parents were residents of Beaver Dam and 43-84 years of age in 1987-1988; offspring ranged in age from 21-84 years in 2005-2008. Methods will be described in detail and preliminary results about the distributions of taste function in the BOSS cohort will be presented.

  8. The CREST biorepository: a tool for molecular epidemiology and translational studies on malignant mesothelioma, lung cancer, and other respiratory tract diseases.

    PubMed

    Ugolini, Donatella; Donatella, Ugolini; Neri, Monica; Monica, Neri; Canessa, Pier Aldo; Aldo, Canessa Pier; Casilli, Cristina; Cristina, Casilli; Catrambone, Giuseppe; Giuseppe, Catrambone; Ivaldi, Giovanni Paolo; Paolo, Ivaldi Giovanni; Lando, Cecilia; Cecilia, Lando; Marroni, Paola; Paola, Marroni; Paganuzzi, Michela; Michela, Paganuzzi; Parodi, Barbara; Barbara, Parodi; Visconti, Paola; Paola, Visconti; Puntoni, Riccardo; Riccardo, Puntoni; Bonassi, Stefano; Stefano, Bonassi

    2008-11-01

    The Cancer of RESpiratory Tract (CREST) biorepository was established to investigate biological mechanisms and to develop tools and strategies for primary and secondary prevention of respiratory tract cancer. The CREST biorepository is focused on pleural malignant mesothelioma, a rare and severe cancer linked to asbestos exposure whose incidence is particularly high in the Ligurian region. The CREST biorepository includes biological specimens from (a) patients with pleural malignant mesothelioma and lung cancer, (b) patients with nonneoplastic respiratory conditions, and (c) control subjects. Whole blood, plasma, serum, lymphocytes, pleural fluid, saliva, and biopsies are collected, and a questionnaire is administered. Collection, transportation, and storage are done according to international standards. As of January 31, 2008, the overall number of subjects recruited was 1,590 (446 lung cancer, 209 pleural malignant mesothelioma, and 935 controls). The biorepository includes a total of 10,055 aliquots (4,741 serum; 3,082 plasma; 1,599 whole blood; 633 pleural fluid; and 561 lymphocytes) and 107 biopsies. Demographic, clinical, and epidemiologic information is collected for each subject and processed in a dedicated database. The CREST biorepository is a valuable tool for molecular epidemiology and translational studies. This structure relies on a network of contacts with local health districts that allows for an active search for patients. This is a particularly efficient approach, especially when the object of the study is a rare cancer type. The CREST experience suggests that the presence of limited resources can be overcome by the biorepository specialization, the high quality of the epidemiologic information, and the variety of samples.

  9. Direct Harmonic Linear Navier-Stokes Methods for Efficient Simulation of Wave Packets

    NASA Technical Reports Server (NTRS)

    Streett, C. L.

    1998-01-01

    Wave packets produced by localized disturbances play an important role in transition in three-dimensional boundary layers, such as that on a swept wing. Starting with the receptivity process, we show the effects of wave-space energy distribution on the development of packets and other three-dimensional disturbance patterns. Nonlinearity in the receptivity process is specifically addressed, including demonstration of an effect which can enhance receptivity of traveling crossflow disturbances. An efficient spatial numerical simulation method is allowing most of the simulations presented to be carried out on a workstation.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horiike, S.; Okazaki, Y.

    This paper describes a performance estimation tool developed for modeling and simulation of open distributed energy management systems to support their design. The approach of discrete event simulation with detailed models is considered for efficient performance estimation. The tool includes basic models constituting a platform, e.g., Ethernet, communication protocol, operating system, etc. Application softwares are modeled by specifying CPU time, disk access size, communication data size, etc. Different types of system configurations for various system activities can be easily studied. Simulation examples show how the tool is utilized for the efficient design of open distributed energy management systems.

  11. Computational efficiency and Amdahl’s law for the adaptive resolution simulation technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Junghans, Christoph; Agarwal, Animesh; Delle Site, Luigi

    Here, we discuss the computational performance of the adaptive resolution technique in molecular simulation when it is compared with equivalent full coarse-grained and full atomistic simulations. We show that an estimate of its efficiency, within 10%–15% accuracy, is given by the Amdahl’s Law adapted to the specific quantities involved in the problem. The derivation of the predictive formula is general enough that it may be applied to the general case of molecular dynamics approaches where a reduction of degrees of freedom in a multi scale fashion occurs.

  12. Computational efficiency and Amdahl’s law for the adaptive resolution simulation technique

    DOE PAGES

    Junghans, Christoph; Agarwal, Animesh; Delle Site, Luigi

    2017-06-01

    Here, we discuss the computational performance of the adaptive resolution technique in molecular simulation when it is compared with equivalent full coarse-grained and full atomistic simulations. We show that an estimate of its efficiency, within 10%–15% accuracy, is given by the Amdahl’s Law adapted to the specific quantities involved in the problem. The derivation of the predictive formula is general enough that it may be applied to the general case of molecular dynamics approaches where a reduction of degrees of freedom in a multi scale fashion occurs.

  13. Adaptive multiple super fast simulated annealing for stochastic microstructure reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryu, Seun; Lin, Guang; Sun, Xin

    2013-01-01

    Fast image reconstruction from statistical information is critical in image fusion from multimodality chemical imaging instrumentation to create high resolution image with large domain. Stochastic methods have been used widely in image reconstruction from two point correlation function. The main challenge is to increase the efficiency of reconstruction. A novel simulated annealing method is proposed for fast solution of image reconstruction. Combining the advantage of very fast cooling schedules, dynamic adaption and parallelization, the new simulation annealing algorithm increases the efficiencies by several orders of magnitude, making the large domain image fusion feasible.

  14. Optimization by simulation of the nature of the buffer, the gap profile of the absorber and the thickness of the various layers in CZTSSe solar cells

    NASA Astrophysics Data System (ADS)

    Chadel, Meriem; Chadel, Asma; Moustafa Bouzaki, Mohammed; Aillerie, Michel; Benyoucef, Boumediene; Charles, Jean-Pierre

    2017-11-01

    Performances of ZnO/ZnS/CZTSSe polycrystalline thin film solar cells (Copper Zinc Tin Sulphur Selenium-solar cell) were simulated for different thicknesses of the absorber and ZnS buffer layers. Simulations were performed with SCAPS (Solar Cell Capacitance Simulator) software, starting with actual parameters available from industrial data for commercial cells processing. The influences of the thickness of the various layers in the structure of the solar cell and the gap profile of the CZTSSe absorber layer on the performance of the solar cell were studied in detail. Through considerations of recent works, we discuss possible routes to enhance the performance of CZTSSe solar cells towards a higher efficiency level. Thus, we found that for one specific thickness of the absorber layer, the efficiency of the CZTSSe solar cell can be increased when a ZnS layer replaces the usual CdS buffer layer. On the other hand, the efficiency of the solar cell can be also improved when the absorber layer presents a grad-gap. In this case, the maximum efficiency for the CZTSSe cell was found equal to 13.73%.

  15. An efficient formulation of robot arm dynamics for control and computer simulation

    NASA Astrophysics Data System (ADS)

    Lee, C. S. G.; Nigam, R.

    This paper describes an efficient formulation of the dynamic equations of motion of industrial robots based on the Lagrange formulation of d'Alembert's principle. This formulation, as applied to a PUMA robot arm, results in a set of closed form second order differential equations with cross product terms. They are not as efficient in computation as those formulated by the Newton-Euler method, but provide a better analytical model for control analysis and computer simulation. Computational complexities of this dynamic model together with other models are tabulated for discussion.

  16. An Initial Multi-Domain Modeling of an Actively Cooled Structure

    NASA Technical Reports Server (NTRS)

    Steinthorsson, Erlendur

    1997-01-01

    A methodology for the simulation of turbine cooling flows is being developed. The methodology seeks to combine numerical techniques that optimize both accuracy and computational efficiency. Key components of the methodology include the use of multiblock grid systems for modeling complex geometries, and multigrid convergence acceleration for enhancing computational efficiency in highly resolved fluid flow simulations. The use of the methodology has been demonstrated in several turbo machinery flow and heat transfer studies. Ongoing and future work involves implementing additional turbulence models, improving computational efficiency, adding AMR.

  17. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling.

    PubMed

    Rauscher, Sarah; Neale, Chris; Pomès, Régis

    2009-10-13

    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  18. Formalizing the role of agent-based modeling in causal inference and epidemiology.

    PubMed

    Marshall, Brandon D L; Galea, Sandro

    2015-01-15

    Calls for the adoption of complex systems approaches, including agent-based modeling, in the field of epidemiology have largely centered on the potential for such methods to examine complex disease etiologies, which are characterized by feedback behavior, interference, threshold dynamics, and multiple interacting causal effects. However, considerable theoretical and practical issues impede the capacity of agent-based methods to examine and evaluate causal effects and thus illuminate new areas for intervention. We build on this work by describing how agent-based models can be used to simulate counterfactual outcomes in the presence of complexity. We show that these models are of particular utility when the hypothesized causal mechanisms exhibit a high degree of interdependence between multiple causal effects and when interference (i.e., one person's exposure affects the outcome of others) is present and of intrinsic scientific interest. Although not without challenges, agent-based modeling (and complex systems methods broadly) represent a promising novel approach to identify and evaluate complex causal effects, and they are thus well suited to complement other modern epidemiologic methods of etiologic inquiry. © The Author 2014. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Augmenting epidemiological models with point-of-care diagnostics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pullum, Laura L.; Ramanathan, Arvind; Nutaro, James J.

    Although adoption of newer Point-of-Care (POC) diagnostics is increasing, there is a significant challenge using POC diagnostics data to improve epidemiological models. In this work, we propose a method to process zip-code level POC datasets and apply these processed data to calibrate an epidemiological model. We specifically develop a calibration algorithm using simulated annealing and calibrate a parsimonious equation-based model of modified Susceptible-Infected-Recovered (SIR) dynamics. The results show that parsimonious models are remarkably effective in predicting the dynamics observed in the number of infected patients and our calibration algorithm is sufficiently capable of predicting peak loads observed in POC diagnosticsmore » data while staying within reasonable and empirical parameter ranges reported in the literature. Additionally, we explore the future use of the calibrated values by testing the correlation between peak load and population density from Census data. Our results show that linearity assumptions for the relationships among various factors can be misleading, therefore further data sources and analysis are needed to identify relationships between additional parameters and existing calibrated ones. As a result, calibration approaches such as ours can determine the values of newly added parameters along with existing ones and enable policy-makers to make better multi-scale decisions.« less

  20. Augmenting epidemiological models with point-of-care diagnostics data

    DOE PAGES

    Pullum, Laura L.; Ramanathan, Arvind; Nutaro, James J.; ...

    2016-04-20

    Although adoption of newer Point-of-Care (POC) diagnostics is increasing, there is a significant challenge using POC diagnostics data to improve epidemiological models. In this work, we propose a method to process zip-code level POC datasets and apply these processed data to calibrate an epidemiological model. We specifically develop a calibration algorithm using simulated annealing and calibrate a parsimonious equation-based model of modified Susceptible-Infected-Recovered (SIR) dynamics. The results show that parsimonious models are remarkably effective in predicting the dynamics observed in the number of infected patients and our calibration algorithm is sufficiently capable of predicting peak loads observed in POC diagnosticsmore » data while staying within reasonable and empirical parameter ranges reported in the literature. Additionally, we explore the future use of the calibrated values by testing the correlation between peak load and population density from Census data. Our results show that linearity assumptions for the relationships among various factors can be misleading, therefore further data sources and analysis are needed to identify relationships between additional parameters and existing calibrated ones. As a result, calibration approaches such as ours can determine the values of newly added parameters along with existing ones and enable policy-makers to make better multi-scale decisions.« less

  1. Computationally Efficient Multiconfigurational Reactive Molecular Dynamics

    PubMed Central

    Yamashita, Takefumi; Peng, Yuxing; Knight, Chris; Voth, Gregory A.

    2012-01-01

    It is a computationally demanding task to explicitly simulate the electronic degrees of freedom in a system to observe the chemical transformations of interest, while at the same time sampling the time and length scales required to converge statistical properties and thus reduce artifacts due to initial conditions, finite-size effects, and limited sampling. One solution that significantly reduces the computational expense consists of molecular models in which effective interactions between particles govern the dynamics of the system. If the interaction potentials in these models are developed to reproduce calculated properties from electronic structure calculations and/or ab initio molecular dynamics simulations, then one can calculate accurate properties at a fraction of the computational cost. Multiconfigurational algorithms model the system as a linear combination of several chemical bonding topologies to simulate chemical reactions, also sometimes referred to as “multistate”. These algorithms typically utilize energy and force calculations already found in popular molecular dynamics software packages, thus facilitating their implementation without significant changes to the structure of the code. However, the evaluation of energies and forces for several bonding topologies per simulation step can lead to poor computational efficiency if redundancy is not efficiently removed, particularly with respect to the calculation of long-ranged Coulombic interactions. This paper presents accurate approximations (effective long-range interaction and resulting hybrid methods) and multiple-program parallelization strategies for the efficient calculation of electrostatic interactions in reactive molecular simulations. PMID:25100924

  2. Numerical analysis of combustion characteristics of hybrid rocket motor with multi-section swirl injection

    NASA Astrophysics Data System (ADS)

    Li, Chengen; Cai, Guobiao; Tian, Hui

    2016-06-01

    This paper is aimed to analyse the combustion characteristics of hybrid rocket motor with multi-section swirl injection by simulating the combustion flow field. Numerical combustion flow field and combustion performance parameters are obtained through three-dimensional numerical simulations based on a steady numerical model proposed in this paper. The hybrid rocket motor adopts 98% hydrogen peroxide and polyethylene as the propellants. Multiple injection sections are set along the axis of the solid fuel grain, and the oxidizer enters the combustion chamber by means of tangential injection via the injector ports in the injection sections. Simulation results indicate that the combustion flow field structure of the hybrid rocket motor could be improved by multi-section swirl injection method. The transformation of the combustion flow field can greatly increase the fuel regression rate and the combustion efficiency. The average fuel regression rate of the motor with multi-section swirl injection is improved by 8.37 times compared with that of the motor with conventional head-end irrotational injection. The combustion efficiency is increased to 95.73%. Besides, the simulation results also indicate that (1) the additional injection sections can increase the fuel regression rate and the combustion efficiency; (2) the upstream offset of the injection sections reduces the combustion efficiency; and (3) the fuel regression rate and the combustion efficiency decrease with the reduction of the number of injector ports in each injection section.

  3. Principles of scarce medical resource allocation in natural disaster relief: a simulation approach.

    PubMed

    Cao, Hui; Huang, Simin

    2012-01-01

    A variety of triage principles have been proposed. The authors sought to evaluate their effects on how many lives can be saved in a hypothetical disaster. To determine an optimal scarce resource-rationing principle in the emergency response domain, considering the trade-off between lifesaving efficiency and ethical issues. A discrete event simulation model is developed to examine the efficiency of four resource-rationing principles: first come-first served, random, most serious first, and least serious first. Seven combinations of available resources are examined in the simulations to evaluate the performance of the principles under different levels of resource scarcity. The simulation results indicate that the performance of the medical resource allocation principles is related to the level of the resource scarcity. When the level of the scarcity is high, the performances of the four principles differ significantly. The least serious first principle performs best, followed by the random principle; the most serious first principle acts worst. However, when the scarcity is relieved, there are no significant differences among the random, first come-first served, and least serious first principles, yet the most serious first principle still performs worst. Although the least serious first principle exhibits the highest efficiency, it is not ethically flawless. Considering the trade off between the lifesaving efficiency and the ethical issues, random selection is a relatively fair and efficient principle for allocating scarce medical resources in natural disaster responses.

  4. Utilizing fast multipole expansions for efficient and accurate quantum-classical molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Schwörer, Magnus; Lorenzen, Konstantin; Mathias, Gerald; Tavan, Paul

    2015-03-01

    Recently, a novel approach to hybrid quantum mechanics/molecular mechanics (QM/MM) molecular dynamics (MD) simulations has been suggested [Schwörer et al., J. Chem. Phys. 138, 244103 (2013)]. Here, the forces acting on the atoms are calculated by grid-based density functional theory (DFT) for a solute molecule and by a polarizable molecular mechanics (PMM) force field for a large solvent environment composed of several 103-105 molecules as negative gradients of a DFT/PMM hybrid Hamiltonian. The electrostatic interactions are efficiently described by a hierarchical fast multipole method (FMM). Adopting recent progress of this FMM technique [Lorenzen et al., J. Chem. Theory Comput. 10, 3244 (2014)], which particularly entails a strictly linear scaling of the computational effort with the system size, and adapting this revised FMM approach to the computation of the interactions between the DFT and PMM fragments of a simulation system, here, we show how one can further enhance the efficiency and accuracy of such DFT/PMM-MD simulations. The resulting gain of total performance, as measured for alanine dipeptide (DFT) embedded in water (PMM) by the product of the gains in efficiency and accuracy, amounts to about one order of magnitude. We also demonstrate that the jointly parallelized implementation of the DFT and PMM-MD parts of the computation enables the efficient use of high-performance computing systems. The associated software is available online.

  5. Risk-based management of invading plant disease.

    PubMed

    Hyatt-Twynam, Samuel R; Parnell, Stephen; Stutt, Richard O J H; Gottwald, Tim R; Gilligan, Christopher A; Cunniffe, Nik J

    2017-05-01

    Effective control of plant disease remains a key challenge. Eradication attempts often involve removal of host plants within a certain radius of detection, targeting asymptomatic infection. Here we develop and test potentially more effective, epidemiologically motivated, control strategies, using a mathematical model previously fitted to the spread of citrus canker in Florida. We test risk-based control, which preferentially removes hosts expected to cause a high number of infections in the remaining host population. Removals then depend on past patterns of pathogen spread and host removal, which might be nontransparent to affected stakeholders. This motivates a variable radius strategy, which approximates risk-based control via removal radii that vary by location, but which are fixed in advance of any epidemic. Risk-based control outperforms variable radius control, which in turn outperforms constant radius removal. This result is robust to changes in disease spread parameters and initial patterns of susceptible host plants. However, efficiency degrades if epidemiological parameters are incorrectly characterised. Risk-based control including additional epidemiology can be used to improve disease management, but it requires good prior knowledge for optimal performance. This focuses attention on gaining maximal information from past epidemics, on understanding model transferability between locations and on adaptive management strategies that change over time. © 2017 The Authors. New Phytologist © 2017 New Phytologist Trust.

  6. Public Health and Epidemiology Informatics

    PubMed Central

    Bar-Hen, A.; Paragios, N.

    2016-01-01

    Summary Objectives The aim of this manuscript is to provide a brief overview of the scientific challenges that should be addressed in order to unlock the full potential of using data from a general point of view, as well as to present some ideas that could help answer specific needs for data understanding in the field of health sciences and epidemiology. Methods A survey of uses and challenges of big data analyses for medicine and public health was conducted. The first part of the paper focuses on big data techniques, algorithms, and statistical approaches to identify patterns in data. The second part describes some cutting-edge applications of analyses and predictive modeling in public health. Results In recent years, we witnessed a revolution regarding the nature, collection, and availability of data in general. This was especially striking in the health sector and particularly in the field of epidemiology. Data derives from a large variety of sources, e.g. clinical settings, billing claims, care scheduling, drug usage, web based search queries, and Tweets. Conclusion The exploitation of the information (data mining, artificial intelligence) relevant to these data has become one of the most promising as well challenging tasks from societal and scientific viewpoints in order to leverage the information available and making public health more efficient. PMID:27830257

  7. The Influence of Socioeconomic Factors on the Epidemiology of Maxillofacial Fractures in Southern Italy.

    PubMed

    Sbordone, Carolina; Barca, Ida; Petrocelli, Marzia; Dell'Aversana Orabona, Giovanni; Vaira, Luigi Angelo; Colangeli, Walter; Cristofaro, Maria Giulia; Giudice, Mario; Giudice, Amerigo; Cassandro, Francesco Maria; Attanasi, Federica; Iaconetta, Giorgio; Califano, Luigi

    2018-05-15

    Maxillofacial fractures represent a serious public health problem. Their epidemiology is extremely variable and its analysis is crucial to establish effective treatment and prevention of these injuries. The aim of this multicentric retrospective study was to analyze causes, demographics, incidence, characteristics of 987 patients diagnosed with maxillofacial trauma between 2011 and 2015 at Complex Operative Unit of Maxillofacial Surgery of Federico II University of Naples and Magna Graecia University of Catanzaro, Italy; 657 male and 310 female patients were admitted in the study. The most frequently observed fracture involved the mandible (399 patients, 35.4%), followed by zygomatic complex (337 patients, 29.9%), orbital walls (160 patients, 14.2%), and nasal bones (129 patients, 11.4%). The most frequent cause of fracture was assaults (30.4%), followed by road traffic injuries (27.2%), falls (23.2%), sport accidents (15.4%), and others causes (2.6%). Significant variations of etiology have been detected between the 2 hospitals in relationship with different migration flow trends and cultural and socioeconomic features. Epidemiological analysis of maxillofacial fractures is crucial to identify the trauma burden and to help in developing a more efficient system to plan resource allocation and to deliver care and preventive measures establishing clinical and research priorities for effective treatment and prevention of these injuries.

  8. Self-Learning Monte Carlo Method

    NASA Astrophysics Data System (ADS)

    Liu, Junwei; Qi, Yang; Meng, Zi Yang; Fu, Liang

    Monte Carlo simulation is an unbiased numerical tool for studying classical and quantum many-body systems. One of its bottlenecks is the lack of general and efficient update algorithm for large size systems close to phase transition or with strong frustrations, for which local updates perform badly. In this work, we propose a new general-purpose Monte Carlo method, dubbed self-learning Monte Carlo (SLMC), in which an efficient update algorithm is first learned from the training data generated in trial simulations and then used to speed up the actual simulation. We demonstrate the efficiency of SLMC in a spin model at the phase transition point, achieving a 10-20 times speedup. This work is supported by the DOE Office of Basic Energy Sciences, Division of Materials Sciences and Engineering under Award DE-SC0010526.

  9. Maximizing fluorescence collection efficiency in multiphoton microscopy

    PubMed Central

    Zinter, Joseph P.; Levene, Michael J.

    2011-01-01

    Understanding fluorescence propagation through a multiphoton microscope is of critical importance in designing high performance systems capable of deep tissue imaging. Optical models of a scattering tissue sample and the Olympus 20X 0.95NA microscope objective were used to simulate fluorescence propagation as a function of imaging depth for physiologically relevant scattering parameters. The spatio-angular distribution of fluorescence at the objective back aperture derived from these simulations was used to design a simple, maximally efficient post-objective fluorescence collection system. Monte Carlo simulations corroborated by data from experimental tissue phantoms demonstrate collection efficiency improvements of 50% – 90% over conventional, non-optimized fluorescence collection geometries at large imaging depths. Imaging performance was verified by imaging layer V neurons in mouse cortex to a depth of 850 μm. PMID:21934897

  10. TU-AB-303-11: Predict Parotids Deformation Applying SIS Epidemiological Model in H&N Adaptive RT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maffei, N; Guidi, G; University of Bologna, Bologna, Bologna

    2015-06-15

    Purpose: The aim is to investigate the use of epidemiological models to predict morphological variations in patients undergoing radiation therapy (RT). The susceptible-infected-susceptible (SIS) deterministic model was applied to simulate warping within a focused region of interest (ROI). Hypothesis is to consider each voxel like a single subject of the whole sample and to treat displacement vector fields like an infection. Methods: Using Raystation hybrid deformation algorithms and automatic re-contouring based on mesh grid, we post-processed 360 MVCT images of 12 H&N patients treated with Tomotherapy. Study focused on parotid glands, identified by literature and previous analysis, as ROI moremore » susceptible to warping in H&N region. Susceptible (S) and infectious (I) cases were identified in voxels with inter-fraction movement respectively under and over a set threshold. IronPython scripting allowed to export positions and displacement data of surface voxels for every fraction. A MATLAB homemade toolbox was developed to model the SIS. Results: SIS model was validated simulating organ motion on QUASAR phantom. Applying model in patients, within a [0–1cm] range, a single voxel movement of 0.4cm was selected as displacement threshold. SIS indexes were evaluated by MATLAB simulations. Dynamic time warping algorithm was used to assess matching between model and parotids behavior days of treatments. The best fit of the model was obtained with contact rate of 7.89±0.94 and recovery rate of 2.36±0.21. Conclusion: SIS model can follow daily structures evolutions, making possible to compare warping conditions and highlighting challenges due to abnormal variation and set-up errors. By epidemiology approach, organ motion could be assessed and predicted not in terms of average of the whole ROI, but in a voxel-by-voxel deterministic trend. Identifying anatomical region subjected to variations, would be possible to focus clinic controls within a cohort of pre-selected patients eligible for adaptive RT. The research is partially co-funded by the Italian Research Grant: Dose warping methods for IGRT and Adaptive RT: dose accumulation based on organ motion and anatomical variations of the patients during radiation therapy treatments,MoH (GR-2010-2318757) and Tecnologie Avanzate S.r.l.(Italy)« less

  11. Examining the Relationship between Pre-Malignant Breast Lesions, Carcinogenesis and Tumor Evolution in the Mammary Epithelium Using an Agent-Based Model.

    PubMed

    Chapa, Joaquin; An, Gary; Kulkarni, Swati A

    2016-01-01

    Breast cancer, the product of numerous rare mutational events that occur over an extended time period, presents numerous challenges to investigators interested in studying the transformation from normal breast epithelium to malignancy using traditional laboratory methods, particularly with respect to characterizing transitional and pre-malignant states. Dynamic computational modeling can provide insight into these pathophysiological dynamics, and as such we use a previously validated agent-based computational model of the mammary epithelium (the DEABM) to investigate the probabilistic mechanisms by which normal populations of ductal cells could transform into states replicating features of both pre-malignant breast lesions and a diverse set of breast cancer subtypes. The DEABM consists of simulated cellular populations governed by algorithms based on accepted and previously published cellular mechanisms. Cells respond to hormones, undergo mitosis, apoptosis and cellular differentiation. Heritable mutations to 12 genes prominently implicated in breast cancer are acquired via a probabilistic mechanism. 3000 simulations of the 40-year period of menstrual cycling were run in wild-type (WT) and BRCA1-mutated groups. Simulations were analyzed by development of hyperplastic states, incidence of malignancy, hormone receptor and HER-2 status, frequency of mutation to particular genes, and whether mutations were early events in carcinogenesis. Cancer incidence in WT (2.6%) and BRCA1-mutated (45.9%) populations closely matched published epidemiologic rates. Hormone receptor expression profiles in both WT and BRCA groups also closely matched epidemiologic data. Hyperplastic populations carried more mutations than normal populations and mutations were similar to early mutations found in ER+ tumors (telomerase, E-cadherin, TGFB, RUNX3, p < .01). ER- tumors carried significantly more mutations and carried more early mutations in BRCA1, c-MYC and genes associated with epithelial-mesenchymal transition. The DEABM generates diverse tumors that express tumor markers consistent with epidemiologic data. The DEABM also generates non-invasive, hyperplastic populations, analogous to atypia or ductal carcinoma in situ (DCIS), via mutations to genes known to be present in hyperplastic lesions and as early mutations in breast cancers. The results demonstrate that agent-based models are well-suited to studying tumor evolution through stages of carcinogenesis and have the potential to be used to develop prevention and treatment strategies.

  12. Tree tensor network approach to simulating Shor's algorithm

    NASA Astrophysics Data System (ADS)

    Dumitrescu, Eugene

    2017-12-01

    Constructively simulating quantum systems furthers our understanding of qualitative and quantitative features which may be analytically intractable. In this paper, we directly simulate and explore the entanglement structure present in the paradigmatic example for exponential quantum speedups: Shor's algorithm. To perform our simulation, we construct a dynamic tree tensor network which manifestly captures two salient circuit features for modular exponentiation. These are the natural two-register bipartition and the invariance of entanglement with respect to permutations of the top-register qubits. Our construction help identify the entanglement entropy properties, which we summarize by a scaling relation. Further, the tree network is efficiently projected onto a matrix product state from which we efficiently execute the quantum Fourier transform. Future simulation of quantum information states with tensor networks exploiting circuit symmetries is discussed.

  13. A computable expression of closure to efficient causation.

    PubMed

    Mossio, Matteo; Longo, Giuseppe; Stewart, John

    2009-04-07

    In this paper, we propose a mathematical expression of closure to efficient causation in terms of lambda-calculus; we argue that this opens up the perspective of developing principled computer simulations of systems closed to efficient causation in an appropriate programming language. An important implication of our formulation is that, by exhibiting an expression in lambda-calculus, which is a paradigmatic formalism for computability and programming, we show that there are no conceptual or principled problems in realizing a computer simulation or model of closure to efficient causation. We conclude with a brief discussion of the question whether closure to efficient causation captures all relevant properties of living systems. We suggest that it might not be the case, and that more complex definitions could indeed create crucial some obstacles to computability.

  14. Malaria epidemiology and economics: the effect of delayed immune acquisition on the cost-effectiveness of insecticide-treated bednets.

    PubMed Central

    Guyatt, H L; Snow, R W; Evans, D B

    1999-01-01

    An understanding of the epidemiology of a disease is central in evaluating the health impact and cost-effectiveness of control interventions. The epidemiology of life-threatening malaria is receiving renewed interest, with concerns that the implementation of preventive measures such as insecticide-treated bednets (ITNs) while protecting young children might in fact increase the risks of mortality and morbidity in older ages by delaying the acquisition of functional immunity. This paper aims to illustrate how a combined approach of epidemiology and economics can be used to (i) explore the long-term impact of changes in epidemiological profiles, and (ii) identify those variables that are critical in determining whether an intervention will be an efficient use of resources. The key parameters for determining effectiveness are the protective efficacy of ITNs (reduction in all-cause mortality), the malaria attributable mortality and the increased malaria-specific mortality risk due to delays in the acquisition of functional immunity. In particular, the analysis demonstrates that delayed immune acquisition is not a problem per se, but that the critical issue is whether it occurs immediately following the implementation of an ITN programme or whether it builds up slowly over time. In the 'worst case' scenario where ITNs immediately increase malaria-specific mortality due to reduced immunity, the intervention might actually cost lives. In other words, it might be better to not use ITNs. On the other hand, if reduced immunity takes two years to develop, ITNs would still fall into the category of excellent value for money compared to other health interventions, saving a year of life (YLL) at a cost of between US$25-30. These types of calculations are important in identifying the parameters which field researchers should be seeking to measure to address the important question of the net impact of delaying the acquisition of immunity through preventive control measures. PMID:10365407

  15. The 'number needed to sample' in primary care research. Comparison of two primary care sampling frames for chronic back pain.

    PubMed

    Smith, Blair H; Hannaford, Philip C; Elliott, Alison M; Smith, W Cairns; Chambers, W Alastair

    2005-04-01

    Sampling for primary care research must strike a balance between efficiency and external validity. For most conditions, even a large population sample will yield a small number of cases, yet other sampling techniques risk problems with extrapolation of findings. To compare the efficiency and external validity of two sampling methods for both an intervention study and epidemiological research in primary care--a convenience sample and a general population sample--comparing the response and follow-up rates, the demographic and clinical characteristics of each sample, and calculating the 'number needed to sample' (NNS) for a hypothetical randomized controlled trial. In 1996, we selected two random samples of adults from 29 general practices in Grampian, for an epidemiological study of chronic pain. One sample of 4175 was identified by an electronic questionnaire that listed patients receiving regular analgesic prescriptions--the 'repeat prescription sample'. The other sample of 5036 was identified from all patients on practice lists--the 'general population sample'. Questionnaires, including demographic, pain and general health measures, were sent to all. A similar follow-up questionnaire was sent in 2000 to all those agreeing to participate in further research. We identified a potential group of subjects for a hypothetical trial in primary care based on a recently published trial (those aged 25-64, with severe chronic back pain, willing to participate in further research). The repeat prescription sample produced better response rates than the general sample overall (86% compared with 82%, P < 0.001), from both genders and from the oldest and youngest age groups. The NNS using convenience sampling was 10 for each member of the final potential trial sample, compared with 55 using general population sampling. There were important differences between the samples in age, marital and employment status, social class and educational level. However, among the potential trial sample, there were no demographic differences. Those from the repeat prescription sample had poorer indices than the general population sample in all pain and health measures. The repeat prescription sampling method was approximately five times more efficient than the general population method. However demographic and clinical differences in the repeat prescription sample might hamper extrapolation of findings to the general population, particularly in an epidemiological study, and demonstrate that simple comparison with age and gender of the target population is insufficient.

  16. On the powerful use of simulations in the quake-catcher network to efficiently position low-cost earthquake sensors

    USGS Publications Warehouse

    Benson, K.; Estrada, T.; Taufer, M.; Lawrence, J.; Cochran, E.

    2011-01-01

    The Quake-Catcher Network (QCN) uses low-cost sensors connected to volunteer computers across the world to monitor seismic events. The location and density of these sensors' placement can impact the accuracy of the event detection. Because testing different special arrangements of new sensors could disrupt the currently active project, this would best be accomplished in a simulated environment. This paper presents an accurate and efficient framework for simulating the low cost QCN sensors and identifying their most effective locations and densities. Results presented show how our simulations are reliable tools to study diverse scenarios under different geographical and infrastructural constraints. ?? 2011 IEEE.

  17. A hybrid algorithm for parallel molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Mangiardi, Chris M.; Meyer, R.

    2017-10-01

    This article describes algorithms for the hybrid parallelization and SIMD vectorization of molecular dynamics simulations with short-range forces. The parallelization method combines domain decomposition with a thread-based parallelization approach. The goal of the work is to enable efficient simulations of very large (tens of millions of atoms) and inhomogeneous systems on many-core processors with hundreds or thousands of cores and SIMD units with large vector sizes. In order to test the efficiency of the method, simulations of a variety of configurations with up to 74 million atoms have been performed. Results are shown that were obtained on multi-core systems with Sandy Bridge and Haswell processors as well as systems with Xeon Phi many-core processors.

  18. Efficiently passing messages in distributed spiking neural network simulation.

    PubMed

    Thibeault, Corey M; Minkovich, Kirill; O'Brien, Michael J; Harris, Frederick C; Srinivasa, Narayan

    2013-01-01

    Efficiently passing spiking messages in a neural model is an important aspect of high-performance simulation. As the scale of networks has increased so has the size of the computing systems required to simulate them. In addition, the information exchange of these resources has become more of an impediment to performance. In this paper we explore spike message passing using different mechanisms provided by the Message Passing Interface (MPI). A specific implementation, MVAPICH, designed for high-performance clusters with Infiniband hardware is employed. The focus is on providing information about these mechanisms for users of commodity high-performance spiking simulators. In addition, a novel hybrid method for spike exchange was implemented and benchmarked.

  19. High-efficiency generation of Bessel beams with transmissive metasurfaces

    NASA Astrophysics Data System (ADS)

    Wang, Zhuo; Dong, Shaohua; Luo, Weijie; Jia, Min; Liang, Zhongzhu; He, Qiong; Sun, Shulin; Zhou, Lei

    2018-05-01

    Circularly polarized Bessel beams (BBs) are important in biomolecule-sensing-related applications, but the available generators are too bulky in size and/or exhibit low efficiencies. Here, we design and fabricate ultra-thin ( ˜λ /6 ) transmissive Pancharatnam-Berry metasurfaces and perform near-field scanning measurements to show that they can generate circularly polarized BBs within a frequency window of 10.7-12.3 GHz. We experimentally demonstrate that the generated BBs exhibit a self-healing effect, illustrating their non-diffraction characteristics. Finally, we employ far-field measurements to demonstrate that the working efficiency of our devices can reach 91%, while the simulated efficiency reaches 92%. All experimental results are in perfect agreement with full-wave simulations.

  20. Methods for estimation of radiation risk in epidemiological studies accounting for classical and Berkson errors in doses.

    PubMed

    Kukush, Alexander; Shklyar, Sergiy; Masiuk, Sergii; Likhtarov, Illya; Kovgan, Lina; Carroll, Raymond J; Bouville, Andre

    2011-02-16

    With a binary response Y, the dose-response model under consideration is logistic in flavor with pr(Y=1 | D) = R (1+R)(-1), R = λ(0) + EAR D, where λ(0) is the baseline incidence rate and EAR is the excess absolute risk per gray. The calculated thyroid dose of a person i is expressed as Dimes=fiQi(mes)/Mi(mes). Here, Qi(mes) is the measured content of radioiodine in the thyroid gland of person i at time t(mes), Mi(mes) is the estimate of the thyroid mass, and f(i) is the normalizing multiplier. The Q(i) and M(i) are measured with multiplicative errors Vi(Q) and ViM, so that Qi(mes)=Qi(tr)Vi(Q) (this is classical measurement error model) and Mi(tr)=Mi(mes)Vi(M) (this is Berkson measurement error model). Here, Qi(tr) is the true content of radioactivity in the thyroid gland, and Mi(tr) is the true value of the thyroid mass. The error in f(i) is much smaller than the errors in ( Qi(mes), Mi(mes)) and ignored in the analysis. By means of Parametric Full Maximum Likelihood and Regression Calibration (under the assumption that the data set of true doses has lognormal distribution), Nonparametric Full Maximum Likelihood, Nonparametric Regression Calibration, and by properly tuned SIMEX method we study the influence of measurement errors in thyroid dose on the estimates of λ(0) and EAR. The simulation study is presented based on a real sample from the epidemiological studies. The doses were reconstructed in the framework of the Ukrainian-American project on the investigation of Post-Chernobyl thyroid cancers in Ukraine, and the underlying subpolulation was artificially enlarged in order to increase the statistical power. The true risk parameters were given by the values to earlier epidemiological studies, and then the binary response was simulated according to the dose-response model.

  1. Improved Variable Selection Algorithm Using a LASSO-Type Penalty, with an Application to Assessing Hepatitis B Infection Relevant Factors in Community Residents

    PubMed Central

    Guo, Pi; Zeng, Fangfang; Hu, Xiaomin; Zhang, Dingmei; Zhu, Shuming; Deng, Yu; Hao, Yuantao

    2015-01-01

    Objectives In epidemiological studies, it is important to identify independent associations between collective exposures and a health outcome. The current stepwise selection technique ignores stochastic errors and suffers from a lack of stability. The alternative LASSO-penalized regression model can be applied to detect significant predictors from a pool of candidate variables. However, this technique is prone to false positives and tends to create excessive biases. It remains challenging to develop robust variable selection methods and enhance predictability. Material and methods Two improved algorithms denoted the two-stage hybrid and bootstrap ranking procedures, both using a LASSO-type penalty, were developed for epidemiological association analysis. The performance of the proposed procedures and other methods including conventional LASSO, Bolasso, stepwise and stability selection models were evaluated using intensive simulation. In addition, methods were compared by using an empirical analysis based on large-scale survey data of hepatitis B infection-relevant factors among Guangdong residents. Results The proposed procedures produced comparable or less biased selection results when compared to conventional variable selection models. In total, the two newly proposed procedures were stable with respect to various scenarios of simulation, demonstrating a higher power and a lower false positive rate during variable selection than the compared methods. In empirical analysis, the proposed procedures yielding a sparse set of hepatitis B infection-relevant factors gave the best predictive performance and showed that the procedures were able to select a more stringent set of factors. The individual history of hepatitis B vaccination, family and individual history of hepatitis B infection were associated with hepatitis B infection in the studied residents according to the proposed procedures. Conclusions The newly proposed procedures improve the identification of significant variables and enable us to derive a new insight into epidemiological association analysis. PMID:26214802

  2. Assessing Disparities of Dengue Virus Transmission Risk across the US-Mexican Border Using a Climate Driven Vector-Epidemiological Model

    NASA Technical Reports Server (NTRS)

    Morin, Cory; Monaghan, Andrew; Quattrochi, Dale; Crosson, William; Hayden, Mary; Ernst, Kacey

    2015-01-01

    Dengue fever is a mosquito-borne viral disease reemerging throughout much of the tropical Americas. Dengue virus transmission is explicitly influenced by climate and the environment through its primary vector, Aedes aegypti. Temperature regulates Ae. aegypti development, survival, and replication rates as well as the incubation period of the virus within the mosquito. Precipitation provides water for many of the preferred breeding habitats of the mosquito, including buckets, old tires, and other places water can collect. Although transmission regularly occurs along the border region in Mexico, dengue virus transmission in bordering Arizona has not occurred. Using NASA's TRMM (Tropical Rainfall Measuring Mission) satellite for precipitation input and Daymet for temperature and supplemental precipitation input, we modeled dengue transmission along a US-Mexico transect using a dynamic dengue transmission model that includes interacting vector ecology and epidemiological components. Model runs were performed for 5 cities in Sonora, Mexico and southern Arizona. Employing a Monte Carlo approach, we performed ensembles of several thousands of model simulations in order to resolve the model uncertainty arising from using different combinations of parameter values that are not well known. For cities with reported dengue case data, the top model simulations that best reproduced dengue case numbers were retained and their parameter values were extracted for comparison. These parameter values were used to run simulations in areas where dengue virus transmission does not occur or where dengue fever case data was unavailable. Additional model runs were performed to reveal how changes in climate or parameter values could alter transmission risk along the transect. The relative influence of climate variability and model parameters on dengue virus transmission is assessed to help public health workers prepare location specific infection prevention strategies.

  3. Inference of Transmission Network Structure from HIV Phylogenetic Trees

    DOE PAGES

    Giardina, Federica; Romero-Severson, Ethan Obie; Albert, Jan; ...

    2017-01-13

    Phylogenetic inference is an attractive means to reconstruct transmission histories and epidemics. However, there is not a perfect correspondence between transmission history and virus phylogeny. Both node height and topological differences may occur, depending on the interaction between within-host evolutionary dynamics and between-host transmission patterns. To investigate these interactions, we added a within-host evolutionary model in epidemiological simulations and examined if the resulting phylogeny could recover different types of contact networks. To further improve realism, we also introduced patient-specific differences in infectivity across disease stages, and on the epidemic level we considered incomplete sampling and the age of the epidemic.more » Second, we implemented an inference method based on approximate Bayesian computation (ABC) to discriminate among three well-studied network models and jointly estimate both network parameters and key epidemiological quantities such as the infection rate. Our ABC framework used both topological and distance-based tree statistics for comparison between simulated and observed trees. Overall, our simulations showed that a virus time-scaled phylogeny (genealogy) may be substantially different from the between-host transmission tree. This has important implications for the interpretation of what a phylogeny reveals about the underlying epidemic contact network. In particular, we found that while the within-host evolutionary process obscures the transmission tree, the diversification process and infectivity dynamics also add discriminatory power to differentiate between different types of contact networks. We also found that the possibility to differentiate contact networks depends on how far an epidemic has progressed, where distance-based tree statistics have more power early in an epidemic. Finally, we applied our ABC inference on two different outbreaks from the Swedish HIV-1 epidemic.« less

  4. Does consideration of larger study areas yield more accurate estimates of air pollution health effects? An illustration of the bias-variance trade-off in air pollution epidemiology.

    PubMed

    Pedersen, Marie; Siroux, Valérie; Pin, Isabelle; Charles, Marie Aline; Forhan, Anne; Hulin, Agnés; Galineau, Julien; Lepeule, Johanna; Giorgis-Allemand, Lise; Sunyer, Jordi; Annesi-Maesano, Isabella; Slama, Rémy

    2013-10-01

    Spatially-resolved air pollution models can be developed in large areas. The resulting increased exposure contrasts and population size offer opportunities to better characterize the effect of atmospheric pollutants on respiratory health. However the heterogeneity of these areas may also enhance the potential for confounding. We aimed to discuss some analytical approaches to handle this trade-off. We modeled NO2 and PM10 concentrations at the home addresses of 1082 pregnant mothers from EDEN cohort living in and around urban areas, using ADMS dispersion model. Simulations were performed to identify the best strategy to limit confounding by unmeasured factors varying with area type. We examined the relation between modeled concentrations and respiratory health in infants using regression models with and without adjustment or interaction terms with area type. Simulations indicated that adjustment for area limited the bias due to unmeasured confounders varying with area at the costs of a slight decrease in statistical power. In our cohort, rural and urban areas differed for air pollution levels and for many factors associated with respiratory health and exposure. Area tended to modify effect measures of air pollution on respiratory health. Increasing the size of the study area also increases the potential for residual confounding. Our simulations suggest that adjusting for type of area is a good option to limit residual confounding due to area-associated factors without restricting the area size. Other statistical approaches developed in the field of spatial epidemiology are an alternative to control for poorly-measured spatially-varying confounders. © 2013 Elsevier Ltd. All rights reserved.

  5. Inference of Transmission Network Structure from HIV Phylogenetic Trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giardina, Federica; Romero-Severson, Ethan Obie; Albert, Jan

    Phylogenetic inference is an attractive means to reconstruct transmission histories and epidemics. However, there is not a perfect correspondence between transmission history and virus phylogeny. Both node height and topological differences may occur, depending on the interaction between within-host evolutionary dynamics and between-host transmission patterns. To investigate these interactions, we added a within-host evolutionary model in epidemiological simulations and examined if the resulting phylogeny could recover different types of contact networks. To further improve realism, we also introduced patient-specific differences in infectivity across disease stages, and on the epidemic level we considered incomplete sampling and the age of the epidemic.more » Second, we implemented an inference method based on approximate Bayesian computation (ABC) to discriminate among three well-studied network models and jointly estimate both network parameters and key epidemiological quantities such as the infection rate. Our ABC framework used both topological and distance-based tree statistics for comparison between simulated and observed trees. Overall, our simulations showed that a virus time-scaled phylogeny (genealogy) may be substantially different from the between-host transmission tree. This has important implications for the interpretation of what a phylogeny reveals about the underlying epidemic contact network. In particular, we found that while the within-host evolutionary process obscures the transmission tree, the diversification process and infectivity dynamics also add discriminatory power to differentiate between different types of contact networks. We also found that the possibility to differentiate contact networks depends on how far an epidemic has progressed, where distance-based tree statistics have more power early in an epidemic. Finally, we applied our ABC inference on two different outbreaks from the Swedish HIV-1 epidemic.« less

  6. Assembly Line Efficiency Improvement by Using WITNESS Simulation Software

    NASA Astrophysics Data System (ADS)

    Yasir, A. S. H. M.; Mohamed, N. M. Z. N.

    2018-03-01

    In the nowadays-competitive world, efficiencies and the productivity of the assembly line are essential in manufacturing company. This paper demonstrates the study of the existing production line performance. The actual cycle time observed and recorded during the working process. The current layout was designed and analysed using Witness simulation software. The productivity and effectiveness for every single operator are measured to determine the operator idle time and busy time. Two new alternatives layout were proposed and analysed by using Witness simulation software to improve the performance of production activities. This research provided valuable and better understanding of production effectiveness by adjusting the line balancing. After analysing the data, simulation result from the current layout and the proposed plan later been tabulated to compare the improved efficiency and productivity. The proposed design plan has shown an increase in yield and productivity compared to the current arrangement. This research has been carried out in company XYZ, which is one of the automotive premises in Pahang, Malaysia.

  7. Multiscale three-dimensional simulations of charge gain and transport in diamond

    NASA Astrophysics Data System (ADS)

    Dimitrov, D. A.; Busby, R.; Cary, J. R.; Ben-Zvi, I.; Rao, T.; Smedley, J.; Chang, X.; Keister, J. W.; Wu, Q.; Muller, E.

    2010-10-01

    A promising new concept of a diamond-amplified photocathode for generation of high-current, high-brightness, and low thermal emittance electron beams was recently proposed and is currently under active development. Detailed understanding of physical processes with multiple energy and time scales is required to design reliable and efficient diamond-amplifier cathodes. We have implemented models, within the VORPAL computational framework, to simulate secondary electron generation and charge transport in diamond in order to facilitate the investigation of the relevant effects involved. The models include inelastic scattering of electrons and holes for generation of electron-hole pairs, elastic, phonon, and charge impurity scattering. We describe the integrated modeling capabilities we developed and present results on charge gain and collection efficiency as a function of primary electron energy and applied electric field. We compare simulation results with available experimental data. The simulations show an overall qualitative agreement with the observed charge gain from transmission mode experiments and have enabled better understanding of the collection efficiency measurements.

  8. A full three dimensional Navier-Stokes numerical simulation of flow field inside a power plant Kaplan turbine using some model test turbine hill chart points

    NASA Astrophysics Data System (ADS)

    Hosseinalipour, S. M.; Raja, A.; Hajikhani, S.

    2012-06-01

    A full three dimensional Navier - Stokes numerical simulation has been performed for performance analysis of a Kaplan turbine which is installed in one of the Irans south dams. No simplifications have been enforced in the simulation. The numerical results have been evaluated using some integral parameters such as the turbine efficiency via comparing the results with existing experimental data from the prototype Hill chart. In part of this study the numerical simulations were performed in order to calculate the prototype turbine efficiencies in some specific points which comes from the scaling up of the model efficiency that are available in the model experimental Hill chart. The results are very promising which shows the good ability of the numerical techniques for resolving the flow characteristics in these kind of complex geometries. A parametric study regarding the evaluation of turbine performance in three different runner angles of the prototype is also performed and the results are cited in this paper.

  9. Modeling the Effects of Turbulence in Rotating Detonation Engines

    NASA Astrophysics Data System (ADS)

    Towery, Colin; Smith, Katherine; Hamlington, Peter; van Schoor, Marthinus; TESLa Team; Midé Team

    2014-03-01

    Propulsion systems based on detonation waves, such as rotating and pulsed detonation engines, have the potential to substantially improve the efficiency and power density of gas turbine engines. Numerous technical challenges remain to be solved in such systems, however, including obtaining more efficient injection and mixing of air and fuels, more reliable detonation initiation, and better understanding of the flow in the ejection nozzle. These challenges can be addressed using numerical simulations. Such simulations are enormously challenging, however, since accurate descriptions of highly unsteady turbulent flow fields are required in the presence of combustion, shock waves, fluid-structure interactions, and other complex physical processes. In this study, we performed high-fidelity three dimensional simulations of a rotating detonation engine and examined turbulent flow effects on the operation, performance, and efficiency of the engine. Along with experimental data, these simulations were used to test the accuracy of commonly-used Reynolds averaged and subgrid-scale turbulence models when applied to detonation engines. The authors gratefully acknowledge the support of the Defense Advanced Research Projects Agency (DARPA).

  10. Efficient Brownian Dynamics of rigid colloids in linear flow fields based on the grand mobility matrix

    NASA Astrophysics Data System (ADS)

    Palanisamy, Duraivelan; den Otter, Wouter K.

    2018-05-01

    We present an efficient general method to simulate in the Stokesian limit the coupled translational and rotational dynamics of arbitrarily shaped colloids subject to external potential forces and torques, linear flow fields, and Brownian motion. The colloid's surface is represented by a collection of spherical primary particles. The hydrodynamic interactions between these particles, here approximated at the Rotne-Prager-Yamakawa level, are evaluated only once to generate the body's (11 × 11) grand mobility matrix. The constancy of this matrix in the body frame, combined with the convenient properties of quaternions in rotational Brownian Dynamics, enables an efficient simulation of the body's motion. Simulations in quiescent fluids yield correct translational and rotational diffusion behaviour and sample Boltzmann's equilibrium distribution. Simulations of ellipsoids and spherical caps under shear, in the absence of thermal fluctuations, yield periodic orbits in excellent agreement with the theories by Jeffery and Dorrepaal. The time-varying stress tensors provide the Einstein coefficient and viscosity of dilute suspensions of these bodies.

  11. Studies of the Low-energy Gamma Background

    NASA Astrophysics Data System (ADS)

    Bikit, K.; Mrđa, D.; Bikit, I.; Slivka, J.; Veskovic, M.; Knezevic, D.

    The investigations of contribution to the low-energy part of background gamma spectrum (below 100 keV) and knowing detection efficiency for this region are important for both, a fundamental, as well as for applied research. In this work, the components contributing to the low-energy region of background gamma spectrum for shielded detector are analyzed, including the production and spectral distribution of muon-induced continuous low-energy radiation in the vicinity of high-purity germanium detector.In addition, the detection efficiency for low energy gamma region is determined using the GEANT 4 simulation package. This technique offers excellent opportunity to predict the detection response in mentioned region. Unfortunately, the frequently weakly known dead layer thickness on the surface of the extended-range detector, as well as some processes which are not incorporated in simulation (e.g. charge collection from detector active volume) may limit the reliability of simulation technique. Thus, the 14, 17, 21, 26, 33, 59.5 keV transitions in the calibrated 241Am point source were used to check the simulated efficiencies.

  12. Efficient prediction of terahertz quantum cascade laser dynamics from steady-state simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agnew, G.; Lim, Y. L.; Nikolić, M.

    2015-04-20

    Terahertz-frequency quantum cascade lasers (THz QCLs) based on bound-to-continuum active regions are difficult to model owing to their large number of quantum states. We present a computationally efficient reduced rate equation (RE) model that reproduces the experimentally observed variation of THz power with respect to drive current and heat-sink temperature. We also present dynamic (time-domain) simulations under a range of drive currents and predict an increase in modulation bandwidth as the current approaches the peak of the light–current curve, as observed experimentally in mid-infrared QCLs. We account for temperature and bias dependence of the carrier lifetimes, gain, and injection efficiency,more » calculated from a full rate equation model. The temperature dependence of the simulated threshold current, emitted power, and cut-off current are thus all reproduced accurately with only one fitting parameter, the interface roughness, in the full REs. We propose that the model could therefore be used for rapid dynamical simulation of QCL designs.« less

  13. Modelling and analysis of solar cell efficiency distributions

    NASA Astrophysics Data System (ADS)

    Wasmer, Sven; Greulich, Johannes

    2017-08-01

    We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.

  14. Improvement of productivity in low volume production industry layout by using witness simulation software

    NASA Astrophysics Data System (ADS)

    Jaffrey, V.; Mohamed, N. M. Z. N.; Rose, A. N. M.

    2017-10-01

    In almost all manufacturing industry, increased productivity and better efficiency of the production line are the most important goals. Most factories especially small scale factory has less awareness of manufacturing system optimization and lack of knowledge about it and uses the traditional way of management. Problems that are commonly identified in the factory are a high idle time of labour and also small production. This study is done in a Small and Medium Enterprises (SME) low volume production company. Data collection and problems affecting productivity and efficiency are identified. In this study, Witness simulation software is being used to simulate the layout and the output is focusing on the improvement of layout in terms of productivity and efficiency. In this study, the layout is rearranged by reducing the travel time from a workstation to another workstation. Then, the improved layout is modelled and the machine and labour statistic of both, original and improved layout is taken. Productivity and efficiency are calculated for both layout and then being compared.

  15. Improving Energy Efficiency for the Vehicle Assembly Industry: A Discrete Event Simulation Approach

    NASA Astrophysics Data System (ADS)

    Oumer, Abduaziz; Mekbib Atnaw, Samson; Kie Cheng, Jack; Singh, Lakveer

    2016-11-01

    This paper presented a Discrete Event Simulation (DES) model for investigating and improving energy efficiency in vehicle assembly line. The car manufacturing industry is one of the highest energy consuming industries. Using Rockwell Arena DES package; a detailed model was constructed for an actual vehicle assembly plant. The sources of energy considered in this research are electricity and fuel; which are the two main types of energy sources used in a typical vehicle assembly plant. The model depicts the performance measurement for process- specific energy measures of painting, welding, and assembling processes. Sound energy efficiency model within this industry has two-fold advantage: reducing CO2 emission and cost reduction associated with fuel and electricity consumption. The paper starts with an overview of challenges in energy consumption within the facilities of automotive assembly line and highlights the parameters for energy efficiency. The results of the simulation model indicated improvements for energy saving objectives and reduced costs.

  16. Efficient Raman sideband cooling of trapped ions to their motional ground state

    NASA Astrophysics Data System (ADS)

    Che, H.; Deng, K.; Xu, Z. T.; Yuan, W. H.; Zhang, J.; Lu, Z. H.

    2017-07-01

    Efficient cooling of trapped ions is a prerequisite for various applications of the ions in precision spectroscopy, quantum information, and coherence control. Raman sideband cooling is an effective method to cool the ions to their motional ground state. We investigate both numerically and experimentally the optimization of Raman sideband cooling strategies and propose an efficient one, which can simplify the experimental setup as well as reduce the number of cooling pulses. Several cooling schemes are tested and compared through numerical simulations. The simulation result shows that the fixed-width pulses and varied-width pulses have almost the same efficiency for both the first-order and the second-order Raman sideband cooling. The optimized strategy is verified experimentally. A single 25Mg+ ion is trapped in a linear Paul trap and Raman sideband cooled, and the achieved average vibrational quantum numbers under different cooling strategies are evaluated. A good agreement between the experimental result and the simulation result is obtained.

  17. [Spectrum simulation based on data derived from red tide].

    PubMed

    Liu, Zhen-Yu; Cui, Ting-Wei; Yue, Jie; Jiang, Tao; Cao, Wen-Xi; Ma, Yi

    2011-11-01

    The present paper utilizes the absorption data of red tide water measured during the growing and dying course to retrieve imaginary part of the index of refraction based on Mie theory, carries out the simulation and analysis of average absorption efficiency factors, average backscattering efficiency factors and scattering phase function. The analysis of the simulation shows that Mie theory can be used to reproduce the absorption property of Chaetoceros socialis with an average error of 11%; the average backscattering efficiency factors depend on the value of absorption whose maximum value corresponds to the wavelength range from 400 to 700 nanometer; the average backscattering efficiency factors showed a maximum value on 17th with a low value during the outbreak of red tide and the minimum on 21th; the total scattering, weakly depending on the absorption, is proportional to the size parameters which represent the relative size of cell diameter with respect to the wavelength, while the angle scattering intensity is inversely proportional to wavelength.

  18. Assessing the durability and efficiency of landscape-based strategies to deploy plant resistance to pathogens

    PubMed Central

    Rey, Jean-François; Barrett, Luke G.; Thrall, Peter H.

    2018-01-01

    Genetically-controlled plant resistance can reduce the damage caused by pathogens. However, pathogens have the ability to evolve and overcome such resistance. This often occurs quickly after resistance is deployed, resulting in significant crop losses and a continuing need to develop new resistant cultivars. To tackle this issue, several strategies have been proposed to constrain the evolution of pathogen populations and thus increase genetic resistance durability. These strategies mainly rely on varying different combinations of resistance sources across time (crop rotations) and space. The spatial scale of deployment can vary from multiple resistance sources occurring in a single cultivar (pyramiding), in different cultivars within the same field (cultivar mixtures) or in different fields (mosaics). However, experimental comparison of the efficiency (i.e. ability to reduce disease impact) and durability (i.e. ability to limit pathogen evolution and delay resistance breakdown) of landscape-scale deployment strategies presents major logistical challenges. Therefore, we developed a spatially explicit stochastic model able to assess the epidemiological and evolutionary outcomes of the four major deployment options described above, including both qualitative resistance (i.e. major genes) and quantitative resistance traits against several components of pathogen aggressiveness: infection rate, latent period duration, propagule production rate, and infectious period duration. This model, implemented in the R package landsepi, provides a new and useful tool to assess the performance of a wide range of deployment options, and helps investigate the effect of landscape, epidemiological and evolutionary parameters. This article describes the model and its parameterisation for rust diseases of cereal crops, caused by fungi of the genus Puccinia. To illustrate the model, we use it to assess the epidemiological and evolutionary potential of the combination of a major gene and different traits of quantitative resistance. The comparison of the four major deployment strategies described above will be the objective of future studies. PMID:29649208

  19. Design of teleoperation system with a force-reflecting real-time simulator

    NASA Technical Reports Server (NTRS)

    Hirata, Mitsunori; Sato, Yuichi; Nagashima, Fumio; Maruyama, Tsugito

    1994-01-01

    We developed a force-reflecting teleoperation system that uses a real-time graphic simulator. This system eliminates the effects of communication time delays in remote robot manipulation. The simulator provides the operator with predictive display and feedback of computed contact forces through a six-degree of freedom (6-DOF) master arm on a real-time basis. With this system, peg-in-hole tasks involving round-trip communication time delays of up to a few seconds were performed at three support levels: a real image alone, a predictive display with a real image, and a real-time graphic simulator with computed-contact-force reflection and a predictive display. The experimental results indicate the best teleoperation efficiency was achieved by using the force-reflecting simulator with two images. The shortest work time, lowest sensor maximum, and a 100 percent success rate were obtained. These results demonstrate the effectiveness of simulated-force-reflecting teleoperation efficiency.

  20. Efficient finite element simulation of slot spirals, slot radomes and microwave structures

    NASA Technical Reports Server (NTRS)

    Gong, J.; Volakis, J. L.

    1995-01-01

    This progress report contains the following two documents: (1) 'Efficient Finite Element Simulation of Slot Antennas using Prismatic Elements' - A hybrid finite element-boundary integral (FE-BI) simulation technique is discussed to treat narrow slot antennas etched on a planar platform. Specifically, the prismatic elements are used to reduce the redundant sampling rates and ease the mesh generation process. Numerical results for an antenna slot and frequency selective surfaces are presented to demonstrate the validity and capability of the technique; and (2) 'Application and Design Guidelines of the PML Absorber for Finite Element Simulations of Microwave Packages' - The recently introduced perfectly matched layer (PML) uniaxial absorber for frequency domain finite element simulations has several advantages. In this paper we present the application of PML for microwave circuit simulations along with design guidelines to obtain a desired level of absorption. Different feeding techniques are also investigated for improved accuracy.

  1. Determination of efficiency of an aged HPGe detector for gaseous sources by self absorption correction and point source methods

    NASA Astrophysics Data System (ADS)

    Sarangapani, R.; Jose, M. T.; Srinivasan, T. K.; Venkatraman, B.

    2017-07-01

    Methods for the determination of efficiency of an aged high purity germanium (HPGe) detector for gaseous sources have been presented in the paper. X-ray radiography of the detector has been performed to get detector dimensions for computational purposes. The dead layer thickness of HPGe detector has been ascertained from experiments and Monte Carlo computations. Experimental work with standard point and liquid sources in several cylindrical geometries has been undertaken for obtaining energy dependant efficiency. Monte Carlo simulations have been performed for computing efficiencies for point, liquid and gaseous sources. Self absorption correction factors have been obtained using mathematical equations for volume sources and MCNP simulations. Self-absorption correction and point source methods have been used to estimate the efficiency for gaseous sources. The efficiencies determined from the present work have been used to estimate activity of cover gas sample of a fast reactor.

  2. A new efficient method to monitor precocious puberty nationwide in France.

    PubMed

    Rigou, Annabel; Le Moal, Joëlle; Léger, Juliane; Le Tertre, Alain; Carel, Jean-Claude

    2018-02-01

    Clinical precocious puberty (PP) is a disease, reputed to be on the increase and suspected to be linked to endocrine disrupting chemicals (EDC) exposure. Population-based epidemiological data are lacking in France and scarce elsewhere. We accessed the feasibility of monitoring PP nationwide in France in this context, using a nationwide existing database, the French National Health Insurance Information System. Here, we present the method we used with a step-by-step approach to build and select the most suitable indicator. We built three indicators reflecting the incidence of idiopathic central precocious puberty (ICPP), the most frequent form of PP, and we compared these indicators according to their strengths and weaknesses with respect to surveillance purposes. Monitoring ICPP in France proved feasible using a Drug reimbursement indicator. Our method is cost efficient and highly relevant in public health surveillance. Our step-by-step approach proved helpful to achieve this project and could be proposed for assessing the feasibility of monitoring health outcomes of interest using existing data bases. What is known: • Precocious puberty (PP) is suspected to be related to EDC exposure and it is believed to be on the increase in France and in others countries. • Very few epidemiologic data on PP are currently available in the world at the national scale. What is new: • This is the first study describing a method to monitor the most frequent form of PP, idiopathic central PP (ICPP) nationwide in a cost-efficient way, using health insurance databases. • This cost-effective method will allow to estimate and monitor the incidence of ICPP in France and to analyze spatial variations at a very precise scale, which will be very useful to examine the role of environmental exposures, especially to EDCs.

  3. Optimal visual simulation of the self-tracking combustion of the infrared decoy based on the particle system

    NASA Astrophysics Data System (ADS)

    Hu, Qi; Duan, Jin; Wang, LiNing; Zhai, Di

    2016-09-01

    The high-efficiency simulation test of military weapons has a very important effect on the high cost of the actual combat test and the very demanding operational efficiency. Especially among the simulative emulation methods of the explosive smoke, the simulation method based on the particle system has generated much attention. In order to further improve the traditional simulative emulation degree of the movement process of the infrared decoy during the real combustion cycle, this paper, adopting the virtual simulation platform of OpenGL and Vega Prime and according to their own radiation characteristics and the aerodynamic characteristics of the infrared decoy, has simulated the dynamic fuzzy characteristics of the infrared decoy during the real combustion cycle by using particle system based on the double depth peeling algorithm and has solved key issues such as the interface, coordinate conversion and the retention and recovery of the Vega Prime's status. The simulation experiment has basically reached the expected improvement purpose, effectively improved the simulation fidelity and provided theoretical support for improving the performance of the infrared decoy.

  4. Fluid mechanics aspects of magnetic drug targeting.

    PubMed

    Odenbach, Stefan

    2015-10-01

    Experiments and numerical simulations using a flow phantom for magnetic drug targeting have been undertaken. The flow phantom is a half y-branched tube configuration where the main tube represents an artery from which a tumour-supplying artery, which is simulated by the side branch of the flow phantom, branches off. In the experiments a quantification of the amount of magnetic particles targeted towards the branch by a magnetic field applied via a permanent magnet is achieved by impedance measurement using sensor coils. Measuring the targeting efficiency, i.e. the relative amount of particles targeted to the side branch, for different field configurations one obtains targeting maps which combine the targeting efficiency with the magnetic force densities in characteristic points in the flow phantom. It could be shown that targeting efficiency depends strongly on the magnetic field configuration. A corresponding numerical model has been set up, which allows the simulation of targeting efficiency for variable field configuration. With this simulation good agreement of targeting efficiency with experimental data has been found. Thus, the basis has been laid for future calculations of optimal field configurations in clinical applications of magnetic drug targeting. Moreover, the numerical model allows the variation of additional parameters of the drug targeting process and thus an estimation of the influence, e.g. of the fluid properties on the targeting efficiency. Corresponding calculations have shown that the non-Newtonian behaviour of the fluid will significantly influence the targeting process, an aspect which has to be taken into account, especially recalling the fact that the viscosity of magnetic suspensions depends strongly on the magnetic field strength and the mechanical load.

  5. Measurement error in mobile source air pollution exposure estimates due to residential mobility during pregnancy

    PubMed Central

    Pennington, Audrey Flak; Strickland, Matthew J.; Klein, Mitchel; Zhai, Xinxin; Russell, Armistead G.; Hansen, Craig; Darrow, Lyndsey A.

    2018-01-01

    Prenatal air pollution exposure is frequently estimated using maternal residential location at the time of delivery as a proxy for residence during pregnancy. We describe residential mobility during pregnancy among 19,951 children from the Kaiser Air Pollution and Pediatric Asthma Study, quantify measurement error in spatially-resolved estimates of prenatal exposure to mobile source fine particulate matter (PM2.5) due to ignoring this mobility, and simulate the impact of this error on estimates of epidemiologic associations. Two exposure estimates were compared, one calculated using complete residential histories during pregnancy (weighted average based on time spent at each address) and the second calculated using only residence at birth. Estimates were computed using annual averages of primary PM2.5 from traffic emissions modeled using a research line-source dispersion model (RLINE) at 250 meter resolution. In this cohort, 18.6% of children were born to mothers who moved at least once during pregnancy. Mobile source PM2.5 exposure estimates calculated using complete residential histories during pregnancy and only residence at birth were highly correlated (rS>0.9). Simulations indicated that ignoring residential mobility resulted in modest bias of epidemiologic associations toward the null, but varied by maternal characteristics and prenatal exposure windows of interest (ranging from −2% to −10% bias). PMID:27966666

  6. Measurement error in mobile source air pollution exposure estimates due to residential mobility during pregnancy.

    PubMed

    Pennington, Audrey Flak; Strickland, Matthew J; Klein, Mitchel; Zhai, Xinxin; Russell, Armistead G; Hansen, Craig; Darrow, Lyndsey A

    2017-09-01

    Prenatal air pollution exposure is frequently estimated using maternal residential location at the time of delivery as a proxy for residence during pregnancy. We describe residential mobility during pregnancy among 19,951 children from the Kaiser Air Pollution and Pediatric Asthma Study, quantify measurement error in spatially resolved estimates of prenatal exposure to mobile source fine particulate matter (PM 2.5 ) due to ignoring this mobility, and simulate the impact of this error on estimates of epidemiologic associations. Two exposure estimates were compared, one calculated using complete residential histories during pregnancy (weighted average based on time spent at each address) and the second calculated using only residence at birth. Estimates were computed using annual averages of primary PM 2.5 from traffic emissions modeled using a Research LINE-source dispersion model for near-surface releases (RLINE) at 250 m resolution. In this cohort, 18.6% of children were born to mothers who moved at least once during pregnancy. Mobile source PM 2.5 exposure estimates calculated using complete residential histories during pregnancy and only residence at birth were highly correlated (r S >0.9). Simulations indicated that ignoring residential mobility resulted in modest bias of epidemiologic associations toward the null, but varied by maternal characteristics and prenatal exposure windows of interest (ranging from -2% to -10% bias).

  7. Individual and Composite Study Endpoints: Separating the Wheat from the Chaff

    PubMed Central

    Goldberg, Robert; Gore, Joel M.; Barton, Bruce; Gurwitz, Jerry

    2014-01-01

    We provide an overview of the individual and combined clinical endpoints and patient reported outcomes typically used in clinical trials and prospective epidemiological investigations. We discuss the strengths and limitations associated with the utilization of aggregated study endpoints and surrogate measures of important clinical endpoints and patient-centered outcomes. We hope that the points raised in this overview will lead to the collection of clinically rich, relevant, measurable, and cost-efficient study outcomes. PMID:24486289

  8. Methodological approach to simulation and choice of ecologically efficient and energetically economic wind turbines (WT)

    NASA Astrophysics Data System (ADS)

    Bespalov, Vadim; Udina, Natalya; Samarskaya, Natalya

    2017-10-01

    Use of wind energy is related to one of the prospective directions among renewed energy sources. A methodological approach is reviewed in the article to simulation and choice of ecologically efficient and energetically economic wind turbines on the designing stage taking into account characteristics of natural-territorial complex and peculiarities of anthropogenic load in the territory of WT location.

  9. Nipah virus transmission in a hamster model.

    PubMed

    de Wit, Emmie; Bushmaker, Trenton; Scott, Dana; Feldmann, Heinz; Munster, Vincent J

    2011-12-01

    Based on epidemiological data, it is believed that human-to-human transmission plays an important role in Nipah virus outbreaks. No experimental data are currently available on the potential routes of human-to-human transmission of Nipah virus. In a first dose-finding experiment in Syrian hamsters, it was shown that Nipah virus was predominantly shed via the respiratory tract within nasal and oropharyngeal secretions. Although Nipah viral RNA was detected in urogenital and rectal swabs, no infectious virus was recovered from these samples, suggesting no viable virus was shed via these routes. In addition, hamsters inoculated with high doses shed significantly higher amounts of viable Nipah virus particles in comparison with hamsters infected with lower inoculum doses. Using the highest inoculum dose, three potential routes of Nipah virus transmission were investigated in the hamster model: transmission via fomites, transmission via direct contact and transmission via aerosols. It was demonstrated that Nipah virus is transmitted efficiently via direct contact and inefficiently via fomites, but not via aerosols. These findings are in line with epidemiological data which suggest that direct contact with nasal and oropharyngeal secretions of Nipah virus infected individuals resulted in greater risk of Nipah virus infection. The data provide new and much-needed insights into the modes and efficiency of Nipah virus transmission and have important public health implications with regards to the risk assessment and management of future Nipah virus outbreaks.

  10. Nipah Virus Transmission in a Hamster Model

    PubMed Central

    de Wit, Emmie; Bushmaker, Trenton; Scott, Dana; Feldmann, Heinz; Munster, Vincent J.

    2011-01-01

    Based on epidemiological data, it is believed that human-to-human transmission plays an important role in Nipah virus outbreaks. No experimental data are currently available on the potential routes of human-to-human transmission of Nipah virus. In a first dose-finding experiment in Syrian hamsters, it was shown that Nipah virus was predominantly shed via the respiratory tract within nasal and oropharyngeal secretions. Although Nipah viral RNA was detected in urogenital and rectal swabs, no infectious virus was recovered from these samples, suggesting no viable virus was shed via these routes. In addition, hamsters inoculated with high doses shed significantly higher amounts of viable Nipah virus particles in comparison with hamsters infected with lower inoculum doses. Using the highest inoculum dose, three potential routes of Nipah virus transmission were investigated in the hamster model: transmission via fomites, transmission via direct contact and transmission via aerosols. It was demonstrated that Nipah virus is transmitted efficiently via direct contact and inefficiently via fomites, but not via aerosols. These findings are in line with epidemiological data which suggest that direct contact with nasal and oropharyngeal secretions of Nipah virus infected individuals resulted in greater risk of Nipah virus infection. The data provide new and much-needed insights into the modes and efficiency of Nipah virus transmission and have important public health implications with regards to the risk assessment and management of future Nipah virus outbreaks. PMID:22180802

  11. A resolved two-way coupled CFD/6-DOF approach for predicting embolus transport and the embolus-trapping efficiency of IVC filters.

    PubMed

    Aycock, Kenneth I; Campbell, Robert L; Manning, Keefe B; Craven, Brent A

    2017-06-01

    Inferior vena cava (IVC) filters are medical devices designed to provide a mechanical barrier to the passage of emboli from the deep veins of the legs to the heart and lungs. Despite decades of development and clinical use, IVC filters still fail to prevent the passage of all hazardous emboli. The objective of this study is to (1) develop a resolved two-way computational model of embolus transport, (2) provide verification and validation evidence for the model, and (3) demonstrate the ability of the model to predict the embolus-trapping efficiency of an IVC filter. Our model couples computational fluid dynamics simulations of blood flow to six-degree-of-freedom simulations of embolus transport and resolves the interactions between rigid, spherical emboli and the blood flow using an immersed boundary method. Following model development and numerical verification and validation of the computational approach against benchmark data from the literature, embolus transport simulations are performed in an idealized IVC geometry. Centered and tilted filter orientations are considered using a nonlinear finite element-based virtual filter placement procedure. A total of 2048 coupled CFD/6-DOF simulations are performed to predict the embolus-trapping statistics of the filter. The simulations predict that the embolus-trapping efficiency of the IVC filter increases with increasing embolus diameter and increasing embolus-to-blood density ratio. Tilted filter placement is found to decrease the embolus-trapping efficiency compared with centered filter placement. Multiple embolus-trapping locations are predicted for the IVC filter, and the trapping locations are predicted to shift upstream and toward the vessel wall with increasing embolus diameter. Simulations of the injection of successive emboli into the IVC are also performed and reveal that the embolus-trapping efficiency decreases with increasing thrombus load in the IVC filter. In future work, the computational tool could be used to investigate IVC filter design improvements, the effect of patient anatomy on embolus transport and IVC filter embolus-trapping efficiency, and, with further development and validation, optimal filter selection and placement on a patient-specific basis.

  12. Accelerated molecular dynamics: A promising and efficient simulation method for biomolecules

    NASA Astrophysics Data System (ADS)

    Hamelberg, Donald; Mongan, John; McCammon, J. Andrew

    2004-06-01

    Many interesting dynamic properties of biological molecules cannot be simulated directly using molecular dynamics because of nanosecond time scale limitations. These systems are trapped in potential energy minima with high free energy barriers for large numbers of computational steps. The dynamic evolution of many molecular systems occurs through a series of rare events as the system moves from one potential energy basin to another. Therefore, we have proposed a robust bias potential function that can be used in an efficient accelerated molecular dynamics approach to simulate the transition of high energy barriers without any advance knowledge of the location of either the potential energy wells or saddle points. In this method, the potential energy landscape is altered by adding a bias potential to the true potential such that the escape rates from potential wells are enhanced, which accelerates and extends the time scale in molecular dynamics simulations. Our definition of the bias potential echoes the underlying shape of the potential energy landscape on the modified surface, thus allowing for the potential energy minima to be well defined, and hence properly sampled during the simulation. We have shown that our approach, which can be extended to biomolecules, samples the conformational space more efficiently than normal molecular dynamics simulations, and converges to the correct canonical distribution.

  13. Off-axis holographic lens spectrum-splitting photovoltaic system for direct and diffuse solar energy conversion.

    PubMed

    Vorndran, Shelby D; Chrysler, Benjamin; Wheelwright, Brian; Angel, Roger; Holman, Zachary; Kostuk, Raymond

    2016-09-20

    This paper describes a high-efficiency, spectrum-splitting photovoltaic module that uses an off-axis volume holographic lens to focus and disperse incident solar illumination to a rectangular shaped high-bandgap indium gallium phosphide cell surrounded by strips of silicon cells. The holographic lens design allows efficient collection of both direct and diffuse illumination to maximize energy yield. We modeled the volume diffraction characteristics using rigorous coupled-wave analysis, and simulated system performance using nonsequential ray tracing and PV cell data from the literature. Under AM 1.5 illumination conditions the simulated module obtained a 30.6% conversion efficiency. This efficiency is a 19.7% relative improvement compared to the more efficient cell in the system (silicon). The module was also simulated under a typical meteorological year of direct and diffuse irradiance in Tucson, Arizona, and Seattle, Washington. Compared to a flat panel silicon module, the holographic spectrum splitting module obtained a relative improvement in energy yield of 17.1% in Tucson and 14.0% in Seattle. An experimental proof-of-concept volume holographic lens was also fabricated in dichromated gelatin to verify the main characteristics of the system. The lens obtained an average first-order diffraction efficiency of 85.4% across the aperture at 532 nm.

  14. SU-F-T-368: Improved HPGe Detector Precise Efficiency Calibration with Monte Carlo Simulations and Radioactive Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhai, Y. John

    2016-06-15

    Purpose: To obtain an improved precise gamma efficiency calibration curve of HPGe (High Purity Germanium) detector with a new comprehensive approach. Methods: Both of radioactive sources and Monte Carlo simulation (CYLTRAN) are used to determine HPGe gamma efficiency for energy range of 0–8 MeV. The HPGe is a GMX coaxial 280 cm{sup 3} N-type 70% gamma detector. Using Momentum Achromat Recoil Spectrometer (MARS) at the K500 superconducting cyclotron of Texas A&M University, the radioactive nucleus {sup 24} Al was produced and separated. This nucleus has positron decays followed by gamma transitions up to 8 MeV from {sup 24} Mg excitedmore » states which is used to do HPGe efficiency calibration. Results: With {sup 24} Al gamma energy spectrum up to 8MeV, the efficiency for γ ray 7.07 MeV at 4.9 cm distance away from the radioactive source {sup 24} Al was obtained at a value of 0.194(4)%, by carefully considering various factors such as positron annihilation, peak summing effect, beta detector efficiency and internal conversion effect. The Monte Carlo simulation (CYLTRAN) gave a value of 0.189%, which was in agreement with the experimental measurements. Applying to different energy points, then a precise efficiency calibration curve of HPGe detector up to 7.07 MeV at 4.9 cm distance away from the source {sup 24} Al was obtained. Using the same data analysis procedure, the efficiency for the 7.07 MeV gamma ray at 15.1 cm from the source {sup 24} Al was obtained at a value of 0.0387(6)%. MC simulation got a similar value of 0.0395%. This discrepancy led us to assign an uncertainty of 3% to the efficiency at 15.1 cm up to 7.07 MeV. The MC calculations also reproduced the intensity of observed single-and double-escape peaks, providing that the effects of positron annihilation-in-flight were incorporated. Conclusion: The precision improved gamma efficiency calibration curve provides more accurate radiation detection and dose calculation for cancer radiotherapy treatment.« less

  15. [The socio-hygienic monitoring as an integral system for health risk assessment and risk management at the regional level].

    PubMed

    Kuzmin, S V; Gurvich, V B; Dikonskaya, O V; Malykh, O L; Yarushin, S V; Romanov, S V; Kornilkov, A S

    2013-01-01

    The information and analytical framework for the introduction of health risk assessment and risk management methodologies in the Sverdlovsk Region is the system of socio-hygienic monitoring. Techniques of risk management that take into account the choice of most cost-effective and efficient actions for improvement of the sanitary and epidemiologic situation at the level of the region, municipality, or a business entity of the Russian Federation, have been developed and proposed. To assess the efficiency of planning and activities for health risk management common method approaches and economic methods of "cost-effectiveness" and "cost-benefit" analyses provided in method recommendations and introduced in the Russian Federation are applied.

  16. Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease

    NASA Astrophysics Data System (ADS)

    Marsden, Alison

    2009-11-01

    Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.

  17. Study on photon transport problem based on the platform of molecular optical simulation environment.

    PubMed

    Peng, Kuan; Gao, Xinbo; Liang, Jimin; Qu, Xiaochao; Ren, Nunu; Chen, Xueli; Ma, Bin; Tian, Jie

    2010-01-01

    As an important molecular imaging modality, optical imaging has attracted increasing attention in the recent years. Since the physical experiment is usually complicated and expensive, research methods based on simulation platforms have obtained extensive attention. We developed a simulation platform named Molecular Optical Simulation Environment (MOSE) to simulate photon transport in both biological tissues and free space for optical imaging based on noncontact measurement. In this platform, Monte Carlo (MC) method and the hybrid radiosity-radiance theorem are used to simulate photon transport in biological tissues and free space, respectively, so both contact and noncontact measurement modes of optical imaging can be simulated properly. In addition, a parallelization strategy for MC method is employed to improve the computational efficiency. In this paper, we study the photon transport problems in both biological tissues and free space using MOSE. The results are compared with Tracepro, simplified spherical harmonics method (SP(n)), and physical measurement to verify the performance of our study method on both accuracy and efficiency.

  18. Study on Photon Transport Problem Based on the Platform of Molecular Optical Simulation Environment

    PubMed Central

    Peng, Kuan; Gao, Xinbo; Liang, Jimin; Qu, Xiaochao; Ren, Nunu; Chen, Xueli; Ma, Bin; Tian, Jie

    2010-01-01

    As an important molecular imaging modality, optical imaging has attracted increasing attention in the recent years. Since the physical experiment is usually complicated and expensive, research methods based on simulation platforms have obtained extensive attention. We developed a simulation platform named Molecular Optical Simulation Environment (MOSE) to simulate photon transport in both biological tissues and free space for optical imaging based on noncontact measurement. In this platform, Monte Carlo (MC) method and the hybrid radiosity-radiance theorem are used to simulate photon transport in biological tissues and free space, respectively, so both contact and noncontact measurement modes of optical imaging can be simulated properly. In addition, a parallelization strategy for MC method is employed to improve the computational efficiency. In this paper, we study the photon transport problems in both biological tissues and free space using MOSE. The results are compared with Tracepro, simplified spherical harmonics method (S P n), and physical measurement to verify the performance of our study method on both accuracy and efficiency. PMID:20445737

  19. Investigating the impact of design characteristics on statistical efficiency within discrete choice experiments: A systematic survey.

    PubMed

    Vanniyasingam, Thuva; Daly, Caitlin; Jin, Xuejing; Zhang, Yuan; Foster, Gary; Cunningham, Charles; Thabane, Lehana

    2018-06-01

    This study reviews simulation studies of discrete choice experiments to determine (i) how survey design features affect statistical efficiency, (ii) and to appraise their reporting quality. Statistical efficiency was measured using relative design (D-) efficiency, D-optimality, or D-error. For this systematic survey, we searched Journal Storage (JSTOR), Since Direct, PubMed, and OVID which included a search within EMBASE. Searches were conducted up to year 2016 for simulation studies investigating the impact of DCE design features on statistical efficiency. Studies were screened and data were extracted independently and in duplicate. Results for each included study were summarized by design characteristic. Previously developed criteria for reporting quality of simulation studies were also adapted and applied to each included study. Of 371 potentially relevant studies, 9 were found to be eligible, with several varying in study objectives. Statistical efficiency improved when increasing the number of choice tasks or alternatives; decreasing the number of attributes, attribute levels; using an unrestricted continuous "manipulator" attribute; using model-based approaches with covariates incorporating response behaviour; using sampling approaches that incorporate previous knowledge of response behaviour; incorporating heterogeneity in a model-based design; correctly specifying Bayesian priors; minimizing parameter prior variances; and using an appropriate method to create the DCE design for the research question. The simulation studies performed well in terms of reporting quality. Improvement is needed in regards to clearly specifying study objectives, number of failures, random number generators, starting seeds, and the software used. These results identify the best approaches to structure a DCE. An investigator can manipulate design characteristics to help reduce response burden and increase statistical efficiency. Since studies varied in their objectives, conclusions were made on several design characteristics, however, the validity of each conclusion was limited. Further research should be conducted to explore all conclusions in various design settings and scenarios. Additional reviews to explore other statistical efficiency outcomes and databases can also be performed to enhance the conclusions identified from this review.

  20. Searching for efficient Markov chain Monte Carlo proposal kernels

    PubMed Central

    Yang, Ziheng; Rodríguez, Carlos E.

    2013-01-01

    Markov chain Monte Carlo (MCMC) or the Metropolis–Hastings algorithm is a simulation algorithm that has made modern Bayesian statistical inference possible. Nevertheless, the efficiency of different Metropolis–Hastings proposal kernels has rarely been studied except for the Gaussian proposal. Here we propose a unique class of Bactrian kernels, which avoid proposing values that are very close to the current value, and compare their efficiency with a number of proposals for simulating different target distributions, with efficiency measured by the asymptotic variance of a parameter estimate. The uniform kernel is found to be more efficient than the Gaussian kernel, whereas the Bactrian kernel is even better. When optimal scales are used for both, the Bactrian kernel is at least 50% more efficient than the Gaussian. Implementation in a Bayesian program for molecular clock dating confirms the general applicability of our results to generic MCMC algorithms. Our results refute a previous claim that all proposals had nearly identical performance and will prompt further research into efficient MCMC proposals. PMID:24218600

  1. STEPS: efficient simulation of stochastic reaction-diffusion models in realistic morphologies.

    PubMed

    Hepburn, Iain; Chen, Weiliang; Wils, Stefan; De Schutter, Erik

    2012-05-10

    Models of cellular molecular systems are built from components such as biochemical reactions (including interactions between ligands and membrane-bound proteins), conformational changes and active and passive transport. A discrete, stochastic description of the kinetics is often essential to capture the behavior of the system accurately. Where spatial effects play a prominent role the complex morphology of cells may have to be represented, along with aspects such as chemical localization and diffusion. This high level of detail makes efficiency a particularly important consideration for software that is designed to simulate such systems. We describe STEPS, a stochastic reaction-diffusion simulator developed with an emphasis on simulating biochemical signaling pathways accurately and efficiently. STEPS supports all the above-mentioned features, and well-validated support for SBML allows many existing biochemical models to be imported reliably. Complex boundaries can be represented accurately in externally generated 3D tetrahedral meshes imported by STEPS. The powerful Python interface facilitates model construction and simulation control. STEPS implements the composition and rejection method, a variation of the Gillespie SSA, supporting diffusion between tetrahedral elements within an efficient search and update engine. Additional support for well-mixed conditions and for deterministic model solution is implemented. Solver accuracy is confirmed with an original and extensive validation set consisting of isolated reaction, diffusion and reaction-diffusion systems. Accuracy imposes upper and lower limits on tetrahedron sizes, which are described in detail. By comparing to Smoldyn, we show how the voxel-based approach in STEPS is often faster than particle-based methods, with increasing advantage in larger systems, and by comparing to MesoRD we show the efficiency of the STEPS implementation. STEPS simulates models of cellular reaction-diffusion systems with complex boundaries with high accuracy and high performance in C/C++, controlled by a powerful and user-friendly Python interface. STEPS is free for use and is available at http://steps.sourceforge.net/

  2. STEPS: efficient simulation of stochastic reaction–diffusion models in realistic morphologies

    PubMed Central

    2012-01-01

    Background Models of cellular molecular systems are built from components such as biochemical reactions (including interactions between ligands and membrane-bound proteins), conformational changes and active and passive transport. A discrete, stochastic description of the kinetics is often essential to capture the behavior of the system accurately. Where spatial effects play a prominent role the complex morphology of cells may have to be represented, along with aspects such as chemical localization and diffusion. This high level of detail makes efficiency a particularly important consideration for software that is designed to simulate such systems. Results We describe STEPS, a stochastic reaction–diffusion simulator developed with an emphasis on simulating biochemical signaling pathways accurately and efficiently. STEPS supports all the above-mentioned features, and well-validated support for SBML allows many existing biochemical models to be imported reliably. Complex boundaries can be represented accurately in externally generated 3D tetrahedral meshes imported by STEPS. The powerful Python interface facilitates model construction and simulation control. STEPS implements the composition and rejection method, a variation of the Gillespie SSA, supporting diffusion between tetrahedral elements within an efficient search and update engine. Additional support for well-mixed conditions and for deterministic model solution is implemented. Solver accuracy is confirmed with an original and extensive validation set consisting of isolated reaction, diffusion and reaction–diffusion systems. Accuracy imposes upper and lower limits on tetrahedron sizes, which are described in detail. By comparing to Smoldyn, we show how the voxel-based approach in STEPS is often faster than particle-based methods, with increasing advantage in larger systems, and by comparing to MesoRD we show the efficiency of the STEPS implementation. Conclusion STEPS simulates models of cellular reaction–diffusion systems with complex boundaries with high accuracy and high performance in C/C++, controlled by a powerful and user-friendly Python interface. STEPS is free for use and is available at http://steps.sourceforge.net/ PMID:22574658

  3. Fast Simulation of Dynamic Ultrasound Images Using the GPU.

    PubMed

    Storve, Sigurd; Torp, Hans

    2017-10-01

    Simulated ultrasound data is a valuable tool for development and validation of quantitative image analysis methods in echocardiography. Unfortunately, simulation time can become prohibitive for phantoms consisting of a large number of point scatterers. The COLE algorithm by Gao et al. is a fast convolution-based simulator that trades simulation accuracy for improved speed. We present highly efficient parallelized CPU and GPU implementations of the COLE algorithm with an emphasis on dynamic simulations involving moving point scatterers. We argue that it is crucial to minimize the amount of data transfers from the CPU to achieve good performance on the GPU. We achieve this by storing the complete trajectories of the dynamic point scatterers as spline curves in the GPU memory. This leads to good efficiency when simulating sequences consisting of a large number of frames, such as B-mode and tissue Doppler data for a full cardiac cycle. In addition, we propose a phase-based subsample delay technique that efficiently eliminates flickering artifacts seen in B-mode sequences when COLE is used without enough temporal oversampling. To assess the performance, we used a laptop computer and a desktop computer, each equipped with a multicore Intel CPU and an NVIDIA GPU. Running the simulator on a high-end TITAN X GPU, we observed two orders of magnitude speedup compared to the parallel CPU version, three orders of magnitude speedup compared to simulation times reported by Gao et al. in their paper on COLE, and a speedup of 27000 times compared to the multithreaded version of Field II, using numbers reported in a paper by Jensen. We hope that by releasing the simulator as an open-source project we will encourage its use and further development.

  4. Energy Losses Estimation During Pulsed-Laser Seam Welding

    NASA Astrophysics Data System (ADS)

    Sebestova, Hana; Havelkova, Martina; Chmelickova, Hana

    2014-06-01

    The finite-element tool SYSWELD (ESI Group, Paris, France) was adapted to simulate pulsed-laser seam welding. Besides temperature field distribution, one of the possible outputs of the welding simulation is the amount of absorbed power necessary to melt the required material volume including energy losses. Comparing absorbed or melting energy with applied laser energy, welding efficiencies can be calculated. This article presents achieved results of welding efficiency estimation based on the assimilation both experimental and simulation output data of the pulsed Nd:YAG laser bead on plate welding of 0.6-mm-thick AISI 304 stainless steel sheets using different beam powers.

  5. High-Performance Computing for the Electromagnetic Modeling and Simulation of Interconnects

    NASA Technical Reports Server (NTRS)

    Schutt-Aine, Jose E.

    1996-01-01

    The electromagnetic modeling of packages and interconnects plays a very important role in the design of high-speed digital circuits, and is most efficiently performed by using computer-aided design algorithms. In recent years, packaging has become a critical area in the design of high-speed communication systems and fast computers, and the importance of the software support for their development has increased accordingly. Throughout this project, our efforts have focused on the development of modeling and simulation techniques and algorithms that permit the fast computation of the electrical parameters of interconnects and the efficient simulation of their electrical performance.

  6. Discrete Spin Vector Approach for Monte Carlo-based Magnetic Nanoparticle Simulations

    NASA Astrophysics Data System (ADS)

    Senkov, Alexander; Peralta, Juan; Sahay, Rahul

    The study of magnetic nanoparticles has gained significant popularity due to the potential uses in many fields such as modern medicine, electronics, and engineering. To study the magnetic behavior of these particles in depth, it is important to be able to model and simulate their magnetic properties efficiently. Here we utilize the Metropolis-Hastings algorithm with a discrete spin vector model (in contrast to the standard continuous model) to model the magnetic hysteresis of a set of protected pure iron nanoparticles. We compare our simulations with the experimental hysteresis curves and discuss the efficiency of our algorithm.

  7. Efficient Measurement of Multiparticle Entanglement with Embedding Quantum Simulator.

    PubMed

    Chen, Ming-Cheng; Wu, Dian; Su, Zu-En; Cai, Xin-Dong; Wang, Xi-Lin; Yang, Tao; Li, Li; Liu, Nai-Le; Lu, Chao-Yang; Pan, Jian-Wei

    2016-02-19

    The quantum measurement of entanglement is a demanding task in the field of quantum information. Here, we report the direct and scalable measurement of multiparticle entanglement with embedding photonic quantum simulators. In this embedding framework [R. Di Candia et al. Phys. Rev. Lett. 111, 240502 (2013)], the N-qubit entanglement, which does not associate with a physical observable directly, can be efficiently measured with only two (for even N) and six (for odd N) local measurement settings. Our experiment uses multiphoton quantum simulators to mimic dynamical concurrence and three-tangle entangled systems and to track their entanglement evolutions.

  8. Structurally detailed coarse-grained model for Sec-facilitated co-translational protein translocation and membrane integration

    PubMed Central

    Miller, Thomas F.

    2017-01-01

    We present a coarse-grained simulation model that is capable of simulating the minute-timescale dynamics of protein translocation and membrane integration via the Sec translocon, while retaining sufficient chemical and structural detail to capture many of the sequence-specific interactions that drive these processes. The model includes accurate geometric representations of the ribosome and Sec translocon, obtained directly from experimental structures, and interactions parameterized from nearly 200 μs of residue-based coarse-grained molecular dynamics simulations. A protocol for mapping amino-acid sequences to coarse-grained beads enables the direct simulation of trajectories for the co-translational insertion of arbitrary polypeptide sequences into the Sec translocon. The model reproduces experimentally observed features of membrane protein integration, including the efficiency with which polypeptide domains integrate into the membrane, the variation in integration efficiency upon single amino-acid mutations, and the orientation of transmembrane domains. The central advantage of the model is that it connects sequence-level protein features to biological observables and timescales, enabling direct simulation for the mechanistic analysis of co-translational integration and for the engineering of membrane proteins with enhanced membrane integration efficiency. PMID:28328943

  9. A parallelization method for time periodic steady state in simulation of radio frequency sheath dynamics

    NASA Astrophysics Data System (ADS)

    Kwon, Deuk-Chul; Shin, Sung-Sik; Yu, Dong-Hun

    2017-10-01

    In order to reduce the computing time in simulation of radio frequency (rf) plasma sources, various numerical schemes were developed. It is well known that the upwind, exponential, and power-law schemes can efficiently overcome the limitation on the grid size for fluid transport simulations of high density plasma discharges. Also, the semi-implicit method is a well-known numerical scheme to overcome on the simulation time step. However, despite remarkable advances in numerical techniques and computing power over the last few decades, efficient multi-dimensional modeling of low temperature plasma discharges has remained a considerable challenge. In particular, there was a difficulty on parallelization in time for the time periodic steady state problems such as capacitively coupled plasma discharges and rf sheath dynamics because values of plasma parameters in previous time step are used to calculate new values each time step. Therefore, we present a parallelization method for the time periodic steady state problems by using period-slices. In order to evaluate the efficiency of the developed method, one-dimensional fluid simulations are conducted for describing rf sheath dynamics. The result shows that speedup can be achieved by using a multithreading method.

  10. Post-hoc simulation study to adopt a computerized adaptive testing (CAT) for a Korean Medical License Examination.

    PubMed

    Seo, Dong Gi; Choi, Jeongwook

    2018-05-17

    Computerized adaptive testing (CAT) has been adopted in license examinations due to a test efficiency and accuracy. Many research about CAT have been published to prove the efficiency and accuracy of measurement. This simulation study investigated scoring method and item selection methods to implement CAT in Korean medical license examination (KMLE). This study used post-hoc (real data) simulation design. The item bank used in this study was designed with all items in a 2017 KMLE. All CAT algorithms for this study were implemented by a 'catR' package in R program. In terms of accuracy, Rasch and 2parametric logistic (PL) model performed better than 3PL model. Modal a Posteriori (MAP) or Expected a Posterior (EAP) provided more accurate estimates than MLE and WLE. Furthermore Maximum posterior weighted information (MPWI) or Minimum expected posterior variance (MEPV) performed better than other item selection methods. In terms of efficiency, Rasch model was recommended to reduce test length. Simulation study should be performed under varied test conditions before adopting a live CAT. Based on a simulation study, specific scoring and item selection methods should be predetermined before implementing a live CAT.

  11. Efficiency of exchange schemes in replica exchange

    NASA Astrophysics Data System (ADS)

    Lingenheil, Martin; Denschlag, Robert; Mathias, Gerald; Tavan, Paul

    2009-08-01

    In replica exchange simulations a fast diffusion of the replicas through the temperature space maximizes the efficiency of the statistical sampling. Here, we compare the diffusion speed as measured by the round trip rates for four exchange algorithms. We find different efficiency profiles with optimal average acceptance probabilities ranging from 8% to 41%. The best performance is determined by benchmark simulations for the most widely used algorithm, which alternately tries to exchange all even and all odd replica pairs. By analytical mathematics we show that the excellent performance of this exchange scheme is due to the high diffusivity of the underlying random walk.

  12. Millimeter-Wave Wireless Power Transfer Technology for Space Applications

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Goutam; Manohara, Harish; Mojarradi, Mohammad M.; Vo, Tuan A.; Mojarradi, Hadi; Bae, Sam Y.; Marzwell, Neville

    2008-01-01

    In this paper we present a new compact, scalable, and low cost technology for efficient receiving of power using RF waves at 94 GHz. This technology employs a highly innovative array of slot antennas that is integrated on substrate composed of gold (Au), silicon (Si), and silicon dioxide (SiO2) layers. The length of the slots and spacing between them are optimized for a highly efficient beam through a 3-D electromagnetic simulation process. Antenna simulation results shows a good beam profile with very low side lobe levels and better than 93% antenna efficiency.

  13. Signal recognition efficiencies of artificial neural-network pulse-shape discrimination in HPGe -decay searches

    NASA Astrophysics Data System (ADS)

    Caldwell, A.; Cossavella, F.; Majorovits, B.; Palioselitis, D.; Volynets, O.

    2015-07-01

    A pulse-shape discrimination method based on artificial neural networks was applied to pulses simulated for different background, signal and signal-like interactions inside a germanium detector. The simulated pulses were used to investigate variations of efficiencies as a function of used training set. It is verified that neural networks are well-suited to identify background pulses in true-coaxial high-purity germanium detectors. The systematic uncertainty on the signal recognition efficiency derived using signal-like evaluation samples from calibration measurements is estimated to be 5 %. This uncertainty is due to differences between signal and calibration samples.

  14. Measured and simulated performance of Compton-suppressed TIGRESS HPGe clover detectors

    NASA Astrophysics Data System (ADS)

    Schumaker, M. A.; Hackman, G.; Pearson, C. J.; Svensson, C. E.; Andreoiu, C.; Andreyev, A.; Austin, R. A. E.; Ball, G. C.; Bandyopadhyay, D.; Boston, A. J.; Chakrawarthy, R. S.; Churchman, R.; Drake, T. E.; Finlay, P.; Garrett, P. E.; Grinyer, G. F.; Hyland, B.; Jones, B.; Maharaj, R.; Morton, A. C.; Phillips, A. A.; Sarazin, F.; Scraggs, H. C.; Smith, M. B.; Valiente-Dobón, J. J.; Waddington, J. C.; Watters, L. M.

    2007-01-01

    Tests of the performance of a 32-fold segmented HPGe clover detector coupled to a 20-fold segmented Compton-suppression shield, which form a prototype element of the TRIUMF-ISAC Gamma-Ray Escape-Suppressed Spectrometer (TIGRESS), have been made. Peak-to-total ratios and relative efficiencies have been measured for a variety of γ-ray energies. These measurements were used to validate a GEANT4 simulation of the TIGRESS detectors, which was then used to create a simulation of the full 12-detector array. Predictions of the expected performance of TIGRESS are presented. These predictions indicate that TIGRESS will be capable, for single 1 MeV γ rays, of absolute detection efficiencies of 17% and 9.4%, and peak-to-total ratios of 54% and 61% for the "high-efficiency" and "optimized peak-to-total" configurations of the array, respectively.

  15. Efficient solution of ordinary differential equations modeling electrical activity in cardiac cells.

    PubMed

    Sundnes, J; Lines, G T; Tveito, A

    2001-08-01

    The contraction of the heart is preceded and caused by a cellular electro-chemical reaction, causing an electrical field to be generated. Performing realistic computer simulations of this process involves solving a set of partial differential equations, as well as a large number of ordinary differential equations (ODEs) characterizing the reactive behavior of the cardiac tissue. Experiments have shown that the solution of the ODEs contribute significantly to the total work of a simulation, and there is thus a strong need to utilize efficient solution methods for this part of the problem. This paper presents how an efficient implicit Runge-Kutta method may be adapted to solve a complicated cardiac cell model consisting of 31 ODEs, and how this solver may be coupled to a set of PDE solvers to provide complete simulations of the electrical activity.

  16. Measurement of the photon identification efficiencies with the ATLAS detector using LHC Run-1 data

    DOE PAGES

    Aaboud, M.; Aad, G.; Abbott, B.; ...

    2016-12-03

    The algorithms used by the ATLAS Collaboration to reconstruct and identify prompt photons are described. Measurements of the photon identification efficiencies are reported, using 4.9 fb –1 of pp collision data collected at the LHC at √s = 7 TeV and 20.3 fb –1 at √s = 8 TeV. The efficiencies are measured separately for converted and unconverted photons, in four different pseudorapidity regions, for transverse momenta between 10 GeV and 1.5 TeV. The results from the combination of three data-driven techniques are compared to the predictions from a simulation of the detector response, after correcting the electromagnetic shower momentamore » in the simulation for the average differences observed with respect to data. Data-to-simulation efficiency ratios used as correction factors in physics measurements are determined to account for the small residual efficiency differences. These factors are measured with uncertainties between 0.5% and 10% in 7 TeV data and between 0.5% and 5.6% in 8 TeV data, depending on the photon transverse momentum and pseudorapidity.« less

  17. PLATSIM: A Simulation and Analysis Package for Large-Order Flexible Systems. Version 2.0

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Kenny, Sean P.; Giesy, Daniel P.

    1997-01-01

    The software package PLATSIM provides efficient time and frequency domain analysis of large-order generic space platforms. PLATSIM can perform open-loop analysis or closed-loop analysis with linear or nonlinear control system models. PLATSIM exploits the particular form of sparsity of the plant matrices for very efficient linear and nonlinear time domain analysis, as well as frequency domain analysis. A new, original algorithm for the efficient computation of open-loop and closed-loop frequency response functions for large-order systems has been developed and is implemented within the package. Furthermore, a novel and efficient jitter analysis routine which determines jitter and stability values from time simulations in a very efficient manner has been developed and is incorporated in the PLATSIM package. In the time domain analysis, PLATSIM simulates the response of the space platform to disturbances and calculates the jitter and stability values from the response time histories. In the frequency domain analysis, PLATSIM calculates frequency response function matrices and provides the corresponding Bode plots. The PLATSIM software package is written in MATLAB script language. A graphical user interface is developed in the package to provide convenient access to its various features.

  18. Measurement of the photon identification efficiencies with the ATLAS detector using LHC Run-1 data.

    PubMed

    Aaboud, M; Aad, G; Abbott, B; Abdallah, J; Abdinov, O; Abeloos, B; Aben, R; AbouZeid, O S; Abraham, N L; Abramowicz, H; Abreu, H; Abreu, R; Abulaiti, Y; Acharya, B S; Adamczyk, L; Adams, D L; Adelman, J; Adomeit, S; Adye, T; Affolder, A A; Agatonovic-Jovin, T; Agricola, J; Aguilar-Saavedra, J A; Ahlen, S P; Ahmadov, F; Aielli, G; Akerstedt, H; Åkesson, T P A; Akimov, A V; Alberghi, G L; Albert, J; Albrand, S; Verzini, M J Alconada; Aleksa, M; Aleksandrov, I N; Alexa, C; Alexander, G; Alexopoulos, T; Alhroob, M; Aliev, M; Alimonti, G; Alison, J; Alkire, S P; Allbrooke, B M M; Allen, B W; Allport, P P; Aloisio, A; Alonso, A; Alonso, F; Alpigiani, C; Alstaty, M; Gonzalez, B Alvarez; Piqueras, D Álvarez; Alviggi, M G; Amadio, B T; Amako, K; Coutinho, Y Amaral; Amelung, C; Amidei, D; Santos, S P Amor Dos; Amorim, A; Amoroso, S; Amundsen, G; Anastopoulos, C; Ancu, L S; Andari, N; Andeen, T; Anders, C F; Anders, G; Anders, J K; Anderson, K J; Andreazza, A; Andrei, V; Angelidakis, S; Angelozzi, I; Anger, P; Angerami, A; Anghinolfi, F; Anisenkov, A V; Anjos, N; Annovi, A; Antonelli, M; Antonov, A; Anulli, F; Aoki, M; Bella, L Aperio; Arabidze, G; Arai, Y; Araque, J P; Arce, A T H; Arduh, F A; Arguin, J-F; Argyropoulos, S; Arik, M; Armbruster, A J; Armitage, L J; Arnaez, O; Arnold, H; Arratia, M; Arslan, O; Artamonov, A; Artoni, G; Artz, S; Asai, S; Asbah, N; Ashkenazi, A; Åsman, B; Asquith, L; Assamagan, K; Astalos, R; Atkinson, M; Atlay, N B; Augsten, K; Avolio, G; Axen, B; Ayoub, M K; Azuelos, G; Baak, M A; Baas, A E; Baca, M J; Bachacou, H; Bachas, K; Backes, M; Backhaus, M; Bagiacchi, P; Bagnaia, P; Bai, Y; Baines, J T; Baker, O K; Baldin, E M; Balek, P; Balestri, T; Balli, F; Balunas, W K; Banas, E; Banerjee, Sw; Bannoura, A A E; Barak, L; Barberio, E L; Barberis, D; Barbero, M; Barillari, T; Barklow, T; Barlow, N; Barnes, S L; Barnett, B M; Barnett, R M; Barnovska, Z; Baroncelli, A; Barone, G; Barr, A J; Navarro, L Barranco; Barreiro, F; da Costa, J Barreiro Guimarães; Bartoldus, R; Barton, A E; Bartos, P; Basalaev, A; Bassalat, A; Bates, R L; Batista, S J; Batley, J R; Battaglia, M; Bauce, M; Bauer, F; Bawa, H S; Beacham, J B; Beattie, M D; Beau, T; Beauchemin, P H; Bechtle, P; Beck, H P; Becker, K; Becker, M; Beckingham, M; Becot, C; Beddall, A J; Beddall, A; Bednyakov, V A; Bedognetti, M; Bee, C P; Beemster, L J; Beermann, T A; Begel, M; Behr, J K; Belanger-Champagne, C; Bell, A S; Bella, G; Bellagamba, L; Bellerive, A; Bellomo, M; Belotskiy, K; Beltramello, O; Belyaev, N L; Benary, O; Benchekroun, D; Bender, M; Bendtz, K; Benekos, N; Benhammou, Y; Noccioli, E Benhar; Benitez, J; Benjamin, D P; Bensinger, J R; Bentvelsen, S; Beresford, L; Beretta, M; Berge, D; Kuutmann, E Bergeaas; Berger, N; Beringer, J; Berlendis, S; Bernard, N R; Bernius, C; Bernlochner, F U; Berry, T; Berta, P; Bertella, C; Bertoli, G; Bertolucci, F; Bertram, I A; Bertsche, C; Bertsche, D; Besjes, G J; Bylund, O Bessidskaia; Bessner, M; Besson, N; Betancourt, C; Bethke, S; Bevan, A J; Bhimji, W; Bianchi, R M; Bianchini, L; Bianco, M; Biebel, O; Biedermann, D; Bielski, R; Biesuz, N V; Biglietti, M; De Mendizabal, J Bilbao; Bilokon, H; Bindi, M; Binet, S; Bingul, A; Bini, C; Biondi, S; Bjergaard, D M; Black, C W; Black, J E; Black, K M; Blackburn, D; Blair, R E; Blanchard, J-B; Blanco, J E; Blazek, T; Bloch, I; Blocker, C; Blum, W; Blumenschein, U; Blunier, S; Bobbink, G J; Bobrovnikov, V S; Bocchetta, S S; Bocci, A; Bock, C; Boehler, M; Boerner, D; Bogaerts, J A; Bogavac, D; Bogdanchikov, A G; Bohm, C; Boisvert, V; Bokan, P; Bold, T; Boldyrev, A S; Bomben, M; Bona, M; Boonekamp, M; Borisov, A; Borissov, G; Bortfeldt, J; Bortoletto, D; Bortolotto, V; Bos, K; Boscherini, D; Bosman, M; Sola, J D Bossio; Boudreau, J; Bouffard, J; Bouhova-Thacker, E V; Boumediene, D; Bourdarios, C; Boutle, S K; Boveia, A; Boyd, J; Boyko, I R; Bracinik, J; Brandt, A; Brandt, G; Brandt, O; Bratzler, U; Brau, B; Brau, J E; Braun, H M; Madden, W D Breaden; Brendlinger, K; Brennan, A J; Brenner, L; Brenner, R; Bressler, S; Bristow, T M; Britton, D; Britzger, D; Brochu, F M; Brock, I; Brock, R; Brooijmans, G; Brooks, T; Brooks, W K; Brosamer, J; Brost, E; Broughton, J H; de Renstrom, P A Bruckman; Bruncko, D; Bruneliere, R; Bruni, A; Bruni, G; Bruni, L S; Brunt, B H; Bruschi, M; Bruscino, N; Bryant, P; Bryngemark, L; Buanes, T; Buat, Q; Buchholz, P; Buckley, A G; Budagov, I A; Buehrer, F; Bugge, M K; Bulekov, O; Bullock, D; Burckhart, H; Burdin, S; Burgard, C D; Burghgrave, B; Burka, K; Burke, S; Burmeister, I; Busato, E; Büscher, D; Büscher, V; Bussey, P; Butler, J M; Buttar, C M; Butterworth, J M; Butti, P; Buttinger, W; Buzatu, A; Buzykaev, A R; Urbán, S Cabrera; Caforio, D; Cairo, V M; Cakir, O; Calace, N; Calafiura, P; Calandri, A; Calderini, G; Calfayan, P; Caloba, L P; Calvet, D; Calvet, S; Calvet, T P; Toro, R Camacho; Camarda, S; Camarri, P; Cameron, D; Armadans, R Caminal; Camincher, C; Campana, S; Campanelli, M; Camplani, A; Campoverde, A; Canale, V; Canepa, A; Bret, M Cano; Cantero, J; Cantrill, R; Cao, T; Garrido, M D M Capeans; Caprini, I; Caprini, M; Capua, M; Caputo, R; Carbone, R M; Cardarelli, R; Cardillo, F; Carli, I; Carli, T; Carlino, G; Carminati, L; Caron, S; Carquin, E; Carrillo-Montoya, G D; Carter, J R; Carvalho, J; Casadei, D; Casado, M P; Casolino, M; Casper, D W; Castaneda-Miranda, E; Castelijn, R; Castelli, A; Gimenez, V Castillo; Castro, N F; Catinaccio, A; Catmore, J R; Cattai, A; Caudron, J; Cavaliere, V; Cavallaro, E; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Ceradini, F; Alberich, L Cerda; Cerio, B C; Cerqueira, A S; Cerri, A; Cerrito, L; Cerutti, F; Cerv, M; Cervelli, A; Cetin, S A; Chafaq, A; Chakraborty, D; Chan, S K; Chan, Y L; Chang, P; Chapman, J D; Charlton, D G; Chatterjee, A; Chau, C C; Barajas, C A Chavez; Che, S; Cheatham, S; Chegwidden, A; Chekanov, S; Chekulaev, S V; Chelkov, G A; Chelstowska, M A; Chen, C; Chen, H; Chen, K; Chen, S; Chen, S; Chen, X; Chen, Y; Cheng, H C; Cheng, H J; Cheng, Y; Cheplakov, A; Cheremushkina, E; El Moursli, R Cherkaoui; Chernyatin, V; Cheu, E; Chevalier, L; Chiarella, V; Chiarelli, G; Chiodini, G; Chisholm, A S; Chitan, A; Chizhov, M V; Choi, K; Chomont, A R; Chouridou, S; Chow, B K B; Christodoulou, V; Chromek-Burckhart, D; Chudoba, J; Chuinard, A J; Chwastowski, J J; Chytka, L; Ciapetti, G; Ciftci, A K; Cinca, D; Cindro, V; Cioara, I A; Ciocio, A; Cirotto, F; Citron, Z H; Citterio, M; Ciubancan, M; Clark, A; Clark, B L; Clark, M R; Clark, P J; Clarke, R N; Clement, C; Coadou, Y; Cobal, M; Coccaro, A; Cochran, J; Coffey, L; Colasurdo, L; Cole, B; Colijn, A P; Collot, J; Colombo, T; Compostella, G; Muiño, P Conde; Coniavitis, E; Connell, S H; Connelly, I A; Consorti, V; Constantinescu, S; Conti, G; Conventi, F; Cooke, M; Cooper, B D; Cooper-Sarkar, A M; Cormier, K J R; Cornelissen, T; Corradi, M; Corriveau, F; Corso-Radu, A; Cortes-Gonzalez, A; Cortiana, G; Costa, G; Costa, M J; Costanzo, D; Cottin, G; Cowan, G; Cox, B E; Cranmer, K; Crawley, S J; Cree, G; Crépé-Renaudin, S; Crescioli, F; Cribbs, W A; Ortuzar, M Crispin; Cristinziani, M; Croft, V; Crosetti, G; Donszelmann, T Cuhadar; Cummings, J; Curatolo, M; Cúth, J; Cuthbert, C; Czirr, H; Czodrowski, P; D'amen, G; D'Auria, S; D'Onofrio, M; Da Cunha Sargedas De Sousa, M J; Da Via, C; Dabrowski, W; Dado, T; Dai, T; Dale, O; Dallaire, F; Dallapiccola, C; Dam, M; Dandoy, J R; Dang, N P; Daniells, A C; Dann, N S; Danninger, M; Hoffmann, M Dano; Dao, V; Darbo, G; Darmora, S; Dassoulas, J; Dattagupta, A; Davey, W; David, C; Davidek, T; Davies, M; Davison, P; Dawe, E; Dawson, I; Daya-Ishmukhametova, R K; De, K; de Asmundis, R; De Benedetti, A; De Castro, S; De Cecco, S; De Groot, N; de Jong, P; De la Torre, H; De Lorenzi, F; De Maria, A; De Pedis, D; De Salvo, A; De Sanctis, U; De Santo, A; De Vivie De Regie, J B; Dearnaley, W J; Debbe, R; Debenedetti, C; Dedovich, D V; Dehghanian, N; Deigaard, I; Del Gaudio, M; Del Peso, J; Del Prete, T; Delgove, D; Deliot, F; Delitzsch, C M; Deliyergiyev, M; Dell'Acqua, A; Dell'Asta, L; Dell'Orso, M; Della Pietra, M; Della Volpe, D; Delmastro, M; Delsart, P A; Deluca, C; DeMarco, D A; Demers, S; Demichev, M; Demilly, A; Denisov, S P; Denysiuk, D; Derendarz, D; Derkaoui, J E; Derue, F; Dervan, P; Desch, K; Deterre, C; Dette, K; Deviveiros, P O; Dewhurst, A; Dhaliwal, S; Di Ciaccio, A; Di Ciaccio, L; Di Clemente, W K; Di Donato, C; Di Girolamo, A; Di Girolamo, B; Di Micco, B; Di Nardo, R; Di Simone, A; Di Sipio, R; Di Valentino, D; Diaconu, C; Diamond, M; Dias, F A; Diaz, M A; Diehl, E B; Dietrich, J; Diglio, S; Dimitrievska, A; Dingfelder, J; Dita, P; Dita, S; Dittus, F; Djama, F; Djobava, T; Djuvsland, J I; do Vale, M A B; Dobos, D; Dobre, M; Doglioni, C; Dohmae, T; Dolejsi, J; Dolezal, Z; Dolgoshein, B A; Donadelli, M; Donati, S; Dondero, P; Donini, J; Dopke, J; Doria, A; Dova, M T; Doyle, A T; Drechsler, E; Dris, M; Du, Y; Duarte-Campderros, J; Duchovni, E; Duckeck, G; Ducu, O A; Duda, D; Dudarev, A; Duffield, E M; Duflot, L; Duguid, L; Dührssen, M; Dumancic, M; Dunford, M; Yildiz, H Duran; Düren, M; Durglishvili, A; Duschinger, D; Dutta, B; Dyndal, M; Eckardt, C; Ecker, K M; Edgar, R C; Edwards, N C; Eifert, T; Eigen, G; Einsweiler, K; Ekelof, T; El Kacimi, M; Ellajosyula, V; Ellert, M; Elles, S; Ellinghaus, F; Elliot, A A; Ellis, N; Elmsheuser, J; Elsing, M; Emeliyanov, D; Enari, Y; Endner, O C; Endo, M; Ennis, J S; Erdmann, J; Ereditato, A; Ernis, G; Ernst, J; Ernst, M; Errede, S; Ertel, E; Escalier, M; Esch, H; Escobar, C; Esposito, B; Etienvre, A I; Etzion, E; Evans, H; Ezhilov, A; Fabbri, F; Fabbri, L; Facini, G; Fakhrutdinov, R M; Falciano, S; Falla, R J; Faltova, J; Fang, Y; Fanti, M; Farbin, A; Farilla, A; Farina, C; Farooque, T; Farrell, S; Farrington, S M; Farthouat, P; Fassi, F; Fassnacht, P; Fassouliotis, D; Giannelli, M Faucci; Favareto, A; Fawcett, W J; Fayard, L; Fedin, O L; Fedorko, W; Feigl, S; Feligioni, L; Feng, C; Feng, E J; Feng, H; Fenyuk, A B; Feremenga, L; Martinez, P Fernandez; Perez, S Fernandez; Ferrando, J; Ferrari, A; Ferrari, P; Ferrari, R; de Lima, D E Ferreira; Ferrer, A; Ferrere, D; Ferretti, C; Parodi, A Ferretto; Fiedler, F; Filipčič, A; Filipuzzi, M; Filthaut, F; Fincke-Keeler, M; Finelli, K D; Fiolhais, M C N; Fiorini, L; Firan, A; Fischer, A; Fischer, C; Fischer, J; Fisher, W C; Flaschel, N; Fleck, I; Fleischmann, P; Fletcher, G T; Fletcher, R R M; Flick, T; Floderus, A; Castillo, L R Flores; Flowerdew, M J; Forcolin, G T; Formica, A; Forti, A; Foster, A G; Fournier, D; Fox, H; Fracchia, S; Francavilla, P; Franchini, M; Francis, D; Franconi, L; Franklin, M; Frate, M; Fraternali, M; Freeborn, D; Fressard-Batraneanu, S M; Friedrich, F; Froidevaux, D; Frost, J A; Fukunaga, C; Torregrosa, E Fullana; Fusayasu, T; Fuster, J; Gabaldon, C; Gabizon, O; Gabrielli, A; Gabrielli, A; Gach, G P; Gadatsch, S; Gadomski, S; Gagliardi, G; Gagnon, L G; Gagnon, P; Galea, C; Galhardo, B; Gallas, E J; Gallop, B J; Gallus, P; Galster, G; Gan, K K; Gao, J; Gao, Y; Gao, Y S; Walls, F M Garay; García, C; Navarro, J E García; Garcia-Sciveres, M; Gardner, R W; Garelli, N; Garonne, V; Bravo, A Gascon; Gatti, C; Gaudiello, A; Gaudio, G; Gaur, B; Gauthier, L; Gavrilenko, I L; Gay, C; Gaycken, G; Gazis, E N; Gecse, Z; Gee, C N P; Geich-Gimbel, Ch; Geisen, M; Geisler, M P; Gemme, C; Genest, M H; Geng, C; Gentile, S; George, S; Gerbaudo, D; Gershon, A; Ghasemi, S; Ghazlane, H; Ghneimat, M; Giacobbe, B; Giagu, S; Giannetti, P; Gibbard, B; Gibson, S M; Gignac, M; Gilchriese, M; Gillam, T P S; Gillberg, D; Gilles, G; Gingrich, D M; Giokaris, N; Giordani, M P; Giorgi, F M; Giorgi, F M; Giraud, P F; Giromini, P; Giugni, D; Giuli, F; Giuliani, C; Giulini, M; Gjelsten, B K; Gkaitatzis, S; Gkialas, I; Gkougkousis, E L; Gladilin, L K; Glasman, C; Glatzer, J; Glaysher, P C F; Glazov, A; Goblirsch-Kolb, M; Godlewski, J; Goldfarb, S; Golling, T; Golubkov, D; Gomes, A; Gonçalo, R; Da Costa, J Goncalves Pinto Firmino; Gonella, G; Gonella, L; Gongadze, A; de la Hoz, S González; Parra, G Gonzalez; Gonzalez-Sevilla, S; Goossens, L; Gorbounov, P A; Gordon, H A; Gorelov, I; Gorini, B; Gorini, E; Gorišek, A; Gornicki, E; Goshaw, A T; Gössling, C; Gostkin, M I; Goudet, C R; Goujdami, D; Goussiou, A G; Govender, N; Gozani, E; Graber, L; Grabowska-Bold, I; Gradin, P O J; Grafström, P; Gramling, J; Gramstad, E; Grancagnolo, S; Gratchev, V; Gravila, P M; Gray, H M; Graziani, E; Greenwood, Z D; Grefe, C; Gregersen, K; Gregor, I M; Grenier, P; Grevtsov, K; Griffiths, J; Grillo, A A; Grimm, K; Grinstein, S; Gris, Ph; Grivaz, J-F; Groh, S; Grohs, J P; Gross, E; Grosse-Knetter, J; Grossi, G C; Grout, Z J; Guan, L; Guan, W; Guenther, J; Guescini, F; Guest, D; Gueta, O; Guido, E; Guillemin, T; Guindon, S; Gul, U; Gumpert, C; Guo, J; Guo, Y; Gupta, S; Gustavino, G; Gutierrez, P; Ortiz, N G Gutierrez; Gutschow, C; Guyot, C; Gwenlan, C; Gwilliam, C B; Haas, A; Haber, C; Hadavand, H K; Haddad, N; Hadef, A; Haefner, P; Hageböck, S; Hajduk, Z; Hakobyan, H; Haleem, M; Haley, J; Halladjian, G; Hallewell, G D; Hamacher, K; Hamal, P; Hamano, K; Hamilton, A; Hamity, G N; Hamnett, P G; Han, L; Hanagaki, K; Hanawa, K; Hance, M; Haney, B; Hanke, P; Hanna, R; Hansen, J B; Hansen, J D; Hansen, M C; Hansen, P H; Hara, K; Hard, A S; Harenberg, T; Hariri, F; Harkusha, S; Harrington, R D; Harrison, P F; Hartjes, F; Hartmann, N M; Hasegawa, M; Hasegawa, Y; Hasib, A; Hassani, S; Haug, S; Hauser, R; Hauswald, L; Havranek, M; Hawkes, C M; Hawkings, R J; Hayden, D; Hays, C P; Hays, J M; Hayward, H S; Haywood, S J; Head, S J; Heck, T; Hedberg, V; Heelan, L; Heim, S; Heim, T; Heinemann, B; Heinrich, J J; Heinrich, L; Heinz, C; Hejbal, J; Helary, L; Hellman, S; Helsens, C; Henderson, J; Henderson, R C W; Heng, Y; Henkelmann, S; Correia, A M Henriques; Henrot-Versille, S; Herbert, G H; Jiménez, Y Hernández; Herten, G; Hertenberger, R; Hervas, L; Hesketh, G G; Hessey, N P; Hetherly, J W; Hickling, R; Higón-Rodriguez, E; Hill, E; Hill, J C; Hiller, K H; Hillier, S J; Hinchliffe, I; Hines, E; Hinman, R R; Hirose, M; Hirschbuehl, D; Hobbs, J; Hod, N; Hodgkinson, M C; Hodgson, P; Hoecker, A; Hoeferkamp, M R; Hoenig, F; Hohn, D; Holmes, T R; Homann, M; Hong, T M; Hooberman, B H; Hopkins, W H; Horii, Y; Horton, A J; Hostachy, J-Y; Hou, S; Hoummada, A; Howarth, J; Hrabovsky, M; Hristova, I; Hrivnac, J; Hryn'ova, T; Hrynevich, A; Hsu, C; Hsu, P J; Hsu, S-C; Hu, D; Hu, Q; Huang, Y; Hubacek, Z; Hubaut, F; Huegging, F; Huffman, T B; Hughes, E W; Hughes, G; Huhtinen, M; Hülsing, T A; Huo, P; Huseynov, N; Huston, J; Huth, J; Iacobucci, G; Iakovidis, G; Ibragimov, I; Iconomidou-Fayard, L; Ideal, E; Idrissi, Z; Iengo, P; Igonkina, O; Iizawa, T; Ikegami, Y; Ikeno, M; Ilchenko, Y; Iliadis, D; Ilic, N; Ince, T; Introzzi, G; Ioannou, P; Iodice, M; Iordanidou, K; Ippolito, V; Ishino, M; Ishitsuka, M; Ishmukhametov, R; Issever, C; Istin, S; Ito, F; Ponce, J M Iturbe; Iuppa, R; Iwanski, W; Iwasaki, H; Izen, J M; Izzo, V; Jabbar, S; Jackson, B; Jackson, M; Jackson, P; Jain, V; Jakobi, K B; Jakobs, K; Jakobsen, S; Jakoubek, T; Jamin, D O; Jana, D K; Jansen, E; Jansky, R; Janssen, J; Janus, M; Jarlskog, G; Javadov, N; Javůrek, T; Jeanneau, F; Jeanty, L; Jejelava, J; Jeng, G-Y; Jennens, D; Jenni, P; Jentzsch, J; Jeske, C; Jézéquel, S; Ji, H; Jia, J; Jiang, H; Jiang, Y; Jiggins, S; Belenguer, M Jimenez; Pena, J Jimenez; Jin, S; Jinaru, A; Jinnouchi, O; Johansson, P; Johns, K A; Johnson, W J; Jon-And, K; Jones, G; Jones, R W L; Jones, S; Jones, T J; Jongmanns, J; Jorge, P M; Jovicevic, J; Ju, X; Rozas, A Juste; Köhler, M K; Kaczmarska, A; Kado, M; Kagan, H; Kagan, M; Kahn, S J; Kajomovitz, E; Kalderon, C W; Kaluza, A; Kama, S; Kamenshchikov, A; Kanaya, N; Kaneti, S; Kanjir, L; Kantserov, V A; Kanzaki, J; Kaplan, B; Kaplan, L S; Kapliy, A; Kar, D; Karakostas, K; Karamaoun, A; Karastathis, N; Kareem, M J; Karentzos, E; Karnevskiy, M; Karpov, S N; Karpova, Z M; Karthik, K; Kartvelishvili, V; Karyukhin, A N; Kasahara, K; Kashif, L; Kass, R D; Kastanas, A; Kataoka, Y; Kato, C; Katre, A; Katzy, J; Kawagoe, K; Kawamoto, T; Kawamura, G; Kazama, S; Kazanin, V F; Keeler, R; Kehoe, R; Keller, J S; Kempster, J J; Kawade, K; Keoshkerian, H; Kepka, O; Kerševan, B P; Kersten, S; Keyes, R A; Khalil-Zada, F; Khanov, A; Kharlamov, A G; Khoo, T J; Khovanskiy, V; Khramov, E; Khubua, J; Kido, S; Kim, H Y; Kim, S H; Kim, Y K; Kimura, N; Kind, O M; King, B T; King, M; King, S B; Kirk, J; Kiryunin, A E; Kishimoto, T; Kisielewska, D; Kiss, F; Kiuchi, K; Kivernyk, O; Kladiva, E; Klein, M H; Klein, M; Klein, U; Kleinknecht, K; Klimek, P; Klimentov, A; Klingenberg, R; Klinger, J A; Klioutchnikova, T; Kluge, E-E; Kluit, P; Kluth, S; Knapik, J; Kneringer, E; Knoops, E B F G; Knue, A; Kobayashi, A; Kobayashi, D; Kobayashi, T; Kobel, M; Kocian, M; Kodys, P; Koffas, T; Koffeman, E; Koi, T; Kolanoski, H; Kolb, M; Koletsou, I; Komar, A A; Komori, Y; Kondo, T; Kondrashova, N; Köneke, K; König, A C; Kono, T; Konoplich, R; Konstantinidis, N; Kopeliansky, R; Koperny, S; Köpke, L; Kopp, A K; Korcyl, K; Kordas, K; Korn, A; Korol, A A; Korolkov, I; Korolkova, E V; Kortner, O; Kortner, S; Kosek, T; Kostyukhin, V V; Kotwal, A; Kourkoumeli-Charalampidi, A; Kourkoumelis, C; Kouskoura, V; Kowalewska, A B; Kowalewski, R; Kowalski, T Z; Kozakai, C; Kozanecki, W; Kozhin, A S; Kramarenko, V A; Kramberger, G; Krasnopevtsev, D; Krasny, M W; Krasznahorkay, A; Kraus, J K; Kravchenko, A; Kretz, M; Kretzschmar, J; Kreutzfeldt, K; Krieger, P; Krizka, K; Kroeninger, K; Kroha, H; Kroll, J; Kroseberg, J; Krstic, J; Kruchonak, U; Krüger, H; Krumnack, N; Kruse, A; Kruse, M C; Kruskal, M; Kubota, T; Kucuk, H; Kuday, S; Kuechler, J T; Kuehn, S; Kugel, A; Kuger, F; Kuhl, A; Kuhl, T; Kukhtin, V; Kukla, R; Kulchitsky, Y; Kuleshov, S; Kuna, M; Kunigo, T; Kupco, A; Kurashige, H; Kurochkin, Y A; Kus, V; Kuwertz, E S; Kuze, M; Kvita, J; Kwan, T; Kyriazopoulos, D; Rosa, A La; Navarro, J L La Rosa; Rotonda, L La; Lacasta, C; Lacava, F; Lacey, J; Lacker, H; Lacour, D; Lacuesta, V R; Ladygin, E; Lafaye, R; Laforge, B; Lagouri, T; Lai, S; Lammers, S; Lampl, W; Lançon, E; Landgraf, U; Landon, M P J; Lang, V S; Lange, J C; Lankford, A J; Lanni, F; Lantzsch, K; Lanza, A; Laplace, S; Lapoire, C; Laporte, J F; Lari, T; Manghi, F Lasagni; Lassnig, M; Laurelli, P; Lavrijsen, W; Law, A T; Laycock, P; Lazovich, T; Lazzaroni, M; Le, B; Dortz, O Le; Guirriec, E Le; Quilleuc, E P Le; LeBlanc, M; LeCompte, T; Ledroit-Guillon, F; Lee, C A; Lee, S C; Lee, L; Lefebvre, G; Lefebvre, M; Legger, F; Leggett, C; Lehan, A; Miotto, G Lehmann; Lei, X; Leight, W A; Leisos, A; Leister, A G; Leite, M A L; Leitner, R; Lellouch, D; Lemmer, B; Leney, K J C; Lenz, T; Lenzi, B; Leone, R; Leone, S; Leonidopoulos, C; Leontsinis, S; Lerner, G; Leroy, C; Lesage, A A J; Lester, C G; Levchenko, M; Levêque, J; Levin, D; Levinson, L J; Levy, M; Lewis, D; Leyko, A M; Leyton, M; Li, B; Li, H; Li, H L; Li, L; Li, L; Li, Q; Li, S; Li, X; Li, Y; Liang, Z; Liberti, B; Liblong, A; Lichard, P; Lie, K; Liebal, J; Liebig, W; Limosani, A; Lin, S C; Lin, T H; Lindquist, B E; Lionti, A E; Lipeles, E; Lipniacka, A; Lisovyi, M; Liss, T M; Lister, A; Litke, A M; Liu, B; Liu, D; Liu, H; Liu, H; Liu, J; Liu, J B; Liu, K; Liu, L; Liu, M; Liu, M; Liu, Y L; Liu, Y; Livan, M; Lleres, A; Merino, J Llorente; Lloyd, S L; Sterzo, F Lo; Lobodzinska, E; Loch, P; Lockman, W S; Loebinger, F K; Loevschall-Jensen, A E; Loew, K M; Loginov, A; Lohse, T; Lohwasser, K; Lokajicek, M; Long, B A; Long, J D; Long, R E; Longo, L; Looper, K A; Lopes, L; Mateos, D Lopez; Paredes, B Lopez; Paz, I Lopez; Solis, A Lopez; Lorenz, J; Martinez, N Lorenzo; Losada, M; Lösel, P J; Lou, X; Lounis, A; Love, J; Love, P A; Lu, H; Lu, N; Lubatti, H J; Luci, C; Lucotte, A; Luedtke, C; Luehring, F; Lukas, W; Luminari, L; Lundberg, O; Lund-Jensen, B; Luzi, P M; Lynn, D; Lysak, R; Lytken, E; Lyubushkin, V; Ma, H; Ma, L L; Ma, Y; Maccarrone, G; Macchiolo, A; Macdonald, C M; Maček, B; Miguens, J Machado; Madaffari, D; Madar, R; Maddocks, H J; Mader, W F; Madsen, A; Maeda, J; Maeland, S; Maeno, T; Maevskiy, A; Magradze, E; Mahlstedt, J; Maiani, C; Maidantchik, C; Maier, A A; Maier, T; Maio, A; Majewski, S; Makida, Y; Makovec, N; Malaescu, B; Malecki, Pa; Maleev, V P; Malek, F; Mallik, U; Malon, D; Malone, C; Maltezos, S; Malyukov, S; Mamuzic, J; Mancini, G; Mandelli, B; Mandelli, L; Mandić, I; Maneira, J; Filho, L Manhaes de Andrade; Ramos, J Manjarres; Mann, A; Manousos, A; Mansoulie, B; Mansour, J D; Mantifel, R; Mantoani, M; Manzoni, S; Mapelli, L; Marceca, G; March, L; Marchiori, G; Marcisovsky, M; Marjanovic, M; Marley, D E; Marroquim, F; Marsden, S P; Marshall, Z; Marti-Garcia, S; Martin, B; Martin, T A; Martin, V J; Latour, B Martin Dit; Martinez, M; Martin-Haugh, S; Martoiu, V S; Martyniuk, A C; Marx, M; Marzin, A; Masetti, L; Mashimo, T; Mashinistov, R; Masik, J; Maslennikov, A L; Massa, I; Massa, L; Mastrandrea, P; Mastroberardino, A; Masubuchi, T; Mättig, P; Mattmann, J; Maurer, J; Maxfield, S J; Maximov, D A; Mazini, R; Mazza, S M; Fadden, N C Mc; Goldrick, G Mc; Kee, S P Mc; McCarn, A; McCarthy, R L; McCarthy, T G; McClymont, L I; McDonald, E F; McFarlane, K W; Mcfayden, J A; Mchedlidze, G; McMahon, S J; McPherson, R A; Medinnis, M; Meehan, S; Mehlhase, S; Mehta, A; Meier, K; Meineck, C; Meirose, B; Melini, D; Garcia, B R Mellado; Melo, M; Meloni, F; Mengarelli, A; Menke, S; Meoni, E; Mergelmeyer, S; Mermod, P; Merola, L; Meroni, C; Merritt, F S; Messina, A; Metcalfe, J; Mete, A S; Meyer, C; Meyer, C; Meyer, J-P; Meyer, J; Theenhausen, H Meyer Zu; Miano, F; Middleton, R P; Miglioranzi, S; Mijović, L; Mikenberg, G; Mikestikova, M; Mikuž, M; Milesi, M; Milic, A; Miller, D W; Mills, C; Milov, A; Milstead, D A; Minaenko, A A; Minami, Y; Minashvili, I A; Mincer, A I; Mindur, B; Mineev, M; Ming, Y; Mir, L M; Mistry, K P; Mitani, T; Mitrevski, J; Mitsou, V A; Miucci, A; Miyagawa, P S; Mjörnmark, J U; Moa, T; Mochizuki, K; Mohapatra, S; Molander, S; Moles-Valls, R; Monden, R; Mondragon, M C; Mönig, K; Monk, J; Monnier, E; Montalbano, A; Berlingen, J Montejo; Monticelli, F; Monzani, S; Moore, R W; Morange, N; Moreno, D; Llácer, M Moreno; Morettini, P; Mori, D; Mori, T; Morii, M; Morinaga, M; Morisbak, V; Moritz, S; Morley, A K; Mornacchi, G; Morris, J D; Mortensen, S S; Morvaj, L; Mosidze, M; Moss, J; Motohashi, K; Mount, R; Mountricha, E; Mouraviev, S V; Moyse, E J W; Muanza, S; Mudd, R D; Mueller, F; Mueller, J; Mueller, R S P; Mueller, T; Muenstermann, D; Mullen, P; Mullier, G A; Sanchez, F J Munoz; Quijada, J A Murillo; Murray, W J; Musheghyan, H; Muškinja, M; Myagkov, A G; Myska, M; Nachman, B P; Nackenhorst, O; Nagai, K; Nagai, R; Nagano, K; Nagasaka, Y; Nagata, K; Nagel, M; Nagy, E; Nairz, A M; Nakahama, Y; Nakamura, K; Nakamura, T; Nakano, I; Namasivayam, H; Garcia, R F Naranjo; Narayan, R; Villar, D I Narrias; Naryshkin, I; Naumann, T; Navarro, G; Nayyar, R; Neal, H A; Nechaeva, P Yu; Neep, T J; Nef, P D; Negri, A; Negrini, M; Nektarijevic, S; Nellist, C; Nelson, A; Nemecek, S; Nemethy, P; Nepomuceno, A A; Nessi, M; Neubauer, M S; Neumann, M; Neves, R M; Nevski, P; Newman, P R; Nguyen, D H; Manh, T Nguyen; Nickerson, R B; Nicolaidou, R; Nielsen, J; Nikiforov, A; Nikolaenko, V; Nikolic-Audit, I; Nikolopoulos, K; Nilsen, J K; Nilsson, P; Ninomiya, Y; Nisati, A; Nisius, R; Nobe, T; Nodulman, L; Nomachi, M; Nomidis, I; Nooney, T; Norberg, S; Nordberg, M; Norjoharuddeen, N; Novgorodova, O; Nowak, S; Nozaki, M; Nozka, L; Ntekas, K; Nurse, E; Nuti, F; O'grady, F; O'Neil, D C; O'Rourke, A A; O'Shea, V; Oakham, F G; Oberlack, H; Obermann, T; Ocariz, J; Ochi, A; Ochoa, I; Ochoa-Ricoux, J P; Oda, S; Odaka, S; Ogren, H; Oh, A; Oh, S H; Ohm, C C; Ohman, H; Oide, H; Okawa, H; Okumura, Y; Okuyama, T; Olariu, A; Seabra, L F Oleiro; Pino, S A Olivares; Damazio, D Oliveira; Olszewski, A; Olszowska, J; Onofre, A; Onogi, K; Onyisi, P U E; Oreglia, M J; Oren, Y; Orestano, D; Orlando, N; Orr, R S; Osculati, B; Ospanov, R; Garzon, G Otero Y; Otono, H; Ouchrif, M; Ould-Saada, F; Ouraou, A; Oussoren, K P; Ouyang, Q; Owen, M; Owen, R E; Ozcan, V E; Ozturk, N; Pachal, K; Pages, A Pacheco; Rodriguez, L Pacheco; Aranda, C Padilla; Pagáčová, M; Griso, S Pagan; Paige, F; Pais, P; Pajchel, K; Palacino, G; Palestini, S; Palka, M; Pallin, D; Palma, A; Panagiotopoulou, E St; Pandini, C E; Vazquez, J G Panduro; Pani, P; Panitkin, S; Pantea, D; Paolozzi, L; Papadopoulou, Th D; Papageorgiou, K; Paramonov, A; Hernandez, D Paredes; Parker, A J; Parker, M A; Parker, K A; Parodi, F; Parsons, J A; Parzefall, U; Pascuzzi, V R; Pasqualucci, E; Passaggio, S; Pastore, Fr; Pásztor, G; Pataraia, S; Pater, J R; Pauly, T; Pearce, J; Pearson, B; Pedersen, L E; Pedersen, M; Lopez, S Pedraza; Pedro, R; Peleganchuk, S V; Pelikan, D; Penc, O; Peng, C; Peng, H; Penwell, J; Peralva, B S; Perego, M M; Perepelitsa, D V; Codina, E Perez; Perini, L; Pernegger, H; Perrella, S; Peschke, R; Peshekhonov, V D; Peters, K; Peters, R F Y; Petersen, B A; Petersen, T C; Petit, E; Petridis, A; Petridou, C; Petroff, P; Petrolo, E; Petrov, M; Petrucci, F; Pettersson, N E; Peyaud, A; Pezoa, R; Phillips, P W; Piacquadio, G; Pianori, E; Picazio, A; Piccaro, E; Piccinini, M; Pickering, M A; Piegaia, R; Pilcher, J E; Pilkington, A D; Pin, A W J; Pinamonti, M; Pinfold, J L; Pingel, A; Pires, S; Pirumov, H; Pitt, M; Plazak, L; Pleier, M-A; Pleskot, V; Plotnikova, E; Plucinski, P; Pluth, D; Poettgen, R; Poggioli, L; Pohl, D; Polesello, G; Poley, A; Policicchio, A; Polifka, R; Polini, A; Pollard, C S; Polychronakos, V; Pommès, K; Pontecorvo, L; Pope, B G; Popeneciu, G A; Popovic, D S; Poppleton, A; Pospisil, S; Potamianos, K; Potrap, I N; Potter, C J; Potter, C T; Poulard, G; Poveda, J; Pozdnyakov, V; Astigarraga, M E Pozo; Pralavorio, P; Pranko, A; Prell, S; Price, D; Price, L E; Primavera, M; Prince, S; Proissl, M; Prokofiev, K; Prokoshin, F; Protopopescu, S; Proudfoot, J; Przybycien, M; Puddu, D; Purohit, M; Puzo, P; Qian, J; Qin, G; Qin, Y; Quadt, A; Quayle, W B; Queitsch-Maitland, M; Quilty, D; Raddum, S; Radeka, V; Radescu, V; Radhakrishnan, S K; Radloff, P; Rados, P; Ragusa, F; Rahal, G; Raine, J A; Rajagopalan, S; Rammensee, M; Rangel-Smith, C; Ratti, M G; Rauscher, F; Rave, S; Ravenscroft, T; Ravinovich, I; Raymond, M; Read, A L; Readioff, N P; Reale, M; Rebuzzi, D M; Redelbach, A; Redlinger, G; Reece, R; Reeves, K; Rehnisch, L; Reichert, J; Reisin, H; Rembser, C; Ren, H; Rescigno, M; Resconi, S; Rezanova, O L; Reznicek, P; Rezvani, R; Richter, R; Richter, S; Richter-Was, E; Ricken, O; Ridel, M; Rieck, P; Riegel, C J; Rieger, J; Rifki, O; Rijssenbeek, M; Rimoldi, A; Rimoldi, M; Rinaldi, L; Ristić, B; Ritsch, E; Riu, I; Rizatdinova, F; Rizvi, E; Rizzi, C; Robertson, S H; Robichaud-Veronneau, A; Robinson, D; Robinson, J E M; Robson, A; Roda, C; Rodina, Y; Perez, A Rodriguez; Rodriguez, D Rodriguez; Roe, S; Rogan, C S; Røhne, O; Romaniouk, A; Romano, M; Saez, S M Romano; Adam, E Romero; Rompotis, N; Ronzani, M; Roos, L; Ros, E; Rosati, S; Rosbach, K; Rose, P; Rosenthal, O; Rosien, N-A; Rossetti, V; Rossi, E; Rossi, L P; Rosten, J H N; Rosten, R; Rotaru, M; Roth, I; Rothberg, J; Rousseau, D; Royon, C R; Rozanov, A; Rozen, Y; Ruan, X; Rubbo, F; Rudolph, M S; Rühr, F; Ruiz-Martinez, A; Rurikova, Z; Rusakovich, N A; Ruschke, A; Russell, H L; Rutherfoord, J P; Ruthmann, N; Ryabov, Y F; Rybar, M; Rybkin, G; Ryu, S; Ryzhov, A; Rzehorz, G F; Saavedra, A F; Sabato, G; Sacerdoti, S; Sadrozinski, H F-W; Sadykov, R; Tehrani, F Safai; Saha, P; Sahinsoy, M; Saimpert, M; Saito, T; Sakamoto, H; Sakurai, Y; Salamanna, G; Salamon, A; Loyola, J E Salazar; Salek, D; De Bruin, P H Sales; Salihagic, D; Salnikov, A; Salt, J; Salvatore, D; Salvatore, F; Salvucci, A; Salzburger, A; Sammel, D; Sampsonidis, D; Sanchez, A; Sánchez, J; Martinez, V Sanchez; Sandaker, H; Sandbach, R L; Sander, H G; Sandhoff, M; Sandoval, C; Sandstroem, R; Sankey, D P C; Sannino, M; Sansoni, A; Santoni, C; Santonico, R; Santos, H; Castillo, I Santoyo; Sapp, K; Sapronov, A; Saraiva, J G; Sarrazin, B; Sasaki, O; Sasaki, Y; Sato, K; Sauvage, G; Sauvan, E; Savage, G; Savard, P; Sawyer, C; Sawyer, L; Saxon, J; Sbarra, C; Sbrizzi, A; Scanlon, T; Scannicchio, D A; Scarcella, M; Scarfone, V; Schaarschmidt, J; Schacht, P; Schachtner, B M; Schaefer, D; Schaefer, R; Schaeffer, J; Schaepe, S; Schaetzel, S; Schäfer, U; Schaffer, A C; Schaile, D; Schamberger, R D; Scharf, V; Schegelsky, V A; Scheirich, D; Schernau, M; Schiavi, C; Schier, S; Schillo, C; Schioppa, M; Schlenker, S; Schmidt-Sommerfeld, K R; Schmieden, K; Schmitt, C; Schmitt, S; Schmitz, S; Schneider, B; Schnoor, U; Schoeffel, L; Schoening, A; Schoenrock, B D; Schopf, E; Schott, M; Schovancova, J; Schramm, S; Schreyer, M; Schuh, N; Schultens, M J; Schultz-Coulon, H-C; Schulz, H; Schumacher, M; Schumm, B A; Schune, Ph; Schwartzman, A; Schwarz, T A; Schwegler, Ph; Schweiger, H; Schwemling, Ph; Schwienhorst, R; Schwindling, J; Schwindt, T; Sciolla, G; Scuri, F; Scutti, F; Searcy, J; Seema, P; Seidel, S C; Seiden, A; Seifert, F; Seixas, J M; Sekhniaidze, G; Sekhon, K; Sekula, S J; Seliverstov, D M; Semprini-Cesari, N; Serfon, C; Serin, L; Serkin, L; Sessa, M; Seuster, R; Severini, H; Sfiligoj, T; Sforza, F; Sfyrla, A; Shabalina, E; Shaikh, N W; Shan, L Y; Shang, R; Shank, J T; Shapiro, M; Shatalov, P B; Shaw, K; Shaw, S M; Shcherbakova, A; Shehu, C Y; Sherwood, P; Shi, L; Shimizu, S; Shimmin, C O; Shimojima, M; Shiyakova, M; Shmeleva, A; Saadi, D Shoaleh; Shochet, M J; Shojaii, S; Shrestha, S; Shulga, E; Shupe, M A; Sicho, P; Sickles, A M; Sidebo, P E; Sidiropoulou, O; Sidorov, D; Sidoti, A; Siegert, F; Sijacki, Dj; Silva, J; Silverstein, S B; Simak, V; Simard, O; Simic, Lj; Simion, S; Simioni, E; Simmons, B; Simon, D; Simon, M; Sinervo, P; Sinev, N B; Sioli, M; Siragusa, G; Sivoklokov, S Yu; Sjölin, J; Sjursen, T B; Skinner, M B; Skottowe, H P; Skubic, P; Slater, M; Slavicek, T; Slawinska, M; Sliwa, K; Slovak, R; Smakhtin, V; Smart, B H; Smestad, L; Smiesko, J; Smirnov, S Yu; Smirnov, Y; Smirnova, L N; Smirnova, O; Smith, M N K; Smith, R W; Smizanska, M; Smolek, K; Snesarev, A A; Snyder, S; Sobie, R; Socher, F; Soffer, A; Soh, D A; Sokhrannyi, G; Sanchez, C A Solans; Solar, M; Soldatov, E Yu; Soldevila, U; Solodkov, A A; Soloshenko, A; Solovyanov, O V; Solovyev, V; Sommer, P; Son, H; Song, H Y; Sood, A; Sopczak, A; Sopko, V; Sorin, V; Sosa, D; Sotiropoulou, C L; Soualah, R; Soukharev, A M; South, D; Sowden, B C; Spagnolo, S; Spalla, M; Spangenberg, M; Spanò, F; Sperlich, D; Spettel, F; Spighi, R; Spigo, G; Spiller, L A; Spousta, M; Denis, R D St; Stabile, A; Stamen, R; Stamm, S; Stanecka, E; Stanek, R W; Stanescu, C; Stanescu-Bellu, M; Stanitzki, M M; Stapnes, S; Starchenko, E A; Stark, G H; Stark, J; Staroba, P; Starovoitov, P; Stärz, S; Staszewski, R; Steinberg, P; Stelzer, B; Stelzer, H J; Stelzer-Chilton, O; Stenzel, H; Stewart, G A; Stillings, J A; Stockton, M C; Stoebe, M; Stoicea, G; Stolte, P; Stonjek, S; Stradling, A R; Straessner, A; Stramaglia, M E; Strandberg, J; Strandberg, S; Strandlie, A; Strauss, M; Strizenec, P; Ströhmer, R; Strom, D M; Stroynowski, R; Strubig, A; Stucci, S A; Stugu, B; Styles, N A; Su, D; Su, J; Subramaniam, R; Suchek, S; Sugaya, Y; Suk, M; Sulin, V V; Sultansoy, S; Sumida, T; Sun, S; Sun, X; Sundermann, J E; Suruliz, K; Susinno, G; Sutton, M R; Suzuki, S; Svatos, M; Swiatlowski, M; Sykora, I; Sykora, T; Ta, D; Taccini, C; Tackmann, K; Taenzer, J; Taffard, A; Tafirout, R; Taiblum, N; Takai, H; Takashima, R; Takeshita, T; Takubo, Y; Talby, M; Talyshev, A A; Tan, K G; Tanaka, J; Tanaka, R; Tanaka, S; Tannenwald, B B; Araya, S Tapia; Tapprogge, S; Tarem, S; Tartarelli, G F; Tas, P; Tasevsky, M; Tashiro, T; Tassi, E; Delgado, A Tavares; Tayalati, Y; Taylor, A C; Taylor, G N; Taylor, P T E; Taylor, W; Teischinger, F A; Teixeira-Dias, P; Temming, K K; Temple, D; Kate, H Ten; Teng, P K; Teoh, J J; Tepel, F; Terada, S; Terashi, K; Terron, J; Terzo, S; Testa, M; Teuscher, R J; Theveneaux-Pelzer, T; Thomas, J P; Thomas-Wilsker, J; Thompson, E N; Thompson, P D; Thompson, A S; Thomsen, L A; Thomson, E; Thomson, M; Tibbetts, M J; Torres, R E Ticse; Tikhomirov, V O; Tikhonov, Yu A; Timoshenko, S; Tipton, P; Tisserant, S; Todome, K; Todorov, T; Todorova-Nova, S; Tojo, J; Tokár, S; Tokushuku, K; Tolley, E; Tomlinson, L; Tomoto, M; Tompkins, L; Toms, K; Tong, B; Torrence, E; Torres, H; Pastor, E Torró; Toth, J; Touchard, F; Tovey, D R; Trefzger, T; Tricoli, A; Trigger, I M; Trincaz-Duvoid, S; Tripiana, M F; Trischuk, W; Trocmé, B; Trofymov, A; Troncon, C; Trottier-McDonald, M; Trovatelli, M; Truong, L; Trzebinski, M; Trzupek, A; Tseng, J C-L; Tsiareshka, P V; Tsipolitis, G; Tsirintanis, N; Tsiskaridze, S; Tsiskaridze, V; Tskhadadze, E G; Tsui, K M; Tsukerman, I I; Tsulaia, V; Tsuno, S; Tsybychev, D; Tudorache, A; Tudorache, V; Tuna, A N; Tupputi, S A; Turchikhin, S; Turecek, D; Turgeman, D; Turra, R; Turvey, A J; Tuts, P M; Tyndel, M; Ucchielli, G; Ueda, I; Ueno, R; Ughetto, M; Ukegawa, F; Unal, G; Undrus, A; Unel, G; Ungaro, F C; Unno, Y; Unverdorben, C; Urban, J; Urquijo, P; Urrejola, P; Usai, G; Usanova, A; Vacavant, L; Vacek, V; Vachon, B; Valderanis, C; Santurio, E Valdes; Valencic, N; Valentinetti, S; Valero, A; Valery, L; Valkar, S; Vallecorsa, S; Ferrer, J A Valls; Van Den Wollenberg, W; Van Der Deijl, P C; van der Geer, R; van der Graaf, H; van Eldik, N; van Gemmeren, P; Van Nieuwkoop, J; van Vulpen, I; van Woerden, M C; Vanadia, M; Vandelli, W; Vanguri, R; Vaniachine, A; Vankov, P; Vardanyan, G; Vari, R; Varnes, E W; Varol, T; Varouchas, D; Vartapetian, A; Varvell, K E; Vasquez, J G; Vazeille, F; Schroeder, T Vazquez; Veatch, J; Veloce, L M; Veloso, F; Veneziano, S; Ventura, A; Venturi, M; Venturi, N; Venturini, A; Vercesi, V; Verducci, M; Verkerke, W; Vermeulen, J C; Vest, A; Vetterli, M C; Viazlo, O; Vichou, I; Vickey, T; Boeriu, O E Vickey; Viehhauser, G H A; Viel, S; Vigani, L; Vigne, R; Villa, M; Perez, M Villaplana; Vilucchi, E; Vincter, M G; Vinogradov, V B; Vittori, C; Vivarelli, I; Vlachos, S; Vlasak, M; Vogel, M; Vokac, P; Volpi, G; Volpi, M; von der Schmitt, H; von Toerne, E; Vorobel, V; Vorobev, K; Vos, M; Voss, R; Vossebeld, J H; Vranjes, N; Milosavljevic, M Vranjes; Vrba, V; Vreeswijk, M; Vuillermet, R; Vukotic, I; Vykydal, Z; Wagner, P; Wagner, W; Wahlberg, H; Wahrmund, S; Wakabayashi, J; Walder, J; Walker, R; Walkowiak, W; Wallangen, V; Wang, C; Wang, C; Wang, F; Wang, H; Wang, H; Wang, J; Wang, J; Wang, K; Wang, R; Wang, S M; Wang, T; Wang, T; Wang, W; Wang, X; Wanotayaroj, C; Warburton, A; Ward, C P; Wardrope, D R; Washbrook, A; Watkins, P M; Watson, A T; Watson, M F; Watts, G; Watts, S; Waugh, B M; Webb, S; Weber, M S; Weber, S W; Webster, J S; Weidberg, A R; Weinert, B; Weingarten, J; Weiser, C; Weits, H; Wells, P S; Wenaus, T; Wengler, T; Wenig, S; Wermes, N; Werner, M; Werner, M D; Werner, P; Wessels, M; Wetter, J; Whalen, K; Whallon, N L; Wharton, A M; White, A; White, M J; White, R; Whiteson, D; Wickens, F J; Wiedenmann, W; Wielers, M; Wienemann, P; Wiglesworth, C; Wiik-Fuchs, L A M; Wildauer, A; Wilk, F; Wilkens, H G; Williams, H H; Williams, S; Willis, C; Willocq, S; Wilson, J A; Wingerter-Seez, I; Winklmeier, F; Winston, O J; Winter, B T; Wittgen, M; Wittkowski, J; Wollstadt, S J; Wolter, M W; Wolters, H; Wosiek, B K; Wotschack, J; Woudstra, M J; Wozniak, K W; Wu, M; Wu, M; Wu, S L; Wu, X; Wu, Y; Wyatt, T R; Wynne, B M; Xella, S; Xu, D; Xu, L; Yabsley, B; Yacoob, S; Yakabe, R; Yamaguchi, D; Yamaguchi, Y; Yamamoto, A; Yamamoto, S; Yamanaka, T; Yamauchi, K; Yamazaki, Y; Yan, Z; Yang, H; Yang, H; Yang, Y; Yang, Z; Yao, W-M; Yap, Y C; Yasu, Y; Yatsenko, E; Wong, K H Yau; Ye, J; Ye, S; Yeletskikh, I; Yen, A L; Yildirim, E; Yorita, K; Yoshida, R; Yoshihara, K; Young, C; Young, C J S; Youssef, S; Yu, D R; Yu, J; Yu, J M; Yu, J; Yuan, L; Yuen, S P Y; Yusuff, I; Zabinski, B; Zaidan, R; Zaitsev, A M; Zakharchuk, N; Zalieckas, J; Zaman, A; Zambito, S; Zanello, L; Zanzi, D; Zeitnitz, C; Zeman, M; Zemla, A; Zeng, J C; Zeng, Q; Zengel, K; Zenin, O; Ženiš, T; Zerwas, D; Zhang, D; Zhang, F; Zhang, G; Zhang, H; Zhang, J; Zhang, L; Zhang, R; Zhang, R; Zhang, X; Zhang, Z; Zhao, X; Zhao, Y; Zhao, Z; Zhemchugov, A; Zhong, J; Zhou, B; Zhou, C; Zhou, L; Zhou, L; Zhou, M; Zhou, N; Zhu, C G; Zhu, H; Zhu, J; Zhu, Y; Zhuang, X; Zhukov, K; Zibell, A; Zieminska, D; Zimine, N I; Zimmermann, C; Zimmermann, S; Zinonos, Z; Zinser, M; Ziolkowski, M; Živković, L; Zobernig, G; Zoccoli, A; Nedden, M Zur; Zurzolo, G; Zwalinski, L

    2016-01-01

    The algorithms used by the ATLAS Collaboration to reconstruct and identify prompt photons are described. Measurements of the photon identification efficiencies are reported, using 4.9 fb[Formula: see text] of pp collision data collected at the LHC at [Formula: see text] [Formula: see text] and 20.3 fb[Formula: see text] at [Formula: see text] [Formula: see text]. The efficiencies are measured separately for converted and unconverted photons, in four different pseudorapidity regions, for transverse momenta between 10 [Formula: see text] and 1.5 [Formula: see text]. The results from the combination of three data-driven techniques are compared to the predictions from a simulation of the detector response, after correcting the electromagnetic shower momenta in the simulation for the average differences observed with respect to data. Data-to-simulation efficiency ratios used as correction factors in physics measurements are determined to account for the small residual efficiency differences. These factors are measured with uncertainties between 0.5% and 10% in 7 [Formula: see text] data and between 0.5% and 5.6% in 8 [Formula: see text] data, depending on the photon transverse momentum and pseudorapidity.

  19. A method for simulating a flux-locked DC SQUID

    NASA Technical Reports Server (NTRS)

    Gutt, G. M.; Kasdin, N. J.; Condron, M. R., II; Muhlfelder, B.; Lockhart, J. M.; Cromar, M. W.

    1993-01-01

    The authors describe a computationally efficient and accurate method for simulating a dc SQUID's V-Phi (voltage-flux) and I-V characteristics which has proven valuable in evaluating and improving various SQUID readout methods. The simulation of the SQUID is based on fitting of previously acquired data from either a real or a modeled device using the Fourier transform of the V-Phi curve. This method does not predict SQUID behavior, but rather is a way of replicating a known behavior efficiently with portability into various simulation programs such as SPICE. The authors discuss the methods used to simulate the SQUID and the flux-locking control electronics, and present specific examples of this approach. Results include an estimate of the slew rate and linearity of a simple flux-locked loop using a characterized dc SQUID.

  20. An efficient Cellular Potts Model algorithm that forbids cell fragmentation

    NASA Astrophysics Data System (ADS)

    Durand, Marc; Guesnet, Etienne

    2016-11-01

    The Cellular Potts Model (CPM) is a lattice based modeling technique which is widely used for simulating cellular patterns such as foams or biological tissues. Despite its realism and generality, the standard Monte Carlo algorithm used in the scientific literature to evolve this model preserves connectivity of cells on a limited range of simulation temperature only. We present a new algorithm in which cell fragmentation is forbidden for all simulation temperatures. This allows to significantly enhance realism of the simulated patterns. It also increases the computational efficiency compared with the standard CPM algorithm even at same simulation temperature, thanks to the time spared in not doing unrealistic moves. Moreover, our algorithm restores the detailed balance equation, ensuring that the long-term stage is independent of the chosen acceptance rate and chosen path in the temperature space.

Top