Sample records for individual-based simulation model

  1. An individual-based simulation model for mottled sculpin (Cottus bairdi) in a southern Appalachian stream

    Treesearch

    Brenda Rashleigh; Gary D. Grossman

    2005-01-01

    We describe and analyze a spatially explicit, individual-based model for the local population dynamics of mottled sculpin (Cottus bairdi). The model simulated daily growth, mortality, movement and spawning of individuals within a reach of stream. Juvenile and adult growth was based on consumption bioenergetics of benthic macroinvertebrate prey;...

  2. AN INDIVIDUAL-BASED SIMULATION MODEL FOR MOTTLED SCULPIN (COTTUS BAIRDI) IN A SOUTHERN APPALACHIAN STREAM

    EPA Science Inventory

    We describe and analyze a spatially explicit, individual-based model for the local population dynamics of mottled sculpin (Cottus bairdi). The model simulated daily growth, mortality, movement and spawning of individuals within a reach of stream. Juvenile and adult growth was bas...

  3. Agent-based modeling of malaria vectors: the importance of spatial simulation.

    PubMed

    Bomblies, Arne

    2014-07-03

    The modeling of malaria vector mosquito populations yields great insight into drivers of malaria transmission at the village scale. Simulation of individual mosquitoes as "agents" in a distributed, dynamic model domain may be greatly beneficial for simulation of spatial relationships of vectors and hosts. In this study, an agent-based model is used to simulate the life cycle and movement of individual malaria vector mosquitoes in a Niger Sahel village, with individual simulated mosquitoes interacting with their physical environment as well as humans. Various processes that are known to be epidemiologically important, such as the dependence of parity on flight distance between developmental habitat and blood meal hosts and therefore spatial relationships of pools and houses, are readily simulated using this modeling paradigm. Impacts of perturbations can be evaluated on the basis of vectorial capacity, because the interactions between individuals that make up the population- scale metric vectorial capacity can be easily tracked for simulated mosquitoes and human blood meal hosts, without the need to estimate vectorial capacity parameters. As expected, model results show pronounced impacts of pool source reduction from larvicide application and draining, but with varying degrees of impact depending on the spatial relationship between pools and human habitation. Results highlight the importance of spatially-explicit simulation that can model individuals such as in an agent-based model. The impacts of perturbations on village scale malaria transmission depend on spatial locations of individual mosquitoes, as well as the tracking of relevant life cycle events and characteristics of individual mosquitoes. This study demonstrates advantages of using an agent-based approach for village-scale mosquito simulation to address questions in which spatial relationships are known to be important.

  4. Simulation's Ensemble is Better Than Ensemble Simulation

    NASA Astrophysics Data System (ADS)

    Yan, X.

    2017-12-01

    Simulation's ensemble is better than ensemble simulation Yan Xiaodong State Key Laboratory of Earth Surface Processes and Resource Ecology (ESPRE) Beijing Normal University,19 Xinjiekouwai Street, Haidian District, Beijing 100875, China Email: yxd@bnu.edu.cnDynamical system is simulated from initial state. However initial state data is of great uncertainty, which leads to uncertainty of simulation. Therefore, multiple possible initial states based simulation has been used widely in atmospheric science, which has indeed been proved to be able to lower the uncertainty, that was named simulation's ensemble because multiple simulation results would be fused . In ecological field, individual based model simulation (forest gap models for example) can be regarded as simulation's ensemble compared with community based simulation (most ecosystem models). In this talk, we will address the advantage of individual based simulation and even their ensembles.

  5. Driving-forces model on individual behavior in scenarios considering moving threat agents

    NASA Astrophysics Data System (ADS)

    Li, Shuying; Zhuang, Jun; Shen, Shifei; Wang, Jia

    2017-09-01

    The individual behavior model is a contributory factor to improve the accuracy of agent-based simulation in different scenarios. However, few studies have considered moving threat agents, which often occur in terrorist attacks caused by attackers with close-range weapons (e.g., sword, stick). At the same time, many existing behavior models lack validation from cases or experiments. This paper builds a new individual behavior model based on seven behavioral hypotheses. The driving-forces model is an extension of the classical social force model considering scenarios including moving threat agents. An experiment was conducted to validate the key components of the model. Then the model is compared with an advanced Elliptical Specification II social force model, by calculating the fitting errors between the simulated and experimental trajectories, and being applied to simulate a specific circumstance. Our results show that the driving-forces model reduced the fitting error by an average of 33.9% and the standard deviation by an average of 44.5%, which indicates the accuracy and stability of the model in the studied situation. The new driving-forces model could be used to simulate individual behavior when analyzing the risk of specific scenarios using agent-based simulation methods, such as risk analysis of close-range terrorist attacks in public places.

  6. Mosquito population dynamics from cellular automata-based simulation

    NASA Astrophysics Data System (ADS)

    Syafarina, Inna; Sadikin, Rifki; Nuraini, Nuning

    2016-02-01

    In this paper we present an innovative model for simulating mosquito-vector population dynamics. The simulation consist of two stages: demography and dispersal dynamics. For demography simulation, we follow the existing model for modeling a mosquito life cycles. Moreover, we use cellular automata-based model for simulating dispersal of the vector. In simulation, each individual vector is able to move to other grid based on a random walk. Our model is also capable to represent immunity factor for each grid. We simulate the model to evaluate its correctness. Based on the simulations, we can conclude that our model is correct. However, our model need to be improved to find a realistic parameters to match real data.

  7. Models and Methods for Adaptive Management of Individual and Team-Based Training Using a Simulator

    NASA Astrophysics Data System (ADS)

    Lisitsyna, L. S.; Smetyuh, N. P.; Golikov, S. P.

    2017-05-01

    Research of adaptive individual and team-based training has been analyzed and helped find out that both in Russia and abroad, individual and team-based training and retraining of AASTM operators usually includes: production training, training of general computer and office equipment skills, simulator training including virtual simulators which use computers to simulate real-world manufacturing situation, and, as a rule, the evaluation of AASTM operators’ knowledge determined by completeness and adequacy of their actions under the simulated conditions. Such approach to training and re-training of AASTM operators stipulates only technical training of operators and testing their knowledge based on assessing their actions in a simulated environment.

  8. A simulations approach for meta-analysis of genetic association studies based on additive genetic model.

    PubMed

    John, Majnu; Lencz, Todd; Malhotra, Anil K; Correll, Christoph U; Zhang, Jian-Ping

    2018-06-01

    Meta-analysis of genetic association studies is being increasingly used to assess phenotypic differences between genotype groups. When the underlying genetic model is assumed to be dominant or recessive, assessing the phenotype differences based on summary statistics, reported for individual studies in a meta-analysis, is a valid strategy. However, when the genetic model is additive, a similar strategy based on summary statistics will lead to biased results. This fact about the additive model is one of the things that we establish in this paper, using simulations. The main goal of this paper is to present an alternate strategy for the additive model based on simulating data for the individual studies. We show that the alternate strategy is far superior to the strategy based on summary statistics.

  9. A grouping method based on grid density and relationship for crowd evacuation simulation

    NASA Astrophysics Data System (ADS)

    Li, Yan; Liu, Hong; Liu, Guang-peng; Li, Liang; Moore, Philip; Hu, Bin

    2017-05-01

    Psychological factors affect the movement of people in the competitive or panic mode of evacuation, in which the density of pedestrians is relatively large and the distance among them is small. In this paper, a crowd is divided into groups according to their social relations to simulate the actual movement of crowd evacuation more realistically and increase the attractiveness of the group based on social force model. The force of group attraction is the synthesis of two forces; one is the attraction of the individuals generated by their social relations to gather, and the other is that of the group leader to the individuals within the group to ensure that the individuals follow the leader. The synthetic force determines the trajectory of individuals. The evacuation process is demonstrated using the improved social force model. In the improved social force model, the individuals with close social relations gradually present a closer and coordinated action while following the leader. In this paper, a grouping algorithm is proposed based on grid density and relationship via computer simulation to illustrate the features of the improved social force model. The definition of the parameters involved in the algorithm is given, and the effect of relational value on the grouping is tested. Reasonable numbers of grids and weights are selected. The effectiveness of the algorithm is shown through simulation experiments. A simulation platform is also established using the proposed grouping algorithm and the improved social force model for crowd evacuation simulation.

  10. CDFISH: an individual-based, spatially-explicit, landscape genetics simulator for aquatic species in complex riverscapes

    USGS Publications Warehouse

    Erin L. Landguth,; Muhlfeld, Clint C.; Luikart, Gordon

    2012-01-01

    We introduce Cost Distance FISHeries (CDFISH), a simulator of population genetics and connectivity in complex riverscapes for a wide range of environmental scenarios of aquatic organisms. The spatially-explicit program implements individual-based genetic modeling with Mendelian inheritance and k-allele mutation on a riverscape with resistance to movement. The program simulates individuals in subpopulations through time employing user-defined functions of individual migration, reproduction, mortality, and dispersal through straying on a continuous resistance surface.

  11. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    ERIC Educational Resources Information Center

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  12. USING ECO-EVOLUTIONARY INDIVIDUAL-BASED MODELS TO INVESTIGATE SPATIALLY-DEPENDENT PROCESSES IN CONSERVATION GENETICS

    EPA Science Inventory

    Eco-evolutionary population simulation models are powerful new forecasting tools for exploring management strategies for climate change and other dynamic disturbance regimes. Additionally, eco-evo individual-based models (IBMs) are useful for investigating theoretical feedbacks ...

  13. Rule-Based Simulation of Multi-Cellular Biological Systems—A Review of Modeling Techniques

    PubMed Central

    Hwang, Minki; Garbey, Marc; Berceli, Scott A.; Tran-Son-Tay, Roger

    2011-01-01

    Emergent behaviors of multi-cellular biological systems (MCBS) result from the behaviors of each individual cells and their interactions with other cells and with the environment. Modeling MCBS requires incorporating these complex interactions among the individual cells and the environment. Modeling approaches for MCBS can be grouped into two categories: continuum models and cell-based models. Continuum models usually take the form of partial differential equations, and the model equations provide insight into the relationship among the components in the system. Cell-based models simulate each individual cell behavior and interactions among them enabling the observation of the emergent system behavior. This review focuses on the cell-based models of MCBS, and especially, the technical aspect of the rule-based simulation method for MCBS is reviewed. How to implement the cell behaviors and the interactions with other cells and with the environment into the computational domain is discussed. The cell behaviors reviewed in this paper are division, migration, apoptosis/necrosis, and differentiation. The environmental factors such as extracellular matrix, chemicals, microvasculature, and forces are also discussed. Application examples of these cell behaviors and interactions are presented. PMID:21369345

  14. MOAB: a spatially explicit, individual-based expert system for creating animal foraging models

    USGS Publications Warehouse

    Carter, J.; Finn, John T.

    1999-01-01

    We describe the development, structure, and corroboration process of a simulation model of animal behavior (MOAB). MOAB can create spatially explicit, individual-based animal foraging models. Users can create or replicate heterogeneous landscape patterns, and place resources and individual animals of a goven species on that landscape to simultaneously simulate the foraging behavior of multiple species. The heuristic rules for animal behavior are maintained in a user-modifiable expert system. MOAB can be used to explore hypotheses concerning the influence of landscape patttern on animal movement and foraging behavior. A red fox (Vulpes vulpes L.) foraging and nest predation model was created to test MOAB's capabilities. Foxes were simulated for 30-day periods using both expert system and random movement rules. Home range size, territory formation and other available simulation studies. A striped skunk (Mephitis mephitis L.) model also was developed. The expert system model proved superior to stochastic in respect to territory formation, general movement patterns and home range size.

  15. A physiologically-based model for simulation of color vision deficiency.

    PubMed

    Machado, Gustavo M; Oliveira, Manuel M; Fernandes, Leandro A F

    2009-01-01

    Color vision deficiency (CVD) affects approximately 200 million people worldwide, compromising the ability of these individuals to effectively perform color and visualization-related tasks. This has a significant impact on their private and professional lives. We present a physiologically-based model for simulating color vision. Our model is based on the stage theory of human color vision and is derived from data reported in electrophysiological studies. It is the first model to consistently handle normal color vision, anomalous trichromacy, and dichromacy in a unified way. We have validated the proposed model through an experimental evaluation involving groups of color vision deficient individuals and normal color vision ones. Our model can provide insights and feedback on how to improve visualization experiences for individuals with CVD. It also provides a framework for testing hypotheses about some aspects of the retinal photoreceptors in color vision deficient individuals.

  16. Stochastic Individual-Based Modeling of Bacterial Growth and Division Using Flow Cytometry.

    PubMed

    García, Míriam R; Vázquez, José A; Teixeira, Isabel G; Alonso, Antonio A

    2017-01-01

    A realistic description of the variability in bacterial growth and division is critical to produce reliable predictions of safety risks along the food chain. Individual-based modeling of bacteria provides the theoretical framework to deal with this variability, but it requires information about the individual behavior of bacteria inside populations. In this work, we overcome this problem by estimating the individual behavior of bacteria from population statistics obtained with flow cytometry. For this objective, a stochastic individual-based modeling framework is defined based on standard assumptions during division and exponential growth. The unknown single-cell parameters required for running the individual-based modeling simulations, such as cell size growth rate, are estimated from the flow cytometry data. Instead of using directly the individual-based model, we make use of a modified Fokker-Plank equation. This only equation simulates the population statistics in function of the unknown single-cell parameters. We test the validity of the approach by modeling the growth and division of Pediococcus acidilactici within the exponential phase. Estimations reveal the statistics of cell growth and division using only data from flow cytometry at a given time. From the relationship between the mother and daughter volumes, we also predict that P. acidilactici divide into two successive parallel planes.

  17. Building occupancy simulation and data assimilation using a graph-based agent-oriented model

    NASA Astrophysics Data System (ADS)

    Rai, Sanish; Hu, Xiaolin

    2018-07-01

    Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.

  18. Simulation of Drought-induced Tree Mortality Using a New Individual and Hydraulic Trait-based Model (S-TEDy)

    NASA Astrophysics Data System (ADS)

    Sinha, T.; Gangodagamage, C.; Ale, S.; Frazier, A. G.; Giambelluca, T. W.; Kumagai, T.; Nakai, T.; Sato, H.

    2017-12-01

    Drought-related tree mortality at a regional scale causes drastic shifts in carbon and water cycling in Southeast Asian tropical rainforests, where severe droughts are projected to occur more frequently, especially under El Niño conditions. To provide a useful tool for projecting the tropical rainforest dynamics under climate change conditions, we developed the Spatially Explicit Individual-Based (SEIB) Dynamic Global Vegetation Model (DGVM) applicable to simulating mechanistic tree mortality induced by the climatic impacts via individual-tree-scale ecophysiology such as hydraulic failure and carbon starvation. In this study, we present the new model, SEIB-originated Terrestrial Ecosystem Dynamics (S-TEDy) model, and the computation results were compared with observations collected at a field site in a Bornean tropical rainforest. Furthermore, after validating the model's performance, numerical experiments addressing a future of the tropical rainforest were conducted using some global climate model (GCM) simulation outputs.

  19. Movement rules for individual-based models of stream fish

    Treesearch

    Steven F. Railsback; Roland H. Lamberson; Bret C. Harvey; Walter E. Duffy

    1999-01-01

    Abstract - Spatially explicit individual-based models (IBMs) use movement rules to determine when an animal departs its current location and to determine its movement destination; these rules are therefore critical to accurate simulations. Movement rules typically define some measure of how an individual's expected fitness varies among locations, under the...

  20. HexSim - A general purpose framework for spatially-explicit, individual-based modeling

    EPA Science Inventory

    HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications. This talk will focus on a subset of those ap...

  1. Individual-based models in ecology after four decades

    PubMed Central

    Grimm, Volker

    2014-01-01

    Individual-based models simulate populations and communities by following individuals and their properties. They have been used in ecology for more than four decades, with their use and ubiquity in ecology growing rapidly in the last two decades. Individual-based models have been used for many applied or “pragmatic” issues, such as informing the protection and management of particular populations in specific locations, but their use in addressing theoretical questions has also grown rapidly, recently helping us to understand how the sets of traits of individual organisms influence the assembly of communities and food webs. Individual-based models will play an increasingly important role in questions posed by complex ecological systems. PMID:24991416

  2. An individual-based modeling approach to simulate the effects of cellular nutrient competition on Escherichia coli K-12 MG1655 colony behavior and interactions in aerobic structured food systems.

    PubMed

    Tack, Ignace L M M; Logist, Filip; Noriega Fernández, Estefanía; Van Impe, Jan F M

    2015-02-01

    Traditional kinetic models in predictive microbiology reliably predict macroscopic dynamics of planktonically-growing cell cultures in homogeneous liquid food systems. However, most food products have a semi-solid structure, where microorganisms grow locally in colonies. Individual colony cells exhibit strongly different and non-normally distributed behavior due to local nutrient competition. As a result, traditional models considering average population behavior in a homogeneous system do not describe colony dynamics in full detail. To incorporate local resource competition and individual cell differences, an individual-based modeling approach has been applied to Escherichia coli K-12 MG1655 colonies, considering the microbial cell as modeling unit. The first contribution of this individual-based model is to describe single colony growth under nutrient-deprived conditions. More specifically, the linear and stationary phase in the evolution of the colony radius, the evolution from a disk-like to branching morphology, and the emergence of a starvation zone in the colony center are simulated and compared to available experimental data. These phenomena occur earlier at more severe nutrient depletion conditions, i.e., at lower nutrient diffusivity and initial nutrient concentration in the medium. Furthermore, intercolony interactions have been simulated. Higher inoculum densities lead to stronger intercolony interactions, such as colony merging and smaller colony sizes, due to nutrient competition. This individual-based model contributes to the elucidation of characteristic experimentally observed colony behavior from mechanistic information about cellular physiology and interactions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Relationships between migration rates and landscape resistance assessed using individual-based simulations

    Treesearch

    E. L. Landguth; S. A. Cushman; M. A. Murphy; G. Luikart

    2010-01-01

    Linking landscape effects on gene flow to processes such as dispersal and mating is essential to provide a conceptual foundation for landscape genetics. It is particularly important to determine how classical population genetic models relate to recent individual-based landscape genetic models when assessing individual movement and its influence on population genetic...

  4. Simulating natural selection in landscape genetics

    Treesearch

    E. L. Landguth; S. A. Cushman; N. Johnson

    2012-01-01

    Linking landscape effects to key evolutionary processes through individual organism movement and natural selection is essential to provide a foundation for evolutionary landscape genetics. Of particular importance is determining how spatially- explicit, individual-based models differ from classic population genetics and evolutionary ecology models based on ideal...

  5. CFD simulation of hemodynamics in sequential and individual coronary bypass grafts based on multislice CT scan datasets.

    PubMed

    Hajati, Omid; Zarrabi, Khalil; Karimi, Reza; Hajati, Azadeh

    2012-01-01

    There is still controversy over the differences in the patency rates of the sequential and individual coronary artery bypass grafting (CABG) techniques. The purpose of this paper was to non-invasively evaluate hemodynamic parameters using complete 3D computational fluid dynamics (CFD) simulations of the sequential and the individual methods based on the patient-specific data extracted from computed tomography (CT) angiography. For CFD analysis, the geometric model of coronary arteries was reconstructed using an ECG-gated 64-detector row CT. Modeling the sequential and individual bypass grafting, this study simulates the flow from the aorta to the occluded posterior descending artery (PDA) and the posterior left ventricle (PLV) vessel with six coronary branches based on the physiologically measured inlet flow as the boundary condition. The maximum calculated wall shear stress (WSS) in the sequential and the individual models were estimated to be 35.1 N/m(2) and 36.5 N/m(2), respectively. Compared to the individual bypass method, the sequential graft has shown a higher velocity at the proximal segment and lower spatial wall shear stress gradient (SWSSG) due to the flow splitting caused by the side-to-side anastomosis. Simulated results combined with its surgical benefits including the requirement of shorter vein length and fewer anastomoses advocate the sequential method as a more favorable CABG method.

  6. CDPOP: A spatially explicit cost distance population genetics program

    Treesearch

    Erin L. Landguth; S. A. Cushman

    2010-01-01

    Spatially explicit simulation of gene flow in complex landscapes is essential to explain observed population responses and provide a foundation for landscape genetics. To address this need, we wrote a spatially explicit, individual-based population genetics model (CDPOP). The model implements individual-based population modelling with Mendelian inheritance and k-allele...

  7. Model-based surgical planning and simulation of cranial base surgery.

    PubMed

    Abe, M; Tabuchi, K; Goto, M; Uchino, A

    1998-11-01

    Plastic skull models of seven individual patients were fabricated by stereolithography from three-dimensional data based on computed tomography bone images. Skull models were utilized for neurosurgical planning and simulation in the seven patients with cranial base lesions that were difficult to remove. Surgical approaches and areas of craniotomy were evaluated using the fabricated skull models. In preoperative simulations, hand-made models of the tumors, major vessels and nerves were placed in the skull models. Step-by-step simulation of surgical procedures was performed using actual surgical tools. The advantages of using skull models to plan and simulate cranial base surgery include a better understanding of anatomic relationships, preoperative evaluation of the proposed procedure, increased understanding by the patient and family, and improved educational experiences for residents and other medical staff. The disadvantages of using skull models include the time and cost of making the models. The skull models provide a more realistic tool that is easier to handle than computer-graphic images. Surgical simulation using models facilitates difficult cranial base surgery and may help reduce surgical complications.

  8. Understanding the effects of different HIV transmission models in individual-based microsimulation of HIV epidemic dynamics in people who inject drugs

    PubMed Central

    MONTEIRO, J.F.G.; ESCUDERO, D.J.; WEINREB, C.; FLANIGAN, T.; GALEA, S.; FRIEDMAN, S.R.; MARSHALL, B.D.L.

    2017-01-01

    SUMMARY We investigated how different models of HIV transmission, and assumptions regarding the distribution of unprotected sex and syringe-sharing events (‘risk acts’), affect quantitative understanding of HIV transmission process in people who inject drugs (PWID). The individual-based model simulated HIV transmission in a dynamic sexual and injecting network representing New York City. We constructed four HIV transmission models: model 1, constant probabilities; model 2, random number of sexual and parenteral acts; model 3, viral load individual assigned; and model 4, two groups of partnerships (low and high risk). Overall, models with less heterogeneity were more sensitive to changes in numbers risk acts, producing HIV incidence up to four times higher than that empirically observed. Although all models overestimated HIV incidence, micro-simulations with greater heterogeneity in the HIV transmission modelling process produced more robust results and better reproduced empirical epidemic dynamics. PMID:26753627

  9. Leveraging social networks for understanding the evolution of epidemics

    PubMed Central

    2011-01-01

    Background To understand how infectious agents disseminate throughout a population it is essential to capture the social model in a realistic manner. This paper presents a novel approach to modeling the propagation of the influenza virus throughout a realistic interconnection network based on actual individual interactions which we extract from online social networks. The advantage is that these networks can be extracted from existing sources which faithfully record interactions between people in their natural environment. We additionally allow modeling the characteristics of each individual as well as customizing his daily interaction patterns by making them time-dependent. Our purpose is to understand how the infection spreads depending on the structure of the contact network and the individuals who introduce the infection in the population. This would help public health authorities to respond more efficiently to epidemics. Results We implement a scalable, fully distributed simulator and validate the epidemic model by comparing the simulation results against the data in the 2004-2005 New York State Department of Health Report (NYSDOH), with similar temporal distribution results for the number of infected individuals. We analyze the impact of different types of connection models on the virus propagation. Lastly, we analyze and compare the effects of adopting several different vaccination policies, some of them based on individual characteristics -such as age- while others targeting the super-connectors in the social model. Conclusions This paper presents an approach to modeling the propagation of the influenza virus via a realistic social model based on actual individual interactions extracted from online social networks. We implemented a scalable, fully distributed simulator and we analyzed both the dissemination of the infection and the effect of different vaccination policies on the progress of the epidemics. The epidemic values predicted by our simulator match real data from NYSDOH. Our results show that our simulator can be a useful tool in understanding the differences in the evolution of an epidemic within populations with different characteristics and can provide guidance with regard to which, and how many, individuals should be vaccinated to slow down the virus propagation and reduce the number of infections. PMID:22784620

  10. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-Level Rule-Based Models in Cell Biology.

    PubMed

    Bittig, Arne T; Uhrmacher, Adelinde M

    2017-01-01

    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  11. mizer: an R package for multispecies, trait-based and community size spectrum ecological modelling.

    PubMed

    Scott, Finlay; Blanchard, Julia L; Andersen, Ken H

    2014-10-01

    Size spectrum ecological models are representations of a community of individuals which grow and change trophic level. A key emergent feature of these models is the size spectrum; the total abundance of all individuals that scales negatively with size. The models we focus on are designed to capture fish community dynamics useful for assessing the community impacts of fishing.We present mizer , an R package for implementing dynamic size spectrum ecological models of an entire aquatic community subject to fishing. Multiple fishing gears can be defined and fishing mortality can change through time making it possible to simulate a range of exploitation strategies and management options. mizer implements three versions of the size spectrum modelling framework: the community model, where individuals are only characterized by their size; the trait-based model, where individuals are further characterized by their asymptotic size; and the multispecies model where additional trait differences are resolved.A range of plot, community indicator and summary methods are available to inspect the results of the simulations.

  12. Analysis of habitat-selection rules using an individual-based model

    Treesearch

    Steven F. Railsback; Bret C. Harvey

    2002-01-01

    Abstract - Despite their promise for simulating natural complexity,individual-based models (IBMs) are rarely used for ecological research or resource management. Few IBMs have been shown to reproduce realistic patterns of behavior by individual organisms.To test our IBM of stream salmonids and draw conclusions about foraging theory,we analyzed the IBM ’s ability to...

  13. Risk, individual differences, and environment: an Agent-Based Modeling approach to sexual risk-taking.

    PubMed

    Nagoski, Emily; Janssen, Erick; Lohrmann, David; Nichols, Eric

    2012-08-01

    Risky sexual behaviors, including the decision to have unprotected sex, result from interactions between individuals and their environment. The current study explored the use of Agent-Based Modeling (ABM)-a methodological approach in which computer-generated artificial societies simulate human sexual networks-to assess the influence of heterogeneity of sexual motivation on the risk of contracting HIV. The models successfully simulated some characteristics of human sexual systems, such as the relationship between individual differences in sexual motivation (sexual excitation and inhibition) and sexual risk, but failed to reproduce the scale-free distribution of number of partners observed in the real world. ABM has the potential to inform intervention strategies that target the interaction between an individual and his or her social environment.

  14. An Agent-Based Epidemic Simulation of Social Behaviors Affecting HIV Transmission among Taiwanese Homosexuals

    PubMed Central

    2015-01-01

    Computational simulations are currently used to identify epidemic dynamics, to test potential prevention and intervention strategies, and to study the effects of social behaviors on HIV transmission. The author describes an agent-based epidemic simulation model of a network of individuals who participate in high-risk sexual practices, using number of partners, condom usage, and relationship length to distinguish between high- and low-risk populations. Two new concepts—free links and fixed links—are used to indicate tendencies among individuals who either have large numbers of short-term partners or stay in long-term monogamous relationships. An attempt was made to reproduce epidemic curves of reported HIV cases among male homosexuals in Taiwan prior to using the agent-based model to determine the effects of various policies on epidemic dynamics. Results suggest that when suitable adjustments are made based on available social survey statistics, the model accurately simulates real-world behaviors on a large scale. PMID:25815047

  15. Integrating Biodiversity into Biosphere-Atmosphere Interactions Using Individual-Based Models (IBM)

    NASA Astrophysics Data System (ADS)

    Wang, B.; Shugart, H. H., Jr.; Lerdau, M.

    2017-12-01

    A key component regulating complex, nonlinear, and dynamic biosphere-atmosphere interactions is the inherent diversity of biological systems. The model frameworks currently widely used, i.e., Plant Functional Type models) do not even begin to capture the metabolic and taxonomic diversity found in many terrestrial systems. We propose that a transition from PFT-based to individual-based modeling approaches (hereafter referred to as IBM) is essential for integrating biodiversity into research on biosphere-atmosphere interactions. The proposal emerges from our studying the interactions of forests with atmospheric processes in the context of climate change using an individual-based forest volatile organic compounds model, UVAFME-VOC. This individual-based model can explicitly simulate VOC emissions based on an explicit modelling of forest dynamics by computing the growth, death, and regeneration of each individual tree of different species and their competition for light, moisture, and nutrient, from which system-level VOC emissions are simulated by explicitly computing and summing up each individual's emissions. We found that elevated O3 significantly altered the forest dynamics by favoring species that are O3-resistant, which, meanwhile, are producers of isoprene. Such compositional changes, on the one hand, resulted in unsuppressed forest productivity and carbon stock because of the compensation by O3-resistant species. On the other hand, with more isoprene produced arising from increased producers, a possible positive feedback loop between tropospheric O3 and forest thereby emerged. We also found that climate warming will not always stimulate isoprene emissions because warming simultaneously reduces isoprene emissions by causing a decline in the abundance of isoprene-emitting species. These results suggest that species diversity is of great significance and that individual-based modelling strategies should be applied in studying biosphere-atmosphere interactions.

  16. An Agent-Based Modeling Template for a Cohort of Veterans with Diabetic Retinopathy.

    PubMed

    Day, Theodore Eugene; Ravi, Nathan; Xian, Hong; Brugh, Ann

    2013-01-01

    Agent-based models are valuable for examining systems where large numbers of discrete individuals interact with each other, or with some environment. Diabetic Veterans seeking eye care at a Veterans Administration hospital represent one such cohort. The objective of this study was to develop an agent-based template to be used as a model for a patient with diabetic retinopathy (DR). This template may be replicated arbitrarily many times in order to generate a large cohort which is representative of a real-world population, upon which in-silico experimentation may be conducted. Agent-based template development was performed in java-based computer simulation suite AnyLogic Professional 6.6. The model was informed by medical data abstracted from 535 patient records representing a retrospective cohort of current patients of the VA St. Louis Healthcare System Eye clinic. Logistic regression was performed to determine the predictors associated with advancing stages of DR. Predicted probabilities obtained from logistic regression were used to generate the stage of DR in the simulated cohort. The simulated cohort of DR patients exhibited no significant deviation from the test population of real-world patients in proportion of stage of DR, duration of diabetes mellitus (DM), or the other abstracted predictors. Simulated patients after 10 years were significantly more likely to exhibit proliferative DR (P<0.001). Agent-based modeling is an emerging platform, capable of simulating large cohorts of individuals based on manageable data abstraction efforts. The modeling method described may be useful in simulating many different conditions where course of disease is described in categorical stages.

  17. R-warfarin clearances from plasma associated with polymorphic cytochrome P450 2C19 and simulated by individual physiologically based pharmacokinetic models for 11 cynomolgus monkeys.

    PubMed

    Utoh, Masahiro; Kusama, Takashi; Miura, Tomonori; Mitsui, Marina; Kawano, Mirai; Hirano, Takahiro; Shimizu, Makiko; Uno, Yasuhiro; Yamazaki, Hiroshi

    2018-02-01

    1. Cynomolgus monkey cytochrome P450 2C19 (formerly known as P450 2C75), homologous to human P450 2C19, has been identified as R-warfarin 7-hydroxylase. In this study, simulations of R-warfarin clearance in individual cynomolgus monkeys genotyped for P450 2C19 p.[(Phe100Asn; Ala103Val; Ile112Leu)] were performed using individual simplified physiologically based pharmacokinetic (PBPK) modeling. 2. Pharmacokinetic parameters and absorption rate constants, volumes of the systemic circulation, and hepatic intrinsic clearances for individual PBPK models were estimated for eleven cynomolgus monkeys. 3. One-way ANOVA revealed significant effects of the genotype (p < 0.01) on the observed elimination half-lives and areas under the curves of R-warfarin among the homozygous mutant, heterozygous mutant, and wild-type groups. R-Warfarin clearances in individual cynomolgus monkeys genotyped for P450 2C19 were simulated by simplified PBPK modeling. The modeled hepatic intrinsic clearances were significantly associated with the P450 2C19 genotypes. The liver microsomal elimination rates of R-warfarin for individual animals after in vivo administration showed significant reductions associated with the genotype (p < 0.01). 4. This study provides important information to help simulate clearances of R-warfarin and related medicines associated with polymorphic P450 2C19 in individual cynomolgus monkeys, thereby facilitating calculation of the fraction of hepatic clearance.

  18. Physics-based statistical model and simulation method of RF propagation in urban environments

    DOEpatents

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  19. A Comparison of Three Approaches to Model Human Behavior

    NASA Astrophysics Data System (ADS)

    Palmius, Joel; Persson-Slumpi, Thomas

    2010-11-01

    One way of studying social processes is through the use of simulations. The use of simulations for this purpose has been established as its own field, social simulations, and has been used for studying a variety of phenomena. A simulation of a social setting can serve as an aid for thinking about that social setting, and for experimenting with different parameters and studying the outcomes caused by them. When using the simulation as an aid for thinking and experimenting, the chosen simulation approach will implicitly steer the simulationist towards thinking in a certain fashion in order to fit the model. To study the implications of model choice on the understanding of a setting where human anticipation comes into play, a simulation scenario of a coffee room was constructed using three different simulation approaches: Cellular Automata, Systems Dynamics and Agent-based modeling. The practical implementations of the models were done in three different simulation packages: Stella for Systems Dynamic, CaFun for Cellular automata and SesAM for Agent-based modeling. The models were evaluated both using Randers' criteria for model evaluation, and through introspection where the authors reflected upon how their understanding of the scenario was steered through the model choice. Further the software used for implementing the simulation models was evaluated, and practical considerations for the choice of software package are listed. It is concluded that the models have very different strengths. The Agent-based modeling approach offers the most intuitive support for thinking about and modeling a social setting where the behavior of the individual is in focus. The Systems Dynamics model would be preferable in situations where populations and large groups would be studied as wholes, but where individual behavior is of less concern. The Cellular Automata models would be preferable where processes need to be studied from the basis of a small set of very simple rules. It is further concluded that in most social simulation settings the Agent-based modeling approach would be the probable choice. This since the other models does not offer much in the way of supporting the modeling of the anticipatory behavior of humans acting in an organization.

  20. A parallel implementation of an off-lattice individual-based model of multicellular populations

    NASA Astrophysics Data System (ADS)

    Harvey, Daniel G.; Fletcher, Alexander G.; Osborne, James M.; Pitt-Francis, Joe

    2015-07-01

    As computational models of multicellular populations include ever more detailed descriptions of biophysical and biochemical processes, the computational cost of simulating such models limits their ability to generate novel scientific hypotheses and testable predictions. While developments in microchip technology continue to increase the power of individual processors, parallel computing offers an immediate increase in available processing power. To make full use of parallel computing technology, it is necessary to develop specialised algorithms. To this end, we present a parallel algorithm for a class of off-lattice individual-based models of multicellular populations. The algorithm divides the spatial domain between computing processes and comprises communication routines that ensure the model is correctly simulated on multiple processors. The parallel algorithm is shown to accurately reproduce the results of a deterministic simulation performed using a pre-existing serial implementation. We test the scaling of computation time, memory use and load balancing as more processes are used to simulate a cell population of fixed size. We find approximate linear scaling of both speed-up and memory consumption on up to 32 processor cores. Dynamic load balancing is shown to provide speed-up for non-regular spatial distributions of cells in the case of a growing population.

  1. Modelling and simulating decision processes of linked lives: An approach based on concurrent processes and stochastic race.

    PubMed

    Warnke, Tom; Reinhardt, Oliver; Klabunde, Anna; Willekens, Frans; Uhrmacher, Adelinde M

    2017-10-01

    Individuals' decision processes play a central role in understanding modern migration phenomena and other demographic processes. Their integration into agent-based computational demography depends largely on suitable support by a modelling language. We are developing the Modelling Language for Linked Lives (ML3) to describe the diverse decision processes of linked lives succinctly in continuous time. The context of individuals is modelled by networks the individual is part of, such as family ties and other social networks. Central concepts, such as behaviour conditional on agent attributes, age-dependent behaviour, and stochastic waiting times, are tightly integrated in the language. Thereby, alternative decisions are modelled by concurrent processes that compete by stochastic race. Using a migration model, we demonstrate how this allows for compact description of complex decisions, here based on the Theory of Planned Behaviour. We describe the challenges for the simulation algorithm posed by stochastic race between multiple concurrent complex decisions.

  2. Exploring the persistence of stream-dwelling trout populations under alternative real-world turbidity regimes with an individual-based model

    Treesearch

    Bret C. Harvey; Steven F. Railsback

    2009-01-01

    We explored the effects of elevated turbidity on stream-resident populations of coastal cutthroat trout Oncorhynchus clarkii clarkii using a spatially explicit individual-based model. Turbidity regimes were contrasted by means of 15-year simulations in a third-order stream in northwestern California. The alternative regimes were based on multiple-year, continuous...

  3. Bivalves: From individual to population modelling

    NASA Astrophysics Data System (ADS)

    Saraiva, S.; van der Meer, J.; Kooijman, S. A. L. M.; Ruardij, P.

    2014-11-01

    An individual based population model for bivalves was designed, built and tested in a 0D approach, to simulate the population dynamics of a mussel bed located in an intertidal area. The processes at the individual level were simulated following the dynamic energy budget theory, whereas initial egg mortality, background mortality, food competition, and predation (including cannibalism) were additional population processes. Model properties were studied through the analysis of theoretical scenarios and by simulation of different mortality parameter combinations in a realistic setup, imposing environmental measurements. Realistic criteria were applied to narrow down the possible combination of parameter values. Field observations obtained in the long-term and multi-station monitoring program were compared with the model scenarios. The realistically selected modeling scenarios were able to reproduce reasonably the timing of some peaks in the individual abundances in the mussel bed and its size distribution but the number of individuals was not well predicted. The results suggest that the mortality in the early life stages (egg and larvae) plays an important role in population dynamics, either by initial egg mortality, larvae dispersion, settlement failure or shrimp predation. Future steps include the coupling of the population model with a hydrodynamic and biogeochemical model to improve the simulation of egg/larvae dispersion, settlement probability, food transport and also to simulate the feedback of the organisms' activity on the water column properties, which will result in an improvement of the food quantity and quality characterization.

  4. Is my study system good enough? A case study for identifying maternal effects.

    PubMed

    Holand, Anna Marie; Steinsland, Ingelin

    2016-06-01

    In this paper, we demonstrate how simulation studies can be used to answer questions about identifiability and consequences of omitting effects from a model. The methodology is presented through a case study where identifiability of genetic and/or individual (environmental) maternal effects is explored. Our study system is a wild house sparrow ( Passer domesticus ) population with known pedigree. We fit pedigree-based (generalized) linear mixed models (animal models), with and without additive genetic and individual maternal effects, and use deviance information criterion (DIC) for choosing between these models. Pedigree and R-code for simulations are available. For this study system, the simulation studies show that only large maternal effects can be identified. The genetic maternal effect (and similar for individual maternal effect) has to be at least half of the total genetic variance to be identified. The consequences of omitting a maternal effect when it is present are explored. Our results indicate that the total (genetic and individual) variance are accounted for. When an individual (environmental) maternal effect is omitted from the model, this only influences the estimated (direct) individual (environmental) variance. When a genetic maternal effect is omitted from the model, both (direct) genetic and (direct) individual variance estimates are overestimated.

  5. CDMetaPOP: An individual-based, eco-evolutionary model for spatially explicit simulation of landscape demogenetics

    USGS Publications Warehouse

    Landguth, Erin L; Bearlin, Andrew; Day, Casey; Dunham, Jason B.

    2016-01-01

    1. Combining landscape demographic and genetics models offers powerful methods for addressing questions for eco-evolutionary applications.2. Using two illustrative examples, we present Cost–Distance Meta-POPulation, a program to simulate changes in neutral and/or selection-driven genotypes through time as a function of individual-based movement, complex spatial population dynamics, and multiple and changing landscape drivers.3. Cost–Distance Meta-POPulation provides a novel tool for questions in landscape genetics by incorporating population viability analysis, while linking directly to conservation applications.

  6. Incorporating variability in simulations of seasonally forced phenology using integral projection models

    DOE PAGES

    Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.; ...

    2017-11-26

    Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography. Our derivation, which is based on the rate summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills maturemore » pine trees. This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less

  7. Incorporating variability in simulations of seasonally forced phenology using integral projection models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.

    Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography. Our derivation, which is based on the rate summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills maturemore » pine trees. This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less

  8. Aggregate age-at-marriage patterns from individual mate-search heuristics.

    PubMed

    Todd, Peter M; Billari, Francesco C; Simão, Jorge

    2005-08-01

    The distribution of age at first marriage shows well-known strong regularities across many countries and recent historical periods. We accounted for these patterns by developing agent-based models that simulate the aggregate behavior of individuals who are searching for marriage partners. Past models assumed fully rational agents with complete knowledge of the marriage market; our simulated agents used psychologically plausible simple heuristic mate search rules that adjust aspiration levels on the basis of a sequence of encounters with potential partners. Substantial individual variation must be included in the models to account for the demographically observed age-at-marriage patterns.

  9. An individual-based modelling approach to estimate landscape connectivity for bighorn sheep (Ovis canadensis).

    PubMed

    Allen, Corrie H; Parrott, Lael; Kyle, Catherine

    2016-01-01

    Background. Preserving connectivity, or the ability of a landscape to support species movement, is among the most commonly recommended strategies to reduce the negative effects of climate change and human land use development on species. Connectivity analyses have traditionally used a corridor-based approach and rely heavily on least cost path modeling and circuit theory to delineate corridors. Individual-based models are gaining popularity as a potentially more ecologically realistic method of estimating landscape connectivity. However, this remains a relatively unexplored approach. We sought to explore the utility of a simple, individual-based model as a land-use management support tool in identifying and implementing landscape connectivity. Methods. We created an individual-based model of bighorn sheep (Ovis canadensis) that simulates a bighorn sheep traversing a landscape by following simple movement rules. The model was calibrated for bighorn sheep in the Okanagan Valley, British Columbia, Canada, a region containing isolated herds that are vital to conservation of the species in its northern range. Simulations were run to determine baseline connectivity between subpopulations in the study area. We then applied the model to explore two land management scenarios on simulated connectivity: restoring natural fire regimes and identifying appropriate sites for interventions that would increase road permeability for bighorn sheep. Results. This model suggests there are no continuous areas of good habitat between current subpopulations of sheep in the study area; however, a series of stepping-stones or circuitous routes could facilitate movement between subpopulations and into currently unoccupied, yet suitable, bighorn habitat. Restoring natural fire regimes or mimicking fire with prescribed burns and tree removal could considerably increase bighorn connectivity in this area. Moreover, several key road crossing sites that could benefit from wildlife overpasses were identified. Discussion. By linking individual-scale movement rules to landscape-scale outcomes, our individual-based model of bighorn sheep allows for the exploration of how on-the-ground management or conservation scenarios may increase functional connectivity for the species in the study area. More generally, this study highlights the usefulness of individual-based models to identify how a species makes broad use of a landscape for movement. Application of this approach can provide effective quantitative support for decision makers seeking to incorporate wildlife conservation and connectivity into land use planning.

  10. An individual-based modelling approach to estimate landscape connectivity for bighorn sheep (Ovis canadensis)

    PubMed Central

    Allen, Corrie H.; Kyle, Catherine

    2016-01-01

    Background. Preserving connectivity, or the ability of a landscape to support species movement, is among the most commonly recommended strategies to reduce the negative effects of climate change and human land use development on species. Connectivity analyses have traditionally used a corridor-based approach and rely heavily on least cost path modeling and circuit theory to delineate corridors. Individual-based models are gaining popularity as a potentially more ecologically realistic method of estimating landscape connectivity. However, this remains a relatively unexplored approach. We sought to explore the utility of a simple, individual-based model as a land-use management support tool in identifying and implementing landscape connectivity. Methods. We created an individual-based model of bighorn sheep (Ovis canadensis) that simulates a bighorn sheep traversing a landscape by following simple movement rules. The model was calibrated for bighorn sheep in the Okanagan Valley, British Columbia, Canada, a region containing isolated herds that are vital to conservation of the species in its northern range. Simulations were run to determine baseline connectivity between subpopulations in the study area. We then applied the model to explore two land management scenarios on simulated connectivity: restoring natural fire regimes and identifying appropriate sites for interventions that would increase road permeability for bighorn sheep. Results. This model suggests there are no continuous areas of good habitat between current subpopulations of sheep in the study area; however, a series of stepping-stones or circuitous routes could facilitate movement between subpopulations and into currently unoccupied, yet suitable, bighorn habitat. Restoring natural fire regimes or mimicking fire with prescribed burns and tree removal could considerably increase bighorn connectivity in this area. Moreover, several key road crossing sites that could benefit from wildlife overpasses were identified. Discussion. By linking individual-scale movement rules to landscape-scale outcomes, our individual-based model of bighorn sheep allows for the exploration of how on-the-ground management or conservation scenarios may increase functional connectivity for the species in the study area. More generally, this study highlights the usefulness of individual-based models to identify how a species makes broad use of a landscape for movement. Application of this approach can provide effective quantitative support for decision makers seeking to incorporate wildlife conservation and connectivity into land use planning. PMID:27168997

  11. IBSEM: An Individual-Based Atlantic Salmon Population Model.

    PubMed

    Castellani, Marco; Heino, Mikko; Gilbey, John; Araki, Hitoshi; Svåsand, Terje; Glover, Kevin A

    2015-01-01

    Ecology and genetics can influence the fate of individuals and populations in multiple ways. However, to date, few studies consider them when modelling the evolutionary trajectory of populations faced with admixture with non-local populations. For the Atlantic salmon, a model incorporating these elements is urgently needed because many populations are challenged with gene-flow from non-local and domesticated conspecifics. We developed an Individual-Based Salmon Eco-genetic Model (IBSEM) to simulate the demographic and population genetic change of an Atlantic salmon population through its entire life-cycle. Processes such as growth, mortality, and maturation are simulated through stochastic procedures, which take into account environmental variables as well as the genotype of the individuals. IBSEM is based upon detailed empirical data from salmon biology, and parameterized to reproduce the environmental conditions and the characteristics of a wild population inhabiting a Norwegian river. Simulations demonstrated that the model consistently and reliably reproduces the characteristics of the population. Moreover, in absence of farmed escapees, the modelled populations reach an evolutionary equilibrium that is similar to our definition of a 'wild' genotype. We assessed the sensitivity of the model in the face of assumptions made on the fitness differences between farm and wild salmon, and evaluated the role of straying as a buffering mechanism against the intrusion of farm genes into wild populations. These results demonstrate that IBSEM is able to capture the evolutionary forces shaping the life history of wild salmon and is therefore able to model the response of populations under environmental and genetic stressors.

  12. An individual-based model of skipjack tuna (Katsuwonus pelamis) movement in the tropical Pacific ocean

    NASA Astrophysics Data System (ADS)

    Scutt Phillips, Joe; Sen Gupta, Alex; Senina, Inna; van Sebille, Erik; Lange, Michael; Lehodey, Patrick; Hampton, John; Nicol, Simon

    2018-05-01

    The distribution of marine species is often modeled using Eulerian approaches, in which changes to population density or abundance are calculated at fixed locations in space. Conversely, Lagrangian, or individual-based, models simulate the movement of individual particles moving in continuous space, with broader-scale patterns such as distribution being an emergent property of many, potentially adaptive, individuals. These models offer advantages in examining dynamics across spatiotemporal scales and making comparisons with observations from individual-scale data. Here, we introduce and describe such a model, the Individual-based Kinesis, Advection and Movement of Ocean ANimAls model (Ikamoana), which we use to replicate the movement processes of an existing Eulerian model for marine predators (the Spatial Ecosystem and Population Dynamics Model, SEAPODYM). Ikamoana simulates the movement of either individual or groups of animals by physical ocean currents, habitat-dependent stochastic movements (kinesis), and taxis movements representing active searching behaviours. Applying our model to Pacific skipjack tuna (Katsuwonus pelamis), we show that it accurately replicates the evolution of density distribution simulated by SEAPODYM with low time-mean error and a spatial correlation of density that exceeds 0.96 at all times. We demonstrate how the Lagrangian approach permits easy tracking of individuals' trajectories for examining connectivity between different regions, and show how the model can provide independent estimates of transfer rates between commonly used assessment regions. In particular, we find that retention rates in most assessment regions are considerably smaller (up to a factor of 2) than those estimated by this population of skipjack's primary assessment model. Moreover, these rates are sensitive to ocean state (e.g. El Nino vs La Nina) and so assuming fixed transfer rates between regions may lead to spurious stock estimates. A novel feature of the Lagrangian approach is that individual schools can be tracked through time, and we demonstrate that movement between two assessment regions at broad temporal scales includes extended transits through other regions at finer-scales. Finally, we discuss the utility of this modeling framework for the management of marine reserves, designing effective monitoring programmes, and exploring hypotheses regarding the behaviour of hard-to-observe oceanic animals.

  13. Simulating carbon stocks and fluxes of an African tropical montane forest with an individual-based forest model.

    PubMed

    Fischer, Rico; Ensslin, Andreas; Rutten, Gemma; Fischer, Markus; Schellenberger Costa, David; Kleyer, Michael; Hemp, Andreas; Paulick, Sebastian; Huth, Andreas

    2015-01-01

    Tropical forests are carbon-dense and highly productive ecosystems. Consequently, they play an important role in the global carbon cycle. In the present study we used an individual-based forest model (FORMIND) to analyze the carbon balances of a tropical forest. The main processes of this model are tree growth, mortality, regeneration, and competition. Model parameters were calibrated using forest inventory data from a tropical forest at Mt. Kilimanjaro. The simulation results showed that the model successfully reproduces important characteristics of tropical forests (aboveground biomass, stem size distribution and leaf area index). The estimated aboveground biomass (385 t/ha) is comparable to biomass values in the Amazon and other tropical forests in Africa. The simulated forest reveals a gross primary production of 24 tcha(-1) yr(-1). Modeling above- and belowground carbon stocks, we analyzed the carbon balance of the investigated tropical forest. The simulated carbon balance of this old-growth forest is zero on average. This study provides an example of how forest models can be used in combination with forest inventory data to investigate forest structure and local carbon balances.

  14. Applications of agent-based modeling to nutrient movement Lake Michigan

    EPA Science Inventory

    As part of an ongoing project aiming to provide useful information for nearshore management (harmful algal blooms, nutrient loading), we explore the value of agent-based models in Lake Michigan. Agent-based models follow many individual “agents” moving through a simul...

  15. Large eddy simulation of forest canopy flow for wildland fire modeling

    Treesearch

    Eric Mueller; William Mell; Albert Simeoni

    2014-01-01

    Large eddy simulation (LES) based computational fluid dynamics (CFD) simulators have obtained increasing attention in the wildland fire research community, as these tools allow the inclusion of important driving physics. However, due to the complexity of the models, individual aspects must be isolated and tested rigorously to ensure meaningful results. As wind is a...

  16. Shape-based approach for the estimation of individual facial mimics in craniofacial surgery planning

    NASA Astrophysics Data System (ADS)

    Gladilin, Evgeny; Zachow, Stefan; Deuflhard, Peter; Hege, Hans-Christian

    2002-05-01

    Besides the static soft tissue prediction, the estimation of basic facial emotion expressions is another important criterion for the evaluation of craniofacial surgery planning. For a realistic simulation of facial mimics, an adequate biomechanical model of soft tissue including the mimic musculature is needed. In this work, we present an approach for the modeling of arbitrarily shaped muscles and the estimation of basic individual facial mimics, which is based on the geometrical model derived from the individual tomographic data and the general finite element modeling of soft tissue biomechanics.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swanson, K; Corwin, D; Rockne, R

    Purpose: To demonstrate a method of generating patient-specific, biologically-guided radiation therapy (RT) plans and to quantify and predict response to RT in glioblastoma. We investigate the biological correlates and imaging physics driving T2-MRI based response to radiation therapy using an MRI simulator. Methods: We have integrated a patient-specific biomathematical model of glioblastoma proliferation, invasion and radiotherapy with a multiobjective evolutionary algorithm for intensity-modulated RT optimization to construct individualized, biologically-guided plans. Patient-individualized simulations of the standard-of-care and optimized plans are compared in terms of several biological metrics quantified on MRI. An extension of the PI model is used to investigate themore » role of angiogenesis and its correlates in glioma response to therapy with the Proliferation-Invasion-Hypoxia- Necrosis-Angiogenesis model (PIHNA). The PIHNA model is used with a brain tissue phantom to predict tumor-induced vasogenic edema, tumor and tissue density that is used in a multi-compartmental MRI signal equation for generation of simulated T2- weighted MRIs. Results: Applying a novel metric of treatment response (Days Gained) to the patient-individualized simulation results predicted that the optimized RT plans would have a significant impact on delaying tumor progression, with Days Gained increases from 21% to 105%. For the T2- MRI simulations, initial validation tests compared average simulated T2 values for white matter, tumor, and peripheral edema to values cited in the literature. Simulated results closely match the characteristic T2 value for each tissue. Conclusion: Patient-individualized simulations using the combination of a biomathematical model with an optimization algorithm for RT generated biologically-guided doses that decreased normal tissue dose and increased therapeutic ratio with the potential to improve survival outcomes for treatment of glioblastoma. Simulated T2-MRI is shown to be consistent with known physics of MRI and can be used to further investigate biological drivers of imaging-based response to RT.« less

  18. A standard protocol for describing individual-based and agent-based models

    USGS Publications Warehouse

    Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.

    2006-01-01

    Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.

  19. Development and verification of an agent-based model of opinion leadership.

    PubMed

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The development and testing of agent-based models is an iterative process. The opinion leader model presented here provides a basic structure for continued model development, ongoing verification, and the establishment of validation procedures, including empirical data collection.

  20. A framework for the use of agent based modeling to simulate inter- and intraindividual variation in human behaviors

    EPA Science Inventory

    Simulation of human behavior in exposure modeling is a complex task. Traditionally, inter-individual variation in human activity has been modeled by drawing from a pool of single day time-activity diaries such as the US EPA Consolidated Human Activity Database (CHAD). Here, an ag...

  1. Gap models and their individual-based relatives in the assessment of the consequences of global change

    NASA Astrophysics Data System (ADS)

    Shugart, Herman H.; Wang, Bin; Fischer, Rico; Ma, Jianyong; Fang, Jing; Yan, Xiaodong; Huth, Andreas; Armstrong, Amanda H.

    2018-03-01

    Individual-based models (IBMs) of complex systems emerged in the 1960s and early 1970s, across diverse disciplines from astronomy to zoology. Ecological IBMs arose with seemingly independent origins out of the tradition of understanding the ecosystems dynamics of ecosystems from a ‘bottom-up’ accounting of the interactions of the parts. Individual trees are principal among the parts of forests. Because these models are computationally demanding, they have prospered as the power of digital computers has increased exponentially over the decades following the 1970s. This review will focus on a class of forest IBMs called gap models. Gap models simulate the changes in forests by simulating the birth, growth and death of each individual tree on a small plot of land. The summation of these plots comprise a forest (or set of sample plots on a forested landscape or region). Other, more aggregated forest IBMs have been used in global applications including cohort-based models, ecosystem demography models, etc. Gap models have been used to provide the parameters for these bulk models. Currently, gap models have grown from local-scale to continental-scale and even global-scale applications to assess the potential consequences of climate change on natural forests. Modifications to the models have enabled simulation of disturbances including fire, insect outbreak and harvest. Our objective in this review is to provide the reader with an overview of the history, motivation and applications, including theoretical applications, of these models. In a time of concern over global changes, gap models are essential tools to understand forest responses to climate change, modified disturbance regimes and other change agents. Development of forest surveys to provide the starting points for simulations and better estimates of the behavior of the diversity of tree species in response to the environment are continuing needs for improvement for these and other IBMs.

  2. A new computational growth model for sea urchin skeletons.

    PubMed

    Zachos, Louis G

    2009-08-07

    A new computational model has been developed to simulate growth of regular sea urchin skeletons. The model incorporates the processes of plate addition and individual plate growth into a composite model of whole-body (somatic) growth. A simple developmental model based on hypothetical morphogens underlies the assumptions used to define the simulated growth processes. The data model is based on a Delaunay triangulation of plate growth center points, using the dual Voronoi polygons to define plate topologies. A spherical frame of reference is used for growth calculations, with affine deformation of the sphere (based on a Young-Laplace membrane model) to result in an urchin-like three-dimensional form. The model verifies that the patterns of coronal plates in general meet the criteria of Voronoi polygonalization, that a morphogen/threshold inhibition model for plate addition results in the alternating plate addition pattern characteristic of sea urchins, and that application of the Bertalanffy growth model to individual plates results in simulated somatic growth that approximates that seen in living urchins. The model suggests avenues of research that could explain some of the distinctions between modern sea urchins and the much more disparate groups of forms that characterized the Paleozoic Era.

  3. Incorporating individual health-protective decisions into disease transmission models: a mathematical framework.

    PubMed

    Durham, David P; Casman, Elizabeth A

    2012-03-07

    It is anticipated that the next generation of computational epidemic models will simulate both infectious disease transmission and dynamic human behaviour change. Individual agents within a simulation will not only infect one another, but will also have situational awareness and a decision algorithm that enables them to modify their behaviour. This paper develops such a model of behavioural response, presenting a mathematical interpretation of a well-known psychological model of individual decision making, the health belief model, suitable for incorporation within an agent-based disease-transmission model. We formalize the health belief model and demonstrate its application in modelling the prevalence of facemask use observed over the course of the 2003 Hong Kong SARS epidemic, a well-documented example of behaviour change in response to a disease outbreak.

  4. Incorporating individual health-protective decisions into disease transmission models: a mathematical framework

    PubMed Central

    Durham, David P.; Casman, Elizabeth A.

    2012-01-01

    It is anticipated that the next generation of computational epidemic models will simulate both infectious disease transmission and dynamic human behaviour change. Individual agents within a simulation will not only infect one another, but will also have situational awareness and a decision algorithm that enables them to modify their behaviour. This paper develops such a model of behavioural response, presenting a mathematical interpretation of a well-known psychological model of individual decision making, the health belief model, suitable for incorporation within an agent-based disease-transmission model. We formalize the health belief model and demonstrate its application in modelling the prevalence of facemask use observed over the course of the 2003 Hong Kong SARS epidemic, a well-documented example of behaviour change in response to a disease outbreak. PMID:21775324

  5. Meta-Analysis of a Continuous Outcome Combining Individual Patient Data and Aggregate Data: A Method Based on Simulated Individual Patient Data

    ERIC Educational Resources Information Center

    Yamaguchi, Yusuke; Sakamoto, Wataru; Goto, Masashi; Staessen, Jan A.; Wang, Jiguang; Gueyffier, Francois; Riley, Richard D.

    2014-01-01

    When some trials provide individual patient data (IPD) and the others provide only aggregate data (AD), meta-analysis methods for combining IPD and AD are required. We propose a method that reconstructs the missing IPD for AD trials by a Bayesian sampling procedure and then applies an IPD meta-analysis model to the mixture of simulated IPD and…

  6. Model-Based Economic Evaluation of Treatments for Depression: A Systematic Literature Review.

    PubMed

    Kolovos, Spyros; Bosmans, Judith E; Riper, Heleen; Chevreul, Karine; Coupé, Veerle M H; van Tulder, Maurits W

    2017-09-01

    An increasing number of model-based studies that evaluate the cost effectiveness of treatments for depression are being published. These studies have different characteristics and use different simulation methods. We aimed to systematically review model-based studies evaluating the cost effectiveness of treatments for depression and examine which modelling technique is most appropriate for simulating the natural course of depression. The literature search was conducted in the databases PubMed, EMBASE and PsycInfo between 1 January 2002 and 1 October 2016. Studies were eligible if they used a health economic model with quality-adjusted life-years or disability-adjusted life-years as an outcome measure. Data related to various methodological characteristics were extracted from the included studies. The available modelling techniques were evaluated based on 11 predefined criteria. This methodological review included 41 model-based studies, of which 21 used decision trees (DTs), 15 used cohort-based state-transition Markov models (CMMs), two used individual-based state-transition models (ISMs), and three used discrete-event simulation (DES) models. Just over half of the studies (54%) evaluated antidepressants compared with a control condition. The data sources, time horizons, cycle lengths, perspectives adopted and number of health states/events all varied widely between the included studies. DTs scored positively in four of the 11 criteria, CMMs in five, ISMs in six, and DES models in seven. There were substantial methodological differences between the studies. Since the individual history of each patient is important for the prognosis of depression, DES and ISM simulation methods may be more appropriate than the others for a pragmatic representation of the course of depression. However, direct comparisons between the available modelling techniques are necessary to yield firm conclusions.

  7. The Lagrangian Ensemble metamodel for simulating plankton ecosystems

    NASA Astrophysics Data System (ADS)

    Woods, J. D.

    2005-10-01

    This paper presents a detailed account of the Lagrangian Ensemble (LE) metamodel for simulating plankton ecosystems. It uses agent-based modelling to describe the life histories of many thousands of individual plankters. The demography of each plankton population is computed from those life histories. So too is bio-optical and biochemical feedback to the environment. The resulting “virtual ecosystem” is a comprehensive simulation of the plankton ecosystem. It is based on phenotypic equations for individual micro-organisms. LE modelling differs significantly from population-based modelling. The latter uses prognostic equations to compute demography and biofeedback directly. LE modelling diagnoses them from the properties of individual micro-organisms, whose behaviour is computed from prognostic equations. That indirect approach permits the ecosystem to adjust gracefully to changes in exogenous forcing. The paper starts with theory: it defines the Lagrangian Ensemble metamodel and explains how LE code performs a number of computations “behind the curtain”. They include budgeting chemicals, and deriving biofeedback and demography from individuals. The next section describes the practice of LE modelling. It starts with designing a model that complies with the LE metamodel. Then it describes the scenario for exogenous properties that provide the computation with initial and boundary conditions. These procedures differ significantly from those used in population-based modelling. The next section shows how LE modelling is used in research, teaching and planning. The practice depends largely on hindcasting to overcome the limits to predictability of weather forecasting. The scientific method explains observable ecosystem phenomena in terms of finer-grained processes that cannot be observed, but which are controlled by the basic laws of physics, chemistry and biology. What-If? Prediction ( WIP), used for planning, extends hindcasting by adding events that describe natural or man-made hazards and remedial actions. Verification is based on the Ecological Turing Test, which takes account of uncertainties in the observed and simulated versions of a target ecological phenomenon. The rest of the paper is devoted to a case study designed to show what LE modelling offers the biological oceanographer. The case study is presented in two parts. The first documents the WB model (Woods & Barkmann, 1994) and scenario used to simulate the ecosystem in a mesocosm moored in deep water off the Azores. The second part illustrates the emergent properties of that virtual ecosystem. The behaviour and development of an individual plankton lineage are revealed by an audit trail of the agent used in the computation. The fields of environmental properties reveal the impact of biofeedback. The fields of demographic properties show how changes in individuals cumulatively affect the birth and death rates of their population. This case study documents the virtual ecosystem used by Woods, Perilli and Barkmann (2005; hereafter WPB); to investigate the stability of simulations created by the Lagrangian Ensemble metamodel. The Azores virtual ecosystem was created and analysed on the Virtual Ecology Workbench (VEW) which is described briefly in the Appendix.

  8. Contrast of degraded and restored stream habitat using an individual-based salmon model

    Treesearch

    S. F. Railsback; M. Gard; Bret Harvey; Jason White; J.K.H. Zimmerman

    2013-01-01

    Stream habitat restoration projects are popular, but can be expensive and difficult to evaluate. We describe inSALMO, an individual-based model designed to predict habitat effects on freshwater life stages (spawning through juvenile out-migration) of salmon. We applied inSALMO to Clear Creek, California, simulating the production of total and large (>5 cm FL)...

  9. Traffic and Driving Simulator Based on Architecture of Interactive Motion.

    PubMed

    Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza

    2015-01-01

    This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination.

  10. Traffic and Driving Simulator Based on Architecture of Interactive Motion

    PubMed Central

    Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza

    2015-01-01

    This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination. PMID:26491711

  11. Analysis of Predominance of Sexual Reproduction and Quadruplicity of Bases by Computer Simulation

    NASA Astrophysics Data System (ADS)

    Dasgupta, Subinay

    We have presented elsewhere a model for computer simulation of a colony of individuals reproducing sexually, by meiotic parthenogenesis and by cloning. Our algorithm takes into account food and space restriction, and attacks of some diseases. Each individual is characterized by a string of L ``base'' units, each of which can be of four types (quaternary model) or two types (binary model). Our previous report was for the case of L=12 (quaternary model) and L=24 (binary model) and contained the result that the fluctuation of population was the lowest for sexual reproduction with four types of base units. The present communication reports that the same conclusion also holds for L=10 (quaternary model) and L=20 (binary model), and for L=8 (quaternary model) and L=16 (binary model). This model however, suffers from the drawback that it does not show the effect of aging. A modification of the model was attempted to remove this drawback, but the results were not encouraging.

  12. Integrating macro and micro scale approaches in the agent-based modeling of residential dynamics

    NASA Astrophysics Data System (ADS)

    Saeedi, Sara

    2018-06-01

    With the advancement of computational modeling and simulation (M&S) methods as well as data collection technologies, urban dynamics modeling substantially improved over the last several decades. The complex urban dynamics processes are most effectively modeled not at the macro-scale, but following a bottom-up approach, by simulating the decisions of individual entities, or residents. Agent-based modeling (ABM) provides the key to a dynamic M&S framework that is able to integrate socioeconomic with environmental models, and to operate at both micro and macro geographical scales. In this study, a multi-agent system is proposed to simulate residential dynamics by considering spatiotemporal land use changes. In the proposed ABM, macro-scale land use change prediction is modeled by Artificial Neural Network (ANN) and deployed as the agent environment and micro-scale residential dynamics behaviors autonomously implemented by household agents. These two levels of simulation interacted and jointly promoted urbanization process in an urban area of Tehran city in Iran. The model simulates the behavior of individual households in finding ideal locations to dwell. The household agents are divided into three main groups based on their income rank and they are further classified into different categories based on a number of attributes. These attributes determine the households' preferences for finding new dwellings and change with time. The ABM environment is represented by a land-use map in which the properties of the land parcels change dynamically over the simulation time. The outputs of this model are a set of maps showing the pattern of different groups of households in the city. These patterns can be used by city planners to find optimum locations for building new residential units or adding new services to the city. The simulation results show that combining macro- and micro-level simulation can give full play to the potential of the ABM to understand the driving mechanism of urbanization and provide decision-making support for urban management.

  13. IBSEM: An Individual-Based Atlantic Salmon Population Model

    PubMed Central

    Castellani, Marco; Heino, Mikko; Gilbey, John; Araki, Hitoshi; Svåsand, Terje; Glover, Kevin A.

    2015-01-01

    Ecology and genetics can influence the fate of individuals and populations in multiple ways. However, to date, few studies consider them when modelling the evolutionary trajectory of populations faced with admixture with non-local populations. For the Atlantic salmon, a model incorporating these elements is urgently needed because many populations are challenged with gene-flow from non-local and domesticated conspecifics. We developed an Individual-Based Salmon Eco-genetic Model (IBSEM) to simulate the demographic and population genetic change of an Atlantic salmon population through its entire life-cycle. Processes such as growth, mortality, and maturation are simulated through stochastic procedures, which take into account environmental variables as well as the genotype of the individuals. IBSEM is based upon detailed empirical data from salmon biology, and parameterized to reproduce the environmental conditions and the characteristics of a wild population inhabiting a Norwegian river. Simulations demonstrated that the model consistently and reliably reproduces the characteristics of the population. Moreover, in absence of farmed escapees, the modelled populations reach an evolutionary equilibrium that is similar to our definition of a ‘wild’ genotype. We assessed the sensitivity of the model in the face of assumptions made on the fitness differences between farm and wild salmon, and evaluated the role of straying as a buffering mechanism against the intrusion of farm genes into wild populations. These results demonstrate that IBSEM is able to capture the evolutionary forces shaping the life history of wild salmon and is therefore able to model the response of populations under environmental and genetic stressors. PMID:26383256

  14. Transition to parenthood: the role of social interaction and endogenous networks.

    PubMed

    Diaz, Belinda Aparicio; Fent, Thomas; Prskawetz, Alexia; Bernardi, Laura

    2011-05-01

    Empirical studies indicate that the transition to parenthood is influenced by an individual's peer group. To study the mechanisms creating interdependencies across individuals' transition to parenthood and its timing, we apply an agent-based simulation model. We build a one-sex model and provide agents with three different characteristics: age, intended education, and parity. Agents endogenously form their network based on social closeness. Network members may then influence the agents' transition to higher parity levels. Our numerical simulations indicate that accounting for social interactions can explain the shift of first-birth probabilities in Austria during the period 1984 to 2004. Moreover, we apply our model to forecast age-specific fertility rates up to 2016.

  15. SEARCH: Spatially Explicit Animal Response to Composition of Habitat.

    PubMed

    Pauli, Benjamin P; McCann, Nicholas P; Zollner, Patrick A; Cummings, Robert; Gilbert, Jonathan H; Gustafson, Eric J

    2013-01-01

    Complex decisions dramatically affect animal dispersal and space use. Dispersing individuals respond to a combination of fine-scale environmental stimuli and internal attributes. Individual-based modeling offers a valuable approach for the investigation of such interactions because it combines the heterogeneity of animal behaviors with spatial detail. Most individual-based models (IBMs), however, vastly oversimplify animal behavior and such behavioral minimalism diminishes the value of these models. We present program SEARCH (Spatially Explicit Animal Response to Composition of Habitat), a spatially explicit, individual-based, population model of animal dispersal through realistic landscapes. SEARCH uses values in Geographic Information System (GIS) maps to apply rules that animals follow during dispersal, thus allowing virtual animals to respond to fine-scale features of the landscape and maintain a detailed memory of areas sensed during movement. SEARCH also incorporates temporally dynamic landscapes so that the environment to which virtual animals respond can change during the course of a simulation. Animals in SEARCH are behaviorally dynamic and able to respond to stimuli based upon their individual experiences. Therefore, SEARCH is able to model behavioral traits of dispersing animals at fine scales and with many dynamic aspects. Such added complexity allows investigation of unique ecological questions. To illustrate SEARCH's capabilities, we simulated case studies using three mammals. We examined the impact of seasonally variable food resources on the weight distribution of dispersing raccoons (Procyon lotor), the effect of temporally dynamic mortality pressure in combination with various levels of behavioral responsiveness in eastern chipmunks (Tamias striatus), and the impact of behavioral plasticity and home range selection on disperser mortality and weight change in virtual American martens (Martes americana). These simulations highlight the relevance of SEARCH for a variety of applications and illustrate benefits it can provide for conservation planning.

  16. Incorporating variability in simulations of seasonally forced phenology using integral projection models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.

    Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography.Our derivation, which is based on the rate-summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills mature pine trees.more » This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less

  17. iCrowd: agent-based behavior modeling and crowd simulator

    NASA Astrophysics Data System (ADS)

    Kountouriotis, Vassilios I.; Paterakis, Manolis; Thomopoulos, Stelios C. A.

    2016-05-01

    Initially designed in the context of the TASS (Total Airport Security System) FP-7 project, the Crowd Simulation platform developed by the Integrated Systems Lab of the Institute of Informatics and Telecommunications at N.C.S.R. Demokritos, has evolved into a complete domain-independent agent-based behavior simulator with an emphasis on crowd behavior and building evacuation simulation. Under continuous development, it reflects an effort to implement a modern, multithreaded, data-oriented simulation engine employing latest state-of-the-art programming technologies and paradigms. It is based on an extensible architecture that separates core services from the individual layers of agent behavior, offering a concrete simulation kernel designed for high-performance and stability. Its primary goal is to deliver an abstract platform to facilitate implementation of several Agent-Based Simulation solutions with applicability in several domains of knowledge, such as: (i) Crowd behavior simulation during [in/out] door evacuation. (ii) Non-Player Character AI for Game-oriented applications and Gamification activities. (iii) Vessel traffic modeling and simulation for Maritime Security and Surveillance applications. (iv) Urban and Highway Traffic and Transportation Simulations. (v) Social Behavior Simulation and Modeling.

  18. Technical Note: Approximate Bayesian parameterization of a complex tropical forest model

    NASA Astrophysics Data System (ADS)

    Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.

    2013-08-01

    Inverse parameter estimation of process-based models is a long-standing problem in ecology and evolution. A key problem of inverse parameter estimation is to define a metric that quantifies how well model predictions fit to the data. Such a metric can be expressed by general cost or objective functions, but statistical inversion approaches are based on a particular metric, the probability of observing the data given the model, known as the likelihood. Deriving likelihoods for dynamic models requires making assumptions about the probability for observations to deviate from mean model predictions. For technical reasons, these assumptions are usually derived without explicit consideration of the processes in the simulation. Only in recent years have new methods become available that allow generating likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional MCMC, performs well in retrieving known parameter values from virtual field data generated by the forest model. We analyze the results of the parameter estimation, examine the sensitivity towards the choice and aggregation of model outputs and observed data (summary statistics), and show results from using this method to fit the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss differences of this approach to Approximate Bayesian Computing (ABC), another commonly used method to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can successfully be applied to process-based models of high complexity. The methodology is particularly suited to heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models in ecology and evolution.

  19. Agent-based modeling: a new approach for theory building in social psychology.

    PubMed

    Smith, Eliot R; Conrey, Frederica R

    2007-02-01

    Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach.

  20. Simulating Carbon Stocks and Fluxes of an African Tropical Montane Forest with an Individual-Based Forest Model

    PubMed Central

    Fischer, Rico; Ensslin, Andreas; Rutten, Gemma; Fischer, Markus; Schellenberger Costa, David; Kleyer, Michael; Hemp, Andreas; Paulick, Sebastian; Huth, Andreas

    2015-01-01

    Tropical forests are carbon-dense and highly productive ecosystems. Consequently, they play an important role in the global carbon cycle. In the present study we used an individual-based forest model (FORMIND) to analyze the carbon balances of a tropical forest. The main processes of this model are tree growth, mortality, regeneration, and competition. Model parameters were calibrated using forest inventory data from a tropical forest at Mt. Kilimanjaro. The simulation results showed that the model successfully reproduces important characteristics of tropical forests (aboveground biomass, stem size distribution and leaf area index). The estimated aboveground biomass (385 t/ha) is comparable to biomass values in the Amazon and other tropical forests in Africa. The simulated forest reveals a gross primary production of 24 tcha-1yr-1. Modeling above- and belowground carbon stocks, we analyzed the carbon balance of the investigated tropical forest. The simulated carbon balance of this old-growth forest is zero on average. This study provides an example of how forest models can be used in combination with forest inventory data to investigate forest structure and local carbon balances. PMID:25915854

  1. Bringing consistency to simulation of population models--Poisson simulation as a bridge between micro and macro simulation.

    PubMed

    Gustafsson, Leif; Sternad, Mikael

    2007-10-01

    Population models concern collections of discrete entities such as atoms, cells, humans, animals, etc., where the focus is on the number of entities in a population. Because of the complexity of such models, simulation is usually needed to reproduce their complete dynamic and stochastic behaviour. Two main types of simulation models are used for different purposes, namely micro-simulation models, where each individual is described with its particular attributes and behaviour, and macro-simulation models based on stochastic differential equations, where the population is described in aggregated terms by the number of individuals in different states. Consistency between micro- and macro-models is a crucial but often neglected aspect. This paper demonstrates how the Poisson Simulation technique can be used to produce a population macro-model consistent with the corresponding micro-model. This is accomplished by defining Poisson Simulation in strictly mathematical terms as a series of Poisson processes that generate sequences of Poisson distributions with dynamically varying parameters. The method can be applied to any population model. It provides the unique stochastic and dynamic macro-model consistent with a correct micro-model. The paper also presents a general macro form for stochastic and dynamic population models. In an appendix Poisson Simulation is compared with Markov Simulation showing a number of advantages. Especially aggregation into state variables and aggregation of many events per time-step makes Poisson Simulation orders of magnitude faster than Markov Simulation. Furthermore, you can build and execute much larger and more complicated models with Poisson Simulation than is possible with the Markov approach.

  2. Effects of streamflow diversion on a fish population: combining empirical data and individual-based models in a site-specific evaluation

    Treesearch

    Bret C. Harvey; Jason L. White; Rodney J. Nakamoto; Steven F. Railsback

    2014-01-01

    Resource managers commonly face the need to evaluate the ecological consequences of specific water diversions of small streams. We addressed this need by conducting 4 years of biophysical monitoring of stream reaches above and below a diversion and applying two individual-based models of salmonid fish that simulated different levels of behavioral complexity. The...

  3. Reconstructing the 2003/2004 H3N2 influenza epidemic in Switzerland with a spatially explicit, individual-based model

    PubMed Central

    2011-01-01

    Background Simulation models of influenza spread play an important role for pandemic preparedness. However, as the world has not faced a severe pandemic for decades, except the rather mild H1N1 one in 2009, pandemic influenza models are inherently hypothetical and validation is, thus, difficult. We aim at reconstructing a recent seasonal influenza epidemic that occurred in Switzerland and deem this to be a promising validation strategy for models of influenza spread. Methods We present a spatially explicit, individual-based simulation model of influenza spread. The simulation model bases upon (i) simulated human travel data, (ii) data on human contact patterns and (iii) empirical knowledge on the epidemiology of influenza. For model validation we compare the simulation outcomes with empirical knowledge regarding (i) the shape of the epidemic curve, overall infection rate and reproduction number, (ii) age-dependent infection rates and time of infection, (iii) spatial patterns. Results The simulation model is capable of reproducing the shape of the 2003/2004 H3N2 epidemic curve of Switzerland and generates an overall infection rate (14.9 percent) and reproduction numbers (between 1.2 and 1.3), which are realistic for seasonal influenza epidemics. Age and spatial patterns observed in empirical data are also reflected by the model: Highest infection rates are in children between 5 and 14 and the disease spreads along the main transport axes from west to east. Conclusions We show that finding evidence for the validity of simulation models of influenza spread by challenging them with seasonal influenza outbreak data is possible and promising. Simulation models for pandemic spread gain more credibility if they are able to reproduce seasonal influenza outbreaks. For more robust modelling of seasonal influenza, serological data complementing sentinel information would be beneficial. PMID:21554680

  4. Group navigation and the "many-wrongs principle" in models of animal movement.

    PubMed

    Codling, E A; Pitchford, J W; Simpson, S D

    2007-07-01

    Traditional studies of animal navigation over both long and short distances have usually considered the orientation ability of the individual only, without reference to the implications of group membership. However, recent work has suggested that being in a group can significantly improve the ability of an individual to align toward and reach a target direction or point, even when all group members have limited navigational ability and there are no leaders. This effect is known as the "many-wrongs principle" since the large number of individual navigational errors across the group are suppressed by interactions and group cohesion. In this paper, we simulate the many-wrongs principle using a simple individual-based model of movement based on a biased random walk that includes group interactions. We study the ability of the group as a whole to reach a target given different levels of individual navigation error, group size, interaction radius, and environmental turbulence. In scenarios with low levels of environmental turbulence, simulation results demonstrate a navigational benefit from group membership, particularly for small group sizes. In contrast, when movement takes place in a highly turbulent environment, simulation results suggest that the best strategy is to navigate as individuals rather than as a group.

  5. Importance of fish behaviour in modelling conservation problems: food limitation as an example

    Treesearch

    Steven Railsback; Bret Harvey

    2011-01-01

    Simulation experiments using the inSTREAM individual-based brown trout Salmo trutta population model explored the role of individual adaptive behaviour in food limitation, as an example of how behaviour can affect managers’ understanding of conservation problems. The model includes many natural complexities in habitat (spatial and temporal variation in characteristics...

  6. Tailor-made heart simulation predicts the effect of cardiac resynchronization therapy in a canine model of heart failure.

    PubMed

    Panthee, Nirmal; Okada, Jun-ichi; Washio, Takumi; Mochizuki, Youhei; Suzuki, Ryohei; Koyama, Hidekazu; Ono, Minoru; Hisada, Toshiaki; Sugiura, Seiryo

    2016-07-01

    Despite extensive studies on clinical indices for the selection of patient candidates for cardiac resynchronization therapy (CRT), approximately 30% of selected patients do not respond to this therapy. Herein, we examined whether CRT simulations based on individualized realistic three-dimensional heart models can predict the therapeutic effect of CRT in a canine model of heart failure with left bundle branch block. In four canine models of failing heart with dyssynchrony, individualized three-dimensional heart models reproducing the electromechanical activity of each animal were created based on the computer tomographic images. CRT simulations were performed for 25 patterns of three ventricular pacing lead positions. Lead positions producing the best and the worst therapeutic effects were selected in each model. The validity of predictions was tested in acute experiments in which hearts were paced from the sites identified by simulations. We found significant correlations between the experimentally observed improvement in ejection fraction (EF) and the predicted improvements in ejection fraction (P<0.01) or the maximum value of the derivative of left ventricular pressure (P<0.01). The optimal lead positions produced better outcomes compared with the worst positioning in all dogs studied, although there were significant variations in responses. Variations in ventricular wall thickness among the dogs may have contributed to these responses. Thus CRT simulations using the individualized three-dimensional heart models can predict acute hemodynamic improvement, and help determine the optimal positions of the pacing lead. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Aspen: A microsimulation model of the economy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basu, N.; Pryor, R.J.; Quint, T.

    1996-10-01

    This report presents, Aspen. Sandia National Laboratories is developing this new agent-based microeconomic simulation model of the U.S. economy. The model is notable because it allows a large number of individual economic agents to be modeled at a high level of detail and with a great degree of freedom. Some features of Aspen are (a) a sophisticated message-passing system that allows individual pairs of agents to communicate, (b) the use of genetic algorithms to simulate the learning of certain agents, and (c) a detailed financial sector that includes a banking system and a bond market. Results from runs of themore » model are also presented.« less

  8. Integrating Intracellular Dynamics Using CompuCell3D and Bionetsolver: Applications to Multiscale Modelling of Cancer Cell Growth and Invasion

    PubMed Central

    Andasari, Vivi; Roper, Ryan T.; Swat, Maciej H.; Chaplain, Mark A. J.

    2012-01-01

    In this paper we present a multiscale, individual-based simulation environment that integrates CompuCell3D for lattice-based modelling on the cellular level and Bionetsolver for intracellular modelling. CompuCell3D or CC3D provides an implementation of the lattice-based Cellular Potts Model or CPM (also known as the Glazier-Graner-Hogeweg or GGH model) and a Monte Carlo method based on the metropolis algorithm for system evolution. The integration of CC3D for cellular systems with Bionetsolver for subcellular systems enables us to develop a multiscale mathematical model and to study the evolution of cell behaviour due to the dynamics inside of the cells, capturing aspects of cell behaviour and interaction that is not possible using continuum approaches. We then apply this multiscale modelling technique to a model of cancer growth and invasion, based on a previously published model of Ramis-Conde et al. (2008) where individual cell behaviour is driven by a molecular network describing the dynamics of E-cadherin and -catenin. In this model, which we refer to as the centre-based model, an alternative individual-based modelling technique was used, namely, a lattice-free approach. In many respects, the GGH or CPM methodology and the approach of the centre-based model have the same overall goal, that is to mimic behaviours and interactions of biological cells. Although the mathematical foundations and computational implementations of the two approaches are very different, the results of the presented simulations are compatible with each other, suggesting that by using individual-based approaches we can formulate a natural way of describing complex multi-cell, multiscale models. The ability to easily reproduce results of one modelling approach using an alternative approach is also essential from a model cross-validation standpoint and also helps to identify any modelling artefacts specific to a given computational approach. PMID:22461894

  9. A multi-model framework for simulating wildlife population response to land-use and climate change

    USGS Publications Warehouse

    McRae, B.H.; Schumaker, N.H.; McKane, R.B.; Busing, R.T.; Solomon, A.M.; Burdick, C.A.

    2008-01-01

    Reliable assessments of how human activities will affect wildlife populations are essential for making scientifically defensible resource management decisions. A principle challenge of predicting effects of proposed management, development, or conservation actions is the need to incorporate multiple biotic and abiotic factors, including land-use and climate change, that interact to affect wildlife habitat and populations through time. Here we demonstrate how models of land-use, climate change, and other dynamic factors can be integrated into a coherent framework for predicting wildlife population trends. Our framework starts with land-use and climate change models developed for a region of interest. Vegetation changes through time under alternative future scenarios are predicted using an individual-based plant community model. These predictions are combined with spatially explicit animal habitat models to map changes in the distribution and quality of wildlife habitat expected under the various scenarios. Animal population responses to habitat changes and other factors are then projected using a flexible, individual-based animal population model. As an example application, we simulated animal population trends under three future land-use scenarios and four climate change scenarios in the Cascade Range of western Oregon. We chose two birds with contrasting habitat preferences for our simulations: winter wrens (Troglodytes troglodytes), which are most abundant in mature conifer forests, and song sparrows (Melospiza melodia), which prefer more open, shrubby habitats. We used climate and land-use predictions from previously published studies, as well as previously published predictions of vegetation responses using FORCLIM, an individual-based forest dynamics simulator. Vegetation predictions were integrated with other factors in PATCH, a spatially explicit, individual-based animal population simulator. Through incorporating effects of landscape history and limited dispersal, our framework predicted population changes that typically exceeded those expected based on changes in mean habitat suitability alone. Although land-use had greater impacts on habitat quality than did climate change in our simulations, we found that small changes in vital rates resulting from climate change or other stressors can have large consequences for population trajectories. The ability to integrate bottom-up demographic processes like these with top-down constraints imposed by climate and land-use in a dynamic modeling environment is a key advantage of our approach. The resulting framework should allow researchers to synthesize existing empirical evidence, and to explore complex interactions that are difficult or impossible to capture through piecemeal modeling approaches. ?? 2008 Elsevier B.V.

  10. Individual differences in transcranial electrical stimulation current density

    PubMed Central

    Russell, Michael J; Goodman, Theodore; Pierson, Ronald; Shepherd, Shane; Wang, Qiang; Groshong, Bennett; Wiley, David F

    2013-01-01

    Transcranial electrical stimulation (TCES) is effective in treating many conditions, but it has not been possible to accurately forecast current density within the complex anatomy of a given subject's head. We sought to predict and verify TCES current densities and determine the variability of these current distributions in patient-specific models based on magnetic resonance imaging (MRI) data. Two experiments were performed. The first experiment estimated conductivity from MRIs and compared the current density results against actual measurements from the scalp surface of 3 subjects. In the second experiment, virtual electrodes were placed on the scalps of 18 subjects to model simulated current densities with 2 mA of virtually applied stimulation. This procedure was repeated for 4 electrode locations. Current densities were then calculated for 75 brain regions. Comparison of modeled and measured external current in experiment 1 yielded a correlation of r = .93. In experiment 2, modeled individual differences were greatest near the electrodes (ten-fold differences were common), but simulated current was found in all regions of the brain. Sites that were distant from the electrodes (e.g. hypothalamus) typically showed two-fold individual differences. MRI-based modeling can effectively predict current densities in individual brains. Significant variation occurs between subjects with the same applied electrode configuration. Individualized MRI-based modeling should be considered in place of the 10-20 system when accurate TCES is needed. PMID:24285948

  11. The impact of individual-level heterogeneity on estimated infectious disease burden: a simulation study.

    PubMed

    McDonald, Scott A; Devleesschauwer, Brecht; Wallinga, Jacco

    2016-12-08

    Disease burden is not evenly distributed within a population; this uneven distribution can be due to individual heterogeneity in progression rates between disease stages. Composite measures of disease burden that are based on disease progression models, such as the disability-adjusted life year (DALY), are widely used to quantify the current and future burden of infectious diseases. Our goal was to investigate to what extent ignoring the presence of heterogeneity could bias DALY computation. Simulations using individual-based models for hypothetical infectious diseases with short and long natural histories were run assuming either "population-averaged" progression probabilities between disease stages, or progression probabilities that were influenced by an a priori defined individual-level frailty (i.e., heterogeneity in disease risk) distribution, and DALYs were calculated. Under the assumption of heterogeneity in transition rates and increasing frailty with age, the short natural history disease model predicted 14% fewer DALYs compared with the homogenous population assumption. Simulations of a long natural history disease indicated that assuming homogeneity in transition rates when heterogeneity was present could overestimate total DALYs, in the present case by 4% (95% quantile interval: 1-8%). The consequences of ignoring population heterogeneity should be considered when defining transition parameters for natural history models and when interpreting the resulting disease burden estimates.

  12. Agent-based models of cellular systems.

    PubMed

    Cannata, Nicola; Corradini, Flavio; Merelli, Emanuela; Tesei, Luca

    2013-01-01

    Software agents are particularly suitable for engineering models and simulations of cellular systems. In a very natural and intuitive manner, individual software components are therein delegated to reproduce "in silico" the behavior of individual components of alive systems at a given level of resolution. Individuals' actions and interactions among individuals allow complex collective behavior to emerge. In this chapter we first introduce the readers to software agents and multi-agent systems, reviewing the evolution of agent-based modeling of biomolecular systems in the last decade. We then describe the main tools, platforms, and methodologies available for programming societies of agents, possibly profiting also of toolkits that do not require advanced programming skills.

  13. Building Single-Cell Models of Planktonic Metabolism Using PSAMM

    NASA Astrophysics Data System (ADS)

    Dufault-Thompson, K.; Zhang, Y.; Steffensen, J. L.

    2016-02-01

    The Genome-scale models (GEMs) of metabolic networks simulate the metabolic activities of individual cells by integrating omics data with biochemical and physiological measurements. GEMs were applied in the simulation of various photo-, chemo-, and heterotrophic organisms and provide significant insights into the function and evolution of planktonic cells. Despite the quick accumulation of GEMs, challenges remain in assembling the individual cell-based models into community-level models. Among various problems, the lack of consistencies in model representation and model quality checking has hindered the integration of individual GEMs and can lead to erroneous conclusions in the development of new modeling algorithms. Here, we present a Portable System for the Analysis of Metabolic Models (PSAMM). Along with the software a novel format of model representation was developed to enhance the readability of model files and permit the inclusion of heterogeneous, model-specific annotation information. A number of quality checking procedures was also implemented in PSAMM to ensure stoichiometric balance and to identify unused reactions. Using a case study of Shewanella piezotolerans WP3, we demonstrated the application of PSAMM in simulating the coupling of carbon utilization and energy production pathways under low-temperature and high-pressure stress. Applying PSAMM, we have also analyzed over 50 GEMs in the current literature and released an updated collection of the models with corrections on a number of common inconsistencies. Overall, PSAMM opens up new opportunities for integrating individual GEMs for the construction and mathematical simulation of community-level models in the scope of entire ecosystems.

  14. Epidemilogical Simulation System, Version 2.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2004-01-30

    EpiSims uses a detailed simulation of disease spread to evaluate demographically and geographically targeted biological threat reduction strategies. Abstract: EpiSims simulates the spread of disease and analyzes the consequences of intervention strategies in a large urban area at the level of individuals. The simulation combines models of three dynamical systems: urban social networks, disease transmission, and within-host progression of a disease. Validated population mobility and activity generation technology provides the social network models, Disease models are based on fusion of expert opinion and available data. EpiSims provides a previously unavailable detailed representation of the course of an outbreak in urbanmore » area. A letter of August 16, 2002 from the Office of Homeland Security states: "Ability of EpiSims to provide comprehensive data on daily activity patterns of individuals makes it far superior to traditional SIR models — clearly had an impact on pre-attack smallpox vaccination policy." EpiSims leverages a unique Los Alamos National Laboratory resource — the population mobility and activity data developed by TRANSIMS (Transportation Analysis and SiMulation System) — to create epidemiological analyses at an unprecedented level of detail. We create models of microscopic (individual-level) physical and biological processes from which, through simulation, emerge the macroscopic (urban regional level) quantities that are the inputs to alternative models. For example, the contact patterns of individuals in different demographic groups determine the overall mixing rates those groups. The characteristics of a person-to-person transmission together with their contact patterns determine the reproductive numbers — how many people will be infected on average by each case. Mixing rates and reproductive numbers are the basic parameters of other epidemiological models. Because interventions — and people’s reactions to them — are ultimately applied at the individual level, EpiSims is uniquely suited to evaluate their macroscopic consequences. For example, the debate over the logistics of targeted vaccination for smallpox, and thus the magnitude of the threat it poses, can best be resolved through an individual- based approach. EpiSims is the only available analytical tool using the individual-based approach that can scale to populations of a million or more without introducing ad-hoc assumptions about the nature of the social network. Impact: The first study commissioned for the EpiSims project was to analyze the effectiveness of targeted vaccination and isolation strategies in the aftermath of a covert release of smallpox at a crowded urban location. In particular we compared casualties and resources required for targeted strategies with those in the case of large-scale quarantine and/or mass vaccination campaigns. We produced this analysis in a sixty-day effort, while prototype software was still under development and delivered it to the Office of Homeland Security in June 2002. More recently, EpiSims provided casualty estimates and cost/benefit analyses for various proposed responses to an attack with pneumonic plague during the TOPOFF-2 exercise. Capabilities: EpiSims is designed to simulate human-human transmissible disease, but it is part of a suite of tools that naturally allow it to estimate individual exposures to air-borne or water-borne spread. Combined with data on animal density and mobility, EpiSims could simulate diseases spread by non-human vectors. EpiSims incorporates reactions of individuals, and is particularly powerful if those reactions are correlated with demographics. It provides a standard for modeling scenarios that cuts across agencies.« less

  15. Parallelization of a Fully-Distributed Hydrologic Model using Sub-basin Partitioning

    NASA Astrophysics Data System (ADS)

    Vivoni, E. R.; Mniszewski, S.; Fasel, P.; Springer, E.; Ivanov, V. Y.; Bras, R. L.

    2005-12-01

    A primary obstacle towards advances in watershed simulations has been the limited computational capacity available to most models. The growing trend of model complexity, data availability and physical representation has not been matched by adequate developments in computational efficiency. This situation has created a serious bottleneck which limits existing distributed hydrologic models to small domains and short simulations. In this study, we present novel developments in the parallelization of a fully-distributed hydrologic model. Our work is based on the TIN-based Real-time Integrated Basin Simulator (tRIBS), which provides continuous hydrologic simulation using a multiple resolution representation of complex terrain based on a triangulated irregular network (TIN). While the use of TINs reduces computational demand, the sequential version of the model is currently limited over large basins (>10,000 km2) and long simulation periods (>1 year). To address this, a parallel MPI-based version of the tRIBS model has been implemented and tested using high performance computing resources at Los Alamos National Laboratory. Our approach utilizes domain decomposition based on sub-basin partitioning of the watershed. A stream reach graph based on the channel network structure is used to guide the sub-basin partitioning. Individual sub-basins or sub-graphs of sub-basins are assigned to separate processors to carry out internal hydrologic computations (e.g. rainfall-runoff transformation). Routed streamflow from each sub-basin forms the major hydrologic data exchange along the stream reach graph. Individual sub-basins also share subsurface hydrologic fluxes across adjacent boundaries. We demonstrate how the sub-basin partitioning provides computational feasibility and efficiency for a set of test watersheds in northeastern Oklahoma. We compare the performance of the sequential and parallelized versions to highlight the efficiency gained as the number of processors increases. We also discuss how the coupled use of TINs and parallel processing can lead to feasible long-term simulations in regional watersheds while preserving basin properties at high-resolution.

  16. Simulated Models Suggest That Price per Calorie Is the Dominant Price Metric That Low-Income Individuals Use for Food Decision Making123

    PubMed Central

    2016-01-01

    Background: The price of food has long been considered one of the major factors that affects food choices. However, the price metric (e.g., the price of food per calorie or the price of food per gram) that individuals predominantly use when making food choices is unclear. Understanding which price metric is used is especially important for studying individuals with severe budget constraints because food price then becomes even more important in food choice. Objective: We assessed which price metric is used by low-income individuals in deciding what to eat. Methods: With the use of data from NHANES and the USDA Food and Nutrient Database for Dietary Studies, we created an agent-based model that simulated an environment representing the US population, wherein individuals were modeled as agents with a specific weight, age, and income. In our model, agents made dietary food choices while meeting their budget limits with the use of 1 of 3 different metrics for decision making: energy cost (price per calorie), unit price (price per gram), and serving price (price per serving). The food consumption patterns generated by our model were compared to 3 independent data sets. Results: The food choice behaviors observed in 2 of the data sets were found to be closest to the simulated dietary patterns generated by the price per calorie metric. The behaviors observed in the third data set were equidistant from the patterns generated by price per calorie and price per serving metrics, whereas results generated by the price per gram metric were further away. Conclusions: Our simulations suggest that dietary food choice based on price per calorie best matches actual consumption patterns and may therefore be the most salient price metric for low-income populations. PMID:27655757

  17. Simulated Models Suggest That Price per Calorie Is the Dominant Price Metric That Low-Income Individuals Use for Food Decision Making.

    PubMed

    Beheshti, Rahmatollah; Igusa, Takeru; Jones-Smith, Jessica

    2016-11-01

    The price of food has long been considered one of the major factors that affects food choices. However, the price metric (e.g., the price of food per calorie or the price of food per gram) that individuals predominantly use when making food choices is unclear. Understanding which price metric is used is especially important for studying individuals with severe budget constraints because food price then becomes even more important in food choice. We assessed which price metric is used by low-income individuals in deciding what to eat. With the use of data from NHANES and the USDA Food and Nutrient Database for Dietary Studies, we created an agent-based model that simulated an environment representing the US population, wherein individuals were modeled as agents with a specific weight, age, and income. In our model, agents made dietary food choices while meeting their budget limits with the use of 1 of 3 different metrics for decision making: energy cost (price per calorie), unit price (price per gram), and serving price (price per serving). The food consumption patterns generated by our model were compared to 3 independent data sets. The food choice behaviors observed in 2 of the data sets were found to be closest to the simulated dietary patterns generated by the price per calorie metric. The behaviors observed in the third data set were equidistant from the patterns generated by price per calorie and price per serving metrics, whereas results generated by the price per gram metric were further away. Our simulations suggest that dietary food choice based on price per calorie best matches actual consumption patterns and may therefore be the most salient price metric for low-income populations. © 2016 American Society for Nutrition.

  18. Computational modeling and statistical analyses on individual contact rate and exposure to disease in complex and confined transportation hubs

    NASA Astrophysics Data System (ADS)

    Wang, W. L.; Tsui, K. L.; Lo, S. M.; Liu, S. B.

    2018-01-01

    Crowded transportation hubs such as metro stations are thought as ideal places for the development and spread of epidemics. However, for the special features of complex spatial layout, confined environment with a large number of highly mobile individuals, it is difficult to quantify human contacts in such environments, wherein disease spreading dynamics were less explored in the previous studies. Due to the heterogeneity and dynamic nature of human interactions, increasing studies proved the importance of contact distance and length of contact in transmission probabilities. In this study, we show how detailed information on contact and exposure patterns can be obtained by statistical analyses on microscopic crowd simulation data. To be specific, a pedestrian simulation model-CityFlow was employed to reproduce individuals' movements in a metro station based on site survey data, values and distributions of individual contact rate and exposure in different simulation cases were obtained and analyzed. It is interesting that Weibull distribution fitted the histogram values of individual-based exposure in each case very well. Moreover, we found both individual contact rate and exposure had linear relationship with the average crowd densities of the environments. The results obtained in this paper can provide reference to epidemic study in complex and confined transportation hubs and refine the existing disease spreading models.

  19. Moving-window dynamic optimization: design of stimulation profiles for walking.

    PubMed

    Dosen, Strahinja; Popović, Dejan B

    2009-05-01

    The overall goal of the research is to improve control for electrical stimulation-based assistance of walking in hemiplegic individuals. We present the simulation for generating offline input (sensors)-output (intensity of muscle stimulation) representation of walking that serves in synthesizing a rule-base for control of electrical stimulation for restoration of walking. The simulation uses new algorithm termed moving-window dynamic optimization (MWDO). The optimization criterion was to minimize the sum of the squares of tracking errors from desired trajectories with the penalty function on the total muscle efforts. The MWDO was developed in the MATLAB environment and tested using target trajectories characteristic for slow-to-normal walking recorded in healthy individual and a model with the parameters characterizing the potential hemiplegic user. The outputs of the simulation are piecewise constant intensities of electrical stimulation and trajectories generated when the calculated stimulation is applied to the model. We demonstrated the importance of this simulation by showing the outputs for healthy and hemiplegic individuals, using the same target trajectories. Results of the simulation show that the MWDO is an efficient tool for analyzing achievable trajectories and for determining the stimulation profiles that need to be delivered for good tracking.

  20. Agent-Based Model Simulating Pedestrian Behavioral Response to Environmental Structural Changes

    DOT National Transportation Integrated Search

    2015-12-01

    The authors' research focused on understanding the travel behavior of individuals in complex urban environments. Specifically, the authors investigated how land use patterns and infrastructure influence how individuals across a broad range of travel ...

  1. A Novel Framework for Characterizing Exposure-Related Behaviors Using Agent-Based Models Embedded with Needs-Based Artificial Intelligence (CSSSA2016)

    EPA Science Inventory

    Descriptions of where and how individuals spend their time are important for characterizing exposures to chemicals in consumer products and in indoor environments. Herein we create an agent-based model (ABM) that is able to simulate longitudinal patterns in behaviors. By basing o...

  2. Web-Based Model Visualization Tools to Aid in Model Optimization and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Alder, J.; van Griensven, A.; Meixner, T.

    2003-12-01

    Individuals applying hydrologic models have a need for a quick easy to use visualization tools to permit them to assess and understand model performance. We present here the Interactive Hydrologic Modeling (IHM) visualization toolbox. The IHM utilizes high-speed Internet access, the portability of the web and the increasing power of modern computers to provide an online toolbox for quick and easy model result visualization. This visualization interface allows for the interpretation and analysis of Monte-Carlo and batch model simulation results. Often times a given project will generate several thousands or even hundreds of thousands simulations. This large number of simulations creates a challenge for post-simulation analysis. IHM's goal is to try to solve this problem by loading all of the data into a database with a web interface that can dynamically generate graphs for the user according to their needs. IHM currently supports: a global samples statistics table (e.g. sum of squares error, sum of absolute differences etc.), top ten simulations table and graphs, graphs of an individual simulation using time step data, objective based dotty plots, threshold based parameter cumulative density function graphs (as used in the regional sensitivity analysis of Spear and Hornberger) and 2D error surface graphs of the parameter space. IHM is ideal for the simplest bucket model to the largest set of Monte-Carlo model simulations with a multi-dimensional parameter and model output space. By using a web interface, IHM offers the user complete flexibility in the sense that they can be anywhere in the world using any operating system. IHM can be a time saving and money saving alternative to spending time producing graphs or conducting analysis that may not be informative or being forced to purchase or use expensive and proprietary software. IHM is a simple, free, method of interpreting and analyzing batch model results, and is suitable for novice to expert hydrologic modelers.

  3. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE PAGES

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  4. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  5. Evolvable social agents for bacterial systems modeling.

    PubMed

    Paton, Ray; Gregory, Richard; Vlachos, Costas; Saunders, Jon; Wu, Henry

    2004-09-01

    We present two approaches to the individual-based modeling (IbM) of bacterial ecologies and evolution using computational tools. The IbM approach is introduced, and its important complementary role to biosystems modeling is discussed. A fine-grained model of bacterial evolution is then presented that is based on networks of interactivity between computational objects representing genes and proteins. This is followed by a coarser grained agent-based model, which is designed to explore the evolvability of adaptive behavioral strategies in artificial bacteria represented by learning classifier systems. The structure and implementation of the two proposed individual-based bacterial models are discussed, and some results from simulation experiments are presented, illustrating their adaptive properties.

  6. Technical Note: Approximate Bayesian parameterization of a process-based tropical forest model

    NASA Astrophysics Data System (ADS)

    Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.

    2014-02-01

    Inverse parameter estimation of process-based models is a long-standing problem in many scientific disciplines. A key question for inverse parameter estimation is how to define the metric that quantifies how well model predictions fit to the data. This metric can be expressed by general cost or objective functions, but statistical inversion methods require a particular metric, the probability of observing the data given the model parameters, known as the likelihood. For technical and computational reasons, likelihoods for process-based stochastic models are usually based on general assumptions about variability in the observed data, and not on the stochasticity generated by the model. Only in recent years have new methods become available that allow the generation of likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional Markov chain Monte Carlo (MCMC) sampler, performs well in retrieving known parameter values from virtual inventory data generated by the forest model. We analyze the results of the parameter estimation, examine its sensitivity to the choice and aggregation of model outputs and observed data (summary statistics), and demonstrate the application of this method by fitting the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss how this approach differs from approximate Bayesian computation (ABC), another method commonly used to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can be successfully applied to process-based models of high complexity. The methodology is particularly suitable for heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models.

  7. Assessment of a model of forest dynamics under contrasting climate and disturbance regimes in the Pacific Northwest [FORCLIM

    USGS Publications Warehouse

    Busing, Richard T.; Solomon, Allen M.

    2005-01-01

    An individual-based model of forest dynamics (FORCLIM) was tested for its ability to simulate forest composition and structure in the Pacific Northwest region of North America. Simulation results across gradients of climate and disturbance were compared to forest survey data from several vegetation zones in western Oregon. Modelled patterns of tree species composition, total basal area and stand height across climate gradients matched those in the forest survey data. However, the density of small stems (<50 cm DBH) was underestimated by the model. Thus actual size-class structure and other density-based parameters of stand structure were not simulated with high accuracy. The addition of partial-stand disturbances at moderate frequencies (<0.01 yr-1) often improved agreement between simulated and actual results. Strengths and weaknesses of the FORCLIM model in simulating forest dynamics and structure in the Pacific Northwest are discussed.

  8. Performance evaluation of an agent-based occupancy simulation model

    DOE PAGES

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing; ...

    2017-01-17

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  9. Performance evaluation of an agent-based occupancy simulation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  10. Modeling wildlife populations with HexSim

    EPA Science Inventory

    HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications including population viability analysis for on...

  11. A generic framework for individual-based modelling and physical-biological interaction

    PubMed Central

    2018-01-01

    The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280

  12. Simple models for studying complex spatiotemporal patterns of animal behavior

    NASA Astrophysics Data System (ADS)

    Tyutyunov, Yuri V.; Titova, Lyudmila I.

    2017-06-01

    Minimal mathematical models able to explain complex patterns of animal behavior are essential parts of simulation systems describing large-scale spatiotemporal dynamics of trophic communities, particularly those with wide-ranging species, such as occur in pelagic environments. We present results obtained with three different modelling approaches: (i) an individual-based model of animal spatial behavior; (ii) a continuous taxis-diffusion-reaction system of partial-difference equations; (iii) a 'hybrid' approach combining the individual-based algorithm of organism movements with explicit description of decay and diffusion of the movement stimuli. Though the models are based on extremely simple rules, they all allow description of spatial movements of animals in a predator-prey system within a closed habitat, reproducing some typical patterns of the pursuit-evasion behavior observed in natural populations. In all three models, at each spatial position the animal movements are determined by local conditions only, so the pattern of collective behavior emerges due to self-organization. The movement velocities of animals are proportional to the density gradients of specific cues emitted by individuals of the antagonistic species (pheromones, exometabolites or mechanical waves of the media, e.g., sound). These cues play a role of taxis stimuli: prey attract predators, while predators repel prey. Depending on the nature and the properties of the movement stimulus we propose using either a simplified individual-based model, a continuous taxis pursuit-evasion system, or a little more detailed 'hybrid' approach that combines simulation of the individual movements with the continuous model describing diffusion and decay of the stimuli in an explicit way. These can be used to improve movement models for many species, including large marine predators.

  13. Genetic demographic networks: Mathematical model and applications.

    PubMed

    Kimmel, Marek; Wojdyła, Tomasz

    2016-10-01

    Recent improvement in the quality of genetic data obtained from extinct human populations and their ancestors encourages searching for answers to basic questions regarding human population history. The most common and successful are model-based approaches, in which genetic data are compared to the data obtained from the assumed demography model. Using such approach, it is possible to either validate or adjust assumed demography. Model fit to data can be obtained based on reverse-time coalescent simulations or forward-time simulations. In this paper we introduce a computational method based on mathematical equation that allows obtaining joint distributions of pairs of individuals under a specified demography model, each of them characterized by a genetic variant at a chosen locus. The two individuals are randomly sampled from either the same or two different populations. The model assumes three types of demographic events (split, merge and migration). Populations evolve according to the time-continuous Moran model with drift and Markov-process mutation. This latter process is described by the Lyapunov-type equation introduced by O'Brien and generalized in our previous works. Application of this equation constitutes an original contribution. In the result section of the paper we present sample applications of our model to both simulated and literature-based demographies. Among other we include a study of the Slavs-Balts-Finns genetic relationship, in which we model split and migrations between the Balts and Slavs. We also include another example that involves the migration rates between farmers and hunters-gatherers, based on modern and ancient DNA samples. This latter process was previously studied using coalescent simulations. Our results are in general agreement with the previous method, which provides validation of our approach. Although our model is not an alternative to simulation methods in the practical sense, it provides an algorithm to compute pairwise distributions of alleles, in the case of haploid non-recombining loci such as mitochondrial and Y-chromosome loci in humans. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. An agent-based approach for modeling dynamics of contagious disease spread

    PubMed Central

    Perez, Liliana; Dragicevic, Suzana

    2009-01-01

    Background The propagation of communicable diseases through a population is an inherent spatial and temporal process of great importance for modern society. For this reason a spatially explicit epidemiologic model of infectious disease is proposed for a greater understanding of the disease's spatial diffusion through a network of human contacts. Objective The objective of this study is to develop an agent-based modelling approach the integrates geographic information systems (GIS) to simulate the spread of a communicable disease in an urban environment, as a result of individuals' interactions in a geospatial context. Methods The methodology for simulating spatiotemporal dynamics of communicable disease propagation is presented and the model is implemented using measles outbreak in an urban environment as a case study. Individuals in a closed population are explicitly represented by agents associated to places where they interact with other agents. They are endowed with mobility, through a transportation network allowing them to move between places within the urban environment, in order to represent the spatial heterogeneity and the complexity involved in infectious diseases diffusion. The model is implemented on georeferenced land use dataset from Metro Vancouver and makes use of census data sets from Statistics Canada for the municipality of Burnaby, BC, Canada study site. Results The results provide insights into the application of the model to calculate ratios of susceptible/infected in specific time frames and urban environments, due to its ability to depict the disease progression based on individuals' interactions. It is demonstrated that the dynamic spatial interactions within the population lead to high numbers of exposed individuals who perform stationary activities in areas after they have finished commuting. As a result, the sick individuals are concentrated in geographical locations like schools and universities. Conclusion The GIS-agent based model designed for this study can be easily customized to study the disease spread dynamics of any other communicable disease by simply adjusting the modeled disease timeline and/or the infection model and modifying the transmission process. This type of simulations can help to improve comprehension of disease spread dynamics and to take better steps towards the prevention and control of an epidemic outbreak. PMID:19656403

  15. Simulating muscular thin films using thermal contraction capabilities in finite element analysis tools.

    PubMed

    Webster, Victoria A; Nieto, Santiago G; Grosberg, Anna; Akkus, Ozan; Chiel, Hillel J; Quinn, Roger D

    2016-10-01

    In this study, new techniques for approximating the contractile properties of cells in biohybrid devices using Finite Element Analysis (FEA) have been investigated. Many current techniques for modeling biohybrid devices use individual cell forces to simulate the cellular contraction. However, such techniques result in long simulation runtimes. In this study we investigated the effect of the use of thermal contraction on simulation runtime. The thermal contraction model was significantly faster than models using individual cell forces, making it beneficial for rapidly designing or optimizing devices. Three techniques, Stoney׳s Approximation, a Modified Stoney׳s Approximation, and a Thermostat Model, were explored for calibrating thermal expansion/contraction parameters (TECPs) needed to simulate cellular contraction using thermal contraction. The TECP values were calibrated by using published data on the deflections of muscular thin films (MTFs). Using these techniques, TECP values that suitably approximate experimental deflections can be determined by using experimental data obtained from cardiomyocyte MTFs. Furthermore, a sensitivity analysis was performed in order to investigate the contribution of individual variables, such as elastic modulus and layer thickness, to the final calibrated TECP for each calibration technique. Additionally, the TECP values are applicable to other types of biohybrid devices. Two non-MTF models were simulated based on devices reported in the existing literature. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Functional enzyme-based modeling approach for dynamic simulation of denitrification process in hyporheic zone sediments: Genetically structured microbial community model

    NASA Astrophysics Data System (ADS)

    Song, H. S.; Li, M.; Qian, W.; Song, X.; Chen, X.; Scheibe, T. D.; Fredrickson, J.; Zachara, J. M.; Liu, C.

    2016-12-01

    Modeling environmental microbial communities at individual organism level is currently intractable due to overwhelming structural complexity. Functional guild-based approaches alleviate this problem by lumping microorganisms into fewer groups based on their functional similarities. This reduction may become ineffective, however, when individual species perform multiple functions as environmental conditions vary. In contrast, the functional enzyme-based modeling approach we present here describes microbial community dynamics based on identified functional enzymes (rather than individual species or their groups). Previous studies in the literature along this line used biomass or functional genes as surrogate measures of enzymes due to the lack of analytical methods for quantifying enzymes in environmental samples. Leveraging our recent development of a signature peptide-based technique enabling sensitive quantification of functional enzymes in environmental samples, we developed a genetically structured microbial community model (GSMCM) to incorporate enzyme concentrations and various other omics measurements (if available) as key modeling input. We formulated the GSMCM based on the cybernetic metabolic modeling framework to rationally account for cellular regulation without relying on empirical inhibition kinetics. In the case study of modeling denitrification process in Columbia River hyporheic zone sediments collected from the Hanford Reach, our GSMCM provided a quantitative fit to complex experimental data in denitrification, including the delayed response of enzyme activation to the change in substrate concentration. Our future goal is to extend the modeling scope to the prediction of carbon and nitrogen cycles and contaminant fate. Integration of a simpler version of the GSMCM with PFLOTRAN for multi-scale field simulations is in progress.

  17. Simulation of emotional contagion using modified SIR model: A cellular automaton approach

    NASA Astrophysics Data System (ADS)

    Fu, Libi; Song, Weiguo; Lv, Wei; Lo, Siuming

    2014-07-01

    Emotion plays an important role in the decision-making of individuals in some emergency situations. The contagion of emotion may induce either normal or abnormal consolidated crowd behavior. This paper aims to simulate the dynamics of emotional contagion among crowds by modifying the epidemiological SIR model to a cellular automaton approach. This new cellular automaton model, entitled the “CA-SIRS model”, captures the dynamic process ‘susceptible-infected-recovered-susceptible', which is based on SIRS contagion in epidemiological theory. Moreover, in this new model, the process is integrated with individual movement. The simulation results of this model show that multiple waves and dynamical stability around a mean value will appear during emotion spreading. It was found that the proportion of initial infected individuals had little influence on the final stable proportion of infected population in a given system, and that infection frequency increased with an increase in the average crowd density. Our results further suggest that individual movement accelerates the spread speed of emotion and increases the stable proportion of infected population. Furthermore, decreasing the duration of an infection and the probability of reinfection can markedly reduce the number of infected individuals. It is hoped that this study will be helpful in crowd management and evacuation organization.

  18. Sensitivity analysis of Repast computational ecology models with R/Repast.

    PubMed

    Prestes García, Antonio; Rodríguez-Patón, Alfonso

    2016-12-01

    Computational ecology is an emerging interdisciplinary discipline founded mainly on modeling and simulation methods for studying ecological systems. Among the existing modeling formalisms, the individual-based modeling is particularly well suited for capturing the complex temporal and spatial dynamics as well as the nonlinearities arising in ecosystems, communities, or populations due to individual variability. In addition, being a bottom-up approach, it is useful for providing new insights on the local mechanisms which are generating some observed global dynamics. Of course, no conclusions about model results could be taken seriously if they are based on a single model execution and they are not analyzed carefully. Therefore, a sound methodology should always be used for underpinning the interpretation of model results. The sensitivity analysis is a methodology for quantitatively assessing the effect of input uncertainty in the simulation output which should be incorporated compulsorily to every work based on in-silico experimental setup. In this article, we present R/Repast a GNU R package for running and analyzing Repast Simphony models accompanied by two worked examples on how to perform global sensitivity analysis and how to interpret the results.

  19. Remotely piloted vehicle: Application of the GRASP analysis method

    NASA Technical Reports Server (NTRS)

    Andre, W. L.; Morris, J. B.

    1981-01-01

    The application of General Reliability Analysis Simulation Program (GRASP) to the remotely piloted vehicle (RPV) system is discussed. The model simulates the field operation of the RPV system. By using individual component reliabilities, the overall reliability of the RPV system is determined. The results of the simulations are given in operational days. The model represented is only a basis from which more detailed work could progress. The RPV system in this model is based on preliminary specifications and estimated values. The use of GRASP from basic system definition, to model input, and to model verification is demonstrated.

  20. Numerical evaluation of the skull for human neuromodulation with transcranial focused ultrasound

    NASA Astrophysics Data System (ADS)

    Mueller, Jerel K.; Ai, Leo; Bansal, Priya; Legon, Wynn

    2017-12-01

    Objective. Transcranial focused ultrasound is an emerging field for human non-invasive neuromodulation, but its dosing in humans is difficult to know due to the skull. The objective of the present study was to establish modeling methods based on medical images to assess skull differences between individuals on the wave propagation of ultrasound. Approach. Computational models of transcranial focused ultrasound were constructed using CT and MR scans to solve for intracranial pressure. We explored the effect of including the skull base in models, different transducer placements on the head, and differences between 250 kHz or 500 kHz acoustic frequency for both female and male models. We further tested these features using linear, nonlinear, and elastic simulations. To better understand inter-subject skull thickness and composition effects we evaluated the intracranial pressure maps between twelve individuals at two different skull sites. Main results. Nonlinear acoustic simulations resulted in virtually identical intracranial pressure maps with linear acoustic simulations. Elastic simulations showed a difference in max pressures and full width half maximum volumes of 15% at most. Ultrasound at an acoustic frequency of 250 kHz resulted in the creation of more prominent intracranial standing waves compared to 500 kHz. Finally, across twelve model human skulls, a significant linear relationship to characterize intracranial pressure maps was not found. Significance. Despite its appeal, an inherent problem with the use of a noninvasive transcranial ultrasound method is the difficulty of knowing intracranial effects because of the skull. Here we develop detailed computational models derived from medical images of individuals to simulate the propagation of neuromodulatory ultrasound across the skull and solve for intracranial pressure maps. These methods allow for a much better understanding of the intracranial effects of ultrasound for an individual in order to ensure proper targeting and more tightly control dosing.

  1. Space-based Ornithology-Studying Bird Migration and Environmental Change in North America

    NASA Technical Reports Server (NTRS)

    Smith, James; Deppe, Jill

    2008-01-01

    Natural fluctuations in the availability of critical stopover sites coupled with anthropogenic destruction of wetlands, land-use change, and anticipated losses due to climate change present migratory birds with a formidable challenge. We have developed an individual-based, spatially explicit bird migration model that simulates the migration routes, timing and energy budgets of individual birds under dynamic weather and land surface conditions. Our model incorporates biophysical constraints, individual bird energy status, bird behavior, and flight aerodynamics. We model the speed, direction, and timing of individual birds moving through a user specified Lagrangian grid. The model incorporates environmental properties including wind speed and direction, topography, dynamic hydrologic properties of the landscape, and environmental suitability. The model is driven by important variables estimated from satellite observations of the land surface, by data assimilation products from weather and climate models, and biological field data. We illustrate the use of the model to study the impact of both short- and long-term environmental variatios, e.g. climate, drought, anthropogenic, on migration timing (phenology), spatial pattern, and fitness (survival and reproductive success). We present several theoretical simulations of the spring migration of Pectoral Sandpiper (Calidris melanotos) in North America with emphasis on the Central flyway from the Gulf of Mexico to Alaska.

  2. Intelligent Systems Approach for Automated Identification of Individual Control Behavior of a Human Operator

    NASA Technical Reports Server (NTRS)

    Zaychik, Kirill B.; Cardullo, Frank M.

    2012-01-01

    Results have been obtained using conventional techniques to model the generic human operator?s control behavior, however little research has been done to identify an individual based on control behavior. The hypothesis investigated is that different operators exhibit different control behavior when performing a given control task. Two enhancements to existing human operator models, which allow personalization of the modeled control behavior, are presented. One enhancement accounts for the testing control signals, which are introduced by an operator for more accurate control of the system and/or to adjust the control strategy. This uses the Artificial Neural Network which can be fine-tuned to model the testing control. Another enhancement takes the form of an equiripple filter which conditions the control system power spectrum. A novel automated parameter identification technique was developed to facilitate the identification process of the parameters of the selected models. This utilizes a Genetic Algorithm based optimization engine called the Bit-Climbing Algorithm. Enhancements were validated using experimental data obtained from three different sources: the Manual Control Laboratory software experiments, Unmanned Aerial Vehicle simulation, and NASA Langley Research Center Visual Motion Simulator studies. This manuscript also addresses applying human operator models to evaluate the effectiveness of motion feedback when simulating actual pilot control behavior in a flight simulator.

  3. [Individualized fluid-solid coupled model of intracranial aneurysms based on computed tomography angiography data].

    PubMed

    Wang, Fuyu; Xu, Bainan; Sun, Zhenghui; Liu, Lei; Wu, Chen; Zhang, Xiaojun

    2012-10-01

    To establish an individualized fluid-solid coupled model of intracranial aneurysms based on computed tomography angiography (CTA) image data. The original Dicom format image data from a patient with an intracranial aneurysm were imported into Mimics software to construct the 3D model. The fluid-solid coupled model was simulated with ANSYS and CFX software, and the sensitivity of the model was analyzed. The difference between the rigid model and fluid-solid coupled model was also compared. The fluid-solid coupled model of intracranial aneurysm was established successfully, which allowed direct simulation of the blood flow of the intracranial aneurysm and the deformation of the solid wall. The pressure field, stress field, and distribution of Von Mises stress and deformation of the aneurysm could be exported from the model. A small Young's modulus led to an obvious deformation of the vascular wall, and the walls with greater thicknesses had smaller deformations. The rigid model and the fluid-solid coupled model showed more differences in the wall shear stress and blood flow velocity than in pressure. The fluid-solid coupled model more accurately represents the actual condition of the intracranial aneurysm than the rigid model. The results of numerical simulation with the model are reliable to study the origin, growth and rupture of the aneurysms.

  4. [Construction of individual-based ecological model for Scomber japonicas at its early growth stages in East China Sea].

    PubMed

    Li, Yue-Song; Chen, Xin-Jun; Yang, Hong

    2012-06-01

    By adopting FVCOM-simulated 3-D physical field and based on the biological processes of chub mackerel (Scomber japonicas) in its early life history from the individual-based biological model, the individual-based ecological model for S. japonicas at its early growth stages in the East China Sea was constructed through coupling the physical field in March-July with the biological model by the method of Lagrange particle tracking. The model constructed could well simulate the transport process and abundance distribution of S. japonicas eggs and larvae. The Taiwan Warm Current, Kuroshio, and Tsushima Strait Warm Current directly affected the transport process and distribution of the eggs and larvae, and indirectly affected the growth and survive of the eggs and larvae through the transport to the nursery grounds with different water temperature and foods. The spawning grounds in southern East China Sea made more contributions to the recruitment to the fishing grounds in northeast East China Sea, but less to the Yangtze estuary and Zhoushan Island. The northwestern and southwestern parts of spawning grounds had strong connectivity with the nursery grounds of Cheju and Tsushima Straits, whereas the northeastern and southeastern parts of the spawning ground had strong connectivity with the nursery grounds of Kyushu and Pacific Ocean.

  5. Implementation and Evaluation of Multiple Adaptive Control Technologies for a Generic Transport Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Campbell, Stefan F.; Kaneshige, John T.; Nguyen, Nhan T.; Krishakumar, Kalmanje S.

    2010-01-01

    Presented here is the evaluation of multiple adaptive control technologies for a generic transport aircraft simulation. For this study, seven model reference adaptive control (MRAC) based technologies were considered. Each technology was integrated into an identical dynamic-inversion control architecture and tuned using a methodology based on metrics and specific design requirements. Simulation tests were then performed to evaluate each technology s sensitivity to time-delay, flight condition, model uncertainty, and artificially induced cross-coupling. The resulting robustness and performance characteristics were used to identify potential strengths, weaknesses, and integration challenges of the individual adaptive control technologies

  6. A Multiscale Simulation Framework to Investigate Hydrobiogeochemical Processes in the Groundwater-Surface Water Interaction Zone

    NASA Astrophysics Data System (ADS)

    Scheibe, T. D.; Yang, X.; Song, X.; Chen, X.; Hammond, G. E.; Song, H. S.; Hou, Z.; Murray, C. J.; Tartakovsky, A. M.; Tartakovsky, G.; Yang, X.; Zachara, J. M.

    2016-12-01

    Drought-related tree mortality at a regional scale causes drastic shifts in carbon and water cycling in Southeast Asian tropical rainforests, where severe droughts are projected to occur more frequently, especially under El Niño conditions. To provide a useful tool for projecting the tropical rainforest dynamics under climate change conditions, we developed the Spatially Explicit Individual-Based (SEIB) Dynamic Global Vegetation Model (DGVM) applicable to simulating mechanistic tree mortality induced by the climatic impacts via individual-tree-scale ecophysiology such as hydraulic failure and carbon starvation. In this study, we present the new model, SEIB-originated Terrestrial Ecosystem Dynamics (S-TEDy) model, and the computation results were compared with observations collected at a field site in a Bornean tropical rainforest. Furthermore, after validating the model's performance, numerical experiments addressing a future of the tropical rainforest were conducted using some global climate model (GCM) simulation outputs.

  7. Insights into Participants' Behaviours in Educational Games, Simulations and Workshops: A Catastrophe Theory Application to Motivation.

    ERIC Educational Resources Information Center

    Cryer, Patricia

    1988-01-01

    Develops models for participants' behaviors in games, simulations, and workshops based on Catastrophe Theory and Herzberg's two-factor theory of motivation. Examples are given of how these models can be used, both for describing and understanding the behaviors of individuals, and for eliciting insights into why participants behave as they do. (11…

  8. A physical data model for fields and agents

    NASA Astrophysics Data System (ADS)

    de Jong, Kor; de Bakker, Merijn; Karssenberg, Derek

    2016-04-01

    Two approaches exist in simulation modeling: agent-based and field-based modeling. In agent-based (or individual-based) simulation modeling, the entities representing the system's state are represented by objects, which are bounded in space and time. Individual objects, like an animal, a house, or a more abstract entity like a country's economy, have properties representing their state. In an agent-based model this state is manipulated. In field-based modeling, the entities representing the system's state are represented by fields. Fields capture the state of a continuous property within a spatial extent, examples of which are elevation, atmospheric pressure, and water flow velocity. With respect to the technology used to create these models, the domains of agent-based and field-based modeling have often been separate worlds. In environmental modeling, widely used logical data models include feature data models for point, line and polygon objects, and the raster data model for fields. Simulation models are often either agent-based or field-based, even though the modeled system might contain both entities that are better represented by individuals and entities that are better represented by fields. We think that the reason for this dichotomy in kinds of models might be that the traditional object and field data models underlying those models are relatively low level. We have developed a higher level conceptual data model for representing both non-spatial and spatial objects, and spatial fields (De Bakker et al. 2016). Based on this conceptual data model we designed a logical and physical data model for representing many kinds of data, including the kinds used in earth system modeling (e.g. hydrological and ecological models). The goal of this work is to be able to create high level code and tools for the creation of models in which entities are representable by both objects and fields. Our conceptual data model is capable of representing the traditional feature data models and the raster data model, among many other data models. Our physical data model is capable of storing a first set of kinds of data, like omnipresent scalars, mobile spatio-temporal points and property values, and spatio-temporal rasters. With our poster we will provide an overview of the physical data model expressed in HDF5 and show examples of how it can be used to capture both object- and field-based information. References De Bakker, M, K. de Jong, D. Karssenberg. 2016. A conceptual data model and language for fields and agents. European Geosciences Union, EGU General Assembly, 2016, Vienna.

  9. A stochastic agent-based model of pathogen propagation in dynamic multi-relational social networks

    PubMed Central

    Khan, Bilal; Dombrowski, Kirk; Saad, Mohamed

    2015-01-01

    We describe a general framework for modeling and stochastic simulation of epidemics in realistic dynamic social networks, which incorporates heterogeneity in the types of individuals, types of interconnecting risk-bearing relationships, and types of pathogens transmitted across them. Dynamism is supported through arrival and departure processes, continuous restructuring of risk relationships, and changes to pathogen infectiousness, as mandated by natural history; dynamism is regulated through constraints on the local agency of individual nodes and their risk behaviors, while simulation trajectories are validated using system-wide metrics. To illustrate its utility, we present a case study that applies the proposed framework towards a simulation of HIV in artificial networks of intravenous drug users (IDUs) modeled using data collected in the Social Factors for HIV Risk survey. PMID:25859056

  10. Using a multinomial tree model for detecting mixtures in perceptual detection

    PubMed Central

    Chechile, Richard A.

    2014-01-01

    In the area of memory research there have been two rival approaches for memory measurement—signal detection theory (SDT) and multinomial processing trees (MPT). Both approaches provide measures for the quality of the memory representation, and both approaches provide for corrections for response bias. In recent years there has been a strong case advanced for the MPT approach because of the finding of stochastic mixtures on both target-present and target-absent tests. In this paper a case is made that perceptual detection, like memory recognition, involves a mixture of processes that are readily represented as a MPT model. The Chechile (2004) 6P memory measurement model is modified in order to apply to the case of perceptual detection. This new MPT model is called the Perceptual Detection (PD) model. The properties of the PD model are developed, and the model is applied to some existing data of a radiologist examining CT scans. The PD model brings out novel features that were absent from a standard SDT analysis. Also the topic of optimal parameter estimation on an individual-observer basis is explored with Monte Carlo simulations. These simulations reveal that the mean of the Bayesian posterior distribution is a more accurate estimator than the corresponding maximum likelihood estimator (MLE). Monte Carlo simulations also indicate that model estimates based on only the data from an individual observer can be improved upon (in the sense of being more accurate) by an adjustment that takes into account the parameter estimate based on the data pooled across all the observers. The adjustment of the estimate for an individual is discussed as an analogous statistical effect to the improvement over the individual MLE demonstrated by the James–Stein shrinkage estimator in the case of the multiple-group normal model. PMID:25018741

  11. Numerical simulation of the distribution of individual gas bubbles in shaped sapphire crystals

    NASA Astrophysics Data System (ADS)

    Borodin, A. V.; Borodin, V. A.

    2017-11-01

    The simulation of the effective density of individual gas bubbles in a two-phase melt, consisting of a liquid and gas bubbles, is performed using the virtual model of the thermal unit. Based on the studies, for the first time the theoretically and experimentally grounded mechanism of individual gas bubbles formation in shaped sapphire is proposed. It is shown that the change of the melt flow pattern in crucible affects greatly the bubble density at the crystallization front, and in the crystal. The obtained results allowed reducing the number of individual gas bubbles in sapphire sheets.

  12. An Instructional Systems Technology Model for Institutional Change.

    ERIC Educational Resources Information Center

    Dudgeon, Paul J.

    A program based on instructional systems technology was developed at Canadore College as a means of devising the optimal learning experience for each individual student. The systems approach is used to solve educational problems through a process of analysis, synthesis, modeling, and simulation, based on the LOGOS (Language for Optimizing…

  13. The effect of area size and predation on the time to extinction of prairie vole populations. simulation studies via SERDYCA: a Spatially-Explicit Individual-Based Model of Rodent Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostova, T; Carlsen, T

    2003-11-21

    We present a spatially-explicit individual-based computational model of rodent dynamics, customized for the prairie vole species, M. Ochrogaster. The model is based on trophic relationships and represents important features such as territorial competition, mating behavior, density-dependent predation and dispersal out of the modeled spatial region. Vegetation growth and vole fecundity are dependent on climatic components. The results of simulations show that the model correctly predicts the overall temporal dynamics of the population density. Time-series analysis shows a very good match between the periods corresponding to the peak population density frequencies predicted by the model and the ones reported in themore » literature. The model is used to study the relation between persistence, landscape area and predation. We introduce the notions of average time to extinction (ATE) and persistence frequency to quantify persistence. While the ATE decreases with decrease of area, it is a bell-shaped function of the predation level: increasing for 'small' and decreasing for 'large' predation levels.« less

  14. The role of density-dependent individual growth in the persistence of freshwater salmonid populations.

    PubMed

    Vincenzi, Simone; Crivelli, Alain J; Jesensek, Dusan; De Leo, Giulio A

    2008-06-01

    Theoretical and empirical models of populations dynamics have paid little attention to the implications of density-dependent individual growth on the persistence and regulation of small freshwater salmonid populations. We have therefore designed a study aimed at testing our hypothesis that density-dependent individual growth is a process that enhances population recovery and reduces extinction risk in salmonid populations in a variable environment subject to disturbance events. This hypothesis was tested in two newly introduced marble trout (Salmo marmoratus) populations living in Slovenian streams (Zakojska and Gorska) subject to severe autumn floods. We developed a discrete-time stochastic individual-based model of population dynamics for each population with demographic parameters and compensatory responses tightly calibrated on data from individually tagged marble trout. The occurrence of severe flood events causing population collapses was explicitly accounted for in the model. We used the model in a population viability analysis setting to estimate the quasi-extinction risk and demographic indexes of the two marble trout populations when individual growth was density-dependent. We ran a set of simulations in which the effect of floods on population abundance was explicitly accounted for and another set of simulations in which flood events were not included in the model. These simulation results were compared with those of scenarios in which individual growth was modelled with density-independent Von Bertalanffy growth curves. Our results show how density-dependent individual growth may confer remarkable resilience to marble trout populations in case of major flood events. The resilience to flood events shown by the simulation results can be explained by the increase in size-dependent fecundity as a consequence of the drop in population size after a severe flood, which allows the population to quickly recover to the pre-event conditions. Our results suggest that density-dependent individual growth plays a potentially powerful role in the persistence of freshwater salmonids living in streams subject to recurrent yet unpredictable flood events.

  15. Using a whole farm model to determine the impacts of mating management on the profitability of pasture-based dairy farms.

    PubMed

    Beukes, P C; Burke, C R; Levy, G; Tiddy, R M

    2010-08-01

    An approach to assessing likely impacts of altering reproductive performance on productivity and profitability in pasture-based dairy farms is described. The basis is the development of a whole farm model (WFM) that simulates the entire farm system and holistically links multiple physical performance factors to profitability. The WFM consists of a framework that links a mechanistic cow model, a pasture model, a crop model, management policies and climate. It simulates individual cows and paddocks, and runs on a day time-step. The WFM was upgraded to include reproductive modeling capability using reference tables and empirical equations describing published relationships between cow factors, physiology and mating management. It predicts reproductive status at any time point for individual cows within a modeled herd. The performance of six commercial pasture-based dairy farms was simulated for the period of 12 months beginning 1 June 2005 (05/06 year) to evaluate the accuracy of the model by comparison with actual outcomes. The model predicted most key performance indicators within an acceptable range of error (residual<10% of observed). The evaluated WFM was then used for the six farms to estimate the profitability of changes in farm "set-up" (farm conditions at the start of the farming year on 1 June) and mating management from 05/06 to 06/07 year. Among the six farms simulated, the 4-week calving rate emerged as an important set-up factor influencing profitability, while reproductive performance during natural bull mating was identified as an area with the greatest opportunity for improvement. The WFM presents utility to explore alternative management strategies to predict likely outcomes to proposed changes to a pasture-based farm system. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  16. Out of the net: An agent-based model to study human movements influence on local-scale malaria transmission.

    PubMed

    Pizzitutti, Francesco; Pan, William; Feingold, Beth; Zaitchik, Ben; Álvarez, Carlos A; Mena, Carlos F

    2018-01-01

    Though malaria control initiatives have markedly reduced malaria prevalence in recent decades, global eradication is far from actuality. Recent studies show that environmental and social heterogeneities in low-transmission settings have an increased weight in shaping malaria micro-epidemiology. New integrated and more localized control strategies should be developed and tested. Here we present a set of agent-based models designed to study the influence of local scale human movements on local scale malaria transmission in a typical Amazon environment, where malaria is transmission is low and strongly connected with seasonal riverine flooding. The agent-based simulations show that the overall malaria incidence is essentially not influenced by local scale human movements. In contrast, the locations of malaria high risk spatial hotspots heavily depend on human movements because simulated malaria hotspots are mainly centered on farms, were laborers work during the day. The agent-based models are then used to test the effectiveness of two different malaria control strategies both designed to reduce local scale malaria incidence by targeting hotspots. The first control scenario consists in treat against mosquito bites people that, during the simulation, enter at least once inside hotspots revealed considering the actual sites where human individuals were infected. The second scenario involves the treatment of people entering in hotspots calculated assuming that the infection sites of every infected individual is located in the household where the individual lives. Simulations show that both considered scenarios perform better in controlling malaria than a randomized treatment, although targeting household hotspots shows slightly better performance.

  17. Can agent based models effectively reduce fisheries management implementation uncertainty?

    NASA Astrophysics Data System (ADS)

    Drexler, M.

    2016-02-01

    Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.

  18. A framework for the use of agent based modeling to simulate ...

    EPA Pesticide Factsheets

    Simulation of human behavior in exposure modeling is a complex task. Traditionally, inter-individual variation in human activity has been modeled by drawing from a pool of single day time-activity diaries such as the US EPA Consolidated Human Activity Database (CHAD). Here, an agent-based model (ABM) is used to simulate population distributions of longitudinal patterns of four macro activities (sleeping, eating, working, and commuting) in populations of adults over a period of one year. In this ABM, an individual is modeled as an agent whose movement through time and space is determined by a set of decision rules. The rules are based on the agent having time-varying “needs” that are satisfied by performing actions. Needs are modeled as increasing over time, and taking an action reduces the need. Need-satisfying actions include sleeping (meeting the need for rest), eating (meeting the need for food), and commuting/working (meeting the need for income). Every time an action is completed, the model determines the next action the agent will take based on the magnitude of each of the agent’s needs at that point in time. Different activities advertise their ability to satisfy various needs of the agent (such as food to eat or sleeping in a bed or on a couch). The model then chooses the activity that satisfies the greatest of the agent’s needs. When multiple actions could address a need, the model will choose the most effective of the actions (bed over the couc

  19. Methodologies for validating ray-based forward model using finite element method in ultrasonic array data simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Nixon, Andrew; Barber, Tom; Budyn, Nicolas; Bevan, Rhodri; Croxford, Anthony; Wilcox, Paul

    2018-04-01

    In this paper, a methodology of using finite element (FE) model to validate a ray-based model in the simulation of full matrix capture (FMC) ultrasonic array data set is proposed. The overall aim is to separate signal contributions from different interactions in FE results for easier comparing each individual component in the ray-based model results. This is achieved by combining the results from multiple FE models of the system of interest that include progressively more geometrical features while preserving the same mesh structure. It is shown that the proposed techniques allow the interactions from a large number of different ray-paths to be isolated in FE results and compared directly to the results from a ray-based forward model.

  20. Simulating Flying Insects Using Dynamics and Data-Driven Noise Modeling to Generate Diverse Collective Behaviors

    PubMed Central

    Ren, Jiaping; Wang, Xinjie; Manocha, Dinesh

    2016-01-01

    We present a biologically plausible dynamics model to simulate swarms of flying insects. Our formulation, which is based on biological conclusions and experimental observations, is designed to simulate large insect swarms of varying densities. We use a force-based model that captures different interactions between the insects and the environment and computes collision-free trajectories for each individual insect. Furthermore, we model the noise as a constructive force at the collective level and present a technique to generate noise-induced insect movements in a large swarm that are similar to those observed in real-world trajectories. We use a data-driven formulation that is based on pre-recorded insect trajectories. We also present a novel evaluation metric and a statistical validation approach that takes into account various characteristics of insect motions. In practice, the combination of Curl noise function with our dynamics model is used to generate realistic swarm simulations and emergent behaviors. We highlight its performance for simulating large flying swarms of midges, fruit fly, locusts and moths and demonstrate many collective behaviors, including aggregation, migration, phase transition, and escape responses. PMID:27187068

  1. Theory and data for simulating fine-scale human movement in an urban environment

    PubMed Central

    Perkins, T. Alex; Garcia, Andres J.; Paz-Soldán, Valerie A.; Stoddard, Steven T.; Reiner, Robert C.; Vazquez-Prokopec, Gonzalo; Bisanzio, Donal; Morrison, Amy C.; Halsey, Eric S.; Kochel, Tadeusz J.; Smith, David L.; Kitron, Uriel; Scott, Thomas W.; Tatem, Andrew J.

    2014-01-01

    Individual-based models of infectious disease transmission depend on accurate quantification of fine-scale patterns of human movement. Existing models of movement either pertain to overly coarse scales, simulate some aspects of movement but not others, or were designed specifically for populations in developed countries. Here, we propose a generalizable framework for simulating the locations that an individual visits, time allocation across those locations, and population-level variation therein. As a case study, we fit alternative models for each of five aspects of movement (number, distance from home and types of locations visited; frequency and duration of visits) to interview data from 157 residents of the city of Iquitos, Peru. Comparison of alternative models showed that location type and distance from home were significant determinants of the locations that individuals visited and how much time they spent there. We also found that for most locations, residents of two neighbourhoods displayed indistinguishable preferences for visiting locations at various distances, despite differing distributions of locations around those neighbourhoods. Finally, simulated patterns of time allocation matched the interview data in a number of ways, suggesting that our framework constitutes a sound basis for simulating fine-scale movement and for investigating factors that influence it. PMID:25142528

  2. Quantitative Agent Based Model of User Behavior in an Internet Discussion Forum

    PubMed Central

    Sobkowicz, Pawel

    2013-01-01

    The paper presents an agent based simulation of opinion evolution, based on a nonlinear emotion/information/opinion (E/I/O) individual dynamics, to an actual Internet discussion forum. The goal is to reproduce the results of two-year long observations and analyses of the user communication behavior and of the expressed opinions and emotions, via simulations using an agent based model. The model allowed to derive various characteristics of the forum, including the distribution of user activity and popularity (outdegree and indegree), the distribution of length of dialogs between the participants, their political sympathies and the emotional content and purpose of the comments. The parameters used in the model have intuitive meanings, and can be translated into psychological observables. PMID:24324606

  3. A Continuous Labour Supply Model in Microsimulation: A Life-Cycle Modelling Approach with Heterogeneity and Uncertainty Extension

    PubMed Central

    Li, Jinjing; Sologon, Denisa Maria

    2014-01-01

    This paper advances a structural inter-temporal model of labour supply that is able to simulate the dynamics of labour supply in a continuous setting and addresses two main drawbacks of most existing models. The first limitation is the inability to incorporate individual heterogeneity as every agent is sharing the same parameters of the utility function. The second one is the strong assumption that individuals make decisions in a world of perfect certainty. Essentially, this paper offers an extension of marginal-utility-of-wealth-constant labour supply functions known as “Frisch functions” under certainty and uncertainty with homogenous and heterogeneous preferences. The lifetime models based on the fixed effect vector decomposition yield the most stable simulation results, under both certain and uncertain future wage assumptions. Due to its improved accuracy and stability, this lifetime labour supply model is particularly suitable for enhancing the performance of the life cycle simulation models, thus providing a better reference for policymaking. PMID:25391021

  4. Intelligent systems approach for automated identification of individual control behavior of a human operator

    NASA Astrophysics Data System (ADS)

    Zaychik, Kirill B.

    Acceptable results have been obtained using conventional techniques to model the generic human operator's control behavior. However, little research has been done in an attempt to identify an individual based on his/her control behavior. The main hypothesis investigated in this dissertation is that different operators exhibit different control behavior when performing a given control task. Furthermore, inter-person differences are manifested in the amplitude and frequency content of the non-linear component of the control behavior. Two enhancements to the existing models of the human operator, which allow personalization of the modeled control behavior, are presented in this dissertation. One of the proposed enhancements accounts for the "testing" control signals, which are introduced by an operator for more accurate control of the system and/or to adjust his/her control strategy. Such enhancement uses the Artificial Neural Network (ANN), which can be fine-tuned to model the "testing" control behavior of a given individual. The other model enhancement took the form of an equiripple filter (EF), which conditions the power spectrum of the control signal before it is passed through the plant dynamics block. The filter design technique uses Parks-McClellan algorithm, which allows parameterization of the desired levels of power at certain frequencies. A novel automated parameter identification technique (APID) was developed to facilitate the identification process of the parameters of the selected models of the human operator. APID utilizes a Genetic Algorithm (GA) based optimization engine called the Bit-climbing Algorithm (BCA). Proposed model enhancements were validated using the experimental data obtained at three different sources: the Manual Control Laboratory software experiments, Unmanned Aerial Vehicle simulation, and NASA Langley Research Center Visual Motion Simulator studies. Validation analysis involves comparison of the actual and simulated control activity signals. Validation criteria used in this dissertation is based on comparing Power Spectral Densities of the control signals against that of the Precision model of the human operator. This dissertation also addresses the issue of applying the proposed human operator model augmentation to evaluate the effectiveness of the motion feedback when simulating the actual pilot control behavior in a flight simulator. The proposed modeling methodology allows for quantitative assessments and prediction of the need for platform motion, while performing aircraft/pilot simulation studies.

  5. Dilemma of dilemmas: how collective and individual perspectives can clarify the size dilemma in voluntary linear public goods dilemmas.

    PubMed

    Shank, Daniel B; Kashima, Yoshihisa; Saber, Saam; Gale, Thomas; Kirley, Michael

    2015-01-01

    Empirical findings on public goods dilemmas indicate an unresolved dilemma: that increasing size-the number of people in the dilemma-sometimes increases, decreases, or does not influence cooperation. We clarify this dilemma by first classifying public goods dilemma properties that specify individual outcomes as individual properties (e.g., Marginal Per Capita Return) and group outcomes as group properties (e.g., public good multiplier), mathematically showing how only one set of properties can remain constant as the dilemma size increases. Underpinning decision-making regarding individual and group properties, we propose that individuals are motivated by both individual and group preferences based on a theory of collective rationality. We use Van Lange's integrated model of social value orientations to operationalize these preferences as an amalgamation of outcomes for self, outcomes for others, and equality of outcomes. Based on this model, we then predict how the public good's benefit and size, combined with controlling individual versus group properties, produce different levels of cooperation in public goods dilemmas. A two (low vs. high benefit) by three (2-person baseline vs. 5-person holding constant individual properties vs. 5-person holding constant group properties) factorial experiment (group n = 99; participant n = 390) confirms our hypotheses. The results indicate that when holding constant group properties, size decreases cooperation. Yet when holding constant individual properties, size increases cooperation when benefit is low and does not affect cooperation when benefit is high. Using agent-based simulations of individual and group preferences vis-à-vis the integrative model, we fit a weighted simulation model to the empirical data. This fitted model is sufficient to reproduce the empirical results, but only when both individual (self-interest) and group (other-interest and equality) preference are included. Our research contributes to understanding how people's motivations and behaviors within public goods dilemmas interact with the properties of the dilemma to lead to collective outcomes.

  6. Dilemma of Dilemmas: How Collective and Individual Perspectives Can Clarify the Size Dilemma in Voluntary Linear Public Goods Dilemmas

    PubMed Central

    Shank, Daniel B.; Kashima, Yoshihisa; Saber, Saam; Gale, Thomas; Kirley, Michael

    2015-01-01

    Empirical findings on public goods dilemmas indicate an unresolved dilemma: that increasing size—the number of people in the dilemma—sometimes increases, decreases, or does not influence cooperation. We clarify this dilemma by first classifying public goods dilemma properties that specify individual outcomes as individual properties (e.g., Marginal Per Capita Return) and group outcomes as group properties (e.g., public good multiplier), mathematically showing how only one set of properties can remain constant as the dilemma size increases. Underpinning decision-making regarding individual and group properties, we propose that individuals are motivated by both individual and group preferences based on a theory of collective rationality. We use Van Lange's integrated model of social value orientations to operationalize these preferences as an amalgamation of outcomes for self, outcomes for others, and equality of outcomes. Based on this model, we then predict how the public good's benefit and size, combined with controlling individual versus group properties, produce different levels of cooperation in public goods dilemmas. A two (low vs. high benefit) by three (2-person baseline vs. 5-person holding constant individual properties vs. 5-person holding constant group properties) factorial experiment (group n = 99; participant n = 390) confirms our hypotheses. The results indicate that when holding constant group properties, size decreases cooperation. Yet when holding constant individual properties, size increases cooperation when benefit is low and does not affect cooperation when benefit is high. Using agent-based simulations of individual and group preferences vis-à-vis the integrative model, we fit a weighted simulation model to the empirical data. This fitted model is sufficient to reproduce the empirical results, but only when both individual (self-interest) and group (other-interest and equality) preference are included. Our research contributes to understanding how people's motivations and behaviors within public goods dilemmas interact with the properties of the dilemma to lead to collective outcomes. PMID:25799355

  7. Surgical stent planning: simulation parameter study for models based on DICOM standards.

    PubMed

    Scherer, S; Treichel, T; Ritter, N; Triebel, G; Drossel, W G; Burgert, O

    2011-05-01

    Endovascular Aneurysm Repair (EVAR) can be facilitated by a realistic simulation model of stent-vessel-interaction. Therefore, numerical feasibility and integrability in the clinical environment was evaluated. The finite element method was used to determine necessary simulation parameters for stent-vessel-interaction in EVAR. Input variables and result data of the simulation model were examined for their standardization using DICOM supplements. The study identified four essential parameters for the stent-vessel simulation: blood pressure, intima constitution, plaque occurrence and the material properties of vessel and plaque. Output quantities such as radial force of the stent and contact pressure between stent/vessel can help the surgeon to evaluate implant fixation and sealing. The model geometry can be saved with DICOM "Surface Segmentation" objects and the upcoming "Implant Templates" supplement. Simulation results can be stored using the "Structured Report". A standards-based general simulation model for optimizing stent-graft selection may be feasible. At present, there are limitations due to specification of individual vessel material parameters and for simulating the proximal fixation of stent-grafts with hooks. Simulation data with clinical relevance for documentation and presentation can be stored using existing or new DICOM extensions.

  8. Skill of Ensemble Seasonal Probability Forecasts

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.; Binter, Roman; Du, Hailiang; Niehoerster, Falk

    2010-05-01

    In operational forecasting, the computational complexity of large simulation models is, ideally, justified by enhanced performance over simpler models. We will consider probability forecasts and contrast the skill of ENSEMBLES-based seasonal probability forecasts of interest to the finance sector (specifically temperature forecasts for Nino 3.4 and the Atlantic Main Development Region (MDR)). The ENSEMBLES model simulations will be contrasted against forecasts from statistical models based on the observations (climatological distributions) and empirical dynamics based on the observations but conditioned on the current state (dynamical climatology). For some start dates, individual ENSEMBLES models yield significant skill even at a lead-time of 14 months. The nature of this skill is discussed, and chances of application are noted. Questions surrounding the interpretation of probability forecasts based on these multi-model ensemble simulations are then considered; the distributions considered are formed by kernel dressing the ensemble and blending with the climatology. The sources of apparent (RMS) skill in distributions based on multi-model simulations is discussed, and it is demonstrated that the inclusion of "zero-skill" models in the long range can improve Root-Mean-Square-Error scores, casting some doubt on the common justification for the claim that all models should be included in forming an operational probability forecast. It is argued that the rational response varies with lead time.

  9. A water market simulator considering pair-wise trades between agents

    NASA Astrophysics Data System (ADS)

    Huskova, I.; Erfani, T.; Harou, J. J.

    2012-04-01

    In many basins in England no further water abstraction licences are available. Trading water between water rights holders has been recognized as a potentially effective and economically efficient strategy to mitigate increasing scarcity. A screening tool that could assess the potential for trade through realistic simulation of individual water rights holders would help assess the solution's potential contribution to local water management. We propose an optimisation-driven water market simulator that predicts pair-wise trade in a catchment and represents its interaction with natural hydrology and engineered infrastructure. A model is used to emulate licence-holders' willingness to engage in short-term trade transactions. In their simplest form agents are represented using an economic benefit function. The working hypothesis is that trading behaviour can be partially predicted based on differences in marginal values of water over space and time and estimates of transaction costs on pair-wise trades. We discuss the further possibility of embedding rules, norms and preferences of the different water user sectors to more realistically represent the behaviours, motives and constraints of individual licence holders. The potential benefits and limitations of such a social simulation (agent-based) approach is contrasted with our simulator where agents are driven by economic optimization. A case study based on the Dove River Basin (UK) demonstrates model inputs and outputs. The ability of the model to suggest impacts of water rights policy reforms on trading is discussed.

  10. The Influence of Spatial Configuration of Residential Area and Vector Populations on Dengue Incidence Patterns in an Individual-Level Transmission Model.

    PubMed

    Kang, Jeon-Young; Aldstadt, Jared

    2017-07-15

    Dengue is a mosquito-borne infectious disease that is endemic in tropical and subtropical countries. Many individual-level simulation models have been developed to test hypotheses about dengue virus transmission. Often these efforts assume that human host and mosquito vector populations are randomly or uniformly distributed in the environment. Although, the movement of mosquitoes is affected by spatial configuration of buildings and mosquito populations are highly clustered in key buildings, little research has focused on the influence of the local built environment in dengue transmission models. We developed an agent-based model of dengue transmission in a village setting to test the importance of using realistic environments in individual-level models of dengue transmission. The results from one-way ANOVA analysis of simulations indicated that the differences between scenarios in terms of infection rates as well as serotype-specific dominance are statistically significant. Specifically, the infection rates in scenarios of a realistic environment are more variable than those of a synthetic spatial configuration. With respect to dengue serotype-specific cases, we found that a single dengue serotype is more often dominant in realistic environments than in synthetic environments. An agent-based approach allows a fine-scaled analysis of simulated dengue incidence patterns. The results provide a better understanding of the influence of spatial heterogeneity on dengue transmission at a local scale.

  11. Agent-based models in translational systems biology

    PubMed Central

    An, Gary; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram

    2013-01-01

    Effective translational methodologies for knowledge representation are needed in order to make strides against the constellation of diseases that affect the world today. These diseases are defined by their mechanistic complexity, redundancy, and nonlinearity. Translational systems biology aims to harness the power of computational simulation to streamline drug/device design, simulate clinical trials, and eventually to predict the effects of drugs on individuals. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggests that this modeling framework is well suited for translational systems biology. This review describes agent-based modeling and gives examples of its translational applications in the context of acute inflammation and wound healing. PMID:20835989

  12. Integrative computational models of cardiac arrhythmias -- simulating the structurally realistic heart

    PubMed Central

    Trayanova, Natalia A; Tice, Brock M

    2009-01-01

    Simulation of cardiac electrical function, and specifically, simulation aimed at understanding the mechanisms of cardiac rhythm disorders, represents an example of a successful integrative multiscale modeling approach, uncovering emergent behavior at the successive scales in the hierarchy of structural complexity. The goal of this article is to present a review of the integrative multiscale models of realistic ventricular structure used in the quest to understand and treat ventricular arrhythmias. It concludes with the new advances in image-based modeling of the heart and the promise it holds for the development of individualized models of ventricular function in health and disease. PMID:20628585

  13. Association analysis of whole genome sequencing data accounting for longitudinal and family designs.

    PubMed

    Hu, Yijuan; Hui, Qin; Sun, Yan V

    2014-01-01

    Using the whole genome sequencing data and the simulated longitudinal phenotypes for 849 pedigree-based individuals from Genetic Analysis Workshop 18, we investigated various approaches to detecting the association of rare and common variants with blood pressure traits. We compared three strategies for longitudinal data: (a) using the baseline measurement only, (b) using the average from multiple visits, and (c) using all individual measurements. We also compared the power of using all of the pedigree-based data and the unrelated subset. The analyses were performed without knowledge of the underlying simulating model.

  14. NUNOA: a computer simulator of individuals, families, and extended families of the high-altitude Quechua

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, C.C.; Weinstein, D.A.; Shugart, H.H.

    1980-10-01

    The Quechua Indians of the Peruvian Andes are an example of a human population which has developed special cultural adaptations to deal with hypocaloric stress imposed by a harsh environment. A highly detailed human ecosystem model, NUNOA, which simulates the yearly energy balance of individuals, families, and extended families in a hypothetical farming and herding Quechua community of the high Andes was developed. Unlike most population models which use sets of differential equations in which individuals are aggregated into groups, this model considers the response of each individual to a stochastic environment. The model calculates the yearly energy demand formore » each family based on caloric requirements of its members. For each family, the model simulates the cultivation of seven different crops and the impact of precipitation, temperature, and disease on yield. Herding, slaughter, and market sales of three different animal species are also simulated. Any energy production in excess of the family's energy demand is placed into extended family storage for possible redistribution. A family failing to meet their annual energy demand may slaughter additional herd animals, temporarily migrate from the community, or borrow food from the extended family storage. The energy balance is used in determining births, deaths, marriages, and resource sharing in the Indian community. In addition, the model maintains a record of each individual's ancestry as well as seven genetic traits for use in tracing lineage and gene flow. The model user has the opportunity to investigate the effect of changes in marriage patterns, resource sharing patterns, or subsistence activities on the ability of the human population to survive in the harsh Andean environment. In addition, the user may investigate the impact of external technology on the Indian culture.« less

  15. Simulation of wetlands forest vegetation dynamics

    USGS Publications Warehouse

    Phipps, R.L.

    1979-01-01

    A computer program, SWAMP, was designed to simulate the effects of flood frequency and depth to water table on southern wetlands forest vegetation dynamics. By incorporating these hydrologic characteristics into the model, forest vegetation and vegetation dynamics can be simulated. The model, based on data from the White River National Wildlife Refuge near De Witt, Arkansas, "grows" individual trees on a 20 x 20-m plot taking into account effects on the tree growth of flooding, depth to water table, shade tolerance, overtopping and crowding, and probability of death and reproduction. A potential application of the model is illustrated with simulations of tree fruit production following flood-control implementation and lumbering. ?? 1979.

  16. HexSim: A flexible simulation model for forecasting wildlife responses to multiple interacting stressors

    EPA Science Inventory

    With SERDP funding, we have improved upon a popular life history simulator (PATCH), and in doing so produced a powerful new forecasting tool (HexSim). PATCH, our starting point, was spatially explicit and individual-based, and was useful for evaluating a range of terrestrial lif...

  17. Simulating dispersal of reintroduced species within heterogeneous landscapes

    Treesearch

    Robert H. Gardner; Eric J. Gustafson

    2004-01-01

    This paper describes the development and application of a spatially explicit, individual based model of animal dispersal (J-walk) to determine the relative effects of landscape heterogeneity, prey availability, predation risk, and the energy requirements and behavior of dispersing organisms on dispersal success. Significant unknowns exist for the simulation of complex...

  18. School System Simulation: An Effective Model for Educational Leaders.

    ERIC Educational Resources Information Center

    Nelson, Jorge O.

    This study reviews the literature regarding the theoretical rationale for creating a computer-based school system simulation for educational leaders' use in problem solving and decision making. Like all social systems, educational systems are so complex that individuals are hard-pressed to consider all interrelated parts as a totality. A…

  19. Modeling the Historical Flood Events in France

    NASA Astrophysics Data System (ADS)

    Ali, Hani; Blaquière, Simon

    2017-04-01

    We will present the simulation results for different scenarios based on the flood model developed by AXA Global P&C CAT Modeling team. The model uses a Digital Elevation Model (DEM) with 75 m resolution, a hydrographic system (DB Carthage), daily rainfall data from "Météo France", water level from "HYDRO Banque" the French Hydrological Database (www.hydro.eaufrance.fr), for more than 1500 stations, hydrological model from IRSTEA and in-house hydraulic tool. In particular, the model re-simulates the most important and costly flood events that occurred during the past decade in France: we will present the re-simulated meteorological conditions since 1964 and estimate insurance loss incurred on current AXA portfolio of individual risks.

  20. Efficiency of reactant site sampling in network-free simulation of rule-based models for biochemical systems

    PubMed Central

    Yang, Jin; Hlavacek, William S.

    2011-01-01

    Rule-based models, which are typically formulated to represent cell signaling systems, can now be simulated via various network-free simulation methods. In a network-free method, reaction rates are calculated for rules that characterize molecular interactions, and these rule rates, which each correspond to the cumulative rate of all reactions implied by a rule, are used to perform a stochastic simulation of reaction kinetics. Network-free methods, which can be viewed as generalizations of Gillespie’s method, are so named because these methods do not require that a list of individual reactions implied by a set of rules be explicitly generated, which is a requirement of other methods for simulating rule-based models. This requirement is impractical for rule sets that imply large reaction networks (i.e., long lists of individual reactions), as reaction network generation is expensive. Here, we compare the network-free simulation methods implemented in RuleMonkey and NFsim, general-purpose software tools for simulating rule-based models encoded in the BioNetGen language. The method implemented in NFsim uses rejection sampling to correct overestimates of rule rates, which introduces null events (i.e., time steps that do not change the state of the system being simulated). The method implemented in RuleMonkey uses iterative updates to track rule rates exactly, which avoids null events. To ensure a fair comparison of the two methods, we developed implementations of the rejection and rejection-free methods specific to a particular class of kinetic models for multivalent ligand-receptor interactions. These implementations were written with the intention of making them as much alike as possible, minimizing the contribution of irrelevant coding differences to efficiency differences. Simulation results show that performance of the rejection method is equal to or better than that of the rejection-free method over wide parameter ranges. However, when parameter values are such that ligand-induced aggregation of receptors yields a large connected receptor cluster, the rejection-free method is more efficient. PMID:21832806

  1. Improving Agent Based Models and Validation through Data Fusion

    PubMed Central

    Laskowski, Marek; Demianyk, Bryan C.P.; Friesen, Marcia R.; McLeod, Robert D.; Mukhi, Shamir N.

    2011-01-01

    This work is contextualized in research in modeling and simulation of infection spread within a community or population, with the objective to provide a public health and policy tool in assessing the dynamics of infection spread and the qualitative impacts of public health interventions. This work uses the integration of real data sources into an Agent Based Model (ABM) to simulate respiratory infection spread within a small municipality. Novelty is derived in that the data sources are not necessarily obvious within ABM infection spread models. The ABM is a spatial-temporal model inclusive of behavioral and interaction patterns between individual agents on a real topography. The agent behaviours (movements and interactions) are fed by census / demographic data, integrated with real data from a telecommunication service provider (cellular records) and person-person contact data obtained via a custom 3G Smartphone application that logs Bluetooth connectivity between devices. Each source provides data of varying type and granularity, thereby enhancing the robustness of the model. The work demonstrates opportunities in data mining and fusion that can be used by policy and decision makers. The data become real-world inputs into individual SIR disease spread models and variants, thereby building credible and non-intrusive models to qualitatively simulate and assess public health interventions at the population level. PMID:23569606

  2. Improving Agent Based Models and Validation through Data Fusion.

    PubMed

    Laskowski, Marek; Demianyk, Bryan C P; Friesen, Marcia R; McLeod, Robert D; Mukhi, Shamir N

    2011-01-01

    This work is contextualized in research in modeling and simulation of infection spread within a community or population, with the objective to provide a public health and policy tool in assessing the dynamics of infection spread and the qualitative impacts of public health interventions. This work uses the integration of real data sources into an Agent Based Model (ABM) to simulate respiratory infection spread within a small municipality. Novelty is derived in that the data sources are not necessarily obvious within ABM infection spread models. The ABM is a spatial-temporal model inclusive of behavioral and interaction patterns between individual agents on a real topography. The agent behaviours (movements and interactions) are fed by census / demographic data, integrated with real data from a telecommunication service provider (cellular records) and person-person contact data obtained via a custom 3G Smartphone application that logs Bluetooth connectivity between devices. Each source provides data of varying type and granularity, thereby enhancing the robustness of the model. The work demonstrates opportunities in data mining and fusion that can be used by policy and decision makers. The data become real-world inputs into individual SIR disease spread models and variants, thereby building credible and non-intrusive models to qualitatively simulate and assess public health interventions at the population level.

  3. The Osseus platform: a prototype for advanced web-based distributed simulation

    NASA Astrophysics Data System (ADS)

    Franceschini, Derrick; Riecken, Mark

    2016-05-01

    Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.

  4. INDIVIDUAL-BASED MODELS: POWERFUL OR POWER STRUGGLE?

    PubMed

    Willem, L; Stijven, S; Hens, N; Vladislavleva, E; Broeckhove, J; Beutels, P

    2015-01-01

    Individual-based models (IBMs) offer endless possibilities to explore various research questions but come with high model complexity and computational burden. Large-scale IBMs have become feasible but the novel hardware architectures require adapted software. The increased model complexity also requires systematic exploration to gain thorough system understanding. We elaborate on the development of IBMs for vaccine-preventable infectious diseases and model exploration with active learning. Investment in IBM simulator code can lead to significant runtime reductions. We found large performance differences due to data locality. Sorting the population once, reduced simulation time by a factor two. Storing person attributes separately instead of using person objects also seemed more efficient. Next, we improved model performance up to 70% by structuring potential contacts based on health status before processing disease transmission. The active learning approach we present is based on iterative surrogate modelling and model-guided experimentation. Symbolic regression is used for nonlinear response surface modelling with automatic feature selection. We illustrate our approach using an IBM for influenza vaccination. After optimizing the parameter spade, we observed an inverse relationship between vaccination coverage and the clinical attack rate reinforced by herd immunity. These insights can be used to focus and optimise research activities, and to reduce both dimensionality and decision uncertainty.

  5. A network-based approach for resistance transmission in bacterial populations.

    PubMed

    Gehring, Ronette; Schumm, Phillip; Youssef, Mina; Scoglio, Caterina

    2010-01-07

    Horizontal transfer of mobile genetic elements (conjugation) is an important mechanism whereby resistance is spread through bacterial populations. The aim of our work is to develop a mathematical model that quantitatively describes this process, and to use this model to optimize antimicrobial dosage regimens to minimize resistance development. The bacterial population is conceptualized as a compartmental mathematical model to describe changes in susceptible, resistant, and transconjugant bacteria over time. This model is combined with a compartmental pharmacokinetic model to explore the effect of different plasma drug concentration profiles. An agent-based simulation tool is used to account for resistance transfer occurring when two bacteria are adjacent or in close proximity. In addition, a non-linear programming optimal control problem is introduced to minimize bacterial populations as well as the drug dose. Simulation and optimization results suggest that the rapid death of susceptible individuals in the population is pivotal in minimizing the number of transconjugants in a population. This supports the use of potent antimicrobials that rapidly kill susceptible individuals and development of dosage regimens that maintain effective antimicrobial drug concentrations for as long as needed to kill off the susceptible population. Suggestions are made for experiments to test the hypotheses generated by these simulations.

  6. Qualitative modeling of normal blood coagulation and its pathological states using stochastic activity networks.

    PubMed

    Mounts, W M; Liebman, M N

    1997-07-01

    We have developed a method for representing biological pathways and simulating their behavior based on the use of stochastic activity networks (SANs). SANs, an extension of the original Petri net, have been used traditionally to model flow systems including data-communications networks and manufacturing processes. We apply the methodology to the blood coagulation cascade, a biological flow system, and present the representation method as well as results of simulation studies based on published experimental data. In addition to describing the dynamic model, we also present the results of its utilization to perform simulations of clinical states including hemophilia's A and B as well as sensitivity analysis of individual factors and their impact on thrombin production.

  7. An individual reproduction model sensitive to milk yield and body condition in Holstein dairy cows.

    PubMed

    Brun-Lafleur, L; Cutullic, E; Faverdin, P; Delaby, L; Disenhaus, C

    2013-08-01

    To simulate the consequences of management in dairy herds, the use of individual-based herd models is very useful and has become common. Reproduction is a key driver of milk production and herd dynamics, whose influence has been magnified by the decrease in reproductive performance over the last decades. Moreover, feeding management influences milk yield (MY) and body reserves, which in turn influence reproductive performance. Therefore, our objective was to build an up-to-date animal reproduction model sensitive to both MY and body condition score (BCS). A dynamic and stochastic individual reproduction model was built mainly from data of a single recent long-term experiment. This model covers the whole reproductive process and is composed of a succession of discrete stochastic events, mainly calving, ovulations, conception and embryonic loss. Each reproductive step is sensitive to MY or BCS levels or changes. The model takes into account recent evolutions of reproductive performance, particularly concerning calving-to-first ovulation interval, cyclicity (normal cycle length, prevalence of prolonged luteal phase), oestrus expression and pregnancy (conception, early and late embryonic loss). A sensitivity analysis of the model to MY and BCS at calving was performed. The simulated performance was compared with observed data from the database used to build the model and from the bibliography to validate the model. Despite comprising a whole series of reproductive steps, the model made it possible to simulate realistic global reproduction outputs. It was able to well simulate the overall reproductive performance observed in farms in terms of both success rate (recalving rate) and reproduction delays (calving interval). This model has the purpose to be integrated in herd simulation models to usefully test the impact of management strategies on herd reproductive performance, and thus on calving patterns and culling rates.

  8. Understanding interannual variability in the distribution of, and transport processes affecting, the early life stages of Todarodes pacificus using behavioral-hydrodynamic modeling approaches

    NASA Astrophysics Data System (ADS)

    Kim, Jung Jin; Stockhausen, William; Kim, Suam; Cho, Yang-Ki; Seo, Gwang-Ho; Lee, Joon-Soo

    2015-11-01

    To understand interannual variability in the distribution of the early life stages of Todarodes pacificus summer spawning population, and to identify the key transport processes influencing this variability, we used a coupled bio-physical model that combines an individual-based model (IBM) incorporating ontogenetic vertical migration for paralarval behavior and temperature-dependent survival process with a ROMS oceanographic model. Using the distribution of paralarvae observed in the northern East China Sea (ECS) during several field cruises as an end point, the spawning ground for the summer-spawning population was estimated to extend from southeast Jeju Island to the central ECS near 29°N by running the model backwards in time. Running the model forward, interannual variability in the distribution of paralarvae predicted by the model was consistent with that observed in several field surveys; surviving individuals in the northern ECS were substantially more abundant in late July 2006 than in 2007, in agreement with observed paralarval distributions. The total number of surviving individuals at 60 days after release based on the simulation throughout summer spawning period (June-August) was 20,329 for 2006, compared with 13,816 for 2007. The surviving individuals were mainly distributed in the East/Japan Sea (EJS), corresponding to a pathway following the nearshore branch of the Tsushima Warm Current flowing along the Japanese coast during both years. In contrast, the abundance of surviving individuals was extremely low in 2007 compared to 2006 on the Pacific side of Japan. Interannual variability in transport and survival processes made a substantial impact on not only the abundance of surviving paralarvae, but also on the flux of paralarvae to adjacent waters. Our simulation results for between-year variation in paralarval abundance coincide with recruitment (year n + 1) variability of T. pacificus in the field. The agreement between the simulation and field data indicates our model may be useful for predicting the recruitment of T. pacificus.

  9. Smoldyn: particle-based simulation with rule-based modeling, improved molecular interaction and a library interface.

    PubMed

    Andrews, Steven S

    2017-03-01

    Smoldyn is a spatial and stochastic biochemical simulator. It treats each molecule of interest as an individual particle in continuous space, simulating molecular diffusion, molecule-membrane interactions and chemical reactions, all with good accuracy. This article presents several new features. Smoldyn now supports two types of rule-based modeling. These are a wildcard method, which is very convenient, and the BioNetGen package with extensions for spatial simulation, which is better for complicated models. Smoldyn also includes new algorithms for simulating the diffusion of surface-bound molecules and molecules with excluded volume. Both are exact in the limit of short time steps and reasonably good with longer steps. In addition, Smoldyn supports single-molecule tracking simulations. Finally, the Smoldyn source code can be accessed through a C/C ++ language library interface. Smoldyn software, documentation, code, and examples are at http://www.smoldyn.org . steven.s.andrews@gmail.com. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  10. An individual-based process model to simulate landscape-scale forest ecosystem dynamics

    Treesearch

    Rupert Seidl; Werner Rammer; Robert M. Scheller; Thomas Spies

    2012-01-01

    Forest ecosystem dynamics emerges from nonlinear interactions between adaptive biotic agents (i.e., individual trees) and their relationship with a spatially and temporally heterogeneous abiotic environment. Understanding and predicting the dynamics resulting from these complex interactions is crucial for the sustainable stewardship of ecosystems, particularly in the...

  11. Simulating Cancer Growth with Multiscale Agent-Based Modeling

    PubMed Central

    Wang, Zhihui; Butner, Joseph D.; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S.

    2014-01-01

    There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. PMID:24793698

  12. Simulated Annealing Based Hybrid Forecast for Improving Daily Municipal Solid Waste Generation Prediction

    PubMed Central

    Song, Jingwei; He, Jiaying; Zhu, Menghua; Tan, Debao; Zhang, Yu; Ye, Song; Shen, Dingtao; Zou, Pengfei

    2014-01-01

    A simulated annealing (SA) based variable weighted forecast model is proposed to combine and weigh local chaotic model, artificial neural network (ANN), and partial least square support vector machine (PLS-SVM) to build a more accurate forecast model. The hybrid model was built and multistep ahead prediction ability was tested based on daily MSW generation data from Seattle, Washington, the United States. The hybrid forecast model was proved to produce more accurate and reliable results and to degrade less in longer predictions than three individual models. The average one-week step ahead prediction has been raised from 11.21% (chaotic model), 12.93% (ANN), and 12.94% (PLS-SVM) to 9.38%. Five-week average has been raised from 13.02% (chaotic model), 15.69% (ANN), and 15.92% (PLS-SVM) to 11.27%. PMID:25301508

  13. Development and application of a crossbreeding simulation model for goat production systems in tropical regions.

    PubMed

    Tsukahara, Y; Oishi, K; Hirooka, H

    2011-12-01

    A deterministic simulation model was developed to estimate biological production efficiency and to evaluate goat crossbreeding systems under tropical conditions. The model involves 5 production systems: pure indigenous, first filial generations (F1), backcross (BC), composite breeds of F1 (CMP(F1)), and BC (CMP(BC)). The model first simulates growth, reproduction, lactation, and energy intakes of a doe and a kid on a 1-d time step at the individual level and thereafter the outputs are integrated into the herd dynamics program. The ability of the model to simulate individual performances was tested under a base situation. The simulation results represented daily BW changes, ME requirements, and milk yield and the estimates were within the range of published data. Two conventional goat production scenarios (an intensive milk production scenario and an integrated goat and oil palm production scenario) in Malaysia were examined. The simulation results of the intensive milk production scenario showed the greater production efficiency of the CMP(BC) and CMP(F1) systems and decreased production efficiency of the F1 and BC systems. The results of the integrated goat and oil palm production scenario showed that the production efficiency and stocking rate were greater for the indigenous goats than for the crossbreeding systems.

  14. Radar signal categorization using a neural network

    NASA Technical Reports Server (NTRS)

    Anderson, James A.; Gately, Michael T.; Penz, P. Andrew; Collins, Dean R.

    1991-01-01

    Neural networks were used to analyze a complex simulated radar environment which contains noisy radar pulses generated by many different emitters. The neural network used is an energy minimizing network (the BSB model) which forms energy minima - attractors in the network dynamical system - based on learned input data. The system first determines how many emitters are present (the deinterleaving problem). Pulses from individual simulated emitters give rise to separate stable attractors in the network. Once individual emitters are characterized, it is possible to make tentative identifications of them based on their observed parameters. As a test of this idea, a neural network was used to form a small data base that potentially could make emitter identifications.

  15. Individual-Based Spatially-Explicit Model of an Herbivore and Its Resource: The Effect of Habitat Reduction and Fragmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostova, T; Carlsen, T; Kercher, J

    2002-06-17

    We present an individual-based, spatially-explicit model of the dynamics of a small mammal and its resource. The life histories of each individual animal are modeled separately. The individuals can have the status of residents or wanderers and belong to behaviorally differing groups of juveniles or adults and males or females. Their territory defending and monogamous behavior is taken into consideration. The resource, green vegetation, grows depending on seasonal climatic characteristics and is diminished due to the herbivore's grazing. Other specifics such as a varying personal energetic level due to feeding and starvation of the individuals, mating preferences, avoidance of competitors,more » dispersal of juveniles, as a result of site overgrazing, etc. are included in the model. We determined model parameters from real data for the species Microtus ochrogaster (prairie vole). The simulations are done for a case of an enclosed habitat without predators or other species competitors. The goal of the study is to find the relation between size of habitat and population persistence. The experiments with the model show the populations go extinct due to severe overgrazing, but that the length of population persistence depends on the area of the habitat as well as on the presence of fragmentation. Additionally, the total population size of the vole population obtained during the simulations exhibits yearly fluctuations as well as multi-yearly peaks of fluctuations. This dynamics is similar to the one observed in prairie vole field studies.« less

  16. Comparison of simulation modeling and satellite techniques for monitoring ecological processes

    NASA Technical Reports Server (NTRS)

    Box, Elgene O.

    1988-01-01

    In 1985 improvements were made in the world climatic data base for modeling and predictive mapping; in individual process models and the overall carbon-balance models; and in the interface software for mapping the simulation results. Statistical analysis of the data base was begun. In 1986 mapping was shifted to NASA-Goddard. The initial approach involving pattern comparisons was modified to a more statistical approach. A major accomplishment was the expansion and improvement of a global data base of measurements of biomass and primary production, to complement the simulation data. The main accomplishments during 1987 included: production of a master tape with all environmental and satellite data and model results for the 1600 sites; development of a complete mapping system used for the initial color maps comparing annual and monthly patterns of Normalized Difference Vegetation Index (NDVI), actual evapotranspiration, net primary productivity, gross primary productivity, and net ecosystem production; collection of more biosphere measurements for eventual improvement of the biological models; and development of some initial monthly models for primary productivity, based on satellite data.

  17. Creating "Intelligent" Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, Noel; Taylor, Patrick

    2014-05-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is used to add value to individual model projections and construct a consensus projection. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, individual models reproduce certain climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. The intention is to produce improved ("intelligent") unequal-weight ensemble averages. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Several climate process metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument in combination with surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing the equal-weighted ensemble average and an ensemble weighted using the process-based metric. Additionally, this study investigates the dependence of the metric weighting scheme on the climate state using a combination of model simulations including a non-forced preindustrial control experiment, historical simulations, and several radiative forcing Representative Concentration Pathway (RCP) scenarios. Ultimately, the goal of the framework is to advise better methods for ensemble averaging models and create better climate predictions.

  18. Smoking cessation treatment and outcomes patterns simulation: a new framework for evaluating the potential health and economic impact of smoking cessation interventions.

    PubMed

    Getsios, Denis; Marton, Jenő P; Revankar, Nikhil; Ward, Alexandra J; Willke, Richard J; Rublee, Dale; Ishak, K Jack; Xenakis, James G

    2013-09-01

    Most existing models of smoking cessation treatments have considered a single quit attempt when modelling long-term outcomes. To develop a model to simulate smokers over their lifetimes accounting for multiple quit attempts and relapses which will allow for prediction of the long-term health and economic impact of smoking cessation strategies. A discrete event simulation (DES) that models individuals' life course of smoking behaviours, attempts to quit, and the cumulative impact on health and economic outcomes was developed. Each individual is assigned one of the available strategies used to support each quit attempt; the outcome of each attempt, time to relapses if abstinence is achieved, and time between quit attempts is tracked. Based on each individual's smoking or abstinence patterns, the risk of developing diseases associated with smoking (chronic obstructive pulmonary disease, lung cancer, myocardial infarction and stroke) is determined and the corresponding costs, changes to mortality, and quality of life assigned. Direct costs are assessed from the perspective of a comprehensive US healthcare payer ($US, 2012 values). Quit attempt strategies that can be evaluated in the current simulation include unassisted quit attempts, brief counselling, behavioural modification therapy, nicotine replacement therapy, bupropion, and varenicline, with the selection of strategies and time between quit attempts based on equations derived from survey data. Equations predicting the success of quit attempts as well as the short-term probability of relapse were derived from five varenicline clinical trials. Concordance between the five trials and predictions from the simulation on abstinence at 12 months was high, indicating that the equations predicting success and relapse in the first year following a quit attempt were reliable. Predictions allowing for only a single quit attempt versus unrestricted attempts demonstrate important differences, with the single quit attempt simulation predicting 19 % more smoking-related diseases and 10 % higher costs associated with smoking-related diseases. Differences are most prominent in predictions of the time that individuals abstain from smoking: 13.2 years on average over a lifetime allowing for multiple quit attempts, versus only 1.2 years with single quit attempts. Differences in abstinence time estimates become substantial only 5 years into the simulation. In the multiple quit attempt simulations, younger individuals survived longer, yet had lower lifetime smoking-related disease and total costs, while the opposite was true for those with high levels of nicotine dependence. By allowing for multiple quit attempts over the course of individuals' lives, the simulation can provide more reliable estimates on the health and economic impact of interventions designed to increase abstinence from smoking. Furthermore, the individual nature of the simulation allows for evaluation of outcomes in populations with different baseline profiles. DES provides a framework for comprehensive and appropriate predictions when applied to smoking cessation over smoker lifetimes.

  19. Towards oscillations-based simulation of social systems: a neurodynamic approach

    NASA Astrophysics Data System (ADS)

    Plikynas, Darius; Basinskas, Gytis; Laukaitis, Algirdas

    2015-04-01

    This multidisciplinary work presents synopsis of theories in the search for common field-like fundamental principles of self-organisation and communication existing on quantum, cellular, and even social levels. Based on these fundamental principles, we formulate conceptually novel social neuroscience paradigm (OSIMAS), which envisages social systems emerging from the coherent neurodynamical processes taking place in the individual mind-fields. In this way, societies are understood as global processes emerging from the superposition of the conscious and subconscious mind-fields of individual members of society. For the experimental validation of the biologically inspired OSIMAS paradigm, we have designed a framework of EEG-based experiments. Initial baseline individual tests of spectral cross-correlations of EEG-recorded brainwave patterns for some mental states have been provided in this paper. Preliminary experimental results do not refute the main OSIMAS postulates. This paper also provides some insights for the construction of OSIMAS-based simulation models.

  20. Evaluation of Inventory Reduction Strategies: Balad Air Base Case Study

    DTIC Science & Technology

    2012-03-01

    produced by conducting individual simulations using a unique random seed generated by the default Anylogic © random number generator. The...develops an agent-based simulation model of the sustainment supply chain supporting Balad AB during its closure using the software AnyLogic ®. The...research. The goal of USAF Stockage Policy is to maximize customer support while minimizing inventory costs (DAF, 2011:1). USAF stocking decisions

  1. On the implications of the classical ergodic theorems: analysis of developmental processes has to focus on intra-individual variation.

    PubMed

    Molenaar, Peter C M

    2008-01-01

    It is argued that general mathematical-statistical theorems imply that standard statistical analysis techniques of inter-individual variation are invalid to investigate developmental processes. Developmental processes have to be analyzed at the level of individual subjects, using time series data characterizing the patterns of intra-individual variation. It is shown that standard statistical techniques based on the analysis of inter-individual variation appear to be insensitive to the presence of arbitrary large degrees of inter-individual heterogeneity in the population. An important class of nonlinear epigenetic models of neural growth is described which can explain the occurrence of such heterogeneity in brain structures and behavior. Links with models of developmental instability are discussed. A simulation study based on a chaotic growth model illustrates the invalidity of standard analysis of inter-individual variation, whereas time series analysis of intra-individual variation is able to recover the true state of affairs. (c) 2007 Wiley Periodicals, Inc.

  2. Modeling Collaborative Interaction Patterns in a Simulation-Based Task

    ERIC Educational Resources Information Center

    Andrews, Jessica J.; Kerr, Deirdre; Mislevy, Robert J.; von Davier, Alina; Hao, Jiangang; Liu, Lei

    2017-01-01

    Simulations and games offer interactive tasks that can elicit rich data, providing evidence of complex skills that are difficult to measure with more conventional items and tests. However, one notable challenge in using such technologies is making sense of the data generated in order to make claims about individuals or groups. This article…

  3. HexSim: A flexible simulation model for forecasting wildlife responses to multiple interacting stressors - ESRP Meeting

    EPA Science Inventory

    With SERDP funding, we have improved upon a popular life history simulator (PATCH), and indoing so produced a powerful new forecasting tool (HexSim). PATCH, our starting point, was spatially explicit and individual-based, and was useful for evaluating a range of terrestrial life...

  4. Modeling climate change impacts on maize growth with the focus on plant internal water transport

    NASA Astrophysics Data System (ADS)

    Heinlein, Florian; Biernath, Christian; Klein, Christian; Thieme, Christoph; Priesack, Eckart

    2015-04-01

    Based on climate change experiments in chambers and on field measurements, the scientific community expects regional and global changes of crop biomass production and yields. In central Europe one major aspect of climate change is the shift of precipitation towards winter months and the increase of extreme events, e.g. heat stress and heavy precipitation, during the main growing season in summer. To understand water uptake, water use, and transpiration rates by plants numerous crop models were developed. We tested the ability of two existing canopy models (CERES-Maize and SPASS) embedded in the model environment Expert-N5.0 to simulate the water balance, water use efficiency and crop growth. Additionally, sap flow was measured using heat-ratio measurement devices at the stem base of individual plants. The models were tested against data on soil water contents, as well as on evaporation and transpiration rates of Maize plants, which were grown on lysimeters at Helmholtz Zentrum München and in the field at the research station Scheyern, Germany, in summer 2013 and 2014. We present the simulation results and discuss observed shortcomings of the models. CERES-Maize and SPASS could simulate the measured dynamics of xylem sap flow. However, these models oversimplify plant water transport, and thus, cannot explain the underlying mechanisms. Therefore, to overcome these shortcomings, we additionally propose a new model, which is based on two coupled 1-D Richards equations, describing explicitly the plant and soil water transport. This model, which has previously successfully been applied to simulate water flux of 94 individual beech trees of an old-grown forest, will lead to a more mechanistic representation of the soil-plant-water-flow-continuum. This xylem water flux model was now implemented into the crop model SPASS and adjusted to simulate water flux of single maize plants. The modified version is presented and explained. Basic model input requirements are the plant above- and below-ground architectures. Shoot architectures were derived from terrestrial laser scanning. Root architectures of Maize plants were generated using a simple L-system. Preliminary results will be presented together with simulation results by CERES-Maize and SPASS.

  5. Computational Modeling of Inflammation and Wound Healing

    PubMed Central

    Ziraldo, Cordelia; Mi, Qi; An, Gary; Vodovotz, Yoram

    2013-01-01

    Objective Inflammation is both central to proper wound healing and a key driver of chronic tissue injury via a positive-feedback loop incited by incidental cell damage. We seek to derive actionable insights into the role of inflammation in wound healing in order to improve outcomes for individual patients. Approach To date, dynamic computational models have been used to study the time evolution of inflammation in wound healing. Emerging clinical data on histo-pathological and macroscopic images of evolving wounds, as well as noninvasive measures of blood flow, suggested the need for tissue-realistic, agent-based, and hybrid mechanistic computational simulations of inflammation and wound healing. Innovation We developed a computational modeling system, Simple Platform for Agent-based Representation of Knowledge, to facilitate the construction of tissue-realistic models. Results A hybrid equation–agent-based model (ABM) of pressure ulcer formation in both spinal cord-injured and -uninjured patients was used to identify control points that reduce stress caused by tissue ischemia/reperfusion. An ABM of arterial restenosis revealed new dynamics of cell migration during neointimal hyperplasia that match histological features, but contradict the currently prevailing mechanistic hypothesis. ABMs of vocal fold inflammation were used to predict inflammatory trajectories in individuals, possibly allowing for personalized treatment. Conclusions The intertwined inflammatory and wound healing responses can be modeled computationally to make predictions in individuals, simulate therapies, and gain mechanistic insights. PMID:24527362

  6. No correlation discerned between the periods of rise and dominance of simulated species in a model of biological evolution

    NASA Astrophysics Data System (ADS)

    Kuhnle, Alan

    2009-11-01

    In [1], Liow et al. discern a general feature of the occurrence trajectories of biological species: the periods of rise and fall of a typical species are about as long as the period of dominance. In this work, an individual-based model of biological evolution that was developed by Rikvold and Zia in [2] is investigated, but no analogous feature is observed in the simulated species populations. Instead, the periods of rise and fall of a simulated species cannot always be sensibly defined; when it does make sense to define these quantities, they are quite short and independent of the period of dominance. [4pt] [1] Liow, L. H., Skaug, H. J., Ergon, T., Schweder, T.: Global occurence trajectories of microfossils: Is the rise and persistence of species influenced by environmental volatility? Manuscript for Paleobiology, 5 Dec 2008 [0pt] [2] Rikvold, P.A., Zia, R.K.P.: Punctuated equilibria and 1/f noise in a biological coevolution model with individual-based dynamics. Physical Review E 68, 031913 (2003)

  7. Seldon v.3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Nina; Ko, Teresa; Shneider, Max

    Seldon is an agent-based social simulation framework that uniquely integrates concepts from a variety of different research areas including psychology, social science, and agent-based modeling. Development has been taking place for a number of years, previously focusing on gang and terrorist recruitment. The toolkit consists of simple agents (individuals) and abstract agents (groups of individuals representing social/institutional concepts) that interact according to exchangeable rule sets (i.e. linear attraction, linear reinforcement). Each agent has a set of customizable attributes that get modified during the interactions. Interactions create relationships between agents, and each agent has a maximum amount of relationship energy thatmore » it can expend. As relationships evolve, they form multiple levels of social networks (i.e. acquaintances, friends, cliques) that in turn drive future interactions. Agents can also interact randomly if they are not connected through a network, mimicking the chance interactions that real people have in everyday life. We are currently integrating Seldon with the cognitive framework (also developed at Sandia). Each individual agent has a lightweight cognitive model that is created automatically from textual sources. Cognitive information is exchanged during interactions, and can also be injected into a running simulation. The entire framework has been parallelized to allow for larger simulations in an HPC environment. We have also added more detail to the agents themselves (a"Big Five" personality model) and their interactions (an enhanced relationship model) for a more realistic representation.« less

  8. Agent-based modeling of the spread of influenza-like illness in an emergency department: a simulation study.

    PubMed

    Laskowski, Marek; Demianyk, Bryan C P; Witt, Julia; Mukhi, Shamir N; Friesen, Marcia R; McLeod, Robert D

    2011-11-01

    The objective of this paper was to develop an agent-based modeling framework in order to simulate the spread of influenza virus infection on a layout based on a representative hospital emergency department in Winnipeg, Canada. In doing so, the study complements mathematical modeling techniques for disease spread, as well as modeling applications focused on the spread of antibiotic-resistant nosocomial infections in hospitals. Twenty different emergency department scenarios were simulated, with further simulation of four infection control strategies. The agent-based modeling approach represents systems modeling, in which the emergency department was modeled as a collection of agents (patients and healthcare workers) and their individual characteristics, behaviors, and interactions. The framework was coded in C++ using Qt4 libraries running under the Linux operating system. A simple ordinary least squares (OLS) regression was used to analyze the data, in which the percentage of patients that became infected in one day within the simulation was the dependent variable. The results suggest that within the given instance context, patient-oriented infection control policies (alternate treatment streams, masking symptomatic patients) tend to have a larger effect than policies that target healthcare workers. The agent-based modeling framework is a flexible tool that can be made to reflect any given environment; it is also a decision support tool for practitioners and policymakers to assess the relative impact of infection control strategies. The framework illuminates scenarios worthy of further investigation, as well as counterintuitive findings.

  9. Sentence Recognition Prediction for Hearing-impaired Listeners in Stationary and Fluctuation Noise With FADE: Empowering the Attenuation and Distortion Concept by Plomp With a Quantitative Processing Model.

    PubMed

    Kollmeier, Birger; Schädler, Marc René; Warzybok, Anna; Meyer, Bernd T; Brand, Thomas

    2016-09-07

    To characterize the individual patient's hearing impairment as obtained with the matrix sentence recognition test, a simulation Framework for Auditory Discrimination Experiments (FADE) is extended here using the Attenuation and Distortion (A+D) approach by Plomp as a blueprint for setting the individual processing parameters. FADE has been shown to predict the outcome of both speech recognition tests and psychoacoustic experiments based on simulations using an automatic speech recognition system requiring only few assumptions. It builds on the closed-set matrix sentence recognition test which is advantageous for testing individual speech recognition in a way comparable across languages. Individual predictions of speech recognition thresholds in stationary and in fluctuating noise were derived using the audiogram and an estimate of the internal level uncertainty for modeling the individual Plomp curves fitted to the data with the Attenuation (A-) and Distortion (D-) parameters of the Plomp approach. The "typical" audiogram shapes from Bisgaard et al with or without a "typical" level uncertainty and the individual data were used for individual predictions. As a result, the individualization of the level uncertainty was found to be more important than the exact shape of the individual audiogram to accurately model the outcome of the German Matrix test in stationary or fluctuating noise for listeners with hearing impairment. The prediction accuracy of the individualized approach also outperforms the (modified) Speech Intelligibility Index approach which is based on the individual threshold data only. © The Author(s) 2016.

  10. Navigating the flow: individual and continuum models for homing in flowing environments

    PubMed Central

    Painter, Kevin J.; Hillen, Thomas

    2015-01-01

    Navigation for aquatic and airborne species often takes place in the face of complicated flows, from persistent currents to highly unpredictable storms. Hydrodynamic models are capable of simulating flow dynamics and provide the impetus for much individual-based modelling, in which particle-sized individuals are immersed into a flowing medium. These models yield insights on the impact of currents on population distributions from fish eggs to large organisms, yet their computational demands and intractability reduce their capacity to generate the broader, less parameter-specific, insights allowed by traditional continuous approaches. In this paper, we formulate an individual-based model for navigation within a flowing field and apply scaling to derive its corresponding macroscopic and continuous model. We apply it to various movement classes, from drifters that simply go with the flow to navigators that respond to environmental orienteering cues. The utility of the model is demonstrated via its application to ‘homing’ problems and, in particular, the navigation of the marine green turtle Chelonia mydas to Ascension Island. PMID:26538557

  11. Truncated Lévy flights and agenda-based mobility are useful for the assessment of personal human exposure.

    PubMed

    Schlink, Uwe; Ragas, Ad M J

    2011-01-01

    Receptor-oriented approaches can assess the individual-specific exposure to air pollution. In such an individual-based model we analyse the impact of human mobility to the personal exposure that is perceived by individuals simulated in an exemplified urban area. The mobility models comprise random walk (reference point mobility, RPM), truncated Lévy flights (TLF), and agenda-based walk (RPMA). We describe and review the general concepts and provide an inter-comparison of these concepts. Stationary and ergodic behaviour are explained and applied as well as performance criteria for a comparative evaluation of the investigated algorithms. We find that none of the studied algorithm results in purely random trajectories. TLF and RPMA prove to be suitable for human mobility modelling, because they provide conditions for very individual-specific trajectories and exposure. Suggesting these models we demonstrate the plausibility of their results for exposure to air-borne benzene and the combined exposure to benzene and nonane. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Pursuing the method of multiple working hypotheses to understand differences in process-based snow models

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Essery, Richard

    2017-04-01

    When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.

  13. Modeling individualized coefficient alpha to measure quality of test score data.

    PubMed

    Liu, Molei; Hu, Ming; Zhou, Xiao-Hua

    2018-05-23

    Individualized coefficient alpha is defined. It is item and subject specific and is used to measure the quality of test score data with heterogenicity among the subjects and items. A regression model is developed based on 3 sets of generalized estimating equations. The first set of generalized estimating equation models the expectation of the responses, the second set models the response's variance, and the third set is proposed to estimate the individualized coefficient alpha, defined and used to measure individualized internal consistency of the responses. We also use different techniques to extend our method to handle missing data. Asymptotic property of the estimators is discussed, based on which inference on the coefficient alpha is derived. Performance of our method is evaluated through simulation study and real data analysis. The real data application is from a health literacy study in Hunan province of China. Copyright © 2018 John Wiley & Sons, Ltd.

  14. Individual-based modelling of population growth and diffusion in discrete time.

    PubMed

    Tkachenko, Natalie; Weissmann, John D; Petersen, Wesley P; Lake, George; Zollikofer, Christoph P E; Callegari, Simone

    2017-01-01

    Individual-based models (IBMs) of human populations capture spatio-temporal dynamics using rules that govern the birth, behavior, and death of individuals. We explore a stochastic IBM of logistic growth-diffusion with constant time steps and independent, simultaneous actions of birth, death, and movement that approaches the Fisher-Kolmogorov model in the continuum limit. This model is well-suited to parallelization on high-performance computers. We explore its emergent properties with analytical approximations and numerical simulations in parameter ranges relevant to human population dynamics and ecology, and reproduce continuous-time results in the limit of small transition probabilities. Our model prediction indicates that the population density and dispersal speed are affected by fluctuations in the number of individuals. The discrete-time model displays novel properties owing to the binomial character of the fluctuations: in certain regimes of the growth model, a decrease in time step size drives the system away from the continuum limit. These effects are especially important at local population sizes of <50 individuals, which largely correspond to group sizes of hunter-gatherers. As an application scenario, we model the late Pleistocene dispersal of Homo sapiens into the Americas, and discuss the agreement of model-based estimates of first-arrival dates with archaeological dates in dependence of IBM model parameter settings.

  15. Modeling a secular trend by Monte Carlo simulation of height biased migration in a spatial network.

    PubMed

    Groth, Detlef

    2017-04-01

    Background: In a recent Monte Carlo simulation, the clustering of body height of Swiss military conscripts within a spatial network with characteristic features of the natural Swiss geography was investigated. In this study I examined the effect of migration of tall individuals into network hubs on the dynamics of body height within the whole spatial network. The aim of this study was to simulate height trends. Material and methods: Three networks were used for modeling, a regular rectangular fishing net like network, a real world example based on the geographic map of Switzerland, and a random network. All networks contained between 144 and 148 districts and between 265-307 road connections. Around 100,000 agents were initially released with average height of 170 cm, and height standard deviation of 6.5 cm. The simulation was started with the a priori assumption that height variation within a district is limited and also depends on height of neighboring districts (community effect on height). In addition to a neighborhood influence factor, which simulates a community effect, body height dependent migration of conscripts between adjacent districts in each Monte Carlo simulation was used to re-calculate next generation body heights. In order to determine the direction of migration for taller individuals, various centrality measures for the evaluation of district importance within the spatial network were applied. Taller individuals were favored to migrate more into network hubs, backward migration using the same number of individuals was random, not biased towards body height. Network hubs were defined by the importance of a district within the spatial network. The importance of a district was evaluated by various centrality measures. In the null model there were no road connections, height information could not be delivered between the districts. Results: Due to the favored migration of tall individuals into network hubs, average body height of the hubs, and later, of the whole network increased by up to 0.1 cm per iteration depending on the network model. The general increase in height within the network depended on connectedness and on the amount of height information that was exchanged between neighboring districts. If higher amounts of neighborhood height information were exchanged, the general increase in height within the network was large (strong secular trend). The trend in the homogeneous fishnet like network was lowest, the trend in the random network was highest. Yet, some network properties, such as the heteroscedasticity and autocorrelations of the migration simulation models differed greatly from the natural features observed in Swiss military conscript networks. Autocorrelations of district heights for instance, were much higher in the migration models. Conclusion: This study confirmed that secular height trends can be modeled by preferred migration of tall individuals into network hubs. However, basic network properties of the migration simulation models differed greatly from the natural features observed in Swiss military conscripts. Similar network-based data from other countries should be explored to better investigate height trends with Monte Carlo migration approach.

  16. Predictive Finite Rate Model for Oxygen-Carbon Interactions at High Temperature

    NASA Astrophysics Data System (ADS)

    Poovathingal, Savio

    An oxidation model for carbon surfaces is developed to predict ablation rates for carbon heat shields used in hypersonic vehicles. Unlike existing empirical models, the approach used here was to probe gas-surface interactions individually and then based on an understanding of the relevant fundamental processes, build a predictive model that would be accurate over a wide range of pressures and temperatures, and even microstructures. Initially, molecular dynamics was used to understand the oxidation processes on the surface. The molecular dynamics simulations were compared to molecular beam experiments and good qualitative agreement was observed. The simulations reproduced cylindrical pitting observed in the experiments where oxidation was rapid and primarily occurred around a defect. However, the studies were limited to small systems at low temperatures and could simulate time scales only of the order of nanoseconds. Molecular beam experiments at high surface temperature indicated that a majority of surface reaction products were produced through thermal mechanisms. Since the reactions were thermal, they occurred over long time scales which were computationally prohibitive for molecular dynamics to simulate. The experiments provided detailed dynamical data on the scattering of O, O2, CO, and CO2 and it was found that the data from molecular beam experiments could be used directly to build a model. The data was initially used to deduce surface reaction probabilities at 800 K. The reaction probabilities were then incorporated into the direct simulation Monte Carlo (DSMC) method. Simulations were performed where the microstructure was resolved and dissociated oxygen convected and diffused towards it. For a gas-surface temperature of 800 K, it was found that despite CO being the dominant surface reaction product, a gas-phase reaction forms significant CO2 within the microstructure region. It was also found that surface area did not play any role in concentration of reaction products because the reaction probabilities were in the diffusion dominant regime. The molecular beam data at different surface temperatures was then used to build a finite rate model. Each reaction mechanism and all rate parameters of the new model were determined individually based on the molecular beam data. Despite the experiments being performed at near vacuum conditions, the finite rate model developed using the data could be used at pressures and temperatures relevant to hypersonic conditions. The new model was implemented in a computational fluid dynamics (CFD) solver and flow over a hypersonic vehicle was simulated. The new model predicted similar overall mass loss rates compared to existing models, however, the individual species production rates were completely different. The most notable difference was that the new model (based on molecular beam data) predicts CO as the oxidation reaction product with virtually no CO2 production, whereas existing models predict the exact opposite trend. CO being the dominant oxidation product is consistent with recent high enthalpy wind tunnel experiments. The discovery that measurements taken in molecular beam facilities are able to determine individual reaction mechanisms, including dependence on surface coverage, opens up an entirely new way of constructing ablation models.

  17. From individual to population level effects of toxicants in the tubicifid Branchiura sowerbyi using threshold effect models in a Bayesian framework.

    PubMed

    Ducrot, Virginie; Billoir, Elise; Péry, Alexandre R R; Garric, Jeanne; Charles, Sandrine

    2010-05-01

    Effects of zinc were studied in the freshwater worm Branchiura sowerbyi using partial and full life-cycle tests. Only newborn and juveniles were sensitive to zinc, displaying effects on survival, growth, and age at first brood at environmentally relevant concentrations. Threshold effect models were proposed to assess toxic effects on individuals. They were fitted to life-cycle test data using Bayesian inference and adequately described life-history trait data in exposed organisms. The daily asymptotic growth rate of theoretical populations was then simulated with a matrix population model, based upon individual-level outputs. Population-level outputs were in accordance with existing literature for controls. Working in a Bayesian framework allowed incorporating parameter uncertainty in the simulation of the population-level response to zinc exposure, thus increasing the relevance of test results in the context of ecological risk assessment.

  18. Agent-Based Modeling in Systems Pharmacology.

    PubMed

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.

  19. Modelling disease outbreaks in realistic urban social networks

    NASA Astrophysics Data System (ADS)

    Eubank, Stephen; Guclu, Hasan; Anil Kumar, V. S.; Marathe, Madhav V.; Srinivasan, Aravind; Toroczkai, Zoltán; Wang, Nan

    2004-05-01

    Most mathematical models for the spread of disease use differential equations based on uniform mixing assumptions or ad hoc models for the contact process. Here we explore the use of dynamic bipartite graphs to model the physical contact patterns that result from movements of individuals between specific locations. The graphs are generated by large-scale individual-based urban traffic simulations built on actual census, land-use and population-mobility data. We find that the contact network among people is a strongly connected small-world-like graph with a well-defined scale for the degree distribution. However, the locations graph is scale-free, which allows highly efficient outbreak detection by placing sensors in the hubs of the locations network. Within this large-scale simulation framework, we then analyse the relative merits of several proposed mitigation strategies for smallpox spread. Our results suggest that outbreaks can be contained by a strategy of targeted vaccination combined with early detection without resorting to mass vaccination of a population.

  20. Biologically Informed Individual-Based Network Model for Rift Valley Fever in the US and Evaluation of Mitigation Strategies

    PubMed Central

    Scoglio, Caterina M.

    2016-01-01

    Rift Valley fever (RVF) is a zoonotic disease endemic in sub-Saharan Africa with periodic outbreaks in human and animal populations. Mosquitoes are the primary disease vectors; however, Rift Valley fever virus (RVFV) can also spread by direct contact with infected tissues. The transmission cycle is complex, involving humans, livestock, and multiple species of mosquitoes. The epidemiology of RVFV in endemic areas is strongly affected by climatic conditions and environmental variables. In this research, we adapt and use a network-based modeling framework to simulate the transmission of RVFV among hypothetical cattle operations in Kansas, US. Our model considers geo-located livestock populations at the individual level while incorporating the role of mosquito populations and the environment at a coarse resolution. Extensive simulations show the flexibility of our modeling framework when applied to specific scenarios to quantitatively evaluate the efficacy of mosquito control and livestock movement regulations in reducing the extent and intensity of RVF outbreaks in the United States. PMID:27662585

  1. Biologically Informed Individual-Based Network Model for Rift Valley Fever in the US and Evaluation of Mitigation Strategies.

    PubMed

    Scoglio, Caterina M; Bosca, Claudio; Riad, Mahbubul H; Sahneh, Faryad D; Britch, Seth C; Cohnstaedt, Lee W; Linthicum, Kenneth J

    Rift Valley fever (RVF) is a zoonotic disease endemic in sub-Saharan Africa with periodic outbreaks in human and animal populations. Mosquitoes are the primary disease vectors; however, Rift Valley fever virus (RVFV) can also spread by direct contact with infected tissues. The transmission cycle is complex, involving humans, livestock, and multiple species of mosquitoes. The epidemiology of RVFV in endemic areas is strongly affected by climatic conditions and environmental variables. In this research, we adapt and use a network-based modeling framework to simulate the transmission of RVFV among hypothetical cattle operations in Kansas, US. Our model considers geo-located livestock populations at the individual level while incorporating the role of mosquito populations and the environment at a coarse resolution. Extensive simulations show the flexibility of our modeling framework when applied to specific scenarios to quantitatively evaluate the efficacy of mosquito control and livestock movement regulations in reducing the extent and intensity of RVF outbreaks in the United States.

  2. Model-Based Sensor-Augmented Pump Therapy

    PubMed Central

    Grosman, Benyamin; Voskanyan, Gayane; Loutseiko, Mikhail; Roy, Anirban; Mehta, Aloke; Kurtz, Natalie; Parikh, Neha; Kaufman, Francine R.; Mastrototaro, John J.; Keenan, Barry

    2013-01-01

    Background In insulin pump therapy, optimization of bolus and basal insulin dose settings is a challenge. We introduce a new algorithm that provides individualized basal rates and new carbohydrate ratio and correction factor recommendations. The algorithm utilizes a mathematical model of blood glucose (BG) as a function of carbohydrate intake and delivered insulin, which includes individualized parameters derived from sensor BG and insulin delivery data downloaded from a patient’s pump. Methods A mathematical model of BG as a function of carbohydrate intake and delivered insulin was developed. The model includes fixed parameters and several individualized parameters derived from the subject’s BG measurements and pump data. Performance of the new algorithm was assessed using n = 4 diabetic canine experiments over a 32 h duration. In addition, 10 in silico adults from the University of Virginia/Padova type 1 diabetes mellitus metabolic simulator were tested. Results The percentage of time in glucose range 80–180 mg/dl was 86%, 85%, 61%, and 30% using model-based therapy and [78%, 100%] (brackets denote multiple experiments conducted under the same therapy and animal model), [75%, 67%], 47%, and 86% for the control experiments for dogs 1 to 4, respectively. The BG measurements obtained in the simulation using our individualized algorithm were in 61–231 mg/dl min–max envelope, whereas use of the simulator’s default treatment resulted in BG measurements 90–210 mg/dl min–max envelope. Conclusions The study results demonstrate the potential of this method, which could serve as a platform for improving, facilitating, and standardizing insulin pump therapy based on a single download of data. PMID:23567006

  3. Stochastic simulation of multiscale complex systems with PISKaS: A rule-based approach.

    PubMed

    Perez-Acle, Tomas; Fuenzalida, Ignacio; Martin, Alberto J M; Santibañez, Rodrigo; Avaria, Rodrigo; Bernardin, Alejandro; Bustos, Alvaro M; Garrido, Daniel; Dushoff, Jonathan; Liu, James H

    2018-03-29

    Computational simulation is a widely employed methodology to study the dynamic behavior of complex systems. Although common approaches are based either on ordinary differential equations or stochastic differential equations, these techniques make several assumptions which, when it comes to biological processes, could often lead to unrealistic models. Among others, model approaches based on differential equations entangle kinetics and causality, failing when complexity increases, separating knowledge from models, and assuming that the average behavior of the population encompasses any individual deviation. To overcome these limitations, simulations based on the Stochastic Simulation Algorithm (SSA) appear as a suitable approach to model complex biological systems. In this work, we review three different models executed in PISKaS: a rule-based framework to produce multiscale stochastic simulations of complex systems. These models span multiple time and spatial scales ranging from gene regulation up to Game Theory. In the first example, we describe a model of the core regulatory network of gene expression in Escherichia coli highlighting the continuous model improvement capacities of PISKaS. The second example describes a hypothetical outbreak of the Ebola virus occurring in a compartmentalized environment resembling cities and highways. Finally, in the last example, we illustrate a stochastic model for the prisoner's dilemma; a common approach from social sciences describing complex interactions involving trust within human populations. As whole, these models demonstrate the capabilities of PISKaS providing fertile scenarios where to explore the dynamics of complex systems. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Reconstruction of Orion Engineering Development Unit (EDU) Parachute Inflation Loads

    NASA Technical Reports Server (NTRS)

    Ray, Eric S.

    2013-01-01

    The process of reconstructing inflation loads of Capsule Parachute Assembly System (CPAS) has been updated as the program transitioned to testing Engineering Development Unit (EDU) hardware. The equations used to reduce the test data have been re-derived based on the same physical assumptions made by simulations. Due to instrumentation challenges, individual parachute loads are determined from complementary accelerometer and load cell measurements. Cluster inflations are now simulated by modeling each parachute individually to better represent different inflation times and non-synchronous disreefing. The reconstruction procedure is tailored to either infinite mass or finite mass events based on measurable characteristics from the test data. Inflation parameters are determined from an automated optimization routine to reduce subjectivity. Infinite mass inflation parameters have been re-defined to avoid unrealistic interactions in Monte Carlo simulations. Sample cases demonstrate how best-fit inflation parameters are used to generate simulated drag areas and loads which favorably agree with test data.

  5. Agent-Based Modeling of Cancer Stem Cell Driven Solid Tumor Growth.

    PubMed

    Poleszczuk, Jan; Macklin, Paul; Enderling, Heiko

    2016-01-01

    Computational modeling of tumor growth has become an invaluable tool to simulate complex cell-cell interactions and emerging population-level dynamics. Agent-based models are commonly used to describe the behavior and interaction of individual cells in different environments. Behavioral rules can be informed and calibrated by in vitro assays, and emerging population-level dynamics may be validated with both in vitro and in vivo experiments. Here, we describe the design and implementation of a lattice-based agent-based model of cancer stem cell driven tumor growth.

  6. Implementation of a Helicopter Flight Simulator with Individual Blade Control

    NASA Astrophysics Data System (ADS)

    Zinchiak, Andrew G.

    2011-12-01

    Nearly all modern helicopters are designed with a swashplate-based system for control of the main rotor blades. However, the swashplate-based approach does not provide the level of redundancy necessary to cope with abnormal actuator conditions. For example, if an actuator fails (becomes locked) on the main rotor, the cyclic inputs are consequently fixed and the helicopter may become stuck in a flight maneuver. This can obviously be seen as a catastrophic failure, and would likely lead to a crash. These types of failures can be overcome with the application of individual blade control (IBC). IBC is achieved using the blade pitch control method, which provides complete authority of the aerodynamic characteristics of each rotor blade at any given time by replacing the normally rigid pitch links between the swashplate and the pitch horn of the blade with hydraulic or electronic actuators. Thus, IBC can provide the redundancy necessary for subsystem failure accommodation. In this research effort, a simulation environment is developed to investigate the potential of the IBC main rotor configuration for fault-tolerant control. To examine the applications of IBC to failure scenarios and fault-tolerant controls, a conventional, swashplate-based linear model is first developed for hover and forward flight scenarios based on the UH-60 Black Hawk helicopter. The linear modeling techniques for the swashplate-based helicopter are then adapted and expanded to include IBC. Using these modified techniques, an IBC based mathematical model of the UH-60 helicopter is developed for the purposes of simulation and analysis. The methodology can be used to model and implement a different aircraft if geometric, gravimetric, and general aerodynamic data are available. Without the kinetic restrictions of the swashplate, the IBC model effectively decouples the cyclic control inputs between different blades. Simulations of the IBC model prove that the primary control functions can be manually reconfigured after local actuator failures are initiated, thus preventing a catastrophic failure or crash. Furthermore, this simulator promises to be a useful tool for the design, testing, and analysis of fault-tolerant control laws.

  7. Efficient coarse simulation of a growing avascular tumor

    PubMed Central

    Kavousanakis, Michail E.; Liu, Ping; Boudouvis, Andreas G.; Lowengrub, John; Kevrekidis, Ioannis G.

    2013-01-01

    The subject of this work is the development and implementation of algorithms which accelerate the simulation of early stage tumor growth models. Among the different computational approaches used for the simulation of tumor progression, discrete stochastic models (e.g., cellular automata) have been widely used to describe processes occurring at the cell and subcell scales (e.g., cell-cell interactions and signaling processes). To describe macroscopic characteristics (e.g., morphology) of growing tumors, large numbers of interacting cells must be simulated. However, the high computational demands of stochastic models make the simulation of large-scale systems impractical. Alternatively, continuum models, which can describe behavior at the tumor scale, often rely on phenomenological assumptions in place of rigorous upscaling of microscopic models. This limits their predictive power. In this work, we circumvent the derivation of closed macroscopic equations for the growing cancer cell populations; instead, we construct, based on the so-called “equation-free” framework, a computational superstructure, which wraps around the individual-based cell-level simulator and accelerates the computations required for the study of the long-time behavior of systems involving many interacting cells. The microscopic model, e.g., a cellular automaton, which simulates the evolution of cancer cell populations, is executed for relatively short time intervals, at the end of which coarse-scale information is obtained. These coarse variables evolve on slower time scales than each individual cell in the population, enabling the application of forward projection schemes, which extrapolate their values at later times. This technique is referred to as coarse projective integration. Increasing the ratio of projection times to microscopic simulator execution times enhances the computational savings. Crucial accuracy issues arising for growing tumors with radial symmetry are addressed by applying the coarse projective integration scheme in a cotraveling (cogrowing) frame. As a proof of principle, we demonstrate that the application of this scheme yields highly accurate solutions, while preserving the computational savings of coarse projective integration. PMID:22587128

  8. The Individual Virtual Eye: a Computer Model for Advanced Intraocular Lens Calculation

    PubMed Central

    Einighammer, Jens; Oltrup, Theo; Bende, Thomas; Jean, Benedikt

    2010-01-01

    Purpose To describe the individual virtual eye, a computer model of a human eye with respect to its optical properties. It is based on measurements of an individual person and one of its major application is calculating intraocular lenses (IOLs) for cataract surgery. Methods The model is constructed from an eye's geometry, including axial length and topographic measurements of the anterior corneal surface. All optical components of a pseudophakic eye are modeled with computer scientific methods. A spline-based interpolation method efficiently includes data from corneal topographic measurements. The geometrical optical properties, such as the wavefront aberration, are simulated with real ray-tracing using Snell's law. Optical components can be calculated using computer scientific optimization procedures. The geometry of customized aspheric IOLs was calculated for 32 eyes and the resulting wavefront aberration was investigated. Results The more complex the calculated IOL is, the lower the residual wavefront error is. Spherical IOLs are only able to correct for the defocus, while toric IOLs also eliminate astigmatism. Spherical aberration is additionally reduced by aspheric and toric aspheric IOLs. The efficient implementation of time-critical numerical ray-tracing and optimization procedures allows for short calculation times, which may lead to a practicable method integrated in some device. Conclusions The individual virtual eye allows for simulations and calculations regarding geometrical optics for individual persons. This leads to clinical applications like IOL calculation, with the potential to overcome the limitations of those current calculation methods that are based on paraxial optics, exemplary shown by calculating customized aspheric IOLs.

  9. Confirmation of model-based dose selection for a Japanese phase III study of rivaroxaban in non-valvular atrial fibrillation patients.

    PubMed

    Kaneko, Masato; Tanigawa, Takahiko; Hashizume, Kensei; Kajikawa, Mariko; Tajiri, Masahiro; Mueck, Wolfgang

    2013-01-01

    This study was designed to confirm the appropriateness of the dose setting for a Japanese phase III study of rivaroxaban in patients with non-valvular atrial fibrillation (NVAF), which had been based on model simulation employing phase II study data. The previously developed mixed-effects pharmacokinetic/pharmacodynamic (PK-PD) model, which consisted of an oral one-compartment model parameterized in terms of clearance, volume and a first-order absorption rate, was rebuilt and optimized using the data for 597 subjects from the Japanese phase III study, J-ROCKET AF. A mixed-effects modeling technique in NONMEM was used to quantify both unexplained inter-individual variability and inter-occasion variability, which are random effect parameters. The final PK and PK-PD models were evaluated to identify influential covariates. The empirical Bayes estimates of AUC and C(max) from the final PK model were consistent with the simulated results from the Japanese phase II study. There was no clear relationship between individual estimated exposures and safety-related events, and the estimated exposure levels were consistent with the global phase III data. Therefore, it was concluded that the dose selected for the phase III study with Japanese NVAF patients by means of model simulation employing phase II study data had been appropriate from the PK-PD perspective.

  10. An individual based computational model of intestinal crypt fission and its application to predicting unrestrictive growth of the intestinal epithelium.

    PubMed

    Pin, Carmen; Parker, Aimee; Gunning, A Patrick; Ohta, Yuki; Johnson, Ian T; Carding, Simon R; Sato, Toshiro

    2015-02-01

    Intestinal crypt fission is a homeostatic phenomenon, observable in healthy adult mucosa, but which also plays a pathological role as the main mode of growth of some intestinal polyps. Building on our previous individual based model for the small intestinal crypt and on in vitro cultured intestinal organoids, we here model crypt fission as a budding process based on fluid mechanics at the individual cell level and extrapolated predictions for growth of the intestinal epithelium. Budding was always observed in regions of organoids with abundant Paneth cells. Our data support a model in which buds are biomechanically initiated by single stem cells surrounded by Paneth cells which exhibit greater resistance to viscoelastic deformation, a hypothesis supported by atomic force measurements of single cells. Time intervals between consecutive budding events, as simulated by the model and observed in vitro, were 2.84 and 2.62 days, respectively. Predicted cell dynamics was unaffected within the original crypt which retained its full capability of providing cells to the epithelium throughout fission. Mitotic pressure in simulated primary crypts forced upward migration of buds, which simultaneously grew into new protruding crypts at a rate equal to 1.03 days(-1) in simulations and 0.99 days(-1) in cultured organoids. Simulated crypts reached their final size in 4.6 days, and required 6.2 days to migrate to the top of the primary crypt. The growth of the secondary crypt is independent of its migration along the original crypt. Assuming unrestricted crypt fission and multiple budding events, a maximal growth rate of the intestinal epithelium of 0.10 days(-1) is predicted and thus approximately 22 days are required for a 10-fold increase of polyp size. These predictions are in agreement with the time reported to develop macroscopic adenomas in mice after loss of Apc in intestinal stem cells.

  11. Coupled Stochastic Time-Inverted Lagrangian Transport/Weather Forecast and Research/Vegetation Photosynthesis and Respiration Model. Part II; Simulations of Tower-Based and Airborne CO2 Measurements

    NASA Technical Reports Server (NTRS)

    Eluszkiewicz, Janusz; Nehrkorn, Thomas; Wofsy, Steven C.; Matross, Daniel; Gerbig, Christoph; Lin, John C.; Freitas, Saulo; Longo, Marcos; Andrews, Arlyn E.; Peters, Wouter

    2007-01-01

    This paper evaluates simulations of atmospheric CO2 measured in 2004 at continental surface and airborne receptors, intended to test the capability to use data with high temporal and spatial resolution for analyses of carbon sources and sinks at regional and continental scales. The simulations were performed using the Stochastic Time-Inverted Lagrangian Transport (STILT) model driven by the Weather Forecast and Research (WRF) model, and linked to surface fluxes from the satellite-driven Vegetation Photosynthesis and Respiration Model (VPRM). The simulations provide detailed representations of hourly CO2 tower data and reproduce the shapes of airborne vertical profiles with high fidelity. WRF meteorology gives superior model performance compared with standard meteorological products, and the impact of including WRF convective mass fluxes in the STILT trajectory calculations is significant in individual cases. Important biases in the simulation are associated with the nighttime CO2 build-up and subsequent morning transition to convective conditions, and with errors in the advected lateral boundary condition. Comparison of STILT simulations driven by the WRF model against those driven by the Brazilian variant of the Regional Atmospheric Modeling System (BRAMS) shows that model-to-model differences are smaller than between an individual transport model and observations, pointing to systematic errors in the simulated transport. Future developments in the WRF model s data assimilation capabilities, basic research into the fundamental aspects of trajectory calculations, and intercomparison studies involving other transport models, are possible venues for reducing these errors. Overall, the STILT/WRF/VPRM offers a powerful tool for continental and regional scale carbon flux estimates.

  12. A novel epidemic spreading model with decreasing infection rate based on infection times

    NASA Astrophysics Data System (ADS)

    Huang, Yunhan; Ding, Li; Feng, Yun

    2016-02-01

    A new epidemic spreading model where individuals can be infected repeatedly is proposed in this paper. The infection rate decreases according to the times it has been infected before. This phenomenon may be caused by immunity or heightened alertness of individuals. We introduce a new parameter called decay factor to evaluate the decrease of infection rate. Our model bridges the Susceptible-Infected-Susceptible(SIS) model and the Susceptible-Infected-Recovered(SIR) model by this parameter. The proposed model has been studied by Monte-Carlo numerical simulation. It is found that initial infection rate has greater impact on peak value comparing with decay factor. The effect of decay factor on final density and threshold of outbreak is dominant but weakens significantly when considering birth and death rates. Besides, simulation results show that the influence of birth and death rates on final density is non-monotonic in some circumstances.

  13. Forecasting Lightning Threat using Cloud-resolving Model Simulations

    NASA Technical Reports Server (NTRS)

    McCaul, E. W., Jr.; Goodman, S. J.; LaCasse, K. M.; Cecil, D. J.

    2009-01-01

    As numerical forecasts capable of resolving individual convective clouds become more common, it is of interest to see if quantitative forecasts of lightning flash rate density are possible, based on fields computed by the numerical model. Previous observational research has shown robust relationships between observed lightning flash rates and inferred updraft and large precipitation ice fields in the mixed phase regions of storms, and that these relationships might allow simulated fields to serve as proxies for lightning flash rate density. It is shown in this paper that two simple proxy fields do indeed provide reasonable and cost-effective bases for creating time-evolving maps of predicted lightning flash rate density, judging from a series of diverse simulation case study events in North Alabama for which Lightning Mapping Array data provide ground truth. One method is based on the product of upward velocity and the mixing ratio of precipitating ice hydrometeors, modeled as graupel only, in the mixed phase region of storms at the -15\\dgc\\ level, while the second method is based on the vertically integrated amounts of ice hydrometeors in each model grid column. Each method can be calibrated by comparing domainwide statistics of the peak values of simulated flash rate proxy fields against domainwide peak total lightning flash rate density data from observations. Tests show that the first method is able to capture much of the temporal variability of the lightning threat, while the second method does a better job of depicting the areal coverage of the threat. A blended solution is designed to retain most of the temporal sensitivity of the first method, while adding the improved spatial coverage of the second. Weather Research and Forecast Model simulations of selected North Alabama cases show that this model can distinguish the general character and intensity of most convective events, and that the proposed methods show promise as a means of generating quantitatively realistic fields of lightning threat. However, because models tend to have more difficulty in correctly predicting the instantaneous placement of storms, forecasts of the detailed location of the lightning threat based on single simulations can be in error. Although these model shortcomings presently limit the precision of lightning threat forecasts from individual runs of current generation models, the techniques proposed herein should continue to be applicable as newer and more accurate physically-based model versions, physical parameterizations, initialization techniques and ensembles of cloud-allowing forecasts become available.

  14. Partitioning the Uncertainty in Estimates of Mean Basal Area Obtained from 10-year Diameter Growth Model Predictions

    Treesearch

    Ronald E. McRoberts

    2005-01-01

    Uncertainty in model-based predictions of individual tree diameter growth is attributed to three sources: measurement error for predictor variables, residual variability around model predictions, and uncertainty in model parameter estimates. Monte Carlo simulations are used to propagate the uncertainty from the three sources through a set of diameter growth models to...

  15. Computational Nanomechanics of Carbon Nanotubes and Composites

    NASA Technical Reports Server (NTRS)

    Srivastava, Deepak; Wei, Chenyu; Cho, Kyeongjae; Biegel, Bryan (Technical Monitor)

    2002-01-01

    Nanomechanics of individual carbon and boron-nitride nanotubes and their application as reinforcing fibers in polymer composites has been reviewed with interplay of theoretical modeling, computer simulations and experimental observations. The emphasis in this work is on elucidating the multi-length scales of the problems involved, and of different simulation techniques that are needed to address specific characteristics of individual nanotubes and nanotube polymer-matrix interfaces. Classical molecular dynamics simulations are shown to be sufficient to describe the generic behavior such as strength and stiffness modulus but are inadequate to describe elastic limit and nature of plastic buckling at large strength. Quantum molecular dynamics simulations are shown to bring out explicit atomic nature dependent behavior of these nanoscale materials objects that are not accessible either via continuum mechanics based descriptions or through classical molecular dynamics based simulations. As examples, we discus local plastic collapse of carbon nanotubes under axial compression and anisotropic plastic buckling of boron-nitride nanotubes. Dependence of the yield strain on the strain rate is addressed through temperature dependent simulations, a transition-state-theory based model of the strain as a function of strain rate and simulation temperature is presented, and in all cases extensive comparisons are made with experimental observations. Mechanical properties of nanotube-polymer composite materials are simulated with diverse nanotube-polymer interface structures (with van der Waals interaction). The atomistic mechanisms of the interface toughening for optimal load transfer through recycling, high-thermal expansion and diffusion coefficient composite formation above glass transition temperature, and enhancement of Young's modulus on addition of nanotubes to polymer are discussed and compared with experimental observations.

  16. Reserve design to maximize species persistence

    Treesearch

    Robert G. Haight; Laurel E. Travis

    2008-01-01

    We develop a reserve design strategy to maximize the probability of species persistence predicted by a stochastic, individual-based, metapopulation model. Because the population model does not fit exact optimization procedures, our strategy involves deriving promising solutions from theory, obtaining promising solutions from a simulation optimization heuristic, and...

  17. An Empirical Agent-Based Model to Simulate the Adoption of Water Reuse Using the Social Amplification of Risk Framework.

    PubMed

    Kandiah, Venu; Binder, Andrew R; Berglund, Emily Z

    2017-10-01

    Water reuse can serve as a sustainable alternative water source for urban areas. However, the successful implementation of large-scale water reuse projects depends on community acceptance. Because of the negative perceptions that are traditionally associated with reclaimed water, water reuse is often not considered in the development of urban water management plans. This study develops a simulation model for understanding community opinion dynamics surrounding the issue of water reuse, and how individual perceptions evolve within that context, which can help in the planning and decision-making process. Based on the social amplification of risk framework, our agent-based model simulates consumer perceptions, discussion patterns, and their adoption or rejection of water reuse. The model is based on the "risk publics" model, an empirical approach that uses the concept of belief clusters to explain the adoption of new technology. Each household is represented as an agent, and parameters that define their behavior and attributes are defined from survey data. Community-level parameters-including social groups, relationships, and communication variables, also from survey data-are encoded to simulate the social processes that influence community opinion. The model demonstrates its capabilities to simulate opinion dynamics and consumer adoption of water reuse. In addition, based on empirical data, the model is applied to investigate water reuse behavior in different regions of the United States. Importantly, our results reveal that public opinion dynamics emerge differently based on membership in opinion clusters, frequency of discussion, and the structure of social networks. © 2017 Society for Risk Analysis.

  18. A web-based decision support tool for prognosis simulation in multiple sclerosis.

    PubMed

    Veloso, Mário

    2014-09-01

    A multiplicity of natural history studies of multiple sclerosis provides valuable knowledge of the disease progression but individualized prognosis remains elusive. A few decision support tools that assist the clinician in such task have emerged but have not received proper attention from clinicians and patients. The objective of the current work is to implement a web-based tool, conveying decision relevant prognostic scientific evidence, which will help clinicians discuss prognosis with individual patients. Data were extracted from a set of reference studies, especially those dealing with the natural history of multiple sclerosis. The web-based decision support tool for individualized prognosis simulation was implemented with NetLogo, a program environment suited for the development of complex adaptive systems. Its prototype has been launched online; it enables clinicians to predict both the likelihood of CIS to CDMS conversion, and the long-term prognosis of disability level and SPMS conversion, as well as assess and monitor the effects of treatment. More robust decision support tools, which convey scientific evidence and satisfy the needs of clinical practice by helping clinicians discuss prognosis expectations with individual patients, are required. The web-based simulation model herein introduced proposes to be a step forward toward this purpose. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Monte Carlo decision curve analysis using aggregate data.

    PubMed

    Hozo, Iztok; Tsalatsanis, Athanasios; Djulbegovic, Benjamin

    2017-02-01

    Decision curve analysis (DCA) is an increasingly used method for evaluating diagnostic tests and predictive models, but its application requires individual patient data. The Monte Carlo (MC) method can be used to simulate probabilities and outcomes of individual patients and offers an attractive option for application of DCA. We constructed a MC decision model to simulate individual probabilities of outcomes of interest. These probabilities were contrasted against the threshold probability at which a decision-maker is indifferent between key management strategies: treat all, treat none or use predictive model to guide treatment. We compared the results of DCA with MC simulated data against the results of DCA based on actual individual patient data for three decision models published in the literature: (i) statins for primary prevention of cardiovascular disease, (ii) hospice referral for terminally ill patients and (iii) prostate cancer surgery. The results of MC DCA and patient data DCA were identical. To the extent that patient data DCA were used to inform decisions about statin use, referral to hospice or prostate surgery, the results indicate that MC DCA could have also been used. As long as the aggregate parameters on distribution of the probability of outcomes and treatment effects are accurately described in the published reports, the MC DCA will generate indistinguishable results from individual patient data DCA. We provide a simple, easy-to-use model, which can facilitate wider use of DCA and better evaluation of diagnostic tests and predictive models that rely only on aggregate data reported in the literature. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.

  20. Perceptual control models of pursuit manual tracking demonstrate individual specificity and parameter consistency.

    PubMed

    Parker, Maximilian G; Tyson, Sarah F; Weightman, Andrew P; Abbott, Bruce; Emsley, Richard; Mansell, Warren

    2017-11-01

    Computational models that simulate individuals' movements in pursuit-tracking tasks have been used to elucidate mechanisms of human motor control. Whilst there is evidence that individuals demonstrate idiosyncratic control-tracking strategies, it remains unclear whether models can be sensitive to these idiosyncrasies. Perceptual control theory (PCT) provides a unique model architecture with an internally set reference value parameter, and can be optimized to fit an individual's tracking behavior. The current study investigated whether PCT models could show temporal stability and individual specificity over time. Twenty adults completed three blocks of 15 1-min, pursuit-tracking trials. Two blocks (training and post-training) were completed in one session and the third was completed after 1 week (follow-up). The target moved in a one-dimensional, pseudorandom pattern. PCT models were optimized to the training data using a least-mean-squares algorithm, and validated with data from post-training and follow-up. We found significant inter-individual variability (partial η 2 : .464-.697) and intra-individual consistency (Cronbach's α: .880-.976) in parameter estimates. Polynomial regression revealed that all model parameters, including the reference value parameter, contribute to simulation accuracy. Participants' tracking performances were significantly more accurately simulated by models developed from their own tracking data than by models developed from other participants' data. We conclude that PCT models can be optimized to simulate the performance of an individual and that the test-retest reliability of individual models is a necessary criterion for evaluating computational models of human performance.

  1. Identification of walking human model using agent-based modelling

    NASA Astrophysics Data System (ADS)

    Shahabpoor, Erfan; Pavic, Aleksandar; Racic, Vitomir

    2018-03-01

    The interaction of walking people with large vibrating structures, such as footbridges and floors, in the vertical direction is an important yet challenging phenomenon to describe mathematically. Several different models have been proposed in the literature to simulate interaction of stationary people with vibrating structures. However, the research on moving (walking) human models, explicitly identified for vibration serviceability assessment of civil structures, is still sparse. In this study, the results of a comprehensive set of FRF-based modal tests were used, in which, over a hundred test subjects walked in different group sizes and walking patterns on a test structure. An agent-based model was used to simulate discrete traffic-structure interactions. The occupied structure modal parameters found in tests were used to identify the parameters of the walking individual's single-degree-of-freedom (SDOF) mass-spring-damper model using 'reverse engineering' methodology. The analysis of the results suggested that the normal distribution with the average of μ = 2.85Hz and standard deviation of σ = 0.34Hz can describe human SDOF model natural frequency. Similarly, the normal distribution with μ = 0.295 and σ = 0.047 can describe the human model damping ratio. Compared to the previous studies, the agent-based modelling methodology proposed in this paper offers significant flexibility in simulating multi-pedestrian walking traffics, external forces and simulating different mechanisms of human-structure and human-environment interaction at the same time.

  2. [Virtual reality simulation training in gynecology: review and perspectives].

    PubMed

    Ricard-Gauthier, Dominique; Popescu, Silvia; Benmohamed, Naida; Petignat, Patrick; Dubuisson, Jean

    2016-10-26

    Laparoscopic simulation has rapidly become an important tool for learning and acquiring technical skills in surgery. It is based on two different complementary pedagogic tools : the box model trainer and the virtual reality simulator. The virtual reality simulator has shown its efficiency by improving surgical skills, decreasing operating time, improving economy of movements and improving self-confidence. The main objective of this tool is the opportunity to easily organize a regular, structured and uniformed training program enabling an automated individualized feedback.

  3. Patient-individualized boundary conditions for CFD simulations using time-resolved 3D angiography.

    PubMed

    Boegel, Marco; Gehrisch, Sonja; Redel, Thomas; Rohkohl, Christopher; Hoelter, Philip; Doerfler, Arnd; Maier, Andreas; Kowarschik, Markus

    2016-06-01

    Hemodynamic simulations are of increasing interest for the assessment of aneurysmal rupture risk and treatment planning. Achievement of accurate simulation results requires the usage of several patient-individual boundary conditions, such as a geometric model of the vasculature but also individualized inflow conditions. We propose the automatic estimation of various parameters for boundary conditions for computational fluid dynamics (CFD) based on a single 3D rotational angiography scan, also showing contrast agent inflow. First the data are reconstructed, and a patient-specific vessel model can be generated in the usual way. For this work, we optimize the inflow waveform based on two parameters, the mean velocity and pulsatility. We use statistical analysis of the measurable velocity distribution in the vessel segment to estimate the mean velocity. An iterative optimization scheme based on CFD and virtual angiography is utilized to estimate the inflow pulsatility. Furthermore, we present methods to automatically determine the heart rate and synchronize the inflow waveform to the patient's heart beat, based on time-intensity curves extracted from the rotational angiogram. This will result in a patient-individualized inflow velocity curve. The proposed methods were evaluated on two clinical datasets. Based on the vascular geometries, synthetic rotational angiography data was generated to allow a quantitative validation of our approach against ground truth data. We observed an average error of approximately [Formula: see text] for the mean velocity, [Formula: see text] for the pulsatility. The heart rate was estimated very precisely with an average error of about [Formula: see text], which corresponds to about 6 ms error for the duration of one cardiac cycle. Furthermore, a qualitative comparison of measured time-intensity curves from the real data and patient-specific simulated ones shows an excellent match. The presented methods have the potential to accurately estimate patient-specific boundary conditions from a single dedicated rotational scan.

  4. Magnetization Reversal of Nanoscale Islands: How Size and Shape Affect the Arrhenius Prefactor

    NASA Astrophysics Data System (ADS)

    Krause, S.; Herzog, G.; Stapelfeldt, T.; Berbil-Bautista, L.; Bode, M.; Vedmedenko, E. Y.; Wiesendanger, R.

    2009-09-01

    The thermal switching behavior of individual in-plane magnetized Fe/W(110) nanoislands is investigated by a combined study of variable-temperature spin-polarized scanning tunneling microscopy and Monte Carlo simulations. Even for islands consisting of less than 100 atoms the magnetization reversal takes place via nucleation and propagation. The Arrhenius prefactor is found to strongly depend on the individual island size and shape, and based on the experimental results a simple model is developed to describe the magnetization reversal in terms of metastable states. Complementary Monte Carlo simulations confirm the model and provide new insight into the microscopic processes involved in magnetization reversal of smallest nanomagnets.

  5. Conceptualizing intragroup and intergroup dynamics within a controlled crowd evacuation.

    PubMed

    Elzie, Terra; Frydenlund, Erika; Collins, Andrew J; Robinson, R Michael

    2015-01-01

    Social dynamics play a critical role in successful pedestrian evacuations. Crowd modeling research has made progress in capturing the way individual and group dynamics affect evacuations; however, few studies have simultaneously examined how individuals and groups interact with one another during egress. To address this gap, the researchers present a conceptual agent-based model (ABM) designed to study the ways in which autonomous, heterogeneous, decision-making individuals negotiate intragroup and intergroup behavior while exiting a large venue. A key feature of this proposed model is the examination of the dynamics among and between various groupings, where heterogeneity at the individual level dynamically affects group behavior and subsequently group/group interactions. ABM provides a means of representing the important social factors that affect decision making among diverse social groups. Expanding on the 2013 work of Vizzari et al., the researchers focus specifically on social factors and decision making at the individual/group and group/group levels to more realistically portray dynamic crowd systems during a pedestrian evacuation. By developing a model with individual, intragroup, and intergroup interactions, the ABM provides a more representative approximation of real-world crowd egress. The simulation will enable more informed planning by disaster managers, emergency planners, and other decision makers. This pedestrian behavioral concept is one piece of a larger simulation model. Future research will build toward an integrated model capturing decision-making interactions between pedestrians and vehicles that affect evacuation outcomes.

  6. Modeling the population-level effects of hypoxia on a coastal fish: implications of a spatially-explicit individual-based model

    NASA Astrophysics Data System (ADS)

    Rose, K.; Creekmore, S.; Thomas, P.; Craig, K.; Neilan, R.; Rahman, S.; Wang, L.; Justic, D.

    2016-02-01

    The northwestern Gulf of Mexico (USA) currently experiences a large hypoxic area ("dead zone") during the summer. The population-level effects of hypoxia on coastal fish are largely unknown. We developed a spatially-explicit, individual-based model to analyze how hypoxia effects on reproduction, growth, and mortality of individual Atlantic croaker could lead to population-level responses. The model follows the hourly growth, mortality, reproduction, and movement of individuals on a 300 x 800 spatial grid of 1 km2 cells for 140 years. Chlorophyll-a concentration and water temperature were specified daily for each grid cell. Dissolved oxygen (DO) was obtained from a 3-D water quality model for four years that differed in their severity of hypoxia. A bioenergetics model was used to represent growth, mortality was assumed stage- and age-dependent, and movement behavior was based on temperature preferences and avoidance of low DO. Hypoxia effects were imposed using exposure-effects sub-models that converted time-varying exposure to DO to reductions in growth and fecundity, and increases in mortality. Using sequences of mild, intermediate, and severe hypoxia years, the model predicted a 20% decrease in population abundance. Additional simulations were performed under the assumption that river-based nutrients loadings that lead to more hypoxia also lead to higher primary production and more food for croaker. Twenty-five percent and 50% nutrient reduction scenarios were simulated by adjusting the cholorphyll-a concentrations used as food proxy for the croaker. We then incrementally increased the DO concentrations to determine how much hypoxia would need to be reduced to offset the lower food production resulting from reduced nutrients. We discuss the generality of our results, the hidden effects of hypoxia on fish, and our overall strategy of combining laboratory and field studies with modeling to produce robust predictions of population responses to stressors under dynamic and multi-stressor conditions.

  7. A Mixed-dimensional Model for Determining the Impact of Permafrost Polygonal Ground Degradation on Arctic Hydrology.

    NASA Astrophysics Data System (ADS)

    Coon, E.; Jan, A.; Painter, S. L.; Moulton, J. D.; Wilson, C. J.

    2017-12-01

    Many permafrost-affected regions in the Arctic manifest a polygonal patterned ground, which contains large carbon stores and is vulnerability to climate change as warming temperatures drive melting ice wedges, polygon degradation, and thawing of the underlying carbon-rich soils. Understanding the fate of this carbon is difficult. The system is controlled by complex, nonlinear physics coupling biogeochemistry, thermal-hydrology, and geomorphology, and there is a strong spatial scale separation between microtopograpy (at the scale of an individual polygon) and the scale of landscape change (at the scale of many thousands of polygons). Physics-based models have come a long way, and are now capable of representing the diverse set of processes, but only on individual polygons or a few polygons. Empirical models have been used to upscale across land types, including ecotypes evolving from low-centered (pristine) polygons to high-centered (degraded) polygon, and do so over large spatial extent, but are limited in their ability to discern causal process mechanisms. Here we present a novel strategy that looks to use physics-based models across scales, bringing together multiple capabilities to capture polygon degradation under a warming climate and its impacts on thermal-hydrology. We use fine-scale simulations on individual polygons to motivate a mixed-dimensional strategy that couples one-dimensional columns representing each individual polygon through two-dimensional surface flow. A subgrid model is used to incorporate the effects of surface microtopography on surface flow; this model is described and calibrated to fine-scale simulations. And critically, a subsidence model that tracks volume loss in bulk ice wedges is used to alter the subsurface structure and subgrid parameters, enabling the inclusion of the feedbacks associated with polygon degradation. This combined strategy results in a model that is able to capture the key features of polygon permafrost degradation, but in a simulation across a large spatial extent of polygonal tundra.

  8. Comparing models for growth and management of forest tracts

    Treesearch

    J.J. Colbert; Michael Schuckers; Desta Fekedulegn

    2003-01-01

    The Stand Damage Model (SDM) is a PC-based model that is easily installed, calibrated and initialized for use in exploring the future growth and management of forest stands or small wood lots. We compare the basic individual tree growth model incorporated in this model with alternative models that predict the basal area growth of trees. The SDM is a gap-type simulator...

  9. Simulating cancer growth with multiscale agent-based modeling.

    PubMed

    Wang, Zhihui; Butner, Joseph D; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S

    2015-02-01

    There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Sentence Recognition Prediction for Hearing-impaired Listeners in Stationary and Fluctuation Noise With FADE

    PubMed Central

    Schädler, Marc René; Warzybok, Anna; Meyer, Bernd T.; Brand, Thomas

    2016-01-01

    To characterize the individual patient’s hearing impairment as obtained with the matrix sentence recognition test, a simulation Framework for Auditory Discrimination Experiments (FADE) is extended here using the Attenuation and Distortion (A+D) approach by Plomp as a blueprint for setting the individual processing parameters. FADE has been shown to predict the outcome of both speech recognition tests and psychoacoustic experiments based on simulations using an automatic speech recognition system requiring only few assumptions. It builds on the closed-set matrix sentence recognition test which is advantageous for testing individual speech recognition in a way comparable across languages. Individual predictions of speech recognition thresholds in stationary and in fluctuating noise were derived using the audiogram and an estimate of the internal level uncertainty for modeling the individual Plomp curves fitted to the data with the Attenuation (A-) and Distortion (D-) parameters of the Plomp approach. The “typical” audiogram shapes from Bisgaard et al with or without a “typical” level uncertainty and the individual data were used for individual predictions. As a result, the individualization of the level uncertainty was found to be more important than the exact shape of the individual audiogram to accurately model the outcome of the German Matrix test in stationary or fluctuating noise for listeners with hearing impairment. The prediction accuracy of the individualized approach also outperforms the (modified) Speech Intelligibility Index approach which is based on the individual threshold data only. PMID:27604782

  11. Dynamic Simulation of Crime Perpetration and Reporting to Examine Community Intervention Strategies

    ERIC Educational Resources Information Center

    Yonas, Michael A.; Burke, Jessica G.; Brown, Shawn T.; Borrebach, Jeffrey D.; Garland, Richard; Burke, Donald S.; Grefenstette, John J.

    2013-01-01

    Objective: To develop a conceptual computational agent-based model (ABM) to explore community-wide versus spatially focused crime reporting interventions to reduce community crime perpetrated by youth. Method: Agents within the model represent individual residents and interact on a two-dimensional grid representing an abstract nonempirically…

  12. Demonstration of a fully-coupled end-to-end model for small pelagic fish using sardine and anchovy in the California Current

    NASA Astrophysics Data System (ADS)

    Rose, Kenneth A.; Fiechter, Jerome; Curchitser, Enrique N.; Hedstrom, Kate; Bernal, Miguel; Creekmore, Sean; Haynie, Alan; Ito, Shin-ichi; Lluch-Cota, Salvador; Megrey, Bernard A.; Edwards, Chris A.; Checkley, Dave; Koslow, Tony; McClatchie, Sam; Werner, Francisco; MacCall, Alec; Agostini, Vera

    2015-11-01

    We describe and document an end-to-end model of anchovy and sardine population dynamics in the California Current as a proof of principle that such coupled models can be developed and implemented. The end-to-end model is 3-dimensional, time-varying, and multispecies, and consists of four coupled submodels: hydrodynamics, Eulerian nutrient-phytoplankton-zooplankton (NPZ), an individual-based full life cycle anchovy and sardine submodel, and an agent-based fishing fleet submodel. A predator roughly mimicking albacore was included as individuals that consumed anchovy and sardine. All submodels were coded within the ROMS open-source community model, and used the same resolution spatial grid and were all solved simultaneously to allow for possible feedbacks among the submodels. We used a super-individual approach and solved the coupled models on a distributed memory parallel computer, both of which created challenging but resolvable bookkeeping challenges. The anchovy and sardine growth, mortality, reproduction, and movement, and the fishing fleet submodel, were each calibrated using simplified grids before being inserted into the full end-to-end model. An historical simulation of 1959-2008 was performed, and the latter 45 years analyzed. Sea surface height (SSH) and sea surface temperature (SST) for the historical simulation showed strong horizontal gradients and multi-year scale temporal oscillations related to various climate indices (PDO, NPGO), and both showed responses to ENSO variability. Simulated total phytoplankton was lower during strong El Nino events and higher for the strong 1999 La Nina event. The three zooplankton groups generally corresponded to the spatial and temporal variation in simulated total phytoplankton. Simulated biomasses of anchovy and sardine were within the historical range of observed biomasses but predicted biomasses showed much less inter-annual variation. Anomalies of annual biomasses of anchovy and sardine showed a switch in the mid-1990s from anchovy to sardine dominance. Simulated averaged weights- and lengths-at-age did not vary much across decades, and movement patterns showed anchovy located close to the coast while sardine were more dispersed and farther offshore. Albacore predation on anchovy and sardine was concentrated near the coast in two pockets near the Monterey Bay area and equatorward of Cape Mendocino. Predation mortality from fishing boats was concentrated where sardine age-1 and older individuals were located close to one of the five ports. We demonstrated that it is feasible to perform multi-decadal simulations of a fully-coupled end-to-end model, and that this can be done for a model that follows individual fish and boats on the same 3-dimensional grid as the hydrodynamics. Our focus here was on proof of principle and our results showed that we solved the major technical, bookkeeping, and computational issues. We discuss the next steps to increase computational speed and to include important biological differences between anchovy and sardine. In a companion paper (Fiechter et al., 2015), we further analyze the historical simulation in the context of the various hypotheses that have been proposed to explain the sardine and anchovy cycles.

  13. Data-driven agent-based modeling, with application to rooftop solar adoption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Haifeng; Vorobeychik, Yevgeniy; Letchford, Joshua

    Agent-based modeling is commonly used for studying complex system properties emergent from interactions among many agents. We present a novel data-driven agent-based modeling framework applied to forecasting individual and aggregate residential rooftop solar adoption in San Diego county. Our first step is to learn a model of individual agent behavior from combined data of individual adoption characteristics and property assessment. We then construct an agent-based simulation with the learned model embedded in artificial agents, and proceed to validate it using a holdout sequence of collective adoption decisions. We demonstrate that the resulting agent-based model successfully forecasts solar adoption trends andmore » provides a meaningful quantification of uncertainty about its predictions. We utilize our model to optimize two classes of policies aimed at spurring solar adoption: one that subsidizes the cost of adoption, and another that gives away free systems to low-income house- holds. We find that the optimal policies derived for the latter class are significantly more efficacious, whereas the policies similar to the current California Solar Initiative incentive scheme appear to have a limited impact on overall adoption trends.« less

  14. Data-driven agent-based modeling, with application to rooftop solar adoption

    DOE PAGES

    Zhang, Haifeng; Vorobeychik, Yevgeniy; Letchford, Joshua; ...

    2016-01-25

    Agent-based modeling is commonly used for studying complex system properties emergent from interactions among many agents. We present a novel data-driven agent-based modeling framework applied to forecasting individual and aggregate residential rooftop solar adoption in San Diego county. Our first step is to learn a model of individual agent behavior from combined data of individual adoption characteristics and property assessment. We then construct an agent-based simulation with the learned model embedded in artificial agents, and proceed to validate it using a holdout sequence of collective adoption decisions. We demonstrate that the resulting agent-based model successfully forecasts solar adoption trends andmore » provides a meaningful quantification of uncertainty about its predictions. We utilize our model to optimize two classes of policies aimed at spurring solar adoption: one that subsidizes the cost of adoption, and another that gives away free systems to low-income house- holds. We find that the optimal policies derived for the latter class are significantly more efficacious, whereas the policies similar to the current California Solar Initiative incentive scheme appear to have a limited impact on overall adoption trends.« less

  15. Reducing the Complexity of an Agent-Based Local Heroin Market Model

    PubMed Central

    Heard, Daniel; Bobashev, Georgiy V.; Morris, Robert J.

    2014-01-01

    This project explores techniques for reducing the complexity of an agent-based model (ABM). The analysis involved a model developed from the ethnographic research of Dr. Lee Hoffer in the Larimer area heroin market, which involved drug users, drug sellers, homeless individuals and police. The authors used statistical techniques to create a reduced version of the original model which maintained simulation fidelity while reducing computational complexity. This involved identifying key summary quantities of individual customer behavior as well as overall market activity and replacing some agents with probability distributions and regressions. The model was then extended to allow external market interventions in the form of police busts. Extensions of this research perspective, as well as its strengths and limitations, are discussed. PMID:25025132

  16. Stochastic foundations in nonlinear density-regulation growth

    NASA Astrophysics Data System (ADS)

    Méndez, Vicenç; Assaf, Michael; Horsthemke, Werner; Campos, Daniel

    2017-08-01

    In this work we construct individual-based models that give rise to the generalized logistic model at the mean-field deterministic level and that allow us to interpret the parameters of these models in terms of individual interactions. We also study the effect of internal fluctuations on the long-time dynamics for the different models that have been widely used in the literature, such as the theta-logistic and Savageau models. In particular, we determine the conditions for population extinction and calculate the mean time to extinction. If the population does not become extinct, we obtain analytical expressions for the population abundance distribution. Our theoretical results are based on WKB theory and the probability generating function formalism and are verified by numerical simulations.

  17. Simulation tools for particle-based reaction-diffusion dynamics in continuous space

    PubMed Central

    2014-01-01

    Particle-based reaction-diffusion algorithms facilitate the modeling of the diffusional motion of individual molecules and the reactions between them in cellular environments. A physically realistic model, depending on the system at hand and the questions asked, would require different levels of modeling detail such as particle diffusion, geometrical confinement, particle volume exclusion or particle-particle interaction potentials. Higher levels of detail usually correspond to increased number of parameters and higher computational cost. Certain systems however, require these investments to be modeled adequately. Here we present a review on the current field of particle-based reaction-diffusion software packages operating on continuous space. Four nested levels of modeling detail are identified that capture incrementing amount of detail. Their applicability to different biological questions is discussed, arching from straight diffusion simulations to sophisticated and expensive models that bridge towards coarse grained molecular dynamics. PMID:25737778

  18. Simulation-based cutaneous surgical-skill training on a chicken-skin bench model in a medical undergraduate program.

    PubMed

    Denadai, Rafael; Saad-Hossne, Rogério; Martinhão Souto, Luís Ricardo

    2013-05-01

    Because of ethical and medico-legal aspects involved in the training of cutaneous surgical skills on living patients, human cadavers and living animals, it is necessary the search for alternative and effective forms of training simulation. To propose and describe an alternative methodology for teaching and learning the principles of cutaneous surgery in a medical undergraduate program by using a chicken-skin bench model. One instructor for every four students, teaching materials on cutaneous surgical skills, chicken trunks, wings, or thighs, a rigid platform support, needled threads, needle holders, surgical blades with scalpel handles, rat-tooth tweezers, scissors, and marking pens were necessary for training simulation. A proposal for simulation-based training on incision, suture, biopsy, and on reconstruction techniques using a chicken-skin bench model distributed in several sessions and with increasing levels of difficultywas structured. Both feedback and objective evaluations always directed to individual students were also outlined. The teaching of a methodology for the principles of cutaneous surgery using a chicken-skin bench model versatile, portable, easy to assemble, and inexpensive is an alternative and complementary option to the armamentarium of methods based on other bench models described.

  19. Discovering the Power of Individual-Based Modelling in Teaching and Learning: The Study of a Predator-Prey System

    NASA Astrophysics Data System (ADS)

    Ginovart, Marta

    2014-08-01

    The general aim is to promote the use of individual-based models (biological agent-based models) in teaching and learning contexts in life sciences and to make their progressive incorporation into academic curricula easier, complementing other existing modelling strategies more frequently used in the classroom. Modelling activities for the study of a predator-prey system for a mathematics classroom in the first year of an undergraduate program in biosystems engineering have been designed and implemented. These activities were designed to put two modelling approaches side by side, an individual-based model and a set of ordinary differential equations. In order to organize and display this, a system with wolves and sheep in a confined domain was considered and studied. With the teaching material elaborated and a computer to perform the numerical resolutions involved and the corresponding individual-based simulations, the students answered questions and completed exercises to achieve the learning goals set. Students' responses regarding the modelling of biological systems and these two distinct methodologies applied to the study of a predator-prey system were collected via questionnaires, open-ended queries and face-to-face dialogues. Taking into account the positive responses of the students when they were doing these activities, it was clear that using a discrete individual-based model to deal with a predator-prey system jointly with a set of ordinary differential equations enriches the understanding of the modelling process, adds new insights and opens novel perspectives of what can be done with computational models versus other models. The complementary views given by the two modelling approaches were very well assessed by students.

  20. Gap Models as Tools for Sustainable Development under Environmental Changes in Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Shugart, H. H., Jr.; Wang, B.; Brazhnik, K.; Armstrong, A. H.; Foster, A.

    2017-12-01

    Agent-based models of complex systems or as used in this review, Individual-based Models (IBMs), emerged in the 1960s and early 1970s, across diverse disciplines from astronomy to zoology. IBMs arose from a deeply embedded ecological tradition of understanding the dynamics of ecosystems from a "bottom-up" accounting of the interactions of the parts. In this case, individual trees are principal among the parts. Because they are computationally demanding, these models have prospered as the power of digital computers has increased exponentially over the decades following the 1970s. Forest IBMs are no longer computationally bound from developing continental- or global-scale simulations of responses of forests to climate and other changes. Gap models simulate the changes in forests by simulating the birth, growth and death of each individual tree on small plots of land that in summation comprise a forest (or set of sample plots on a forested landscape or region). Currently, gap models have grown from continental-scale and even global-scale applications to assess the potential consequences of climate change on natural forests. These predictions are valuable in the planning and anticipatory decision-making needed to sustainably manage a vast region such as Northern Eurasia. Modifications to the models have enabled simulation of disturbances including fire, insect outbreak and harvest. These disturbances have significant exogenous drivers, notably weather variables, but their effects are also a function of the endogenous conditions involving the structure of forest itself. This feedback between the forest and its environment can in some cases produce hysteresis and multiple-stable operating-regimes for forests. Such responses, often characterized as "tipping points" could play a significant role in increasing risk under environmental change, notably global warming. Such dynamics in a management context imply regional systems that could be "unforgiving" of management mistakes.

  1. Capturing multi-stage fuzzy uncertainties in hybrid system dynamics and agent-based models for enhancing policy implementation in health systems research.

    PubMed

    Liu, Shiyong; Triantis, Konstantinos P; Zhao, Li; Wang, Youfa

    2018-01-01

    In practical research, it was found that most people made health-related decisions not based on numerical data but on perceptions. Examples include the perceptions and their corresponding linguistic values of health risks such as, smoking, syringe sharing, eating energy-dense food, drinking sugar-sweetened beverages etc. For the sake of understanding the mechanisms that affect the implementations of health-related interventions, we employ fuzzy variables to quantify linguistic variable in healthcare modeling where we employ an integrated system dynamics and agent-based model. In a nonlinear causal-driven simulation environment driven by feedback loops, we mathematically demonstrate how interventions at an aggregate level affect the dynamics of linguistic variables that are captured by fuzzy agents and how interactions among fuzzy agents, at the same time, affect the formation of different clusters(groups) that are targeted by specific interventions. In this paper, we provide an innovative framework to capture multi-stage fuzzy uncertainties manifested among interacting heterogeneous agents (individuals) and intervention decisions that affect homogeneous agents (groups of individuals) in a hybrid model that combines an agent-based simulation model (ABM) and a system dynamics models (SDM). Having built the platform to incorporate high-dimension data in a hybrid ABM/SDM model, this paper demonstrates how one can obtain the state variable behaviors in the SDM and the corresponding values of linguistic variables in the ABM. This research provides a way to incorporate high-dimension data in a hybrid ABM/SDM model. This research not only enriches the application of fuzzy set theory by capturing the dynamics of variables associated with interacting fuzzy agents that lead to aggregate behaviors but also informs implementation research by enabling the incorporation of linguistic variables at both individual and institutional levels, which makes unstructured linguistic data meaningful and quantifiable in a simulation environment. This research can help practitioners and decision makers to gain better understanding on the dynamics and complexities of precision intervention in healthcare. It can aid the improvement of the optimal allocation of resources for targeted group (s) and the achievement of maximum utility. As this technology becomes more mature, one can design policy flight simulators by which policy/intervention designers can test a variety of assumptions when they evaluate different alternatives interventions.

  2. A Novel Framework for Characterizing Exposure-Related ...

    EPA Pesticide Factsheets

    Descriptions of where and how individuals spend their time are important for characterizing exposures to chemicals in consumer products and in indoor environments. Herein we create an agent-based model (ABM) that is able to simulate longitudinal patterns in behaviors. By basing our ABM upon a needs-based artificial intelligence (AI) system, we create agents that mimic human decisions on these exposure-relevant behaviors. In a case study of adults, we use the AI to predict the inter-individual variation in the start time and duration of four behaviors: sleeping, eating, commuting, and working. The results demonstrate that the ABM can capture both inter-individual variation and how decisions on one behavior can affect subsequent behaviors. Preset NERL's research on the use of agent based modeling in exposure assessments. To obtain feed back on the approach from the leading experts in the field.

  3. Simulated western spruce budworm defoliation reduces torching and crowning potential: A sensitivity analysis using a physics-based fire model

    Treesearch

    Gregory M. Cohn; Russell A. Parsons; Emily K. Heyerdahl; Daniel G. Gavin; Aquila Flower

    2014-01-01

    The widespread, native defoliator western spruce budworm (Choristoneura occidentalis Freeman) reduces canopy fuels, which might affect the potential for surface fires to torch (ignite the crowns of individual trees) or crown (spread between tree crowns). However, the effects of defoliation on fire behaviour are poorly understood. We used a physics-based fire model to...

  4. A Spatial Agent-Based Model for the Simulation of Adults’ Daily Walking Within a City

    PubMed Central

    Yang, Yong; Roux, Ana V. Diez; Auchincloss, Amy H.; Rodriguez, Daniel A.; Brown, Daniel G.

    2012-01-01

    Environmental effects on walking behavior have received attention in recent years because of the potential for policy interventions to increase population levels of walking. Most epidemiologic studies describe associations of walking behavior with environmental features. These analyses ignore the dynamic processes that shape walking behaviors. A spatial agent-based model (ABM) was developed to simulate peoples’ walking behaviors within a city. Each individual was assigned properties such as age, SES, walking ability, attitude toward walking and a home location. Individuals perform different activities on a regular basis such as traveling for work, for shopping, and for recreation. Whether an individual walks and the amount she or he walks is a function distance to different activities and her or his walking ability and attitude toward walking. An individual’s attitude toward walking evolves over time as a function of past experiences, walking of others along the walking route, limits on distances walked per day, and attitudes toward walking of the other individuals within her/his social network. The model was calibrated and used to examine the contributions of land use and safety to socioeconomic differences in walking. With further refinement and validation, ABMs may help to better understand the determinants of walking and identify the most promising interventions to increase walking. PMID:21335269

  5. InSTREAM: the individual-based stream trout research and environmental assessment model

    Treesearch

    Steven F. Railsback; Bret C. Harvey; Stephen K. Jackson; Roland H. Lamberson

    2009-01-01

    This report documents Version 4.2 of InSTREAM, including its formulation, software, and application to research and management problems. InSTREAM is a simulation model designed to understand how stream and river salmonid populations respond to habitat alteration, including altered flow, temperature, and turbidity regimes and changes in channel morphology. The model...

  6. Simulation-based Education for Endoscopic Third Ventriculostomy: A Comparison Between Virtual and Physical Training Models.

    PubMed

    Breimer, Gerben E; Haji, Faizal A; Bodani, Vivek; Cunningham, Melissa S; Lopez-Rios, Adriana-Lucia; Okrainec, Allan; Drake, James M

    2017-02-01

    The relative educational benefits of virtual reality (VR) and physical simulation models for endoscopic third ventriculostomy (ETV) have not been evaluated "head to head." To compare and identify the relative utility of a physical and VR ETV simulation model for use in neurosurgical training. Twenty-three neurosurgical residents and 3 fellows performed an ETV on both a physical and VR simulation model. Trainees rated the models using 5-point Likert scales evaluating the domains of anatomy, instrument handling, procedural content, and the overall fidelity of the simulation. Paired t tests were performed for each domain's mean overall score and individual items. The VR model has relative benefits compared with the physical model with respect to realistic representation of intraventricular anatomy at the foramen of Monro (4.5, standard deviation [SD] = 0.7 vs 4.1, SD = 0.6; P = .04) and the third ventricle floor (4.4, SD = 0.6 vs 4.0, SD = 0.9; P = .03), although the overall anatomy score was similar (4.2, SD = 0.6 vs 4.0, SD = 0.6; P = .11). For overall instrument handling and procedural content, the physical simulator outperformed the VR model (3.7, SD = 0.8 vs 4.5; SD = 0.5, P < .001 and 3.9; SD = 0.8 vs 4.2, SD = 0.6; P = .02, respectively). Overall task fidelity across the 2 simulators was not perceived as significantly different. Simulation model selection should be based on educational objectives. Training focused on learning anatomy or decision-making for anatomic cues may be aided with the VR simulation model. A focus on developing manual dexterity and technical skills using endoscopic equipment in the operating room may be better learned on the physical simulation model. Copyright © 2016 by the Congress of Neurological Surgeons

  7. Bootstrap-based methods for estimating standard errors in Cox's regression analyses of clustered event times.

    PubMed

    Xiao, Yongling; Abrahamowicz, Michal

    2010-03-30

    We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.

  8. Pattern formation in individual-based systems with time-varying parameters

    NASA Astrophysics Data System (ADS)

    Ashcroft, Peter; Galla, Tobias

    2013-12-01

    We study the patterns generated in finite-time sweeps across symmetry-breaking bifurcations in individual-based models. Similar to the well-known Kibble-Zurek scenario of defect formation, large-scale patterns are generated when model parameters are varied slowly, whereas fast sweeps produce a large number of small domains. The symmetry breaking is triggered by intrinsic noise, originating from the discrete dynamics at the microlevel. Based on a linear-noise approximation, we calculate the characteristic length scale of these patterns. We demonstrate the applicability of this approach in a simple model of opinion dynamics, a model in evolutionary game theory with a time-dependent fitness structure, and a model of cell differentiation. Our theoretical estimates are confirmed in simulations. In further numerical work, we observe a similar phenomenon when the symmetry-breaking bifurcation is triggered by population growth.

  9. Modeling Of In-Vehicle Human Exposure to Ambient Fine Particulate Matter

    PubMed Central

    Liu, Xiaozhen; Frey, H. Christopher

    2012-01-01

    A method for estimating in-vehicle PM2.5 exposure as part of a scenario-based population simulation model is developed and assessed. In existing models, such as the Stochastic Exposure and Dose Simulation model for Particulate Matter (SHEDS-PM), in-vehicle exposure is estimated using linear regression based on area-wide ambient PM2.5 concentration. An alternative modeling approach is explored based on estimation of near-road PM2.5 concentration and an in-vehicle mass balance. Near-road PM2.5 concentration is estimated using a dispersion model and fixed site monitor (FSM) data. In-vehicle concentration is estimated based on air exchange rate and filter efficiency. In-vehicle concentration varies with road type, traffic flow, windspeed, stability class, and ventilation. Average in-vehicle exposure is estimated to contribute 10 to 20 percent of average daily exposure. The contribution of in-vehicle exposure to total daily exposure can be higher for some individuals. Recommendations are made for updating exposure models and implementation of the alternative approach. PMID:23101000

  10. RY-Coding and Non-Homogeneous Models Can Ameliorate the Maximum-Likelihood Inferences From Nucleotide Sequence Data with Parallel Compositional Heterogeneity.

    PubMed

    Ishikawa, Sohta A; Inagaki, Yuji; Hashimoto, Tetsuo

    2012-01-01

    In phylogenetic analyses of nucleotide sequences, 'homogeneous' substitution models, which assume the stationarity of base composition across a tree, are widely used, albeit individual sequences may bear distinctive base frequencies. In the worst-case scenario, a homogeneous model-based analysis can yield an artifactual union of two distantly related sequences that achieved similar base frequencies in parallel. Such potential difficulty can be countered by two approaches, 'RY-coding' and 'non-homogeneous' models. The former approach converts four bases into purine and pyrimidine to normalize base frequencies across a tree, while the heterogeneity in base frequency is explicitly incorporated in the latter approach. The two approaches have been applied to real-world sequence data; however, their basic properties have not been fully examined by pioneering simulation studies. Here, we assessed the performances of the maximum-likelihood analyses incorporating RY-coding and a non-homogeneous model (RY-coding and non-homogeneous analyses) on simulated data with parallel convergence to similar base composition. Both RY-coding and non-homogeneous analyses showed superior performances compared with homogeneous model-based analyses. Curiously, the performance of RY-coding analysis appeared to be significantly affected by a setting of the substitution process for sequence simulation relative to that of non-homogeneous analysis. The performance of a non-homogeneous analysis was also validated by analyzing a real-world sequence data set with significant base heterogeneity.

  11. [Team training and assessment in mixed reality-based simulated operating room : Current state of research in the field of simulation in spine surgery exemplified by the ATMEOS project].

    PubMed

    Stefan, P; Pfandler, M; Wucherer, P; Habert, S; Fürmetz, J; Weidert, S; Euler, E; Eck, U; Lazarovici, M; Weigl, M; Navab, N

    2018-04-01

    Surgical simulators are being increasingly used as an attractive alternative to clinical training in addition to conventional animal models and human specimens. Typically, surgical simulation technology is designed for the purpose of teaching technical surgical skills (so-called task trainers). Simulator training in surgery is therefore in general limited to the individual training of the surgeon and disregards the participation of the rest of the surgical team. The objective of the project Assessment and Training of Medical Experts based on Objective Standards (ATMEOS) is to develop an immersive simulated operating room environment that enables the training and assessment of multidisciplinary surgical teams under various conditions. Using a mixed reality approach, a synthetic patient model, real surgical instruments and radiation-free virtual X‑ray imaging are combined into a simulation of spinal surgery. In previous research studies, the concept was evaluated in terms of realism, plausibility and immersiveness. In the current research, assessment measurements for technical and non-technical skills are developed and evaluated. The aim is to observe multidisciplinary surgical teams in the simulated operating room during minimally invasive spinal surgery and objectively assess the performance of the individual team members and the entire team. Moreover, the effectiveness of training methods and surgical techniques or success critical factors, e. g. management of crisis situations, can be captured and objectively assessed in the controlled environment.

  12. Trait-based Modeling of Larval Dispersal in the Gulf of Maine

    NASA Astrophysics Data System (ADS)

    Jones, B.; Richardson, D.; Follows, M. J.; Hill, C. N.; Solow, A.; Ji, R.

    2016-02-01

    Population connectivity of marine species is the inter-generational movement of individuals among geographically separated subpopulations and is a crucial determinant of population dynamics, community structure, and optimal management strategies. For many marine species, population connectivity is largely determined by the dispersal patterns that emerge from a pelagic larval phase. These dispersal patterns are a result of interactions between the physical environment, adult spawning strategy, and larval ecology. Using a generalized trait-based model that represents the adult spawning strategy as a distribution of larval releases in time and space and the larval trait space with the pelagic larval duration, vertical swimming behavior, and settlement habitat preferences, we simulate dispersal patterns in the Gulf of Maine and surrounding regions. We implement this model as an individual-based simulation that tracks Lagrangian particles on a graphics processing unit as they move through hourly archived output from the Finite-Volume Community Ocean Model. The particles are released between the Hudson Canyon and Nova Scotia and the release distributions are determined using a novel method that minimizes the number of simulations required to achieve a predetermined level of precision for the connectivity matrices. The simulated larvae have a variable pelagic larval duration and exhibit multiple forms of dynamic depth-keeping behavior. We describe how these traits influence the dispersal trajectories and connectivity patterns among regions in the northwest Atlantic. Our description includes the probability of successful recruitment, patchiness of larval distributions, and the variability of these properties in time and space under a variety of larval dispersal strategies.

  13. Towards Linking 3D SAR and Lidar Models with a Spatially Explicit Individual Based Forest Model

    NASA Astrophysics Data System (ADS)

    Osmanoglu, B.; Ranson, J.; Sun, G.; Armstrong, A. H.; Fischer, R.; Huth, A.

    2017-12-01

    In this study, we present a parameterization of the FORMIND individual-based gap model (IBGM)for old growth Atlantic lowland rainforest in La Selva, Costa Rica for the purpose of informing multisensor remote sensing techniques for above ground biomass techniques. The model was successfully parameterized and calibrated for the study site; results show that the simulated forest reproduces the structural complexity of Costa Rican rainforest based on comparisons with CARBONO inventory plot data. Though the simulated stem numbers (378) slightly underestimated the plot data (418), particularly for canopy dominant intermediate shade tolerant trees and shade tolerant understory trees, overall there was a 9.7% difference. Aboveground biomass (kg/ha) showed a 0.1% difference between the simulated forest and inventory plot dataset. The Costa Rica FORMIND simulation was then used to parameterize a spatially explicit (3D) SAR and lidar backscatter models. The simulated forest stands were used to generate a Look Up Table as a tractable means to estimate aboveground forest biomass for these complex forests. Various combinations of lidar and radar variables were evaluated in the LUT inversion. To test the capability of future data for estimation of forest height and biomass, we considered data of 1) L- (or P-) band polarimetric data (backscattering coefficients of HH, HV and VV); 2) L-band dual-pol repeat-pass InSAR data (HH/HV backscattering coefficients and coherences, height of scattering phase center at HH and HV using DEM or surface height from lidar data as reference); 3) P-band polarimetric InSAR data (canopy height from inversion of PolInSAR data or use the coherences and height of scattering phase center at HH, HV and VV); 4) various height indices from waveform lidar data); and 5) surface and canopy top height from photon-counting lidar data. The methods for parameterizing the remote sensing models with the IBGM and developing Look Up Tables will be discussed. Results from various remote sensing scenarios will also be presented.

  14. Modelling the Effects of Information Campaigns Using Agent-Based Simulation

    DTIC Science & Technology

    2006-04-01

    individual i (±1). T=5 T=10 T=20 T=40 DSTO-TR-1853 9 The incorporation of media effects into Equation (1) results in a social impact model of the...that minority opinions often survived in a social margin [17]. Nevertheless, compared to the situation where there is no media effect in the simulation...analysis presented in this paper combines word-of-mouth communication and mass media broadcasting into a single line of analysis. The effects of

  15. Assessing the accuracy of subject-specific, muscle-model parameters determined by optimizing to match isometric strength.

    PubMed

    DeSmitt, Holly J; Domire, Zachary J

    2016-12-01

    Biomechanical models are sensitive to the choice of model parameters. Therefore, determination of accurate subject specific model parameters is important. One approach to generate these parameters is to optimize the values such that the model output will match experimentally measured strength curves. This approach is attractive as it is inexpensive and should provide an excellent match to experimentally measured strength. However, given the problem of muscle redundancy, it is not clear that this approach generates accurate individual muscle forces. The purpose of this investigation is to evaluate this approach using simulated data to enable a direct comparison. It is hypothesized that the optimization approach will be able to recreate accurate muscle model parameters when information from measurable parameters is given. A model of isometric knee extension was developed to simulate a strength curve across a range of knee angles. In order to realistically recreate experimentally measured strength, random noise was added to the modeled strength. Parameters were solved for using a genetic search algorithm. When noise was added to the measurements the strength curve was reasonably recreated. However, the individual muscle model parameters and force curves were far less accurate. Based upon this examination, it is clear that very different sets of model parameters can recreate similar strength curves. Therefore, experimental variation in strength measurements has a significant influence on the results. Given the difficulty in accurately recreating individual muscle parameters, it may be more appropriate to perform simulations with lumped actuators representing similar muscles.

  16. OAKSIM: An individual-tree growth and yield simulator for managed, even-aged, upland oak stands

    Treesearch

    Donald E. Hilt; Donald E. Hilt

    1985-01-01

    OAKSIM is an individual-tree growth and yield simulator for managed, even-aged, upland oak stands. Growth and yield projections for various thinning alternatives can be made with OAKSIM for a period of up to 50 years. Simulator components include an individual-tree diameter growth model, a mortality model, height prediction equations, bark ratio equations, a taper-...

  17. Navigating the flow: individual and continuum models for homing in flowing environments.

    PubMed

    Painter, Kevin J; Hillen, Thomas

    2015-11-06

    Navigation for aquatic and airborne species often takes place in the face of complicated flows, from persistent currents to highly unpredictable storms. Hydrodynamic models are capable of simulating flow dynamics and provide the impetus for much individual-based modelling, in which particle-sized individuals are immersed into a flowing medium. These models yield insights on the impact of currents on population distributions from fish eggs to large organisms, yet their computational demands and intractability reduce their capacity to generate the broader, less parameter-specific, insights allowed by traditional continuous approaches. In this paper, we formulate an individual-based model for navigation within a flowing field and apply scaling to derive its corresponding macroscopic and continuous model. We apply it to various movement classes, from drifters that simply go with the flow to navigators that respond to environmental orienteering cues. The utility of the model is demonstrated via its application to 'homing' problems and, in particular, the navigation of the marine green turtle Chelonia mydas to Ascension Island. © 2015 The Author(s).

  18. Spatial Self-Organization of Vegetation Subject to Climatic Stress-Insights from a System Dynamics-Individual-Based Hybrid Model.

    PubMed

    Vincenot, Christian E; Carteni, Fabrizio; Mazzoleni, Stefano; Rietkerk, Max; Giannino, Francesco

    2016-01-01

    In simulation models of populations or communities, individual plants have often been obfuscated in favor of aggregated vegetation. This simplification comes with a loss of biological detail and a smoothing out of the demographic noise engendered by stochastic individual-scale processes and heterogeneities, which is significant among others when studying the viability of small populations facing challenging fluctuating environmental conditions. This consideration has motivated the development of precise plant-centered models. The accuracy gained in the representation of plant biology has then, however, often been balanced by the disappearance in models of important plant-soil interactions (esp. water dynamics) due to the inability of most individual-based frameworks to simulate complex continuous processes. In this study, we used a hybrid modeling approach, namely integrated System Dynamics (SD)-Individual-based (IB), to illustrate the importance of individual plant dynamics to explain spatial self-organization of vegetation in arid environments. We analyzed the behavior of this model under different parameter sets either related to individual plant properties (such as seed dispersal distance and reproductive age) or the environment (such as intensity and yearly distribution of precipitation events). While the results of this work confirmed the prevailing theory on vegetation patterning, they also revealed the importance therein of plant-level processes that cannot be rendered by reaction-diffusion models. Initial spatial distribution of plants, reproductive age, and average seed dispersal distance, by impacting patch size and vegetation aggregation, affected pattern formation and population survival under climatic variations. Besides, changes in precipitation regime altered the demographic structure and spatial organization of vegetation patches by affecting plants differentially depending on their age and biomass. Water availability influenced non-linearly total biomass density. Remarkably, lower precipitation resulted in lower mean plant age yet higher mean individual biomass. Moreover, seasonal variations in rainfall greater than a threshold (here, ±0.45 mm from the 1.3 mm baseline) decreased mean total biomass and generated limit cycles, which, in the case of large variations, were preceded by chaotic demographic and spatial behavior. In some cases, peculiar spatial patterns (e.g., rings) were also engendered. On a technical note, the shortcomings of the present model and the benefit of hybrid modeling for virtual investigations in plant science are discussed.

  19. Spatial Self-Organization of Vegetation Subject to Climatic Stress—Insights from a System Dynamics—Individual-Based Hybrid Model

    PubMed Central

    Vincenot, Christian E.; Carteni, Fabrizio; Mazzoleni, Stefano; Rietkerk, Max; Giannino, Francesco

    2016-01-01

    In simulation models of populations or communities, individual plants have often been obfuscated in favor of aggregated vegetation. This simplification comes with a loss of biological detail and a smoothing out of the demographic noise engendered by stochastic individual-scale processes and heterogeneities, which is significant among others when studying the viability of small populations facing challenging fluctuating environmental conditions. This consideration has motivated the development of precise plant-centered models. The accuracy gained in the representation of plant biology has then, however, often been balanced by the disappearance in models of important plant-soil interactions (esp. water dynamics) due to the inability of most individual-based frameworks to simulate complex continuous processes. In this study, we used a hybrid modeling approach, namely integrated System Dynamics (SD)—Individual-based (IB), to illustrate the importance of individual plant dynamics to explain spatial self-organization of vegetation in arid environments. We analyzed the behavior of this model under different parameter sets either related to individual plant properties (such as seed dispersal distance and reproductive age) or the environment (such as intensity and yearly distribution of precipitation events). While the results of this work confirmed the prevailing theory on vegetation patterning, they also revealed the importance therein of plant-level processes that cannot be rendered by reaction-diffusion models. Initial spatial distribution of plants, reproductive age, and average seed dispersal distance, by impacting patch size and vegetation aggregation, affected pattern formation and population survival under climatic variations. Besides, changes in precipitation regime altered the demographic structure and spatial organization of vegetation patches by affecting plants differentially depending on their age and biomass. Water availability influenced non-linearly total biomass density. Remarkably, lower precipitation resulted in lower mean plant age yet higher mean individual biomass. Moreover, seasonal variations in rainfall greater than a threshold (here, ±0.45 mm from the 1.3 mm baseline) decreased mean total biomass and generated limit cycles, which, in the case of large variations, were preceded by chaotic demographic and spatial behavior. In some cases, peculiar spatial patterns (e.g., rings) were also engendered. On a technical note, the shortcomings of the present model and the benefit of hybrid modeling for virtual investigations in plant science are discussed. PMID:27252707

  20. Using Agent Based Modeling (ABM) to Develop Cultural Interaction Simulations

    NASA Technical Reports Server (NTRS)

    Drucker, Nick; Jones, Phillip N.

    2012-01-01

    Today, most cultural training is based on or built around "cultural engagements" or discrete interactions between the individual learner and one or more cultural "others". Often, success in the engagement is the end or the objective. In reality, these interactions usually involve secondary and tertiary effects with potentially wide ranging consequences. The concern is that learning culture within a strict engagement context might lead to "checklist" cultural thinking that will not empower learners to understand the full consequence of their actions. We propose the use of agent based modeling (ABM) to collect, store, and, simulating the effects of social networks, promulgate engagement effects over time, distance, and consequence. The ABM development allows for rapid modification to re-create any number of population types, extending the applicability of the model to any requirement for social modeling.

  1. A simulation framework for mapping risks in clinical processes: the case of in-patient transfers.

    PubMed

    Dunn, Adam G; Ong, Mei-Sing; Westbrook, Johanna I; Magrabi, Farah; Coiera, Enrico; Wobcke, Wayne

    2011-05-01

    To model how individual violations in routine clinical processes cumulatively contribute to the risk of adverse events in hospital using an agent-based simulation framework. An agent-based simulation was designed to model the cascade of common violations that contribute to the risk of adverse events in routine clinical processes. Clinicians and the information systems that support them were represented as a group of interacting agents using data from direct observations. The model was calibrated using data from 101 patient transfers observed in a hospital and results were validated for one of two scenarios (a misidentification scenario and an infection control scenario). Repeated simulations using the calibrated model were undertaken to create a distribution of possible process outcomes. The likelihood of end-of-chain risk is the main outcome measure, reported for each of the two scenarios. The simulations demonstrate end-of-chain risks of 8% and 24% for the misidentification and infection control scenarios, respectively. Over 95% of the simulations in both scenarios are unique, indicating that the in-patient transfer process diverges from prescribed work practices in a variety of ways. The simulation allowed us to model the risk of adverse events in a clinical process, by generating the variety of possible work subject to violations, a novel prospective risk analysis method. The in-patient transfer process has a high proportion of unique trajectories, implying that risk mitigation may benefit from focusing on reducing complexity rather than augmenting the process with further rule-based protocols.

  2. MITK-based segmentation of co-registered MRI for subject-related regional anesthesia simulation

    NASA Astrophysics Data System (ADS)

    Teich, Christian; Liao, Wei; Ullrich, Sebastian; Kuhlen, Torsten; Ntouba, Alexandre; Rossaint, Rolf; Ullisch, Marcus; Deserno, Thomas M.

    2008-03-01

    With a steadily increasing indication, regional anesthesia is still trained directly on the patient. To develop a virtual reality (VR)-based simulation, a patient model is needed containing several tissues, which have to be extracted from individual magnet resonance imaging (MRI) volume datasets. Due to the given modality and the different characteristics of the single tissues, an adequate segmentation can only be achieved by using a combination of segmentation algorithms. In this paper, we present a framework for creating an individual model from MRI scans of the patient. Our work splits in two parts. At first, an easy-to-use and extensible tool for handling the segmentation task on arbitrary datasets is provided. The key idea is to let the user create a segmentation for the given subject by running different processing steps in a purposive order and store them in a segmentation script for reuse on new datasets. For data handling and visualization, we utilize the Medical Imaging Interaction Toolkit (MITK), which is based on the Visualization Toolkit (VTK) and the Insight Segmentation and Registration Toolkit (ITK). The second part is to find suitable segmentation algorithms and respectively parameters for differentiating the tissues required by the RA simulation. For this purpose, a fuzzy c-means clustering algorithm combined with mathematical morphology operators and a geometric active contour-based approach is chosen. The segmentation process itself aims at operating with minimal user interaction, and the gained model fits the requirements of the simulation. First results are shown for both, male and female MRI of the pelvis.

  3. Pixel-based meshfree modelling of skeletal muscles.

    PubMed

    Chen, Jiun-Shyan; Basava, Ramya Rao; Zhang, Yantao; Csapo, Robert; Malis, Vadim; Sinha, Usha; Hodgson, John; Sinha, Shantanu

    2016-01-01

    This paper introduces the meshfree Reproducing Kernel Particle Method (RKPM) for 3D image-based modeling of skeletal muscles. This approach allows for construction of simulation model based on pixel data obtained from medical images. The material properties and muscle fiber direction obtained from Diffusion Tensor Imaging (DTI) are input at each pixel point. The reproducing kernel (RK) approximation allows a representation of material heterogeneity with smooth transition. A multiphase multichannel level set based segmentation framework is adopted for individual muscle segmentation using Magnetic Resonance Images (MRI) and DTI. The application of the proposed methods for modeling the human lower leg is demonstrated.

  4. CHEMICAL EVOLUTION LIBRARY FOR GALAXY FORMATION SIMULATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saitoh, Takayuki R., E-mail: saitoh@elsi.jp

    We have developed a software library for chemical evolution simulations of galaxy formation under the simple stellar population (SSP) approximation. In this library, all of the necessary components concerning chemical evolution, such as initial mass functions, stellar lifetimes, yields from Type II and Type Ia supernovae, asymptotic giant branch stars, and neutron star mergers, are compiled from the literature. Various models are pre-implemented in this library so that users can choose their favorite combination of models. Subroutines of this library return released energy and masses of individual elements depending on a given event type. Since the redistribution manner of thesemore » quantities depends on the implementation of users’ simulation codes, this library leaves it up to the simulation code. As demonstrations, we carry out both one-zone, closed-box simulations and 3D simulations of a collapsing gas and dark matter system using this library. In these simulations, we can easily compare the impact of individual models on the chemical evolution of galaxies, just by changing the control flags and parameters of the library. Since this library only deals with the part of chemical evolution under the SSP approximation, any simulation codes that use the SSP approximation—namely, particle-base and mesh codes, as well as semianalytical models—can use it. This library is named “CELib” after the term “Chemical Evolution Library” and is made available to the community.« less

  5. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    NASA Astrophysics Data System (ADS)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  6. Simulation of the effects of time and size at stocking on PCB accumulation in lake trout

    USGS Publications Warehouse

    Madenjian, Charles P.; Carpenter, Stephen R.

    1993-01-01

    Manipulations of size at stocking and timing of stocking have already been used to improve survival of stocked salmonines in the Great Lakes. It should be possible to stock salmonines into the Great Lakes in a way that reduces the rate of polychlorinated biphenyl (PCB) accumulation in these fishes. An individual-based model (IBM) was used to investigate the effects of size at stocking and timing of stocking on PCB accumulation by lake trout Salvelinus namaycush in Lake Michigan. The individual-based feature of the model allowed lake trout individuals to encounter prey fish individuals and then consume sufficiently small prey fish. The IBM accurately accounted for the variation in PCB concentrations observed within the Lake Michigan lake trout population. Results of the IBM simulations revealed that increasing the average size at stocking from 110 to 160 mm total length led to an increase in the average PCB concentration in the stocked cohort at age 5, after the fish had spent 4 years in the lake, from 2.33 to 2.65 mg/kg; the percentage of lake trout in the cohort at the end of the simulated time period with PCB concentration of 2 mg/kg or more increased from 62% to 79%. Thus, PCB contamination was reduced when the simulated size at stocking was smallest. An overall stocking strategy for lake trout into Lake Michigan should weigh this advantage regarding PCB contamination against the poor survival of lake trout that may occur if the trout are stocked at too small a size.

  7. A method for assigning species into groups based on generalized Mahalanobis distance between habitat model coefficients

    USGS Publications Warehouse

    Williams, C.J.; Heglund, P.J.

    2009-01-01

    Habitat association models are commonly developed for individual animal species using generalized linear modeling methods such as logistic regression. We considered the issue of grouping species based on their habitat use so that management decisions can be based on sets of species rather than individual species. This research was motivated by a study of western landbirds in northern Idaho forests. The method we examined was to separately fit models to each species and to use a generalized Mahalanobis distance between coefficient vectors to create a distance matrix among species. Clustering methods were used to group species from the distance matrix, and multidimensional scaling methods were used to visualize the relations among species groups. Methods were also discussed for evaluating the sensitivity of the conclusions because of outliers or influential data points. We illustrate these methods with data from the landbird study conducted in northern Idaho. Simulation results are presented to compare the success of this method to alternative methods using Euclidean distance between coefficient vectors and to methods that do not use habitat association models. These simulations demonstrate that our Mahalanobis-distance- based method was nearly always better than Euclidean-distance-based methods or methods not based on habitat association models. The methods used to develop candidate species groups are easily explained to other scientists and resource managers since they mainly rely on classical multivariate statistical methods. ?? 2008 Springer Science+Business Media, LLC.

  8. Does spatial variation in environmental conditions affect recruitment? A study using a 3-D model of Peruvian anchovy

    NASA Astrophysics Data System (ADS)

    Xu, Yi; Rose, Kenneth A.; Chai, Fei; Chavez, Francisco P.; Ayón, Patricia

    2015-11-01

    We used a 3-dimensional individual-based model (3-D IBM) of Peruvian anchovy to examine how spatial variation in environmental conditions affects larval and juvenile growth and survival, and recruitment. Temperature, velocity, and phytoplankton and zooplankton concentrations generated from a coupled hydrodynamic Nutrients-Phytoplankton-Zooplankton-Detritus (NPZD) model, mapped to a three dimensional rectangular grid, were used to simulate anchovy populations. The IBM simulated individuals as they progressed from eggs to recruitment at 10 cm. Eggs and yolk-sac larvae were followed hourly through the processes of development, mortality, and movement (advection), and larvae and juveniles were followed daily through the processes of growth, mortality, and movement (advection plus behavior). A bioenergetics model was used to grow larvae and juveniles. The NPZD model provided prey fields which influence both food consumption rate as well as behavior mediated movement with individuals going to grids cells having optimal growth conditions. We compared predicted recruitment for monthly cohorts for 1990 through 2004 between the full 3-D IBM and a point (0-D) model that used spatially-averaged environmental conditions. The 3-D and 0-D versions generated similar interannual patterns in monthly recruitment for 1991-2004, with the 3-D results yielding consistently higher survivorship. Both versions successfully captured the very poor recruitment during the 1997-1998 El Niño event. Higher recruitment in the 3-D simulations was due to higher survival during the larval stage resulting from individuals searching for more favorable temperatures that lead to faster growth rates. The strong effect of temperature was because both model versions provided saturating food conditions for larval and juvenile anchovies. We conclude with a discussion of how explicit treatment of spatial variation affected simulated recruitment, other examples of fisheries modeling analyses that have used a similar approach to assess the influence of spatial variation, and areas for further model development.

  9. An improved null model for assessing the net effects of multiple stressors on communities.

    PubMed

    Thompson, Patrick L; MacLennan, Megan M; Vinebrooke, Rolf D

    2018-01-01

    Ecological stressors (i.e., environmental factors outside their normal range of variation) can mediate each other through their interactions, leading to unexpected combined effects on communities. Determining whether the net effect of stressors is ecologically surprising requires comparing their cumulative impact to a null model that represents the linear combination of their individual effects (i.e., an additive expectation). However, we show that standard additive and multiplicative null models that base their predictions on the effects of single stressors on community properties (e.g., species richness or biomass) do not provide this linear expectation, leading to incorrect interpretations of antagonistic and synergistic responses by communities. We present an alternative, the compositional null model, which instead bases its predictions on the effects of stressors on individual species, and then aggregates them to the community level. Simulations demonstrate the improved ability of the compositional null model to accurately provide a linear expectation of the net effect of stressors. We simulate the response of communities to paired stressors that affect species in a purely additive fashion and compare the relative abilities of the compositional null model and two standard community property null models (additive and multiplicative) to predict these linear changes in species richness and community biomass across different combinations (both positive, negative, or opposite) and intensities of stressors. The compositional model predicts the linear effects of multiple stressors under almost all scenarios, allowing for proper classification of net effects, whereas the standard null models do not. Our findings suggest that current estimates of the prevalence of ecological surprises on communities based on community property null models are unreliable, and should be improved by integrating the responses of individual species to the community level as does our compositional null model. © 2017 John Wiley & Sons Ltd.

  10. Multi-agent simulation of the von Thunen model formation mechanism

    NASA Astrophysics Data System (ADS)

    Tao, Haiyan; Li, Xia; Chen, Xiaoxiang; Deng, Chengbin

    2008-10-01

    This research tries to explain the internal driving forces of circular structure formation in urban geography via the simulation of interaction between individual behavior and market. On the premise of single city center, unchanged scale merit and complete competition, enterprise migration theory as well, an R-D algorithm, that has agents searched the best behavior rules in some given locations, is introduced with agent-based modeling technique. The experiment conducts a simulation on Swarm platform, whose result reflects and replays the formation process of Von Thünen circular structure. Introducing and considering some heterogeneous factors, such as traffic roads, the research verifies several landuse models and discusses the self-adjustment function of price mechanism.

  11. Effects of inbreeding on coastal Douglas fir growth and yield in operational plantations: a model-based approach.

    PubMed

    Wang, Tongli; Aitken, Sally N; Woods, Jack H; Polsson, Ken; Magnussen, Steen

    2004-04-01

    In advanced generation seed orchards, tradeoffs exist between genetic gain obtained by selecting the best related individuals for seed orchard populations, and potential losses due to subsequent inbreeding between these individuals. Although inbreeding depression for growth rate is strong in most forest tree species at the individual tree level, the effect of a small proportion of inbreds in seed lots on final stand yield may be less important. The effects of inbreeding on wood production of mature stands cannot be assessed empirically in the short term, thus such effects were simulated for coastal Douglas fir [ Pseudotsuga menziesii var. menziesii (Mirb.) Franco] using an individual-tree growth and yield model TASS (Tree and Stand Simulator). The simulations were based on seed set, nursery culling rates, and 10-year-old field test performance for trees resulting from crosses between unrelated individuals and for inbred trees produced through mating between half-sibs, full-sibs, parents and offspring and self-pollination. Results indicate that inclusion of a small proportion of related clones in seed orchards will have relatively low impacts on stand yields due to low probability of related individuals mating, lower probability of producing acceptable seedlings from related matings than from unrelated matings, and a greater probability of competition-induced mortality for slower growing inbred individuals than for outcrossed trees. Thus, competition reduces the losses expected due to inbreeding depression at harvest, particularly on better sites with higher planting densities and longer rotations. Slightly higher breeding values for related clones than unrelated clones would offset or exceed the effects of inbreeding resulting from related matings. Concerns regarding the maintenance of genetic diversity are more likely to limit inclusion of related clones in orchards than inbreeding depression for final stand yield.

  12. Personalized glucose-insulin model based on signal analysis.

    PubMed

    Goede, Simon L; de Galan, Bastiaan E; Leow, Melvin Khee Shing

    2017-04-21

    Glucose plasma measurements for diabetes patients are generally presented as a glucose concentration-time profile with 15-60min time scale intervals. This limited resolution obscures detailed dynamic events of glucose appearance and metabolism. Measurement intervals of 15min or more could contribute to imperfections in present diabetes treatment. High resolution data from mixed meal tolerance tests (MMTT) for 24 type 1 and type 2 diabetes patients were used in our present modeling. We introduce a model based on the physiological properties of transport, storage and utilization. This logistic approach follows the principles of electrical network analysis and signal processing theory. The method mimics the physiological equivalent of the glucose homeostasis comprising the meal ingestion, absorption via the gastrointestinal tract (GIT) to the endocrine nexus between the liver, pancreatic alpha and beta cells. This model demystifies the metabolic 'black box' by enabling in silico simulations and fitting of individual responses to clinical data. Five-minute intervals MMTT data measured from diabetic subjects result in two independent model parameters that characterize the complete glucose system response at a personalized level. From the individual data measurements, we obtain a model which can be analyzed with a standard electrical network simulator for diagnostics and treatment optimization. The insulin dosing time scale can be accurately adjusted to match the individual requirements of characterized diabetic patients without the physical burden of treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Perceptually relevant parameters for virtual listening simulation of small room acoustics

    PubMed Central

    Zahorik, Pavel

    2009-01-01

    Various physical aspects of room-acoustic simulation techniques have been extensively studied and refined, yet the perceptual attributes of the simulations have received relatively little attention. Here a method of evaluating the perceptual similarity between rooms is described and tested using 15 small-room simulations based on binaural room impulse responses (BRIRs) either measured from a real room or estimated using simple geometrical acoustic modeling techniques. Room size and surface absorption properties were varied, along with aspects of the virtual simulation including the use of individualized head-related transfer function (HRTF) measurements for spatial rendering. Although differences between BRIRs were evident in a variety of physical parameters, a multidimensional scaling analysis revealed that when at-the-ear signal levels were held constant, the rooms differed along just two perceptual dimensions: one related to reverberation time (T60) and one related to interaural coherence (IACC). Modeled rooms were found to differ from measured rooms in this perceptual space, but the differences were relatively small and should be easily correctable through adjustment of T60 and IACC in the model outputs. Results further suggest that spatial rendering using individualized HRTFs offers little benefit over nonindividualized HRTF rendering for room simulation applications where source direction is fixed. PMID:19640043

  14. Uncertainty in simulating wheat yields under climate change

    NASA Astrophysics Data System (ADS)

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J. W.; Hatfield, J. L.; Ruane, A. C.; Boote, K. J.; Thorburn, P. J.; Rötter, R. P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P. K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, A. J.; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, R.; Heng, L.; Hooker, J.; Hunt, L. A.; Ingwersen, J.; Izaurralde, R. C.; Kersebaum, K. C.; Müller, C.; Naresh Kumar, S.; Nendel, C.; O'Leary, G.; Olesen, J. E.; Osborne, T. M.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M. A.; Shcherbak, I.; Steduto, P.; Stöckle, C.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J. W.; Williams, J. R.; Wolf, J.

    2013-09-01

    Projections of climate change impacts on crop yields are inherently uncertain. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models are difficult. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development andpolicymaking.

  15. Effects of selective attention on continuous opinions and discrete decisions

    NASA Astrophysics Data System (ADS)

    Si, Xia-Meng; Liu, Yun; Xiong, Fei; Zhang, Yan-Chao; Ding, Fei; Cheng, Hui

    2010-09-01

    Selective attention describes that individuals have a preference on information according to their involving motivation. Based on achievements of social psychology, we propose an opinion interacting model to improve the modeling of individuals’ interacting behaviors. There are two parameters governing the probability of agents interacting with opponents, i.e. individual relevance and time-openness. It is found that, large individual relevance and large time-openness advance the appearance of large clusters, but large individual relevance and small time-openness favor the lessening of extremism. We also put this new model into application to work out some factor leading to a successful product. Numerical simulations show that selective attention, especially individual relevance, cannot be ignored by launcher firms and information spreaders so as to attain the most successful promotion.

  16. Whole body counter calibration using Monte Carlo modeling with an array of phantom sizes based on national anthropometric reference data

    NASA Astrophysics Data System (ADS)

    Shypailo, R. J.; Ellis, K. J.

    2011-05-01

    During construction of the whole body counter (WBC) at the Children's Nutrition Research Center (CNRC), efficiency calibration was needed to translate acquired counts of 40K to actual grams of potassium for measurement of total body potassium (TBK) in a diverse subject population. The MCNP Monte Carlo n-particle simulation program was used to describe the WBC (54 detectors plus shielding), test individual detector counting response, and create a series of virtual anthropomorphic phantoms based on national reference anthropometric data. Each phantom included an outer layer of adipose tissue and an inner core of lean tissue. Phantoms were designed for both genders representing ages 3.5 to 18.5 years with body sizes from the 5th to the 95th percentile based on body weight. In addition, a spherical surface source surrounding the WBC was modeled in order to measure the effects of subject mass on room background interference. Individual detector measurements showed good agreement with the MCNP model. The background source model came close to agreement with empirical measurements, but showed a trend deviating from unity with increasing subject size. Results from the MCNP simulation of the CNRC WBC agreed well with empirical measurements using BOMAB phantoms. Individual detector efficiency corrections were used to improve the accuracy of the model. Nonlinear multiple regression efficiency calibration equations were derived for each gender. Room background correction is critical in improving the accuracy of the WBC calibration.

  17. Assessing the detail needed to capture rainfall-runoff dynamics with physics-based hydrologic response simulation

    USGS Publications Warehouse

    Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.

    2011-01-01

    Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.

  18. A Social Diffusion Model with an Application on Election Simulation

    PubMed Central

    Wang, Fu-Min; Hung, San-Chuan; Kung, Perng-Hwa; Lin, Shou-De

    2014-01-01

    Issues about opinion diffusion have been studied for decades. It has so far no empirical approach to model the interflow and formation of crowd's opinion in elections due to two reasons. First, unlike the spread of information or flu, individuals have their intrinsic attitudes to election candidates in advance. Second, opinions are generally simply assumed as single values in most diffusion models. However, in this case, an opinion should represent preference toward multiple candidates. Previously done models thus may not intuitively interpret such scenario. This work is to design a diffusion model which is capable of managing the aforementioned scenario. To demonstrate the usefulness of our model, we simulate the diffusion on the network built based on a publicly available bibliography dataset. We compare the proposed model with other well-known models such as independent cascade. It turns out that our model consistently outperforms other models. We additionally investigate electoral issues with our model simulator. PMID:24995351

  19. Simulation of the National Aerospace System for Safety Analysis

    NASA Technical Reports Server (NTRS)

    Pritchett, Amy; Goldsman, Dave; Statler, Irv (Technical Monitor)

    2002-01-01

    Work started on this project on January 1, 1999, the first year of the grant. Following the outline of the grant proposal, a simulator architecture has been established which can incorporate the variety of types of models needed to accurately simulate national airspace dynamics. For the sake of efficiency, this architecture was based on an established single-aircraft flight simulator, the Reconfigurable Flight Simulator (RFS), already developed at Georgia Tech. Likewise, in the first year substantive changes and additions were made to the RFS to convert it into a simulation of the National Airspace System, with the flexibility to incorporate many types of models: aircraft models; controller models; airspace configuration generators; discrete event generators; embedded statistical functions; and display and data outputs. The architecture has been developed with the capability to accept any models of these types; due to its object-oriented structure, individual simulator components can be added and removed during run-time, and can be compiled separately. Simulation objects from other projects should be easy to convert to meet architecture requirements, with the intent that both this project may now be able to incorporate established simulation components from other projects, and that other projects may easily use this simulation without significant time investment.

  20. Simulation of large-scale rule-based models

    PubMed Central

    Colvin, Joshua; Monine, Michael I.; Faeder, James R.; Hlavacek, William S.; Von Hoff, Daniel D.; Posner, Richard G.

    2009-01-01

    Motivation: Interactions of molecules, such as signaling proteins, with multiple binding sites and/or multiple sites of post-translational covalent modification can be modeled using reaction rules. Rules comprehensively, but implicitly, define the individual chemical species and reactions that molecular interactions can potentially generate. Although rules can be automatically processed to define a biochemical reaction network, the network implied by a set of rules is often too large to generate completely or to simulate using conventional procedures. To address this problem, we present DYNSTOC, a general-purpose tool for simulating rule-based models. Results: DYNSTOC implements a null-event algorithm for simulating chemical reactions in a homogenous reaction compartment. The simulation method does not require that a reaction network be specified explicitly in advance, but rather takes advantage of the availability of the reaction rules in a rule-based specification of a network to determine if a randomly selected set of molecular components participates in a reaction during a time step. DYNSTOC reads reaction rules written in the BioNetGen language which is useful for modeling protein–protein interactions involved in signal transduction. The method of DYNSTOC is closely related to that of StochSim. DYNSTOC differs from StochSim by allowing for model specification in terms of BNGL, which extends the range of protein complexes that can be considered in a model. DYNSTOC enables the simulation of rule-based models that cannot be simulated by conventional methods. We demonstrate the ability of DYNSTOC to simulate models accounting for multisite phosphorylation and multivalent binding processes that are characterized by large numbers of reactions. Availability: DYNSTOC is free for non-commercial use. The C source code, supporting documentation and example input files are available at http://public.tgen.org/dynstoc/. Contact: dynstoc@tgen.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19213740

  1. Basic Simulation Environment for Highly Customized Connected and Autonomous Vehicle Kinematic Scenarios.

    PubMed

    Chai, Linguo; Cai, Baigen; ShangGuan, Wei; Wang, Jian; Wang, Huashen

    2017-08-23

    To enhance the reality of Connected and Autonomous Vehicles (CAVs) kinematic simulation scenarios and to guarantee the accuracy and reliability of the verification, a four-layer CAVs kinematic simulation framework, which is composed with road network layer, vehicle operating layer, uncertainties modelling layer and demonstrating layer, is proposed in this paper. Properties of the intersections are defined to describe the road network. A target position based vehicle position updating method is designed to simulate such vehicle behaviors as lane changing and turning. Vehicle kinematic models are implemented to maintain the status of the vehicles when they are moving towards the target position. Priorities for individual vehicle control are authorized for different layers. Operation mechanisms of CAVs uncertainties, which are defined as position error and communication delay in this paper, are implemented in the simulation to enhance the reality of the simulation. A simulation platform is developed based on the proposed methodology. A comparison of simulated and theoretical vehicle delay has been analyzed to prove the validity and the creditability of the platform. The scenario of rear-end collision avoidance is conducted to verify the uncertainties operating mechanisms, and a slot-based intersections (SIs) control strategy is realized and verified in the simulation platform to show the supports of the platform to CAVs kinematic simulation and verification.

  2. Basic Simulation Environment for Highly Customized Connected and Autonomous Vehicle Kinematic Scenarios

    PubMed Central

    Chai, Linguo; Cai, Baigen; ShangGuan, Wei; Wang, Jian; Wang, Huashen

    2017-01-01

    To enhance the reality of Connected and Autonomous Vehicles (CAVs) kinematic simulation scenarios and to guarantee the accuracy and reliability of the verification, a four-layer CAVs kinematic simulation framework, which is composed with road network layer, vehicle operating layer, uncertainties modelling layer and demonstrating layer, is proposed in this paper. Properties of the intersections are defined to describe the road network. A target position based vehicle position updating method is designed to simulate such vehicle behaviors as lane changing and turning. Vehicle kinematic models are implemented to maintain the status of the vehicles when they are moving towards the target position. Priorities for individual vehicle control are authorized for different layers. Operation mechanisms of CAVs uncertainties, which are defined as position error and communication delay in this paper, are implemented in the simulation to enhance the reality of the simulation. A simulation platform is developed based on the proposed methodology. A comparison of simulated and theoretical vehicle delay has been analyzed to prove the validity and the creditability of the platform. The scenario of rear-end collision avoidance is conducted to verify the uncertainties operating mechanisms, and a slot-based intersections (SIs) control strategy is realized and verified in the simulation platform to show the supports of the platform to CAVs kinematic simulation and verification. PMID:28832518

  3. Optimization Model for Web Based Multimodal Interactive Simulations.

    PubMed

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-07-15

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update . In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.

  4. Optimization Model for Web Based Multimodal Interactive Simulations

    PubMed Central

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-01-01

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update. In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach. PMID:26085713

  5. Combining Individual-Based Modeling and Food Microenvironment Descriptions To Predict the Growth of Listeria monocytogenes on Smear Soft Cheese

    PubMed Central

    Ferrier, Rachel; Hezard, Bernard; Lintz, Adrienne; Stahl, Valérie

    2013-01-01

    An individual-based modeling (IBM) approach was developed to describe the behavior of a few Listeria monocytogenes cells contaminating smear soft cheese surface. The IBM approach consisted of assessing the stochastic individual behaviors of cells on cheese surfaces and knowing the characteristics of their surrounding microenvironments. We used a microelectrode for pH measurements and micro-osmolality to assess the water activity of cheese microsamples. These measurements revealed a high variability of microscale pH compared to that of macroscale pH. A model describing the increase in pH from approximately 5.0 to more than 7.0 during ripening was developed. The spatial variability of the cheese surface characterized by an increasing pH with radius and higher pH on crests compared to that of hollows on cheese rind was also modeled. The microscale water activity ranged from approximately 0.96 to 0.98 and was stable during ripening. The spatial variability on cheese surfaces was low compared to between-cheese variability. Models describing the microscale variability of cheese characteristics were combined with the IBM approach to simulate the stochastic growth of L. monocytogenes on cheese, and these simulations were compared to bacterial counts obtained from irradiated cheeses artificially contaminated at different ripening stages. The simulated variability of L. monocytogenes counts with the IBM/microenvironmental approach was consistent with the observed one. Contrasting situations corresponding to no growth or highly contaminated foods could be deduced from these models. Moreover, the IBM approach was more effective than the traditional population/macroenvironmental approach to describe the actual bacterial behavior variability. PMID:23872572

  6. Heterogeneous mechanics of the mouse pulmonary arterial network.

    PubMed

    Lee, Pilhwa; Carlson, Brian E; Chesler, Naomi; Olufsen, Mette S; Qureshi, M Umar; Smith, Nicolas P; Sochi, Taha; Beard, Daniel A

    2016-10-01

    Individualized modeling and simulation of blood flow mechanics find applications in both animal research and patient care. Individual animal or patient models for blood vessel mechanics are based on combining measured vascular geometry with a fluid structure model coupling formulations describing dynamics of the fluid and mechanics of the wall. For example, one-dimensional fluid flow modeling requires a constitutive law relating vessel cross-sectional deformation to pressure in the lumen. To investigate means of identifying appropriate constitutive relationships, an automated segmentation algorithm was applied to micro-computerized tomography images from a mouse lung obtained at four different static pressures to identify the static pressure-radius relationship for four generations of vessels in the pulmonary arterial network. A shape-fitting function was parameterized for each vessel in the network to characterize the nonlinear and heterogeneous nature of vessel distensibility in the pulmonary arteries. These data on morphometric and mechanical properties were used to simulate pressure and flow velocity propagation in the network using one-dimensional representations of fluid and vessel wall mechanics. Moreover, wave intensity analysis was used to study effects of wall mechanics on generation and propagation of pressure wave reflections. Simulations were conducted to investigate the role of linear versus nonlinear formulations of wall elasticity and homogeneous versus heterogeneous treatments of vessel wall properties. Accounting for heterogeneity, by parameterizing the pressure/distention equation of state individually for each vessel segment, was found to have little effect on the predicted pressure profiles and wave propagation compared to a homogeneous parameterization based on average behavior. However, substantially different results were obtained using a linear elastic thin-shell model than were obtained using a nonlinear model that has a more physiologically realistic pressure versus radius relationship.

  7. Twelve tips for a successful interprofessional team-based high-fidelity simulation education session

    PubMed Central

    Bould, M. Dylan; Layat Burn, Carine; Reeves, Scott

    2014-01-01

    Simulation-based education allows experiential learning without risk to patients. Interprofessional education aims to provide opportunities to different professions for learning how to work effectively together. Interprofessional simulation-based education presents many challenges, including the logistics of setting up the session and providing effective feedback to participants with different backgrounds and mental models. This paper aims to provide educators with a series of practical and pedagogical tips for designing, implementing, assessing, and evaluating a successful interprofessional team-based simulation session. The paper is organized in the sequence that an educator might use in developing an interprofessional simulation-based education session. Collectively, this paper provides guidance from determining interprofessional learning objectives and curricular design to program evaluation. With a better understanding of the concepts and pedagogical methods underlying interprofessional education and simulation, educators will be able to create conditions for a unique educational experience where individuals learn with and from other specialties and professions in a controlled, safe environment. PMID:25023765

  8. Visual Predictive Check in Models with Time-Varying Input Function.

    PubMed

    Largajolli, Anna; Bertoldo, Alessandra; Campioni, Marco; Cobelli, Claudio

    2015-11-01

    The nonlinear mixed effects models are commonly used modeling techniques in the pharmaceutical research as they enable the characterization of the individual profiles together with the population to which the individuals belong. To ensure a correct use of them is fundamental to provide powerful diagnostic tools that are able to evaluate the predictive performance of the models. The visual predictive check (VPC) is a commonly used tool that helps the user to check by visual inspection if the model is able to reproduce the variability and the main trend of the observed data. However, the simulation from the model is not always trivial, for example, when using models with time-varying input function (IF). In this class of models, there is a potential mismatch between each set of simulated parameters and the associated individual IF which can cause an incorrect profile simulation. We introduce a refinement of the VPC by taking in consideration a correlation term (the Mahalanobis or normalized Euclidean distance) that helps the association of the correct IF with the individual set of simulated parameters. We investigate and compare its performance with the standard VPC in models of the glucose and insulin system applied on real and simulated data and in a simulated pharmacokinetic/pharmacodynamic (PK/PD) example. The newly proposed VPC performance appears to be better with respect to the standard VPC especially for the models with big variability in the IF where the probability of simulating incorrect profiles is higher.

  9. Strategies for efficient numerical implementation of hybrid multi-scale agent-based models to describe biological systems

    PubMed Central

    Cilfone, Nicholas A.; Kirschner, Denise E.; Linderman, Jennifer J.

    2015-01-01

    Biologically related processes operate across multiple spatiotemporal scales. For computational modeling methodologies to mimic this biological complexity, individual scale models must be linked in ways that allow for dynamic exchange of information across scales. A powerful methodology is to combine a discrete modeling approach, agent-based models (ABMs), with continuum models to form hybrid models. Hybrid multi-scale ABMs have been used to simulate emergent responses of biological systems. Here, we review two aspects of hybrid multi-scale ABMs: linking individual scale models and efficiently solving the resulting model. We discuss the computational choices associated with aspects of linking individual scale models while simultaneously maintaining model tractability. We demonstrate implementations of existing numerical methods in the context of hybrid multi-scale ABMs. Using an example model describing Mycobacterium tuberculosis infection, we show relative computational speeds of various combinations of numerical methods. Efficient linking and solution of hybrid multi-scale ABMs is key to model portability, modularity, and their use in understanding biological phenomena at a systems level. PMID:26366228

  10. Modelling hen harrier dynamics to inform human-wildlife conflict resolution: a spatially-realistic, individual-based approach.

    PubMed

    Heinonen, Johannes P M; Palmer, Stephen C F; Redpath, Steve M; Travis, Justin M J

    2014-01-01

    Individual-based models have gained popularity in ecology, and enable simultaneous incorporation of spatial explicitness and population dynamic processes to understand spatio-temporal patterns of populations. We introduce an individual-based model for understanding and predicting spatial hen harrier (Circus cyaneus) population dynamics in Great Britain. The model uses a landscape with habitat, prey and game management indices. The hen harrier population was initialised according to empirical census estimates for 1988/89 and simulated until 2030, and predictions for 1998, 2004 and 2010 were compared to empirical census estimates for respective years. The model produced a good qualitative match to overall trends between 1989 and 2010. Parameter explorations revealed relatively high elasticity in particular to demographic parameters such as juvenile male mortality. This highlights the need for robust parameter estimates from empirical research. There are clearly challenges for replication of real-world population trends, but this model provides a useful tool for increasing understanding of drivers of hen harrier dynamics and focusing research efforts in order to inform conflict management decisions.

  11. Modelling Hen Harrier Dynamics to Inform Human-Wildlife Conflict Resolution: A Spatially-Realistic, Individual-Based Approach

    PubMed Central

    Heinonen, Johannes P. M.; Palmer, Stephen C. F.; Redpath, Steve M.; Travis, Justin M. J.

    2014-01-01

    Individual-based models have gained popularity in ecology, and enable simultaneous incorporation of spatial explicitness and population dynamic processes to understand spatio-temporal patterns of populations. We introduce an individual-based model for understanding and predicting spatial hen harrier (Circus cyaneus) population dynamics in Great Britain. The model uses a landscape with habitat, prey and game management indices. The hen harrier population was initialised according to empirical census estimates for 1988/89 and simulated until 2030, and predictions for 1998, 2004 and 2010 were compared to empirical census estimates for respective years. The model produced a good qualitative match to overall trends between 1989 and 2010. Parameter explorations revealed relatively high elasticity in particular to demographic parameters such as juvenile male mortality. This highlights the need for robust parameter estimates from empirical research. There are clearly challenges for replication of real-world population trends, but this model provides a useful tool for increasing understanding of drivers of hen harrier dynamics and focusing research efforts in order to inform conflict management decisions. PMID:25405860

  12. Knowledge transmission model with differing initial transmission and retransmission process

    NASA Astrophysics Data System (ADS)

    Wang, Haiying; Wang, Jun; Small, Michael

    2018-10-01

    Knowledge transmission is a cyclic dynamic diffusion process. The rate of acceptance of knowledge differs upon whether or not the recipient has previously held the knowledge. In this paper, the knowledge transmission process is divided into an initial and a retransmission procedure, each with its own transmission and self-learning parameters. Based on epidemic spreading model, we propose a naive-evangelical-agnostic (VEA) knowledge transmission model and derive mean-field equations to describe the dynamics of knowledge transmission in homogeneous networks. Theoretical analysis identifies a criterion for the persistence of knowledge, i.e., the reproduction number R0 depends on the minor effective parameters between the initial and retransmission process. Moreover, the final size of evangelical individuals is only related to retransmission process parameters. Numerical simulations validate the theoretical analysis. Furthermore, the simulations indicate that increasing the initial transmission parameters, including first transmission and self-learning rates of naive individuals, can accelerate the velocity of knowledge transmission efficiently but have no effect on the final size of evangelical individuals. In contrast, the retransmission parameters, including retransmission and self-learning rates of agnostic individuals, have a significant effect on the rate of knowledge transmission, i.e., the larger parameters the greater final density of evangelical individuals.

  13. Modeling the frequency response of microwave radiometers with QUCS

    NASA Astrophysics Data System (ADS)

    Zonca, A.; Roucaries, B.; Williams, B.; Rubin, I.; D'Arcangelo, O.; Meinhold, P.; Lubin, P.; Franceschet, C.; Jahn, S.; Mennella, A.; Bersanelli, M.

    2010-12-01

    Characterization of the frequency response of coherent radiometric receivers is a key element in estimating the flux of astrophysical emissions, since the measured signal depends on the convolution of the source spectral emission with the instrument band shape. Laboratory Radio Frequency (RF) measurements of the instrument bandpass often require complex test setups and are subject to a number of systematic effects driven by thermal issues and impedance matching, particularly if cryogenic operation is involved. In this paper we present an approach to modeling radiometers bandpasses by integrating simulations and RF measurements of individual components. This method is based on QUCS (Quasi Universal Circuit Simulator), an open-source circuit simulator, which gives the flexibility of choosing among the available devices, implementing new analytical software models or using measured S-parameters. Therefore an independent estimate of the instrument bandpass is achieved using standard individual component measurements and validated analytical simulations. In order to automate the process of preparing input data, running simulations and exporting results we developed the Python package python-qucs and released it under GNU Public License. We discuss, as working cases, bandpass response modeling of the COFE and Planck Low Frequency Instrument (LFI) radiometers and compare results obtained with QUCS and with a commercial circuit simulator software. The main purpose of bandpass modeling in COFE is to optimize component matching, while in LFI they represent the best estimation of frequency response, since end-to-end measurements were strongly affected by systematic effects.

  14. Applying GIS and high performance agent-based simulation for managing an Old World Screwworm fly invasion of Australia.

    PubMed

    Welch, M C; Kwan, P W; Sajeev, A S M

    2014-10-01

    Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. Copyright © 2014 International Atomic Energy Agency 2014. Published by Elsevier B.V. All rights reserved.

  15. Landscape Level Carbon and Water Balances and Agricultural Production in Mountainous Terrain of the Haean Basin, South Korea

    NASA Astrophysics Data System (ADS)

    Lee, B.; Geyer, R.; Seo, B.; Lindner, S.; Walther, G.; Tenhunen, J. D.

    2009-12-01

    The process-based spatial simulation model PIXGRO was used to estimate gross primary production, ecosystem respiration, net ecosystem CO2 exchange and water use by forest and crop fields of Haean Basin, South Korea at landscape scale. Simulations are run for individual years from early spring to late fall, providing estimates for dry land crops and rice paddies with respect to carbon gain, biomass and leaf area development, allocation of photoproducts to the belowground ecosystem compartment, and harvest yields. In the case of deciduous oak forests, gas exchange is estimated, but spatial simulation of growth over the single annual cycles is not included. Spatial parameterization of the model is derived for forest LAI based on remote sensing, for forest and cropland fluxes via eddy covariance and chamber studies, for soil characteristics by generalization from spatial surveys, for climate drivers by generalizing observations at ca. 20 monitoring stations distributed throughout the basin and along the elevation gradient from 500 to 1000 m, and for incident radiation via modelling of the radiation components in complex terrain. Validation of the model is being carried out at point scale based on comparison of model output at selected locations with observations as well as with known trends in ecosystem response documented in the literature. The resulting modelling tool is useful for estimation of ecosystem services at landscape scale, first expressed as kg ha-1 crop yield, but via future cooperative studies also in terms of monetary gain to individual farms and farming cooperatives applying particular management strategies.

  16. Towards a complex systems approach in sports injury research: simulating running-related injury development with agent-based modelling.

    PubMed

    Hulme, Adam; Thompson, Jason; Nielsen, Rasmus Oestergaard; Read, Gemma J M; Salmon, Paul M

    2018-06-18

    There have been recent calls for the application of the complex systems approach in sports injury research. However, beyond theoretical description and static models of complexity, little progress has been made towards formalising this approach in way that is practical to sports injury scientists and clinicians. Therefore, our objective was to use a computational modelling method and develop a dynamic simulation in sports injury research. Agent-based modelling (ABM) was used to model the occurrence of sports injury in a synthetic athlete population. The ABM was developed based on sports injury causal frameworks and was applied in the context of distance running-related injury (RRI). Using the acute:chronic workload ratio (ACWR), we simulated the dynamic relationship between changes in weekly running distance and RRI through the manipulation of various 'athlete management tools'. The findings confirmed that building weekly running distances over time, even within the reported ACWR 'sweet spot', will eventually result in RRI as athletes reach and surpass their individual physical workload limits. Introducing training-related error into the simulation and the modelling of a 'hard ceiling' dynamic resulted in a higher RRI incidence proportion across the population at higher absolute workloads. The presented simulation offers a practical starting point to further apply more sophisticated computational models that can account for the complex nature of sports injury aetiology. Alongside traditional forms of scientific inquiry, the use of ABM and other simulation-based techniques could be considered as a complementary and alternative methodological approach in sports injury research. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Nonlinear Simulation of the Tooth Enamel Spectrum for EPR Dosimetry

    NASA Astrophysics Data System (ADS)

    Kirillov, V. A.; Dubovsky, S. V.

    2016-07-01

    Software was developed where initial EPR spectra of tooth enamel were deconvoluted based on nonlinear simulation, line shapes and signal amplitudes in the model initial spectrum were calculated, the regression coefficient was evaluated, and individual spectra were summed. Software validation demonstrated that doses calculated using it agreed excellently with the applied radiation doses and the doses reconstructed by the method of additive doses.

  18. Simulating Gas-Liquid-Water Partitioning and Fluid Properties of Petroleum under Pressure: Implications for Deep-Sea Blowouts.

    PubMed

    Gros, Jonas; Reddy, Christopher M; Nelson, Robert K; Socolofsky, Scott A; Arey, J Samuel

    2016-07-19

    With the expansion of offshore petroleum extraction, validated models are needed to simulate the behaviors of petroleum compounds released in deep (>100 m) waters. We present a thermodynamic model of the densities, viscosities, and gas-liquid-water partitioning of petroleum mixtures with varying pressure, temperature, and composition based on the Peng-Robinson equation-of-state and the modified Henry's law (Krychevsky-Kasarnovsky equation). The model is applied to Macondo reservoir fluid released during the Deepwater Horizon disaster, represented with 279-280 pseudocomponents, including 131-132 individual compounds. We define >n-C8 pseudocomponents based on comprehensive two-dimensional gas chromatography (GC × GC) measurements, which enable the modeling of aqueous partitioning for n-C8 to n-C26 fractions not quantified individually. Thermodynamic model predictions are tested against available laboratory data on petroleum liquid densities, gas/liquid volume fractions, and liquid viscosities. We find that the emitted petroleum mixture was ∼29-44% gas and ∼56-71% liquid, after cooling to local conditions near the broken Macondo riser stub (∼153 atm and 4.3 °C). High pressure conditions dramatically favor the aqueous dissolution of C1-C4 hydrocarbons and also influence the buoyancies of bubbles and droplets. Additionally, the simulated densities of emitted petroleum fluids affect previous estimates of the volumetric flow rate of dead oil from the emission source.

  19. Simulation-based power calculations for planning a two-stage individual participant data meta-analysis.

    PubMed

    Ensor, Joie; Burke, Danielle L; Snell, Kym I E; Hemming, Karla; Riley, Richard D

    2018-05-18

    Researchers and funders should consider the statistical power of planned Individual Participant Data (IPD) meta-analysis projects, as they are often time-consuming and costly. We propose simulation-based power calculations utilising a two-stage framework, and illustrate the approach for a planned IPD meta-analysis of randomised trials with continuous outcomes where the aim is to identify treatment-covariate interactions. The simulation approach has four steps: (i) specify an underlying (data generating) statistical model for trials in the IPD meta-analysis; (ii) use readily available information (e.g. from publications) and prior knowledge (e.g. number of studies promising IPD) to specify model parameter values (e.g. control group mean, intervention effect, treatment-covariate interaction); (iii) simulate an IPD meta-analysis dataset of a particular size from the model, and apply a two-stage IPD meta-analysis to obtain the summary estimate of interest (e.g. interaction effect) and its associated p-value; (iv) repeat the previous step (e.g. thousands of times), then estimate the power to detect a genuine effect by the proportion of summary estimates with a significant p-value. In a planned IPD meta-analysis of lifestyle interventions to reduce weight gain in pregnancy, 14 trials (1183 patients) promised their IPD to examine a treatment-BMI interaction (i.e. whether baseline BMI modifies intervention effect on weight gain). Using our simulation-based approach, a two-stage IPD meta-analysis has < 60% power to detect a reduction of 1 kg weight gain for a 10-unit increase in BMI. Additional IPD from ten other published trials (containing 1761 patients) would improve power to over 80%, but only if a fixed-effect meta-analysis was appropriate. Pre-specified adjustment for prognostic factors would increase power further. Incorrect dichotomisation of BMI would reduce power by over 20%, similar to immediately throwing away IPD from ten trials. Simulation-based power calculations could inform the planning and funding of IPD projects, and should be used routinely.

  20. Methods for improving simulations of biological systems: systemic computation and fractal proteins

    PubMed Central

    Bentley, Peter J.

    2009-01-01

    Modelling and simulation are becoming essential for new fields such as synthetic biology. Perhaps the most important aspect of modelling is to follow a clear design methodology that will help to highlight unwanted deficiencies. The use of tools designed to aid the modelling process can be of benefit in many situations. In this paper, the modelling approach called systemic computation (SC) is introduced. SC is an interaction-based language, which enables individual-based expression and modelling of biological systems, and the interactions between them. SC permits a precise description of a hypothetical mechanism to be written using an intuitive graph-based or a calculus-based notation. The same description can then be directly run as a simulation, merging the hypothetical mechanism and the simulation into the same entity. However, even when using well-designed modelling tools to produce good models, the best model is not always the most accurate one. Frequently, computational constraints or lack of data make it infeasible to model an aspect of biology. Simplification may provide one way forward, but with inevitable consequences of decreased accuracy. Instead of attempting to replace an element with a simpler approximation, it is sometimes possible to substitute the element with a different but functionally similar component. In the second part of this paper, this modelling approach is described and its advantages are summarized using an exemplar: the fractal protein model. Finally, the paper ends with a discussion of good biological modelling practice by presenting lessons learned from the use of SC and the fractal protein model. PMID:19324681

  1. A 3D radiative transfer model based on lidar data and its application on hydrological and ecosystem modeling

    NASA Astrophysics Data System (ADS)

    Li, W.; Su, Y.; Harmon, T. C.; Guo, Q.

    2013-12-01

    Light Detection and Ranging (lidar) is an optical remote sensing technology that measures properties of scattered light to find range and/or other information of a distant object. Due to its ability to generate 3-dimensional data with high spatial resolution and accuracy, lidar technology is being increasingly used in ecology, geography, geology, geomorphology, seismology, remote sensing, and atmospheric physics. In this study we construct a 3-dimentional (3D) radiative transfer model (RTM) using lidar data to simulate the spatial distribution of solar radiation (direct and diffuse) on the surface of water and mountain forests. The model includes three sub-models: a light model simulating the light source, a sensor model simulating the camera, and a scene model simulating the landscape. We use ground-based and airborne lidar data to characterize the 3D structure of the study area, and generate a detailed 3D scene model. The interactions between light and object are simulated using the Monte Carlo Ray Tracing (MCRT) method. A large number of rays are generated from the light source. For each individual ray, the full traveling path is traced until it is absorbed or escapes from the scene boundary. By locating the sensor at different positions and directions, we can simulate the spatial distribution of solar energy at the ground, vegetation and water surfaces. These outputs can then be incorporated into meteorological drivers for hydrologic and energy balance models to improve our understanding of hydrologic processes and ecosystem functions.

  2. Multithreaded Stochastic PDES for Reactions and Diffusions in Neurons.

    PubMed

    Lin, Zhongwei; Tropper, Carl; Mcdougal, Robert A; Patoary, Mohammand Nazrul Ishlam; Lytton, William W; Yao, Yiping; Hines, Michael L

    2017-07-01

    Cells exhibit stochastic behavior when the number of molecules is small. Hence a stochastic reaction-diffusion simulator capable of working at scale can provide a more accurate view of molecular dynamics within the cell. This paper describes a parallel discrete event simulator, Neuron Time Warp-Multi Thread (NTW-MT), developed for the simulation of reaction diffusion models of neurons. To the best of our knowledge, this is the first parallel discrete event simulator oriented towards stochastic simulation of chemical reactions in a neuron. The simulator was developed as part of the NEURON project. NTW-MT is optimistic and thread-based, which attempts to capitalize on multi-core architectures used in high performance machines. It makes use of a multi-level queue for the pending event set and a single roll-back message in place of individual anti-messages to disperse contention and decrease the overhead of processing rollbacks. Global Virtual Time is computed asynchronously both within and among processes to get rid of the overhead for synchronizing threads. Memory usage is managed in order to avoid locking and unlocking when allocating and de-allocating memory and to maximize cache locality. We verified our simulator on a calcium buffer model. We examined its performance on a calcium wave model, comparing it to the performance of a process based optimistic simulator and a threaded simulator which uses a single priority queue for each thread. Our multi-threaded simulator is shown to achieve superior performance to these simulators. Finally, we demonstrated the scalability of our simulator on a larger CICR model and a more detailed CICR model.

  3. Simulations in Cyber-Security: A Review of Cognitive Modeling of Network Attackers, Defenders, and Users.

    PubMed

    Veksler, Vladislav D; Buchler, Norbou; Hoffman, Blaine E; Cassenti, Daniel N; Sample, Char; Sugrim, Shridat

    2018-01-01

    Computational models of cognitive processes may be employed in cyber-security tools, experiments, and simulations to address human agency and effective decision-making in keeping computational networks secure. Cognitive modeling can addresses multi-disciplinary cyber-security challenges requiring cross-cutting approaches over the human and computational sciences such as the following: (a) adversarial reasoning and behavioral game theory to predict attacker subjective utilities and decision likelihood distributions, (b) human factors of cyber tools to address human system integration challenges, estimation of defender cognitive states, and opportunities for automation, (c) dynamic simulations involving attacker, defender, and user models to enhance studies of cyber epidemiology and cyber hygiene, and (d) training effectiveness research and training scenarios to address human cyber-security performance, maturation of cyber-security skill sets, and effective decision-making. Models may be initially constructed at the group-level based on mean tendencies of each subject's subgroup, based on known statistics such as specific skill proficiencies, demographic characteristics, and cultural factors. For more precise and accurate predictions, cognitive models may be fine-tuned to each individual attacker, defender, or user profile, and updated over time (based on recorded behavior) via techniques such as model tracing and dynamic parameter fitting.

  4. Variance decomposition in stochastic simulators.

    PubMed

    Le Maître, O P; Knio, O M; Moraes, A

    2015-06-28

    This work aims at the development of a mathematical and computational approach that enables quantification of the inherent sources of stochasticity and of the corresponding sensitivities in stochastic simulations of chemical reaction networks. The approach is based on reformulating the system dynamics as being generated by independent standardized Poisson processes. This reformulation affords a straightforward identification of individual realizations for the stochastic dynamics of each reaction channel, and consequently a quantitative characterization of the inherent sources of stochasticity in the system. By relying on the Sobol-Hoeffding decomposition, the reformulation enables us to perform an orthogonal decomposition of the solution variance. Thus, by judiciously exploiting the inherent stochasticity of the system, one is able to quantify the variance-based sensitivities associated with individual reaction channels, as well as the importance of channel interactions. Implementation of the algorithms is illustrated in light of simulations of simplified systems, including the birth-death, Schlögl, and Michaelis-Menten models.

  5. Variance decomposition in stochastic simulators

    NASA Astrophysics Data System (ADS)

    Le Maître, O. P.; Knio, O. M.; Moraes, A.

    2015-06-01

    This work aims at the development of a mathematical and computational approach that enables quantification of the inherent sources of stochasticity and of the corresponding sensitivities in stochastic simulations of chemical reaction networks. The approach is based on reformulating the system dynamics as being generated by independent standardized Poisson processes. This reformulation affords a straightforward identification of individual realizations for the stochastic dynamics of each reaction channel, and consequently a quantitative characterization of the inherent sources of stochasticity in the system. By relying on the Sobol-Hoeffding decomposition, the reformulation enables us to perform an orthogonal decomposition of the solution variance. Thus, by judiciously exploiting the inherent stochasticity of the system, one is able to quantify the variance-based sensitivities associated with individual reaction channels, as well as the importance of channel interactions. Implementation of the algorithms is illustrated in light of simulations of simplified systems, including the birth-death, Schlögl, and Michaelis-Menten models.

  6. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  7. Biome changes in Asia since the mid-Holocene - an analysis of different transient Earth system model simulations

    NASA Astrophysics Data System (ADS)

    Dallmeyer, Anne; Claussen, Martin; Ni, Jian; Cao, Xianyong; Wang, Yongbo; Fischer, Nils; Pfeiffer, Madlene; Jin, Liya; Khon, Vyacheslav; Wagner, Sebastian; Haberkorn, Kerstin; Herzschuh, Ulrike

    2017-02-01

    The large variety of atmospheric circulation systems affecting the eastern Asian climate is reflected by the complex Asian vegetation distribution. Particularly in the transition zones of these circulation systems, vegetation is supposed to be very sensitive to climate change. Since proxy records are scarce, hitherto a mechanistic understanding of the past spatio-temporal climate-vegetation relationship is lacking. To assess the Holocene vegetation change and to obtain an ensemble of potential mid-Holocene biome distributions for eastern Asia, we forced the diagnostic biome model BIOME4 with climate anomalies of different transient Holocene climate simulations performed in coupled atmosphere-ocean(-vegetation) models. The simulated biome changes are compared with pollen-based biome records for different key regions.In all simulations, substantial biome shifts during the last 6000 years are confined to the high northern latitudes and the monsoon-westerly wind transition zone, but the temporal evolution and amplitude of change strongly depend on the climate forcing. Large parts of the southern tundra are replaced by taiga during the mid-Holocene due to a warmer growing season and the boreal treeline in northern Asia is shifted northward by approx. 4° in the ensemble mean, ranging from 1.5 to 6° in the individual simulations, respectively. This simulated treeline shift is in agreement with pollen-based reconstructions from northern Siberia. The desert fraction in the transition zone is reduced by 21 % during the mid-Holocene compared to pre-industrial due to enhanced precipitation. The desert-steppe margin is shifted westward by 5° (1-9° in the individual simulations). The forest biomes are expanded north-westward by 2°, ranging from 0 to 4° in the single simulations. These results corroborate pollen-based reconstructions indicating an extended forest area in north-central China during the mid-Holocene. According to the model, the forest-to-non-forest and steppe-to-desert changes in the climate transition zones are spatially not uniform and not linear since the mid-Holocene.

  8. Coarse-graining to the meso and continuum scales with molecular-dynamics-like models

    NASA Astrophysics Data System (ADS)

    Plimpton, Steve

    Many engineering-scale problems that industry or the national labs try to address with particle-based simulations occur at length and time scales well beyond the most optimistic hopes of traditional coarse-graining methods for molecular dynamics (MD), which typically start at the atomic scale and build upward. However classical MD can be viewed as an engine for simulating particles at literally any length or time scale, depending on the models used for individual particles and their interactions. To illustrate I'll highlight several coarse-grained (CG) materials models, some of which are likely familiar to molecular-scale modelers, but others probably not. These include models for water droplet freezing on surfaces, dissipative particle dynamics (DPD) models of explosives where particles have internal state, CG models of nano or colloidal particles in solution, models for aspherical particles, Peridynamics models for fracture, and models of granular materials at the scale of industrial processing. All of these can be implemented as MD-style models for either soft or hard materials; in fact they are all part of our LAMMPS MD package, added either by our group or contributed by collaborators. Unlike most all-atom MD simulations, CG simulations at these scales often involve highly non-uniform particle densities. So I'll also discuss a load-balancing method we've implemented for these kinds of models, which can improve parallel efficiencies. From the physics point-of-view, these models may be viewed as non-traditional or ad hoc. But because they are MD-style simulations, there's an opportunity for physicists to add statistical mechanics rigor to individual models. Or, in keeping with a theme of this session, to devise methods that more accurately bridge models from one scale to the next.

  9. Simulating competitive egress of noncircular pedestrians.

    PubMed

    Hidalgo, R C; Parisi, D R; Zuriguel, I

    2017-04-01

    We present a numerical framework to simulate pedestrian dynamics in highly competitive conditions by means of a force-based model implemented with spherocylindrical particles instead of the traditional, symmetric disks. This modification of the individuals' shape allows one to naturally reproduce recent experimental findings of room evacuations through narrow doors in situations where the contact pressure among the pedestrians was rather large. In particular, we obtain a power-law tail distribution of the time lapses between the passage of consecutive individuals. In addition, we show that this improvement leads to new features where the particles' rotation acquires great significance.

  10. A Variable-Instar Climate-Driven Individual Beetle-Based Phenology Model for the Invasive Asian Longhorned Beetle (Coleoptera: Cerambycidae).

    PubMed

    Trotter, R Talbot; Keena, Melody A

    2016-12-01

    Efforts to manage and eradicate invasive species can benefit from an improved understanding of the physiology, biology, and behavior of the target species, and ongoing efforts to eradicate the Asian longhorned beetle (Anoplophora glabripennis Motschulsky) highlight the roles this information may play. Here, we present a climate-driven phenology model for A. glabripennis that provides simulated life-tables for populations of individual beetles under variable climatic conditions that takes into account the variable number of instars beetles may undergo as larvae. Phenology parameters in the model are based on a synthesis of published data and studies of A. glabripennis, and the model output was evaluated using a laboratory-reared population maintained under varying temperatures mimicking those typical of Central Park in New York City. The model was stable under variations in population size, simulation length, and the Julian dates used to initiate individual beetles within the population. Comparison of model results with previously published field-based phenology studies in native and invasive populations indicates both this new phenology model, and the previously published heating-degree-day model show good agreement in the prediction of the beginning of the flight season for adults. However, the phenology model described here avoids underpredicting the cumulative emergence of adults through the season, in addition to providing tables of life stages and estimations of voltinism for local populations. This information can play a key role in evaluating risk by predicting the potential for population growth, and may facilitate the optimization of management and eradication efforts. Published by Oxford University Press on behalf of Entomological Society of America 2016. This work is written by US Government employees and is in the public domain in the US.

  11. Simulation-based Bayesian inference for latent traits of item response models: Introduction to the ltbayes package for R.

    PubMed

    Johnson, Timothy R; Kuhn, Kristine M

    2015-12-01

    This paper introduces the ltbayes package for R. This package includes a suite of functions for investigating the posterior distribution of latent traits of item response models. These include functions for simulating realizations from the posterior distribution, profiling the posterior density or likelihood function, calculation of posterior modes or means, Fisher information functions and observed information, and profile likelihood confidence intervals. Inferences can be based on individual response patterns or sets of response patterns such as sum scores. Functions are included for several common binary and polytomous item response models, but the package can also be used with user-specified models. This paper introduces some background and motivation for the package, and includes several detailed examples of its use.

  12. The Hydrology of Malaria: Model Development and Application to a Sahelian Village

    NASA Astrophysics Data System (ADS)

    Bomblies, A.; Duchemin, J.; Eltahir, E. A.

    2008-12-01

    We present a coupled hydrology and entomology model for the mechanistic simulation of local-scale response of malaria transmission to hydrological and climatological determinants in semi-arid, desert fringe environments. The model is applied to the Sahel village of Banizoumbou, Niger, to predict interannual variability in malaria vector mosquito populations which lead to variations in malaria transmission. Using a high-resolution, small-scale distributed hydrology model that incorporates remotely-sensed data for land cover and topography, we simulate the formation and persistence of the pools constituting the primary breeding habitat of Anopheles gambiae s.l. mosquitoes, the principal regional malaria vector mosquitoes. An agent-based mosquito population model is coupled to the distributed hydrology model, with aquatic stage and adult stage components. For each individual adult mosquito, the model tracks attributes relevant to population dynamics and malaria transmission, which are updated as mosquitoes interact with their environment, humans, and animals. Weekly field observations were made in 2005 and 2006. The model reproduces mosquito population variability at seasonal and interannual time scales, and highlights individual pool persistence as a dominant control. Future developments to the presented model can be used in the evaluation of impacts of climate change on malaria, as well as the a priori evaluation of environmental management-based interventions.

  13. Ancestral haplotype-based association mapping with generalized linear mixed models accounting for stratification.

    PubMed

    Zhang, Z; Guillaume, F; Sartelet, A; Charlier, C; Georges, M; Farnir, F; Druet, T

    2012-10-01

    In many situations, genome-wide association studies are performed in populations presenting stratification. Mixed models including a kinship matrix accounting for genetic relatedness among individuals have been shown to correct for population and/or family structure. Here we extend this methodology to generalized linear mixed models which properly model data under various distributions. In addition we perform association with ancestral haplotypes inferred using a hidden Markov model. The method was shown to properly account for stratification under various simulated scenari presenting population and/or family structure. Use of ancestral haplotypes resulted in higher power than SNPs on simulated datasets. Application to real data demonstrates the usefulness of the developed model. Full analysis of a dataset with 4600 individuals and 500 000 SNPs was performed in 2 h 36 min and required 2.28 Gb of RAM. The software GLASCOW can be freely downloaded from www.giga.ulg.ac.be/jcms/prod_381171/software. francois.guillaume@jouy.inra.fr Supplementary data are available at Bioinformatics online.

  14. Intervertebral disc biomechanical analysis using the finite element modeling based on medical images.

    PubMed

    Li, Haiyun; Wang, Zheng

    2006-01-01

    In this paper, a 3D geometric model of the intervertebral and lumbar disks has been presented, which integrated the spine CT and MRI data-based anatomical structure. Based on the geometric model, a 3D finite element model of an L1-L2 segment was created. Loads, which simulate the pressure from above were applied to the FEM, while a boundary condition describing the relative L1-L2 displacement is imposed on the FEM to account for 3D physiological states. The simulation calculation illustrates the stress and strain distribution and deformation of the spine. The method has two characteristics compared to previous studies: first, the finite element model of the lumbar are based on the data directly derived from medical images such as CTs and MRIs. Second, the result of analysis will be more accurate than using the data of geometric parameters. The FEM provides a promising tool in clinical diagnosis and for optimizing individual therapy in the intervertebral disc herniation.

  15. The Atlas of Physiology and Pathophysiology: Web-based multimedia enabled interactive simulations.

    PubMed

    Kofranek, Jiri; Matousek, Stanislav; Rusz, Jan; Stodulka, Petr; Privitzer, Pavol; Matejak, Marek; Tribula, Martin

    2011-11-01

    The paper is a presentation of the current state of development for the Atlas of Physiology and Pathophysiology (Atlas). Our main aim is to provide a novel interactive multimedia application that can be used for biomedical education where (a) simulations are combined with tutorials and (b) the presentation layer is simplified while the underlying complexity of the model is retained. The development of the Atlas required the cooperation of many professionals including teachers, system analysts, artists, and programmers. During the design of the Atlas, tools were developed that allow for component-based creation of simulation models, creation of interactive multimedia and their final coordination into a compact unit based on the given design. The Atlas is a freely available online application, which can help to explain the function of individual physiological systems and the causes and symptoms of their disorders. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  16. Effective Network Size Predicted From Simulations of Pathogen Outbreaks Through Social Networks Provides a Novel Measure of Structure-Standardized Group Size.

    PubMed

    McCabe, Collin M; Nunn, Charles L

    2018-01-01

    The transmission of infectious disease through a population is often modeled assuming that interactions occur randomly in groups, with all individuals potentially interacting with all other individuals at an equal rate. However, it is well known that pairs of individuals vary in their degree of contact. Here, we propose a measure to account for such heterogeneity: effective network size (ENS), which refers to the size of a maximally complete network (i.e., unstructured, where all individuals interact with all others equally) that corresponds to the outbreak characteristics of a given heterogeneous, structured network. We simulated susceptible-infected (SI) and susceptible-infected-recovered (SIR) models on maximally complete networks to produce idealized outbreak duration distributions for a disease on a network of a given size. We also simulated the transmission of these same diseases on random structured networks and then used the resulting outbreak duration distributions to predict the ENS for the group or population. We provide the methods to reproduce these analyses in a public R package, "enss." Outbreak durations of simulations on randomly structured networks were more variable than those on complete networks, but tended to have similar mean durations of disease spread. We then applied our novel metric to empirical primate networks taken from the literature and compared the information represented by our ENSs to that by other established social network metrics. In AICc model comparison frameworks, group size and mean distance proved to be the metrics most consistently associated with ENS for SI simulations, while group size, centralization, and modularity were most consistently associated with ENS for SIR simulations. In all cases, ENS was shown to be associated with at least two other independent metrics, supporting its use as a novel metric. Overall, our study provides a proof of concept for simulation-based approaches toward constructing metrics of ENS, while also revealing the conditions under which this approach is most promising.

  17. Simulation-based assessment in anesthesiology: requirements for practical implementation.

    PubMed

    Boulet, John R; Murray, David J

    2010-04-01

    Simulations have taken a central role in the education and assessment of medical students, residents, and practicing physicians. The introduction of simulation-based assessments in anesthesiology, especially those used to establish various competencies, has demanded fairly rigorous studies concerning the psychometric properties of the scores. Most important, major efforts have been directed at identifying, and addressing, potential threats to the validity of simulation-based assessment scores. As a result, organizations that wish to incorporate simulation-based assessments into their evaluation practices can access information regarding effective test development practices, the selection of appropriate metrics, the minimization of measurement errors, and test score validation processes. The purpose of this article is to provide a broad overview of the use of simulation for measuring physician skills and competencies. For simulations used in anesthesiology, studies that describe advances in scenario development, the development of scoring rubrics, and the validation of assessment results are synthesized. Based on the summary of relevant research, psychometric requirements for practical implementation of simulation-based assessments in anesthesiology are forwarded. As technology expands, and simulation-based education and evaluation takes on a larger role in patient safety initiatives, the groundbreaking work conducted to date can serve as a model for those individuals and organizations that are responsible for developing, scoring, or validating simulation-based education and assessment programs in anesthesiology.

  18. Numerical modeling of transverse mode competition in strongly pumped multimode fiber lasers and amplifiers.

    PubMed

    Gong, Mali; Yuan, Yanyang; Li, Chen; Yan, Ping; Zhang, Haitao; Liao, Suying

    2007-03-19

    A model based on propagation-rate equations with consideration of transverse gain distribution is built up to describe the transverse mode competition in strongly pumped multimode fiber lasers and amplifiers. An approximate practical numerical algorithm by multilayer method is presented. Based on the model and the numerical algorithm, the behaviors of multitransverse mode competition are demonstrated and individual transverse modes power distributions of output are simulated numerically for both fiber lasers and amplifiers under various conditions.

  19. Generation and use of human 3D-CAD models

    NASA Astrophysics Data System (ADS)

    Grotepass, Juergen; Speyer, Hartmut; Kaiser, Ralf

    2002-05-01

    Individualized Products are one of the ten mega trends of the 21st Century with human modeling as the key issue for tomorrow's design and product development. The use of human modeling software for computer based ergonomic simulations within the production process increases quality while reducing costs by 30- 50 percent and shortening production time. This presentation focuses on the use of human 3D-CAD models for both, the ergonomic design of working environments and made to measure garment production. Today, the entire production chain can be designed, individualized models generated and analyzed in 3D computer environments. Anthropometric design for ergonomics is matched to human needs, thus preserving health. Ergonomic simulation includes topics as human vision, reachability, kinematics, force and comfort analysis and international design capabilities. In German more than 17 billions of Mark are moved to other industries, because clothes do not fit. Individual clothing tailored to the customer's preference means surplus value, pleasure and perfect fit. The body scanning technology is the key to generation and use of human 3D-CAD models for both, the ergonomic design of working environments and made to measure garment production.

  20. EEG Characteristic Extraction Method of Listening Music and Objective Estimation Method Based on Latency Structure Model in Individual Characteristics

    NASA Astrophysics Data System (ADS)

    Ito, Shin-Ichi; Mitsukura, Yasue; Nakamura Miyamura, Hiroko; Saito, Takafumi; Fukumi, Minoru

    EEG is characterized by the unique and individual characteristics. Little research has been done to take into account the individual characteristics when analyzing EEG signals. Often the EEG has frequency components which can describe most of the significant characteristics. Then there is the difference of importance between the analyzed frequency components of the EEG. We think that the importance difference shows the individual characteristics. In this paper, we propose a new EEG extraction method of characteristic vector by a latency structure model in individual characteristics (LSMIC). The LSMIC is the latency structure model, which has personal error as the individual characteristics, based on normal distribution. The real-coded genetic algorithms (RGA) are used for specifying the personal error that is unknown parameter. Moreover we propose an objective estimation method that plots the EEG characteristic vector on a visualization space. Finally, the performance of the proposed method is evaluated using a realistic simulation and applied to a real EEG data. The result of our experiment shows the effectiveness of the proposed method.

  1. Simulation-Based Cutaneous Surgical-Skill Training on a Chicken-Skin Bench Model in a Medical Undergraduate Program

    PubMed Central

    Denadai, Rafael; Saad-Hossne, Rogério; Martinhão Souto, Luís Ricardo

    2013-01-01

    Background: Because of ethical and medico-legal aspects involved in the training of cutaneous surgical skills on living patients, human cadavers and living animals, it is necessary the search for alternative and effective forms of training simulation. Aims: To propose and describe an alternative methodology for teaching and learning the principles of cutaneous surgery in a medical undergraduate program by using a chicken-skin bench model. Materials and Methods: One instructor for every four students, teaching materials on cutaneous surgical skills, chicken trunks, wings, or thighs, a rigid platform support, needled threads, needle holders, surgical blades with scalpel handles, rat-tooth tweezers, scissors, and marking pens were necessary for training simulation. Results: A proposal for simulation-based training on incision, suture, biopsy, and on reconstruction techniques using a chicken-skin bench model distributed in several sessions and with increasing levels of difficultywas structured. Both feedback and objective evaluations always directed to individual students were also outlined. Conclusion: The teaching of a methodology for the principles of cutaneous surgery using a chicken-skin bench model versatile, portable, easy to assemble, and inexpensive is an alternative and complementary option to the armamentarium of methods based on other bench models described. PMID:23723471

  2. Dry matter partitioning models for the simulation of individual fruit growth in greenhouse cucumber canopies

    PubMed Central

    Wiechers, Dirk; Kahlen, Katrin; Stützel, Hartmut

    2011-01-01

    Background and Aims Growth imbalances between individual fruits are common in indeterminate plants such as cucumber (Cucumis sativus). In this species, these imbalances can be related to differences in two growth characteristics, fruit growth duration until reaching a given size and fruit abortion. Both are related to distribution, and environmental factors as well as canopy architecture play a key role in their differentiation. Furthermore, events leading to a fruit reaching its harvestable size before or simultaneously with a prior fruit can be observed. Functional–structural plant models (FSPMs) allow for interactions between environmental factors, canopy architecture and physiological processes. Here, we tested hypotheses which account for these interactions by introducing dominance and abortion thresholds for the partitioning of assimilates between growing fruits. Methods Using the L-System formalism, an FSPM was developed which combined a model for architectural development, a biochemical model of photosynthesis and a model for assimilate partitioning, the last including a fruit growth model based on a size-related potential growth rate (RP). Starting from a distribution proportional to RP, the model was extended by including abortion and dominance. Abortion was related to source strength and dominance to sink strength. Both thresholds were varied to test their influence on fruit growth characteristics. Simulations were conducted for a dense row and a sparse isometric canopy. Key Results The simple partitioning models failed to simulate individual fruit growth realistically. The introduction of abortion and dominance thresholds gave the best results. Simulations of fruit growth durations and abortion rates were in line with measurements, and events in which a fruit was harvestable earlier than an older fruit were reproduced. Conclusions Dominance and abortion events need to be considered when simulating typical fruit growth traits. By integrating environmental factors, the FSPM can be a valuable tool to analyse and improve existing knowledge about the dynamics of assimilates partitioning. PMID:21715366

  3. The Virtual Mouse Brain: A Computational Neuroinformatics Platform to Study Whole Mouse Brain Dynamics.

    PubMed

    Melozzi, Francesca; Woodman, Marmaduke M; Jirsa, Viktor K; Bernard, Christophe

    2017-01-01

    Connectome-based modeling of large-scale brain network dynamics enables causal in silico interrogation of the brain's structure-function relationship, necessitating the close integration of diverse neuroinformatics fields. Here we extend the open-source simulation software The Virtual Brain (TVB) to whole mouse brain network modeling based on individual diffusion magnetic resonance imaging (dMRI)-based or tracer-based detailed mouse connectomes. We provide practical examples on how to use The Virtual Mouse Brain (TVMB) to simulate brain activity, such as seizure propagation and the switching behavior of the resting state dynamics in health and disease. TVMB enables theoretically driven experimental planning and ways to test predictions in the numerous strains of mice available to study brain function in normal and pathological conditions.

  4. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2 O emissions.

    PubMed

    Ehrhardt, Fiona; Soussana, Jean-François; Bellocchi, Gianni; Grace, Peter; McAuliffe, Russel; Recous, Sylvie; Sándor, Renáta; Smith, Pete; Snow, Val; de Antoni Migliorati, Massimiliano; Basso, Bruno; Bhatia, Arti; Brilli, Lorenzo; Doltra, Jordi; Dorich, Christopher D; Doro, Luca; Fitton, Nuala; Giacomini, Sandro J; Grant, Brian; Harrison, Matthew T; Jones, Stephanie K; Kirschbaum, Miko U F; Klumpp, Katja; Laville, Patricia; Léonard, Joël; Liebig, Mark; Lieffering, Mark; Martin, Raphaël; Massad, Raia S; Meier, Elizabeth; Merbold, Lutz; Moore, Andrew D; Myrgiotis, Vasileios; Newton, Paul; Pattey, Elizabeth; Rolinski, Susanne; Sharp, Joanna; Smith, Ward N; Wu, Lianhai; Zhang, Qing

    2018-02-01

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multi-species agricultural contexts. We report an international model comparison and benchmarking exercise, showing the potential of multi-model ensembles to predict productivity and nitrous oxide (N 2 O) emissions for wheat, maize, rice and temperate grasslands. Using a multi-stage modelling protocol, from blind simulations (stage 1) to partial (stages 2-4) and full calibration (stage 5), 24 process-based biogeochemical models were assessed individually or as an ensemble against long-term experimental data from four temperate grassland and five arable crop rotation sites spanning four continents. Comparisons were performed by reference to the experimental uncertainties of observed yields and N 2 O emissions. Results showed that across sites and crop/grassland types, 23%-40% of the uncalibrated individual models were within two standard deviations (SD) of observed yields, while 42 (rice) to 96% (grasslands) of the models were within 1 SD of observed N 2 O emissions. At stage 1, ensembles formed by the three lowest prediction model errors predicted both yields and N 2 O emissions within experimental uncertainties for 44% and 33% of the crop and grassland growth cycles, respectively. Partial model calibration (stages 2-4) markedly reduced prediction errors of the full model ensemble E-median for crop grain yields (from 36% at stage 1 down to 4% on average) and grassland productivity (from 44% to 27%) and to a lesser and more variable extent for N 2 O emissions. Yield-scaled N 2 O emissions (N 2 O emissions divided by crop yields) were ranked accurately by three-model ensembles across crop species and field sites. The potential of using process-based model ensembles to predict jointly productivity and N 2 O emissions at field scale is discussed. © 2017 John Wiley & Sons Ltd.

  5. SPARK: A Framework for Multi-Scale Agent-Based Biomedical Modeling.

    PubMed

    Solovyev, Alexey; Mikheev, Maxim; Zhou, Leming; Dutta-Moscato, Joyeeta; Ziraldo, Cordelia; An, Gary; Vodovotz, Yoram; Mi, Qi

    2010-01-01

    Multi-scale modeling of complex biological systems remains a central challenge in the systems biology community. A method of dynamic knowledge representation known as agent-based modeling enables the study of higher level behavior emerging from discrete events performed by individual components. With the advancement of computer technology, agent-based modeling has emerged as an innovative technique to model the complexities of systems biology. In this work, the authors describe SPARK (Simple Platform for Agent-based Representation of Knowledge), a framework for agent-based modeling specifically designed for systems-level biomedical model development. SPARK is a stand-alone application written in Java. It provides a user-friendly interface, and a simple programming language for developing Agent-Based Models (ABMs). SPARK has the following features specialized for modeling biomedical systems: 1) continuous space that can simulate real physical space; 2) flexible agent size and shape that can represent the relative proportions of various cell types; 3) multiple spaces that can concurrently simulate and visualize multiple scales in biomedical models; 4) a convenient graphical user interface. Existing ABMs of diabetic foot ulcers and acute inflammation were implemented in SPARK. Models of identical complexity were run in both NetLogo and SPARK; the SPARK-based models ran two to three times faster.

  6. Mussel dynamics model: A hydroinformatics tool for analyzing the effects of different stressors on the dynamics of freshwater mussel communities

    USGS Publications Warehouse

    Morales, Y.; Weber, L.J.; Mynett, A.E.; Newton, T.J.

    2006-01-01

    A model for simulating freshwater mussel population dynamics is presented. The model is a hydroinformatics tool that integrates principles from ecology, river hydraulics, fluid mechanics and sediment transport, and applies the individual-based modelling approach for simulating population dynamics. The general model layout, data requirements, and steps of the simulation process are discussed. As an illustration, simulation results from an application in a 10 km reach of the Upper Mississippi River are presented. The model was used to investigate the spatial distribution of mussels and the effects of food competition in native unionid mussel communities, and communities infested by Dreissena polymorpha, the zebra mussel. Simulation results were found to be realistic and coincided with data obtained from the literature. These results indicate that the model can be a useful tool for assessing the potential effects of different stressors on long-term population dynamics, and consequently, may improve the current understanding of cause and effect relationships in freshwater mussel communities. ?? 2006 Elsevier B.V. All rights reserved.

  7. Co-pyrolysis characteristics of microalgae Isochrysis and Chlorella: Kinetics, biocrude yield and interaction.

    PubMed

    Zhao, Bingwei; Wang, Xin; Yang, Xiaoyi

    2015-12-01

    Co-pyrolysis characteristics of Isochrysis (high lipid) and Chlorella (high protein) were investigated qualitatively and quantitatively based on DTG curves, biocrude yield and composition by individual pyrolysis and co-pyrolysis. DTG curves in co-pyrolysis have been compared accurately with those in individual pyrolysis. An interaction has been detected at 475-500°C in co-pyrolysis based on biocrude yields, and co-pyrolysis reaction mechanism appear three-dimensional diffusion in comparison with random nucleation followed by growth in individual pyrolysis based on kinetic analysis. There is no obvious difference in the maximum biocrude yields for individual pyrolysis and co-pyrolysis, but carboxylic acids (IC21) decreased and N-heterocyclic compounds (IC12) increased in co-pyrolysis. Simulation results of biocrude yield by Components Biofuel Model and Kinetics Biofuel Model indicate that the processes of co-pyrolysis comply with those of individual pyrolysis in solid phase by and large. Variation of percentage content in co-pyrolysis and individual pyrolysis biocrude indicated interaction in gas phase. Copyright © 2015. Published by Elsevier Ltd.

  8. A physiology-based model describing heterogeneity in glucose metabolism: the core of the Eindhoven Diabetes Education Simulator (E-DES).

    PubMed

    Maas, Anne H; Rozendaal, Yvonne J W; van Pul, Carola; Hilbers, Peter A J; Cottaar, Ward J; Haak, Harm R; van Riel, Natal A W

    2015-03-01

    Current diabetes education methods are costly, time-consuming, and do not actively engage the patient. Here, we describe the development and verification of the physiological model for healthy subjects that forms the basis of the Eindhoven Diabetes Education Simulator (E-DES). E-DES shall provide diabetes patients with an individualized virtual practice environment incorporating the main factors that influence glycemic control: food, exercise, and medication. The physiological model consists of 4 compartments for which the inflow and outflow of glucose and insulin are calculated using 6 nonlinear coupled differential equations and 14 parameters. These parameters are estimated on 12 sets of oral glucose tolerance test (OGTT) data (226 healthy subjects) obtained from literature. The resulting parameter set is verified on 8 separate literature OGTT data sets (229 subjects). The model is considered verified if 95% of the glucose data points lie within an acceptance range of ±20% of the corresponding model value. All glucose data points of the verification data sets lie within the predefined acceptance range. Physiological processes represented in the model include insulin resistance and β-cell function. Adjusting the corresponding parameters allows to describe heterogeneity in the data and shows the capabilities of this model for individualization. We have verified the physiological model of the E-DES for healthy subjects. Heterogeneity of the data has successfully been modeled by adjusting the 4 parameters describing insulin resistance and β-cell function. Our model will form the basis of a simulator providing individualized education on glucose control. © 2014 Diabetes Technology Society.

  9. Enhancing the Behaviorial Fidelity of Synthetic Entities with Human Behavior Models

    DTIC Science & Technology

    2004-05-05

    reflecting the soldier’s extensive training. A civilian’s behavior in the same situation will be determined more by emotions , such as fear, and goals...of intelligent behavior , from path-planning to emotional effects, data on the environment must be gathered from the simulation to serve as sensor...model of decision-making based on emotional utility. AI.Implant takes a composite behavior -based approach to individual and crowd navigation

  10. Socioeconophysics:. Opinion Dynamics for Number of Transactions and Price, a Trader Based Model

    NASA Astrophysics Data System (ADS)

    Tuncay, Çağlar

    Involving effects of media, opinion leader and other agents on the opinion of individuals of market society, a trader based model is developed and utilized to simulate price via supply and demand. Pronounced effects are considered with several weights and some personal differences between traders are taken into account. Resulting time series and probabilty distribution function involving a power law for price come out similar to the real ones.

  11. A biologically-based individual tree model for managing the longleaf pine ecosystem

    Treesearch

    Rick Smith; Greg Somers

    1998-01-01

    Duration: 1995-present Objective: Develop a longleaf pine dynamics model and simulation system to define desirable ecosystem management practices in existing and future longleaf pine stands. Methods: Naturally-regenerated longleaf pine trees are being destructively sampled to measure their recent growth and dynamics. Soils and climate data will be combined with the...

  12. Development of algorithmic decision-making models for sea crews

    NASA Astrophysics Data System (ADS)

    Lisitsyna, L.; Smetyuh, N.; Ivanovskiy, N.

    2018-05-01

    Modern virtual simulators are multifunctional, i.e. they can be used to develop and enhance the skills as well as to control professional skills and abilities of specialists of diverse profiles under various working conditions. This study is based on the generalization of a large experience in the sphere of applying ready-made multifunctional virtual simulators (MFVS) and developing new ones for the training and retraining of the crews of the Azov-Black Sea fishing vessels. The model is implemented in the multifunctional visual simulator "Trawling and purse-seining" to train the situational awareness among navigators individually and in a team. Interviews with those who employ the graduates of the advanced training courses testify to the adequacy of this model.

  13. Agent Based Modeling Applications for Geosciences

    NASA Astrophysics Data System (ADS)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in a thermodynamic framework as a set of reactions that roll-up the integrated effect that diverse biological communities exert on a geological system. This approach may work well to predict the effect of certain biological communities in specific environments in which experimental data is available. However, it does not further our knowledge of how the geobiological system actually functions on a micro scale. Agent-based techniques may provide a framework to explore the fundamental interactions required to explain the system-wide behavior. This presentation will present a survey of several promising applications of agent-based modeling approaches to problems in the geosciences and describe specific contributions to some of the inherent challenges facing this approach.

  14. Bayesian B-spline mapping for dynamic quantitative traits.

    PubMed

    Xing, Jun; Li, Jiahan; Yang, Runqing; Zhou, Xiaojing; Xu, Shizhong

    2012-04-01

    Owing to their ability and flexibility to describe individual gene expression at different time points, random regression (RR) analyses have become a popular procedure for the genetic analysis of dynamic traits whose phenotypes are collected over time. Specifically, when modelling the dynamic patterns of gene expressions in the RR framework, B-splines have been proved successful as an alternative to orthogonal polynomials. In the so-called Bayesian B-spline quantitative trait locus (QTL) mapping, B-splines are used to characterize the patterns of QTL effects and individual-specific time-dependent environmental errors over time, and the Bayesian shrinkage estimation method is employed to estimate model parameters. Extensive simulations demonstrate that (1) in terms of statistical power, Bayesian B-spline mapping outperforms the interval mapping based on the maximum likelihood; (2) for the simulated dataset with complicated growth curve simulated by B-splines, Legendre polynomial-based Bayesian mapping is not capable of identifying the designed QTLs accurately, even when higher-order Legendre polynomials are considered and (3) for the simulated dataset using Legendre polynomials, the Bayesian B-spline mapping can find the same QTLs as those identified by Legendre polynomial analysis. All simulation results support the necessity and flexibility of B-spline in Bayesian mapping of dynamic traits. The proposed method is also applied to a real dataset, where QTLs controlling the growth trajectory of stem diameters in Populus are located.

  15. Near-Surface Meteorology During the Arctic Summer Cloud Ocean Study (ASCOS): Evaluation of Reanalyses and Global Climate Models.

    NASA Technical Reports Server (NTRS)

    De Boer, G.; Shupe, M.D.; Caldwell, P.M.; Bauer, Susanne E.; Persson, O.; Boyle, J.S.; Kelley, M.; Klein, S.A.; Tjernstrom, M.

    2014-01-01

    Atmospheric measurements from the Arctic Summer Cloud Ocean Study (ASCOS) are used to evaluate the performance of three atmospheric reanalyses (European Centre for Medium Range Weather Forecasting (ECMWF)- Interim reanalysis, National Center for Environmental Prediction (NCEP)-National Center for Atmospheric Research (NCAR) reanalysis, and NCEP-DOE (Department of Energy) reanalysis) and two global climate models (CAM5 (Community Atmosphere Model 5) and NASA GISS (Goddard Institute for Space Studies) ModelE2) in simulation of the high Arctic environment. Quantities analyzed include near surface meteorological variables such as temperature, pressure, humidity and winds, surface-based estimates of cloud and precipitation properties, the surface energy budget, and lower atmospheric temperature structure. In general, the models perform well in simulating large-scale dynamical quantities such as pressure and winds. Near-surface temperature and lower atmospheric stability, along with surface energy budget terms, are not as well represented due largely to errors in simulation of cloud occurrence, phase and altitude. Additionally, a development version of CAM5, which features improved handling of cloud macro physics, has demonstrated to improve simulation of cloud properties and liquid water amount. The ASCOS period additionally provides an excellent example of the benefits gained by evaluating individual budget terms, rather than simply evaluating the net end product, with large compensating errors between individual surface energy budget terms that result in the best net energy budget.

  16. Biomimicry of quorum sensing using bacterial lifecycle model.

    PubMed

    Niu, Ben; Wang, Hong; Duan, Qiqi; Li, Li

    2013-01-01

    Recent microbiologic studies have shown that quorum sensing mechanisms, which serve as one of the fundamental requirements for bacterial survival, exist widely in bacterial intra- and inter-species cell-cell communication. Many simulation models, inspired by the social behavior of natural organisms, are presented to provide new approaches for solving realistic optimization problems. Most of these simulation models follow population-based modelling approaches, where all the individuals are updated according to the same rules. Therefore, it is difficult to maintain the diversity of the population. In this paper, we present a computational model termed LCM-QS, which simulates the bacterial quorum-sensing (QS) mechanism using an individual-based modelling approach under the framework of Agent-Environment-Rule (AER) scheme, i.e. bacterial lifecycle model (LCM). LCM-QS model can be classified into three main sub-models: chemotaxis with QS sub-model, reproduction and elimination sub-model and migration sub-model. The proposed model is used to not only imitate the bacterial evolution process at the single-cell level, but also concentrate on the study of bacterial macroscopic behaviour. Comparative experiments under four different scenarios have been conducted in an artificial 3-D environment with nutrients and noxious distribution. Detailed study on bacterial chemotatic processes with quorum sensing and without quorum sensing are compared. By using quorum sensing mechanisms, artificial bacteria working together can find the nutrient concentration (or global optimum) quickly in the artificial environment. Biomimicry of quorum sensing mechanisms using the lifecycle model allows the artificial bacteria endowed with the communication abilities, which are essential to obtain more valuable information to guide their search cooperatively towards the preferred nutrient concentrations. It can also provide an inspiration for designing new swarm intelligence optimization algorithms, which can be used for solving the real-world problems.

  17. Biomimicry of quorum sensing using bacterial lifecycle model

    PubMed Central

    2013-01-01

    Background Recent microbiologic studies have shown that quorum sensing mechanisms, which serve as one of the fundamental requirements for bacterial survival, exist widely in bacterial intra- and inter-species cell-cell communication. Many simulation models, inspired by the social behavior of natural organisms, are presented to provide new approaches for solving realistic optimization problems. Most of these simulation models follow population-based modelling approaches, where all the individuals are updated according to the same rules. Therefore, it is difficult to maintain the diversity of the population. Results In this paper, we present a computational model termed LCM-QS, which simulates the bacterial quorum-sensing (QS) mechanism using an individual-based modelling approach under the framework of Agent-Environment-Rule (AER) scheme, i.e. bacterial lifecycle model (LCM). LCM-QS model can be classified into three main sub-models: chemotaxis with QS sub-model, reproduction and elimination sub-model and migration sub-model. The proposed model is used to not only imitate the bacterial evolution process at the single-cell level, but also concentrate on the study of bacterial macroscopic behaviour. Comparative experiments under four different scenarios have been conducted in an artificial 3-D environment with nutrients and noxious distribution. Detailed study on bacterial chemotatic processes with quorum sensing and without quorum sensing are compared. By using quorum sensing mechanisms, artificial bacteria working together can find the nutrient concentration (or global optimum) quickly in the artificial environment. Conclusions Biomimicry of quorum sensing mechanisms using the lifecycle model allows the artificial bacteria endowed with the communication abilities, which are essential to obtain more valuable information to guide their search cooperatively towards the preferred nutrient concentrations. It can also provide an inspiration for designing new swarm intelligence optimization algorithms, which can be used for solving the real-world problems. PMID:23815296

  18. Trace organic chemical attenuation during managed aquifer recharge: Insights from a variably saturated 2D tank experiment

    NASA Astrophysics Data System (ADS)

    Regnery, Julia; Lee, Jonghyun; Drumheller, Zachary W.; Drewes, Jörg E.; Illangasekare, Tissa H.; Kitanidis, Peter K.; McCray, John E.; Smits, Kathleen M.

    2017-05-01

    Meaningful model-based predictions of water quality and quantity are imperative for the designed footprint of managed aquifer recharge installations. A two-dimensional (2D) synthetic MAR system equipped with automated sensors (temperature, water pressure, conductivity, soil moisture, oxidation-reduction potential) and embedded water sampling ports was used to test and model fundamental subsurface processes during surface spreading managed aquifer recharge operations under controlled flow and redox conditions at the meso-scale. The fate and transport of contaminants in the variably saturated synthetic aquifer were simulated using the finite element analysis model, FEFLOW. In general, the model concurred with travel times derived from contaminant breakthrough curves at individual sensor locations throughout the 2D tank. However, discrepancies between measured and simulated trace organic chemical concentrations (i.e., carbamazepine, sulfamethoxazole, tris (2-chloroethyl) phosphate, trimethoprim) were observed. While the FEFLOW simulation of breakthrough curves captured overall shapes of trace organic chemical concentrations well, the model struggled with matching individual data points, although compound-specific attenuation parameters were used. Interestingly, despite steady-state operation, oxidation-reduction potential measurements indicated temporal disturbances in hydraulic properties in the saturated zone of the 2D tank that affected water quality.

  19. Effect of Heterogeneous Interest Similarity on the Spread of Information in Mobile Social Networks

    NASA Astrophysics Data System (ADS)

    Zhao, Narisa; Sui, Guoqin; Yang, Fan

    2018-06-01

    Mobile social networks (MSNs) are important platforms for spreading news. The fact that individuals usually forward information aligned with their own interests inevitably changes the dynamics of information spread. Thereby, first we present a theoretical model based on the discrete Markov chain and mean field theory to evaluate the effect of interest similarity on the information spread in MSNs. Meanwhile, individuals' interests are heterogeneous and vary with time. These two features result in interest shift behavior, and both features are considered in our model. A leveraging simulation demonstrates the accuracy of our model. Moreover, the basic reproduction number R0 is determined. Further extensive numerical analyses based on the model indicate that interest similarity has a critical impact on information spread at the early spreading stage. Specifically, the information always spreads more quickly and widely if the interest similarity between an individual and the information is higher. Finally, five actual data sets from Sina Weibo illustrate the validity of the model.

  20. Bee++: An Object-Oriented, Agent-Based Simulator for Honey Bee Colonies

    PubMed Central

    Betti, Matthew; LeClair, Josh; Wahl, Lindi M.; Zamir, Mair

    2017-01-01

    We present a model and associated simulation package (www.beeplusplus.ca) to capture the natural dynamics of a honey bee colony in a spatially-explicit landscape, with temporally-variable, weather-dependent parameters. The simulation tracks bees of different ages and castes, food stores within the colony, pollen and nectar sources and the spatial position of individual foragers outside the hive. We track explicitly the intake of pesticides in individual bees and their ability to metabolize these toxins, such that the impact of sub-lethal doses of pesticides can be explored. Moreover, pathogen populations (in particular, Nosema apis, Nosema cerenae and Varroa mites) have been included in the model and may be introduced at any time or location. The ability to study interactions among pesticides, climate, biodiversity and pathogens in this predictive framework should prove useful to a wide range of researchers studying honey bee populations. To this end, the simulation package is written in open source, object-oriented code (C++) and can be easily modified by the user. Here, we demonstrate the use of the model by exploring the effects of sub-lethal pesticide exposure on the flight behaviour of foragers. PMID:28287445

  1. Simulation Models for Developing an Individualized, Performance Criterion Learning Situation. Technical Monograph No. 21.

    ERIC Educational Resources Information Center

    Anderson, G. Ernest, Jr.

    The mission of the simulation team of the Model Elementary Teacher Education Project, 1968-71, was to develop simulation tools and conduct appropriate studies of the anticipated operation of that project. The team focused on the experiences of individual students and on the resources necessary for these experiences to be reasonable. This report…

  2. Electromagnetic Modeling of Human Body Using High Performance Computing

    NASA Astrophysics Data System (ADS)

    Ng, Cho-Kuen; Beall, Mark; Ge, Lixin; Kim, Sanghoek; Klaas, Ottmar; Poon, Ada

    Realistic simulation of electromagnetic wave propagation in the actual human body can expedite the investigation of the phenomenon of harvesting implanted devices using wireless powering coupled from external sources. The parallel electromagnetics code suite ACE3P developed at SLAC National Accelerator Laboratory is based on the finite element method for high fidelity accelerator simulation, which can be enhanced to model electromagnetic wave propagation in the human body. Starting with a CAD model of a human phantom that is characterized by a number of tissues, a finite element mesh representing the complex geometries of the individual tissues is built for simulation. Employing an optimal power source with a specific pattern of field distribution, the propagation and focusing of electromagnetic waves in the phantom has been demonstrated. Substantial speedup of the simulation is achieved by using multiple compute cores on supercomputers.

  3. Probabilistic information transmission in a network of coupled oscillators reveals speed-accuracy trade-off in responding to threats

    PubMed Central

    Chicoli, Amanda; Paley, Derek A.

    2016-01-01

    Individuals in a group may obtain information from other group members about the environment, including the location of a food source or the presence of a predator. Here, we model how information spreads in a group using a susceptible-infected-removed epidemic model. We apply this model to a simulated shoal of fish using the motion dynamics of a coupled oscillator model, in order to test the biological hypothesis that polarized or aligned shoaling leads to faster and more accurate escape responses. The contributions of this study are the (i) application of a probabilistic model of epidemics to the study of collective animal behavior; (ii) testing the biological hypothesis that group cohesion improves predator escape; (iii) quantification of the effect of social cues on startle propagation; and (iv) investigation of the variation in response based on network connectivity. We find that when perfectly aligned individuals in a group are startled, there is a rapid escape by individuals that directly detect the threat, as well as by individuals responding to their neighbors. However, individuals that are not startled do not head away from the threat. In startled groups that are randomly oriented, there is a rapid, accurate response by individuals that directly detect the threat, followed by less accurate responses by individuals responding to neighbor cues. Over the simulation duration, however, even unstartled individuals head away from the threat. This study illustrates a potential speed-accuracy trade-off in the startle response of animal groups, in agreement with several previous experimental studies. Additionally, the model can be applied to a variety of group decision-making processes, including those involving higher-dimensional motion. PMID:27907996

  4. Probabilistic information transmission in a network of coupled oscillators reveals speed-accuracy trade-off in responding to threats

    NASA Astrophysics Data System (ADS)

    Chicoli, Amanda; Paley, Derek A.

    2016-11-01

    Individuals in a group may obtain information from other group members about the environment, including the location of a food source or the presence of a predator. Here, we model how information spreads in a group using a susceptible-infected-removed epidemic model. We apply this model to a simulated shoal of fish using the motion dynamics of a coupled oscillator model, in order to test the biological hypothesis that polarized or aligned shoaling leads to faster and more accurate escape responses. The contributions of this study are the (i) application of a probabilistic model of epidemics to the study of collective animal behavior; (ii) testing the biological hypothesis that group cohesion improves predator escape; (iii) quantification of the effect of social cues on startle propagation; and (iv) investigation of the variation in response based on network connectivity. We find that when perfectly aligned individuals in a group are startled, there is a rapid escape by individuals that directly detect the threat, as well as by individuals responding to their neighbors. However, individuals that are not startled do not head away from the threat. In startled groups that are randomly oriented, there is a rapid, accurate response by individuals that directly detect the threat, followed by less accurate responses by individuals responding to neighbor cues. Over the simulation duration, however, even unstartled individuals head away from the threat. This study illustrates a potential speed-accuracy trade-off in the startle response of animal groups, in agreement with several previous experimental studies. Additionally, the model can be applied to a variety of group decision-making processes, including those involving higher-dimensional motion.

  5. Probabilistic information transmission in a network of coupled oscillators reveals speed-accuracy trade-off in responding to threats.

    PubMed

    Chicoli, Amanda; Paley, Derek A

    2016-11-01

    Individuals in a group may obtain information from other group members about the environment, including the location of a food source or the presence of a predator. Here, we model how information spreads in a group using a susceptible-infected-removed epidemic model. We apply this model to a simulated shoal of fish using the motion dynamics of a coupled oscillator model, in order to test the biological hypothesis that polarized or aligned shoaling leads to faster and more accurate escape responses. The contributions of this study are the (i) application of a probabilistic model of epidemics to the study of collective animal behavior; (ii) testing the biological hypothesis that group cohesion improves predator escape; (iii) quantification of the effect of social cues on startle propagation; and (iv) investigation of the variation in response based on network connectivity. We find that when perfectly aligned individuals in a group are startled, there is a rapid escape by individuals that directly detect the threat, as well as by individuals responding to their neighbors. However, individuals that are not startled do not head away from the threat. In startled groups that are randomly oriented, there is a rapid, accurate response by individuals that directly detect the threat, followed by less accurate responses by individuals responding to neighbor cues. Over the simulation duration, however, even unstartled individuals head away from the threat. This study illustrates a potential speed-accuracy trade-off in the startle response of animal groups, in agreement with several previous experimental studies. Additionally, the model can be applied to a variety of group decision-making processes, including those involving higher-dimensional motion.

  6. Evacuation Simulation in Kalayaan Residence Hall, up Diliman Using Gama Simulation Software

    NASA Astrophysics Data System (ADS)

    Claridades, A. R. C.; Villanueva, J. K. S.; Macatulad, E. G.

    2016-09-01

    Agent-Based Modeling (ABM) has recently been adopted in some studies for the modelling of events as a dynamic system given a set of events and parameters. In principle, ABM employs individual agents with assigned attributes and behaviors and simulates their behavior around their environment and interaction with other agents. This can be a useful tool in both micro and macroscale-applications. In this study, a model initially created and applied to an academic building was implemented in a dormitory. In particular, this research integrates three-dimensional Geographic Information System (GIS) with GAMA as the multi-agent based evacuation simulation and is implemented in Kalayaan Residence Hall. A three-dimensional GIS model is created based on the floor plans and demographic data of the dorm, including respective pathways as networks, rooms, floors, exits and appropriate attributes. This model is then re-implemented in GAMA. Different states of the agents and their effect on their evacuation time were then observed. GAMA simulation with varying path width was also implemented. It has been found out that compared to their original states, panic, eating and studying will hasten evacuation, and on the other hand, sleeping and being on the bathrooms will be impedances. It is also concluded that evacuation time will be halved when path widths are doubled, however it is recommended for further studies for pathways to be modeled as spaces instead of lines. A more scientific basis for predicting agent behavior in these states is also recommended for more realistic results.

  7. A theoretical individual-based model of Brown Ring Disease in Manila clams, Venerupis philippinarum

    NASA Astrophysics Data System (ADS)

    Paillard, Christine; Jean, Fred; Ford, Susan E.; Powell, Eric N.; Klinck, John M.; Hofmann, Eileen E.; Flye-Sainte-Marie, Jonathan

    2014-08-01

    An individual-based mathematical model was developed to investigate the biological and environmental interactions that influence the prevalence and intensity of Brown Ring Disease (BRD), a disease, caused by the bacterial pathogen, Vibrio tapetis, in the Manila clam (Venerupis (= Tapes, = Ruditapes) philippinarum). V. tapetis acts as an external microparasite, adhering at the surface of the mantle edge and its secretion, the periostracal lamina, causing the symptomatic brown deposit. Brown Ring Disease is atypical in that it leaves a shell scar that provides a unique tool for diagnosis of either live or dead clams. The model was formulated using laboratory and field measurements of BRD development in Manila clams, physiological responses of the clam to the pathogen, and the physiology of V. tapetis, as well as theoretical understanding of bacterial disease progression in marine shellfish. The simulation results obtained for an individual Manila clam were expanded to cohorts and populations using a probability distribution that prescribed a range of variability for parameters in a three dimensional framework; assimilation rate, clam hemocyte activity rate (the number of bacteria ingested per hemocyte per day), and clam calcification rate (a measure of the ability to recover by covering over the symptomatic brown ring deposit), which sensitivity studies indicated to be processes important in determining BRD prevalence and intensity. This approach allows concurrent simulation of individuals with a variety of different physiological capabilities (phenotypes) and hence by implication differing genotypic composition. Different combinations of the three variables provide robust estimates for the fate of individuals with particular characteristics in a population that consists of mixtures of all possible combinations. The BRD model was implemented using environmental observations from sites in Brittany, France, where Manila clams routinely exhibit BRD signs. The simulated annual cycle of BRD prevalence and intensity agrees with observed disease cycles in cultured clam populations from this region, with maximum disease prevalence and intensity occurring from December to April. Sensitivity analyses of modeled physiological processes showed that the level of hemocyte activity is the primary intrinsic determinant of recovery of infected clams. Simulations designed to investigate environmental effects on BRD suggested that the outcome of the host-parasite interaction is dependent on food supply (high values being favorable for the host) and temperature. Results of simulations illustrate the complex interaction of temperature effects on propagation and viability of the bacterium, on the phagocytic activity of the hemocytes, and on other physiological processes of the host clam. Simulations using 1 °C and 2 °C increases in temperature generally favored disease development, indicating that climate warming might favor the spread of BRD.

  8. An open, object-based modeling approach for simulating subsurface heterogeneity

    NASA Astrophysics Data System (ADS)

    Bennett, J.; Ross, M.; Haslauer, C. P.; Cirpka, O. A.

    2017-12-01

    Characterization of subsurface heterogeneity with respect to hydraulic and geochemical properties is critical in hydrogeology as their spatial distribution controls groundwater flow and solute transport. Many approaches of characterizing subsurface heterogeneity do not account for well-established geological concepts about the deposition of the aquifer materials; those that do (i.e. process-based methods) often require forcing parameters that are difficult to derive from site observations. We have developed a new method for simulating subsurface heterogeneity that honors concepts of sequence stratigraphy, resolves fine-scale heterogeneity and anisotropy of distributed parameters, and resembles observed sedimentary deposits. The method implements a multi-scale hierarchical facies modeling framework based on architectural element analysis, with larger features composed of smaller sub-units. The Hydrogeological Virtual Reality simulator (HYVR) simulates distributed parameter models using an object-based approach. Input parameters are derived from observations of stratigraphic morphology in sequence type-sections. Simulation outputs can be used for generic simulations of groundwater flow and solute transport, and for the generation of three-dimensional training images needed in applications of multiple-point geostatistics. The HYVR algorithm is flexible and easy to customize. The algorithm was written in the open-source programming language Python, and is intended to form a code base for hydrogeological researchers, as well as a platform that can be further developed to suit investigators' individual needs. This presentation will encompass the conceptual background and computational methods of the HYVR algorithm, the derivation of input parameters from site characterization, and the results of groundwater flow and solute transport simulations in different depositional settings.

  9. Exploiting Motion Capture to Enhance Avoidance Behaviour in Games

    NASA Astrophysics Data System (ADS)

    van Basten, Ben J. H.; Jansen, Sander E. M.; Karamouzas, Ioannis

    Realistic simulation of interacting virtual characters is essential in computer games, training and simulation applications. The problem is very challenging since people are accustomed to real-world situations and thus, they can easily detect inconsistencies and artifacts in the simulations. Over the past twenty years several models have been proposed for simulating individuals, groups and crowds of characters. However, little effort has been made to actually understand how humans solve interactions and avoid inter-collisions in real-life. In this paper, we exploit motion capture data to gain more insights into human-human interactions. We propose four measures to describe the collision-avoidance behavior. Based on these measures, we extract simple rules that can be applied on top of existing agent and force based approaches, increasing the realism of the resulting simulations.

  10. Bioenergetics-based modeling of individual PCB congeners in nestling tree swallows from two contaminated sites on the Upper Hudson River, New York

    USGS Publications Warehouse

    Nichols, John W.; Echols, Kathy R.; Tillitt, Donald E.; Secord, Anne L.; McCarty, John P.

    2004-01-01

    A bioenergetics-based model was used to simulate the accumulation of total PCBs and 20 PCB congeners by nestling tree swallows at two contaminated sites on the Upper Hudson River, New York. PCB concentrations in birds were calculated as the sum of inherited residues and those acquired through consumption of contaminated insects. Close agreement between simulations and measured residues in 5-, 10-, and 15-day-old nestlings was obtained when PCB concentrations in the diet were set equal to those in food boli taken from adult birds. These simulations were further optimized by fitting the value of a dietary assimilation efficiency constant. Fitted constants for both sites were similar and averaged about 0.7. An evaluation of model performance for individual congeners provided no evidence of metabolic biotransformation. The results of this study are consistent with a companion effort in which principal components analysis was used to compare PCB congener patterns in insects and in tree swallow eggs, nestlings, and adults. Together, these studies establish a quantitative linkage between nestling tree swallows and the insects that they consume and provide strong support for the use of nestling swallows as a biomonitoring species for exposure assessment.

  11. surrosurv: An R package for the evaluation of failure time surrogate endpoints in individual patient data meta-analyses of randomized clinical trials.

    PubMed

    Rotolo, Federico; Paoletti, Xavier; Michiels, Stefan

    2018-03-01

    Surrogate endpoints are attractive for use in clinical trials instead of well-established endpoints because of practical convenience. To validate a surrogate endpoint, two important measures can be estimated in a meta-analytic context when individual patient data are available: the R indiv 2 or the Kendall's τ at the individual level, and the R trial 2 at the trial level. We aimed at providing an R implementation of classical and well-established as well as more recent statistical methods for surrogacy assessment with failure time endpoints. We also intended incorporating utilities for model checking and visualization and data generating methods described in the literature to date. In the case of failure time endpoints, the classical approach is based on two steps. First, a Kendall's τ is estimated as measure of individual level surrogacy using a copula model. Then, the R trial 2 is computed via a linear regression of the estimated treatment effects; at this second step, the estimation uncertainty can be accounted for via measurement-error model or via weights. In addition to the classical approach, we recently developed an approach based on bivariate auxiliary Poisson models with individual random effects to measure the Kendall's τ and treatment-by-trial interactions to measure the R trial 2 . The most common data simulation models described in the literature are based on: copula models, mixed proportional hazard models, and mixture of half-normal and exponential random variables. The R package surrosurv implements the classical two-step method with Clayton, Plackett, and Hougaard copulas. It also allows to optionally adjusting the second-step linear regression for measurement-error. The mixed Poisson approach is implemented with different reduced models in addition to the full model. We present the package functions for estimating the surrogacy models, for checking their convergence, for performing leave-one-trial-out cross-validation, and for plotting the results. We illustrate their use in practice on individual patient data from a meta-analysis of 4069 patients with advanced gastric cancer from 20 trials of chemotherapy. The surrosurv package provides an R implementation of classical and recent statistical methods for surrogacy assessment of failure time endpoints. Flexible simulation functions are available to generate data according to the methods described in the literature. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  13. Potential effects of maternal contribution on egg and larva population dynamics of striped bass: Integrated individual-based model and directed field sampling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cowan, J.H., Jr.; Rose, K.A.

    1991-01-01

    We have used a bioenergetically-driven, individual-based model (IBM) of striped bass as a framework for synthesizing available information on population biology and quantifying, in a relative sense, factors that potentially affect year class success. The IBM has been configured to simulate environmental conditions experienced by several striped bass populations; i.e., in the Potomac River, MD; in Hudson River, NY; in the Santee-Cooper River System, SC, and; in the San Joaquin-Sacramento River System CA. These sites represent extremes in the geographic distribution and thus, environmental variability of striped bass spawning. At each location, data describing the physio-chemical and biological characteristics ofmore » the spawning population and nursery area are being collected and synthesized by means of a prioritized, directed field sampling program that is organized by the individual-based recruitment model. Here, we employ the striped bass IBM configured for the Potomac River, MD from spawning into the larval period to evaluate the potential for maternal contribution to affect larva survival and growth. Model simulations in which the size distribution and spawning day of females are altered indicate that larva survival is enhanced (3.3-fold increase) when a high fraction of females in the spawning population are large. Larva stage duration also is less ({bar X} = 18.4 d and 22.2 d) when large and small females, respectively, are mothers in simulations. Although inconclusive, these preliminary results for Potomac River striped bass suggest that the effects of female size, timing of spawning nad maternal contribution on recruitment dynamics potentially are important and illustrate our approach to the study of recruitment in striped bass. We hope to use the model, field collections and management alternatives that vary from site to site, in an iterative manner for some time to come. 54 refs., 4 figs., 1 tab.« less

  14. Simulating effects of fire on northern Rocky Mountain landscapes with the ecological process model FIRE-BGC.

    PubMed

    Keane, R E; Ryan, K C; Running, S W

    1996-03-01

    A mechanistic, biogeochemical succession model, FIRE-BGC, was used to investigate the role of fire on long-term landscape dynamics in northern Rocky Mountain coniferous forests of Glacier National Park, Montana, USA. FIRE-BGC is an individual-tree model-created by merging the gap-phase process-based model FIRESUM with the mechanistic ecosystem biogeochemical model FOREST-BGC-that has mixed spatial and temporal resolution in its simulation architecture. Ecological processes that act at a landscape level, such as fire and seed dispersal, are simulated annually from stand and topographic information. Stand-level processes, such as tree establishment, growth and mortality, organic matter accumulation and decomposition, and undergrowth plant dynamics are simulated both daily and annually. Tree growth is mechanistically modeled based on the ecosystem process approach of FOREST-BGC where carbon is fixed daily by forest canopy photosynthesis at the stand level. Carbon allocated to the tree stem at the end of the year generates the corresponding diameter and height growth. The model also explicitly simulates fire behavior and effects on landscape characteristics. We simulated the effects of fire on ecosystem characteristics of net primary productivity, evapotranspiration, standing crop biomass, nitrogen cycling and leaf area index over 200 years for the 50,000-ha McDonald Drainage in Glacier National Park. Results show increases in net primary productivity and available nitrogen when fires are included in the simulation. Standing crop biomass and evapotranspiration decrease under a fire regime. Shade-intolerant species dominate the landscape when fires are excluded. Model tree increment predictions compared well with field data.

  15. Modelling approaches: the case of schizophrenia.

    PubMed

    Heeg, Bart M S; Damen, Joep; Buskens, Erik; Caleo, Sue; de Charro, Frank; van Hout, Ben A

    2008-01-01

    Schizophrenia is a chronic disease characterized by periods of relative stability interrupted by acute episodes (or relapses). The course of the disease may vary considerably between patients. Patient histories show considerable inter- and even intra-individual variability. We provide a critical assessment of the advantages and disadvantages of three modelling techniques that have been used in schizophrenia: decision trees, (cohort and micro-simulation) Markov models and discrete event simulation models. These modelling techniques are compared in terms of building time, data requirements, medico-scientific experience, simulation time, clinical representation, and their ability to deal with patient heterogeneity, the timing of events, prior events, patient interaction, interaction between co-variates and variability (first-order uncertainty). We note that, depending on the research question, the optimal modelling approach should be selected based on the expected differences between the comparators, the number of co-variates, the number of patient subgroups, the interactions between co-variates, and simulation time. Finally, it is argued that in case micro-simulation is required for the cost-effectiveness analysis of schizophrenia treatments, a discrete event simulation model is best suited to accurately capture all of the relevant interdependencies in this chronic, highly heterogeneous disease with limited long-term follow-up data.

  16. Evaluation of a Stochastic Inactivation Model for Heat-Activated Spores of Bacillus spp. ▿

    PubMed Central

    Corradini, Maria G.; Normand, Mark D.; Eisenberg, Murray; Peleg, Micha

    2010-01-01

    Heat activates the dormant spores of certain Bacillus spp., which is reflected in the “activation shoulder” in their survival curves. At the same time, heat also inactivates the already active and just activated spores, as well as those still dormant. A stochastic model based on progressively changing probabilities of activation and inactivation can describe this phenomenon. The model is presented in a fully probabilistic discrete form for individual and small groups of spores and as a semicontinuous deterministic model for large spore populations. The same underlying algorithm applies to both isothermal and dynamic heat treatments. Its construction does not require the assumption of the activation and inactivation kinetics or knowledge of their biophysical and biochemical mechanisms. A simplified version of the semicontinuous model was used to simulate survival curves with the activation shoulder that are reminiscent of experimental curves reported in the literature. The model is not intended to replace current models to predict dynamic inactivation but only to offer a conceptual alternative to their interpretation. Nevertheless, by linking the survival curve's shape to probabilities of events at the individual spore level, the model explains, and can be used to simulate, the irregular activation and survival patterns of individual and small groups of spores, which might be involved in food poisoning and spoilage. PMID:20453137

  17. Socialising Health Burden Through Different Network Topologies: A Simulation Study.

    PubMed

    Peacock, Adrian; Cheung, Anthony; Kim, Peter; Poon, Simon K

    2017-01-01

    An aging population and the expectation of premium quality health services combined with the increasing economic burden of the healthcare system requires a paradigm shift toward patient oriented healthcare. The guardian angel theory described by Szolovits [1] explores the notion of enlisting patients as primary providers of information and motivation to patients with similar clinical history through social connections. In this study, an agent based model was developed to simulate to explore how individuals are affected through their levels of intrinsic positivity. Ring, point-to-point (paired buddy), and random networks were modelled, with individuals able to send messages to each other given their levels of variables positivity and motivation. Of the 3 modelled networks it is apparent that the ring network provides the most equal, collective improvement in positivity and motivation for all users. Further study into other network topologies should be undertaken in the future.

  18. An analysis of intergroup rivalry using Ising model and reinforcement learning

    NASA Astrophysics Data System (ADS)

    Zhao, Feng-Fei; Qin, Zheng; Shao, Zhuo

    2014-01-01

    Modeling of intergroup rivalry can help us better understand economic competitions, political elections and other similar activities. The result of intergroup rivalry depends on the co-evolution of individual behavior within one group and the impact from the rival group. In this paper, we model the rivalry behavior using Ising model. Different from other simulation studies using Ising model, the evolution rules of each individual in our model are not static, but have the ability to learn from historical experience using reinforcement learning technique, which makes the simulation more close to real human behavior. We studied the phase transition in intergroup rivalry and focused on the impact of the degree of social freedom, the personality of group members and the social experience of individuals. The results of computer simulation show that a society with a low degree of social freedom and highly educated, experienced individuals is more likely to be one-sided in intergroup rivalry.

  19. Uncertainty in Simulating Wheat Yields Under Climate Change

    NASA Technical Reports Server (NTRS)

    Asseng, S.; Ewert, F.; Rosenzweig, Cynthia; Jones, J. W.; Hatfield, J. W.; Ruane, A. C.; Boote, K. J.; Thornburn, P. J.; Rotter, R. P.; Cammarano, D.; hide

    2013-01-01

    Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1,3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policymaking.

  20. Redwing: A MOOSE application for coupling MPACT and BISON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick N. Gleicher; Michael Rose; Tom Downar

    Fuel performance and whole core neutron transport programs are often used to analyze fuel behavior as it is depleted in a reactor. For fuel performance programs, internal models provide the local intra-pin power density, fast neutron flux, burnup, and fission rate density, which are needed for a fuel performance analysis. The fuel performance internal models have a number of limitations. These include effects on the intra-pin power distribution by nearby assembly elements, such as water channels and control rods, and the further limitation of applicability to a specified fuel type such as low enriched UO2. In addition, whole core neutronmore » transport codes need an accurate intra-pin temperature distribution in order to calculate neutron cross sections. Fuel performance simulations are able to model the intra-pin fuel displacement as the fuel expands and densifies. These displacements must be accurately modeled in order to capture the eventual mechanical contact of the fuel and the clad; the correct radial gap width is needed for an accurate calculation of the temperature distribution of the fuel rod. Redwing is a MOOSE-based application that enables coupling between MPACT and BISON for transport and fuel performance coupling. MPACT is a 3D neutron transport and reactor core simulator based on the method of characteristics (MOC). The development of MPACT began at the University of Michigan (UM) and now is under the joint development of ORNL and UM as part of the DOE CASL Simulation Hub. MPACT is able to model the effects of local assembly elements and is able calculate intra-pin quantities such as the local power density on a volumetric mesh for any fuel type. BISON is a fuel performance application of Multi-physics Object Oriented Simulation Environment (MOOSE), which is under development at Idaho National Laboratory. BISON is able to solve the nonlinearly coupled mechanical deformation and heat transfer finite element equations that model a fuel element as it is depleted in a nuclear reactor. Redwing couples BISON and MPACT in a single application. Redwing maps and transfers the individual intra-pin quantities such as fission rate density, power density, and fast neutron flux from the MPACT volumetric mesh to the individual BISON finite element meshes. For a two-way coupling Redwing maps and transfers the individual pin temperature field and axially dependent coolant densities from the BISON mesh to the MPACT volumetric mesh. Details of the mapping are given. Redwing advances the simulation with the MPACT solution for each depletion time step and then advances the multiple BISON simulations for fuel performance calculations. Sub-cycle advancement can be applied to the individual BISON simulations and allows multiple time steps to be applied to the fuel performance simulations. Currently, only loose coupling where data from a previous time step is applied to the current time step is performed.« less

  1. Simulating anchovy's full life cycle in the northern Aegean Sea (eastern Mediterranean): A coupled hydro-biogeochemical-IBM model

    NASA Astrophysics Data System (ADS)

    Politikos, D.; Somarakis, S.; Tsiaras, K. P.; Giannoulaki, M.; Petihakis, G.; Machias, A.; Triantafyllou, G.

    2015-11-01

    A 3-D full life cycle population model for the North Aegean Sea (NAS) anchovy stock is presented. The model is two-way coupled with a hydrodynamic-biogeochemical model (POM-ERSEM). The anchovy life span is divided into seven life stages/age classes. Embryos and early larvae are passive particles, but subsequent stages exhibit active horizontal movements based on specific rules. A bioenergetics model simulates the growth in both the larval and juvenile/adult stages, while the microzooplankton and mesozooplankton fields of the biogeochemical model provide the food for fish consumption. The super-individual approach is adopted for the representation of the anchovy population. A dynamic egg production module, with an energy allocation algorithm, is embedded in the bioenergetics equation and produces eggs based on a new conceptual model for anchovy vitellogenesis. A model simulation for the period 2003-2006 with realistic initial conditions reproduced well the magnitude of population biomass and daily egg production estimated from acoustic and daily egg production method (DEPM) surveys, carried out in the NAS during June 2003-2006. Model simulated adult and egg habitats were also in good agreement with observed spatial distributions of acoustic biomass and egg abundance in June. Sensitivity simulations were performed to investigate the effect of different formulations adopted for key processes, such as reproduction and movement. The effect of the anchovy population on plankton dynamics was also investigated, by comparing simulations adopting a two-way or a one-way coupling of the fish with the biogeochemical model.

  2. A marketing approach to carpool demand analysis. Technical memorandum III. Tradeoff model and policy simulation. Conservation paper. [Commuter survey in 3 major urban areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1976-07-01

    The memorandum discusses the theoretical basis of the trade-off model and its adaptation particularly in the simulation procedures used in evaluating specific policies. Two published articles dealing with the development and application of the trade-off model for market research are included as appendices to this memorandum. This model was the primary instrument used in connection with a research effort examining the role of individuals attitudes and perceptions in deciding whether or not to carpool. The research was based upon a survey of commuters in 3 major urban areas and has resulted in a sizeable new data base on respondents' socio-economicmore » and worktrip characteristics, travel perceptions, and travel preferences. Research is contained in the Summary Report, also available through NTIS.« less

  3. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons.

  4. Statistical power calculations for mixed pharmacokinetic study designs using a population approach.

    PubMed

    Kloprogge, Frank; Simpson, Julie A; Day, Nicholas P J; White, Nicholas J; Tarning, Joel

    2014-09-01

    Simultaneous modelling of dense and sparse pharmacokinetic data is possible with a population approach. To determine the number of individuals required to detect the effect of a covariate, simulation-based power calculation methodologies can be employed. The Monte Carlo Mapped Power method (a simulation-based power calculation methodology using the likelihood ratio test) was extended in the current study to perform sample size calculations for mixed pharmacokinetic studies (i.e. both sparse and dense data collection). A workflow guiding an easy and straightforward pharmacokinetic study design, considering also the cost-effectiveness of alternative study designs, was used in this analysis. Initially, data were simulated for a hypothetical drug and then for the anti-malarial drug, dihydroartemisinin. Two datasets (sampling design A: dense; sampling design B: sparse) were simulated using a pharmacokinetic model that included a binary covariate effect and subsequently re-estimated using (1) the same model and (2) a model not including the covariate effect in NONMEM 7.2. Power calculations were performed for varying numbers of patients with sampling designs A and B. Study designs with statistical power >80% were selected and further evaluated for cost-effectiveness. The simulation studies of the hypothetical drug and the anti-malarial drug dihydroartemisinin demonstrated that the simulation-based power calculation methodology, based on the Monte Carlo Mapped Power method, can be utilised to evaluate and determine the sample size of mixed (part sparsely and part densely sampled) study designs. The developed method can contribute to the design of robust and efficient pharmacokinetic studies.

  5. High resolution, MRI-based, segmented, computerized head phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zubal, I.G.; Harrell, C.R.; Smith, E.O.

    1999-01-01

    The authors have created a high-resolution software phantom of the human brain which is applicable to voxel-based radiation transport calculations yielding nuclear medicine simulated images and/or internal dose estimates. A software head phantom was created from 124 transverse MRI images of a healthy normal individual. The transverse T2 slices, recorded in a 256x256 matrix from a GE Signa 2 scanner, have isotropic voxel dimensions of 1.5 mm and were manually segmented by the clinical staff. Each voxel of the phantom contains one of 62 index numbers designating anatomical, neurological, and taxonomical structures. The result is stored as a 256x256x128 bytemore » array. Internal volumes compare favorably to those described in the ICRP Reference Man. The computerized array represents a high resolution model of a typical human brain and serves as a voxel-based anthropomorphic head phantom suitable for computer-based modeling and simulation calculations. It offers an improved realism over previous mathematically described software brain phantoms, and creates a reference standard for comparing results of newly emerging voxel-based computations. Such voxel-based computations lead the way to developing diagnostic and dosimetry calculations which can utilize patient-specific diagnostic images. However, such individualized approaches lack fast, automatic segmentation schemes for routine use; therefore, the high resolution, typical head geometry gives the most realistic patient model currently available.« less

  6. Markov modeling and discrete event simulation in health care: a systematic comparison.

    PubMed

    Standfield, Lachlan; Comans, Tracy; Scuffham, Paul

    2014-04-01

    The aim of this study was to assess if the use of Markov modeling (MM) or discrete event simulation (DES) for cost-effectiveness analysis (CEA) may alter healthcare resource allocation decisions. A systematic literature search and review of empirical and non-empirical studies comparing MM and DES techniques used in the CEA of healthcare technologies was conducted. Twenty-two pertinent publications were identified. Two publications compared MM and DES models empirically, one presented a conceptual DES and MM, two described a DES consensus guideline, and seventeen drew comparisons between MM and DES through the authors' experience. The primary advantages described for DES over MM were the ability to model queuing for limited resources, capture individual patient histories, accommodate complexity and uncertainty, represent time flexibly, model competing risks, and accommodate multiple events simultaneously. The disadvantages of DES over MM were the potential for model overspecification, increased data requirements, specialized expensive software, and increased model development, validation, and computational time. Where individual patient history is an important driver of future events an individual patient simulation technique like DES may be preferred over MM. Where supply shortages, subsequent queuing, and diversion of patients through other pathways in the healthcare system are likely to be drivers of cost-effectiveness, DES modeling methods may provide decision makers with more accurate information on which to base resource allocation decisions. Where these are not major features of the cost-effectiveness question, MM remains an efficient, easily validated, parsimonious, and accurate method of determining the cost-effectiveness of new healthcare interventions.

  7. Memory, not just perception, plays an important role in terrestrial mammalian migration

    PubMed Central

    Mueller, Thomas

    2017-01-01

    One of the key questions regarding the underlying mechanisms of mammalian land migrations is how animals select where to go. Most studies assume perception of resources as the navigational mechanism. The possible role of memory that would allow forecasting conditions at distant locations and times based on information about environmental conditions from previous years has been little studied. We study migrating zebra in Botswana using an individual-based simulation model, where perceptually guided individuals use currently sensed resources at different perceptual ranges, while memory-guided individuals use long-term averages of past resources to forecast future conditions. We compare simulated individuals guided by perception or memory on resource landscapes of remotely sensed vegetation data to trajectories of GPS-tagged zebras. Our results show that memory provides a clear signal that best directs migrants to their destination compared to perception at even the largest perceptual ranges. Zebras modelled with memory arrived two to four times, or up to 100 km, closer to the migration destination than those using perception. We suggest that memory in addition to perception is important for directing ungulate migration. Furthermore, our findings are important for the conservation of migratory mammals, as memory informing direction suggests migration routes could be relatively inflexible. PMID:28539516

  8. Memory, not just perception, plays an important role in terrestrial mammalian migration.

    PubMed

    Bracis, Chloe; Mueller, Thomas

    2017-05-31

    One of the key questions regarding the underlying mechanisms of mammalian land migrations is how animals select where to go. Most studies assume perception of resources as the navigational mechanism. The possible role of memory that would allow forecasting conditions at distant locations and times based on information about environmental conditions from previous years has been little studied. We study migrating zebra in Botswana using an individual-based simulation model, where perceptually guided individuals use currently sensed resources at different perceptual ranges, while memory-guided individuals use long-term averages of past resources to forecast future conditions. We compare simulated individuals guided by perception or memory on resource landscapes of remotely sensed vegetation data to trajectories of GPS-tagged zebras. Our results show that memory provides a clear signal that best directs migrants to their destination compared to perception at even the largest perceptual ranges. Zebras modelled with memory arrived two to four times, or up to 100 km, closer to the migration destination than those using perception. We suggest that memory in addition to perception is important for directing ungulate migration. Furthermore, our findings are important for the conservation of migratory mammals, as memory informing direction suggests migration routes could be relatively inflexible. © 2017 The Author(s).

  9. Process-Oriented Diagnostics of Tropical Cyclones in Global Climate Models

    NASA Astrophysics Data System (ADS)

    Moon, Y.; Kim, D.; Camargo, S. J.; Wing, A. A.; Sobel, A. H.; Bosilovich, M. G.; Murakami, H.; Reed, K. A.; Vecchi, G. A.; Wehner, M. F.; Zarzycki, C. M.; Zhao, M.

    2017-12-01

    Simulating tropical cyclone (TC) activity with global climate models (GCMs) remains a challenging problem. While some GCMs are able to simulate TC activity that is in good agreement with the observations, many other models exhibit strong biases. Decreasing horizontal grid spacing of the GCM simulations tends to improve the characteristics of simulated TCs, but this enhancement alone does not necessarily lead to greater skill in simulating TC activity. This study uses process-based diagnostics to identify model characteristics that could explain why some GCM simulations are able to produce more realistic TC activity than others. The diagnostics examine how convection, moisture, clouds and related processes are coupled at individual grid points, which yields useful information into how convective parameterizations interact with resolved model dynamics. These diagnostics share similarities with those originally developed to examine the Madden-Julian Oscillations in climate models. This study will examine TCs in eight different GCM simulations performed at NOAA/GFDL, NCAR and NASA that have different horizontal resolutions and ocean coupling. Preliminary results suggest that stronger TCs are closely associated with greater rainfall - thus greater diabatic heating - in the inner-core regions of the storms, which is consistent with previous theoretical studies. Other storm characteristics that can be used to infer why GCM simulations with comparable horizontal grid spacings produce different TC activity will be examined.

  10. How exactly can computer simulation predict the kinematics and contact status after TKA? Examination in individualized models.

    PubMed

    Tanaka, Yoshihisa; Nakamura, Shinichiro; Kuriyama, Shinichi; Ito, Hiromu; Furu, Moritoshi; Komistek, Richard D; Matsuda, Shuichi

    2016-11-01

    It is unknown whether a computer simulation with simple models can estimate individual in vivo knee kinematics, although some complex models have predicted the knee kinematics. The purposes of this study are first, to validate the accuracy of the computer simulation with our developed model during a squatting activity in a weight-bearing deep knee bend and then, to analyze the contact area and the contact stress of the tri-condylar implants for individual patients. We compared the anteroposterior (AP) contact positions of medial and lateral condyles calculated by the computer simulation program with the positions measured from the fluoroscopic analysis for three implanted knees. Then the contact area and the stress including the third condyle were calculated individually using finite element (FE) analysis. The motion patterns were similar in the simulation program and the fluoroscopic surveillance. Our developed model could nearly estimate the individual in vivo knee kinematics. The mean and maximum differences of the AP contact positions were 1.0mm and 2.5mm, respectively. At 120° of knee flexion, the contact area at the third condyle was wider than the both condyles. The mean maximum contact stress at the third condyle was lower than the both condyles at 90° and 120° of knee flexion. Individual bone models are required to estimate in vivo knee kinematics in our simple model. The tri-condylar implant seems to be safe for deep flexion activities due to the wide contact area and low contact stress. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Simulating imaging spectrometer data of a mixed old-growth forest: A parameterization of a 3D radiative transfer model based on airborne and terrestrial laser scanning

    NASA Astrophysics Data System (ADS)

    Schneider, F. D.; Leiterer, R.; Morsdorf, F.; Gastellu-Etchegorry, J.; Lauret, N.; Pfeifer, N.; Schaepman, M. E.

    2013-12-01

    Remote sensing offers unique potential to study forest ecosystems by providing spatially and temporally distributed information that can be linked with key biophysical and biochemical variables. The estimation of biochemical constituents of leaves from remotely sensed data is of high interest revealing insight on photosynthetic processes, plant health, plant functional types, and speciation. However, the scaling of observations at the canopy level to the leaf level or vice versa is not trivial due to the structural complexity of forests. Thus, a common solution for scaling spectral information is the use of physically-based radiative transfer models. The discrete anisotropic radiative transfer model (DART), being one of the most complete coupled canopy-atmosphere 3D radiative transfer models, was parameterized based on airborne and in-situ measurements. At-sensor radiances were simulated and compared with measurements from an airborne imaging spectrometer. The study was performed on the Laegern site, a temperate mixed forest characterized by steep slopes, a heterogeneous spectral background, and deciduous and coniferous trees at different development stages (dominated by beech trees; 47°28'42.0' N, 8°21'51.8' E, 682 m asl, Switzerland). It is one of the few studies conducted on an old-growth forest. Particularly the 3D modeling of the complex canopy architecture is crucial to model the interaction of photons with the vegetation canopy and its background. Thus, we developed two forest reconstruction approaches: 1) based on a voxel grid, and 2) based on individual tree detection. Both methods are transferable to various forest ecosystems and applicable at scales between plot and landscape. Our results show that the newly developed voxel grid approach is favorable over a parameterization based on individual trees. In comparison to the actual imaging spectrometer data, the simulated images exhibit very similar spatial patterns, whereas absolute radiance values are partially differing depending on the respective wavelength. We conclude that our proposed method provides a representation of the 3D radiative regime within old-growth forests that is suitable for simulating most spectral and spatial features of imaging spectrometer data. It indicates the potential of simulating future Earth observation missions, such as ESA's Sentinel-2. However, the high spectral variability of leaf optical properties among species has to be addressed in future radiative transfer modeling. The results further reveal that research emphasis has to be put on the accurate parameterization of small-scale structures, such as the clumping of needles into shoots or the distribution of leaf angles.

  12. The management submodel of the Wind Erosion Prediction System

    USDA-ARS?s Scientific Manuscript database

    The Wind Erosion Prediction System (WEPS) is a process-based, daily time-step, computer model that predicts soil erosion via simulation of the physical processes controlling wind erosion. WEPS is comprised of several individual modules (submodels) that reflect different sets of physical processes, ...

  13. HOW POPULATION STRUCTURE SHAPES NEIGHBORHOOD SEGREGATION*

    PubMed Central

    Bruch, Elizabeth E.

    2014-01-01

    This study investigates how choices about social affiliation based on one attribute can exacerbate or attenuate segregation on another correlated attribute. The specific application is the role of racial and economic factors in generating patterns of racial residential segregation. I identify three population parameters—between-group inequality, within-group inequality, and relative group size—that determine how income inequality between race groups affects racial segregation. I use data from the Panel Study of Income Dynamics to estimate models of individual-level residential mobility, and incorporate these estimates into agent-based models. I then simulate segregation dynamics under alternative assumptions about: (1) the relative size of minority groups; and (2) the degree of correlation between race and income among individuals. I find that income inequality can have offsetting effects at the high and low ends of the income distribution. I demonstrate the empirical relevance of the simulation results using fixed-effects, metro-level regressions applied to 1980-2000 U.S. Census data. PMID:25009360

  14. Calculation of Individual Tree Water Use in a Bornean Tropical Rain Forest Using Individual-Based Dynamic Vegetation Model SEIB-DGVM

    NASA Astrophysics Data System (ADS)

    Nakai, T.; Kumagai, T.; Saito, T.; Matsumoto, K.; Kume, T.; Nakagawa, M.; Sato, H.

    2015-12-01

    Bornean tropical rain forests are among the moistest biomes of the world with abundant rainfall throughout the year, and considered to be vulnerable to a change in the rainfall regime; e.g., high tree mortality was reported in such forests induced by a severe drought associated with the ENSO event in 1997-1998. In order to assess the effect (risk) of future climate change on eco-hydrology in such tropical rain forests, it is important to understand the water use of trees individually, because the vulnerability or mortality of trees against climate change can depend on the size of trees. Therefore, we refined the Spatially Explicit Individual-Based Dynamic Global Vegetation Model (SEIB-DGVM) so that the transpiration and its control by stomata are calculated for each individual tree. By using this model, we simulated the transpiration of each tree and its DBH-size dependency, and successfully reproduced the measured data of sap flow of trees and eddy covariance flux data obtained in a Bornean lowland tropical rain forest in Lambir Hills National Park, Sarawak, Malaysia.

  15. Evaluating Temporal Factors in Combined Interventions of Workforce Shift and School Closure for Mitigating the Spread of Influenza

    PubMed Central

    Zhang, Tianyou; Fu, Xiuju; Ma, Stefan; Xiao, Gaoxi; Wong, Limsoon; Kwoh, Chee Keong; Lees, Michael; Lee, Gary Kee Khoon; Hung, Terence

    2012-01-01

    Background It is believed that combined interventions may be more effective than individual interventions in mitigating epidemic. However there is a lack of quantitative studies on performance of the combination of individual interventions under different temporal settings. Methodology/Principal Findings To better understand the problem, we develop an individual-based simulation model running on top of contact networks based on real-life contact data in Singapore. We model and evaluate the spread of influenza epidemic with intervention strategies of workforce shift and its combination with school closure, and examine the impacts of temporal factors, namely the trigger threshold and the duration of an intervention. By comparing simulation results for intervention scenarios with different temporal factors, we find that combined interventions do not always outperform individual interventions and are more effective only when the duration is longer than 6 weeks or school closure is triggered at the 5% threshold; combined interventions may be more effective if school closure starts first when the duration is less than 4 weeks or workforce shift starts first when the duration is longer than 4 weeks. Conclusions/Significance We therefore conclude that identifying the appropriate timing configuration is crucial for achieving optimal or near optimal performance in mitigating the spread of influenza epidemic. The results of this study are useful to policy makers in deliberating and planning individual and combined interventions. PMID:22403634

  16. An agent-based stochastic Occupancy Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yixing; Hong, Tianzhen; Luo, Xuan

    Occupancy has significant impacts on building performance. However, in current building performance simulation programs, occupancy inputs are static and lack diversity, contributing to discrepancies between the simulated and actual building performance. This work presents an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity. Each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastic behaviors. The occupancy behaviors are represented with three types of models: (1) the status transition events (e.g., first arrival in office) simulated with probability distributionmore » model, (2) the random moving events (e.g., from one office to another) simulated with a homogeneous Markov chain model, and (3) the meeting events simulated with a new stochastic model. A hierarchical data model was developed for the Occupancy Simulator, which reduces the amount of data input by using the concepts of occupant types and space types. Finally, a case study of a small office building is presented to demonstrate the use of the Simulator to generate detailed annual sub-hourly occupant schedules for individual spaces and the whole building. The Simulator is a web application freely available to the public and capable of performing a detailed stochastic simulation of occupant presence and movement in buildings. Future work includes enhancements in the meeting event model, consideration of personal absent days, verification and validation of the simulated occupancy results, and expansion for use with residential buildings.« less

  17. An agent-based stochastic Occupancy Simulator

    DOE PAGES

    Chen, Yixing; Hong, Tianzhen; Luo, Xuan

    2017-06-01

    Occupancy has significant impacts on building performance. However, in current building performance simulation programs, occupancy inputs are static and lack diversity, contributing to discrepancies between the simulated and actual building performance. This work presents an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity. Each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastic behaviors. The occupancy behaviors are represented with three types of models: (1) the status transition events (e.g., first arrival in office) simulated with probability distributionmore » model, (2) the random moving events (e.g., from one office to another) simulated with a homogeneous Markov chain model, and (3) the meeting events simulated with a new stochastic model. A hierarchical data model was developed for the Occupancy Simulator, which reduces the amount of data input by using the concepts of occupant types and space types. Finally, a case study of a small office building is presented to demonstrate the use of the Simulator to generate detailed annual sub-hourly occupant schedules for individual spaces and the whole building. The Simulator is a web application freely available to the public and capable of performing a detailed stochastic simulation of occupant presence and movement in buildings. Future work includes enhancements in the meeting event model, consideration of personal absent days, verification and validation of the simulated occupancy results, and expansion for use with residential buildings.« less

  18. A hybrid agent-based approach for modeling microbiological systems.

    PubMed

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  19. Combining area-based and individual-level data in the geostatistical mapping of late-stage cancer incidence.

    PubMed

    Goovaerts, Pierre

    2009-01-01

    This paper presents a geostatistical approach to incorporate individual-level data (e.g. patient residences) and area-based data (e.g. rates recorded at census tract level) into the mapping of late-stage cancer incidence, with an application to breast cancer in three Michigan counties. Spatial trends in cancer incidence are first estimated from census data using area-to-point binomial kriging. This prior model is then updated using indicator kriging and individual-level data. Simulation studies demonstrate the benefits of this two-step approach over methods (kernel density estimation and indicator kriging) that process only residence data.

  20. PSAMM: A Portable System for the Analysis of Metabolic Models

    PubMed Central

    Steffensen, Jon Lund; Dufault-Thompson, Keith; Zhang, Ying

    2016-01-01

    The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM), a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies. PMID:26828591

  1. Kinetic modelling of anaerobic hydrolysis of solid wastes, including disintegration processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    García-Gen, Santiago; Sousbie, Philippe; Rangaraj, Ganesh

    2015-01-15

    Highlights: • Fractionation of solid wastes into readily and slowly biodegradable fractions. • Kinetic coefficients estimation from mono-digestion batch assays. • Validation of kinetic coefficients with a co-digestion continuous experiment. • Simulation of batch and continuous experiments with an ADM1-based model. - Abstract: A methodology to estimate disintegration and hydrolysis kinetic parameters of solid wastes and validate an ADM1-based anaerobic co-digestion model is presented. Kinetic parameters of the model were calibrated from batch reactor experiments treating individually fruit and vegetable wastes (among other residues) following a new protocol for batch tests. In addition, decoupled disintegration kinetics for readily and slowlymore » biodegradable fractions of solid wastes was considered. Calibrated parameters from batch assays of individual substrates were used to validate the model for a semi-continuous co-digestion operation treating simultaneously 5 fruit and vegetable wastes. The semi-continuous experiment was carried out in a lab-scale CSTR reactor for 15 weeks at organic loading rate ranging between 2.0 and 4.7 g VS/L d. The model (built in Matlab/Simulink) fit to a large extent the experimental results in both batch and semi-continuous mode and served as a powerful tool to simulate the digestion or co-digestion of solid wastes.« less

  2. Simulations in Cyber-Security: A Review of Cognitive Modeling of Network Attackers, Defenders, and Users

    PubMed Central

    Veksler, Vladislav D.; Buchler, Norbou; Hoffman, Blaine E.; Cassenti, Daniel N.; Sample, Char; Sugrim, Shridat

    2018-01-01

    Computational models of cognitive processes may be employed in cyber-security tools, experiments, and simulations to address human agency and effective decision-making in keeping computational networks secure. Cognitive modeling can addresses multi-disciplinary cyber-security challenges requiring cross-cutting approaches over the human and computational sciences such as the following: (a) adversarial reasoning and behavioral game theory to predict attacker subjective utilities and decision likelihood distributions, (b) human factors of cyber tools to address human system integration challenges, estimation of defender cognitive states, and opportunities for automation, (c) dynamic simulations involving attacker, defender, and user models to enhance studies of cyber epidemiology and cyber hygiene, and (d) training effectiveness research and training scenarios to address human cyber-security performance, maturation of cyber-security skill sets, and effective decision-making. Models may be initially constructed at the group-level based on mean tendencies of each subject's subgroup, based on known statistics such as specific skill proficiencies, demographic characteristics, and cultural factors. For more precise and accurate predictions, cognitive models may be fine-tuned to each individual attacker, defender, or user profile, and updated over time (based on recorded behavior) via techniques such as model tracing and dynamic parameter fitting. PMID:29867661

  3. Instructional design affects the efficacy of simulation-based training in central venous catheterization.

    PubMed

    Craft, Christopher; Feldon, David F; Brown, Eric A

    2014-05-01

    Simulation-based learning is a common educational tool in health care training and frequently involves instructional designs based on Experiential Learning Theory (ELT). However, little research explores the effectiveness and efficiency of different instructional design methodologies appropriate for simulations. The aim of this study was to compare 2 instructional design models, ELT and Guided Experiential Learning (GEL), to determine which is more effective for training the central venous catheterization procedure. Using a quasi-experimental randomized block design, nurse anesthetists completed training under 1 of the 2 instructional design models. Performance was assessed using a checklist of central venous catheterization performance, pass rates, and critical action errors. Participants in the GEL condition performed significantly better than those in the ELT condition on the overall checklist score after controlling for individual practice time (F[1, 29] = 4.021, P = .027, Cohen's d = .71), had higher pass rates (P = .006, Cohen's d = 1.15), and had lower rates of failure due to critical action errors (P = .038, Cohen's d = .81). The GEL model of instructional design is significantly more effective than ELT for simulation-based learning of the central venous catheterization procedure, yielding large differences in effect size. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. A Poisson approach to the validation of failure time surrogate endpoints in individual patient data meta-analyses.

    PubMed

    Rotolo, Federico; Paoletti, Xavier; Burzykowski, Tomasz; Buyse, Marc; Michiels, Stefan

    2017-01-01

    Surrogate endpoints are often used in clinical trials instead of well-established hard endpoints for practical convenience. The meta-analytic approach relies on two measures of surrogacy: one at the individual level and one at the trial level. In the survival data setting, a two-step model based on copulas is commonly used. We present a new approach which employs a bivariate survival model with an individual random effect shared between the two endpoints and correlated treatment-by-trial interactions. We fit this model using auxiliary mixed Poisson models. We study via simulations the operating characteristics of this mixed Poisson approach as compared to the two-step copula approach. We illustrate the application of the methods on two individual patient data meta-analyses in gastric cancer, in the advanced setting (4069 patients from 20 randomized trials) and in the adjuvant setting (3288 patients from 14 randomized trials).

  5. INDIVIDUAL BASED MODELLING APPROACH TO THERMAL ...

    EPA Pesticide Factsheets

    Diadromous fish populations in the Pacific Northwest face challenges along their migratory routes from declining habitat quality, harvest, and barriers to longitudinal connectivity. Changes in river temperature regimes are producing an additional challenge for upstream migrating adult salmon and steelhead, species that are sensitive to absolute and cumulative thermal exposure. Adult salmon populations have been shown to utilize cold water patches along migration routes when mainstem river temperatures exceed thermal optimums. We are employing an individual based model (IBM) to explore the costs and benefits of spatially-distributed cold water refugia for adult migrating salmon. Our model, developed in the HexSim platform, is built around a mechanistic behavioral decision tree that drives individual interactions with their spatially explicit simulated environment. Population-scale responses to dynamic thermal regimes, coupled with other stressors such as disease and harvest, become emergent properties of the spatial IBM. Other model outputs include arrival times, species-specific survival rates, body energetic content, and reproductive fitness levels. Here, we discuss the challenges associated with parameterizing an individual based model of salmon and steelhead in a section of the Columbia River. Many rivers and streams in the Pacific Northwest are currently listed as impaired under the Clean Water Act as a result of high summer water temperatures. Adverse effec

  6. Simulating the Distribution of Individual Livestock Farms and Their Populations in the United States: An Example Using Domestic Swine (Sus scrofa domesticus) Farms

    PubMed Central

    Garza, Sarah J.; Miller, Ryan S.

    2015-01-01

    Livestock distribution in the United States (U.S.) can only be mapped at a county-level or worse resolution. We developed a spatial microsimulation model called the Farm Location and Agricultural Production Simulator (FLAPS) that simulated the distribution and populations of individual livestock farms throughout the conterminous U.S. Using domestic pigs (Sus scrofa domesticus) as an example species, we customized iterative proportional-fitting algorithms for the hierarchical structure of the U.S. Census of Agriculture and imputed unpublished state- or county-level livestock population totals that were redacted to ensure confidentiality. We used a weighted sampling design to collect data on the presence and absence of farms and used them to develop a national-scale distribution model that predicted the distribution of individual farms at a 100 m resolution. We implemented microsimulation algorithms that simulated the populations and locations of individual farms using output from our imputed Census of Agriculture dataset and distribution model. Approximately 19% of county-level pig population totals were unpublished in the 2012 Census of Agriculture and needed to be imputed. Using aerial photography, we confirmed the presence or absence of livestock farms at 10,238 locations and found livestock farms were correlated with open areas, cropland, and roads, and also areas with cooler temperatures and gentler topography. The distribution of swine farms was highly variable, but cross-validation of our distribution model produced an area under the receiver-operating characteristics curve value of 0.78, which indicated good predictive performance. Verification analyses showed FLAPS accurately imputed and simulated Census of Agriculture data based on absolute percent difference values of < 0.01% at the state-to-national scale, 3.26% for the county-to-state scale, and 0.03% for the individual farm-to-county scale. Our output data have many applications for risk management of agricultural systems including epidemiological studies, food safety, biosecurity issues, emergency-response planning, and conflicts between livestock and other natural resources. PMID:26571497

  7. Simulating the Distribution of Individual Livestock Farms and Their Populations in the United States: An Example Using Domestic Swine (Sus scrofa domesticus) Farms.

    PubMed

    Burdett, Christopher L; Kraus, Brian R; Garza, Sarah J; Miller, Ryan S; Bjork, Kathe E

    2015-01-01

    Livestock distribution in the United States (U.S.) can only be mapped at a county-level or worse resolution. We developed a spatial microsimulation model called the Farm Location and Agricultural Production Simulator (FLAPS) that simulated the distribution and populations of individual livestock farms throughout the conterminous U.S. Using domestic pigs (Sus scrofa domesticus) as an example species, we customized iterative proportional-fitting algorithms for the hierarchical structure of the U.S. Census of Agriculture and imputed unpublished state- or county-level livestock population totals that were redacted to ensure confidentiality. We used a weighted sampling design to collect data on the presence and absence of farms and used them to develop a national-scale distribution model that predicted the distribution of individual farms at a 100 m resolution. We implemented microsimulation algorithms that simulated the populations and locations of individual farms using output from our imputed Census of Agriculture dataset and distribution model. Approximately 19% of county-level pig population totals were unpublished in the 2012 Census of Agriculture and needed to be imputed. Using aerial photography, we confirmed the presence or absence of livestock farms at 10,238 locations and found livestock farms were correlated with open areas, cropland, and roads, and also areas with cooler temperatures and gentler topography. The distribution of swine farms was highly variable, but cross-validation of our distribution model produced an area under the receiver-operating characteristics curve value of 0.78, which indicated good predictive performance. Verification analyses showed FLAPS accurately imputed and simulated Census of Agriculture data based on absolute percent difference values of < 0.01% at the state-to-national scale, 3.26% for the county-to-state scale, and 0.03% for the individual farm-to-county scale. Our output data have many applications for risk management of agricultural systems including epidemiological studies, food safety, biosecurity issues, emergency-response planning, and conflicts between livestock and other natural resources.

  8. Dengue fever spreading based on probabilistic cellular automata with two lattices

    NASA Astrophysics Data System (ADS)

    Pereira, F. M. M.; Schimit, P. H. T.

    2018-06-01

    Modeling and simulation of mosquito-borne diseases have gained attention due to a growing incidence in tropical countries in the past few years. Here, we study the dengue spreading in a population modeled by cellular automata, where there are two lattices to model the human-mosquitointeraction: one lattice for human individuals, and one lattice for mosquitoes in order to enable different dynamics in populations. The disease considered is the dengue fever with one, two or three different serotypes coexisting in population. Although many regions exhibit the incidence of only one serotype, here we set a complete framework to also study the occurrence of two and three serotypes at the same time in a population. Furthermore, the flexibility of the model allows its use to other mosquito-borne diseases, like chikungunya, yellow fever and malaria. An approximation of the cellular automata is proposed in terms of ordinary differential equations; the spreading of mosquitoes is studied and the influence of some model parameters are analyzed with numerical simulations. Finally, a method to combat dengue spreading is simulated based on a reduction of mosquito birth and mosquito bites in population.

  9. Fuzzy logic application for modeling man-in-the-loop space shuttle proximity operations. M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Brown, Robert B.

    1994-01-01

    A software pilot model for Space Shuttle proximity operations is developed, utilizing fuzzy logic. The model is designed to emulate a human pilot during the terminal phase of a Space Shuttle approach to the Space Station. The model uses the same sensory information available to a human pilot and is based upon existing piloting rules and techniques determined from analysis of human pilot performance. Such a model is needed to generate numerous rendezvous simulations to various Space Station assembly stages for analysis of current NASA procedures and plume impingement loads on the Space Station. The advantages of a fuzzy logic pilot model are demonstrated by comparing its performance with NASA's man-in-the-loop simulations and with a similar model based upon traditional Boolean logic. The fuzzy model is shown to respond well from a number of initial conditions, with results typical of an average human. In addition, the ability to model different individual piloting techniques and new piloting rules is demonstrated.

  10. System Simulation Modeling: A Case Study Illustration of the Model Development Life Cycle

    Treesearch

    Janice K. Wiedenbeck; D. Earl Kline

    1994-01-01

    Systems simulation modeling techniques offer a method of representing the individual elements of a manufacturing system and their interactions. By developing and experimenting with simulation models, one can obtain a better understanding of the overall physical system. Forest products industries are beginning to understand the importance of simulation modeling to help...

  11. A computational approach to characterizing the impact of social influence on individuals' vaccination decision making.

    PubMed

    Xia, Shang; Liu, Jiming

    2013-01-01

    In modeling individuals vaccination decision making, existing studies have typically used the payoff-based (e.g., game-theoretical) approaches that evaluate the risks and benefits of vaccination. In reality, whether an individual takes vaccine or not is also influenced by the decisions of others, i.e., due to the impact of social influence. In this regard, we present a dual-perspective view on individuals decision making that incorporates both the cost analysis of vaccination and the impact of social influence. In doing so, we consider a group of individuals making their vaccination decisions by both minimizing the associated costs and evaluating the decisions of others. We apply social impact theory (SIT) to characterize the impact of social influence with respect to individuals interaction relationships. By doing so, we propose a novel modeling framework that integrates an extended SIT-based characterization of social influence with a game-theoretical analysis of cost minimization. We consider the scenario of voluntary vaccination against an influenza-like disease through a series of simulations. We investigate the steady state of individuals' decision making, and thus, assess the impact of social influence by evaluating the coverage of vaccination for infectious diseases control. Our simulation results suggest that individuals high conformity to social influence will increase the vaccination coverage if the cost of vaccination is low, and conversely, will decrease it if the cost is high. Interestingly, if individuals are social followers, the resulting vaccination coverage would converge to a certain level, depending on individuals' initial level of vaccination willingness rather than the associated costs. We conclude that social influence will have an impact on the control of an infectious disease as they can affect the vaccination coverage. In this respect, our work can provide a means for modeling the impact of social influence as well as for estimating the effectiveness of a voluntary vaccination program.

  12. Safety evaluation of driver cognitive failures and driving errors on right-turn filtering movement at signalized road intersections based on Fuzzy Cellular Automata (FCA) model.

    PubMed

    Chai, Chen; Wong, Yiik Diew; Wang, Xuesong

    2017-07-01

    This paper proposes a simulation-based approach to estimate safety impact of driver cognitive failures and driving errors. Fuzzy Logic, which involves linguistic terms and uncertainty, is incorporated with Cellular Automata model to simulate decision-making process of right-turn filtering movement at signalized intersections. Simulation experiments are conducted to estimate the relationships between cognitive failures and driving errors with safety performance. Simulation results show Different types of cognitive failures are found to have varied relationship with driving errors and safety performance. For right-turn filtering movement, cognitive failures are more likely to result in driving errors with denser conflicting traffic stream. Moreover, different driving errors are found to have different safety impacts. The study serves to provide a novel approach to linguistically assess cognitions and replicate decision-making procedures of the individual driver. Compare to crash analysis, the proposed FCA model allows quantitative estimation of particular cognitive failures, and the impact of cognitions on driving errors and safety performance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. A hierarchical model of the evolution of cooperation in cultural systems.

    PubMed

    Savatsky, K; Reynolds, R G

    1989-01-01

    In this paper the following problem is addressed: "Under what conditions can a collection of individual organisms learn to cooperate when cooperation appears to outwardly degrade individual performance at the outset. In order to attempt a theoretical solution to this problem, data from a real world problem in anthropology is used. A distributed simulation model of this system was developed to assess its long term behavior using using an approach suggested by Zeigler (Zeigler, B.P., 1984, Multifaceted Modelling and Discrete Event Simulation (Academic Press, London)). The results of the simulation are used to show that although cooperation degrades the performance potential of each individual, it enhances the persistence of the individual's partial solution to the problem in certain situations."

  14. A musculoskeletal model of the upper extremity for use in the development of neuroprosthetic systems

    PubMed Central

    Blana, Dimitra; Hincapie, Juan G.; Chadwick, Edward K.; Kirsch, Robert F.

    2008-01-01

    Upper extremity neuroprostheses use functional electrical stimulation (FES) to restore arm motor function to individuals with cervical level spinal cord injury. For the design and testing of these systems, a biomechanical model of the shoulder and elbow has been developed, to be used as a substitute for the human arm. It can be used to design and evaluate specific implementations of FES systems, as well as FES controllers. The model can be customized to simulate a variety of pathological conditions. For example, by adjusting the maximum force the muscles can produce, the model can be used to simulate an individual with tetraplegia and to explore the effects of FES of different muscle sets. The model comprises six bones, five joints, nine degrees of freedom, and 29 shoulder and arm muscles. It was developed using commercial, graphics-based modeling and simulation packages that are easily accessible to other researchers and can be readily interfaced to other analysis packages. It can be used for both forward-dynamic (inputs: muscle activation and external load; outputs:motions) and inverse-dynamic (inputs: motions and external load; outputs: muscle activation) simulations. Our model was verified by comparing the model-calculated muscle activations to electromyographic signals recorded from shoulder and arm muscles of five subjects. As an example of its application to neuroprosthesis design, the model was used to demonstrate the importance of rotator cuff muscle stimulation when aiming to restore humeral elevation. It is concluded that this model is a useful tool in the development and implementation of upper extremity neuroprosthetic systems. PMID:18420213

  15. A musculoskeletal model of the upper extremity for use in the development of neuroprosthetic systems.

    PubMed

    Blana, Dimitra; Hincapie, Juan G; Chadwick, Edward K; Kirsch, Robert F

    2008-01-01

    Upper extremity neuroprostheses use functional electrical stimulation (FES) to restore arm motor function to individuals with cervical level spinal cord injury. For the design and testing of these systems, a biomechanical model of the shoulder and elbow has been developed, to be used as a substitute for the human arm. It can be used to design and evaluate specific implementations of FES systems, as well as FES controllers. The model can be customized to simulate a variety of pathological conditions. For example, by adjusting the maximum force the muscles can produce, the model can be used to simulate an individual with tetraplegia and to explore the effects of FES of different muscle sets. The model comprises six bones, five joints, nine degrees of freedom, and 29 shoulder and arm muscles. It was developed using commercial, graphics-based modeling and simulation packages that are easily accessible to other researchers and can be readily interfaced to other analysis packages. It can be used for both forward-dynamic (inputs: muscle activation and external load; outputs: motions) and inverse-dynamic (inputs: motions and external load; outputs: muscle activation) simulations. Our model was verified by comparing the model calculated muscle activations to electromyographic signals recorded from shoulder and arm muscles of five subjects. As an example of its application to neuroprosthesis design, the model was used to demonstrate the importance of rotator cuff muscle stimulation when aiming to restore humeral elevation. It is concluded that this model is a useful tool in the development and implementation of upper extremity neuroprosthetic systems.

  16. Forecasting the use of elderly care: a static micro-simulation model.

    PubMed

    Eggink, Evelien; Woittiez, Isolde; Ras, Michiel

    2016-07-01

    This paper describes a model suitable for forecasting the use of publicly funded long-term elderly care, taking into account both ageing and changes in the health status of the population. In addition, the impact of socioeconomic factors on care use is included in the forecasts. The model is also suitable for the simulation of possible implications of some specific policy measures. The model is a static micro-simulation model, consisting of an explanatory model and a population model. The explanatory model statistically relates care use to individual characteristics. The population model mimics the composition of the population at future points in time. The forecasts of care use are driven by changes in the composition of the population in terms of relevant characteristics instead of dynamics at the individual level. The results show that a further 37 % increase in the use of elderly care (from 7 to 9 % of the Dutch 30-plus population) between 2008 and 2030 can be expected due to a further ageing of the population. However, the use of care is expected to increase less than if it were based on the increasing number of elderly only (+70 %), due to decreasing disability levels and increasing levels of education. As an application of the model, we simulated the effects of restricting access to residential care to elderly people with severe physical disabilities. The result was a lower growth of residential care use (32 % instead of 57 %), but a somewhat faster growth in the use of home care (35 % instead of 32 %).

  17. An Individual-Based Model of the Evolution of Pesticide Resistance in Heterogeneous Environments: Control of Meligethes aeneus Population in Oilseed Rape Crops

    PubMed Central

    Stratonovitch, Pierre; Elias, Jan; Denholm, Ian; Slater, Russell; Semenov, Mikhail A.

    2014-01-01

    Preventing a pest population from damaging an agricultural crop and, at the same time, preventing the development of pesticide resistance is a major challenge in crop protection. Understanding how farming practices and environmental factors interact with pest characteristics to influence the spread of resistance is a difficult and complex task. It is extremely challenging to investigate such interactions experimentally at realistic spatial and temporal scales. Mathematical modelling and computer simulation have, therefore, been used to analyse resistance evolution and to evaluate potential resistance management tactics. Of the many modelling approaches available, individual-based modelling of a pest population offers most flexibility to include and analyse numerous factors and their interactions. Here, a pollen beetle (Meligethes aeneus) population was modelled as an aggregate of individual insects inhabiting a spatially heterogeneous landscape. The development of the pest and host crop (oilseed rape) was driven by climatic variables. The agricultural land of the landscape was managed by farmers applying a specific rotation and crop protection strategy. The evolution of a single resistance allele to the pyrethroid lambda cyhalothrin was analysed for different combinations of crop management practices and for a recessive, intermediate and dominant resistance allele. While the spread of a recessive resistance allele was severely constrained, intermediate or dominant resistance alleles showed a similar response to the management regime imposed. Calendar treatments applied irrespective of pest density accelerated the development of resistance compared to ones applied in response to prescribed pest density thresholds. A greater proportion of spring-sown oilseed rape was also found to increase the speed of resistance as it increased the period of insecticide exposure. Our study demonstrates the flexibility and power of an individual-based model to simulate how farming practices affect pest population dynamics, and the consequent impact of different control strategies on the risk and speed of resistance development. PMID:25531104

  18. An individual-based model of the evolution of pesticide resistance in heterogeneous environments: control of Meligethes aeneus population in oilseed rape crops.

    PubMed

    Stratonovitch, Pierre; Elias, Jan; Denholm, Ian; Slater, Russell; Semenov, Mikhail A

    2014-01-01

    Preventing a pest population from damaging an agricultural crop and, at the same time, preventing the development of pesticide resistance is a major challenge in crop protection. Understanding how farming practices and environmental factors interact with pest characteristics to influence the spread of resistance is a difficult and complex task. It is extremely challenging to investigate such interactions experimentally at realistic spatial and temporal scales. Mathematical modelling and computer simulation have, therefore, been used to analyse resistance evolution and to evaluate potential resistance management tactics. Of the many modelling approaches available, individual-based modelling of a pest population offers most flexibility to include and analyse numerous factors and their interactions. Here, a pollen beetle (Meligethes aeneus) population was modelled as an aggregate of individual insects inhabiting a spatially heterogeneous landscape. The development of the pest and host crop (oilseed rape) was driven by climatic variables. The agricultural land of the landscape was managed by farmers applying a specific rotation and crop protection strategy. The evolution of a single resistance allele to the pyrethroid lambda cyhalothrin was analysed for different combinations of crop management practices and for a recessive, intermediate and dominant resistance allele. While the spread of a recessive resistance allele was severely constrained, intermediate or dominant resistance alleles showed a similar response to the management regime imposed. Calendar treatments applied irrespective of pest density accelerated the development of resistance compared to ones applied in response to prescribed pest density thresholds. A greater proportion of spring-sown oilseed rape was also found to increase the speed of resistance as it increased the period of insecticide exposure. Our study demonstrates the flexibility and power of an individual-based model to simulate how farming practices affect pest population dynamics, and the consequent impact of different control strategies on the risk and speed of resistance development.

  19. Fast image-based mitral valve simulation from individualized geometry.

    PubMed

    Villard, Pierre-Frederic; Hammer, Peter E; Perrin, Douglas P; Del Nido, Pedro J; Howe, Robert D

    2018-04-01

    Common surgical procedures on the mitral valve of the heart include modifications to the chordae tendineae. Such interventions are used when there is extensive leaflet prolapse caused by chordae rupture or elongation. Understanding the role of individual chordae tendineae before operating could be helpful to predict whether the mitral valve will be competent at peak systole. Biomechanical modelling and simulation can achieve this goal. We present a method to semi-automatically build a computational model of a mitral valve from micro CT (computed tomography) scans: after manually picking chordae fiducial points, the leaflets are segmented and the boundary conditions as well as the loading conditions are automatically defined. Fast finite element method (FEM) simulation is carried out using Simulation Open Framework Architecture (SOFA) to reproduce leaflet closure at peak systole. We develop three metrics to evaluate simulation results: (i) point-to-surface error with the ground truth reference extracted from the CT image, (ii) coaptation surface area of the leaflets and (iii) an indication of whether the simulated closed leaflets leak. We validate our method on three explanted porcine hearts and show that our model predicts the closed valve surface with point-to-surface error of approximately 1 mm, a reasonable coaptation surface area, and absence of any leak at peak systole (maximum closed pressure). We also evaluate the sensitivity of our model to changes in various parameters (tissue elasticity, mesh accuracy, and the transformation matrix used for CT scan registration). We also measure the influence of the positions of the chordae tendineae on simulation results and show that marginal chordae have a greater influence on the final shape than intermediate chordae. The mitral valve simulation can help the surgeon understand valve behaviour and anticipate the outcome of a procedure. Copyright © 2018 John Wiley & Sons, Ltd.

  20. Coupling individual kernel-filling processes with source-sink interactions into GREENLAB-Maize.

    PubMed

    Ma, Yuntao; Chen, Youjia; Zhu, Jinyu; Meng, Lei; Guo, Yan; Li, Baoguo; Hoogenboom, Gerrit

    2018-02-13

    Failure to account for the variation of kernel growth in a cereal crop simulation model may cause serious deviations in the estimates of crop yield. The goal of this research was to revise the GREENLAB-Maize model to incorporate source- and sink-limited allocation approaches to simulate the dry matter accumulation of individual kernels of an ear (GREENLAB-Maize-Kernel). The model used potential individual kernel growth rates to characterize the individual potential sink demand. The remobilization of non-structural carbohydrates from reserve organs to kernels was also incorporated. Two years of field experiments were conducted to determine the model parameter values and to evaluate the model using two maize hybrids with different plant densities and pollination treatments. Detailed observations were made on the dimensions and dry weights of individual kernels and other above-ground plant organs throughout the seasons. Three basic traits characterizing an individual kernel were compared on simulated and measured individual kernels: (1) final kernel size; (2) kernel growth rate; and (3) duration of kernel filling. Simulations of individual kernel growth closely corresponded to experimental data. The model was able to reproduce the observed dry weight of plant organs well. Then, the source-sink dynamics and the remobilization of carbohydrates for kernel growth were quantified to show that remobilization processes accompanied source-sink dynamics during the kernel-filling process. We conclude that the model may be used to explore options for optimizing plant kernel yield by matching maize management to the environment, taking into account responses at the level of individual kernels. © The Author(s) 2018. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Worldwide evaluation of mean and extreme runoff from six global-scale hydrological models that account for human impacts

    NASA Astrophysics Data System (ADS)

    Zaherpour, Jamal; Gosling, Simon N.; Mount, Nick; Müller Schmied, Hannes; Veldkamp, Ted I. E.; Dankers, Rutger; Eisner, Stephanie; Gerten, Dieter; Gudmundsson, Lukas; Haddeland, Ingjerd; Hanasaki, Naota; Kim, Hyungjun; Leng, Guoyong; Liu, Junguo; Masaki, Yoshimitsu; Oki, Taikan; Pokhrel, Yadu; Satoh, Yusuke; Schewe, Jacob; Wada, Yoshihide

    2018-06-01

    Global-scale hydrological models are routinely used to assess water scarcity, flood hazards and droughts worldwide. Recent efforts to incorporate anthropogenic activities in these models have enabled more realistic comparisons with observations. Here we evaluate simulations from an ensemble of six models participating in the second phase of the Inter-Sectoral Impact Model Inter-comparison Project (ISIMIP2a). We simulate monthly runoff in 40 catchments, spatially distributed across eight global hydrobelts. The performance of each model and the ensemble mean is examined with respect to their ability to replicate observed mean and extreme runoff under human-influenced conditions. Application of a novel integrated evaluation metric to quantify the models’ ability to simulate timeseries of monthly runoff suggests that the models generally perform better in the wetter equatorial and northern hydrobelts than in drier southern hydrobelts. When model outputs are temporally aggregated to assess mean annual and extreme runoff, the models perform better. Nevertheless, we find a general trend in the majority of models towards the overestimation of mean annual runoff and all indicators of upper and lower extreme runoff. The models struggle to capture the timing of the seasonal cycle, particularly in northern hydrobelts, while in southern hydrobelts the models struggle to reproduce the magnitude of the seasonal cycle. It is noteworthy that over all hydrological indicators, the ensemble mean fails to perform better than any individual model—a finding that challenges the commonly held perception that model ensemble estimates deliver superior performance over individual models. The study highlights the need for continued model development and improvement. It also suggests that caution should be taken when summarising the simulations from a model ensemble based upon its mean output.

  2. Evaluating the impacts of screening and smoking cessation programmes on lung cancer in a high-burden region of the USA: a simulation modelling study

    PubMed Central

    Tramontano, Angela C; Sheehan, Deirdre F; McMahon, Pamela M; Dowling, Emily C; Holford, Theodore R; Ryczak, Karen; Lesko, Samuel M; Levy, David T; Kong, Chung Yin

    2016-01-01

    Objective While the US Preventive Services Task Force has issued recommendations for lung cancer screening, its effectiveness at reducing lung cancer burden may vary at local levels due to regional variations in smoking behaviour. Our objective was to use an existing model to determine the impacts of lung cancer screening alone or in addition to increased smoking cessation in a US region with a relatively high smoking prevalence and lung cancer incidence. Setting Computer-based simulation model. Participants Simulated population of individuals 55 and older based on smoking prevalence and census data from Northeast Pennsylvania. Interventions Hypothetical lung cancer control from 2014 to 2050 through (1) screening with CT, (2) intensified smoking cessation or (3) a combination strategy. Primary and secondary outcome measures Primary outcomes were lung cancer mortality rates. Secondary outcomes included number of people eligible for screening and number of radiation-induced lung cancers. Results Combining lung cancer screening with increased smoking cessation would yield an estimated 8.1% reduction in cumulative lung cancer mortality by 2050. Our model estimated that the number of screening-eligible individuals would progressively decrease over time, indicating declining benefit of a screening-only programme. Lung cancer screening achieved a greater mortality reduction in earlier years, but was later surpassed by smoking cessation. Conclusions Combining smoking cessation programmes with lung cancer screening would provide the most benefit to a population, especially considering the growing proportion of patients ineligible for screening based on current recommendations. PMID:26928026

  3. Assessing the Impact of Climatic Variability and Change on Maize Production in the Midwestern USA

    NASA Astrophysics Data System (ADS)

    Andresen, J.; Jain, A. K.; Niyogi, D. S.; Alagarswamy, G.; Biehl, L.; Delamater, P.; Doering, O.; Elias, A.; Elmore, R.; Gramig, B.; Hart, C.; Kellner, O.; Liu, X.; Mohankumar, E.; Prokopy, L. S.; Song, C.; Todey, D.; Widhalm, M.

    2013-12-01

    Weather and climate remain among the most important uncontrollable factors in agricultural production systems. In this study, three process-based crop simulation models were used to identify the impacts of climate on the production of maize in the Midwestern U.S.A. during the past century. The 12-state region is a key global production area, responsible for more than 80% of U.S. domestic and 25% of total global production. The study is a part of the Useful to Useable (U2U) Project, a USDA NIFA-sponsored project seeking to improve the resilience and profitability of farming operations in the region amid climate variability and change. Three process-based crop simulation models were used in the study: CERES-Maize (DSSAT, Hoogenboom et al., 2012), the Hybrid-Maize model (Yang et al., 2004), and the Integrated Science Assessment Model (ISAM, Song et al., 2013). Model validation was carried out with individual plot and county observations. The models were run with 4 to 50 km spatial resolution gridded weather data for representative soils and cultivars, 1981-2012, to examine spatial and temporal yield variability within the region. We also examined the influence of different crop models and spatial scales on regional scale yield estimation, as well as a yield gap analysis between observed and attainable yields. An additional study was carried out with the CERES-Maize model at 18 individual site locations 1901-2012 to examine longer term historical trends. For all simulations, all input variables were held constant in order to isolate the impacts of climate. In general, the model estimates were in good agreement with observed yields, especially in central sections of the region. Regionally, low precipitation and soil moisture stress were chief limitations to simulated crop yields. The study suggests that at least part of the observed yield increases in the region during recent decades have occurred as the result of wetter, less stressful growing season weather conditions.

  4. Forecasting a winner for Malaysian Cup 2013 using soccer simulation model

    NASA Astrophysics Data System (ADS)

    Yusof, Muhammad Mat; Fauzee, Mohd Soffian Omar; Latif, Rozita Abdul

    2014-07-01

    This paper investigates through soccer simulation the calculation of the probability for each team winning Malaysia Cup 2013. Our methodology used here is we predict the outcomes of individual matches and then we simulate the Malaysia Cup 2013 tournament 5000 times. As match outcomes are always a matter of uncertainty, statistical model, in particular a double Poisson model is used to predict the number of goals scored and conceded for each team. Maximum likelihood estimation is use to measure the attacking strength and defensive weakness for each team. Based on our simulation result, LionXII has a higher probability in becoming the winner, followed by Selangor, ATM, JDT and Kelantan. Meanwhile, T-Team, Negeri Sembilan and Felda United have lower probabilities to win Malaysia Cup 2013. In summary, we find that the probability for each team becominga winner is small, indicating that the level of competitive balance in Malaysia Cup 2013 is quite high.

  5. [The history of development of evolutionary methods in St. Petersburg school of computer simulation in biology].

    PubMed

    Menshutkin, V V; Kazanskiĭ, A B; Levchenko, V F

    2010-01-01

    The history of rise and development of evolutionary methods in Saint Petersburg school of biological modelling is traced and analyzed. Some pioneering works in simulation of ecological and evolutionary processes, performed in St.-Petersburg school became an exemplary ones for many followers in Russia and abroad. The individual-based approach became the crucial point in the history of the school as an adequate instrument for construction of models of biological evolution. This approach is natural for simulation of the evolution of life-history parameters and adaptive processes in populations and communities. In some cases simulated evolutionary process was used for solving a reverse problem, i. e., for estimation of uncertain life-history parameters of population. Evolutionary computations is one more aspect of this approach application in great many fields. The problems and vistas of ecological and evolutionary modelling in general are discussed.

  6. Next generation dynamic global vegetation models: learning from community ecology (Invited)

    NASA Astrophysics Data System (ADS)

    Scheiter, S.; Higgins, S.; Langan, L.

    2013-12-01

    Dynamic global vegetation models are a powerful tool to project past, current and future vegetation patterns and the associated biogeochemical cycles. However, most models are limited by their representation of vegetation by using static and pre-defined plant functional types and by their simplistic representation of competition. We discuss how concepts from community assembly theory and coexistence theory can help to improve dynamic vegetation models. We present a trait- and individual-based dynamic vegetation model, the aDGVM2, that allows individual plants to adopt a unique combination of trait values. These traits define how individual plants grow, compete and reproduce under the given biotic and abiotic conditions. A genetic optimization algorithm is used to simulate trait inheritance and reproductive isolation between individuals. These model properties allow the assembly of plant communities that are adapted to biotic and abiotic conditions. We show (1) that the aDGVM2 can simulate coarse vegetation patterns in Africa, (2) that changes in the environmental conditions and disturbances strongly influence trait diversity and the assembled plant communities by influencing traits such as leaf phenology and carbon allocation patterns of individual plants and (3) that communities do not necessarily return to the initial state when environmental conditions return to the initial state. The aDGVM2 deals with functional diversity and competition fundamentally differently from current models and allows novel insights as to how vegetation may respond to climate change. We believe that the aDGVM2 approach could foster collaborations between research communities that focus on functional plant ecology, plant competition, plant physiology and Earth system science.

  7. Biodiversity matters in feedbacks between climate change and air quality: a study using an individual-based model.

    PubMed

    Wang, Bin; Shuman, Jacquelyn; Shugart, Herman H; Lerdau, Manuel T

    2018-03-30

    Air quality is closely associated with climate change via the biosphere because plants release large quantities of volatile organic compounds (VOC) that mediate both gaseous pollutants and aerosol dynamics. Earlier studies, which considered only leaf physiology and simply scale up from leaf-level enhancements of emissions, suggest that climate warming enhances whole forest VOC emissions, and these increased VOC emissions aggravate ozone pollution and secondary organic aerosol formation. Using an individual-based forest VOC emissions model, UVAFME-VOC, that simulates system-level emissions by explicitly simulating forest community dynamics to the individual tree level, ecological competition among the individuals of differing size and age, and radiative transfer and leaf function through the canopy, we find that climate warming only sometimes stimulates isoprene emissions (the single largest source of non-methane hydrocarbon) in a southeastern U.S. forest. These complex patterns result from the combination of higher temperatures' stimulating emissions at the leaf level but decreasing the abundance of isoprene-emitting taxa at the community level by causing a decline in the abundance of isoprene-emitting species (Quercus spp.). This ecological effect eventually outweighs the physiological one, thus reducing overall emissions. Such reduced emissions have far-reaching implications for the climate-air-quality relationships that have been established on the paradigm of warming-enhancement VOC emissions from vegetation. This local scale modeling study suggests that community ecology rather than only individual physiology should be integrated into future studies of biosphere-climate-chemistry interactions. © 2018 by the Ecological Society of America.

  8. Effect of inlet modelling on surface drainage in coupled urban flood simulation

    NASA Astrophysics Data System (ADS)

    Jang, Jiun-Huei; Chang, Tien-Hao; Chen, Wei-Bo

    2018-07-01

    For a highly developed urban area with complete drainage systems, flood simulation is necessary for describing the flow dynamics from rainfall, to surface runoff, and to sewer flow. In this study, a coupled flood model based on diffusion wave equations was proposed to simulate one-dimensional sewer flow and two-dimensional overland flow simultaneously. The overland flow model provides details on the rainfall-runoff process to estimate the excess runoff that enters the sewer system through street inlets for sewer flow routing. Three types of inlet modelling are considered in this study, including the manhole-based approach that ignores the street inlets by draining surface water directly into manholes, the inlet-manhole approach that drains surface water into manholes that are each connected to multiple inlets, and the inlet-node approach that drains surface water into sewer nodes that are connected to individual inlets. The simulation results were compared with a high-intensity rainstorm event that occurred in 2015 in Taipei City. In the verification of the maximum flood extent, the two approaches that considered street inlets performed considerably better than that without street inlets. When considering the aforementioned models in terms of temporal flood variation, using manholes as receivers leads to an overall inefficient draining of the surface water either by the manhole-based approach or by the inlet-manhole approach. Using the inlet-node approach is more reasonable than using the inlet-manhole approach because the inlet-node approach greatly reduces the fluctuation of the sewer water level. The inlet-node approach is more efficient in draining surface water by reducing flood volume by 13% compared with the inlet-manhole approach and by 41% compared with the manhole-based approach. The results show that inlet modeling has a strong influence on drainage efficiency in coupled flood simulation.

  9. The origin of human complex diversity: Stochastic epistatic modules and the intrinsic compatibility between distributional robustness and phenotypic changeability.

    PubMed

    Ijichi, Shinji; Ijichi, Naomi; Ijichi, Yukina; Imamura, Chikako; Sameshima, Hisami; Kawaike, Yoichi; Morioka, Hirofumi

    2018-01-01

    The continuing prevalence of a highly heritable and hypo-reproductive extreme tail of a human neurobehavioral quantitative diversity suggests the possibility that the reproductive majority retains the genetic mechanism for the extremes. From the perspective of stochastic epistasis, the effect of an epistatic modifier variant can randomly vary in both phenotypic value and effect direction among the careers depending on the genetic individuality, and the modifier careers are ubiquitous in the population distribution. The neutrality of the mean genetic effect in the careers warrants the survival of the variant under selection pressures. Functionally or metabolically related modifier variants make an epistatic network module and dozens of modules may be involved in the phenotype. To assess the significance of stochastic epistasis, a simplified module-based model was employed. The individual repertoire of the modifier variants in a module also participates in the genetic individuality which determines the genetic contribution of each modifier in the career. Because the entire contribution of a module to the phenotypic outcome is consequently unpredictable in the model, the module effect represents the total contribution of the related modifiers as a stochastic unit in the simulations. As a result, the intrinsic compatibility between distributional robustness and quantitative changeability could mathematically be simulated using the model. The artificial normal distribution shape in large-sized simulations was preserved in each generation even if the lowest fitness tail was un-reproductive. The robustness of normality beyond generations is analogous to the real situations of human complex diversity including neurodevelopmental conditions. The repeated regeneration of the un-reproductive extreme tail may be inevitable for the reproductive majority's competence to survive and change, suggesting implications of the extremes for others. Further model-simulations to illustrate how the fitness of extreme individuals can be low through generations may be warranted to increase the credibility of this stochastic epistasis model.

  10. A biologically inspired approach to modeling unmanned vehicle teams

    NASA Astrophysics Data System (ADS)

    Cortesi, Roger S.; Galloway, Kevin S.; Justh, Eric W.

    2008-04-01

    Cooperative motion control of teams of agile unmanned vehicles presents modeling challenges at several levels. The "microscopic equations" describing individual vehicle dynamics and their interaction with the environment may be known fairly precisely, but are generally too complicated to yield qualitative insights at the level of multi-vehicle trajectory coordination. Interacting particle models are suitable for coordinating trajectories, but require care to ensure that individual vehicles are not driven in a "costly" manner. From the point of view of the cooperative motion controller, the individual vehicle autopilots serve to "shape" the microscopic equations, and we have been exploring the interplay between autopilots and cooperative motion controllers using a multivehicle hardware-in-the-loop simulator. Specifically, we seek refinements to interacting particle models in order to better describe observed behavior, without sacrificing qualitative understanding. A recent analogous example from biology involves introducing a fixed delay into a curvature-control-based feedback law for prey capture by an echolocating bat. This delay captures both neural processing time and the flight-dynamic response of the bat as it uses sensor-driven feedback. We propose a comparable approach for unmanned vehicle modeling; however, in contrast to the bat, with unmanned vehicles we have an additional freedom to modify the autopilot. Simulation results demonstrate the effectiveness of this biologically guided modeling approach.

  11. Fisher waves and front roughening in a two-species invasion model with preemptive competition.

    PubMed

    O'Malley, L; Kozma, B; Korniss, G; Rácz, Z; Caraco, T

    2006-10-01

    We study front propagation when an invading species competes with a resident; we assume nearest-neighbor preemptive competition for resources in an individual-based, two-dimensional lattice model. The asymptotic front velocity exhibits an effective power-law dependence on the difference between the two species' clonal propagation rates (key ecological parameters). The mean-field approximation behaves similarly, but the power law's exponent slightly differs from the individual-based model's result. We also study roughening of the front, using the framework of nonequilibrium interface growth. Our analysis indicates that initially flat, linear invading fronts exhibit Kardar-Parisi-Zhang (KPZ) roughening in one transverse dimension. Further, this finding implies, and is also confirmed by simulations, that the temporal correction to the asymptotic front velocity is of O(t(-2/3)).

  12. Testing the role of reward and punishment sensitivity in avoidance behavior: a computational modeling approach

    PubMed Central

    Sheynin, Jony; Moustafa, Ahmed A.; Beck, Kevin D.; Servatius, Richard J.; Myers, Catherine E.

    2015-01-01

    Exaggerated avoidance behavior is a predominant symptom in all anxiety disorders and its degree often parallels the development and persistence of these conditions. Both human and non-human animal studies suggest that individual differences as well as various contextual cues may impact avoidance behavior. Specifically, we have recently shown that female sex and inhibited temperament, two anxiety vulnerability factors, are associated with greater duration and rate of the avoidance behavior, as demonstrated on a computer-based task closely related to common rodent avoidance paradigms. We have also demonstrated that avoidance is attenuated by the administration of explicit visual signals during “non-threat” periods (i.e., safety signals). Here, we use a reinforcement-learning network model to investigate the underlying mechanisms of these empirical findings, with a special focus on distinct reward and punishment sensitivities. Model simulations suggest that sex and inhibited temperament are associated with specific aspects of these sensitivities. Specifically, differences in relative sensitivity to reward and punishment might underlie the longer avoidance duration demonstrated by females, whereas higher sensitivity to punishment might underlie the higher avoidance rate demonstrated by inhibited individuals. Simulations also suggest that safety signals attenuate avoidance behavior by strengthening the competing approach response. Lastly, several predictions generated by the model suggest that extinction-based cognitive-behavioral therapies might benefit from the use of safety signals, especially if given to individuals with high reward sensitivity and during longer safe periods. Overall, this study is the first to suggest cognitive mechanisms underlying the greater avoidance behavior observed in healthy individuals with different anxiety vulnerabilities. PMID:25639540

  13. Simulating Future Changes in Spatio-temporal Precipitation by Identifying and Characterizing Individual Rainstorm Events

    NASA Astrophysics Data System (ADS)

    Chang, W.; Stein, M.; Wang, J.; Kotamarthi, V. R.; Moyer, E. J.

    2015-12-01

    A growing body of literature suggests that human-induced climate change may cause significant changes in precipitation patterns, which could in turn influence future flood levels and frequencies and water supply and management practices. Although climate models produce full three-dimensional simulations of precipitation, analyses of model precipitation have focused either on time-averaged distributions or on individual timeseries with no spatial information. We describe here a new approach based on identifying and characterizing individual rainstorms in either data or model output. Our approach enables us to readily characterize important spatio-temporal aspects of rainstorms including initiation location, intensity (mean and patterns), spatial extent, duration, and trajectory. We apply this technique to high-resolution precipitation over the continental U.S. both from radar-based observations (NCEP Stage IV QPE product, 1-hourly, 4 km spatial resolution) and from model runs with dynamical downscaling (WRF regional climate model, 3-hourly, 12 km spatial resolution). In the model studies we investigate the changes in storm characteristics under a business-as-usual warming scenario to 2100 (RCP 8.5). We find that in these model runs, rainstorm intensity increases as expected with rising temperatures (approximately 7%/K, following increased atmospheric moisture content), while total precipitation increases by a lesser amount (3%/K), consistent with other studies. We identify for the first time the necessary compensating mechanism: in these model runs, individual precipitation events become smaller. Other aspects are approximately unchanged in the warmer climate. Because these spatio-temporal changes in rainfall patterns would impact regional hydrology, it is important that they be accurately incorporated into any impacts assessment. For this purpose we have developed a methodology for producing scenarios of future precipitation that combine observational data and model-projected changes. We statistically describe the future changes in rainstorm characteristics suggested by the WRF model and apply those changes to observational data. The resulting high spatial and temporal resolution scenarios have immediate applications for impacts assessment and adaptation studies.

  14. EnsembleGraph: Interactive Visual Analysis of Spatial-Temporal Behavior for Ensemble Simulation Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Qingya; Guo, Hanqi; Che, Limei

    We present a novel visualization framework—EnsembleGraph— for analyzing ensemble simulation data, in order to help scientists understand behavior similarities between ensemble members over space and time. A graph-based representation is used to visualize individual spatiotemporal regions with similar behaviors, which are extracted by hierarchical clustering algorithms. A user interface with multiple-linked views is provided, which enables users to explore, locate, and compare regions that have similar behaviors between and then users can investigate and analyze the selected regions in detail. The driving application of this paper is the studies on regional emission influences over tropospheric ozone, which is based onmore » ensemble simulations conducted with different anthropogenic emission absences using the MOZART-4 (model of ozone and related tracers, version 4) model. We demonstrate the effectiveness of our method by visualizing the MOZART-4 ensemble simulation data and evaluating the relative regional emission influences on tropospheric ozone concentrations. Positive feedbacks from domain experts and two case studies prove efficiency of our method.« less

  15. The Influence of Aerosol Hygroscopicity on Precipitation Intensity During a Mesoscale Convective Event

    NASA Astrophysics Data System (ADS)

    Kawecki, Stacey; Steiner, Allison L.

    2018-01-01

    We examine how aerosol composition affects precipitation intensity using the Weather and Research Forecasting Model with Chemistry (version 3.6). By changing the prescribed default hygroscopicity values to updated values from laboratory studies, we test model assumptions about individual component hygroscopicity values of ammonium, sulfate, nitrate, and organic species. We compare a baseline simulation (BASE, using default hygroscopicity values) with four sensitivity simulations (SULF, increasing the sulfate hygroscopicity; ORG, decreasing organic hygroscopicity; SWITCH, using a concentration-dependent hygroscopicity value for ammonium; and ALL, including all three changes) to understand the role of aerosol composition on precipitation during a mesoscale convective system (MCS). Overall, the hygroscopicity changes influence the spatial patterns of precipitation and the intensity. Focusing on the maximum precipitation in the model domain downwind of an urban area, we find that changing the individual component hygroscopicities leads to bulk hygroscopicity changes, especially in the ORG simulation. Reducing bulk hygroscopicity (e.g., ORG simulation) initially causes fewer activated drops, weakened updrafts in the midtroposphere, and increased precipitation from larger hydrometeors. Increasing bulk hygroscopicity (e.g., SULF simulation) simulates more numerous and smaller cloud drops and increases precipitation. In the ALL simulation, a stronger cold pool and downdrafts lead to precipitation suppression later in the MCS evolution. In this downwind region, the combined changes in hygroscopicity (ALL) reduces the overprediction of intense events (>70 mm d-1) and better captures the range of moderate intensity (30-60 mm d-1) events. The results of this single MCS analysis suggest that aerosol composition can play an important role in simulating high-intensity precipitation events.

  16. Lagrangian Particle Tracking Simulation for Warm-Rain Processes in Quasi-One-Dimensional Domain

    NASA Astrophysics Data System (ADS)

    Kunishima, Y.; Onishi, R.

    2017-12-01

    Conventional cloud simulations are based on the Euler method and compute each microphysics process in a stochastic way assuming infinite numbers of particles within each numerical grid. They therefore cannot provide the Lagrangian statistics of individual particles in cloud microphysics (i.e., aerosol particles, cloud particles, and rain drops) nor discuss the statistical fluctuations due to finite number of particles. We here simulate the entire precipitation process of warm-rain, with tracking individual particles. We use the Lagrangian Cloud Simulator (LCS), which is based on the Euler-Lagrangian framework. In that framework, flow motion and scalar transportation are computed with the Euler method, and particle motion with the Lagrangian one. The LCS tracks particle motions and collision events individually with considering the hydrodynamic interaction between approaching particles with a superposition method, that is, it can directly represent the collisional growth of cloud particles. It is essential for trustworthy collision detection to take account of the hydrodynamic interaction. In this study, we newly developed a stochastic model based on the Twomey cloud condensation nuclei (CCN) activation for the Lagrangian tracking simulation and integrated it into the LCS. Coupling with the Euler computation for water vapour and temperature fields, the initiation and condensational growth of water droplets were computed in the Lagrangian way. We applied the integrated LCS for a kinematic simulation of warm-rain processes in a vertically-elongated domain of, at largest, 0.03×0.03×3000 (m3) with horizontal periodicity. Aerosol particles with a realistic number density, 5×107 (m3), were evenly distributed over the domain at the initial state. Prescribed updraft at the early stage initiated development of a precipitating cloud. We have confirmed that the obtained bulk statistics fairly agree with those from a conventional spectral-bin scheme for a vertical column domain. The centre of the discussion will be the Lagrangian statistics which is collected from the individual behaviour of the tracked particles.

  17. Designing a podiatry service to meet the needs of the population: a service simulation.

    PubMed

    Campbell, Jackie A

    2007-02-01

    A model of a podiatry service has been developed which takes into consideration the effect of changing access criteria, skill mix and staffing levels (among others) given fixed local staffing budgets and the foot-health characteristics of the local community. A spreadsheet-based deterministic model was chosen to allow maximum transparency of programming. This work models a podiatry service in England, but could be adapted for other settings and, with some modification, for other community-based services. This model enables individual services to see the effect on outcome parameters such as number of patients treated, number discharged and size of waiting lists of various service configurations, given their individual local data profile. The process of designing the model has also had spin-off benefits for the participants in making explicit many of the implicit rules used in managing their services.

  18. Leveraging social influence to address overweight and obesity using agent-based models: the role of adolescent social networks.

    PubMed

    Zhang, J; Tong, L; Lamberson, P J; Durazo-Arvizu, R A; Luke, A; Shoham, D A

    2015-01-01

    The prevalence of adolescent overweight and obesity (hereafter, simply "overweight") in the US has increased over the past several decades. Individually-targeted prevention and treatment strategies targeting individuals have been disappointing, leading some to propose leveraging social networks to improve interventions. We hypothesized that social network dynamics (social marginalization; homophily on body mass index, BMI) and the strength of peer influence would increase or decrease the proportion of network member (agents) becoming overweight over a simulated year, and that peer influence would operate differently in social networks with greater overweight. We built an agent-based model (ABM) using results from R-SIENA. ABMs allow for the exploration of potential interventions using simulated agents. Initial model specifications were drawn from Wave 1 of the National Longitudinal Study of Adolescent Health (Add Health). We focused on a single saturation school with complete network and BMI data over two waves (n = 624). The model was validated against empirical observations at Wave 2. We focused on overall overweight prevalence after a simulated year. Five experiments were conducted: (1) changing attractiveness of high-BMI agents; (2) changing homophily on BMI; (3) changing the strength of peer influence; (4) shifting the overall BMI distribution; and (5) targeting dietary interventions to highly connected individuals. Increasing peer influence showed a dramatic decrease in the prevalence of overweight; making peer influence negative (i.e., doing the opposite of friends) increased overweight. However, the effect of peer influence varied based on the underlying distribution of BMI; when BMI was increased overall, stronger peer influence increased proportion of overweight. Other interventions, including targeted dieting, had little impact. Peer influence may be a viable target in overweight interventions, but the distribution of body size in the population needs to be taken into account. In low-obesity populations, strengthening peer influence may be a useful strategy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Leveraging social influence to address overweight and obesity using agent-based models: the role of adolescent social networks

    PubMed Central

    Zhang, J; Tong, L; Lamberson, PJ; Durazo, R; Luke, A; Shoham, DA

    2014-01-01

    The prevalence of adolescent overweight and obesity (hereafter, simply “overweight”) in the US has increased over the past several decades. Individually-targeted prevention and treatment strategies targeting individuals have been disappointing, leading some to propose leveraging social networks to improve interventions. We hypothesized that social network dynamics (social marginalization; homophily on body mass index, BMI) and the strength of peer influence would increase or decrease the proportion of network member (agents) becoming overweight over a simulated year, and that peer influence would operate differently in social networks with greater overweight. We built an agent-based model (ABM) using results from R-SIENA. ABMs allow for the exploration of potential interventions using simulated agents. Initial model specifications were drawn from Wave 1 of the National Longitudinal Study of Adolescent Health (Add Health). We focused on a single saturation school with complete network and BMI data over two waves (n=624). The model was validated against empirical observations at Wave 2. We focused on overall overweight prevalence after a simulated year. Five experiments were conducted: (1) changing attractiveness of high-BMI agents; (2) changing homophily on BMI; (3) changing the strength of peer influence; (4) shifting the overall BMI distribution; and (5) targeting dietary interventions to highly connected individuals. Increasing peer influence showed a dramatic decrease in the prevalence of overweight; making peer influence negative (ie, doing the opposite of friends) increased overweight. However, the effect of peer influence varied based on the underlying distribution of BMI; when BMI was increased overall, stronger peer influence increased proportion of overweight. Other interventions, including targeted dieting, had little impact. Peer influence may be a viable target in overweight interventions, but the distribution of body size in the population needs to be taken into account. In low-obesity populations, strengthening peer influence may be a useful strategy. PMID:24951404

  20. FARSITE: Fire Area Simulator-model development and evaluation

    Treesearch

    Mark A. Finney

    1998-01-01

    A computer simulation model, FARSITE, includes existing fire behavior models for surface, crown, spotting, point-source fire acceleration, and fuel moisture. The model's components and assumptions are documented. Simulations were run for simple conditions that illustrate the effect of individual fire behavior models on two-dimensional fire growth.

  1. Variance decomposition in stochastic simulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Maître, O. P., E-mail: olm@limsi.fr; Knio, O. M., E-mail: knio@duke.edu; Moraes, A., E-mail: alvaro.moraesgutierrez@kaust.edu.sa

    This work aims at the development of a mathematical and computational approach that enables quantification of the inherent sources of stochasticity and of the corresponding sensitivities in stochastic simulations of chemical reaction networks. The approach is based on reformulating the system dynamics as being generated by independent standardized Poisson processes. This reformulation affords a straightforward identification of individual realizations for the stochastic dynamics of each reaction channel, and consequently a quantitative characterization of the inherent sources of stochasticity in the system. By relying on the Sobol-Hoeffding decomposition, the reformulation enables us to perform an orthogonal decomposition of the solution variance.more » Thus, by judiciously exploiting the inherent stochasticity of the system, one is able to quantify the variance-based sensitivities associated with individual reaction channels, as well as the importance of channel interactions. Implementation of the algorithms is illustrated in light of simulations of simplified systems, including the birth-death, Schlögl, and Michaelis-Menten models.« less

  2. A computational cognitive model of self-efficacy and daily adherence in mHealth.

    PubMed

    Pirolli, Peter

    2016-12-01

    Mobile health (mHealth) applications provide an excellent opportunity for collecting rich, fine-grained data necessary for understanding and predicting day-to-day health behavior change dynamics. A computational predictive model (ACT-R-DStress) is presented and fit to individual daily adherence in 28-day mHealth exercise programs. The ACT-R-DStress model refines the psychological construct of self-efficacy. To explain and predict the dynamics of self-efficacy and predict individual performance of targeted behaviors, the self-efficacy construct is implemented as a theory-based neurocognitive simulation of the interaction of behavioral goals, memories of past experiences, and behavioral performance.

  3. Monte Carlo simulations of parapatric speciation

    NASA Astrophysics Data System (ADS)

    Schwämmle, V.; Sousa, A. O.; de Oliveira, S. M.

    2006-06-01

    Parapatric speciation is studied using an individual-based model with sexual reproduction. We combine the theory of mutation accumulation for biological ageing with an environmental selection pressure that varies according to the individuals geographical positions and phenotypic traits. Fluctuations and genetic diversity of large populations are crucial ingredients to model the features of evolutionary branching and are intrinsic properties of the model. Its implementation on a spatial lattice gives interesting insights into the population dynamics of speciation on a geographical landscape and the disruptive selection that leads to the divergence of phenotypes. Our results suggest that assortative mating is not an obligatory ingredient to obtain speciation in large populations at low gene flow.

  4. Newtonian chimpanzees? A molecular dynamics approach to understanding decision-making by wild chimpanzees

    NASA Astrophysics Data System (ADS)

    Westley, Matthew; Sen, Surajit; Sinha, Anindya

    2014-07-01

    In this study, we computationally investigate decision-making by individuals and the ensuing social structure of a primate species, chimpanzees, using Newton's equations of classical mechanics, as opposed to agentbased analyses in which individual chimpanzees make independent decisions. Our model uses molecular dynamics simulation techniques to solve Newton's equations and is able to approximate the movements of female and male chimpanzees, especially in relation to the available food resources, in a manner that is consistent with their observed behavior in natural habitats. It is noteworthy that our Newtonian dynamics-based model may allow us to make certain specific observations of their behaviour, some of which may be difficult to achieve through agent-based modelling exercises or even field studies. Chimpanzees tend to live in fission-fusion social groups, with varying number of individuals, in which both females and males tend to display intrasexual competition for valuable food resources while the males also compete for oestrus females. Most populations of the species are also restricted to a small range of habitats, a clear indication that they are especially vulnerable to the availability and distribution of food sources. With reasonable assumptions of chimpanzee behaviour, we have been able to analyse the clustering behaviour of individuals in relation to local food sources as also patterns of their migration across groups. Our simulated results are qualitatively consistent with field observations conducted on a particular semi-isolated population of chimpanzees in Bossou, Guinea, in western Africa.

  5. Multi-day activity scheduling reactions to planned activities and future events in a dynamic model of activity-travel behavior

    NASA Astrophysics Data System (ADS)

    Nijland, Linda; Arentze, Theo; Timmermans, Harry

    2014-01-01

    Modeling multi-day planning has received scarce attention in activity-based transport demand modeling so far. However, new dynamic activity-based approaches are being developed at the current moment. The frequency and inflexibility of planned activities and events in activity schedules of individuals indicate the importance of incorporating those pre-planned activities in the new generation of dynamic travel demand models. Elaborating and combining previous work on event-driven activity generation, the aim of this paper is to develop and illustrate an extension of a need-based model of activity generation that takes into account possible influences of pre-planned activities and events. This paper describes the theory and shows the results of simulations of the extension. The simulation was conducted for six different activities, and the parameter values used were consistent with an earlier estimation study. The results show that the model works well and that the influences of the parameters are consistent, logical, and have clear interpretations. These findings offer further evidence of face and construct validity to the suggested modeling approach.

  6. Multi-objective optimization for generating a weighted multi-model ensemble

    NASA Astrophysics Data System (ADS)

    Lee, H.

    2017-12-01

    Many studies have demonstrated that multi-model ensembles generally show better skill than each ensemble member. When generating weighted multi-model ensembles, the first step is measuring the performance of individual model simulations using observations. There is a consensus on the assignment of weighting factors based on a single evaluation metric. When considering only one evaluation metric, the weighting factor for each model is proportional to a performance score or inversely proportional to an error for the model. While this conventional approach can provide appropriate combinations of multiple models, the approach confronts a big challenge when there are multiple metrics under consideration. When considering multiple evaluation metrics, it is obvious that a simple averaging of multiple performance scores or model ranks does not address the trade-off problem between conflicting metrics. So far, there seems to be no best method to generate weighted multi-model ensembles based on multiple performance metrics. The current study applies the multi-objective optimization, a mathematical process that provides a set of optimal trade-off solutions based on a range of evaluation metrics, to combining multiple performance metrics for the global climate models and their dynamically downscaled regional climate simulations over North America and generating a weighted multi-model ensemble. NASA satellite data and the Regional Climate Model Evaluation System (RCMES) software toolkit are used for assessment of the climate simulations. Overall, the performance of each model differs markedly with strong seasonal dependence. Because of the considerable variability across the climate simulations, it is important to evaluate models systematically and make future projections by assigning optimized weighting factors to the models with relatively good performance. Our results indicate that the optimally weighted multi-model ensemble always shows better performance than an arithmetic ensemble mean and may provide reliable future projections.

  7. Multi-Scale Spatio-Temporal Modeling: Lifelines of Microorganisms in Bioreactors and Tracking Molecules in Cells

    NASA Astrophysics Data System (ADS)

    Lapin, Alexei; Klann, Michael; Reuss, Matthias

    Agent-based models are rigorous tools for simulating the interactions of individual entities, such as organisms or molecules within cells and assessing their effects on the dynamic behavior of the system as a whole. In context with bioprocess and biosystems engineering there are several interesting and important applications. This contribution aims at introducing this strategy with the aid of two examples characterized by striking distinctions in the scale of the individual entities and the mode of their interactions. In the first example a structured-segregated model is applied to travel along the lifelines of single cells in the environment of a three-dimensional turbulent field of a stirred bioreactor. The modeling approach is based on an Euler-Lagrange formulation of the system. The strategy permits one to account for the heterogeneity present in real reactors in both the fluid and cellular phases, respectively. The individual response of the cells to local variations in the extracellular concentrations is pictured by a dynamically structured model of the key reactions of the central metabolism. The approach permits analysis of the lifelines of individual cells in space and time.

  8. Inverse sampling regression for pooled data.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Eskridge, Kent; Crossa, José

    2017-06-01

    Because pools are tested instead of individuals in group testing, this technique is helpful for estimating prevalence in a population or for classifying a large number of individuals into two groups at a low cost. For this reason, group testing is a well-known means of saving costs and producing precise estimates. In this paper, we developed a mixed-effect group testing regression that is useful when the data-collecting process is performed using inverse sampling. This model allows including covariate information at the individual level to incorporate heterogeneity among individuals and identify which covariates are associated with positive individuals. We present an approach to fit this model using maximum likelihood and we performed a simulation study to evaluate the quality of the estimates. Based on the simulation study, we found that the proposed regression method for inverse sampling with group testing produces parameter estimates with low bias when the pre-specified number of positive pools (r) to stop the sampling process is at least 10 and the number of clusters in the sample is also at least 10. We performed an application with real data and we provide an NLMIXED code that researchers can use to implement this method.

  9. War-gaming application for future space systems acquisition: MATLAB implementation of war-gaming acquisition models and simulation results

    NASA Astrophysics Data System (ADS)

    Vienhage, Paul; Barcomb, Heather; Marshall, Karel; Black, William A.; Coons, Amanda; Tran, Hien T.; Nguyen, Tien M.; Guillen, Andy T.; Yoh, James; Kizer, Justin; Rogers, Blake A.

    2017-05-01

    The paper describes the MATLAB (MathWorks) programs that were developed during the REU workshop1 to implement The Aerospace Corporation developed Unified Game-based Acquisition Framework and Advanced Game - based Mathematical Framework (UGAF-AGMF) and its associated War-Gaming Engine (WGE) models. Each game can be played from the perspectives of the Department of Defense Acquisition Authority (DAA) or of an individual contractor (KTR). The programs also implement Aerospace's optimum "Program and Technical Baseline (PTB) and associated acquisition" strategy that combines low Total Ownership Cost (TOC) with innovative designs while still meeting warfighter needs. The paper also describes the Bayesian Acquisition War-Gaming approach using Monte Carlo simulations, a numerical analysis technique to account for uncertainty in decision making, which simulate the PTB development and acquisition processes and will detail the procedure of the implementation and the interactions between the games.

  10. Comparison of Cox's and relative survival models when estimating the effects of prognostic factors on disease-specific mortality: a simulation study under proportional excess hazards.

    PubMed

    Le Teuff, Gwenaël; Abrahamowicz, Michal; Bolard, Philippe; Quantin, Catherine

    2005-12-30

    In many prognostic studies focusing on mortality of persons affected by a particular disease, the cause of death of individual patients is not recorded. In such situations, the conventional survival analytical methods, such as the Cox's proportional hazards regression model, do not allow to discriminate the effects of prognostic factors on disease-specific mortality from their effects on all-causes mortality. In the last decade, the relative survival approach has been proposed to deal with the analyses involving population-based cancer registries, where the problem of missing information on the cause of death is very common. However, some questions regarding the ability of the relative survival methods to accurately discriminate between the two sources of mortality remain open. In order to systematically assess the performance of the relative survival model proposed by Esteve et al., and to quantify its potential advantages over the Cox's model analyses, we carried out a series of simulation experiments, based on the population-based colon cancer registry in the French region of Burgundy. Simulations showed a systematic bias induced by the 'crude' conventional Cox's model analyses when individual causes of death are unknown. In simulations where only about 10 per cent of patients died of causes other than colon cancer, the Cox's model over-estimated the effects of male gender and oldest age category by about 17 and 13 per cent, respectively, with the coverage rate of the 95 per cent CI for the latter estimate as low as 65 per cent. In contrast, the effect of higher cancer stages was under-estimated by 8-28 per cent. In contrast to crude survival, relative survival model largely reduced such problems and handled well even such challenging tasks as separating the opposite effects of the same variable on cancer-related versus other-causes mortality. Specifically, in all the cases discussed above, the relative bias in the estimates from the Esteve et al.'s model was always below 10 per cent, with the coverage rates above 81 per cent. Copyright 2005 John Wiley & Sons, Ltd.

  11. Simulation skill of APCC set of global climate models for Asian summer monsoon rainfall variability

    NASA Astrophysics Data System (ADS)

    Singh, U. K.; Singh, G. P.; Singh, Vikas

    2015-04-01

    The performance of 11 Asia-Pacific Economic Cooperation Climate Center (APCC) global climate models (coupled and uncoupled both) in simulating the seasonal summer (June-August) monsoon rainfall variability over Asia (especially over India and East Asia) has been evaluated in detail using hind-cast data (3 months advance) generated from APCC which provides the regional climate information product services based on multi-model ensemble dynamical seasonal prediction systems. The skill of each global climate model over Asia was tested separately in detail for the period of 21 years (1983-2003), and simulated Asian summer monsoon rainfall (ASMR) has been verified using various statistical measures for Indian and East Asian land masses separately. The analysis found a large variation in spatial ASMR simulated with uncoupled model compared to coupled models (like Predictive Ocean Atmosphere Model for Australia, National Centers for Environmental Prediction and Japan Meteorological Agency). The simulated ASMR in coupled model was closer to Climate Prediction Centre Merged Analysis of Precipitation (CMAP) compared to uncoupled models although the amount of ASMR was underestimated in both models. Analysis also found a high spread in simulated ASMR among the ensemble members (suggesting that the model's performance is highly dependent on its initial conditions). The correlation analysis between sea surface temperature (SST) and ASMR shows that that the coupled models are strongly associated with ASMR compared to the uncoupled models (suggesting that air-sea interaction is well cared in coupled models). The analysis of rainfall using various statistical measures suggests that the multi-model ensemble (MME) performed better compared to individual model and also separate study indicate that Indian and East Asian land masses are more useful compared to Asia monsoon rainfall as a whole. The results of various statistical measures like skill of multi-model ensemble, large spread among the ensemble members of individual model, strong teleconnection (correlation analysis) with SST, coefficient of variation, inter-annual variability, analysis of Taylor diagram, etc. suggest that there is a need to improve coupled model instead of uncoupled model for the development of a better dynamical seasonal forecast system.

  12. Computational simulation of the creep-rupture process in filamentary composite materials

    NASA Technical Reports Server (NTRS)

    Slattery, Kerry T.; Hackett, Robert M.

    1991-01-01

    A computational simulation of the internal damage accumulation which causes the creep-rupture phenomenon in filamentary composite materials is developed. The creep-rupture process involves complex interactions between several damage mechanisms. A statistically-based computational simulation using a time-differencing approach is employed to model these progressive interactions. The finite element method is used to calculate the internal stresses. The fibers are modeled as a series of bar elements which are connected transversely by matrix elements. Flaws are distributed randomly throughout the elements in the model. Load is applied, and the properties of the individual elements are updated at the end of each time step as a function of the stress history. The simulation is continued until failure occurs. Several cases, with different initial flaw dispersions, are run to establish a statistical distribution of the time-to-failure. The calculations are performed on a supercomputer. The simulation results compare favorably with the results of creep-rupture experiments conducted at the Lawrence Livermore National Laboratory.

  13. Construction and calibration of a groundwater-flow model to assess groundwater availability in the uppermost principal aquifer systems of the Williston Basin, United States and Canada

    USGS Publications Warehouse

    Davis, Kyle W.; Long, Andrew J.

    2018-05-31

    The U.S. Geological Survey developed a groundwater-flow model for the uppermost principal aquifer systems in the Williston Basin in parts of Montana, North Dakota, and South Dakota in the United States and parts of Manitoba and Saskatchewan in Canada as part of a detailed assessment of the groundwater availability in the area. The assessment was done because of the potential for increased demands and stresses on groundwater associated with large-scale energy development in the area. As part of this assessment, a three-dimensional groundwater-flow model was developed as a tool that can be used to simulate how the groundwater-flow system responds to changes in hydrologic stresses at a regional scale.The three-dimensional groundwater-flow model was developed using the U.S. Geological Survey’s numerical finite-difference groundwater model with the Newton-Rhapson solver, MODFLOW–NWT, to represent the glacial, lower Tertiary, and Upper Cretaceous aquifer systems for steady-state (mean) hydrological conditions for 1981‒2005 and for transient (temporally varying) conditions using a combination of a steady-state period for pre-1960 and transient periods for 1961‒2005. The numerical model framework was constructed based on existing and interpreted hydrogeologic and geospatial data and consisted of eight layers. Two layers were used to represent the glacial aquifer system in the model; layer 1 represented the upper one-half and layer 2 represented the lower one-half of the glacial aquifer system. Three layers were used to represent the lower Tertiary aquifer system in the model; layer 3 represented the upper Fort Union aquifer, layer 4 represented the middle Fort Union hydrogeologic unit, and layer 5 represented the lower Fort Union aquifer. Three layers were used to represent the Upper Cretaceous aquifer system in the model; layer 6 represented the upper Hell Creek hydrogeologic unit, layer 7 represented the lower Hell Creek aquifer, and layer 8 represented the Fox Hills aquifer. The numerical model was constructed using a uniform grid with square cells that are about 1 mile (1,600 meters) on each side with a total of about 657,000 active cells.Model calibration was completed by linking Parameter ESTimation (PEST) software with MODFLOW–NWT. The PEST software uses statistical parameter estimation techniques to identify an optimum set of input parameters by adjusting individual model input parameters and assessing the differences, or residuals, between observed (measured or estimated) data and simulated values. Steady-state model calibration consisted of attempting to match mean simulated values to measured or estimated values of (1) hydraulic head, (2) hydraulic head differences between model layers, (3) stream infiltration, and (4) discharge to streams. Calibration of the transient model consisted of attempting to match simulated and measured temporally distributed values of hydraulic head changes, stream base flow, and groundwater discharge to artesian flowing wells. Hydraulic properties estimated through model calibration included hydraulic conductivity, vertical hydraulic conductivity, aquifer storage, and riverbed hydraulic conductivity in addition to groundwater recharge and well skin.The ability of the numerical model to accurately simulate groundwater flow in the Williston Basin was assessed primarily by its ability to match calibration targets for hydraulic head, stream base flow, and flowing well discharge. The steady-state model also was used to assess the simulated potentiometric surfaces in the upper Fort Union aquifer, the lower Fort Union aquifer, and the Fox Hills aquifer. Additionally, a previously estimated regional groundwater-flow budget was compared with the simulated steady-state groundwater-flow budget for the Williston Basin. The simulated potentiometric surfaces typically compared well with the estimated potentiometric surfaces based on measured hydraulic head data and indicated localized groundwater-flow gradients that were topographically controlled in outcrop areas and more generalized regional gradients where the aquifers were confined. The differences between the measured and simulated (residuals) hydraulic head values for 11,109 wells were assessed, which indicated that the steady-state model generally underestimated hydraulic head in the model area. This underestimation is indicated by a positive mean residual of 11.2 feet for all model layers. Layer 7, which represents the lower Hell Creek aquifer, is the only layer for which the steady-state model overestimated hydraulic head. Simulated groundwater-level changes for the transient model matched within plus or minus 2.5 feet of the measured values for more than 60 percent of all measurements and to within plus or minus 17.5 feet for 95 percent of all measurements; however, the transient model underestimated groundwater-level changes for all model layers. A comparison between simulated and estimated base flows for the steady-state and transient models indicated that both models overestimated base flow in streams and underestimated annual fluctuations in base flow.The estimated and simulated groundwater budgets indicate the model area received a substantial amount of recharge from precipitation and stream infiltration. The steady-state model indicated that reservoir seepage was a larger component of recharge in the Williston Basin than was previously estimated. Irrigation recharge and groundwater inflow from outside the Williston Basin accounted for a relatively small part of total groundwater recharge when compared with recharge from precipitation, stream infiltration, and reservoir seepage. Most of the estimated and simulated groundwater discharge in the Williston Basin was to streams and reservoirs. Simulated groundwater withdrawal, discharge to reservoirs, and groundwater outflow in the Williston Basin accounted for a smaller part of total groundwater discharge.The transient model was used to simulate discharge to 571 flowing artesian wells within the model area. Of the 571 established flowing artesian wells simulated by the model, 271 wells did not flow at any time during the simulation because hydraulic head was always below the land-surface altitude. As hydraulic head declined throughout the simulation, 68 of these wells responded by ceasing to flow by the end of 2005. Total mean simulated discharge for the 571 flowing artesian wells was 55.1 cubic feet per second (ft3/s), and the mean simulated flowing well discharge for individual wells was 0.118 ft3/s. Simulated discharge to individual flowing artesian wells increased from 0.039 to 0.177 ft3/s between 1961 and 1975 and decreased to 0.102 ft3/s by 2005. The mean residual for 34 flowing wells with measured discharge was 0.014 ft3/s, which indicates the transient model overestimated discharge to flowing artesian wells in the model area.Model limitations arise from aspects of the conceptual model and from simplifications inherent in the construction and calibration of a regional-scale numerical groundwater-flow model. Simplifying assumptions in defining hydraulic parameters in space and hydrologic stresses and time-varying observational data in time can limit the capabilities of this tool to simulate how the groundwater-flow system responds to changes in hydrologic stresses, particularly at the local scale; nevertheless, the steady-state model adequately simulated flow in the uppermost principal aquifer systems in the Williston Basin based on the comparison between the simulated and estimated groundwater-flow budget, the comparison between simulated and estimated potentiometric surfaces, and the results of the calibration process.

  14. Development of a tool for calculating early internal doses in the Fukushima Daiichi nuclear power plant accident based on atmospheric dispersion simulation

    NASA Astrophysics Data System (ADS)

    Kurihara, Osamu; Kim, Eunjoo; Kunishima, Naoaki; Tani, Kotaro; Ishikawa, Tetsuo; Furuyama, Kazuo; Hashimoto, Shozo; Akashi, Makoto

    2017-09-01

    A tool was developed to facilitate the calculation of the early internal doses to residents involved in the Fukushima Nuclear Disaster based on atmospheric transport and dispersion model (ATDM) simulations performed using Worldwide version of System for Prediction of Environmental Emergency Information 2nd version (WSPEEDI-II) together with personal behavior data containing the history of the whereabouts of individul's after the accident. The tool generates hourly-averaged air concentration data for the simulation grids nearest to an individual's whereabouts using WSPEEDI-II datasets for the subsequent calculation of internal doses due to inhalation. This paper presents an overview of the developed tool and provides tentative comparisons between direct measurement-based and ATDM-based results regarding the internal doses received by 421 persons from whom personal behavior data available.

  15. The pseudo-compartment method for coupling partial differential equation and compartment-based models of diffusion.

    PubMed

    Yates, Christian A; Flegg, Mark B

    2015-05-06

    Spatial reaction-diffusion models have been employed to describe many emergent phenomena in biological systems. The modelling technique most commonly adopted in the literature implements systems of partial differential equations (PDEs), which assumes there are sufficient densities of particles that a continuum approximation is valid. However, owing to recent advances in computational power, the simulation and therefore postulation, of computationally intensive individual-based models has become a popular way to investigate the effects of noise in reaction-diffusion systems in which regions of low copy numbers exist. The specific stochastic models with which we shall be concerned in this manuscript are referred to as 'compartment-based' or 'on-lattice'. These models are characterized by a discretization of the computational domain into a grid/lattice of 'compartments'. Within each compartment, particles are assumed to be well mixed and are permitted to react with other particles within their compartment or to transfer between neighbouring compartments. Stochastic models provide accuracy, but at the cost of significant computational resources. For models that have regions of both low and high concentrations, it is often desirable, for reasons of efficiency, to employ coupled multi-scale modelling paradigms. In this work, we develop two hybrid algorithms in which a PDE in one region of the domain is coupled to a compartment-based model in the other. Rather than attempting to balance average fluxes, our algorithms answer a more fundamental question: 'how are individual particles transported between the vastly different model descriptions?' First, we present an algorithm derived by carefully redefining the continuous PDE concentration as a probability distribution. While this first algorithm shows very strong convergence to analytical solutions of test problems, it can be cumbersome to simulate. Our second algorithm is a simplified and more efficient implementation of the first, it is derived in the continuum limit over the PDE region alone. We test our hybrid methods for functionality and accuracy in a variety of different scenarios by comparing the averaged simulations with analytical solutions of PDEs for mean concentrations. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  16. Species delimitation using Bayes factors: simulations and application to the Sceloporus scalaris species group (Squamata: Phrynosomatidae).

    PubMed

    Grummer, Jared A; Bryson, Robert W; Reeder, Tod W

    2014-03-01

    Current molecular methods of species delimitation are limited by the types of species delimitation models and scenarios that can be tested. Bayes factors allow for more flexibility in testing non-nested species delimitation models and hypotheses of individual assignment to alternative lineages. Here, we examined the efficacy of Bayes factors in delimiting species through simulations and empirical data from the Sceloporus scalaris species group. Marginal-likelihood scores of competing species delimitation models, from which Bayes factor values were compared, were estimated with four different methods: harmonic mean estimation (HME), smoothed harmonic mean estimation (sHME), path-sampling/thermodynamic integration (PS), and stepping-stone (SS) analysis. We also performed model selection using a posterior simulation-based analog of the Akaike information criterion through Markov chain Monte Carlo analysis (AICM). Bayes factor species delimitation results from the empirical data were then compared with results from the reversible-jump MCMC (rjMCMC) coalescent-based species delimitation method Bayesian Phylogenetics and Phylogeography (BP&P). Simulation results show that HME and sHME perform poorly compared with PS and SS marginal-likelihood estimators when identifying the true species delimitation model. Furthermore, Bayes factor delimitation (BFD) of species showed improved performance when species limits are tested by reassigning individuals between species, as opposed to either lumping or splitting lineages. In the empirical data, BFD through PS and SS analyses, as well as the rjMCMC method, each provide support for the recognition of all scalaris group taxa as independent evolutionary lineages. Bayes factor species delimitation and BP&P also support the recognition of three previously undescribed lineages. In both simulated and empirical data sets, harmonic and smoothed harmonic mean marginal-likelihood estimators provided much higher marginal-likelihood estimates than PS and SS estimators. The AICM displayed poor repeatability in both simulated and empirical data sets, and produced inconsistent model rankings across replicate runs with the empirical data. Our results suggest that species delimitation through the use of Bayes factors with marginal-likelihood estimates via PS or SS analyses provide a useful and complementary alternative to existing species delimitation methods.

  17. Application of Biologically Based Lumping To Investigate the Toxicokinetic Interactions of a Complex Gasoline Mixture.

    PubMed

    Jasper, Micah N; Martin, Sheppard A; Oshiro, Wendy M; Ford, Jermaine; Bushnell, Philip J; El-Masri, Hisham

    2016-03-15

    People are often exposed to complex mixtures of environmental chemicals such as gasoline, tobacco smoke, water contaminants, or food additives. We developed an approach that applies chemical lumping methods to complex mixtures, in this case gasoline, based on biologically relevant parameters used in physiologically based pharmacokinetic (PBPK) modeling. Inhalation exposures were performed with rats to evaluate the performance of our PBPK model and chemical lumping method. There were 109 chemicals identified and quantified in the vapor in the chamber. The time-course toxicokinetic profiles of 10 target chemicals were also determined from blood samples collected during and following the in vivo experiments. A general PBPK model was used to compare the experimental data to the simulated values of blood concentration for 10 target chemicals with various numbers of lumps, iteratively increasing from 0 to 99. Large reductions in simulation error were gained by incorporating enzymatic chemical interactions, in comparison to simulating the individual chemicals separately. The error was further reduced by lumping the 99 nontarget chemicals. The same biologically based lumping approach can be used to simplify any complex mixture with tens, hundreds, or thousands of constituents.

  18. Toward Theory-Based Instruction in Scientific Problem Solving.

    ERIC Educational Resources Information Center

    Heller, Joan I.; And Others

    Several empirical and theoretical analyses related to scientific problem-solving are reviewed, including: detailed studies of individuals at different levels of expertise, and computer models simulating some aspects of human information processing during problem solving. Analysis of these studies has revealed many facets about the nature of the…

  19. An individual-based modeling approach to simulating recreation use in wilderness settings

    Treesearch

    Randy Gimblett; Terry Daniel; Michael J. Meitner

    2000-01-01

    Landscapes protect biological diversity and provide unique opportunities for human-nature interactions. Too often, these desirable settings suffer from extremely high visitation. Given the complexity of social, environmental and economic interactions, resource managers need tools that provide insights into the cause and effect relationships between management actions...

  20. Assessing the contribution of different factors in RegCM4.3 regional climate model projections using the Factor Separation method over the Med-CORDEX domain

    NASA Astrophysics Data System (ADS)

    Zsolt Torma, Csaba; Giorgi, Filippo

    2014-05-01

    A set of regional climate model (RCM) simulations applying dynamical downscaling of global climate model (GCM) simulations over the Mediterranean domain specified by the international initiative Coordinated Regional Downscaling Experiment (CORDEX) were completed with the Regional Climate Model RegCM, version RegCM4.3. Two GCMs were selected from the Coupled Model Intercomparison Project Phase 5 (CMIP5) ensemble to provide the driving fields for the RegCM: HadGEM2-ES (HadGEM) and MPI-ESM-MR (MPI). The simulations consist of an ensemble including multiple physics configurations and different "Reference Concentration Pathways" (RCP4.5 and RCP8.5). In total 15 simulations were carried out with 7 model physics configurations with varying convection and land surface schemes. The horizontal grid spacing of the RCM simulations is 50 km and the simulated period in all cases is 1970-2100 (1970-2099 in case of HadGEM driven simulations). This ensemble includes a combination of experiments in which different model components are changed individually and in combination, and thus lends itself optimally to the application of the Factor Separation (FS) method. This study applies the FS method to investigate the contributions of different factors, along with their synergy, on a set of regional climate model (RCM) projections for the Mediterranean region. The FS method is applied to 6 projections for the period 1970-2100 performed with the regional model RegCM4.3 over the Med-CORDEX domain. Two different sets of factors are intercompared, namely the driving global climate model (HadGEM and MPI) boundary conditions against two model physics settings (convection scheme and irrigation). We find that both the GCM driving conditions and the model physics provide important contributions, depending on the variable analyzed (surface air temperature and precipitation), season (winter vs. summer) and time horizon into the future, while the synergy term mostly tends to counterbalance the contributions of the individual factors. We demonstrate the usefulness of the FS method to assess different sources of uncertainty in RCM-based regional climate projections.

  1. EEG-based Affect and Workload Recognition in a Virtual Driving Environment for ASD Intervention

    PubMed Central

    Wade, Joshua W.; Key, Alexandra P.; Warren, Zachary E.; Sarkar, Nilanjan

    2017-01-01

    objective To build group-level classification models capable of recognizing affective states and mental workload of individuals with autism spectrum disorder (ASD) during driving skill training. Methods Twenty adolescents with ASD participated in a six-session virtual reality driving simulator based experiment, during which their electroencephalogram (EEG) data were recorded alongside driving events and a therapist’s rating of their affective states and mental workload. Five feature generation approaches including statistical features, fractal dimension features, higher order crossings (HOC)-based features, power features from frequency bands, and power features from bins (Δf = 2 Hz) were applied to extract relevant features. Individual differences were removed with a two-step feature calibration method. Finally, binary classification results based on the k-nearest neighbors algorithm and univariate feature selection method were evaluated by leave-one-subject-out nested cross-validation to compare feature types and identify discriminative features. Results The best classification results were achieved using power features from bins for engagement (0.95) and boredom (0.78), and HOC-based features for enjoyment (0.90), frustration (0.88), and workload (0.86). Conclusion Offline EEG-based group-level classification models are feasible for recognizing binary low and high intensity of affect and workload of individuals with ASD in the context of driving. However, while promising the applicability of the models in an online adaptive driving task requires further development. Significance The developed models provide a basis for an EEG-based passive brain computer interface system that has the potential to benefit individuals with ASD with an affect- and workload-based individualized driving skill training intervention. PMID:28422647

  2. Altruistic aging: The evolutionary dynamics balancing longevity and evolvability.

    PubMed

    Herrera, Minette; Miller, Aaron; Nishimura, Joel

    2017-04-01

    Altruism is typically associated with traits or behaviors that benefit the population as a whole, but are costly to the individual. We propose that, when the environment is rapidly changing, senescence (age-related deterioration) can be altruistic. According to numerical simulations of an agent-based model, while long-lived individuals can outcompete their short lived peers, populations composed of long-lived individuals are more likely to go extinct during periods of rapid environmental change. Moreover, as in many situations where other cooperative behavior arises, senescence can be stabilized in a structured population.

  3. Shorebird Migration Patterns in Response to Climate Change: A Modeling Approach

    NASA Technical Reports Server (NTRS)

    Smith, James A.

    2010-01-01

    The availability of satellite remote sensing observations at multiple spatial and temporal scales, coupled with advances in climate modeling and information technologies offer new opportunities for the application of mechanistic models to predict how continental scale bird migration patterns may change in response to environmental change. In earlier studies, we explored the phenotypic plasticity of a migratory population of Pectoral sandpipers by simulating the movement patterns of an ensemble of 10,000 individual birds in response to changes in stopover locations as an indicator of the impacts of wetland loss and inter-annual variability on the fitness of migratory shorebirds. We used an individual based, biophysical migration model, driven by remotely sensed land surface data, climate data, and biological field data. Mean stop-over durations and stop-over frequency with latitude predicted from our model for nominal cases were consistent with results reported in the literature and available field data. In this study, we take advantage of new computing capabilities enabled by recent GP-GPU computing paradigms and commodity hardware (general purchase computing on graphics processing units). Several aspects of our individual based (agent modeling) approach lend themselves well to GP-GPU computing. We have been able to allocate compute-intensive tasks to the graphics processing units, and now simulate ensembles of 400,000 birds at varying spatial resolutions along the central North American flyway. We are incorporating additional, species specific, mechanistic processes to better reflect the processes underlying bird phenotypic plasticity responses to different climate change scenarios in the central U.S.

  4. Tundra shrubification and tree-line advance amplify arctic climate warming: results from an individual-based dynamic vegetation model

    NASA Astrophysics Data System (ADS)

    Zhang, Wenxin; Miller, Paul A.; Smith, Benjamin; Wania, Rita; Koenigk, Torben; Döscher, Ralf

    2013-09-01

    One major challenge to the improvement of regional climate scenarios for the northern high latitudes is to understand land surface feedbacks associated with vegetation shifts and ecosystem biogeochemical cycling. We employed a customized, Arctic version of the individual-based dynamic vegetation model LPJ-GUESS to simulate the dynamics of upland and wetland ecosystems under a regional climate model-downscaled future climate projection for the Arctic and Subarctic. The simulated vegetation distribution (1961-1990) agreed well with a composite map of actual arctic vegetation. In the future (2051-2080), a poleward advance of the forest-tundra boundary, an expansion of tall shrub tundra, and a dominance shift from deciduous to evergreen boreal conifer forest over northern Eurasia were simulated. Ecosystems continued to sink carbon for the next few decades, although the size of these sinks diminished by the late 21st century. Hot spots of increased CH4 emission were identified in the peatlands near Hudson Bay and western Siberia. In terms of their net impact on regional climate forcing, positive feedbacks associated with the negative effects of tree-line, shrub cover and forest phenology changes on snow-season albedo, as well as the larger sources of CH4, may potentially dominate over negative feedbacks due to increased carbon sequestration and increased latent heat flux.

  5. From large-eddy simulation to multi-UAVs sampling of shallow cumulus clouds

    NASA Astrophysics Data System (ADS)

    Lamraoui, Fayçal; Roberts, Greg; Burnet, Frédéric

    2016-04-01

    In-situ sampling of clouds that can provide simultaneous measurements at satisfying spatio-temporal resolutions to capture 3D small scale physical processes continues to present challenges. This project (SKYSCANNER) aims at bringing together cloud sampling strategies using a swarm of unmanned aerial vehicles (UAVs) based on Large-eddy simulation (LES). The multi-UAV-based field campaigns with a personalized sampling strategy for individual clouds and cloud fields will significantly improve the understanding of the unresolved cloud physical processes. An extensive set of LES experiments for case studies from ARM-SGP site have been performed using MesoNH model at high resolutions down to 10 m. The carried out simulations led to establishing a macroscopic model that quantifies the interrelationship between micro- and macrophysical properties of shallow convective clouds. Both the geometry and evolution of individual clouds are critical to multi-UAV cloud sampling and path planning. The preliminary findings of the current project reveal several linear relationships that associate many cloud geometric parameters to cloud related meteorological variables. In addition, the horizontal wind speed indicates a proportional impact on cloud number concentration as well as triggering and prolonging the occurrence of cumulus clouds. In the framework of the joint collaboration that involves a Multidisciplinary Team (including institutes specializing in aviation, robotics and atmospheric science), this model will be a reference point for multi-UAVs sampling strategies and path planning.

  6. A health economic model to determine the long-term costs and clinical outcomes of raising low HDL-cholesterol in the prevention of coronary heart disease.

    PubMed

    Roze, S; Liens, D; Palmer, A; Berger, W; Tucker, D; Renaudin, C

    2006-12-01

    The aim of this study was to describe a health economic model developed to project lifetime clinical and cost outcomes of lipid-modifying interventions in patients not reaching target lipid levels and to assess the validity of the model. The internet-based, computer simulation model is made up of two decision analytic sub-models, the first utilizing Monte Carlo simulation, and the second applying Markov modeling techniques. Monte Carlo simulation generates a baseline cohort for long-term simulation by assigning an individual lipid profile to each patient, and applying the treatment effects of interventions under investigation. The Markov model then estimates the long-term clinical (coronary heart disease events, life expectancy, and quality-adjusted life expectancy) and cost outcomes up to a lifetime horizon, based on risk equations from the Framingham study. Internal and external validation analyses were performed. The results of the model validation analyses, plotted against corresponding real-life values from Framingham, 4S, AFCAPS/TexCAPS, and a meta-analysis by Gordon et al., showed that the majority of values were close to the y = x line, which indicates a perfect fit. The R2 value was 0.9575 and the gradient of the regression line was 0.9329, both very close to the perfect fit (= 1). Validation analyses of the computer simulation model suggest the model is able to recreate the outcomes from published clinical studies and would be a valuable tool for the evaluation of new and existing therapy options for patients with persistent dyslipidemia.

  7. Mitigating randomness of consumer preferences under certain conditional choices

    NASA Astrophysics Data System (ADS)

    Bothos, John M. A.; Thanos, Konstantinos-Georgios; Papadopoulou, Eirini; Daveas, Stelios; Thomopoulos, Stelios C. A.

    2017-05-01

    Agent-based crowd behaviour consists a significant field of research that has drawn a lot of attention in recent years. Agent-based crowd simulation techniques have been used excessively to forecast the behaviour of larger or smaller crowds in terms of certain given conditions influenced by specific cognition models and behavioural rules and norms, imposed from the beginning. Our research employs conditional event algebra, statistical methodology and agent-based crowd simulation techniques in developing a behavioural econometric model about the selection of certain economic behaviour by a consumer that faces a spectre of potential choices when moving and acting in a multiplex mall. More specifically we try to analyse the influence of demographic, economic, social and cultural factors on the economic behaviour of a certain individual and then we try to link its behaviour with the general behaviour of the crowds of consumers in multiplex malls using agent-based crowd simulation techniques. We then run our model using Generalized Least Squares and Maximum Likelihood methods to come up with the most probable forecast estimations, regarding the agent's behaviour. Our model is indicative about the formation of consumers' spectre of choices in multiplex malls under the condition of predefined preferences and can be used as a guide for further research in this area.

  8. Transport link scanner: simulating geographic transport network expansion through individual investments

    NASA Astrophysics Data System (ADS)

    Jacobs-Crisioni, C.; Koopmans, C. C.

    2016-07-01

    This paper introduces a GIS-based model that simulates the geographic expansion of transport networks by several decision-makers with varying objectives. The model progressively adds extensions to a growing network by choosing the most attractive investments from a limited choice set. Attractiveness is defined as a function of variables in which revenue and broader societal benefits may play a role and can be based on empirically underpinned parameters that may differ according to private or public interests. The choice set is selected from an exhaustive set of links and presumably contains those investment options that best meet private operator's objectives by balancing the revenues of additional fare against construction costs. The investment options consist of geographically plausible routes with potential detours. These routes are generated using a fine-meshed regularly latticed network and shortest path finding methods. Additionally, two indicators of the geographic accuracy of the simulated networks are introduced. A historical case study is presented to demonstrate the model's first results. These results show that the modelled networks reproduce relevant results of the historically built network with reasonable accuracy.

  9. Proceedings of the Numerical Modeling for Underground Nuclear Test Monitoring Symposium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, S.R.; Kamm, J.R.

    1993-11-01

    The purpose of the meeting was to discuss the state-of-the-art in numerical simulations of nuclear explosion phenomenology with applications to test ban monitoring. We focused on the uniqueness of model fits to data, the measurement and characterization of material response models, advanced modeling techniques, and applications of modeling to monitoring problems. The second goal of the symposium was to establish a dialogue between seismologists and explosion-source code calculators. The meeting was divided into five main sessions: explosion source phenomenology, material response modeling, numerical simulations, the seismic source, and phenomenology from near source to far field. We feel the symposium reachedmore » many of its goals. Individual papers submitted at the conference are indexed separately on the data base.« less

  10. Dynamic social networks based on movement

    USGS Publications Warehouse

    Scharf, Henry; Hooten, Mevin B.; Fosdick, Bailey K.; Johnson, Devin S.; London, Joshua M.; Durban, John W.

    2016-01-01

    Network modeling techniques provide a means for quantifying social structure in populations of individuals. Data used to define social connectivity are often expensive to collect and based on case-specific, ad hoc criteria. Moreover, in applications involving animal social networks, collection of these data is often opportunistic and can be invasive. Frequently, the social network of interest for a given population is closely related to the way individuals move. Thus, telemetry data, which are minimally invasive and relatively inexpensive to collect, present an alternative source of information. We develop a framework for using telemetry data to infer social relationships among animals. To achieve this, we propose a Bayesian hierarchical model with an underlying dynamic social network controlling movement of individuals via two mechanisms: an attractive effect and an aligning effect. We demonstrate the model and its ability to accurately identify complex social behavior in simulation, and apply our model to telemetry data arising from killer whales. Using auxiliary information about the study population, we investigate model validity and find the inferred dynamic social network is consistent with killer whale ecology and expert knowledge.

  11. A model-based test for treatment effects with probabilistic classifications.

    PubMed

    Cavagnaro, Daniel R; Davis-Stober, Clintin P

    2018-05-21

    Within modern psychology, computational and statistical models play an important role in describing a wide variety of human behavior. Model selection analyses are typically used to classify individuals according to the model(s) that best describe their behavior. These classifications are inherently probabilistic, which presents challenges for performing group-level analyses, such as quantifying the effect of an experimental manipulation. We answer this challenge by presenting a method for quantifying treatment effects in terms of distributional changes in model-based (i.e., probabilistic) classifications across treatment conditions. The method uses hierarchical Bayesian mixture modeling to incorporate classification uncertainty at the individual level into the test for a treatment effect at the group level. We illustrate the method with several worked examples, including a reanalysis of the data from Kellen, Mata, and Davis-Stober (2017), and analyze its performance more generally through simulation studies. Our simulations show that the method is both more powerful and less prone to type-1 errors than Fisher's exact test when classifications are uncertain. In the special case where classifications are deterministic, we find a near-perfect power-law relationship between the Bayes factor, derived from our method, and the p value obtained from Fisher's exact test. We provide code in an online supplement that allows researchers to apply the method to their own data. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. Probabilities and predictions: modeling the development of scientific problem-solving skills.

    PubMed

    Stevens, Ron; Johnson, David F; Soller, Amy

    2005-01-01

    The IMMEX (Interactive Multi-Media Exercises) Web-based problem set platform enables the online delivery of complex, multimedia simulations, the rapid collection of student performance data, and has already been used in several genetic simulations. The next step is the use of these data to understand and improve student learning in a formative manner. This article describes the development of probabilistic models of undergraduate student problem solving in molecular genetics that detailed the spectrum of strategies students used when problem solving, and how the strategic approaches evolved with experience. The actions of 776 university sophomore biology majors from three molecular biology lecture courses were recorded and analyzed. Each of six simulations were first grouped by artificial neural network clustering to provide individual performance measures, and then sequences of these performances were probabilistically modeled by hidden Markov modeling to provide measures of progress. The models showed that students with different initial problem-solving abilities choose different strategies. Initial and final strategies varied across different sections of the same course and were not strongly correlated with other achievement measures. In contrast to previous studies, we observed no significant gender differences. We suggest that instructor interventions based on early student performances with these simulations may assist students to recognize effective and efficient problem-solving strategies and enhance learning.

  13. A generic model to simulate air-borne diseases as a function of crop architecture.

    PubMed

    Casadebaig, Pierre; Quesnel, Gauthier; Langlais, Michel; Faivre, Robert

    2012-01-01

    In a context of pesticide use reduction, alternatives to chemical-based crop protection strategies are needed to control diseases. Crop and plant architectures can be viewed as levers to control disease outbreaks by affecting microclimate within the canopy or pathogen transmission between plants. Modeling and simulation is a key approach to help analyze the behaviour of such systems where direct observations are difficult and tedious. Modeling permits the joining of concepts from ecophysiology and epidemiology to define structures and functions generic enough to describe a wide range of epidemiological dynamics. Additionally, this conception should minimize computing time by both limiting the complexity and setting an efficient software implementation. In this paper, our aim was to present a model that suited these constraints so it could first be used as a research and teaching tool to promote discussions about epidemic management in cropping systems. The system was modelled as a combination of individual hosts (population of plants or organs) and infectious agents (pathogens) whose contacts are restricted through a network of connections. The system dynamics were described at an individual scale. Additional attention was given to the identification of generic properties of host-pathogen systems to widen the model's applicability domain. Two specific pathosystems with contrasted crop architectures were considered: ascochyta blight on pea (homogeneously layered canopy) and potato late blight (lattice of individualized plants). The model behavior was assessed by simulation and sensitivity analysis and these results were discussed against the model ability to discriminate between the defined types of epidemics. Crop traits related to disease avoidance resulting in a low exposure, a slow dispersal or a de-synchronization of plant and pathogen cycles were shown to strongly impact the disease severity at the crop scale.

  14. A Method for Modeling Household Occupant Behavior to Simulate Residential Energy Consumption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Brandon J; Starke, Michael R; Abdelaziz, Omar

    2014-01-01

    This paper presents a statistical method for modeling the behavior of household occupants to estimate residential energy consumption. Using data gathered by the U.S. Census Bureau in the American Time Use Survey (ATUS), actions carried out by survey respondents are categorized into ten distinct activities. These activities are defined to correspond to the major energy consuming loads commonly found within the residential sector. Next, time varying minute resolution Markov chain based statistical models of different occupant types are developed. Using these behavioral models, individual occupants are simulated to show how an occupant interacts with the major residential energy consuming loadsmore » throughout the day. From these simulations, the minimum number of occupants, and consequently the minimum number of multiple occupant households, needing to be simulated to produce a statistically accurate representation of aggregate residential behavior can be determined. Finally, future work will involve the use of these occupant models along side residential load models to produce a high-resolution energy consumption profile and estimate the potential for demand response from residential loads.« less

  15. Modeling individual movement decisions of brown hare (Lepus europaeus) as a key concept for realistic spatial behavior and exposure: A population model for landscape-level risk assessment.

    PubMed

    Kleinmann, Joachim U; Wang, Magnus

    2017-09-01

    Spatial behavior is of crucial importance for the risk assessment of pesticides and for the assessment of effects of agricultural practice or multiple stressors, because it determines field use, exposition, and recovery. Recently, population models have increasingly been used to understand the mechanisms driving risk and recovery or to conduct landscape-level risk assessments. To include spatial behavior appropriately in population models for use in risk assessments, a new method, "probabilistic walk," was developed, which simulates the detailed daily movement of individuals by taking into account food resources, vegetation cover, and the presence of conspecifics. At each movement step, animals decide where to move next based on probabilities being determined from this information. The model was parameterized to simulate populations of brown hares (Lepus europaeus). A detailed validation of the model demonstrated that it can realistically reproduce various natural patterns of brown hare ecology and behavior. Simulated proportions of time animals spent in fields (PT values) were also comparable to field observations. It is shown that these important parameters for the risk assessment may, however, vary in different landscapes. The results demonstrate the value of using population models to reduce uncertainties in risk assessment and to better understand which factors determine risk in a landscape context. Environ Toxicol Chem 2017;36:2299-2307. © 2017 SETAC. © 2017 SETAC.

  16. 3D morphology-based clustering and simulation of human pyramidal cell dendritic spines.

    PubMed

    Luengo-Sanchez, Sergio; Fernaud-Espinosa, Isabel; Bielza, Concha; Benavides-Piccione, Ruth; Larrañaga, Pedro; DeFelipe, Javier

    2018-06-13

    The dendritic spines of pyramidal neurons are the targets of most excitatory synapses in the cerebral cortex. They have a wide variety of morphologies, and their morphology appears to be critical from the functional point of view. To further characterize dendritic spine geometry, we used in this paper over 7,000 individually 3D reconstructed dendritic spines from human cortical pyramidal neurons to group dendritic spines using model-based clustering. This approach uncovered six separate groups of human dendritic spines. To better understand the differences between these groups, the discriminative characteristics of each group were identified as a set of rules. Model-based clustering was also useful for simulating accurate 3D virtual representations of spines that matched the morphological definitions of each cluster. This mathematical approach could provide a useful tool for theoretical predictions on the functional features of human pyramidal neurons based on the morphology of dendritic spines.

  17. Thermal Analysis System

    NASA Technical Reports Server (NTRS)

    DiStefano, III, Frank James (Inventor); Wobick, Craig A. (Inventor); Chapman, Kirt Auldwin (Inventor); McCloud, Peter L. (Inventor)

    2014-01-01

    A thermal fluid system modeler including a plurality of individual components. A solution vector is configured and ordered as a function of one or more inlet dependencies of the plurality of individual components. A fluid flow simulator simulates thermal energy being communicated with the flowing fluid and between first and second components of the plurality of individual components. The simulation extends from an initial time to a later time step and bounds heat transfer to be substantially between the flowing fluid, walls of tubes formed in each of the individual components of the plurality, and between adjacent tubes. Component parameters of the solution vector are updated with simulation results for each of the plurality of individual components of the simulation.

  18. MoCog1: A computer simulation of recognition-primed human decision making, considering emotions

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1992-01-01

    The successful results of the first stage of a research effort to develop a versatile computer model of motivated human cognitive behavior are reported. Most human decision making appears to be an experience-based, relatively straightforward, largely automatic response to situations, utilizing cues and opportunities perceived from the current environment. The development, considering emotions, of the architecture and computer program associated with such 'recognition-primed' decision-making is described. The resultant computer program (MoCog1) was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  19. MoCog1: A computer simulation of recognition-primed human decision making

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  20. Modeling livestock population structure: a geospatial database for Ontario swine farms.

    PubMed

    Khan, Salah Uddin; O'Sullivan, Terri L; Poljak, Zvonimir; Alsop, Janet; Greer, Amy L

    2018-01-30

    Infectious diseases in farmed animals have economic, social, and health consequences. Foreign animal diseases (FAD) of swine are of significant concern. Mathematical and simulation models are often used to simulate FAD outbreaks and best practices for control. However, simulation outcomes are sensitive to the population structure used. Within Canada, access to individual swine farm population data with which to parameterize models is a challenge because of privacy concerns. Our objective was to develop a methodology to model the farmed swine population in Ontario, Canada that could represent the existing population structure and improve the efficacy of simulation models. We developed a swine population model based on the factors such as facilities supporting farm infrastructure, land availability, zoning and local regulations, and natural geographic barriers that could affect swine farming in Ontario. Assigned farm locations were equal to the swine farm density described in the 2011 Canadian Census of Agriculture. Farms were then randomly assigned to farm types proportional to the existing swine herd types. We compared the swine population models with a known database of swine farm locations in Ontario and found that the modeled population was representative of farm locations with a high accuracy (AUC: 0.91, Standard deviation: 0.02) suggesting that our algorithm generated a reasonable approximation of farm locations in Ontario. In the absence of a readily accessible dataset providing details of the relative locations of swine farms in Ontario, development of a model livestock population that captures key characteristics of the true population structure while protecting privacy concerns is an important methodological advancement. This methodology will be useful for individuals interested in modeling the spread of pathogens between farms across a landscape and using these models to evaluate disease control strategies.

  1. Evolution of tag-based cooperation with emotion on complex networks

    NASA Astrophysics Data System (ADS)

    Lima, F. W. S.

    2018-04-01

    We study the evolution of the four strategies: Ethnocentric, altruistic, egoistic and cosmopolitan in one community of individuals through Monte Carlo simulations. Interactions and reproduction among computational agents are simulated on undirected Barabási-Albert (UBA) networks and Erdös-Rènyi random graphs (ER).We study the Hammond-Axelrod model on both UBA networks and ER random graphs for the asexual reproduction case. We use a modified version of the traditional Hammond-Axelrod model and we also allow the agents’ decisions about one of the strategies to take into account the emotion among their equals. Our simulations showed that egoism and altruism win, differently from other results found in the literature where ethnocentric strategy is common.

  2. Modeling human mobility responses to the large-scale spreading of infectious diseases.

    PubMed

    Meloni, Sandro; Perra, Nicola; Arenas, Alex; Gómez, Sergio; Moreno, Yamir; Vespignani, Alessandro

    2011-01-01

    Current modeling of infectious diseases allows for the study of realistic scenarios that include population heterogeneity, social structures, and mobility processes down to the individual level. The advances in the realism of epidemic description call for the explicit modeling of individual behavioral responses to the presence of disease within modeling frameworks. Here we formulate and analyze a metapopulation model that incorporates several scenarios of self-initiated behavioral changes into the mobility patterns of individuals. We find that prevalence-based travel limitations do not alter the epidemic invasion threshold. Strikingly, we observe in both synthetic and data-driven numerical simulations that when travelers decide to avoid locations with high levels of prevalence, this self-initiated behavioral change may enhance disease spreading. Our results point out that the real-time availability of information on the disease and the ensuing behavioral changes in the population may produce a negative impact on disease containment and mitigation.

  3. Inferring a District-Based Hierarchical Structure of Social Contacts from Census Data

    PubMed Central

    Yu, Zhiwen; Liu, Jiming; Zhu, Xianjun

    2015-01-01

    Researchers have recently paid attention to social contact patterns among individuals due to their useful applications in such areas as epidemic evaluation and control, public health decisions, chronic disease research and social network research. Although some studies have estimated social contact patterns from social networks and surveys, few have considered how to infer the hierarchical structure of social contacts directly from census data. In this paper, we focus on inferring an individual’s social contact patterns from detailed census data, and generate various types of social contact patterns such as hierarchical-district-structure-based, cross-district and age-district-based patterns. We evaluate newly generated contact patterns derived from detailed 2011 Hong Kong census data by incorporating them into a model and simulation of the 2009 Hong Kong H1N1 epidemic. We then compare the newly generated social contact patterns with the mixing patterns that are often used in the literature, and draw the following conclusions. First, the generation of social contact patterns based on a hierarchical district structure allows for simulations at different district levels. Second, the newly generated social contact patterns reflect individuals social contacts. Third, the newly generated social contact patterns improve the accuracy of the SEIR-based epidemic model. PMID:25679787

  4. The financial impact of employment decisions for individuals with HIV.

    PubMed

    Cho, Elizabeth; Chan, Kee

    2013-01-01

    Individuals living with HIV face challenging employment decisions that have personal, financial, and health impacts. The decision to stay or to leave the work force is much more complicated for an individual with HIV because the financial choices related to potential health benefits are not clearly understood. To assist in the decision-making process for an individual with HIV, we propose to develop a decision model that compares the potential costs and benefits of staying in or leaving the work force. A hypothetical cohort of HIV-infected individuals was simulated in our decision model. Characteristics of these individuals over a one-year period were extracted from the medical literature and publicly available national surveys. Men and women between the ages of 18 and 59 were included in our simulated cohort. A decision tree model was created to estimate the financial impact of an individual's decision on employment. The outcomes were presented as the cost-savings associated with the following employment statuses over a one-year period: 1) staying full-time, 2) switching from full-to part-time, 3) transitioning from full-time to unemployment, and 4) staying unemployed. CD4 T cell counts and employment statuses were stratified by earned income. Employment probabilities were calculated from national databases on employment trends in the United States. Sensitivity analyses were conducted to test the robustness of the effects of the variables on the outcomes. Overall, the decision outcome that resulted in the least financial loss for individuals with HIV was to remain at work. For an individual with CD4 T cell count > 350, the cost difference between staying employed full-time and switching from full-time to part-time status was a maximum of $2,970. For an individual with a CD4 T cell count between 200 and 350, the cost difference was as low as $126 and as great as $2,492. For an individual with a CD4 T cell count < 200, the minimum cost difference was $375 and the maximum cost difference was $2,253. Based on our simulated model, we recommend an individual with CD4 T cell count > 350 to stay employed full-time because it resulted in the least financial loss. On the other hand, for an individual with a CD4 T cell < 350, the financial cost loss was much more variable. Our model provides an objective decision-making guide for individuals with HIV to weigh the costs and benefits of employment decisions.

  5. Comparison between artificial neural network and multilinear regression models in an evaluation of cognitive workload in a flight simulator.

    PubMed

    Hannula, Manne; Huttunen, Kerttu; Koskelo, Jukka; Laitinen, Tomi; Leino, Tuomo

    2008-01-01

    In this study, the performances of artificial neural network (ANN) analysis and multilinear regression (MLR) model-based estimation of heart rate were compared in an evaluation of individual cognitive workload. The data comprised electrocardiography (ECG) measurements and an evaluation of cognitive load that induces psychophysiological stress (PPS), collected from 14 interceptor fighter pilots during complex simulated F/A-18 Hornet air battles. In our data, the mean absolute error of the ANN estimate was 11.4 as a visual analog scale score, being 13-23% better than the mean absolute error of the MLR model in the estimation of cognitive workload.

  6. Optimization of Biomathematical Model Predictions for Cognitive Performance Impairment in Individuals: Accounting for Unknown Traits and Uncertain States in Homeostatic and Circadian Processes

    PubMed Central

    Van Dongen, Hans P. A.; Mott, Christopher G.; Huang, Jen-Kuang; Mollicone, Daniel J.; McKenzie, Frederic D.; Dinges, David F.

    2007-01-01

    Current biomathematical models of fatigue and performance do not accurately predict cognitive performance for individuals with a priori unknown degrees of trait vulnerability to sleep loss, do not predict performance reliably when initial conditions are uncertain, and do not yield statistically valid estimates of prediction accuracy. These limitations diminish their usefulness for predicting the performance of individuals in operational environments. To overcome these 3 limitations, a novel modeling approach was developed, based on the expansion of a statistical technique called Bayesian forecasting. The expanded Bayesian forecasting procedure was implemented in the two-process model of sleep regulation, which has been used to predict performance on the basis of the combination of a sleep homeostatic process and a circadian process. Employing the two-process model with the Bayesian forecasting procedure to predict performance for individual subjects in the face of unknown traits and uncertain states entailed subject-specific optimization of 3 trait parameters (homeostatic build-up rate, circadian amplitude, and basal performance level) and 2 initial state parameters (initial homeostatic state and circadian phase angle). Prior information about the distribution of the trait parameters in the population at large was extracted from psychomotor vigilance test (PVT) performance measurements in 10 subjects who had participated in a laboratory experiment with 88 h of total sleep deprivation. The PVT performance data of 3 additional subjects in this experiment were set aside beforehand for use in prospective computer simulations. The simulations involved updating the subject-specific model parameters every time the next performance measurement became available, and then predicting performance 24 h ahead. Comparison of the predictions to the subjects' actual data revealed that as more data became available for the individuals at hand, the performance predictions became increasingly more accurate and had progressively smaller 95% confidence intervals, as the model parameters converged efficiently to those that best characterized each individual. Even when more challenging simulations were run (mimicking a change in the initial homeostatic state; simulating the data to be sparse), the predictions were still considerably more accurate than would have been achieved by the two-process model alone. Although the work described here is still limited to periods of consolidated wakefulness with stable circadian rhythms, the results obtained thus far indicate that the Bayesian forecasting procedure can successfully overcome some of the major outstanding challenges for biomathematical prediction of cognitive performance in operational settings. Citation: Van Dongen HPA; Mott CG; Huang JK; Mollicone DJ; McKenzie FD; Dinges DF. Optimization of biomathematical model predictions for cognitive performance impairment in individuals: accounting for unknown traits and uncertain states in homeostatic and circadian processes. SLEEP 2007;30(9):1129-1143. PMID:17910385

  7. Modelling the Constraints of Spatial Environment in Fauna Movement Simulations: Comparison of a Boundaries Accurate Function and a Cost Function

    NASA Astrophysics Data System (ADS)

    Jolivet, L.; Cohen, M.; Ruas, A.

    2015-08-01

    Landscape influences fauna movement at different levels, from habitat selection to choices of movements' direction. Our goal is to provide a development frame in order to test simulation functions for animal's movement. We describe our approach for such simulations and we compare two types of functions to calculate trajectories. To do so, we first modelled the role of landscape elements to differentiate between elements that facilitate movements and the ones being hindrances. Different influences are identified depending on landscape elements and on animal species. Knowledge were gathered from ecologists, literature and observation datasets. Second, we analysed the description of animal movement recorded with GPS at fine scale, corresponding to high temporal frequency and good location accuracy. Analysing this type of data provides information on the relation between landscape features and movements. We implemented an agent-based simulation approach to calculate potential trajectories constrained by the spatial environment and individual's behaviour. We tested two functions that consider space differently: one function takes into account the geometry and the types of landscape elements and one cost function sums up the spatial surroundings of an individual. Results highlight the fact that the cost function exaggerates the distances travelled by an individual and simplifies movement patterns. The geometry accurate function represents a good bottom-up approach for discovering interesting areas or obstacles for movements.

  8. State-of-the-Art Review on Physiologically Based Pharmacokinetic Modeling in Pediatric Drug Development.

    PubMed

    Yellepeddi, Venkata; Rower, Joseph; Liu, Xiaoxi; Kumar, Shaun; Rashid, Jahidur; Sherwin, Catherine M T

    2018-05-18

    Physiologically based pharmacokinetic modeling and simulation is an important tool for predicting the pharmacokinetics, pharmacodynamics, and safety of drugs in pediatrics. Physiologically based pharmacokinetic modeling is applied in pediatric drug development for first-time-in-pediatric dose selection, simulation-based trial design, correlation with target organ toxicities, risk assessment by investigating possible drug-drug interactions, real-time assessment of pharmacokinetic-safety relationships, and assessment of non-systemic biodistribution targets. This review summarizes the details of a physiologically based pharmacokinetic modeling approach in pediatric drug research, emphasizing reports on pediatric physiologically based pharmacokinetic models of individual drugs. We also compare and contrast the strategies employed by various researchers in pediatric physiologically based pharmacokinetic modeling and provide a comprehensive overview of physiologically based pharmacokinetic modeling strategies and approaches in pediatrics. We discuss the impact of physiologically based pharmacokinetic models on regulatory reviews and product labels in the field of pediatric pharmacotherapy. Additionally, we examine in detail the current limitations and future directions of physiologically based pharmacokinetic modeling in pediatrics with regard to the ability to predict plasma concentrations and pharmacokinetic parameters. Despite the skepticism and concern in the pediatric community about the reliability of physiologically based pharmacokinetic models, there is substantial evidence that pediatric physiologically based pharmacokinetic models have been used successfully to predict differences in pharmacokinetics between adults and children for several drugs. It is obvious that the use of physiologically based pharmacokinetic modeling to support various stages of pediatric drug development is highly attractive and will rapidly increase, provided the robustness and reliability of these techniques are well established.

  9. Applying and Individual-Based Model to Simultaneously Evaluate Net Ecosystem Production and Tree Diameter Increment

    NASA Astrophysics Data System (ADS)

    Fang, F. J.

    2017-12-01

    Reconciling observations at fundamentally different scales is central in understanding the global carbon cycle. This study investigates a model-based melding of forest inventory data, remote-sensing data and micrometeorological-station data ("flux towers" estimating forest heat, CO2 and H2O fluxes). The individual tree-based model FORCCHN was used to evaluate the tree DBH increment and forest carbon fluxes. These are the first simultaneous simulations of the forest carbon budgets from flux towers and individual-tree growth estimates of forest carbon budgets using the continuous forest inventory data — under circumstances in which both predictions can be tested. Along with the global implications of such findings, this also improves the capacity for forest sustainable management and the comprehensive understanding of forest ecosystems. In forest ecology, diameter at breast height (DBH) of a tree significantly determines an individual tree's cross-sectional sapwood area, its biomass and carbon storage. Evaluation the annual DBH increment (ΔDBH) of an individual tree is central to understanding tree growth and forest ecology. Ecosystem Carbon flux is a consequence of key ecosystem processes in the forest-ecosystem carbon cycle, Gross and Net Primary Production (GPP and NPP, respectively) and Net Ecosystem Respiration (NEP). All of these closely relate with tree DBH changes and tree death. Despite advances in evaluating forest carbon fluxes with flux towers and forest inventories for individual tree ΔDBH, few current ecological models can simultaneously quantify and predict the tree ΔDBH and forest carbon flux.

  10. Mesoscale mechanics of twisting carbon nanotube yarns.

    PubMed

    Mirzaeifar, Reza; Qin, Zhao; Buehler, Markus J

    2015-03-12

    Fabricating continuous macroscopic carbon nanotube (CNT) yarns with mechanical properties close to individual CNTs remains a major challenge. Spinning CNT fibers and ribbons for enhancing the weak interactions between the nanotubes is a simple and efficient method for fabricating high-strength and tough continuous yarns. Here we investigate the mesoscale mechanics of twisting CNT yarns using full atomistic and coarse grained molecular dynamics simulations, considering concurrent mechanisms at multiple length-scales. To investigate the mechanical response of such a complex structure without losing insights into the molecular mechanism, we applied a multiscale strategy. The full atomistic results are used for training a coarse grained model for studying larger systems consisting of several CNTs. The mesoscopic model parameters are updated as a function of the twist angle, based on the full atomistic results, in order to incorporate the atomistic scale deformation mechanisms in larger scale simulations. By bridging across two length scales, our model is capable of accurately predicting the mechanical behavior of twisted yarns while the atomistic level deformations in individual nanotubes are integrated into the model by updating the parameters. Our results focused on studying a bundle of close packed nanotubes provide novel mechanistic insights into the spinning of CNTs. Our simulations reveal how twisting a bundle of CNTs improves the shear interaction between the nanotubes up to a certain level due to increasing the interaction surface. Furthermore, twisting the bundle weakens the intertube interactions due to excessive deformation in the cross sections of individual CNTs in the bundle.

  11. Modeling the Population Dynamics of Antibiotic-Resistant Bacteria:. AN Agent-Based Approach

    NASA Astrophysics Data System (ADS)

    Murphy, James T.; Walshe, Ray; Devocelle, Marc

    The response of bacterial populations to antibiotic treatment is often a function of a diverse range of interacting factors. In order to develop strategies to minimize the spread of antibiotic resistance in pathogenic bacteria, a sound theoretical understanding of the systems of interactions taking place within a colony must be developed. The agent-based approach to modeling bacterial populations is a useful tool for relating data obtained at the molecular and cellular level with the overall population dynamics. Here we demonstrate an agent-based model, called Micro-Gen, which has been developed to simulate the growth and development of bacterial colonies in culture. The model also incorporates biochemical rules and parameters describing the kinetic interactions of bacterial cells with antibiotic molecules. Simulations were carried out to replicate the development of methicillin-resistant S. aureus (MRSA) colonies growing in the presence of antibiotics. The model was explored to see how the properties of the system emerge from the interactions of the individual bacterial agents in order to achieve a better mechanistic understanding of the population dynamics taking place. Micro-Gen provides a good theoretical framework for investigating the effects of local environmental conditions and cellular properties on the response of bacterial populations to antibiotic exposure in the context of a simulated environment.

  12. Uncertainty based modeling of rainfall-runoff: Combined differential evolution adaptive Metropolis (DREAM) and K-means clustering

    NASA Astrophysics Data System (ADS)

    Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara

    2015-09-01

    Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Won; Stein, Michael L.; Wang, Jiali

    Climate models robustly imply that some significant change in precipitation patterns will occur. Models consistently project that the intensity of individual precipitation events increases by approximately 6-7%/K, following the increase in atmospheric water content, but that total precipitation increases by a lesser amount (2-3%/K in the global average). Some other aspect of precipitation events must then change to compensate for this difference. We develop here a new methodology for identifying individual rainstorms and studying their physical characteristics - including starting location, intensity, spatial extent, duration, and trajectory - that allows identifying that compensating mechanism. We apply this technique to precipitationmore » over the contiguous U.S. from both radar-based data products and high-resolution model runs simulating 100 years of business-as-usual warming. In model studies, we find that the dominant compensating mechanism is a reduction of storm size. In summer, rainstorms become more intense but smaller; in winter, rainstorm shrinkage still dominates, but storms also become less numerous and shorter duration. These results imply that flood impacts from climate change will be less severe than would be expected from changes in precipitation intensity alone. We show also that projected changes are smaller than model-observation biases, implying that the best means of incorporating them into impact assessments is via "data-driven simulations" that apply model-projected changes to observational data. We therefore develop a simulation algorithm that statistically describes model changes in precipitation characteristics and adjusts data accordingly, and show that, especially for summertime precipitation, it outperforms simulation approaches that do not include spatial information.« less

  14. Is the whole the sum of its parts? Agent-based modelling of wastewater treatment systems.

    PubMed

    Schuler, A J; Majed, N; Bucci, V; Hellweger, F L; Tu, Y; Gu, A Z

    2011-01-01

    Agent-based models (ABMS) simulate individual units within a system, such as the bacteria in a biological wastewater treatment system. This paper outlines past, current and potential future applications of ABMs to wastewater treatment. ABMs track heterogeneities within microbial populations, and this has been demonstrated to yield different predictions of bulk behaviors than the conventional, "lumped" approaches for enhanced biological phosphorus removal (EBPR) completely mixed reactors systems. Current work included the application of the ABM approach to bacterial adaptation/evolution, using the model system of individual EBPR bacteria that are allowed to evolve a kinetic parameter (maximum glycogen storage) in a competitive environment. The ABM approach was successfully implemented to a simple anaerobic-aerobic system and it was found the differing initial states converged to the same optimal solution under uncertain hydraulic residence times associated with completely mixed hydraulics. In another study, an ABM was developed and applied to simulate the heterogeneity in intracellular polymer storage compounds, including polyphosphate (PP), in functional microbial populations in enhanced biological phosphorus removal (EBPR) process. The simulation results were compared to the experimental measurements of single-cell abundance of PP in polyphosphate accumulating organisms (PAOs), performed using Raman microscopy. The model-predicted heterogeneity was generally consistent with observations, and it was used to investigate the relative contribution of external (different life histories) and internal (biological) mechanisms leading to heterogeneity. In the future, ABMs could be combined with computational fluid dynamics (CFD) models to understand incomplete mixing, more intracellular states and mechanisms can be incorporated, and additional experimental verification is needed.

  15. An approach to hydrogeological modeling of a large system of groundwater-fed lakes and wetlands in the Nebraska Sand Hills, USA

    NASA Astrophysics Data System (ADS)

    Rossman, Nathan R.; Zlotnik, Vitaly A.; Rowe, Clinton M.

    2018-05-01

    The feasibility of a hydrogeological modeling approach to simulate several thousand shallow groundwater-fed lakes and wetlands without explicitly considering their connection with groundwater is investigated at the regional scale ( 40,000 km2) through an application in the semi-arid Nebraska Sand Hills (NSH), USA. Hydraulic heads are compared to local land-surface elevations from a digital elevation model (DEM) within a geographic information system to assess locations of lakes and wetlands. The water bodies are inferred where hydraulic heads exceed, or are above a certain depth below, the land surface. Numbers of lakes and/or wetlands are determined via image cluster analysis applied to the same 30-m grid as the DEM after interpolating both simulated and estimated heads. The regional water-table map was used for groundwater model calibration, considering MODIS-based net groundwater recharge data. Resulting values of simulated total baseflow to interior streams are within 1% of observed values. Locations, areas, and numbers of simulated lakes and wetlands are compared with Landsat 2005 survey data and with areas of lakes from a 1979-1980 Landsat survey and the National Hydrography Dataset. This simplified process-based modeling approach avoids the need for field-based morphology or water-budget data from individual lakes or wetlands, or determination of lake-groundwater exchanges, yet it reproduces observed lake-wetland characteristics at regional groundwater management scales. A better understanding of the NSH hydrogeology is attained, and the approach shows promise for use in simulations of groundwater-fed lake and wetland characteristics in other large groundwater systems.

  16. The ability of individuals to assess population density influences the evolution of emigration propensity and dispersal distance.

    PubMed

    Poethke, Hans Joachim; Gros, Andreas; Hovestadt, Thomas

    2011-08-07

    We analyze the simultaneous evolution of emigration and settlement decisions for actively dispersing species differing in their ability to assess population density. Using an individual-based model we simulate dispersal as a multi-step (patch to patch) movement in a world consisting of habitat patches surrounded by a hostile matrix. Each such step is associated with the same mortality risk. Our simulations show that individuals following an informed strategy, where emigration (and settlement) probability depends on local population density, evolve a lower (natal) emigration propensity but disperse over significantly larger distances - i.e. postpone settlement longer - than individuals performing density-independent emigration. This holds especially when variation in environmental conditions is spatially correlated. Both effects can be traced to the informed individuals' ability to better exploit existing heterogeneity in reproductive chances. Yet, already moderate distance-dependent dispersal costs prevent the evolution of multi-step (long-distance) dispersal, irrespective of the dispersal strategy. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Settlement Dynamics and Hierarchy from Agent Decision-Making: a Method Derived from Entropy Maximization.

    PubMed

    Altaweel, Mark

    2015-01-01

    This paper presents an agent-based complex system simulation of settlement structure change using methods derived from entropy maximization modeling. The approach is applied to model the movement of people and goods in urban settings to study how settlement size hierarchy develops. While entropy maximization is well known for assessing settlement structure change over different spatiotemporal settings, approaches have rarely attempted to develop and apply this methodology to understand how individual and household decisions may affect settlement size distributions. A new method developed in this paper allows individual decision-makers to chose where to settle based on social-environmental factors, evaluate settlements based on geography and relative benefits, while retaining concepts derived from entropy maximization with settlement size affected by movement ability and site attractiveness feedbacks. To demonstrate the applicability of the theoretical and methodological approach, case study settlement patterns from the Middle Bronze (MBA) and Iron Ages (IA) in the Iraqi North Jazirah Survey (NJS) are used. Results indicate clear differences in settlement factors and household choices in simulations that lead to settlement size hierarchies comparable to the two evaluated periods. Conflict and socio-political cohesion, both their presence and absence, are suggested to have major roles in affecting the observed settlement hierarchy. More broadly, the model is made applicable for different empirically based settings, while being generalized to incorporate data uncertainty, making the model useful for understanding urbanism from top-down and bottom-up perspectives.

  18. Reconstructing Demography and Social Behavior During the Neolithic Expansion from Genomic Diversity Across Island Southeast Asia.

    PubMed

    Vallée, François; Luciani, Aurélien; Cox, Murray P

    2016-12-01

    Archaeology, linguistics, and increasingly genetics are clarifying how populations moved from mainland Asia, through Island Southeast Asia, and out into the Pacific during the farming revolution. Yet key features of this process remain poorly understood, particularly how social behaviors intersected with demographic drivers to create the patterns of genomic diversity observed across Island Southeast Asia today. Such questions are ripe for computer modeling. Here, we construct an agent-based model to simulate human mobility across Island Southeast Asia from the Neolithic period to the present, with a special focus on interactions between individuals with Asian, Papuan, and mixed Asian-Papuan ancestry. Incorporating key features of the region, including its complex geography (islands and sea), demographic drivers (fecundity and migration), and social behaviors (marriage preferences), the model simultaneously tracks a full suite of genomic markers (autosomes, X chromosome, mitochondrial DNA, and Y chromosome). Using Bayesian inference, model parameters were determined that produce simulations that closely resemble the admixture profiles of 2299 individuals from 84 populations across Island Southeast Asia. The results highlight that greater propensity to migrate and elevated birth rates are related drivers behind the expansion of individuals with Asian ancestry relative to individuals with Papuan ancestry, that offspring preferentially resulted from marriages between Asian women and Papuan men, and that in contrast to current thinking, individuals with Asian ancestry were likely distributed across large parts of western Island Southeast Asia before the Neolithic expansion. Copyright © 2016 Vallée et al.

  19. Reconstructing Demography and Social Behavior During the Neolithic Expansion from Genomic Diversity Across Island Southeast Asia

    PubMed Central

    Vallée, François; Luciani, Aurélien; Cox, Murray P.

    2016-01-01

    Archaeology, linguistics, and increasingly genetics are clarifying how populations moved from mainland Asia, through Island Southeast Asia, and out into the Pacific during the farming revolution. Yet key features of this process remain poorly understood, particularly how social behaviors intersected with demographic drivers to create the patterns of genomic diversity observed across Island Southeast Asia today. Such questions are ripe for computer modeling. Here, we construct an agent-based model to simulate human mobility across Island Southeast Asia from the Neolithic period to the present, with a special focus on interactions between individuals with Asian, Papuan, and mixed Asian–Papuan ancestry. Incorporating key features of the region, including its complex geography (islands and sea), demographic drivers (fecundity and migration), and social behaviors (marriage preferences), the model simultaneously tracks a full suite of genomic markers (autosomes, X chromosome, mitochondrial DNA, and Y chromosome). Using Bayesian inference, model parameters were determined that produce simulations that closely resemble the admixture profiles of 2299 individuals from 84 populations across Island Southeast Asia. The results highlight that greater propensity to migrate and elevated birth rates are related drivers behind the expansion of individuals with Asian ancestry relative to individuals with Papuan ancestry, that offspring preferentially resulted from marriages between Asian women and Papuan men, and that in contrast to current thinking, individuals with Asian ancestry were likely distributed across large parts of western Island Southeast Asia before the Neolithic expansion. PMID:27683274

  20. Closed loop models for analyzing engineering requirements for simulators

    NASA Technical Reports Server (NTRS)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  1. Development of digital phantoms based on a finite element model to simulate low-attenuation areas in CT imaging for pulmonary emphysema quantification.

    PubMed

    Diciotti, Stefano; Nobis, Alessandro; Ciulli, Stefano; Landini, Nicholas; Mascalchi, Mario; Sverzellati, Nicola; Innocenti, Bernardo

    2017-09-01

    To develop an innovative finite element (FE) model of lung parenchyma which simulates pulmonary emphysema on CT imaging. The model is aimed to generate a set of digital phantoms of low-attenuation areas (LAA) images with different grades of emphysema severity. Four individual parameter configurations simulating different grades of emphysema severity were utilized to generate 40 FE models using ten randomizations for each setting. We compared two measures of emphysema severity (relative area (RA) and the exponent D of the cumulative distribution function of LAA clusters size) between the simulated LAA images and those computed directly on the models output (considered as reference). The LAA images obtained from our model output can simulate CT-LAA images in subjects with different grades of emphysema severity. Both RA and D computed on simulated LAA images were underestimated as compared to those calculated on the models output, suggesting that measurements in CT imaging may not be accurate in the assessment of real emphysema extent. Our model is able to mimic the cluster size distribution of LAA on CT imaging of subjects with pulmonary emphysema. The model could be useful to generate standard test images and to design physical phantoms of LAA images for the assessment of the accuracy of indexes for the radiologic quantitation of emphysema.

  2. An individual-based growth and competition model for coastal redwood forest restoration

    USGS Publications Warehouse

    van Mantgem, Phillip J.; Das, Adrian J.

    2014-01-01

    Thinning treatments to accelerate coastal redwood forest stand development are in wide application, but managers have yet to identify prescriptions that might best promote Sequoia sempervirens (Lamb. ex D. Don) Endl. (redwood) growth. The creation of successful thinning prescriptions would be aided by identifying the underlying mechanisms governing how individual tree growth responds to competitive environments in coastal redwood forests. We created a spatially explicit individual-based model of tree competition and growth parameterized using surveys of upland redwood forests at Redwood National Park, California. We modeled competition for overstory trees (stems ≥ 20 cm stem diameter at breast height, 1.37 m (dbh)) as growth reductions arising from sizes, distances, and species identity of competitor trees. Our model explained up to half of the variation in individual tree growth, suggesting that neighborhood crowding is an important determinant of growth in this forest type. We used our model to simulate the effects of novel thinning prescriptions (e.g., 40% stand basal area removal) for redwood forest restoration, concluding that these treatments could lead to substantial growth releases, particularly for S. sempervirens. The results of this study, along with continued improvements to our model, will help to determine spacing and species composition that best encourage growth.

  3. Progress report on daily flow-routing simulation for the Carson River, California and Nevada

    USGS Publications Warehouse

    Hess, G.W.

    1996-01-01

    A physically based flow-routing model using Hydrological Simulation Program-FORTRAN (HSPF) was constructed for modeling streamflow in the Carson River at daily time intervals as part of the Truckee-Carson Program of the U.S. Geological Survey (USGS). Daily streamflow data for water years 1978-92 for the mainstem river, tributaries, and irrigation ditches from the East Fork Carson River near Markleeville and West Fork Carson River at Woodfords down to the mainstem Carson River at Fort Churchill upstream from Lahontan Reservoir were obtained from several agencies and were compiled into a comprehensive data base. No previous physically based flow-routing model of the Carson River has incorporated multi-agency streamflow data into a single data base and simulated flow at a daily time interval. Where streamflow data were unavailable or incomplete, hydrologic techniques were used to estimate some flows. For modeling purposes, the Carson River was divided into six segments, which correspond to those used in the Alpine Decree that governs water rights along the river. Hydraulic characteristics were defined for 48 individual stream reaches based on cross-sectional survey data obtained from field surveys and previous studies. Simulation results from the model were compared with available observed and estimated streamflow data. Model testing demonstrated that hydraulic characteristics of the Carson River are adequately represented in the models for a range of flow regimes. Differences between simulated and observed streamflow result mostly from inadequate data characterizing inflow and outflow from the river. Because irrigation return flows are largely unknown, irrigation return flow percentages were used as a calibration parameter to minimize differences between observed and simulated streamflows. Observed and simulated streamflow were compared for daily periods for the full modeled length of the Carson River and for two major subreaches modeled with more detailed input data. Hydrographs and statistics presented in this report describe these differences. A sensitivity analysis of four estimated components of the hydrologic system evaluated which components were significant in the model. Estimated ungaged tributary streamflow is not a significant component of the model during low runoff, but is significant during high runoff. The sensitivity analysis indicates that changes in the estimated irrigation diversion and estimated return flow creates a noticeable change in the statistics. The modeling for this study is preliminary. Results of the model are constrained by current availability and accuracy of observed hydrologic data. Several inflows and outflows of the Carson River are not described by time-series data and therefore are not represented in the model.

  4. Improved fourth-year medical student clinical decision-making performance as a resuscitation team leader after a simulation-based curriculum.

    PubMed

    Ten Eyck, Raymond P; Tews, Matthew; Ballester, John M; Hamilton, Glenn C

    2010-06-01

    To determine the impact of simulation-based instruction on student performance in the role of emergency department resuscitation team leader. A randomized, single-blinded, controlled study using an intention to treat analysis. Eighty-three fourth-year medical students enrolled in an emergency medicine clerkship were randomly allocated to two groups differing only by instructional format. Each student individually completed an initial simulation case, followed by a standardized curriculum of eight cases in either group simulation or case-based group discussion format before a second individual simulation case. A remote coinvestigator measured eight objective performance end points using digital recordings of all individual simulation cases. McNemar chi2, Pearson correlation, repeated measures multivariate analysis of variance, and follow-up analysis of variance were used for statistical evaluation. Sixty-eight students (82%) completed both initial and follow-up individual simulations. Eight students were lost from the simulation group and seven from the discussion group. The mean postintervention case performance was significantly better for the students allocated to simulation instruction compared with the group discussion students for four outcomes including a decrease in mean time to (1) order an intravenous line; (2) initiate cardiac monitoring; (3) order initial laboratory tests; and (4) initiate blood pressure monitoring. Paired comparisons of each student's initial and follow-up simulations demonstrated significant improvement in the same four areas, in mean time to order an abdominal radiograph and in obtaining an allergy history. A single simulation-based teaching session significantly improved student performance as a team leader. Additional simulation sessions provided further improvement compared with instruction provided in case-based group discussion format.

  5. Creating "Intelligent" Climate Model Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, N. C.; Taylor, P. C.

    2014-12-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is often used to add value to model projections: consensus projections have been shown to consistently outperform individual models. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, certain models reproduce climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument and surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing weighted and unweighted model ensembles. For example, one tested metric weights the ensemble by how well models reproduce the time-series probability distribution of the cloud forcing component of reflected shortwave radiation. The weighted ensemble for this metric indicates lower simulated precipitation (up to .7 mm/day) in tropical regions than the unweighted ensemble: since CMIP5 models have been shown to overproduce precipitation, this result could indicate that the metric is effective in identifying models which simulate more realistic precipitation. Ultimately, the goal of the framework is to identify performance metrics for advising better methods for ensemble averaging models and create better climate predictions.

  6. Quantitative evaluation of simulated functional brain networks in graph theoretical analysis.

    PubMed

    Lee, Won Hee; Bullmore, Ed; Frangou, Sophia

    2017-02-01

    There is increasing interest in the potential of whole-brain computational models to provide mechanistic insights into resting-state brain networks. It is therefore important to determine the degree to which computational models reproduce the topological features of empirical functional brain networks. We used empirical connectivity data derived from diffusion spectrum and resting-state functional magnetic resonance imaging data from healthy individuals. Empirical and simulated functional networks, constrained by structural connectivity, were defined based on 66 brain anatomical regions (nodes). Simulated functional data were generated using the Kuramoto model in which each anatomical region acts as a phase oscillator. Network topology was studied using graph theory in the empirical and simulated data. The difference (relative error) between graph theory measures derived from empirical and simulated data was then estimated. We found that simulated data can be used with confidence to model graph measures of global network organization at different dynamic states and highlight the sensitive dependence of the solutions obtained in simulated data on the specified connection densities. This study provides a method for the quantitative evaluation and external validation of graph theory metrics derived from simulated data that can be used to inform future study designs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Monte Carlo dosimetry for {sup 103}Pd, {sup 125}I, and {sup 131}Cs ocular brachytherapy with various plaque models using an eye phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lesperance, Marielle; Martinov, M.; Thomson, R. M., E-mail: rthomson@physics.carleton.ca

    Purpose: To investigate dosimetry for ocular brachytherapy for a range of eye plaque models containing{sup 103}Pd, {sup 125}I, or {sup 131}Cs seeds with model-based dose calculations. Methods: Five representative plaque models are developed based on a literature review and are compared to the standardized COMS plaque, including plaques consisting of a stainless steel backing and acrylic insert, and gold alloy backings with: short collimating lips and acrylic insert, no lips and silicone polymer insert, no lips and a thin acrylic layer, and individual collimating slots for each seed within the backing and no insert. Monte Carlo simulations are performed usingmore » the EGSnrc user-code BrachyDose for single and multiple seed configurations for the plaques in water and within an eye model (including nonwater media). Simulations under TG-43 assumptions are also performed, i.e., with the same seed configurations in water, neglecting interseed and plaque effects. Maximum and average doses to ocular structures as well as isodose contours are compared for simulations of each radionuclide within the plaque models. Results: The presence of the plaque affects the dose distribution substantially along the plaque axis for both single seed and multiseed simulations of each plaque design in water. Of all the plaque models, the COMS plaque generally has the largest effect on the dose distribution in water along the plaque axis. Differences between doses for single and multiple seed configurations vary between plaque models and radionuclides. Collimation is most substantial for the plaque with individual collimating slots. For plaques in the full eye model, average dose in the tumor region differs from those for the TG-43 simulations by up to 10% for{sup 125}I and {sup 131}Cs, and up to 17% for {sup 103}Pd, and in the lens region by up to 29% for {sup 125}I, 34% for {sup 103}Pd, and 28% for {sup 131}Cs. For the same prescription dose to the tumor apex, the lowest doses to critical ocular structures are generally delivered with plaques containing {sup 103}Pd seeds. Conclusions: The combined effects of ocular and plaque media on dose are significant and vary with plaque model and radionuclide, suggesting the importance of model-based dose calculations employing accurate ocular and plaque media and geometries for eye plaque brachytherapy.« less

  8. Inferring multi-scale neural mechanisms with brain network modelling

    PubMed Central

    Schirner, Michael; McIntosh, Anthony Randal; Jirsa, Viktor; Deco, Gustavo

    2018-01-01

    The neurophysiological processes underlying non-invasive brain activity measurements are incompletely understood. Here, we developed a connectome-based brain network model that integrates individual structural and functional data with neural population dynamics to support multi-scale neurophysiological inference. Simulated populations were linked by structural connectivity and, as a novelty, driven by electroencephalography (EEG) source activity. Simulations not only predicted subjects' individual resting-state functional magnetic resonance imaging (fMRI) time series and spatial network topologies over 20 minutes of activity, but more importantly, they also revealed precise neurophysiological mechanisms that underlie and link six empirical observations from different scales and modalities: (1) resting-state fMRI oscillations, (2) functional connectivity networks, (3) excitation-inhibition balance, (4, 5) inverse relationships between α-rhythms, spike-firing and fMRI on short and long time scales, and (6) fMRI power-law scaling. These findings underscore the potential of this new modelling framework for general inference and integration of neurophysiological knowledge to complement empirical studies. PMID:29308767

  9. Estimating degradation in real time and accelerated stability tests with random lot-to-lot variation: a simulation study.

    PubMed

    Magari, Robert T

    2002-03-01

    The effect of different lot-to-lot variability levels on the prediction of stability are studied based on two statistical models for estimating degradation in real time and accelerated stability tests. Lot-to-lot variability is considered as random in both models, and is attributed to two sources-variability at time zero, and variability of degradation rate. Real-time stability tests are modeled as a function of time while accelerated stability tests as a function of time and temperatures. Several data sets were simulated, and a maximum likelihood approach was used for estimation. The 95% confidence intervals for the degradation rate depend on the amount of lot-to-lot variability. When lot-to-lot degradation rate variability is relatively large (CV > or = 8%) the estimated confidence intervals do not represent the trend for individual lots. In such cases it is recommended to analyze each lot individually. Copyright 2002 Wiley-Liss, Inc. and the American Pharmaceutical Association J Pharm Sci 91: 893-899, 2002

  10. Modeling Social Capital as Dynamic Networks to Promote Access to Oral Healthcare

    PubMed Central

    Northridge, Mary E.; Kunzel, Carol; Zhang, Qiuyi; Kum, Susan S.; Gilbert, Jessica L.; Jin, Zhu; Metcalf, Sara S.

    2016-01-01

    Social capital, as comprised of human connections in social networks and their associated benefits, is closely related to the health of individuals, communities, and societies at large. For disadvantaged population groups such as older adults and racial/ethnic minorities, social capital may play a particularly critical role in mitigating the negative effects and reinforcing the positive effects on health. In this project, we model social capital as both cause and effect by simulating dynamic networks. Informed in part by a community-based health promotion program, an agent-based model is contextualized in a GIS environment to explore the complexity of social disparities in oral and general health as experienced at the individual, interpersonal, and community scales. This study provides the foundation for future work investigating how health and healthcare accessibility may be influenced by social networks. PMID:27668298

  11. Modeling Social Capital as Dynamic Networks to Promote Access to Oral Healthcare.

    PubMed

    Wang, Hua; Northridge, Mary E; Kunzel, Carol; Zhang, Qiuyi; Kum, Susan S; Gilbert, Jessica L; Jin, Zhu; Metcalf, Sara S

    2016-01-01

    Social capital, as comprised of human connections in social networks and their associated benefits, is closely related to the health of individuals, communities, and societies at large. For disadvantaged population groups such as older adults and racial/ethnic minorities, social capital may play a particularly critical role in mitigating the negative effects and reinforcing the positive effects on health. In this project, we model social capital as both cause and effect by simulating dynamic networks. Informed in part by a community-based health promotion program, an agent-based model is contextualized in a GIS environment to explore the complexity of social disparities in oral and general health as experienced at the individual, interpersonal, and community scales. This study provides the foundation for future work investigating how health and healthcare accessibility may be influenced by social networks.

  12. Simulating farmer behaviour under water markets

    NASA Astrophysics Data System (ADS)

    Padula, SIlvia; Erfani, Tohid; Henriques, Catarina; Maziotis, Alexandros; Garbe, Jennifer; Swinscoe, Thomas; Harou, Julien; Weatherhead, Keith; Beevers, Lindsay; Fleskens, Luuk

    2015-04-01

    Increasing water scarcity may lead water managers to consider alternative approaches to water allocation including water markets. One concern with markets is how will specific sectors interact with a potential water market, when will they gain or loose water and will they benefit economically - why, when and how? The behaviours of different individual abstractors or institutional actors under water markets is of interest to regulators who seek to design effective market policies which satisfy multiple stakeholder groups. In this study we consider two dozen agricultural water users in eastern England (Nar basin). Using partially synthetic but regionally representative cropping and irrigation data we simulate the buying and selling behaviour of farmers on a weekly basis over multiple years. The impact of on-farm water storage is assessed for farmers who own a reservoir. A river-basin-scale hydro-economic multi-agent model is used that represents individual abstractors and can simulate a spot market under various licensing regimes. Weekly varying economic demand curves for water are calibrated based on historical climate and water use data. The model represents the trade-off between current use value and expected gains from trade to reach weekly decisions. Early results are discussed and model limitations and possible extensions are presented.

  13. A hybrid computational model to explore the topological characteristics of epithelial tissues.

    PubMed

    González-Valverde, Ismael; García-Aznar, José Manuel

    2017-11-01

    Epithelial tissues show a particular topology where cells resemble a polygon-like shape, but some biological processes can alter this tissue topology. During cell proliferation, mitotic cell dilation deforms the tissue and modifies the tissue topology. Additionally, cells are reorganized in the epithelial layer and these rearrangements also alter the polygon distribution. We present here a computer-based hybrid framework focused on the simulation of epithelial layer dynamics that combines discrete and continuum numerical models. In this framework, we consider topological and mechanical aspects of the epithelial tissue. Individual cells in the tissue are simulated by an off-lattice agent-based model, which keeps the information of each cell. In addition, we model the cell-cell interaction forces and the cell cycle. Otherwise, we simulate the passive mechanical behaviour of the cell monolayer using a material that approximates the mechanical properties of the cell. This continuum approach is solved by the finite element method, which uses a dynamic mesh generated by the triangulation of cell polygons. Forces generated by cell-cell interaction in the agent-based model are also applied on the finite element mesh. Cell movement in the agent-based model is driven by the displacements obtained from the deformed finite element mesh of the continuum mechanical approach. We successfully compare the results of our simulations with some experiments about the topology of proliferating epithelial tissues in Drosophila. Our framework is able to model the emergent behaviour of the cell monolayer that is due to local cell-cell interactions, which have a direct influence on the dynamics of the epithelial tissue. Copyright © 2017 John Wiley & Sons, Ltd.

  14. A mechanistic Individual-based Model of microbial communities.

    PubMed

    Jayathilake, Pahala Gedara; Gupta, Prashant; Li, Bowen; Madsen, Curtis; Oyebamiji, Oluwole; González-Cabaleiro, Rebeca; Rushton, Steve; Bridgens, Ben; Swailes, David; Allen, Ben; McGough, A Stephen; Zuliani, Paolo; Ofiteru, Irina Dana; Wilkinson, Darren; Chen, Jinju; Curtis, Tom

    2017-01-01

    Accurate predictive modelling of the growth of microbial communities requires the credible representation of the interactions of biological, chemical and mechanical processes. However, although biological and chemical processes are represented in a number of Individual-based Models (IbMs) the interaction of growth and mechanics is limited. Conversely, there are mechanically sophisticated IbMs with only elementary biology and chemistry. This study focuses on addressing these limitations by developing a flexible IbM that can robustly combine the biological, chemical and physical processes that dictate the emergent properties of a wide range of bacterial communities. This IbM is developed by creating a microbiological adaptation of the open source Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS). This innovation should provide the basis for "bottom up" prediction of the emergent behaviour of entire microbial systems. In the model presented here, bacterial growth, division, decay, mechanical contact among bacterial cells, and adhesion between the bacteria and extracellular polymeric substances are incorporated. In addition, fluid-bacteria interaction is implemented to simulate biofilm deformation and erosion. The model predicts that the surface morphology of biofilms becomes smoother with increased nutrient concentration, which agrees well with previous literature. In addition, the results show that increased shear rate results in smoother and more compact biofilms. The model can also predict shear rate dependent biofilm deformation, erosion, streamer formation and breakup.

  15. A mechanistic Individual-based Model of microbial communities

    PubMed Central

    Gupta, Prashant; Li, Bowen; Madsen, Curtis; Oyebamiji, Oluwole; González-Cabaleiro, Rebeca; Rushton, Steve; Bridgens, Ben; Swailes, David; Allen, Ben; McGough, A. Stephen; Zuliani, Paolo; Ofiteru, Irina Dana; Wilkinson, Darren; Chen, Jinju; Curtis, Tom

    2017-01-01

    Accurate predictive modelling of the growth of microbial communities requires the credible representation of the interactions of biological, chemical and mechanical processes. However, although biological and chemical processes are represented in a number of Individual-based Models (IbMs) the interaction of growth and mechanics is limited. Conversely, there are mechanically sophisticated IbMs with only elementary biology and chemistry. This study focuses on addressing these limitations by developing a flexible IbM that can robustly combine the biological, chemical and physical processes that dictate the emergent properties of a wide range of bacterial communities. This IbM is developed by creating a microbiological adaptation of the open source Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS). This innovation should provide the basis for “bottom up” prediction of the emergent behaviour of entire microbial systems. In the model presented here, bacterial growth, division, decay, mechanical contact among bacterial cells, and adhesion between the bacteria and extracellular polymeric substances are incorporated. In addition, fluid-bacteria interaction is implemented to simulate biofilm deformation and erosion. The model predicts that the surface morphology of biofilms becomes smoother with increased nutrient concentration, which agrees well with previous literature. In addition, the results show that increased shear rate results in smoother and more compact biofilms. The model can also predict shear rate dependent biofilm deformation, erosion, streamer formation and breakup. PMID:28771505

  16. Simulating the Effects of Cross-Generational Cultural Transmission on Language Change

    NASA Astrophysics Data System (ADS)

    Gong, Tao; Shuai, Lan

    Language evolves in a socio-cultural environment. Apart from biological evolution and individual learning, cultural transmission also casts important influence on many aspects of language evolution. In this paper, based on the lexicon-syntax coevolution model, we extend the acquisition framework in our previous work to examine the roles of three forms of cultural transmission spanning the offspring, parent, and grandparent generations in language change. These transmissions are: those between the parent and offspring generations (PO), those within the offspring generation (OO), and those between the grandparent and offspring generations (GO). The simulation results of the considered model and relevant analyses illustrate not only the necessity of PO and OO transmissions for language change, thus echoing our previous findings, but also the importance of GO transmission, a form of cross-generational cultural transmission, on preserving the mutual understandability of the communal language across generations of individuals.

  17. GSFLOW - Coupled Ground-Water and Surface-Water Flow Model Based on the Integration of the Precipitation-Runoff Modeling System (PRMS) and the Modular Ground-Water Flow Model (MODFLOW-2005)

    USGS Publications Warehouse

    Markstrom, Steven L.; Niswonger, Richard G.; Regan, R. Steven; Prudic, David E.; Barlow, Paul M.

    2008-01-01

    The need to assess the effects of variability in climate, biota, geology, and human activities on water availability and flow requires the development of models that couple two or more components of the hydrologic cycle. An integrated hydrologic model called GSFLOW (Ground-water and Surface-water FLOW) was developed to simulate coupled ground-water and surface-water resources. The new model is based on the integration of the U.S. Geological Survey Precipitation-Runoff Modeling System (PRMS) and the U.S. Geological Survey Modular Ground-Water Flow Model (MODFLOW). Additional model components were developed, and existing components were modified, to facilitate integration of the models. Methods were developed to route flow among the PRMS Hydrologic Response Units (HRUs) and between the HRUs and the MODFLOW finite-difference cells. This report describes the organization, concepts, design, and mathematical formulation of all GSFLOW model components. An important aspect of the integrated model design is its ability to conserve water mass and to provide comprehensive water budgets for a location of interest. This report includes descriptions of how water budgets are calculated for the integrated model and for individual model components. GSFLOW provides a robust modeling system for simulating flow through the hydrologic cycle, while allowing for future enhancements to incorporate other simulation techniques.

  18. A numerical multi-scale model to predict macroscopic material anisotropy of multi-phase steels from crystal plasticity material definitions

    NASA Astrophysics Data System (ADS)

    Ravi, Sathish Kumar; Gawad, Jerzy; Seefeldt, Marc; Van Bael, Albert; Roose, Dirk

    2017-10-01

    A numerical multi-scale model is being developed to predict the anisotropic macroscopic material response of multi-phase steel. The embedded microstructure is given by a meso-scale Representative Volume Element (RVE), which holds the most relevant features like phase distribution, grain orientation, morphology etc., in sufficient detail to describe the multi-phase behavior of the material. A Finite Element (FE) mesh of the RVE is constructed using statistical information from individual phases such as grain size distribution and ODF. The material response of the RVE is obtained for selected loading/deformation modes through numerical FE simulations in Abaqus. For the elasto-plastic response of the individual grains, single crystal plasticity based plastic potential functions are proposed as Abaqus material definitions. The plastic potential functions are derived using the Facet method for individual phases in the microstructure at the level of single grains. The proposed method is a new modeling framework and the results presented in terms of macroscopic flow curves are based on the building blocks of the approach, while the model would eventually facilitate the construction of an anisotropic yield locus of the underlying multi-phase microstructure derived from a crystal plasticity based framework.

  19. Evaluating the impacts of screening and smoking cessation programmes on lung cancer in a high-burden region of the USA: a simulation modelling study.

    PubMed

    Tramontano, Angela C; Sheehan, Deirdre F; McMahon, Pamela M; Dowling, Emily C; Holford, Theodore R; Ryczak, Karen; Lesko, Samuel M; Levy, David T; Kong, Chung Yin

    2016-02-29

    While the US Preventive Services Task Force has issued recommendations for lung cancer screening, its effectiveness at reducing lung cancer burden may vary at local levels due to regional variations in smoking behaviour. Our objective was to use an existing model to determine the impacts of lung cancer screening alone or in addition to increased smoking cessation in a US region with a relatively high smoking prevalence and lung cancer incidence. Computer-based simulation model. Simulated population of individuals 55 and older based on smoking prevalence and census data from Northeast Pennsylvania. Hypothetical lung cancer control from 2014 to 2050 through (1) screening with CT, (2) intensified smoking cessation or (3) a combination strategy. Primary outcomes were lung cancer mortality rates. Secondary outcomes included number of people eligible for screening and number of radiation-induced lung cancers. Combining lung cancer screening with increased smoking cessation would yield an estimated 8.1% reduction in cumulative lung cancer mortality by 2050. Our model estimated that the number of screening-eligible individuals would progressively decrease over time, indicating declining benefit of a screening-only programme. Lung cancer screening achieved a greater mortality reduction in earlier years, but was later surpassed by smoking cessation. Combining smoking cessation programmes with lung cancer screening would provide the most benefit to a population, especially considering the growing proportion of patients ineligible for screening based on current recommendations. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  20. Mathematical modeling of malaria infection with innate and adaptive immunity in individuals and agent-based communities.

    PubMed

    Gurarie, David; Karl, Stephan; Zimmerman, Peter A; King, Charles H; St Pierre, Timothy G; Davis, Timothy M E

    2012-01-01

    Agent-based modeling of Plasmodium falciparum infection offers an attractive alternative to the conventional Ross-Macdonald methodology, as it allows simulation of heterogeneous communities subjected to realistic transmission (inoculation patterns). We developed a new, agent based model that accounts for the essential in-host processes: parasite replication and its regulation by innate and adaptive immunity. The model also incorporates a simplified version of antigenic variation by Plasmodium falciparum. We calibrated the model using data from malaria-therapy (MT) studies, and developed a novel calibration procedure that accounts for a deterministic and a pseudo-random component in the observed parasite density patterns. Using the parasite density patterns of 122 MT patients, we generated a large number of calibrated parameters. The resulting data set served as a basis for constructing and simulating heterogeneous agent-based (AB) communities of MT-like hosts. We conducted several numerical experiments subjecting AB communities to realistic inoculation patterns reported from previous field studies, and compared the model output to the observed malaria prevalence in the field. There was overall consistency, supporting the potential of this agent-based methodology to represent transmission in realistic communities. Our approach represents a novel, convenient and versatile method to model Plasmodium falciparum infection.

Top