Science.gov

Sample records for agent-based simulation framework

  1. A Multi Agent-Based Framework for Simulating Household PHEV Distribution and Electric Distribution Network Impact

    SciTech Connect

    Cui, Xiaohui; Liu, Cheng; Kim, Hoe Kyoung; Kao, Shih-Chieh; Tuttle, Mark A; Bhaduri, Budhendra L

    2011-01-01

    The variation of household attributes such as income, travel distance, age, household member, and education for different residential areas may generate different market penetration rates for plug-in hybrid electric vehicle (PHEV). Residential areas with higher PHEV ownership could increase peak electric demand locally and require utilities to upgrade the electric distribution infrastructure even though the capacity of the regional power grid is under-utilized. Estimating the future PHEV ownership distribution at the residential household level can help us understand the impact of PHEV fleet on power line congestion, transformer overload and other unforeseen problems at the local residential distribution network level. It can also help utilities manage the timing of recharging demand to maximize load factors and utilization of existing distribution resources. This paper presents a multi agent-based simulation framework for 1) modeling spatial distribution of PHEV ownership at local residential household level, 2) discovering PHEV hot zones where PHEV ownership may quickly increase in the near future, and 3) estimating the impacts of the increasing PHEV ownership on the local electric distribution network with different charging strategies. In this paper, we use Knox County, TN as a case study to show the simulation results of the agent-based model (ABM) framework. However, the framework can be easily applied to other local areas in the US.

  2. A Scaffolding Framework to Support Learning of Emergent Phenomena Using Multi-Agent-Based Simulation Environments

    NASA Astrophysics Data System (ADS)

    Basu, Satabdi; Sengupta, Pratim; Biswas, Gautam

    2015-04-01

    Students from middle school to college have difficulties in interpreting and understanding complex systems such as ecological phenomena. Researchers have suggested that students experience difficulties in reconciling the relationships between individuals, populations, and species, as well as the interactions between organisms and their environment in the ecosystem. Multi-agent-based computational models (MABMs) can explicitly capture agents and their interactions by representing individual actors as computational objects with assigned rules. As a result, the collective aggregate-level behavior of the population dynamically emerges from simulations that generate the aggregation of these interactions. Past studies have used a variety of scaffolds to help students learn ecological phenomena. Yet, there is no theoretical framework that supports the systematic design of scaffolds to aid students' learning in MABMs. Our paper addresses this issue by proposing a comprehensive framework for the design, analysis, and evaluation of scaffolding to support students' learning of ecology in a MABM. We present a study in which middle school students used a MABM to investigate and learn about a desert ecosystem. We identify the different types of scaffolds needed to support inquiry learning activities in this simulation environment and use our theoretical framework to demonstrate the effectiveness of our scaffolds in helping students develop a deep understanding of the complex ecological behaviors represented in the simulation..

  3. Agent-Based Spatiotemporal Simulation of Biomolecular Systems within the Open Source MASON Framework

    PubMed Central

    Pérez-Rodríguez, Gael; Pérez-Pérez, Martín; Glez-Peña, Daniel; Azevedo, Nuno F.; Lourenço, Anália

    2015-01-01

    Agent-based modelling is being used to represent biological systems with increasing frequency and success. This paper presents the implementation of a new tool for biomolecular reaction modelling in the open source Multiagent Simulator of Neighborhoods framework. The rationale behind this new tool is the necessity to describe interactions at the molecular level to be able to grasp emergent and meaningful biological behaviour. We are particularly interested in characterising and quantifying the various effects that facilitate biocatalysis. Enzymes may display high specificity for their substrates and this information is crucial to the engineering and optimisation of bioprocesses. Simulation results demonstrate that molecule distributions, reaction rate parameters, and structural parameters can be adjusted separately in the simulation allowing a comprehensive study of individual effects in the context of realistic cell environments. While higher percentage of collisions with occurrence of reaction increases the affinity of the enzyme to the substrate, a faster reaction (i.e., turnover number) leads to a smaller number of time steps. Slower diffusion rates and molecular crowding (physical hurdles) decrease the collision rate of reactants, hence reducing the reaction rate, as expected. Also, the random distribution of molecules affects the results significantly. PMID:25874228

  4. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    SciTech Connect

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-06-23

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control system design, and integration of wind power in a smart grid.

  5. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    DOE PAGES

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-01-01

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less

  6. On-lattice agent-based simulation of populations of cells within the open-source Chaste framework.

    PubMed

    Figueredo, Grazziela P; Joshi, Tanvi V; Osborne, James M; Byrne, Helen M; Owen, Markus R

    2013-04-01

    Over the years, agent-based models have been developed that combine cell division and reinforced random walks of cells on a regular lattice, reaction-diffusion equations for nutrients and growth factors; and ordinary differential equations for the subcellular networks regulating the cell cycle. When linked to a vascular layer, this multiple scale model framework has been applied to tumour growth and therapy. Here, we report on the creation of an agent-based multi-scale environment amalgamating the characteristics of these models within a Virtual Physiological Human (VPH) Exemplar Project. This project enables reuse, integration, expansion and sharing of the model and relevant data. The agent-based and reaction-diffusion parts of the multi-scale model have been implemented and are available for download as part of the latest public release of Chaste (Cancer, Heart and Soft Tissue Environment; http://www.cs.ox.ac.uk/chaste/), part of the VPH Toolkit (http://toolkit.vph-noe.eu/). The environment functionalities are verified against the original models, in addition to extra validation of all aspects of the code. In this work, we present the details of the implementation of the agent-based environment, including the system description, the conceptual model, the development of the simulation model and the processes of verification and validation of the simulation results. We explore the potential use of the environment by presenting exemplar applications of the 'what if' scenarios that can easily be studied in the environment. These examples relate to tumour growth, cellular competition for resources and tumour responses to hypoxia (low oxygen levels). We conclude our work by summarizing the future steps for the expansion of the current system. PMID:24427527

  7. An agent-based framework for fuel cycle simulation with recycling

    SciTech Connect

    Gidden, M.J.; Wilson, P.P.H.; Huff, K.D.; Carlsen, R.W.

    2013-07-01

    Simulation of the nuclear fuel cycle is an established field with multiple players. Prior development work has utilized techniques such as system dynamics to provide a solution structure for the matching of supply and demand in these simulations. In general, however, simulation infrastructure development has occurred in relatively closed circles, each effort having unique considerations as to the cases which are desired to be modeled. Accordingly, individual simulators tend to have their design decisions driven by specific use cases. Presented in this work is a proposed supply and demand matching algorithm that leverages the techniques of the well-studied field of mathematical programming. A generic approach is achieved by treating facilities as individual entities and actors in the supply-demand market which denote preferences amongst commodities. Using such a framework allows for varying levels of interaction fidelity, ranging from low-fidelity, quick solutions to high-fidelity solutions that model individual transactions (e.g. at the fuel-assembly level). The power of the technique is that it allows such flexibility while still treating the problem in a generic manner, encapsulating simulation engine design decisions in such a way that future simulation requirements can be relatively easily added when needed. (authors)

  8. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  9. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    NASA Astrophysics Data System (ADS)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  10. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  11. Simulating Cancer Growth with Multiscale Agent-Based Modeling

    PubMed Central

    Wang, Zhihui; Butner, Joseph D.; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S.

    2014-01-01

    There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. PMID:24793698

  12. Agent-based modeling and simulation Part 3 : desktop ABMS.

    SciTech Connect

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2007-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS 'is a third way of doing science,' in addition to traditional deductive and inductive reasoning (Axelrod 1997b). Computational advances have made possible a growing number of agent-based models across a variety of application domains. Applications range from modeling agent behavior in the stock market, supply chains, and consumer markets, to predicting the spread of epidemics, the threat of bio-warfare, and the factors responsible for the fall of ancient civilizations. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing agent models, and illustrates the development of a simple agent-based model of shopper behavior using spreadsheets.

  13. An Agent-Based Modeling Framework and Application for the Generic Nuclear Fuel Cycle

    NASA Astrophysics Data System (ADS)

    Gidden, Matthew J.

    Key components of a novel methodology and implementation of an agent-based, dynamic nuclear fuel cycle simulator, Cyclus , are presented. The nuclear fuel cycle is a complex, physics-dependent supply chain. To date, existing dynamic simulators have not treated constrained fuel supply, time-dependent, isotopic-quality based demand, or fuel fungibility particularly well. Utilizing an agent-based methodology that incorporates sophisticated graph theory and operations research techniques can overcome these deficiencies. This work describes a simulation kernel and agents that interact with it, highlighting the Dynamic Resource Exchange (DRE), the supply-demand framework at the heart of the kernel. The key agent-DRE interaction mechanisms are described, which enable complex entity interaction through the use of physics and socio-economic models. The translation of an exchange instance to a variant of the Multicommodity Transportation Problem, which can be solved feasibly or optimally, follows. An extensive investigation of solution performance and fidelity is then presented. Finally, recommendations for future users of Cyclus and the DRE are provided.

  14. Agent-Based Modeling and Simulation on Emergency Evacuation

    NASA Astrophysics Data System (ADS)

    Ren, Chuanjun; Yang, Chenghui; Jin, Shiyao

    Crowd stampedes and evacuation induced by panic caused by emergences often lead to fatalities as people are crushed, injured, trampled or even dead. Such phenomena may be triggered in life-threatening situations such as fires, explosions in crowded buildings. Emergency evacuation simulation has recently attracted the interest of a rapidly increasing number of scientists. This paper presents an Agent-Based Modeling and Simulation using Repast software to construct crowd evacuations for emergency response from an area under a fire. Various types of agents and different attributes of agents are designed in contrast to traditional modeling. The attributes that govern the characteristics of the people are studied and tested by iterative simulations. Simulations are also conducted to demonstrate the effect of various parameters of agents. Some interesting results were observed such as "faster is slower" and the ignorance of available exits. At last, simulation results suggest practical ways of minimizing the harmful consequences of such events and the existence of an optimal escape strategy.

  15. Using Agent Based Modeling (ABM) to Develop Cultural Interaction Simulations

    NASA Technical Reports Server (NTRS)

    Drucker, Nick; Jones, Phillip N.

    2012-01-01

    Today, most cultural training is based on or built around "cultural engagements" or discrete interactions between the individual learner and one or more cultural "others". Often, success in the engagement is the end or the objective. In reality, these interactions usually involve secondary and tertiary effects with potentially wide ranging consequences. The concern is that learning culture within a strict engagement context might lead to "checklist" cultural thinking that will not empower learners to understand the full consequence of their actions. We propose the use of agent based modeling (ABM) to collect, store, and, simulating the effects of social networks, promulgate engagement effects over time, distance, and consequence. The ABM development allows for rapid modification to re-create any number of population types, extending the applicability of the model to any requirement for social modeling.

  16. Agent-based simulation of a financial market

    NASA Astrophysics Data System (ADS)

    Raberto, Marco; Cincotti, Silvano; Focardi, Sergio M.; Marchesi, Michele

    2001-10-01

    This paper introduces an agent-based artificial financial market in which heterogeneous agents trade one single asset through a realistic trading mechanism for price formation. Agents are initially endowed with a finite amount of cash and a given finite portfolio of assets. There is no money-creation process; the total available cash is conserved in time. In each period, agents make random buy and sell decisions that are constrained by available resources, subject to clustering, and dependent on the volatility of previous periods. The model proposed herein is able to reproduce the leptokurtic shape of the probability density of log price returns and the clustering of volatility. Implemented using extreme programming and object-oriented technology, the simulator is a flexible computational experimental facility that can find applications in both academic and industrial research projects.

  17. Agent-based modeling to simulate the dengue spread

    NASA Astrophysics Data System (ADS)

    Deng, Chengbin; Tao, Haiyan; Ye, Zhiwei

    2008-10-01

    In this paper, we introduce a novel method ABM in simulating the unique process for the dengue spread. Dengue is an acute infectious disease with a long history of over 200 years. Unlike the diseases that can be transmitted directly from person to person, dengue spreads through a must vector of mosquitoes. There is still no any special effective medicine and vaccine for dengue up till now. The best way to prevent dengue spread is to take precautions beforehand. Thus, it is crucial to detect and study the dynamic process of dengue spread that closely relates to human-environment interactions where Agent-Based Modeling (ABM) effectively works. The model attempts to simulate the dengue spread in a more realistic way in the bottom-up way, and to overcome the limitation of ABM, namely overlooking the influence of geographic and environmental factors. Considering the influence of environment, Aedes aegypti ecology and other epidemiological characteristics of dengue spread, ABM can be regarded as a useful way to simulate the whole process so as to disclose the essence of the evolution of dengue spread.

  18. Agent-based modeling of the immune system: NetLogo, a promising framework.

    PubMed

    Chiacchio, Ferdinando; Pennisi, Marzio; Russo, Giulia; Motta, Santo; Pappalardo, Francesco

    2014-01-01

    Several components that interact with each other to evolve a complex, and, in some cases, unexpected behavior, represents one of the main and fascinating features of the mammalian immune system. Agent-based modeling and cellular automata belong to a class of discrete mathematical approaches in which entities (agents) sense local information and undertake actions over time according to predefined rules. The strength of this approach is characterized by the appearance of a global behavior that emerges from interactions among agents. This behavior is unpredictable, as it does not follow linear rules. There are a lot of works that investigates the immune system with agent-based modeling and cellular automata. They have shown the ability to see clearly and intuitively into the nature of immunological processes. NetLogo is a multiagent programming language and modeling environment for simulating complex phenomena. It is designed for both research and education and is used across a wide range of disciplines and education levels. In this paper, we summarize NetLogo applications to immunology and, particularly, how this framework can help in the development and formulation of hypotheses that might drive further experimental investigations of disease mechanisms. PMID:24864263

  19. Agent-Based Modeling of the Immune System: NetLogo, a Promising Framework

    PubMed Central

    Chiacchio, Ferdinando; Russo, Giulia; Pappalardo, Francesco

    2014-01-01

    Several components that interact with each other to evolve a complex, and, in some cases, unexpected behavior, represents one of the main and fascinating features of the mammalian immune system. Agent-based modeling and cellular automata belong to a class of discrete mathematical approaches in which entities (agents) sense local information and undertake actions over time according to predefined rules. The strength of this approach is characterized by the appearance of a global behavior that emerges from interactions among agents. This behavior is unpredictable, as it does not follow linear rules. There are a lot of works that investigates the immune system with agent-based modeling and cellular automata. They have shown the ability to see clearly and intuitively into the nature of immunological processes. NetLogo is a multiagent programming language and modeling environment for simulating complex phenomena. It is designed for both research and education and is used across a wide range of disciplines and education levels. In this paper, we summarize NetLogo applications to immunology and, particularly, how this framework can help in the development and formulation of hypotheses that might drive further experimental investigations of disease mechanisms. PMID:24864263

  20. Agent-based modeling of the immune system: NetLogo, a promising framework.

    PubMed

    Chiacchio, Ferdinando; Pennisi, Marzio; Russo, Giulia; Motta, Santo; Pappalardo, Francesco

    2014-01-01

    Several components that interact with each other to evolve a complex, and, in some cases, unexpected behavior, represents one of the main and fascinating features of the mammalian immune system. Agent-based modeling and cellular automata belong to a class of discrete mathematical approaches in which entities (agents) sense local information and undertake actions over time according to predefined rules. The strength of this approach is characterized by the appearance of a global behavior that emerges from interactions among agents. This behavior is unpredictable, as it does not follow linear rules. There are a lot of works that investigates the immune system with agent-based modeling and cellular automata. They have shown the ability to see clearly and intuitively into the nature of immunological processes. NetLogo is a multiagent programming language and modeling environment for simulating complex phenomena. It is designed for both research and education and is used across a wide range of disciplines and education levels. In this paper, we summarize NetLogo applications to immunology and, particularly, how this framework can help in the development and formulation of hypotheses that might drive further experimental investigations of disease mechanisms.

  1. Agent based modeling of blood coagulation system: implementation using a GPU based high speed framework.

    PubMed

    Chen, Wenan; Ward, Kevin; Li, Qi; Kecman, Vojislav; Najarian, Kayvan; Menke, Nathan

    2011-01-01

    The coagulation and fibrinolytic systems are complex, inter-connected biological systems with major physiological roles. The complex, nonlinear multi-point relationships between the molecular and cellular constituents of two systems render a comprehensive and simultaneous study of the system at the microscopic and macroscopic level a significant challenge. We have created an Agent Based Modeling and Simulation (ABMS) approach for simulating these complex interactions. As the scale of agents increase, the time complexity and cost of the resulting simulations presents a significant challenge. As such, in this paper, we also present a high-speed framework for the coagulation simulation utilizing the computing power of graphics processing units (GPU). For comparison, we also implemented the simulations in NetLogo, Repast, and a direct C version. As our experiments demonstrate, the computational speed of the GPU implementation of the million-level scale of agents is over 10 times faster versus the C version, over 100 times faster versus the Repast version and over 300 times faster versus the NetLogo simulation. PMID:22254271

  2. Patient-centered appointment scheduling using agent-based simulation.

    PubMed

    Turkcan, Ayten; Toscos, Tammy; Doebbeling, Brad N

    2014-01-01

    Enhanced access and continuity are key components of patient-centered care. Existing studies show that several interventions such as providing same day appointments, walk-in services, after-hours care, and group appointments, have been used to redesign the healthcare systems for improved access to primary care. However, an intervention focusing on a single component of care delivery (i.e. improving access to acute care) might have a negative impact other components of the system (i.e. reduced continuity of care for chronic patients). Therefore, primary care clinics should consider implementing multiple interventions tailored for their patient population needs. We collected rapid ethnography and observations to better understand clinic workflow and key constraints. We then developed an agent-based simulation model that includes all access modalities (appointments, walk-ins, and after-hours access), incorporate resources and key constraints and determine the best appointment scheduling method that improves access and continuity of care. This paper demonstrates the value of simulation models to test a variety of alternative strategies to improve access to care through scheduling. PMID:25954423

  3. Serious games experiment toward agent-based simulation

    USGS Publications Warehouse

    Wein, Anne; Labiosa, William

    2013-01-01

    We evaluate the potential for serious games to be used as a scientifically based decision-support product that supports the United States Geological Survey’s (USGS) mission--to provide integrated, unbiased scientific information that can make a substantial contribution to societal well-being for a wide variety of complex environmental challenges. Serious or pedagogical games are an engaging way to educate decisionmakers and stakeholders about environmental challenges that are usefully informed by natural and social scientific information and knowledge and can be designed to promote interactive learning and exploration in the face of large uncertainties, divergent values, and complex situations. We developed two serious games that use challenging environmental-planning issues to demonstrate and investigate the potential contributions of serious games to inform regional-planning decisions. Delta Skelta is a game emulating long-term integrated environmental planning in the Sacramento-San Joaquin Delta, California, that incorporates natural hazards (flooding and earthquakes) and consequences for California water supplies amidst conflicting water interests. Age of Ecology is a game that simulates interactions between economic and ecologic processes, as well as natural hazards while implementing agent-based modeling. The content of these games spans the USGS science mission areas related to water, ecosystems, natural hazards, land use, and climate change. We describe the games, reflect on design and informational aspects, and comment on their potential usefulness. During the process of developing these games, we identified various design trade-offs involving factual information, strategic thinking, game-winning criteria, elements of fun, number and type of players, time horizon, and uncertainty. We evaluate the two games in terms of accomplishments and limitations. Overall, we demonstrated the potential for these games to usefully represent scientific information

  4. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  5. Agent-Based Knowledge Discovery for Modeling and Simulation

    SciTech Connect

    Haack, Jereme N.; Cowell, Andrew J.; Marshall, Eric J.; Fligg, Alan K.; Gregory, Michelle L.; McGrath, Liam R.

    2009-09-15

    This paper describes an approach to using agent technology to extend the automated discovery mechanism of the Knowledge Encapsulation Framework (KEF). KEF is a suite of tools to enable the linking of knowledge inputs (relevant, domain-specific evidence) to modeling and simulation projects, as well as other domains that require an effective collaborative workspace for knowledge-based tasks. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a semantic wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.

  6. A Framework for Model-Based Inquiry Through Agent-Based Programming

    NASA Astrophysics Data System (ADS)

    Xiang, Lin; Passmore, Cynthia

    2015-04-01

    There has been increased recognition in the past decades that model-based inquiry (MBI) is a promising approach for cultivating deep understandings by helping students unite phenomena and underlying mechanisms. Although multiple technology tools have been used to improve the effectiveness of MBI, there are not enough detailed examinations of how agent-based programmable modeling (ABPM) tools influence students' MBI learning. The present collective case study sought to contribute by closely investigating ABPM-supported MBI processes for 8th grade students learning about natural selection and adaptation. Eight 8th grade students in groups of 2-3 spent 15 h during a span of 4 weeks collaboratively programming simulations of adaptation based on the natural selection model, using an ABPM tool named NetLogo. The entire programming processes of these learning groups, up to 50 h, were videotaped and then analyzed using mixed methods. Our analysis revealed that the programming task created a context that calls for nine types of MBI actions. These MBI actions were related to both phenomena and the underlying model. Results also showed that students' programming processes took place in consecutive programming cycles and aligned with iterative MBI cycles. A framework for ABPM-supported MBI learning is proposed based upon the findings. Implications in developing MBI instruction involving ABPM tools are discussed.

  7. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    ERIC Educational Resources Information Center

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  8. Integrated Agent-Based and Production Cost Modeling Framework for Renewable Energy Studies: Preprint

    SciTech Connect

    Gallo, Giulia

    2015-10-07

    The agent-based framework for renewable energy studies (ARES) is an integrated approach that adds an agent-based model of industry actors to PLEXOS and combines the strengths of the two to overcome their individual shortcomings. It can examine existing and novel wholesale electricity markets under high penetrations of renewables. ARES is demonstrated by studying how increasing levels of wind will impact the operations and the exercise of market power of generation companies that exploit an economic withholding strategy. The analysis is carried out on a test system that represents the Electric Reliability Council of Texas energy-only market in the year 2020. The results more realistically reproduce the operations of an energy market under different and increasing penetrations of wind, and ARES can be extended to address pressing issues in current and future wholesale electricity markets.

  9. Agent-Based Framework for Personalized Service Provisioning in Converged IP Networks

    NASA Astrophysics Data System (ADS)

    Podobnik, Vedran; Matijasevic, Maja; Lovrek, Ignac; Skorin-Kapov, Lea; Desic, Sasa

    In a global multi-service and multi-provider market, the Internet Service Providers will increasingly need to differentiate in the service quality they offer and base their operation on new, consumer-centric business models. In this paper, we propose an agent-based framework for the Business-to-Consumer (B2C) electronic market, comprising the Consumer Agents, Broker Agents and Content Agents, which enable Internet consumers to select a content provider in an automated manner. We also discuss how to dynamically allocate network resources to provide end-to-end Quality of Service (QoS) for a given consumer and content provider.

  10. Agent-Based Crowd Simulation Considering Emotion Contagion for Emergency Evacuation Problem

    NASA Astrophysics Data System (ADS)

    Faroqi, H.; Mesgari, M.-S.

    2015-12-01

    During emergencies, emotions greatly affect human behaviour. For more realistic multi-agent systems in simulations of emergency evacuations, it is important to incorporate emotions and their effects on the agents. In few words, emotional contagion is a process in which a person or group influences the emotions or behavior of another person or group through the conscious or unconscious induction of emotion states and behavioral attitudes. In this study, we simulate an emergency situation in an open square area with three exits considering Adults and Children agents with different behavior. Also, Security agents are considered in order to guide Adults and Children for finding the exits and be calm. Six levels of emotion levels are considered for each agent in different scenarios and situations. The agent-based simulated model initialize with the random scattering of agent populations and then when an alarm occurs, each agent react to the situation based on its and neighbors current circumstances. The main goal of each agent is firstly to find the exit, and then help other agents to find their ways. Numbers of exited agents along with their emotion levels and damaged agents are compared in different scenarios with different initialization in order to evaluate the achieved results of the simulated model. NetLogo 5.2 is used as the multi-agent simulation framework with R language as the developing language.

  11. Towards a framework for agent-based image analysis of remote-sensing data

    PubMed Central

    Hofmann, Peter; Lettmayer, Paul; Blaschke, Thomas; Belgiu, Mariana; Wegenkittl, Stefan; Graf, Roland; Lampoltshammer, Thomas Josef; Andrejchenko, Vera

    2015-01-01

    Object-based image analysis (OBIA) as a paradigm for analysing remotely sensed image data has in many cases led to spatially and thematically improved classification results in comparison to pixel-based approaches. Nevertheless, robust and transferable object-based solutions for automated image analysis capable of analysing sets of images or even large image archives without any human interaction are still rare. A major reason for this lack of robustness and transferability is the high complexity of image contents: Especially in very high resolution (VHR) remote-sensing data with varying imaging conditions or sensor characteristics, the variability of the objects’ properties in these varying images is hardly predictable. The work described in this article builds on so-called rule sets. While earlier work has demonstrated that OBIA rule sets bear a high potential of transferability, they need to be adapted manually, or classification results need to be adjusted manually in a post-processing step. In order to automate these adaptation and adjustment procedures, we investigate the coupling, extension and integration of OBIA with the agent-based paradigm, which is exhaustively investigated in software engineering. The aims of such integration are (a) autonomously adapting rule sets and (b) image objects that can adopt and adjust themselves according to different imaging conditions and sensor characteristics. This article focuses on self-adapting image objects and therefore introduces a framework for agent-based image analysis (ABIA). PMID:27721916

  12. An Agent-Based Optimization Framework for Engineered Complex Adaptive Systems with Application to Demand Response in Electricity Markets

    NASA Astrophysics Data System (ADS)

    Haghnevis, Moeed

    The main objective of this research is to develop an integrated method to study emergent behavior and consequences of evolution and adaptation in engineered complex adaptive systems (ECASs). A multi-layer conceptual framework and modeling approach including behavioral and structural aspects is provided to describe the structure of a class of engineered complex systems and predict their future adaptive patterns. The approach allows the examination of complexity in the structure and the behavior of components as a result of their connections and in relation to their environment. This research describes and uses the major differences of natural complex adaptive systems (CASs) with artificial/engineered CASs to build a framework and platform for ECAS. While this framework focuses on the critical factors of an engineered system, it also enables one to synthetically employ engineering and mathematical models to analyze and measure complexity in such systems. In this way concepts of complex systems science are adapted to management science and system of systems engineering. In particular an integrated consumer-based optimization and agent-based modeling (ABM) platform is presented that enables managers to predict and partially control patterns of behaviors in ECASs. Demonstrated on the U.S. electricity markets, ABM is integrated with normative and subjective decision behavior recommended by the U.S. Department of Energy (DOE) and Federal Energy Regulatory Commission (FERC). The approach integrates social networks, social science, complexity theory, and diffusion theory. Furthermore, it has unique and significant contribution in exploring and representing concrete managerial insights for ECASs and offering new optimized actions and modeling paradigms in agent-based simulation.

  13. An extensible simulation environment and movement metrics for testing walking behavior in agent-based models

    SciTech Connect

    Paul M. Torrens; Atsushi Nara; Xun Li; Haojie Zhu; William A. Griffin; Scott B. Brown

    2012-01-01

    Human movement is a significant ingredient of many social, environmental, and technical systems, yet the importance of movement is often discounted in considering systems complexity. Movement is commonly abstracted in agent-based modeling (which is perhaps the methodological vehicle for modeling complex systems), despite the influence of movement upon information exchange and adaptation in a system. In particular, agent-based models of urban pedestrians often treat movement in proxy form at the expense of faithfully treating movement behavior with realistic agency. There exists little consensus about which method is appropriate for representing movement in agent-based schemes. In this paper, we examine popularly-used methods to drive movement in agent-based models, first by introducing a methodology that can flexibly handle many representations of movement at many different scales and second, introducing a suite of tools to benchmark agent movement between models and against real-world trajectory data. We find that most popular movement schemes do a relatively poor job of representing movement, but that some schemes may well be 'good enough' for some applications. We also discuss potential avenues for improving the representation of movement in agent-based frameworks.

  14. Efficient Allocation of Resources for Defense of Spatially Distributed Networks Using Agent-Based Simulation.

    PubMed

    Kroshl, William M; Sarkani, Shahram; Mazzuchi, Thomas A

    2015-09-01

    This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent-based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg "leader follower" game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent-based simulation. The evolutionary agent-based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent-based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent-based approach results in a greater percentage of defender victories than does the PRA-based approach.

  15. Efficient Allocation of Resources for Defense of Spatially Distributed Networks Using Agent-Based Simulation.

    PubMed

    Kroshl, William M; Sarkani, Shahram; Mazzuchi, Thomas A

    2015-09-01

    This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent-based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg "leader follower" game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent-based simulation. The evolutionary agent-based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent-based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent-based approach results in a greater percentage of defender victories than does the PRA-based approach. PMID:25683347

  16. Evaluation of wholesale electric power market rules and financial risk management by agent-based simulations

    NASA Astrophysics Data System (ADS)

    Yu, Nanpeng

    dissertation, basic financial risk management concepts relevant for wholesale electric power markets are carefully explained and illustrated. In addition, the financial risk management problem in wholesale electric power markets is generalized as a four-stage process. Within the proposed financial risk management framework, the critical problem of financial bilateral contract negotiation is addressed. This dissertation analyzes a financial bilateral contract negotiation process between a generating company and a load-serving entity in a wholesale electric power market with congestion managed by locational marginal pricing. Nash bargaining theory is used to model a Pareto-efficient settlement point. The model predicts negotiation results under varied conditions and identifies circumstances in which the two parties might fail to reach an agreement. Both analysis and agent-based simulation are used to gain insight regarding how relative risk aversion and biased price estimates influence negotiated outcomes. These results should provide useful guidance to market participants in their bilateral contract negotiation processes.

  17. Collaborative Multi-Agent Based Simulations: Stakeholder-Focused Innovation in Water Resources Management and Decision-Support Modeling

    NASA Astrophysics Data System (ADS)

    Kock, B. E.

    2006-12-01

    The combined use of multi-agent based simulations and collaborative modeling approaches is emerging as a highly effective tool for representing complex coupled social-biophysical water resource systems. A collaboratively-designed, multi-agent based simulation can be used both as a decision-support tool and as a didactic method for improving stakeholder understanding and engagement with water resources policymaking and management. Major technical and non-technical obstacles remain to the efficient and effective development of multi-agent models of human society, to integrating these models with GIS and other numerical models, and to building a process for engaging stakeholders with model design, implementation and use. It is proposed here to tackle some of these obstacles through a collaborative multi-agent based simulation process framework, intended for practical use in resolving disputes and environmental challenges over sustainable irrigated agriculture in the Western United States. A practical implementation of this framework will be conducted in collaboration with a diverse stakeholder group representing farmers and local, state and federal water managers. Through the use of simulation gaming, interviewing and computer-based knowledge elicitation, a multi-agent model representing local and regional social dynamics will be developed to support the acceptable and sustainable implementation of management alternatives for reducing regional problems of salinization and high selenium concentrations in soils and irrigation water. The development of a socially and scientifically credible simulation platform in this setting can make a significant contribution to ensuring the non-adversarial use of high quality science, enhance the engagement of stakeholders with policymaking, and help meet the challenges of integrating dynamic models of human society with more traditional biophysical systems models.

  18. A Participatory Agent-Based Simulation for Indoor Evacuation Supported by Google Glass.

    PubMed

    Sánchez, Jesús M; Carrera, Álvaro; Iglesias, Carlos Á; Serrano, Emilio

    2016-08-24

    Indoor evacuation systems are needed for rescue and safety management. One of the challenges is to provide users with personalized evacuation routes in real time. To this end, this project aims at exploring the possibilities of Google Glass technology for participatory multiagent indoor evacuation simulations. Participatory multiagent simulation combines scenario-guided agents and humans equipped with Google Glass that coexist in a shared virtual space and jointly perform simulations. The paper proposes an architecture for participatory multiagent simulation in order to combine devices (Google Glass and/or smartphones) with an agent-based social simulator and indoor tracking services.

  19. A Participatory Agent-Based Simulation for Indoor Evacuation Supported by Google Glass.

    PubMed

    Sánchez, Jesús M; Carrera, Álvaro; Iglesias, Carlos Á; Serrano, Emilio

    2016-01-01

    Indoor evacuation systems are needed for rescue and safety management. One of the challenges is to provide users with personalized evacuation routes in real time. To this end, this project aims at exploring the possibilities of Google Glass technology for participatory multiagent indoor evacuation simulations. Participatory multiagent simulation combines scenario-guided agents and humans equipped with Google Glass that coexist in a shared virtual space and jointly perform simulations. The paper proposes an architecture for participatory multiagent simulation in order to combine devices (Google Glass and/or smartphones) with an agent-based social simulator and indoor tracking services. PMID:27563911

  20. A Participatory Agent-Based Simulation for Indoor Evacuation Supported by Google Glass

    PubMed Central

    Sánchez, Jesús M.; Carrera, Álvaro; Iglesias, Carlos Á.; Serrano, Emilio

    2016-01-01

    Indoor evacuation systems are needed for rescue and safety management. One of the challenges is to provide users with personalized evacuation routes in real time. To this end, this project aims at exploring the possibilities of Google Glass technology for participatory multiagent indoor evacuation simulations. Participatory multiagent simulation combines scenario-guided agents and humans equipped with Google Glass that coexist in a shared virtual space and jointly perform simulations. The paper proposes an architecture for participatory multiagent simulation in order to combine devices (Google Glass and/or smartphones) with an agent-based social simulator and indoor tracking services. PMID:27563911

  1. An agent-based simulation-assisted approach to bi-lateral building systems control

    NASA Astrophysics Data System (ADS)

    Mo, Zhengchun

    Two of the primary objectives of building operations are maximizing occupancy comfort and minimizing energy costs. While research effort has been focused on concept development, design decision support and systems advancement, little attention has been paid to operational decision support. Most commercial buildings are operated under a central control scheme, in which a building operator makes control decisions without in-depth information about individual preference. Widely used set points represent generalized human requirements that do not sufficiently address individual differences. Energy costs, on the other hand, are easier to measure. As a result, operational decisions tend to favor cost savings at the expense of individual occupancy comfort. Personal control systems have enabled individual occupants to customize their local environments. It is argued that individual occupants and building operators have different motivations for environmental controls. They access to different scopes of information and represent partial knowledge for operational solutions. Such a new control environment suggests a bi-lateral control scheme that cannot be offered by existing central control schemes or distributed control schemes. There is a critical need for methods that support the bi-lateral control scheme, in which building operators and individual occupants coordinate to make balanced control decisions. Toward this end, an agent-based simulation-assisted computational framework has been proposed and prototypically implemented in the lighting controls domain. The prototype supports bi-lateral building operations by offering concurrent evaluation of alternative control strategies. The experimental results showed that, by utilizing the proposed framework, the energy use is greatly reduced without undue increase in individual visual discomfort.

  2. A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture

    NASA Technical Reports Server (NTRS)

    Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.

    2005-01-01

    Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.

  3. Use of agent-based simulations to design and interpret HIV clinical trials.

    PubMed

    Cuadros, Diego F; Abu-Raddad, Laith J; Awad, Susanne F; García-Ramos, Gisela

    2014-07-01

    In this study, we illustrate the utility of an agent-based simulation to inform a trial design and how this supports outcome interpretation of randomized controlled trials (RCTs). We developed agent-based Monte Carlo models to simulate existing landmark HIV RCTs, such as the Partners in Prevention HSV/HIV Transmission Study. We simulated a variation of this study using valacyclovir therapy as the intervention, and we used a male circumcision RCT based on the Rakai Male Circumcision Trial. Our results indicate that a small fraction (20%) of the simulated Partners in Prevention HSV/HIV Transmission Study realizations rejected the null hypothesis, which was no effect from the intervention. Our results also suggest that an RCT designed to evaluate the effectiveness of a more potent drug regimen for HSV-2 suppression (valacyclovir therapy) is more likely to identify the efficacy of the intervention. For the male circumcision RCT simulation, the greater biological effect of the male circumcision yielded a major fraction (81%) of RCT realizations' that rejects the null hypothesis, which was no effect from the intervention. Our study highlights how agent-based simulations synthesize individual variation in the epidemiological context of the RCT. This methodology will be particularly useful for designing RCTs aimed at evaluating combination prevention interventions in community-based RCTs, wherein an intervention׳s effectiveness is challenging to predict. PMID:24792492

  4. ACACIA: an agent-based program for simulating behavior to reach long-term goals.

    PubMed

    Beltran, Francesc S; Quera, Vicenç; Zibetti, Elisabetta; Tijus, Charles; Miñano, Meritxell

    2009-05-01

    We present ACACIA, an agent-based program implemented in Java StarLogo 2.0 that simulates a two-dimensional microworld populated by agents, obstacles and goals. Our program simulates how agents can reach long-term goals by following sensorial-motor couplings (SMCs) that control how the agents interact with their environment and other agents through a process of local categorization. Thus, while acting in accordance with this set of SMCs, the agents reach their goals through the emergence of global behaviors. This agent-based simulation program would allow us to understand some psychological processes such as planning behavior from the point of view that the complexity of these processes is the result of agent-environment interaction.

  5. Agent-based simulation of building evacuation using a grid graph-based model

    NASA Astrophysics Data System (ADS)

    Tan, L.; Lin, H.; Hu, M.; Che, W.

    2014-02-01

    Shifting from macroscope models to microscope models, the agent-based approach has been widely used to model crowd evacuation as more attentions are paid on individualized behaviour. Since indoor evacuation behaviour is closely related to spatial features of the building, effective representation of indoor space is essential for the simulation of building evacuation. The traditional cell-based representation has limitations in reflecting spatial structure and is not suitable for topology analysis. Aiming at incorporating powerful topology analysis functions of GIS to facilitate agent-based simulation of building evacuation, we used a grid graph-based model in this study to represent the indoor space. Such model allows us to establish an evacuation network at a micro level. Potential escape routes from each node thus could be analysed through GIS functions of network analysis considering both the spatial structure and route capacity. This would better support agent-based modelling of evacuees' behaviour including route choice and local movements. As a case study, we conducted a simulation of emergency evacuation from the second floor of an official building using Agent Analyst as the simulation platform. The results demonstrate the feasibility of the proposed method, as well as the potential of GIS in visualizing and analysing simulation results.

  6. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE PAGES

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  7. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    SciTech Connect

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease states in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.

  8. Agent-based modeling of malaria vectors: the importance of spatial simulation

    PubMed Central

    2014-01-01

    Background The modeling of malaria vector mosquito populations yields great insight into drivers of malaria transmission at the village scale. Simulation of individual mosquitoes as “agents” in a distributed, dynamic model domain may be greatly beneficial for simulation of spatial relationships of vectors and hosts. Methods In this study, an agent-based model is used to simulate the life cycle and movement of individual malaria vector mosquitoes in a Niger Sahel village, with individual simulated mosquitoes interacting with their physical environment as well as humans. Various processes that are known to be epidemiologically important, such as the dependence of parity on flight distance between developmental habitat and blood meal hosts and therefore spatial relationships of pools and houses, are readily simulated using this modeling paradigm. Impacts of perturbations can be evaluated on the basis of vectorial capacity, because the interactions between individuals that make up the population- scale metric vectorial capacity can be easily tracked for simulated mosquitoes and human blood meal hosts, without the need to estimate vectorial capacity parameters. Results As expected, model results show pronounced impacts of pool source reduction from larvicide application and draining, but with varying degrees of impact depending on the spatial relationship between pools and human habitation. Results highlight the importance of spatially-explicit simulation that can model individuals such as in an agent-based model. Conclusions The impacts of perturbations on village scale malaria transmission depend on spatial locations of individual mosquitoes, as well as the tracking of relevant life cycle events and characteristics of individual mosquitoes. This study demonstrates advantages of using an agent-based approach for village-scale mosquito simulation to address questions in which spatial relationships are known to be important. PMID:24992942

  9. An Agent-Based Model of New Venture Creation: Conceptual Design for Simulating Entrepreneurship

    NASA Technical Reports Server (NTRS)

    Provance, Mike; Collins, Andrew; Carayannis, Elias

    2012-01-01

    There is a growing debate over the means by which regions can foster the growth of entrepreneurial activity in order to stimulate recovery and growth of their economies. On one side, agglomeration theory suggests the regions grow because of strong clusters that foster knowledge spillover locally; on the other side, the entrepreneurial action camp argues that innovative business models are generated by entrepreneurs with unique market perspectives who draw on knowledge from more distant domains. We will show you the design for a novel agent-based model of new venture creation that will demonstrate the relationship between agglomeration and action. The primary focus of this model is information exchange as the medium for these agent interactions. Our modeling and simulation study proposes to reveal interesting relationships in these perspectives, offer a foundation on which these disparate theories from economics and sociology can find common ground, and expand the use of agent-based modeling into entrepreneurship research.

  10. Graceful Failure and Societal Resilience Analysis Via Agent-Based Modeling and Simulation

    NASA Astrophysics Data System (ADS)

    Schopf, P. S.; Cioffi-Revilla, C.; Rogers, J. D.; Bassett, J.; Hailegiorgis, A. B.

    2014-12-01

    Agent-based social modeling is opening up new methodologies for the study of societal response to weather and climate hazards, and providing measures of resiliency that can be studied in many contexts, particularly in coupled human and natural-technological systems (CHANTS). Since CHANTS are complex adaptive systems, societal resiliency may or may not occur, depending on dynamics that lack closed form solutions. Agent-based modeling has been shown to provide a viable theoretical and methodological approach for analyzing and understanding disasters and societal resiliency in CHANTS. Our approach advances the science of societal resilience through computational modeling and simulation methods that complement earlier statistical and mathematical approaches. We present three case studies of social dynamics modeling that demonstrate the use of these agent based models. In Central Asia, we exmaine mutltiple ensemble simulations with varying climate statistics to see how droughts and zuds affect populations, transmission of wealth across generations, and the overall structure of the social system. In Eastern Africa, we explore how successive episodes of drought events affect the adaptive capacity of rural households. Human displacement, mainly, rural to urban migration, and livelihood transition particularly from pastoral to farming are observed as rural households interacting dynamically with the biophysical environment and continually adjust their behavior to accommodate changes in climate. In the far north case we demonstrate one of the first successful attempts to model the complete climate-permafrost-infrastructure-societal interaction network as a complex adaptive system/CHANTS implemented as a ``federated'' agent-based model using evolutionary computation. Analysis of population changes resulting from extreme weather across these and other cases provides evidence for the emergence of new steady states and shifting patterns of resilience.

  11. Model reduction for agent-based social simulation: coarse-graining a civil violence model.

    PubMed

    Zou, Yu; Fonoberov, Vladimir A; Fonoberova, Maria; Mezic, Igor; Kevrekidis, Ioannis G

    2012-06-01

    Agent-based modeling (ABM) constitutes a powerful computational tool for the exploration of phenomena involving emergent dynamic behavior in the social sciences. This paper demonstrates a computer-assisted approach that bridges the significant gap between the single-agent microscopic level and the macroscopic (coarse-grained population) level, where fundamental questions must be rationally answered and policies guiding the emergent dynamics devised. Our approach will be illustrated through an agent-based model of civil violence. This spatiotemporally varying ABM incorporates interactions between a heterogeneous population of citizens [active (insurgent), inactive, or jailed] and a population of police officers. Detailed simulations exhibit an equilibrium punctuated by periods of social upheavals. We show how to effectively reduce the agent-based dynamics to a stochastic model with only two coarse-grained degrees of freedom: the number of jailed citizens and the number of active ones. The coarse-grained model captures the ABM dynamics while drastically reducing the computation time (by a factor of approximately 20).

  12. Quantitative agent-based firm dynamics simulation with parameters estimated by financial and transaction data analysis

    NASA Astrophysics Data System (ADS)

    Ikeda, Yuichi; Souma, Wataru; Aoyama, Hideaki; Iyetomi, Hiroshi; Fujiwara, Yoshi; Kaizoji, Taisei

    2007-03-01

    Firm dynamics on a transaction network is considered from the standpoint of econophysics, agent-based simulations, and game theory. In this model, interacting firms rationally invest in a production facility to maximize net present value. We estimate parameters used in the model through empirical analysis of financial and transaction data. We propose two different methods ( analytical method and regression method) to obtain an interaction matrix of firms. On a subset of a real transaction network, we simulate firm's revenue, cost, and fixed asset, which is the accumulated investment for the production facility. The simulation reproduces the quantitative behavior of past revenues and costs within a standard error when we use the interaction matrix estimated by the regression method, in which only transaction pairs are taken into account. Furthermore, the simulation qualitatively reproduces past data of fixed assets.

  13. An agent-based epidemic simulation of social behaviors affecting HIV transmission among Taiwanese homosexuals.

    PubMed

    Huang, Chung-Yuan

    2015-01-01

    Computational simulations are currently used to identify epidemic dynamics, to test potential prevention and intervention strategies, and to study the effects of social behaviors on HIV transmission. The author describes an agent-based epidemic simulation model of a network of individuals who participate in high-risk sexual practices, using number of partners, condom usage, and relationship length to distinguish between high- and low-risk populations. Two new concepts-free links and fixed links-are used to indicate tendencies among individuals who either have large numbers of short-term partners or stay in long-term monogamous relationships. An attempt was made to reproduce epidemic curves of reported HIV cases among male homosexuals in Taiwan prior to using the agent-based model to determine the effects of various policies on epidemic dynamics. Results suggest that when suitable adjustments are made based on available social survey statistics, the model accurately simulates real-world behaviors on a large scale.

  14. An Agent-Based Labor Market Simulation with Endogenous Skill-Demand

    NASA Astrophysics Data System (ADS)

    Gemkow, S.

    This paper considers an agent-based labor market simulation to examine the influence of skills on wages and unemployment rates. Therefore less and highly skilled workers as well as less and highly productive vacancies are implemented. The skill distribution is exogenous whereas the distribution of the less and highly productive vacancies is endogenous. The different opportunities of the skill groups on the labor market are established by skill requirements. This means that a highly productive vacancy can only be filled by a highly skilled unemployed. Different skill distributions, which can also be interpreted as skill-biased technological change, are simulated by incrementing the skill level of highly skilled persons exogenously. This simulation also provides a microeconomic foundation of the matching function often used in theoretical approaches.

  15. Security Analysis of Selected AMI Failure Scenarios Using Agent Based Game Theoretic Simulation

    SciTech Connect

    Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T

    2014-01-01

    Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the Advanced Metering Infrastructure (AMI) functional domain which the National Electric Sector Cyber security Organization Resource (NESCOR) working group has currently documented 29 failure scenarios. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain. From these five selected scenarios, we characterize them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrates how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.

  16. Recent Advances in Agent-Based Tsunami Evacuation Simulations: Case Studies in Indonesia, Thailand, Japan and Peru

    NASA Astrophysics Data System (ADS)

    Mas, Erick; Koshimura, Shunichi; Imamura, Fumihiko; Suppasri, Anawat; Muhari, Abdul; Adriano, Bruno

    2015-12-01

    As confirmed by the extreme tsunami events over the last decade (the 2004 Indian Ocean, 2010 Chile and 2011 Japan tsunami events), mitigation measures and effective evacuation planning are needed to reduce disaster risks. Modeling tsunami evacuations is an alternative means to analyze evacuation plans and possible scenarios of evacuees' behaviors. In this paper, practical applications of an agent-based tsunami evacuation model are presented to demonstrate the contributions that agent-based modeling has added to tsunami evacuation simulations and tsunami mitigation efforts. A brief review of previous agent-based evacuation models in the literature is given to highlight recent progress in agent-based methods. Finally, challenges are noted for bridging gaps between geoscience and social science within the agent-based approach for modeling tsunami evacuations.

  17. Modeling the Information Age Combat Model: An Agent-Based Simulation of Network Centric Operations

    NASA Technical Reports Server (NTRS)

    Deller, Sean; Rabadi, Ghaith A.; Bell, Michael I.; Bowling, Shannon R.; Tolk, Andreas

    2010-01-01

    The Information Age Combat Model (IACM) was introduced by Cares in 2005 to contribute to the development of an understanding of the influence of connectivity on force effectiveness that can eventually lead to quantitative prediction and guidelines for design and employment. The structure of the IACM makes it clear that the Perron-Frobenius Eigenvalue is a quantifiable metric with which to measure the organization of a networked force. The results of recent experiments presented in Deller, et aI., (2009) indicate that the value of the Perron-Frobenius Eigenvalue is a significant measurement of the performance of an Information Age combat force. This was accomplished through the innovative use of an agent-based simulation to model the IACM and represents an initial contribution towards a new generation of combat models that are net-centric instead of using the current platform-centric approach. This paper describes the intent, challenges, design, and initial results of this agent-based simulation model.

  18. AN AGENT-BASED SIMULATION STUDY OF A COMPLEX ADAPTIVE COLLABORATION NETWORK

    SciTech Connect

    Ozmen, Ozgur; Smith, Jeffrey; Yilmaz, Levent

    2013-01-01

    One of the most significant problems in organizational scholarship is to discern how social collectives govern, organize, and coordinate the actions of individuals to achieve collective outcomes. The collectives are usually interpreted as complex adaptive systems (CAS). The understanding of CAS is more likely to arise with the help of computer-based simulations. In this tutorial, using agent-based modeling approach, a complex adaptive social communication network model is introduced. The objective is to present the underlying dynamics of the system in a form of computer simulation that enables analyzing the impacts of various mechanisms on network topologies and emergent behaviors. The ultimate goal is to further our understanding of the dynamics in the system and facilitate developing informed policies for decision-makers.

  19. Multi-Agent-Based Simulation of a Complex Ecosystem of Mental Health Care.

    PubMed

    Kalton, Alan; Falconer, Erin; Docherty, John; Alevras, Dimitris; Brann, David; Johnson, Kyle

    2016-02-01

    This paper discusses the creation of an Agent-Based Simulation that modeled the introduction of care coordination capabilities into a complex system of care for patients with Serious and Persistent Mental Illness. The model describes the engagement between patients and the medical, social and criminal justice services they interact with in a complex ecosystem of care. We outline the challenges involved in developing the model, including process mapping and the collection and synthesis of data to support parametric estimates, and describe the controls built into the model to support analysis of potential changes to the system. We also describe the approach taken to calibrate the model to an observable level of system performance. Preliminary results from application of the simulation are provided to demonstrate how it can provide insights into potential improvements deriving from introduction of care coordination technology. PMID:26590977

  20. Multi-Agent-Based Simulation of a Complex Ecosystem of Mental Health Care.

    PubMed

    Kalton, Alan; Falconer, Erin; Docherty, John; Alevras, Dimitris; Brann, David; Johnson, Kyle

    2016-02-01

    This paper discusses the creation of an Agent-Based Simulation that modeled the introduction of care coordination capabilities into a complex system of care for patients with Serious and Persistent Mental Illness. The model describes the engagement between patients and the medical, social and criminal justice services they interact with in a complex ecosystem of care. We outline the challenges involved in developing the model, including process mapping and the collection and synthesis of data to support parametric estimates, and describe the controls built into the model to support analysis of potential changes to the system. We also describe the approach taken to calibrate the model to an observable level of system performance. Preliminary results from application of the simulation are provided to demonstrate how it can provide insights into potential improvements deriving from introduction of care coordination technology.

  1. Promoting Conceptual Change for Complex Systems Understanding: Outcomes of an Agent-Based Participatory Simulation

    NASA Astrophysics Data System (ADS)

    Rates, Christopher A.; Mulvey, Bridget K.; Feldon, David F.

    2016-08-01

    Components of complex systems apply across multiple subject areas, and teaching these components may help students build unifying conceptual links. Students, however, often have difficulty learning these components, and limited research exists to understand what types of interventions may best help improve understanding. We investigated 32 high school students' understandings of complex systems components and whether an agent-based simulation could improve their understandings. Pretest and posttest essays were coded for changes in six components to determine whether students showed more expert thinking about the complex system of the Chesapeake Bay watershed. Results showed significant improvement for the components Emergence ( r = .26, p = .03), Order ( r = .37, p = .002), and Tradeoffs ( r = .44, p = .001). Implications include that the experiential nature of the simulation has the potential to support conceptual change for some complex systems components, presenting a promising option for complex systems instruction.

  2. Automated multi-objective calibration of biological agent-based simulations.

    PubMed

    Read, Mark N; Alden, Kieran; Rose, Louis M; Timmis, Jon

    2016-09-01

    Computational agent-based simulation (ABS) is increasingly used to complement laboratory techniques in advancing our understanding of biological systems. Calibration, the identification of parameter values that align simulation with biological behaviours, becomes challenging as increasingly complex biological domains are simulated. Complex domains cannot be characterized by single metrics alone, rendering simulation calibration a fundamentally multi-metric optimization problem that typical calibration techniques cannot handle. Yet calibration is an essential activity in simulation-based science; the baseline calibration forms a control for subsequent experimentation and hence is fundamental in the interpretation of results. Here, we develop and showcase a method, built around multi-objective optimization, for calibrating ABSs against complex target behaviours requiring several metrics (termed objectives) to characterize. Multi-objective calibration (MOC) delivers those sets of parameter values representing optimal trade-offs in simulation performance against each metric, in the form of a Pareto front. We use MOC to calibrate a well-understood immunological simulation against both established a priori and previously unestablished target behaviours. Furthermore, we show that simulation-borne conclusions are broadly, but not entirely, robust to adopting baseline parameter values from different extremes of the Pareto front, highlighting the importance of MOC's identification of numerous calibration solutions. We devise a method for detecting overfitting in a multi-objective context, not previously possible, used to save computational effort by terminating MOC when no improved solutions will be found. MOC can significantly impact biological simulation, adding rigour to and speeding up an otherwise time-consuming calibration process and highlighting inappropriate biological capture by simulations that cannot be well calibrated. As such, it produces more accurate

  3. Automated multi-objective calibration of biological agent-based simulations.

    PubMed

    Read, Mark N; Alden, Kieran; Rose, Louis M; Timmis, Jon

    2016-09-01

    Computational agent-based simulation (ABS) is increasingly used to complement laboratory techniques in advancing our understanding of biological systems. Calibration, the identification of parameter values that align simulation with biological behaviours, becomes challenging as increasingly complex biological domains are simulated. Complex domains cannot be characterized by single metrics alone, rendering simulation calibration a fundamentally multi-metric optimization problem that typical calibration techniques cannot handle. Yet calibration is an essential activity in simulation-based science; the baseline calibration forms a control for subsequent experimentation and hence is fundamental in the interpretation of results. Here, we develop and showcase a method, built around multi-objective optimization, for calibrating ABSs against complex target behaviours requiring several metrics (termed objectives) to characterize. Multi-objective calibration (MOC) delivers those sets of parameter values representing optimal trade-offs in simulation performance against each metric, in the form of a Pareto front. We use MOC to calibrate a well-understood immunological simulation against both established a priori and previously unestablished target behaviours. Furthermore, we show that simulation-borne conclusions are broadly, but not entirely, robust to adopting baseline parameter values from different extremes of the Pareto front, highlighting the importance of MOC's identification of numerous calibration solutions. We devise a method for detecting overfitting in a multi-objective context, not previously possible, used to save computational effort by terminating MOC when no improved solutions will be found. MOC can significantly impact biological simulation, adding rigour to and speeding up an otherwise time-consuming calibration process and highlighting inappropriate biological capture by simulations that cannot be well calibrated. As such, it produces more accurate

  4. Understanding the Dynamics of Violent Political Revolutions in an Agent-Based Framework

    PubMed Central

    Moro, Alessandro

    2016-01-01

    This paper develops an agent-based computational model of violent political revolutions in which a subjugated population of citizens and an armed revolutionary organisation attempt to overthrow a central authority and its loyal forces. The model replicates several patterns of rebellion consistent with major historical revolutions, and provides an explanation for the multiplicity of outcomes that can arise from an uprising. The relevance of the heterogeneity of scenarios predicted by the model can be understood by considering the recent experience of the Arab Spring involving several rebellions that arose in an apparently similar way, but resulted in completely different political outcomes: the successful revolution in Tunisia, the failed protests in Saudi Arabia and Bahrain, and civil war in Syria and Libya. PMID:27104855

  5. Understanding the Dynamics of Violent Political Revolutions in an Agent-Based Framework.

    PubMed

    Moro, Alessandro

    2016-01-01

    This paper develops an agent-based computational model of violent political revolutions in which a subjugated population of citizens and an armed revolutionary organisation attempt to overthrow a central authority and its loyal forces. The model replicates several patterns of rebellion consistent with major historical revolutions, and provides an explanation for the multiplicity of outcomes that can arise from an uprising. The relevance of the heterogeneity of scenarios predicted by the model can be understood by considering the recent experience of the Arab Spring involving several rebellions that arose in an apparently similar way, but resulted in completely different political outcomes: the successful revolution in Tunisia, the failed protests in Saudi Arabia and Bahrain, and civil war in Syria and Libya. PMID:27104855

  6. Changing crops in response to climate: virtual Nang Rong, Thailand in an agent based simulation

    PubMed Central

    Malanson, George P.; Verdery, Ashton M.; Walsh, Stephen J.; Sawangdee, Yothin; Heumann, Benjamin W.; McDaniel, Philip M.; Frizzelle, Brian G.; Williams, Nathalie E.; Yao, Xiaozheng; Entwisle, Barbara; Rindfuss, Ronald R.

    2014-01-01

    The effects of extended climatic variability on agricultural land use were explored for the type of system found in villages of northeastern Thailand. An agent based model developed for the Nang Rong district was used to simulate land allotted to jasmine rice, heavy rice, cassava, and sugar cane. The land use choices in the model depended on likely economic outcomes, but included elements of bounded rationality in dependence on household demography. The socioeconomic dynamics are endogenous in the system, and climate changes were added as exogenous drivers. Villages changed their agricultural effort in many different ways. Most villages reduced the amount of land under cultivation, primarily with reduction in jasmine rice, but others did not. The variation in responses to climate change indicates potential sensitivity to initial conditions and path dependence for this type of system. The differences between our virtual villages and the real villages of the region indicate effects of bounded rationality and limits on model applications. PMID:25061240

  7. Method for distributed agent-based non-expert simulation of manufacturing process behavior

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2004-11-30

    A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.

  8. Prediction Markets and Beliefs about Climate: Results from Agent-Based Simulations

    NASA Astrophysics Data System (ADS)

    Gilligan, J. M.; John, N. J.; van der Linden, M.

    2015-12-01

    Climate scientists have long been frustrated by persistent doubts a large portion of the public expresses toward the scientific consensus about anthropogenic global warming. The political and ideological polarization of this doubt led Vandenbergh, Raimi, and Gilligan [1] to propose that prediction markets for climate change might influence the opinions of those who mistrust the scientific community but do trust the power of markets.We have developed an agent-based simulation of a climate prediction market in which traders buy and sell future contracts that will pay off at some future year with a value that depends on the global average temperature at that time. The traders form a heterogeneous population with different ideological positions, different beliefs about anthropogenic global warming, and different degrees of risk aversion. We also vary characteristics of the market, including the topology of social networks among the traders, the number of traders, and the completeness of the market. Traders adjust their beliefs about climate according to the gains and losses they and other traders in their social network experience. This model predicts that if global temperature is predominantly driven by greenhouse gas concentrations, prediction markets will cause traders' beliefs to converge toward correctly accepting anthropogenic warming as real. This convergence is largely independent of the structure of the market and the characteristics of the population of traders. However, it may take considerable time for beliefs to converge. Conversely, if temperature does not depend on greenhouse gases, the model predicts that traders' beliefs will not converge. We will discuss the policy-relevance of these results and more generally, the use of agent-based market simulations for policy analysis regarding climate change, seasonal agricultural weather forecasts, and other applications.[1] MP Vandenbergh, KT Raimi, & JM Gilligan. UCLA Law Rev. 61, 1962 (2014).

  9. Investigating the role of water in the Diffusion of Cholera using Agent-Based simulation

    NASA Astrophysics Data System (ADS)

    Augustijn, Ellen-Wien; Doldersum, Tom; Augustijn, Denie

    2014-05-01

    Traditionally, cholera was considered to be a waterborne disease. Currently we know that many other factors can contribute to the spread of this disease including human mobility and human behavior. However, the hydrological component in cholera diffusion is significant. The interplay between cholera and water includes bacteria (V. cholera) that survive in the aquatic environment, the possibility that run-off water from dumpsites carries the bacteria to surface water (rivers and lakes), and when the bacteria reach streams they can be carried downstream to infect new locations. Modelling is a very important tool to build theory on the interplay between different types of transmission mechanisms that together are responsible for the spread of Cholera. Agent-based simulation models are very suitable to incorporate behavior at individual level and to reproduce emergence. However, it is more difficult to incorporate the hydrological components in this type of model. In this research we present the hydrological component of an Agent-Based Cholera model developed to study a Cholera epidemic in Kumasi (Ghana) in 2005. The model was calibrated on the relative contribution of each community to the distributed pattern of cholera rather than the absolute number of incidences. Analysis of the results shows that water plays an important role in the diffusion of cholera: 75% of the cholera cases were infected via river water that was contaminated by runoff from the dumpsites. To initiate infections upstream, the probability of environment-to-human transmission seemed to be overestimated compared to what may be expected from literature. Scenario analyses show that there is a strong relation between the epidemic curve and the rainfall. Removing dumpsites that are situated close to the river resulted in a strong decrease in the number of cholera cases. Results are sensitive to the scheduling of the daily activities and the survival time of the cholera bacteria.

  10. Impact of road environment on drivers' behaviors in dilemma zone: Application of agent-based simulation.

    PubMed

    Kim, Sojung; Son, Young-Jun; Chiu, Yi-Chang; Jeffers, Mary Anne B; Yang, C Y David

    2016-11-01

    At a signalized intersection, there exists an area where drivers become indecisive as to either stop their car or proceed through when the traffic signal turns yellow. This point, called a dilemma zone, has remained a safety concern for drivers due to the great possibility of a rear-end or right-angle crash occurring. In order to reduce the risk of car crashes at the dilemma zone, Institute of Transportation Engineers (ITE) recommended a dilemma zone model. The model, however, fails to provide precise calculations on the decision of drivers because it disregards the supplemental roadway information, such as whether a red light camera is present. Hence, the goal of this study was to incorporate such roadway environmental factors into a more realistic driver decision-making model for the dilemma zone. A driving simulator was used to determine the influence of roadway conditions on decision-making of real drivers. Following data collection, each driver's decision outcomes were implemented in an Agent-Based Simulation (ABS) so as to analyze behaviors under realistic road environments. The experimental results revealed that the proposed dilemma zone model was able to accurately predict the decisions of drivers. Specifically, the model confirmed the findings from the driving simulator study that the changes in the roadway environment reduced the number of red light violations at an intersection.

  11. Agent-based Modeling to Simulate the Diffusion of Water-Efficient Innovations and the Emergence of Urban Water Sustainability

    NASA Astrophysics Data System (ADS)

    Kanta, L.; Giacomoni, M.; Shafiee, M. E.; Berglund, E.

    2014-12-01

    The sustainability of water resources is threatened by urbanization, as increasing demands deplete water availability, and changes to the landscape alter runoff and the flow regime of receiving water bodies. Utility managers typically manage urban water resources through the use of centralized solutions, such as large reservoirs, which may be limited in their ability balance the needs of urbanization and ecological systems. Decentralized technologies, on the other hand, may improve the health of the water resources system and deliver urban water services. For example, low impact development technologies, such as rainwater harvesting, and water-efficient technologies, such as low-flow faucets and toilets, may be adopted by households to retain rainwater and reduce demands, offsetting the need for new centralized infrastructure. Decentralized technologies may create new complexities in infrastructure and water management, as decentralization depends on community behavior and participation beyond traditional water resources planning. Messages about water shortages and water quality from peers and the water utility managers can influence the adoption of new technologies. As a result, feedbacks between consumers and water resources emerge, creating a complex system. This research develops a framework to simulate the diffusion of water-efficient innovations and the sustainability of urban water resources, by coupling models of households in a community, hydrologic models of a water resources system, and a cellular automata model of land use change. Agent-based models are developed to simulate the land use and water demand decisions of individual households, and behavioral rules are encoded to simulate communication with other agents and adoption of decentralized technologies, using a model of the diffusion of innovation. The framework is applied for an illustrative case study to simulate water resources sustainability over a long-term planning horizon.

  12. An artificial intelligence approach for modeling molecular self-assembly: agent-based simulations of rigid molecules.

    PubMed

    Fortuna, Sara; Troisi, Alessandro

    2009-07-23

    Agent-based simulations are rule-based models traditionally used for the simulations of complex systems. In this paper, an algorithm based on the concept of agent-based simulations is developed to predict the lowest energy packing of a set of identical rigid molecules. The agents are identified with rigid portions of the system under investigation, and they evolve following a set of rules designed to drive the system toward the lowest energy minimum. The algorithm is compared with a conventional Metropolis Monte Carlo algorithm, and it is applied on a large set of representative models of molecules. For all the systems studied, the agent-based method consistently finds a significantly lower energy minima than the Monte Carlo algorithm because the system evolution includes elements of adaptation (new configurations induce new types of moves) and learning (past successful choices are repeated).

  13. Juxtaposition of System Dynamics and Agent-Based Simulation for a Case Study in Immunosenescence

    PubMed Central

    Figueredo, Grazziela P.

    2015-01-01

    Advances in healthcare and in the quality of life significantly increase human life expectancy. With the aging of populations, new un-faced challenges are brought to science. The human body is naturally selected to be well-functioning until the age of reproduction to keep the species alive. However, as the lifespan extends, unseen problems due to the body deterioration emerge. There are several age-related diseases with no appropriate treatment; therefore, the complex aging phenomena needs further understanding. It is known that immunosenescence is highly correlated to the negative effects of aging. In this work we advocate the use of simulation as a tool to assist the understanding of immune aging phenomena. In particular, we are comparing system dynamics modelling and simulation (SDMS) and agent-based modelling and simulation (ABMS) for the case of age-related depletion of naive T cells in the organism. We address the following research questions: Which simulation approach is more suitable for this problem? Can these approaches be employed interchangeably? Is there any benefit of using one approach compared to the other? Results show that both simulation outcomes closely fit the observed data and existing mathematical model; and the likely contribution of each of the naive T cell repertoire maintenance method can therefore be estimated. The differences observed in the outcomes of both approaches are due to the probabilistic character of ABMS contrasted to SDMS. However, they do not interfere in the overall expected dynamics of the populations. In this case, therefore, they can be employed interchangeably, with SDMS being simpler to implement and taking less computational resources. PMID:25807273

  14. Parallel Agent-Based Simulations on Clusters of GPUs and Multi-Core Processors

    SciTech Connect

    Aaby, Brandon G; Perumalla, Kalyan S; Seal, Sudip K

    2010-01-01

    An effective latency-hiding mechanism is presented in the parallelization of agent-based model simulations (ABMS) with millions of agents. The mechanism is designed to accommodate the hierarchical organization as well as heterogeneity of current state-of-the-art parallel computing platforms. We use it to explore the computation vs. communication trade-off continuum available with the deep computational and memory hierarchies of extant platforms and present a novel analytical model of the tradeoff. We describe our implementation and report preliminary performance results on two distinct parallel platforms suitable for ABMS: CUDA threads on multiple, networked graphical processing units (GPUs), and pthreads on multi-core processors. Message Passing Interface (MPI) is used for inter-GPU as well as inter-socket communication on a cluster of multiple GPUs and multi-core processors. Results indicate the benefits of our latency-hiding scheme, delivering as much as over 100-fold improvement in runtime for certain benchmark ABMS application scenarios with several million agents. This speed improvement is obtained on our system that is already two to three orders of magnitude faster on one GPU than an equivalent CPU-based execution in a popular simulator in Java. Thus, the overall execution of our current work is over four orders of magnitude faster when executed on multiple GPUs.

  15. An Agent-based Simulation Model for C. difficile Infection Control

    PubMed Central

    Codella, James; Safdar, Nasia; Heffernan, Rick; Alagoz, Oguzhan

    2014-01-01

    Background. Control of C. difficile infection (CDI) is an increasingly difficult problem for healthcare institutions. There are commonly recommended strategies to combat CDI transmission such as oral vancomycin for CDI treatment, increased hand hygiene with soap and water for healthcare workers, daily environmental disinfection of infected patient rooms, and contact isolation of diseased patients. However, the efficacy of these strategies, particularly for endemic CDI, has not been well studied. The objective of this research is to develop a valid agent-based simulation model (ABM) to study C. difficile transmission and control in a mid-sized hospital. Methods. We develop an ABM of a mid-sized hospital with agents such as patients, healthcare workers, and visitors. We model the natural progression of CDI in a patient using a Markov chain and the transmission of CDI through agent and environmental interactions. We derive input parameters from aggregate patient data from the 2007-2010 Wisconsin Hospital Association and published medical literature. We define a calibration process, which we use to estimate transition probabilities of the Markov model by comparing simulation results to benchmark values found in published literature. Results. Comparing CDI control strategies implemented individually, routine bleach disinfection of CDI+ patient rooms provides the largest reduction in nosocomial asymptomatic colonizations (21.8%) and nosocomial CDIs (42.8%). Additionally, vancomycin treatment provides the largest reduction in relapse CDIs (41.9%), CDI-related mortalities (68.5%), and total patient LOS (21.6%). Conclusion. We develop a generalized ABM for CDI control that can be customized and further expanded to specific institutions and/or scenarios. Additionally, we estimate transition probabilities for a Markov model of natural CDI progression in a patient through calibration. PMID:25112595

  16. Agent-based evacuation simulation for spatial allocation assessment of urban shelters

    NASA Astrophysics Data System (ADS)

    Yu, Jia; Wen, Jiahong; Jiang, Yong

    2015-12-01

    The construction of urban shelters is one of the most important work in urban planning and disaster prevention. The spatial allocation assessment is a fundamental pre-step for spatial location-allocation of urban shelters. This paper introduces a new method which makes use of agent-based technology to implement evacuation simulation so as to conduct dynamic spatial allocation assessment of urban shelters. The method can not only accomplish traditional geospatial evaluation for urban shelters, but also simulate the evacuation process of the residents to shelters. The advantage of utilizing this method lies into three aspects: (1) the evacuation time of each citizen from a residential building to the shelter can be estimated more reasonably; (2) the total evacuation time of all the residents in a region is able to be obtained; (3) the road congestions in evacuation in sheltering can be detected so as to take precautionary measures to prevent potential risks. In this study, three types of agents are designed: shelter agents, government agents and resident agents. Shelter agents select specified land uses as shelter candidates for different disasters. Government agents delimitate the service area of each shelter, in other words, regulate which shelter a person should take, in accordance with the administrative boundaries and road distance between the person's position and the location of the shelter. Resident agents have a series of attributes, such as ages, positions, walking speeds, and so on. They also have several behaviors, such as reducing speed when walking in the crowd, helping old people and children, and so on. Integrating these three types of agents which are correlated with each other, evacuation procedures can be simulated and dynamic allocation assessment of shelters will be achieved. A case study in Jing'an District, Shanghai, China, was conducted to demonstrate the feasibility of the method. A scenario of earthquake disaster which occurs in nighttime

  17. Age-correlated stress resistance improves fitness of yeast: support from agent-based simulations

    PubMed Central

    2014-01-01

    Background Resistance to stress is often heterogeneous among individuals within a population, which helps protect against intermittent stress (bet hedging). This is also the case for heat shock resistance in the budding yeast Saccharomyces cerevisiae. Interestingly, the resistance appears to be continuously distributed (vs. binary, switch-like) and correlated with replicative age (vs. random). Older, slower-growing cells are more resistant than younger, faster-growing ones. Is there a fitness benefit to age-correlated stress resistance? Results Here this hypothesis is explored using a simple agent-based model, which simulates a population of individual cells that grow and replicate. Cells age by accumulating damage, which lowers their growth rate. They synthesize trehalose at a metabolic cost, which helps protect against heat shock. Proteins Tsl1 and Tps3 (trehalose synthase complex regulatory subunit TSL1 and TPS3) represent the trehalose synthesis complex and they are expressed using constant, age-dependent and stochastic terms. The model was constrained by calibration and comparison to data from the literature, including individual-based observations obtained using high-throughput microscopy and flow cytometry. A heterogeneity network was developed, which highlights the predominant sources and pathways of resistance heterogeneity. To determine the best trehalose synthesis strategy, model strains with different Tsl1/Tps3 expression parameters were placed in competition in an environment with intermittent heat shocks. Conclusions For high severities and low frequencies of heat shock, the winning strain used an age-dependent bet hedging strategy, which shows that there can be a benefit to age-correlated stress resistance. The study also illustrates the utility of combining individual-based observations and modeling to understand mechanisms underlying population heterogeneity, and the effect on fitness. PMID:24529069

  18. Understanding coupled natural and human systems on fire prone landscapes: integrating wildfire simulation into an agent based planning system.

    NASA Astrophysics Data System (ADS)

    Barros, Ana; Ager, Alan; Preisler, Haiganoush; Day, Michelle; Spies, Tom; Bolte, John

    2015-04-01

    Agent-based models (ABM) allow users to examine the long-term effects of agent decisions in complex systems where multiple agents and processes interact. This framework has potential application to study the dynamics of coupled natural and human systems where multiple stimuli determine trajectories over both space and time. We used Envision, a landscape based ABM, to analyze long-term wildfire dynamics in a heterogeneous, multi-owner landscape in Oregon, USA. Landscape dynamics are affected by land management policies, actors decisions, and autonomous processes such as vegetation succession, wildfire, or at a broader scale, climate change. Key questions include: 1) How are landscape dynamics influenced by policies and institutions, and 2) How do land management policies and actor decisions interact to produce intended and unintended consequences with respect to wildfire on fire-prone landscapes. Applying Envision to address these questions required the development of a wildfire module that could accurately simulate wildfires on the heterogeneous landscapes within the study area in terms of replicating historical fire size distribution, spatial distribution and fire intensity. In this paper we describe the development and testing of a mechanistic fire simulation system within Envision and application of the model on a 3.2 million fire prone landscape in central Oregon USA. The core fire spread equations use the Minimum Travel Time algorithm developed by M Finney. The model operates on a daily time step and uses a fire prediction system based on the relationship between energy release component and historical fires. Specifically, daily wildfire probabilities and sizes are generated from statistical analyses of historical fires in relation to daily ERC values. The MTT was coupled with the vegetation dynamics module in Envision to allow communication between the respective subsystem and effectively model fire effects and vegetation dynamics after a wildfire. Canopy and

  19. An operational epidemiological model for calibrating agent-based simulations of pandemic influenza outbreaks.

    PubMed

    Prieto, D; Das, T K

    2016-03-01

    Uncertainty of pandemic influenza viruses continue to cause major preparedness challenges for public health policymakers. Decisions to mitigate influenza outbreaks often involve tradeoff between the social costs of interventions (e.g., school closure) and the cost of uncontrolled spread of the virus. To achieve a balance, policymakers must assess the impact of mitigation strategies once an outbreak begins and the virus characteristics are known. Agent-based (AB) simulation is a useful tool for building highly granular disease spread models incorporating the epidemiological features of the virus as well as the demographic and social behavioral attributes of tens of millions of affected people. Such disease spread models provide excellent basis on which various mitigation strategies can be tested, before they are adopted and implemented by the policymakers. However, to serve as a testbed for the mitigation strategies, the AB simulation models must be operational. A critical requirement for operational AB models is that they are amenable for quick and simple calibration. The calibration process works as follows: the AB model accepts information available from the field and uses those to update its parameters such that some of its outputs in turn replicate the field data. In this paper, we present our epidemiological model based calibration methodology that has a low computational complexity and is easy to interpret. Our model accepts a field estimate of the basic reproduction number, and then uses it to update (calibrate) the infection probabilities in a way that its effect combined with the effects of the given virus epidemiology, demographics, and social behavior results in an infection pattern yielding a similar value of the basic reproduction number. We evaluate the accuracy of the calibration methodology by applying it for an AB simulation model mimicking a regional outbreak in the US. The calibrated model is shown to yield infection patterns closely replicating

  20. Modelling Temporal Schedule of Urban Trains Using Agent-Based Simulation and NSGA2-BASED Multiobjective Optimization Approaches

    NASA Astrophysics Data System (ADS)

    Sahelgozin, M.; Alimohammadi, A.

    2015-12-01

    Increasing distances between locations of residence and services leads to a large number of daily commutes in urban areas. Developing subway systems has been taken into consideration of transportation managers as a response to this huge amount of travel demands. In developments of subway infrastructures, representing a temporal schedule for trains is an important task; because an appropriately designed timetable decreases Total passenger travel times, Total Operation Costs and Energy Consumption of trains. Since these variables are not positively correlated, subway scheduling is considered as a multi-criteria optimization problem. Therefore, proposing a proper solution for subway scheduling has been always a controversial issue. On the other hand, research on a phenomenon requires a summarized representation of the real world that is known as Model. In this study, it is attempted to model temporal schedule of urban trains that can be applied in Multi-Criteria Subway Schedule Optimization (MCSSO) problems. At first, a conceptual framework is represented for MCSSO. Then, an agent-based simulation environment is implemented to perform Sensitivity Analysis (SA) that is used to extract the interrelations between the framework components. These interrelations is then taken into account in order to construct the proposed model. In order to evaluate performance of the model in MCSSO problems, Tehran subway line no. 1 is considered as the case study. Results of the study show that the model was able to generate an acceptable distribution of Pareto-optimal solutions which are applicable in the real situations while solving a MCSSO is the goal. Also, the accuracy of the model in representing the operation of subway systems was significant.

  1. Comparing administered and market-based water allocation systems through a consistent agent-based modeling framework.

    PubMed

    Zhao, Jianshi; Cai, Ximing; Wang, Zhongjing

    2013-07-15

    Water allocation can be undertaken through administered systems (AS), market-based systems (MS), or a combination of the two. The debate on the performance of the two systems has lasted for decades but still calls for attention in both research and practice. This paper compares water users' behavior under AS and MS through a consistent agent-based modeling framework for water allocation analysis that incorporates variables particular to both MS (e.g., water trade and trading prices) and AS (water use violations and penalties/subsidies). Analogous to the economic theory of water markets under MS, the theory of rational violation justifies the exchange of entitled water under AS through the use of cross-subsidies. Under water stress conditions, a unique water allocation equilibrium can be achieved by following a simple bargaining rule that does not depend upon initial market prices under MS, or initial economic incentives under AS. The modeling analysis shows that the behavior of water users (agents) depends on transaction, or administrative, costs, as well as their autonomy. Reducing transaction costs under MS or administrative costs under AS will mitigate the effect that equity constraints (originating with primary water allocation) have on the system's total net economic benefits. Moreover, hydrologic uncertainty is shown to increase market prices under MS and penalties/subsidies under AS and, in most cases, also increases transaction, or administrative, costs. PMID:23597927

  2. Comparing administered and market-based water allocation systems through a consistent agent-based modeling framework.

    PubMed

    Zhao, Jianshi; Cai, Ximing; Wang, Zhongjing

    2013-07-15

    Water allocation can be undertaken through administered systems (AS), market-based systems (MS), or a combination of the two. The debate on the performance of the two systems has lasted for decades but still calls for attention in both research and practice. This paper compares water users' behavior under AS and MS through a consistent agent-based modeling framework for water allocation analysis that incorporates variables particular to both MS (e.g., water trade and trading prices) and AS (water use violations and penalties/subsidies). Analogous to the economic theory of water markets under MS, the theory of rational violation justifies the exchange of entitled water under AS through the use of cross-subsidies. Under water stress conditions, a unique water allocation equilibrium can be achieved by following a simple bargaining rule that does not depend upon initial market prices under MS, or initial economic incentives under AS. The modeling analysis shows that the behavior of water users (agents) depends on transaction, or administrative, costs, as well as their autonomy. Reducing transaction costs under MS or administrative costs under AS will mitigate the effect that equity constraints (originating with primary water allocation) have on the system's total net economic benefits. Moreover, hydrologic uncertainty is shown to increase market prices under MS and penalties/subsidies under AS and, in most cases, also increases transaction, or administrative, costs.

  3. An integrated modeling framework of socio-economic, biophysical, and hydrological processes in Midwest landscapes: Remote sensing data, agro-hydrological model, and agent-based model

    NASA Astrophysics Data System (ADS)

    Ding, Deng

    Intensive human-environment interactions are taking place in Midwestern agricultural systems. An integrated modeling framework is suitable for predicting dynamics of key variables of the socio-economic, biophysical, hydrological processes as well as exploring the potential transitions of system states in response to changes of the driving factors. The purpose of this dissertation is to address issues concerning the interacting processes and consequent changes in land use, water balance, and water quality using an integrated modeling framework. This dissertation is composed of three studies in the same agricultural watershed, the Clear Creek watershed in East-Central Iowa. In the first study, a parsimonious hydrologic model, the Threshold-Exceedance-Lagrangian Model (TELM), is further developed into RS-TELM (Remote Sensing TELM) to integrate remote sensing vegetation data for estimating evapotranspiration. The goodness of fit of RS-TELM is comparable to a well-calibrated SWAT (Soil and Water Assessment Tool) and even slightly superior in capturing intra-seasonal variability of stream flow. The integration of RS LAI (Leaf Area Index) data improves the model's performance especially over the agriculture dominated landscapes. The input of rainfall datasets with spatially explicit information plays a critical role in increasing the model's goodness of fit. In the second study, an agent-based model is developed to simulate farmers' decisions on crop type and fertilizer application in response to commodity and biofuel crop prices. The comparison between simulated crop land percentage and crop rotations with satellite-based land cover data suggest that farmers may be underestimating the effects that continuous corn production has on yields (yield drag). The simulation results given alternative market scenarios based on a survey of agricultural land owners and operators in the Clear Creek Watershed show that, farmers see cellulosic biofuel feedstock production in the form

  4. Simulating Land-Use Change using an Agent-Based Land Transaction Model

    NASA Astrophysics Data System (ADS)

    Bakker, M. M.; van Dijk, J.; Alam, S. J.

    2013-12-01

    In the densely populated cultural landscapes of Europe, the vast majority of all land is owned by private parties, be it farmers (the majority), nature organizations, property developers, or citizens. Therewith, the vast majority of all land-use change arises from land transactions between different owner types: successful farms expand at the expense of less successful farms, and meanwhile property developers, individual citizens, and nature organizations also actively purchase land. These land transactions are driven by specific properties of the land, by governmental policies, and by the (economic) motives of both buyers and sellers. Climate/global change can affect these drivers at various scales: at the local scale changes in hydrology can make certain land less or more desirable; at the global scale the agricultural markets will affect motives of farmers to buy or sell land; while at intermediate (e.g. provincial) scales property developers and nature conservationists may be encouraged or discouraged to purchase land. The cumulative result of all these transactions becomes manifest in changing land-use patterns, and consequent environmental responses. Within the project Climate Adaptation for Rural Areas an agent-based land-use model was developed that explores the future response of individual land users to climate change, within the context of wider global change (i.e. policy and market change). It simulates the exchange of land among farmers and between farmers and nature organizations and property developers, for a specific case study area in the east of the Netherlands. Results show that local impacts of climate change can result in a relative stagnation in the land market in waterlogged areas. Furthermore, the increase in dairying at the expense of arable cultivation - as has been observed in the area in the past - is slowing down as arable produce shows a favourable trend in the agricultural world market. Furthermore, budgets for nature managers are

  5. Impact of Different Policies on Unhealthy Dietary Behaviors in an Urban Adult Population: An Agent-Based Simulation Model

    PubMed Central

    Giabbanelli, Philippe J.; Arah, Onyebuchi A.; Zimmerman, Frederick J.

    2014-01-01

    Objectives. Unhealthy eating is a complex-system problem. We used agent-based modeling to examine the effects of different policies on unhealthy eating behaviors. Methods. We developed an agent-based simulation model to represent a synthetic population of adults in Pasadena, CA, and how they make dietary decisions. Data from the 2007 Food Attitudes and Behaviors Survey and other empirical studies were used to calibrate the parameters of the model. Simulations were performed to contrast the potential effects of various policies on the evolution of dietary decisions. Results. Our model showed that a 20% increase in taxes on fast foods would lower the probability of fast-food consumption by 3 percentage points, whereas improving the visibility of positive social norms by 10%, either through community-based or mass-media campaigns, could improve the consumption of fruits and vegetables by 7 percentage points and lower fast-food consumption by 6 percentage points. Zoning policies had no significant impact. Conclusions. Interventions emphasizing healthy eating norms may be more effective than directly targeting food prices or regulating local food outlets. Agent-based modeling may be a useful tool for testing the population-level effects of various policies within complex systems. PMID:24832414

  6. The effects of social interactions on fertility decline in nineteenth-century France: an agent-based simulation experiment.

    PubMed

    González-Bailón, Sandra; Murphy, Tommy E

    2013-07-01

    We built an agent-based simulation, incorporating geographic and demographic data from nineteenth-century France, to study the role of social interactions in fertility decisions. The simulation made experimentation possible in a context where other empirical strategies were precluded by a lack of data. We evaluated how different decision rules, with and without interdependent decision-making, caused variations in population growth and fertility levels. The analyses show that incorporating social influence into the model allows empirically observed behaviour to be mimicked, especially at a national level. These findings shed light on individual-level mechanisms through which the French demographic transition may have developed.

  7. Multi-Agent Based Simulation of Optimal Urban Land Use Allocation in the Middle Reaches of the Yangtze River, China

    NASA Astrophysics Data System (ADS)

    Zeng, Y.; Huang, W.; Jin, W.; Li, S.

    2016-06-01

    The optimization of land-use allocation is one of important approaches to achieve regional sustainable development. This study selects Chang-Zhu-Tan agglomeration as study area and proposed a new land use optimization allocation model. Using multi-agent based simulation model, the future urban land use optimization allocation was simulated in 2020 and 2030 under three different scenarios. This kind of quantitative information about urban land use optimization allocation and urban expansions in future would be of great interest to urban planning, water and land resource management, and climate change research.

  8. Applying GIS and high performance agent-based simulation for managing an Old World Screwworm fly invasion of Australia.

    PubMed

    Welch, M C; Kwan, P W; Sajeev, A S M

    2014-10-01

    Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. PMID:24705073

  9. Applying GIS and high performance agent-based simulation for managing an Old World Screwworm fly invasion of Australia.

    PubMed

    Welch, M C; Kwan, P W; Sajeev, A S M

    2014-10-01

    Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised.

  10. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department.

    PubMed

    Kittipittayakorn, Cholada; Ying, Kuo-Ching

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department. PMID:27195606

  11. Linking Bayesian and Agent-Based Models to Simulate Complex Social-Ecological Systems in the Sonoran Desert

    NASA Astrophysics Data System (ADS)

    Pope, A.; Gimblett, R.

    2013-12-01

    Interdependencies of ecologic, hydrologic, and social systems challenge traditional approaches to natural resource management in semi-arid regions. As a complex social-ecological system, water demands in the Sonoran Desert from agricultural and urban users often conflicts with water needs for its ecologically-significant riparian corridors. To explore this system, we developed an agent-based model to simulate complex feedbacks between human decisions and environmental conditions. Cognitive mapping in conjunction with stakeholder participation produced a Bayesian model of conditional probabilities of local human decision-making processes resulting to changes in water demand. Probabilities created in the Bayesian model were incorporated into the agent-based model, so that each agent had a unique probability to make a positive decision based on its perceived environment at each point in time and space. By using a Bayesian approach, uncertainty in the human decision-making process could be incorporated. The spatially-explicit agent-based model simulated changes in depth-to-groundwater by well pumping based on an agent's water demand. Depth-to-groundwater was then used as an indicator of unique vegetation guilds within the riparian corridor. Each vegetation guild provides varying levels of ecosystem services, the changes of which, along with changes in depth-to-groundwater, feedback to influence agent behavior. Using this modeling approach allowed us to examine resilience of semi-arid riparian corridors and agent behavior under various scenarios. The insight provided by the model contributes to understanding how specific interventions may alter the complex social-ecological system in the future.

  12. Can human-like Bots control collective mood: agent-based simulations of online chats

    NASA Astrophysics Data System (ADS)

    Tadić, Bosiljka; Šuvakov, Milovan

    2013-10-01

    Using an agent-based modeling approach, in this paper, we study self-organized dynamics of interacting agents in the presence of chat Bots. Different Bots with tunable ‘human-like’ attributes, which exchange emotional messages with agents, are considered, and the collective emotional behavior of agents is quantitatively analyzed. In particular, using detrended fractal analysis we determine persistent fluctuations and temporal correlations in time series of agent activity and statistics of avalanches carrying emotional messages of agents when Bots favoring positive/negative affects are active. We determine the impact of Bots and identify parameters that can modulate that impact. Our analysis suggests that, by these measures, the emotional Bots induce collective emotion among interacting agents by suitably altering the fractal characteristics of the underlying stochastic process. Positive emotion Bots are slightly more effective than negative emotion Bots. Moreover, Bots which periodically alternate between positive and negative emotion can enhance fluctuations in the system, leading to avalanches of agent messages that are reminiscent of self-organized critical states.

  13. Evaluating the effect of human activity patterns on air pollution exposure using an integrated field-based and agent-based modelling framework

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; Beelen, Rob M. J.; de Bakker, Merijn P.; Karssenberg, Derek

    2015-04-01

    Constructing spatio-temporal numerical models to support risk assessment, such as assessing the exposure of humans to air pollution, often requires the integration of field-based and agent-based modelling approaches. Continuous environmental variables such as air pollution are best represented using the field-based approach which considers phenomena as continuous fields having attribute values at all locations. When calculating human exposure to such pollutants it is, however, preferable to consider the population as a set of individuals each with a particular activity pattern. This would allow to account for the spatio-temporal variation in a pollutant along the space-time paths travelled by individuals, determined, for example, by home and work locations, road network, and travel times. Modelling this activity pattern requires an agent-based or individual based modelling approach. In general, field- and agent-based models are constructed with the help of separate software tools, while both approaches should play together in an interacting way and preferably should be combined into one modelling framework, which would allow for efficient and effective implementation of models by domain specialists. To overcome this lack in integrated modelling frameworks, we aim at the development of concepts and software for an integrated field-based and agent-based modelling framework. Concepts merging field- and agent-based modelling were implemented by extending PCRaster (http://www.pcraster.eu), a field-based modelling library implemented in C++, with components for 1) representation of discrete, mobile, agents, 2) spatial networks and algorithms by integrating the NetworkX library (http://networkx.github.io), allowing therefore to calculate e.g. shortest routes or total transport costs between locations, and 3) functions for field-network interactions, allowing to assign field-based attribute values to networks (i.e. as edge weights), such as aggregated or averaged

  14. Real-Time Agent-Based Modeling Simulation with in-situ Visualization of Complex Biological Systems

    PubMed Central

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y. K.

    2016-01-01

    We present an efficient and scalable scheme for implementing agent-based modeling (ABM) simulation with In Situ visualization of large complex systems on heterogeneous computing platforms. The scheme is designed to make optimal use of the resources available on a heterogeneous platform consisting of a multicore CPU and a GPU, resulting in minimal to no resource idle time. Furthermore, the scheme was implemented under a client-server paradigm that enables remote users to visualize and analyze simulation data as it is being generated at each time step of the model. Performance of a simulation case study of vocal fold inflammation and wound healing with 3.8 million agents shows 35× and 7× speedup in execution time over single-core and multi-core CPU respectively. Each iteration of the model took less than 200 ms to simulate, visualize and send the results to the client. This enables users to monitor the simulation in real-time and modify its course as needed. PMID:27547508

  15. How to determine future EHR ROI. Agent-based modeling and simulation offers a new alternative to traditional techniques.

    PubMed

    Blachowicz, Dariusz; Christiansen, John H; Ranginani, Archana; Simunich, Kathy Lee

    2008-01-01

    Effectively determining the future return-on-investment of regional healthcare delivery and electronic healthcare record systems requires consideration of many alternative designs for their performance, cost and ability to meet stakeholder expectations. Successfully testing, validating and communicating the expected consequences of alternative business practices, processes, protocols and policies requires an objective analytical approach. Agent-based modeling and simulation (ABMS), a technique for determining the system-level results of complex, interacting, and often conflicting individual-level decisions, provides such an approach. ABMS of healthcare delivery can provide actionable guidance for decision makers by enabling healthcare experts to define the individual, agent-level rules of operation; allowing them to see how the agent rules play out over time in a detailed real-world context; providing them with the tools to assess the consequences of alternative plans; and giving them a clear method for communicating results to the broader stakeholder community.

  16. Agent-Based Simulations of Malaria Transmissions with Applications to a Study Site in Thailand

    NASA Technical Reports Server (NTRS)

    Kiang, Richard K.; Adimi, Farida; Zollner, Gabriela E.; Coleman, Russell E.

    2006-01-01

    The dynamics of malaria transmission are driven by environmental, biotic and socioeconomic factors. Because of the geographic dependency of these factors and the complex interactions among them, it is difficult to generalize the key factors that perpetuate or intensify malaria transmission. Methods: Discrete event simulations were used for modeling the detailed interactions among the vector life cycle, sporogonic cycle and human infection cycle, under the explicit influences of selected extrinsic and intrinsic factors. Meteorological and environmental parameters may be derived from satellite data. The output of the model includes the individual infection status and the quantities normally observed in field studies, such as mosquito biting rates, sporozoite infection rates, gametocyte prevalence and incidence. Results were compared with mosquito vector and human malaria data acquired over 4.5 years (June 1999 - January 2004) in Kong Mong Tha, a remote village in Kanchanaburi Province, western Thailand. Results: Three years of transmissions of vivax and falciparum malaria were simulated for a hypothetical hamlet with approximately 1/7 of the study site population. The model generated results for a number of scenarios, including applications of larvicide and insecticide, asymptomatic cases receiving or not receiving treatment, blocking malaria transmission in mosquito vectors, and increasing the density of farm (host) animals in the hamlet. Transmission characteristics and trends in the simulated results are comparable to actual data collected at the study site.

  17. Investigation of Simulated Trading — A multi agent based trading system for optimization purposes

    NASA Astrophysics Data System (ADS)

    Schneider, Johannes J.

    2010-07-01

    Some years ago, Bachem, Hochstättler, and Malich proposed a heuristic algorithm called Simulated Trading for the optimization of vehicle routing problems. Computational agents place buy-orders and sell-orders for customers to be handled at a virtual financial market, the prices of the orders depending on the costs of inserting the customer in the tour or for his removal. According to a proposed rule set, the financial market creates a buy-and-sell graph for the various orders in the order book, intending to optimize the overall system. Here I present a thorough investigation for the application of this algorithm to the traveling salesman problem.

  18. The contribution of agent-based simulations to conservation management on a Natura 2000 site.

    PubMed

    Dupont, Hélène; Gourmelon, Françoise; Rouan, Mathias; Le Viol, Isabelle; Kerbiriou, Christian

    2016-03-01

    The conservation of biodiversity today must include the participation and support of local stakeholders. Natura 2000 can be considered as a conservation system that, in its application in most EU countries, relies on the participation of local stakeholders. Our study proposes a scientific method for participatory modelling, with the aim of contributing to the conservation management of habitats and species at a Natura 2000 site (Crozon Peninsula, Bretagne, France) that is representative of in landuse changes in coastal areas. We make use of companion modelling and its associated tools (scenario-planning, GIS, multi-agent modelling and simulations) to consider possible futures through the co-construction of management scenarios and the understanding of their consequences on different indicators of biodiversity status (habitats, avifauna, flora). The maintenance of human activities as they have been carried out since the creation of the Natura 2000s zone allows the biodiversity values to remain stable. Extensive agricultural activities have been shown to be essential to this maintenance, whereas management sustained by the multiplication of conservation actions brings about variable results according to the indicators. None of the scenarios has a positive incidence on the set of indicators. However, an understanding of the modelling system and the results of the simulations allow for the refining of the selection of conservation actions in relation to the species to be preserved.

  19. ActivitySim: large-scale agent based activity generation for infrastructure simulation

    SciTech Connect

    Gali, Emmanuel; Eidenbenz, Stephan; Mniszewski, Sue; Cuellar, Leticia; Teuscher, Christof

    2008-01-01

    The United States' Department of Homeland Security aims to model, simulate, and analyze critical infrastructure and their interdependencies across multiple sectors such as electric power, telecommunications, water distribution, transportation, etc. We introduce ActivitySim, an activity simulator for a population of millions of individual agents each characterized by a set of demographic attributes that is based on US census data. ActivitySim generates daily schedules for each agent that consists of a sequence of activities, such as sleeping, shopping, working etc., each being scheduled at a geographic location, such as businesses or private residences that is appropriate for the activity type and for the personal situation of the agent. ActivitySim has been developed as part of a larger effort to understand the interdependencies among national infrastructure networks and their demand profiles that emerge from the different activities of individuals in baseline scenarios as well as emergency scenarios, such as hurricane evacuations. We present the scalable software engineering principles underlying ActivitySim, the socia-technical modeling paradigms that drive the activity generation, and proof-of-principle results for a scenario in the Twin Cities, MN area of 2.6 M agents.

  20. Evolutionary Agent-Based Simulation of the Introduction of New Technologies in Air Traffic Management

    NASA Technical Reports Server (NTRS)

    Yliniemi, Logan; Agogino, Adrian K.; Tumer, Kagan

    2014-01-01

    Accurate simulation of the effects of integrating new technologies into a complex system is critical to the modernization of our antiquated air traffic system, where there exist many layers of interacting procedures, controls, and automation all designed to cooperate with human operators. Additions of even simple new technologies may result in unexpected emergent behavior due to complex human/ machine interactions. One approach is to create high-fidelity human models coming from the field of human factors that can simulate a rich set of behaviors. However, such models are difficult to produce, especially to show unexpected emergent behavior coming from many human operators interacting simultaneously within a complex system. Instead of engineering complex human models, we directly model the emergent behavior by evolving goal directed agents, representing human users. Using evolution we can predict how the agent representing the human user reacts given his/her goals. In this paradigm, each autonomous agent in a system pursues individual goals, and the behavior of the system emerges from the interactions, foreseen or unforeseen, between the agents/actors. We show that this method reflects the integration of new technologies in a historical case, and apply the same methodology for a possible future technology.

  1. The contribution of agent-based simulations to conservation management on a Natura 2000 site.

    PubMed

    Dupont, Hélène; Gourmelon, Françoise; Rouan, Mathias; Le Viol, Isabelle; Kerbiriou, Christian

    2016-03-01

    The conservation of biodiversity today must include the participation and support of local stakeholders. Natura 2000 can be considered as a conservation system that, in its application in most EU countries, relies on the participation of local stakeholders. Our study proposes a scientific method for participatory modelling, with the aim of contributing to the conservation management of habitats and species at a Natura 2000 site (Crozon Peninsula, Bretagne, France) that is representative of in landuse changes in coastal areas. We make use of companion modelling and its associated tools (scenario-planning, GIS, multi-agent modelling and simulations) to consider possible futures through the co-construction of management scenarios and the understanding of their consequences on different indicators of biodiversity status (habitats, avifauna, flora). The maintenance of human activities as they have been carried out since the creation of the Natura 2000s zone allows the biodiversity values to remain stable. Extensive agricultural activities have been shown to be essential to this maintenance, whereas management sustained by the multiplication of conservation actions brings about variable results according to the indicators. None of the scenarios has a positive incidence on the set of indicators. However, an understanding of the modelling system and the results of the simulations allow for the refining of the selection of conservation actions in relation to the species to be preserved. PMID:26696603

  2. An Agent-Based Simulation for Investigating the Impact of Stereotypes on Task-Oriented Group Formation

    NASA Astrophysics Data System (ADS)

    Maghami, Mahsa; Sukthankar, Gita

    In this paper, we introduce an agent-based simulation for investigating the impact of social factors on the formation and evolution of task-oriented groups. Task-oriented groups are created explicitly to perform a task, and all members derive benefits from task completion. However, even in cases when all group members act in a way that is locally optimal for task completion, social forces that have mild effects on choice of associates can have a measurable impact on task completion performance. In this paper, we show how our simulation can be used to model the impact of stereotypes on group formation. In our simulation, stereotypes are based on observable features, learned from prior experience, and only affect an agent's link formation preferences. Even without assuming stereotypes affect the agents' willingness or ability to complete tasks, the long-term modifications that stereotypes have on the agents' social network impair the agents' ability to form groups with sufficient diversity of skills, as compared to agents who form links randomly. An interesting finding is that this effect holds even in cases where stereotype preference and skill existence are completely uncorrelated.

  3. Simulating Brain Tumor Heterogeneity with a Multiscale Agent-Based Model: Linking Molecular Signatures, Phenotypes and Expansion Rate

    PubMed Central

    Zhang, Le; Strouthos, Costas G.; Wang, Zhihui; Deisboeck, Thomas S.

    2008-01-01

    We have extended our previously developed 3D multi-scale agent-based brain tumor model to simulate cancer heterogeneity and to analyze its impact across the scales of interest. While our algorithm continues to employ an epidermal growth factor receptor (EGFR) gene-protein interaction network to determine the cells’ phenotype, it now adds an implicit treatment of tumor cell adhesion related to the model’s biochemical microenvironment. We simulate a simplified tumor progression pathway that leads to the emergence of five distinct glioma cell clones with different EGFR density and cell ‘search precisions’. The in silico results show that microscopic tumor heterogeneity can impact the tumor system’s multicellular growth patterns. Our findings further confirm that EGFR density results in the more aggressive clonal populations switching earlier from proliferation-dominated to a more migratory phenotype. Moreover, analyzing the dynamic molecular profile that triggers the phenotypic switch between proliferation and migration, our in silico oncogenomics data display spatial and temporal diversity in documenting the regional impact of tumorigenesis, and thus support the added value of multi-site and repeated assessments in vitro and in vivo. Potential implications from this in silico work for experimental and computational studies are discussed. PMID:20047002

  4. An agent-based model to simulate tsetse fly distribution and control techniques: a case study in Nguruman, Kenya

    PubMed Central

    Lin, Shengpan; DeVisser, Mark H.; Messina, Joseph P.

    2015-01-01

    Background African trypanosomiasis, also known as “sleeping sickness” in humans and “nagana” in livestock is an important vector-borne disease in Sub-Saharan Africa. Control of trypanosomiasis has focused on eliminating the vector, the tsetse fly (Glossina, spp.). Effective tsetse fly control planning requires models to predict tsetse population and distribution changes over time and space. Traditional planning models have used statistical tools to predict tsetse distributions and have been hindered by limited field survey data. Methodology/Results We developed an Agent-Based Model (ABM) to provide timing and location information for tsetse fly control without presence/absence training data. The model is driven by daily remotely-sensed environment data. The model provides a flexible tool linking environmental changes with individual biology to analyze tsetse control methods such as aerial insecticide spraying, wild animal control, releasing irradiated sterile tsetse males, and land use and cover modification. Significance This is a bottom-up process-based model with freely available data as inputs that can be easily transferred to a new area. The tsetse population simulation more closely approximates real conditions than those using traditional statistical models making it a useful tool in tsetse fly control planning. PMID:26309347

  5. Multiobjective Decision Making Policies and Coordination Mechanisms in Hierarchical Organizations: Results of an Agent-Based Simulation

    PubMed Central

    2014-01-01

    This paper analyses how different coordination modes and different multiobjective decision making approaches interfere with each other in hierarchical organizations. The investigation is based on an agent-based simulation. We apply a modified NK-model in which we map multiobjective decision making as adaptive walk on multiple performance landscapes, whereby each landscape represents one objective. We find that the impact of the coordination mode on the performance and the speed of performance improvement is critically affected by the selected multiobjective decision making approach. In certain setups, the performances achieved with the more complex multiobjective decision making approaches turn out to be less sensitive to the coordination mode than the performances achieved with the less complex multiobjective decision making approaches. Furthermore, we present results on the impact of the nature of interactions among decisions on the achieved performance in multiobjective setups. Our results give guidance on how to control the performance contribution of objectives to overall performance and answer the question how effective certain multiobjective decision making approaches perform under certain circumstances (coordination mode and interdependencies among decisions). PMID:25152926

  6. A Scaffolding Framework to Support Learning of Emergent Phenomena Using Multi-Agent-Based Simulation Environments

    ERIC Educational Resources Information Center

    Basu, Satabdi; Sengupta, Pratim; Biswas, Gautam

    2015-01-01

    Students from middle school to college have difficulties in interpreting and understanding complex systems such as ecological phenomena. Researchers have suggested that students experience difficulties in reconciling the relationships between individuals, populations, and species, as well as the interactions between organisms and their environment…

  7. An Economic Analysis of Strategies to Control Clostridium Difficile Transmission and Infection Using an Agent-Based Simulation Model

    PubMed Central

    Nelson, Richard E.; Jones, Makoto; Leecaster, Molly; Samore, Matthew H.; Ray, William; Huttner, Angela; Huttner, Benedikt; Khader, Karim; Stevens, Vanessa W.; Gerding, Dale; Schweizer, Marin L.; Rubin, Michael A.

    2016-01-01

    Background A number of strategies exist to reduce Clostridium difficile (C. difficile) transmission. We conducted an economic evaluation of “bundling” these strategies together. Methods We constructed an agent-based computer simulation of nosocomial C. difficile transmission and infection in a hospital setting. This model included the following components: interactions between patients and health care workers; room contamination via C. difficile shedding; C. difficile hand carriage and removal via hand hygiene; patient acquisition of C. difficile via contact with contaminated rooms or health care workers; and patient antimicrobial use. Six interventions were introduced alone and "bundled" together: (a) aggressive C. difficile testing; (b) empiric isolation and treatment of symptomatic patients; (c) improved adherence to hand hygiene and (d) contact precautions; (e) improved use of soap and water for hand hygiene; and (f) improved environmental cleaning. Our analysis compared these interventions using values representing 3 different scenarios: (1) base-case (BASE) values that reflect typical hospital practice, (2) intervention (INT) values that represent implementation of hospital-wide efforts to reduce C. diff transmission, and (3) optimal (OPT) values representing the highest expected results from strong adherence to the interventions. Cost parameters for each intervention were obtained from published literature. We performed our analyses assuming low, normal, and high C. difficile importation prevalence and transmissibility of C. difficile. Results INT levels of the “bundled” intervention were cost-effective at a willingness-to-pay threshold of $100,000/quality-adjusted life-year in all importation prevalence and transmissibility scenarios. OPT levels of intervention were cost-effective for normal and high importation prevalence and transmissibility scenarios. When analyzed separately, hand hygiene compliance, environmental decontamination, and empiric

  8. Emerging patterns in tumor systems: simulating the dynamics of multicellular clusters with an agent-based spatial agglomeration model.

    PubMed

    Mansury, Yuri; Kimura, Mark; Lobo, Jose; Deisboeck, Thomas S

    2002-12-01

    Brain cancer cells invade early on surrounding parenchyma, which makes it impossible to surgically remove all tumor cells and thus significantly worsens the prognosis of the patient. Specific structural elements such as multicellular clusters have been seen in experimental settings to emerge within the invasive cell system and are believed to express the systems' guidance toward nutritive sites in a heterogeneous environment. Based on these observations, we developed a novel agent-based model of spatio-temporal search and agglomeration to investigate the dynamics of cell motility and aggregation with the assumption that tumors behave as complex dynamic self-organizing biosystems. In this model, virtual cells migrate because they are attracted by higher nutrient concentrations and to avoid overpopulated areas with high levels of toxic metabolites. A specific feature of our model is the capability of cells to search both globally and locally. This concept is applied to simulate cell-surface receptor-mediated information processing of tumor cells such that a cell searching for a more growth-permissive place "learns" the information content of a brain tissue region within a two-dimensional lattice in two stages, processing first the global and then the local input. In both stages, differences in microenvironment characteristics define distinctions in energy expenditure for a moving cell and thus influence cell migration, proliferation, agglomeration, and cell death. Numerical results of our model show a phase transition leading to the emergence of two distinct spatio-temporal patterns depending on the dominant search mechanism. If global search is dominant, the result is a small number of large clusters exhibiting rapid spatial expansion but shorter lifetime of the tumor system. By contrast, if local search is dominant, the trade-off is many small clusters with longer lifetime but much slower velocity of expansion. Furthermore, in the case of such dominant local search

  9. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    NASA Astrophysics Data System (ADS)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  10. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  11. Thread Group Multithreading: Accelerating the Computation of an Agent-Based Power System Modeling and Simulation Tool -- C GridLAB-D

    SciTech Connect

    Jin, Shuangshuang; Chassin, David P.

    2014-01-06

    GridLAB-DTM is an open source next generation agent-based smart-grid simulator that provides unprecedented capability to model the performance of smart grid technologies. Over the past few years, GridLAB-D has been used to conduct important analyses of smart grid concepts, but it is still quite limited by its computational performance. In order to break through the performance bottleneck to meet the need for large scale power grid simulations, we develop a thread group mechanism to implement highly granular multithreaded computation in GridLAB-D. We achieve close to linear speedups on multithreading version compared against the single-thread version of the same code running on general purpose multi-core commodity for a benchmark simple house model. The performance of the multithreading code shows favorable scalability properties and resource utilization, and much shorter execution time for large-scale power grid simulations.

  12. Rural-urban migration including formal and informal workers in the urban sector: an agent-based numerical simulation study

    NASA Astrophysics Data System (ADS)

    Branco, Nilton; Oliveira, Tharnier; Silveira, Jaylson

    2012-02-01

    The goal of this work is to study rural-urban migration in the early stages of industrialization. We use an agent-based model and take into account the existence of informal and formal workers on the urban sector and possible migration movements, dependent on the agents' social and private utilities. Our agents are place on vertices of a square lattice, such that each vertex has only one agent. Rural, urban informal and urban formal workers are represented by different states of a three-state Ising model. At every step, a fraction a of the agents may change sectors or migrate. The total utility of a given agent is then calculated and compared to a random utility, in order to check if this agent turns into an actual migrant or changes sector. The dynamics is carried out until an equilibrium state is reached and equilibrium variables are then calculated and compared to available data. We find that a generalized Harris-Todaro condition is satisfied [1] on these equilibrium regimes, i.e, the ratio between expected wages between any pair of sectors reach a constant value. [4pt] [1] J. J. Silveira, A. L. Esp'indola and T. J. Penna, Physica A, 364, 445 (2006).

  13. Ideal free distribution or dynamic game? An agent-based simulation study of trawling strategies with varying information

    NASA Astrophysics Data System (ADS)

    Beecham, J. A.; Engelhard, G. H.

    2007-10-01

    An ecological economic model of trawling is presented to demonstrate the effect of trawling location choice strategy on net input (rate of economic gain of fish caught per time spent less costs). Fishing location choice is considered to be a dynamic process whereby trawlers chose from among a repertoire of plastic strategies that they modify if their gains fall below a fixed proportion of the mean gains of the fleet as a whole. The distribution of fishing across different areas of a fishery follows an approximate ideal free distribution (IFD) with varying noise due to uncertainty. The least-productive areas are not utilised because initial net input never reaches the mean yield of better areas subject to competitive exploitation. In cases, where there is a weak temporal autocorrelation between fish stocks in a specific location, a plastic strategy of local translocation between trawls mixed with longer-range translocation increases realised input. The trawler can change its translocation strategy in the light of information about recent trawling success compared to its long-term average but, in contrast to predictions of the Marginal Value Theorem (MVT) model, does not know for certain what it will find by moving, so may need to sample new patches. The combination of the two types of translocation mirrored beam-trawling strategies used by the Dutch fleet and the resultant distribution of trawling effort is confirmed by analysis of historical effort distribution of British otter trawling fleets in the North Sea. Fisheries exploitation represents an area where dynamic agent-based adaptive models may be a better representation of the economic dynamics of a fleet than classically inspired optimisation models.

  14. Component-Based Framework for Subsurface Simulations

    SciTech Connect

    Palmer, Bruce J.; Fang, Yilin; Hammond, Glenn E.; Gurumoorthi, Vidhya

    2007-08-01

    Simulations in the subsurface environment represent a broad range of phenomena covering an equally broad range of scales. Developing modelling capabilities that can integrate models representing different phenomena acting at different scales present formidable challenges both from the algorithmic and computer science perspective. This paper will describe the development of an integrated framework that will be used to combine different models into a single simulation. Initial work has focused on creating two frameworks, one for performing smooth particle hydrodynamics (SPH) simulations of fluid systems, the other for performing grid-based continuum simulations of reactive subsurface flow. The SPH framework is based on a parallel code developed for doing pore scale simulations, the continuum grid-based framework is based on the STOMP (Subsurface Transport Over Multiple Phases) code developed at PNNL. Future work will focus on combining the frameworks together to perform multiscale, multiphysics simulations of reactive subsurface flow.

  15. Integrating the simulation of domestic water demand behaviour to an urban water model using agent based modelling

    NASA Astrophysics Data System (ADS)

    Koutiva, Ifigeneia; Makropoulos, Christos

    2015-04-01

    The urban water system's sustainable evolution requires tools that can analyse and simulate the complete cycle including both physical and cultural environments. One of the main challenges, in this regard, is the design and development of tools that are able to simulate the society's water demand behaviour and the way policy measures affect it. The effects of these policy measures are a function of personal opinions that subsequently lead to the formation of people's attitudes. These attitudes will eventually form behaviours. This work presents the design of an ABM tool for addressing the social dimension of the urban water system. The created tool, called Urban Water Agents' Behaviour (UWAB) model, was implemented, using the NetLogo agent programming language. The main aim of the UWAB model is to capture the effects of policies and environmental pressures to water conservation behaviour of urban households. The model consists of agents representing urban households that are linked to each other creating a social network that influences the water conservation behaviour of its members. Household agents are influenced as well by policies and environmental pressures, such as drought. The UWAB model simulates behaviour resulting in the evolution of water conservation within an urban population. The final outcome of the model is the evolution of the distribution of different conservation levels (no, low, high) to the selected urban population. In addition, UWAB is implemented in combination with an existing urban water management simulation tool, the Urban Water Optioneering Tool (UWOT) in order to create a modelling platform aiming to facilitate an adaptive approach of water resources management. For the purposes of this proposed modelling platform, UWOT is used in a twofold manner: (1) to simulate domestic water demand evolution and (2) to simulate the response of the water system to the domestic water demand evolution. The main advantage of the UWAB - UWOT model

  16. Flexible Residential Smart Grid Simulation Framework

    NASA Astrophysics Data System (ADS)

    Xiang, Wang

    Different scheduling and coordination algorithms controlling household appliances' operations can potentially lead to energy consumption reduction and/or load balancing in conjunction with different electricity pricing methods used in smart grid programs. In order to easily implement different algorithms and evaluate their efficiency against other ideas, a flexible simulation framework is desirable in both research and business fields. However, such a platform is currently lacking or underdeveloped. In this thesis, we provide a simulation framework to focus on demand side residential energy consumption coordination in response to different pricing methods. This simulation framework, equipped with an appliance consumption library using realistic values, aims to closely represent the average usage of different types of appliances. The simulation results of traditional usage yield close matching values compared to surveyed real life consumption records. Several sample coordination algorithms, pricing schemes, and communication scenarios are also implemented to illustrate the use of the simulation framework.

  17. A Software Framework for Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.

    2008-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has a long history in developing simulations of experimental fixed-wing aircraft from gliders to suborbital vehicles on platforms ranging from desktop simulators to pilot-in-the-loop/aircraft-in-the-loop simulators. Regardless of the aircraft or simulator hardware, much of the software framework is common to all NASA Dryden simulators. Some of this software has withstood the test of time, but in recent years the push toward high-fidelity user-friendly simulations has resulted in some significant changes. This report presents an overview of the current NASA Dryden simulation software framework and capabilities with an emphasis on the new features that have permitted NASA to develop more capable simulations while maintaining the same staffing levels.

  18. Incorporating fault tolerance in distributed agent based systems by simulating bio-computing model of stress pathways

    NASA Astrophysics Data System (ADS)

    Bansal, Arvind K.

    2006-05-01

    Bio-computing model of 'Distributed Multiple Intelligent Agents Systems' (BDMIAS) models agents as genes, a cooperating group of agents as operons - commonly regulated groups of genes, and the complex task as a set of interacting pathways such that the pathways involve multiple cooperating operons. The agents (or groups of agents) interact with each other using message passing and pattern based bindings that may reconfigure agent's function temporarily. In this paper, a technique has been described for incorporating fault tolerance in BDMIAS. The scheme is based upon simulating BDMIAS, exploiting the modeling of biological stress pathways, integration of fault avoidance, and distributed fault recovery of the crashed agents. Stress pathways are latent pathways in biological system that gets triggered very quickly, regulate the complex biological system by temporarily regulating or inactivating the undesirable pathways, and are essential to avoid catastrophic failures. Pattern based interaction between messages and agents allow multiple agents to react concurrently in response to single condition change represented by a message broadcast. The fault avoidance exploits the integration of the intelligent processing rate control using message based loop feedback and temporary reconfiguration that alters the data flow between functional modules within an agent, and may alter. The fault recovery exploits the concept of semi passive shadow agents - one on the local machine and other on the remote machine, dynamic polling of machines, logically time stamped messages to avoid message losses, and distributed archiving of volatile part of agent state on distributed machines. Various algorithms have been described.

  19. Agent-based models in translational systems biology

    PubMed Central

    An, Gary; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram

    2013-01-01

    Effective translational methodologies for knowledge representation are needed in order to make strides against the constellation of diseases that affect the world today. These diseases are defined by their mechanistic complexity, redundancy, and nonlinearity. Translational systems biology aims to harness the power of computational simulation to streamline drug/device design, simulate clinical trials, and eventually to predict the effects of drugs on individuals. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggests that this modeling framework is well suited for translational systems biology. This review describes agent-based modeling and gives examples of its translational applications in the context of acute inflammation and wound healing. PMID:20835989

  20. Object-oriented framework for distributed simulation

    NASA Astrophysics Data System (ADS)

    Hunter, Julia; Carson, John A.; Colley, Martin; Standeven, John; Callaghan, Victor

    1999-06-01

    The benefits of object-oriented technology are widely recognized in software engineering. This paper describes the use of the object-oriented paradigm to create distributed simulations. The University of Essex Robotics and Intelligent Machines group has been carrying out research into distributed vehicle simulation since 1992. Part of this research has focused on the development of simulation systems to assist in the design of robotic vehicles. This paper describes the evolution of these systems, from an early toolkit used for teaching robotics to recent work on using simulation as a design tool in the creation of a new generation of unmanned underwater vehicles. It outlines experiences gained in using PVM, and ongoing research into the use of the emerging High Level Architecture as the basis for these frameworks. The paper concludes with the perceived benefits of adopting object-oriented methodologies as the basis for simulation frameworks.

  1. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  2. Framework for Network Co-Simulation

    2014-01-09

    The Framework for Network Co-Simulation (FNCS) uses a federated approach to integrate simulations which may have differing time scales. Special consideration is given to integration with a communication network simulation such that inter-simulation messages may be optionally routed through and delayed by such a simulation. In addition, FNCS uses novel time synchronization algorithms to accelerate co-simulation including the application of speculative multithreading. FNCS accomplishes all of these improvements with minimal end user intervention. Simulations canmore » be integrated using FNCS while maintaining their original model input files simply by linking with the FNCS library and making appropriate calls into the FNCS API.« less

  3. Onyx-Advanced Aeropropulsion Simulation Framework Created

    NASA Technical Reports Server (NTRS)

    Reed, John A.

    2001-01-01

    The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.

  4. Pattern-oriented modeling of agent-based complex systems: Lessons from ecology

    USGS Publications Warehouse

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-01-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  5. The Umbra Simulation and Integration Framework Applied to Emergency Response Training

    NASA Technical Reports Server (NTRS)

    Hamilton, Paul Lawrence; Britain, Robert

    2010-01-01

    The Mine Emergency Response Interactive Training Simulation (MERITS) is intended to prepare personnel to manage an emergency in an underground coal mine. The creation of an effective training environment required realistic emergent behavior in response to simulation events and trainee interventions, exploratory modification of miner behavior rules, realistic physics, and incorporation of legacy code. It also required the ability to add rich media to the simulation without conflicting with normal desktop security settings. Our Umbra Simulation and Integration Framework facilitated agent-based modeling of miners and rescuers and made it possible to work with subject matter experts to quickly adjust behavior through script editing, rather than through lengthy programming and recompilation. Integration of Umbra code with the WebKit browser engine allowed the use of JavaScript-enabled local web pages for media support. This project greatly extended the capabilities of Umbra in support of training simulations and has implications for simulations that combine human behavior, physics, and rich media.

  6. Investigating biocomplexity through the agent-based paradigm.

    PubMed

    Kaul, Himanshu; Ventikos, Yiannis

    2015-01-01

    Capturing the dynamism that pervades biological systems requires a computational approach that can accommodate both the continuous features of the system environment as well as the flexible and heterogeneous nature of component interactions. This presents a serious challenge for the more traditional mathematical approaches that assume component homogeneity to relate system observables using mathematical equations. While the homogeneity condition does not lead to loss of accuracy while simulating various continua, it fails to offer detailed solutions when applied to systems with dynamically interacting heterogeneous components. As the functionality and architecture of most biological systems is a product of multi-faceted individual interactions at the sub-system level, continuum models rarely offer much beyond qualitative similarity. Agent-based modelling is a class of algorithmic computational approaches that rely on interactions between Turing-complete finite-state machines--or agents--to simulate, from the bottom-up, macroscopic properties of a system. In recognizing the heterogeneity condition, they offer suitable ontologies to the system components being modelled, thereby succeeding where their continuum counterparts tend to struggle. Furthermore, being inherently hierarchical, they are quite amenable to coupling with other computational paradigms. The integration of any agent-based framework with continuum models is arguably the most elegant and precise way of representing biological systems. Although in its nascence, agent-based modelling has been utilized to model biological complexity across a broad range of biological scales (from cells to societies). In this article, we explore the reasons that make agent-based modelling the most precise approach to model biological systems that tend to be non-linear and complex.

  7. Investigating biocomplexity through the agent-based paradigm

    PubMed Central

    Kaul, Himanshu

    2015-01-01

    Capturing the dynamism that pervades biological systems requires a computational approach that can accommodate both the continuous features of the system environment as well as the flexible and heterogeneous nature of component interactions. This presents a serious challenge for the more traditional mathematical approaches that assume component homogeneity to relate system observables using mathematical equations. While the homogeneity condition does not lead to loss of accuracy while simulating various continua, it fails to offer detailed solutions when applied to systems with dynamically interacting heterogeneous components. As the functionality and architecture of most biological systems is a product of multi-faceted individual interactions at the sub-system level, continuum models rarely offer much beyond qualitative similarity. Agent-based modelling is a class of algorithmic computational approaches that rely on interactions between Turing-complete finite-state machines—or agents—to simulate, from the bottom-up, macroscopic properties of a system. In recognizing the heterogeneity condition, they offer suitable ontologies to the system components being modelled, thereby succeeding where their continuum counterparts tend to struggle. Furthermore, being inherently hierarchical, they are quite amenable to coupling with other computational paradigms. The integration of any agent-based framework with continuum models is arguably the most elegant and precise way of representing biological systems. Although in its nascence, agent-based modelling has been utilized to model biological complexity across a broad range of biological scales (from cells to societies). In this article, we explore the reasons that make agent-based modelling the most precise approach to model biological systems that tend to be non-linear and complex. PMID:24227161

  8. An Active Learning Exercise for Introducing Agent-Based Modeling

    ERIC Educational Resources Information Center

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  9. Simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Tentner, A.

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.

  10. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  11. Argonne simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Canfield, T.; Brown-VanHoozer, A.; Tentner, A.

    1996-04-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically to reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  12. An Agent-Based Cockpit Task Management System

    NASA Technical Reports Server (NTRS)

    Funk, Ken

    1997-01-01

    An agent-based program to facilitate Cockpit Task Management (CTM) in commercial transport aircraft is developed and evaluated. The agent-based program called the AgendaManager (AMgr) is described and evaluated in a part-task simulator study using airline pilots.

  13. MCdevelop - a universal framework for Stochastic Simulations

    NASA Astrophysics Data System (ADS)

    Slawinska, M.; Jadach, S.

    2011-03-01

    We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, merge and analyse results from multiple parallel jobs, even before the production runs are terminated. Throughout the years of development of stochastic simulations for HEP, a sophisticated framework featuring all the above mentioned functionality has been implemented. MCdevelop represents its latest version, written mostly in C++ (GNU compiler gcc). It uses Autotools to build binaries (optionally managed within the KDevelop 3.5.3 Integrated Development Environment (IDE)). It uses the open-source ROOT package for histogramming, graphics and the mechanism of persistency for the C++ objects. MCdevelop helps to run multiple parallel jobs on any computer cluster with NQS-type batch system. Program summaryProgram title:MCdevelop Catalogue identifier: AEHW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http

  14. Agent-Based Literacy Theory

    ERIC Educational Resources Information Center

    McEneaney, John E.

    2006-01-01

    The purpose of this theoretical essay is to explore the limits of traditional conceptualizations of reader and text and to propose a more general theory based on the concept of a literacy agent. The proposed theoretical perspective subsumes concepts from traditional theory and aims to account for literacy online. The agent-based literacy theory…

  15. Simulation framework for spatio-spectral anomalous change detection

    SciTech Connect

    Theiler, James P; Harvey, Neal R; Porter, Reid B; Wohlberg, Brendt E

    2009-01-01

    The authors describe the development of a simulation framework for anomalous change detection that considers both the spatial and spectral aspects of the imagery. A purely spectral framework has previously been introduced, but the extension to spatio-spectral requires attention to a variety of new issues, and requires more careful modeling of the anomalous changes. Using this extended framework, they evaluate the utility of spatial image processing operators to enhance change detection sensitivity in (simulated) remote sensing imagery.

  16. Agent-based forward analysis

    SciTech Connect

    Kerekes, Ryan A.; Jiao, Yu; Shankar, Mallikarjun; Potok, Thomas E.; Lusk, Rick M.

    2008-01-01

    We propose software agent-based "forward analysis" for efficient information retrieval in a network of sensing devices. In our approach, processing is pushed to the data at the edge of the network via intelligent software agents rather than pulling data to a central facility for processing. The agents are deployed with a specific query and perform varying levels of analysis of the data, communicating with each other and sending only relevant information back across the network. We demonstrate our concept in the context of face recognition using a wireless test bed comprised of PDA cell phones and laptops. We show that agent-based forward analysis can provide a significant increase in retrieval speed while decreasing bandwidth usage and information overload at the central facility. n

  17. Exploring cooperation and competition using agent-based modeling

    PubMed Central

    Elliott, Euel; Kiel, L. Douglas

    2002-01-01

    Agent-based modeling enhances our capacity to model competitive and cooperative behaviors at both the individual and group levels of analysis. Models presented in these proceedings produce consistent results regarding the relative fragility of cooperative regimes among agents operating under diverse rules. These studies also show how competition and cooperation may generate change at both the group and societal level. Agent-based simulation of competitive and cooperative behaviors may reveal the greatest payoff to social science research of all agent-based modeling efforts because of the need to better understand the dynamics of these behaviors in an increasingly interconnected world. PMID:12011396

  18. Study of photo-oxidative reactivity of sunscreening agents based on photo-oxidation of uric acid by kinetic Monte Carlo simulation.

    PubMed

    Moradmand Jalali, Hamed; Bashiri, Hadis; Rasa, Hossein

    2015-05-01

    In the present study, the mechanism of free radical production by light-reflective agents in sunscreens (TiO2, ZnO and ZrO2) was obtained by applying kinetic Monte Carlo simulation. The values of the rate constants for each step of the suggested mechanism have been obtained by simulation. The effect of the initial concentration of mineral oxides and uric acid on the rate of uric acid photo-oxidation by irradiation of some sun care agents has been studied. The kinetic Monte Carlo simulation results agree qualitatively with the existing experimental data for the production of free radicals by sun care agents.

  19. GIS and agent based spatial-temporal simulation modeling for assessing tourism social carrying capacity: a study on Mount Emei scenic area, China

    NASA Astrophysics Data System (ADS)

    Zhang, Renjun

    2007-06-01

    Each scenic area can sustain a specific level of acceptance of tourist development and use, beyond which further development can result in socio-cultural deterioration or a decline in the quality of the experience gained by visitors. This specific level is called carrying capacity. Social carrying capacity can be defined as the maximum level of use (in terms of numbers and activities) that can be absorbed by an area without an unacceptable decline in the quality of experience of visitors and without an unacceptable adverse impact on the society of the area. It is difficult to assess the carrying capacity, because the carrying capacity is determined by not only the number of visitors, but also the time, the type of the recreation, the characters of each individual and the physical environment. The objective of this study is to build a spatial-temporal simulation model to simulate the spatial-temporal distribution of tourists. This model is a tourist spatial behaviors simulator (TSBS). Based on TSBS, the changes of each visitor's travel patterns such as location, cost, and other states data are recoded in a state table. By analyzing this table, the intensity of the tourist use in any area can be calculated; the changes of the quality of tourism experience can be quantized and analyzed. So based on this micro simulation method the social carrying capacity can be assessed more accurately, can be monitored proactively and managed adaptively. In this paper, the carrying capacity of Mount Emei scenic area is analyzed as followed: The author selected the intensity of the crowd as the monitoring Indicators. it is regarded that longer waiting time means more crowded. TSBS was used to simulate the spatial-temporal distribution of tourists. the average of waiting time all the visitors is calculated. And then the author assessed the social carrying capacity of Mount Emei scenic area, found the key factors have impacted on social carrying capacity. The results show that the TSBS

  20. A Multiscale/Multifidelity CFD Framework for Robust Simulations

    NASA Astrophysics Data System (ADS)

    Lee, Seungjoon; Kevrekidis, Yannis; Karniadakis, George

    2015-11-01

    We develop a general CFD framework based on multifidelity simulations to target multiscale problems but also resilience in exascale simulations, where faulty processors may lead to gappy simulated fields. We combine approximation theory and domain decomposition together with machine learning techniques, e.g. co-Kriging, to estimate boundary conditions and minimize communications by performing independent parallel runs. To demonstrate this new simulation approach, we consider two benchmark problems. First, we solve the heat equation with different patches of the domain simulated by finite differences at fine resolution or very low resolution but also with Monte Carlo, hence fusing multifidelity and heterogeneous models to obtain the final answer. Second, we simulate the flow in a driven cavity by fusing finite difference solutions with solutions obtained by dissipative particle dynamics - a coarse-grained molecular dynamics method. In addition to its robustness and resilience, the new framework generalizes previous multiscale approaches (e.g. continuum-atomistic) in a unified parallel computational framework.

  1. A Simulation and Modeling Framework for Space Situational Awareness

    SciTech Connect

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  2. The Astrophysics Simulation Collaboratory portal: A framework foreffective distributed research

    SciTech Connect

    Bondarescu, Ruxandra; Allen, Gabrielle; Daues, Gregory; Kelly,Ian; Russell, Michael; Seidel, Edward; Shalf, John; Tobias, Malcolm

    2003-03-03

    We describe the motivation, architecture, and implementation of the Astrophysics Simulation Collaboratory (ASC) portal. The ASC project provides a web-based problem solving framework for the astrophysics community that harnesses the capabilities of emerging computational grids.

  3. Agent Based Modeling as an Educational Tool

    NASA Astrophysics Data System (ADS)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  4. Development of an Agent-Based Model (ABM) to Simulate the Immune System and Integration of a Regression Method to Estimate the Key ABM Parameters by Fitting the Experimental Data.

    PubMed

    Tong, Xuming; Chen, Jinghang; Miao, Hongyu; Li, Tingting; Zhang, Le

    2015-01-01

    Agent-based models (ABM) and differential equations (DE) are two commonly used methods for immune system simulation. However, it is difficult for ABM to estimate key parameters of the model by incorporating experimental data, whereas the differential equation model is incapable of describing the complicated immune system in detail. To overcome these problems, we developed an integrated ABM regression model (IABMR). It can combine the advantages of ABM and DE by employing ABM to mimic the multi-scale immune system with various phenotypes and types of cells as well as using the input and output of ABM to build up the Loess regression for key parameter estimation. Next, we employed the greedy algorithm to estimate the key parameters of the ABM with respect to the same experimental data set and used ABM to describe a 3D immune system similar to previous studies that employed the DE model. These results indicate that IABMR not only has the potential to simulate the immune system at various scales, phenotypes and cell types, but can also accurately infer the key parameters like DE model. Therefore, this study innovatively developed a complex system development mechanism that could simulate the complicated immune system in detail like ABM and validate the reliability and efficiency of model like DE by fitting the experimental data.

  5. FACET: A simulation software framework for modeling complex societal processes and interactions

    SciTech Connect

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  6. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  7. Games and Simulations in Online Learning: Research and Development Frameworks

    ERIC Educational Resources Information Center

    Gibson, David; Aldrich, Clark; Prensky, Marc

    2007-01-01

    Games and Simulations in Online Learning: Research and Development Frameworks examines the potential of games and simulations in online learning, and how the future could look as developers learn to use the emerging capabilities of the Semantic Web. It presents a general understanding of how the Semantic Web will impact education and how games and…

  8. FDPS: Framework for Developing Particle Simulators

    NASA Astrophysics Data System (ADS)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-04-01

    FDPS provides the necessary functions for efficient parallel execution of particle-based simulations as templates independent of the data structure of particles and the functional form of the interaction. It is used to develop particle-based simulation programs for large-scale distributed-memory parallel supercomputers. FDPS includes templates for domain decomposition, redistribution of particles, and gathering of particle information for interaction calculation. It uses algorithms such as Barnes-Hut tree method for long-range interactions; methods to limit the calculation to neighbor particles are used for short-range interactions. FDPS reduces the time and effort necessary to write a simple, sequential and unoptimized program of O(N^2) calculation cost, and produces compiled programs that will run efficiently on large-scale parallel supercomputers.

  9. Software Framework for Advanced Power Plant Simulations

    SciTech Connect

    John Widmann; Sorin Munteanu; Aseem Jain; Pankaj Gupta; Mark Moales; Erik Ferguson; Lewis Collins; David Sloan; Woodrow Fiveland; Yi-dong Lang; Larry Biegler; Michael Locke; Simon Lingard; Jay Yun

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. These include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.

  10. Agent-based power sharing scheme for active hybrid power sources

    NASA Astrophysics Data System (ADS)

    Jiang, Zhenhua

    The active hybridization technique provides an effective approach to combining the best properties of a heterogeneous set of power sources to achieve higher energy density, power density and fuel efficiency. Active hybrid power sources can be used to power hybrid electric vehicles with selected combinations of internal combustion engines, fuel cells, batteries, and/or supercapacitors. They can be deployed in all-electric ships to build a distributed electric power system. They can also be used in a bulk power system to construct an autonomous distributed energy system. An important aspect in designing an active hybrid power source is to find a suitable control strategy that can manage the active power sharing and take advantage of the inherent scalability and robustness benefits of the hybrid system. This paper presents an agent-based power sharing scheme for active hybrid power sources. To demonstrate the effectiveness of the proposed agent-based power sharing scheme, simulation studies are performed for a hybrid power source that can be used in a solar car as the main propulsion power module. Simulation results clearly indicate that the agent-based control framework is effective to coordinate the various energy sources and manage the power/voltage profiles.

  11. A simulation framework for the CMS Track Trigger electronics

    NASA Astrophysics Data System (ADS)

    Amstutz, C.; Magazzù, G.; Weber, M.; Palla, F.

    2015-03-01

    A simulation framework has been developed to test and characterize algorithms, architectures and hardware implementations of the vastly complex CMS Track Trigger for the high luminosity upgrade of the CMS experiment at the Large Hadron Collider in Geneva. High-level SystemC models of all system components have been developed to simulate a portion of the track trigger. The simulation of the system components together with input data from physics simulations allows evaluating figures of merit, like delays or bandwidths, under realistic conditions. The use of SystemC for high-level modelling allows co-simulation with models developed in Hardware Description Languages, e.g. VHDL or Verilog. Therefore, the simulation framework can also be used as a test bench for digital modules developed for the final system.

  12. Linking MODFLOW with an agent-based land-use model to support decision making

    USGS Publications Warehouse

    Reeves, H.W.; Zellner, M.L.

    2010-01-01

    The U.S. Geological Survey numerical groundwater flow model, MODFLOW, was integrated with an agent-based land-use model to yield a simulator for environmental planning studies. Ultimately, this integrated simulator will be used as a means to organize information, illustrate potential system responses, and facilitate communication within a participatory modeling framework. Initial results show the potential system response to different zoning policy scenarios in terms of the spatial patterns of development, which is referred to as urban form, and consequent impacts on groundwater levels. These results illustrate how the integrated simulator is capable of representing the complexity of the system. From a groundwater modeling perspective, the most important aspect of the integration is that the simulator generates stresses on the groundwater system within the simulation in contrast to the traditional approach that requires the user to specify the stresses through time. Copyright ?? 2010 The Author(s). Journal compilation ?? 2010 National Ground Water Association.

  13. A Simulation Framework for Exploring Socioecological Dynamics and Sustainability of Settlement Systems Under Stress in Ancient Mesopotamia and Beyond

    NASA Astrophysics Data System (ADS)

    Christiansen, J. H.; Altaweel, M. R.

    2007-12-01

    The presentation will describe an object-oriented, agent-based simulation framework being used to help answer longstanding questions regarding the development trajectories and sustainability of ancient Mesopotamian settlement systems. This multidisciplinary, multi-model framework supports explicit, fine-scale representations of the dynamics of key natural processes such as crop growth, hydrology, and weather, operating concurrently with social processes such as kinship-driven behaviors, farming and herding practices, social stratification, and economic and political activities carried out by social agents that represent individual persons, households, and larger-scale organizations. The framework has allowed us to explore the inherently coupled dynamics of modeled settlements and landscapes that are undergoing diverse social and environmental stresses, both acute and chronic, across multi-generational time spans. The simulation framework was originally used to address single-settlement scenarios, but has recently been extended to begin to address settlement system sustainability issues at sub-regional to regional scale, by introducing a number of new dynamic mechanisms, such as the activities of nomadic communities, that manifest themselves at these larger spatial scales. The framework is flexible and scalable and has broad applicability. It has, for example, recently been adapted to address agroeconomic sustainability of settlement systems in modern rural Thailand, testing the resilience and vulnerability of settled landscapes in the face of such perturbations as large-scale political interventions, global economic shifts, and climate change.

  14. A Simulation Framework for Virtual Prototyping of Robotic Exoskeletons.

    PubMed

    Agarwal, Priyanshu; Neptune, Richard R; Deshpande, Ashish D

    2016-06-01

    A number of robotic exoskeletons are being developed to provide rehabilitation interventions for those with movement disabilities. We present a systematic framework that allows for virtual prototyping (i.e., design, control, and experimentation (i.e. design, control, and experimentation) of robotic exoskeletons. The framework merges computational musculoskeletal analyses with simulation-based design techniques which allows for exoskeleton design and control algorithm optimization. We introduce biomechanical, morphological, and controller measures to optimize the exoskeleton performance. A major advantage of the framework is that it provides a platform for carrying out hypothesis-driven virtual experiments to quantify device performance and rehabilitation progress. To illustrate the efficacy of the framework, we present a case study wherein the design and analysis of an index finger exoskeleton is carried out using the proposed framework. PMID:27018453

  15. A Simulation Framework for Virtual Prototyping of Robotic Exoskeletons.

    PubMed

    Agarwal, Priyanshu; Neptune, Richard R; Deshpande, Ashish D

    2016-06-01

    A number of robotic exoskeletons are being developed to provide rehabilitation interventions for those with movement disabilities. We present a systematic framework that allows for virtual prototyping (i.e., design, control, and experimentation (i.e. design, control, and experimentation) of robotic exoskeletons. The framework merges computational musculoskeletal analyses with simulation-based design techniques which allows for exoskeleton design and control algorithm optimization. We introduce biomechanical, morphological, and controller measures to optimize the exoskeleton performance. A major advantage of the framework is that it provides a platform for carrying out hypothesis-driven virtual experiments to quantify device performance and rehabilitation progress. To illustrate the efficacy of the framework, we present a case study wherein the design and analysis of an index finger exoskeleton is carried out using the proposed framework.

  16. Framework for Architecture Trade Study Using MBSE and Performance Simulation

    NASA Technical Reports Server (NTRS)

    Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas

    2012-01-01

    Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.

  17. Introduction of a Framework for Dynamic Knowledge Representation of the Control Structure of Transplant Immunology: Employing the Power of Abstraction with a Solid Organ Transplant Agent-Based Model

    PubMed Central

    An, Gary

    2015-01-01

    Agent-based modeling has been used to characterize the nested control loops and non-linear dynamics associated with inflammatory and immune responses, particularly as a means of visualizing putative mechanistic hypotheses. This process is termed dynamic knowledge representation and serves a critical role in facilitating the ability to test and potentially falsify hypotheses in the current data- and hypothesis-rich biomedical research environment. Importantly, dynamic computational modeling aids in identifying useful abstractions, a fundamental scientific principle that pervades the physical sciences. Recognizing the critical scientific role of abstraction provides an intellectual and methodological counterweight to the tendency in biology to emphasize comprehensive description as the primary manifestation of biological knowledge. Transplant immunology represents yet another example of the challenge of identifying sufficient understanding of the inflammatory/immune response in order to develop and refine clinically effective interventions. Advances in immunosuppressive therapies have greatly improved solid organ transplant (SOT) outcomes, most notably by reducing and treating acute rejection. The end goal of these transplant immune strategies is to facilitate effective control of the balance between regulatory T cells and the effector/cytotoxic T-cell populations in order to generate, and ideally maintain, a tolerant phenotype. Characterizing the dynamics of immune cell populations and the interactive feedback loops that lead to graft rejection or tolerance is extremely challenging, but is necessary if rational modulation to induce transplant tolerance is to be accomplished. Herein is presented the solid organ agent-based model (SOTABM) as an initial example of an agent-based model (ABM) that abstractly reproduces the cellular and molecular components of the immune response to SOT. Despite its abstract nature, the SOTABM is able to qualitatively reproduce acute

  18. Next Generation Simulation Framework for Robotic and Human Space Missions

    NASA Technical Reports Server (NTRS)

    Cameron, Jonathan M.; Balaram, J.; Jain, Abhinandan; Kuo, Calvin; Lim, Christopher; Myint, Steven

    2012-01-01

    The Dartslab team at NASA's Jet Propulsion Laboratory (JPL) has a long history of developing physics-based simulations based on the Darts/Dshell simulation framework that have been used to simulate many planetary robotic missions, such as the Cassini spacecraft and the rovers that are currently driving on Mars. Recent collaboration efforts between the Dartslab team at JPL and the Mission Operations Directorate (MOD) at NASA Johnson Space Center (JSC) have led to significant enhancements to the Dartslab DSENDS (Dynamics Simulator for Entry, Descent and Surface landing) software framework. The new version of DSENDS is now being used for new planetary mission simulations at JPL. JSC is using DSENDS as the foundation for a suite of software known as COMPASS (Core Operations, Mission Planning, and Analysis Spacecraft Simulation) that is the basis for their new human space mission simulations and analysis. In this paper, we will describe the collaborative process with the JPL Dartslab and the JSC MOD team that resulted in the redesign and enhancement of the DSENDS software. We will outline the improvements in DSENDS that simplify creation of new high-fidelity robotic/spacecraft simulations. We will illustrate how DSENDS simulations are assembled and show results from several mission simulations.

  19. A GPU-based framework for simulation of medical ultrasound

    NASA Astrophysics Data System (ADS)

    Kutter, Oliver; Karamalis, Athanasios; Wein, Wolfgang; Navab, Nassir

    2009-02-01

    Simulation of ultrasound (US) images from volumetric medical image data has been shown to be an important tool in medical image analysis. However, there is a trade off between the accuracy of the simulation and its real-time performance. In this paper, we present a framework for acceleration of ultrasound simulation on the graphics processing unit (GPU) of commodity computer hardware. Our framework can accommodate ultrasound modeling with varying degrees of complexity. To demonstrate the flexibility of our proposed method, we have implemented several models of acoustic propagation through 3D volumes. We conducted multiple experiments to evaluate the performance of our method for its application in multi-modal image registration and training. The results demonstrate the high performance of the GPU accelerated simulation outperforming CPU implementations by up to two orders of magnitude and encourage the investigation of even more realistic acoustic models.

  20. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    NASA Astrophysics Data System (ADS)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  1. Introducing FNCS: Framework for Network Co-Simulation

    SciTech Connect

    2014-10-23

    This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.

  2. Applications of Agent Based Approaches in Business (A Three Essay Dissertation)

    ERIC Educational Resources Information Center

    Prawesh, Shankar

    2013-01-01

    The goal of this dissertation is to investigate the enabling role that agent based simulation plays in business and policy. The aforementioned issue has been addressed in this dissertation through three distinct, but related essays. The first essay is a literature review of different research applications of agent based simulation in various…

  3. An example-based brain MRI simulation framework

    NASA Astrophysics Data System (ADS)

    He, Qing; Roy, Snehashis; Jog, Amod; Pham, Dzung L.

    2015-03-01

    The simulation of magnetic resonance (MR) images plays an important role in the validation of image analysis algorithms such as image segmentation, due to lack of sufficient ground truth in real MR images. Previous work on MRI simulation has focused on explicitly modeling the MR image formation process. However, because of the overwhelming complexity of MR acquisition these simulations must involve simplifications and approximations that can result in visually unrealistic simulated images. In this work, we describe an example-based simulation framework, which uses an "atlas" consisting of an MR image and its anatomical models derived from the hard segmentation. The relationships between the MR image intensities and its anatomical models are learned using a patch-based regression that implicitly models the physics of the MR image formation. Given the anatomical models of a new brain, a new MR image can be simulated using the learned regression. This approach has been extended to also simulate intensity inhomogeneity artifacts based on the statistical model of training data. Results show that the example based MRI simulation method is capable of simulating different image contrasts and is robust to different choices of atlas. The simulated images resemble real MR images more than simulations produced by a physics-based model.

  4. Adding ecosystem function to agent-based land use models

    PubMed Central

    Yadav, V.; Del Grosso, S.J.; Parton, W.J.; Malanson, G.P.

    2015-01-01

    The objective of this paper is to examine issues in the inclusion of simulations of ecosystem functions in agent-based models of land use decision-making. The reasons for incorporating these simulations include local interests in land fertility and global interests in carbon sequestration. Biogeochemical models are needed in order to calculate such fluxes. The Century model is described with particular attention to the land use choices that it can encompass. When Century is applied to a land use problem the combinatorial choices lead to a potentially unmanageable number of simulation runs. Century is also parameter-intensive. Three ways of including Century output in agent-based models, ranging from separately calculated look-up tables to agents running Century within the simulation, are presented. The latter may be most efficient, but it moves the computing costs to where they are most problematic. Concern for computing costs should not be a roadblock. PMID:26191077

  5. Agent-based model for the h-index - exact solution

    NASA Astrophysics Data System (ADS)

    Żogała-Siudem, Barbara; Siudem, Grzegorz; Cena, Anna; Gagolewski, Marek

    2016-01-01

    Hirsch's h-index is perhaps the most popular citation-based measure of scientific excellence. In 2013, Ionescu and Chopard proposed an agent-based model describing a process for generating publications and citations in an abstract scientific community [G. Ionescu, B. Chopard, Eur. Phys. J. B 86, 426 (2013)]. Within such a framework, one may simulate a scientist's activity, and - by extension - investigate the whole community of researchers. Even though the Ionescu and Chopard model predicts the h-index quite well, the authors provided a solution based solely on simulations. In this paper, we complete their results with exact, analytic formulas. What is more, by considering a simplified version of the Ionescu-Chopard model, we obtained a compact, easy to compute formula for the h-index. The derived approximate and exact solutions are investigated on a simulated and real-world data sets.

  6. A Multi-Paradigm Modeling Framework to Simulate Dynamic Reciprocity in a Bioreactor

    PubMed Central

    Kaul, Himanshu; Cui, Zhanfeng; Ventikos, Yiannis

    2013-01-01

    Despite numerous technology advances, bioreactors are still mostly utilized as functional black-boxes where trial and error eventually leads to the desirable cellular outcome. Investigators have applied various computational approaches to understand the impact the internal dynamics of such devices has on overall cell growth, but such models cannot provide a comprehensive perspective regarding the system dynamics, due to limitations inherent to the underlying approaches. In this study, a novel multi-paradigm modeling platform capable of simulating the dynamic bidirectional relationship between cells and their microenvironment is presented. Designing the modeling platform entailed combining and coupling fully an agent-based modeling platform with a transport phenomena computational modeling framework. To demonstrate capability, the platform was used to study the impact of bioreactor parameters on the overall cell population behavior and vice versa. In order to achieve this, virtual bioreactors were constructed and seeded. The virtual cells, guided by a set of rules involving the simulated mass transport inside the bioreactor, as well as cell-related probabilistic parameters, were capable of displaying an array of behaviors such as proliferation, migration, chemotaxis and apoptosis. In this way the platform was shown to capture not only the impact of bioreactor transport processes on cellular behavior but also the influence that cellular activity wields on that very same local mass transport, thereby influencing overall cell growth. The platform was validated by simulating cellular chemotaxis in a virtual direct visualization chamber and comparing the simulation with its experimental analogue. The results presented in this paper are in agreement with published models of similar flavor. The modeling platform can be used as a concept selection tool to optimize bioreactor design specifications. PMID:23555740

  7. In-situ Data Analysis Framework for ACME Land Simulations

    NASA Astrophysics Data System (ADS)

    Wang, D.; Yao, C.; Jia, Y.; Steed, C.; Atchley, S.

    2015-12-01

    The realistic representation of key biogeophysical and biogeochemical functions is the fundamental of process-based ecosystem models. Investigating the behavior of those ecosystem functions within real-time model simulation can be a very challenging due to the complex of both model and software structure of an environmental model, such as the Accelerated Climate Model for Energy (ACME) Land Model (ALM). In this research, author will describe the urgent needs and challenges for in-situ data analysis for ALM simulations, and layouts our methods/strategies to meet these challenges. Specifically, an in-situ data analysis framework is designed to allow users interactively observe the biogeophyical and biogeochemical process during ALM simulation. There are two key components in this framework, automatically instrumented ecosystem simulation, in-situ data communication and large-scale data exploratory toolkit. This effort is developed by leveraging several active projects, including scientific unit testing platform, common communication interface and extreme-scale data exploratory toolkit. Authors believe that, based on advanced computing technologies, such as compiler-based software system analysis, automatic code instrumentation, and in-memory data transport, this software system provides not only much needed capability for real-time observation and in-situ data analytics for environmental model simulation, but also the potentials for in-situ model behavior adjustment via simulation steering.

  8. Agent-based enterprise integration

    SciTech Connect

    N. M. Berry; C. M. Pancerella

    1998-12-01

    The authors are developing and deploying software agents in an enterprise information architecture such that the agents manage enterprise resources and facilitate user interaction with these resources. The enterprise agents are built on top of a robust software architecture for data exchange and tool integration across heterogeneous hardware and software. The resulting distributed multi-agent system serves as a method of enhancing enterprises in the following ways: providing users with knowledge about enterprise resources and applications; accessing the dynamically changing enterprise; locating enterprise applications and services; and improving search capabilities for applications and data. Furthermore, agents can access non-agents (i.e., databases and tools) through the enterprise framework. The ultimate target of the effort is the user; they are attempting to increase user productivity in the enterprise. This paper describes their design and early implementation and discusses the planned future work.

  9. Modeling the Population Dynamics of Antibiotic-Resistant Bacteria:. AN Agent-Based Approach

    NASA Astrophysics Data System (ADS)

    Murphy, James T.; Walshe, Ray; Devocelle, Marc

    The response of bacterial populations to antibiotic treatment is often a function of a diverse range of interacting factors. In order to develop strategies to minimize the spread of antibiotic resistance in pathogenic bacteria, a sound theoretical understanding of the systems of interactions taking place within a colony must be developed. The agent-based approach to modeling bacterial populations is a useful tool for relating data obtained at the molecular and cellular level with the overall population dynamics. Here we demonstrate an agent-based model, called Micro-Gen, which has been developed to simulate the growth and development of bacterial colonies in culture. The model also incorporates biochemical rules and parameters describing the kinetic interactions of bacterial cells with antibiotic molecules. Simulations were carried out to replicate the development of methicillin-resistant S. aureus (MRSA) colonies growing in the presence of antibiotics. The model was explored to see how the properties of the system emerge from the interactions of the individual bacterial agents in order to achieve a better mechanistic understanding of the population dynamics taking place. Micro-Gen provides a good theoretical framework for investigating the effects of local environmental conditions and cellular properties on the response of bacterial populations to antibiotic exposure in the context of a simulated environment.

  10. An Agent-Based Dynamic Model for Analysis of Distributed Space Exploration Architectures

    NASA Astrophysics Data System (ADS)

    Sindiy, Oleg V.; DeLaurentis, Daniel A.; Stein, William B.

    2009-07-01

    A range of complex challenges, but also potentially unique rewards, underlie the development of exploration architectures that use a distributed, dynamic network of resources across the solar system. From a methodological perspective, the prime challenge is to systematically model the evolution (and quantify comparative performance) of such architectures, under uncertainty, to effectively direct further study of specialized trajectories, spacecraft technologies, concept of operations, and resource allocation. A process model for System-of-Systems Engineering is used to define time-varying performance measures for comparative architecture analysis and identification of distinguishing patterns among interoperating systems. Agent-based modeling serves as the means to create a discrete-time simulation that generates dynamics for the study of architecture evolution. A Solar System Mobility Network proof-of-concept problem is introduced representing a set of longer-term, distributed exploration architectures. Options within this set revolve around deployment of human and robotic exploration and infrastructure assets, their organization, interoperability, and evolution, i.e., a system-of-systems. Agent-based simulations quantify relative payoffs for a fully distributed architecture (which can be significant over the long term), the latency period before they are manifest, and the up-front investment (which can be substantial compared to alternatives). Verification and sensitivity results provide further insight on development paths and indicate that the framework and simulation modeling approach may be useful in architectural design of other space exploration mass, energy, and information exchange settings.

  11. Who's your neighbor? neighbor identification for agent-based modeling.

    SciTech Connect

    Macal, C. M.; Howe, T. R.; Decision and Information Sciences; Univ. of Chicago

    2006-01-01

    Agent-based modeling and simulation, based on the cellular automata paradigm, is an approach to modeling complex systems comprised of interacting autonomous agents. Open questions in agent-based simulation focus on scale-up issues encountered in simulating large numbers of agents. Specifically, how many agents can be included in a workable agent-based simulation? One of the basic tenets of agent-based modeling and simulation is that agents only interact and exchange locally available information with other agents located in their immediate proximity or neighborhood of the space in which the agents are situated. Generally, an agent's set of neighbors changes rapidly as a simulation proceeds through time and as the agents move through space. Depending on the topology defined for agent interactions, proximity may be defined by spatial distance for continuous space, adjacency for grid cells (as in cellular automata), or by connectivity in social networks. Identifying an agent's neighbors is a particularly time-consuming computational task and can dominate the computational effort in a simulation. Two challenges in agent simulation are (1) efficiently representing an agent's neighborhood and the neighbors in it and (2) efficiently identifying an agent's neighbors at any time in the simulation. These problems are addressed differently for different agent interaction topologies. While efficient approaches have been identified for agent neighborhood representation and neighbor identification for agents on a lattice with general neighborhood configurations, other techniques must be used when agents are able to move freely in space. Techniques for the analysis and representation of spatial data are applicable to the agent neighbor identification problem. This paper extends agent neighborhood simulation techniques from the lattice topology to continuous space, specifically R2. Algorithms based on hierarchical (quad trees) or non-hierarchical data structures (grid cells) are

  12. Symphony: A Framework for Accurate and Holistic WSN Simulation

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  13. Symphony: a framework for accurate and holistic WSN simulation.

    PubMed

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles.

  14. Etomica: an object-oriented framework for molecular simulation.

    PubMed

    Schultz, Andrew J; Kofke, David A

    2015-03-30

    We describe the design of an object-oriented library of software components that are suitable for constructing simulations of systems of interacting particles. The emphasis of the discussion is on the general design of the components and how they interact, and less on details of the programming interface or its implementation. Example code is provided as an aid to understanding object-oriented programming structures and to demonstrate how the framework is applied.

  15. Etomica: an object-oriented framework for molecular simulation.

    PubMed

    Schultz, Andrew J; Kofke, David A

    2015-03-30

    We describe the design of an object-oriented library of software components that are suitable for constructing simulations of systems of interacting particles. The emphasis of the discussion is on the general design of the components and how they interact, and less on details of the programming interface or its implementation. Example code is provided as an aid to understanding object-oriented programming structures and to demonstrate how the framework is applied. PMID:25565378

  16. Velo: A Knowledge Management Framework for Modeling and Simulation

    SciTech Connect

    Gorton, Ian; Sivaramakrishnan, Chandrika; Black, Gary D.; White, Signe K.; Purohit, Sumit; Lansing, Carina S.; Madison, Michael C.; Schuchardt, Karen L.; Liu, Yan

    2012-03-01

    Modern scientific enterprises are inherently knowledge-intensive. Scientific studies in domains such as geosciences, climate, and biology require the acquisition and manipulation of large amounts of experimental and field data to create inputs for large-scale computational simulations. The results of these simulations are then analyzed, leading to refinements of inputs and models and additional simulations. The results of this process must be managed and archived to provide justifications for regulatory decisions and publications that are based on the models. In this paper we introduce our Velo framework that is designed as a reusable, domain independent knowledge management infrastructure for modeling and simulation. Velo leverages, integrates and extends open source collaborative and content management technologies to create a scalable and flexible core platform that can be tailored to specific scientific domains. We describe the architecture of Velo for managing and associating the various types of data that are used and created in modeling and simulation projects, as well as the framework for integrating domain-specific tools. To demonstrate realizations of Velo, we describe examples from two deployed sites for carbon sequestration and climate modeling. These provide concrete example of the inherent extensibility and utility of our approach.

  17. An agent based model of genotype editing

    SciTech Connect

    Rocha, L. M.; Huang, C. F.

    2004-01-01

    This paper presents our investigation on an agent-based model of Genotype Editing. This model is based on several characteristics that are gleaned from the RNA editing system as observed in several organisms. The incorporation of editing mechanisms in an evolutionary agent-based model provides a means for evolving agents with heterogenous post-transcriptional processes. The study of this agent-based genotype-editing model has shed some light into the evolutionary implications of RNA editing as well as established an advantageous evolutionary computation algorithm for machine learning. We expect that our proposed model may both facilitate determining the evolutionary role of RNA editing in biology, and advance the current state of research in agent-based optimization.

  18. EIC detector simulations in FairRoot framework

    NASA Astrophysics Data System (ADS)

    Kiselev, Alexander; eRHIC task force Team

    2013-10-01

    The long-term RHIC facility upgrade plan foresees the addition of a high-energy electron beam to the existing hadron accelerator complex thus converting RHIC into an Electron-Ion Collider (eRHIC). A dedicated EIC detector, designed to efficiently register and identify deep inelastic electron scattering (DIS) processes in a wide range of center-of-mass energies is one of the key elements of this upgrade. Detailed Monte-Carlo studies are needed to optimize EIC detector components and to fine tune their design. The simulation package foreseen for this purpose (EicRoot) is based on the FairRoot framework developed and maintained at the GSI. A feature of this framework is its level of flexibility, allowing one to switch easily between different geometry (ROOT, GEANT) and transport (GEANT3, GEANT4, FLUKA) models. Apart from providing a convenient simulation environment the framework includes basic tools for visualization and allows for easy sharing of event reconstruction codes between higher level experiment-specific applications. The description of the main EicRoot features and first simulation results will be the main focus of the talk.

  19. A new framework for simulating forced homogeneous buoyant turbulent flows

    NASA Astrophysics Data System (ADS)

    Carroll, Phares L.; Blanquart, Guillaume

    2015-06-01

    This work proposes a new simulation methodology to study variable density turbulent buoyant flows. The mathematical framework, referred to as homogeneous buoyant turbulence, relies on a triply periodic domain and incorporates numerical forcing methods commonly used in simulation studies of homogeneous, isotropic flows. In order to separate the effects due to buoyancy from those due to large-scale gradients, the linear scalar forcing technique is used to maintain the scalar variance at a constant value. Two sources of kinetic energy production are considered in the momentum equation, namely shear via an isotropic forcing term and buoyancy via the gravity term. The simulation framework is designed such that the four dimensionless parameters of importance in buoyant mixing, namely the Reynolds, Richardson, Atwood, and Schmidt numbers, can be independently varied and controlled. The framework is used to interrogate fully non-buoyant, fully buoyant, and partially buoyant turbulent flows. The results show that the statistics of the scalar fields (mixture fraction and density) are not influenced by the energy production mechanism (shear vs. buoyancy). On the other hand, the velocity field exhibits anisotropy, namely a larger variance in the direction of gravity which is associated with a statistical dependence of the velocity component on the local fluid density.

  20. A hybrid parallel framework for the cellular Potts model simulations

    SciTech Connect

    Jiang, Yi; He, Kejing; Dong, Shoubin

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  1. Framework Application for Core Edge Transport Simulation (FACETS)

    SciTech Connect

    Krasheninnikov, Sergei; Pigarov, Alexander

    2011-10-15

    The FACETS (Framework Application for Core-Edge Transport Simulations) project of Scientific Discovery through Advanced Computing (SciDAC) Program was aimed at providing a high-fidelity whole-tokamak modeling for the U.S. magnetic fusion energy program and ITER through coupling separate components for each of the core region, edge region, and wall, with realistic plasma particles and power sources and turbulent transport simulation. The project also aimed at developing advanced numerical algorithms, efficient implicit coupling methods, and software tools utilizing the leadership class computing facilities under Advanced Scientific Computing Research (ASCR). The FACETS project was conducted by a multi-discipline, multi-institutional teams, the Lead PI was J.R. Cary (Tech-X Corp.). In the FACETS project, the Applied Plasma Theory Group at the MAE Department of UCSD developed the Wall and Plasma-Surface Interaction (WALLPSI) module, performed its validation against experimental data, and integrated it into the developed framework. WALLPSI is a one-dimensional, coarse grained, reaction/advection/diffusion code applied to each material boundary cell in the common modeling domain for a tokamak. It incorporates an advanced model for plasma particle transport and retention in the solid matter of plasma facing components, simulation of plasma heat power load handling, calculation of erosion/deposition, and simulation of synergistic effects in strong plasma-wall coupling.

  2. Modeling Interdependencies between power and economic sectors using the N-ABLE agent-based model.

    SciTech Connect

    Ehlen, Mark Andrew; Scholand, Andrew Joseph

    2005-01-01

    The nation's electric power sector is highly interdependent with the economic sectors it serves; electric power needs are driven by economic activity while the economy itself depends on reliable and sustainable electric power. To advance higher level understandings of the vulnerabilities that result from these interdependencies and to identify the loss prevention and loss mitigation policies that best serve the nation, the National Infrastructure Simulation and Analysis Center is developing and using N-ABLE{trademark}, an agent-based microeconomic framework and simulation tool that models these interdependencies at the level of collections of individual economic firms. Current projects that capture components of these electric power and economic sector interdependencies illustrate some of the public policy issues that should be addressed for combined power sector reliability and national economic security.

  3. Health care supply networks in tightly and loosely coupled structures: exploration using agent-based modelling

    NASA Astrophysics Data System (ADS)

    Kanagarajah, A.; Parker, D.; Xu, H.

    2010-03-01

    Health care supply networks are multi-faceted complex structures. This article discusses architecture of complex systems and an agent-based modelling framework to study health care supply networks and their impact on patient safety, economics, and workloads. Here we demonstrate the application of a safety dynamics model proposed by Cook and Rasmussen (2005, '"Going Solid": A Model of System Dynamics and Consequences for Patient Safety', Quality & Safety in Health Care, 14, 67-84.) to study a health care system, using a hypothetical simulation of an emergency department as a representative unit and its dynamic behaviour. By means of simulation, this article demonstrates the non-linear behaviours of a health service unit and its complexities; and how the safety dynamic model may be used to evaluate the various policy and design aspects of health care supply networks.

  4. The PandaRoot framework for simulation, reconstruction and analysis

    NASA Astrophysics Data System (ADS)

    Spataro, Stefano; PANDA Collaboration

    2011-12-01

    The PANDA experiment at the future facility FAIR will study anti-proton proton and anti-proton nucleus collisions in a beam momentum range from 2 GeV/c up to 15 GeV/c. The PandaRoot framework is part of the FairRoot project, a common software framework for the future FAIR experiments, and is currently used to simulate detector performances and to evaluate different detector concepts. It is based on the packages ROOT and Virtual MonteCarlo with Geant3 and Geant4. Different reconstruction algorithms for tracking and particle identification are under development and optimization, in order to achieve the performance requirements of the experiment. In the central tracker a first track fit is performed using a conformal map transformation based on a helix assumption, then the track is used as input for a Kalman Filter (package genfit), using GEANE as track follower. The track is then correlated to the pid detectors (e.g. Cerenkov detectors, EM Calorimeter or Muon Chambers) to evaluate a global particle identification probability, using a Bayesian approach or multivariate methods. Further implemented packages in PandaRoot are: the analysis tools framework Rho, the kinematic fitter package for vertex and mass constraint fits, and a fast simulation code based upon parametrized detector responses. PandaRoot was also tested on an Alien-based GRID infrastructure. The contribution will report about the status of PandaRoot and show some example results for analysis of physics benchmark channels.

  5. The Framework for Approximate Queries on Simulation Data

    SciTech Connect

    Abdulla, G; Baldwin, C; Critchlow, T; Kamimura, R; Lee, B; Musick, R; Snapp, R; Tang, N

    2001-09-27

    AQSim is a system intended to enable scientists to query and analyze a large volume of scientific simulation data. The system uses the state of the art in approximate query processing techniques to build a novel framework for progressive data analysis. These techniques are used to define a multi-resolution index, where each node contains multiple models of the data. The benefits of these models are two-fold: (1) they are compact representations, reconstructing only the information relevant to the analysis, and (2) the variety of models capture different aspects of the data which may be of interest to the user but are not readily apparent in their raw form. To be able to deal with the data interactively, AQSim allows the scientist to make an informed tradeoff between query response accuracy and time. In this paper, we present the framework of AQSim with a focus on its architectural design. We also show the results from an initial proof-of-concept prototype developed at LLNL. The presented framework is generic enough to handle more than just simulation data.

  6. Seawater Pervaporation through Zeolitic Imidazolate Framework Membranes: Atomistic Simulation Study.

    PubMed

    Gupta, Krishna M; Qiao, Zhiwei; Zhang, Kang; Jiang, Jianwen

    2016-06-01

    An atomistic simulation study is reported for seawater pervaporation through five zeolitic imidazolate framework (ZIF) membranes including ZIF-8, -93, -95, -97, and -100. Salt rejection in the five ZIFs is predicted to be 100%. With the largest aperture, ZIF-100 possesses the highest water permeability of 5 × 10(-4) kg m/(m(2) h bar), which is substantially higher compared to commercial reverse osmosis membranes, as well as zeolite and graphene oxide pervaporation membranes. In ZIF-8, -93, -95, and -97 with similar aperture size, water flux is governed by framework hydrophobicity/hydrophilicity; in hydrophobic ZIF-8 and -95, water flux is higher than in hydrophilic ZIF-93 and -97. Furthermore, water molecules in ZIF-93 move slowly and remain in the membrane for a long time but undergo to-and-fro motion in ZIF-100. The lifetime of hydrogen bonds in ZIF-93 is found to be longer than in ZIF-100. This simulation study quantitatively elucidates the dynamic and structural properties of water in ZIF membranes, identifies the key governing factors (aperture size and framework hydrophobicity/hydrophilicity), and suggests that ZIF-100 is an intriguing membrane for seawater pervaporation. PMID:27195441

  7. A framework for control simulations using the TRANSP code

    NASA Astrophysics Data System (ADS)

    Boyer, Mark D.; Andre, Rob; Gates, David; Gerhardt, Stefan; Goumiri, Imene; Menard, Jon

    2014-10-01

    The high-performance operational goals of present-day and future tokamaks will require development of advanced feedback control algorithms. Though reduced models are often used for initial designs, it is important to study the performance of control schemes with integrated models prior to experimental implementation. To this end, a flexible framework for closed loop simulations within the TRANSP code is being developed. The framework exploits many of the predictive capabilities of TRANSP and provides a means for performing control calculations based on user-supplied data (controller matrices, target waveforms, etc.). These calculations, along with the acquisition of ``real-time'' measurements and manipulation of TRANSP internal variables based on actuator requests, are implemented through a hook that allows custom run-specific code to be inserted into the standard TRANSP source code. As part of the framework, a module has been created to constrain the thermal stored energy in TRANSP using a confinement scaling expression. Progress towards feedback control of the current profile on NSTX-U will be presented to demonstrate the framework. Supported in part by an appointment to the U.S. Department of Energy Fusion Energy Postdoctoral Research Program administered by the Oak Ridge Institute for Science and Education.

  8. A modeling and simulation framework for electrokinetic nanoparticle treatment

    NASA Astrophysics Data System (ADS)

    Phillips, James

    2011-12-01

    The focus of this research is to model and provide a simulation framework for the packing of differently sized spheres within a hard boundary. The novel contributions of this dissertation are the cylinders of influence (COI) method and sectoring method implementations. The impetus for this research stems from modeling electrokinetic nanoparticle (EN) treatment, which packs concrete pores with differently sized nanoparticles. We show an improved speed of the simulation compared to previously published results of EN treatment simulation while obtaining similar porosity reduction results. We mainly focused on readily, commercially available particle sizes of 2 nm and 20 nm particles, but have the capability to model other sizes. Our simulation has graphical capabilities and can provide additional data unobtainable from physical experimentation. The data collected has a median of 0.5750 and a mean of 0.5504. The standard error is 0.0054 at alpha = 0.05 for a 95% confidence interval of 0.5504 +/- 0.0054. The simulation has produced maximum packing densities of 65% and minimum packing densities of 34%. Simulation data are analyzed using linear regression via the R statistical language to obtain two equations: one that describes porosity reduction based on all cylinder and particle characteristics, and another that focuses on describing porosity reduction based on cylinder diameter for 2 and 20 nm particles into pores of 100 nm height. Simulation results are similar to most physical results obtained from MIP and WLR. Some MIP results do not fall within the simulation limits; however, this is expected as MIP has been documented to be an inaccurate measure of pore distribution and porosity of concrete. Despite the disagreement between WLR and MIP, there is a trend that porosity reduction is higher two inches from the rebar as compared to the rebar-concrete interface. The simulation also detects a higher porosity reduction further from the rebar. This may be due to particles

  9. A framework of modeling detector systems for computed tomography simulations

    NASA Astrophysics Data System (ADS)

    Youn, H.; Kim, D.; Kim, S. H.; Kam, S.; Jeon, H.; Nam, J.; Kim, H. K.

    2016-01-01

    Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model.

  10. An Agent-Based Model of Farmer Decision Making in Jordan

    NASA Astrophysics Data System (ADS)

    Selby, Philip; Medellin-Azuara, Josue; Harou, Julien; Klassert, Christian; Yoon, Jim

    2016-04-01

    We describe an agent based hydro-economic model of groundwater irrigated agriculture in the Jordan Highlands. The model employs a Multi-Agent-Simulation (MAS) framework and is designed to evaluate direct and indirect outcomes of climate change scenarios and policy interventions on farmer decision making, including annual land use, groundwater use for irrigation, and water sales to a water tanker market. Land use and water use decisions are simulated for groups of farms grouped by location and their behavioural and economic similarities. Decreasing groundwater levels, and the associated increase in pumping costs, are important drivers for change within Jordan'S agricultural sector. We describe how this is considered by coupling of agricultural and groundwater models. The agricultural production model employs Positive Mathematical Programming (PMP), a method for calibrating agricultural production functions to observed planted areas. PMP has successfully been used with disaggregate models for policy analysis. We adapt the PMP approach to allow explicit evaluation of the impact of pumping costs, groundwater purchase fees and a water tanker market. The work demonstrates the applicability of agent-based agricultural decision making assessment in the Jordan Highlands and its integration with agricultural model calibration methods. The proposed approach is designed and implemented with software such that it could be used to evaluate a variety of physical and human influences on decision making in agricultural water management.

  11. A Run-Time Verification Framework for Smart Grid Applications Implemented on Simulation Frameworks

    SciTech Connect

    Ciraci, Selim; Sozer, Hasan; Tekinerdogan, Bedir

    2013-05-18

    Smart grid applications are implemented and tested with simulation frameworks as the developers usually do not have access to large sensor networks to be used as a test bed. The developers are forced to map the implementation onto these frameworks which results in a deviation between the architecture and the code. On its turn this deviation makes it hard to verify behavioral constraints that are de- scribed at the architectural level. We have developed the ConArch toolset to support the automated verification of architecture-level behavioral constraints. A key feature of ConArch is programmable mapping for architecture to the implementation. Here, developers implement queries to identify the points in the target program that correspond to architectural interactions. ConArch generates run- time observers that monitor the flow of execution between these points and verifies whether this flow conforms to the behavioral constraints. We illustrate how the programmable mappings can be exploited for verifying behavioral constraints of a smart grid appli- cation that is implemented with two simulation frameworks.

  12. The Integrated Plasma Simulator: A Flexible Python Framework for Coupled Multiphysics Simulation

    SciTech Connect

    Foley, Samantha S; Elwasif, Wael R; Bernholdt, David E

    2011-11-01

    High-fidelity coupled multiphysics simulations are an increasingly important aspect of computational science. In many domains, however, there has been very limited experience with simulations of this sort, therefore research in coupled multiphysics often requires computational frameworks with significant flexibility to respond to the changing directions of the physics and mathematics. This paper presents the Integrated Plasma Simulator (IPS), a framework designed for loosely coupled simulations of fusion plasmas. The IPS provides users with a simple component architecture into which a wide range of existing plasma physics codes can be inserted as components. Simulations can take advantage of multiple levels of parallelism supported in the IPS, and can be controlled by a high-level ``driver'' component, or by other coordination mechanisms, such as an asynchronous event service. We describe the requirements and design of the framework, and how they were implemented in the Python language. We also illustrate the flexibility of the framework by providing examples of different types of simulations that utilize various features of the IPS.

  13. A wind turbine hybrid simulation framework considering aeroelastic effects

    NASA Astrophysics Data System (ADS)

    Song, Wei; Su, Weihua

    2015-04-01

    In performing an effective structural analysis for wind turbine, the simulation of turbine aerodynamic loads is of great importance. The interaction between the wake flow and the blades may impact turbine blades loading condition, energy yield and operational behavior. Direct experimental measurement of wind flow field and wind profiles around wind turbines is very helpful to support the wind turbine design. However, with the growth of the size of wind turbines for higher energy output, it is not convenient to obtain all the desired data in wind-tunnel and field tests. In this paper, firstly the modeling of dynamic responses of large-span wind turbine blades will consider nonlinear aeroelastic effects. A strain-based geometrically nonlinear beam formulation will be used for the basic structural dynamic modeling, which will be coupled with unsteady aerodynamic equations and rigid-body rotations of the rotor. Full wind turbines can be modeled by using the multi-connected beams. Then, a hybrid simulation experimental framework is proposed to potentially address this issue. The aerodynamic-dominant components, such as the turbine blades and rotor, are simulated as numerical components using the nonlinear aeroelastic model; while the turbine tower, where the collapse of failure may occur under high level of wind load, is simulated separately as the physical component. With the proposed framework, dynamic behavior of NREL's 5MW wind turbine blades will be studied and correlated with available numerical data. The current work will be the basis of the authors' further studies on flow control and hazard mitigation on wind turbine blades and towers.

  14. A framework for industrial systems modeling and simulation

    SciTech Connect

    Macfarlane, J.; Nachnani, S.; Tsai, L.H.; Kaae, P.; Freund, K.; Hoza, M.; Stahlman, E.

    1995-04-01

    To successfully compete in a global market, manufacturing production systems are being forced to reduce time to market and to provide improved responsiveness to changes in market conditions. The organizations that comprise the business links in the production system must constantly make tradeoffs between time and cost in order to achieve a competitive but quick response to consumer demand. Due to the inherent uncertainty of consumer demand, these tradeoffs are, by definition, made with incomplete information and can incur significant financial and competitive risk to the organization. Partnerships between organizations are a mechanism for increasing the information in the decision making process by combining information from the two partners. Partnerships are inherently difficult to implement due to trust issues. A mechanism for investigating and validating the mutual benefit to partnering would be useful in designing and implementing partnerships. This paper describes the development of a software framework for industrial systems modeling and simulation. The framework provides a mechanism for investigating changes to industrial systems in a manner which minimizes the effort and computational power needed to develop focused simulations. The architecture and it`s component parts are described.

  15. A Virtual Engineering Framework for Simulating Advanced Power System

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Stanislav Borodai

    2008-06-18

    In this report is described the work effort performed to provide NETL with VE-Suite based Virtual Engineering software and enhanced equipment models to support NETL's Advanced Process Engineering Co-simulation (APECS) framework for advanced power generation systems. Enhancements to the software framework facilitated an important link between APECS and the virtual engineering capabilities provided by VE-Suite (e.g., equipment and process visualization, information assimilation). Model enhancements focused on improving predictions for the performance of entrained flow coal gasifiers and important auxiliary equipment (e.g., Air Separation Units) used in coal gasification systems. In addition, a Reduced Order Model generation tool and software to provide a coupling between APECS/AspenPlus and the GE GateCycle simulation system were developed. CAPE-Open model interfaces were employed where needed. The improved simulation capability is demonstrated on selected test problems. As part of the project an Advisory Panel was formed to provide guidance on the issues on which to focus the work effort. The Advisory Panel included experts from industry and academics in gasification, CO2 capture issues, process simulation and representatives from technology developers and the electric utility industry. To optimize the benefit to NETL, REI coordinated its efforts with NETL and NETL funded projects at Iowa State University, Carnegie Mellon University and ANSYS/Fluent, Inc. The improved simulation capabilities incorporated into APECS will enable researchers and engineers to better understand the interactions of different equipment components, identify weaknesses and processes needing improvement and thereby allow more efficient, less expensive plants to be developed and brought on-line faster and in a more cost-effective manner. These enhancements to APECS represent an important step toward having a fully integrated environment for performing plant simulation and engineering

  16. ALF--a simulation framework for genome evolution.

    PubMed

    Dalquen, Daniel A; Anisimova, Maria; Gonnet, Gaston H; Dessimoz, Christophe

    2012-04-01

    In computational evolutionary biology, verification and benchmarking is a challenging task because the evolutionary history of studied biological entities is usually not known. Computer programs for simulating sequence evolution in silico have shown to be viable test beds for the verification of newly developed methods and to compare different algorithms. However, current simulation packages tend to focus either on gene-level aspects of genome evolution such as character substitutions and insertions and deletions (indels) or on genome-level aspects such as genome rearrangement and speciation events. Here, we introduce Artificial Life Framework (ALF), which aims at simulating the entire range of evolutionary forces that act on genomes: nucleotide, codon, or amino acid substitution (under simple or mixture models), indels, GC-content amelioration, gene duplication, gene loss, gene fusion, gene fission, genome rearrangement, lateral gene transfer (LGT), or speciation. The other distinctive feature of ALF is its user-friendly yet powerful web interface. We illustrate the utility of ALF with two possible applications: 1) we reanalyze data from a study of selection after globin gene duplication and test the statistical significance of the original conclusions and 2) we demonstrate that LGT can dramatically decrease the accuracy of two well-established orthology inference methods. ALF is available as a stand-alone application or via a web interface at http://www.cbrg.ethz.ch/alf.

  17. Interactive agent based modeling of public health decision-making.

    PubMed

    Parks, Amanda L; Walker, Brett; Pettey, Warren; Benuzillo, Jose; Gesteland, Per; Grant, Juliana; Koopman, James; Drews, Frank; Samore, Matthew

    2009-01-01

    Agent-based models have yielded important insights regarding the transmission dynamics of communicable diseases. To better understand how these models can be used to study decision making of public health officials, we developed a computer program that linked an agent-based model of pertussis with an agent-based model of public health management. The program, which we call the Public Health Interactive Model & simulation (PHIMs) encompassed the reporting of cases to public health, case investigation, and public health response. The user directly interacted with the model in the role of the public health decision-maker. In this paper we describe the design of our model, and present the results of a pilot study to assess its usability and potential for future development. Affinity for specific tools was demonstrated. Participants ranked the program high in usability and considered it useful for training. Our ultimate goal is to achieve better public health decisions and outcomes through use of public health decision support tools. PMID:20351907

  18. Framework Application for Core Edge Transport Simulation (FACETS)

    SciTech Connect

    Malony, Allen D; Shende, Sameer S; Huck, Kevin A; Mr. Alan Morris, and Mr. Wyatt Spear

    2012-03-14

    The goal of the FACETS project (Framework Application for Core-Edge Transport Simulations) was to provide a multiphysics, parallel framework application (FACETS) that will enable whole-device modeling for the U.S. fusion program, to provide the modeling infrastructure needed for ITER, the next step fusion confinement device. Through use of modern computational methods, including component technology and object oriented design, FACETS is able to switch from one model to another for a given aspect of the physics in a flexible manner. This enables use of simplified models for rapid turnaround or high-fidelity models that can take advantage of the largest supercomputer hardware. FACETS does so in a heterogeneous parallel context, where different parts of the application execute in parallel by utilizing task farming, domain decomposition, and/or pipelining as needed and applicable. ParaTools, Inc. was tasked with supporting the performance analysis and tuning of the FACETS components and framework in order to achieve the parallel scaling goals of the project. The TAU Performance System® was used for instrumentation, measurement, archiving, and profile / tracing analysis. ParaTools, Inc. also assisted in FACETS performance engineering efforts. Through the use of the TAU Performance System, ParaTools provided instrumentation, measurement, analysis and archival support for the FACETS project. Performance optimization of key components has yielded significant performance speedups. TAU was integrated into the FACETS build for both the full coupled application and the UEDGE component. The performance database provided archival storage of the performance regression testing data generated by the project, and helped to track improvements in the software development.

  19. A Hybrid Multiscale Framework for Subsurface Flow and Transport Simulations

    SciTech Connect

    Scheibe, Timothy D.; Yang, Xiaofan; Chen, Xingyuan; Hammond, Glenn E.

    2015-06-01

    -specific and sometimes ad-hoc approaches for model coupling. We are developing a generalized approach to hierarchical model coupling designed for high-performance computational systems, based on the Swift computing workflow framework. In this presentation we will describe the generalized approach and provide two use cases: 1) simulation of a mixing-controlled biogeochemical reaction coupling pore- and continuum-scale models, and 2) simulation of biogeochemical impacts of groundwater – river water interactions coupling fine- and coarse-grid model representations. This generalized framework can be customized for use with any pair of linked models (microscale and macroscale) with minimal intrusiveness to the at-scale simulators. It combines a set of python scripts with the Swift workflow environment to execute a complex multiscale simulation utilizing an approach similar to the well-known Heterogeneous Multiscale Method. User customization is facilitated through user-provided input and output file templates and processing function scripts, and execution within a high-performance computing environment is handled by Swift, such that minimal to no user modification of at-scale codes is required.

  20. A Hybrid Multiscale Framework for Subsurface Flow and Transport Simulations

    DOE PAGES

    Scheibe, Timothy D.; Yang, Xiaofan; Chen, Xingyuan; Hammond, Glenn E.

    2015-06-01

    -specific and sometimes ad-hoc approaches for model coupling. We are developing a generalized approach to hierarchical model coupling designed for high-performance computational systems, based on the Swift computing workflow framework. In this presentation we will describe the generalized approach and provide two use cases: 1) simulation of a mixing-controlled biogeochemical reaction coupling pore- and continuum-scale models, and 2) simulation of biogeochemical impacts of groundwater – river water interactions coupling fine- and coarse-grid model representations. This generalized framework can be customized for use with any pair of linked models (microscale and macroscale) with minimal intrusiveness to the at-scale simulators. It combines a set of python scripts with the Swift workflow environment to execute a complex multiscale simulation utilizing an approach similar to the well-known Heterogeneous Multiscale Method. User customization is facilitated through user-provided input and output file templates and processing function scripts, and execution within a high-performance computing environment is handled by Swift, such that minimal to no user modification of at-scale codes is required.« less

  1. Agent Based Modeling of Air Carrier Behavior for Evaluation of Technology Equipage and Adoption

    NASA Technical Reports Server (NTRS)

    Horio, Brant M.; DeCicco, Anthony H.; Stouffer, Virginia L.; Hasan, Shahab; Rosenbaum, Rebecca L.; Smith, Jeremy C.

    2014-01-01

    As part of ongoing research, the National Aeronautics and Space Administration (NASA) and LMI developed a research framework to assist policymakers in identifying impacts on the U.S. air transportation system (ATS) of potential policies and technology related to the implementation of the Next Generation Air Transportation System (NextGen). This framework, called the Air Transportation System Evolutionary Simulation (ATS-EVOS), integrates multiple models into a single process flow to best simulate responses by U.S. commercial airlines and other ATS stakeholders to NextGen-related policies, and in turn, how those responses impact the ATS. Development of this framework required NASA and LMI to create an agent-based model of airline and passenger behavior. This Airline Evolutionary Simulation (AIRLINE-EVOS) models airline decisions about tactical airfare and schedule adjustments, and strategic decisions related to fleet assignments, market prices, and equipage. AIRLINE-EVOS models its own heterogeneous population of passenger agents that interact with airlines; this interaction allows the model to simulate the cycle of action-reaction as airlines compete with each other and engage passengers. We validated a baseline configuration of AIRLINE-EVOS against Airline Origin and Destination Survey (DB1B) data and subject matter expert opinion, and we verified the ATS-EVOS framework and agent behavior logic through scenario-based experiments. These experiments demonstrated AIRLINE-EVOS's capabilities in responding to an input price shock in fuel prices, and to equipage challenges in a series of analyses based on potential incentive policies for best equipped best served, optimal-wind routing, and traffic management initiative exemption concepts..

  2. Agent-Based Deterministic Modeling of the Bone Marrow Homeostasis.

    PubMed

    Kurhekar, Manish; Deshpande, Umesh

    2016-01-01

    Modeling of stem cells not only describes but also predicts how a stem cell's environment can control its fate. The first stem cell populations discovered were hematopoietic stem cells (HSCs). In this paper, we present a deterministic model of bone marrow (that hosts HSCs) that is consistent with several of the qualitative biological observations. This model incorporates stem cell death (apoptosis) after a certain number of cell divisions and also demonstrates that a single HSC can potentially populate the entire bone marrow. It also demonstrates that there is a production of sufficient number of differentiated cells (RBCs, WBCs, etc.). We prove that our model of bone marrow is biologically consistent and it overcomes the biological feasibility limitations of previously reported models. The major contribution of our model is the flexibility it allows in choosing model parameters which permits several different simulations to be carried out in silico without affecting the homeostatic properties of the model. We have also performed agent-based simulation of the model of bone marrow system proposed in this paper. We have also included parameter details and the results obtained from the simulation. The program of the agent-based simulation of the proposed model is made available on a publicly accessible website. PMID:27340402

  3. A rigorous framework for multiscale simulation of stochastic cellular networks

    PubMed Central

    Chevalier, Michael W.; El-Samad, Hana

    2009-01-01

    Noise and stochasticity are fundamental to biology and derive from the very nature of biochemical reactions where thermal motion of molecules translates into randomness in the sequence and timing of reactions. This randomness leads to cell-cell variability even in clonal populations. Stochastic biochemical networks are modeled as continuous time discrete state Markov processes whose probability density functions evolve according to a chemical master equation (CME). The CME is not solvable but for the simplest cases, and one has to resort to kinetic Monte Carlo techniques to simulate the stochastic trajectories of the biochemical network under study. A commonly used such algorithm is the stochastic simulation algorithm (SSA). Because it tracks every biochemical reaction that occurs in a given system, the SSA presents computational difficulties especially when there is a vast disparity in the timescales of the reactions or in the number of molecules involved in these reactions. This is common in cellular networks, and many approximation algorithms have evolved to alleviate the computational burdens of the SSA. Here, we present a rigorously derived modified CME framework based on the partition of a biochemically reacting system into restricted and unrestricted reactions. Although this modified CME decomposition is as analytically difficult as the original CME, it can be naturally used to generate a hierarchy of approximations at different levels of accuracy. Most importantly, some previously derived algorithms are demonstrated to be limiting cases of our formulation. We apply our methods to biologically relevant test systems to demonstrate their accuracy and efficiency. PMID:19673546

  4. Agent based simulations in disease modeling Comment on "Towards a unified approach in the modeling of fibrosis: A review with research perspectives" by Martine Ben Amar and Carlo Bianca

    NASA Astrophysics Data System (ADS)

    Pappalardo, Francesco; Pennisi, Marzio

    2016-07-01

    Fibrosis represents a process where an excessive tissue formation in an organ follows the failure of a physiological reparative or reactive process. Mathematical and computational techniques may be used to improve the understanding of the mechanisms that lead to the disease and to test potential new treatments that may directly or indirectly have positive effects against fibrosis [1]. In this scenario, Ben Amar and Bianca [2] give us a broad picture of the existing mathematical and computational tools that have been used to model fibrotic processes at the molecular, cellular, and tissue levels. Among such techniques, agent based models (ABM) can give a valuable contribution in the understanding and better management of fibrotic diseases.

  5. Agent-Based Collaborative Affective e-Learning Framework

    ERIC Educational Resources Information Center

    Neji, Mahmoud; Ben Ammar, Mohamed

    2007-01-01

    Based on facial expression (FE), this paper explores the possible use of the affective communication in virtual environments (VEs). The attention of affective communication is examined and some research ideas for developing affective communication in virtual environments are proposed. We place an emphasis on communication between virtual entities,…

  6. Multiscale agent-based consumer market modeling.

    SciTech Connect

    North, M. J.; Macal, C. M.; St. Aubin, J.; Thimmapuram, P.; Bragen, M.; Hahn, J.; Karr, J.; Brigham, N.; Lacy, M. E.; Hampton, D.; Decision and Information Sciences; Procter & Gamble Co.

    2010-05-01

    Consumer markets have been studied in great depth, and many techniques have been used to represent them. These have included regression-based models, logit models, and theoretical market-level models, such as the NBD-Dirichlet approach. Although many important contributions and insights have resulted from studies that relied on these models, there is still a need for a model that could more holistically represent the interdependencies of the decisions made by consumers, retailers, and manufacturers. When the need is for a model that could be used repeatedly over time to support decisions in an industrial setting, it is particularly critical. Although some existing methods can, in principle, represent such complex interdependencies, their capabilities might be outstripped if they had to be used for industrial applications, because of the details this type of modeling requires. However, a complementary method - agent-based modeling - shows promise for addressing these issues. Agent-based models use business-driven rules for individuals (e.g., individual consumer rules for buying items, individual retailer rules for stocking items, or individual firm rules for advertizing items) to determine holistic, system-level outcomes (e.g., to determine if brand X's market share is increasing). We applied agent-based modeling to develop a multi-scale consumer market model. We then conducted calibration, verification, and validation tests of this model. The model was successfully applied by Procter & Gamble to several challenging business problems. In these situations, it directly influenced managerial decision making and produced substantial cost savings.

  7. Understanding agent-based models of financial markets: A bottom-up approach based on order parameters and phase diagrams

    NASA Astrophysics Data System (ADS)

    Lye, Ribin; Tan, James Peng Lung; Cheong, Siew Ann

    2012-11-01

    We describe a bottom-up framework, based on the identification of appropriate order parameters and determination of phase diagrams, for understanding progressively refined agent-based models and simulations of financial markets. We illustrate this framework by starting with a deterministic toy model, whereby N independent traders buy and sell M stocks through an order book that acts as a clearing house. The price of a stock increases whenever it is bought and decreases whenever it is sold. Price changes are updated by the order book before the next transaction takes place. In this deterministic model, all traders based their buy decisions on a call utility function, and all their sell decisions on a put utility function. We then make the agent-based model more realistic, by either having a fraction fb of traders buy a random stock on offer, or a fraction fs of traders sell a random stock in their portfolio. Based on our simulations, we find that it is possible to identify useful order parameters from the steady-state price distributions of all three models. Using these order parameters as a guide, we find three phases: (i) the dead market; (ii) the boom market; and (iii) the jammed market in the phase diagram of the deterministic model. Comparing the phase diagrams of the stochastic models against that of the deterministic model, we realize that the primary effect of stochasticity is to eliminate the dead market phase.

  8. An Agent Based Model for Social Class Emergence

    NASA Astrophysics Data System (ADS)

    Yang, Xiaoxiang; Rodriguez Segura, Daniel; Lin, Fei; Mazilu, Irina

    We present an open system agent-based model to analyze the effects of education and the society-specific wealth transactions on the emergence of social classes. Building on previous studies, we use realistic functions to model how years of education affect the income level. Numerical simulations show that the fraction of an individual's total transactions that is invested rather than consumed can cause wealth gaps between different income brackets in the long run. In an attempt to incorporate the network effects, we also explore how the probability of interactions among agents depending on the spread of their income brackets affects wealth distribution.

  9. Agent-based model of macrophage action on endocrine pancreas.

    PubMed

    Martínez, Ignacio V; Gómez, Enrique J; Hernando, M Elena; Villares, Ricardo; Mellado, Mario

    2012-01-01

    This paper proposes an agent-based model of the action of macrophages on the beta cells of the endocrine pancreas. The aim of this model is to simulate the processes of beta cell proliferation and apoptosis and also the process of phagocytosis of cell debris by macrophages, all of which are related to the onset of the autoimmune response in type 1 diabetes. We have used data from the scientific literature to design the model. The results show that the model obtains good approximations to real processes and could be used to shed light on some open questions concerning such processes.

  10. An agent-based multilayer architecture for bioinformatics grids.

    PubMed

    Bartocci, Ezio; Cacciagrano, Diletta; Cannata, Nicola; Corradini, Flavio; Merelli, Emanuela; Milanesi, Luciano; Romano, Paolo

    2007-06-01

    Due to the huge volume and complexity of biological data available today, a fundamental component of biomedical research is now in silico analysis. This includes modelling and simulation of biological systems and processes, as well as automated bioinformatics analysis of high-throughput data. The quest for bioinformatics resources (including databases, tools, and knowledge) becomes therefore of extreme importance. Bioinformatics itself is in rapid evolution and dedicated Grid cyberinfrastructures already offer easier access and sharing of resources. Furthermore, the concept of the Grid is progressively interleaving with those of Web Services, semantics, and software agents. Agent-based systems can play a key role in learning, planning, interaction, and coordination. Agents constitute also a natural paradigm to engineer simulations of complex systems like the molecular ones. We present here an agent-based, multilayer architecture for bioinformatics Grids. It is intended to support both the execution of complex in silico experiments and the simulation of biological systems. In the architecture a pivotal role is assigned to an "alive" semantic index of resources, which is also expected to facilitate users' awareness of the bioinformatics domain.

  11. Reducing complexity in an agent based reaction model-Benefits and limitations of simplifications in relation to run time and system level output.

    PubMed

    Rhodes, David M; Holcombe, Mike; Qwarnstrom, Eva E

    2016-09-01

    Agent based modelling is a methodology for simulating a variety of systems across a broad spectrum of fields. However, due to the complexity of the systems it is often impossible or impractical to model them at a one to one scale. In this paper we use a simple reaction rate model implemented using the FLAME framework to test the impact of common methods for reducing model complexity such as reducing scale, increasing iteration duration and reducing message overheads. We demonstrate that such approaches can have significant impact on simulation runtime albeit with increasing risk of aberrant system behaviour and errors, as the complexity of the model is reduced. PMID:27297544

  12. Coupling Agent-Based and Groundwater Modeling to Explore Demand Management Strategies for Shared Resources

    NASA Astrophysics Data System (ADS)

    Al-Amin, S.

    2015-12-01

    Municipal water demands in growing population centers in the arid southwest US are typically met through increased groundwater withdrawals. Hydro-climatic uncertainties attributed to climate change and land use conversions may also alter demands and impact the replenishment of groundwater supply. Groundwater aquifers are not necessarily confined within municipal and management boundaries, and multiple diverse agencies may manage a shared resource in a decentralized approach, based on individual concerns and resources. The interactions among water managers, consumers, and the environment influence the performance of local management strategies and regional groundwater resources. This research couples an agent-based modeling (ABM) framework and a groundwater model to analyze the effects of different management approaches on shared groundwater resources. The ABM captures the dynamic interactions between household-level consumers and policy makers to simulate water demands under climate change and population growth uncertainties. The groundwater model is used to analyze the relative effects of management approaches on reducing demands and replenishing groundwater resources. The framework is applied for municipalities located in the Verde River Basin, Arizona that withdraw groundwater from the Verde Formation-Basin Fill-Carbonate aquifer system. Insights gained through this simulation study can be used to guide groundwater policy-making under changing hydro-climatic scenarios for a long-term planning horizon.

  13. Techniques and Issues in Agent-Based Modeling Validation

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui

    2012-01-01

    Validation of simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. However, researchers have only recently started to consider the issues of validation. Compared to other simulation models, ABM has many differences in model development, usage and validation. An ABM is inherently easier to build than a classical simulation, but more difficult to describe formally since they are closer to human cognition. Using multi-agent models to study complex systems has attracted criticisms because of the challenges involved in their validation [1]. In this report, we describe the challenge of ABM validation and present a novel approach we recently developed for an ABM system.

  14. Multiscale Simulation Framework for Coupled Fluid Flow and Mechanical Deformation

    SciTech Connect

    Tchelepi, Hamdi

    2014-11-14

    A multiscale linear-solver framework for the pressure equation associated with flow in highly heterogeneous porous formations was developed. The multiscale based approach is cast in a general algebraic form, which facilitates integration of the new scalable linear solver in existing flow simulators. The Algebraic Multiscale Solver (AMS) is employed as a preconditioner within a multi-stage strategy. The formulations investigated include the standard MultiScale Finite-Element (MSFE) andMultiScale Finite-Volume (MSFV) methods. The local-stage solvers include incomplete factorization and the so-called Correction Functions (CF) associated with the MSFV approach. Extensive testing of AMS, as an iterative linear solver, indicate excellent convergence rates and computational scalability. AMS compares favorably with advanced Algebraic MultiGrid (AMG) solvers for highly detailed three-dimensional heterogeneous models. Moreover, AMS is expected to be especially beneficial in solving time-dependent problems of coupled multiphase flow and transport in large-scale subsurface formations.

  15. Dynamic Gaussian wake meandering in a restricted nonlinear simulation framework

    NASA Astrophysics Data System (ADS)

    Bretheim, Joel; Porte-Agel, Fernando; Gayme, Dennice; Meneveau, Charles

    2015-11-01

    Wake meandering can significantly impact the performance of large-scale wind farms. Simplified wake expansion (e.g., Jensen/PARK) models, which are commonly used in industry, lead to accurate predictions of certain wind farm performance characteristics (e.g., time- and row-averaged total power output). However, they are unable to capture certain temporal phenomena such as wake meandering, which can have profound effects on both power output and turbine loading. We explore a dynamic wake modeling framework based on the approach proposed by Larsen et al. (Wind Energy 11, 2008) whereby turbine ``wake elements'' are treated as passive tracers and advected by an averaged streamwise flow. Our wake elements are treated as Gaussian velocity deficit profiles (Bastankhah and Porte-Agel, Renew. Energy 70, 2014). A restricted nonlinear (RNL) model is used to capture the turbulent velocity fluctuations that are critical to the wake meandering phenomenon. The RNL system, which has been used in prior wall-turbulence studies, provides a computationally affordable way to model atmospheric turbulence, making it more reasonable for use in engineering models than the more accurate but computationally intensive approaches like large-eddy simulation. This work is supported by NSF (IGERT 0801471, SEP-1230788, and IIA-1243482, the WINDINSPIRE project).

  16. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  17. artG4: A Generic Framework for Geant4 Simulations

    SciTech Connect

    Arvanitis, Tasha; Lyon, Adam

    2014-01-01

    A small experiment must devote its limited computing expertise to writing physics code directly applicable to the experiment. A software 'framework' is essential for providing an infrastructure that makes writing the physics-relevant code easy. In this paper, we describe a highly modular and easy to use framework for writing Geant4 based simulations called 'artg4'. This framework is a layer on top of the art framework.

  18. Agent based modeling in tactical wargaming

    NASA Astrophysics Data System (ADS)

    James, Alex; Hanratty, Timothy P.; Tuttle, Daniel C.; Coles, John B.

    2016-05-01

    Army staffs at division, brigade, and battalion levels often plan for contingency operations. As such, analysts consider the impact and potential consequences of actions taken. The Army Military Decision-Making Process (MDMP) dictates identification and evaluation of possible enemy courses of action; however, non-state actors often do not exhibit the same level and consistency of planned actions that the MDMP was originally designed to anticipate. The fourth MDMP step is a particular challenge, wargaming courses of action within the context of complex social-cultural behaviors. Agent-based Modeling (ABM) and its resulting emergent behavior is a potential solution to model terrain in terms of the human domain and improve the results and rigor of the traditional wargaming process.

  19. An agent-based approach to financial stylized facts

    NASA Astrophysics Data System (ADS)

    Shimokawa, Tetsuya; Suzuki, Kyoko; Misawa, Tadanobu

    2007-06-01

    An important challenge of the financial theory in recent years is to construct more sophisticated models which have consistencies with as many financial stylized facts that cannot be explained by traditional models. Recently, psychological studies on decision making under uncertainty which originate in Kahneman and Tversky's research attract a lot of interest as key factors which figure out the financial stylized facts. These psychological results have been applied to the theory of investor's decision making and financial equilibrium modeling. This paper, following these behavioral financial studies, would like to propose an agent-based equilibrium model with prospect theoretical features of investors. Our goal is to point out a possibility that loss-averse feature of investors explains vast number of financial stylized facts and plays a crucial role in price formations of financial markets. Price process which is endogenously generated through our model has consistencies with, not only the equity premium puzzle and the volatility puzzle, but great kurtosis, asymmetry of return distribution, auto-correlation of return volatility, cross-correlation between return volatility and trading volume. Moreover, by using agent-based simulations, the paper also provides a rigorous explanation from the viewpoint of a lack of market liquidity to the size effect, which means that small-sized stocks enjoy excess returns compared to large-sized stocks.

  20. A framework for web browser-based medical simulation using WebGL.

    PubMed

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2012-01-01

    This paper presents a web browser-based software framework that provides accessibility, portability, and platform independence for medical simulation. Typical medical simulation systems are restricted to the underlying platform and device, which limits widespread use. Our framework allows realistic and efficient medical simulation using only the web browser for anytime anywhere access using a variety of platforms ranging from desktop PCs to tablets. The framework consists of visualization, simulation, and hardware integration modules that are fundamental components for multimodal interactive simulation. Benchmark tests are performed to validate the rendering and computing performance of our framework with latest web browsers including Chrome and Firefox. The results are quite promising opening up the possibility of developing web-based medical simulation technology.

  1. An agent-based hydroeconomic model to evaluate water policies in Jordan

    NASA Astrophysics Data System (ADS)

    Yoon, J.; Gorelick, S.

    2014-12-01

    Modern water systems can be characterized by a complex network of institutional and private actors that represent competing sectors and interests. Identifying solutions to enhance water security in such systems calls for analysis that can adequately account for this level of complexity and interaction. Our work focuses on the development of a hierarchical, multi-agent, hydroeconomic model that attempts to realistically represent complex interactions between hydrologic and multi-faceted human systems. The model is applied to Jordan, one of the most water-poor countries in the world. In recent years, the water crisis in Jordan has escalated due to an ongoing drought and influx of refugees from regional conflicts. We adopt a modular approach in which biophysical modules simulate natural and engineering phenomena, and human modules represent behavior at multiple scales of decision making. The human modules employ agent-based modeling, in which agents act as autonomous decision makers at the transboundary, state, organizational, and user levels. A systematic nomenclature and conceptual framework is used to characterize model agents and modules. Concepts from the Unified Modeling Language (UML) are adopted to promote clear conceptualization of model classes and process sequencing, establishing a foundation for full deployment of the integrated model in a scalable object-oriented programming environment. Although the framework is applied to the Jordanian water context, it is generalizable to other regional human-natural freshwater supply systems.

  2. Combining Bayesian Networks and Agent Based Modeling to develop a decision-support model in Vietnam

    NASA Astrophysics Data System (ADS)

    Nong, Bao Anh; Ertsen, Maurits; Schoups, Gerrit

    2016-04-01

    Complexity and uncertainty in natural resources management have been focus themes in recent years. Within these debates, with the aim to define an approach feasible for water management practice, we are developing an integrated conceptual modeling framework for simulating decision-making processes of citizens, in our case in the Day river area, Vietnam. The model combines Bayesian Networks (BNs) and Agent-Based Modeling (ABM). BNs are able to combine both qualitative data from consultants / experts / stakeholders, and quantitative data from observations on different phenomena or outcomes from other models. Further strengths of BNs are that the relationship between variables in the system is presented in a graphical interface, and that components of uncertainty are explicitly related to their probabilistic dependencies. A disadvantage is that BNs cannot easily identify the feedback of agents in the system once changes appear. Hence, ABM was adopted to represent the reaction among stakeholders under changes. The modeling framework is developed as an attempt to gain better understanding about citizen's behavior and factors influencing their decisions in order to reduce uncertainty in the implementation of water management policy.

  3. Educational Validity of Business Gaming Simulation: A Research Methodology Framework

    ERIC Educational Resources Information Center

    Stainton, Andrew J.; Johnson, Johnnie E.; Borodzicz, Edward P.

    2010-01-01

    Many past educational validity studies of business gaming simulation, and more specifically total enterprise simulation, have been inconclusive. Studies have focused on the weaknesses of business gaming simulation; which is often regarded as an educational medium that has limitations regarding learning effectiveness. However, no attempts have been…

  4. An Agent-Based Model for Studying Child Maltreatment and Child Maltreatment Prevention

    NASA Astrophysics Data System (ADS)

    Hu, Xiaolin; Puddy, Richard W.

    This paper presents an agent-based model that simulates the dynamics of child maltreatment and child maltreatment prevention. The developed model follows the principles of complex systems science and explicitly models a community and its families with multi-level factors and interconnections across the social ecology. This makes it possible to experiment how different factors and prevention strategies can affect the rate of child maltreatment. We present the background of this work and give an overview of the agent-based model and show some simulation results.

  5. A New Simulation Framework for Autonomy in Robotic Missions

    NASA Technical Reports Server (NTRS)

    Flueckiger, Lorenzo; Neukom, Christian

    2003-01-01

    Autonomy is a key factor in remote robotic exploration and there is significant activity addressing the application of autonomy to remote robots. It has become increasingly important to have simulation tools available to test the autonomy algorithms. While indus1;rial robotics benefits from a variety of high quality simulation tools, researchers developing autonomous software are still dependent primarily on block-world simulations. The Mission Simulation Facility I(MSF) project addresses this shortcoming with a simulation toolkit that will enable developers of autonomous control systems to test their system s performance against a set of integrated, standardized simulations of NASA mission scenarios. MSF provides a distributed architecture that connects the autonomous system to a set of simulated components replacing the robot hardware and its environment.

  6. A Runtime Verification Framework for Control System Simulation

    SciTech Connect

    Ciraci, Selim; Fuller, Jason C.; Daily, Jeffrey A.; Makhmalbaf, Atefe; Callahan, Charles D.

    2014-08-02

    n a standard workflow for the validation of a control system, the control system is implemented as an extension to a simulator. Such simulators are complex software systems, and engineers may unknowingly violate constraints a simulator places on extensions. As such, errors may be introduced in the implementation of either the control system or the simulator leading to invalid simulation results. This paper presents a novel runtime verification approach for verifying control system implementations within simulators. The major contribution of the approach is the two-tier specification process. In the first tier, engineers model constraints using a domain-specific language tailored to modeling a controller’s response to changes in its input. The language is high-level and effectively hides the implementation details of the simulator, allowing engineers to specify design-level constraints independent of low-level simulator interfaces. In the second tier, simulator developers provide mapping rules for mapping design-level constraints to the implementation of the simulator. Using the rules, an automated tool transforms the design-level specifications into simulator-specific runtime verification specifications and generates monitoring code which is injected into the implementation of the simulator. During simulation, these monitors observe the input and output variables of the control system and report changes to the verifier. The verifier checks whether these changes follow the constraints of the control system. We describe application of this approach to the verification of the constraints of an HVAC control system implemented with the power grid simulator GridLAB-D.

  7. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  8. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  9. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  10. Validating agent based models through virtual worlds.

    SciTech Connect

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  11. High performance computing for three-dimensional agent-based molecular models.

    PubMed

    Pérez-Rodríguez, G; Pérez-Pérez, M; Fdez-Riverola, F; Lourenço, A

    2016-07-01

    Agent-based simulations are increasingly popular in exploring and understanding cellular systems, but the natural complexity of these systems and the desire to grasp different modelling levels demand cost-effective simulation strategies and tools. In this context, the present paper introduces novel sequential and distributed approaches for the three-dimensional agent-based simulation of individual molecules in cellular events. These approaches are able to describe the dimensions and position of the molecules with high accuracy and thus, study the critical effect of spatial distribution on cellular events. Moreover, two of the approaches allow multi-thread high performance simulations, distributing the three-dimensional model in a platform independent and computationally efficient way. Evaluation addressed the reproduction of molecular scenarios and different scalability aspects of agent creation and agent interaction. The three approaches simulate common biophysical and biochemical laws faithfully. The distributed approaches show improved performance when dealing with large agent populations while the sequential approach is better suited for small to medium size agent populations. Overall, the main new contribution of the approaches is the ability to simulate three-dimensional agent-based models at the molecular level with reduced implementation effort and moderate-level computational capacity. Since these approaches have a generic design, they have the major potential of being used in any event-driven agent-based tool. PMID:27372059

  12. High performance computing for three-dimensional agent-based molecular models.

    PubMed

    Pérez-Rodríguez, G; Pérez-Pérez, M; Fdez-Riverola, F; Lourenço, A

    2016-07-01

    Agent-based simulations are increasingly popular in exploring and understanding cellular systems, but the natural complexity of these systems and the desire to grasp different modelling levels demand cost-effective simulation strategies and tools. In this context, the present paper introduces novel sequential and distributed approaches for the three-dimensional agent-based simulation of individual molecules in cellular events. These approaches are able to describe the dimensions and position of the molecules with high accuracy and thus, study the critical effect of spatial distribution on cellular events. Moreover, two of the approaches allow multi-thread high performance simulations, distributing the three-dimensional model in a platform independent and computationally efficient way. Evaluation addressed the reproduction of molecular scenarios and different scalability aspects of agent creation and agent interaction. The three approaches simulate common biophysical and biochemical laws faithfully. The distributed approaches show improved performance when dealing with large agent populations while the sequential approach is better suited for small to medium size agent populations. Overall, the main new contribution of the approaches is the ability to simulate three-dimensional agent-based models at the molecular level with reduced implementation effort and moderate-level computational capacity. Since these approaches have a generic design, they have the major potential of being used in any event-driven agent-based tool.

  13. Agent-based model to rural urban migration analysis

    NASA Astrophysics Data System (ADS)

    Silveira, Jaylson J.; Espíndola, Aquino L.; Penna, T. J. P.

    2006-05-01

    In this paper, we analyze the rural-urban migration phenomenon as it is usually observed in economies which are in the early stages of industrialization. The analysis is conducted by means of a statistical mechanics approach which builds a computational agent-based model. Agents are placed on a lattice and the connections among them are described via an Ising-like model. Simulations on this computational model show some emergent properties that are common in developing economies, such as a transitional dynamics characterized by continuous growth of urban population, followed by the equalization of expected wages between rural and urban sectors (Harris-Todaro equilibrium condition), urban concentration and increasing of per capita income.

  14. Agent Based Model of Livestock Movements

    NASA Astrophysics Data System (ADS)

    Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.

    The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.

  15. CrusDe: A plug-in based simulation framework for composable CRUStal DEformation simulations

    NASA Astrophysics Data System (ADS)

    Grapenthin, R.

    2008-12-01

    Within geoscience, Green's method is an established mathematical tool to analyze the dynamics of the Earth's crust in response to the application of a mass force, e.g. a surface load. Different abstractions from the Earth's interior as well as the particular effects caused by such a force are expressed by means of a Green's function, G, which is a particular solution to an inhomogeneous differential equation with boundary conditions. Surface loads, L, are defined by real data or as analytical expressions. The response of the crust to a surface load is gained by a 2D-convolution (**) of the Green's function with this load. The crustal response can be thought of as an instantaneous displacement which is followed by a gradual transition towards the final relaxed state of displacement. A relaxation function, R, describing such a transition depends on the rheological model for the ductile layer of the crust. The 1D-convolution (*) of the relaxation function with a load history, H, allows to include the temporal evolution of the surface load into a model. The product of the two convolution results expresses the displacement (rate) of the crust, U, at a certain time t: Ut = (R * H)t · (G ** L) Rather than implementing a variety of specific models, approaching crustal deformation problems from the general formulation in equation~1 opens the opportunity to consider reuse of model building blocks within a more flexible simulation framework. Model elements (Green's function, load function, etc.), operators, pre- and postprocessing, and even input and output routines could be part of a framework that enables a user to freely compose software components to resemble equation~1. The simulation framework CrusDe implements equation~1 in the proposed way. CrusDe's architecture defines interfaces for generic communication between the simulation core and the model elements. Thus, exchangeability of the particular model element implementations is possible. In the presented plug

  16. Systematic Assessment of Communication Games and Simulations: An Applied Framework.

    ERIC Educational Resources Information Center

    Lederman, Linda C.; Ruben, Brent D.

    1984-01-01

    Reviews the components of simulations and games (roles, interactions, rules, goals, outcomes) and the criteria (validity, reliability, utility) by which they can be assessed. Provides a model for including these criteria in the selection/design, use, and assessment of communication simulations and games. (PD)

  17. Developing and Implementing a Framework of Participatory Simulation for Mobile Learning Using Scaffolding

    ERIC Educational Resources Information Center

    Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen

    2013-01-01

    This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…

  18. Agent based models for testing city evacuation strategies under a flood event as strategy to reduce flood risk

    NASA Astrophysics Data System (ADS)

    Medina, Neiler; Sanchez, Arlex; Nokolic, Igor; Vojinovic, Zoran

    2016-04-01

    This research explores the uses of Agent Based Models (ABM) and its potential to test large scale evacuation strategies in coastal cities at risk from flood events due to extreme hydro-meteorological events with the final purpose of disaster risk reduction by decreasing human's exposure to the hazard. The first part of the paper corresponds to the theory used to build the models such as: Complex adaptive systems (CAS) and the principles and uses of ABM in this field. The first section outlines the pros and cons of using AMB to test city evacuation strategies at medium and large scale. The second part of the paper focuses on the central theory used to build the ABM, specifically the psychological and behavioral model as well as the framework used in this research, specifically the PECS reference model is cover in this section. The last part of this section covers the main attributes or characteristics of human beings used to described the agents. The third part of the paper shows the methodology used to build and implement the ABM model using Repast-Symphony as an open source agent-based modelling and simulation platform. The preliminary results for the first implementation in a region of the island of Sint-Maarten a Dutch Caribbean island are presented and discussed in the fourth section of paper. The results obtained so far, are promising for a further development of the model and its implementation and testing in a full scale city

  19. Framework Application for Core-Edge Transport Simulations

    2007-06-13

    FACETS is a whole-device model for magnetic-fusion experiments (including ITER) combining physics effects from sources & sinks, wall effects, edge effects, and core effects in an advanced parallel framework which manages allocation of parallel resources, performs runtime performance analysis, and provides tools for interactive steering and visualization. FACETS will be used by fusion researchers to design experimental campaigns, predict and model fusion experimental phenomena, and design and optimize future machines.

  20. Improving Agent Based Models and Validation through Data Fusion

    PubMed Central

    Laskowski, Marek; Demianyk, Bryan C.P.; Friesen, Marcia R.; McLeod, Robert D.; Mukhi, Shamir N.

    2011-01-01

    This work is contextualized in research in modeling and simulation of infection spread within a community or population, with the objective to provide a public health and policy tool in assessing the dynamics of infection spread and the qualitative impacts of public health interventions. This work uses the integration of real data sources into an Agent Based Model (ABM) to simulate respiratory infection spread within a small municipality. Novelty is derived in that the data sources are not necessarily obvious within ABM infection spread models. The ABM is a spatial-temporal model inclusive of behavioral and interaction patterns between individual agents on a real topography. The agent behaviours (movements and interactions) are fed by census / demographic data, integrated with real data from a telecommunication service provider (cellular records) and person-person contact data obtained via a custom 3G Smartphone application that logs Bluetooth connectivity between devices. Each source provides data of varying type and granularity, thereby enhancing the robustness of the model. The work demonstrates opportunities in data mining and fusion that can be used by policy and decision makers. The data become real-world inputs into individual SIR disease spread models and variants, thereby building credible and non-intrusive models to qualitatively simulate and assess public health interventions at the population level. PMID:23569606

  1. Agent-Based Modeling of Cancer Stem Cell Driven Solid Tumor Growth.

    PubMed

    Poleszczuk, Jan; Macklin, Paul; Enderling, Heiko

    2016-01-01

    Computational modeling of tumor growth has become an invaluable tool to simulate complex cell-cell interactions and emerging population-level dynamics. Agent-based models are commonly used to describe the behavior and interaction of individual cells in different environments. Behavioral rules can be informed and calibrated by in vitro assays, and emerging population-level dynamics may be validated with both in vitro and in vivo experiments. Here, we describe the design and implementation of a lattice-based agent-based model of cancer stem cell driven tumor growth. PMID:27044046

  2. A Distributed Platform for Global-Scale Agent-Based Models of Disease Transmission

    PubMed Central

    Parker, Jon; Epstein, Joshua M.

    2013-01-01

    The Global-Scale Agent Model (GSAM) is presented. The GSAM is a high-performance distributed platform for agent-based epidemic modeling capable of simulating a disease outbreak in a population of several billion agents. It is unprecedented in its scale, its speed, and its use of Java. Solutions to multiple challenges inherent in distributing massive agent-based models are presented. Communication, synchronization, and memory usage are among the topics covered in detail. The memory usage discussion is Java specific. However, the communication and synchronization discussions apply broadly. We provide benchmarks illustrating the GSAM’s speed and scalability. PMID:24465120

  3. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    NASA Technical Reports Server (NTRS)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  4. Hierarchical Petascale Simulation Framework for Stress Corrosion Cracking

    SciTech Connect

    Vashishta, Priya

    2014-12-01

    Reaction Dynamics in Energetic Materials: Detonation is a prototype of mechanochemistry, in which mechanically and thermally induced chemical reactions far from equilibrium exhibit vastly different behaviors. It is also one of the hardest multiscale physics problems, in which diverse length and time scales play important roles. The CACS group has performed multimillion-atom reactive MD simulations to reveal a novel two-stage reaction mechanism during the detonation of cyclotrimethylenetrinitramine (RDX) crystal. Rapid production of N2 and H2O within ~10 ps is followed by delayed production of CO molecules within ~ 1 ns. They found that further decomposition towards the final products is inhibited by the formation of large metastable C- and O-rich clusters with fractal geometry. The CACS group has also simulated the oxidation dynamics of close-packed aggregates of aluminum nanoparticles passivated by oxide shells. Their simulation results suggest an unexpectedly active role of the oxide shell as a nanoreactor.

  5. NPTool: a simulation and analysis framework for low-energy nuclear physics experiments

    NASA Astrophysics Data System (ADS)

    Matta, A.; Morfouace, P.; de Séréville, N.; Flavigny, F.; Labiche, M.; Shearman, R.

    2016-08-01

    The Nuclear Physics Tool (NPTool) is an open source data analysis and Monte Carlo simulation framework that has been developed for low-energy nuclear physics experiments with an emphasis on radioactive beam experiments. The NPTool offers a unified framework for designing, preparing and analyzing complex experiments employing multiple detectors, each of which may comprise some hundreds of channels. The framework has been successfully used for the analysis and simulation of experiments at facilities including GANIL, RIKEN, ALTO and TRIUMF, using both stable and radioactive beams. This paper details the NPTool philosophy together with an overview of the workflow. The framework has been benchmarked through the comparison of simulated and experimental data for a variety of detectors used in charged particle and gamma-ray spectroscopy.

  6. Maestro: An Orchestration Framework for Large-Scale WSN Simulations

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2014-01-01

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation. PMID:24647123

  7. Maestro: an orchestration framework for large-scale WSN simulations.

    PubMed

    Riliskis, Laurynas; Osipov, Evgeny

    2014-01-01

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation. PMID:24647123

  8. Design and Performance Frameworks for Constructing Problem-Solving Simulations

    ERIC Educational Resources Information Center

    Stevens, Rons; Palacio-Cayetano, Joycelin

    2003-01-01

    Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks…

  9. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    NASA Astrophysics Data System (ADS)

    Rescigno, R.; Finck, Ch.; Juliani, D.; Baudot, J.; Dauvergne, D.; Dedes, G.; Krimmer, J.; Ray, C.; Reithinger, V.; Rousseau, M.; Testa, E.; Winter, M.

    2014-03-01

    Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  10. Demeter, persephone, and the search for emergence in agent-based models.

    SciTech Connect

    North, M. J.; Howe, T. R.; Collier, N. T.; Vos, J. R.; Decision and Information Sciences; Univ. of Chicago; PantaRei Corp.; Univ. of Illinois

    2006-01-01

    In Greek mythology, the earth goddess Demeter was unable to find her daughter Persephone after Persephone was abducted by Hades, the god of the underworld. Demeter is said to have embarked on a long and frustrating, but ultimately successful, search to find her daughter. Unfortunately, long and frustrating searches are not confined to Greek mythology. In modern times, agent-based modelers often face similar troubles when searching for agents that are to be to be connected to one another and when seeking appropriate target agents while defining agent behaviors. The result is a 'search for emergence' in that many emergent or potentially emergent behaviors in agent-based models of complex adaptive systems either implicitly or explicitly require search functions. This paper considers a new nested querying approach to simplifying such agent-based modeling and multi-agent simulation search problems.

  11. NASA Earth Observing System Simulator Suite (NEOS3): A Forward Simulation Framework for Observing System Simulation Experiments

    NASA Astrophysics Data System (ADS)

    Niamsuwan, N.; Tanelli, S.; Johnson, M. P.; Jacob, J. C.; Jaruwatanadilok, S.; Oveisgharan, S.; Dao, D.; Simard, M.; Turk, F. J.; Tsang, L.; Liao, T. H.; Chau, Q.

    2014-12-01

    Future Earth observation missions will produce a large volume of interrelated data sets that will help us to cross-calibrate and validate spaceborne sensor measurements. A forward simulator is a crucial tool for examining the quality of individual products as well as resolving discrepancy among related data sets. NASA Earth Observing System Simulator Suite (NEOS3) is a highly customizable forward simulation tool for Earth remote sensing instruments. Its three-stage simulation process converts the 3D geophysical description of the scene being observed to corresponding electromagnetic emission and scattering signatures, and finally to observable parameters as reported by a (passive or active) remote sensing instrument. User-configurable options include selection of models for describing geophysical properties of atmospheric particles and their effects on the signal of interest, selection of wave scattering and propagation models, and activation of simplifying assumptions (trading between computation time and solution accuracy). The next generation of NEOS3, to be released in 2015, will feature additional state-of-the-art electromagnetic scattering models for various types of the Earth's surfaces and ground covers (e.g. layered snowpack, forest, vegetated soil, and sea ice) tailored specifically for missions like GPM and SMAP. To be included in 2015 is dedicated functionalities and interface that facilitate integrating NEOS3 into Observing System Simulation Experiment (OSSE) environments. This new generation of NEOS3 can also utilize high performance computing resources (parallel processing and cloud computing) and can be scaled to handle large or computation intensive problems. This presentation will highlight some notable features of NEOS3. Demonstration of its applications for evaluating new mission concepts, especially in the context of OSSE frameworks will also be presented.

  12. Consentaneous agent-based and stochastic model of the financial markets.

    PubMed

    Gontis, Vygintas; Kononovicius, Aleksejus

    2014-01-01

    We are looking for the agent-based treatment of the financial markets considering necessity to build bridges between microscopic, agent based, and macroscopic, phenomenological modeling. The acknowledgment that agent-based modeling framework, which may provide qualitative and quantitative understanding of the financial markets, is very ambiguous emphasizes the exceptional value of well defined analytically tractable agent systems. Herding as one of the behavior peculiarities considered in the behavioral finance is the main property of the agent interactions we deal with in this contribution. Looking for the consentaneous agent-based and macroscopic approach we combine two origins of the noise: exogenous one, related to the information flow, and endogenous one, arising form the complex stochastic dynamics of agents. As a result we propose a three state agent-based herding model of the financial markets. From this agent-based model we derive a set of stochastic differential equations, which describes underlying macroscopic dynamics of agent population and log price in the financial markets. The obtained solution is then subjected to the exogenous noise, which shapes instantaneous return fluctuations. We test both Gaussian and q-Gaussian noise as a source of the short term fluctuations. The resulting model of the return in the financial markets with the same set of parameters reproduces empirical probability and spectral densities of absolute return observed in New York, Warsaw and NASDAQ OMX Vilnius Stock Exchanges. Our result confirms the prevalent idea in behavioral finance that herding interactions may be dominant over agent rationality and contribute towards bubble formation.

  13. Consentaneous Agent-Based and Stochastic Model of the Financial Markets

    PubMed Central

    Gontis, Vygintas; Kononovicius, Aleksejus

    2014-01-01

    We are looking for the agent-based treatment of the financial markets considering necessity to build bridges between microscopic, agent based, and macroscopic, phenomenological modeling. The acknowledgment that agent-based modeling framework, which may provide qualitative and quantitative understanding of the financial markets, is very ambiguous emphasizes the exceptional value of well defined analytically tractable agent systems. Herding as one of the behavior peculiarities considered in the behavioral finance is the main property of the agent interactions we deal with in this contribution. Looking for the consentaneous agent-based and macroscopic approach we combine two origins of the noise: exogenous one, related to the information flow, and endogenous one, arising form the complex stochastic dynamics of agents. As a result we propose a three state agent-based herding model of the financial markets. From this agent-based model we derive a set of stochastic differential equations, which describes underlying macroscopic dynamics of agent population and log price in the financial markets. The obtained solution is then subjected to the exogenous noise, which shapes instantaneous return fluctuations. We test both Gaussian and q-Gaussian noise as a source of the short term fluctuations. The resulting model of the return in the financial markets with the same set of parameters reproduces empirical probability and spectral densities of absolute return observed in New York, Warsaw and NASDAQ OMX Vilnius Stock Exchanges. Our result confirms the prevalent idea in behavioral finance that herding interactions may be dominant over agent rationality and contribute towards bubble formation. PMID:25029364

  14. Designing a Virtual Olympic Games Framework by Using Simulation in Web 2.0 Technologies

    ERIC Educational Resources Information Center

    Stoilescu, Dorian

    2013-01-01

    Instructional simulation had major difficulties in the past for offering limited possibilities in practice and learning. This article proposes a link between instructional simulation and Web 2.0 technologies. More exactly, I present the design of the Virtual Olympic Games Framework (VOGF), as a significant demonstration of how interactivity in…

  15. Agent-Based Mapping of Credit Risk for Sustainable Microfinance

    PubMed Central

    Lee, Joung-Hun; Jusup, Marko; Podobnik, Boris; Iwasa, Yoh

    2015-01-01

    By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk---a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital. PMID:25945790

  16. Agent-based mapping of credit risk for sustainable microfinance.

    PubMed

    Lee, Joung-Hun; Jusup, Marko; Podobnik, Boris; Iwasa, Yoh

    2015-01-01

    By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk--a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital.

  17. Agent-based mapping of credit risk for sustainable microfinance.

    PubMed

    Lee, Joung-Hun; Jusup, Marko; Podobnik, Boris; Iwasa, Yoh

    2015-01-01

    By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk--a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital. PMID:25945790

  18. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  19. E-laboratories : agent-based modeling of electricity markets.

    SciTech Connect

    North, M.; Conzelmann, G.; Koritarov, V.; Macal, C.; Thimmapuram, P.; Veselka, T.

    2002-05-03

    Electricity markets are complex adaptive systems that operate under a wide range of rules that span a variety of time scales. These rules are imposed both from above by society and below by physics. Many electricity markets are undergoing or are about to undergo a transition from centrally regulated systems to decentralized markets. Furthermore, several electricity markets have recently undergone this transition with extremely unsatisfactory results, most notably in California. These high stakes transitions require the introduction of largely untested regulatory structures. Suitable laboratories that can be used to test regulatory structures before they are applied to real systems are needed. Agent-based models can provide such electronic laboratories or ''e-laboratories.'' To better understand the requirements of an electricity market e-laboratory, a live electricity market simulation was created. This experience helped to shape the development of the Electricity Market Complex Adaptive Systems (EMCAS) model. To explore EMCAS' potential as an e-laboratory, several variations of the live simulation were created. These variations probed the possible effects of changing power plant outages and price setting rules on electricity market prices.

  20. An agent-based mathematical model about carp aggregation

    NASA Astrophysics Data System (ADS)

    Liang, Yu; Wu, Chao

    2005-05-01

    This work presents an agent-based mathematical model to simulate the aggregation of carp, a harmful fish in North America. The referred mathematical model is derived from the following assumptions: (1) instead of the consensus among every carps involved in the aggregation, the aggregation of carp is completely a random and spontaneous physical behavior of numerous of independent carp; (2) carp aggregation is a collective effect of inter-carp and carp-environment interaction; (3) the inter-carp interaction can be derived from the statistical analytics about large-scale observed data. The proposed mathematical model is mainly based on empirical inter-carp force field, whose effect is featured with repulsion, parallel orientation, attraction, out-of-perception zone, and blind. Based on above mathematical model, the aggregation behavior of carp is formulated and preliminary simulation results about the aggregation of small number of carps within simple environment are provided. Further experiment-based validation about the mathematical model will be made in our future work.

  1. A scalable framework for the global offline community land model ensemble simulation

    DOE PAGES

    Wang, Dali; Domke, Jens; Mao, Jiafu; Shi, Xiaoying; Ricciuto, Daniel M.

    2016-01-01

    Current earth system models have a large range of uncertainty, owing to differences in the simulation of feedbacks and insufficient information to constrain model parameters. Parameter disturbance experiment provides a straightforward method to quantify the variation (uncertainty) outputs caused by model inputs. Owing to the software complexity and computational intensity of earth system models, a large-scale simulation framework is needed to support ensemble simulation required by parameter disturbance experiment. This paper presents a parallel framework for the community land model ensemble simulation. After a software structure review of the community land model simulation, a single factor parameter disturbance experiment ofmore » a reference computational experiment design is used to demonstrate the software design principles, computational characteristics of individual application, parallel ensemble simulation implementation, as well as the weak scalability of this simulation framework on a high-end computer. Lastly, the paper discusses some preliminary diagnostic analysis results of the single factor parameter disturbance experiments. The framework design considerations and implementation details described in this paper can be beneficial to many other research programmes involving large scale, legacy modelling system.« less

  2. A scalable framework for the global offline community land model ensemble simulation

    SciTech Connect

    Wang, Dali; Domke, Jens; Mao, Jiafu; Shi, Xiaoying; Ricciuto, Daniel M.

    2016-01-01

    Current earth system models have a large range of uncertainty, owing to differences in the simulation of feedbacks and insufficient information to constrain model parameters. Parameter disturbance experiment provides a straightforward method to quantify the variation (uncertainty) outputs caused by model inputs. Owing to the software complexity and computational intensity of earth system models, a large-scale simulation framework is needed to support ensemble simulation required by parameter disturbance experiment. This paper presents a parallel framework for the community land model ensemble simulation. After a software structure review of the community land model simulation, a single factor parameter disturbance experiment of a reference computational experiment design is used to demonstrate the software design principles, computational characteristics of individual application, parallel ensemble simulation implementation, as well as the weak scalability of this simulation framework on a high-end computer. Lastly, the paper discusses some preliminary diagnostic analysis results of the single factor parameter disturbance experiments. The framework design considerations and implementation details described in this paper can be beneficial to many other research programmes involving large scale, legacy modelling system.

  3. Atomistic Simulation of Protein Encapsulation in Metal-Organic Frameworks.

    PubMed

    Zhang, Haiyang; Lv, Yongqin; Tan, Tianwei; van der Spoel, David

    2016-01-28

    Fabrication of metal-organic frameworks (MOFs) with large apertures triggers a brand-new research area for selective encapsulation of biomolecules within MOF nanopores. The underlying inclusion mechanism is yet to be clarified however. Here we report a molecular dynamics study on the mechanism of protein encapsulation in MOFs. Evaluation for the binding of amino acid side chain analogues reveals that van der Waals interaction is the main driving force for the binding and that guest size acts as a key factor predicting protein binding with MOFs. Analysis on the conformation and thermodynamic stability of the miniprotein Trp-cage encapsulated in a series of MOFs with varying pore apertures and surface chemistries indicates that protein encapsulation can be achieved via maintaining a polar/nonpolar balance in the MOF surface through tunable modification of organic linkers and Mg-O chelating moieties. Such modifications endow MOFs with a more biocompatible confinement. This work provides guidelines for selective inclusion of biomolecules within MOFs and facilitates MOF functions as a new class of host materials and molecular chaperones.

  4. Turbulent Simulations of Divertor Detachment Based On BOUT + + Framework

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Xu, Xueqiao; Xia, Tianyang; Ye, Minyou

    2015-11-01

    China Fusion Engineering Testing Reactor is under conceptual design, acting as a bridge between ITER and DEMO. The detached divertor operation offers great promise for a reduction of heat flux onto divertor target plates for acceptable erosion. Therefore, a density scan is performed via an increase of D2 gas puffing rates in the range of 0 . 0 ~ 5 . 0 ×1023s-1 by using the B2-Eirene/SOLPS 5.0 code package to study the heat flux control and impurity screening property. As the density increases, it shows a gradually change of the divertor operation status, from low-recycling regime to high-recycling regime and finally to detachment. Significant radiation loss inside the confined plasma in the divertor region during detachment leads to strong parallel density and temperature gradients. Based on the SOLPS simulations, BOUT + + simulations will be presented to investigate the stability and turbulent transport under divertor plasma detachment, particularly the strong parallel gradient driven instabilities and enhanced plasma turbulence to spread heat flux over larger surface areas. The correlation between outer mid-plane and divertor turbulence and the related transport will be analyzed. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-675075.

  5. Consistent and conservative framework for incompressible multiphase flow simulations

    NASA Astrophysics Data System (ADS)

    Owkes, Mark; Desjardins, Olivier

    2015-11-01

    We present a computational methodology for convection that handles discontinuities with second order accuracy and maintains conservation to machine precision. We use this method in the context of an incompressible gas-liquid flow to transport the phase interface, momentum, and scalars. Using the same methodology for all the variables ensures discretely consistent transport, which is necessary for robust and accurate simulations of turbulent atomizing flows with high-density ratios. The method achieves conservative transport by computing consistent fluxes on a refined mesh, which ensures all conserved quantities are fluxed with the same discretization. Additionally, the method seamlessly couples semi-Lagrangian fluxes used near the interface with finite difference fluxes used away from the interface. The semi-Lagrangian fluxes are three-dimensional, un-split, and conservatively handle discontinuities. Careful construction of the fluxes ensures they are divergence-free and no gaps or overlaps form between neighbors. We have tested and used the scheme for many cases and demonstrate a simulation of an atomizing liquid jet.

  6. Agent-based modeling of complex infrastructures

    SciTech Connect

    North, M. J.

    2001-06-01

    Complex Adaptive Systems (CAS) can be applied to investigate complex infrastructures and infrastructure interdependencies. The CAS model agents within the Spot Market Agent Research Tool (SMART) and Flexible Agent Simulation Toolkit (FAST) allow investigation of the electric power infrastructure, the natural gas infrastructure and their interdependencies.

  7. A configurable distributed high-performance computing framework for satellite's TDI-CCD imaging simulation

    NASA Astrophysics Data System (ADS)

    Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang

    2010-11-01

    This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.

  8. Creating a Software Framework for Simulating Satellite Geolocation

    SciTech Connect

    Koch, Daniel B

    2011-01-01

    It is hard to imagine life these days without having some sort of electronic indication of one's current location. Whether the purpose is for business, personal, or emergency use, utilizing smart cell phones, in-vehicle navigation systems, or location beacons, dependence on the Global Positioning System (GPS) is pervasive. Yet the availability of the GPS should not be taken for granted. Both environmental (e.g., terrain, weather) and intentional interference (i.e., jamming) can reduce or deny satellite access. In order to investigate these and other issues, as well as to explore possible alternative satellite constellations, an application called the Satellite Simulation Toolkit (SatSim) was created. This paper presents a high-level overview of SatSim and an example of how it may be used to study geolocation.

  9. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  10. Using a scalable modeling and simulation framework to evaluate the benefits of intelligent transportation systems.

    SciTech Connect

    Ewing, T.; Tentner, A.

    2000-03-21

    A scalable, distributed modeling and simulation framework has been developed at Argonne National Laboratory to study Intelligent Transportation Systems. The framework can run on a single-processor workstation, or run distributed on a multiprocessor computer or network of workstations. The framework is modular and supports plug-in models, hardware, and live data sources. The initial set of models currently includes road network and traffic flow, probe and smart vehicles, traffic management centers, communications between vehicles and centers, in-vehicle navigation systems, roadway traffic advisories. The modeling and simulation capability has been used to examine proposed ITS concepts. Results are presented from modeling scenarios from the Advanced Driver and Vehicle Advisory Navigation Concept (ADVANCE) experimental program to demonstrate how the framework can be used to evaluate the benefits of ITS and to plan future ITS operational tests and deployment initiatives.

  11. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  12. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 1

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The findings of the assessment are contained in this report.

  13. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis, Phase 2 Results

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2011-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL-Systems Analysis (SA) team that is conducting studies of the technologies and architectures that are required to enable human and higher mass robotic missions to Mars. The findings, observations, and recommendations from the NESC are provided in this report.

  14. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The appendices to the original report are contained in this document.

  15. An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.

  16. NISAC Agent Based Laboratory for Economics

    2006-10-11

    The software provides large-scale microeconomic simulation of complex economic and social systems (such as supply chain and market dynamics of businesses in the US economy) and their dependence on physical infrastructure systems. The system is based on Agent simulation, where each entity of inteest in the system to be modeled (for example, a Bank, individual firms, Consumer households, etc.) is specified in a data-driven sense to be individually repreented by an Agent. The Agents interactmore » using rules of interaction appropriate to their roles, and through those interactions complex economic and social dynamics emerge. The software is implemented in three tiers, a Java-based visualization client, a C++ control mid-tier, and a C++ computational tier.« less

  17. NISAC Agent Based Laboratory for Economics

    SciTech Connect

    Downes, Paula; Davis, Chris; Eidson, Eric; Ehlen, Mark; Gieseler, Charles; Harris, Richard

    2006-10-11

    The software provides large-scale microeconomic simulation of complex economic and social systems (such as supply chain and market dynamics of businesses in the US economy) and their dependence on physical infrastructure systems. The system is based on Agent simulation, where each entity of inteest in the system to be modeled (for example, a Bank, individual firms, Consumer households, etc.) is specified in a data-driven sense to be individually repreented by an Agent. The Agents interact using rules of interaction appropriate to their roles, and through those interactions complex economic and social dynamics emerge. The software is implemented in three tiers, a Java-based visualization client, a C++ control mid-tier, and a C++ computational tier.

  18. SIMPEG: An open source framework for simulation and gradient based parameter estimation in geophysical applications

    NASA Astrophysics Data System (ADS)

    Cockett, Rowan; Kang, Seogi; Heagy, Lindsey J.; Pidlisecky, Adam; Oldenburg, Douglas W.

    2015-12-01

    Inverse modeling is a powerful tool for extracting information about the subsurface from geophysical data. Geophysical inverse problems are inherently multidisciplinary, requiring elements from the relevant physics, numerical simulation, and optimization, as well as knowledge of the geologic setting, and a comprehension of the interplay between all of these elements. The development and advancement of inversion methodologies can be enabled by a framework that supports experimentation, is flexible and extensible, and allows the knowledge generated to be captured and shared. The goal of this paper is to propose a framework that supports many different types of geophysical forward simulations and deterministic inverse problems. Additionally, we provide an open source implementation of this framework in Python called SIMPEG (Simulation and Parameter Estimation in Geophysics,

  19. An Implicit Solution Framework for Reactor Fuel Performance Simulation

    SciTech Connect

    Glen Hansen; Chris Newman; Derek Gaston; Cody Permann

    2009-08-01

    The simulation of nuclear reactor fuel performance involves complex thermomechanical processes between fuel pellets, made of fissile material, and the protective cladding that surrounds the pellets. An important design goal for a fuel is to maximize the life of the cladding thereby allowing the fuel to remain in the reactor for a longer period of time to achieve higher degrees of burnup. This presentation presents an initial approach for modeling the thermomechanical response of reactor fuel, and details of the solution method employed within INL's fuel performance code, BISON. The code employs advanced methods for solving coupled partial differential equation systems that describe multidimensional fuel thermomechanics, heat generation, and oxygen transport within the fuel. This discussion explores the effectiveness of a JFNK-based solution of a problem involving three dimensional fully coupled, nonlinear transient heat conduction and that includes pellet displacement and oxygen diffusion effects. These equations are closed using empirical data that is a function of temperature, density, and oxygen hyperstoichiometry. The method appears quite effective for the fuel pellet / cladding configurations examined, with excellent nonlinear convergence properties exhibited on the combined system. In closing, fully coupled solutions of three dimensional thermomechanics coupled with oxygen diffusion appear quite attractive using the JFNK approach described here, at least for configurations similar to those examined in this report.

  20. Bayesian uncertainty quantification and propagation in molecular dynamics simulations: A high performance computing framework

    NASA Astrophysics Data System (ADS)

    Angelikopoulos, Panagiotis; Papadimitriou, Costas; Koumoutsakos, Petros

    2012-10-01

    We present a Bayesian probabilistic framework for quantifying and propagating the uncertainties in the parameters of force fields employed in molecular dynamics (MD) simulations. We propose a highly parallel implementation of the transitional Markov chain Monte Carlo for populating the posterior probability distribution of the MD force-field parameters. Efficient scheduling algorithms are proposed to handle the MD model runs and to distribute the computations in clusters with heterogeneous architectures. Furthermore, adaptive surrogate models are proposed in order to reduce the computational cost associated with the large number of MD model runs. The effectiveness and computational efficiency of the proposed Bayesian framework is demonstrated in MD simulations of liquid and gaseous argon.

  1. Measure of Landscape Heterogeneity by Agent-Based Methodology

    NASA Astrophysics Data System (ADS)

    Wirth, E.; Szabó, Gy.; Czinkóczky, A.

    2016-06-01

    With the rapid increase of the world's population, the efficient food production is one of the key factors of the human survival. Since biodiversity and heterogeneity is the basis of the sustainable agriculture, the authors tried to measure the heterogeneity of a chosen landscape. The EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity, nevertheless exact measurements and calculations apart from statistical parameters (standard deviation, mean), do not really exist. In the present paper the authors' goal is to find an objective, dynamic method that measures landscape heterogeneity. It is achieved with the so called agent-based modelling, where randomly dispatched dynamic scouts record the observed land cover parameters and sum up the features of a new type of land. During the simulation the agents collect a Monte Carlo integral as a diversity landscape potential which can be considered as the unit of the `greening' measure. As a final product of the ABM method, a landscape potential map is obtained that can serve as a tool for objective decision making to support agricultural diversity.

  2. FERN – a Java framework for stochastic simulation and evaluation of reaction networks

    PubMed Central

    Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf

    2008-01-01

    Background Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. Results In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. Conclusion FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand

  3. Lipid converter, a framework for lipid manipulations in molecular dynamics simulations.

    PubMed

    Larsson, Per; Kasson, Peter M

    2014-11-01

    Construction of lipid membrane and membrane protein systems for molecular dynamics simulations can be a challenging process. In addition, there are few available tools to extend existing studies by repeating simulations using other force fields and lipid compositions. To facilitate this, we introduce Lipid Converter, a modular Python framework for exchanging force fields and lipid composition in coordinate files obtained from simulations. Force fields and lipids are specified by simple text files, making it easy to introduce support for additional force fields and lipids. The converter produces simulation input files that can be used for structural relaxation of the new membranes.

  4. Lipid-converter, a framework for lipid manipulations in molecular dynamics simulations

    PubMed Central

    Larsson, Per; Kasson, Peter M.

    2014-01-01

    Construction of lipid membrane and membrane protein systems for molecular dynamics simulations can be a challenging process. In addition, there are few available tools to extend existing studies by repeating simulations using other force fields and lipid compositions. To facilitate this, we introduce lipidconverter, a modular Python framework for exchanging force fields and lipid composition in coordinate files obtained from simulations. Force fields and lipids are specified by simple text files, making it easy to introduce support for additional force fields and lipids. The converter produces simulation input files that can be used for structural relaxation of the new membranes. PMID:25081234

  5. Evaluating Water Demand Using Agent-Based Modeling

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.

    2004-12-01

    The supply and demand of water resources are functions of complex, inter-related systems including hydrology, climate, demographics, economics, and policy. To assess the safety and sustainability of water resources, planners often rely on complex numerical models that relate some or all of these systems using mathematical abstractions. The accuracy of these models relies on how well the abstractions capture the true nature of the systems interactions. Typically, these abstractions are based on analyses of observations and/or experiments that account only for the statistical mean behavior of each system. This limits the approach in two important ways: 1) It cannot capture cross-system disruptive events, such as major drought, significant policy change, or terrorist attack, and 2) it cannot resolve sub-system level responses. To overcome these limitations, we are developing an agent-based water resources model that includes the systems of hydrology, climate, demographics, economics, and policy, to examine water demand during normal and extraordinary conditions. Agent-based modeling (ABM) develops functional relationships between systems by modeling the interaction between individuals (agents), who behave according to a probabilistic set of rules. ABM is a "bottom-up" modeling approach in that it defines macro-system behavior by modeling the micro-behavior of individual agents. While each agent's behavior is often simple and predictable, the aggregate behavior of all agents in each system can be complex, unpredictable, and different than behaviors observed in mean-behavior models. Furthermore, the ABM approach creates a virtual laboratory where the effects of policy changes and/or extraordinary events can be simulated. Our model, which is based on the demographics and hydrology of the Middle Rio Grande Basin in the state of New Mexico, includes agent groups of residential, agricultural, and industrial users. Each agent within each group determines its water usage

  6. Agent Based Intelligence in a Tetrahedral Rover

    NASA Technical Reports Server (NTRS)

    Phelps, Peter; Truszkowski, Walt

    2007-01-01

    A tetrahedron is a 4-node 6-strut pyramid structure which is being used by the NASA - Goddard Space Flight Center as the basic building block for a new approach to robotic motion. The struts are extendable; it is by the sequence of activities: strut-extension, changing the center of gravity and falling that the tetrahedron "moves". Currently, strut-extension is handled by human remote control. There is an effort underway to make the movement of the tetrahedron autonomous, driven by an attempt to achieve a goal. The approach being taken is to associate an intelligent agent with each node. Thus, the autonomous tetrahedron is realized as a constrained multi-agent system, where the constraints arise from the fact that between any two agents there is an extendible strut. The hypothesis of this work is that, by proper composition of such automated tetrahedra, robotic structures of various levels of complexity can be developed which will support more complex dynamic motions. This is the basis of the new approach to robotic motion which is under investigation. A Java-based simulator for the single tetrahedron, realized as a constrained multi-agent system, has been developed and evaluated. This paper reports on this project and presents a discussion of the structure and dynamics of the simulator.

  7. AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*

    PubMed Central

    Bruch, Elizabeth; Atwell, Jon

    2014-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351

  8. Agent-Based Modeling of Growth Processes

    ERIC Educational Resources Information Center

    Abraham, Ralph

    2014-01-01

    Growth processes abound in nature, and are frequently the target of modeling exercises in the sciences. In this article we illustrate an agent-based approach to modeling, in the case of a single example from the social sciences: bullying.

  9. FNCS: A Framework for Power System and Communication Networks Co-Simulation

    SciTech Connect

    Ciraci, Selim; Daily, Jeffrey A.; Fuller, Jason C.; Fisher, Andrew R.; Marinovici, Laurentiu D.; Agarwal, Khushbu

    2014-04-13

    This paper describes the Fenix framework that uses a federated approach for integrating power grid and communication network simulators. Compared existing approaches, Fenix al- lows co-simulation of both transmission and distribution level power grid simulators with the communication network sim- ulator. To reduce the performance overhead of time synchro- nization, Fenix utilizes optimistic synchronization strategies that make speculative decisions about when the simulators are going to exchange messages. GridLAB-D (a distribution simulator), PowerFlow (a transmission simulator), and ns-3 (a telecommunication simulator) are integrated with the frame- work and are used to illustrate the enhanced performance pro- vided by speculative multi-threading on a smart grid applica- tion. Our speculative multi-threading approach achieved on average 20% improvement over the existing synchronization methods

  10. A Framework for Simulating Turbine-Based Combined-Cycle Inlet Mode-Transition

    NASA Technical Reports Server (NTRS)

    Le, Dzu K.; Vrnak, Daniel R.; Slater, John W.; Hessel, Emil O.

    2012-01-01

    A simulation framework based on the Memory-Mapped-Files technique was created to operate multiple numerical processes in locked time-steps and send I/O data synchronously across to one-another to simulate system-dynamics. This simulation scheme is currently used to study the complex interactions between inlet flow-dynamics, variable-geometry actuation mechanisms, and flow-controls in the transition from the supersonic to hypersonic conditions and vice-versa. A study of Mode-Transition Control for a high-speed inlet wind-tunnel model with this MMF-based framework is presented to illustrate this scheme and demonstrate its usefulness in simulating supersonic and hypersonic inlet dynamics and controls or other types of complex systems.

  11. Smell Detection Agent Based Optimization Algorithm

    NASA Astrophysics Data System (ADS)

    Vinod Chandra, S. S.

    2016-09-01

    In this paper, a novel nature-inspired optimization algorithm has been employed and the trained behaviour of dogs in detecting smell trails is adapted into computational agents for problem solving. The algorithm involves creation of a surface with smell trails and subsequent iteration of the agents in resolving a path. This algorithm can be applied in different computational constraints that incorporate path-based problems. Implementation of the algorithm can be treated as a shortest path problem for a variety of datasets. The simulated agents have been used to evolve the shortest path between two nodes in a graph. This algorithm is useful to solve NP-hard problems that are related to path discovery. This algorithm is also useful to solve many practical optimization problems. The extensive derivation of the algorithm can be enabled to solve shortest path problems.

  12. A Collective Case Study of Secondary Students' Model-Based Inquiry on Natural Selection through Programming in an Agent-Based Modeling Environment

    ERIC Educational Resources Information Center

    Xiang, Lin

    2011-01-01

    This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on…

  13. A Metascalable Computing Framework for Large Spatiotemporal-Scale Atomistic Simulations

    SciTech Connect

    Nomura, K; Seymour, R; Wang, W; Kalia, R; Nakano, A; Vashishta, P; Shimojo, F; Yang, L H

    2009-02-17

    A metascalable (or 'design once, scale on new architectures') parallel computing framework has been developed for large spatiotemporal-scale atomistic simulations of materials based on spatiotemporal data locality principles, which is expected to scale on emerging multipetaflops architectures. The framework consists of: (1) an embedded divide-and-conquer (EDC) algorithmic framework based on spatial locality to design linear-scaling algorithms for high complexity problems; (2) a space-time-ensemble parallel (STEP) approach based on temporal locality to predict long-time dynamics, while introducing multiple parallelization axes; and (3) a tunable hierarchical cellular decomposition (HCD) parallelization framework to map these O(N) algorithms onto a multicore cluster based on hybrid implementation combining message passing and critical section-free multithreading. The EDC-STEP-HCD framework exposes maximal concurrency and data locality, thereby achieving: (1) inter-node parallel efficiency well over 0.95 for 218 billion-atom molecular-dynamics and 1.68 trillion electronic-degrees-of-freedom quantum-mechanical simulations on 212,992 IBM BlueGene/L processors (superscalability); (2) high intra-node, multithreading parallel efficiency (nanoscalability); and (3) nearly perfect time/ensemble parallel efficiency (eon-scalability). The spatiotemporal scale covered by MD simulation on a sustained petaflops computer per day (i.e. petaflops {center_dot} day of computing) is estimated as NT = 2.14 (e.g. N = 2.14 million atoms for T = 1 microseconds).

  14. Simulating the Household Plug-in Hybrid Electric Vehicle Distribution and its Electric Distribution Network Impacts

    SciTech Connect

    Cui, Xiaohui; Kim, Hoe Kyoung; Liu, Cheng; Kao, Shih-Chieh; Bhaduri, Budhendra L

    2012-01-01

    This paper presents a multi agent-based simulation framework for modeling spatial distribution of plug-in hybrid electric vehicle ownership at local residential level, discovering plug-in hybrid electric vehicle hot zones where ownership may quickly increase in the near future, and estimating the impacts of the increasing plug-in hybrid electric vehicle ownership on the local electric distribution network with different charging strategies. We use Knox County, Tennessee as a case study to highlight the simulation results of the agent-based simulation framework.

  15. Framework of passive millimeter-wave scene simulation based on material classification

    NASA Astrophysics Data System (ADS)

    Park, Hyuk; Kim, Sung-Hyun; Lee, Ho-Jin; Kim, Yong-Hoon; Ki, Jae-Sug; Yoon, In-Bok; Lee, Jung-Min; Park, Soon-Jun

    2006-05-01

    Over the past few decades, passive millimeter-wave (PMMW) sensors have emerged as useful implements in transportation and military applications such as autonomous flight-landing system, smart weapons, night- and all weather vision system. As an efficient way to predict the performance of a PMMW sensor and apply it to system, it is required to test in SoftWare-In-the-Loop (SWIL). The PMMW scene simulation is a key component for implementation of this simulator. However, there is no commercial on-the-shelf available to construct the PMMW scene simulation; only there have been a few studies on this technology. We have studied the PMMW scene simulation method to develop the PMMW sensor SWIL simulator. This paper describes the framework of the PMMW scene simulation and the tentative results. The purpose of the PMMW scene simulation is to generate sensor outputs (or image) from a visible image and environmental conditions. We organize it into four parts; material classification mapping, PMMW environmental setting, PMMW scene forming, and millimeter-wave (MMW) sensorworks. The background and the objects in the scene are classified based on properties related with MMW radiation and reflectivity. The environmental setting part calculates the following PMMW phenomenology; atmospheric propagation and emission including sky temperature, weather conditions, and physical temperature. Then, PMMW raw images are formed with surface geometry. Finally, PMMW sensor outputs are generated from PMMW raw images by applying the sensor characteristics such as an aperture size and noise level. Through the simulation process, PMMW phenomenology and sensor characteristics are simulated on the output scene. We have finished the design of framework of the simulator, and are working on implementation in detail. As a tentative result, the flight observation was simulated in specific conditions. After implementation details, we plan to increase the reliability of the simulation by data collecting

  16. Understanding Group/Party Affiliation Using Social Networks and Agent-Based Modeling

    NASA Technical Reports Server (NTRS)

    Campbell, Kenyth

    2012-01-01

    The dynamics of group affiliation and group dispersion is a concept that is most often studied in order for political candidates to better understand the most efficient way to conduct their campaigns. While political campaigning in the United States is a very hot topic that most politicians analyze and study, the concept of group/party affiliation presents its own area of study that producers very interesting results. One tool for examining party affiliation on a large scale is agent-based modeling (ABM), a paradigm in the modeling and simulation (M&S) field perfectly suited for aggregating individual behaviors to observe large swaths of a population. For this study agent based modeling was used in order to look at a community of agents and determine what factors can affect the group/party affiliation patterns that are present. In the agent-based model that was used for this experiment many factors were present but two main factors were used to determine the results. The results of this study show that it is possible to use agent-based modeling to explore group/party affiliation and construct a model that can mimic real world events. More importantly, the model in the study allows for the results found in a smaller community to be translated into larger experiments to determine if the results will remain present on a much larger scale.

  17. Formalizing the Role of Agent-Based Modeling in Causal Inference and Epidemiology

    PubMed Central

    Marshall, Brandon D. L.; Galea, Sandro

    2015-01-01

    Calls for the adoption of complex systems approaches, including agent-based modeling, in the field of epidemiology have largely centered on the potential for such methods to examine complex disease etiologies, which are characterized by feedback behavior, interference, threshold dynamics, and multiple interacting causal effects. However, considerable theoretical and practical issues impede the capacity of agent-based methods to examine and evaluate causal effects and thus illuminate new areas for intervention. We build on this work by describing how agent-based models can be used to simulate counterfactual outcomes in the presence of complexity. We show that these models are of particular utility when the hypothesized causal mechanisms exhibit a high degree of interdependence between multiple causal effects and when interference (i.e., one person's exposure affects the outcome of others) is present and of intrinsic scientific interest. Although not without challenges, agent-based modeling (and complex systems methods broadly) represent a promising novel approach to identify and evaluate complex causal effects, and they are thus well suited to complement other modern epidemiologic methods of etiologic inquiry. PMID:25480821

  18. Agent-Based vs. Equation-based Epidemiological Models:A Model Selection Case Study

    SciTech Connect

    Sukumar, Sreenivas R; Nutaro, James J

    2012-01-01

    This paper is motivated by the need to design model validation strategies for epidemiological disease-spread models. We consider both agent-based and equation-based models of pandemic disease spread and study the nuances and complexities one has to consider from the perspective of model validation. For this purpose, we instantiate an equation based model and an agent based model of the 1918 Spanish flu and we leverage data published in the literature for our case- study. We present our observations from the perspective of each implementation and discuss the application of model-selection criteria to compare the risk in choosing one modeling paradigm to another. We conclude with a discussion of our experience and document future ideas for a model validation framework.

  19. A Semantic Web Service and Simulation Framework to Intelligent Distributed Manufacturing

    SciTech Connect

    Son, Young Jun; Kulvatunyou, Boonserm; Cho, Hyunbo; Feng, Shaw

    2005-11-01

    To cope with today's fluctuating markets, a virtual enterprise (VE) concept can be employed to achieve the cooperation among independently operating enterprises. The success of VE depends on reliable interoperation among trading partners. This paper proposes a framework based on semantic web of manufacturing and simulation services to enable business and engineering collaborations between VE partners, particularly a design house and manufacturing suppliers.

  20. Quantitative agent based model of user behavior in an Internet discussion forum.

    PubMed

    Sobkowicz, Pawel

    2013-01-01

    The paper presents an agent based simulation of opinion evolution, based on a nonlinear emotion/information/opinion (E/I/O) individual dynamics, to an actual Internet discussion forum. The goal is to reproduce the results of two-year long observations and analyses of the user communication behavior and of the expressed opinions and emotions, via simulations using an agent based model. The model allowed to derive various characteristics of the forum, including the distribution of user activity and popularity (outdegree and indegree), the distribution of length of dialogs between the participants, their political sympathies and the emotional content and purpose of the comments. The parameters used in the model have intuitive meanings, and can be translated into psychological observables.

  1. Agent Based Study of Surprise Attacks:. Roles of Surveillance, Prompt Reaction and Intelligence

    NASA Astrophysics Data System (ADS)

    Shanahan, Linda; Sen, Surajit

    Defending a confined territory from a surprise attack is seldom possible. We use molecular dynamics and statistical physics inspired agent-based simulations to explore the evolution and outcome of such attacks. The study suggests robust emergent behavior, which emphasizes the importance of accurate surveillance, automated and powerful attack response, building layout, and sheds light on the role of communication restrictions in defending such territories.

  2. An interactive framework for developing simulation models of hospital accident and emergency services.

    PubMed

    Codrington-Virtue, Anthony; Whittlestone, Paul; Kelly, John; Chaussalet, Thierry

    2005-01-01

    Discrete-event simulation can be a valuable tool in modelling health care systems. This paper describes an interactive framework to model and simulate a hospital accident and emergency department. An interactive spreadsheet (Excel) facilitated the user-friendly input of data such as patient pathways, arrival times, service times and resources into the discrete event simulation package (SIMUL8). The framework was enhanced further by configuring SIMUL8 to visually show patient flow and activity on a schematic plan of an A&E. The patient flow and activity information included patient icons flowing along A&E corridors and pathways, processes undertaken in A&E work areas and queue activity. One major benefit of visually showing patient flow and activity was that modellers and decision makers could visually gain a dynamic insight into the performance of the overall system and visually see changes over the model run cycle. Another key benefit of the interactive framework was the ability to quickly and easily change model parameters to trial, test and compare different scenarios.

  3. An open software framework for advancement of x-ray optics simulation and modeling

    NASA Astrophysics Data System (ADS)

    Bruhwiler, David L.; Chubar, Oleg; Nagler, Robert; Krzywinski, Jacek; Boehnlein, Amber

    2014-09-01

    Accurate physical-optics based simulation of emission, transport and use in experiments of fully- and partially-coherent X-ray radiation is essential for both designers and users of experiments at state-of-the-art light sources: low-emittance storage rings, energy-recovery linacs and free-electron lasers. To be useful for different applications, the simulations must include accurate physical models for the processes of emission, for the structures of X-ray optical elements, interaction of the radiation with samples, and propagation of scattered X-rays to a detector. Based on the "Synchrotron Radiation Workshop" (SRW) open source computer code, we are developing a simulation framework, including a graphical user interface, web interface for client-server simulations, data format for wave-optics based representation of partially-coherent X-ray radiation, and a dictionary for universal description of optical elements. Also, we are evaluating formats for sample and experimental data representation for different types of experiments and processing. The simulation framework will facilitate start-to-end simulations by different computer codes complementary to SRW, for example GENESIS and FAST codes for simulating self-amplified spontaneous emission, SHADOW and McXtrace geometrical ray-tracing codes, as well as codes for simulation of interaction of radiation with matter and data processing in experiments exploiting coherence of radiation. The development of the new framework is building on components developed for the Python-based RadTrack software, which is designed for loose coupling of multiple electron and radiation codes to enable sophisticated workflows. We are exploring opportunities for collaboration with teams pursuing similar developments at European Synchrotron Radiation Facility and the European XFEL.

  4. Users' Perception of Medical Simulation Training: A Framework for Adopting Simulator Technology

    ERIC Educational Resources Information Center

    Green, Leili Hayati

    2014-01-01

    Users play a key role in many training strategies, yet some organizations often fail to understand the users' perception after a simulation training implementation, their attitude about acceptance or rejection of and integration of emerging simulation technology in medical training (Gaba, 2007, and Topol, 2012). Several factors are considered to…

  5. Flexible simulation framework to couple processes in complex 3D models for subsurface utilization assessment

    NASA Astrophysics Data System (ADS)

    Kempka, Thomas; Nakaten, Benjamin; De Lucia, Marco; Nakaten, Natalie; Otto, Christopher; Pohl, Maik; Tillner, Elena; Kühn, Michael

    2016-04-01

    Utilization of the geological subsurface for production and storage of hydrocarbons, chemical energy and heat as well as for waste disposal requires the quantification and mitigation of environmental impacts as well as the improvement of georesources utilization in terms of efficiency and sustainability. The development of tools for coupled process simulations is essential to tackle these challenges, since reliable assessments are only feasible by integrative numerical computations. Coupled processes at reservoir to regional scale determine the behaviour of reservoirs, faults and caprocks, generally demanding for complex 3D geological models to be considered besides available monitoring and experimenting data in coupled numerical simulations. We have been developing a flexible numerical simulation framework that provides efficient workflows for integrating the required data and software packages to carry out coupled process simulations considering, e.g., multiphase fluid flow, geomechanics, geochemistry and heat. Simulation results are stored in structured data formats to allow for an integrated 3D visualization and result interpretation as well as data archiving and its provision to collaborators. The main benefits in using the flexible simulation framework are the integration of data geological and grid data from any third party software package as well as data export to generic 3D visualization tools and archiving formats. The coupling of the required process simulators in time and space is feasible, while different spatial dimensions in the coupled simulations can be integrated, e.g., 0D batch with 3D dynamic simulations. User interaction is established via high-level programming languages, while computational efficiency is achieved by using low-level programming languages. We present three case studies on the assessment of geological subsurface utilization based on different process coupling approaches and numerical simulations.

  6. Pain expressiveness and altruistic behavior: an exploration using agent-based modeling.

    PubMed

    de C Williams, Amanda C; Gallagher, Elizabeth; Fidalgo, Antonio R; Bentley, Peter J

    2016-03-01

    Predictions which invoke evolutionary mechanisms are hard to test. Agent-based modeling in artificial life offers a way to simulate behaviors and interactions in specific physical or social environments over many generations. The outcomes have implications for understanding adaptive value of behaviors in context. Pain-related behavior in animals is communicated to other animals that might protect or help, or might exploit or predate. An agent-based model simulated the effects of displaying or not displaying pain (expresser/nonexpresser strategies) when injured and of helping, ignoring, or exploiting another in pain (altruistic/nonaltruistic/selfish strategies). Agents modeled in MATLAB interacted at random while foraging (gaining energy); random injury interrupted foraging for a fixed time unless help from an altruistic agent, who paid an energy cost, speeded recovery. Environmental and social conditions also varied, and each model ran for 10,000 iterations. Findings were meaningful in that, in general, contingencies that evident from experimental work with a variety of mammals, over a few interactions, were replicated in the agent-based model after selection pressure over many generations. More energy-demanding expression of pain reduced its frequency in successive generations, and increasing injury frequency resulted in fewer expressers and altruists. Allowing exploitation of injured agents decreased expression of pain to near zero, but altruists remained. Decreasing costs or increasing benefits of helping hardly changed its frequency, whereas increasing interaction rate between injured agents and helpers diminished the benefits to both. Agent-based modeling allows simulation of complex behaviors and environmental pressures over evolutionary time. PMID:26655734

  7. Pain expressiveness and altruistic behavior: an exploration using agent-based modeling

    PubMed Central

    de C Williams, Amanda C.; Gallagher, Elizabeth; Fidalgo, Antonio R.; Bentley, Peter J.

    2015-01-01

    Abstract Predictions which invoke evolutionary mechanisms are hard to test. Agent-based modeling in artificial life offers a way to simulate behaviors and interactions in specific physical or social environments over many generations. The outcomes have implications for understanding adaptive value of behaviors in context. Pain-related behavior in animals is communicated to other animals that might protect or help, or might exploit or predate. An agent-based model simulated the effects of displaying or not displaying pain (expresser/nonexpresser strategies) when injured and of helping, ignoring, or exploiting another in pain (altruistic/nonaltruistic/selfish strategies). Agents modeled in MATLAB interacted at random while foraging (gaining energy); random injury interrupted foraging for a fixed time unless help from an altruistic agent, who paid an energy cost, speeded recovery. Environmental and social conditions also varied, and each model ran for 10,000 iterations. Findings were meaningful in that, in general, contingencies that evident from experimental work with a variety of mammals, over a few interactions, were replicated in the agent-based model after selection pressure over many generations. More energy-demanding expression of pain reduced its frequency in successive generations, and increasing injury frequency resulted in fewer expressers and altruists. Allowing exploitation of injured agents decreased expression of pain to near zero, but altruists remained. Decreasing costs or increasing benefits of helping hardly changed its frequency, whereas increasing interaction rate between injured agents and helpers diminished the benefits to both. Agent-based modeling allows simulation of complex behaviors and environmental pressures over evolutionary time. PMID:26655734

  8. Generic framework for mining cellular automata models on protein-folding simulations.

    PubMed

    Diaz, N; Tischer, I

    2016-01-01

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined. PMID:27323045

  9. Deterministic Agent-Based Path Optimization by Mimicking the Spreading of Ripples.

    PubMed

    Hu, Xiao-Bing; Wang, Ming; Leeson, Mark S; Di Paolo, Ezequiel A; Liu, Hao

    2016-01-01

    Inspirations from nature have contributed fundamentally to the development of evolutionary computation. Learning from the natural ripple-spreading phenomenon, this article proposes a novel ripple-spreading algorithm (RSA) for the path optimization problem (POP). In nature, a ripple spreads at a constant speed in all directions, and the node closest to the source is the first to be reached. This very simple principle forms the foundation of the proposed RSA. In contrast to most deterministic top-down centralized path optimization methods, such as Dijkstra's algorithm, the RSA is a bottom-up decentralized agent-based simulation model. Moreover, it is distinguished from other agent-based algorithms, such as genetic algorithms and ant colony optimization, by being a deterministic method that can always guarantee the global optimal solution with very good scalability. Here, the RSA is specifically applied to four different POPs. The comparative simulation results illustrate the advantages of the RSA in terms of effectiveness and efficiency. Thanks to the agent-based and deterministic features, the RSA opens new opportunities to attack some problems, such as calculating the exact complete Pareto front in multiobjective optimization and determining the kth shortest project time in project management, which are very difficult, if not impossible, for existing methods to resolve. The ripple-spreading optimization principle and the new distinguishing features and capacities of the RSA enrich the theoretical foundations of evolutionary computation.

  10. Integrated Modeling, Mapping, and Simulation (IMMS) Framework for Exercise and Response Planning

    NASA Technical Reports Server (NTRS)

    Mapar, Jalal; Hoette, Trisha; Mahrous, Karim; Pancerella, Carmen M.; Plantenga, Todd; Yang, Christine; Yang, Lynn; Hopmeier, Michael

    2011-01-01

    EmergenCy management personnel at federal, stale, and local levels can benefit from the increased situational awareness and operational efficiency afforded by simulation and modeling for emergency preparedness, including planning, training and exercises. To support this goal, the Department of Homeland Security's Science & Technology Directorate is funding the Integrated Modeling, Mapping, and Simulation (IMMS) program to create an integrating framework that brings together diverse models for use by the emergency response community. SUMMIT, one piece of the IMMS program, is the initial software framework that connects users such as emergency planners and exercise developers with modeling resources, bridging the gap in expertise and technical skills between these two communities. SUMMIT was recently deployed to support exercise planning for National Level Exercise 2010. Threat, casualty. infrastructure, and medical surge models were combined within SUMMIT to estimate health care resource requirements for the exercise ground truth.

  11. Architectural considerations for agent-based national scale policy models : LDRD final report.

    SciTech Connect

    Backus, George A.; Strip, David R.

    2007-09-01

    The need to anticipate the consequences of policy decisions becomes ever more important as the magnitude of the potential consequences grows. The multiplicity of connections between the components of society and the economy makes intuitive assessments extremely unreliable. Agent-based modeling has the potential to be a powerful tool in modeling policy impacts. The direct mapping between agents and elements of society and the economy simplify the mapping of real world functions into the world of computation assessment. Our modeling initiative is motivated by the desire to facilitate informed public debate on alternative policies for how we, as a nation, provide healthcare to our population. We explore the implications of this motivation on the design and implementation of a model. We discuss the choice of an agent-based modeling approach and contrast it to micro-simulation and systems dynamics approaches.

  12. Comparing large-scale computational approaches to epidemic modeling: agent based versus structured metapopulation models

    NASA Astrophysics Data System (ADS)

    Gonçalves, Bruno; Ajelli, Marco; Balcan, Duygu; Colizza, Vittoria; Hu, Hao; Ramasco, José; Merler, Stefano; Vespignani, Alessandro

    2010-03-01

    We provide for the first time a side by side comparison of the results obtained with a stochastic agent based model and a structured metapopulation stochastic model for the evolution of a baseline pandemic event in Italy. The Agent Based model is based on the explicit representation of the Italian population through highly detailed data on the socio-demographic structure. The metapopulation simulations use the GLobal Epidemic and Mobility (GLEaM) model, based on high resolution census data worldwide, and integrating airline travel flow data with short range human mobility patterns at the global scale. Both models provide epidemic patterns that are in very good agreement at the granularity levels accessible by both approaches, with differences in peak timing of the order of few days. The age breakdown analysis shows that similar attack rates are obtained for the younger age classes.

  13. Microworlds, Simulators, and Simulation: Framework for a Benchmark of Human Reliability Data Sources

    SciTech Connect

    Ronald Boring; Dana Kelly; Carol Smidts; Ali Mosleh; Brian Dyre

    2012-06-01

    In this paper, we propose a method to improve the data basis of human reliability analysis (HRA) by extending the data sources used to inform HRA methods. Currently, most HRA methods are based on limited empirical data, and efforts to enhance the empirical basis behind HRA methods have not yet yielded significant new data. Part of the reason behind this shortage of quality data is attributable to the data sources used. Data have been derived from unrelated industries, from infrequent risk-significant events, or from costly control room simulator studies. We propose a benchmark of four data sources: a simplified microworld simulator using unskilled student operators, a full-scope control room simulator using skilled student operators, a full-scope control room simulator using licensed commercial operators, and a human performance modeling and simulation system using virtual operators. The goal of this research is to compare findings across the data sources to determine to what extent data may be used and generalized from cost effective sources.

  14. Analysis of GEANT4 Physics List Properties in the 12 GeV MOLLER Simulation Framework

    NASA Astrophysics Data System (ADS)

    Haufe, Christopher; Moller Collaboration

    2013-10-01

    To determine the validity of new physics beyond the scope of the electroweak theory, nuclear physicists across the globe have been collaborating on future endeavors that will provide the precision needed to confirm these speculations. One of these is the MOLLER experiment - a low-energy particle experiment that will utilize the 12 GeV upgrade of Jefferson Lab's CEBAF accelerator. The motivation of this experiment is to measure the parity-violating asymmetry of scattered polarized electrons off unpolarized electrons in a liquid hydrogen target. This measurement would allow for a more precise determination of the electron's weak charge and weak mixing angle. While still in its planning stages, the MOLLER experiment requires a detailed simulation framework in order to determine how the project should be run in the future. The simulation framework for MOLLER, called ``remoll'', is written in GEANT4 code. As a result, the simulation can utilize a number of GEANT4 coded physics lists that provide the simulation with a number of particle interaction constraints based off of different particle physics models. By comparing these lists with one another using the data-analysis application ROOT, the most optimal physics list for the MOLLER simulation can be determined and implemented. This material is based upon work supported by the National Science Foundation under Grant No. 714001.

  15. A 3D MPI-Parallel GPU-accelerated framework for simulating ocean wave energy converters

    NASA Astrophysics Data System (ADS)

    Pathak, Ashish; Raessi, Mehdi

    2015-11-01

    We present an MPI-parallel GPU-accelerated computational framework for studying the interaction between ocean waves and wave energy converters (WECs). The computational framework captures the viscous effects, nonlinear fluid-structure interaction (FSI), and breaking of waves around the structure, which cannot be captured in many potential flow solvers commonly used for WEC simulations. The full Navier-Stokes equations are solved using the two-step projection method, which is accelerated by porting the pressure Poisson equation to GPUs. The FSI is captured using the numerically stable fictitious domain method. A novel three-phase interface reconstruction algorithm is used to resolve three phases in a VOF-PLIC context. A consistent mass and momentum transport approach enables simulations at high density ratios. The accuracy of the overall framework is demonstrated via an array of test cases. Numerical simulations of the interaction between ocean waves and WECs are presented. Funding from the National Science Foundation CBET-1236462 grant is gratefully acknowledged.

  16. A Framework for Simulation of Aircraft Flyover Noise Through a Non-Standard Atmosphere

    NASA Technical Reports Server (NTRS)

    Arntzen, Michael; Rizzi, Stephen A.; Visser, Hendrikus G.; Simons, Dick G.

    2012-01-01

    This paper describes a new framework for the simulation of aircraft flyover noise through a non-standard atmosphere. Central to the framework is a ray-tracing algorithm which defines multiple curved propagation paths, if the atmosphere allows, between the moving source and listener. Because each path has a different emission angle, synthesis of the sound at the source must be performed independently for each path. The time delay, spreading loss and absorption (ground and atmosphere) are integrated along each path, and applied to each synthesized aircraft noise source to simulate a flyover. A final step assigns each resulting signal to its corresponding receiver angle for the simulation of a flyover in a virtual reality environment. Spectrograms of the results from a straight path and a curved path modeling assumption are shown. When the aircraft is at close range, the straight path results are valid. Differences appear especially when the source is relatively far away at shallow elevation angles. These differences, however, are not significant in common sound metrics. While the framework used in this work performs off-line processing, it is conducive to real-time implementation.

  17. Ximpol: a new X-ray polarimetry observation-simulation and analysis framework

    NASA Astrophysics Data System (ADS)

    Baldini, Luca; Muleri, Fabio; Soffitta, Paolo; Omodei, Nicola; Pesce-Rollins, Melissa; Sgro, Carmelo; Latronico, Luca; Spada, Francesca; Manfreda, Alberto; Di Lalla, Niccolo

    2016-07-01

    We present a new simulation framework, ximpol, based on the Python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. ximpol is designed to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC---which make it a useful tool not only for simulating observations of astronomical sources, but also to develop and test end-to-end analysis chains. In this contribution we shall give an overview of the basic architecture of the software. Although in principle the framework is not tied to any specific mission or instrument design we shall present a few physically interesting case studies in the context of the XIPE mission phase study.

  18. An Agent-Based Interface to Terrestrial Ecological Forecasting

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Nemani, Ramakrishna; Pang, Wan-Lin; Votava, Petr; Etzioni, Oren

    2004-01-01

    This paper describes a flexible agent-based ecological forecasting system that combines multiple distributed data sources and models to provide near-real-time answers to questions about the state of the Earth system We build on novel techniques in automated constraint-based planning and natural language interfaces to automatically generate data products based on descriptions of the desired data products.

  19. Agent-based services for B2B electronic commerce

    NASA Astrophysics Data System (ADS)

    Fong, Elizabeth; Ivezic, Nenad; Rhodes, Tom; Peng, Yun

    2000-12-01

    The potential of agent-based systems has not been realized yet, in part, because of the lack of understanding of how the agent technology supports industrial needs and emerging standards. The area of business-to-business electronic commerce (b2b e-commerce) is one of the most rapidly developing sectors of industry with huge impact on manufacturing practices. In this paper, we investigate the current state of agent technology and the feasibility of applying agent-based computing to b2b e-commerce in the circuit board manufacturing sector. We identify critical tasks and opportunities in the b2b e-commerce area where agent-based services can best be deployed. We describe an implemented agent-based prototype system to facilitate the bidding process for printed circuit board manufacturing and assembly. These activities are taking place within the Internet Commerce for Manufacturing (ICM) project, the NIST- sponsored project working with industry to create an environment where small manufacturers of mechanical and electronic components may participate competitively in virtual enterprises that manufacture printed circuit assemblies.

  20. SMART: A New Semi-distributed Hydrologic Modelling Framework for Soil Moisture and Runoff Simulations

    NASA Astrophysics Data System (ADS)

    Ajami, Hoori; Sharma, Ashish

    2016-04-01

    A new GIS-based semi-distributed hydrological modelling framework is developed based upon the delineation of contiguous and topologically connected Hydrologic Response Units (HRUs). The Soil Moisture and Runoff simulation Toolkit (SMART) performs topographic and geomorphic analysis of a catchment and delineates HRUs in each first order sub-basin. This HRU delineation approach maintains lateral flow dynamics in first order sub-basins and therefore it is suited for simulating runoff in upland catchments. Simulation elements in SMART are distributed cross sections or equivalent cross sections (ECS) in each first order sub-basin to represent hillslope hydrologic processes. Delineation of ECSs in SMART is performed by weighting the topographic and physiographic properties of the part or entire first-order sub-basin and has the advantage of reducing computational time/effort while maintaining reasonable accuracy in simulated hydrologic state and fluxes (e.g. soil moisture, evapotranspiration and runoff). SMART workflow is written in MATLAB to automate the HRU and cross section delineations, model simulations across multiple cross sections, and post-processing of model outputs to visualize the results. The MATLAB Parallel Processing Toolbox is used for simultaneous simulations of cross sections and is further reduced computational time. SMART workflow tasks are: 1) delineation of first order sub-basins of a catchment using a digital elevation model, 2) hillslope delineation, 3) landform delineation in every first order sub-basin based on topographic and geomorphic properties of a group of sub-basins or the entire catchment, 4) formulation of cross sections as well as equivalent cross sections in every first order sub-basin, and 5) deriving vegetation and soil parameters from spatially distributed land cover and soil information. The current version of SMART uses a 2-d distributed hydrological model based on the Richards' equation. However, any hydrologic model can be

  1. A framework for simulating ultrasound imaging based on first order nonlinear pressure-velocity relations.

    PubMed

    Du, Yigang; Fan, Rui; Li, Yong; Chen, Siping; Jensen, Jørgen Arendt

    2016-07-01

    An ultrasound imaging framework modeled with the first order nonlinear pressure-velocity relations (NPVR) and implemented by a half-time staggered solution and pseudospectral method is presented in this paper. The framework is capable of simulating linear and nonlinear ultrasound propagation and reflections in a heterogeneous medium with different sound speeds and densities. It can be initialized with arbitrary focus, excitation and apodization for multiple individual channels in both 2D and 3D spatial fields. The simulated channel data can be generated using this framework, and ultrasound image can be obtained by beamforming the simulated channel data. Various results simulated by different algorithms are illustrated for comparisons. The root mean square (RMS) errors for each compared pulses are calculated. The linear propagation is validated by an angular spectrum approach (ASA) with a RMS error of 3% at the focal point for a 2D field, and Field II with RMS errors of 0.8% and 1.5% at the electronic and the elevation focuses for 3D fields, respectively. The accuracy for the NPVR based nonlinear propagation is investigated by comparing with the Abersim simulation for pulsed fields and with the nonlinear ASA for monochromatic fields. The RMS errors of the nonlinear pulses calculated by the NPVR and Abersim are respectively 2.4%, 7.4%, 17.6% and 36.6% corresponding to initial pressure amplitudes of 50kPa, 200kPa, 500kPa and 1MPa at the transducer. By increasing the sampling frequency for the strong nonlinearity, the RMS error for 1MPa initial pressure amplitude is reduced from 36.6% to 27.3%. PMID:27107165

  2. A modelling framework to simulate foliar fungal epidemics using functional–structural plant models

    PubMed Central

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-01-01

    Background and Aims Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional–structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Methods Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant–environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Key Results Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. Conclusions This study provides a framework for modelling a large number of pathosystems

  3. Toward an Agent-Based Model of Socially Optimal Water Rights Markets

    NASA Astrophysics Data System (ADS)

    Ehlen, M. A.

    2004-12-01

    There has been considerable interest lately in using public markets for buying and selling the rights to local water usage. Such water rights markets, if designed correctly, should be socially optimal, that is, should sell rights at prices that reflect the true value of water in the region, taking into account that water rights buyers and sellers represent a disparate group of private industry, public authorities, and private users, each having different water needs and different priority to local government. Good market design, however, is hard. As was experienced in California short-run electric power markets, a market design that on paper looks reasonable but in practice is mal-constructed can have devastating effects: firms can learn to manipulate prices by `playing' both sides of the market, and sellers can under-provide so as to create exorbitant prices which buyers have no choice but to pay. Economic theory provides several frameworks for developing a good water rights market design; for example, the structure-conduct-performance paradigm (SCPP) suggests that, among other things, the number and types of buyers and sellers (structure), and transaction clearing rules and government policies (conduct) affect in very particular ways the prices and quantities (performance) in the market. In slow-moving or static markets, SCPP has been a useful predictor of market performance; in faster markets the market dynamics that endogenously develop over time are often too complex to predict with SCPP or other existing modeling techniques. New, more sophisticated combinations of modeling and simulation are needed. Toward developing a good (i.e., socially optimal) water rights market design that can take into account the dynamics inherent in the water sector, we are developing an agent-based model of water rights markets. The model serves two purposes: first, it provides an SCPP-based framework of water rights markets that takes into account the particular structure of

  4. DELPHES 3: a modular framework for fast simulation of a generic collider experiment

    NASA Astrophysics Data System (ADS)

    de Favereau, J.; Delaere, C.; Demin, P.; Giammanco, A.; Lemaître, V.; Mertens, A.; Selvaggi, M.

    2014-02-01

    The version 3.0 of the Delphes fast-simulation is presented. The goal of Delphes is to allow the simulation of a multipurpose detector for phenomenological studies. The simulation includes a track propagation system embedded in a magnetic field, electromagnetic and hadron calorimeters, and a muon identification system. Physics objects that can be used for data analysis are then reconstructed from the simulated detector response. These include tracks and calorimeter deposits and high level objects such as isolated electrons, jets, taus, and missing energy. The new modular approach allows for greater flexibility in the design of the simulation and reconstruction sequence. New features such as the particle-flow reconstruction approach, crucial in the first years of the LHC, and pile-up simulation and mitigation, which is needed for the simulation of the LHC detectors in the near future, have also been implemented. The Delphes framework is not meant to be used for advanced detector studies, for which more accurate tools are needed. Although some aspects of Delphes are hadron collider specific, it is flexible enough to be adapted to the needs of electron-positron collider experiments. [Figure not available: see fulltext.

  5. An Object-Oriented Finite Element Framework for Multiphysics Phase Field Simulations

    SciTech Connect

    Michael R Tonks; Derek R Gaston; Paul C Millett; David Andrs; Paul Talbot

    2012-01-01

    The phase field approach is a powerful and popular method for modeling microstructure evolution. In this work, advanced numerical tools are used to create a phase field framework that facilitates rapid model development. This framework, called MARMOT, is based on Idaho National Laboratory's finite element Multiphysics Object-Oriented Simulation Environment. In MARMOT, the system of phase field partial differential equations (PDEs) are solved simultaneously with PDEs describing additional physics, such as solid mechanics and heat conduction, using the Jacobian-Free Newton Krylov Method. An object-oriented architecture is created by taking advantage of commonalities in phase fields models to facilitate development of new models with very little written code. In addition, MARMOT provides access to mesh and time step adaptivity, reducing the cost for performing simulations with large disparities in both spatial and temporal scales. In this work, phase separation simulations are used to show the numerical performance of MARMOT. Deformation-induced grain growth and void growth simulations are included to demonstrate the muliphysics capability.

  6. A framework of passive millimeter-wave imaging simulation for typical ground scenes

    NASA Astrophysics Data System (ADS)

    Yan, Luxin; Ge, Rui; Zhong, Sheng

    2009-10-01

    Passive millimeter-wave (PMMW) imaging offers advantages over visible and IR imaging in having better all weather performance. However the PMMW imaging sensors are state-of-the-art to date, sometimes it is required to predict and evaluate the performance of a PMMW sensor under a variety of weather, terrain and sensor operational conditions. The PMMW scene simulation is an efficient way. This paper proposes a framework of the PMMW simulation for ground scenes. Commercial scene modeling software, Multigen and Vega, are used to generate the multi-viewpoint and multi-scale description for natural ground scenes with visible images. The background and objects in the scene are classified based on perceptive color clusters and mapped with different materials. Further, the radiometric temperature images of the scene are calculated according to millimeter wave phenomenology: atmospheric propagation and emission including sky temperature, weather conditions, and physical temperature. Finally, the simulated output PMMW images are generated by applying the sensor characteristics such as the aperture size, data sample scheme and system noise. Tentative results show the simulation framework can provide reasonable scene's PMMW image with high fidelity.

  7. Using Uncertainty and Sensitivity Analyses in Socioecological Agent-Based Models to Improve Their Analytical Performance and Policy Relevance

    PubMed Central

    Ligmann-Zielinska, Arika; Kramer, Daniel B.; Spence Cheruvelil, Kendra; Soranno, Patricia A.

    2014-01-01

    Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system. PMID:25340764

  8. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    NASA Astrophysics Data System (ADS)

    Hummel, Jacob A.

    2016-11-01

    We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.

  9. Zeolitic imidazolate framework-8 as a reverse osmosis membrane for water desalination: insight from molecular simulation.

    PubMed

    Hu, Zhongqiao; Chen, Yifei; Jiang, Jianwen

    2011-04-01

    A molecular simulation study is reported for water desalination in zeolitic imidazolate framework-8 (ZIF-8) membrane. The simulation demonstrates that water desalination occurs under external pressure, and Na(+) and Cl(-) ions cannot transport across the membrane due to the sieving effect of small apertures in ZIF-8. The flux of water permeating the membrane scales linearly with the external pressure, and exhibits an Arrhenius-type relation with temperature (activation energy of 24.4 kJ∕mol). Compared with bulk phase, water molecules in ZIF-8 membrane are less hydrogen-bonded and the lifetime of hydrogen-bonding is considerably longer, as attributed to the surface interactions and geometrical confinement. This simulation study suggests that ZIF-8 might be potentially used as a reverse osmosis membrane for water purification.

  10. A Two-Stage Multi-Agent Based Assessment Approach to Enhance Students' Learning Motivation through Negotiated Skills Assessment

    ERIC Educational Resources Information Center

    Chadli, Abdelhafid; Bendella, Fatima; Tranvouez, Erwan

    2015-01-01

    In this paper we present an Agent-based evaluation approach in a context of Multi-agent simulation learning systems. Our evaluation model is based on a two stage assessment approach: (1) a Distributed skill evaluation combining agents and fuzzy sets theory; and (2) a Negotiation based evaluation of students' performance during a training…

  11. Examining the Pathogenesis of Breast Cancer Using a Novel Agent-Based Model of Mammary Ductal Epithelium Dynamics

    PubMed Central

    Chapa, Joaquin; Bourgo, Ryan J.; Greene, Geoffrey L.; Kulkarni, Swati; An, Gary

    2013-01-01

    The study of the pathogenesis of breast cancer is challenged by the long time-course of the disease process and the multi-factorial nature of generating oncogenic insults. The characterization of the longitudinal pathogenesis of malignant transformation from baseline normal breast duct epithelial dynamics may provide vital insight into the cascading systems failure that leads to breast cancer. To this end, extensive information on the baseline behavior of normal mammary epithelium and breast cancer oncogenesis was integrated into a computational model termed the Ductal Epithelium Agent-Based Model (DEABM). The DEABM is composed of computational agents that behave according to rules established from published cellular and molecular mechanisms concerning breast duct epithelial dynamics and oncogenesis. The DEABM implements DNA damage and repair, cell division, genetic inheritance and simulates the local tissue environment with hormone excretion and receptor signaling. Unrepaired DNA damage impacts the integrity of the genome within individual cells, including a set of eight representative oncogenes and tumor suppressors previously implicated in breast cancer, with subsequent consequences on successive generations of cells. The DEABM reproduced cellular population dynamics seen during the menstrual cycle and pregnancy, and demonstrated the oncogenic effect of known genetic factors associated with breast cancer, namely TP53 and Myc, in simulations spanning ∼40 years of simulated time. Simulations comparing normal to BRCA1-mutant breast tissue demonstrated rates of invasive cancer development similar to published epidemiologic data with respect to both cumulative incidence over time and estrogen-receptor status. Investigation of the modeling of ERα-positive (ER+) tumorigenesis led to a novel hypothesis implicating the transcription factor and tumor suppressor RUNX3. These data suggest that the DEABM can serve as a potentially valuable framework to augment the

  12. A higher-order numerical framework for stochastic simulation of chemical reaction systems

    PubMed Central

    2012-01-01

    Background In this paper, we present a framework for improving the accuracy of fixed-step methods for Monte Carlo simulation of discrete stochastic chemical kinetics. Stochasticity is ubiquitous in many areas of cell biology, for example in gene regulation, biochemical cascades and cell-cell interaction. However most discrete stochastic simulation techniques are slow. We apply Richardson extrapolation to the moments of three fixed-step methods, the Euler, midpoint and θ-trapezoidal τ-leap methods, to demonstrate the power of stochastic extrapolation. The extrapolation framework can increase the order of convergence of any fixed-step discrete stochastic solver and is very easy to implement; the only condition for its use is knowledge of the appropriate terms of the global error expansion of the solver in terms of its stepsize. In practical terms, a higher-order method with a larger stepsize can achieve the same level of accuracy as a lower-order method with a smaller one, potentially reducing the computational time of the system. Results By obtaining a global error expansion for a general weak first-order method, we prove that extrapolation can increase the weak order of convergence for the moments of the Euler and the midpoint τ-leap methods, from one to two. This is supported by numerical simulations of several chemical systems of biological importance using the Euler, midpoint and θ-trapezoidal τ-leap methods. In almost all cases, extrapolation results in an improvement of accuracy. As in the case of ordinary and stochastic differential equations, extrapolation can be repeated to obtain even higher-order approximations. Conclusions Extrapolation is a general framework for increasing the order of accuracy of any fixed-step stochastic solver. This enables the simulation of complicated systems in less time, allowing for more realistic biochemical problems to be solved. PMID:23256696

  13. An Agent-Based Model of Signal Transduction in Bacterial Chemotaxis

    PubMed Central

    Miller, Jameson; Parker, Miles; Bourret, Robert B.; Giddings, Morgan C.

    2010-01-01

    We report the application of agent-based modeling to examine the signal transduction network and receptor arrays for chemotaxis in Escherichia coli, which are responsible for regulating swimming behavior in response to environmental stimuli. Agent-based modeling is a stochastic and bottom-up approach, where individual components of the modeled system are explicitly represented, and bulk properties emerge from their movement and interactions. We present the Chemoscape model: a collection of agents representing both fixed membrane-embedded and mobile cytoplasmic proteins, each governed by a set of rules representing knowledge or hypotheses about their function. When the agents were placed in a simulated cellular space and then allowed to move and interact stochastically, the model exhibited many properties similar to the biological system including adaptation, high signal gain, and wide dynamic range. We found the agent based modeling approach to be both powerful and intuitive for testing hypotheses about biological properties such as self-assembly, the non-linear dynamics that occur through cooperative protein interactions, and non-uniform distributions of proteins in the cell. We applied the model to explore the role of receptor type, geometry and cooperativity in the signal gain and dynamic range of the chemotactic response to environmental stimuli. The model provided substantial qualitative evidence that the dynamic range of chemotactic response can be traced to both the heterogeneity of receptor types present, and the modulation of their cooperativity by their methylation state. PMID:20485527

  14. A generic framework to simulate realistic lung, liver and renal pathologies in CT imaging

    NASA Astrophysics Data System (ADS)

    Solomon, Justin; Samei, Ehsan

    2014-11-01

    Realistic three-dimensional (3D) mathematical models of subtle lesions are essential for many computed tomography (CT) studies focused on performance evaluation and optimization. In this paper, we develop a generic mathematical framework that describes the 3D size, shape, contrast, and contrast-profile characteristics of a lesion, as well as a method to create lesion models based on CT data of real lesions. Further, we implemented a technique to insert the lesion models into CT images in order to create hybrid CT datasets. This framework was used to create a library of realistic lesion models and corresponding hybrid CT images. The goodness of fit of the models was assessed using the coefficient of determination (R2) and the visual appearance of the hybrid images was assessed with an observer study using images of both real and simulated lesions and receiver operator characteristic (ROC) analysis. The average R2 of the lesion models was 0.80, implying that the models provide a good fit to real lesion data. The area under the ROC curve was 0.55, implying that the observers could not readily distinguish between real and simulated lesions. Therefore, we conclude that the lesion-modeling framework presented in this paper can be used to create realistic lesion models and hybrid CT images. These models could be instrumental in performance evaluation and optimization of novel CT systems.

  15. A Simulation Framework for Quantitative Validation of Artefact Correction in Diffusion MRI.

    PubMed

    Graham, Mark S; Drobnjak, Ivana; Zhang, Hui

    2015-01-01

    In this paper we demonstrate a simulation framework that enables the direct and quantitative comparison of post-processing methods for diffusion weighted magnetic resonance (DW-MR) images. DW-MR datasets are employed in a range of techniques that enable estimates of local microstructure and global connectivity in the brain. These techniques require full alignment of images across the dataset, but this is rarely the case. Artefacts such as eddy-current (EC) distortion and motion lead to misalignment between images, which compromise the quality of the microstructural measures obtained from them. Numerous methods and software packages exist to correct these artefacts, some of which have become de-facto standards, but none have been subject to rigorous validation. The ultimate aim of these techniques is improved image alignment, yet in the literature this is assessed using either qualitative visual measures or quantitative surrogate metrics. Here we introduce a simulation framework that allows for the direct, quantitative assessment of techniques, enabling objective comparisons of existing and future methods. DW-MR datasets are generated using a process that is based on the physics of MRI acquisition, which allows for the salient features of the images and their artefacts to be reproduced. We demonstrate the application of this framework by testing one of the most commonly used methods for EC correction, registration of DWIs to b = 0, and reveal the systematic bias this introduces into corrected datasets.

  16. Evaluation of a performance appraisal framework for radiation therapists in planning and simulation

    SciTech Connect

    Becker, Jillian; Bridge, Pete; Brown, Elizabeth; Lusk, Ryan; Ferrari-Anderson, Janet

    2015-06-15

    Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce.

  17. Evaluation of a performance appraisal framework for radiation therapists in planning and simulation

    PubMed Central

    Becker, Jillian; Bridge, Pete; Brown, Elizabeth; Lusk, Ryan; Ferrari-Anderson, Janet

    2015-01-01

    Introduction Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. Methods A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Results Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. Conclusions This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce. PMID:26229676

  18. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations

    PubMed Central

    Hahne, Jan; Helias, Moritz; Kunkel, Susanne; Igarashi, Jun; Bolten, Matthias; Frommer, Andreas; Diesmann, Markus

    2015-01-01

    Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy in the presence of gap junctions, we present benchmarks for workstations, clusters, and supercomputers. Finally, we discuss limitations of the novel technology. PMID:26441628

  19. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations.

    PubMed

    Hahne, Jan; Helias, Moritz; Kunkel, Susanne; Igarashi, Jun; Bolten, Matthias; Frommer, Andreas; Diesmann, Markus

    2015-01-01

    Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy in the presence of gap junctions, we present benchmarks for workstations, clusters, and supercomputers. Finally, we discuss limitations of the novel technology. PMID:26441628

  20. XML-based 3D model visualization and simulation framework for dynamic models

    NASA Astrophysics Data System (ADS)

    Kim, Taewoo; Fishwick, Paul A.

    2002-07-01

    Relatively recent advances in computer technology enable us to create three-dimensional (3D) dynamic models and simulate them within a 3D web environment. The use of such models is especially valuable when teaching simulation, and the concepts behind dynamic models, since the models are made more accessible to the students. Students tend to enjoy a construction process in which they are able to employ their own cultural and aesthetic forms. The challenge is to create a language that allows for a grammar for modeling, while simultaneously permitting arbitrary presentation styles. For further flexibility, we need an effective way to represent and simulate dynamic models that can be shared by modelers over the Internet. We present an Extensible Markup Language (XML)-based framework that will guide a modeler in creating personalized 3D models, visualizing its dynamic behaviors, and simulating the created models. A model author will use XML files to represent geometries and topology of a dynamic model. Model Fusion Engine, written in Extensible Stylesheet Language Transformation (XSLT), expedites the modeling process by automating the creation of dynamic models with the user-defined XML files. Modelers can also link simulation programs with a created model to analyze the characteristics of the model. The advantages of this system lie in the education of modeling and simulating dynamic models, and in the exploitation of visualizing the dynamic model behaviors.

  1. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations.

    PubMed

    Hahne, Jan; Helias, Moritz; Kunkel, Susanne; Igarashi, Jun; Bolten, Matthias; Frommer, Andreas; Diesmann, Markus

    2015-01-01

    Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy in the presence of gap junctions, we present benchmarks for workstations, clusters, and supercomputers. Finally, we discuss limitations of the novel technology.

  2. Exploring Tradeoffs in Demand-side and Supply-side Management of Urban Water Resources using Agent-based Modeling and Evolutionary Computation

    NASA Astrophysics Data System (ADS)

    Kanta, L.; Berglund, E. Z.

    2015-12-01

    Urban water supply systems may be managed through supply-side and demand-side strategies, which focus on water source expansion and demand reductions, respectively. Supply-side strategies bear infrastructure and energy costs, while demand-side strategies bear costs of implementation and inconvenience to consumers. To evaluate the performance of demand-side strategies, the participation and water use adaptations of consumers should be simulated. In this study, a Complex Adaptive Systems (CAS) framework is developed to simulate consumer agents that change their consumption to affect the withdrawal from the water supply system, which, in turn influences operational policies and long-term resource planning. Agent-based models are encoded to represent consumers and a policy maker agent and are coupled with water resources system simulation models. The CAS framework is coupled with an evolutionary computation-based multi-objective methodology to explore tradeoffs in cost, inconvenience to consumers, and environmental impacts for both supply-side and demand-side strategies. Decisions are identified to specify storage levels in a reservoir that trigger (1) increases in the volume of water pumped through inter-basin transfers from an external reservoir and (2) drought stages, which restrict the volume of water that is allowed for residential outdoor uses. The proposed methodology is demonstrated for Arlington, Texas, water supply system to identify non-dominated strategies for an historic drought decade. Results demonstrate that pumping costs associated with maximizing environmental reliability exceed pumping costs associated with minimizing restrictions on consumer water use.

  3. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    NASA Astrophysics Data System (ADS)

    Hartwig, Zachary S.

    2016-04-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms.

  4. DDG4 A Simulation Framework based on the DD4hep Detector Description Toolkit

    NASA Astrophysics Data System (ADS)

    Frank, M.; Gaede, F.; Nikiforou, N.; Petric, M.; Sailer, A.

    2015-12-01

    The detector description is an essential component that has to be used to analyse and simulate data resulting from particle collisions in high energy physics experiments. Based on the DD4hep detector description toolkit a flexible and data driven simulation framework was designed using the Geant4 tool-kit. We present this framework and describe the guiding requirements and the architectural design, which was strongly driven by ease of use. The goal was, given an existing detector description, to simulate the detector response to particle collisions in high energy physics experiments with minimal effort, but not impose restrictions to support enhanced or improved behaviour. Starting from the ROOT based geometry implementation used by DD4hep an automatic conversion mechanism to Geant4 was developed. The physics response and the mechanism to input particle data from generators was highly formalized and can be instantiated on demand using known factory patterns. A palette of components to model the detector response is provided by default, but improved or more sophisticated components may easily be added using the factory pattern. Only the final configuration of the instantiated components has to be provided by end-users using either C++ or python scripting or an XML based description.

  5. The Application of Modeling and Simulation in Capacity Management within the ITIL Framework

    NASA Technical Reports Server (NTRS)

    Rahmani, Sonya; vonderHoff, Otto

    2010-01-01

    Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.

  6. The fractional volatility model: An agent-based interpretation

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.

    2008-06-01

    Based on the criteria of mathematical simplicity and consistency with empirical market data, a model with volatility driven by fractional noise has been constructed which provides a fairly accurate mathematical parametrization of the data. Here, some features of the model are reviewed and extended to account for leverage effects. Using agent-based models, one tries to find which agent strategies and (or) properties of the financial institutions might be responsible for the features of the fractional volatility model.

  7. Agent-based models in robotized manufacturing cells designing

    NASA Astrophysics Data System (ADS)

    Sekala, A.; Gwiazda, A.; Foit, K.; Banas, W.; Hryniewicz, P.; Kost, G.

    2015-11-01

    The complexity of the components, presented in robotized manufacturing workcells, causes that already at the design phase is necessary to develop models presenting various aspects of their structure and functioning. These models are simplified representation of real systems and allow to, among others, systematize knowledge about the designed manufacturing workcell. They also facilitate defining and analyzing the interrelationships between its particular components. This paper proposes the agent-based approach applied for designing robotized manufacturing cells.

  8. Managing simulation-based training: A framework for optimizing learning, cost, and time

    NASA Astrophysics Data System (ADS)

    Richmond, Noah Joseph

    This study provides a management framework for optimizing training programs for learning, cost, and time when using simulation based training (SBT) and reality based training (RBT) as resources. Simulation is shown to be an effective means for implementing activity substitution as a way to reduce risk. The risk profile of 22 US Air Force vehicles are calculated, and the potential risk reduction is calculated under the assumption of perfect substitutability of RBT and SBT. Methods are subsequently developed to relax the assumption of perfect substitutability. The transfer effectiveness ratio (TER) concept is defined and modeled as a function of the quality of the simulator used, and the requirements of the activity trained. The Navy F/A-18 is then analyzed in a case study illustrating how learning can be maximized subject to constraints in cost and time, and also subject to the decision maker's preferences for the proportional and absolute use of simulation. Solution methods for optimizing multiple activities across shared resources are next provided. Finally, a simulation strategy including an operations planning program (OPP), an implementation program (IP), an acquisition program (AP), and a pedagogical research program (PRP) is detailed. The study provides the theoretical tools to understand how to leverage SBT, a case study demonstrating these tools' efficacy, and a set of policy recommendations to enable the US military to better utilize SBT in the future.

  9. Review of Molecular Simulations of Methane Storage in Metal-Organic Frameworks.

    PubMed

    Lee, Seung-Joon; Bae, Youn-Sang

    2016-05-01

    Methane storage in porous materials is one of the hot issues because it can replace dangerous high-pressure compressed natural gas (CNG) tanks in natural gas vehicles. Among the diverse adsorbents, metal-organic frameworks (MOFs) are considered to be promising due to their extremely high surface areas and low crystal densities. Molecular simulation has been considered as an important tool for finding an appropriate MOF for methane storage. We review several important roles of molecular modeling for the studies of methane adsorption in MOFs. PMID:27483748

  10. The dynamic information architecture system : an advanced simulation framework for military and civilian applications.

    SciTech Connect

    Campbell, A. P.; Hummel, J. R.

    1998-01-08

    DIAS, the Dynamic Information Architecture System, is an object-oriented simulation system that was designed to provide an integrating framework in which new or legacy software applications can operate in a context-driven frame of reference. DIAS provides a flexible and extensible mechanism to allow disparate, and mixed language, software applications to interoperate. DIAS captures the dynamic interplay between different processes or phenomena in the same frame of reference. Finally, DIAS accommodates a broad range of analysis contexts, with widely varying spatial and temporal resolutions and fidelity.

  11. Towards a unified framework for coarse-graining particle-based simulations.

    SciTech Connect

    Junghans, Christoph

    2012-06-28

    Different coarse-graining techniques for soft matter systems have been developed in recent years, however it is often very demanding to find the method most suitable for the problem studied. For this reason we began to develop the VOTCA toolkit to allow for easy comparison of different methods. We have incorporated 6 different techniques into the package and implemented a powerful and parallel analysis framework plus multiple simulation back-ends. We will discuss the specifics of the package by means of various studies, which have been performed with the toolkit and highlight problems we encountered along the way.

  12. Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.

  13. A flexible object-oriented software framework for developing complex multimedia simulations.

    SciTech Connect

    Sydelko, P. J.; Dolph, J. E.; Christiansen, J. H.

    2002-05-03

    Decision makers involved in brownfields redevelopment and long-term stewardship must consider environmental conditions, future-use potential, site ownership, area infrastructure, funding resources, cost recovery, regulations, risk and liability management, community relations, and expected return on investment in a comprehensive and integrated fashion to achieve desired results. Successful brownfields redevelopment requires the ability to assess the impacts of redevelopment options on multiple interrelated aspects of the ecosystem, both natural and societal. Computer-based tools, such as simulation models, databases, and geographical information systems (GISs) can be used to address brownfields planning and project execution. The transparent integration of these tools into a comprehensive and dynamic decision support system would greatly enhance the brownfields assessment process. Such a system needs to be able to adapt to shifting and expanding analytical requirements and contexts. The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-oriented framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application domains. The modeling domain of a specific DIAS-based simulation is determined by (1) software objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. Models and applications used to express dynamic behaviors can be either internal or external to DIAS, including existing legacy models written in various languages (FORTRAN, C, etc.). The flexible design framework of DIAS makes the objects adjustable to the context of the problem without a great deal of recoding. The DIAS Spatial Data Set facility allows parameters to vary spatially depending on the simulation context according to any of a number of 1-D, 2-D

  14. Agent Based Modeling of Collaboration and Work Practices Onboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Acquisti, Alessandro; Sierhuis, Maarten; Clancey, William J.; Bradshaw, Jeffrey M.; Shaffo, Mike (Technical Monitor)

    2002-01-01

    The International Space Station is one the most complex projects ever, with numerous interdependent constraints affecting productivity and crew safety. This requires planning years before crew expeditions, and the use of sophisticated scheduling tools. Human work practices, however, are difficult to study and represent within traditional planning tools. We present an agent-based model and simulation of the activities and work practices of astronauts onboard the ISS based on an agent-oriented approach. The model represents 'a day in the life' of the ISS crew and is developed in Brahms, an agent-oriented, activity-based language used to model knowledge in situated action and learning in human activities.

  15. An agent-based computational model of the spread of tuberculosis

    NASA Astrophysics Data System (ADS)

    de Espíndola, Aquino L.; Bauch, Chris T.; Troca Cabella, Brenno C.; Souto Martinez, Alexandre

    2011-05-01

    In this work we propose an alternative model of the spread of tuberculosis (TB) and the emergence of drug resistance due to the treatment with antibiotics. We implement the simulations by an agent-based model computational approach where the spatial structure is taken into account. The spread of tuberculosis occurs according to probabilities defined by the interactions among individuals. The model was validated by reproducing results already known from the literature in which different treatment regimes yield the emergence of drug resistance. The different patterns of TB spread can be visualized at any time of the system evolution. The implementation details as well as some results of this alternative approach are discussed.

  16. Genetic Algorithms for Agent-Based Infrastructure Interdependency Modeling and Analysis

    SciTech Connect

    May Permann

    2007-03-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, electric power, telecommunication, and financial networks. This paper describes initial research combining agent-based infrastructure modeling software and genetic algorithms (GAs) to help optimize infrastructure protection and restoration decisions. This research proposes to apply GAs to the problem of infrastructure modeling and analysis in order to determine the optimum assets to restore or protect from attack or other disaster. This research is just commencing and therefore the focus of this paper is the integration of a GA optimization method with a simulation through the simulation’s agents.

  17. A proposed simulation optimization model framework for emergency department problems in public hospital

    NASA Astrophysics Data System (ADS)

    Ibrahim, Ireen Munira; Liong, Choong-Yeun; Bakar, Sakhinah Abu; Ahmad, Norazura; Najmuddin, Ahmad Farid

    2015-12-01

    The Emergency Department (ED) is a very complex system with limited resources to support increase in demand. ED services are considered as good quality if they can meet the patient's expectation. Long waiting times and length of stay is always the main problem faced by the management. The management of ED should give greater emphasis on their capacity of resources in order to increase the quality of services, which conforms to patient satisfaction. This paper is a review of work in progress of a study being conducted in a government hospital in Selangor, Malaysia. This paper proposed a simulation optimization model framework which is used to study ED operations and problems as well as to find an optimal solution to the problems. The integration of simulation and optimization is hoped can assist management in decision making process regarding their resource capacity planning in order to improve current and future ED operations.

  18. Simulating collisions of thick nuclei in the color glass condensate framework

    NASA Astrophysics Data System (ADS)

    Gelfand, Daniil; Ipp, Andreas; Müller, David

    2016-07-01

    We present our work on the simulation of the early stages of heavy-ion collisions with finite longitudinal thickness in the laboratory frame in 3 +1 dimensions. In particular we study the effects of nuclear thickness on the production of a glasma state in the McLerran-Venugopalan model within the color glass condensate framework. A finite thickness enables us to describe nuclei at lower energies, but forces us to abandon boost invariance. As a consequence, random classical color sources within the nuclei have to be included in the simulation, which is achieved by using the colored particle-in-cell method. We show that the description in the laboratory frame agrees with boost-invariant approaches as a limiting case. Furthermore we investigate collisions beyond boost invariance, in particular the pressure anisotropy in the glasma.

  19. Direct numerical simulation of rigid bodies in multiphase flow within an Eulerian framework

    NASA Astrophysics Data System (ADS)

    Rauschenberger, P.; Weigand, B.

    2015-06-01

    A new method is presented to simulate rigid body motion in the Volume-of-Fluid based multiphase code Free Surface 3D. The specific feature of the new method is that it works within an Eulerian framework without the need for a Lagrangian representation of rigid bodies. Several test cases are shown to prove the validity of the numerical scheme. The technique is able to conserve the shape of arbitrarily shaped rigid bodies and predict terminal velocities of rigid spheres. The instability of a falling ellipsoid is captured. Multiple rigid bodies including collisions may be considered using only one Volume-of-Fluid variable which allows to simulate the drafting, kissing and tumbling phenomena of two rigid spheres. The method can easily be extended to rigid bodies undergoing phase change processes.

  20. CRUSDE: A plug-in based simulation framework for composable CRUstal DEformation studies using Green's functions

    NASA Astrophysics Data System (ADS)

    Grapenthin, R.

    2014-01-01

    CRUSDE is a plug-in based simulation framework written in C/C++ for Linux platforms (installation information, download and test cases: http://www.grapenthin.org/crusde). It utilizes Green's functions for simulations of the Earth's response to changes in surface loads. Such changes could involve, for example, melting glaciers, oscillating snow loads, or lava flow emplacement. The focus in the simulation could be the response of the Earth's crust in terms of stress changes, changes in strain rates, or simply uplift or subsidence and the respective horizontal displacements of the crust (over time). Rather than implementing a variety of specific models, CRUSDE approaches crustal deformation problems from a general formulation in which model elements (Green's function, load function, relaxation function, load history), operators, pre- and postprocessors, as well as input and output routines are independent, exchangeable, and reusable on the basis of a plug-in approach (shared libraries loaded at runtime). We derive the general formulation CRUSDE is based on, describe its architecture and use, and demonstrate its capabilities in a test case. With CRUSDE users can: (1) dynamically select software components to participate in a simulation (through XML experiment definitions), (2) extend the framework independently with new software components and reuse existing ones, and (3) exchange software components and experiment definitions with other users. CRUSDE's plug-in mechanism aims for straightforward extendability allowing modelers to add new Earth models/response functions. Current Green's function implementations include surface displacements due to the elastic response, final relaxed response, and pure thick plate response for a flat Earth. These can be combined to express exponential decay from elastic to final relaxed response, displacement rates due to one or multiple disks, irregular loads, or a combination of these. Each load can have its own load history and

  1. A Probabilistic Framework for the Validation and Certification of Computer Simulations

    NASA Technical Reports Server (NTRS)

    Ghanem, Roger; Knio, Omar

    2000-01-01

    The paper presents a methodology for quantifying, propagating, and managing the uncertainty in the data required to initialize computer simulations of complex phenomena. The purpose of the methodology is to permit the quantitative assessment of a certification level to be associated with the predictions from the simulations, as well as the design of a data acquisition strategy to achieve a target level of certification. The value of a methodology that can address the above issues is obvious, specially in light of the trend in the availability of computational resources, as well as the trend in sensor technology. These two trends make it possible to probe physical phenomena both with physical sensors, as well as with complex models, at previously inconceivable levels. With these new abilities arises the need to develop the knowledge to integrate the information from sensors and computer simulations. This is achieved in the present work by tracing both activities back to a level of abstraction that highlights their commonalities, thus allowing them to be manipulated in a mathematically consistent fashion. In particular, the mathematical theory underlying computer simulations has long been associated with partial differential equations and functional analysis concepts such as Hilbert spares and orthogonal projections. By relying on a probabilistic framework for the modeling of data, a Hilbert space framework emerges that permits the modeling of coefficients in the governing equations as random variables, or equivalently, as elements in a Hilbert space. This permits the development of an approximation theory for probabilistic problems that parallels that of deterministic approximation theory. According to this formalism, the solution of the problem is identified by its projection on a basis in the Hilbert space of random variables, as opposed to more traditional techniques where the solution is approximated by its first or second-order statistics. The present

  2. Parallel kinetic Monte Carlo simulation framework incorporating accurate models of adsorbate lateral interactions

    NASA Astrophysics Data System (ADS)

    Nielsen, Jens; d'Avezac, Mayeul; Hetherington, James; Stamatakis, Michail

    2013-12-01

    Ab initio kinetic Monte Carlo (KMC) simulations have been successfully applied for over two decades to elucidate the underlying physico-chemical phenomena on the surfaces of heterogeneous catalysts. These simulations necessitate detailed knowledge of the kinetics of elementary reactions constituting the reaction mechanism, and the energetics of the species participating in the chemistry. The information about the energetics is encoded in the formation energies of gas and surface-bound species, and the lateral interactions between adsorbates on the catalytic surface, which can be modeled at different levels of detail. The majority of previous works accounted for only pairwise-additive first nearest-neighbor interactions. More recently, cluster-expansion Hamiltonians incorporating long-range interactions and many-body terms have been used for detailed estimations of catalytic rate [C. Wu, D. J. Schmidt, C. Wolverton, and W. F. Schneider, J. Catal. 286, 88 (2012)]. In view of the increasing interest in accurate predictions of catalytic performance, there is a need for general-purpose KMC approaches incorporating detailed cluster expansion models for the adlayer energetics. We have addressed this need by building on the previously introduced graph-theoretical KMC framework, and we have developed Zacros, a FORTRAN2003 KMC package for simulating catalytic chemistries. To tackle the high computational cost in the presence of long-range interactions we introduce parallelization with OpenMP. We further benchmark our framework by simulating a KMC analogue of the NO oxidation system established by Schneider and co-workers [J. Catal. 286, 88 (2012)]. We show that taking into account only first nearest-neighbor interactions may lead to large errors in the prediction of the catalytic rate, whereas for accurate estimates thereof, one needs to include long-range terms in the cluster expansion.

  3. A computational framework for qualitative simulation of nonlinear dynamical models of gene-regulatory networks

    PubMed Central

    Ironi, Liliana; Panzeri, Luigi

    2009-01-01

    Background Due to the huge amount of information at genomic level made recently available by high-throughput experimental technologies, networks of regulatory interactions between genes and gene products, the so-called gene-regulatory networks, can be uncovered. Most networks of interest are quite intricate because of both the high dimension of interacting elements and the complexity of the kinds of interactions between them. Then, mathematical and computational modeling frameworks are a must to predict the network behavior in response to environmental stimuli. A specific class of Ordinary Differential Equations (ODE) has shown to be adequate to describe the essential features of the dynamics of gene-regulatory networks. But, deriving quantitative predictions of the network dynamics through the numerical simulation of such models is mostly impracticable as they are currently characterized by incomplete knowledge of biochemical reactions underlying regulatory interactions, and of numeric values of kinetic parameters. Results This paper presents a computational framework for qualitative simulation of a class of ODE models, based on the assumption that gene regulation is threshold-dependent, i.e. only effective above or below a certain threshold. The simulation algorithm we propose assumes that threshold-dependent regulation mechanisms are modeled by continuous steep sigmoid functions, unlike other simulation tools that considerably simplifies the problem by approximating threshold-regulated response functions by step functions discontinuous in the thresholds. The algorithm results from the interplay between methods to deal with incomplete knowledge and to study phenomena that occur at different time-scales. Conclusion The work herein presented establishes the computational groundwork for a sound and a complete algorithm capable to capture the dynamical properties that depend only on the network structure and are invariant for ranges of values of kinetic parameters

  4. A multi-fidelity framework for physics based rotor blade simulation and optimization

    NASA Astrophysics Data System (ADS)

    Collins, Kyle Brian

    with lower fidelity models. This thesis documents the development of automated low and high fidelity physics based rotor simulation frameworks. The low fidelity framework uses a comprehensive code with simplified aerodynamics. The high fidelity model uses a parallel processor capable CFD/CSD methodology. Both low and high fidelity frameworks include an aeroacoustic simulation for prediction of noise. A synergistic process is developed that uses both the low and high fidelity frameworks together to build approximate models of important high fidelity metrics as functions of certain design variables. To test the process, a 4-bladed hingeless rotor model is used as a baseline. The design variables investigated include tip geometry and spanwise twist distribution. Approximation models are built for metrics related to rotor efficiency and vibration using the results from 60+ high fidelity (CFD/CSD) experiments and 400+ low fidelity experiments. Optimization using the approximation models found the Pareto Frontier anchor points, or the design having maximum rotor efficiency and the design having minimum vibration. Various Pareto generation methods are used to find designs on the frontier between these two anchor designs. When tested in the high fidelity framework, the Pareto anchor designs are shown to be very good designs when compared with other designs from the high fidelity database. This provides evidence that the process proposed has merit. Ultimately, this process can be utilized by industry rotor designers with their existing tools to bring high fidelity analysis into the preliminary design stage of rotors. In conclusion, the methods developed and documented in this thesis have made several novel contributions. First, an automated high fidelity CFD based forward flight simulation framework has been built for use in preliminary design optimization. The framework was built around an integrated, parallel processor capable CFD/CSD/AA process. Second, a novel method of

  5. Structure simulation with calculated NMR parameters - integrating COSMOS into the CCPN framework.

    PubMed

    Schneider, Olaf; Fogh, Rasmus H; Sternberg, Ulrich; Klenin, Konstantin; Kondov, Ivan

    2012-01-01

    The Collaborative Computing Project for NMR (CCPN) has build a software framework consisting of the CCPN data model (with APIs) for NMR related data, the CcpNmr Analysis program and additional tools like CcpNmr FormatConverter. The open architecture allows for the integration of external software to extend the abilities of the CCPN framework with additional calculation methods. Recently, we have carried out the first steps for integrating our software Computer Simulation of Molecular Structures (COSMOS) into the CCPN framework. The COSMOS-NMR force field unites quantum chemical routines for the calculation of molecular properties with a molecular mechanics force field yielding the relative molecular energies. COSMOS-NMR allows introducing NMR parameters as constraints into molecular mechanics calculations. The resulting infrastructure will be made available for the NMR community. As a first application we have tested the evaluation of calculated protein structures using COSMOS-derived 13C Cα and Cβ chemical shifts. In this paper we give an overview of the methodology and a roadmap for future developments and applications. PMID:22942007

  6. Structure simulation with calculated NMR parameters - integrating COSMOS into the CCPN framework.

    PubMed

    Schneider, Olaf; Fogh, Rasmus H; Sternberg, Ulrich; Klenin, Konstantin; Kondov, Ivan

    2012-01-01

    The Collaborative Computing Project for NMR (CCPN) has build a software framework consisting of the CCPN data model (with APIs) for NMR related data, the CcpNmr Analysis program and additional tools like CcpNmr FormatConverter. The open architecture allows for the integration of external software to extend the abilities of the CCPN framework with additional calculation methods. Recently, we have carried out the first steps for integrating our software Computer Simulation of Molecular Structures (COSMOS) into the CCPN framework. The COSMOS-NMR force field unites quantum chemical routines for the calculation of molecular properties with a molecular mechanics force field yielding the relative molecular energies. COSMOS-NMR allows introducing NMR parameters as constraints into molecular mechanics calculations. The resulting infrastructure will be made available for the NMR community. As a first application we have tested the evaluation of calculated protein structures using COSMOS-derived 13C Cα and Cβ chemical shifts. In this paper we give an overview of the methodology and a roadmap for future developments and applications.

  7. Multiscale Simulation as a Framework for the Enhanced Design of Nanodiamond-Polyethylenimine-based Gene Delivery

    PubMed Central

    Kim, Hansung; Man, Han Bin; Saha, Biswajit; Kopacz, Adrian M.; Lee, One-Sun; Schatz, George C.; Ho, Dean; Liu, Wing Kam

    2012-01-01

    Nanodiamonds (NDs) are emerging carbon platforms with promise as gene/drug delivery vectors for cancer therapy. Specifically, NDs functionalized with the polymer polyethylenimine (PEI) can transfect small interfering RNAs (siRNA) in vitro with high efficiency and low cytotoxicity. Here we present a modeling framework to accurately guide the design of ND-PEI gene platforms and elucidate binding mechanisms between ND, PEI, and siRNA. This is among the first ND simulations to comprehensively account for ND size, charge distribution, surface functionalization, and graphitization. The simulation results are compared with our experimental results both for PEI loading onto NDs and for siRNA (C-myc) loading onto ND-PEI for various mixing ratios. Remarkably, the model is able to predict loading trends and saturation limits for PEI and siRNA, while confirming the essential role of ND surface functionalization in mediating ND-PEI interactions. These results demonstrate that this robust framework can be a powerful tool in ND platform development, with the capacity to realistically treat other nanoparticle systems. PMID:23304428

  8. A Component-Based FPGA Design Framework for Neuronal Ion Channel Dynamics Simulations

    PubMed Central

    Mak, Terrence S. T.; Rachmuth, Guy; Lam, Kai-Pui; Poon, Chi-Sang

    2008-01-01

    Neuron-machine interfaces such as dynamic clamp and brain-implantable neuroprosthetic devices require real-time simulations of neuronal ion channel dynamics. Field Programmable Gate Array (FPGA) has emerged as a high-speed digital platform ideal for such application-specific computations. We propose an efficient and flexible component-based FPGA design framework for neuronal ion channel dynamics simulations, which overcomes certain limitations of the recently proposed memory-based approach. A parallel processing strategy is used to minimize computational delay, and a hardware-efficient factoring approach for calculating exponential and division functions in neuronal ion channel models is used to conserve resource consumption. Performances of the various FPGA design approaches are compared theoretically and experimentally in corresponding implementations of the AMPA and NMDA synaptic ion channel models. Our results suggest that the component-based design framework provides a more memory economic solution as well as more efficient logic utilization for large word lengths, whereas the memory-based approach may be suitable for time-critical applications where a higher throughput rate is desired. PMID:17190033

  9. PHAISTOS: a framework for Markov chain Monte Carlo simulation and inference of protein structure.

    PubMed

    Boomsma, Wouter; Frellsen, Jes; Harder, Tim; Bottaro, Sandro; Johansson, Kristoffer E; Tian, Pengfei; Stovgaard, Kasper; Andreetta, Christian; Olsson, Simon; Valentin, Jan B; Antonov, Lubomir D; Christensen, Anders S; Borg, Mikael; Jensen, Jan H; Lindorff-Larsen, Kresten; Ferkinghoff-Borg, Jesper; Hamelryck, Thomas

    2013-07-15

    We present a new software framework for Markov chain Monte Carlo sampling for simulation, prediction, and inference of protein structure. The software package contains implementations of recent advances in Monte Carlo methodology, such as efficient local updates and sampling from probabilistic models of local protein structure. These models form a probabilistic alternative to the widely used fragment and rotamer libraries. Combined with an easily extendible software architecture, this makes PHAISTOS well suited for Bayesian inference of protein structure from sequence and/or experimental data. Currently, two force-fields are available within the framework: PROFASI and OPLS-AA/L, the latter including the generalized Born surface area solvent model. A flexible command-line and configuration-file interface allows users quickly to set up simulations with the desired configuration. PHAISTOS is released under the GNU General Public License v3.0. Source code and documentation are freely available from http://phaistos.sourceforge.net. The software is implemented in C++ and has been tested on Linux and OSX platforms.

  10. A parallel framework for the FE-based simulation of knee joint motion.

    PubMed

    Wawro, Martin; Fathi-Torbaghan, Madjid

    2004-08-01

    We present an object-oriented framework for the finite-element (FE)-based simulation of the human knee joint motion. The FE model of the knee joint is acquired from the patients in vivo by using magnetic resonance imaging. The MRI images are converted into a three-dimensional model and finally an all-hexahedral mesh for the FE analysis is generated. The simulation environment uses nonlinear finite-element analysis (FEA) and is capable of handling contact of the model to handle the complex rolling/sliding motion of the knee joint. The software strictly follows object-oriented concepts of software engineering in order to guarantee maximum extensibility and maintainability. The final goal of this work-in-progress is the creation of a computer-based biomechanical model of the knee joint which can be used in a variety of applications, ranging from prosthesis design and treatment planning (e.g., optimal reconstruction of ruptured ligaments) over surgical simulation to impact computations in crashworthiness simulations.

  11. Global Simulation of Bioenergy Crop Productivity: Analytical Framework and Case Study for Switchgrass

    SciTech Connect

    Kang, Shujiang; Kline, Keith L; Nair, S. Surendran; Nichols, Dr Jeff A; Post, Wilfred M; Brandt, Craig C; Wullschleger, Stan D; Wei, Yaxing; Singh, Nagendra

    2013-01-01

    A global energy crop productivity model that provides geospatially explicit quantitative details on biomass potential and factors affecting sustainability would be useful, but does not exist now. This study describes a modeling platform capable of meeting many challenges associated with global-scale agro-ecosystem modeling. We designed an analytical framework for bioenergy crops consisting of six major components: (i) standardized natural resources datasets, (ii) global field-trial data and crop management practices, (iii) simulation units and management scenarios, (iv) model calibration and validation, (v) high-performance computing (HPC) simulation, and (vi) simulation output processing and analysis. The HPC-Environmental Policy Integrated Climate (HPC-EPIC) model simulated a perennial bioenergy crop, switchgrass (Panicum virgatum L.), estimating feedstock production potentials and effects across the globe. This modeling platform can assess soil C sequestration, net greenhouse gas (GHG) emissions, nonpoint source pollution (e.g., nutrient and pesticide loss), and energy exchange with the atmosphere. It can be expanded to include additional bioenergy crops (e.g., miscanthus, energy cane, and agave) and food crops under different management scenarios. The platform and switchgrass field-trial dataset are available to support global analysis of biomass feedstock production potential and corresponding metrics of sustainability.

  12. A GIS/Simulation Framework for Assessing Change in Water Yield over Large Spatial Scales

    SciTech Connect

    Graham, R.; Hargrove, W.W.; Huff, D.D.; Nikolov, N.; Tharp, M.L.

    1999-11-13

    Recent legislation to,initiate vegetation management in the Central Sierra hydrologic region of California includes a focus on corresponding changes in water yield. This served as the impetus for developing a combined geographic information system (GIS) and simulation assessment framework. Using the existing vegetation density condition, together with proposed rules for thinning to reduce fire risk, a set of simulation model inputs were generated for examining the impact of the thinning scenario on water yield. The approach allows results to be expressed as the mean and standard deviation of change in water yield for each 1 km2 map cell that is treated. Values for groups of cells are aggregated for typical watershed units using area-weighted averaging. Wet, dry and average precipitation years were simulated over a large region. Where snow plays an important role in hydrologic processes, the simulated change in water yield was less than 0.5% of expected annual runoff for a typical water shed. Such small changes would be undetectable in the field using conventional stream flow analysis. These results suggest that use of water yield increases to help justify forest-thinning activities or offset their cost will be difficult.

  13. Spatial and Temporal Simulation of Human Evolution. Methods, Frameworks and Applications

    PubMed Central

    Benguigui, Macarena; Arenas, Miguel

    2014-01-01

    Analyses of human evolution are fundamental to understand the current gradients of human diversity. In this concern, genetic samples collected from current populations together with archaeological data are the most important resources to study human evolution. However, they are often insufficient to properly evaluate a variety of evolutionary scenarios, leading to continuous debates and discussions. A commonly applied strategy consists of the use of computer simulations based on, as realistic as possible, evolutionary models, to evaluate alternative evolutionary scenarios through statistical correlations with the real data. Computer simulations can also be applied to estimate evolutionary parameters or to study the role of each parameter on the evolutionary process. Here we review the mainly used methods and evolutionary frameworks to perform realistic spatially explicit computer simulations of human evolution. Although we focus on human evolution, most of the methods and software we describe can also be used to study other species. We also describe the importance of considering spatially explicit models to better mimic human evolutionary scenarios based on a variety of phenomena such as range expansions, range shifts, range contractions, sex-biased dispersal, long-distance dispersal or admixtures of populations. We finally discuss future implementations to improve current spatially explicit simulations and their derived applications in human evolution. PMID:25132795

  14. Global Simulation of Bioenergy Crop Productivity: Analytical framework and Case Study for Switchgrass

    SciTech Connect

    Nair, S. Surendran; Nichols, Jeff A. {Cyber Sciences}; Post, Wilfred M; Wang, Dali; Wullschleger, Stan D; Kline, Keith L; Wei, Yaxing; Singh, Nagendra; Kang, Shujiang

    2014-01-01

    Contemporary global assessments of the deployment potential and sustainability aspects of biofuel crops lack quantitative details. This paper describes an analytical framework capable of meeting the challenges associated with global scale agro-ecosystem modeling. We designed a modeling platform for bioenergy crops, consisting of five major components: (i) standardized global natural resources and management data sets, (ii) global simulation unit and management scenarios, (iii) model calibration and validation, (iv) high-performance computing (HPC) modeling, and (v) simulation output processing and analysis. A case study with the HPC- Environmental Policy Integrated Climate model (HPC-EPIC) to simulate a perennial bioenergy crop, switchgrass (Panicum virgatum L.) and global biomass feedstock analysis on grassland demonstrates the application of this platform. The results illustrate biomass feedstock variability of switchgrass and provide insights on how the modeling platform can be expanded to better assess sustainable production criteria and other biomass crops. Feedstock potentials on global grasslands and within different countries are also shown. Future efforts involve developing databases of productivity, implementing global simulations for other bioenergy crops (e.g. miscanthus, energycane and agave), and assessing environmental impacts under various management regimes. We anticipated this platform will provide an exemplary tool and assessment data for international communities to conduct global analysis of biofuel biomass feedstocks and sustainability.

  15. Flexible and modular MPI simulation framework and its use in modelling a μMPI

    PubMed Central

    Straub, Marcel; Lammers, Twan; Kiessling, Fabian; Schulz, Volkmar

    2014-01-01

    The availability of thorough system simulations for detailed and accurate performance prediction and optimization of existing and future designs for a new modality such as magnetic particle imaging (MPI) are very important. Our framework aims to simulate a complete MPI system by providing a description of all (drive and receive) coils, permanent magnet configurations, magnetic nanoparticle (MNP) distributions, and characteristics of the signal processing chain. The simulation is performed on a user defined spatial and temporal discrete grid. The magnetization of the MNP is modelled by either the Langevin theory or as ideal particles with infinite steepness and ideal saturation. The magnetic fields are approximated in first order by calculating the Biot-Savart integral. Additionally the coupling constants between the excitation coils (e.g. drive field coils) and the receive coils can be determined. All coils can be described by an XML description language based on primitive geometric shapes. First simulations of a modelled μMPI system are shown. In this regard μMPI refers to a small one dimensional system for samples of a size of a few tens of a cubic millimeter and a spatial resolution of about 200 μm. PMID:25892744

  16. SimPEG: An open-source framework for geophysical simulations and inverse problems

    NASA Astrophysics Data System (ADS)

    Cockett, R.; Kang, S.; Heagy, L.

    2014-12-01

    Geophysical surveys are powerful tools for obtaining information about the subsurface. Inverse modelling provides a mathematical framework for constructing a model of physical property distributions that are consistent with the data collected by these surveys. The geosciences are increasingly moving towards the integration of geological, geophysical, and hydrological information to better characterize the subsurface. This integration must span disciplines and is not only challenging scientifically, but the inconsistencies between conventions often makes implementations complicated, non-reproducible, or inefficient. We have developed an open source software package for Simulation and Parameter Estimation in Geophysics (SimPEG), which provides a generalized framework for solving geophysical forward and inverse problems. SimPEG is written entirely in Python with minimal dependencies in the hopes that it can be used both as a research tool and for education. SimPEG includes finite volume discretizations on structured and unstructured meshes, interfaces to standard numerical solver packages, convex optimization algorithms, model parameterizations, and tailored visualization routines. The framework is modular and object-oriented, which promotes real time experimentation and combination of geophysical problems and inversion methodologies. In this presentation, we will highlight a few geophysical examples, including direct-current resistivity and electromagnetics, and discuss some of the challenges and successes we encountered in developing a flexible and extensible framework. Throughout development of SimPEG we have focused on simplicity, usability, documentation, and extensive testing. By embracing a fully open source development paradigm, we hope to encourage reproducible research, cooperation, and communication to help tackle some of the inherently multidisciplinary problems that face integrated geophysical methods.

  17. Is the Person-Situation Debate Important for Agent-Based Modeling and Vice-Versa?

    PubMed Central

    Sznajd-Weron, Katarzyna; Szwabiński, Janusz; Weron, Rafał

    2014-01-01

    Background Agent-based models (ABM) are believed to be a very powerful tool in the social sciences, sometimes even treated as a substitute for social experiments. When building an ABM we have to define the agents and the rules governing the artificial society. Given the complexity and our limited understanding of the human nature, we face the problem of assuming that either personal traits, the situation or both have impact on the social behavior of agents. However, as the long-standing person-situation debate in psychology shows, there is no consensus as to the underlying psychological mechanism and the important question that arises is whether the modeling assumptions we make will have a substantial influence on the simulated behavior of the system as a whole or not. Methodology/Principal Findings Studying two variants of the same agent-based model of opinion formation, we show that the decision to choose either personal traits or the situation as the primary factor driving social interactions is of critical importance. Using Monte Carlo simulations (for Barabasi-Albert networks) and analytic calculations (for a complete graph) we provide evidence that assuming a person-specific response to social influence at the microscopic level generally leads to a completely different and less realistic aggregate or macroscopic behavior than an assumption of a situation-specific response; a result that has been reported by social psychologists for a range of experimental setups, but has been downplayed or ignored in the opinion dynamics literature. Significance This sensitivity to modeling assumptions has far reaching consequences also beyond opinion dynamics, since agent-based models are becoming a popular tool among economists and policy makers and are often used as substitutes of real social experiments. PMID:25369531

  18. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries

    PubMed Central

    2012-01-01

    Background Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. Results We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic

  19. PM2.5 Population Exposure in New Delhi Using a Probabilistic Simulation Framework.

    PubMed

    Saraswat, Arvind; Kandlikar, Milind; Brauer, Michael; Srivastava, Arun

    2016-03-15

    This paper presents a Geographical Information System (GIS) based probabilistic simulation framework to estimate PM2.5 population exposure in New Delhi, India. The framework integrates PM2.5 output from spatiotemporal LUR models and trip distribution data using a Gravity model based on zonal data for population, employment and enrollment in educational institutions. Time-activity patterns were derived from a survey of randomly sampled individuals (n = 1012) and in-vehicle exposure was estimated using microenvironmental monitoring data based on field measurements. We simulated population exposure for three different scenarios to capture stay-at-home populations (Scenario 1), working population exposed to near-road concentrations during commutes (Scenario 2), and the working population exposed to on-road concentrations during commutes (Scenario 3). Simulated annual average levels of PM2.5 exposure across the entire city were very high, and particularly severe in the winter months: ∼200 μg m(-3) in November, roughly four times higher compared to the lower levels in the monsoon season. Mean annual exposures ranged from 109 μg m(-3) (IQR: 97-120 μg m(-3)) for Scenario 1, to 121 μg m(-3) (IQR: 110-131 μg m(-3)), and 125 μg m(-3) (IQR: 114-136 μ gm(-3)) for Scenarios 2 and 3 respectively. Ignoring the effects of mobility causes the average annual PM2.5 population exposure to be underestimated by only 11%. PMID:26885573

  20. Final Report for Project "Framework Application for Core-Edge Transport Simulations (FACETS)"

    SciTech Connect

    Estep, Donald

    2014-01-17

    This is the final report for the Colorado State University Component of the FACETS Project. FACETS was focused on the development of a multiphysics, parallel framework application that could provide the capability to enable whole-device fusion reactor modeling and, in the process, the development of the modeling infrastructure and computational understanding needed for ITER. It was intended that FACETS be highly flexible, through the use of modern computational methods, including component technology and object oriented design, to facilitate switching from one model to another for a given aspect of the physics, and making it possible to use simplified models for rapid turnaround or high-fidelity models that will take advantage of the largest supercomputer hardware. FACETS was designed in a heterogeneous parallel context, where different parts of the application can take advantage through parallelism based on task farming, domain decomposition, and/or pipelining as needed and applicable. As with all fusion simulations, an integral part of the FACETS project was treatment of the coupling of different physical processes at different scales interacting closely. A primary example for the FACETS project is the coupling of existing core and edge simulations, with the transport and wall interactions described by reduced models. However, core and edge simulations themselves involve significant coupling of different processes with large scale differences. Numerical treatment of coupling is impacted by a number of factors including, scale differences, form of information transferred between processes, implementation of solvers for different codes, and high performance computing concerns. Operator decomposition involving the computation of the individual processes individually using appropriate simulation codes and then linking/synchronizing the component simulations at regular points in space and time, is the defacto approach to high performance simulation of multiphysics

  1. Development of a multi-physics simulation framework for semiconductor materials and devices

    NASA Astrophysics Data System (ADS)

    Almeida, Nuno Sucena

    framework will be used to simulate simple structures and some of its relevant parameters extracted.

  2. Network Interventions on Physical Activity in an Afterschool Program: An Agent-Based Social Network Study

    PubMed Central

    Zhang, Jun; Shoham, David A.; Tesdahl, Eric

    2015-01-01

    Objectives. We studied simulated interventions that leveraged social networks to increase physical activity in children. Methods. We studied a real-world social network of 81 children (average age = 7.96 years) who lived in low socioeconomic status neighborhoods, and attended public schools and 1 of 2 structured afterschool programs. The sample was ethnically diverse, and 44% were overweight or obese. We used social network analysis and agent-based modeling simulations to test whether implementing a network intervention would increase children’s physical activity. We tested 3 intervention strategies. Results. The intervention that targeted opinion leaders was effective in increasing the average level of physical activity across the entire network. However, the intervention that targeted the most sedentary children was the best at increasing their physical activity levels. Conclusions. Which network intervention to implement depends on whether the goal is to shift the entire distribution of physical activity or to influence those most adversely affected by low physical activity. Agent-based modeling could be an important complement to traditional project planning tools, analogous to sample size and power analyses, to help researchers design more effective interventions for increasing children’s physical activity. PMID:25689202

  3. Understanding virulence mechanisms in M. tuberculosis infection via a circuit-based simulation framework.

    SciTech Connect

    May, Elebeoba Eni; Oprea, Tudor I.; Joo, Jaewook; Misra, Milind; Leitao, Andrei; Faulon, Jean-Loup Michel

    2008-08-01

    Tuberculosis (TB), caused by the bacterium Mycobacterium tuberculosis (Mtb), is a growing international health crisis. Mtb is able to persist in host tissues in a non-replicating persistent (NRP) or latent state. This presents a challenge in the treatment of TB. Latent TB can re-activate in 10% of individuals with normal immune systems, higher for those with compromised immune systems. A quantitative understanding of latency-associated virulence mechanisms may help researchers develop more effective methods to battle the spread and reduce TB associated fatalities. Leveraging BioXyce's ability to simulate whole-cell and multi-cellular systems we are developing a circuit-based framework to investigate the impact of pathogenicity-associated pathways on the latency/reactivation phase of tuberculosis infection. We discuss efforts to simulate metabolic pathways that potentially impact the ability of Mtb to persist within host immune cells. We demonstrate how simulation studies can provide insight regarding the efficacy of potential anti-TB agents on biological networks critical to Mtb pathogenicity using a systems chemical biology approach

  4. Social simulation theory: a framework to explain nurses' understanding of patients' experiences of ill-health.

    PubMed

    Nordby, Halvor

    2016-09-01

    A fundamental aim in caring practice is to understand patients' experiences of ill-health. These experiences have a qualitative content and cannot, unlike thoughts and beliefs with conceptual content, directly be expressed in words. Nurses therefore face a variety of interpretive challenges when they aim to understand patients' subjective perspectives on disease and illness. The article argues that theories on social simulation can shed light on how nurses manage to meet these challenges. The core assumption of social simulationism is that we do not understand other people by forming mental representations of how they think, but by putting ourselves in their situation in a more imaginative way. According to simulationism, any attempt to understand a patient's behavior is made on the basis of simulating what it is like to be that patient in the given context. The article argues that this approach to social interpretation can clarify how nurses manage to achieve aims of patient understanding, even when they have limited time to communicate and incomplete knowledge of patients' perspectives. Furthermore, simulation theory provides a normative framework for interpretation, in the sense that its theoretical assumptions constitute ideals for how nurses should seek to understand patients' experiences of illness.

  5. A upwind PPM with limiter for tokamak edge plasmas simulation under BOUT++ framework

    NASA Astrophysics Data System (ADS)

    Ma, Chenhao; Xu, Xueqiao

    2012-10-01

    To study the propagation of blobs driven by edge plasma instability, the PPM should be applied to improve numerical accuracy. The upwind Piecewise Parabolic Method(PPM) with limiter preserves accuracy at smooth extrema. The interpolated values only at extrema is restricted by non-linear combinations of various different approximations of the second order derivatives. This method has the same accuracy for smooth initial data as PPM without limiter and preserves shape of initial data exactly during its propagation. BOUT++ is a C++ framework for 3D plasma fluid simulation in real geometry, including both open and closed field lines, and was developed in part from the original fluid edge code BOUT. Our goal is to implement the PPM with limiter as one of numerical differencing methods in BOUT++'s library. Because the spatial scale of blobs driven by edge plasma instability are typically ten times smaller than the simulation region, the PPM with limiter will preserve the shape of blobs exactly at smooth extrema and provide better long time simulation result.

  6. A framework for stochastic simulations and visualization of biological electron-transfer dynamics

    NASA Astrophysics Data System (ADS)

    Nakano, C. Masato; Byun, Hye Suk; Ma, Heng; Wei, Tao; El-Naggar, Mohamed Y.

    2015-08-01

    Electron transfer (ET) dictates a wide variety of energy-conversion processes in biological systems. Visualizing ET dynamics could provide key insight into understanding and possibly controlling these processes. We present a computational framework named VizBET to visualize biological ET dynamics, using an outer-membrane Mtr-Omc cytochrome complex in Shewanella oneidensis MR-1 as an example. Starting from X-ray crystal structures of the constituent cytochromes, molecular dynamics simulations are combined with homology modeling, protein docking, and binding free energy computations to sample the configuration of the complex as well as the change of the free energy associated with ET. This information, along with quantum-mechanical calculations of the electronic coupling, provides inputs to kinetic Monte Carlo (KMC) simulations of ET dynamics in a network of heme groups within the complex. Visualization of the KMC simulation results has been implemented as a plugin to the Visual Molecular Dynamics (VMD) software. VizBET has been used to reveal the nature of ET dynamics associated with novel nonequilibrium phase transitions in a candidate configuration of the Mtr-Omc complex due to electron-electron interactions.

  7. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    SciTech Connect

    Ahmadi, Rouhollah; Khamehchi, Ehsan

    2013-12-15

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.

  8. GeNN: a code generation framework for accelerated brain simulations.

    PubMed

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/. PMID:26740369

  9. GeNN: a code generation framework for accelerated brain simulations

    PubMed Central

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/. PMID:26740369

  10. Strengthening Theoretical Testing in Criminology Using Agent-based Modeling

    PubMed Central

    Groff, Elizabeth R.

    2014-01-01

    Objectives: The Journal of Research in Crime and Delinquency (JRCD) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity—agent-based computational modeling—that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Method: Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Results: Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Conclusion: Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs—not without its own issues—may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification. PMID:25419001

  11. Modelling of robotic work cells using agent based-approach

    NASA Astrophysics Data System (ADS)

    Sękala, A.; Banaś, W.; Gwiazda, A.; Monica, Z.; Kost, G.; Hryniewicz, P.

    2016-08-01

    In the case of modern manufacturing systems the requirements, both according the scope and according characteristics of technical procedures are dynamically changing. This results in production system organization inability to keep up with changes in a market demand. Accordingly, there is a need for new design methods, characterized, on the one hand with a high efficiency and on the other with the adequate level of the generated organizational solutions. One of the tools that could be used for this purpose is the concept of agent systems. These systems are the tools of artificial intelligence. They allow assigning to agents the proper domains of procedures and knowledge so that they represent in a self-organizing system of an agent environment, components of a real system. The agent-based system for modelling robotic work cell should be designed taking into consideration many limitations considered with the characteristic of this production unit. It is possible to distinguish some grouped of structural components that constitute such a system. This confirms the structural complexity of a work cell as a specific production system. So it is necessary to develop agents depicting various aspects of the work cell structure. The main groups of agents that are used to model a robotic work cell should at least include next pattern representatives: machine tool agents, auxiliary equipment agents, robots agents, transport equipment agents, organizational agents as well as data and knowledge bases agents. In this way it is possible to create the holarchy of the agent-based system.

  12. Excellent approach to modeling urban expansion by fuzzy cellular automata: agent base model

    NASA Astrophysics Data System (ADS)

    Khajavigodellou, Yousef; Alesheikh, Ali A.; Mohammed, Abdulrazak A. S.; Chapi, Kamran

    2014-09-01

    Recently, the interaction between humans and their environment is the one of important challenges in the world. Landuse/ cover change (LUCC) is a complex process that includes actors and factors at different social and spatial levels. The complexity and dynamics of urban systems make the applicable practice of urban modeling very difficult. With the increased computational power and the greater availability of spatial data, micro-simulation such as the agent based and cellular automata simulation methods, has been developed by geographers, planners, and scholars, and it has shown great potential for representing and simulating the complexity of the dynamic processes involved in urban growth and land use change. This paper presents Fuzzy Cellular Automata in Geospatial Information System and remote Sensing to simulated and predicted urban expansion pattern. These FCA-based dynamic spatial urban models provide an improved ability to forecast and assess future urban growth and to create planning scenarios, allowing us to explore the potential impacts of simulations that correspond to urban planning and management policies. A fuzzy inference guided cellular automata approach. Semantic or linguistic knowledge on Land use change is expressed as fuzzy rules, based on which fuzzy inference is applied to determine the urban development potential for each pixel. The model integrates an ABM (agent-based model) and FCA (Fuzzy Cellular Automata) to investigate a complex decision-making process and future urban dynamic processes. Based on this model rapid development and green land protection under the influences of the behaviors and decision modes of regional authority agents, real estate developer agents, resident agents and non- resident agents and their interactions have been applied to predict the future development patterns of the Erbil metropolitan region.

  13. The structure of disaster resilience: a framework for simulations and policy recommendations

    NASA Astrophysics Data System (ADS)

    Edwards, J. H. Y.

    2014-09-01

    In this era of rapid climate change there is an urgent need for interdisciplinary collaboration and understanding in the study of what determines resistance to disasters and recovery speed. This paper is an economist's contribution to that effort. It traces the entrance of the word "resilience" from ecology into the social science literature on disasters, provides a formal economic definition of resilience that can be used in mathematical modeling, incorporates this definition into a multilevel model that suggests appropriate policy roles and targets at each level, and draws on the recent empirical literature on the economics of disaster searching for policy handles that can stimulate higher resilience. On the whole it provides a framework for simulations and for formulating disaster resilience policies.

  14. The structure of disaster resilience: a framework for simulations and policy recommendations

    NASA Astrophysics Data System (ADS)

    Edwards, J. H. Y.

    2015-04-01

    In this era of rapid climate change there is an urgent need for interdisciplinary collaboration and understanding in the study of what determines resistance to disasters and recovery speed. This paper is an economist's contribution to that effort. It traces the entrance of the word "resilience" from ecology into the social science literature on disasters, provides a formal economic definition of resilience that can be used in mathematical modeling, incorporates this definition into a multilevel model that suggests appropriate policy roles and targets at each level, and draws on the recent empirical literature on the economics of disaster, searching for policy handles that can stimulate higher resilience. On the whole it provides a framework for simulations and for formulating disaster resilience policies.

  15. A framework to quantify uncertainty in simulations of oil transport in the ocean

    NASA Astrophysics Data System (ADS)

    Gonçalves, Rafael C.; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Chassignet, Eric; Knio, Omar M.

    2016-04-01

    An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model's output to be presented in a probabilistic framework so that the model's predictions reflect the uncertainty in the model's input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model's uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable.

  16. Implementation and performance of FDPS: a framework for developing parallel particle simulation codes

    NASA Astrophysics Data System (ADS)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-08-01

    We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication

  17. A spatial web/agent-based model to support stakeholders' negotiation regarding land development.

    PubMed

    Pooyandeh, Majeed; Marceau, Danielle J

    2013-11-15

    Decision making in land management can be greatly enhanced if the perspectives of concerned stakeholders are taken into consideration. This often implies negotiation in order to reach an agreement based on the examination of multiple alternatives. This paper describes a spatial web/agent-based modeling system that was developed to support the negotiation process of stakeholders regarding land development in southern Alberta, Canada. This system integrates a fuzzy analytic hierarchy procedure within an agent-based model in an interactive visualization environment provided through a web interface to facilitate the learning and negotiation of the stakeholders. In the pre-negotiation phase, the stakeholders compare their evaluation criteria using linguistic expressions. Due to the uncertainty and fuzzy nature of such comparisons, a fuzzy Analytic Hierarchy Process is then used to prioritize the criteria. The negotiation starts by a development plan being submitted by a user (stakeholder) through the web interface. An agent called the proposer, which represents the proposer of the plan, receives this plan and starts negotiating with all other agents. The negotiation is conducted in a step-wise manner where the agents change their attitudes by assigning a new set of weights to their criteria. If an agreement is not achieved, a new location for development is proposed by the proposer agent. This process is repeated until a location is found that satisfies all agents to a certain predefined degree. To evaluate the performance of the model, the negotiation was simulated with four agents, one of which being the proposer agent, using two hypothetical development plans. The first plan was selected randomly; the other one was chosen in an area that is of high importance to one of the agents. While the agents managed to achieve an agreement about the location of the land development after three rounds of negotiation in the first scenario, seven rounds were required in the second

  18. Molecular dynamics simulation of framework flexibility effects on noble gas diffusion in HKUST-1 and ZIF-8

    SciTech Connect

    Parkes, Marie V.; Demir, Hakan; Teich-McGoldrick, Stephanie L.; Sholl, David S.; Greathouse, Jeffery A.; Allendorf, Mark D.

    2014-03-28

    Molecular dynamics simulations were used to investigate trends in noble gas (Ar, Kr, Xe) diffusion in the metal-organic frameworks HKUST-1 and ZIF-8. Diffusion occurs primarily through inter-cage jump events, with much greater diffusion of guest atoms in HKUST-1 compared to ZIF-8 due to the larger cage and window sizes in the former. We compare diffusion coefficients calculated for both rigid and flexible frameworks. For rigid framework simulations, in which the framework atoms were held at their crystallographic or geometry optimized coordinates, sometimes dramatic differences in guest diffusion were seen depending on the initial framework structure or the choice of framework force field parameters. When framework flexibility effects were included, argon and krypton diffusion increased significantly compared to rigid-framework simulations using general force field parameters. Additionally, for argon and krypton in ZIF-8, guest diffusion increased with loading, demonstrating that guest-guest interactions between cages enhance inter-cage diffusion. No inter-cage jump events were seen for xenon atoms in ZIF-8 regardless of force field or initial structure, and the loading dependence of xenon diffusion in HKUST-1 is different for rigid and flexible frameworks. Diffusion of krypton and xenon in HKUST-1 depends on two competing effects: the steric effect that decreases diffusion as loading increases, and the “small cage effect” that increases diffusion as loading increases. Finally, a detailed analysis of the window size in ZIF-8 reveals that the window increases beyond its normal size to permit passage of a (nominally) larger krypton atom.

  19. Progress report for FACETS (Framework Application for Core-Edge Transport Simulations): C.S. SAP

    SciTech Connect

    Epperly, T W

    2008-10-01

    The mission of the Computer Science Scientific Application Partnership (C.S. SAP) at LLNL is to develop and apply leading-edge scientific component technology to FACETS software. Contributions from LLNL's fusion energy program staff towards the underlying physics modules are described in a separate report. FACETS uses component technology to combine selectively multiple physics and solver software modules written in different languages by different institutions together in an tightly-integrated, parallel computing framework for Tokamak reactor modeling. In the past fiscal year, the C.S. SAP has focused on two primary tasks: applying Babel to connect UEDGE into the FACETS framework through UEDGE's existing Python interface and developing a next generation componentization strategy for UEDGE which avoids the use of Python. The FACETS project uses Babel to solve its language interoperability challenges. Specific accomplishments for the year include: (1) Refined SIDL interfaces for UEDGE to meet satisfy the standard interfaces required by FACETS for all physics modules. This required consensus building between framework and UEDGE developers. (2) Wrote prototype C++ driver for UEDGE to demonstrate how UEDGE can be called from C++ using Babel. (3) Supported the FACETS project by adding new features to Babel such as release number tagging, porting to new machines, and adding new configuration options. Babel modifications were delivered to FACETS by testing and publishing development snapshots in the projects software repository. (4) Assisted Tech-X Corporation in testing and debugging of a high level build system for the complete FACETS tool chain--the complete list of third-party software libraries that FACETS depends on directly or indirectly (e.g., MPI, HDF5, PACT, etc.). (5) Designed and implemented a new approach to wrapping UEDGE as a FACETS component without requiring Python. To get simulation results as soon as possible, our initial connection from the FACETS

  20. Agent-based modeling: a new approach for theory building in social psychology.

    PubMed

    Smith, Eliot R; Conrey, Frederica R

    2007-02-01

    Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach. PMID:18453457

  1. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery.

    PubMed

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level.

  2. Re-Examining of Moffitt's Theory of Delinquency through Agent Based Modeling.

    PubMed

    Leaw, Jia Ning; Ang, Rebecca P; Huan, Vivien S; Chan, Wei Teng; Cheong, Siew Ann

    2015-01-01

    Moffitt's theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome. PMID:26062022

  3. Re-Examining of Moffitt’s Theory of Delinquency through Agent Based Modeling

    PubMed Central

    Leaw, Jia Ning; Ang, Rebecca P.; Huan, Vivien S.; Chan, Wei Teng; Cheong, Siew Ann

    2015-01-01

    Moffitt’s theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome. PMID:26062022

  4. An agent-based interaction model for Chinese personal income distribution

    NASA Astrophysics Data System (ADS)

    Zou, Yijiang; Deng, Weibing; Li, Wei; Cai, Xu

    2015-10-01

    The personal income distribution in China was studied by employing the data from China Household Income Projects (CHIP) between 1990 and 2002. It was observed that the low and middle income regions could be described by the log-normal law, while the large income region could be well fitted by the power law. To characterize these empirical findings, a stochastic interactive model with mean-field approach was discussed, and the analytic result shows that the wealth distribution is of the Pareto type. Then we explored the agent-based model on networks, in which the exchange of wealth among agents depends on their connectivity. Numerical results suggest that the wealth of agents would largely rely on their connectivity, and the Pareto index of the simulated wealth distributions is comparable to those of the empirical data. The Pareto behavior of the tails of the empirical wealth distributions is consistent with that of the 'mean-field' model, as well as numerical simulations.

  5. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery.

    PubMed

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526

  6. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery

    PubMed Central

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526

  7. Using an Agent-Based Model to Examine the Role of Dynamic Bacterial Virulence Potential in the Pathogenesis of Surgical Site Infection

    PubMed Central

    Gopalakrishnan, Vissagan; Kim, Moses; An, Gary

    2013-01-01

    Objective Despite clinical advances, surgical site infections (SSIs) remain a problem. The development of SSIs involves a complex interplay between the cellular and molecular mechanisms of wound healing and contaminating bacteria, and here, we utilize an agent-based model (ABM) to investigate the role of bacterial virulence potential in the pathogenesis of SSI. Approach The Muscle Wound ABM (MWABM) incorporates muscle cells, neutrophils, macrophages, myoblasts, abstracted blood vessels, and avirulent/virulent bacteria to simulate the pathogenesis of SSIs. Simulated bacteria with virulence potential can mutate to possess resistance to reactive oxygen species and increased invasiveness. Simulated experiments (t=7 days) involved parameter sweeps of initial wound size to identify transition zones between healed and nonhealed wounds/SSIs, and to evaluate the effect of avirulent/virulent bacteria. Results The MWABM reproduced the dynamics of normal successful healing, including a transition zone in initial wound size beyond which healing was significantly impaired. Parameter sweeps with avirulent bacteria demonstrated that smaller wound sizes were associated with healing failure. This effect was even more pronounced with the addition of virulence potential to the contaminating bacteria. Innovation The MWABM integrates the myriad factors involved in the healing of a normal wound and the pathogenesis of SSIs. This type of model can serve as a useful framework into which more detailed mechanistic knowledge can be embedded. Conclusion Future work will involve more comprehensive representation of host factors, and especially the ability of those host factors to activate virulence potential in the microbes involved. PMID:24761337

  8. Application of SALSSA Framework to the Validation of Smoothed Particle Hydrodynamics Simulations of Low Reynolds Number Flows

    SciTech Connect

    Schuchardt, Karen L.; Chase, Jared M.; Daily, Jeffrey A.; Elsethagen, Todd O.; Palmer, Bruce J.; Scheibe, Timothy D.

    2009-06-15

    The Support Architecture for Large-Scale Subsurface Analysis (SALSSA) provides an extensible framework, sophisticated graphical user interface (GUI), and underlying data management system that simplifies the process of running subsurface models, tracking provenance information, and analyzing the model results. The SALSSA software framework is currently being applied to validating the Smoothed Particle Hydrodynamics (SPH) model. SPH is a three-dimensional model of flow and transport in porous media at the pore scale. Fluid flow in porous media at velocities common in natural porous media occur at low Reynolds numbers and therefore it is important to verify that the SPH model is producing accurate flow solutions in this regime. Validating SPH requires performing a series of simulations and comparing these simulation flow solutions to analytical results or numerical results using other methods. This validation study is being facilitated by the SALLSA framework, which provides capabilities to setup, execute, analyze, and administer these SPH simulations.

  9. A framework for stochastic simulation of distribution practices for hotel reservations

    SciTech Connect

    Halkos, George E.; Tsilika, Kyriaki D.

    2015-03-10

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system.

  10. A Systematic Framework for Molecular Dynamics Simulations of Protein Post-Translational Modifications

    PubMed Central

    Grandits, Melanie; Oostenbrink, Chris; Zagrovic, Bojan

    2013-01-01

    By directly affecting structure, dynamics and interaction networks of their targets, post-translational modifications (PTMs) of proteins play a key role in different cellular processes ranging from enzymatic activation to regulation of signal transduction to cell-cycle control. Despite the great importance of understanding how PTMs affect proteins at the atomistic level, a systematic framework for treating post-translationally modified amino acids by molecular dynamics (MD) simulations, a premier high-resolution computational biology tool, has never been developed. Here, we report and validate force field parameters (GROMOS 45a3 and 54a7) required to run and analyze MD simulations of more than 250 different types of enzymatic and non-enzymatic PTMs. The newly developed GROMOS 54a7 parameters in particular exhibit near chemical accuracy in matching experimentally measured hydration free energies (RMSE = 4.2 kJ/mol over the validation set). Using this tool, we quantitatively show that the majority of PTMs greatly alter the hydrophobicity and other physico-chemical properties of target amino acids, with the extent of change in many cases being comparable to the complete range spanned by native amino acids. PMID:23874192

  11. A systematic framework for molecular dynamics simulations of protein post-translational modifications.

    PubMed

    Petrov, Drazen; Margreitter, Christian; Grandits, Melanie; Oostenbrink, Chris; Zagrovic, Bojan

    2013-01-01

    By directly affecting structure, dynamics and interaction networks of their targets, post-translational modifications (PTMs) of proteins play a key role in different cellular processes ranging from enzymatic activation to regulation of signal transduction to cell-cycle control. Despite the great importance of understanding how PTMs affect proteins at the atomistic level, a systematic framework for treating post-translationally modified amino acids by molecular dynamics (MD) simulations, a premier high-resolution computational biology tool, has never been developed. Here, we report and validate force field parameters (GROMOS 45a3 and 54a7) required to run and analyze MD simulations of more than 250 different types of enzymatic and non-enzymatic PTMs. The newly developed GROMOS 54a7 parameters in particular exhibit near chemical accuracy in matching experimentally measured hydration free energies (RMSE=4.2 kJ/mol over the validation set). Using this tool, we quantitatively show that the majority of PTMs greatly alter the hydrophobicity and other physico-chemical properties of target amino acids, with the extent of change in many cases being comparable to the complete range spanned by native amino acids. PMID:23874192

  12. Population genetics and molecular evolution of DNA sequences in transposable elements. I. A simulation framework.

    PubMed

    Kijima, T E; Innan, Hideki

    2013-11-01

    A population genetic simulation framework is developed to understand the behavior and molecular evolution of DNA sequences of transposable elements. Our model incorporates random transposition and excision of transposable element (TE) copies, two modes of selection against TEs, and degeneration of transpositional activity by point mutations. We first investigated the relationships between the behavior of the copy number of TEs and these parameters. Our results show that when selection is weak, the genome can maintain a relatively large number of TEs, but most of them are less active. In contrast, with strong selection, the genome can maintain only a limited number of TEs but the proportion of active copies is large. In such a case, there could be substantial fluctuations of the copy number over generations. We also explored how DNA sequences of TEs evolve through the simulations. In general, active copies form clusters around the original sequence, while less active copies have long branches specific to themselves, exhibiting a star-shaped phylogeny. It is demonstrated that the phylogeny of TE sequences could be informative to understand the dynamics of TE evolution.

  13. A discrete element based simulation framework to investigate particulate spray deposition processes

    SciTech Connect

    Mukherjee, Debanjan Zohdi, Tarek I.

    2015-06-01

    This work presents a computer simulation framework based on discrete element method to analyze manufacturing processes that comprise a loosely flowing stream of particles in a carrier fluid being deposited on a target surface. The individual particulate dynamics under the combined action of particle collisions, fluid–particle interactions, particle–surface contact and adhesive interactions is simulated, and aggregated to obtain global system behavior. A model for deposition which incorporates the effect of surface energy, impact velocity and particle size, is developed. The fluid–particle interaction is modeled using appropriate spray nozzle gas velocity distributions and a one-way coupling between the phases. It is found that the particle response times and the release velocity distribution of particles have a combined effect on inter-particle collisions during the flow along the spray. It is also found that resolution of the particulate collisions close to the target surface plays an important role in characterizing the trends in the deposit pattern. Analysis of the deposit pattern using metrics defined from the particle distribution on the target surface is provided to characterize the deposition efficiency, deposit size, and scatter due to collisions.

  14. A heterogeneous and parallel computing framework for high-resolution hydrodynamic simulations

    NASA Astrophysics Data System (ADS)

    Smith, Luke; Liang, Qiuhua

    2015-04-01

    Shock-capturing hydrodynamic models are now widely applied in the context of flood risk assessment and forecasting, accurately capturing the behaviour of surface water over ground and within rivers. Such models are generally explicit in their numerical basis, and can be computationally expensive; this has prohibited full use of high-resolution topographic data for complex urban environments, now easily obtainable through airborne altimetric surveys (LiDAR). As processor clock speed advances have stagnated in recent years, further computational performance gains are largely dependent on the use of parallel processing. Heterogeneous computing architectures (e.g. graphics processing units or compute accelerator cards) provide a cost-effective means of achieving high throughput in cases where the same calculation is performed with a large input dataset. In recent years this technique has been applied successfully for flood risk mapping, such as within the national surface water flood risk assessment for the United Kingdom. We present a flexible software framework for hydrodynamic simulations across multiple processors of different architectures, within multiple computer systems, enabled using OpenCL and Message Passing Interface (MPI) libraries. A finite-volume Godunov-type scheme is implemented using the HLLC approach to solving the Riemann problem, with optional extension to second-order accuracy in space and time using the MUSCL-Hancock approach. The framework is successfully applied on personal computers and a small cluster to provide considerable improvements in performance. The most significant performance gains were achieved across two servers, each containing four NVIDIA GPUs, with a mix of K20, M2075 and C2050 devices. Advantages are found with respect to decreased parametric sensitivity, and thus in reducing uncertainty, for a major fluvial flood within a large catchment during 2005 in Carlisle, England. Simulations for the three-day event could be performed

  15. Engineering large-scale agent-based systems with consensus

    NASA Technical Reports Server (NTRS)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  16. Agent-Based Modeling of Noncommunicable Diseases: A Systematic Review

    PubMed Central

    Arah, Onyebuchi A.

    2015-01-01

    We reviewed the use of agent-based modeling (ABM), a systems science method, in understanding noncommunicable diseases (NCDs) and their public health risk factors. We systematically reviewed studies in PubMed, ScienceDirect, and Web of Sciences published from January 2003 to July 2014. We retrieved 22 relevant articles; each had an observational or interventional design. Physical activity and diet were the most-studied outcomes. Often, single agent types were modeled, and the environment was usually irrelevant to the studied outcome. Predictive validation and sensitivity analyses were most used to validate models. Although increasingly used to study NCDs, ABM remains underutilized and, where used, is suboptimally reported in public health studies. Its use in studying NCDs will benefit from clarified best practices and improved rigor to establish its usefulness and facilitate replication, interpretation, and application. PMID:25602871

  17. On agent-based modeling and computational social science

    PubMed Central

    Conte, Rosaria; Paolucci, Mario

    2014-01-01

    In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS. PMID:25071642

  18. Statistical Agent Based Modelization of the Phenomenon of Drug Abuse

    NASA Astrophysics Data System (ADS)

    di Clemente, Riccardo; Pietronero, Luciano

    2012-07-01

    We introduce a statistical agent based model to describe the phenomenon of drug abuse and its dynamical evolution at the individual and global level. The agents are heterogeneous with respect to their intrinsic inclination to drugs, to their budget attitude and social environment. The various levels of drug use were inspired by the professional description of the phenomenon and this permits a direct comparison with all available data. We show that certain elements have a great importance to start the use of drugs, for example the rare events in the personal experiences which permit to overcame the barrier of drug use occasionally. The analysis of how the system reacts to perturbations is very important to understand its key elements and it provides strategies for effective policy making. The present model represents the first step of a realistic description of this phenomenon and can be easily generalized in various directions.

  19. Evaluation of a 4D cone-beam CT reconstruction approach using a simulation framework.

    PubMed

    Hartl, Alexander; Yaniv, Ziv

    2009-01-01

    Current image-guided navigation systems for thoracic abdominal interventions utilize three dimensional (3D) images acquired at breath-hold. As a result they can only provide guidance at a specific point in the respiratory cycle. The intervention is thus performed in a gated manner, with the physician advancing only when the patient is at the same respiratory cycle in which the 3D image was acquired. To enable a more continuous workflow we propose to use 4D image data. We describe an approach to constructing a set of 4D images from a diagnostic CT acquired at breath-hold and a set of intraoperative cone-beam CT (CBCT) projection images acquired while the patient is freely breathing. Our approach is based on an initial reconstruction of a gated 4D CBCT data set. The 3D CBCT images for each respiratory phase are then non-rigidly registered to the diagnostic CT data. Finally the diagnostic CT is deformed based on the registration results, providing a 4D data set with sufficient quality for navigation purposes. In this work we evaluate the proposed reconstruction approach using a simulation framework. A 3D CBCT dataset of an anthropomorphic phantom is deformed using internal motion data acquired from an animal model to create a ground truth 4D CBCT image. Simulated projection images are then created from the 4D image and the known CBCT scan parameters. Finally, the original 3D CBCT and the simulated X-ray images are used as input to our reconstruction method. The resulting 4D data set is then compared to the known ground truth by normalized cross correlation(NCC). We show that the deformed diagnostic CTs are of better quality than the gated reconstructions with a mean NCC value of 0.94 versus a mean 0.81 for the reconstructions. PMID:19964143

  20. Linking agent-based models and stochastic models of financial markets.

    PubMed

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene

    2012-05-29

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.

  1. Combining agent-based modeling and life cycle assessment for the evaluation of mobility policies.

    PubMed

    Florent, Querini; Enrico, Benetto

    2015-02-01

    This article presents agent-based modeling (ABM) as a novel approach for consequential life cycle assessment (C-LCA) of large scale policies, more specifically mobility-related policies. The approach is validated at the Luxembourgish level (as a first case study). The agent-based model simulates the car market (sales, use, and dismantling) of the population of users in the period 2013-2020, following the implementation of different mobility policies and available electric vehicles. The resulting changes in the car fleet composition as well as the hourly uses of the vehicles are then used to derive consistent LCA results, representing the consequences of the policies. Policies will have significant environmental consequences: when using ReCiPe2008, we observe a decrease of global warming, fossil depletion, acidification, ozone depletion, and photochemical ozone formation and an increase of metal depletion, ionizing radiations, marine eutrophication, and particulate matter formation. The study clearly shows that the extrapolation of LCA results for the circulating fleet at national scale following the introduction of the policies from the LCAs of single vehicles by simple up-scaling (using hypothetical deployment scenarios) would be flawed. The inventory has to be directly conducted at full scale and to this aim, ABM is indeed a promising approach, as it allows identifying and quantifying emerging effects while modeling the Life Cycle Inventory of vehicles at microscale through the concept of agents.

  2. Combining agent-based modeling and life cycle assessment for the evaluation of mobility policies.

    PubMed

    Florent, Querini; Enrico, Benetto

    2015-02-01

    This article presents agent-based modeling (ABM) as a novel approach for consequential life cycle assessment (C-LCA) of large scale policies, more specifically mobility-related policies. The approach is validated at the Luxembourgish level (as a first case study). The agent-based model simulates the car market (sales, use, and dismantling) of the population of users in the period 2013-2020, following the implementation of different mobility policies and available electric vehicles. The resulting changes in the car fleet composition as well as the hourly uses of the vehicles are then used to derive consistent LCA results, representing the consequences of the policies. Policies will have significant environmental consequences: when using ReCiPe2008, we observe a decrease of global warming, fossil depletion, acidification, ozone depletion, and photochemical ozone formation and an increase of metal depletion, ionizing radiations, marine eutrophication, and particulate matter formation. The study clearly shows that the extrapolation of LCA results for the circulating fleet at national scale following the introduction of the policies from the LCAs of single vehicles by simple up-scaling (using hypothetical deployment scenarios) would be flawed. The inventory has to be directly conducted at full scale and to this aim, ABM is indeed a promising approach, as it allows identifying and quantifying emerging effects while modeling the Life Cycle Inventory of vehicles at microscale through the concept of agents. PMID:25587896

  3. Linking agent-based models and stochastic models of financial markets

    PubMed Central

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene

    2012-01-01

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086

  4. A real-time interactive simulation framework for watershed decision making using numerical models and virtual environment

    NASA Astrophysics Data System (ADS)

    Zhang, ShangHong; Xia, ZhongXi; Wang, TaiWei

    2013-06-01

    Decision support systems based on a virtual environment (VE) are becoming a popular platform in watershed simulation and management. Simulation speed and data visualization is of great significance to decision making, especially in urgent events. Real-time interaction during the simulation process is also very important for dealing with different conditions and for making timely decisions. In this study, a VE-based real-time interactive simulation framework (VERTISF) is developed and applied to simulation and management of the Dujiangyan Project in China. In VERTISF development, a virtual reality platform and numerical models were hosted on different computers and connected by a network to improve simulation speed. Different types of numerical models were generalized in a unified architecture based on time step, and interactive control was realized by modifying model boundary conditions at each time step. The "instruction-response" method and data interpolation were used to synchronize virtual environment visualization and numerical model calculation. Implementation of the framework was based on modular software design; various computer languages can be used to develop the appropriate module. Since only slight modification was needed for current numerical model integration in the framework, VERTISF was easy to extend. Results showed that VERTISF could take full advantage of hardware development, and it was a simple and effective solution for complex watershed simulation.

  5. Molecular dynamics simulation of framework flexibility effects on noble gas diffusion in HKUST-1 and ZIF-8

    DOE PAGES

    Parkes, Marie V.; Demir, Hakan; Teich-McGoldrick, Stephanie L.; Sholl, David S.; Greathouse, Jeffery A.; Allendorf, Mark D.

    2014-03-28

    Molecular dynamics simulations were used to investigate trends in noble gas (Ar, Kr, Xe) diffusion in the metal-organic frameworks HKUST-1 and ZIF-8. Diffusion occurs primarily through inter-cage jump events, with much greater diffusion of guest atoms in HKUST-1 compared to ZIF-8 due to the larger cage and window sizes in the former. We compare diffusion coefficients calculated for both rigid and flexible frameworks. For rigid framework simulations, in which the framework atoms were held at their crystallographic or geometry optimized coordinates, sometimes dramatic differences in guest diffusion were seen depending on the initial framework structure or the choice of frameworkmore » force field parameters. When framework flexibility effects were included, argon and krypton diffusion increased significantly compared to rigid-framework simulations using general force field parameters. Additionally, for argon and krypton in ZIF-8, guest diffusion increased with loading, demonstrating that guest-guest interactions between cages enhance inter-cage diffusion. No inter-cage jump events were seen for xenon atoms in ZIF-8 regardless of force field or initial structure, and the loading dependence of xenon diffusion in HKUST-1 is different for rigid and flexible frameworks. Diffusion of krypton and xenon in HKUST-1 depends on two competing effects: the steric effect that decreases diffusion as loading increases, and the “small cage effect” that increases diffusion as loading increases. Finally, a detailed analysis of the window size in ZIF-8 reveals that the window increases beyond its normal size to permit passage of a (nominally) larger krypton atom.« less

  6. A Framework for the Abstraction of Mesoscale Modeling for Weather Simulation

    NASA Astrophysics Data System (ADS)

    Limpasuvan, V.; Ujcich, B. E.

    2009-12-01

    Widely disseminated weather forecast results (e. g. from various national centers and private companies) are useful for typical users in gauging future atmospheric disturbances. However, these canonical forecasts may not adequately meet the needs of end-users in the various scientific fields since a predetermined model, as structured by the model administrator, produces these forecasts. To perform his/her own successful forecasts, a user faces a steep learning curve involving the collection of initial condition data (e.g. radar, satellite, and reanalyses) and operation of a suitable model (and associated software/computing). In this project, we develop an intermediate (prototypical) software framework and a web-based front-end interface that allow for the abstraction of an advanced weather model upon which the end-user can perform customizable forecasts and analyses. Having such an accessible, front-end interface for a weather model can benefit educational programs at the secondary school and undergraduate level, scientific research in the fields like fluid dynamics and meteorology, and the general public. In all cases, our project allows the user to generate a localized domain of choice, run the desired forecast on a remote high-performance computer cluster, and visually see the results. For instance, an undergraduate science curriculum could incorporate the resulting weather forecast performed under this project in laboratory exercises. Scientific researchers and graduate students would be able to readily adjust key prognostic variables in the simulation within this project’s framework. The general public within the contiguous United States could also run a simplified version of the project’s software with adjustments in forecast clarity (spatial resolution) and region size (domain). Special cases of general interests, in which a detailed forecast may be required, would be over areas of possible strong weather activities.

  7. Simulating mesoscale coastal evolution for decadal coastal management: A new framework integrating multiple, complementary modelling approaches

    NASA Astrophysics Data System (ADS)

    van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.

    2016-03-01

    Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner

  8. A model framework to represent plant-physiology and rhizosphere processes in soil profile simulation models

    NASA Astrophysics Data System (ADS)

    Vanderborght, J.; Javaux, M.; Couvreur, V.; Schröder, N.; Huber, K.; Abesha, B.; Schnepf, A.; Vereecken, H.

    2013-12-01

    of plant transpiration by root-zone produced plant hormones, and (iv) the impact of salt accumulation at the soil-root interface on root water uptake. We further propose a framework how this process knowledge could be implemented in root zone simulation models that do not resolve small scale processes.

  9. Autogenerator-Based Modelling Framework for Development of Strategic Games Simulations: Rational Pigs Game Extended

    PubMed Central

    Magdalenić, Ivan

    2014-01-01

    When considering strategic games from the conceptual perspective that focuses on the questions of participants' decision-making rationality, the very issues of modelling and simulation are rarely discussed. The well-known Rational Pigs matrix game has been relatively intensively analyzed in terms of reassessment of the logic of two players involved in asymmetric situations as gluttons that differ significantly by their attributes. This paper presents a successful attempt of using autogenerator for creating the framework of the game, including the predefined scenarios and corresponding payoffs. Autogenerator offers flexibility concerning the specification of game parameters, which consist of variations in the number of simultaneous players and their features and game objects and their attributes as well as some general game characteristics. In the proposed approach the model of autogenerator was upgraded so as to enable program specification updates. For the purpose of treatment of more complex strategic scenarios, we created the Rational Pigs Game Extended (RPGE), in which the introduction of a third glutton entails significant structural changes. In addition, due to the existence of particular attributes of the new player, “the tramp,” one equilibrium point from the original game is destabilized which has an influence on the decision-making of rational players. PMID:25254228

  10. A simulation framework for estimating wall stress distribution of abdominal aortic aneurysm.

    PubMed

    Qin, Jing; Zhang, Jing; Chui, Chee-Kong; Huang, Wei-Min; Yang, Tao; Pang, Wai-Man; Sudhakar, Venkatesh; Chang, Stephen

    2011-01-01

    Abdominal aortic aneurysm (AAA) rupture is believed to occur when the mechanical stress acting on the wall exceeds the strength of the wall tissue. In endovascular aneurysm repair, a stent-graft in a catheter is released at the aneurysm site to form a new blood vessel and protect the weakened AAA wall from the pulsatile pressure and, hence, possible rupture. In this paper, we propose a framework to estimate the wall stress distribution of non-stented/stented AAA based on fluid-structure interaction, which is utilized in a surgical simulation system (IRAS). The 3D geometric model of AAA is reconstructed from computed tomography angiographic (CTA) images. Based on our experiments, a combined logarithm and polynomial strain energy equation is applied to model the elastic properties of arterial wall. The blood flow is modeled as laminar, incompressible, and non-Newtonian flow by applying Navier-Stokes equation. The obtained pressure of blood flow is applied as load on the AAA meshes with and without stent-graft and the wall stress distribution is calculated by fluid-structure interaction (FSI) solver equipped in ANSYS. Experiments demonstrate that our analytical results are consistent with clinical observations. PMID:22254456

  11. Autogenerator-based modelling framework for development of strategic games simulations: rational pigs game extended.

    PubMed

    Fabac, Robert; Radošević, Danijel; Magdalenić, Ivan

    2014-01-01

    When considering strategic games from the conceptual perspective that focuses on the questions of participants' decision-making rationality, the very issues of modelling and simulation are rarely discussed. The well-known Rational Pigs matrix game has been relatively intensively analyzed in terms of reassessment of the logic of two players involved in asymmetric situations as gluttons that differ significantly by their attributes. This paper presents a successful attempt of using autogenerator for creating the framework of the game, including the predefined scenarios and corresponding payoffs. Autogenerator offers flexibility concerning the specification of game parameters, which consist of variations in the number of simultaneous players and their features and game objects and their attributes as well as some general game characteristics. In the proposed approach the model of autogenerator was upgraded so as to enable program specification updates. For the purpose of treatment of more complex strategic scenarios, we created the Rational Pigs Game Extended (RPGE), in which the introduction of a third glutton entails significant structural changes. In addition, due to the existence of particular attributes of the new player, "the tramp," one equilibrium point from the original game is destabilized which has an influence on the decision-making of rational players.

  12. Development of a Lattice Boltzmann Framework for Numerical Simulation of Thrombosis

    NASA Astrophysics Data System (ADS)

    Harrison, S. E.; Bernsdorf, J.; Hose, D. R.; Lawford, P. V.

    The interacting factors relating to thrombogenesis were defined by Virchow in 1856 to be abnormalities of blood chemistry, the vessel wall and haemodynamics. Together, these factors are known as Virchow's triad. Many attempts have been made to simulate numerically certain aspects of the complex phenomena of thrombosis, but a comprehensive model, which includes the biochemical and physical aspects of Virchow's triad, and is capable of predicting thrombus development within physiological geometries has not yet been developed. Such a model would consider the role of platelets and the coagulation cascade along with the properties of the flow in the chosen vessel. A lattice Boltzmann thrombosis framework has been developed, on top of an existing flow solver, to model the formation of thrombi resulting from platelet activation and initiation of the coagulation cascade by one or more of the strands of Virchow's triad. Both processes then act in parallel, to restore homeostasis as the deposited thrombus disturbs the flow. Results are presented in a model of deep vein thrombosis (DVT), resulting from hypoxia and associated endothelial damage.

  13. An agent-based model for queue formation of powered two-wheelers in heterogeneous traffic

    NASA Astrophysics Data System (ADS)

    Lee, Tzu-Chang; Wong, K. I.

    2016-11-01

    This paper presents an agent-based model (ABM) for simulating the queue formation of powered two-wheelers (PTWs) in heterogeneous traffic at a signalized intersection. The main novelty is that the proposed interaction rule describing the position choice behavior of PTWs when queuing in heterogeneous traffic can capture the stochastic nature of the decision making process. The interaction rule is formulated as a multinomial logit model, which is calibrated by using a microscopic traffic trajectory dataset obtained from video footage. The ABM is validated against the survey data for the vehicular trajectory patterns, queuing patterns, queue lengths, and discharge rates. The results demonstrate that the proposed model is capable of replicating the observed queue formation process for heterogeneous traffic.

  14. Agent based model of effects of task allocation strategies in flat organizations

    NASA Astrophysics Data System (ADS)

    Sobkowicz, Pawel

    2016-09-01

    A common practice in many organizations is to pile the work on the best performers. It is easy to implement by the management and, despite the apparent injustice, appears to be working in many situations. In our work we present a simple agent based model, constructed to simulate this practice and to analyze conditions under which the overall efficiency of the organization (for example measured by the backlog of unresolved issues) breaks down, due to the cumulative effect of the individual overloads. The model confirms that the strategy mentioned above is, indeed, rational: it leads to better global results than an alternative one, using equal workload distribution among all workers. The presented analyses focus on the behavior of the organizations close to the limit of the maximum total throughput and provide results for the growth of the unprocessed backlog in several situations, as well as suggestions related to avoiding such buildup.

  15. An agent based multi-optional model for the diffusion of innovations

    NASA Astrophysics Data System (ADS)

    Laciana, Carlos E.; Oteiza-Aguirre, Nicolás

    2014-01-01

    We propose a model for the diffusion of several products competing in a common market based on the generalization of the Ising model of statistical mechanics (Potts model). Using an agent based implementation we analyze two problems: (i) a three options case, i.e. to adopt a product A, a product B, or non-adoption and (ii) a four option case, i.e. the adoption of product A, product B, both, or none. In the first case we analyze a launching strategy for one of the two products, which delays its launching with the objective of competing with improvements. Market shares reached by each product are then estimated at market saturation. Finally, simulations are carried out with varying degrees of social network topology, uncertainty, and population homogeneity.

  16. Skin Stem Cell Hypotheses and Long Term Clone Survival – Explored Using Agent-based Modelling

    PubMed Central

    Li, X.; Upadhyay, A. K.; Bullock, A. J.; Dicolandrea, T.; Xu, J.; Binder, R. L.; Robinson, M. K.; Finlay, D. R.; Mills, K. J.; Bascom, C. C.; Kelling, C. K.; Isfort, R. J.; Haycock, J. W.; MacNeil, S.; Smallwood, R. H.

    2013-01-01

    Epithelial renewal in skin is achieved by the constant turnover and differentiation of keratinocytes. Three popular hypotheses have been proposed to explain basal keratinocyte regeneration and epidermal homeostasis: 1) asymmetric division (stem-transit amplifying cell); 2) populational asymmetry (progenitor cell with stochastic fate); and 3) populational asymmetry with stem cells. In this study, we investigated lineage dynamics using these hypotheses with a 3D agent-based model of the epidermis. The model simulated the growth and maintenance of the epidermis over three years. The offspring of each proliferative cell was traced. While all lineages were preserved in asymmetric division, the vast majority were lost when assuming populational asymmetry. The third hypothesis provided the most reliable mechanism for self-renewal by preserving genetic heterogeneity in quiescent stem cells, and also inherent mechanisms for skin ageing and the accumulation of genetic mutation. PMID:23712735

  17. The Evolution of ICT Markets: An Agent-Based Model on Complex Networks

    NASA Astrophysics Data System (ADS)

    Zhao, Liangjie; Wu, Bangtao; Chen, Zhong; Li, Li

    Information and communication technology (ICT) products exhibit positive network effects.The dynamic process of ICT markets evolution has two intrinsic characteristics: (1) customers are influenced by each others’ purchasing decision; (2) customers are intelligent agents with bounded rationality.Guided by complex systems theory, we construct an agent-based model and simulate on complex networks to examine how the evolution can arise from the interaction of customers, which occur when they make expectations about the future installed base of a product by the fraction of neighbors who are using the same product in his personal network.We demonstrate that network effects play an important role in the evolution of markets share, which make even an inferior product can dominate the whole market.We also find that the intensity of customers’ communication can influence whether the best initial strategy for firms is to improve product quality or expand their installed base.

  18. Evolving nutritional strategies in the presence of competition: a geometric agent-based model.

    PubMed

    Senior, Alistair M; Charleston, Michael A; Lihoreau, Mathieu; Buhl, Jerome; Raubenheimer, David; Simpson, Stephen J

    2015-03-01

    Access to nutrients is a key factor governing development, reproduction and ultimately fitness. Within social groups, contest-competition can fundamentally affect nutrient access, potentially leading to reproductive asymmetry among individuals. Previously, agent-based models have been combined with the Geometric Framework of nutrition to provide insight into how nutrition and social interactions affect one another. Here, we expand this modelling approach by incorporating evolutionary algorithms to explore how contest-competition over nutrient acquisition might affect the evolution of animal nutritional strategies. Specifically, we model tolerance of nutrient excesses and deficits when ingesting nutritionally imbalanced foods, which we term 'nutritional latitude'; a higher degree of nutritional latitude constitutes a higher tolerance of nutritional excess and deficit. Our results indicate that a transition between two alternative strategies occurs at moderate to high levels of competition. When competition is low, individuals display a low level of nutritional latitude and regularly switch foods in search of an optimum. When food is scarce and contest-competition is intense, high nutritional latitude appears optimal, and individuals continue to consume an imbalanced food for longer periods before attempting to switch to an alternative. However, the relative balance of nutrients within available foods also strongly influences at what levels of competition, if any, transitions between these two strategies occur. Our models imply that competition combined with reproductive skew in social groups can play a role in the evolution of diet breadth. We discuss how the integration of agent-based, nutritional and evolutionary modelling may be applied in future studies to further understand the evolution of nutritional strategies across social and ecological contexts.

  19. Estimating Impacts of Climate Change Policy on Land Use: An Agent-Based Modelling Approach

    PubMed Central

    2015-01-01

    Agriculture is important to New Zealand’s economy. Like other primary producers, New Zealand strives to increase agricultural output while maintaining environmental integrity. Utilising modelling to explore the economic, environmental and land use impacts of policy is critical to understand the likely effects on the sector. Key deficiencies within existing land use and land cover change models are the lack of heterogeneity in farmers and their behaviour, the role that social networks play in information transfer, and the abstraction of the global and regional economic aspects within local-scale approaches. To resolve these issues we developed the Agent-based Rural Land Use New Zealand model. The model utilises a partial equilibrium economic model and an agent-based decision-making framework to explore how the cumulative effects of individual farmer’s decisions affect farm conversion and the resulting land use at a catchment scale. The model is intended to assist in the development of policy to shape agricultural land use intensification in New Zealand. We illustrate the model, by modelling the impact of a greenhouse gas price on farm-level land use, net revenue, and environmental indicators such as nutrient losses and soil erosion for key enterprises in the Hurunui and Waiau catchments of North Canterbury in New Zealand. Key results from the model show that farm net revenue is estimated to increase over time regardless of the greenhouse gas price. Net greenhouse gas emissions are estimated to decline over time, even under a no GHG price baseline, due to an expansion of forestry on low productivity land. Higher GHG prices provide a greater net reduction of emissions. While social and geographic network effects have minimal impact on net revenue and environmental outputs for the catchment, they do have an effect on the spatial arrangement of land use and in particular the clustering of enterprises. PMID:25996591

  20. The Evolution of Cooperation in Managed Groundwater Systems: An Agent-Based Modelling Approach

    NASA Astrophysics Data System (ADS)

    Castilla Rho, J. C.; Mariethoz, G.; Rojas, R. F.; Andersen, M. S.; Kelly, B. F.; Holley, C.

    2014-12-01

    Human interactions with groundwater systems often exhibit complex features that hinder the sustainable management of the resource. This leads to costly and persistent conflicts over groundwater at the catchment scale. One possible way to address these conflicts is by gaining a better understanding of how social and groundwater dynamics coevolve using agent-based models (ABM). Such models allow exploring 'bottom-up' solutions (i.e., self-organised governance systems), where the behaviour of individual agents (e.g., farmers) results in the emergence of mutual cooperation among groundwater users. There is significant empirical evidence indicating that this kind of 'bottom-up' approach may lead to more enduring and sustainable outcomes, compared to conventional 'top-down' strategies such as centralized control and water right schemes (Ostrom 1990). New modelling tools are needed to study these concepts systematically and efficiently. Our model uses a conceptual framework to study cooperation and the emergence of social norms as initially proposed by Axelrod (1986), which we adapted to groundwater management. We developed an ABM that integrates social mechanisms and the physics of subsurface flow. The model explicitly represents feedback between groundwater conditions and social dynamics, capturing the spatial structure of these interactions and the potential effects on cooperation levels in an agricultural setting. Using this model, we investigate a series of mechanisms that may trigger norms supporting cooperative strategies, which can be sustained and become stable over time. For example, farmers in a self-monitoring community can be more efficient at achieving the objective of sustainable groundwater use than government-imposed regulation. Our coupled model thus offers a platform for testing new schemes promoting cooperation and improved resource use, which can be used as a basis for policy design. Importantly, we hope to raise awareness of agent-based modelling as

  1. Evolving Nutritional Strategies in the Presence of Competition: A Geometric Agent-Based Model

    PubMed Central

    Senior, Alistair M.; Charleston, Michael A.; Lihoreau, Mathieu; Buhl, Jerome; Raubenheimer, David; Simpson, Stephen J.

    2015-01-01

    Access to nutrients is a key factor governing development, reproduction and ultimately fitness. Within social groups, contest-competition can fundamentally affect nutrient access, potentially leading to reproductive asymmetry among individuals. Previously, agent-based models have been combined with the Geometric Framework of nutrition to provide insight into how nutrition and social interactions affect one another. Here, we expand this modelling approach by incorporating evolutionary algorithms to explore how contest-competition over nutrient acquisition might affect the evolution of animal nutritional strategies. Specifically, we model tolerance of nutrient excesses and deficits when ingesting nutritionally imbalanced foods, which we term ‘nutritional latitude’; a higher degree of nutritional latitude constitutes a higher tolerance of nutritional excess and deficit. Our results indicate that a transition between two alternative strategies occurs at moderate to high levels of competition. When competition is low, individuals display a low level of nutritional latitude and regularly switch foods in search of an optimum. When food is scarce and contest-competition is intense, high nutritional latitude appears optimal, and individuals continue to consume an imbalanced food for longer periods before attempting to switch to an alternative. However, the relative balance of nutrients within available foods also strongly influences at what levels of competition, if any, transitions between these two strategies occur. Our models imply that competition combined with reproductive skew in social groups can play a role in the evolution of diet breadth. We discuss how the integration of agent-based, nutritional and evolutionary modelling may be applied in future studies to further understand the evolution of nutritional strategies across social and ecological contexts. PMID:25815976

  2. Estimating impacts of climate change policy on land use: an agent-based modelling approach.

    PubMed

    Morgan, Fraser J; Daigneault, Adam J

    2015-01-01

    Agriculture is important to New Zealand's economy. Like other primary producers, New Zealand strives to increase agricultural output while maintaining environmental integrity. Utilising modelling to explore the economic, environmental and land use impacts of policy is critical to understand the likely effects on the sector. Key deficiencies within existing land use and land cover change models are the lack of heterogeneity in farmers and their behaviour, the role that social networks play in information transfer, and the abstraction of the global and regional economic aspects within local-scale approaches. To resolve these issues we developed the Agent-based Rural Land Use New Zealand model. The model utilises a partial equilibrium economic model and an agent-based decision-making framework to explore how the cumulative effects of individual farmer's decisions affect farm conversion and the resulting land use at a catchment scale. The model is intended to assist in the development of policy to shape agricultural land use intensification in New Zealand. We illustrate the model, by modelling the impact of a greenhouse gas price on farm-level land use, net revenue, and environmental indicators such as nutrient losses and soil erosion for key enterprises in the Hurunui and Waiau catchments of North Canterbury in New Zealand. Key results from the model show that farm net revenue is estimated to increase over time regardless of the greenhouse gas price. Net greenhouse gas emissions are estimated to decline over time, even under a no GHG price baseline, due to an expansion of forestry on low productivity land. Higher GHG prices provide a greater net reduction of emissions. While social and geographic network effects have minimal impact on net revenue and environmental outputs for the catchment, they do have an effect on the spatial arrangement of land use and in particular the clustering of enterprises.

  3. Generic Procedure for Coupling the PHREEQC Geochemical Modeling Framework with Flow and Solute Transport Simulators

    NASA Astrophysics Data System (ADS)

    Wissmeier, L. C.; Barry, D. A.

    2009-12-01

    Computer simulations of water availability and quality play an important role in state-of-the-art water resources management. However, many of the most utilized software programs focus either on physical flow and transport phenomena (e.g., MODFLOW, MT3DMS, FEFLOW, HYDRUS) or on geochemical reactions (e.g., MINTEQ, PHREEQC, CHESS, ORCHESTRA). In recent years, several couplings between both genres of programs evolved in order to consider interactions between flow and biogeochemical reactivity (e.g., HP1, PHWAT). Software coupling procedures can be categorized as ‘close couplings’, where programs pass information via the memory stack at runtime, and ‘remote couplings’, where the information is exchanged at each time step via input/output files. The former generally involves modifications of software codes and therefore expert programming skills are required. We present a generic recipe for remotely coupling the PHREEQC geochemical modeling framework and flow and solute transport (FST) simulators. The iterative scheme relies on operator splitting with continuous re-initialization of PHREEQC and the FST of choice at each time step. Since PHREEQC calculates the geochemistry of aqueous solutions in contact with soil minerals, the procedure is primarily designed for couplings to FST’s for liquid phase flow in natural environments. It requires the accessibility of initial conditions and numerical parameters such as time and space discretization in the input text file for the FST and control of the FST via commands to the operating system (batch on Windows; bash/shell on Unix/Linux). The coupling procedure is based on PHREEQC’s capability to save the state of a simulation with all solid, liquid and gaseous species as a PHREEQC input file by making use of the dump file option in the TRANSPORT keyword. The output from one reaction calculation step is therefore reused as input for the following reaction step where changes in element amounts due to advection

  4. Evolutionary Agent-based Models to design distributed water management strategies

    NASA Astrophysics Data System (ADS)

    Giuliani, M.; Castelletti, A.; Reed, P. M.

    2012-12-01

    There is growing awareness in the scientific community that the traditional centralized approach to water resources management, as described in much of the water resources literature, provides an ideal optimal solution, which is certainly useful to quantify the best physically achievable performance, but is generally inapplicable. Most real world water resources management problems are indeed characterized by the presence of multiple, distributed and institutionally-independent decision-makers. Multi-Agent Systems provide a potentially more realistic alternative framework to model multiple and self-interested decision-makers in a credible context. Each decision-maker can be represented by an agent who, being self-interested, acts according to local objective functions and produces negative externalities on system level objectives. Different levels of coordination can potentially be included in the framework by designing coordination mechanisms to drive the current decision-making structure toward the global system efficiency. Yet, the identification of effective coordination strategies can be particularly complex in modern institutional contexts and current practice is dependent on largely ad-hoc coordination strategies. In this work we propose a novel Evolutionary Agent-based Modeling (EAM) framework that enables a mapping of fully uncoordinated and centrally coordinated solutions into their relative "many-objective" tradeoffs using multiobjective evolutionary algorithms. Then, by analysing the conflicts between local individual agent and global system level objectives it is possible to more fully understand the causes, consequences, and potential solution strategies for coordination failures. Game-theoretic criteria have value for identifying the most interesting alternatives from a policy making point of view as well as the coordination mechanisms that can be applied to obtain these interesting solutions. The proposed approach is numerically tested on a

  5. A modular framework for matter flux simulation at the catchment scale

    NASA Astrophysics Data System (ADS)

    Kraft, P.; Breuer, L.; Vaché, K. B.; Frede, H.-G.

    2009-04-01

    Modeling nutrient fluxes in a catchment is a complex and interdisciplinary task. Building and improving simulation tools for such complex systems is often constraint by the expertise of the engaged scientists: Since different fields of science are involved like vadose zone and ground water hydrology, plant growth, atmospheric exchange, soil chemistry, soil microbiology, stream physics and stream chemistry, a single work group cannot excel in all parts. As a result, either parts of the system, where no scientist involved is an expert, include rough simplifications, or a "complete" group is too big for maintaining the system over a longer period. However, many approaches exist to create complex models that integrate processes for all sub domains. But a tight integration bears the problem of freezing a specific state of science in the complex system. A model infrastructure, which takes the complex feedback loops across domain boundaries (e.g. soil moisture and plant growth) into consideration and is still flexible enough for adoption to new findings in any of the scientific fields is therefore needed. This type of infrastructure can be obtained by a set of independent, but connectible models. The new Catchment Model Framework (cmf), a module for subsurface water and solute transport, is an example of an independent yet open and easily extendible framework for the simulation of water and solute transport processes. Openness is gained by implementing the model as an extension to the Python programming language. Coupling of cmf with models also providing an interface to the Python language dealing with other system compartments, as plant growth, biogeochemical or atmospheric dispersion models etc. can easily be done. The models used in the coupling process can either be spatial explicit models, plot scale models with one instance per mesh node of the landscape model or pure reaction functions using the integration methods of cmf. The concept of extending an existing and

  6. Dynamic calibration of agent-based models using data assimilation

    PubMed Central

    Ward, Jonathan A.; Evans, Andrew J.; Malleson, Nicolas S.

    2016-01-01

    A widespread approach to investigating the dynamical behaviour of complex social systems is via agent-based models (ABMs). In this paper, we describe how such models can be dynamically calibrated using the ensemble Kalman filter (EnKF), a standard method of data assimilation. Our goal is twofold. First, we want to present the EnKF in a simple setting for the benefit of ABM practitioners who are unfamiliar with it. Second, we want to illustrate to data assimilation experts the value of using such methods in the context of ABMs of complex social systems and the new challenges these types of model present. We work towards these goals within the context of a simple question of practical value: how many people are there in Leeds (or any other major city) right now? We build a hierarchy of exemplar models that we use to demonstrate how to apply the EnKF and calibrate these using open data of footfall counts in Leeds. PMID:27152214

  7. Endogenizing geopolitical boundaries with agent-based modeling

    PubMed Central

    Cederman, Lars-Erik

    2002-01-01

    Agent-based modeling promises to overcome the reification of actors. Whereas this common, but limiting, assumption makes a lot of sense during periods characterized by stable actor boundaries, other historical junctures, such as the end of the Cold War, exhibit far-reaching and swift transformations of actors' spatial and organizational existence. Moreover, because actors cannot be assumed to remain constant in the long run, analysis of macrohistorical processes virtually always requires “sociational” endogenization. This paper presents a series of computational models, implemented with the software package REPAST, which trace complex macrohistorical transformations of actors be they hierarchically organized as relational networks or as collections of symbolic categories. With respect to the former, dynamic networks featuring emergent compound actors with agent compartments represented in a spatial grid capture organizational domination of the territorial state. In addition, models of “tagged” social processes allows the analyst to show how democratic states predicate their behavior on categorical traits. Finally, categorical schemata that select out politically relevant cultural traits in ethnic landscapes formalize a constructivist notion of national identity in conformance with the qualitative literature on nationalism. This “finite-agent method”, representing both states and nations as higher-level structures superimposed on a lower-level grid of primitive agents or cultural traits, avoids reification of agency. Furthermore, it opens the door to explicit analysis of entity processes, such as the integration and disintegration of actors as well as boundary transformations. PMID:12011409

  8. Dynamic calibration of agent-based models using data assimilation.

    PubMed

    Ward, Jonathan A; Evans, Andrew J; Malleson, Nicolas S

    2016-04-01

    A widespread approach to investigating the dynamical behaviour of complex social systems is via agent-based models (ABMs). In this paper, we describe how such models can be dynamically calibrated using the ensemble Kalman filter (EnKF), a standard method of data assimilation. Our goal is twofold. First, we want to present the EnKF in a simple setting for the benefit of ABM practitioners who are unfamiliar with it. Second, we want to illustrate to data assimilation experts the value of using such methods in the context of ABMs of complex social systems and the new challenges these types of model present. We work towards these goals within the context of a simple question of practical value: how many people are there in Leeds (or any other major city) right now? We build a hierarchy of exemplar models that we use to demonstrate how to apply the EnKF and calibrate these using open data of footfall counts in Leeds. PMID:27152214

  9. Agent-based modelling of consumer energy choices

    NASA Astrophysics Data System (ADS)

    Rai, Varun; Henry, Adam Douglas

    2016-06-01

    Strategies to mitigate global climate change should be grounded in a rigorous understanding of energy systems, particularly the factors that drive energy demand. Agent-based modelling (ABM) is a powerful tool for representing the complexities of energy demand, such as social interactions and spatial constraints. Unlike other approaches for modelling energy demand, ABM is not limited to studying perfectly rational agents or to abstracting micro details into system-level equations. Instead, ABM provides the ability to represent behaviours of energy consumers -- such as individual households -- using a range of theories, and to examine how the interaction of heterogeneous agents at the micro-level produces macro outcomes of importance to the global climate, such as the adoption of low-carbon behaviours and technologies over space and time. We provide an overview of ABM work in the area of consumer energy choices, with a focus on identifying specific ways in which ABM can improve understanding of both fundamental scientific and applied aspects of the demand side of energy to aid the design of better policies and programmes. Future research needs for improving the practice of ABM to better understand energy demand are also discussed.

  10. Multi-agent based control of large-scale complex systems employing distributed dynamic inference engine

    NASA Astrophysics Data System (ADS)

    Zhang, Daili

    Increasing societal demand for automation has led to considerable efforts to control large-scale complex systems, especially in the area of autonomous intelligent control methods. The control system of a large-scale complex system needs to satisfy four system level requirements: robustness, flexibility, reusability, and scalability. Corresponding to the four system level requirements, there arise four major challenges. First, it is difficult to get accurate and complete information. Second, the system may be physically highly distributed. Third, the system evolves very quickly. Fourth, emergent global behaviors of the system can be caused by small disturbances at the component level. The Multi-Agent Based Control (MABC) method as an implementation of distributed intelligent control has been the focus of research since the 1970s, in an effort to solve the above-mentioned problems in controlling large-scale complex systems. However, to the author's best knowledge, all MABC systems for large-scale complex systems with significant uncertainties are problem-specific and thus difficult to extend to other domains or larger systems. This situation is partly due to the control architecture of multiple agents being determined by agent to agent coupling and interaction mechanisms. Therefore, the research objective of this dissertation is to develop a comprehensive, generalized framework for the control system design of general large-scale complex systems with significant uncertainties, with the focus on distributed control architecture design and distributed inference engine design. A Hybrid Multi-Agent Based Control (HyMABC) architecture is proposed by combining hierarchical control architecture and module control architecture with logical replication rings. First, it decomposes a complex system hierarchically; second, it combines the components in the same level as a module, and then designs common interfaces for all of the components in the same module; third, replications

  11. A Framework for Model-Based Inquiry through Agent-Based Programming

    ERIC Educational Resources Information Center

    Xiang, Lin; Passmore, Cynthia

    2015-01-01

    There has been increased recognition in the past decades that model-based inquiry (MBI) is a promising approach for cultivating deep understandings by helping students unite phenomena and underlying mechanisms. Although multiple technology tools have been used to improve the effectiveness of MBI, there are not enough detailed examinations of how…

  12. PLANNING AND RESPONSE IN THE AFTERMATH OF A LARGE CRISIS: AN AGENT-BASED INFORMATICS FRAMEWORK*

    PubMed Central

    Barrett, Christopher; Bisset, Keith; Chandan, Shridhar; Chen, Jiangzhuo; Chungbaek, Youngyun; Eubank, Stephen; Evrenosoğlu, Yaman; Lewis, Bryan; Lum, Kristian; Marathe, Achla; Marathe, Madhav; Mortveit, Henning; Parikh, Nidhi; Phadke, Arun; Reed, Jeffrey; Rivers, Caitlin; Saha, Sudip; Stretz, Paula; Swarup, Samarth; Thorp, James; Vullikanti, Anil; Xie, Dawen

    2014-01-01

    We present a synthetic information and modeling environment that can allow policy makers to study various counter-factual experiments in the event of a large human-initiated crisis. The specific scenario we consider is a ground detonation caused by an improvised nuclear device in a large urban region. In contrast to earlier work in this area that focuses largely on the prompt effects on human health and injury, we focus on co-evolution of individual and collective behavior and its interaction with the differentially damaged infrastructure. This allows us to study short term secondary and tertiary effects. The present environment is suitable for studying the dynamical outcomes over a two week period after the initial blast. A novel computing and data processing architecture is described; the architecture allows us to represent multiple co-evolving infrastructures and social networks at a highly resolved temporal, spatial, and individual scale. The representation allows us to study the emergent behavior of individuals as well as specific strategies to reduce casualties and injuries that exploit the spatial and temporal nature of the secondary and tertiary effects. A number of important conclusions are obtained using the modeling environment. For example, the studies decisively show that deploying ad hoc communication networks to reach individuals in the affected area is likely to have a significant impact on the overall casualties and injuries. PMID:25580055

  13. Experimental Evidence Supported by Simulations of a Very High H{sub 2} Diffusion in Metal Organic Framework Materials

    SciTech Connect

    Salles, F.; Maurin, G.; Jobic, H.; Koza, M. M.; Llewellyn, P. L.; Devic, T.; Serre, C.; Ferey, G.

    2008-06-20

    Quasielastic neutron scattering measurements are combined with molecular dynamics simulations to extract the self-diffusion coefficient of hydrogen in the metal organic frameworks MIL-47(V) and MIL-53(Cr). We find that the diffusivity of hydrogen at low loading is about 2 orders of magnitude higher than in zeolites. Such a high mobility has never been experimentally observed before in any nanoporous materials, although it was predicted in carbon nanotubes. Either 1D or 3D diffusion mechanisms are elucidated depending on the chemical features of the MIL framework.

  14. Infectio: a Generic Framework for Computational Simulation of Virus Transmission between Cells

    PubMed Central

    Yakimovich, Artur; Yakimovich, Yauhen; Schmid, Michael; Mercer, Jason; Sbalzarini, Ivo F.

    2016-01-01

    ABSTRACT Viruses spread between cells, tissues, and organisms by cell-free and cell-cell mechanisms, depending on the cell type, the nature of the virus, or the phase of the infection cycle. The mode of viral transmission has a large impact on disease development, the outcome of antiviral therapies or the efficacy of gene therapy protocols. The transmission mode of viruses can be addressed in tissue culture systems using live-cell imaging. Yet even in relatively simple cell cultures, the mechanisms of viral transmission are difficult to distinguish. Here we present a cross-platform software framework called “Infectio,” which is capable of simulating transmission phenotypes in tissue culture of virtually any virus. Infectio can estimate interdependent biological parameters, for example for vaccinia virus infection, and differentiate between cell-cell and cell-free virus spreading. Infectio assists in elucidating virus transmission mechanisms, a feature useful for designing strategies of perturbing or enhancing viral transmission. The complexity of the Infectio software is low compared to that of other software commonly used to quantitate features of cell biological images, which yields stable and relatively error-free output from Infectio. The software is open source (GPLv3 license), and operates on the major platforms (Windows, Mac, and Linux). The complete source code can be downloaded from http://infectio.github.io/index.html. IMPORTANCE Infectio presents a generalized platform to analyze virus infection spread between cells. It allows the simulation of plaque phenotypes from image-based assays. Viral plaques are the result of virus spreading from primary infected cells to neighboring cells. This is a complex process and involves neighborhood effects at cell-cell contact sites or fluid dynamics in the extracellular medium. Infectio differentiates between two major modes of virus transmission between cells, allowing in silico testing of hypotheses about

  15. Infectio: a Generic Framework for Computational Simulation of Virus Transmission between Cells.

    PubMed

    Yakimovich, Artur; Yakimovich, Yauhen; Schmid, Michael; Mercer, Jason; Sbalzarini, Ivo F; Greber, Urs F

    2016-01-01

    Viruses spread between cells, tissues, and organisms by cell-free and cell-cell mechanisms, depending on the cell type, the nature of the virus, or the phase of the infection cycle. The mode of viral transmission has a large impact on disease development, the outcome of antiviral therapies or the efficacy of gene therapy protocols. The transmission mode of viruses can be addressed in tissue culture systems using live-cell imaging. Yet even in relatively simple cell cultures, the mechanisms of viral transmission are difficult to distinguish. Here we present a cross-platform software framework called "Infectio," which is capable of simulating transmission phenotypes in tissue culture of virtually any virus. Infectio can estimate interdependent biological parameters, for example for vaccinia virus infection, and differentiate between cell-cell and cell-free virus spreading. Infectio assists in elucidating virus transmission mechanisms, a feature useful for designing strategies of perturbing or enhancing viral transmission. The complexity of the Infectio software is low compared to that of other software commonly used to quantitate features of cell biological images, which yields stable and relatively error-free output from Infectio. The software is open source (GPLv3 license), and operates on the major platforms (Windows, Mac, and Linux). The complete source code can be downloaded from http://infectio.github.io/index.html. IMPORTANCE Infectio presents a generalized platform to analyze virus infection spread between cells. It allows the simulation of plaque phenotypes from image-based assays. Viral plaques are the result of virus spreading from primary infected cells to neighboring cells. This is a complex process and involves neighborhood effects at cell-cell contact sites or fluid dynamics in the extracellular medium. Infectio differentiates between two major modes of virus transmission between cells, allowing in silico testing of hypotheses about spreading

  16. On Complexities of Impact Simulation of Fiber Reinforced Polymer Composites: A Simplified Modeling Framework

    PubMed Central

    Alemi-Ardakani, M.; Milani, A. S.; Yannacopoulos, S.

    2014-01-01

    Impact modeling of fiber reinforced polymer composites is a complex and challenging task, in particular for practitioners with less experience in advanced coding and user-defined subroutines. Different numerical algorithms have been developed over the past decades for impact modeling of composites, yet a considerable gap often exists between predicted and experimental observations. In this paper, after a review of reported sources of complexities in impact modeling of fiber reinforced polymer composites, two simplified approaches are presented for fast simulation of out-of-plane impact response of these materials considering four main effects: (a) strain rate dependency of the mechanical properties, (b) difference between tensile and flexural bending responses, (c) delamination, and (d) the geometry of fixture (clamping conditions). In the first approach, it is shown that by applying correction factors to the quasistatic material properties, which are often readily available from material datasheets, the role of these four sources in modeling impact response of a given composite may be accounted for. As a result a rough estimation of the dynamic force response of the composite can be attained. To show the application of the approach, a twill woven polypropylene/glass reinforced thermoplastic composite laminate has been tested under 200 J impact energy and was modeled in Abaqus/Explicit via the built-in Hashin damage criteria. X-ray microtomography was used to investigate the presence of delamination inside the impacted sample. Finally, as a second and much simpler modeling approach it is shown that applying only a single correction factor over all material properties at once can still yield a reasonable prediction. Both advantages and limitations of the simplified modeling framework are addressed in the performed case study. PMID:25431787

  17. On complexities of impact simulation of fiber reinforced polymer composites: a simplified modeling framework.

    PubMed

    Alemi-Ardakani, M; Milani, A S; Yannacopoulos, S

    2014-01-01

    Impact modeling of fiber reinforced polymer composites is a complex and challenging task, in particular for practitioners with less experience in advanced coding and user-defined subroutines. Different numerical algorithms have been developed over the past decades for impact modeling of composites, yet a considerable gap often exists between predicted and experimental observations. In this paper, after a review of reported sources of complexities in impact modeling of fiber reinforced polymer composites, two simplified approaches are presented for fast simulation of out-of-plane impact response of these materials considering four main effects: (a) strain rate dependency of the mechanical properties, (b) difference between tensile and flexural bending responses, (c) delamination, and (d) the geometry of fixture (clamping conditions). In the first approach, it is shown that by applying correction factors to the quasistatic material properties, which are often readily available from material datasheets, the role of these four sources in modeling impact response of a given composite may be accounted for. As a result a rough estimation of the dynamic force response of the composite can be attained. To show the application of the approach, a twill woven polypropylene/glass reinforced thermoplastic composite laminate has been tested under 200 J impact energy and was modeled in Abaqus/Explicit via the built-in Hashin damage criteria. X-ray microtomography was used to investigate the presence of delamination inside the impacted sample. Finally, as a second and much simpler modeling approach it is shown that applying only a single correction factor over all material properties at once can still yield a reasonable prediction. Both advantages and limitations of the simplified modeling framework are addressed in the performed case study. PMID:25431787

  18. On complexities of impact simulation of fiber reinforced polymer composites: a simplified modeling framework.

    PubMed

    Alemi-Ardakani, M; Milani, A S; Yannacopoulos, S

    2014-01-01

    Impact modeling of fiber reinforced polymer composites is a complex and challenging task, in particular for practitioners with less experience in advanced coding and user-defined subroutines. Different numerical algorithms have been developed over the past decades for impact modeling of composites, yet a considerable gap often exists between predicted and experimental observations. In this paper, after a review of reported sources of complexities in impact modeling of fiber reinforced polymer composites, two simplified approaches are presented for fast simulation of out-of-plane impact response of these materials considering four main effects: (a) strain rate dependency of the mechanical properties, (b) difference between tensile and flexural bending responses, (c) delamination, and (d) the geometry of fixture (clamping conditions). In the first approach, it is shown that by applying correction factors to the quasistatic material properties, which are often readily available from material datasheets, the role of these four sources in modeling impact response of a given composite may be accounted for. As a result a rough estimation of the dynamic force response of the composite can be attained. To show the application of the approach, a twill woven polypropylene/glass reinforced thermoplastic composite laminate has been tested under 200 J impact energy and was modeled in Abaqus/Explicit via the built-in Hashin damage criteria. X-ray microtomography was used to investigate the presence of delamination inside the impacted sample. Finally, as a second and much simpler modeling approach it is shown that applying only a single correction factor over all material properties at once can still yield a reasonable prediction. Both advantages and limitations of the simplified modeling framework are addressed in the performed case study.

  19. KMCLib: A general framework for lattice kinetic Monte Carlo (KMC) simulations

    NASA Astrophysics Data System (ADS)

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2014-09-01

    KMCLib is a general framework for lattice kinetic Monte Carlo (KMC) simulations. The program can handle simulations of the diffusion and reaction of millions of particles in one, two, or three dimensions, and is designed to be easily extended and customized by the user to allow for the development of complex custom KMC models for specific systems without having to modify the core functionality of the program. Analysis modules and on-the-fly elementary step diffusion rate calculations can be implemented as plugins following a well-defined API. The plugin modules are loosely coupled to the core KMCLib program via the Python scripting language. KMCLib is written as a Python module with a backend C++ library. After initial compilation of the backend library KMCLib is used as a Python module; input to the program is given as a Python script executed using a standard Python interpreter. We give a detailed description of the features and implementation of the code and demonstrate its scaling behavior and parallel performance with a simple one-dimensional A-B-C lattice KMC model and a more complex three-dimensional lattice KMC model of oxygen-vacancy diffusion in a fluorite structured metal oxide. KMCLib can keep track of individual particle movements and includes tools for mean square displacement analysis, and is therefore particularly well suited for studying diffusion processes at surfaces and in solids. Catalogue identifier: AESZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AESZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 49 064 No. of bytes in distributed program, including test data, etc.: 1 575 172 Distribution format: tar.gz Programming language: Python and C++. Computer: Any computer that can run a C++ compiler and a Python interpreter. Operating system: Tested on Ubuntu 12

  20. Agent Based Modeling of Human Gut Microbiome Interactions and Perturbations

    PubMed Central

    Shashkova, Tatiana; Popenko, Anna; Tyakht, Alexander; Peskov, Kirill; Kosinsky, Yuri; Bogolubsky, Lev; Raigorodskii, Andrei; Ischenko, Dmitry; Alexeev, Dmitry; Govorun, Vadim

    2016-01-01

    Background Intestinal microbiota plays an important role in the human health. It is involved in the digestion and protects the host against external pathogens. Examination of the intestinal microbiome interactions is required for understanding of the community influence on host health. Studies of the microbiome can provide insight on methods of improving health, including specific clinical procedures for individual microbial community composition modification and microbiota correction by colonizing with new bacterial species or dietary changes. Methodology/Principal Findings In this work we report an agent-based model of interactions between two bacterial species and between species and the gut. The model is based on reactions describing bacterial fermentation of polysaccharides to acetate and propionate and fermentation of acetate to butyrate. Antibiotic treatment was chosen as disturbance factor and used to investigate stability of the system. System recovery after antibiotic treatment was analyzed as dependence on quantity of feedback interactions inside the community, therapy duration and amount of antibiotics. Bacterial species are known to mutate and acquire resistance to the antibiotics. The ability to mutate was considered to be a stochastic process, under this suggestion ratio of sensitive to resistant bacteria was calculated during antibiotic therapy and recovery. Conclusion/Significance The model confirms a hypothesis of feedbacks mechanisms necessity for providing functionality and stability of the system after disturbance. High fraction of bacterial community was shown to mutate during antibiotic treatment, though sensitive strains could become dominating after recovery. The recovery of sensitive strains is explained by fitness cost of the resistance. The model demonstrates not only quantitative dynamics of bacterial species, but also gives an ability to observe the emergent spatial structure and its alteration, depending on various feedback mechanisms

  1. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser

    PubMed Central

    Yoon, Chun Hong; Yurkov, Mikhail V.; Schneidmiller, Evgeny A.; Samoylova, Liubov; Buzmakov, Alexey; Jurek, Zoltan; Ziaja, Beata; Santra, Robin; Loh, N. Duane; Tschentscher, Thomas; Mancuso, Adrian P.

    2016-01-01

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy and incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. We demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design. PMID:27109208

  2. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser.

    PubMed

    Yoon, Chun Hong; Yurkov, Mikhail V; Schneidmiller, Evgeny A; Samoylova, Liubov; Buzmakov, Alexey; Jurek, Zoltan; Ziaja, Beata; Santra, Robin; Loh, N Duane; Tschentscher, Thomas; Mancuso, Adrian P

    2016-01-01

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy and incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. We demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design. PMID:27109208

  3. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser

    DOE PAGES

    Yoon, Chun Hong; Yurkov, Mikhail V.; Schneidmiller, Evgeny A.; Samoylova, Liubov; Buzmakov, Alexey; Jurek, Zoltan; Ziaja, Beata; Santra, Robin; Loh, N. Duane; Tschentscher, Thomas; et al

    2016-04-25

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy andmore » incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. Furthermore, we demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design.« less

  4. A comprehensive simulation framework for imaging single particles and biomolecules at the European X-ray Free-Electron Laser

    NASA Astrophysics Data System (ADS)

    Yoon, Chun Hong; Yurkov, Mikhail V.; Schneidmiller, Evgeny A.; Samoylova, Liubov; Buzmakov, Alexey; Jurek, Zoltan; Ziaja, Beata; Santra, Robin; Loh, N. Duane; Tschentscher, Thomas; Mancuso, Adrian P.

    2016-04-01

    The advent of newer, brighter, and more coherent X-ray sources, such as X-ray Free-Electron Lasers (XFELs), represents a tremendous growth in the potential to apply coherent X-rays to determine the structure of materials from the micron-scale down to the Angstrom-scale. There is a significant need for a multi-physics simulation framework to perform source-to-detector simulations for a single particle imaging experiment, including (i) the multidimensional simulation of the X-ray source; (ii) simulation of the wave-optics propagation of the coherent XFEL beams; (iii) atomistic modelling of photon-material interactions; (iv) simulation of the time-dependent diffraction process, including incoherent scattering; (v) assembling noisy and incomplete diffraction intensities into a three-dimensional data set using the Expansion-Maximisation-Compression (EMC) algorithm and (vi) phase retrieval to obtain structural information. We demonstrate the framework by simulating a single-particle experiment for a nitrogenase iron protein using parameters of the SPB/SFX instrument of the European XFEL. This exercise demonstrably yields interpretable consequences for structure determination that are crucial yet currently unavailable for experiment design.

  5. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach

    PubMed Central

    2016-01-01

    Background Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. Purpose It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. Method We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. Results The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach. PMID:26812235

  6. KMCLib: A general framework for lattice kinetic Monte Carlo (KMC) simulations

    NASA Astrophysics Data System (ADS)

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2014-09-01

    KMCLib is a general framework for lattice kinetic Monte Carlo (KMC) simulations. The program can handle simulations of the diffusion and reaction of millions of particles in one, two, or three dimensions, and is designed to be easily extended and customized by the user to allow for the development of complex custom KMC models for specific systems without having to modify the core functionality of the program. Analysis modules and on-the-fly elementary step diffusion rate calculations can be implemented as plugins following a well-defined API. The plugin modules are loosely coupled to the core KMCLib program via the Python scripting language. KMCLib is written as a Python module with a backend C++ library. After initial compilation of the backend library KMCLib is used as a Python module; input to the program is given as a Python script executed using a standard Python interpreter. We give a detailed description of the features and implementation of the code and demonstrate its scaling behavior and parallel performance with a simple one-dimensional A-B-C lattice KMC model and a more complex three-dimensional lattice KMC model of oxygen-vacancy diffusion in a fluorite structured metal oxide. KMCLib can keep track of individual particle movements and includes tools for mean square displacement analysis, and is therefore particularly well suited for studying diffusion processes at surfaces and in solids. Catalogue identifier: AESZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AESZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 49 064 No. of bytes in distributed program, including test data, etc.: 1 575 172 Distribution format: tar.gz Programming language: Python and C++. Computer: Any computer that can run a C++ compiler and a Python interpreter. Operating system: Tested on Ubuntu 12

  7. Agent-Based Model with Asymmetric Trading and Herding for Complex Financial Systems

    PubMed Central

    Chen, Jun-Jie; Zheng, Bo; Tan, Lei

    2013-01-01

    Background For complex financial systems, the negative and positive return-volatility correlations, i.e., the so-called leverage and anti-leverage effects, are particularly important for the understanding of the price dynamics. However, the microscopic origination of the leverage and anti-leverage effects is still not understood, and how to produce these effects in agent-based modeling remains open. On the other hand, in constructing microscopic models, it is a promising conception to determine model parameters from empirical data rather than from statistical fitting of the results. Methods To study the microscopic origination of the return-volatility correlation in financial systems, we take into account the individual and collective behaviors of investors in real markets, and construct an agent-based model. The agents are linked with each other and trade in groups, and particularly, two novel microscopic mechanisms, i.e., investors’ asymmetric trading and herding in bull and bear markets, are introduced. Further, we propose effective methods to determine the key parameters in our model from historical market data. Results With the model parameters determined for six representative stock-market indices in the world, respectively, we obtain the corresponding leverage or anti-leverage effect from the simulation, and the effect is in agreement with the empirical one on amplitude and duration. At the same time, our model produces other features of the real markets, such as the fat-tail distribution of returns and the long-term correlation of volatilities. Conclusions We reveal that for the leverage and anti-leverage effects, both the investors’ asymmetric trading and herding are essential generation mechanisms. Among the six markets, however, the investors’ trading is approximately symmetric for the five markets which exhibit the leverage effect, thus contributing very little. These two microscopic mechanisms and the methods for the determination of the key

  8. Learning Natural Selection in 4th Grade with Multi-Agent-Based Computational Models

    NASA Astrophysics Data System (ADS)

    Dickes, Amanda Catherine; Sengupta, Pratim

    2013-06-01

    In this paper, we investigate how elementary school students develop multi-level explanations of population dynamics in a simple predator-prey ecosystem, through scaffolded interactions with a multi-agent-based computational model (MABM). The term "agent" in an MABM indicates individual computational objects or actors (e.g., cars), and these agents obey simple rules assigned or manipulated by the user (e.g., speeding up, slowing down, etc.). It is the interactions between these agents, based on the rules assigned by the user, that give rise to emergent, aggregate-level behavior (e.g., formation and movement of the traffic jam). Natural selection is such an emergent phenomenon, which has been shown to be challenging for novices (K16 students) to understand. Whereas prior research on learning evolutionary phenomena with MABMs has typically focused on high school students and beyond, we investigate how elementary students (4th graders) develop multi-level explanations of some introductory aspects of natural selection—species differentiation and population change—through scaffolded interactions with an MABM that simulates predator-prey dynamics in a simple birds-butterflies ecosystem. We conducted a semi-clinical interview based study with ten participants, in which we focused on the following: a) identifying the nature of learners' initial interpretations of salient events or elements of the represented phenomena, b) identifying the roles these interpretations play in the development of their multi-level explanations, and c) how attending to different levels of the relevant phenomena can make explicit different mechanisms to the learners. In addition, our analysis also shows that although there were differences between high- and low-performing students (in terms of being able to explain population-level behaviors) in the pre-test, these differences disappeared in the post-test.

  9. Combination HIV Prevention among MSM in South Africa: Results from Agent-based Modeling

    PubMed Central

    Brookmeyer, Ron; Boren, David; Baral, Stefan D.; Bekker, Linda- Gail; Phaswana-Mafuya, Nancy; Beyrer, Chris; Sullivan, Patrick S.

    2014-01-01

    HIV prevention trials have demonstrated the effectiveness of a number of behavioral and biomedical interventions. HIV prevention packages are combinations of interventions and offer potential to significantly increase the effectiveness of any single intervention. Estimates of the effectiveness of prevention packages are important for guiding the development of prevention strategies and for characterizing effect sizes before embarking on large scale trials. Unfortunately, most research to date has focused on testing single interventions rather than HIV prevention packages. Here we report the results from agent-based modeling of the effectiveness of HIV prevention packages for men who have sex with men (MSM) in South Africa. We consider packages consisting of four components: antiretroviral therapy for HIV infected persons with CD4 count <350; PrEP for high risk uninfected persons; behavioral interventions to reduce rates of unprotected anal intercourse (UAI); and campaigns to increase HIV testing. We considered 163 HIV prevention packages corresponding to different intensity levels of the four components. We performed 2252 simulation runs of our agent-based model to evaluate those packages. We found that a four component package consisting of a 15% reduction in the rate of UAI, 50% PrEP coverage of high risk uninfected persons, 50% reduction in persons who never test for HIV, and 50% ART coverage over and above persons already receiving ART at baseline, could prevent 33.9% of infections over 5 years (95% confidence interval, 31.5, 36.3). The package components with the largest incremental prevention effects were UAI reduction and PrEP coverage. The impact of increased HIV testing was magnified in the presence of PrEP. We find that HIV prevention packages that include both behavioral and biomedical components can in combination prevent significant numbers of infections with levels of coverage, acceptance and adherence that are potentially achievable among MSM in

  10. Combination HIV prevention among MSM in South Africa: results from agent-based modeling.

    PubMed

    Brookmeyer, Ron; Boren, David; Baral, Stefan D; Bekker, Linda-Gail; Phaswana-Mafuya, Nancy; Beyrer, Chris; Sullivan, Patrick S

    2014-01-01

    HIV prevention trials have demonstrated the effectiveness of a number of behavioral and biomedical interventions. HIV prevention packages are combinations of interventions and offer potential to significantly increase the effectiveness of any single intervention. Estimates of the effectiveness of prevention packages are important for guiding the development of prevention strategies and for characterizing effect sizes before embarking on large scale trials. Unfortunately, most research to date has focused on testing single interventions rather than HIV prevention packages. Here we report the results from agent-based modeling of the effectiveness of HIV prevention packages for men who have sex with men (MSM) in South Africa. We consider packages consisting of four components: antiretroviral therapy for HIV infected persons with CD4 count <350; PrEP for high risk uninfected persons; behavioral interventions to reduce rates of unprotected anal intercourse (UAI); and campaigns to increase HIV testing. We considered 163 HIV prevention packages corresponding to different intensity levels of the four components. We performed 2252 simulation runs of our agent-based model to evaluate those packages. We found that a four component package consisting of a 15% reduction in the rate of UAI, 50% PrEP coverage of high risk uninfected persons, 50% reduction in persons who never test for HIV, and 50% ART coverage over and above persons already receiving ART at baseline, could prevent 33.9% of infections over 5 years (95% confidence interval, 31.5, 36.3). The package components with the largest incremental prevention effects were UAI reduction and PrEP coverage. The impact of increased HIV testing was magnified in the presence of PrEP. We find that HIV prevention packages that include both behavioral and biomedical components can in combination prevent significant numbers of infections with levels of coverage, acceptance and adherence that are potentially achievable among MSM in

  11. Holistic flood risk assessment using agent-based modelling: the case of Sint Maarten Island

    NASA Astrophysics Data System (ADS)

    Abayneh Abebe, Yared; Vojinovic, Zoran; Nikolic, Igor; Hammond, Michael; Sanchez, Arlex; Pelling, Mark

    2015-04-01

    Floods in coastal regions are regarded as one of the most dangerous and harmful disasters. Though commonly referred to as natural disasters, coastal floods are also attributable to various social, economic, historical and political issues. Rapid urbanisation in coastal areas combined with climate change and poor governance can lead to a significant increase in the risk of pluvial flooding coinciding with fluvial and coastal flooding posing a greater risk of devastation in coastal communities. Disasters that can be triggered by hydro-meteorological events are interconnected and interrelated with both human activities and natural processes. They, therefore, require holistic approaches to help understand their complexity in order to design and develop adaptive risk management approaches that minimise social and economic losses and environmental impacts, and increase resilience to such events. Being located in the North Atlantic Ocean, Sint Maarten is frequently subjected to hurricanes. In addition, the stormwater catchments and streams on Sint Maarten have several unique characteristics that contribute to the severity of flood-related impacts. Urban environments are usually situated in low-lying areas, with little consideration for stormwater drainage, and as such are subject to flash flooding. Hence, Sint Maarten authorities drafted policies to minimise the risk of flood-related disasters on the island. In this study, an agent-based model is designed and applied to understand the implications of introduced policies and regulations, and to understand how different actors' behaviours influence the formation, propagation and accumulation of flood risk. The agent-based model built for this study is based on the MAIA meta-model, which helps to decompose, structure and conceptualize socio-technical systems with an agent-oriented perspective, and is developed using the NetLogo simulation environment. The agents described in this model are households and businesses, and

  12. A task-oriented modular and agent-based collaborative design mechanism for distributed product development

    NASA Astrophysics Data System (ADS)

    Liu, Jinfei; Chen, Ming; Wang, Lei; Wu, Qidi

    2014-05-01

    The rapid expansion of enterprises makes product collaborative design (PCD) a critical issue under the distributed heterogeneous environment, but as the collaborative task of large-scale network becomes more complicated, neither unified task decomposition and allocation methodology nor Agent-based network management platform can satisfy the increasing demands. In this paper, to meet requirements of PCD for distributed product development, a collaborative design mechanism based on the thought of modularity and the Agent technology is presented. First, the top-down 4-tier process model based on task-oriented modular and Agent is constructed for PCD after analyzing the mapping relationships between requirements and functions in the collaborative design. Second, on basis of sub-task decomposition for PCD based on a mixed method, the mathematic model of task-oriented modular based on multi-objective optimization is established to maximize the module cohesion degree and minimize the module coupling degree, while considering the module executable degree as a restriction. The mathematic model is optimized and simulated by the modified PSO, and the decomposed modules are obtained. Finally, the Agent structure model for collaborative design is put forward, and the optimism matching Agents are selected by using similarity algorithm to implement different task-modules by the integrated reasoning and decision-making mechanism with the behavioral model of collaborative design Agents. With the results of experimental studies for automobile collaborative design, the feasibility and efficiency of this methodology of task-oriented modular and Agent-based collaborative design in the distributed heterogeneous environment are verified. On this basis, an integrative automobile collaborative R&D platform is developed. This research provides an effective platform for automobile manufacturing enterprises to achieve PCD, and helps to promote product numeralization collaborative R&D and

  13. Agent Based Modeling of Atherosclerosis: A Concrete Help in Personalized Treatments

    NASA Astrophysics Data System (ADS)

    Pappalardo, Francesco; Cincotti, Alessandro; Motta, Alfredo; Pennisi, Marzio

    Atherosclerosis, a pathology affecting arterial blood vessels, is one of most common diseases of the developed countries. We present studies on the increased atherosclerosis risk using an agent based model of atherogenesis that has been previously validated using clinical data. It is well known that the major risk in atherosclerosis is the persistent high level of low density lipoprotein (LDL) concentration. However, it is not known if short period of high LDL concentration can cause irreversible damage and if reduction of the LDL concentration (either by life style or drug) can drastically or partially reduce the already acquired risk. We simulated four different clinical situations in a large set of virtual patients (200 per clinical scenario). In the first one the patients lifestyle maintains the concentration of LDL in a no risk range. This is the control case simulation. The second case is represented by patients having high level of LDL with a delay to apply appropriate treatments; The third scenario is characterized by patients with high LDL levels treated with specific drugs like statins. Finally we simulated patients that are characterized by several oxidative events (smoke, sedentary life style, assumption of alcoholic drinks and so on so forth) that effective increase the risk of LDL oxidation. Those preliminary results obviously need to be clinically investigated. It is clear, however, that SimAthero has the power to concretely help medical doctors and clinicians in choosing personalized treatments for the prevention of the atherosclerosis damages.

  14. A simulation framework for auditory discrimination experiments: Revealing the importance of across-frequency processing in speech perception.

    PubMed

    Schädler, Marc René; Warzybok, Anna; Ewert, Stephan D; Kollmeier, Birger

    2016-05-01

    A framework for simulating auditory discrimination experiments, based on an approach from Schädler, Warzybok, Hochmuth, and Kollmeier [(2015). Int. J. Audiol. 54, 100-107] which was originally designed to predict speech recognition thresholds, is extended to also predict psychoacoustic thresholds. The proposed framework is used to assess the suitability of different auditory-inspired feature sets for a range of auditory discrimination experiments that included psychoacoustic as well as speech recognition experiments in noise. The considered experiments were 2 kHz tone-in-broadband-noise simultaneous masking depending on the tone length, spectral masking with simultaneously presented tone signals and narrow-band noise maskers, and German Matrix sentence test reception threshold in stationary and modulated noise. The employed feature sets included spectro-temporal Gabor filter bank features, Mel-frequency cepstral coefficients, logarithmically scaled Mel-spectrograms, and the internal representation of the Perception Model from Dau, Kollmeier, and Kohlrausch [(1997). J. Acoust. Soc. Am. 102(5), 2892-2905]. The proposed framework was successfully employed to simulate all experiments with a common parameter set and obtain objective thresholds with less assumptions compared to traditional modeling approaches. Depending on the feature set, the simulated reference-free thresholds were found to agree with-and hence to predict-empirical data from the literature. Across-frequency processing was found to be crucial to accurately model the lower speech reception threshold in modulated noise conditions than in stationary noise conditions.

  15. A robust framework for soft tissue simulations with application to modeling brain tumor mass effect in 3D MR images.

    PubMed

    Hogea, Cosmina; Biros, George; Abraham, Feby; Davatzikos, Christos

    2007-12-01

    We present a framework for black-box and flexible simulation of soft tissue deformation for medical imaging and surgical planning applications. Our main motivation in the present work is to develop robust algorithms that allow batch processing for registration of brains with tumors to statistical atlases of normal brains and construction of brain tumor atlases. We describe a fully Eulerian formulation able to handle large deformations effortlessly, with a level-set-based approach for evolving fronts. We use a regular grid-fictitious domain method approach, in which we approximate coefficient discontinuities, distributed forces and boundary conditions. This approach circumvents the need for unstructured mesh generation, which is often a bottleneck in the modeling and simulation pipeline. Our framework employs penalty approaches to impose boundary conditions and uses a matrix-free implementation coupled with a multigrid-accelerated Krylov solver. The overall scheme results in a scalable method with minimal storage requirements and optimal algorithmic complexity. We illustrate the potential of our framework to simulate realistic brain tumor mass effects at reduced computational cost, for aiding the registration process towards the construction of brain tumor atlases. PMID:18029982

  16. A simulation framework for auditory discrimination experiments: Revealing the importance of across-frequency processing in speech perception.

    PubMed

    Schädler, Marc René; Warzybok, Anna; Ewert, Stephan D; Kollmeier, Birger

    2016-05-01

    A framework for simulating auditory discrimination experiments, based on an approach from Schädler, Warzybok, Hochmuth, and Kollmeier [(2015). Int. J. Audiol. 54, 100-107] which was originally designed to predict speech recognition thresholds, is extended to also predict psychoacoustic thresholds. The proposed framework is used to assess the suitability of different auditory-inspired feature sets for a range of auditory discrimination experiments that included psychoacoustic as well as speech recognition experiments in noise. The considered experiments were 2 kHz tone-in-broadband-noise simultaneous masking depending on the tone length, spectral masking with simultaneously presented tone signals and narrow-band noise maskers, and German Matrix sentence test reception threshold in stationary and modulated noise. The employed feature sets included spectro-temporal Gabor filter bank features, Mel-frequency cepstral coefficients, logarithmically scaled Mel-spectrograms, and the internal representation of the Perception Model from Dau, Kollmeier, and Kohlrausch [(1997). J. Acoust. Soc. Am. 102(5), 2892-2905]. The proposed framework was successfully employed to simulate all experiments with a common parameter set and obtain objective thresholds with less assumptions compared to traditional modeling approaches. Depending on the feature set, the simulated reference-free thresholds were found to agree with-and hence to predict-empirical data from the literature. Across-frequency processing was found to be crucial to accurately model the lower speech reception threshold in modulated noise conditions than in stationary noise conditions. PMID:27250164

  17. From Agents to Continuous Change via Aesthetics: Learning Mechanics with Visual Agent-Based Computational Modeling

    ERIC Educational Resources Information Center

    Sengupta, Pratim; Farris, Amy Voss; Wright, Mason

    2012-01-01

    Novice learners find motion as a continuous process of change challenging to understand. In this paper, we present a pedagogical approach based on agent-based, visual programming to address this issue. Integrating agent-based programming, in particular, Logo programming, with curricular science has been shown to be challenging in previous research…

  18. The Agent-based Approach: A New Direction for Computational Models of Development.

    ERIC Educational Resources Information Center

    Schlesinger, Matthew; Parisi, Domenico

    2001-01-01

    Introduces the concepts of online and offline sampling and highlights the role of online sampling in agent-based models of learning and development. Compares the strengths of each approach for modeling particular developmental phenomena and research questions. Describes a recent agent-based model of infant causal perception. Discusses limitations…

  19. The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models

    NASA Technical Reports Server (NTRS)

    Penn, John M.; Lin, Alexander S.

    2016-01-01

    This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.

  20. Agent-Based Modeling of Chronic Diseases: A Narrative Review and Future Research Directions.

    PubMed

    Li, Yan; Lawley, Mark A; Siscovick, David S; Zhang, Donglan; Pagán, José A

    2016-01-01

    The United States is experiencing an epidemic of chronic disease. As the US population ages, health care providers and policy makers urgently need decision models that provide systematic, credible prediction regarding the prevention and treatment of chronic diseases to improve population health management and medical decision-making. Agent-based modeling is a promising systems science approach that can model complex interactions and processes related to chronic health conditions, such as adaptive behaviors, feedback loops, and contextual effects. This article introduces agent-based modeling by providing a narrative review of agent-based models of chronic disease and identifying the characteristics of various chronic health conditions that must be taken into account to build effective clinical- and policy-relevant models. We also identify barriers to adopting agent-based models to study chronic diseases. Finally, we discuss future research directions of agent-based modeling applied to problems related to specific chronic health conditions. PMID:27236380

  1. Agent-Based Modeling of Chronic Diseases: A Narrative Review and Future Research Directions

    PubMed Central

    Lawley, Mark A.; Siscovick, David S.; Zhang, Donglan; Pagán, José A.

    2016-01-01

    The United States is experiencing an epidemic of chronic disease. As the US population ages, health care providers and policy makers urgently need decision models that provide systematic, credible prediction regarding the prevention and treatment of chronic diseases to improve population health management and medical decision-making. Agent-based modeling is a promising systems science approach that can model complex interactions and processes related to chronic health conditions, such as adaptive behaviors, feedback loops, and contextual effects. This article introduces agent-based modeling by providing a narrative review of agent-based models of chronic disease and identifying the characteristics of various chronic health conditions that must be taken into account to build effective clinical- and policy-relevant models. We also identify barriers to adopting agent-based models to study chronic diseases. Finally, we discuss future research directions of agent-based modeling applied to problems related to specific chronic health conditions. PMID:27236380

  2. An agent-based approach to modelling the effects of extreme events on global food prices

    NASA Astrophysics Data System (ADS)

    Schewe, Jacob; Otto, Christian; Frieler, Katja

    2015-04-01

    Extreme climate events such as droughts or heat waves affect agricultural production in major food producing regions and therefore can influence the price of staple foods on the world market. There is evidence that recent dramatic spikes in grain prices were at least partly triggered by actual and/or expected supply shortages. The reaction of the market to supply changes is however highly nonlinear and depends on complex and interlinked processes such as warehousing, speculation, and export restrictions. Here we present for the first time an agent-based modelling framework that accounts, in simplified terms, for these processes and allows to estimate the reaction of world food prices to supply shocks on a short (monthly) timescale. We test the basic model using observed historical supply, demand, and price data of wheat as a major food grain. Further, we illustrate how the model can be used in conjunction with biophysical crop models to assess the effect of future changes in extreme event regimes on the volatility of food prices. In particular, the explicit representation of storage dynamics makes it possible to investigate the potentially nonlinear interaction between simultaneous extreme events in different food producing regions, or between several consecutive events in the same region, which may both occur more frequently under future global warming.

  3. Agent-based modeling of hyporheic dissolved organic carbon transport and transformation

    NASA Astrophysics Data System (ADS)

    Gabrielsen, P. J.; Wilson, J. L.; Pullin, M.

    2011-12-01

    Dissolved organic carbon (DOC) is a complex suite of organic compounds present in natural ecosystems, and is particularly studied in river and stream systems. The hyporheic zone (HZ), a region of surface water-shallow groundwater exchange, has been identified as a hotspot of DOC processing and is generally regarded as a net sink of organic matter. More recent studies into stream DOC have shifted to examining DOC quality rather than bulk quantity. DOC quality variability has been linked to hydrologic and climatic variability, both focuses of current climate change research. A new agent-based model in the NetLogo modeling environment couples hydrologic transport with chemical and biological transformation of DOC to simulate changing DOC quality in hyporheic flow. A pore-scale model implements a Lattice Boltzmann fluid dynamic model and surficial interactions to simulate sorption and microbial uptake. Upscaled to a stream meander scale, this model displays spatial variation and evolution of DOC quality. Model output metrics are correlated to field sample analytical results from a hyporheic meander of the East Fork Jemez River, Sandoval Co., NM.

  4. A standard protocol for describing individual-based and agent-based models

    USGS Publications Warehouse

    Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.

    2006-01-01

    Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.

  5. Quantitative Agent Based Model of Opinion Dynamics: Polish Elections of 2015.

    PubMed

    Sobkowicz, Pawel

    2016-01-01

    We present results of an abstract, agent based model of opinion dynamics simulations based on the emotion/information/opinion (E/I/O) approach, applied to a strongly polarized society, corresponding to the Polish political scene between 2005 and 2015. Under certain conditions the model leads to metastable coexistence of two subcommunities of comparable size (supporting the corresponding opinions)-which corresponds to the bipartisan split found in Poland. Spurred by the recent breakdown of this political duopoly, which occurred in 2015, we present a model extension that describes both the long term coexistence of the two opposing opinions and a rapid, transitory change due to the appearance of a third party alternative. We provide quantitative comparison of the model with the results of polls and elections in Poland, testing the assumptions related to the modeled processes and the parameters used in the simulations. It is shown, that when the propaganda messages of the two incumbent parties differ in emotional tone, the political status quo may be unstable. The asymmetry of the emotions within the support bases of the two parties allows one of them to be 'invaded' by a newcomer third party very quickly, while the second remains immune to such invasion.

  6. Study of the attractor structure of an agent-based sociological model

    NASA Astrophysics Data System (ADS)

    Timpanaro, André M.; Prado, Carmen P. C.

    2011-03-01

    The Sznajd model is a sociophysics model that is based in the Potts model, and used for describing opinion propagation in a society. It employs an agent-based approach and interaction rules favouring pairs of agreeing agents. It has been successfully employed in modeling some properties and scale features of both proportional and majority elections (see for instance the works of A. T. Bernardes and R. N. Costa Filho), but its stationary states are always consensus states. In order to explain more complicated behaviours, we have modified the bounded confidence idea (introduced before in other opinion models, like the Deffuant model), with the introduction of prejudices and biases (we called this modification confidence rules), and have adapted it to the discrete Sznajd model. This generalized Sznajd model is able to reproduce almost all of the previous versions of the Sznajd model, by using appropriate choices of parameters. We solved the attractor structure of the resulting model in a mean-field approach and made Monte Carlo simulations in a Barabási-Albert network. These simulations show great similarities with the mean-field, for the tested cases of 3 and 4 opinions. The dynamical systems approach that we devised allows for a deeper understanding of the potential of the Sznajd model as an opinion propagation model and can be easily extended to other models, like the voter model. Our modification of the bounded confidence rule can also be readily applied to other opinion propagation models.

  7. Quantitative Agent Based Model of Opinion Dynamics: Polish Elections of 2015

    PubMed Central

    Sobkowicz, Pawel

    2016-01-01

    We present results of an abstract, agent based model of opinion dynamics simulations based on the emotion/information/opinion (E/I/O) approach, applied to a strongly polarized society, corresponding to the Polish political scene between 2005 and 2015. Under certain conditions the model leads to metastable coexistence of two subcommunities of comparable size (supporting the corresponding opinions)—which corresponds to the bipartisan split found in Poland. Spurred by the recent breakdown of this political duopoly, which occurred in 2015, we present a model extension that describes both the long term coexistence of the two opposing opinions and a rapid, transitory change due to the appearance of a third party alternative. We provide quantitative comparison of the model with the results of polls and elections in Poland, testing the assumptions related to the modeled processes and the parameters used in the simulations. It is shown, that when the propaganda messages of the two incumbent parties differ in emotional tone, the political status quo may be unstable. The asymmetry of the emotions within the support bases of the two parties allows one of them to be ‘invaded’ by a newcomer third party very quickly, while the second remains immune to such invasion. PMID:27171226

  8. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  9. The role of research efficiency in the evolution of scientific productivity and impact: An agent-based model

    NASA Astrophysics Data System (ADS)

    You, Zhi-Qiang; Han, Xiao-Pu; Hadzibeganovic, Tarik

    2016-02-01

    We introduce an agent-based model to investigate the effects of production efficiency (PE) and hot field tracing capability (HFTC) on productivity and impact of scientists embedded in a competitive research environment. Agents compete to publish and become cited by occupying the nodes of a citation network calibrated by real-world citation datasets. Our Monte-Carlo simulations reveal that differences in individual performance are strongly related to PE, whereas HFTC alone cannot provide sustainable academic careers under intensely competitive conditions. Remarkably, the negative effect of high competition levels on productivity can be buffered by elevated research efficiency if simultaneously HFTC is sufficiently low.

  10. Validation of an open-source framework for the simulation of blood flow in biomedical devices

    NASA Astrophysics Data System (ADS)

    Quaini, Annalisa; Passerini, Tiziano; Villa, Umberto; Veneziani, Alessandro; Canic, Suncica

    2013-11-01

    We discuss the validation of an open source framework for the solution of problems arising in hemodynamics. The framework is assessed through experimental data for fluid flow in an idealized medical device with rigid boundaries. The core of the framework is an open source parallel finite element library that features several algorithms for fluid problems. The numerical results for the flow in the idealized medical device are in good quantitative agreement with the measured axial components of the velocity and pressures for flow rates corresponding to laminar, transitional, and turbulent regimes. A detailed account of the methods is provided. Support through grants NSF DMS-1109189 and NIH R01 HL70531 is gratefully acknowledged.

  11. Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.

  12. Molecular simulations for energy, environmental and pharmaceutical applications of nanoporous materials: from zeolites, metal-organic frameworks to protein crystals.

    PubMed

    Jiang, Jianwen; Babarao, Ravichandar; Hu, Zhongqiao

    2011-07-01

    Nanoporous materials have widespread applications in chemical industry, but the pathway from laboratory synthesis and testing to practical utilization of nanoporous materials is substantially challenging and requires fundamental understanding from the bottom up. With ever-growing computational resources, molecular simulations have become an indispensable tool for material characterization, screening and design. This tutorial review summarizes the recent simulation studies in zeolites, metal-organic frameworks and protein crystals, and provides a molecular overview for energy, environmental and pharmaceutical applications of nanoporous materials with increasing degree of complexity in building blocks. It is demonstrated that molecular-level studies can bridge the gap between physical and engineering sciences, unravel microscopic insights that are otherwise experimentally inaccessible, and assist in the rational design of new materials. The review is concluded with major challenges in future simulation exploration of novel nanoporous materials for emerging applications.

  13. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  14. Agent-based modeling and systems dynamics model reproduction.

    SciTech Connect

    North, M. J.; Macal, C. M.

    2009-01-01

    Reproducibility is a pillar of the scientific endeavour. We view computer simulations as laboratories for electronic experimentation and therefore as tools for science. Recent studies have addressed model reproduction and found it to be surprisingly difficult to replicate published findings. There have been enough failed simulation replications to raise the question, 'can computer models be fully replicated?' This paper answers in the affirmative by reporting on a successful reproduction study using Mathematica, Repast and Swarm for the Beer Game supply chain model. The reproduction process was valuable because it demonstrated the original result's robustness across modelling methodologies and implementation environments.

  15. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    NASA Astrophysics Data System (ADS)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  16. Atomistic simulation studies on the dynamics and thermodynamics of nonpolar molecules within the zeolite imidazolate framework-8.

    PubMed

    Pantatosaki, Evangelia; Pazzona, Federico G; Megariotis, Gregory; Papadopoulos, George K

    2010-02-25

    Statistical-mechanics-based simulation studies at the atomistic level of argon (Ar), methane (CH(4)), and hydrogen (H(2)) sorbed in the zeolite imidazolate framework-8 (ZIF-8) are reported. ZIF-8 is a product of a special kind of chemical process, recently termed as reticular synthesis, which has generated a class of materials of critical importance as molecular binders. In this work, we explore the mechanisms that govern the sorption thermodynamics and kinetics of nonpolar sorbates possessing different sizes and strength of interactions with the metal-organic framework to understand the outstanding properties of this novel class of sorbents, as revealed by experiments published elsewhere. For this purpose, we have developed an in-house modeling procedure involving calculations of sorption isotherms, partial internal energies, various probability density functions, and molecular dynamics for the simulation of the sorbed phase over a wide range of occupancies and temperatures within a digitally reconstructed unit cell of ZIF-8. The results showed that sorbates perceive a marked energetic inhomogeneity within the atomic framework of the metal-organic material under study, resulting in free energy barriers that give rise to inflections in the sorption isotherms and guide the dynamics of guest molecules.

  17. A new numerical framework for simulating the control of weather and climate on the evolution of soil-mantled hillslopes

    NASA Astrophysics Data System (ADS)

    Bovy, Benoît; Braun, Jean; Demoulin, Alain

    2016-06-01

    We present a new numerical framework for simulating short to long-term hillslope evolution. This modeling framework, to which we have given the name CLICHE (CLImate Control on Hillslope Evolution), aims to better capture the control of climate on soil dynamics. It allows the use of realistic forcing that involves, through a specific time discretization scheme, the variability of both the temperature and precipitation at time scales ranging from the daily rainfall events to the climatic oscillations of the Quaternary, also including seasonal variability. Two simple models of soil temperature and soil water balance permit the link between the climatic inputs and derived quantities that take part in the computation of the soil flux, such as the surface water discharge and the depth of the non-frozen soil layer. Using this framework together with a multi-process parameterization of soil transport, we apply an original method to calculate hillslope effective diffusivity as a function of climate. This allows us to demonstrate the ability of the model to simulate observed rates of hillslope erosion under different climates (cold and temperate) with a single set of parameter values. Numerical experiments furthermore suggest a potential high peak of sediment transport on hillslopes during the glacial-interglacial transitions of the Quaternary. We finally discuss the need to improve the parameterization of the soil production and transport processes in order to explicitly account for other key controlling factors that are also climate-sensitive, such as biological activity.

  18. Bayesian networks and agent-based modeling approach for urban land-use and population density change: a BNAS model

    NASA Astrophysics Data System (ADS)

    Kocabas, Verda; Dragicevic, Suzana

    2013-10-01

    Land-use change models grounded in complexity theory such as agent-based models (ABMs) are increasingly being used to examine evolving urban systems. The objective of this study is to develop a spatial model that simulates land-use change under the influence of human land-use choice behavior. This is achieved by integrating the key physical and social drivers of land-use change using Bayesian networks (BNs) coupled with agent-based modeling. The BNAS model, integrated Bayesian network-based agent system, presented in this study uses geographic information systems, ABMs, BNs, and influence diagram principles to model population change on an irregular spatial structure. The model is parameterized with historical data and then used to simulate 20 years of future population and land-use change for the City of Surrey, British Columbia, Canada. The simulation results identify feasible new urban areas for development around the main transportation corridors. The obtained new development areas and the projected population trajectories with the“what-if” scenario capabilities can provide insights into urban planners for better and more informed land-use policy or decision-making processes.

  19. Dynamics of bloggers’ communities: Bipartite networks from empirical data and agent-based modeling

    NASA Astrophysics Data System (ADS)

    Mitrović, Marija; Tadić, Bosiljka

    2012-11-01

    We present an analysis of the empirical data and the agent-based modeling of the emotional behavior of users on the Web portals where the user interaction is mediated by posted comments, like Blogs and Diggs. We consider the dataset of discussion-driven popular Diggs, in which all comments are screened by machine-learning emotion detection in the text, to determine positive and negative valence (attractiveness and aversiveness) of each comment. By mapping the data onto a suitable bipartite network, we perform an analysis of the network topology and the related time-series of the emotional comments. The agent-based model is then introduced to simulate the dynamics and to capture the emergence of the emotional behaviors and communities. The agents are linked to posts on a bipartite network, whose structure evolves through their actions on the posts. The emotional states (arousal and valence) of each agent fluctuate in time, subject to the current contents of the posts to which the agent is exposed. By an agent’s action on a post its current emotions are transferred to the post. The model rules and the key parameters are inferred from the considered empirical data to ensure their realistic values and mutual consistency. The model assumes that the emotional arousal over posts drives the agent’s action. The simulations are preformed for the case of constant flux of agents and the results are analyzed in full analogy with the empirical data. The main conclusions are that the emotion-driven dynamics leads to long-range temporal correlations and emergent networks with community structure, that are comparable with the ones in the empirical system of popular posts. In view of pure emotion-driven agents actions, this type of comparisons provide a quantitative measure for the role of emotions in the dynamics on real blogs. Furthermore, the model reveals the underlying mechanisms which relate the post popularity with the emotion dynamics and the prevalence of negative

  20. A Generalized Framework for Different Drought Indices: Testing its Suitability in a Simulation of the last two Millennia for Europe

    NASA Astrophysics Data System (ADS)

    Raible, Christoph C.; Baerenbold, Oliver; Gomez-Navarro, Juan Jose

    2016-04-01

    Over the past decades, different drought indices have been suggested in the literature. This study tackles the problem of how to characterize drought by defining a general framework and proposing a generalized family of drought indices that is flexible regarding the use of different water balance models. The sensitivity of various indices and its skill to represent drought conditions is evaluated using a regional model simulation in Europe spanning the last two millennia as test bed. The framework combines an exponentially damped memory with a normalization method based on quantile mapping. Both approaches are more robust and physically meaningful compared to the existing methods used to define drought indices. Still, framework is flexible with respect to the water balance, enabling users to adapt the index formulation to the data availability of different locations. Based on the framework, indices with different complex water balances are compared with each other. The comparison shows that a drought index considering only precipitation in the water balance is sufficient for Western